Platform Engineer
Requirements engineer job in Boston, MA
Cloud Platform Engineer | Aerospace | $210k + Bonus | Boston, MA
Role: Cloud Platform Engineer
Base: $160,000 - $210,000 DOE
An industry leading aerospace company is seeking a Cloud Platform Engineer to join its Boston-based engineering team. This role is ideal for someone passionate about building scalable cloud-native infrastructure and enabling mission-critical applications in a high-performance environment.
What You'll Do
Design, build, and maintain cloud infrastructure primarily on AWS
Develop and deploy applications using Kubernetes, with a focus on reliability and scalability
Implement Infrastructure as Code (IaC) using Terraform
Build observability and monitoring systems using Grafana and related tools
Collaborate with software engineers and data scientists to support deployment pipelines and cloud architecture
What We're Looking For
Strong experience with AWS services and architecture
Deep understanding of Kubernetes and container orchestration
Hands-on experience with Terraform for infrastructure automation
Familiarity with Grafana, Prometheus, or similar monitoring stacks
Solid programming/scripting skills (Python, Go, or Bash preferred)
Experience in high-availability, distributed systems is a plus
Why Join?
Competitive compensation of up to $210K base
Work on cutting-edge aerospace and AI technologies
Hybrid work environment with a Boston office hub
Collaborative, mission-driven team culture
Cloud Engineer
Requirements engineer job in Boston, MA
Cloud Database Administrator (DBA)/ETL Engineer
Contract Length: Through 6/30/2026 with high likelihood for extension
As a Cloud Database Administrator (DBA)/ETL Engineer, you will play a key role in a multi-year application and platform modernization initiative within the public education sector. You will be responsible for maintaining, optimizing, and modernizing cloud-hosted databases and data services, ensuring security, high availability, and compliance with governance policies. This hands-on role involves designing and implementing scalable data pipelines, migrating legacy SSIS ETL code to modern SQL-based solutions, and leveraging Apache Airflow for scheduling and dependency management. You will collaborate with cloud engineers, DBAs, and technical leads to deliver streamlined, cost-effective solutions that support critical education programs.
Minimum Qualifications
Strong experience with Oracle RDS and AWS services (S3, Managed Airflow/MWAA, DMS)
Advanced SQL coding skills; ability to translate Oracle PL/SQL to Snowflake or similar platforms
Experience with multiple backend data sources (SQL Server, Oracle, Postgres, DynamoDB, Snowflake)
Familiarity with data warehouse concepts (facts, dimensions, normalization, slowly changing dimensions)
Basic scripting experience (Python, PowerShell, bash)
Ability to write unit test scripts and validate migrated ETL/ELT code
Nice-to-Have Skills
Experience configuring and managing Apache Airflow DAGs
Knowledge of Snowflake features (Snowpipe, cloning, time travel, RBAC)
Prior experience in government or education data domains
Familiarity with GitHub and Jira for code management and task tracking
Responsibilities
Create and manage cloud-native databases and services (RDS Oracle, Aurora, Postgres, Snowflake)
Design and implement data pipelines and transformations using AWS and Airflow
Optimize query execution, compute scaling, and storage performance
Implement encryption, access policies, and auditing to meet FERPA/PII standards
Migrate legacy SSIS ETL code to modern SQL-based solutions
Perform performance tuning and benchmarking against on-prem solutions
Collaborate with technical teams to troubleshoot and enhance data workflows
What's In It For You
Weekly Paychecks
Opportunity to work on a high-impact modernization project
Collaborative environment with cutting-edge cloud technologies
Professional growth in data engineering and cloud architecture
EEO Statement:
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
Scada Engineer
Requirements engineer job in Westborough, MA
first PRO is now accepting resumes for a Scada Engineer role in Westborough, MA. This is a 12+month contract and onsite 2 days per week.
Typical task breakdown:
o Manage, maintain and enhance SCADA system software and field RTU software.
o Analyze, research, develop, maintain and implement enhancements to SCADA.
o Defines and maintains supervisory control and data acquisition (SCADA) data and definitions, Aveva Enterprise 2023.
o Develop SCADA operational pages, create system reports and maintain historical and real-time databases.
o Program and implement the installation of all new and upgraded field RTUs and telecommunications.
Education & Experience Required:
Bachelor of Science Degree in Electrical and/or Computer Engineering.
o Minimum of five to eight (5-8) years related experience.
Technical Skills
o Requires proficiency in Accol Workbench Open BSI, Modbus, BSAP, OPC, SQL Server, TCP-IP, Microsoft Access and Office. Proficiency in Aveva Enterprise SCADA 2023 and ControlWave Designer preferred. An understanding of gas distribution system operations and telecommunications such as serial, Ethernet and wireless preferred
User Interface Engineer
Requirements engineer job in Boston, MA
Hello,
We have 3 urgent openings for a "Senior User Interface Developer". These are hybrid roles.
Duration- long-term contract
Interview: There will be an in-person interview for this role
Senior User Interface Developer
Key Responsibilities
Architect and deliver scalable frontend applications using Angular, React, and Next.js, incorporating SSR, SSG, and advanced UI patterns.
Develop backend APIs and server-side logic with Express.js for robust full-stack solutions.
Drive technical decisions on frontend frameworks, component libraries, styling systems, and build tools.
Build, optimize, and maintain CI/CD pipelines, ensuring reliable deployments in cloud environments (AWS preferred).
Mentor and coach engineers on frontend frameworks, state management, accessibility, performance, and testing best practices.
Collaborate closely with design, product, and backend teams to translate business needs into exceptional user experiences.
Continuously evaluate emerging technologies and advocate for best practices in frontend and full-stack engineering.
Qualifications
Extensive experience with Angular and React frameworks, and production use of Next.js and Express.js.
Strong proficiency in JavaScript/TypeScript, HTML5, CSS3, and responsive design principles.
Experience with RESTful and GraphQL API integration.
Familiarity with cloud CI/CD pipelines and deployment models (Azure, AWS, or GCP).
Proven leadership in frontend/full-stack architecture and mentoring development teams.
ABOUT US:
Anagh Technologies is a technical consulting firm specializing in UI, Front-End, and Full-Stack web technologies. We currently have 30+ positions in Angular, React, Node, and Java.
If technically strong, we can 100% get you an offer within 2 weeks MAX, as we will consider you for multiple roles at once. If you are interested and are available, please email me your resume and contact information to arshad AT anaghtech.com. Thank you for your time.
DevOps Engineer
Requirements engineer job in Boston, MA
📣 Platform Engineer - Travel SaaS
A fast-scaling SaaS company in the travel tech space is hiring a Platform Engineer to help build and scale their global infrastructure.
This is a high-impact role in a product-led, engineering-driven environment. The company operates a modern, multi-service architecture on AWS and needs someone who can take ownership of platform reliability, CI/CD tooling, and infrastructure as code.
💻 The role:
Design, build and maintain secure, scalable AWS infrastructure (EC2, S3, RDS, IAM, etc.)
Champion Infrastructure as Code using Terraform or Pulumi
Manage containerised deployments with Docker and ECS or Kubernetes
Improve and maintain CI/CD pipelines (GitHub Actions, CircleCI, etc.)
Collaborate closely with engineering, SRE and security teams
Take part in on-call and incident response as part of a “you build it, you run it” culture
🧠 What they're looking for:
3+ years' hands-on experience with AWS
Strong background in infrastructure as code
Solid understanding of containerisation and orchestration
Comfortable with scripting (Python, Go or Bash)
Experience with observability tools (Datadog, CloudWatch, etc.)
Excellent debugging and troubleshooting skills
🎁 Nice-to-haves:
Exposure to Windows/.NET, serverless architectures or compliance frameworks (e.g. SOC2)
🌍 Why join:
Compensation: $130-150K base + equity
Culture: Low-ego, high-ownership team with a strong engineering voice
Hybrid setup: ~3 days per week in office in Boston
Mission: Helping businesses travel smarter - at global scale
DevOps Engineer
Requirements engineer job in Boston, MA
We're looking for a Senior DevOps Tools Engineer to help modernize and elevate our development ecosystem. If you're passionate about improving how software teams build, test, secure, and deliver high-quality code-this role is built for you.
This is not a traditional infrastructure-heavy DevOps role. It's a developer-enablement, tooling modernization, and process transformation position with real influence.
🔧 Role Overview
You will lead initiatives that reshape how engineering teams work-modernizing tooling, redesigning source control practices, improving CI/CD workflows, and championing DevEx across the organization. This role combines hands-on engineering with strategic process design.
⭐ Key Responsibilities
Drive modernization of development tools and processes, including SVN → Git migration and workflow redesign.
Own and enhance CI/CD pipelines to improve reliability, automation, and performance.
Implement modern DevOps + DevSecOps practices (SAST, DAST, code scanning, dependency checks, etc.).
Automate build, packaging, testing, and release processes.
Advocate for and improve Developer Experience (DevEx) by reducing friction and enabling efficiency.
Collaborate across engineering teams to define standards for source control, branching, packaging, and release workflows.
Guide teams through modernization initiatives and influence technical direction.
🎯 Must-Have Qualifications
Strong experience with CI/CD pipelines, developer tooling, and automation.
Hands-on expertise with Git + Git-based platforms (GitLab, GitHub, Bitbucket).
Experience modernizing tooling or migrating from legacy systems (SVN → Git is a big plus).
Solid understanding of DevOps / DevSecOps workflows: automation, builds, packaging, security integration.
Proficient in scripting/programming for automation (Python, Bash, PowerShell, Groovy, etc.).
Excellent communication skills and ability to guide teams through change.
🏙️ Work Model
This is a full-time, hybrid role based in Boston, MA. Onsite participation is required.
📩 When Applying
Please include:
Updated resume
Expected Salary
Notice period (30 days or less)
A good time for a quick introductory call
If you're excited about modernizing engineering ecosystems, improving developer experience, and driving organization-wide transformation, we'd love to connect.
Junior DevOps Engineer
Requirements engineer job in Woburn, MA
The Alexander Technology Group is looking for a junior devops engineer for a client in the Woburn, MA area.
Hybrid on-site
No 3rd party applicants will be considered, do not reach out
85-90k
Requirements:
Key Responsibilities:
Support Cloud Infrastructure: Assist in managing AWS infrastructure including VPCs, ECS/EKS clusters, RDS databases, and serverless components under the guidance of senior engineers.
Maintain CI/CD Pipelines: Help maintain and improve deployment pipelines using GitLab CI or GitHub Actions, ensuring smooth software delivery.
Monitor System Health: Set up and monitor alerting systems using CloudWatch, Grafana, or Prometheus, and respond to incidents with support from the team.
Security and Compliance: Support SOC 2 Type II compliance efforts by implementing security controls and following established protocols.
Infrastructure as Code: Gain experience with Terraform and other IaC tools to automate infrastructure provisioning and management.
If interested, please send resume to ************************
Senior Data Engineer
Requirements engineer job in Boston, MA
We are seeking a highly skilled Senior Data Engineer for a client that is based in Boston.
Ideal candidates will have
Power BI or Tableau Experience
SQL experience
AWS Cloud experience
Senior Backend Data Engineer
Requirements engineer job in Boston, MA
Hybrid - Boston MA, Richmond VA, or McLean VA
Long Term - Ex Capital one
Required Skills & Experience
5-8+ years in backend or data engineering.
PySpark & Python (expert level).
AWS: Hands-on experience with Glue, Lambda, EC2; Step Functions preferred.
Strong background in ETL/ELT and large-scale ingestion pipelines.
Experience supporting accounting/reporting data flows or similar financial processes.
Knowledge of secure file transfer, validation, audit, and compliance workflows.
Solid understanding of distributed systems, CI/CD, and DevOps practices.
DBT SME - Data Modeling, Analytics Engineer
Requirements engineer job in Boston, MA
We're seeking a Lead Analytics Engineer to help design, model, and scale a modern data environment for a global software organization. This role will play a key part in organizing and maturing that landscape as part of a multi-year strategic roadmap. This position is ideal for a senior-level analytics engineer who can architect data solutions, build robust models, and stay hands-on with development.
This is a remote role with occasional onsite meetings. Candidates must currently be local to the Boston area and reside in MA/CT/RI/NH/ME.
Long term contract. W2 or c2c.
Highlights:
Architect and build new data models using dbt and modern modeling techniques.
Partner closely with leadership and business teams to translate complex requirements into technical solutions.
Drive structure and clarity within a growing analytics ecosystem.
Qualifications
Bachelor's degree in Economics, Mathematics, Computer Science, or related field.
10+ years of experience in an Analytics Engineering role.
Expert in SQL and dbt with demonstrated modeling experience.
Data Modeling & Transformation: Design and implement robust, reusable data models within the warehouse. Develop and maintain SQL transformations in dbt.
Data Pipeline & Orchestration: Build and maintain reliable data pipelines in collaboration with data engineering. Utilize orchestration tools (Airflow) to manage and monitor workflows. Manage and support dbt environments and transformations.
Hands-on experience with BigQuery or other cloud data warehouses.
Proficiency in Python and Docker.
Experience with Airflow (Composer), Git, and CI/CD pipelines.
Strong attention to detail and communication skills; able to interact with both technical and business stakeholders.
Technical Requirements:
Primary Data Warehouse: BigQuery (mandatory)
Nice to Have: Snowflake, Redshift
Orchestration: Airflow (GCP Composer)
Languages: Expert-level SQL / dbt; strong Python required
Other Tools: GCP or AWS, Fivetran, Apache Beam, Looker or Preset, Docker
Modeling Techniques: Vault 2.0, 3NF, Dimensional Modeling, etc.
Version Control: Git / CI-CD
Quality Tools: dbt-Elementary, dbt-Osmosis, or Great Expectations preferred
Data Engineer (HR Data warehousing exp)
Requirements engineer job in Boston, MA
Ness is a full lifecycle digital engineering firm offering digital advisory through scaled engineering services. Combining our core competence in engineering with the latest in digital strategy and technology, we seamlessly manage Digital Transformation journeys from strategy through execution to help businesses thrive in the digital economy. As your tech partner, we help engineer your company's future with cloud and data. For more information, visit ************
Data Engineer (HR Data warehousing exp)
Boston, MA (3-4 days onsite a week)
Key Responsibilities:
Translate business needs into data modeling strategies and implement Snowflake data models to support HR analytics, KPIs, and reporting.
Design, build, and maintain Snowflake objects including tables, views, and stored procedures.
Develop and execute SQL or Python transformations, data mappings, cleansing, validation, and conversion processes.
Establish and enforce data governance standards to ensure consistency, quality, and completeness of data assets.
Manage technical metadata and documentation for data warehousing and migration efforts.
Optimize performance of data transformation pipelines and monitor integration performance.
Design, configure, and optimize integrations between Workday and third-party applications.
Participate in system testing including unit, integration, and regression phases.
Support data analysis needs throughout the implementation lifecycle.
Required Experience & Skills:
Experience with Snowflake or similar data warehouse platforms
Strong SQL skills and experience with data transformation tools
Experience with ETL processes and data validation techniques
Understanding of HR data structures and relationships
Excellent analytical and problem-solving abilities
Experience with developing with Python
Architecting a data warehousing solution leveraging data from Workday or other HRIS such as Workday to support advanced reporting and insights for an organization
Preferred Experience & Skills:
Experience in developing and supporting a data warehouse serving the HR domain
Experience with data platforms where SCD Type 2 was required
Experience with data visualization tools such as Tableau
Experience with architecting or working with ELT technologies (such as DBT) and data architectures
Understanding of HR processes, compliance requirements, and industry best practices
Ness is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law
Senior Data Engineer
Requirements engineer job in Boston, MA
This role is with a Maris Financial Services Partner
Boston, MA - Hybrid Role - We are targeting local candidates that can be in the Boston office 3 days per week.
12 Month + contract (or contract to hire, if desired)
This team oversees critical systems including Snowflake, Tableau, and RDBMS technologies like SQL Server and Postgres. This role will focus on automating database deployments and creating efficient patterns and practices that enhance our data processing capabilities.
Key Responsibilities:
Design, enhance, and manage DataOps tools and services to support cloud initiatives.
Develop and maintain scheduled workflows using Airflow.
Create containerized applications for deployment with ECS, Fargate, and EKS.
Build data pipelines to extract, transform, and load (ETL) data from various sources into Apache Kafka, ultimately feeding into Snowflake.
Provide consultation for infrastructure projects to ensure alignment with technical architecture and end-user needs.
Qualifications:
Familiarity with Continuous Integration and Continuous Deployment (CI/CD) practices and tools.
Understanding of application stack architectures (e.g., microservices), PaaS development, and AWS environments.
Proficiency in scripting languages such as Bash.
Experience with Python, Go, or C#.
Hands-on experience with Terraform or other Infrastructure as Code (IaC) tools, such as CloudFormation.
Preferred experience with Apache Kafka and Flink.
Proven experience working with Kubernetes.
Strong knowledge of Linux and Docker environments.
Excellent communication and interpersonal skills.
Strong analytical and problem-solving abilities.
Ability to manage multiple tasks and projects concurrently.
Expertise with SQL Server, Postgres, and Snowflake.
In-depth experience with ETL/ELT processes.
Data Science Engineer
Requirements engineer job in Boston, MA
Role: Data Science Engineer
Note: In-person interview required
This is a 12+ month, ongoing contract with our insurance client in Boston, MA 4x hybrid per week with a mandatory final onsite interview.
We are seeking a talented Data Science Engineer to join our team and contribute to the development and implementation of advanced data solutions using technologies such as AWS Glue, Python, Spark, Snowflake Data Lake, S3, SageMaker, and machine learning (M/L).
Overview: As a Data Science Engineer, you will play a crucial role in designing, building, and optimizing data pipelines, machine learning models, and analytics solutions. You will work closely with cross-functional teams to extract actionable insights from data and drive business outcomes.
Develop and maintain ETL pipelines using AWS Glue for data ingestion, transformation, and integration from various sources.
Utilize Python and Spark for data preprocessing, feature engineering, and model development.
Design and implement data lake architecture using Snowflake Data Lake, Snowflake data warehouse and S3 for scalable and efficient storage and processing of structured and unstructured data.
Leverage SageMaker for model training, evaluation, deployment, and monitoring in production environments.
Collaborate with data scientists, analysts, and business stakeholders to understand requirements, develop predictive models, and generate actionable insights.
Conduct exploratory data analysis (EDA) and data visualization to communicate findings and trends effectively.
Stay updated with advancements in machine learning algorithms, techniques, and best practices to enhance model performance and accuracy.
Ensure data quality, integrity, and security throughout the data lifecycle by implementing robust data governance and compliance measures.
Requirements added by the job poster
• 4+ years of work experience with Amazon Web Services (AWS)
• 2+ years of work experience with Machine Learning
• Accept a background check
• 3+ years of work experience with Python (Programming Language)
• Working in a hybrid setting
Senior Data Engineer
Requirements engineer job in Boston, MA
Hi, this is Eric 👋 We're hiring a stellar Data Engineer to join our engineering org at Basil Systems.
At Basil Systems, we're revolutionizing healthcare data access and insights for the life sciences industry. We've built powerful platforms that help pharmaceutical and medical device companies navigate complex regulatory landscapes, accelerate product development, and ultimately bring life-saving innovations to market faster. Our SaaS platforms transform disconnected data sources into actionable intelligence, empowering organizations to make data-driven decisions that improve patient outcomes and save lives.
The Role
We are seeking a Senior Data Engineer to own and advance the data infrastructure that powers our healthcare insights platform. As our engineering team scales and we expand our data capabilities, we need someone who can build reliable, scalable pipelines while ensuring data quality across increasingly complex regulatory sources.
Key Responsibilities
Design, build, and maintain robust ETL processes for healthcare regulatory data
Integrate new data sources as we onboard customers and expand platform capabilities
Optimize pipeline performance and reliability
Ensure data accuracy and consistency across complex transformation workflows
Qualifications
5+ years of professional experience as a data engineer or in a similar role
Experience with Apache Spark and distributed computing
Familiarity with common ML algorithms and their applications
Knowledge of or willingness to learn and work with Generative AI technologies
Experience with developing for distributed cloud platforms
Experience with MongoDB / ElasticSearch and technologies like BigQuery
Strong commitment to engineering best practices
Nice-to-Haves
Solid understanding of modern security practices, especially in healthcare data contexts
Subject matter expertise in LifeSciences / Pharma / MedTech
This role might not be for you if...
You're a heavy process advocate and want enterprise-grade Scrum or rigid methodologies
You have a need for perfect clarity before taking action
You have a big company mindset
What We Offer
Competitive salary
Health and vision benefits
Attractive equity package
Flexible work environment (remote-friendly)
Opportunity to work on impactful projects that are helping bring life-saving medical products to market
Be part of a mission-driven team solving real healthcare challenges at a critical scaling point
Our Culture
At Basil Systems, we value flexibility and support a distributed team. We actively employ and support remote team members across different geographies, allowing you to work when, where, and how you work best. We are committed to building a diverse, inclusive, and safe work environment for everyone. Our team is passionate about using technology to make a meaningful difference in healthcare.
How to Apply
If you're excited about this opportunity and believe you'd be a great fit for our team, please send your resume and a brief introduction to *****************************.
Basil Systems is an equal opportunity employer. We welcome applicants of all backgrounds and experiences.
Junior Data Engineer
Requirements engineer job in Boston, MA
Job Title: Junior Data Engineer
W2 candidates only
We are on the lookout for engineers who are open to upskill to the exciting world of Data Engineering. This opportunity is for our client, a top tier insurance company and includes a 2-3 week online pre-employment training program (15 hours per week), conveniently scheduled after business hours. Participants who successfully complete the program will receive a $500 stipend. This is a fantastic chance to gain in demand skills, hands-on experience, and a pathway into a dynamic tech role..
Key Responsibilities:
• Assist in the design and development of big data solutions using technologies such as Spark, Scala, AWS Glue, Lambda, SNS/SQS, and CloudWatch.
• Develop applications primarily in Scala and Python with guidance from senior team members.
• Write and optimize SQL queries, preferably with Redshift; experience with Snowflake is a plus.
• Work on ETL/ELT processes and frameworks to ensure smooth data integration.
• Participate in development tasks, including configuration, writing unit test cases, and testing support.
• Help identify and troubleshoot defects and assist in root cause analysis during testing.
• Support performance testing and production environment troubleshooting.
• Collaborate with the team on best practices, including Git version control and CI/CD deployment processes.
• Continuously learn and grow your skills in big data technologies and cloud platforms.
Prerequisites:
• Recent graduate with a degree in Computer Science, Information Technology, Engineering, or related fields.
• Basic experience or coursework in Scala, Python, or other programming languages.
• Familiarity with SQL and database concepts.
• Understanding of ETL/ELT concepts is preferred.
• Exposure to AWS cloud services (Glue, Lambda, SNS/SQS) is a plus but not mandatory.
• Strong problem-solving skills and eagerness to learn.
• Good communication and teamwork abilities.
Selection Process & Training:
• Online assessment and technical interview by Quintrix.
• Client Interview(s).
• 2-3 weeks of pre-employment online instructor-led training.
Stipend paid during Training:
• $500.
Benefits:
• 2 weeks of Paid Vacation.
• Health Insurance including Vision and Dental.
• Employee Assistance Program.
• Dependent Care FSA.
• Commuter Benefits.
• Voluntary Life Insurance.
• Relocation Reimbursement.
Who is Quintrix?
Quintrix is on a mission to help individuals develop their technology talent. We have helped hundreds of candidate's kick start their careers in tech. You will be “paid-to-learn”, qualifying you for a high paying tech job with one of our top employers. To learn more about our candidate experience go to *************************************
Senior Data Engineer
Requirements engineer job in Boston, MA
Data Engineer (HRIS experience)
Boston, MA ( 4 days onsite a week )
Key Responsibilities:
Translate business needs into data modelling strategies and implement Snowflake data models to support HR analytics, KPIs, and reporting.
Design, build, and maintain Snowflake objects including tables, views, and stored procedures.
Develop and execute SQL or Python transformations, data mappings, cleansing, validation, and conversion processes.
Establish and enforce data governance standards to ensure consistency, quality, and completeness of data assets.
Manage technical metadata and documentation for data warehousing and migration efforts.
Optimize performance of data transformation pipelines and monitor integration performance.
Design, configure, and optimize integrations between Workday and third-party applications.
Participate in system testing including unit, integration, and regression phases.
Support data analysis needs throughout the implementation lifecycle.
Required Experience & Skills:
Experience with Snowflake or similar data warehouse platforms
Strong SQL skills and experience with data transformation tools
Experience with ETL processes and data validation techniques
Understanding of HR data structures and relationships
Excellent analytical and problem-solving abilities
Experience with developing with Python
Architecting a data warehousing solution leveraging data from Workday or other HRIS such as Workday to support advanced reporting and insights for an organization
Preferred Experience & Skills:
Experience in developing and supporting a data warehouse serving the HR domain
Experience with data platforms where SCD Type 2 was required
Experience with data visualization tools such as Tableau
Experience with architecting or working with ELT technologies (such as DBT) and data architectures
Understanding of HR processes, compliance requirements, and industry best practices
Software Development Engineer in Test - AI
Requirements engineer job in Boston, MA
JOB MISSION:
New Balance is seeking a forward-thinking Senior SDET with a developer's mindset and a passion for AI to lead the next evolution of our global eCommerce test automation platform. This is a unique opportunity for someone who thrives on staying ahead of AI trends and is eager to apply them to modern software quality engineering. You'll drive the transformation of our Selenium and BDD-based test stack into a cutting-edge, AI-augmented platform that supports everything from unit testing to full user journey validation. If you're a builder at heart-excited by the challenge of creating scalable, self-healing, and autonomous testing systems that empower both engineers and developers-this role is for you.
MAJOR ACCOUNTABILITIES:
Lead the architectural redesign of our test automation platform, transitioning from a legacy Selenium/C# and BDD stack to a modern, intelligent framework.
Design, build, and maintain AI-driven test automation platforms that enable reliable, scalable tests across the entire testing pyramid-from unit and integration to full end-to-end user journeys.
Implement AI-augmented testing strategies to support autonomous test creation, maintenance, and healing.
Integrate visual validation tools such as Applitools Eyes into the automation pipeline.
Collaborate cross-functionally with developers, QA engineers, and DevOps to ensure test coverage, reliability, and scalability across global eCommerce sites.
Evaluate and integrate open-source and commercial tools that enhance test intelligence, observability, and maintainability.
Advocate for testability by partnering with developers and architects to influence solution design.
Mentor and guide other SDETs and QA engineers in modern test automation practices and AI-driven testing approaches.
Continuously research and prototype emerging AI technologies in the testing space to keep the platform at the forefront of innovation.
REQUIREMENTS FOR SUCCESS:
5+ years of experience in test automation, with deep expertise in Selenium and C#.
Strong understanding of BDD frameworks (e.g., SpecFlow, Cucumber) and test design principles.
Hands-on experience with Selenium extensions such as Healenium, Selenide, or Selenium Grid, with a focus on improving test resilience, scalability, and maintainability.
Proven ability to implement self-healing test mechanisms and intelligent locator strategies to reduce flakiness and maintenance overhead.
Familiarity with AI-augmented testing strategies (e.g., intelligent test generation, adaptive test execution).
Experience integrating Selenium-based frameworks into modern CI/CD pipelines (e.g., Azure DevOps, Jenkins), with AI-driven diagnostics or analytics.
Proficiency with visual testing tools like Applitools Eyes.
Experience with modern automation frameworks such as TestRigor, Playwright, or Cypress.
Exposure to machine learning or NLP concepts applied to software testing.
Contributions to open-source testing tools or frameworks.
Strong problem-solving, communication, and mentoring skills.
Software Engineer
Requirements engineer job in Boston, MA
Work schedule: Hybrid
Key Responsibilities:
Performance Tuning: Monitor and optimize performance, including query performance, resource utilization, and storage management.
User and Access Management: Manage user access, roles, and permissions to ensure data security and compliance with organizational policies.
Data Integration: Support and manage data integration processes, including data loading, transformation, and extraction.
Troubleshooting and Support: Provide technical support and troubleshooting for Snowflake-related issues, including resolving performance bottlenecks and query optimization.
Documentation and Reporting: Maintain detailed documentation of system configurations, procedures, and changes. Generate and deliver regular reports on system performance and usage.
Collaboration: Work closely with data engineers, analysts, and other IT professionals to ensure seamless integration and optimal performance of the Snowflake environment.
Best Practices: Stay up to date with Snowflake best practices and industry trends. Recommend and implement improvements and upgrades to enhance system functionality and performance.
Qualifications and Experience:
5+ years of experience in data architecture, data engineering, or database development.
2+ years of hands-on experience with Snowflake, including data modeling, performance tuning, and security.
At a minimum Bachelor's degree in Computer Science, Information Technology, or related field.
Experience with source control tools (GitHub preferred), ETL/ELT tools and cloud platforms (AWS preferred).
Experience or exposure to AI tools.
Deep understanding of data warehousing concepts, dimensional modeling, and analytics.
Excellent problem-solving and communication skills.
Experience integrating Snowflake with BI and reporting tools is a plus
Required Skills:
Strong proficiency in Snowflake architecture, features, and capabilities.
Knowledge of SQL and Snowflake-specific query optimization.
Experience with ETL tools and data integration processes.
Strong proficiency in SQL and Python.
Strong Database design and data modelling experience. Experience with data modeling tools.
Ability to identify and drive continuous improvements.
Strong problem solving and analytical skills.
Demonstrated process-oriented and strategic thinking skills.
Strong motivation and a desire to continuously learn and grow.
Knowledge of Snowflake security features including access control, authentication, authorization, encryption, masking, secure view, etc.
Experience working in AWS cloud environments.
Experience working with Power BI and other BI, data visualization, and reporting tools.
Business requirement gathering and aligning to solutions delivery.
Experience with data integration solutions and tools, data pipelines, and modern ways of automating data using cloud based and on-premises technologies.
Experience integrating Snowflake with an identity and access management program such as Azure IDP is a plus.
Experience with other relational database management systems, cloud data warehouses and big data platforms is a plus.
Analytical Skills: Excellent problem-solving and analytical skills with strong attention to detail.
Communication: Effective communication skills, both written and verbal, with the ability to convey complex technical information to non-technical stakeholders.
Teamwork: Ability to work independently and collaboratively in a fast-paced environment.
Preferred Skills:
Snowflake certification (e.g., SnowPro Core or Advanced Certification).
AI & Systems Engineer
Requirements engineer job in Boston, MA
Job Title: Artificial Intelligence Engineer Location: Boston/Hybrid Type: Full-time
The Role We're looking for a hands-on AI Systems Engineer to own the deployment, integration, and support of AI-powered tools (LLMs, Copilot, Claude, etc.) while keeping enterprise infrastructure running smoothly in a professional services environment.
What You'll Do
Build, deploy, and maintain AI applications that supercharge legal and knowledge workflows
Manage and optimize cloud (Azure/M365) and on-prem environments (Windows/Linux, VMware/Nutanix, AD, SQL)
Write production-grade Python/PowerShell, automate everything, consume REST APIs and SDKs
Craft high-impact prompts and fine-tune LLM usage
Partner with architecture and ops leadership on strategy, resilience, and continual improvement
Research emerging tech and drive efficiency gains
Rotate in 24×7 on-call (escalation/triage)
You Bring
7+ years supporting mission-critical IT in professional services or similar
Real experience with modern LLMs and AI tools
Strong Python or PowerShell + familiarity with ML libraries
Deep experience with Azure, M365, Active Directory, virtualization, networking, backups
Proven ability to solve complex problems independently and communicate clearly
Bachelor's in CS or related field
Software Engineer
Requirements engineer job in Cambridge, MA
💻 Software Engineer | Scalable Systems | Onsite (Cambridge, MA)
A rapidly growing tech startup is building AI-driven, high-performance systems designed to solve complex, real-world challenges. The team blends software engineering and systems optimization to create scalable, reliable technology that supports next-generation applications.
As a Software Engineer, you'll play a key role in designing and building scalable front-end and back-end systems in TypeScript, collaborating with talented engineers to deliver secure and efficient solutions that perform at scale.
Tech: TypeScript, React, and Node.js
If you enjoy building scalable systems that drive innovation and want to make a visible impact in a fast-moving startup, this could be the perfect fit.
📍 Location: Cambridge, MA (onsite)
💰 Up to $300,000 (dependant on exp.) + equity options
Interested? Apply now!