Cloud Engineer
Requirements engineer job in Boston, MA
Cloud Database Administrator (DBA)/ETL Engineer
Contract Length: Through 6/30/2026 with high likelihood for extension
As a Cloud Database Administrator (DBA)/ETL Engineer, you will play a key role in a multi-year application and platform modernization initiative within the public education sector. You will be responsible for maintaining, optimizing, and modernizing cloud-hosted databases and data services, ensuring security, high availability, and compliance with governance policies. This hands-on role involves designing and implementing scalable data pipelines, migrating legacy SSIS ETL code to modern SQL-based solutions, and leveraging Apache Airflow for scheduling and dependency management. You will collaborate with cloud engineers, DBAs, and technical leads to deliver streamlined, cost-effective solutions that support critical education programs.
Minimum Qualifications
Strong experience with Oracle RDS and AWS services (S3, Managed Airflow/MWAA, DMS)
Advanced SQL coding skills; ability to translate Oracle PL/SQL to Snowflake or similar platforms
Experience with multiple backend data sources (SQL Server, Oracle, Postgres, DynamoDB, Snowflake)
Familiarity with data warehouse concepts (facts, dimensions, normalization, slowly changing dimensions)
Basic scripting experience (Python, PowerShell, bash)
Ability to write unit test scripts and validate migrated ETL/ELT code
Nice-to-Have Skills
Experience configuring and managing Apache Airflow DAGs
Knowledge of Snowflake features (Snowpipe, cloning, time travel, RBAC)
Prior experience in government or education data domains
Familiarity with GitHub and Jira for code management and task tracking
Responsibilities
Create and manage cloud-native databases and services (RDS Oracle, Aurora, Postgres, Snowflake)
Design and implement data pipelines and transformations using AWS and Airflow
Optimize query execution, compute scaling, and storage performance
Implement encryption, access policies, and auditing to meet FERPA/PII standards
Migrate legacy SSIS ETL code to modern SQL-based solutions
Perform performance tuning and benchmarking against on-prem solutions
Collaborate with technical teams to troubleshoot and enhance data workflows
What's In It For You
Weekly Paychecks
Opportunity to work on a high-impact modernization project
Collaborative environment with cutting-edge cloud technologies
Professional growth in data engineering and cloud architecture
EEO Statement:
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
Platform Engineer
Requirements engineer job in Boston, MA
Cloud Platform Engineer | Aerospace | $210k + Bonus | Boston, MA
Role: Cloud Platform Engineer
Base: $160,000 - $210,000 DOE
An industry leading aerospace company is seeking a Cloud Platform Engineer to join its Boston-based engineering team. This role is ideal for someone passionate about building scalable cloud-native infrastructure and enabling mission-critical applications in a high-performance environment.
What You'll Do
Design, build, and maintain cloud infrastructure primarily on AWS
Develop and deploy applications using Kubernetes, with a focus on reliability and scalability
Implement Infrastructure as Code (IaC) using Terraform
Build observability and monitoring systems using Grafana and related tools
Collaborate with software engineers and data scientists to support deployment pipelines and cloud architecture
What We're Looking For
Strong experience with AWS services and architecture
Deep understanding of Kubernetes and container orchestration
Hands-on experience with Terraform for infrastructure automation
Familiarity with Grafana, Prometheus, or similar monitoring stacks
Solid programming/scripting skills (Python, Go, or Bash preferred)
Experience in high-availability, distributed systems is a plus
Why Join?
Competitive compensation of up to $210K base
Work on cutting-edge aerospace and AI technologies
Hybrid work environment with a Boston office hub
Collaborative, mission-driven team culture
User Interface Engineer
Requirements engineer job in Boston, MA
Hello,
We have 3 urgent openings for a "Senior User Interface Developer". These are hybrid roles.
Duration- long-term contract
Interview: There will be an in-person interview for this role
Senior User Interface Developer
Key Responsibilities
Architect and deliver scalable frontend applications using Angular, React, and Next.js, incorporating SSR, SSG, and advanced UI patterns.
Develop backend APIs and server-side logic with Express.js for robust full-stack solutions.
Drive technical decisions on frontend frameworks, component libraries, styling systems, and build tools.
Build, optimize, and maintain CI/CD pipelines, ensuring reliable deployments in cloud environments (AWS preferred).
Mentor and coach engineers on frontend frameworks, state management, accessibility, performance, and testing best practices.
Collaborate closely with design, product, and backend teams to translate business needs into exceptional user experiences.
Continuously evaluate emerging technologies and advocate for best practices in frontend and full-stack engineering.
Qualifications
Extensive experience with Angular and React frameworks, and production use of Next.js and Express.js.
Strong proficiency in JavaScript/TypeScript, HTML5, CSS3, and responsive design principles.
Experience with RESTful and GraphQL API integration.
Familiarity with cloud CI/CD pipelines and deployment models (Azure, AWS, or GCP).
Proven leadership in frontend/full-stack architecture and mentoring development teams.
ABOUT US:
Anagh Technologies is a technical consulting firm specializing in UI, Front-End, and Full-Stack web technologies. We currently have 30+ positions in Angular, React, Node, and Java.
If technically strong, we can 100% get you an offer within 2 weeks MAX, as we will consider you for multiple roles at once. If you are interested and are available, please email me your resume and contact information to arshad AT anaghtech.com. Thank you for your time.
Scada Engineer
Requirements engineer job in Westborough, MA
first PRO is now accepting resumes for a Scada Engineer role in Westborough, MA. This is a 12+month contract and onsite 2 days per week.
Typical task breakdown:
o Manage, maintain and enhance SCADA system software and field RTU software.
o Analyze, research, develop, maintain and implement enhancements to SCADA.
o Defines and maintains supervisory control and data acquisition (SCADA) data and definitions, Aveva Enterprise 2023.
o Develop SCADA operational pages, create system reports and maintain historical and real-time databases.
o Program and implement the installation of all new and upgraded field RTUs and telecommunications.
Education & Experience Required:
Bachelor of Science Degree in Electrical and/or Computer Engineering.
o Minimum of five to eight (5-8) years related experience.
Technical Skills
o Requires proficiency in Accol Workbench Open BSI, Modbus, BSAP, OPC, SQL Server, TCP-IP, Microsoft Access and Office. Proficiency in Aveva Enterprise SCADA 2023 and ControlWave Designer preferred. An understanding of gas distribution system operations and telecommunications such as serial, Ethernet and wireless preferred
Analytics Engineer
Requirements engineer job in Boston, MA
🚀 BI Developer / Data Analyst / Analytics Engineer
hackajob is partnering with Simply Business to help grow their Data & Analytics team. This role is ideal for someone who enjoys working hands-on with SQL, data models, and analytics solutions that support real business decisions.
About the role
📊 Build SQL-driven analytics & data models that power business decisions.
🤝 Work cross-functionally with product, ops and engineering in a collaborative team.
⚙️ Help evolve a modern analytics stack and deliver production-ready data solutions.
📍 Boston, MA (Hybrid ~8 days/month)
💼 Full-time
💵 $60k-$99k
✅ Essential skills
🧾 SQL - advanced querying & performance tuning
🏗 Data modeling - star/snowflake, schemas for analytics
🛠 Analytics engineering - building data pipelines & BI solutions
📈 3+ yrs experience in Data & Analytics roles
➕ Nice to have
❄️ Snowflake or similar warehouse
🛠 dbt / AWS / BI tools (Looker/Tableau)
🐍 Python familiarity
🤖 Exposure to AI / advanced analytics
🔎 Role details
👩 💻 Levels: 1-6 yrs up to 6 yrs (multiple levels considered)
🛂 Work auth: Due to compliance and project requirements, this role is available to candidates who are currently authorized to work in the United States without the need for future sponsorship.
✅ To apply: Sign up on hackajob (Simply Business' hiring partner)
👉 Quick CTA: Sign up on hackajob to be matched and apply - no long forms.
hackajob has partnered with a forward-thinking tech-driven business that prioritizes innovation in its digital solutions and leverages extensive industry data to drive impactful results.
#Data #SQL #Snowflake #dbt #hackajob
Lead Data Engineer
Requirements engineer job in Smithfield, RI
Immediate need for a talented Lead Data Engineer. This is a 06+ Months Contract opportunity with long-term potential and is located in Smithfield, RI (Onsite). Please review the job description below and contact me ASAP if you are interested.
Job ID:25-93890
Pay Range: $63 - $73/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Responsibilities:
In this role, you will be responsible for the leading, engineering & developing quality software components and applications for brokerage products.
You will build, modernize, and maintain Core & Common tools and Data Solutions. You will also apply and adopt variety of cloud-native technologies for the products.
In addition to building software, you will have an opportunity to help define and implement development practices, standards and strategies.
This position can office in Merrimack, NH or Smithfield, RI. (Smithfield is preferred).
Key Requirements and Technology Experience:
Bachelor's or Master's Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with 10 years of working experience
Advanced levels of Oracle and be able to read Oracle SQL/PLSQL.
Need to know RDBMS database, modeling, ETL and SQL concepts
Expertise in Oracle PLSQL is a must to read logic, schema, stored procedures
AWS data engineering services, is a must (batch, EMR, S3, glue, lambda, etc)
Informatica is a must (need ETL concepts)
Unix/Python for scripting is a must
Financial domain experience
Our client is a leading Banking and financial Industry and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
DevOps Engineer
Requirements engineer job in Boston, MA
📣 Platform Engineer - Travel SaaS
A fast-scaling SaaS company in the travel tech space is hiring a Platform Engineer to help build and scale their global infrastructure.
This is a high-impact role in a product-led, engineering-driven environment. The company operates a modern, multi-service architecture on AWS and needs someone who can take ownership of platform reliability, CI/CD tooling, and infrastructure as code.
💻 The role:
Design, build and maintain secure, scalable AWS infrastructure (EC2, S3, RDS, IAM, etc.)
Champion Infrastructure as Code using Terraform or Pulumi
Manage containerised deployments with Docker and ECS or Kubernetes
Improve and maintain CI/CD pipelines (GitHub Actions, CircleCI, etc.)
Collaborate closely with engineering, SRE and security teams
Take part in on-call and incident response as part of a “you build it, you run it” culture
🧠 What they're looking for:
3+ years' hands-on experience with AWS
Strong background in infrastructure as code
Solid understanding of containerisation and orchestration
Comfortable with scripting (Python, Go or Bash)
Experience with observability tools (Datadog, CloudWatch, etc.)
Excellent debugging and troubleshooting skills
🎁 Nice-to-haves:
Exposure to Windows/.NET, serverless architectures or compliance frameworks (e.g. SOC2)
🌍 Why join:
Compensation: $130-150K base + equity
Culture: Low-ego, high-ownership team with a strong engineering voice
Hybrid setup: ~3 days per week in office in Boston
Mission: Helping businesses travel smarter - at global scale
DevOps Engineer
Requirements engineer job in Boston, MA
We're looking for a Senior DevOps Tools Engineer to help modernize and elevate our development ecosystem. If you're passionate about improving how software teams build, test, secure, and deliver high-quality code-this role is built for you.
This is not a traditional infrastructure-heavy DevOps role. It's a developer-enablement, tooling modernization, and process transformation position with real influence.
🔧 Role Overview
You will lead initiatives that reshape how engineering teams work-modernizing tooling, redesigning source control practices, improving CI/CD workflows, and championing DevEx across the organization. This role combines hands-on engineering with strategic process design.
⭐ Key Responsibilities
Drive modernization of development tools and processes, including SVN → Git migration and workflow redesign.
Own and enhance CI/CD pipelines to improve reliability, automation, and performance.
Implement modern DevOps + DevSecOps practices (SAST, DAST, code scanning, dependency checks, etc.).
Automate build, packaging, testing, and release processes.
Advocate for and improve Developer Experience (DevEx) by reducing friction and enabling efficiency.
Collaborate across engineering teams to define standards for source control, branching, packaging, and release workflows.
Guide teams through modernization initiatives and influence technical direction.
🎯 Must-Have Qualifications
Strong experience with CI/CD pipelines, developer tooling, and automation.
Hands-on expertise with Git + Git-based platforms (GitLab, GitHub, Bitbucket).
Experience modernizing tooling or migrating from legacy systems (SVN → Git is a big plus).
Solid understanding of DevOps / DevSecOps workflows: automation, builds, packaging, security integration.
Proficient in scripting/programming for automation (Python, Bash, PowerShell, Groovy, etc.).
Excellent communication skills and ability to guide teams through change.
🏙️ Work Model
This is a full-time, hybrid role based in Boston, MA. Onsite participation is required.
📩 When Applying
Please include:
Updated resume
Expected Salary
Notice period (30 days or less)
A good time for a quick introductory call
If you're excited about modernizing engineering ecosystems, improving developer experience, and driving organization-wide transformation, we'd love to connect.
Senior Data Engineer
Requirements engineer job in Boston, MA
Data Engineer (HRIS experience)
Boston, MA ( 4 days onsite a week )
Key Responsibilities:
Translate business needs into data modelling strategies and implement Snowflake data models to support HR analytics, KPIs, and reporting.
Design, build, and maintain Snowflake objects including tables, views, and stored procedures.
Develop and execute SQL or Python transformations, data mappings, cleansing, validation, and conversion processes.
Establish and enforce data governance standards to ensure consistency, quality, and completeness of data assets.
Manage technical metadata and documentation for data warehousing and migration efforts.
Optimize performance of data transformation pipelines and monitor integration performance.
Design, configure, and optimize integrations between Workday and third-party applications.
Participate in system testing including unit, integration, and regression phases.
Support data analysis needs throughout the implementation lifecycle.
Required Experience & Skills:
Experience with Snowflake or similar data warehouse platforms
Strong SQL skills and experience with data transformation tools
Experience with ETL processes and data validation techniques
Understanding of HR data structures and relationships
Excellent analytical and problem-solving abilities
Experience with developing with Python
Architecting a data warehousing solution leveraging data from Workday or other HRIS such as Workday to support advanced reporting and insights for an organization
Preferred Experience & Skills:
Experience in developing and supporting a data warehouse serving the HR domain
Experience with data platforms where SCD Type 2 was required
Experience with data visualization tools such as Tableau
Experience with architecting or working with ELT technologies (such as DBT) and data architectures
Understanding of HR processes, compliance requirements, and industry best practices
Senior Data Engineer
Requirements engineer job in Boston, MA
We are seeking a highly skilled Senior Data Engineer for a client that is based in Boston.
Ideal candidates will have
Power BI or Tableau Experience
SQL experience
AWS Cloud experience
Senior Backend Data Engineer
Requirements engineer job in Boston, MA
Hybrid - Boston MA, Richmond VA, or McLean VA
Long Term - Ex Capital one
Required Skills & Experience
5-8+ years in backend or data engineering.
PySpark & Python (expert level).
AWS: Hands-on experience with Glue, Lambda, EC2; Step Functions preferred.
Strong background in ETL/ELT and large-scale ingestion pipelines.
Experience supporting accounting/reporting data flows or similar financial processes.
Knowledge of secure file transfer, validation, audit, and compliance workflows.
Solid understanding of distributed systems, CI/CD, and DevOps practices.
DBT SME - Data Modeling, Analytics Engineer
Requirements engineer job in Boston, MA
We're seeking a Lead Analytics Engineer to help design, model, and scale a modern data environment for a global software organization. This role will play a key part in organizing and maturing that landscape as part of a multi-year strategic roadmap. This position is ideal for a senior-level analytics engineer who can architect data solutions, build robust models, and stay hands-on with development.
This is a remote role with occasional onsite meetings. Candidates must currently be local to the Boston area and reside in MA/CT/RI/NH/ME.
Long term contract. W2 or c2c.
Highlights:
Architect and build new data models using dbt and modern modeling techniques.
Partner closely with leadership and business teams to translate complex requirements into technical solutions.
Drive structure and clarity within a growing analytics ecosystem.
Qualifications
Bachelor's degree in Economics, Mathematics, Computer Science, or related field.
10+ years of experience in an Analytics Engineering role.
Expert in SQL and dbt with demonstrated modeling experience.
Data Modeling & Transformation: Design and implement robust, reusable data models within the warehouse. Develop and maintain SQL transformations in dbt.
Data Pipeline & Orchestration: Build and maintain reliable data pipelines in collaboration with data engineering. Utilize orchestration tools (Airflow) to manage and monitor workflows. Manage and support dbt environments and transformations.
Hands-on experience with BigQuery or other cloud data warehouses.
Proficiency in Python and Docker.
Experience with Airflow (Composer), Git, and CI/CD pipelines.
Strong attention to detail and communication skills; able to interact with both technical and business stakeholders.
Technical Requirements:
Primary Data Warehouse: BigQuery (mandatory)
Nice to Have: Snowflake, Redshift
Orchestration: Airflow (GCP Composer)
Languages: Expert-level SQL / dbt; strong Python required
Other Tools: GCP or AWS, Fivetran, Apache Beam, Looker or Preset, Docker
Modeling Techniques: Vault 2.0, 3NF, Dimensional Modeling, etc.
Version Control: Git / CI-CD
Quality Tools: dbt-Elementary, dbt-Osmosis, or Great Expectations preferred
Data Engineer (HR Data warehousing exp)
Requirements engineer job in Boston, MA
Ness is a full lifecycle digital engineering firm offering digital advisory through scaled engineering services. Combining our core competence in engineering with the latest in digital strategy and technology, we seamlessly manage Digital Transformation journeys from strategy through execution to help businesses thrive in the digital economy. As your tech partner, we help engineer your company's future with cloud and data. For more information, visit ************
Data Engineer (HR Data warehousing exp)
Boston, MA (3-4 days onsite a week)
Key Responsibilities:
Translate business needs into data modeling strategies and implement Snowflake data models to support HR analytics, KPIs, and reporting.
Design, build, and maintain Snowflake objects including tables, views, and stored procedures.
Develop and execute SQL or Python transformations, data mappings, cleansing, validation, and conversion processes.
Establish and enforce data governance standards to ensure consistency, quality, and completeness of data assets.
Manage technical metadata and documentation for data warehousing and migration efforts.
Optimize performance of data transformation pipelines and monitor integration performance.
Design, configure, and optimize integrations between Workday and third-party applications.
Participate in system testing including unit, integration, and regression phases.
Support data analysis needs throughout the implementation lifecycle.
Required Experience & Skills:
Experience with Snowflake or similar data warehouse platforms
Strong SQL skills and experience with data transformation tools
Experience with ETL processes and data validation techniques
Understanding of HR data structures and relationships
Excellent analytical and problem-solving abilities
Experience with developing with Python
Architecting a data warehousing solution leveraging data from Workday or other HRIS such as Workday to support advanced reporting and insights for an organization
Preferred Experience & Skills:
Experience in developing and supporting a data warehouse serving the HR domain
Experience with data platforms where SCD Type 2 was required
Experience with data visualization tools such as Tableau
Experience with architecting or working with ELT technologies (such as DBT) and data architectures
Understanding of HR processes, compliance requirements, and industry best practices
Ness is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law
Senior Data Engineer
Requirements engineer job in Boston, MA
This role is with a Maris Financial Services Partner
Boston, MA - Hybrid Role - We are targeting local candidates that can be in the Boston office 3 days per week.
12 Month + contract (or contract to hire, if desired)
This team oversees critical systems including Snowflake, Tableau, and RDBMS technologies like SQL Server and Postgres. This role will focus on automating database deployments and creating efficient patterns and practices that enhance our data processing capabilities.
Key Responsibilities:
Design, enhance, and manage DataOps tools and services to support cloud initiatives.
Develop and maintain scheduled workflows using Airflow.
Create containerized applications for deployment with ECS, Fargate, and EKS.
Build data pipelines to extract, transform, and load (ETL) data from various sources into Apache Kafka, ultimately feeding into Snowflake.
Provide consultation for infrastructure projects to ensure alignment with technical architecture and end-user needs.
Qualifications:
Familiarity with Continuous Integration and Continuous Deployment (CI/CD) practices and tools.
Understanding of application stack architectures (e.g., microservices), PaaS development, and AWS environments.
Proficiency in scripting languages such as Bash.
Experience with Python, Go, or C#.
Hands-on experience with Terraform or other Infrastructure as Code (IaC) tools, such as CloudFormation.
Preferred experience with Apache Kafka and Flink.
Proven experience working with Kubernetes.
Strong knowledge of Linux and Docker environments.
Excellent communication and interpersonal skills.
Strong analytical and problem-solving abilities.
Ability to manage multiple tasks and projects concurrently.
Expertise with SQL Server, Postgres, and Snowflake.
In-depth experience with ETL/ELT processes.
Data Science Engineer
Requirements engineer job in Boston, MA
Role: Data Science Engineer
Note: In-person interview required
This is a 12+ month, ongoing contract with our insurance client in Boston, MA 4x hybrid per week with a mandatory final onsite interview.
We are seeking a talented Data Science Engineer to join our team and contribute to the development and implementation of advanced data solutions using technologies such as AWS Glue, Python, Spark, Snowflake Data Lake, S3, SageMaker, and machine learning (M/L).
Overview: As a Data Science Engineer, you will play a crucial role in designing, building, and optimizing data pipelines, machine learning models, and analytics solutions. You will work closely with cross-functional teams to extract actionable insights from data and drive business outcomes.
Develop and maintain ETL pipelines using AWS Glue for data ingestion, transformation, and integration from various sources.
Utilize Python and Spark for data preprocessing, feature engineering, and model development.
Design and implement data lake architecture using Snowflake Data Lake, Snowflake data warehouse and S3 for scalable and efficient storage and processing of structured and unstructured data.
Leverage SageMaker for model training, evaluation, deployment, and monitoring in production environments.
Collaborate with data scientists, analysts, and business stakeholders to understand requirements, develop predictive models, and generate actionable insights.
Conduct exploratory data analysis (EDA) and data visualization to communicate findings and trends effectively.
Stay updated with advancements in machine learning algorithms, techniques, and best practices to enhance model performance and accuracy.
Ensure data quality, integrity, and security throughout the data lifecycle by implementing robust data governance and compliance measures.
Requirements added by the job poster
• 4+ years of work experience with Amazon Web Services (AWS)
• 2+ years of work experience with Machine Learning
• Accept a background check
• 3+ years of work experience with Python (Programming Language)
• Working in a hybrid setting
Senior Data Engineer
Requirements engineer job in Boston, MA
Hi, this is Eric 👋 We're hiring a stellar Data Engineer to join our engineering org at Basil Systems.
At Basil Systems, we're revolutionizing healthcare data access and insights for the life sciences industry. We've built powerful platforms that help pharmaceutical and medical device companies navigate complex regulatory landscapes, accelerate product development, and ultimately bring life-saving innovations to market faster. Our SaaS platforms transform disconnected data sources into actionable intelligence, empowering organizations to make data-driven decisions that improve patient outcomes and save lives.
The Role
We are seeking a Senior Data Engineer to own and advance the data infrastructure that powers our healthcare insights platform. As our engineering team scales and we expand our data capabilities, we need someone who can build reliable, scalable pipelines while ensuring data quality across increasingly complex regulatory sources.
Key Responsibilities
Design, build, and maintain robust ETL processes for healthcare regulatory data
Integrate new data sources as we onboard customers and expand platform capabilities
Optimize pipeline performance and reliability
Ensure data accuracy and consistency across complex transformation workflows
Qualifications
5+ years of professional experience as a data engineer or in a similar role
Experience with Apache Spark and distributed computing
Familiarity with common ML algorithms and their applications
Knowledge of or willingness to learn and work with Generative AI technologies
Experience with developing for distributed cloud platforms
Experience with MongoDB / ElasticSearch and technologies like BigQuery
Strong commitment to engineering best practices
Nice-to-Haves
Solid understanding of modern security practices, especially in healthcare data contexts
Subject matter expertise in LifeSciences / Pharma / MedTech
This role might not be for you if...
You're a heavy process advocate and want enterprise-grade Scrum or rigid methodologies
You have a need for perfect clarity before taking action
You have a big company mindset
What We Offer
Competitive salary
Health and vision benefits
Attractive equity package
Flexible work environment (remote-friendly)
Opportunity to work on impactful projects that are helping bring life-saving medical products to market
Be part of a mission-driven team solving real healthcare challenges at a critical scaling point
Our Culture
At Basil Systems, we value flexibility and support a distributed team. We actively employ and support remote team members across different geographies, allowing you to work when, where, and how you work best. We are committed to building a diverse, inclusive, and safe work environment for everyone. Our team is passionate about using technology to make a meaningful difference in healthcare.
How to Apply
If you're excited about this opportunity and believe you'd be a great fit for our team, please send your resume and a brief introduction to *****************************.
Basil Systems is an equal opportunity employer. We welcome applicants of all backgrounds and experiences.
Junior Data Engineer
Requirements engineer job in Boston, MA
Job Title: Junior Data Engineer
W2 candidates only
We are on the lookout for engineers who are open to upskill to the exciting world of Data Engineering. This opportunity is for our client, a top tier insurance company and includes a 2-3 week online pre-employment training program (15 hours per week), conveniently scheduled after business hours. Participants who successfully complete the program will receive a $500 stipend. This is a fantastic chance to gain in demand skills, hands-on experience, and a pathway into a dynamic tech role..
Key Responsibilities:
• Assist in the design and development of big data solutions using technologies such as Spark, Scala, AWS Glue, Lambda, SNS/SQS, and CloudWatch.
• Develop applications primarily in Scala and Python with guidance from senior team members.
• Write and optimize SQL queries, preferably with Redshift; experience with Snowflake is a plus.
• Work on ETL/ELT processes and frameworks to ensure smooth data integration.
• Participate in development tasks, including configuration, writing unit test cases, and testing support.
• Help identify and troubleshoot defects and assist in root cause analysis during testing.
• Support performance testing and production environment troubleshooting.
• Collaborate with the team on best practices, including Git version control and CI/CD deployment processes.
• Continuously learn and grow your skills in big data technologies and cloud platforms.
Prerequisites:
• Recent graduate with a degree in Computer Science, Information Technology, Engineering, or related fields.
• Basic experience or coursework in Scala, Python, or other programming languages.
• Familiarity with SQL and database concepts.
• Understanding of ETL/ELT concepts is preferred.
• Exposure to AWS cloud services (Glue, Lambda, SNS/SQS) is a plus but not mandatory.
• Strong problem-solving skills and eagerness to learn.
• Good communication and teamwork abilities.
Selection Process & Training:
• Online assessment and technical interview by Quintrix.
• Client Interview(s).
• 2-3 weeks of pre-employment online instructor-led training.
Stipend paid during Training:
• $500.
Benefits:
• 2 weeks of Paid Vacation.
• Health Insurance including Vision and Dental.
• Employee Assistance Program.
• Dependent Care FSA.
• Commuter Benefits.
• Voluntary Life Insurance.
• Relocation Reimbursement.
Who is Quintrix?
Quintrix is on a mission to help individuals develop their technology talent. We have helped hundreds of candidate's kick start their careers in tech. You will be “paid-to-learn”, qualifying you for a high paying tech job with one of our top employers. To learn more about our candidate experience go to *************************************
Software Development Engineer in Test - AI
Requirements engineer job in Boston, MA
JOB MISSION:
New Balance is seeking a forward-thinking Senior SDET with a developer's mindset and a passion for AI to lead the next evolution of our global eCommerce test automation platform. This is a unique opportunity for someone who thrives on staying ahead of AI trends and is eager to apply them to modern software quality engineering. You'll drive the transformation of our Selenium and BDD-based test stack into a cutting-edge, AI-augmented platform that supports everything from unit testing to full user journey validation. If you're a builder at heart-excited by the challenge of creating scalable, self-healing, and autonomous testing systems that empower both engineers and developers-this role is for you.
MAJOR ACCOUNTABILITIES:
Lead the architectural redesign of our test automation platform, transitioning from a legacy Selenium/C# and BDD stack to a modern, intelligent framework.
Design, build, and maintain AI-driven test automation platforms that enable reliable, scalable tests across the entire testing pyramid-from unit and integration to full end-to-end user journeys.
Implement AI-augmented testing strategies to support autonomous test creation, maintenance, and healing.
Integrate visual validation tools such as Applitools Eyes into the automation pipeline.
Collaborate cross-functionally with developers, QA engineers, and DevOps to ensure test coverage, reliability, and scalability across global eCommerce sites.
Evaluate and integrate open-source and commercial tools that enhance test intelligence, observability, and maintainability.
Advocate for testability by partnering with developers and architects to influence solution design.
Mentor and guide other SDETs and QA engineers in modern test automation practices and AI-driven testing approaches.
Continuously research and prototype emerging AI technologies in the testing space to keep the platform at the forefront of innovation.
REQUIREMENTS FOR SUCCESS:
5+ years of experience in test automation, with deep expertise in Selenium and C#.
Strong understanding of BDD frameworks (e.g., SpecFlow, Cucumber) and test design principles.
Hands-on experience with Selenium extensions such as Healenium, Selenide, or Selenium Grid, with a focus on improving test resilience, scalability, and maintainability.
Proven ability to implement self-healing test mechanisms and intelligent locator strategies to reduce flakiness and maintenance overhead.
Familiarity with AI-augmented testing strategies (e.g., intelligent test generation, adaptive test execution).
Experience integrating Selenium-based frameworks into modern CI/CD pipelines (e.g., Azure DevOps, Jenkins), with AI-driven diagnostics or analytics.
Proficiency with visual testing tools like Applitools Eyes.
Experience with modern automation frameworks such as TestRigor, Playwright, or Cypress.
Exposure to machine learning or NLP concepts applied to software testing.
Contributions to open-source testing tools or frameworks.
Strong problem-solving, communication, and mentoring skills.
ETL Data Engineer with SpringBatch Experience-- SHADC5693360
Requirements engineer job in Smithfield, RI
Job Title: ETL Data Engineer with SpringBatch Experience - W2 only - We can provide sponsorship
Duration: Long Term
MUST HAVES:
Strong SQL for querying and data validation
Oracle
AWS
ETL experience with Java SpringBatch (for the ETL data transformation).
Note: the ETL work is done in Java (so Python is only a nice to have).
The Expertise and Skills You Bring
Bachelor's or Master's Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with 5+ years of working experience
4+ years of Java development utilizing Spring frameworks. Experience writing batch jobs with Spring Batch is a must
2+ years of experience developing applications that run in AWS, with focus on AWS Batch, S3, IAM
3+ years working with SQL (ANSI SQL, Oracle, Snowflake)
2+ years of Python development
Experience with Unix shell scripting (bash, ksh) and scheduling / orchestration tools (Control-M)
Strong data modeling skills with experience working with 3NF and Star Schema data models
Proven data analysis skills; not afraid to work in a complex data ecosystem
Hands-on experience on SQL query optimization and tuning to improve performance is desirable
Experience with DevOps, Continuous Integration and Continuous Delivery (Jenkins, Terraform, CloudFormation)
Experience in Agile methodologies (Kanban and SCRUM)
Experience building and deploying containerized applications using Docker
Work experience in the financial services industry is a plus
Proven track record to handle ambiguity and work in a fast-paced environment, either independently or in a collaborative manner
Good interpersonal skills to work with multiple teams within the business unit and across the organization
Senior AWS DevOps Engineer - AI Platform Enablement
Requirements engineer job in Boston, MA
We are seeking a seasoned AWS DevOps Engineer to join our growing AI Platform team, responsible for building and operating the next-generation Agentic AI infrastructure that powers intelligent automation across the enterprise.
This individual will play a key role in establishing the DevOps, security, and compliance foundations for our AWS-based AI platform - while working collaboratively with both the AI Studio engineering team and the Enterprise IT organization to ensure governance, reliability, and speed of innovation.
The ideal candidate combines deep AWS technical expertise with a pragmatic, people-oriented approach to bridge between innovation and enterprise IT standards.
Responsibilities:
Key Responsibilities:
Design, implement, and maintain AWS infrastructure supporting AI development, model orchestration, and agentic systems (e.g., Bedrock, Lambda, ECS/EKS, API Gateway).
Build and manage CI/CD pipelines for AI and data applications using AWS CDK / CloudFormation / CodePipeline / Terraform.
Implement security guardrails and compliance controls (IAM, KMS, network segmentation, audit logging) aligned with IT standards - but autonomously within the AI environment.
Partner with the IT security and cloud teams to ensure adherence to cybersecurity insurance and data governance requirements.
Manage monitoring, observability, and cost-optimization for AI workloads (CloudWatch, X-Ray, Config, Trusted Advisor).
Enable rapid development cycles for the AI team by streamlining environment provisioning, model deployment, and access management.
Serve as the bridge between AI Engineering and IT, building mutual trust through transparency, security-minded automation, and operational excellence.
Document and evangelize best practices in DevSecOps, Infrastructure as Code, and model lifecycle management.
Qualifications:
Required Qualifications:
7+ years of DevOps / Cloud Engineering experience, with 4+ years on AWS in production environments.
Proven expertise with AWS services: VPC, IAM, Lambda, ECS/EKS, API Gateway, CloudFront, S3, CloudWatch, CloudFormation/CDK, KMS, Cognito, Secrets Manager.
Experience managing CI/CD pipelines (CodePipeline, GitHub Actions, or Jenkins).
Strong understanding of networking, identity federation (Azure AD / Okta), and data security.
Familiarity with AI/ML workflows (SageMaker, Bedrock, Databricks, or similar).
Hands-on experience implementing security guardrails and compliance controls in AWS.
Proficiency with Terraform or CDK for Infrastructure as Code.
Excellent communication and collaboration skills - able to explain technical decisions to IT, security, and data teams.
Preferred Qualifications:
Prior experience supporting AI or data platform teams.
Exposure to multi-agent systems, Bedrock AgentCore, LangChain, or similar frameworks.
Background in hybrid enterprise environments (AWS + Azure).
AWS certifications (e.g., Solutions Architect, DevOps Engineer, Security Specialty).
Success Indicators:
AI development team can deploy and operate independently within compliant AWS guardrails.
IT leadership gains confidence in the AI team's DevOps maturity and control mechanisms.
Infrastructure and pipelines are fully automated, observable, and secure - without bottlenecks.
Summary:
This is a hands-on DevOps leadership role at the intersection of AI innovation and enterprise trust. You will empower cutting-edge AI development while ensuring the environment remains compliant, secure, and sustainable - helping shape the company's next generation of intelligent construction technology.