Cloud Engineer
Requirements engineer job in Boston, MA
Cloud Database Administrator (DBA)/ETL Engineer
Contract Length: Through 6/30/2026 with high likelihood for extension
As a Cloud Database Administrator (DBA)/ETL Engineer, you will play a key role in a multi-year application and platform modernization initiative within the public education sector. You will be responsible for maintaining, optimizing, and modernizing cloud-hosted databases and data services, ensuring security, high availability, and compliance with governance policies. This hands-on role involves designing and implementing scalable data pipelines, migrating legacy SSIS ETL code to modern SQL-based solutions, and leveraging Apache Airflow for scheduling and dependency management. You will collaborate with cloud engineers, DBAs, and technical leads to deliver streamlined, cost-effective solutions that support critical education programs.
Minimum Qualifications
Strong experience with Oracle RDS and AWS services (S3, Managed Airflow/MWAA, DMS)
Advanced SQL coding skills; ability to translate Oracle PL/SQL to Snowflake or similar platforms
Experience with multiple backend data sources (SQL Server, Oracle, Postgres, DynamoDB, Snowflake)
Familiarity with data warehouse concepts (facts, dimensions, normalization, slowly changing dimensions)
Basic scripting experience (Python, PowerShell, bash)
Ability to write unit test scripts and validate migrated ETL/ELT code
Nice-to-Have Skills
Experience configuring and managing Apache Airflow DAGs
Knowledge of Snowflake features (Snowpipe, cloning, time travel, RBAC)
Prior experience in government or education data domains
Familiarity with GitHub and Jira for code management and task tracking
Responsibilities
Create and manage cloud-native databases and services (RDS Oracle, Aurora, Postgres, Snowflake)
Design and implement data pipelines and transformations using AWS and Airflow
Optimize query execution, compute scaling, and storage performance
Implement encryption, access policies, and auditing to meet FERPA/PII standards
Migrate legacy SSIS ETL code to modern SQL-based solutions
Perform performance tuning and benchmarking against on-prem solutions
Collaborate with technical teams to troubleshoot and enhance data workflows
What's In It For You
Weekly Paychecks
Opportunity to work on a high-impact modernization project
Collaborative environment with cutting-edge cloud technologies
Professional growth in data engineering and cloud architecture
EEO Statement:
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
Platform Engineer
Requirements engineer job in Boston, MA
Cloud Platform Engineer | Aerospace | $210k + Bonus | Boston, MA
Role: Cloud Platform Engineer
Base: $160,000 - $210,000 DOE
An industry leading aerospace company is seeking a Cloud Platform Engineer to join its Boston-based engineering team. This role is ideal for someone passionate about building scalable cloud-native infrastructure and enabling mission-critical applications in a high-performance environment.
What You'll Do
Design, build, and maintain cloud infrastructure primarily on AWS
Develop and deploy applications using Kubernetes, with a focus on reliability and scalability
Implement Infrastructure as Code (IaC) using Terraform
Build observability and monitoring systems using Grafana and related tools
Collaborate with software engineers and data scientists to support deployment pipelines and cloud architecture
What We're Looking For
Strong experience with AWS services and architecture
Deep understanding of Kubernetes and container orchestration
Hands-on experience with Terraform for infrastructure automation
Familiarity with Grafana, Prometheus, or similar monitoring stacks
Solid programming/scripting skills (Python, Go, or Bash preferred)
Experience in high-availability, distributed systems is a plus
Why Join?
Competitive compensation of up to $210K base
Work on cutting-edge aerospace and AI technologies
Hybrid work environment with a Boston office hub
Collaborative, mission-driven team culture
Cloud Engineer
Requirements engineer job in Devens, MA
Cloud Engineer | Hybrid, Devens, MA | $130,000 - $145,000 + Benefits
A well-established firm in Devens, MA is seeking a skilled and experienced Cloud Engineer to join its growing IT team. With over 100 years in business and a strong reputation in their sector, the company offers a collaborative, community-focused environment with a commitment to innovation and long-term growth.
Why This Role Stands Out:
Join a close-knit IT team and serve as the go-to expert for cloud infrastructure
Work directly with leadership to shape the company's Azure strategy
Be part of a business expanding into new locations and investing in technology
Enjoy a family-friendly culture with strong values and a relaxed office environment
Access to professional development, certifications, and regular company events
Key Responsibilities:
Design, implement, and maintain hybrid cloud infrastructure (Azure + Active Directory)
Lead networking, firewall, and security configurations across cloud environments
Manage virtual machines, storage, SSO, conditional access, and enterprise applications
Implement MDR/XDR antivirus solutions and support security audits
Collaborate with cross-functional teams to troubleshoot and optimize systems
Stay current on cloud trends and make recommendations for continuous improvement
Required Skills and Experience
10+ years in IT, with a strong focus on cloud engineering
Deep expertise in Microsoft Azure technologies
Solid understanding of networking, firewalls, and Windows environments
Experience with MDR/XDR cybersecurity tools
Strong documentation and collaboration skills
Relevant certifications (e.g., Azure Solutions Architect, AWS Solutions Architect) preferred
Compensation & Benefits:
$130,000 - $145,000 base salary
Full medical, dental, and vision coverage (Blue Cross)
401(k) with company match and profit sharing after 1 year
One remote day per week (4 days on-site)
Supportive culture with a focus on work-life balance and community engagement
The company is actively interviewing and looking to make a hire soon. To secure an interview slot, send in an application, drop me a message, or forward your resume to the contact below.
📧 *****************************
Analytics Engineer
Requirements engineer job in Boston, MA
🚀 BI Developer / Data Analyst / Analytics Engineer
hackajob is partnering with Simply Business to help grow their Data & Analytics team. This role is ideal for someone who enjoys working hands-on with SQL, data models, and analytics solutions that support real business decisions.
About the role
📊 Build SQL-driven analytics & data models that power business decisions.
🤝 Work cross-functionally with product, ops and engineering in a collaborative team.
⚙️ Help evolve a modern analytics stack and deliver production-ready data solutions.
📍 Boston, MA (Hybrid ~8 days/month)
💼 Full-time
💵 $60k-$99k
✅ Essential skills
🧾 SQL - advanced querying & performance tuning
🏗 Data modeling - star/snowflake, schemas for analytics
🛠 Analytics engineering - building data pipelines & BI solutions
📈 3+ yrs experience in Data & Analytics roles
➕ Nice to have
❄️ Snowflake or similar warehouse
🛠 dbt / AWS / BI tools (Looker/Tableau)
🐍 Python familiarity
🤖 Exposure to AI / advanced analytics
🔎 Role details
👩 💻 Levels: 1-6 yrs up to 6 yrs (multiple levels considered)
🛂 Work auth: Due to compliance and project requirements, this role is available to candidates who are currently authorized to work in the United States without the need for future sponsorship.
✅ To apply: Sign up on hackajob (Simply Business' hiring partner)
👉 Quick CTA: Sign up on hackajob to be matched and apply - no long forms.
hackajob has partnered with a forward-thinking tech-driven business that prioritizes innovation in its digital solutions and leverages extensive industry data to drive impactful results.
#Data #SQL #Snowflake #dbt #hackajob
Formulation Engineer
Requirements engineer job in Woburn, MA
Vaxess is an NIH and venture-funded company developing a pipeline of next-generation therapeutics on the
Microneedle Array Patch (MAP) platform. With only five minutes of wear-time on the skin, the self-applied
MAP enables sustained delivery in the skin. The platform combines high temperature stability with simplified
application to dramatically alter the way that drugs are delivered. Vaxess is committed to enabling products
that are not only more effective, but also more accessible to patients around the world.
Manufacturing at Vaxess is cross-disciplinary, integrating mechanical, industrial, biomedical, and chemical engineering with chemistry, biology, and human factors to address important unmet medical needs. We are seeking a talented, collaborative, and highly motivated Formulation engineer with a proven track record in leading Process Development, Tech Transfer as well as manufacturing support. The Formulation Engineer will report to the Director of Process Development and will collaborate closely with cross-functional teams such as Manufacturing, Assay Development, Quality Control, Automation, and Facilities.
Responsibilities:
Leads and supervises cGMP formulation operations related to Microarray Patch manufacturing, ensuring adherence to batch records, SOPs, and regulatory requirements. Provides hands-on guidance and oversight to junior staff and technicians during manufacturing campaigns.
Oversees day-to-day manufacturing floor activities during formulation operations, ensuring safety, compliance, and productivity. Coordinates scheduling, resource allocation, and troubleshooting in collaboration with Manufacturing and Facilities teams.
Trains, mentors, and evaluates manufacturing personnel involved in formulation processes. Develops and implements training materials and competency assessments to ensure operational excellence and compliance.
Reviews and verifies documentation for completeness and compliance with regulatory standards.
Ensures the timely completion of production batch records, logbooks, test records, and other documentation following Good Documentation Practices.
Collaborates on process development and optimization studies for Microarray Patch formulations. Works as part of the process development team to optimize excipients and formulations to support project objectives. Maintains detailed experimental records. Documents data in reports and write detailed protocols.
Collaborates closely with cross-functional team members of Manufacturing, Quality, Research, and Automation functions to support process development, manufacturing, and tech transfer between R&D and product development activities. Acts as a liaison between Research and Manufacturing teams to scale up formulation processes from R&D to production.
Identifies and implements new technologies and assays to support continuous improvement and scale up of processes.
Leads root cause analyses (RCA) and process development investigations.
Authors, edits, and reviews batch records, SOPs, work instructions, and other documentation to support manufacturing readiness and regulatory submissions.
Regularly communicates progress, issues, and results to key stakeholders
Qualifications:
BS or MS in Biological Sciences, Chemistry, Biomedical Engineering, Chemical Engineering or a related discipline, with at least 3 years of relevant experience. Title will be adjusted commensurate with qualifications.
Strong prior hands-on formulation process development experience. Demonstrated ability to design, execute studies, and analyze experimental results.
Prior supervisory experience in a GxP environment is preferred.
Strong understanding of Good documentation practices. Experience in authoring and reviewing batch records, SOPs, and work instructions.
Strong verbal and written communication skills. Excellent time and project management skills.
Demonstrated ability to learn new skills, creatively solve challenging technical problems, think independently, and work collaboratively in diverse multidisciplinary teams.
Strong attention to detail. Ability to identify root causes of problems and recommend corrective actions for continuous process improvement.
Vaxess is building a team of exceptional people to rapidly advance product development. We work closely as a team and thrive in a dynamic, exciting, and engaging work environment. If you're interested in joining the Vaxess team, please submit your CV/resume to ******************.
Scada Engineer
Requirements engineer job in Westborough, MA
first PRO is now accepting resumes for a Scada Engineer role in Westborough, MA. This is a 12+month contract and onsite 2 days per week.
Typical task breakdown:
o Manage, maintain and enhance SCADA system software and field RTU software.
o Analyze, research, develop, maintain and implement enhancements to SCADA.
o Defines and maintains supervisory control and data acquisition (SCADA) data and definitions, Aveva Enterprise 2023.
o Develop SCADA operational pages, create system reports and maintain historical and real-time databases.
o Program and implement the installation of all new and upgraded field RTUs and telecommunications.
Education & Experience Required:
Bachelor of Science Degree in Electrical and/or Computer Engineering.
o Minimum of five to eight (5-8) years related experience.
Technical Skills
o Requires proficiency in Accol Workbench Open BSI, Modbus, BSAP, OPC, SQL Server, TCP-IP, Microsoft Access and Office. Proficiency in Aveva Enterprise SCADA 2023 and ControlWave Designer preferred. An understanding of gas distribution system operations and telecommunications such as serial, Ethernet and wireless preferred
DevOps Engineer
Requirements engineer job in Boston, MA
📣 Platform Engineer - Travel SaaS
A fast-scaling SaaS company in the travel tech space is hiring a Platform Engineer to help build and scale their global infrastructure.
This is a high-impact role in a product-led, engineering-driven environment. The company operates a modern, multi-service architecture on AWS and needs someone who can take ownership of platform reliability, CI/CD tooling, and infrastructure as code.
💻 The role:
Design, build and maintain secure, scalable AWS infrastructure (EC2, S3, RDS, IAM, etc.)
Champion Infrastructure as Code using Terraform or Pulumi
Manage containerised deployments with Docker and ECS or Kubernetes
Improve and maintain CI/CD pipelines (GitHub Actions, CircleCI, etc.)
Collaborate closely with engineering, SRE and security teams
Take part in on-call and incident response as part of a “you build it, you run it” culture
🧠 What they're looking for:
3+ years' hands-on experience with AWS
Strong background in infrastructure as code
Solid understanding of containerisation and orchestration
Comfortable with scripting (Python, Go or Bash)
Experience with observability tools (Datadog, CloudWatch, etc.)
Excellent debugging and troubleshooting skills
🎁 Nice-to-haves:
Exposure to Windows/.NET, serverless architectures or compliance frameworks (e.g. SOC2)
🌍 Why join:
Compensation: $130-150K base + equity
Culture: Low-ego, high-ownership team with a strong engineering voice
Hybrid setup: ~3 days per week in office in Boston
Mission: Helping businesses travel smarter - at global scale
Senior Data Engineer
Requirements engineer job in Boston, MA
Data Engineer (HRIS experience)
Boston, MA ( 4 days onsite a week )
Key Responsibilities:
Translate business needs into data modelling strategies and implement Snowflake data models to support HR analytics, KPIs, and reporting.
Design, build, and maintain Snowflake objects including tables, views, and stored procedures.
Develop and execute SQL or Python transformations, data mappings, cleansing, validation, and conversion processes.
Establish and enforce data governance standards to ensure consistency, quality, and completeness of data assets.
Manage technical metadata and documentation for data warehousing and migration efforts.
Optimize performance of data transformation pipelines and monitor integration performance.
Design, configure, and optimize integrations between Workday and third-party applications.
Participate in system testing including unit, integration, and regression phases.
Support data analysis needs throughout the implementation lifecycle.
Required Experience & Skills:
Experience with Snowflake or similar data warehouse platforms
Strong SQL skills and experience with data transformation tools
Experience with ETL processes and data validation techniques
Understanding of HR data structures and relationships
Excellent analytical and problem-solving abilities
Experience with developing with Python
Architecting a data warehousing solution leveraging data from Workday or other HRIS such as Workday to support advanced reporting and insights for an organization
Preferred Experience & Skills:
Experience in developing and supporting a data warehouse serving the HR domain
Experience with data platforms where SCD Type 2 was required
Experience with data visualization tools such as Tableau
Experience with architecting or working with ELT technologies (such as DBT) and data architectures
Understanding of HR processes, compliance requirements, and industry best practices
DevOps Engineer
Requirements engineer job in Boston, MA
We're looking for a Senior DevOps Tools Engineer to help modernize and elevate our development ecosystem. If you're passionate about improving how software teams build, test, secure, and deliver high-quality code-this role is built for you.
This is not a traditional infrastructure-heavy DevOps role. It's a developer-enablement, tooling modernization, and process transformation position with real influence.
🔧 Role Overview
You will lead initiatives that reshape how engineering teams work-modernizing tooling, redesigning source control practices, improving CI/CD workflows, and championing DevEx across the organization. This role combines hands-on engineering with strategic process design.
⭐ Key Responsibilities
Drive modernization of development tools and processes, including SVN → Git migration and workflow redesign.
Own and enhance CI/CD pipelines to improve reliability, automation, and performance.
Implement modern DevOps + DevSecOps practices (SAST, DAST, code scanning, dependency checks, etc.).
Automate build, packaging, testing, and release processes.
Advocate for and improve Developer Experience (DevEx) by reducing friction and enabling efficiency.
Collaborate across engineering teams to define standards for source control, branching, packaging, and release workflows.
Guide teams through modernization initiatives and influence technical direction.
🎯 Must-Have Qualifications
Strong experience with CI/CD pipelines, developer tooling, and automation.
Hands-on expertise with Git + Git-based platforms (GitLab, GitHub, Bitbucket).
Experience modernizing tooling or migrating from legacy systems (SVN → Git is a big plus).
Solid understanding of DevOps / DevSecOps workflows: automation, builds, packaging, security integration.
Proficient in scripting/programming for automation (Python, Bash, PowerShell, Groovy, etc.).
Excellent communication skills and ability to guide teams through change.
🏙️ Work Model
This is a full-time, hybrid role based in Boston, MA. Onsite participation is required.
📩 When Applying
Please include:
Updated resume
Expected Salary
Notice period (30 days or less)
A good time for a quick introductory call
If you're excited about modernizing engineering ecosystems, improving developer experience, and driving organization-wide transformation, we'd love to connect.
Senior Data Engineer
Requirements engineer job in Boston, MA
We are seeking a highly skilled Senior Data Engineer for a client that is based in Boston.
Ideal candidates will have
Power BI or Tableau Experience
SQL experience
AWS Cloud experience
Junior Data Engineer
Requirements engineer job in Boston, MA
Job Title: Junior Data Engineer
W2 candidates only
We are on the lookout for engineers who are open to upskill to the exciting world of Data Engineering. This opportunity is for our client, a top tier insurance company and includes a 2-3 week online pre-employment training program (15 hours per week), conveniently scheduled after business hours. Participants who successfully complete the program will receive a $500 stipend. This is a fantastic chance to gain in demand skills, hands-on experience, and a pathway into a dynamic tech role..
Key Responsibilities:
• Assist in the design and development of big data solutions using technologies such as Spark, Scala, AWS Glue, Lambda, SNS/SQS, and CloudWatch.
• Develop applications primarily in Scala and Python with guidance from senior team members.
• Write and optimize SQL queries, preferably with Redshift; experience with Snowflake is a plus.
• Work on ETL/ELT processes and frameworks to ensure smooth data integration.
• Participate in development tasks, including configuration, writing unit test cases, and testing support.
• Help identify and troubleshoot defects and assist in root cause analysis during testing.
• Support performance testing and production environment troubleshooting.
• Collaborate with the team on best practices, including Git version control and CI/CD deployment processes.
• Continuously learn and grow your skills in big data technologies and cloud platforms.
Prerequisites:
• Recent graduate with a degree in Computer Science, Information Technology, Engineering, or related fields.
• Basic experience or coursework in Scala, Python, or other programming languages.
• Familiarity with SQL and database concepts.
• Understanding of ETL/ELT concepts is preferred.
• Exposure to AWS cloud services (Glue, Lambda, SNS/SQS) is a plus but not mandatory.
• Strong problem-solving skills and eagerness to learn.
• Good communication and teamwork abilities.
Selection Process & Training:
• Online assessment and technical interview by Quintrix.
• Client Interview(s).
• 2-3 weeks of pre-employment online instructor-led training.
Stipend paid during Training:
• $500.
Benefits:
• 2 weeks of Paid Vacation.
• Health Insurance including Vision and Dental.
• Employee Assistance Program.
• Dependent Care FSA.
• Commuter Benefits.
• Voluntary Life Insurance.
• Relocation Reimbursement.
Who is Quintrix?
Quintrix is on a mission to help individuals develop their technology talent. We have helped hundreds of candidate's kick start their careers in tech. You will be “paid-to-learn”, qualifying you for a high paying tech job with one of our top employers. To learn more about our candidate experience go to *************************************
DBT SME - Data Modeling, Analytics Engineer
Requirements engineer job in Boston, MA
We're seeking a Lead Analytics Engineer to help design, model, and scale a modern data environment for a global software organization. This role will play a key part in organizing and maturing that landscape as part of a multi-year strategic roadmap. This position is ideal for a senior-level analytics engineer who can architect data solutions, build robust models, and stay hands-on with development.
This is a remote role with occasional onsite meetings. Candidates must currently be local to the Boston area and reside in MA/CT/RI/NH/ME.
Long term contract. W2 or c2c.
Highlights:
Architect and build new data models using dbt and modern modeling techniques.
Partner closely with leadership and business teams to translate complex requirements into technical solutions.
Drive structure and clarity within a growing analytics ecosystem.
Qualifications
Bachelor's degree in Economics, Mathematics, Computer Science, or related field.
10+ years of experience in an Analytics Engineering role.
Expert in SQL and dbt with demonstrated modeling experience.
Data Modeling & Transformation: Design and implement robust, reusable data models within the warehouse. Develop and maintain SQL transformations in dbt.
Data Pipeline & Orchestration: Build and maintain reliable data pipelines in collaboration with data engineering. Utilize orchestration tools (Airflow) to manage and monitor workflows. Manage and support dbt environments and transformations.
Hands-on experience with BigQuery or other cloud data warehouses.
Proficiency in Python and Docker.
Experience with Airflow (Composer), Git, and CI/CD pipelines.
Strong attention to detail and communication skills; able to interact with both technical and business stakeholders.
Technical Requirements:
Primary Data Warehouse: BigQuery (mandatory)
Nice to Have: Snowflake, Redshift
Orchestration: Airflow (GCP Composer)
Languages: Expert-level SQL / dbt; strong Python required
Other Tools: GCP or AWS, Fivetran, Apache Beam, Looker or Preset, Docker
Modeling Techniques: Vault 2.0, 3NF, Dimensional Modeling, etc.
Version Control: Git / CI-CD
Quality Tools: dbt-Elementary, dbt-Osmosis, or Great Expectations preferred
Data Science Engineer
Requirements engineer job in Boston, MA
Role: Data Science Engineer
Note: In-person interview required
This is a 12+ month, ongoing contract with our insurance client in Boston, MA 4x hybrid per week with a mandatory final onsite interview.
We are seeking a talented Data Science Engineer to join our team and contribute to the development and implementation of advanced data solutions using technologies such as AWS Glue, Python, Spark, Snowflake Data Lake, S3, SageMaker, and machine learning (M/L).
Overview: As a Data Science Engineer, you will play a crucial role in designing, building, and optimizing data pipelines, machine learning models, and analytics solutions. You will work closely with cross-functional teams to extract actionable insights from data and drive business outcomes.
Develop and maintain ETL pipelines using AWS Glue for data ingestion, transformation, and integration from various sources.
Utilize Python and Spark for data preprocessing, feature engineering, and model development.
Design and implement data lake architecture using Snowflake Data Lake, Snowflake data warehouse and S3 for scalable and efficient storage and processing of structured and unstructured data.
Leverage SageMaker for model training, evaluation, deployment, and monitoring in production environments.
Collaborate with data scientists, analysts, and business stakeholders to understand requirements, develop predictive models, and generate actionable insights.
Conduct exploratory data analysis (EDA) and data visualization to communicate findings and trends effectively.
Stay updated with advancements in machine learning algorithms, techniques, and best practices to enhance model performance and accuracy.
Ensure data quality, integrity, and security throughout the data lifecycle by implementing robust data governance and compliance measures.
Requirements added by the job poster
• 4+ years of work experience with Amazon Web Services (AWS)
• 2+ years of work experience with Machine Learning
• Accept a background check
• 3+ years of work experience with Python (Programming Language)
• Working in a hybrid setting
Senior Data Engineer
Requirements engineer job in Boston, MA
Hi, this is Eric 👋 We're hiring a stellar Data Engineer to join our engineering org at Basil Systems.
At Basil Systems, we're revolutionizing healthcare data access and insights for the life sciences industry. We've built powerful platforms that help pharmaceutical and medical device companies navigate complex regulatory landscapes, accelerate product development, and ultimately bring life-saving innovations to market faster. Our SaaS platforms transform disconnected data sources into actionable intelligence, empowering organizations to make data-driven decisions that improve patient outcomes and save lives.
The Role
We are seeking a Senior Data Engineer to own and advance the data infrastructure that powers our healthcare insights platform. As our engineering team scales and we expand our data capabilities, we need someone who can build reliable, scalable pipelines while ensuring data quality across increasingly complex regulatory sources.
Key Responsibilities
Design, build, and maintain robust ETL processes for healthcare regulatory data
Integrate new data sources as we onboard customers and expand platform capabilities
Optimize pipeline performance and reliability
Ensure data accuracy and consistency across complex transformation workflows
Qualifications
5+ years of professional experience as a data engineer or in a similar role
Experience with Apache Spark and distributed computing
Familiarity with common ML algorithms and their applications
Knowledge of or willingness to learn and work with Generative AI technologies
Experience with developing for distributed cloud platforms
Experience with MongoDB / ElasticSearch and technologies like BigQuery
Strong commitment to engineering best practices
Nice-to-Haves
Solid understanding of modern security practices, especially in healthcare data contexts
Subject matter expertise in LifeSciences / Pharma / MedTech
This role might not be for you if...
You're a heavy process advocate and want enterprise-grade Scrum or rigid methodologies
You have a need for perfect clarity before taking action
You have a big company mindset
What We Offer
Competitive salary
Health and vision benefits
Attractive equity package
Flexible work environment (remote-friendly)
Opportunity to work on impactful projects that are helping bring life-saving medical products to market
Be part of a mission-driven team solving real healthcare challenges at a critical scaling point
Our Culture
At Basil Systems, we value flexibility and support a distributed team. We actively employ and support remote team members across different geographies, allowing you to work when, where, and how you work best. We are committed to building a diverse, inclusive, and safe work environment for everyone. Our team is passionate about using technology to make a meaningful difference in healthcare.
How to Apply
If you're excited about this opportunity and believe you'd be a great fit for our team, please send your resume and a brief introduction to *****************************.
Basil Systems is an equal opportunity employer. We welcome applicants of all backgrounds and experiences.
Senior Data Engineer
Requirements engineer job in Boston, MA
This role is with a Maris Financial Services Partner
Boston, MA - Hybrid Role - We are targeting local candidates that can be in the Boston office 3 days per week.
12 Month + contract (or contract to hire, if desired)
This team oversees critical systems including Snowflake, Tableau, and RDBMS technologies like SQL Server and Postgres. This role will focus on automating database deployments and creating efficient patterns and practices that enhance our data processing capabilities.
Key Responsibilities:
Design, enhance, and manage DataOps tools and services to support cloud initiatives.
Develop and maintain scheduled workflows using Airflow.
Create containerized applications for deployment with ECS, Fargate, and EKS.
Build data pipelines to extract, transform, and load (ETL) data from various sources into Apache Kafka, ultimately feeding into Snowflake.
Provide consultation for infrastructure projects to ensure alignment with technical architecture and end-user needs.
Qualifications:
Familiarity with Continuous Integration and Continuous Deployment (CI/CD) practices and tools.
Understanding of application stack architectures (e.g., microservices), PaaS development, and AWS environments.
Proficiency in scripting languages such as Bash.
Experience with Python, Go, or C#.
Hands-on experience with Terraform or other Infrastructure as Code (IaC) tools, such as CloudFormation.
Preferred experience with Apache Kafka and Flink.
Proven experience working with Kubernetes.
Strong knowledge of Linux and Docker environments.
Excellent communication and interpersonal skills.
Strong analytical and problem-solving abilities.
Ability to manage multiple tasks and projects concurrently.
Expertise with SQL Server, Postgres, and Snowflake.
In-depth experience with ETL/ELT processes.
Software Development Engineer in Test - AI
Requirements engineer job in Boston, MA
JOB MISSION:
New Balance is seeking a forward-thinking Senior SDET with a developer's mindset and a passion for AI to lead the next evolution of our global eCommerce test automation platform. This is a unique opportunity for someone who thrives on staying ahead of AI trends and is eager to apply them to modern software quality engineering. You'll drive the transformation of our Selenium and BDD-based test stack into a cutting-edge, AI-augmented platform that supports everything from unit testing to full user journey validation. If you're a builder at heart-excited by the challenge of creating scalable, self-healing, and autonomous testing systems that empower both engineers and developers-this role is for you.
MAJOR ACCOUNTABILITIES:
Lead the architectural redesign of our test automation platform, transitioning from a legacy Selenium/C# and BDD stack to a modern, intelligent framework.
Design, build, and maintain AI-driven test automation platforms that enable reliable, scalable tests across the entire testing pyramid-from unit and integration to full end-to-end user journeys.
Implement AI-augmented testing strategies to support autonomous test creation, maintenance, and healing.
Integrate visual validation tools such as Applitools Eyes into the automation pipeline.
Collaborate cross-functionally with developers, QA engineers, and DevOps to ensure test coverage, reliability, and scalability across global eCommerce sites.
Evaluate and integrate open-source and commercial tools that enhance test intelligence, observability, and maintainability.
Advocate for testability by partnering with developers and architects to influence solution design.
Mentor and guide other SDETs and QA engineers in modern test automation practices and AI-driven testing approaches.
Continuously research and prototype emerging AI technologies in the testing space to keep the platform at the forefront of innovation.
REQUIREMENTS FOR SUCCESS:
5+ years of experience in test automation, with deep expertise in Selenium and C#.
Strong understanding of BDD frameworks (e.g., SpecFlow, Cucumber) and test design principles.
Hands-on experience with Selenium extensions such as Healenium, Selenide, or Selenium Grid, with a focus on improving test resilience, scalability, and maintainability.
Proven ability to implement self-healing test mechanisms and intelligent locator strategies to reduce flakiness and maintenance overhead.
Familiarity with AI-augmented testing strategies (e.g., intelligent test generation, adaptive test execution).
Experience integrating Selenium-based frameworks into modern CI/CD pipelines (e.g., Azure DevOps, Jenkins), with AI-driven diagnostics or analytics.
Proficiency with visual testing tools like Applitools Eyes.
Experience with modern automation frameworks such as TestRigor, Playwright, or Cypress.
Exposure to machine learning or NLP concepts applied to software testing.
Contributions to open-source testing tools or frameworks.
Strong problem-solving, communication, and mentoring skills.
Junior DevOps Engineer
Requirements engineer job in Woburn, MA
The Alexander Technology Group is looking for a junior devops engineer for a client in the Woburn, MA area.
Hybrid on-site
No 3rd party applicants will be considered, do not reach out
85-90k
Requirements:
Key Responsibilities:
Support Cloud Infrastructure: Assist in managing AWS infrastructure including VPCs, ECS/EKS clusters, RDS databases, and serverless components under the guidance of senior engineers.
Maintain CI/CD Pipelines: Help maintain and improve deployment pipelines using GitLab CI or GitHub Actions, ensuring smooth software delivery.
Monitor System Health: Set up and monitor alerting systems using CloudWatch, Grafana, or Prometheus, and respond to incidents with support from the team.
Security and Compliance: Support SOC 2 Type II compliance efforts by implementing security controls and following established protocols.
Infrastructure as Code: Gain experience with Terraform and other IaC tools to automate infrastructure provisioning and management.
If interested, please send resume to ************************
Senior Data Engineer
Requirements engineer job in Merrimack, NH
Immediate need for a talented Senior Data Engineer. This is a 12 months contract opportunity with long-term potential and is located in Westlake, TX/Merrimack, NH(Hybrid). Please review the job description below and contact me ASAP if you are interested.
Job ID:25-93826
Pay Range: $60 - $65/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Requirements and Technology Experience:
Key Skills; ETL, SQL, PL/SQL, Informatica, Snowflake, Data modeling, DevOps .
7 years of experience in developing quality data solution
Software development experience in the Financial Industry
Expertise in Oracle PL/SQL development, SQL Scripting, and database performance tuning.
Intermediate experience in Java development is preferred
Solid understanding of ETL tools like Informatica and Data Warehousing like Snowflake.
You enjoy learning new technologies, analyzing data, identifying gaps, issues, patterns, and building solutions
You can independently analyze technical challenges, identify, assess impact, and identify innovative solutions
Strong data modeling skills using Quantitative and Multidimensional Analysis
Demonstrate understanding of data design concepts - Transactional, Data Mart, Data Warehouse, etc.
Beginner proficiency of in Python, REST API and AWS is a plus
You are passionate about delivering high-quality software using DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Git, Docker) practices!
Experience developing software using Agile methodologies (Kanban and SCRUM)
Strong analytical and problem-solving skills
Excellent written and oral communication skills
Our client is a leading Financial Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, colour, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
Senior AWS DevOps Engineer - AI Platform Enablement
Requirements engineer job in Boston, MA
We are seeking a seasoned AWS DevOps Engineer to join our growing AI Platform team, responsible for building and operating the next-generation Agentic AI infrastructure that powers intelligent automation across the enterprise.
This individual will play a key role in establishing the DevOps, security, and compliance foundations for our AWS-based AI platform - while working collaboratively with both the AI Studio engineering team and the Enterprise IT organization to ensure governance, reliability, and speed of innovation.
The ideal candidate combines deep AWS technical expertise with a pragmatic, people-oriented approach to bridge between innovation and enterprise IT standards.
Responsibilities:
Key Responsibilities:
Design, implement, and maintain AWS infrastructure supporting AI development, model orchestration, and agentic systems (e.g., Bedrock, Lambda, ECS/EKS, API Gateway).
Build and manage CI/CD pipelines for AI and data applications using AWS CDK / CloudFormation / CodePipeline / Terraform.
Implement security guardrails and compliance controls (IAM, KMS, network segmentation, audit logging) aligned with IT standards - but autonomously within the AI environment.
Partner with the IT security and cloud teams to ensure adherence to cybersecurity insurance and data governance requirements.
Manage monitoring, observability, and cost-optimization for AI workloads (CloudWatch, X-Ray, Config, Trusted Advisor).
Enable rapid development cycles for the AI team by streamlining environment provisioning, model deployment, and access management.
Serve as the bridge between AI Engineering and IT, building mutual trust through transparency, security-minded automation, and operational excellence.
Document and evangelize best practices in DevSecOps, Infrastructure as Code, and model lifecycle management.
Qualifications:
Required Qualifications:
7+ years of DevOps / Cloud Engineering experience, with 4+ years on AWS in production environments.
Proven expertise with AWS services: VPC, IAM, Lambda, ECS/EKS, API Gateway, CloudFront, S3, CloudWatch, CloudFormation/CDK, KMS, Cognito, Secrets Manager.
Experience managing CI/CD pipelines (CodePipeline, GitHub Actions, or Jenkins).
Strong understanding of networking, identity federation (Azure AD / Okta), and data security.
Familiarity with AI/ML workflows (SageMaker, Bedrock, Databricks, or similar).
Hands-on experience implementing security guardrails and compliance controls in AWS.
Proficiency with Terraform or CDK for Infrastructure as Code.
Excellent communication and collaboration skills - able to explain technical decisions to IT, security, and data teams.
Preferred Qualifications:
Prior experience supporting AI or data platform teams.
Exposure to multi-agent systems, Bedrock AgentCore, LangChain, or similar frameworks.
Background in hybrid enterprise environments (AWS + Azure).
AWS certifications (e.g., Solutions Architect, DevOps Engineer, Security Specialty).
Success Indicators:
AI development team can deploy and operate independently within compliant AWS guardrails.
IT leadership gains confidence in the AI team's DevOps maturity and control mechanisms.
Infrastructure and pipelines are fully automated, observable, and secure - without bottlenecks.
Summary:
This is a hands-on DevOps leadership role at the intersection of AI innovation and enterprise trust. You will empower cutting-edge AI development while ensuring the environment remains compliant, secure, and sustainable - helping shape the company's next generation of intelligent construction technology.
ETL Data Engineer with SpringBatch Experience-- SHADC5693360
Requirements engineer job in Smithfield, RI
Job Title: ETL Data Engineer with SpringBatch Experience - W2 only - We can provide sponsorship
Duration: Long Term
MUST HAVES:
Strong SQL for querying and data validation
Oracle
AWS
ETL experience with Java SpringBatch (for the ETL data transformation).
Note: the ETL work is done in Java (so Python is only a nice to have).
The Expertise and Skills You Bring
Bachelor's or Master's Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with 5+ years of working experience
4+ years of Java development utilizing Spring frameworks. Experience writing batch jobs with Spring Batch is a must
2+ years of experience developing applications that run in AWS, with focus on AWS Batch, S3, IAM
3+ years working with SQL (ANSI SQL, Oracle, Snowflake)
2+ years of Python development
Experience with Unix shell scripting (bash, ksh) and scheduling / orchestration tools (Control-M)
Strong data modeling skills with experience working with 3NF and Star Schema data models
Proven data analysis skills; not afraid to work in a complex data ecosystem
Hands-on experience on SQL query optimization and tuning to improve performance is desirable
Experience with DevOps, Continuous Integration and Continuous Delivery (Jenkins, Terraform, CloudFormation)
Experience in Agile methodologies (Kanban and SCRUM)
Experience building and deploying containerized applications using Docker
Work experience in the financial services industry is a plus
Proven track record to handle ambiguity and work in a fast-paced environment, either independently or in a collaborative manner
Good interpersonal skills to work with multiple teams within the business unit and across the organization