Senior Data Engineer
Data engineer job in Merrimack, NH
Immediate need for a talented Senior Data Engineer. This is a 12 months contract opportunity with long-term potential and is located in Westlake, TX/Merrimack, NH(Hybrid). Please review the job description below and contact me ASAP if you are interested.
Job ID:25-93826
Pay Range: $60 - $65/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Requirements and Technology Experience:
Key Skills; ETL, SQL, PL/SQL, Informatica, Snowflake, Data modeling, DevOps .
7 years of experience in developing quality data solution
Software development experience in the Financial Industry
Expertise in Oracle PL/SQL development, SQL Scripting, and database performance tuning.
Intermediate experience in Java development is preferred
Solid understanding of ETL tools like Informatica and Data Warehousing like Snowflake.
You enjoy learning new technologies, analyzing data, identifying gaps, issues, patterns, and building solutions
You can independently analyze technical challenges, identify, assess impact, and identify innovative solutions
Strong data modeling skills using Quantitative and Multidimensional Analysis
Demonstrate understanding of data design concepts - Transactional, Data Mart, Data Warehouse, etc.
Beginner proficiency of in Python, REST API and AWS is a plus
You are passionate about delivering high-quality software using DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Git, Docker) practices!
Experience developing software using Agile methodologies (Kanban and SCRUM)
Strong analytical and problem-solving skills
Excellent written and oral communication skills
Our client is a leading Financial Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, colour, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
Data Engineer (HR Data warehousing exp)
Data engineer job in Boston, MA
Ness is a full lifecycle digital engineering firm offering digital advisory through scaled engineering services. Combining our core competence in engineering with the latest in digital strategy and technology, we seamlessly manage Digital Transformation journeys from strategy through execution to help businesses thrive in the digital economy. As your tech partner, we help engineer your company's future with cloud and data. For more information, visit ************
Data Engineer (HR Data warehousing exp)
Boston, MA (3-4 days onsite a week)
Key Responsibilities:
Translate business needs into data modeling strategies and implement Snowflake data models to support HR analytics, KPIs, and reporting.
Design, build, and maintain Snowflake objects including tables, views, and stored procedures.
Develop and execute SQL or Python transformations, data mappings, cleansing, validation, and conversion processes.
Establish and enforce data governance standards to ensure consistency, quality, and completeness of data assets.
Manage technical metadata and documentation for data warehousing and migration efforts.
Optimize performance of data transformation pipelines and monitor integration performance.
Design, configure, and optimize integrations between Workday and third-party applications.
Participate in system testing including unit, integration, and regression phases.
Support data analysis needs throughout the implementation lifecycle.
Required Experience & Skills:
Experience with Snowflake or similar data warehouse platforms
Strong SQL skills and experience with data transformation tools
Experience with ETL processes and data validation techniques
Understanding of HR data structures and relationships
Excellent analytical and problem-solving abilities
Experience with developing with Python
Architecting a data warehousing solution leveraging data from Workday or other HRIS such as Workday to support advanced reporting and insights for an organization
Preferred Experience & Skills:
Experience in developing and supporting a data warehouse serving the HR domain
Experience with data platforms where SCD Type 2 was required
Experience with data visualization tools such as Tableau
Experience with architecting or working with ELT technologies (such as DBT) and data architectures
Understanding of HR processes, compliance requirements, and industry best practices
Ness is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law
Senior Data Engineer
Data engineer job in Boston, MA
This role is with a Maris Financial Services Partner
Boston, MA - Hybrid Role - We are targeting local candidates that can be in the Boston office 3 days per week.
12 Month + contract (or contract to hire, if desired)
This team oversees critical systems including Snowflake, Tableau, and RDBMS technologies like SQL Server and Postgres. This role will focus on automating database deployments and creating efficient patterns and practices that enhance our data processing capabilities.
Key Responsibilities:
Design, enhance, and manage DataOps tools and services to support cloud initiatives.
Develop and maintain scheduled workflows using Airflow.
Create containerized applications for deployment with ECS, Fargate, and EKS.
Build data pipelines to extract, transform, and load (ETL) data from various sources into Apache Kafka, ultimately feeding into Snowflake.
Provide consultation for infrastructure projects to ensure alignment with technical architecture and end-user needs.
Qualifications:
Familiarity with Continuous Integration and Continuous Deployment (CI/CD) practices and tools.
Understanding of application stack architectures (e.g., microservices), PaaS development, and AWS environments.
Proficiency in scripting languages such as Bash.
Experience with Python, Go, or C#.
Hands-on experience with Terraform or other Infrastructure as Code (IaC) tools, such as CloudFormation.
Preferred experience with Apache Kafka and Flink.
Proven experience working with Kubernetes.
Strong knowledge of Linux and Docker environments.
Excellent communication and interpersonal skills.
Strong analytical and problem-solving abilities.
Ability to manage multiple tasks and projects concurrently.
Expertise with SQL Server, Postgres, and Snowflake.
In-depth experience with ETL/ELT processes.
Data Engineer
Data engineer job in Boston, MA
URGENT REQUIREMENT
NO C2C ,
Role : Data engineer- With Private wealth management exp.
Boston, MA or San Francisco (SFO), CA (Hybrid)
Duration : Long term W2 contract
Required Qualifications
7+ years of experience as a Data Engineer
Strong experience in Private Wealth Management, Asset Management, or Investment Management
Hands-on experience with financial data domains such as portfolios, trades, transactions, performance, benchmarks, and client accounts
Strong SQL skills and experience with relational and analytical databases
Experience with ETL tools, data pipelines, and batch processing
Solid programming experience in Python, Java, or Scala
Experience with on-prem and/or cloud data platforms
Senior Data Engineer
Data engineer job in Boston, MA
Hi, this is Eric 👋 We're hiring a stellar Data Engineer to join our engineering org at Basil Systems.
At Basil Systems, we're revolutionizing healthcare data access and insights for the life sciences industry. We've built powerful platforms that help pharmaceutical and medical device companies navigate complex regulatory landscapes, accelerate product development, and ultimately bring life-saving innovations to market faster. Our SaaS platforms transform disconnected data sources into actionable intelligence, empowering organizations to make data-driven decisions that improve patient outcomes and save lives.
The Role
We are seeking a Senior Data Engineer to own and advance the data infrastructure that powers our healthcare insights platform. As our engineering team scales and we expand our data capabilities, we need someone who can build reliable, scalable pipelines while ensuring data quality across increasingly complex regulatory sources.
Key Responsibilities
Design, build, and maintain robust ETL processes for healthcare regulatory data
Integrate new data sources as we onboard customers and expand platform capabilities
Optimize pipeline performance and reliability
Ensure data accuracy and consistency across complex transformation workflows
Qualifications
5+ years of professional experience as a data engineer or in a similar role
Experience with Apache Spark and distributed computing
Familiarity with common ML algorithms and their applications
Knowledge of or willingness to learn and work with Generative AI technologies
Experience with developing for distributed cloud platforms
Experience with MongoDB / ElasticSearch and technologies like BigQuery
Strong commitment to engineering best practices
Nice-to-Haves
Solid understanding of modern security practices, especially in healthcare data contexts
Subject matter expertise in LifeSciences / Pharma / MedTech
This role might not be for you if...
You're a heavy process advocate and want enterprise-grade Scrum or rigid methodologies
You have a need for perfect clarity before taking action
You have a big company mindset
What We Offer
Competitive salary
Health and vision benefits
Attractive equity package
Flexible work environment (remote-friendly)
Opportunity to work on impactful projects that are helping bring life-saving medical products to market
Be part of a mission-driven team solving real healthcare challenges at a critical scaling point
Our Culture
At Basil Systems, we value flexibility and support a distributed team. We actively employ and support remote team members across different geographies, allowing you to work when, where, and how you work best. We are committed to building a diverse, inclusive, and safe work environment for everyone. Our team is passionate about using technology to make a meaningful difference in healthcare.
How to Apply
If you're excited about this opportunity and believe you'd be a great fit for our team, please send your resume and a brief introduction to *****************************.
Basil Systems is an equal opportunity employer. We welcome applicants of all backgrounds and experiences.
Junior Data Engineer
Data engineer job in Boston, MA
Job Title: Junior Data Engineer
W2 candidates only
We are on the lookout for engineers who are open to upskill to the exciting world of Data Engineering. This opportunity is for our client, a top tier insurance company and includes a 2-3 week online pre-employment training program (15 hours per week), conveniently scheduled after business hours. Participants who successfully complete the program will receive a $500 stipend. This is a fantastic chance to gain in demand skills, hands-on experience, and a pathway into a dynamic tech role..
Key Responsibilities:
• Assist in the design and development of big data solutions using technologies such as Spark, Scala, AWS Glue, Lambda, SNS/SQS, and CloudWatch.
• Develop applications primarily in Scala and Python with guidance from senior team members.
• Write and optimize SQL queries, preferably with Redshift; experience with Snowflake is a plus.
• Work on ETL/ELT processes and frameworks to ensure smooth data integration.
• Participate in development tasks, including configuration, writing unit test cases, and testing support.
• Help identify and troubleshoot defects and assist in root cause analysis during testing.
• Support performance testing and production environment troubleshooting.
• Collaborate with the team on best practices, including Git version control and CI/CD deployment processes.
• Continuously learn and grow your skills in big data technologies and cloud platforms.
Prerequisites:
• Recent graduate with a degree in Computer Science, Information Technology, Engineering, or related fields.
• Basic experience or coursework in Scala, Python, or other programming languages.
• Familiarity with SQL and database concepts.
• Understanding of ETL/ELT concepts is preferred.
• Exposure to AWS cloud services (Glue, Lambda, SNS/SQS) is a plus but not mandatory.
• Strong problem-solving skills and eagerness to learn.
• Good communication and teamwork abilities.
Selection Process & Training:
• Online assessment and technical interview by Quintrix.
• Client Interview(s).
• 2-3 weeks of pre-employment online instructor-led training.
Stipend paid during Training:
• $500.
Benefits:
• 2 weeks of Paid Vacation.
• Health Insurance including Vision and Dental.
• Employee Assistance Program.
• Dependent Care FSA.
• Commuter Benefits.
• Voluntary Life Insurance.
• Relocation Reimbursement.
Who is Quintrix?
Quintrix is on a mission to help individuals develop their technology talent. We have helped hundreds of candidate's kick start their careers in tech. You will be “paid-to-learn”, qualifying you for a high paying tech job with one of our top employers. To learn more about our candidate experience go to *************************************
Data Modelling Architect
Data engineer job in Boston, MA
The Wissen team continues to expand its footprint in the Canada & USA. More openings to come as we continue to grow the team!
Please read below for a brilliant career opportunity.
Role: Data Modelling Architect
Title: Vice President
Location: Boston, MA (Day 1 Onsite/Hybrid)
Mode of Work: 3 days per week onsite required
Required experience: 10+ Years
Job Description
We are looking for an experienced Data Modelling Architect to design and optimize enterprise data models supporting risk, regulatory, and financial domains. The role requires strong expertise in conceptual, logical, and physical data modelling, along with working knowledge of Financial Risk or Operational Risk frameworks used in global banking environments.
Required Skills
10-12 years of strong experience in data modelling and data architecture.
Expertise in ER modelling, dimensional modelling, and industry-standard modelling methodologies.
Hands-on experience with tools like Erwin, ER/Studio.
Strong SQL and experience with relational databases and distributed/cloud data platforms.
Working knowledge of Financial Risk, Operational Risk, or regulatory risk data (Credit Risk, Market Risk, Liquidity Risk, RCSA, Loss Events, KRI, etc.).
Experience supporting regulatory frameworks such as Basel II/III, CCAR, ICAAP, or similar.
Ability to work with cross-functional teams across global locations.
Excellent communication and documentation skills.
Benefits:
Healthcare insurance for you and your family (medical, dental, vision).
Short / Long term disability insurance.
Life Insurance.
Accidental death & disability Insurance.
401K.
3 weeks of Paid Time Off.
Support and fee coverage for immigration needs.
Remote office set up stipend.
Support for industry certifications.
Additional cash incentives.
Re-skilling opportunities to transition between technologies.
Schedule: Monday to Friday
Work Mode: Hybrid
Job Type: Full-time
We are: A high end technical consulting firm built and run by highly qualified technologists. Our workforce consists of 5000+ highly skilled professionals, with leadership from Wharton, MIT, IITs, IIMs, and NITs and decades of experience at Goldman Sachs, Morgan Stanley, MSCI, Deutsche Bank, Credit Suisse, Verizon, British Telecom, ISRO etc. Without any external funding or investments, Wissen Technology has grown its revenues by 100% every other year since it started as a subsidiary of Wissen Group in 2015. We have a global presence with offices in the US, India, UK, Australia, Mexico, and Canada.
You are: A true tech or domain ninja. Or both. Comfortable working in a quickly growing profitable startup, have a “can do” attitude and are willing to take on any task thrown your way.
You will:
Develop and promote the company's culture of engineering excellence.
Define, develop and deliver solutions at a top tier investment bank or another esteemed client.
Perform other duties as needed
Your Education and Experience:
We value candidates who can execute on our vision and help us build an industry-leading organization.
Graduate-level degree in computer science, engineering, or related technical field
Wissen embraces diversity and is an equal opportunity employer. We are committed to building a team that represents a variety of backgrounds, skills, and abilities. We believe that the more inclusive our team is, the better our work will be. All qualified applicants, including but not limited to LGBTQ+, Minorities, Females, the Disabled, and Veterans, are encouraged to apply.
About Wissen Technology:
The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for diverse industries, including Banking, E-commerce, Telecom, Healthcare, Manufacturing, and Energy. We help clients build world-class products. We have offices in the US, India (Bangalore, Hyderabad, Chennai, Gurugram, Mumbai, Pune), UK, Australia, Mexico, Vietnam, and Canada.
We empower businesses with a dynamic portfolio of services and accelerators tailored to today's digital demands and based on future ready technology stack. Our services include Industry Leading Custom Software Development, AI-Driven Software Engineering, Generative AI & Machine Learning, Real-Time Data Analytics & Insights, Interactive Data Visualization & Decision Intelligence, Intelligent Process Automation, Multi-Cloud & Hybrid Cloud Strategies, Cross-Platform Mobile Experiences, CI/CD-Powered Agile DevOps, Automated Quality Engineering, and cutting-edge integrations.
Certified as a Great Place to Work for five consecutive years (2020-2025) and recognized as a Top 20 AI/ML vendor by CIO Insider, Wissen Group has delivered multimillion-dollar projects for over 20 Fortune 500 companies. Wissen Technology delivers exceptional value on mission-critical projects through thought leadership, ownership, and reliable, high-quality, on-time delivery.
Our industry-leading technical expertise stem from the talented professionals we attract. Committed to fostering their growth and providing top-tier career opportunities, Wissen ensures an outstanding experience and value for our clients and employees.
We Value:
Perfection: Pursuit of excellence through continuous improvement.
Curiosity: Fostering continuous learning and exploration.
Respect: Valuing diversity and mutual respect.
Integrity: Commitment to ethical conduct and transparency.
Transparency: Open communication and trust.
Website: **************
Glassdoor Reviews: *************************************************************
Wissen Thought leadership: https://**************/articles/
Latest in Wissen in CIO Insider:
**********************************************************************************************************************
Employee Speak:
***************************************************************
LinkedIn: **************************************************
About Wissen Interview Process:
https://**************/blog/we-work-on-highly-complex-technology-projects-here-is-how-it-changes-whom-we-hire/
Wissen: A Great Place to Work
https://**************/blog/wissen-is-a-great-place-to-work-says-the-great-place-to-work-r-institute-india
https://**************/blog/here-is-what-ownership-and-commitment-mean-to-wissenites/
Wissen | Driving Digital Transformation
A technology consultancy that drives digital innovation by connecting strategy and execution, helping global clients to strengthen their core technology.
Job Type: Full-time
Work Location: In person
HR Data Analytics Architect
Data engineer job in Boston, MA
Key Responsibilities:
Architect & Model: Design and implement scalable, efficient Snowflake data models to support HR analytics, workforce planning, and KPI reporting.
Data Integration: Develop and optimize integrations between Workday, Snowflake, and downstream analytics platforms; ensure seamless, accurate data flow across systems.
Governance & Quality: Define and enforce data governance, quality, and metadata management standards to ensure data consistency and compliance.
Documentation & Metadata: Maintain comprehensive technical documentation and data dictionaries for warehouse structures, transformations, and integrations.
Performance Optimization: Monitor and tune ETL/ELT pipelines, ensuring high-performance data transformation and loading processes.
Collaboration: Partner with HR, Data Engineering, and Analytics teams to translate business logic into reusable and governed data assets.
Testing & Validation: Participate in unit, integration, and regression testing to validate data pipelines and ensure data accuracy.
Lifecycle Support: Support data analysis and troubleshooting across the full implementation and operational lifecycle of HR data solutions.
Required Experience & Skills:
Proven experience architecting and implementing solutions on Snowflake or similar cloud data warehouse platforms.
Advanced SQL skills and hands-on experience with data transformation and pipeline optimization tools.
Strong understanding of ETL/ELT frameworks, data validation, and reconciliation techniques.
Demonstrated experience working with HR data structures, Workday, or other HRIS systems.
Strong analytical mindset and problem-solving ability, with attention to data integrity and business context.
Experience with Python for data engineering, automation, or orchestration tasks.
Track record of designing data warehouses or analytical platforms leveraging HR data to drive insights and advanced reporting.
Preferred Experience & Skills:
Experience building and supporting data warehouses specifically for HR and People Analytics domains.
Hands-on experience with Slowly Changing Dimensions (SCD Type 2) and historical data management.
Proficiency with data visualization tools such as Tableau or Power BI.
Experience with ELT frameworks (e.g., dbt) and modern data architecture patterns (e.g., Data Vault, Medallion Architecture).
Familiarity with HR processes, compliance standards, and industry best practices related to HR data management and reporting.
Experience working in an enterprise environment with cross-functional collaboration between HR, Finance, and Technology teams.
Software Engineer
Data engineer job in Boston, MA
Work schedule: Hybrid
Key Responsibilities:
Performance Tuning: Monitor and optimize performance, including query performance, resource utilization, and storage management.
User and Access Management: Manage user access, roles, and permissions to ensure data security and compliance with organizational policies.
Data Integration: Support and manage data integration processes, including data loading, transformation, and extraction.
Troubleshooting and Support: Provide technical support and troubleshooting for Snowflake-related issues, including resolving performance bottlenecks and query optimization.
Documentation and Reporting: Maintain detailed documentation of system configurations, procedures, and changes. Generate and deliver regular reports on system performance and usage.
Collaboration: Work closely with data engineers, analysts, and other IT professionals to ensure seamless integration and optimal performance of the Snowflake environment.
Best Practices: Stay up to date with Snowflake best practices and industry trends. Recommend and implement improvements and upgrades to enhance system functionality and performance.
Qualifications and Experience:
5+ years of experience in data architecture, data engineering, or database development.
2+ years of hands-on experience with Snowflake, including data modeling, performance tuning, and security.
At a minimum Bachelor's degree in Computer Science, Information Technology, or related field.
Experience with source control tools (GitHub preferred), ETL/ELT tools and cloud platforms (AWS preferred).
Experience or exposure to AI tools.
Deep understanding of data warehousing concepts, dimensional modeling, and analytics.
Excellent problem-solving and communication skills.
Experience integrating Snowflake with BI and reporting tools is a plus
Required Skills:
Strong proficiency in Snowflake architecture, features, and capabilities.
Knowledge of SQL and Snowflake-specific query optimization.
Experience with ETL tools and data integration processes.
Strong proficiency in SQL and Python.
Strong Database design and data modelling experience. Experience with data modeling tools.
Ability to identify and drive continuous improvements.
Strong problem solving and analytical skills.
Demonstrated process-oriented and strategic thinking skills.
Strong motivation and a desire to continuously learn and grow.
Knowledge of Snowflake security features including access control, authentication, authorization, encryption, masking, secure view, etc.
Experience working in AWS cloud environments.
Experience working with Power BI and other BI, data visualization, and reporting tools.
Business requirement gathering and aligning to solutions delivery.
Experience with data integration solutions and tools, data pipelines, and modern ways of automating data using cloud based and on-premises technologies.
Experience integrating Snowflake with an identity and access management program such as Azure IDP is a plus.
Experience with other relational database management systems, cloud data warehouses and big data platforms is a plus.
Analytical Skills: Excellent problem-solving and analytical skills with strong attention to detail.
Communication: Effective communication skills, both written and verbal, with the ability to convey complex technical information to non-technical stakeholders.
Teamwork: Ability to work independently and collaboratively in a fast-paced environment.
Preferred Skills:
Snowflake certification (e.g., SnowPro Core or Advanced Certification).
SDET - Mainframe Testing
Data engineer job in Merrimack, NH
Role - Mainframe Tester
Fulltime role.
Key Skills:-
They are looking for someone with a typical QA engineering background, primarily manual and functional testing, with some automation experience (ability to automate CICS screens).
Experience with VSAM files + CICS kick screens + DB2 querying is required.
Self-driven person, capable of independently understanding acceptance criteria.
The role will provide access to the team's GitHub Copilot to assist with writing automation code, with plans to expand to Microsoft Copilot for test coverage.
Automation experience with Java + Cucumber BDD is required.
Good understanding of Cobol to help run JCL files required.
The team is starting to get involved in IOM doing some automation.
Nice to Have:
Experience in Financial Services domain, preferably mutual funds, stocks, managed accounts.
Experience working upon virtual databases like Delphix, or data mockup tools like Wire Mock.
Worked on Cloud based services - Azure, AWS.
Aptitude and Communication skills.
The Expertise and Skills You Bring
Bachelor's degree in Computer Science or equivalent experience is required.
Experience of test automation development using Java or similar language.
Experience in testing distributed applications at multiple layers of the technology stack.
Proven Expertise on standard test automation frameworks - Cucumber, Java, Rest services.
Expertise on Data validation and processes - SQL, DB2, Oracle.
Experience working in Agile/Scrum environment, using tools like JIRA, XRAY. Solid familiarity with Agile/Scrum processes.
DevOps, CI/CD processes - Maven, Git/GitHub, Jenkins, Sonar.
Knowledge on SDLC including coding standards, code reviews, source code management, build processes.
Cobalt language understanding.
VSAM files
Batch process
IOM (Input/Output Modules)
Experience in Financial Services domain, preferably mutual funds, stocks, managed accounts.
Experience working upon virtual databases like Delphix, or data mockup tools like Wire Mock.
Worked on Cloud based services - Azure, AWS.
Practical experience working with both functional and regression testing including test automation within an agile environment
Ability to quickly learn, adapt and thrive to meet the needs of a fast paced, changing environment
Interpret business requirements and crystallize stories and acceptance criteria for implementation
Identify risks and develop contingency plans in anticipation of test automation issues
Have the proven ability to work independently as part of an Agile Sprint, Kanban team/squads
Proven understanding of the software development process including planning, analysis, design, coding, system and user testing, and problem resolution.
Use and improve upon an existing automation framework
DevOps Engineer
Data engineer job in Boston, MA
📣 Platform Engineer - Travel SaaS
A fast-scaling SaaS company in the travel tech space is hiring a Platform Engineer to help build and scale their global infrastructure.
This is a high-impact role in a product-led, engineering-driven environment. The company operates a modern, multi-service architecture on AWS and needs someone who can take ownership of platform reliability, CI/CD tooling, and infrastructure as code.
💻 The role:
Design, build and maintain secure, scalable AWS infrastructure (EC2, S3, RDS, IAM, etc.)
Champion Infrastructure as Code using Terraform or Pulumi
Manage containerised deployments with Docker and ECS or Kubernetes
Improve and maintain CI/CD pipelines (GitHub Actions, CircleCI, etc.)
Collaborate closely with engineering, SRE and security teams
Take part in on-call and incident response as part of a “you build it, you run it” culture
🧠 What they're looking for:
3+ years' hands-on experience with AWS
Strong background in infrastructure as code
Solid understanding of containerisation and orchestration
Comfortable with scripting (Python, Go or Bash)
Experience with observability tools (Datadog, CloudWatch, etc.)
Excellent debugging and troubleshooting skills
🎁 Nice-to-haves:
Exposure to Windows/.NET, serverless architectures or compliance frameworks (e.g. SOC2)
🌍 Why join:
Compensation: $130-150K base + equity
Culture: Low-ego, high-ownership team with a strong engineering voice
Hybrid setup: ~3 days per week in office in Boston
Mission: Helping businesses travel smarter - at global scale
Junior DevOps Engineer
Data engineer job in Woburn, MA
The Alexander Technology Group is looking for a junior devops engineer for a client in the Woburn, MA area.
Hybrid on-site
No 3rd party applicants will be considered, do not reach out
85-90k
Requirements:
Key Responsibilities:
Support Cloud Infrastructure: Assist in managing AWS infrastructure including VPCs, ECS/EKS clusters, RDS databases, and serverless components under the guidance of senior engineers.
Maintain CI/CD Pipelines: Help maintain and improve deployment pipelines using GitLab CI or GitHub Actions, ensuring smooth software delivery.
Monitor System Health: Set up and monitor alerting systems using CloudWatch, Grafana, or Prometheus, and respond to incidents with support from the team.
Security and Compliance: Support SOC 2 Type II compliance efforts by implementing security controls and following established protocols.
Infrastructure as Code: Gain experience with Terraform and other IaC tools to automate infrastructure provisioning and management.
If interested, please send resume to ************************
AWS Networking / AWS DevOps Engineer
Data engineer job in Quincy, MA
Job Description - AWS Networking / AWS DevOps Engineer
Type: Hybrid (3 to 4 days based on client request and project demand)
Role: AWS Cloud Networking Engineer/DevOps
We are seeking an experienced Networking-focused AWS DevOps Engineer to support and optimize our multi-region cloud infrastructure. The ideal candidate will have strong expertise across AWS networking, multi-region architectures, CI/CD, container orchestration, infrastructure automation, and data platform components such as Redshift. This role is part-time but requires a hands-on engineer who can troubleshoot, optimize, and enhance our production and non-production cloud environments.
Key Responsibilities
AWS Multi-Region Architecture & Networking
• Design, implement, and optimize multi-region VPC architectures, peering, Transit Gateway, and routing policies.
• Configure and manage security groups, NACLs, route tables, NAT gateways, IGWs, and cross-region networking.
• Ensure high availability (HA) and disaster recovery (DR) readiness across multiple AWS regions.
• Support network connectivity for hybrid environments (VPN, Direct Connect).
AWS DevOps & Automation
• Develop and maintain CI/CD pipelines using CodePipeline, CodeBuild, GitHub Actions, GitLab, or Jenkins.
• Automate infrastructure provisioning using Terraform, CloudFormation, or CDK.
• Implement and optimize monitoring, logging, and alerting via CloudWatch, OpenSearch, Prometheus/Grafana, or equivalent.
• Drive continuous improvements in deployment reliability and DevOps best practices.
Compute & Container Services
• Manage and optimize EC2 instances including AMIs, autoscaling, patching, and configurations.
• Deploy, scale, and troubleshoot workloads on ECS (Fargate or EC2).
• Implement workload security, resource optimization, and cost controls across compute services.
Redshift & Data Infrastructure Support
• Support Redshift cluster configuration, security, WLM settings, performance optimization, and connectivity.
• Ensure secure and optimized data flows between ETL layers, Redshift, EC2/ECS services, and S3.
• Collaborate with data teams to tune Redshift workloads and ensure optimal network performance.
Security & Compliance
• Enable IAM policies, role-based access, and least-privilege security controls.
• Implement multi-region failover, backup/restore strategies, and environment hardening.
• Ensure compliance with security best practices, patching, encryption, and CloudTrail logging.
Operations, Troubleshooting & Support
• Troubleshoot multi-region connectivity, latency, DNS, and infrastructure issues.
• Optimize cloud spend across compute, networking, and Redshift workloads.
• Provide on-call / ad-hoc support during deployments or critical incidents (as needed).
Required Skills & Experience
Technical Skills
• 10+ years of experience as a DevOps, Cloud Engineer, or AWS Infrastructure Engineer.
• Strong AWS networking expertise: VPC, TGW, Route53, VPN, Direct Connect, SGs, NACLs.
• Experience with multi-region, HA, DR architectures.
• Proficient in EC2, ECS (Fargate/EC2 Launch Types), Redshift.
• Strong Terraform / CloudFormation scripting experience.
• Strong experience with Python or Bash for automation.
• Hands-on experience setting up CI/CD pipelines.
• Experience with monitoring/observability tools: CloudWatch, OpenSearch, Grafana/Prometheus, Datadog, etc.
• Familiarity with cloud cost optimization and tagging strategies.
DevOps Engineer
Data engineer job in Boston, MA
We're looking for a Senior DevOps Tools Engineer to help modernize and elevate our development ecosystem. If you're passionate about improving how software teams build, test, secure, and deliver high-quality code-this role is built for you.
This is not a traditional infrastructure-heavy DevOps role. It's a developer-enablement, tooling modernization, and process transformation position with real influence.
🔧 Role Overview
You will lead initiatives that reshape how engineering teams work-modernizing tooling, redesigning source control practices, improving CI/CD workflows, and championing DevEx across the organization. This role combines hands-on engineering with strategic process design.
⭐ Key Responsibilities
Drive modernization of development tools and processes, including SVN → Git migration and workflow redesign.
Own and enhance CI/CD pipelines to improve reliability, automation, and performance.
Implement modern DevOps + DevSecOps practices (SAST, DAST, code scanning, dependency checks, etc.).
Automate build, packaging, testing, and release processes.
Advocate for and improve Developer Experience (DevEx) by reducing friction and enabling efficiency.
Collaborate across engineering teams to define standards for source control, branching, packaging, and release workflows.
Guide teams through modernization initiatives and influence technical direction.
🎯 Must-Have Qualifications
Strong experience with CI/CD pipelines, developer tooling, and automation.
Hands-on expertise with Git + Git-based platforms (GitLab, GitHub, Bitbucket).
Experience modernizing tooling or migrating from legacy systems (SVN → Git is a big plus).
Solid understanding of DevOps / DevSecOps workflows: automation, builds, packaging, security integration.
Proficient in scripting/programming for automation (Python, Bash, PowerShell, Groovy, etc.).
Excellent communication skills and ability to guide teams through change.
🏙️ Work Model
This is a full-time, hybrid role based in Boston, MA. Onsite participation is required.
📩 When Applying
Please include:
Updated resume
Expected Salary
Notice period (30 days or less)
A good time for a quick introductory call
If you're excited about modernizing engineering ecosystems, improving developer experience, and driving organization-wide transformation, we'd love to connect.
Senior Full Stack Developer
Data engineer job in Boston, MA
Boston Energy Trading & Marketing (BETM) has accelerated its shift to digital growth and cloud enablement. Our talented, energetic team is creating next-gen platforms to provide industry leading solutions supporting the green energy transition. We're seeking candidates with the passion to enhance value through technology, and with the experience to effectively manage & mature the solutions we create. If you have those traits, and you are ready to join our Boston-based team in a hybrid work model, we would love to hear from you!
As part of this team, you will engage closely with business & IT colleagues to improve, streamline and automate business processes. You will design, build and manage applications/workflows in a cloud environment. You will leverage tools which automate processes, enabling our DevOps capabilities to manage all aspects of application development. You are organized, driven to solve problems and have a passion for life-long learning. Your strong engineering skills, along with your customer-focused mindset, makes you a valuable addition to our team.
Role Overview
As a Senior Full Stack Developer, you will design, build, and maintain cloud-based applications and workflows that streamline and automate business processes. You'll work closely with business and IT teams to deliver scalable, high-quality solutions using modern technologies across the stack.
Key Responsibilities
Develop and maintain full-stack applications using React, TypeScript, Python, and FastAPI.
Build and optimize APIs and microservices for performance and scalability.
Design and implement data workflows leveraging Snowflake and Postgres.
Implement distributed caching using Redis for high-performance applications.
Collaborate with cross-functional teams to gather requirements and deliver solutions in an agile environment.
Write clean, efficient, and maintainable code following best practices.
Implement CI/CD pipelines and DevOps practices for cloud deployments.
Identify opportunities for reusable components and automation to accelerate delivery.
Qualifications
Bachelor's degree in Computer Science, Engineering, or related field.
7+ years of professional software development experience.
Strong proficiency in React, TypeScript, Python, and FastAPI.
Solid understanding of OOP, algorithms, data structures, and design patterns.
Familiarity with Redis for distributed caching.
Familiarity with Azure services (Data Factory, Functions, Storage, SQL Database, Managed Instance) and cloud architecture.
Hands-on experience with DevOps tools, CI/CD pipelines, and workflow automation.
Excellent communication and collaboration skills; passion for continuous learning and improvement.
Range: $145,000 - $175,000
Senior Software Engineer (Python)
Data engineer job in Boston, MA
Senior Developer - FP&A Forecasting & Reporting (Contract)
Industry: Financial Services (Insurance / Investments / FP&A)
We are seeking an experienced Senior Developer to support a large-scale FP&A Forecasting and Reporting transformation within a financial services environment. This role partners closely with FP&A, Investment, Actuarial, and IT Data teams to design, build, and optimise high-performance forecasting and calculation platforms. You will play a key role in developing scalable, production-grade systems capable of handling complex financial calculations and large data volumes in a fast-paced, enterprise setting. This is a hands-on, senior-level contract role, requiring strong Python engineering expertise and deep exposure to financial data and models.
Key Responsibilities
Platform Development & Optimisation
Lead the design, development, and optimisation of core systems using Python and Python-based compute environments (Jupyter, VS Code, Databricks)
Build and enhance FP&A forecasting models, analytics, and data pipelines
Ensure performance, scalability, and reliability across computation-heavy workloads
Drive technical innovation, including the use of parallel compute, data vectors, and AI-assisted techniques where appropriate
Collaboration & Delivery
Partner with FP&A teams and external consultants to deliver robust, performant forecasting solutions
Provide technical guidance on backlog items and architectural decisions
Act as a senior escalation point (Tier-3) for critical incidents and complex production issues
Governance & Best Practices
Establish and promote engineering standards, testing frameworks, and performance monitoring
Contribute to documentation, runbooks, and knowledge-sharing initiatives
Mentor junior developers and data engineers, raising overall engineering maturity
Ensure ongoing maintenance, upgrades, and enhancements are delivered with minimal disruption
Required Experience & Skills (Must-Have)
Expert-level Python development experience (non-negotiable)
Strong experience building high-performance, data-intensive systems
Deep understanding of financial models, FP&A processes, or large-scale computational engines
Hands-on experience with data integration and management in cloud or hybrid environments
Experience optimising computation through parallelisation and performance tuning techniques
Strong communication skills with the ability to engage technical and business stakeholders
Proven technical leadership and mentoring capability
Experience & Background
10+ years of professional software engineering experience
Demonstrated success delivering or maintaining financial modelling or forecasting platforms
Prior experience within financial services, ideally:
Insurance (Life & Annuities)
Investments
Asset or Wealth Management
Experience working with large datasets, analytics platforms, or data-driven systems
Solid understanding of DevOps practices and ability to explain them to non-technical stakeholders
Nice to Have
Cloud platform experience (AWS, Azure, Redshift, Snowflake)
Exposure to AI / ML integration within financial or computational systems
Experience with data visualisation or reporting tools
Contract & Eligibility Information
This is a contract role
Valid US work authorisation is required at the start of the engagement
No additional contractor benefits are provided
Company-issued equipment will be supplied
Engagement is aligned to a long-term FP&A transformation programme
Senior Developer - FP&A Forecasting & Reporting
Data engineer job in Boston, MA
The Senior Developer role partners with the FP&A team and IT data team to design, build, and optimize the FP&A Forecasting and Reporting processes. This position will be accountable for developing and maintaining high-performance components, ensuring the processes scale effectively with complex calculations and large data volumes.
You will be a key member of the FP&A Transformation, helping to build a calculation and forecast modeling platform to support a growing, high pace firm.
You will act as a technical leader, bringing deep engineering expertise while working closely with FP&A, Investment, Actuarial and data specialists to ensure the processes deliver robust, efficient, and production-grade solutions.
KEY RESPONSIBILITIES
Platform Development & Optimization
Lead the design, development, and optimization of core code base using Python and Python-based compute environments (e.g., Jupyter, VS Code, Databricks).
Collaborate with FP&A and data teams to integrate FP&A forecasting models, analytics, and data pipelines.
Ensure computational performance, scalability, and reliability across FP&A workloads.
Drive technical innovation, including use of compute grids, data vectors, and AI-based methods where relevant.
Collaboration & Support
Partner with FP&A and external consultants to deliver performant model execution and tooling.
Provide technical guidance on backlog items and requirements for IT Data and supporting teams.
Serve as a senior escalation point for critical incidents, providing Tier-3 expertise when needed.
Governance & Best Practices
Establish and champion engineering standards, testing practices, and performance monitoring.
Contribute to platform documentation, run books, and knowledge-sharing initiatives.
Mentor junior developers and data engineers, raising engineering maturity across the team.
Ensure ongoing maintenance, upgrades, and optimizations are delivered with minimal disruption.
EDUCATION
An undergraduate or advanced degree in Computer Science, Engineering, or related field (or equivalent combination of education and experience).
SKILLS
Expert Python skills with proven experience in building high-performance, data-intensive systems.
Strong understanding of Life & Annuity data, financial models, or large-scale computational engines.
Deep knowledge of data management and integration in hybrid cloud ecosystems (AWS / Snowflake).
Hands-on experience with compute parallelization and performance optimization techniques.
Excellent interpersonal and communication skills; ability to interface effectively with FP&A and program leadership stakeholders.
Strong leadership qualities: mentoring, influencing, and guiding technical direction across teams.
EXPERIENCE
10+ years in professional software engineering, with significant hands-on Python development.
Demonstrated success building or maintaining financial modeling or large-scale computational systems.
Experience working with investment banks and/or within life and annuity insurance domains is highly desirable.
Proven track record in data-driven platforms, large-scale computation, or advanced analytics.
Experience with DevOps practices and ability to explain them to business users.
Desirable:
Cloud ecosystem expertise (AWS, Azure, Redshift)
Familiarity with AI/ML integration in computational systems
Experience in data visualization and reporting
Senior Software Engineer
Data engineer job in Boston, MA
Senior Software Engineers - Relocate to NYC (Boston's Best Only)
Industry: High-performance trading & research engineering
Comp: $600k - $1.5m
We're hiring a small number of exceptional Software Engineers to join a high-impact engineering group in NYC. If you're in Boston and operating well above average, this is where your ability actually gets used.
What You'll Work On
Serious engineering - not feature factory work:
Distributed computing & large-scale data systems
Research/modelling platforms
High-performance execution & routing
Core infra powering real-time decisioning
Small teams. High autonomy. Immediate impact.
Who Should Apply
Engineers with clear, proven excellence, typically shown through:
Elite Foundations
Top CS/Math/EE degree (MIT/Harvard/CMU/Stanford/etc. or equivalent)
Strong GPA (usually 3.7+ or equivalent)
Real Engineering Depth
Distributed systems, infra, platform, HPC, research tooling or similar
Ownership of complex, performance-critical systems
Fast progression + high trust roles
High Problem-Solving Ability
Algorithms, systems thinking, clean architecture, first-principle reasoning.
Commercial Awareness
You build with purpose - speed, correctness, and impact matter to you.
Not a Fit
Web dev only, Maintenance roles, Anyone not relocating to NYC
What You Get
Market-leading comp
NYC relocation
High-calibre peers
Work that hits production fast
A genuinely career-defining engineering environment
If You're One of Boston's Top Engineers - Apply.
Data Architect (HR Data & Analytics)
Data engineer job in Boston, MA
Ness is a full lifecycle digital engineering firm offering digital advisory through scaled engineering services. Combining our core competence in engineering with the latest in digital strategy and technology, we seamlessly manage Digital Transformation journeys from strategy through execution to help businesses thrive in the digital economy. As your tech partner, we help engineer your company's future with cloud and data. For more information, visit ************
Data Architect - HR Data & Analytics
Boston, MA (3-4 days onsite a week)
Role Overview:
The Data Architect will play a key role in designing and implementing the enterprise HR data architecture to support KKR's global HR analytics, KPIs, and reporting initiatives. This role involves translating complex business requirements into scalable, governed data solutions built on Snowflake and integrated with Workday and other HR systems. The ideal candidate will combine deep technical expertise with a strong understanding of HR data domains, ensuring data integrity, accessibility, and analytical value across the organization.
Key Responsibilities:
Architect & Model: Design and implement scalable, efficient Snowflake data models to support HR analytics, workforce planning, and KPI reporting.
Data Integration: Develop and optimize integrations between Workday, Snowflake, and downstream analytics platforms; ensure seamless, accurate data flow across systems.
Governance & Quality: Define and enforce data governance, quality, and metadata management standards to ensure data consistency and compliance.
Documentation & Metadata: Maintain comprehensive technical documentation and data dictionaries for warehouse structures, transformations, and integrations.
Performance Optimization: Monitor and tune ETL/ELT pipelines, ensuring high-performance data transformation and loading processes.
Collaboration: Partner with HR, Data Engineering, and Analytics teams to translate business logic into reusable and governed data assets.
Testing & Validation: Participate in unit, integration, and regression testing to validate data pipelines and ensure data accuracy.
Lifecycle Support: Support data analysis and troubleshooting across the full implementation and operational lifecycle of HR data solutions.
Required Experience & Skills:
Proven experience architecting and implementing solutions on Snowflake or similar cloud data warehouse platforms.
Advanced SQL skills and hands-on experience with data transformation and pipeline optimization tools.
Strong understanding of ETL/ELT frameworks, data validation, and reconciliation techniques.
Demonstrated experience working with HR data structures, Workday, or other HRIS systems.
Strong analytical mindset and problem-solving ability, with attention to data integrity and business context.
Experience with Python for data engineering, automation, or orchestration tasks.
Track record of designing data warehouses or analytical platforms leveraging HR data to drive insights and advanced reporting.
Preferred Experience & Skills:
Experience building and supporting data warehouses specifically for HR and People Analytics domains.
Hands-on experience with Slowly Changing Dimensions (SCD Type 2) and historical data management.
Proficiency with data visualization tools such as Tableau or Power BI.
Experience with ELT frameworks (e.g., dbt) and modern data architecture patterns (e.g., Data Vault, Medallion Architecture).
Familiarity with HR processes, compliance standards, and industry best practices related to HR data management and reporting.
Experience working in an enterprise environment with cross-functional collaboration between HR, Finance, and Technology teams.
Ness is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law
Principal Software Engineer (Desktop/UI)
Data engineer job in Boston, MA
Join our Mapping & Ablation System Software Team and help shape the next generation of cardiac electrophysiology technology. This system enables physicians to visualize the heart's electrical activity in real time and deliver targeted ablation therapy for patients with arrhythmias. You'll advance the software that makes this life-changing technology usable, responsive, and clinically impactful.
As a Principal Desktop UI Engineer, you'll design and build intuitive, high-performance user interfaces that power advanced real-time visualization tools used by clinicians treating cardiac arrhythmias.
What we offer you in USA
We honor the contract terms you prefer.
20 paid vacation days per year
40 working hours per week
Retirement Plan 401(K)
Medical, Dental, Vision Insurance Plan for you and your Family
100% On-Site position in Newton
What You'll Do
Lead the design and development of desktop UI features using Qt/QML, Python, and C++ in a Linux environment.
Build fast, data-rich, real-time interfaces for complex clinical workflows.
Partner with UI/UX designers, hardware engineers, and clinical teams to translate requirements into polished user experiences.
Own major software components from architecture through implementation, testing, and deployment.
Mentor junior engineers and champion modern engineering practices.
Support pre-clinical evaluations and system demos for internal and clinical stakeholders.
Responsibilities
Required
Bachelor's degree + 7 years experience, or Master's degree + 5 years, or PhD + 3 years.
Strong background building high-performance UI applications.
Preferred
Expertise in Python and C++; experience with ROS or device-level communication is a plus.
Deep experience with Qt/QML or similar UI frameworks (PyQt, OpenGL, Unity, GTK, WPF).
Proven ability to develop real-time or data-intensive applications.
Strong grasp of software architecture, design patterns, and modern development practices.
Experience collaborating with UI designers and implementing designs from tools like Figma.
Familiarity with Agile, Git, CI/CD, and cloud platforms (AWS/Azure).
Experience in regulated environments (medical devices strongly preferred).
Excellent communication, problem-solving skills, and team collaboration abilities.