Data Engineer: HR Systems (Boston, Massachusetts)
Data engineer job in Boston, MA
Data Engineer (HR Data Systems)
Boston, MA
3-4 days onsite a week
Pay Rate: $105/Hr
Key Responsibilities:
Translate business needs into data modeling strategies and implement Snowflake data models to support HR analytics, KPIs, and reporting.
Design, build, and maintain Snowflake objects including tables, views, and stored procedures.
Develop and execute SQL or Python transformations, data mappings, cleansing, validation, and conversion processes.
Establish and enforce data governance standards to ensure consistency, quality, and completeness of data assets.
Manage technical metadata and documentation for data warehousing and migration efforts.
Optimize performance of data transformation pipelines and monitor integration performance.
Design, configure, and optimize integrations between Workday and third-party applications.
Participate in system testing including unit, integration, and regression phases.
Support data analysis needs throughout the implementation lifecycle.
Required Experience & Skills:
Experience with Snowflake or similar data warehouse platforms
Strong SQL skills and experience with data transformation tools
Experience with ETL processes and data validation techniques
Understanding of HR data structures and relationships
Excellent analytical and problem-solving abilities
Experience with developing with Python
Exposure to Data Warehousing solutions leveraging data from Workday or other HRIS such as SuccessFactors, etc. to support advanced reporting and insights for an organization
Lead Data Scientist - Commercial Pharmaceuticals
Data engineer job in Cambridge, MA
Join a leading pharmaceutical company's Data Science team, where you'll drive and lead advanced analytics across Marketing, Sales, and Access. As Associate Director, A HIGH LEVEL INDIVIDUAL CONTRIBUTOR, you'll lead strategic initiatives-from predictive modeling and personalization to field force optimization-delivering scalable solutions that inform commercial decisions and enhance patient engagement. Deep experience in pharmaceutical marketing analytics is essential to translate brand strategy into actionable insights.
Keywords: MMM, Next Best Action, NLP, Data Science, HCP, GenAI
Location: Onsite 3 days a week in Cambridge, MA
Key Responsibilities
Lead development and deployment of predictive models, segmentation, NLP, and GenAI tools to solve complex commercial challenges
Translate pharmaceutical brand objectives into analytics frameworks across marketing, sales, and access
Design and operationalize Next Best Action strategies to boost omnichannel engagement and HCP ROI
Build and scale Patient 360 models and targeting algorithms for AI-driven lead generation
Guide stakeholders through insight activation and integration into workflows
Champion model governance, experimentation, and analytical rigor
Collaborate with IT to develop ML Ops environments and productized solutions
Manage external analytics partners and ensure alignment across data engineering, insights, and compliance
Who You Are
A strategic data scientist with strong business acumen, leadership presence, and deep experience in pharmaceutical marketing analytics. You thrive at the intersection of data and action, delivering measurable impact.
Qualifications
7+ years in analytics/data science; 4+ years in leadership roles within pharmaceutical industry
Proven experience in pharmaceutical marketing analytics, including brand strategy, HCP engagement, and omnichannel optimization
Expertise in NBA, MMM, supervised/unsupervised learning, A/B testing, time-series forecasting
Success in marketing mix modeling, decision engines, and GenAI product design
Proficient in Python, R, SQL, Snowflake; skilled in Power BI or Tableau
Familiarity with APLD, PlanTrak, claims, and specialty pharmacy datasets
Strong communicator with executive presence and cross-functional influence
Data Engineer
Data engineer job in Peabody, MA
About the Company
Our client is a well-established financial services organization with a strong presence across retail banking, insurance, wealth management, and mortgage services. They are embarking on an exciting journey to build a next-generation data analytics platform from the ground up, leveraging rich datasets and modern cloud technologies.
This is a fantastic opportunity to join a collaborative, mid-sized environment where you can make a real impact on data strategy and analytics capabilities.
The Role
We are seeking a Data Engineer who can wear multiple hats-data modeling, dashboard development, predictive analytics-and help shape the future of data within the organization.
You'll work closely with business stakeholders and IT leadership to design and implement solutions that unlock insights and drive strategic decisions.
Key Responsibilities
Design and develop new data solutions to meet evolving business needs.
Build and optimize predictive models; integrate with BI tools.
Develop dashboards and reports using Power BI.
Create and maintain efficient data pipelines; support cloud-to-cloud environments.
Collaborate with leadership to provide actionable insights.
Mentor team members on data handling best practices.
Stay current with emerging technologies and advocate for ethical data use.
Technical Requirements
Must Have:
Power BI and SQL Server expertise
Azure Cloud experience
ETL/ELT tools (SSIS, Informatica, Databricks, KingswaySoft)
Nice to Have:
Snowflake experience
Multi-cloud exposure (AWS/GCP)
Python and ML frameworks (TensorFlow, PyTorch, scikit-learn)
Predictive modeling and advanced analytics
Soft Skills
Builder mindset-creative with limited resources.
Comfortable in a non-siloed environment; able to wear multiple hats.
Strong collaboration and communication skills.
Open to learning financial services domain.
Why Apply?
Opportunity to design and build a data analytics platform from scratch.
Direct interaction with business stakeholders.
Stable organization with a history of growth and strong financials.
Collaborative culture with room for innovation.
Lead Data Engineer
Data engineer job in Smithfield, RI
Immediate need for a talented Lead Data Engineer. This is a 06+ Months Contract opportunity with long-term potential and is located in Smithfield, RI (Onsite). Please review the job description below and contact me ASAP if you are interested.
Job ID:25-93890
Pay Range: $63 - $73/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Responsibilities:
In this role, you will be responsible for the leading, engineering & developing quality software components and applications for brokerage products.
You will build, modernize, and maintain Core & Common tools and Data Solutions. You will also apply and adopt variety of cloud-native technologies for the products.
In addition to building software, you will have an opportunity to help define and implement development practices, standards and strategies.
This position can office in Merrimack, NH or Smithfield, RI. (Smithfield is preferred).
Key Requirements and Technology Experience:
Bachelor's or Master's Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with 10 years of working experience
Advanced levels of Oracle and be able to read Oracle SQL/PLSQL.
Need to know RDBMS database, modeling, ETL and SQL concepts
Expertise in Oracle PLSQL is a must to read logic, schema, stored procedures
AWS data engineering services, is a must (batch, EMR, S3, glue, lambda, etc)
Informatica is a must (need ETL concepts)
Unix/Python for scripting is a must
Financial domain experience
Our client is a leading Banking and financial Industry and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
Junior Data Engineer
Data engineer job in Boston, MA
Job Title: Junior Data Engineer
W2 candidates only
We are on the lookout for engineers who are open to upskill to the exciting world of Data Engineering. This opportunity is for our client, a top tier insurance company and includes a 2-3 week online pre-employment training program (15 hours per week), conveniently scheduled after business hours. Participants who successfully complete the program will receive a $500 stipend. This is a fantastic chance to gain in demand skills, hands-on experience, and a pathway into a dynamic tech role..
Key Responsibilities:
• Assist in the design and development of big data solutions using technologies such as Spark, Scala, AWS Glue, Lambda, SNS/SQS, and CloudWatch.
• Develop applications primarily in Scala and Python with guidance from senior team members.
• Write and optimize SQL queries, preferably with Redshift; experience with Snowflake is a plus.
• Work on ETL/ELT processes and frameworks to ensure smooth data integration.
• Participate in development tasks, including configuration, writing unit test cases, and testing support.
• Help identify and troubleshoot defects and assist in root cause analysis during testing.
• Support performance testing and production environment troubleshooting.
• Collaborate with the team on best practices, including Git version control and CI/CD deployment processes.
• Continuously learn and grow your skills in big data technologies and cloud platforms.
Prerequisites:
• Recent graduate with a degree in Computer Science, Information Technology, Engineering, or related fields.
• Basic experience or coursework in Scala, Python, or other programming languages.
• Familiarity with SQL and database concepts.
• Understanding of ETL/ELT concepts is preferred.
• Exposure to AWS cloud services (Glue, Lambda, SNS/SQS) is a plus but not mandatory.
• Strong problem-solving skills and eagerness to learn.
• Good communication and teamwork abilities.
Selection Process & Training:
• Online assessment and technical interview by Quintrix.
• Client Interview(s).
• 2-3 weeks of pre-employment online instructor-led training.
Stipend paid during Training:
• $500.
Benefits:
• 2 weeks of Paid Vacation.
• Health Insurance including Vision and Dental.
• Employee Assistance Program.
• Dependent Care FSA.
• Commuter Benefits.
• Voluntary Life Insurance.
• Relocation Reimbursement.
Who is Quintrix?
Quintrix is on a mission to help individuals develop their technology talent. We have helped hundreds of candidate's kick start their careers in tech. You will be “paid-to-learn”, qualifying you for a high paying tech job with one of our top employers. To learn more about our candidate experience go to *************************************
Data Engineer
Data engineer job in Boston, MA
hackajob has partnered with Simply Business in its digital solutions and leverages extensive industry data to drive impactful results.
Role: Data Analyst / Analytics Engineer
Mission:
You'll be part of the global Data and Analytics (DNA) team at Simply Business, helping drive data-driven decision making across the organization. Your mission will be to provide high-impact analytical support to business stakeholders, measure and evaluate product value streams, and ensure product investments deliver maximum value. As a trusted advisor, you'll translate business needs into actionable insights that improve operational efficiency and customer conversion.
Location: US-based (Hybrid / details 8 days/month)
Salary: Up to 99K USD
Work authorization: This role requires you to be authorized to work in the United States without sponsorship.
Qualifications (3+ years of experience):
Experience in data & analytics, with a preference for operational analytics
Experience working with cloud data warehouses (Snowflake or similar)
Strong SQL skills and experience designing data models (dbt is a plus)
Experience integrating new data sources and working with ETL pipelines
Strong data visualization and storytelling skills using BI tools such as Looker, Tableau, or Power BI
Ability to partner with product, engineering, and business teams to translate use cases into analytical solutions
Excellent communication skills, especially when presenting insights to non-technical stakeholders
Comfortable working as a self-starter in a globally distributed team
Benefits:
Opportunity to work on strategic, high-impact analytics initiatives
Exposure to global stakeholders across product, engineering, and business teams
Ongoing investment in modern data platforms and BI capabilities
Career growth within a mature and expanding Data & Analytics function
About hackajob:
hackajob is a recruitment platform that matches you with relevant roles based on your preferences. To be matched with roles like this one, you'll need to create an account with us.
This role requires you to be based in the US.
Data Engineer (HR Data warehousing exp)
Data engineer job in Boston, MA
Ness is a full lifecycle digital engineering firm offering digital advisory through scaled engineering services. Combining our core competence in engineering with the latest in digital strategy and technology, we seamlessly manage Digital Transformation journeys from strategy through execution to help businesses thrive in the digital economy. As your tech partner, we help engineer your company's future with cloud and data. For more information, visit ************
Data Engineer (HR Data warehousing exp)
Boston, MA (3-4 days onsite a week)
Key Responsibilities:
Translate business needs into data modeling strategies and implement Snowflake data models to support HR analytics, KPIs, and reporting.
Design, build, and maintain Snowflake objects including tables, views, and stored procedures.
Develop and execute SQL or Python transformations, data mappings, cleansing, validation, and conversion processes.
Establish and enforce data governance standards to ensure consistency, quality, and completeness of data assets.
Manage technical metadata and documentation for data warehousing and migration efforts.
Optimize performance of data transformation pipelines and monitor integration performance.
Design, configure, and optimize integrations between Workday and third-party applications.
Participate in system testing including unit, integration, and regression phases.
Support data analysis needs throughout the implementation lifecycle.
Required Experience & Skills:
Experience with Snowflake or similar data warehouse platforms
Strong SQL skills and experience with data transformation tools
Experience with ETL processes and data validation techniques
Understanding of HR data structures and relationships
Excellent analytical and problem-solving abilities
Experience with developing with Python
Architecting a data warehousing solution leveraging data from Workday or other HRIS such as Workday to support advanced reporting and insights for an organization
Preferred Experience & Skills:
Experience in developing and supporting a data warehouse serving the HR domain
Experience with data platforms where SCD Type 2 was required
Experience with data visualization tools such as Tableau
Experience with architecting or working with ELT technologies (such as DBT) and data architectures
Understanding of HR processes, compliance requirements, and industry best practices
Ness is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law
Senior Data Engineer
Data engineer job in Boston, MA
This role is with a Maris Financial Services Partner
Boston, MA - Hybrid Role - We are targeting local candidates that can be in the Boston office 3 days per week.
12 Month + contract (or contract to hire, if desired)
This team oversees critical systems including Snowflake, Tableau, and RDBMS technologies like SQL Server and Postgres. This role will focus on automating database deployments and creating efficient patterns and practices that enhance our data processing capabilities.
Key Responsibilities:
Design, enhance, and manage DataOps tools and services to support cloud initiatives.
Develop and maintain scheduled workflows using Airflow.
Create containerized applications for deployment with ECS, Fargate, and EKS.
Build data pipelines to extract, transform, and load (ETL) data from various sources into Apache Kafka, ultimately feeding into Snowflake.
Provide consultation for infrastructure projects to ensure alignment with technical architecture and end-user needs.
Qualifications:
Familiarity with Continuous Integration and Continuous Deployment (CI/CD) practices and tools.
Understanding of application stack architectures (e.g., microservices), PaaS development, and AWS environments.
Proficiency in scripting languages such as Bash.
Experience with Python, Go, or C#.
Hands-on experience with Terraform or other Infrastructure as Code (IaC) tools, such as CloudFormation.
Preferred experience with Apache Kafka and Flink.
Proven experience working with Kubernetes.
Strong knowledge of Linux and Docker environments.
Excellent communication and interpersonal skills.
Strong analytical and problem-solving abilities.
Ability to manage multiple tasks and projects concurrently.
Expertise with SQL Server, Postgres, and Snowflake.
In-depth experience with ETL/ELT processes.
Senior Data Engineer
Data engineer job in Boston, MA
Hi, this is Eric 👋 We're hiring a stellar Data Engineer to join our engineering org at Basil Systems.
At Basil Systems, we're revolutionizing healthcare data access and insights for the life sciences industry. We've built powerful platforms that help pharmaceutical and medical device companies navigate complex regulatory landscapes, accelerate product development, and ultimately bring life-saving innovations to market faster. Our SaaS platforms transform disconnected data sources into actionable intelligence, empowering organizations to make data-driven decisions that improve patient outcomes and save lives.
The Role
We are seeking a Senior Data Engineer to own and advance the data infrastructure that powers our healthcare insights platform. As our engineering team scales and we expand our data capabilities, we need someone who can build reliable, scalable pipelines while ensuring data quality across increasingly complex regulatory sources.
Key Responsibilities
Design, build, and maintain robust ETL processes for healthcare regulatory data
Integrate new data sources as we onboard customers and expand platform capabilities
Optimize pipeline performance and reliability
Ensure data accuracy and consistency across complex transformation workflows
Qualifications
5+ years of professional experience as a data engineer or in a similar role
Experience with Apache Spark and distributed computing
Familiarity with common ML algorithms and their applications
Knowledge of or willingness to learn and work with Generative AI technologies
Experience with developing for distributed cloud platforms
Experience with MongoDB / ElasticSearch and technologies like BigQuery
Strong commitment to engineering best practices
Nice-to-Haves
Solid understanding of modern security practices, especially in healthcare data contexts
Subject matter expertise in LifeSciences / Pharma / MedTech
This role might not be for you if...
You're a heavy process advocate and want enterprise-grade Scrum or rigid methodologies
You have a need for perfect clarity before taking action
You have a big company mindset
What We Offer
Competitive salary
Health and vision benefits
Attractive equity package
Flexible work environment (remote-friendly)
Opportunity to work on impactful projects that are helping bring life-saving medical products to market
Be part of a mission-driven team solving real healthcare challenges at a critical scaling point
Our Culture
At Basil Systems, we value flexibility and support a distributed team. We actively employ and support remote team members across different geographies, allowing you to work when, where, and how you work best. We are committed to building a diverse, inclusive, and safe work environment for everyone. Our team is passionate about using technology to make a meaningful difference in healthcare.
How to Apply
If you're excited about this opportunity and believe you'd be a great fit for our team, please send your resume and a brief introduction to *****************************.
Basil Systems is an equal opportunity employer. We welcome applicants of all backgrounds and experiences.
Senior Data Engineer
Data engineer job in Boston, MA
first PRO is now accepting resumes for a Senior Data Engineer role in Boston, MA. This is a direct hire role and onsite 2-3 days per week.
RESPONSIBILITIES INCLUDE
Support and enhance the firm's Data Governance, BI platforms, and data stores.
Administer and extend data governance tools including Atlan, Monte Carlo, Snowflake, and Power BI.
Develop production-quality code and data solutions supporting key business initiatives.
Conduct architecture and code reviews to ensure security, scalability, and quality across deliverables.
Collaborate with the cloud migration, information security, and business analysis teams to design and deploy new applications and migrate existing systems to the cloud.
TECHNOLOGY EXPERIENCE
Hands-on experience supporting SaaS, business-facing applications.
Expertise in Python for data processing, automation, and production-grade development.
Strong knowledge of SQL, data modeling, and data warehouse design (Kimball/star schema preferred).
Experience with Power BI or similar BI/reporting tools.
Familiarity with data pipeline technologies and orchestration tools (e.g., Airflow, dbt).
Experience with Snowflake, Redshift, BigQuery, or Athena.
Understanding of data governance, data quality, and metadata management frameworks.
QUALIFICATIONSBS or MS in Computer Science, Engineering, or a related technical field.
7+ years of professional software or data engineering experience.
Strong foundation in software design and architectural patterns.
Data Modelling Architect
Data engineer job in Boston, MA
The Wissen team continues to expand its footprint in the Canada & USA. More openings to come as we continue to grow the team!
Please read below for a brilliant career opportunity.
Role: Data Modelling Architect
Title: Vice President
Location: Boston, MA (Day 1 Onsite/Hybrid)
Mode of Work: 3 days per week onsite required
Required experience: 10+ Years
Job Description
We are looking for an experienced Data Modelling Architect to design and optimize enterprise data models supporting risk, regulatory, and financial domains. The role requires strong expertise in conceptual, logical, and physical data modelling, along with working knowledge of Financial Risk or Operational Risk frameworks used in global banking environments.
Required Skills
10-12 years of strong experience in data modelling and data architecture.
Expertise in ER modelling, dimensional modelling, and industry-standard modelling methodologies.
Hands-on experience with tools like Erwin, ER/Studio.
Strong SQL and experience with relational databases and distributed/cloud data platforms.
Working knowledge of Financial Risk, Operational Risk, or regulatory risk data (Credit Risk, Market Risk, Liquidity Risk, RCSA, Loss Events, KRI, etc.).
Experience supporting regulatory frameworks such as Basel II/III, CCAR, ICAAP, or similar.
Ability to work with cross-functional teams across global locations.
Excellent communication and documentation skills.
Benefits:
Healthcare insurance for you and your family (medical, dental, vision).
Short / Long term disability insurance.
Life Insurance.
Accidental death & disability Insurance.
401K.
3 weeks of Paid Time Off.
Support and fee coverage for immigration needs.
Remote office set up stipend.
Support for industry certifications.
Additional cash incentives.
Re-skilling opportunities to transition between technologies.
Schedule: Monday to Friday
Work Mode: Hybrid
Job Type: Full-time
We are: A high end technical consulting firm built and run by highly qualified technologists. Our workforce consists of 5000+ highly skilled professionals, with leadership from Wharton, MIT, IITs, IIMs, and NITs and decades of experience at Goldman Sachs, Morgan Stanley, MSCI, Deutsche Bank, Credit Suisse, Verizon, British Telecom, ISRO etc. Without any external funding or investments, Wissen Technology has grown its revenues by 100% every other year since it started as a subsidiary of Wissen Group in 2015. We have a global presence with offices in the US, India, UK, Australia, Mexico, and Canada.
You are: A true tech or domain ninja. Or both. Comfortable working in a quickly growing profitable startup, have a “can do” attitude and are willing to take on any task thrown your way.
You will:
Develop and promote the company's culture of engineering excellence.
Define, develop and deliver solutions at a top tier investment bank or another esteemed client.
Perform other duties as needed
Your Education and Experience:
We value candidates who can execute on our vision and help us build an industry-leading organization.
Graduate-level degree in computer science, engineering, or related technical field
Wissen embraces diversity and is an equal opportunity employer. We are committed to building a team that represents a variety of backgrounds, skills, and abilities. We believe that the more inclusive our team is, the better our work will be. All qualified applicants, including but not limited to LGBTQ+, Minorities, Females, the Disabled, and Veterans, are encouraged to apply.
About Wissen Technology:
The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for diverse industries, including Banking, E-commerce, Telecom, Healthcare, Manufacturing, and Energy. We help clients build world-class products. We have offices in the US, India (Bangalore, Hyderabad, Chennai, Gurugram, Mumbai, Pune), UK, Australia, Mexico, Vietnam, and Canada.
We empower businesses with a dynamic portfolio of services and accelerators tailored to today's digital demands and based on future ready technology stack. Our services include Industry Leading Custom Software Development, AI-Driven Software Engineering, Generative AI & Machine Learning, Real-Time Data Analytics & Insights, Interactive Data Visualization & Decision Intelligence, Intelligent Process Automation, Multi-Cloud & Hybrid Cloud Strategies, Cross-Platform Mobile Experiences, CI/CD-Powered Agile DevOps, Automated Quality Engineering, and cutting-edge integrations.
Certified as a Great Place to Work for five consecutive years (2020-2025) and recognized as a Top 20 AI/ML vendor by CIO Insider, Wissen Group has delivered multimillion-dollar projects for over 20 Fortune 500 companies. Wissen Technology delivers exceptional value on mission-critical projects through thought leadership, ownership, and reliable, high-quality, on-time delivery.
Our industry-leading technical expertise stem from the talented professionals we attract. Committed to fostering their growth and providing top-tier career opportunities, Wissen ensures an outstanding experience and value for our clients and employees.
We Value:
Perfection: Pursuit of excellence through continuous improvement.
Curiosity: Fostering continuous learning and exploration.
Respect: Valuing diversity and mutual respect.
Integrity: Commitment to ethical conduct and transparency.
Transparency: Open communication and trust.
Website: **************
Glassdoor Reviews: *************************************************************
Wissen Thought leadership: https://**************/articles/
Latest in Wissen in CIO Insider:
**********************************************************************************************************************
Employee Speak:
***************************************************************
LinkedIn: **************************************************
About Wissen Interview Process:
https://**************/blog/we-work-on-highly-complex-technology-projects-here-is-how-it-changes-whom-we-hire/
Wissen: A Great Place to Work
https://**************/blog/wissen-is-a-great-place-to-work-says-the-great-place-to-work-r-institute-india
https://**************/blog/here-is-what-ownership-and-commitment-mean-to-wissenites/
Wissen | Driving Digital Transformation
A technology consultancy that drives digital innovation by connecting strategy and execution, helping global clients to strengthen their core technology.
Job Type: Full-time
Work Location: In person
HR Data Analytics Architect
Data engineer job in Boston, MA
Key Responsibilities:
Architect & Model: Design and implement scalable, efficient Snowflake data models to support HR analytics, workforce planning, and KPI reporting.
Data Integration: Develop and optimize integrations between Workday, Snowflake, and downstream analytics platforms; ensure seamless, accurate data flow across systems.
Governance & Quality: Define and enforce data governance, quality, and metadata management standards to ensure data consistency and compliance.
Documentation & Metadata: Maintain comprehensive technical documentation and data dictionaries for warehouse structures, transformations, and integrations.
Performance Optimization: Monitor and tune ETL/ELT pipelines, ensuring high-performance data transformation and loading processes.
Collaboration: Partner with HR, Data Engineering, and Analytics teams to translate business logic into reusable and governed data assets.
Testing & Validation: Participate in unit, integration, and regression testing to validate data pipelines and ensure data accuracy.
Lifecycle Support: Support data analysis and troubleshooting across the full implementation and operational lifecycle of HR data solutions.
Required Experience & Skills:
Proven experience architecting and implementing solutions on Snowflake or similar cloud data warehouse platforms.
Advanced SQL skills and hands-on experience with data transformation and pipeline optimization tools.
Strong understanding of ETL/ELT frameworks, data validation, and reconciliation techniques.
Demonstrated experience working with HR data structures, Workday, or other HRIS systems.
Strong analytical mindset and problem-solving ability, with attention to data integrity and business context.
Experience with Python for data engineering, automation, or orchestration tasks.
Track record of designing data warehouses or analytical platforms leveraging HR data to drive insights and advanced reporting.
Preferred Experience & Skills:
Experience building and supporting data warehouses specifically for HR and People Analytics domains.
Hands-on experience with Slowly Changing Dimensions (SCD Type 2) and historical data management.
Proficiency with data visualization tools such as Tableau or Power BI.
Experience with ELT frameworks (e.g., dbt) and modern data architecture patterns (e.g., Data Vault, Medallion Architecture).
Familiarity with HR processes, compliance standards, and industry best practices related to HR data management and reporting.
Experience working in an enterprise environment with cross-functional collaboration between HR, Finance, and Technology teams.
Data Architect
Data engineer job in Waltham, MA
For over 16 years, Trilyon has been a leader in global workforce solutions, specializing in Cloud Technology, AI/ML, Software Development, Technical Writing, and Digital Transformation. We partner with top companies to deliver high-quality talent in engineering, IT, and emerging technologies. For additional information or to view all of our job opportunities, please visit our website ************************************
We are seeking an AI Data Architect to join our team. This role will involve designing and leading data and AI architecture solutions, collaborating with business stakeholders, and driving end-to-end solution closures. The ideal candidate will have extensive experience in retail data architecture, cloud data platforms, and AI/ML technologies, along with a passion for building scalable, data-driven AI solutions.
Job Title: AI Data Architect
Location: Waltham, MA
Duration: 6 months
Job Description:
The AI Data Architect will be responsible for leading data architecture and AI initiatives, with a strong focus on retail domain solutions. Key responsibilities include:
Leveraging 10+ years of retail domain experience to design and guide data and AI solutions.
Working independently with business stakeholders to understand requirements, propose solutions, and drive solution closures.
Applying strong proficiency in AI/LLM technologies, including prompt engineering and development of LLM-based agents.
Designing and managing cloud data platforms with mandatory 5+ years of experience in Snowflake or Databricks.
Establishing data architecture and governance frameworks, including standards, security, compliance, and continuous improvement of existing architectures.
Writing advanced SQL queries for data analysis, profiling, and supporting foundational AI and analytical systems.
Developing and deploying machine learning solutions using Python, with hands-on experience in MLOps and CI/CD orchestration, and integrating models into big data platforms.
Building scalable application architectures using Python, Streamlit, and YAML, particularly for chatbot applications.
Providing technical leadership with 5+ years of experience in solution architecture or technical lead roles, including mentoring engineers and leading projects.
Why Join Us?
Trilyon, Inc., offers a comprehensive benefits package.
Opportunities for growth and professional development.
Collaborative and inclusive company culture.
Equal Employment Opportunity (EEO) Statement:
Trilyon, Inc., is an Equal Opportunity Employer committed to diversity, equity, and inclusion. We do not discriminate based on race, color, religion, gender, gender identity, sexual orientation, national origin, age, disability, veteran status, or any other protected status under applicable laws. Our diverse team drives innovation, competitiveness, and creativity, enhancing our ability to effectively serve our clients and communities. This commitment to diversity makes us stronger and more adaptable.
Lead Data Engineer
Data engineer job in Cambridge, MA
Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative, inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Data Engineers who are passionate about marrying data with emerging technologies. As a Capital One Lead Data Engineer, you'll have the opportunity to be on the forefront of driving a major transformation within Capital One.
What You'll Do:
Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies
Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems
Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake
Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community
Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment
Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance
Basic Qualifications:
Bachelor's Degree
At least 4 years of experience in application development (Internship experience does not apply)
At least 2 years of experience in big data technologies
At least 1 year experience with cloud computing (AWS, Microsoft Azure, Google Cloud)
Preferred Qualifications:
7+ years of experience in application development including Python, SQL, Scala, or Java
4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud)
4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL)
4+ year experience working on real-time data and streaming applications
4+ years of experience with NoSQL implementation (Mongo, Cassandra)
4+ years of data warehousing experience (Redshift or Snowflake)
4+ years of experience with UNIX/Linux including basic commands and shell scripting
2+ years of experience with Agile engineering practices
At this time, Capital One will not sponsor a new applicant for employment authorization, or offer any immigration related support for this position (i.e. H1B, F-1 OPT, F-1 STEM OPT, F-1 CPT, J-1, TN, E-2, E-3, L-1 and O-1, or any EADs or other forms of work authorization that require immigration support from an employer).
The minimum and maximum full-time annual salaries for this role are listed below, by location. Please note that this salary information is solely for candidates hired to perform work within one of these locations, and refers to the amount Capital One is willing to pay at the time of this posting. Salaries for part-time roles will be prorated based upon the agreed upon number of hours to be regularly worked.
Cambridge, MA: $193,400 - $220,700 for Manager, Data Engineering
McLean, VA: $193,400 - $220,700 for Manager, Data Engineering
Richmond, VA: $175,800 - $200,700 for Manager, Data Engineering
Candidates hired to work in other locations will be subject to the pay range associated with that location, and the actual annualized salary amount offered to any candidate at the time of hire will be reflected solely in the candidate's offer letter.
This role is also eligible to earn performance based incentive compensation, which may include cash bonus(es) and/or long term incentives (LTI). Incentives could be discretionary or non discretionary depending on the plan.
Capital One offers a comprehensive, competitive, and inclusive set of health, financial and other benefits that support your total well-being. Learn more at the Capital One Careers website . Eligibility varies based on full or part-time status, exempt or non-exempt status, and management level.
This role is expected to accept applications for a minimum of 5 business days.No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections ; New York City's Fair Chance Act; Philadelphia's Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries.
If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1- or via email at . All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations.
For technical support or questions about Capital One's recruiting process, please send an email to
Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site.
Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC).
Software Engineer
Data engineer job in Boston, MA
Work schedule: Hybrid
Key Responsibilities:
Performance Tuning: Monitor and optimize performance, including query performance, resource utilization, and storage management.
User and Access Management: Manage user access, roles, and permissions to ensure data security and compliance with organizational policies.
Data Integration: Support and manage data integration processes, including data loading, transformation, and extraction.
Troubleshooting and Support: Provide technical support and troubleshooting for Snowflake-related issues, including resolving performance bottlenecks and query optimization.
Documentation and Reporting: Maintain detailed documentation of system configurations, procedures, and changes. Generate and deliver regular reports on system performance and usage.
Collaboration: Work closely with data engineers, analysts, and other IT professionals to ensure seamless integration and optimal performance of the Snowflake environment.
Best Practices: Stay up to date with Snowflake best practices and industry trends. Recommend and implement improvements and upgrades to enhance system functionality and performance.
Qualifications and Experience:
5+ years of experience in data architecture, data engineering, or database development.
2+ years of hands-on experience with Snowflake, including data modeling, performance tuning, and security.
At a minimum Bachelor's degree in Computer Science, Information Technology, or related field.
Experience with source control tools (GitHub preferred), ETL/ELT tools and cloud platforms (AWS preferred).
Experience or exposure to AI tools.
Deep understanding of data warehousing concepts, dimensional modeling, and analytics.
Excellent problem-solving and communication skills.
Experience integrating Snowflake with BI and reporting tools is a plus
Required Skills:
Strong proficiency in Snowflake architecture, features, and capabilities.
Knowledge of SQL and Snowflake-specific query optimization.
Experience with ETL tools and data integration processes.
Strong proficiency in SQL and Python.
Strong Database design and data modelling experience. Experience with data modeling tools.
Ability to identify and drive continuous improvements.
Strong problem solving and analytical skills.
Demonstrated process-oriented and strategic thinking skills.
Strong motivation and a desire to continuously learn and grow.
Knowledge of Snowflake security features including access control, authentication, authorization, encryption, masking, secure view, etc.
Experience working in AWS cloud environments.
Experience working with Power BI and other BI, data visualization, and reporting tools.
Business requirement gathering and aligning to solutions delivery.
Experience with data integration solutions and tools, data pipelines, and modern ways of automating data using cloud based and on-premises technologies.
Experience integrating Snowflake with an identity and access management program such as Azure IDP is a plus.
Experience with other relational database management systems, cloud data warehouses and big data platforms is a plus.
Analytical Skills: Excellent problem-solving and analytical skills with strong attention to detail.
Communication: Effective communication skills, both written and verbal, with the ability to convey complex technical information to non-technical stakeholders.
Teamwork: Ability to work independently and collaboratively in a fast-paced environment.
Preferred Skills:
Snowflake certification (e.g., SnowPro Core or Advanced Certification).
AWS Networking / AWS DevOps Engineer
Data engineer job in Quincy, MA
Job Description - AWS Networking / AWS DevOps Engineer
Type: Hybrid (3 to 4 days based on client request and project demand)
Role: AWS Cloud Networking Engineer/DevOps
We are seeking an experienced Networking-focused AWS DevOps Engineer to support and optimize our multi-region cloud infrastructure. The ideal candidate will have strong expertise across AWS networking, multi-region architectures, CI/CD, container orchestration, infrastructure automation, and data platform components such as Redshift. This role is part-time but requires a hands-on engineer who can troubleshoot, optimize, and enhance our production and non-production cloud environments.
Key Responsibilities
AWS Multi-Region Architecture & Networking
• Design, implement, and optimize multi-region VPC architectures, peering, Transit Gateway, and routing policies.
• Configure and manage security groups, NACLs, route tables, NAT gateways, IGWs, and cross-region networking.
• Ensure high availability (HA) and disaster recovery (DR) readiness across multiple AWS regions.
• Support network connectivity for hybrid environments (VPN, Direct Connect).
AWS DevOps & Automation
• Develop and maintain CI/CD pipelines using CodePipeline, CodeBuild, GitHub Actions, GitLab, or Jenkins.
• Automate infrastructure provisioning using Terraform, CloudFormation, or CDK.
• Implement and optimize monitoring, logging, and alerting via CloudWatch, OpenSearch, Prometheus/Grafana, or equivalent.
• Drive continuous improvements in deployment reliability and DevOps best practices.
Compute & Container Services
• Manage and optimize EC2 instances including AMIs, autoscaling, patching, and configurations.
• Deploy, scale, and troubleshoot workloads on ECS (Fargate or EC2).
• Implement workload security, resource optimization, and cost controls across compute services.
Redshift & Data Infrastructure Support
• Support Redshift cluster configuration, security, WLM settings, performance optimization, and connectivity.
• Ensure secure and optimized data flows between ETL layers, Redshift, EC2/ECS services, and S3.
• Collaborate with data teams to tune Redshift workloads and ensure optimal network performance.
Security & Compliance
• Enable IAM policies, role-based access, and least-privilege security controls.
• Implement multi-region failover, backup/restore strategies, and environment hardening.
• Ensure compliance with security best practices, patching, encryption, and CloudTrail logging.
Operations, Troubleshooting & Support
• Troubleshoot multi-region connectivity, latency, DNS, and infrastructure issues.
• Optimize cloud spend across compute, networking, and Redshift workloads.
• Provide on-call / ad-hoc support during deployments or critical incidents (as needed).
Required Skills & Experience
Technical Skills
• 10+ years of experience as a DevOps, Cloud Engineer, or AWS Infrastructure Engineer.
• Strong AWS networking expertise: VPC, TGW, Route53, VPN, Direct Connect, SGs, NACLs.
• Experience with multi-region, HA, DR architectures.
• Proficient in EC2, ECS (Fargate/EC2 Launch Types), Redshift.
• Strong Terraform / CloudFormation scripting experience.
• Strong experience with Python or Bash for automation.
• Hands-on experience setting up CI/CD pipelines.
• Experience with monitoring/observability tools: CloudWatch, OpenSearch, Grafana/Prometheus, Datadog, etc.
• Familiarity with cloud cost optimization and tagging strategies.
Data Scientist-Statistics OR Operations Research
Data engineer job in Johnston, RI
Established nearly two centuries ago, FM is a leading mutual insurance company whose capital, scientific research capability and engineering expertise are solely dedicated to property risk management and the resilience of its policyholder-owners. These owners, who share the belief that the majority of property loss is preventable, represent many of the world's largest organizations, including one of every four Fortune 500 companies. They work with FM to better understand the hazards that can impact their business continuity to make cost-effective risk management decisions, combining property loss prevention with insurance protection.
As a Data Scientist, you will focus on translating business needs into analytics, interpretation of analytics into business applications, sophisticated technologies, and artificial intelligence solutions. This role will allow you to innovate, explore, and build solutions for FM using new technologies. You will develop and apply statistics, artificial intelligence, machine learning, and deep learning to various business problems in loss prevention.
You'll be part of our innovative and diverse team of sophisticated data and analytics professionals. You'll work alongside multiple departments, including operations, innovation, business technology transformation, underwriting, and engineering. Through constant learning, innovation, discovery, and collaboration you'll not only help FM deliver on the promise of loss prevention, but you'll also grow your career and the scope of your impact across our company.
Using your creativity and applying a vast array of techniques and tools, you will plan, conduct, and advise the development and evaluation of real-world, large-scale problems using artificial intelligence/machine learning with minimal or limited supervision.
Your projects will be interesting, exciting and ambitious! You will use statistics to advance the mission and goals of FM.
Ph.D. in Statistics, Biostatistics or Operational Research with 2+ years of industry experience or a Master's degree with 5+ years of industry working experience in data science modelling
5+ years of experience of data processing, statistical analysis and modelling using tools such as Python,R, or SQL
2+ years of experience of working with cloud data analytics platform, such as Databricks
Advanced Knowledge and Working Experience in:
Generalized Linear Models (Logistic Regression, Zero-Inflated Model, etc.)
Model Regularization
Probability Distributions
Hypothesis testing
Statistical Inference
Machine Learning (Random Forest, Clustering, Gradient Descent, Gradient Boosting, etc.)
Simulation
Experiment Design
Non-Parametric Statistics
Experience as the lead role of full-cycle data science projects
Working experience in risk management and/or property insurance is strongly preferred
Compensation, Grade, and Job Title will be determined based on qualifications, experience, and technical skillset.
The position is eligible to participate in FM's comprehensive Total Rewards program that includes an incentive plan, generous health and well-being programs, a 401(k) and pension plan, career development opportunities, tuition reimbursement, flexible work, paid time off allowances and much more.
FM is an Equal Opportunity Employer and is committed to attracting, developing, and retaining a diverse workforce.
#LI-TA1
Auto-ApplyDevOps Engineer
Data engineer job in Boston, MA
We're looking for a Senior DevOps Tools Engineer to help modernize and elevate our development ecosystem. If you're passionate about improving how software teams build, test, secure, and deliver high-quality code-this role is built for you.
This is not a traditional infrastructure-heavy DevOps role. It's a developer-enablement, tooling modernization, and process transformation position with real influence.
🔧 Role Overview
You will lead initiatives that reshape how engineering teams work-modernizing tooling, redesigning source control practices, improving CI/CD workflows, and championing DevEx across the organization. This role combines hands-on engineering with strategic process design.
⭐ Key Responsibilities
Drive modernization of development tools and processes, including SVN → Git migration and workflow redesign.
Own and enhance CI/CD pipelines to improve reliability, automation, and performance.
Implement modern DevOps + DevSecOps practices (SAST, DAST, code scanning, dependency checks, etc.).
Automate build, packaging, testing, and release processes.
Advocate for and improve Developer Experience (DevEx) by reducing friction and enabling efficiency.
Collaborate across engineering teams to define standards for source control, branching, packaging, and release workflows.
Guide teams through modernization initiatives and influence technical direction.
🎯 Must-Have Qualifications
Strong experience with CI/CD pipelines, developer tooling, and automation.
Hands-on expertise with Git + Git-based platforms (GitLab, GitHub, Bitbucket).
Experience modernizing tooling or migrating from legacy systems (SVN → Git is a big plus).
Solid understanding of DevOps / DevSecOps workflows: automation, builds, packaging, security integration.
Proficient in scripting/programming for automation (Python, Bash, PowerShell, Groovy, etc.).
Excellent communication skills and ability to guide teams through change.
🏙️ Work Model
This is a full-time, hybrid role based in Boston, MA. Onsite participation is required.
📩 When Applying
Please include:
Updated resume
Expected Salary
Notice period (30 days or less)
A good time for a quick introductory call
If you're excited about modernizing engineering ecosystems, improving developer experience, and driving organization-wide transformation, we'd love to connect.
Java Software Engineer
Data engineer job in Boston, MA
Hello,
We have urgent openings for a "Java Backend Developer". These are hybrid roles.
Title: Backend Java Developer
F2F interview is required
Job Description:
Java/AWS Backend Developer
Senior Individual Contributors with deep expertise in Java, Node.js, AWS system design, event-driven microservices, performance optimization, and LLM integrations.
Key Responsibilities
Architect modular microservices and event-driven systems using Java/Node.js on AWS (SNS/SQS, Lambda, ECS, Batch).
Drive performance improvements, profiling, fine-tuning, and quality gates for production reliability.
Integrate chatbots and LLMs into backend services for intelligent, scalable applications.
Required Qualifications
10+ years backend experience with Java and Node.js.
Proven system design skills in distributed, microservices architectures.
Hands-on AWS expertise and event-driven patterns.
Track record of performance optimization and modular development.
ABOUT US:
Anagh Technologies is a technical consulting firm specializing in UI, Front-End, and Full-Stack web technologies. We currently have 30+ positions in Angular, React, Node, and Java.
If technically strong, we can 100% get you an offer within 2 weeks MAX, as we will consider you for multiple roles at once. If you are interested and are available, please email me your resume and contact information to jeff.r AT anaghtech.com. Thank you for your time.
Robotics Software Engineer
Data engineer job in Danvers, MA
Open Role: Onboarding Immediately
for REAL is a modern platform focused on simplifying the leasing experience for tenants and landlords. Tenants can browse listings, take 3D tours, and complete the application process seamlessly on their phones. Landlords benefit from centralized management of the leasing cycle, from tours to rent collection, all in one platform.
Role Description
This is a full-time on-site Robotics Engineer role located in Danvers, MA. The Robotics Software Engineer will be responsible for tasks such as developing robotics systems, implementing process automation, and collaborating with the software development team to enhance technology solutions.
Qualifications:
Experience with Structure from Motion (SfM) and camera pose estimation
Strong experience with 3D Gaussian Splatting and surface reconstruction
Proficiency in Python and C++
Hands-on experience designing and implementing computer vision algorithms (segmentation, object detection, classification, tracking)
Familiarity with deep learning models and their deployment
Solid understanding of multi-view geometry
Proficiency in OpenCV, and either PyTorch or TensorFlow
Experience working with 3D point clouds, mesh generation, and libraries such as Open3D, Trimesh, or PCL
Familiarity with 3D reconstruction pipelines (e.g., COLMAP, NerfStudio, Photogrammetry tools)
Strong knowledge of coordinate frames, and camera calibration
Preferred Qualifications:
Master's degree in Robotics, Computer Science, Electrical/Mechanical Engineering, or a related field
Experience with ROS/ROS 2 concepts
Familiarity with robot localization using SLAM and multi-sensor fusion
Experience working with multi-modal sensors: GPS, LiDAR, stereo/depth cameras, IMUs
Proficient in path planning algorithms (both global and local)
Experience developing robotic software stacks for controls, motion planning, sensor integration, and simulation.