Post job

Data engineer jobs in Worcester, MA

- 2,683 jobs
All
Data Engineer
Data Architect
Software Engineer
Data Scientist
Senior Data Architect
Software Development Engineer
Requirements Engineer
  • Senior Data Engineer

    Pyramid Consulting, Inc. 4.1company rating

    Data engineer job in Merrimack, NH

    Immediate need for a talented Senior Data Engineer. This is a 12 months contract opportunity with long-term potential and is located in Westlake, TX/Merrimack, NH(Hybrid). Please review the job description below and contact me ASAP if you are interested. Job ID:25-93826 Pay Range: $60 - $65/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location). Key Requirements and Technology Experience: Key Skills; ETL, SQL, PL/SQL, Informatica, Snowflake, Data modeling, DevOps . 7 years of experience in developing quality data solution Software development experience in the Financial Industry Expertise in Oracle PL/SQL development, SQL Scripting, and database performance tuning. Intermediate experience in Java development is preferred Solid understanding of ETL tools like Informatica and Data Warehousing like Snowflake. You enjoy learning new technologies, analyzing data, identifying gaps, issues, patterns, and building solutions You can independently analyze technical challenges, identify, assess impact, and identify innovative solutions Strong data modeling skills using Quantitative and Multidimensional Analysis Demonstrate understanding of data design concepts - Transactional, Data Mart, Data Warehouse, etc. Beginner proficiency of in Python, REST API and AWS is a plus You are passionate about delivering high-quality software using DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Git, Docker) practices! Experience developing software using Agile methodologies (Kanban and SCRUM) Strong analytical and problem-solving skills Excellent written and oral communication skills Our client is a leading Financial Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration. Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, colour, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
    $60-65 hourly 1d ago
  • Senior Data Engineer

    Cyber Space Technologies LLC 4.4company rating

    Data engineer job in Boston, MA

    Data Engineer (HRIS experience) Boston, MA ( 4 days onsite a week ) Key Responsibilities: Translate business needs into data modelling strategies and implement Snowflake data models to support HR analytics, KPIs, and reporting. Design, build, and maintain Snowflake objects including tables, views, and stored procedures. Develop and execute SQL or Python transformations, data mappings, cleansing, validation, and conversion processes. Establish and enforce data governance standards to ensure consistency, quality, and completeness of data assets. Manage technical metadata and documentation for data warehousing and migration efforts. Optimize performance of data transformation pipelines and monitor integration performance. Design, configure, and optimize integrations between Workday and third-party applications. Participate in system testing including unit, integration, and regression phases. Support data analysis needs throughout the implementation lifecycle. Required Experience & Skills: Experience with Snowflake or similar data warehouse platforms Strong SQL skills and experience with data transformation tools Experience with ETL processes and data validation techniques Understanding of HR data structures and relationships Excellent analytical and problem-solving abilities Experience with developing with Python Architecting a data warehousing solution leveraging data from Workday or other HRIS such as Workday to support advanced reporting and insights for an organization Preferred Experience & Skills: Experience in developing and supporting a data warehouse serving the HR domain Experience with data platforms where SCD Type 2 was required Experience with data visualization tools such as Tableau Experience with architecting or working with ELT technologies (such as DBT) and data architectures Understanding of HR processes, compliance requirements, and industry best practices
    $79k-109k yearly est. 2d ago
  • Data Engineer

    Mastech Digital 4.7company rating

    Data engineer job in Smithfield, RI

    Smithfield, RI and Westlake, TX Full Time Must Have Skills: Strong SQL for querying and data validation Oracle AWS ETL experience with Java Spring Batch (for the ETL data transformation). Note: the ETL work is done in Java (so Python is only a nice to have). Must have Java for ETL which makes this role difficult to source on. DO NOT need Data Engineer without Java
    $82k-108k yearly est. 2d ago
  • DBT SME - Data Modeling, Analytics Engineer

    Gardner Resources Consulting, LLC

    Data engineer job in Boston, MA

    We're seeking a Lead Analytics Engineer to help design, model, and scale a modern data environment for a global software organization. This role will play a key part in organizing and maturing that landscape as part of a multi-year strategic roadmap. This position is ideal for a senior-level analytics engineer who can architect data solutions, build robust models, and stay hands-on with development. This is a remote role with occasional onsite meetings. Candidates must currently be local to the Boston area and reside in MA/CT/RI/NH/ME. Long term contract. W2 or c2c. Highlights: Architect and build new data models using dbt and modern modeling techniques. Partner closely with leadership and business teams to translate complex requirements into technical solutions. Drive structure and clarity within a growing analytics ecosystem. Qualifications Bachelor's degree in Economics, Mathematics, Computer Science, or related field. 10+ years of experience in an Analytics Engineering role. Expert in SQL and dbt with demonstrated modeling experience. Data Modeling & Transformation: Design and implement robust, reusable data models within the warehouse. Develop and maintain SQL transformations in dbt. Data Pipeline & Orchestration: Build and maintain reliable data pipelines in collaboration with data engineering. Utilize orchestration tools (Airflow) to manage and monitor workflows. Manage and support dbt environments and transformations. Hands-on experience with BigQuery or other cloud data warehouses. Proficiency in Python and Docker. Experience with Airflow (Composer), Git, and CI/CD pipelines. Strong attention to detail and communication skills; able to interact with both technical and business stakeholders. Technical Requirements: Primary Data Warehouse: BigQuery (mandatory) Nice to Have: Snowflake, Redshift Orchestration: Airflow (GCP Composer) Languages: Expert-level SQL / dbt; strong Python required Other Tools: GCP or AWS, Fivetran, Apache Beam, Looker or Preset, Docker Modeling Techniques: Vault 2.0, 3NF, Dimensional Modeling, etc. Version Control: Git / CI-CD Quality Tools: dbt-Elementary, dbt-Osmosis, or Great Expectations preferred
    $85k-115k yearly est. 4d ago
  • Senior Data Engineer

    Dewinter Group

    Data engineer job in Boston, MA

    This role is with a Maris Financial Services Partner Boston, MA - Hybrid Role - We are targeting local candidates that can be in the Boston office 3 days per week. 12 Month + contract (or contract to hire, if desired) This team oversees critical systems including Snowflake, Tableau, and RDBMS technologies like SQL Server and Postgres. This role will focus on automating database deployments and creating efficient patterns and practices that enhance our data processing capabilities. Key Responsibilities: Design, enhance, and manage DataOps tools and services to support cloud initiatives. Develop and maintain scheduled workflows using Airflow. Create containerized applications for deployment with ECS, Fargate, and EKS. Build data pipelines to extract, transform, and load (ETL) data from various sources into Apache Kafka, ultimately feeding into Snowflake. Provide consultation for infrastructure projects to ensure alignment with technical architecture and end-user needs. Qualifications: Familiarity with Continuous Integration and Continuous Deployment (CI/CD) practices and tools. Understanding of application stack architectures (e.g., microservices), PaaS development, and AWS environments. Proficiency in scripting languages such as Bash. Experience with Python, Go, or C#. Hands-on experience with Terraform or other Infrastructure as Code (IaC) tools, such as CloudFormation. Preferred experience with Apache Kafka and Flink. Proven experience working with Kubernetes. Strong knowledge of Linux and Docker environments. Excellent communication and interpersonal skills. Strong analytical and problem-solving abilities. Ability to manage multiple tasks and projects concurrently. Expertise with SQL Server, Postgres, and Snowflake. In-depth experience with ETL/ELT processes.
    $85k-115k yearly est. 4d ago
  • Data Engineer (HR Data warehousing exp)

    Ness Digital Engineering

    Data engineer job in Boston, MA

    Ness is a full lifecycle digital engineering firm offering digital advisory through scaled engineering services. Combining our core competence in engineering with the latest in digital strategy and technology, we seamlessly manage Digital Transformation journeys from strategy through execution to help businesses thrive in the digital economy. As your tech partner, we help engineer your company's future with cloud and data. For more information, visit ************ Data Engineer (HR Data warehousing exp) Boston, MA (3-4 days onsite a week) Key Responsibilities: Translate business needs into data modeling strategies and implement Snowflake data models to support HR analytics, KPIs, and reporting. Design, build, and maintain Snowflake objects including tables, views, and stored procedures. Develop and execute SQL or Python transformations, data mappings, cleansing, validation, and conversion processes. Establish and enforce data governance standards to ensure consistency, quality, and completeness of data assets. Manage technical metadata and documentation for data warehousing and migration efforts. Optimize performance of data transformation pipelines and monitor integration performance. Design, configure, and optimize integrations between Workday and third-party applications. Participate in system testing including unit, integration, and regression phases. Support data analysis needs throughout the implementation lifecycle. Required Experience & Skills: Experience with Snowflake or similar data warehouse platforms Strong SQL skills and experience with data transformation tools Experience with ETL processes and data validation techniques Understanding of HR data structures and relationships Excellent analytical and problem-solving abilities Experience with developing with Python Architecting a data warehousing solution leveraging data from Workday or other HRIS such as Workday to support advanced reporting and insights for an organization Preferred Experience & Skills: Experience in developing and supporting a data warehouse serving the HR domain Experience with data platforms where SCD Type 2 was required Experience with data visualization tools such as Tableau Experience with architecting or working with ELT technologies (such as DBT) and data architectures Understanding of HR processes, compliance requirements, and industry best practices Ness is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law
    $85k-115k yearly est. 1d ago
  • Senior Data Engineer

    Delmar Nord

    Data engineer job in Boston, MA

    We are seeking a highly skilled Senior Data Engineer for a client that is based in Boston. Ideal candidates will have Power BI or Tableau Experience SQL experience AWS Cloud experience
    $85k-115k yearly est. 4d ago
  • Junior Data Engineer

    Quintrix, By Mindlance

    Data engineer job in Boston, MA

    Job Title: Junior Data Engineer W2 candidates only We are on the lookout for engineers who are open to upskill to the exciting world of Data Engineering. This opportunity is for our client, a top tier insurance company and includes a 2-3 week online pre-employment training program (15 hours per week), conveniently scheduled after business hours. Participants who successfully complete the program will receive a $500 stipend. This is a fantastic chance to gain in demand skills, hands-on experience, and a pathway into a dynamic tech role.. Key Responsibilities: • Assist in the design and development of big data solutions using technologies such as Spark, Scala, AWS Glue, Lambda, SNS/SQS, and CloudWatch. • Develop applications primarily in Scala and Python with guidance from senior team members. • Write and optimize SQL queries, preferably with Redshift; experience with Snowflake is a plus. • Work on ETL/ELT processes and frameworks to ensure smooth data integration. • Participate in development tasks, including configuration, writing unit test cases, and testing support. • Help identify and troubleshoot defects and assist in root cause analysis during testing. • Support performance testing and production environment troubleshooting. • Collaborate with the team on best practices, including Git version control and CI/CD deployment processes. • Continuously learn and grow your skills in big data technologies and cloud platforms. Prerequisites: • Recent graduate with a degree in Computer Science, Information Technology, Engineering, or related fields. • Basic experience or coursework in Scala, Python, or other programming languages. • Familiarity with SQL and database concepts. • Understanding of ETL/ELT concepts is preferred. • Exposure to AWS cloud services (Glue, Lambda, SNS/SQS) is a plus but not mandatory. • Strong problem-solving skills and eagerness to learn. • Good communication and teamwork abilities. Selection Process & Training: • Online assessment and technical interview by Quintrix. • Client Interview(s). • 2-3 weeks of pre-employment online instructor-led training. Stipend paid during Training: • $500. Benefits: • 2 weeks of Paid Vacation. • Health Insurance including Vision and Dental. • Employee Assistance Program. • Dependent Care FSA. • Commuter Benefits. • Voluntary Life Insurance. • Relocation Reimbursement. Who is Quintrix? Quintrix is on a mission to help individuals develop their technology talent. We have helped hundreds of candidate's kick start their careers in tech. You will be “paid-to-learn”, qualifying you for a high paying tech job with one of our top employers. To learn more about our candidate experience go to *************************************
    $85k-115k yearly est. 4d ago
  • Senior Backend Data Engineer

    Tekvalue It Solutions

    Data engineer job in Boston, MA

    Hybrid - Boston MA, Richmond VA, or McLean VA Long Term - Ex Capital one Required Skills & Experience 5-8+ years in backend or data engineering. PySpark & Python (expert level). AWS: Hands-on experience with Glue, Lambda, EC2; Step Functions preferred. Strong background in ETL/ELT and large-scale ingestion pipelines. Experience supporting accounting/reporting data flows or similar financial processes. Knowledge of secure file transfer, validation, audit, and compliance workflows. Solid understanding of distributed systems, CI/CD, and DevOps practices.
    $85k-115k yearly est. 1d ago
  • Senior Data Engineer

    Basil Systems

    Data engineer job in Boston, MA

    Hi, this is Eric 👋 We're hiring a stellar Data Engineer to join our engineering org at Basil Systems. At Basil Systems, we're revolutionizing healthcare data access and insights for the life sciences industry. We've built powerful platforms that help pharmaceutical and medical device companies navigate complex regulatory landscapes, accelerate product development, and ultimately bring life-saving innovations to market faster. Our SaaS platforms transform disconnected data sources into actionable intelligence, empowering organizations to make data-driven decisions that improve patient outcomes and save lives. The Role We are seeking a Senior Data Engineer to own and advance the data infrastructure that powers our healthcare insights platform. As our engineering team scales and we expand our data capabilities, we need someone who can build reliable, scalable pipelines while ensuring data quality across increasingly complex regulatory sources. Key Responsibilities Design, build, and maintain robust ETL processes for healthcare regulatory data Integrate new data sources as we onboard customers and expand platform capabilities Optimize pipeline performance and reliability Ensure data accuracy and consistency across complex transformation workflows Qualifications 5+ years of professional experience as a data engineer or in a similar role Experience with Apache Spark and distributed computing Familiarity with common ML algorithms and their applications Knowledge of or willingness to learn and work with Generative AI technologies Experience with developing for distributed cloud platforms Experience with MongoDB / ElasticSearch and technologies like BigQuery Strong commitment to engineering best practices Nice-to-Haves Solid understanding of modern security practices, especially in healthcare data contexts Subject matter expertise in LifeSciences / Pharma / MedTech This role might not be for you if... You're a heavy process advocate and want enterprise-grade Scrum or rigid methodologies You have a need for perfect clarity before taking action You have a big company mindset What We Offer Competitive salary Health and vision benefits Attractive equity package Flexible work environment (remote-friendly) Opportunity to work on impactful projects that are helping bring life-saving medical products to market Be part of a mission-driven team solving real healthcare challenges at a critical scaling point Our Culture At Basil Systems, we value flexibility and support a distributed team. We actively employ and support remote team members across different geographies, allowing you to work when, where, and how you work best. We are committed to building a diverse, inclusive, and safe work environment for everyone. Our team is passionate about using technology to make a meaningful difference in healthcare. How to Apply If you're excited about this opportunity and believe you'd be a great fit for our team, please send your resume and a brief introduction to *****************************. Basil Systems is an equal opportunity employer. We welcome applicants of all backgrounds and experiences.
    $85k-115k yearly est. 4d ago
  • Data Science Engineer

    Ket Software

    Data engineer job in Boston, MA

    Role: Data Science Engineer Note: In-person interview required This is a 12+ month, ongoing contract with our insurance client in Boston, MA 4x hybrid per week with a mandatory final onsite interview. We are seeking a talented Data Science Engineer to join our team and contribute to the development and implementation of advanced data solutions using technologies such as AWS Glue, Python, Spark, Snowflake Data Lake, S3, SageMaker, and machine learning (M/L). Overview: As a Data Science Engineer, you will play a crucial role in designing, building, and optimizing data pipelines, machine learning models, and analytics solutions. You will work closely with cross-functional teams to extract actionable insights from data and drive business outcomes. Develop and maintain ETL pipelines using AWS Glue for data ingestion, transformation, and integration from various sources. Utilize Python and Spark for data preprocessing, feature engineering, and model development. Design and implement data lake architecture using Snowflake Data Lake, Snowflake data warehouse and S3 for scalable and efficient storage and processing of structured and unstructured data. Leverage SageMaker for model training, evaluation, deployment, and monitoring in production environments. Collaborate with data scientists, analysts, and business stakeholders to understand requirements, develop predictive models, and generate actionable insights. Conduct exploratory data analysis (EDA) and data visualization to communicate findings and trends effectively. Stay updated with advancements in machine learning algorithms, techniques, and best practices to enhance model performance and accuracy. Ensure data quality, integrity, and security throughout the data lifecycle by implementing robust data governance and compliance measures. Requirements added by the job poster • 4+ years of work experience with Amazon Web Services (AWS) • 2+ years of work experience with Machine Learning • Accept a background check • 3+ years of work experience with Python (Programming Language) • Working in a hybrid setting
    $85k-115k yearly est. 3d ago
  • Data Scientist

    Sotalent

    Data engineer job in Johnston, RI

    Job Title: Data Scientist Type: Full Time Our client is looking for a Data Scientist to design and implement advanced analytics and AI solutions that solve real-world business challenges. This role offers the opportunity to innovate, explore cutting-edge technologies, and make a measurable impact. What You'll Do Lead end-to-end data science projects, from concept to deployment. Apply statistical modeling, machine learning, and deep learning to large-scale problems. Collaborate with cross-functional teams to translate business needs into data-driven strategies. Drive innovation through experimentation and advanced analytics. What We're Looking For Ph.D. in Statistics, Biostatistics, or Operations Research with 2+ years of experience OR Master's with 5+ years in data science. Expertise in Python, R, or SQL Cloud platforms (e.g., Databricks) GLMs, Model Regularization, Probability Distributions, Hypothesis Testing Machine Learning (Random Forest, Gradient Boosting, Clustering) Simulation, Experimental Design, Non-Parametric Statistics Proven experience leading full-cycle data science projects. Background in risk management or insurance is a plus. Benefits & Perks Competitive compensation and performance-based incentives. Comprehensive health and wellness programs. Retirement plans (401(k) and pension). Flexible work arrangements and generous paid time off. Tuition reimbursement and continuous learning opportunities.
    $74k-104k yearly est. 1d ago
  • Data Modelling Architect

    Wissen Technology

    Data engineer job in Boston, MA

    The Wissen team continues to expand its footprint in the Canada & USA. More openings to come as we continue to grow the team! Please read below for a brilliant career opportunity. Role: Data Modelling Architect Title: Vice President Location: Boston, MA (Day 1 Onsite/Hybrid) Mode of Work: 3 days per week onsite required Required experience: 10+ Years Job Description We are looking for an experienced Data Modelling Architect to design and optimize enterprise data models supporting risk, regulatory, and financial domains. The role requires strong expertise in conceptual, logical, and physical data modelling, along with working knowledge of Financial Risk or Operational Risk frameworks used in global banking environments. Required Skills 10-12 years of strong experience in data modelling and data architecture. Expertise in ER modelling, dimensional modelling, and industry-standard modelling methodologies. Hands-on experience with tools like Erwin, ER/Studio. Strong SQL and experience with relational databases and distributed/cloud data platforms. Working knowledge of Financial Risk, Operational Risk, or regulatory risk data (Credit Risk, Market Risk, Liquidity Risk, RCSA, Loss Events, KRI, etc.). Experience supporting regulatory frameworks such as Basel II/III, CCAR, ICAAP, or similar. Ability to work with cross-functional teams across global locations. Excellent communication and documentation skills. Benefits: Healthcare insurance for you and your family (medical, dental, vision). Short / Long term disability insurance. Life Insurance. Accidental death & disability Insurance. 401K. 3 weeks of Paid Time Off. Support and fee coverage for immigration needs. Remote office set up stipend. Support for industry certifications. Additional cash incentives. Re-skilling opportunities to transition between technologies. Schedule: Monday to Friday Work Mode: Hybrid Job Type: Full-time We are: A high end technical consulting firm built and run by highly qualified technologists. Our workforce consists of 5000+ highly skilled professionals, with leadership from Wharton, MIT, IITs, IIMs, and NITs and decades of experience at Goldman Sachs, Morgan Stanley, MSCI, Deutsche Bank, Credit Suisse, Verizon, British Telecom, ISRO etc. Without any external funding or investments, Wissen Technology has grown its revenues by 100% every other year since it started as a subsidiary of Wissen Group in 2015. We have a global presence with offices in the US, India, UK, Australia, Mexico, and Canada. You are: A true tech or domain ninja. Or both. Comfortable working in a quickly growing profitable startup, have a “can do” attitude and are willing to take on any task thrown your way. You will: Develop and promote the company's culture of engineering excellence. Define, develop and deliver solutions at a top tier investment bank or another esteemed client. Perform other duties as needed Your Education and Experience: We value candidates who can execute on our vision and help us build an industry-leading organization. Graduate-level degree in computer science, engineering, or related technical field Wissen embraces diversity and is an equal opportunity employer. We are committed to building a team that represents a variety of backgrounds, skills, and abilities. We believe that the more inclusive our team is, the better our work will be. All qualified applicants, including but not limited to LGBTQ+, Minorities, Females, the Disabled, and Veterans, are encouraged to apply. About Wissen Technology: The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for diverse industries, including Banking, E-commerce, Telecom, Healthcare, Manufacturing, and Energy. We help clients build world-class products. We have offices in the US, India (Bangalore, Hyderabad, Chennai, Gurugram, Mumbai, Pune), UK, Australia, Mexico, Vietnam, and Canada. We empower businesses with a dynamic portfolio of services and accelerators tailored to today's digital demands and based on future ready technology stack. Our services include Industry Leading Custom Software Development, AI-Driven Software Engineering, Generative AI & Machine Learning, Real-Time Data Analytics & Insights, Interactive Data Visualization & Decision Intelligence, Intelligent Process Automation, Multi-Cloud & Hybrid Cloud Strategies, Cross-Platform Mobile Experiences, CI/CD-Powered Agile DevOps, Automated Quality Engineering, and cutting-edge integrations. Certified as a Great Place to Work for five consecutive years (2020-2025) and recognized as a Top 20 AI/ML vendor by CIO Insider, Wissen Group has delivered multimillion-dollar projects for over 20 Fortune 500 companies. Wissen Technology delivers exceptional value on mission-critical projects through thought leadership, ownership, and reliable, high-quality, on-time delivery. Our industry-leading technical expertise stem from the talented professionals we attract. Committed to fostering their growth and providing top-tier career opportunities, Wissen ensures an outstanding experience and value for our clients and employees. We Value: Perfection: Pursuit of excellence through continuous improvement. Curiosity: Fostering continuous learning and exploration. Respect: Valuing diversity and mutual respect. Integrity: Commitment to ethical conduct and transparency. Transparency: Open communication and trust. Website: ************** Glassdoor Reviews: ************************************************************* Wissen Thought leadership: https://**************/articles/ Latest in Wissen in CIO Insider: ********************************************************************************************************************** Employee Speak: *************************************************************** LinkedIn: ************************************************** About Wissen Interview Process: https://**************/blog/we-work-on-highly-complex-technology-projects-here-is-how-it-changes-whom-we-hire/ Wissen: A Great Place to Work https://**************/blog/wissen-is-a-great-place-to-work-says-the-great-place-to-work-r-institute-india https://**************/blog/here-is-what-ownership-and-commitment-mean-to-wissenites/ Wissen | Driving Digital Transformation A technology consultancy that drives digital innovation by connecting strategy and execution, helping global clients to strengthen their core technology. Job Type: Full-time Work Location: In person
    $93k-127k yearly est. 14h ago
  • Data Architect - HRIS

    TEK Ninjas

    Data engineer job in Boston, MA

    Job Title: Sr. Data Architect- HR Data & Analytics Duration: 6 - 12+ Months Notes : Snowflake,ETL/ELT, HRIS, SQL, Python. The Data Architect will play a key role in designing and implementing the enterprise HR data architecture to support HR analytics, KPIs, and reporting initiatives. This role involves translating complex business requirements into scalable, governed data solutions built on Snowflake and integrated with Workday and other HR systems. The ideal candidate will combine deep technical expertise with a strong understanding of HR data domains, ensuring data integrity, accessibility, and analytical value across the organization. Key Responsibilities Architect & Model: Design and implement scalable, efficient Snowflake data models to support HR analytics, workforce planning, and KPI reporting. Data Integration: Develop and optimize integrations between Workday, Snowflake, and downstream analytics platforms; ensure seamless, accurate data flow across systems. Governance & Quality: Define and enforce data governance, quality, and metadata management standards to ensure data consistency and compliance. Documentation & Metadata: Maintain comprehensive technical documentation and data dictionaries for warehouse structures, transformations, and integrations. Performance Optimization: Monitor and tune ETL/ELT pipelines, ensuring high-performance data transformation and loading processes. Collaboration: Partner with HR, Data Engineering, and Analytics teams to translate business logic into reusable and governed data assets. Testing & Validation: Participate in unit, integration, and regression testing to validate data pipelines and ensure data accuracy. Lifecycle Support: Support data analysis and troubleshooting across the full implementation and operational lifecycle of HR data solutions. Required Experience & Skills Proven experience architecting and implementing solutions on Snowflake or similar cloud data warehouse platforms. Advanced SQL skills and hands-on experience with data transformation and pipeline optimization tools. Strong understanding of ETL/ELT frameworks, data validation, and reconciliation techniques. Demonstrated experience working with HR data structures, Workday, or other HRIS systems. Strong analytical mindset and problem-solving ability, with attention to data integrity and business context. Experience with Python for data engineering, automation, or orchestration tasks. Track record of designing data warehouses or analytical platforms leveraging HR data to drive insights and advanced reporting. Preferred Experience & Skills Experience building and supporting data warehouses specifically for HR and People Analytics domains. Hands-on experience with Slowly Changing Dimensions (SCD Type 2) and historical data management. Proficiency with data visualization tools such as Tableau or Power BI. Experience with ELT frameworks (e.g., dbt) and modern data architecture patterns (e.g., Data Vault, Medallion Architecture). Familiarity with HR processes, compliance standards, and industry best practices related to HR data management and reporting. Experience working in an enterprise environment with cross-functional collaboration between HR, Finance, and Technology teams.
    $93k-127k yearly est. 2d ago
  • Aure Data Factory Architect

    Smart It Frame LLC

    Data engineer job in Newton, MA

    Role: Aure Data Factory Architect Type: Contract About Smart IT Frame: At Smart IT Frame, we connect top talent with leading organizations across the USA. With over a decade of staffing excellence, we specialize in IT, healthcare, and professional roles, empowering both clients and candidates to grow together. Job Summary: We are seeking an experienced Architect with 7 to 9 years of experience to join our team in a hybrid work model. The ideal candidate will have expertise in Snowflake Tasks and Azure Data Factory, with a strong understanding of Industry Essentials and Finance & Accounting. This role involves designing and implementing data solutions that align with business objectives, ensuring data integrity and efficiency. The position does not require travel and operates during day shifts. Responsibilities : Design and implement scalable data architecture solutions using Snowflake Tasks and Azure Data Factory to support business needs. Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Ensure data integrity, security, and compliance with industry standards and regulations. Optimize data processing workflows to enhance performance and reduce latency. Provide technical guidance and support to development teams throughout the project lifecycle. Develop and maintain documentation for data architecture and processes to ensure knowledge sharing and continuity. Conduct regular assessments of data systems to identify areas for improvement and implement necessary changes. Monitor and troubleshoot data pipelines to ensure seamless data flow and address any issues promptly. Stay updated with the latest industry trends and technologies to continuously improve data solutions. Work closely with stakeholders to understand business objectives and align data architecture accordingly. Implement best practices for data management and governance to ensure data quality and consistency. Collaborate with IT and business teams to integrate data solutions with existing systems and platforms. Provide training and support to team members on data architecture and related technologies. Qualifications : Possess strong expertise in Snowflake Tasks and Azure Data Factory, with a proven track record of successful implementations. Demonstrate a solid understanding of Industry Essentials and Finance & Accounting, with the ability to apply domain knowledge to data solutions. Exhibit excellent problem-solving skills and the ability to work independently as well as part of a team. Have strong communication skills to effectively collaborate with technical and non-technical stakeholders. Show a commitment to continuous learning and staying updated with the latest advancements in data architecture and related technologies. 📩 Apply today or share profiles at ***************************
    $93k-127k yearly est. 1d ago
  • Data Architect

    Hyrhub

    Data engineer job in Newton, MA

    A seasoned Azure Data Architect with deep expertise in the Microsoft Fabric platform to design, develop, and govern enterprise-scale analytics and data-platform solutions. Bachelor's or Master's degree in computer science, Information Technology, or a related field. Candidate will be responsible for designing, developing, and maintaining scalable, secure, and efficient data integration pipelines using Microsoft Fabric capabilities. Develop and manage lakehouse and warehouse architectures (using Fabric Lakehouse/Warehouse, Delta Lake, etc.). This role requires strong expertise in Azure Data Factory, Azure Synapse / Data Lake, Data Lake Storage Gen2 and associated Azure data services, along with hands-on experience in ETL/ELT development, performance tuning, and automation. Design and develop data pipelines using Azure Data Factory for ingesting, transforming, and orchestrating data from multiple on-prem and cloud sources. Implement ETL/ELT processes to move data from various sources (SQL, APIs, files, SAP, etc.) to Azure Data Lake, Azure Synapse, or Databricks. Strong problem-solving skills and attention to detail. Experience working in Agile/Scrum development teams is preferred. Certifications such as Microsoft Certified: Fabric Data Engineer Associate, Azure Solutions Architect or similar. Exposure to Customer data platform (MS Customer Insights Data and MS customer insights journey). Key Responsibilities Collaborate with business stakeholders and senior leadership to translate business requirements into a coherent data-platform architecture. Define and maintain the data-architecture roadmap, including data-ingestion, transformation, storage and analytics layers using Microsoft Fabric. Design end-to-end data solutions: ingestion pipelines, lakehouses/warehouses, semantic layer, analytics consumption and real-time capabilities. Architect and guide modelling of data (conceptual, logical, physical) - ensuring consistency, performance and reusability. Oversee migration of existing platforms (on-premises or legacy cloud systems) to Fabric-centric architecture. Work with data-engineering and analytics teams to implement solutions (e.g., Azure Data Factory/SQL/Synapse, Fabric pipelines, OneLake). Build and maintain parameterized, reusable ADF pipelines with dynamic configurations. Integrate ADF with Azure Functions, Logic Apps, and DevOps CI/CD pipelines. Ensure data quality, data governance, security and compliance across Fabric solutions. Monitor, tune, and optimise performance of data pipelines and storage solutions to ensure efficiency, cost-effectiveness and reliability.
    $93k-127k yearly est. 14h ago
  • ETL Data Engineer with SpringBatch Experience-- SHADC5693360

    Compunnel Inc. 4.4company rating

    Data engineer job in Smithfield, RI

    Job Title: ETL Data Engineer with SpringBatch Experience - W2 only - We can provide sponsorship Duration: Long Term MUST HAVES: Strong SQL for querying and data validation Oracle AWS ETL experience with Java SpringBatch (for the ETL data transformation). Note: the ETL work is done in Java (so Python is only a nice to have). The Expertise and Skills You Bring Bachelor's or Master's Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with 5+ years of working experience 4+ years of Java development utilizing Spring frameworks. Experience writing batch jobs with Spring Batch is a must 2+ years of experience developing applications that run in AWS, with focus on AWS Batch, S3, IAM 3+ years working with SQL (ANSI SQL, Oracle, Snowflake) 2+ years of Python development Experience with Unix shell scripting (bash, ksh) and scheduling / orchestration tools (Control-M) Strong data modeling skills with experience working with 3NF and Star Schema data models Proven data analysis skills; not afraid to work in a complex data ecosystem Hands-on experience on SQL query optimization and tuning to improve performance is desirable Experience with DevOps, Continuous Integration and Continuous Delivery (Jenkins, Terraform, CloudFormation) Experience in Agile methodologies (Kanban and SCRUM) Experience building and deploying containerized applications using Docker Work experience in the financial services industry is a plus Proven track record to handle ambiguity and work in a fast-paced environment, either independently or in a collaborative manner Good interpersonal skills to work with multiple teams within the business unit and across the organization
    $79k-104k yearly est. 3d ago
  • Sr. Data Architect

    GBIT (Global Bridge Infotech Inc.

    Data engineer job in Boston, MA

    Job Title: Sr. Data Architect - HR Data & Analytics (Need the Architect level) The Data Architect will play a key role in designing and implementing the enterprise HR data architecture to support KKR's global HR analytics, KPIs, and reporting initiatives. This role involves translating complex business requirements into scalable, governed data solutions built on Snowflake and integrated with Workday and other HR systems. The ideal candidate will combine deep technical expertise with a strong understanding of HR data domains, ensuring data integrity, accessibility, and analytical value across the organization. Key Responsibilities Architect & Model: Design and implement scalable, efficient Snowflake data models to support HR analytics, workforce planning, and KPI reporting. Data Integration: Develop and optimize integrations between Workday, Snowflake, and downstream analytics platforms; ensure seamless, accurate data flow across systems. Governance & Quality: Define and enforce data governance, quality, and metadata management standards to ensure data consistency and compliance. Documentation & Metadata: Maintain comprehensive technical documentation and data dictionaries for warehouse structures, transformations, and integrations. Performance Optimization: Monitor and tune ETL/ELT pipelines, ensuring high-performance data transformation and loading processes. Collaboration: Partner with HR, Data Engineering, and Analytics teams to translate business logic into reusable and governed data assets. Testing & Validation: Participate in unit, integration, and regression testing to validate data pipelines and ensure data accuracy. Lifecycle Support: Support data analysis and troubleshooting across the full implementation and operational lifecycle of HR data solutions. Required Experience & Skills Proven experience architecting and implementing solutions on Snowflake or similar cloud data warehouse platforms. Advanced SQL skills and hands-on experience with data transformation and pipeline optimization tools. Strong understanding of ETL/ELT frameworks, data validation, and reconciliation techniques. Demonstrated experience working with HR data structures, Workday, or other HRIS systems. Strong analytical mindset and problem-solving ability, with attention to data integrity and business context. Experience with Python for data engineering, automation, or orchestration tasks. Track record of designing data warehouses or analytical platforms leveraging HR data to drive insights and advanced reporting. Preferred Experience & Skills Experience building and supporting data warehouses specifically for HR and People Analytics domains. Hands-on experience with Slowly Changing Dimensions (SCD Type 2) and historical data management. Proficiency with data visualization tools such as Tableau or Power BI. Experience with ELT frameworks (e.g., dbt) and modern data architecture patterns (e.g., Data Vault, Medallion Architecture). Familiarity with HR processes, compliance standards, and industry best practices related to HR data management and reporting. Experience working in an enterprise environment with cross-functional collaboration between HR, Finance, and Technology teams.
    $100k-135k yearly est. 3d ago
  • Software Development Engineer in Test - AI

    New Balance 4.8company rating

    Data engineer job in Boston, MA

    JOB MISSION: New Balance is seeking a forward-thinking Senior SDET with a developer's mindset and a passion for AI to lead the next evolution of our global eCommerce test automation platform. This is a unique opportunity for someone who thrives on staying ahead of AI trends and is eager to apply them to modern software quality engineering. You'll drive the transformation of our Selenium and BDD-based test stack into a cutting-edge, AI-augmented platform that supports everything from unit testing to full user journey validation. If you're a builder at heart-excited by the challenge of creating scalable, self-healing, and autonomous testing systems that empower both engineers and developers-this role is for you. MAJOR ACCOUNTABILITIES: Lead the architectural redesign of our test automation platform, transitioning from a legacy Selenium/C# and BDD stack to a modern, intelligent framework. Design, build, and maintain AI-driven test automation platforms that enable reliable, scalable tests across the entire testing pyramid-from unit and integration to full end-to-end user journeys. Implement AI-augmented testing strategies to support autonomous test creation, maintenance, and healing. Integrate visual validation tools such as Applitools Eyes into the automation pipeline. Collaborate cross-functionally with developers, QA engineers, and DevOps to ensure test coverage, reliability, and scalability across global eCommerce sites. Evaluate and integrate open-source and commercial tools that enhance test intelligence, observability, and maintainability. Advocate for testability by partnering with developers and architects to influence solution design. Mentor and guide other SDETs and QA engineers in modern test automation practices and AI-driven testing approaches. Continuously research and prototype emerging AI technologies in the testing space to keep the platform at the forefront of innovation. REQUIREMENTS FOR SUCCESS: 5+ years of experience in test automation, with deep expertise in Selenium and C#. Strong understanding of BDD frameworks (e.g., SpecFlow, Cucumber) and test design principles. Hands-on experience with Selenium extensions such as Healenium, Selenide, or Selenium Grid, with a focus on improving test resilience, scalability, and maintainability. Proven ability to implement self-healing test mechanisms and intelligent locator strategies to reduce flakiness and maintenance overhead. Familiarity with AI-augmented testing strategies (e.g., intelligent test generation, adaptive test execution). Experience integrating Selenium-based frameworks into modern CI/CD pipelines (e.g., Azure DevOps, Jenkins), with AI-driven diagnostics or analytics. Proficiency with visual testing tools like Applitools Eyes. Experience with modern automation frameworks such as TestRigor, Playwright, or Cypress. Exposure to machine learning or NLP concepts applied to software testing. Contributions to open-source testing tools or frameworks. Strong problem-solving, communication, and mentoring skills.
    $89k-115k yearly est. 3d ago
  • Software Engineer

    Acro Service Corporation 4.8company rating

    Data engineer job in Boston, MA

    Work schedule: Hybrid Key Responsibilities: Performance Tuning: Monitor and optimize performance, including query performance, resource utilization, and storage management. User and Access Management: Manage user access, roles, and permissions to ensure data security and compliance with organizational policies. Data Integration: Support and manage data integration processes, including data loading, transformation, and extraction. Troubleshooting and Support: Provide technical support and troubleshooting for Snowflake-related issues, including resolving performance bottlenecks and query optimization. Documentation and Reporting: Maintain detailed documentation of system configurations, procedures, and changes. Generate and deliver regular reports on system performance and usage. Collaboration: Work closely with data engineers, analysts, and other IT professionals to ensure seamless integration and optimal performance of the Snowflake environment. Best Practices: Stay up to date with Snowflake best practices and industry trends. Recommend and implement improvements and upgrades to enhance system functionality and performance. Qualifications and Experience: 5+ years of experience in data architecture, data engineering, or database development. 2+ years of hands-on experience with Snowflake, including data modeling, performance tuning, and security. At a minimum Bachelor's degree in Computer Science, Information Technology, or related field. Experience with source control tools (GitHub preferred), ETL/ELT tools and cloud platforms (AWS preferred). Experience or exposure to AI tools. Deep understanding of data warehousing concepts, dimensional modeling, and analytics. Excellent problem-solving and communication skills. Experience integrating Snowflake with BI and reporting tools is a plus Required Skills: Strong proficiency in Snowflake architecture, features, and capabilities. Knowledge of SQL and Snowflake-specific query optimization. Experience with ETL tools and data integration processes. Strong proficiency in SQL and Python. Strong Database design and data modelling experience. Experience with data modeling tools. Ability to identify and drive continuous improvements. Strong problem solving and analytical skills. Demonstrated process-oriented and strategic thinking skills. Strong motivation and a desire to continuously learn and grow. Knowledge of Snowflake security features including access control, authentication, authorization, encryption, masking, secure view, etc. Experience working in AWS cloud environments. Experience working with Power BI and other BI, data visualization, and reporting tools. Business requirement gathering and aligning to solutions delivery. Experience with data integration solutions and tools, data pipelines, and modern ways of automating data using cloud based and on-premises technologies. Experience integrating Snowflake with an identity and access management program such as Azure IDP is a plus. Experience with other relational database management systems, cloud data warehouses and big data platforms is a plus. Analytical Skills: Excellent problem-solving and analytical skills with strong attention to detail. Communication: Effective communication skills, both written and verbal, with the ability to convey complex technical information to non-technical stakeholders. Teamwork: Ability to work independently and collaboratively in a fast-paced environment. Preferred Skills: Snowflake certification (e.g., SnowPro Core or Advanced Certification).
    $98k-136k yearly est. 4d ago

Learn more about data engineer jobs

How much does a data engineer earn in Worcester, MA?

The average data engineer in Worcester, MA earns between $75,000 and $132,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Worcester, MA

$99,000
Job type you want
Full Time
Part Time
Internship
Temporary