Post job

Data engineer jobs in Hamilton, NJ

- 1,175 jobs
All
Data Engineer
Data Consultant
Software Engineer
Requirements Engineer
Senior Software Engineer
Data Warehouse Developer
Senior Data Architect
Data Modeler
Devops Engineer
Software Engineer Lead
  • Sr Data Engineer Python Serverside

    Canyon Associates 4.2company rating

    Data engineer job in White House Station, NJ

    This is a direct hire full-time position, with a hybrid on-site 2 days a week format. YOU MUST BE A US CITIZEN OR GREEN CARD, NO OTHER STATUS TO WORK IN THE US WILL BE PERMITTED YOU MUST LIVE LOCAL TO THE AREA AND BE ABLE TO DRIVE ONSITE A MIN TWO DAYS A WEEK THE TECH STACK WILL BE: 7 years demonstrated server-side development proficiency 5 years demonstrated server-side development proficiency Programming Languages: Python (NumPy, Pandas, Oracle PL/SQL). Other non-interpreted languages like Java, C++, Rust, etc. are a plus. Must be proficient in the intermediate-advanced level of the language (concurrency, memory management, etc.) Design patterns: typical GOF patterns (Factory, Facade, Singleton, etc.) Data structures: maps, lists, arrays, etc SCM: solid Git proficiency, MS Azure DevOps (CI/CD)
    $97k-129k yearly est. 2d ago
  • Data Engineer

    EXL 4.5company rating

    Data engineer job in Philadelphia, PA

    Job Title: Data Engineer Experience: 5+ years We are seeking an experienced Data Engineer with strong expertise in PySpark and data pipeline operations. This role focuses heavily on performance tuning Spark applications, managing large-scale data pipelines, and ensuring high operational stability. The ideal candidate is a strong technical problem-solver, highly collaborative, and proactive in automation and process improvements. Key Responsibilities: Data Pipeline Management & Support Operate and support Business-as-Usual (BAU) data pipelines, ensuring stability, SLA adherence, and timely incident resolution. Identify and implement opportunities for optimization and automation across pipelines and operational workflows. Spark Development & Performance Tuning Design, develop, and optimize PySpark jobs for efficient large-scale data processing. Diagnose and resolve complex Spark performance issues such as data skew, shuffle spill, executor OOM errors, slow-running stages, and partition imbalance. Platform & Tool Management Use Databricks for Spark job orchestration, workflow automation, and cluster configuration. Debug and manage Spark on Kubernetes, addressing pod crashes, OOM kills, resource tuning, and scheduling problems. Work with MinIO/S3 storage for bucket management, permissions, and large-volume file ingestion and retrieval. Collaboration & Communication Partner with onshore business stakeholders to clarify requirements and convert them into well-defined technical tasks. Provide daily coordination and technical oversight to offshore engineering teams. Participate actively in design discussions and technical reviews. Documentation & Operational Excellence Maintain accurate and detailed documentation, runbooks, and troubleshooting guides. Contribute to process improvements that enhance operational stability and engineering efficiency. Required Skills & Qualifications: Primary Skills (Must-Have) PySpark: Advanced proficiency in transformations, performance tuning, and Spark internals. SQL: Strong analytical query design, performance tuning, and foundational data modeling (relational & dimensional). Python: Ability to write maintainable, production-grade code with a focus on modularity, automation, and reusability. Secondary Skills (Highly Desirable) Kubernetes: Experience with Spark-on-K8s, including pod diagnostics, resource configuration, and log/monitoring tools. Databricks: Hands-on experience with cluster management, workflow creation, Delta Lake optimization, and job monitoring. MinIO / S3: Familiarity with bucket configuration, policies, and efficient ingestion patterns. DevOps: Experience with Git, CI/CD, and cloud environments (Azure preferred).
    $74k-100k yearly est. 2d ago
  • Senior Data Engineer

    Realtime Recruitment

    Data engineer job in Philadelphia, PA

    Full-time Perm Remote - EAST COAST ONLY Role open to US Citizens and Green Card Holders only We're looking for a Senior Data Engineer to lead the design, build, and optimization of modern data pipelines and cloud-native data infrastructure. This role is ideal for someone who thrives on solving complex data challenges, improving systems at scale, and collaborating across technical and business teams to deliver high-impact solutions. What You'll Do Architect, develop, and maintain scalable, secure data infrastructure supporting analytics, reporting, and operational workflows. Design and optimize ETL/ELT pipelines to integrate data from diverse internal and external sources. Prepare and transform structured and unstructured data to support modeling, reporting, and advanced analysis. Improve data quality, reliability, and performance across platforms and workflows. Monitor pipelines, troubleshoot discrepancies, and ensure accuracy and timely data delivery. Identify architectural bottlenecks and drive long-term scalability improvements. Collaborate with Product, BI, Finance, and engineering teams to build end-to-end data solutions. Prototype algorithms, transformations, and automation tools to accelerate insights. Lead cloud-native workflow design, including logging, monitoring, and storage best practices. Create and maintain high-quality technical documentation. Contribute to Agile ceremonies, engineering best practices, and continuous improvement initiatives. Mentor teammates and guide adoption of data platform tools and patterns. Participate in on-call rotation to maintain platform stability and availability. What You Bring Bachelor's degree in Computer Science or related technical field. 4+ years of advanced SQL experience (Oracle, PostgreSQL, etc.). 4+ years working with Java or Groovy. 3+ years integrating with SOAP or REST APIs. 2+ years with DBT and data modeling. Strong understanding of modern data architectures, distributed systems, and performance optimization. Experience with Snowflake or similar cloud data platforms (preferred). Hands-on experience with Git, Jenkins, CI/CD, and automation/testing practices. Solid grasp of cloud concepts and cloud-native engineering. Excellent problem-solving, communication, and cross-team collaboration skills. Ability to lead projects, own solutions end-to-end, and influence technical direction. Proactive mindset with strong analytical and consultative abilities.
    $81k-111k yearly est. 2d ago
  • Data Engineer

    Ztek Consulting 4.3company rating

    Data engineer job in Hamilton, NJ

    Key Responsibilities: Manage and support batch processes and data pipelines in Azure Databricks and Azure Data Factory. Integrate and process Bloomberg market data feeds and files into trading or analytics platforms. Monitor, troubleshoot, and resolve data and system issues related to trading applications and market data ingestion. Develop, automate, and optimize ETL pipelines using Python, Spark, and SQL. Manage FTP/SFTP file transfers between internal systems and external vendors. Ensure data quality, completeness, and timeliness for downstream trading and reporting systems. Collaborate with operations, application support, and infrastructure teams to resolve incidents and enhance data workflows. Required Skills & Experience: 10+ years of experience in data engineering or production support within financial services or trading environments. Hands-on experience with Azure Databricks, Azure Data Factory, and Azure Storage, Logic Apps, Fabric. Strong Python and SQL programming skills. Experience with Bloomberg data feeds (BPIPE, TSIP,SFTP). Experience with Git, CI/CD pipelines, and Azure DevOps. Proven ability to support batch jobs, troubleshoot failures, and manage job scheduling. Experience handling FTP/SFTP file transfers and automation (e.g., using scripts or managed file transfer tools). Solid understanding of equities trading, fixed income trading, trading workflows, and financial instruments. Excellent communication, problem-solving, and stakeholder management skills.
    $89k-125k yearly est. 5d ago
  • Data Modeler

    Capgemini 4.5company rating

    Data engineer job in Philadelphia, PA

    Role: Data Modeler Fulltime The Senior Data Modeler is responsible for developing advanced data models associated with the enterprise data warehouse (DART), IHG operational systems and IHG data exchange with limited management oversight. Primary job function entails gathering and assessing business and technical requirements to create/update database objects. Delivery items include using ERwin Data Modeler to create and maintain entity relationship diagrams, DDL, and supporting documentation as needed. Additional job functions include identifying solution design options, advanced data profiling and subject matter expertise. This role works closely with all development resources including Business Systems Analysts, Developers and Project Managers. The Senior Data Modeler works independently with minimal guidance and acts as a resource for colleagues with less experience. A solid understanding of the following is required: • Software Development Life Cycle (SDLC) • Agile Methodology • Logical and Physical data modeling • Levels of Normalization • Abstraction and Generalization • Subtyping and Classification • Relational model design • Dimensional model design (Star Schemas, Snowflake designs) • Inmon & Kimball methodologies • Master Data Management (MDM) • Data Profiling • Metadata Management • Data security and protecting information • ETL processes, BI processes • Cloud Platforms, especially Google Cloud Platform Requirements: • 7+ years of proven experience in Data Modeling • 5+ years of experience with advanced SQL query techniques • Highly skilled in Erwin, including but not limited to: Diagraming, Complete Compare, Reverse Engineering, Forward Engineering • Strong multi-tasking capability, with adaptability to work simultaneously in multiple environments with differing procedures, SME support and level of accountability • Ability to estimate scope of effort, to prioritize and/or fast track requirements, and to provide multiple options for data-driven solution design • Demonstrated ability to recognize, elicit and/or decompose complex business requirements • Demonstrated ability to perceive patterns and relationships in data • Demonstrated ability to design models which integrate data from disparate sources • Demonstrated ability to design data models for complex and ragged hierarchies • Demonstrated ability to design data models for complicated historical perspectives • Understanding of data warehousing and decision support tools and techniques • Strong ability to multi-task with several concurrent issues and projects • Demonstrated ability to interact effectively with all levels of the organization • Strong interpersonal, written and verbal communication skills • Experience advising or mentoring staff regarding data administration, data modeling, and data mapping Desired • Experience creating data models for BigQuery, SQL Server, Oracle and MySQL databases • Experience in Healthcare Insurance or related field strongly preferred"
    $80k-132k yearly est. 2d ago
  • AWS Data engineer with Databricks || USC Only || W2 Only

    Ipivot

    Data engineer job in Princeton, NJ

    AWS Data Engineer with Databricks Princeton, NJ - Hybrid - Need Locals or Neaby Duration: Long Term is available only to U.S. citizens. Key Responsibilities Design and implement ETL/ELT pipelines with Databricks, Apache Spark, AWS Glue, S3, Redshift, and EMR for processing large-scale structured and unstructured data. Optimize data flows, monitor performance, and troubleshoot issues to maintain reliability and scalability. Collaborate on data modeling, governance, security, and integration with tools like Airflow or Step Functions. Document processes and mentor junior team members on best practices. Required Qualifications Bachelor's degree in Computer Science, Engineering, or related field. 5+ years of data engineering experience, with strong proficiency in Databricks, Spark, Python, SQL, and AWS services (S3, Glue, Redshift, Lambda). Familiarity with big data tools like Kafka, Hadoop, and data warehousing concepts.
    $82k-112k yearly est. 3d ago
  • Data Analytics Engineer

    Dale Workforce Solutions

    Data engineer job in Somerset, NJ

    Client: manufacturing company Type: direct hire Our client is a publicly traded, globally recognized technology and manufacturing organization that relies on data-driven insights to support operational excellence, strategic decision-making, and digital transformation. They are seeking a Power BI Developer to design, develop, and maintain enterprise reporting solutions, data pipelines, and data warehousing assets. This role works closely with internal stakeholders across departments to ensure reporting accuracy, data availability, and the long-term success of the company's business intelligence initiatives. The position also plays a key role in shaping BI strategy and fostering collaboration across cross-functional teams. This role is on-site five days per week in Somerset, NJ. Key Responsibilities Power BI Reporting & Administration Lead the design, development, and deployment of Power BI and SSRS reports, dashboards, and analytics assets Collaborate with business stakeholders to gather requirements and translate needs into scalable technical solutions Develop and maintain data models to ensure accuracy, consistency, and reliability Serve as the Power BI tenant administrator, partnering with security teams to maintain data protection and regulatory compliance Optimize Power BI solutions for performance, scalability, and ease of use ETL & Data Warehousing Design and maintain data warehouse structures, including schema and database layouts Develop and support ETL processes to ensure timely and accurate data ingestion Integrate data from multiple systems while ensuring quality, consistency, and completeness Work closely with database administrators to optimize data warehouse performance Troubleshoot data pipelines, ETL jobs, and warehouse-related issues as needed Training & Documentation Create and maintain technical documentation, including specifications, mappings, models, and architectural designs Document data warehouse processes for reference, troubleshooting, and ongoing maintenance Manage data definitions, lineage documentation, and data cataloging for all enterprise data models Project Management Oversee Power BI and reporting projects, offering technical guidance to the Business Intelligence team Collaborate with key business stakeholders to ensure departmental reporting needs are met Record meeting notes in Confluence and document project updates in Jira Data Governance Implement and enforce data governance policies to ensure data quality, compliance, and security Monitor report usage metrics and follow up with end users as needed to optimize adoption and effectiveness Routine IT Functions Resolve Help Desk tickets related to reporting, dashboards, and BI tools Support general software and hardware installations when needed Other Responsibilities Manage email and phone communication professionally and promptly Respond to inquiries to resolve issues, provide information, or direct to appropriate personnel Perform additional assigned duties as needed Qualifications Required Minimum of 3 years of relevant experience Bachelor's degree in Computer Science, Data Analytics, Machine Learning, or equivalent experience Experience with cloud-based BI environments (Azure, AWS, etc.) Strong understanding of data modeling, data visualization, and ETL tools (e.g., SSIS, Azure Synapse, Snowflake, Informatica) Proficiency in SQL for data extraction, manipulation, and transformation Strong knowledge of DAX Familiarity with data warehouse technologies (e.g., Azure Blob Storage, Redshift, Snowflake) Experience with Power Pivot, SSRS, Azure Synapse, or similar reporting tools Strong analytical, problem-solving, and documentation skills Excellent written and verbal communication abilities High attention to detail and strong self-review practices Effective time management and organizational skills; ability to prioritize workload Professional, adaptable, team-oriented, and able to thrive in a dynamic environment
    $82k-112k yearly est. 2d ago
  • Python Data Controls Developer

    Thought Byte

    Data engineer job in Mount Laurel, NJ

    Mount Laurel, NJ - (3 days onsite role) Mode of Hiring: Full Time Salary: Negotiable for right candidates Minimum of 7 - 10 years of experience working in a financial institution, preferably in Global Banks Minimum of 7 - 10 years of experience in SQL development, including query optimization, stored procedures, and indexing. Strong working experience in Python for data manipulation, scripting, and automation Understanding of the Compliance domain and concepts i.e., Anti-Money laundering (AML), Know your Customer (KYC), Customer Risk Rating etc. is a must Minimum of 5 - 7 years of experience in data (data lifecycle, data governance, data quality, Metadata, Data issue resolution and other data concepts)
    $77k-102k yearly est. 2d ago
  • Senior Data Engineer - MDM

    Synechron 4.4company rating

    Data engineer job in Iselin, NJ

    We are At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver industry-leading digital solutions. Synechron's progressive technologies and optimization strategies span end-to-end Artificial Intelligence, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering, servicing an array of noteworthy financial services and technology firms. Through research and development initiatives in our FinLabs we develop solutions for modernization, from Artificial Intelligence and Blockchain to Data Science models, Digital Underwriting, mobile-first applications and more. Over the last 20+ years, our company has been honored with multiple employer awards, recognizing our commitment to our talented teams. With top clients to boast about, Synechron has a global workforce of 14,500+, and has 58 offices in 21 countries within key global markets. Our challenge: We are seeking a highly skilled and experienced Senior Data Engineer specializing in Master Data Management (MDM) to join our data team. The ideal candidate will have a strong background in designing, implementing, and managing end-to-end MDM solutions, preferably within the financial sector. You will be responsible for architecting robust data platforms, evaluating MDM tools, and aligning data strategies to meet business needs. Additional Information The base salary for this position will vary based on geography and other factors. In accordance with the law, the base salary for this role if filled within Iselin, NJ is $135K to $150K/year & benefits (see below). Key Responsibilities: Lead the design, development, and deployment of comprehensive MDM solutions across the organization, with an emphasis on financial data domains. Demonstrate extensive experience with multiple MDM implementations, including platform selection, comparison, and optimization. Architect and present end-to-end MDM architectures, ensuring scalability, data quality, and governance standards are met. Evaluate various MDM platforms (e.g., Informatica, Reltio, Talend, IBM MDM, etc.) and provide objective recommendations aligned with business requirements. Collaborate with business stakeholders to understand reference data sources and develop strategies for managing reference and master data effectively. Implement data integration pipelines leveraging modern data engineering tools and practices. Develop, automate, and maintain data workflows using Python, Airflow, or Astronomer. Build and optimize data processing solutions using Kafka, Databricks, Snowflake, Azure Data Factory (ADF), and related technologies. Design microservices, especially utilizing GraphQL, to enable flexible and scalable data services. Ensure compliance with data governance, data privacy, and security standards. Support CI/CD pipelines for continuous integration and deployment of data solutions. Qualifications: 12+ years of experience in data engineering, with a proven track record of MDM implementations, preferably in the financial services industry. Extensive hands-on experience designing and deploying MDM solutions and comparing MDM platform options. Strong functional knowledge of reference data sources and domain-specific data standards. Expertise in Python, Pyspark, Kafka, microservices architecture (particularly GraphQL), Databricks, Snowflake, Azure Data Factory, SQL, and orchestration tools such as Airflow or Astronomer. Familiarity with CI/CD practices, tools, and automation pipelines. Ability to work collaboratively across teams to deliver complex data solutions. Experience with financial systems (capital markets, credit risk, and regulatory compliance applications). Preferred Skills: Familiarity with financial data models and regulatory requirements. Experience with Azure cloud platforms Knowledge of data governance, data quality frameworks, and metadata management. We offer: A highly competitive compensation and benefits package A multinational organization with 58 offices in 21 countries and the possibility to work abroad 10 days of paid annual leave (plus sick leave and national holidays) Maternity & Paternity leave plans A comprehensive insurance plan including: medical, dental, vision, life insurance, and long-/short-term disability (plans vary by region) Retirement savings plans A higher education certification policy Commuter benefits (varies by region) Extensive training opportunities, focused on skills, substantive knowledge, and personal development. On-demand Udemy for Business for all Synechron employees with free access to more than 5000 curated courses Coaching opportunities with experienced colleagues from our Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups Cutting edge projects at the world's leading tier-one banks, financial institutions and insurance firms A flat and approachable organization A truly diverse, fun-loving and global work culture SYNECHRON'S DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference' is committed to fostering an inclusive culture - promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant's gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
    $135k-150k yearly 4d ago
  • Senior Data Architect

    Valuemomentum 3.6company rating

    Data engineer job in Edison, NJ

    Act as a Enterprise Architect, supporting architecture reviews, design decisions, and strategic planning. Design and implement scalable data warehouse and analytics solutions on AWS and Snowflake. Develop and optimize SQL, ETL/ELT pipelines, and data models to support reporting and analytics. Collaborate with cross-functional teams (data engineering, application development, infrastructure) to align on architecture best practices and ensure consistency across solutions. Evaluate and recommend technologies, tools, and frameworks to improve data processing efficiency and reliability. Provide guidance and mentorship to data engineering teams, enforcing data governance, quality, and security standards. Troubleshoot complex data and performance issues and propose long-term architectural solutions. Support capacity planning, cost optimization, and environment management within AWS/Snowflake ecosystems. About ValueMomentum ValueMomentum is a leading solutions provider for the global property & casualty insurance industry, supported by deep domain and technology capabilities. We offer a comprehensive suite of advisory, development, implementation, and maintenance services across the entire P&C insurance value chain. This includes Underwriting, Claims, Distribution, and more, empowering insurers to stay ahead with sustained growth, high performance, and enhanced stakeholder value. Trusted by over 75 insurers, ValueMomentum is one of the largest standalone insurance-focused solutions providers to the US insurance industry.
    $82k-113k yearly est. 4d ago
  • Java Software Engineer

    Beaconfire Inc.

    Data engineer job in East Windsor, NJ

    Job Responsibilities: Develop applications using Java 8/JEE (and higher), Angular 2+, React.js, SQL, Spring, HTML5, CSS, JavaScript, and TypeScript, among other tools. Write scalable, secure, and maintainable code that powers our clients' platforms. Create, deploy, and maintain automated system tests. Work with Testers to understand defects and resolve them in a timely manner. Support continuous improvement by investigating alternatives and technologies, and presenting these for architectural review. Collaborate effectively with other team members to accomplish shared user story and sprint goals. Requirement: Experience in programming languages: Java and JavaScript Decent understanding of the software development life cycle Basic programming skills using object-oriented programming (OOP) languages, with in-depth knowledge of common APIs and data structures like Collections, Maps, Lists, Sets, etc. Knowledge of relational databases (e.g., SQL Server, Oracle) and basic SQL query language skills Preferred Qualifications: Master's Degree in Computer Science (CS) 0-1 year of practical experience in Java coding Experience using Spring, Maven, Angular frameworks, HTML, and CSS Knowledge of other contemporary Java technologies (e.g., WebLogic, RabbitMQ, Tomcat) Familiarity with JSP, J2EE, and JDBC
    $71k-95k yearly est. 1d ago
  • Senior Dotnet Developer

    HNE

    Data engineer job in Ewing, NJ

    The Software Engineer, III is a fullstack engineer with strong Node.js and functional programming (Scala & Ruby) skills. Our applications leverage Node.js, Bootstrap, JQuery, Mondo DB, Elastic Search, Redis, React.js and delightful interactive experience to the web. Our applications run in the AWS cloud environment. We use Agile Methodologies to enable our engineering team to work closely with partners and with our design & product teams. This role is full time and preferably located long-term in New York City or southern New Jersey areas. Essential Job Duties and Responsibilities include: Design, develop, and maintain modern web applications and UIs using .NET technologies such as C#, ASP.NET MVC, ASP.NET Core, Razor Pages, and Blazor. Create clean, maintainable, and well-documented code following industry best practices and coding standards. Develop and consume RESTful APIs and web services. Build responsive and accessible user interfaces using HTML, CSS, JavaScript, and UI libraries/frameworks such as React, Angular, Vue.js, or Bootstrap. Work with relational and NoSQL databases (e.g., SQL Server, MongoDB) and object-relational mappers (ORMs) such as Entity Framework Core. Conduct unit and integration testing to validate functionality and ensure high-quality deliverables. Participate in peer code reviews and provide constructive feedback to ensure continuous improvement and knowledge sharing. Identify, troubleshoot, and resolve complex technical issues in development and production environments. Collaborate with cross-functional teams throughout the software development lifecycle. Stay current with emerging .NET technologies and trends. May mentor and support junior developers in their technical growth and day-to-day work. Maintain regular and punctual attendance. Preferred Qualifications: Experience with CI/CD pipelines and DevOps practices. Familiarity with cloud platforms (e.g., Azure, AWS) and deploying .NET applications in cloud environments. Knowledge of Blazor for interactive web UIs using C# instead of JavaScript Education and/or Experience: 7+ years of professional software development experience with a strong focus on web and UI development in the .NET ecosystem. Advanced proficiency in C#, ASP.NET, ASP.NET Core, and MVC frameworks; experience with VBScript is a plus. Deep understanding of object-oriented programming (OOP) and design patterns. Strong front-end development skills, including HTML, CSS, JavaScript, and at least one modern UI framework (React, Angular, Vue.js, etc.). Proven experience developing and integrating RESTful APIs. Hands-on experience with SQL Server and/or NoSQL databases; proficient in using Entity Framework Core or similar ORMs. Familiarity with version control systems such as Git. Solid grasp of Agile/Scrum development methodologies. Excellent problem-solving abilities and strong attention to detail. Effective communication and interpersonal skills with the ability to work independently and within a team.
    $91k-118k yearly est. 3d ago
  • AI / ML Engineer

    Wall Street Consulting Services LLC

    Data engineer job in Warren, NJ

    Title: AI Engineer or MCP Developer Duration: Long Term Contract Kindly share your resumes to **************** Description: A MCP Developer in commercial P&C insurance is typically an IT role focused on developing systems and integrations using the Model Context Protocol (MCP) to leverage Artificial Intelligence (AI) and Large Language Models (LLMs) within insurance operations. This role involves building the infrastructure that allows AI agents to securely and reliably access and act upon internal P&C data sources (e.g., policy systems, claims databases, underwriting documents), thereby enhancing automation and decision-making in core insurance functions like underwriting and claims processing. Responsibilities: AI Integration: Develop and implement robust integrations between AI models (LLMs) and internal data repositories and business tools using the Model Context Protocol (MCP). System Development: Build and maintain MCP servers and clients to expose necessary data and capabilities to AI agents. Workflow Automation: Design and implement agentic workflows that allow AI systems to perform complex, multi-step tasks, such as accessing real-time policy data, processing claims information, and updating customer records. Security & Compliance: Implement secure coding practices and ensure all AI interactions and data exchanges via MCP adhere to insurance industry regulations and internal compliance standards (e.g., data privacy, secure data handling). API Management: Work with existing APIs (REST/SOAP) and develop new ones to facilitate data flow to and from the MCP environment. Collaboration: Partner with actuaries, underwriters, claims specialists, and IT teams to identify AI opportunities and ensure seamless solution deployment. Testing & Quality Assurance: Perform testing to ensure AI-driven job outputs are accurate and reliable, and maintain high performance levels. Documentation: Document all development processes, system architectures, and operational procedures for MCP integrations. Experience: 3+ years of experience in software development or AI integration, preferably within the insurance or financial services industry. P&C Knowledge: Strong knowledge of Commercial P&C insurance products, underwriting processes, and claims systems is highly preferred. Technical Expertise: Proficiency in programming languages like Python, Java, or similar. Experience with API development and management. Familiarity with cloud platforms (AWS, Azure, GCP) and containerization tools (Docker, Kubernetes). Understanding of the Model Context Protocol (MCP) specification and SDKs
    $70k-94k yearly est. 5d ago
  • Sr. C++FIX or Market Data Developer

    Shain Associates

    Data engineer job in Princeton, NJ

    Looking for a highly motivated C++ Trading Systems Developer with demonstrated experience in designing, developing and delivering core production software solutions in a mission critical trading systems environment. Major responsibilities include: Assessing business and systems requirements and developing functional specifications Designing and developing high quality, high performance trading systems software written in C++ to meet deliverable timelines and requirements Adhering to software development life cycle process/methodology Building business level subject matter expertise in trading systems functionality and processing Provide second level support for production on an ad hoc basis when necessary Location: Princeton, NJ Organizational Structure: The developer will be an integral part of a core development team and report to the Trading System Development management team. Qualifications: Full software development life cycle experience in a mission critical trading systems environment a must… Options, Equities, Futures, etc. Must possess excellent software design skills and knowledge of advanced data structures Must have exceptionally strong C++ knowledge and debugging skills in a Linux environment Solid knowledge of Object Oriented Programming concepts a must Strong knowledge of TCP/IP multicast and socket programming required Knowledge of the BOOST libraries and STL required Must have experience in developing real-time applications in a distributed processing architecture Must have excellent organizational and communication skills Must be able to work effectively in a team environment Strong knowledge of the logical business domain in Options or Equities trading systems a big plus Experience coding interface solutions for FIX, OPRA, CTA or UTP a big plus Knowledge of scripting languages such as Python, Shell, and Perl a plus Education and Experience: Minimum of a Bachelor's degree or equivalent in IT/Computer Science 7+ years of experience in C++ development 5+ years of demonstrated experience in delivering software solutions in a trading systems environment for an Exchange or a Wall Street firm
    $91k-118k yearly est. 5d ago
  • Java Software Engineer

    CLS Group 4.8company rating

    Data engineer job in Iselin, NJ

    Job Information: Functional Title - Assistant Vice President, Java Software Development Engineer Department - Technology Corporate Level - Assistant Vice President Report to - Director, Application Development Expected full-time salary range between $ 125,000 - 145,000 + variable compensation + 401(k) match + benefits Job Description: This position is with CLS Technology. The primary responsibilities of the job will be (a) Hands-on software application development (b) Level 3 support Duties, Responsibilities, and Deliverables: Develop scalable, robust applications utilizing appropriate design patterns, algorithms and Java frameworks Collaborate with Business Analysts, Application Architects, Developers, QA, Engineering, and Technology Vendor teams for design, development, testing, maintenance and support Adhere to CLS SDLC process and governance requirements and ensure full compliance of these requirements Plan, implement and ensure that delivery milestones are met Provide solutions using design patterns, common techniques, and industry best practices that meet the typical challenges/requirements of a financial application including usability, performance, security, resiliency, and compatibility Proactively recognize system deficiencies and implement effective solutions Participate in, contribute to, and assimilate changes, enhancements, requirements (functional and non-functional), and requirements traceability Apply significant knowledge of industry trends and developments to improve CLS in-house practices and services Provide Level-3 support. Provide application knowledge and training to Level-2 support teams Experience Requirements: 5+ years of hands-on application development and testing experience with proficient knowledge of core Java and JEE technologies such as JDBC and JAXB, Java/Web technologies Knowledge of Python, Perl, Unix shell scripting is a plus Expert hands-on experience with SQL and with at least one DBMS such as IBM DB2 (preferred) or Oracle is a strong plus Expert knowledge of and experience in securing web applications, secure coding practices Hands-on knowledge of application resiliency, performance tuning, technology risk management is a strong plus Hands-on knowledge of messaging middleware such as IBM MQ (preferred) or TIBCO EMS, and application servers such as WebSphere, or WebLogic Knowledge of SWIFT messaging, payments processing, FX business domain is a plus Hands-on knowledge of CI/CD practices and DevOps toolsets such as JIRA, GIT, Ant, Maven, Jenkins, Bamboo, Confluence, and ServiceNow. Hands-on knowledge of MS Office toolset including MS-Excel, MS-Word, PowerPoint, and Visio Proven track record of successful application delivery to production and effective Level-3 support. Success factors: In addition, the person selected for the job will Have strong analytical, written and oral communication skills with a high self-motivation factor Possess excellent organization skills to manage multiple tasks in parallel Be a team player Have the ability to work on complex projects with globally distributed teams and manage tight delivery timelines Have the ability to smoothly handle high stress application development and support environments Strive continuously to improve stakeholder management for end-to-end application delivery and support Qualification Requirements: Bachelor Degree Minimum 5 year experience in Information Technology
    $125k-145k yearly 5d ago
  • SRE/DevOps w/ HashiCorp & Clojure Exp

    Dexian

    Data engineer job in Philadelphia, PA

    Locals Only! SRE/DevOps w/ HashiCorp & Clojure Exp Philadelphia, PA: 100% Onsite! 12 + Months MUST: HashiCorp Clojure Role: Lead SRE initiatives, automating and monitoring cloud infrastructure to ensure reliable, scalable, and secure systems for eCommerce. Required: Must Have: AWS, Terraform, HashiCorp Stack (Nomad, Vault, Consul) Programming in Python/Clojure Automation, monitoring, and log centralization (Splunk) Experience leading large-scale cloud infrastructure Desired Skills and Experience Locals Only! SRE/DevOps w/ HashiCorp & Clojure Exp Philadelphia, PA: 100% Onsite! 12 + Months Dexian stands at the forefront of Talent + Technology solutions with a presence spanning more than 70 locations worldwide and a team exceeding 10,000 professionals. As one of the largest technology and professional staffing companies and one of the largest minority-owned staffing companies in the United States, Dexian combines over 30 years of industry expertise with cutting-edge technologies to deliver comprehensive global services and support. Dexian connects the right talent and the right technology with the right organizations to deliver trajectory-changing results that help everyone achieve their ambitions and goals. To learn more, please visit ******************** Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.
    $85k-112k yearly est. 4d ago
  • Identity and Access Management Software Engineering Lead

    Elsevier 4.2company rating

    Data engineer job in Philadelphia, PA

    Identity and Access Management - Software Engineering Lead- Must have either (KeyCloak, Auth0, Okta, or similar) Are you a Software Engineering lead with a strong security background ready to broaden your impact and take on a hands-on software engineering leadership role? Are you a collaborative Software Engineering Lead looking to work for a mission driven global organization? About the role - As an Engineering Lead for NeoID-Elsevier's next-generation Identity and Access Management (IAM) platform-you'll leverage your deep security expertise to architect, build, and evolve the authentication and authorization backbone for Elsevier's global products. You'll also lead and manage a team of 5 engineers, fostering their growth and ensuring delivery excellence. You'll have the opportunity to work with industry standard protocols such as OAuth2, OIDC and SAML, as well as healthcare's SMART on FHIR and EHR integrations. About the team- This team is entrusted with building Elsevier's next-generation Identity and Access Management (IAM) platform. This diverse team of engineers are also building and evolving the authentication and authorization backbone for Elsevier's global products. This team is building a brand new product in Cyber Security that will provide Authorization and Authentication for ALL Elsevier products Qualifications Current and extensive experience with at least one major IAM platform (KeyCloak, Auth0, Okta, or similar) - KeyCloak and Auth0 experience are strong pluses. Only candidates with this experience will be considered for this critical role. Possess an in-depth security mindset, with proven experience designing and implementing secure authentication and authorization systems Have an extensive understanding of OAuth2, OIDC and SAML protocols, including relevant RFCs and enterprise/server-side implementations Familiarity with healthcare identity protocols, including SMART on FHIR and EHR integrations Have current hands-on experience with AWS cloud services and infrastructure management. Proficiency in Infrastructure as Code (IaC) tools, especially Terraform Strong networking skills, including network security, protocols, and troubleshooting Familiarity with software development methodologies (Agile, Waterfall, etc.) Experience with Java/J2EE, JavaScript, and related technologies, or willingness to learn and deepen expertise Knowledge of data modeling, optimization, and secure data handling best practices Accountabilities Leading the design and implementation of secure, scalable IAM solutions, with a focus on OAuth2/OIDC and healthcare protocols such as SMART on FHIR and EHR integrations Managing, mentoring and supporting a team of 5 engineers, fostering a culture of security, innovation, and technical excellence Collaborating with product managers and stakeholders to define requirements and strategic direction for the platform, including healthcare and life sciences use cases Writing and reviewing code, performing code reviews, and ensuring adherence to security and engineering best practices Troubleshooting and resolving complex technical issues, providing expert guidance on IAM, security, and healthcare protocol topics Contributing to architectural decisions and long-term platform strategy Staying current with industry trends, emerging technologies, and evolving security threats in the IAM and healthcare space Why Elsevier? Join a global leader in information and analytics, and help shape the future of secure, seamless access to knowledge for millions of users worldwide, including healthcare professionals and researchers. If you are an Engineering Lead ready to expand your skills, take on a hands-on software engineering leadership role, and grow as a people manager, we want to hear from you.
    $95k-121k yearly est. 1d ago
  • ETL/BI/Data reports consultants

    Ayr Global It Solutions 3.4company rating

    Data engineer job in Philadelphia, PA

    AYR Global IT Solutions is a national staffing firm focused on cloud, cyber security, web application services, ERP, and BI implementations by providing proven and experienced consultants to our clients. Our competitive, transparent pricing model and industry experience make us a top choice of Global System Integrators and enterprise customers with federal and commercial projects supported nationwide. Job Description Subject: ETL/BI/Data reports consultants Location: Philadelphia, Pa. Duration: One year plus Qualifications Experience - Must haves! 5+ years coding advanced SQL queries, ETL automation, and stored procedures to support business inquiries 5+ years of demonstrated reporting, analytical, and database experience in a dynamic business environment Advanced BI/Reporting Tool experience (SSRS) required. Experience with transforming complex datasets into relevant visualizations Familiar with relational database technology and terminology Functional Competencies SQL coding proficiency (5+ years) Microsoft Reporting Services (SSRS) - Ability to develop advanced reports, manage subscriptions, etc. Microsoft Integration Services (SSIS) - Ability to manage SQL Server ETL processes / job failures, etc. Relational database development (SQL Server, Teradata) SQL Server - Table, View & Procedure development Teradata environment experience Teradata SQL Assistant experience Excel - Charts, Pivot Tables, Equations, VBA Tableau experience a plus Oracle experience a plus Qualifications Bachelor's degree required (preferably in Information Systems, Business Intelligence, or Computer Science) with related experience, or an equivalent combination of education and experience from which comparable knowledge and abilities can be acquired. A sound technical background with the ability to analyze complex data sets, business processes, and quickly adapt to the use of new technologies. Must have excellent communication skills with the ability to share technical capabilities and work cross-functionally between business functions. Additional Information If anyone might be intersted please send resumes to kmarsh@ayrglobal (dot) com or you can reach me direct at **************
    $82k-113k yearly est. 60d+ ago
  • Microsoft Dynamics 365 CE Data Migration Consultant

    Data-Core System, Inc. 4.2company rating

    Data engineer job in Middletown, PA

    Job DescriptionSalary: Data-Core Systems, Inc. is a provider of information technology, consulting and business process services. We offer breakthrough tech solutions and have worked with companies, hospitals, universities and government organizations. A proven partner with a passion for client satisfaction, we combine technology innovation, business process expertise and a global, collaborative workforce that exemplifies the future of work. For more information about Data-Core Systems, Inc., please visit***************************** Our client is a roadway system and as a part of their digital transformation they are implementing a solution based on SAP BRIM & Microsoft Dynamics CE. Data-Core Systems Inc. is seeking Microsoft Dynamics 365 CE Data Migration Consultantto be a part of our Consulting team. You will be responsible for planning, designing, and executing the migration of customer, account, vehicle, financial, and transaction data from a variety of source systemsincluding legacy CRMs, ERPs, SQL databases, flat files, Excel, cloud platforms, and tolling systemsinto Microsoft Dynamics 365 Customer Engagement (CE). This role involves understanding complex data models, extracting structured and unstructured data, transforming and mapping it to Dynamics CE entities, and ensuring data quality, integrity, and reconciliation throughout the migration lifecycle. Roles & Responsibilities: Analyze source system data structures, including customer profiles, accounts, vehicles, transponders, payment methods, transactions, violations, invoices, and billing records Identify critical data relationships, parent/child hierarchies, and foreign key dependencies Develop detailed data mapping and transformation documentation from source systems to Dynamics 365 CE entities (standard and custom) Build, test, and execute ETL pipelines using tools such as SSIS/KingswaySoft, Azure Data Factory, Power Platform Dataflows, or custom .NET utilities Perform data cleansing, normalization, deduplication, and standardization to meet Dynamics CE data model requirements Execute multiple migration cycles, including test loads, validation, and final production migration Ensure referential integrity, high data quality, and accuracy of historical data Generate reconciliation reports, resolve data inconsistencies, and troubleshoot migration errors Document migration strategies, execution runbooks, and transformation rules for future reference Required Skills & Experience: 8-12 years of proven experience migrating data from tolling systems, transportation platforms, legacy CRMs, or other high-volume transactional systems Strong SQL skills for complex queries, stored procedures, data transformation, and data validation Hands-on experience with Microsoft Dynamics 365 CE / CRM data model, entities, and relationships Proficiency with ETL/migration tools: SSIS with KingswaySoft, Azure Data Factory, Power Platform Dataflows, Custom C#/.NET migration scripts Experience with large-scale migrations involving millions of records Strong understanding of relational data structures such as: Customer Account Vehicle Transponder Transaction Ability to analyze large datasets, identify anomalies, and resolve inconsistencies Bachelors degree in engineering or a bachelors degree in technology from a recognized university Preferred Skills & Experience: Experience with financial transactions, billing data, or violation/enforcement records. Experience in enterprise-scale Dynamics 365 CE migrations. Familiarity with data governance, security, and compliance requirements for financial or transportation data. Knowledge of historical data migration and archival strategies. We are an equal opportunity employer.
    $78k-106k yearly est. 1d ago
  • Business Intelligence Data Engineer

    Cozen O'Connor Corporation 4.8company rating

    Data engineer job in Philadelphia, PA

    Cozen O'Connor is hiring a Business Intelligence Architect to join the IS team. This position is a staff-level role on the Enterprise Data Services team. The individual in this role is responsible for the design, development, and maintenance of data presentation tools within Cozen O'Connor. The role also participates in data engineering responsibilities in collaboration with other team members. Responsibilities Learn and maintain an understanding of the transactional system data used throughout the firm, and the end-user delivery methods for reporting and dashboards. Development, administration, and support of the firms' PowerBI-based reporting and dashboards. Administration, support and preparation for decommissioning of the firms' Qlik-based reporting/dashboard tools. Act as the (internal) user-facing subject matter expert on firm-wide reporting tools and design practices. Apply best practices for Data Governance and Metadata Management. Create and maintain well organized and easy to use documentation for end-user applications. In collaboration with IS Compliance team, provide evidence as required to satisfy audit requirements and ISO certification renewal. Qualifications Minimum of 4 years of development experience with PowerBI. Minimum of 4 years of development experience with Microsoft Fabric ETL methods. Some development experience with Qlik Suite (Qlik Sense, NPrinting) is preferred, but not required. Some experience with source control/release processes using Azure DevOps as a repository, as well as Git for version comparison. BS or BA Degree in Information Systems, Information Technology or other related field and/or commensurate work experience. Knowledge of enterprise-scale data management methodology, best practices, and associated frameworks (DAMA DMBOK, etc.). Expert-level proficiency in working with technical and non-technical users to facilitate data presentation needs. Some experience with T-SQL query building preferred. Excellent analytical and communication skills.
    $126k-167k yearly est. Auto-Apply 60d+ ago

Learn more about data engineer jobs

How much does a data engineer earn in Hamilton, NJ?

The average data engineer in Hamilton, NJ earns between $71,000 and $128,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Hamilton, NJ

$96,000

What are the biggest employers of Data Engineers in Hamilton, NJ?

Job type you want
Full Time
Part Time
Internship
Temporary