Post job

Data engineer jobs in New Jersey

- 2,037 jobs
  • Sr. Azure Data Engineer with Databricks Expertise

    E-Solutions 4.5company rating

    Data engineer job in Iselin, NJ

    Role : Sr. Azure Data Engineer with Databricks Expertise Exp : 12+ We are seeking highly skilled Azure Data Engineer with strong expertise in SQL, Python, Datawarehouse, Cloud ETL tools to join our data team. The ideal candidate will design, implement and optimize large-scale data pipeline, ensuring scalability, reliability and performance. This role involves working closely with multiple teams and business stakeholders to deliver cutting-edge data solutions. Key Responsibilities: 1. Data Pipeline Development: • Build and maintain scalable ETL/ELT pipelines using Databricks. • Leverage PySpark/Spark and SQL to transform and process large datasets. • Integrate data from multiple sources including Azure Blob Storage, ADLS and other relational/non-relational systems. 2. Collaboration & Analysis: • Work Closely with multiple teams to prepare data for dashboard and BI Tools. • Collaborate with cross-functional teams to understand business requirements and deliver tailored data solutions. 3. Performance & Optimization: • Optimize Databricks workloads for cost efficiency and performance. • Monitor and troubleshoot data pipelines to ensure reliability and accuracy. 4. Governance & Security: • Implement and manage data security, access controls and governance standards using Unity Catalog. • Ensure compliance with organizational and regulatory data policies. 5. Deployment: • Leverage Databricks Asset Bundles for seamless deployment of Databricks jobs, notebooks and configurations across environments. • Manage version control for Databricks artifacts and collaborate with team to maintain development best practices. Technical Skills: • Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime etc.) • Proficiency in Azure Cloud Services. • Solid Understanding of Spark and PySpark for big data processing. • Experience in relational databases. • Knowledge on Databricks Asset Bundles and GitLab. Preferred Experience: • Familiarity with Databricks Runtimes and advanced configurations. • Knowledge of streaming frameworks like Spark Streaming. • Experience in developing real-time data solutions. Certifications: • Azure Data Engineer Associate or Databricks certified Data Engineer Associate certification. (Optional)
    $108k-150k yearly est. 16h ago
  • Sr Data Engineer Python Serverside

    Canyon Associates 4.2company rating

    Data engineer job in White House Station, NJ

    This is a direct hire full-time position, with a hybrid on-site 2 days a week format. YOU MUST BE A US CITIZEN OR GREEN CARD, NO OTHER STATUS TO WORK IN THE US WILL BE PERMITTED YOU MUST LIVE LOCAL TO THE AREA AND BE ABLE TO DRIVE ONSITE A MIN TWO DAYS A WEEK THE TECH STACK WILL BE: 7 years demonstrated server-side development proficiency 5 years demonstrated server-side development proficiency Programming Languages: Python (NumPy, Pandas, Oracle PL/SQL). Other non-interpreted languages like Java, C++, Rust, etc. are a plus. Must be proficient in the intermediate-advanced level of the language (concurrency, memory management, etc.) Design patterns: typical GOF patterns (Factory, Facade, Singleton, etc.) Data structures: maps, lists, arrays, etc SCM: solid Git proficiency, MS Azure DevOps (CI/CD)
    $97k-129k yearly est. 1d ago
  • Senior Data Engineer (Snowflake)

    Epic Placements

    Data engineer job in Parsippany-Troy Hills, NJ

    Senior Data Engineer (Snowflake & Python) 1-Year Contract | $60/hour + Benefit Options Hybrid: On-site a few days per month (local candidates only) Work Authorization Requirement You must be authorized to work for any employer as a W2 employee. This is required for this role. This position is W-2 only - no C2C, no third-party submissions, and no sponsorship will be considered. Overview We are seeking a Senior Data Engineer to support enterprise-scale data initiatives for a highly collaborative engineering organization. This is a new, long-term contract opportunity for a hands-on data professional who thrives in fast-paced environments and enjoys building high-quality, scalable data solutions on Snowflake. Candidates must be based in or around New Jersey, able to work on-site at least 3 days per month, and meet the W2 employment requirement. What You'll Do Design, develop, and support enterprise-level data solutions with a strong focus on Snowflake Participate across the full software development lifecycle - planning, requirements, development, testing, and QA Partner closely with engineering and data teams to identify and implement optimal technical solutions Build and maintain high-performance, scalable data pipelines and data warehouse architectures Ensure platform performance, reliability, and uptime, maintaining strong coding and design standards Troubleshoot production issues, identify root causes, implement fixes, and document preventive solutions Manage deliverables and priorities effectively in a fast-moving environment Contribute to data governance practices including metadata management and data lineage Support analytics and reporting use cases leveraging advanced SQL and analytical functions Required Skills & Experience 8+ years of experience designing and developing data solutions in an enterprise environment 5+ years of hands-on Snowflake experience Strong hands-on development skills with SQL and Python Proven experience designing and developing data warehouses in Snowflake Ability to diagnose, optimize, and tune SQL queries Experience with Azure data frameworks (e.g., Azure Data Factory) Strong experience with orchestration tools such as Airflow, Informatica, Automic, or similar Solid understanding of metadata management and data lineage Hands-on experience with SQL analytical functions Working knowledge of Shell scripting and Java scripting Experience using Git, Confluence, and Jira Strong problem-solving and troubleshooting skills Collaborative mindset with excellent communication skills Nice to Have Experience supporting Pharma industry data Exposure to Omni-channel data environments Why This Opportunity $60/hour W2 on a long-term 1-year contract Benefit options available Hybrid structure with limited on-site requirement High-impact role supporting enterprise data initiatives Clear expectations: W-2 only, no third-party submissions, no Corp-to-Corp This employer participates in E-Verify and will provide the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S.
    $60 hourly 16h ago
  • Data Engineer

    Ztek Consulting 4.3company rating

    Data engineer job in Hamilton, NJ

    Key Responsibilities: Manage and support batch processes and data pipelines in Azure Databricks and Azure Data Factory. Integrate and process Bloomberg market data feeds and files into trading or analytics platforms. Monitor, troubleshoot, and resolve data and system issues related to trading applications and market data ingestion. Develop, automate, and optimize ETL pipelines using Python, Spark, and SQL. Manage FTP/SFTP file transfers between internal systems and external vendors. Ensure data quality, completeness, and timeliness for downstream trading and reporting systems. Collaborate with operations, application support, and infrastructure teams to resolve incidents and enhance data workflows. Required Skills & Experience: 10+ years of experience in data engineering or production support within financial services or trading environments. Hands-on experience with Azure Databricks, Azure Data Factory, and Azure Storage, Logic Apps, Fabric. Strong Python and SQL programming skills. Experience with Bloomberg data feeds (BPIPE, TSIP,SFTP). Experience with Git, CI/CD pipelines, and Azure DevOps. Proven ability to support batch jobs, troubleshoot failures, and manage job scheduling. Experience handling FTP/SFTP file transfers and automation (e.g., using scripts or managed file transfer tools). Solid understanding of equities trading, fixed income trading, trading workflows, and financial instruments. Excellent communication, problem-solving, and stakeholder management skills.
    $89k-125k yearly est. 4d ago
  • Azure Data Engineer

    Programmers.Io 3.8company rating

    Data engineer job in Weehawken, NJ

    · Expert level skills writing and optimizing complex SQL · Experience with complex data modelling, ETL design, and using large databases in a business environment · Experience with building data pipelines and applications to stream and process datasets at low latencies · Fluent with Big Data technologies like Spark, Kafka and Hive · Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required · Designing and building of data pipelines using API ingestion and Streaming ingestion methods · Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential · Experience in developing NO SQL solutions using Azure Cosmos DB is essential · Thorough understanding of Azure and AWS Cloud Infrastructure offerings · Working knowledge of Python is desirable · Designing and implementing scalable and secure data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services · Managing and optimizing data storage using Azure Data Lake Storage, Azure SQL Data Warehouse, and Azure Cosmos DB · Monitoring and troubleshooting data-related issues within the Azure environment to maintain high availability and performance · Implementing data security measures, including encryption, access controls, and auditing, to protect sensitive information · Automating data pipelines and workflows to streamline data ingestion, processing, and distribution tasks · Utilizing Azure's analytics services, such as Azure Synapse Analytics, to provide insights and support data-driven decision-making. · Documenting data procedures, systems, and architectures to maintain clarity and ensure compliance with regulatory standards · Providing guidance and support for data governance, including metadata management, data lineage, and data cataloging Best Regards, Dipendra Gupta Technical Recruiter *****************************
    $92k-132k yearly est. 1d ago
  • Data Architect

    Pyramid Consulting, Inc. 4.1company rating

    Data engineer job in Ridgefield, NJ

    Immediate need for a talented Data Architect. This is a 12 month contract opportunity with long-term potential and is located in Basking Ridge, NJ (Hybrid). Please review the job description below and contact me ASAP if you are interested. Job ID:25-93859 Pay Range: $110 - $120/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location). Key Requirements and Technology Experience: Key Skills; ETL, LTMC, SaaS . 5 years as a Data Architect 5 years in ETL 3 years in LTMC Our client is a leading Telecom Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration. Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, colour, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
    $110-120 hourly 1d ago
  • Senior Data Engineer

    Apexon

    Data engineer job in New Providence, NJ

    Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. We have been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation.Apexon brings together distinct core competencies - in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences - to help businesses capitalize on the unlimited opportunities digital offers. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving clients' toughest technology problems, and a commitment to continuous improvement. Backed by Goldman Sachs Asset Management and Everstone Capital, Apexon now has a global presence of 15 offices (and 10 delivery centers) across four continents. Job Description Experienced Data management specialist responsible for developing, overseeing, organizing, storing, and analyzing data and data systems Participate in all aspects of the software development lifecycle for Snowflake solutions, including planning, requirements, development, testing, and quality assurance Work in tandem with our engineering team to identify and implement the most optimal solutions Ensure platform performance, uptime, and scale, maintaining high standards for code quality and thoughtful design Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures Able to manage deliverables in fast paced environments Areas of Expertise At least 10 years of experience designing and development of data solutions in enterprise environment At least 5+ years' experience on Snowflake Platform Strong hands-on SQL and Python development Experience with designing and developing data warehouses in Snowflake A minimum of three years' experience in developing production-ready data ingestion and processing pipelines using Spark, Scala Strong hands-on experience with Orchestration Tools e.g. Airflow, Informatica, Automic Good understanding on Metadata and data lineage Hands-on knowledge on SQL Analytical functions Strong knowledge and hands-on experience in Shell scripting, Java Scripting Able to demonstrate experience with software engineering practices including CI/CD, Automated testing and Performance Engineering. Good understanding and exposure to Git, Confluence and Jira Good problem solving and troubleshooting skills. Team player, collaborative approach and excellent communication skills Our Commitment to Diversity & Inclusion: Did you know that Apexon has been Certified™ by Great Place To Work , the global authority on workplace culture, in each of the three regions in which it operates: USA (for the fourth time in 2023), India (seven consecutive certifications as of 2023), and the UK.Apexon is committed to being an equal opportunity employer and promoting diversity in the workplace. We are taking affirmative action to ensure equal employment opportunity for all qualified individuals. Apexon strictly prohibits discrimination and harassment of any kind and provides equal employment opportunities to employees and applicants without regard to gender, race, color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. You can read about our Job Applicant Privacy policy here Job Applicant Privacy Policy (apexon.com)
    $82k-112k yearly est. 4d ago
  • Azure Data Engineer

    Sharp Decisions 4.6company rating

    Data engineer job in Jersey City, NJ

    Title: Senior Azure Data Engineer Client: Major Japanese Bank Experience Level: Senior (10+ Years) The Senior Azure Data Engineer will design, build, and optimize enterprise data solutions within Microsoft Azure for a major Japanese bank. This role focuses on architecting scalable data pipelines, enhancing data lake environments, and ensuring security, compliance, and data governance best practices. Key Responsibilities: Develop, maintain, and optimize Azure-based data pipelines and ETL/ELT workflows. Design and implement Azure Data Lake, Synapse, Databricks, and ADF solutions. Ensure data security, compliance, lineage, and governance controls. Partner with architecture, data governance, and business teams to deliver high-quality data solutions. Troubleshoot performance issues and improve system efficiency. Required Skills: 10+ years of data engineering experience. Strong hands-on expertise with Azure Synapse, Azure Data Factory, Azure Databricks, Azure Data Lake, and Azure SQL. Azure certifications strongly preferred. Strong SQL, Python, and cloud data architecture skills. Experience in financial services or large enterprise environments preferred.
    $77k-101k yearly est. 1d ago
  • Data Engineer / Manager (Only W2)

    Nexify Infosystems

    Data engineer job in Edison, NJ

    12-15 years of overall experience, combining solution architecture and hands-on delivery. Strong client-facing skills, with the ability to lead technical conversations, manage expectations, and handle escalations calmly. Hands-on technologist capable of designing, reviewing, and, when required, building/debugging components, not limited to high-level guidance. Proficient in Data Solutions, with practical, hands-on experience in Databricks including Lakehouse architecture, ETL pipelines, analytics, and AI workloads. Strong hands-on experience with PySpark and Python for large-scale data processing and transformation. Experience designing and implementing microservices-based architectures.
    $82k-112k yearly est. 4d ago
  • Data Analytics Engineer

    Dale Workforce Solutions

    Data engineer job in Somerset, NJ

    Client: manufacturing company Type: direct hire Our client is a publicly traded, globally recognized technology and manufacturing organization that relies on data-driven insights to support operational excellence, strategic decision-making, and digital transformation. They are seeking a Power BI Developer to design, develop, and maintain enterprise reporting solutions, data pipelines, and data warehousing assets. This role works closely with internal stakeholders across departments to ensure reporting accuracy, data availability, and the long-term success of the company's business intelligence initiatives. The position also plays a key role in shaping BI strategy and fostering collaboration across cross-functional teams. This role is on-site five days per week in Somerset, NJ. Key Responsibilities Power BI Reporting & Administration Lead the design, development, and deployment of Power BI and SSRS reports, dashboards, and analytics assets Collaborate with business stakeholders to gather requirements and translate needs into scalable technical solutions Develop and maintain data models to ensure accuracy, consistency, and reliability Serve as the Power BI tenant administrator, partnering with security teams to maintain data protection and regulatory compliance Optimize Power BI solutions for performance, scalability, and ease of use ETL & Data Warehousing Design and maintain data warehouse structures, including schema and database layouts Develop and support ETL processes to ensure timely and accurate data ingestion Integrate data from multiple systems while ensuring quality, consistency, and completeness Work closely with database administrators to optimize data warehouse performance Troubleshoot data pipelines, ETL jobs, and warehouse-related issues as needed Training & Documentation Create and maintain technical documentation, including specifications, mappings, models, and architectural designs Document data warehouse processes for reference, troubleshooting, and ongoing maintenance Manage data definitions, lineage documentation, and data cataloging for all enterprise data models Project Management Oversee Power BI and reporting projects, offering technical guidance to the Business Intelligence team Collaborate with key business stakeholders to ensure departmental reporting needs are met Record meeting notes in Confluence and document project updates in Jira Data Governance Implement and enforce data governance policies to ensure data quality, compliance, and security Monitor report usage metrics and follow up with end users as needed to optimize adoption and effectiveness Routine IT Functions Resolve Help Desk tickets related to reporting, dashboards, and BI tools Support general software and hardware installations when needed Other Responsibilities Manage email and phone communication professionally and promptly Respond to inquiries to resolve issues, provide information, or direct to appropriate personnel Perform additional assigned duties as needed Qualifications Required Minimum of 3 years of relevant experience Bachelor's degree in Computer Science, Data Analytics, Machine Learning, or equivalent experience Experience with cloud-based BI environments (Azure, AWS, etc.) Strong understanding of data modeling, data visualization, and ETL tools (e.g., SSIS, Azure Synapse, Snowflake, Informatica) Proficiency in SQL for data extraction, manipulation, and transformation Strong knowledge of DAX Familiarity with data warehouse technologies (e.g., Azure Blob Storage, Redshift, Snowflake) Experience with Power Pivot, SSRS, Azure Synapse, or similar reporting tools Strong analytical, problem-solving, and documentation skills Excellent written and verbal communication abilities High attention to detail and strong self-review practices Effective time management and organizational skills; ability to prioritize workload Professional, adaptable, team-oriented, and able to thrive in a dynamic environment
    $82k-112k yearly est. 1d ago
  • Senior Bigdata Engineer

    Iris Software Inc. 4.3company rating

    Data engineer job in New Jersey

    Greetings! We are looking for a Big Data Engineer to design, build, and maintain scalable data solutions. This role focuses on developing reliable data pipelines and platforms that support analytics, reporting, and data-driven decision making. The ideal candidate has strong hands-on experience with Python and SQL and is comfortable working with large, complex datasets. Position: Sr. Big Data Engineer Location: Whippany NJ (Hybrid) Contract: Long term contract Client: One of the largest financial clients. Responsibilities Design, develop, and maintain large-scale data pipelines and data platforms Build efficient ETL and ELT processes using Python and SQL Optimize data models, queries, and workflows for performance and reliability Work with structured and unstructured data from multiple sources Collaborate with data scientists, analysts, and software engineers to support analytics and machine learning use cases Ensure data quality, consistency, and availability across systems Monitor and troubleshoot data pipelines in production environments Document data processes, models, and best practices Required Qualifications Strong experience in Python for data processing and pipeline development Advanced SQL skills, including query optimization and complex data transformations Experience working with big data technologies such as Spark, Hadoop, or similar frameworks Solid understanding of data modeling, warehousing, and lakehouse concepts Experience with cloud data platforms (AWS, Azure, or Google Cloud) Familiarity with version control systems such as Git Preferred Qualifications Experience with workflow orchestration tools such as Airflow or similar Knowledge of streaming technologies such as Kafka or equivalent Experience with containerization and deployment tools (Docker, Kubernetes) Exposure to data governance, security, and compliance best practices Best Regards,
    $84k-114k yearly est. 3d ago
  • Sr Data Modeler with Capital Markets/ Custody

    Ltimindtree

    Data engineer job in Jersey City, NJ

    LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 750 clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit ******************* Job Title: Principal Data Modeler / Data Architecture Lead - Capital Markets Work Location Jersey City, NJ (Onsite, 5 days / week) Job Description: We are seeking a highly experienced Principal Data Modeler / Data Architecture Lead to reverse engineer an existing logical data model supporting all major lines of business in the capital markets domain. The ideal candidate will have deep capital markets domain expertise and will work closely with business and technology stakeholders to elicit and document requirements, map those requirements to the data model, and drive enhancements or rationalization of the logical model prior to its conversion to a physical data model. A software development background is not required. Key Responsibilities Reverse engineers the current logical data model, analyzing entities, relationships, and subject areas across capital markets (including customer, account, portfolio, instruments, trades, settlement, funds, reporting, and analytics). Engage with stakeholders (business, operations, risk, finance, compliance, technology) to capture and document business and functional requirements, and map these to the data model. Enhance or streamline the logical data model, ensuring it is fit-for-purpose, scalable, and aligned with business needs before conversion to a physical model. Lead the logical-to-physical data model transformation, including schema design, indexing, and optimization for performance and data quality. Perform advanced data analysis using SQL or other data analysis tools to validate model assumptions, support business decisions, and ensure data integrity. Document all aspects of the data model, including entity and attribute definitions, ERDs, source-to-target mappings, and data lineage. Mentor and guide junior data modelers, providing coaching, peer reviews, and best practices for modeling and documentation. Champion a detail-oriented and documentation-first culture within the data modeling team. Qualifications Minimum 15 years of experience in data modeling, data architecture, or related roles within capital markets or financial services. Strong domain expertise in capital markets (e.g., trading, settlement, reference data, funds, private investments, reporting, analytics). Proven expertise in reverse engineering complex logical data models and translating business requirements into robust data architectures. Strong skills in data analysis using SQL and/or other data analysis tools. Demonstrated ability to engage with stakeholders, elicit requirements, and produce high-quality documentation. Experience in enhancing, rationalizing, and optimizing logical data models prior to physical implementation. Ability to mentor and lead junior team members in data modeling best practices. Passion for detail, documentation, and continuous improvement. Software development background is not required. Preferred Skills Experience with data modeling tools (e.g., ER/Studio, ERwin, Power Designer). Familiarity with capital markets, business processes and data flows. Knowledge of regulatory and compliance requirements in financial data management. Exposure to modern data platforms (e.g., Snowflake, Databricks, cloud databases). Benefits and Perks: Comprehensive Medical Plan Covering Medical, Dental, Vision Short Term and Long-Term Disability Coverage 401(k) Plan with Company match Life Insurance Vacation Time, Sick Leave, Paid Holidays Paid Paternity and Maternity Leave LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, colour, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.
    $79k-111k yearly est. 16h ago
  • AI & Data Engineering Leader - IT Presales

    Infovision Inc. 4.4company rating

    Data engineer job in Ridgefield, NJ

    Job Description: IT Presales - AI & Data Engineering Specialist Fulltime We are seeking a dynamic IT Presales professional with deep expertise in AI, Data Engineering, and Cloud technologies to lead solutioning and proposal development for enterprise clients. This role demands someone who lives and breathes AI and Data, with the ability to translate complex business challenges into innovative, scalable solutions. The ideal candidate will combine strong technical credibility in Data Platforms, AI/ML, and Analytics with exceptional presales skills, including crafting proposals, responding to RFPs/RFIs, and engaging with senior client stakeholders. Key Responsibilities Solutioning & Proposal Development Own end-to-end presales activities including RFP/RFI responses, proposals, and solution presentations. Design and articulate solutions leveraging Data Engineering, AI/ML, Cloud platforms (AWS, Azure, GCP), and modern analytics frameworks. Develop pricing models and effort estimations for T&M, SOW, and Managed Services engagements. Ensure proposals meet compliance standards and align with client objectives. Client Engagement & Strategic Positioning Partner with Sales and Account teams to understand client needs and position AI/Data-driven solutions. Conduct client workshops and solution walkthroughs to demonstrate technical depth and business value. Build trusted relationships with client stakeholders, influencing decision-making at senior levels. Innovation & Practice Enablement Collaborate with AI/ML and Data Practices to incorporate emerging technologies into proposals. Contribute to reusable assets such as accelerators, PoCs, and case studies to strengthen InfoVision's positioning. Drive thought leadership in AI and Data Engineering within presales and client engagements. Qualifications & Experience 10-12+ years of IT services experience, with at least 5 years in presales or solutioning roles. Proven expertise in Data Engineering (ETL, Data Warehousing, Cloud Data Platforms) and AI/ML solutioning. Strong understanding of IT delivery models (onsite/offshore/nearshore) and engagement types (T&M, SOW, Managed Services). Familiarity with modern data architectures, Cloud migration strategies, and AI/ML adoption frameworks. Excellent communication, presentation, and stakeholder management skills. Telecom domain experience preferred. Preferred Skills Ability to influence CXO-level stakeholders and articulate AI/Data-driven business value. Experience working on large-scale RFPs for Fortune 500 clients. MBA or advanced degree is a plus.
    $96k-124k yearly est. 2d ago
  • Python Data Controls Developer

    Thought Byte

    Data engineer job in Mount Laurel, NJ

    Mount Laurel, NJ - (3 days onsite role) Mode of Hiring: Full Time Salary: Negotiable for right candidates Minimum of 7 - 10 years of experience working in a financial institution, preferably in Global Banks Minimum of 7 - 10 years of experience in SQL development, including query optimization, stored procedures, and indexing. Strong working experience in Python for data manipulation, scripting, and automation Understanding of the Compliance domain and concepts i.e., Anti-Money laundering (AML), Know your Customer (KYC), Customer Risk Rating etc. is a must Minimum of 5 - 7 years of experience in data (data lifecycle, data governance, data quality, Metadata, Data issue resolution and other data concepts)
    $77k-102k yearly est. 1d ago
  • Big Data Developer

    Infocepts 3.7company rating

    Data engineer job in Jersey City, NJ

    Designing Hive/HCatalog data model includes creating table definitions, file formats, compression techniques for Structured & Semi-structured data processing Implementing Spark processing based ETL frameworks Implementing Big data pipeline for Data Ingestion, Storage, Processing & Consumption Modifying the Informatica-Teradata & Unix based data pipeline Enhancing the Talend-Hive/Spark & Unix based data pipelines Develop and Deploy Scala/Python based Spark Jobs for ETL processing Strong SQL & DWH concepts
    $75k-98k yearly est. 3d ago
  • SAP Data Migration Developer

    Numeric Technologies 4.5company rating

    Data engineer job in Englewood, NJ

    SAP S4 Data Migration Developer Duration: 6 Months Rate: Competitive Market Rate This key role is responsible for development and configuration of SAP Data Services Platform with in Client's Corporate technology to deliver a successful data conversion and migration from SAP ECC to SAP S4 as part of project Keystone. KEY RESPONSIBILITIES - Responsible for SAP Data Services development, design, job creation and execution. Responsible for efficient design, performance tuning and ensuring timely data processing, validation & verification. Responsible for creating content within SAP Data Services for both master and transaction data conversion (standard SAP and Custom data objects). Responsible for data conversion using Staging tables and work with SAP teams for data loads in SAP S4 and MDG environments. Responsible for building validation rules, scorecards and data for consumption in Information Steward pursuant to conversion rules as per Functional Specifications. Responsible to adhere to project timelines, deliverables and account for object delivery for teams involved. To take part in meetings, execute plans, design, develop custom solutions within Clients O&T Engineering scope. Work in all facets of SAP Data Migration projects with focus on SAP S4 Data Migration using SAP Data Services Platform Hands-on development experience with ETL from legacy SAP ECC environment, conversions and jobs. Demonstrate capabilities with performance tuning, handling large data sets. Understand SAP tables, fields & load processes into SAP S4, MDG systems Build validation rules, customize, and deploy Information Steward scorecards, data reconciliation and validation Be a problem solver and build robust conversion, validation per requirements. SKILLS AND EXPERIENCE 6-8 years of experience in SAP Data Services application as a developer At least 2 SAP S4 Conversion projects with DMC, Staging Tables & updating SAP Master Data Governance Good communication skills, ability to deliver key objects on time and support with testing, mock cycles. 4-5 Years development experience in SAP Data Services 4.3 Designer, Information Steward Taking ownership and ensuring high quality results Active in seeking feedback and making necessary changes Specific previous experience - Proven experience in implementing SAP Data Services in a multinational environment. Experience in design of data loads of large volumes to SAP S4 from SAP ECC Must have used HANA Staging tables Experience in developing Information Steward for Data Reconciliation & Validation (not profiling) REQUIREMENTS Adhere to work availability schedule as noted above, be on time for meeting Written and verbal communication in English
    $78k-98k yearly est. 4d ago
  • Data Scientist

    Marlabs LLC 4.1company rating

    Data engineer job in Parsippany-Troy Hills, NJ

    Data Scientist- Parsippany, NJ (Hybrid) Data Scientist Summary: Provide analytics, telemetry, ML/GenAI-driven insights to measure SDLC health, prioritize improvements, validate pilot outcomes, and implement AI-driven development lifecycle capabilities. • Responsibilities: o Define metrics and instrumentation for SDLC/CI pipelines, incidents, and delivery KPIs. o Build dashboards, anomaly detection, and data models; implement GenAI solutions (e.g., code suggestion, PR summarization, automated test generation) to improve developer workflows. o Design experiments and validate AI-driven features during the pilot. o Collaborate with engineering and SRE to operationalize models and ensure observability and data governance. • Required skills: o Applied data science/ML in production; hands-on experience with GenAI/LLMs applied to developer workflows or DevOps automation. o Strong Python (pandas, scikit-learn), ML frameworks, SQL, and data visualization (Tableau/Power BI). o Experience with observability/telemetry data (logs/metrics/traces) and A/B experiment design. • Preferred: o Experience with model deployment, MLOps, prompt engineering, and cloud data platforms (AWS/GCP/Azure).
    $72k-98k yearly est. 4d ago
  • Senior Data Architect

    Valuemomentum 3.6company rating

    Data engineer job in Edison, NJ

    Act as a Enterprise Architect, supporting architecture reviews, design decisions, and strategic planning. Design and implement scalable data warehouse and analytics solutions on AWS and Snowflake. Develop and optimize SQL, ETL/ELT pipelines, and data models to support reporting and analytics. Collaborate with cross-functional teams (data engineering, application development, infrastructure) to align on architecture best practices and ensure consistency across solutions. Evaluate and recommend technologies, tools, and frameworks to improve data processing efficiency and reliability. Provide guidance and mentorship to data engineering teams, enforcing data governance, quality, and security standards. Troubleshoot complex data and performance issues and propose long-term architectural solutions. Support capacity planning, cost optimization, and environment management within AWS/Snowflake ecosystems. About ValueMomentum ValueMomentum is a leading solutions provider for the global property & casualty insurance industry, supported by deep domain and technology capabilities. We offer a comprehensive suite of advisory, development, implementation, and maintenance services across the entire P&C insurance value chain. This includes Underwriting, Claims, Distribution, and more, empowering insurers to stay ahead with sustained growth, high performance, and enhanced stakeholder value. Trusted by over 75 insurers, ValueMomentum is one of the largest standalone insurance-focused solutions providers to the US insurance industry.
    $82k-113k yearly est. 3d ago
  • Associate Software Engineer

    JSR Tech Consulting 4.0company rating

    Data engineer job in Newark, NJ

    Associate Software Engineer (Entry Level) We're looking for an Associate Software Engineer to join our technology team and help build and improve modern applications. This is a great opportunity for recent graduates or engineers with 0-2 years of experience who want to grow their skills in a collaborative, fast-moving environment. You'll work closely with product managers, designers, and senior engineers to build, test, and enhance software using Java, Python, AWS, and React. Industry experience is not required - we value strong fundamentals, curiosity, and a willingness to learn. Candidates must have permanent work authorization in the United States. What You'll Do Build, test, and maintain applications using Java, Python, JavaScript, and React Develop clean, well-documented code following best practices Work with AWS services for cloud-based development and deployment Collaborate with team members to understand requirements and deliver features Write unit and integration tests and help troubleshoot issues Learn new tools and technologies and apply them in real projects Participate in Agile development processes Required Qualifications Bachelor's degree in Computer Science, Engineering, or a related field 0-2 years of software development experience (internships and projects count) Basic experience or coursework with: Java and/or Python JavaScript and React AWS (cloud fundamentals) Understanding of object-oriented programming concepts Strong problem-solving and communication skills Eagerness to learn and grow as a software engineer Nice to Have (Not Required) Experience with frameworks such as Spring Boot, Node.js, Flask, or Django Exposure to APIs (REST/JSON) Familiarity with Git and basic DevOps concepts Knowledge of databases (SQL or NoSQL) Interest or exposure to AI-assisted development tools (e.g., GitHub Copilot, Claude) Financial or insurance industry experience (a plus, not required) Why This Role Entry-level friendly with strong mentorship Hands-on experience with modern tech stacks Opportunity to grow your skills in cloud, full-stack development, and software engineering best practices Inclusive, collaborative team environment
    $76k-99k yearly est. 16h ago
  • Java Software Engineer (Trading)-- AGADC5642050

    Compunnel Inc. 4.4company rating

    Data engineer job in Jersey City, NJ

    Must Haves: 1.) Low Latency Java Development experience (Trading would be preferred but not mandatory) These are more from a screening standpoint, if they have low latency java development experience they should have the following: 2.) Garbage collection, threading and or multi threading, Memory management experience 3.) Fix Protocol 4.) Optimization techniques or profiling techniques Nice to Haves: Order management System, Smart order router, market data experience
    $72k-93k yearly est. 4d ago

Learn more about data engineer jobs

Do you work as a data engineer?

What are the top employers for data engineer in NJ?

Top 10 Data Engineer companies in NJ

  1. Incedo

  2. Ernst & Young

  3. Inizio Partners Corp

  4. JPMC

  5. Capital One

  6. Colgate-Palmolive

  7. Oracle

  8. Colgate University

  9. A.M. Best

  10. Throtle

Job type you want
Full Time
Part Time
Internship
Temporary

Browse data engineer jobs in new jersey by city

All data engineer jobs

Jobs in New Jersey