Post job

Data engineer jobs in Piscataway, NJ

- 4,907 jobs
All
Data Engineer
ETL Architect
Requirements Engineer
Data Scientist
Data Architect
Software Engineer
Data Consultant
Analytical Data Miner
Lead Data Architect
Staff Software Engineer
  • Applied Data Scientists

    Mercor

    Data engineer job in Sayreville, NJ

    **1\. Role Overview**Mercor is seeking applied data science professionals to support a strategic analytics initiative with a global enterprise. This contract-based opportunity focuses on extracting insights, building statistical models, and informing business decisions through advanced data science techniques. Freelancers will translate complex datasets into actionable outcomes using tools like Python, SQL, and visualization platforms. This short-term engagement emphasizes experimentation, modeling, and stakeholder communication - distinct from production ML engineering. **2\. Key Responsibilities** ● Translate business questions into data science problems and analytical workflows ● Conduct data wrangling, exploratory analysis, and hypothesis testing ● Develop statistical models and predictive tools for decision support ● Create compelling data visualizations and dashboards for business users ● Present findings and recommendations to non-technical stakeholders **3\. Ideal Qualifications** ● 5+ years of applied data science or analytics experience in business settings ● Proficiency in Python or R (pandas, NumPy, Jupyter) and strong SQL skills ● Experience with data visualization tools (e.g., Tableau, Power BI) ● Solid understanding of statistical modeling, experimentation, and A/B testing ● Strong communication skills for translating technical work into strategic insights **4\. More About the Opportunity** ● Remote ● **Expected commitment: min 30 hours/week ● Project duration: ~6 weeks** **5\. Compensation & Contract Terms** ● $75-100/hour ● Paid weekly via Stripe Connect ● You'll be classified as an independent contractor **6\. Application Process** ● Submit your resume followed by domain expertise interview and short form **7.About Mercor** ● Mercor is a talent marketplace that connects top experts with leading AI labs and research organizations ● Our investors include Benchmark, General Catalyst, Adam D'Angelo, Larry Summers, and Jack Dorsey ● Thousands of professionals across domains like law, creatives, engineering, and research have joined Mercor to work on frontier projects shaping the next era of AI
    $75-100 hourly 17d ago
  • Lead Data Scientist (Ref: 190351)

    Forsyth Barnes

    Data engineer job in New York, NY

    Industry: Retail Salary: $150,000-$175,000 + Bonus Contact: ******************************** Our client is a prominent player in the Apparel sector, committed to fusing fashion with cutting-edge data and technological solutions. Situated in New York, this organization is on the lookout for a Data Science Manager to drive their Operations Intelligence initiatives within the Data & Analytics department. This critical role is pivotal in leveraging advanced analytics, predictive modeling, and state-of-the-art Generative AI technologies to bolster decision-making across key operational areas such as Planning, Supply Chain, Sourcing, Sales, and Logistics. The selected candidate will oversee the integration of data science methodologies into essential operational workflows, aiming to automate processes, improve visibility, accurately forecast business dynamics, and facilitate strategic planning through insightful data analysis. Requirements A minimum of 6 years of experience in the field of data science, with at least 2 years in a leadership or product-related role. Proven ability to apply analytics in complex operational environments such as planning, supply chain, and sourcing. Strong expertise in Python and SQL, along with a solid grasp of cloud-based data ecosystems. Experience with advanced modeling techniques, including forecasting, optimization, and classification. Familiarity with Generative AI technologies or LLMs, combined with a keen interest in leveraging these for practical business applications. Excellent business acumen and communication skills, facilitating effective collaboration between data insights and strategic goals.
    $83k-117k yearly est. 4d ago
  • Data Architect

    Aligned Automation, LLC 4.3company rating

    Data engineer job in New York, NY

    *Data Architect* Increase your chances of an interview by reading the following overview of this role before making an application. The *Data Architect* designs, governs, and evolves enterprise data architectures that enable reliable analytics, AI, and operational reporting. Data Architect defines standards for data modeling, integration, quality, security, and lifecycle management across cloud and on-prem platforms, ensuring data is trusted, performant, and cost-efficient. This contract position is onsite. *Job Description*: * Define end-to-end *data architecture* patterns (warehouse, lake/lakehouse, streaming, operational data stores) and reference designs aligned to business outcomes. * Own enterprise *data models* (conceptual, logical, physical), canonical data definitions, and metadata standards to drive consistency and reuse. * Architect *data integration* pipelines (batch and streaming) including ingestion, transformation, enrichment, and distribution with strong SLAs and observability. * Establish *data governance* controls (cataloging, lineage, quality rules, MDM, access policies) in partnership with security, compliance, and business stakeholders. * Drive *platform selection and design* (e.g., cloud data services, analytics engines, storage tiers) balancing scalability, performance, resilience, and total cost. * Implement *security and privacy by design* (RBAC/ABAC, encryption, tokenization, masking, retention) and ensure regulatory compliance requirements are met. * Set *standards and guardrails* for SQL, schema evolution, event design, job orchestration, and CI/CD for data workloads; review solutions for architectural fit. * Partner with product, engineering, and analytics teams to translate *business requirements* into data structures, interfaces, and service contracts. * Lead *migration and modernization* initiatives (e.g., to cloud/lakehouse), including dependency mapping, cutover plans, and performance optimization. * Define *SLOs/SLAs* and capacity plans; monitor cost, reliability, and performance; drive continuous improvement via benchmarking and right-sizing. * Mentor engineers and analysts; contribute to *architecture governance*, patterns, and best practices; present roadmaps and decisions to senior stakeholders. Required Qualifications * Strong expertise in *data modeling* (3NF, dimensional, Data Vault), *SQL*, and distributed compute/storage paradigms. * Practical experience with major *cloud platforms* (AWS, Azure, GCP) and modern *data ecosystems* (e.g., Snowflake, BigQuery, Databricks, Starburst/Trino, Apache Spark). * Proficiency in *ETL/ELT orchestration* and workflow tools (e.g., Airflow, dbt, native cloud services) and event/streaming systems (e.g., Kafka). * Proven track record implementing *data governance*: catalog, lineage, quality frameworks, MDM, and access controls. * Solid understanding of *security and compliance* for data (PII/PHI/PCI), including policy enforcement, encryption, and auditability. * Strong programming/scripting in *Python* (or Scala/Java) for data processing, automation, and tooling. * Excellent communication and stakeholder management; ability to *translate* complex technical concepts into clear business value. *Preferred Skills * * Experience with *lakehouse* architectures, open table formats (e.g., Apache Iceberg/Delta), and data sharing patterns. * Familiarity with *metadata-driven* design, semantic layers, and BI acceleration techniques. * Exposure to *ML/AI data readiness* practices (feature engineering, data labeling, model data pipelines). * Infrastructure-as-Code (e.g., *Terraform*) and CI/CD for data platform provisioning and jobs. * Cost optimization and *FinOps* practices for data services. *Key Outcomes * * Deliver a scalable, secure, and well-governed *data platform* that improves time-to-insight and reduces total cost of ownership. * Establish enterprise *data standards* that increase interoperability and reduce duplication. * Enable trusted *analytics and AI* by elevating data quality, lineage, and accessibility. xevrcyc *Required Qualifications* * Master's in management information systems or computer science. * 10-15 Years of Experience *Duration of Contract:* 6 months with probable extension Job Type: Contract Pay: $90.00 - $105.00 per hour Education: * Master's (Required) Experience: * relevant work: 10 years (Required) Ability to Commute: * Manhattan, NY 10001 (Required) Willingness to travel: * 25% (Required) Work Location: In person
    $90-105 hourly 1d ago
  • Lead HPC Architect Cybersecurity - High Performance & Computational Data Ecosystem

    Icahn School of Medicine at Mount Sinai 4.8company rating

    Data engineer job in New York, NY

    The Scientific Computing and Data group at the Icahn School of Medicine at Mount Sinai partners with scientists to accelerate scientific discovery. To achieve these aims, we support a cutting-edge high-performance computing and data ecosystem along with MD/PhD-level support for researchers. The group is composed of a high-performance computing team, a clinical data warehouse team and a data services team. The Lead HPC Architect, Cybersecurity, High Performance Computational and Data Ecosystem, is responsible for designing, implementing, and managing the cybersecurity infrastructure and technical operations of Scientific Computing's computational and data science ecosystem. This ecosystem includes a 25,000+ core and 40+ petabyte usable high-performance computing (HPC) systems, clinical research databases, and a software development infrastructure for local and national projects. The HPC system is the fastest in the world at any academic biomedical center (Top 500 list). To meet Sinai's scientific and clinical goals, the Lead brings a strategic, tactical and customer-focused vision to evolve the ecosystem to be continually more resilient, secure, scalable and productive for basic and translational biomedical research. The Lead combines deep technical expertise in cybersecurity, HPC systems, storage, networking, and software infrastructure with a strong focus on service, collaboration, and strategic planning for researchers and clinicians throughout the organization and beyond. The Lead is an expert troubleshooter, productive partner and leader of projects. The lead will work with stakeholders to make sure the HPC infrastructure is in compliance with governmental funding agency requirements and to promote efficient resource utilizations for researchers This position reports to the Director for HPC and Data Ecosystem in Scientific Computing and Data. Key Responsibilities: HPC Cybersecurity & System Administration: Design, implement, and manage all cybersecurity operations within the HPC environment, ensuring alignment with industry standards (NIST, ISO, GDPR, HIPAA, CMMC, NYC Cyber Command, etc.). Implement best practices for data security, including but not limited to encryption (at rest, in transit, and in use), audit logging, access control, authentication control, configuration managements, secure enclaves, and confidential computing. Perform full-spectrum HPC system administration: installation, monitoring, maintenance, usage reporting, troubleshooting, backup and performance tuning across HPC applications, web service, database, job scheduler, networking, storage, computes, and hardware to optimize workload efficiency. Lead resolution of complex cybersecurity and system issues; provide mentorship and technical guidance to team members. Ensure that all designs and implementations meet cybersecurity, performance, scalability, and reliability goals. Ensure that the design and operation of the HPC ecosystem is productive for research. Lead the integration of HPC resources with laboratory equipment for data ingestion aligned with all regulatory such as genomic sequencers, microscopy, clinical system etc. Develop, review and maintain security policies, risk assessments, and compliance documentation accurately and efficiently. Collaborate with institutional IT, compliance, and research teams to ensure all regulatory, Sinai Policy and operational alignment. Design and implement hybrid and cloud-integrated HPC solutions using on-premise and public cloud resources. Partner with other peers regionally, nationally and internationally to discover, propose and deploy a world-class research infrastructure for Mount Sinai. Stay current with emerging HPC, cloud, and cybersecurity technologies to keep the organization's infrastructure up-to-date. Work collaboratively, effectively and productively with other team members within the group and across Mount Sinai. Provide after-hours support as needed. Perform other duties as assigned or requested. Requirements: Bachelor's degree in computer science, engineering or another scientific field. Master's or PhD preferred. 10 years of progressive HPC system administration experience with Enterprise Linux releases including RedHat/CentOS/Rocky Systems, and batch cluster environment. Experience with all aspects of high-throughput HPC including schedulers (LSF or Slurm), networking (Infiniband/Gigabit Ethernet), parallel file systems and storage, configuration management systems (xCAT, Puppet and/or Ansible), etc. Proficient in cybersecurity processes, posture, regulations, approaches, protocols, firewalls, data protection in a regulated environment (e.g. finance, healthcare). In-depth knowledge HIPAA, NIST, FISMA, GDPR and related compliance standards, with prove experience building and maintaining compliant HPC system Experience with secure enclaves and confidential computing. Proven ability to provide mentorship and technical leadership to team members. Proven ability to lead complex projects to completion in collaborative, interdisciplinary settings with minimum guidance. Excellent analytical ability and troubleshooting skills. Excellent communication, documentation, collaboration and interpersonal skills. Must be a team player and customer focused. Scripting and programming experience. Preferred Experience Proficient with cloud services, orchestration tools, openshift/Kubernetes cost optimization and hybrid HPC architectures. Experience with Azure, AWS or Google cloud services. Experience with LSF job scheduler and GPFS Spectrum Scale. Experience in a healthcare environment. Experience in a research environment is highly preferred. Experience with software that enables privacy-preserving linking of PHI. Experience with Globus data transfer. Experience with Web service, SAP HANA, Oracle, SQL, MariaDB and other database technologies. Strength through Unity and Inclusion The Mount Sinai Health System is committed to fostering an environment where everyone can contribute to excellence. We share a common dedication to delivering outstanding patient care. When you join us, you become part of Mount Sinai's unparalleled legacy of achievement, education, and innovation as we work together to transform healthcare. We encourage all team members to actively participate in creating a culture that ensures fair access to opportunities, promotes inclusive practices, and supports the success of every individual. At Mount Sinai, our leaders are committed to fostering a workplace where all employees feel valued, respected, and empowered to grow. We strive to create an environment where collaboration, fairness, and continuous learning drive positive change, improving the well-being of our staff, patients, and organization. Our leaders are expected to challenge outdated practices, promote a culture of respect, and work toward meaningful improvements that enhance patient care and workplace experiences. We are dedicated to building a supportive and welcoming environment where everyone has the opportunity to thrive and advance professionally. Explore this opportunity and be part of the next chapter in our history. About the Mount Sinai Health System: Mount Sinai Health System is one of the largest academic medical systems in the New York metro area, with more than 48,000 employees working across eight hospitals, more than 400 outpatient practices, more than 300 labs, a school of nursing, and a leading school of medicine and graduate education. Mount Sinai advances health for all people, everywhere, by taking on the most complex health care challenges of our time - discovering and applying new scientific learning and knowledge; developing safer, more effective treatments; educating the next generation of medical leaders and innovators; and supporting local communities by delivering high-quality care to all who need it. Through the integration of its hospitals, labs, and schools, Mount Sinai offers comprehensive health care solutions from birth through geriatrics, leveraging innovative approaches such as artificial intelligence and informatics while keeping patients' medical and emotional needs at the center of all treatment. The Health System includes more than 9,000 primary and specialty care physicians; 13 joint-venture outpatient surgery centers throughout the five boroughs of New York City, Westchester, Long Island, and Florida; and more than 30 affiliated community health centers. We are consistently ranked by U.S. News & World Report's Best Hospitals, receiving high "Honor Roll" status. Equal Opportunity Employer The Mount Sinai Health System is an equal opportunity employer, complying with all applicable federal civil rights laws. We do not discriminate, exclude, or treat individuals differently based on race, color, national origin, age, religion, disability, sex, sexual orientation, gender, veteran status, or any other characteristic protected by law. We are deeply committed to fostering an environment where all faculty, staff, students, trainees, patients, visitors, and the communities we serve feel respected and supported. Our goal is to create a healthcare and learning institution that actively works to remove barriers, address challenges, and promote fairness in all aspects of our organization.
    $89k-116k yearly est. 1d ago
  • Data & Performance Analytics (Hedge Fund)

    Coda Search│Staffing

    Data engineer job in New York, NY

    Our client is a $28B NY based multi-strategy Hedge Fund currently seeking to add a talented Associate to their Data & Performance Analytics Team. This individual will be working closely with senior managers across finance, investment management, operations, technology, investor services, compliance/legal, and marketing. Responsibilities This role will be responsible for Compiling periodical fund performance analyses Review and analyze portfolio performance data, benchmark performance and risk statistics Review and make necessary adjustments to client quarterly reports to ensure reports are sent out in a timely manner Work with all levels of team members across the organization to help coordinate data feeds for various internal and external databases, in effort to ensure the integrity and consistency of portfolio data reported across client reporting systems Apply queries, pivot tables, filters and other tools to analyze data. Maintain client relationship management database and providing reports to Directors on a regular basis Coordinate submissions of RFPs by working with RFP/Marketing Team and other groups internally to gather information for accurate data and performance analysis Identifying opportunities to enhance the strategic reporting platform by gathering and analyzing field feedback and collaborating with partners across the organization Provide various ad hoc data research and analysis as needed. Desired Skills and Experience Bachelor's Degree with at least 2+ years of Financial Services/Private Equity data/client reporting experience Proficiency in Microsoft Office, particularly Excel Modeling Technical knowledge, data analytics using CRMs (Salesforce), Excel, PowerPoint Outstanding communication skills, proven ability to effectively work with all levels of Managment Comfortable working in a fast-paced, dead-line driven dynamic environment Innovative and creative thinker Must be detail oriented
    $68k-96k yearly est. 1d ago
  • Data Engineer

    Drillo.Ai

    Data engineer job in New Providence, NJ

    Job Title: Senior Data Engineer (Python & Snowflake, SQL) Employment Type: Contract Sr. Data Engineer (Python, Snowflake, SQL) The developer should have strong Python, Snowflake, SQL coding skills. The developer should be able to articulate few real time experience scenarios and should have a good aptitude to show case solutions for real life problems in Snowflake and Python. The developer should be able to write code in Python for some intermediate level problems given during the L1 assessment. Lead qualities to be able to guide a team and to own the end to end support of the project. Around 8 years' experience as Snowflake Developer on design and development of data solutions within the Snowflake Data Cloud, leveraging its cloud-based data warehousing capabilities. Responsible for designing and implementing data pipelines, data models, and ETL processes, ensuring efficient and effective data storage, processing, and analysis. Able to write Complex SQL Queries, Write Python Stored Procedure code in Snowflake Job Description Summary: Data Modelling and Schema Design: Create and maintain well-structured data models and schemas within Snowflake, ensuring data integrity and efficient query performance. ETL/ELT Development: Design and implement ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) processes to load data into Snowflake from various sources. Data Pipeline Management: Build and optimize data pipelines to ingest data into Snowflake, ensuring accurate and timely data flow. SQL Optimization: Write and optimize SQL queries to enhance performance and efficiency within Snowflake. Performance Tuning: Identify and address performance bottlenecks within Snowflake, optimizing query execution and resource allocation. Security and Governance: Implement data security and governance best practices within Snowflake environments, including access control and encryption. Documentation and Maintenance: Maintain documentation for data models, data pipelines, and other Snowflake solutions. Troubleshooting and Support: Troubleshoot and resolve issues within Snowflake, providing technical support to users. Collaboration: Collaborate with data architects, data engineers, and business users to understand requirements and deliver solutions Other Skills: Experience with data warehousing concepts and data modelling. Hands-on experience in creating stored procedures, functions, tables, cursors. Experience in database testing, data comparison, and data transformation scripting. Capable of troubleshooting common database issues Hands on experience in Gitlab with understanding of CI/CD Pipeline, DevOps tools Knowledge on AWS Lambda and Azure Functions
    $82k-112k yearly est. 4d ago
  • Lead Data Engineer (Marketing Technology)

    Sogeti 4.7company rating

    Data engineer job in New York, NY

    required ) About the job: We're seeking a Lead Data Engineer to drive innovation and excellence across our Marketing Technology data ecosystem. You thrive in dynamic, fast-paced environments and are comfortable navigating both legacy systems and modern data architectures. You balance long-term strategic planning with short-term urgency, responding to challenges with clarity, speed, and purpose. You take initiative, quickly familiarize yourself with source systems, ingestion pipelines, and operational processes, and integrate seamlessly into agile work rhythms. Above all, you bring a solution-oriented, win-win mindset-owning outcomes and driving progress. What you will do at Sogeti: Rapidly onboard into our Martech data ecosystem-understanding source systems, ingestion flows, and operational processes. Build and maintain scalable data pipelines across Martech, Loyalty, and Engineering teams. Balance long-term projects with short-term reactive tasks, including urgent bug fixes and business-critical issues. Identify gaps in data infrastructure or workflows and proactively propose and implement solutions. Collaborate with product managers, analysts, and data scientists to ensure data availability and quality. Participate in agile ceremonies and contribute to backlog grooming, sprint planning, and team reviews. What you will bring: 7+ years of experience in data engineering, with a strong foundation in ETL design, cloud platforms, and real-time data processing. Deep expertise in Snowflake, Airflow, dbt, Fivetran, AWS S3, Lambda, Python, SQL. Previous experience integrating data from multiple retail and ecommerce source systems. Experience with implementation and data management for loyalty platforms, customer data platforms, marketing automation systems, and ESPs. Deep expertise in data modeling with dbt. Demonstrated ability to lead critical and complex platform migrations and new deployments. Strong communication and stakeholder management skills. Self-driven, adaptable, and proactive problem solver Education: Bachelor's or Master's degree in Computer Science, Software Engineering, Information Systems, Business Administration, or a related field. Life at Sogeti: Sogeti supports all aspects of your well-being throughout the changing stages of your life and career. For eligible employees, we offer: Flexible work options 401(k) with 150% match up to 6% Employee Share Ownership Plan Medical, Prescription, Dental & Vision Insurance Life Insurance 100% Company-Paid Mobile Phone Plan 3 Weeks PTO + 7 Paid Holidays Paid Parental Leave Adoption, Surrogacy & Cryopreservation Assistance Subsidized Back-up Child/Elder Care & Tutoring Career Planning & Coaching $5,250 Tuition Reimbursement & 20,000+ Online Courses Employee Resource Groups Counseling & Support for Physical, Financial, Emotional & Spiritual Well-being Disaster Relief Programs About Sogeti Part of the Capgemini Group, Sogeti makes business value through technology for organizations that need to implement innovation at speed and want a local partner with global scale. With a hands-on culture and close proximity to its clients, Sogeti implements solutions that will help organizations work faster, better, and smarter. By combining its agility and speed of implementation through a DevOps approach, Sogeti delivers innovative solutions in quality engineering, cloud and application development, all driven by AI, data and automation. Become Your Best | ************* Disclaimer Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law. This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship. Capgemini is committed to providing reasonable accommodation during our recruitment process. If you need assistance or accommodation, please reach out to your recruiting contact. Please be aware that Capgemini may capture your image (video or screenshot) during the interview process and that image may be used for verification, including during the hiring and onboarding process. Click the following link for more information on your rights as an Applicant ************************************************************************** Applicants for employment in the US must have valid work authorization that does not now and/or will not in the future require sponsorship of a visa for employment authorization in the US by Capgemini. Capgemini discloses salary range information in compliance with state and local pay transparency obligations. The disclosed range represents the lowest to highest salary we, in good faith, believe we would pay for this role at the time of this posting, although we may ultimately pay more or less than the disclosed range, and the range may be modified in the future. The disclosed range takes into account the wide range of factors that are considered in making compensation decisions including, but not limited to, geographic location, relevant education, qualifications, certifications, experience, skills, seniority, performance, sales or revenue-based metrics, and business or organizational needs. At Capgemini, it is not typical for an individual to be hired at or near the top of the range for their role. The base salary range for the tagged location is $125,000 - $175,000. This role may be eligible for other compensation including variable compensation, bonus, or commission. Full time regular employees are eligible for paid time off, medical/dental/vision insurance, 401(k), and any other benefits to eligible employees. Note: No amount of pay is considered to be wages or compensation until such amount is earned, vested, and determinable. The amount and availability of any bonus, commission, or any other form of compensation that are allocable to a particular employee remains in the Company's sole discretion unless and until paid and may be modified at the Company's sole discretion, consistent with the law.
    $125k-175k yearly 1d ago
  • Data Engineer

    Beauty By Imagination (BBI

    Data engineer job in New York, NY

    About Beauty by Imagination: Beauty by Imagination is a global haircare company dedicated to boosting self-confidence with imaginative solutions for every hair moment. We are a platform company of diverse, market-leading brands, including Wet Brush, Goody, Bio Ionic, and Ouidad - all of which are driven to be the most trusted choice for happy, healthy hair. Our talented team is passionate about delivering high-performing products for consumers and salon professionals alike. Position Overview: We are looking for a skilled Data Engineer to design, build, and maintain our enterprise Data Warehouse (DWH) and analytics ecosystem - with a growing focus on enabling AI-driven insights, automation, and enterprise-grade AI usage. In this role, you will architect scalable pipelines, improve data quality and reliability, and help lay the foundational data structures that power tools like Microsoft Copilot, Copilot for Power BI, and AI-assisted analytics across the business. You'll collaborate with business stakeholders, analysts, and IT teams to modernize our data environment, integrate complex data sources, and support advanced analytics initiatives. Your work will directly influence decision-making, enterprise reporting, and next-generation AI capabilities built on top of our Data Warehouse. Key Responsibilities Design, develop, and maintain Data Warehouse architecture, including ETL/ELT pipelines, staging layers, and data marts. Build and manage ETL workflows using SQL Server Integration Services (SSIS) and other data integration tools. Integrate and transform data from multiple systems, including ERP platforms such as NetSuite. Develop and optimize SQL scripts, stored procedures, and data transformations for performance and scalability. Support and enhance Power BI dashboards and other BI/reporting systems. Implement data quality checks, automation, and process monitoring. Collaborate with business and analytics teams to translate requirements into scalable data solutions. Contribute to data governance, standardization, and documentation practices. Support emerging AI initiatives by ensuring model-ready data quality, accessibility, and semantic alignment with Copilot and other AI tools. Required Qualifications Proven experience with Data Warehouse design and development (ETL/ELT, star schema, SCD, staging, data marts). Hands-on experience with SSIS (SQL Server Integration Services) for building and managing ETL workflows. Strong SQL skills and experience with Microsoft SQL Server. Proficiency in Power BI or other BI tools (Tableau, Looker, Qlik). Understanding of data modeling, performance optimization, and relational database design. Familiarity with Python, Airflow, or Azure Data Factory for data orchestration and automation. Excellent analytical and communication skills. Preferred Qualifications Experience with cloud data platforms (Azure, AWS, or GCP). Understanding of data security, governance, and compliance (GDPR, SOC2). Experience with API integrations and real-time data ingestion. Background in finance, supply chain, or e-commerce analytics. Experience with NetSuite ERP or other ERP systems (SAP, Oracle, Dynamics, etc.). AI Focused Preferred Skills: Experience implementing AI-driven analytics or automation inside Data Warehouses. Hands-on experience using Microsoft Copilot, Copilot for Power BI, or Copilot Studio to accelerate SQL, DAX, data modeling, documentation, or insights. Familiarity with building RAG (Retrieval-Augmented Generation) or AI-assisted query patterns using SQL Server, Synapse, or Azure SQL. Understanding of how LLMs interact with enterprise data, including grounding, semantic models, and data security considerations (Purview, RBAC). Experience using AI tools to optimize ETL/ELT workflows, generate SQL scripts, or streamline data mapping/design. Exposure to AI-driven data quality monitoring, anomaly detection, or pipeline validation tools. Experience with Microsoft Fabric, semantic models, or ML-integrated analytics environments. Soft Skills Strong analytical and problem-solving mindset. Ability to communicate complex technical concepts to business stakeholders. Detail-oriented, organized, and self-motivated. Collaborative team player with a growth mindset. Impact You will play a key role in shaping the company's modern data infrastructure - building scalable pipelines, enabling advanced analytics, and empowering the organization to safely and effectively adopt AI-powered insights across all business functions. Our Tech Stack SQL Server, SSIS, Azure Synapse Python, Airflow, Azure Data Factory Power BI, NetSuite ERP, REST APIs CI/CD (Azure DevOps, GitHub) What We Offer Location: New York, NY (Hybrid work model) Employment Type: Full-time Compensation: Competitive salary based on experience Benefits: Health insurance, 401(k), paid time off Opportunities for professional growth and participation in enterprise AI modernization initiatives
    $90k-123k yearly est. 18h ago
  • Synthetic Data Engineer (Observability & DevOps)

    Sepal

    Data engineer job in New York, NY

    About the Role: We're building a large-scale synthetic data generation engine to produce realistic observability datasets - metrics, logs, and traces - to support AI/ML training and benchmarking. You will design, implement, and scale pipelines that simulate complex production environments and emit controllable, parameterized telemetry data. 🧠 What You'll Do \t•\tDesign and implement generators for metrics (CPU, latency, throughput) and logs (structured/unstructured). \t•\tBuild configurable pipelines to control data rate, shape, and anomaly injection. \t•\tDevelop reproducible workload simulations and system behaviors (microservices, failures, recoveries). \t•\tIntegrate synthetic data storage with Prometheus, ClickHouse, or Elasticsearch. \t•\tCollaborate with ML researchers to evaluate realism and coverage of generated datasets. \t•\tOptimize for scale and reproducibility using Docker containers. ✅ Who You Are \t•\tStrong programming skills in Python. \t•\tFamiliarity with observability tools (Grafana, Prometheus, ELK, OpenTelemetry). \t•\tSolid understanding of distributed systems metrics and log structures. \t•\tExperience building data pipelines or synthetic data generators. \t•\t(Bonus) Knowledge of anomaly detection, time-series analysis, or generative ML models. 💸 Pay $50 - 75/hr depending on experience Remote, flexible hours Project timeline: 5-6 weeks
    $50-75 hourly 1d ago
  • Data Architect - Semantic & AI Readiness

    GS1 4.3company rating

    Data engineer job in Ewing, NJ

    (contractor) with project funding for 3.5 years. The Future of Data Sharing Programme is central to GS1's Vision 2030 - building a globally unified, interoperable and trusted data ecosystem that powers AI-enabled supply chains, supports sustainability data exchange and reinforces digital trust. As Data Architect - Semantic & AI Readiness, you will serve as a data architect and ontology specialist, helping GS1 move from traditional data exchange to semantic, machine-readable and AI-ready data infrastructures. You will design and document the data models, ontologies and governance rules that ensure GS1 registries become the reliable “source of truth” for industry and regulators. This is a unique opportunity to combine hands-on data modelling with global impact - helping define how trusted data will power AI, sustainability, environmental transparency and digital transformation across industries. Responsibilities include the following: Design semantic foundations - Lead the development and maintenance of ontologies, taxonomies and canonical data models aligned with GS1 standards and industry vocabularies. Translate business concepts into data - Work with domain experts and Member Organisations to extract meaning, model relationships and represent business entities in precise, interoperable formats. Specify interoperable data structures - Define and validate linked-data outputs (e.g. RDF, OWL, JSON-LD, SKOS), metadata schemas and API contracts supporting GS1 Registries and data services. Embed data quality by design - Establish validation rules, provenance metadata and governance controls to ensure trustworthy, machine-actionable data for AI and analytics. Support architecture and delivery - Collaborate with product owners, programme leads and technical teams on solution design, requirements, testing and rollout. Align and influence - Engage Member Organisations and partners to harmonise semantic models and promote consistent implementation across the federation. Communicate value - Produce clear technical summaries, architecture documents and executive briefings that demonstrate how GS1 data supports AI and sustainability use cases. Education/experience Bachelor's or Master's degree in Information Science, Knowledge Engineering, Computer Science, Data Architecture or related field. 4-6 years' experience in data modelling, ontology development, semantic data integration or information architecture, ideally in international or standards-based environments. Excellent collaboration and communication skills - able to bridge technical and business perspectives and explain complex concepts clearly. Strategic mindset with a passion for AI, sustainability and data trust, and a drive to make technical concepts deliver real-world impact. Skills Must Have Practical experience with ontology and taxonomy tools (e.g. Protégé) and linked-data technologies (RDF, OWL, JSON-LD, SKOS). Familiarity with modern data platforms, API design and data exchange standards. Must be fluent in English, oral and written. Fluency in other languages is helpful. Strong organisational, analytical, verbal, and written communication skills. Demonstrates passion, energy, and drive in their work. Excellent time management skills and flexibility to cater for commitments across multiple time zones. Operates in a manner that demonstrates honesty; keeps promises and honours commitments; behaves in a consistent manner. Nice to Have Understanding of metadata design, data governance, interoperability frameworks and knowledge graph architectures. Interest or experience in sustainability standards, ESG reporting frameworks, lifecycle or circularity data, product environmental foot printing or related domains. This job may require up to 10% global travel. This is a hybrid role with a minimum of 4 to 8 days per month in Ewing, NJ office, or remote for other locations in the US. IMPORTANT! Please do not contact hiring managers. Apply through LinkedIn Recruiter and we will be in touch if you are a good fit. GS1 Overview GS1 develops and maintains the most widely used supply chain standards that are fundamental to numerous enterprises around the world. The best-known symbol of GS1 standards is the barcode, named by the BBC as one of “the 50 things that made the world economy”. Five decades ago, we started by helping food retailers do business more efficiently and reduce consumer prices. Today, GS1 standards improve the efficiency, safety, and visibility of supply chains across physical and digital channels in 25 sectors, including retail omnichannel and e-commerce, healthcare, transport and logistics, food service, technical industries, and humanitarian logistics. Our scale and reach - local Member Organisations (MO) in 120 countries, 2 million user companies, and 10 billion transactions every day - help ensure that GS1 standards create a common language that supports systems and processes across the globe. GS1 is an Equal Opportunity Employer. We will never unlawfully discriminate on the grounds of race, religion, belief, ethnic origin, colour, nationality, gender, gender reassignment, sexual orientation, age, disability, marriage and civil partnership, pregnancy, maternity, or political opinions.
    $101k-143k yearly est. 2d ago
  • Data Engineer

    Beaconfire Inc.

    Data engineer job in East Windsor, NJ

    🚀 Junior Data Engineer 📝 E-Verified | Visa Sponsorship Available 🔍 About Us: BeaconFire, based in Central NJ, is a fast-growing company specializing in Software Development, Web Development, and Business Intelligence. We're looking for self-motivated and strong communicators to join our team as a Junior Data Engineer! If you're passionate about data and eager to learn, this is your opportunity to grow in a collaborative and innovative environment. 🌟 🎓 Qualifications We're Looking For: Passion for data and a strong desire to learn and grow. Master's Degree in Computer Science, Information Technology, Data Analytics, Data Science, or a related field. Intermediate Python skills (Experience with NumPy, Pandas, etc. is a plus!) Experience with relational databases like SQL Server, Oracle, or MySQL. Strong written and verbal communication skills. Ability to work independently and collaboratively within a team. 🛠️ Your Responsibilities: Collaborate with analytics teams to deliver reliable, scalable data solutions. Design and implement ETL/ELT processes to meet business data demands. Perform data extraction, manipulation, and production from database tables. Build utilities, user-defined functions, and frameworks to optimize data flows. Create automated unit tests and participate in integration testing. Troubleshoot and resolve operational and performance-related issues. Work with architecture and engineering teams to implement high-quality solutions and follow best practices. 🌟 Why Join BeaconFire? ✅ E-Verified employer 🌍 Work Visa Sponsorship Available 📈 Career growth in data engineering and BI 🤝 Supportive and collaborative work culture 💻 Exposure to real-world, enterprise-level projects 📩 Ready to launch your career in Data Engineering? Apply now and let's build something amazing together! 🚀
    $82k-112k yearly est. 3d ago
  • AI Engineer

    Oscar 4.6company rating

    Data engineer job in New York, NY

    AI Engineer - Healthcare Automation Platform Well-Funded Startup | Healthcare AI | Hybrid (NYC preferred) or Remote About Us We're building an AI-powered automation platform that streamlines critical workflows in healthcare operations. Our system processes complex, unstructured data to ensure time-sensitive information gets where it needs to go-reducing delays and improving operational efficiency for healthcare providers. We're in production with paying enterprise customers and experiencing rapid growth. The Role We're looking for a high-agency AI Engineer to bridge the gap between cutting-edge ML research and real-world product delivery. You'll design and build agentic workflows that automate complex operational processes, combining LLMs, vision models, and structured automation to solve challenging infrastructure and workflow problems. This role involves creating pipelines, evaluation harnesses, and scalable production-grade agents. You'll research and implement the best-fit technology for each workflow, working across the full stack from data collection to orchestration to frontend integration. What You'll Build Ship full-stack AI systems end-to-end-from prototype to production Build observability and debugging tools to capture model performance, user feedback, and edge cases Go from ideation to working code within hours; iterate rapidly on experiments and data Design agentic workflows powered by LLMs and vision models for document understanding Create evaluation frameworks to test AI system performance beyond raw model accuracy Work directly with cross-functional teams (ML, Sales, Customer Success) to build AI solutions for diverse use cases What We're Looking For Full-stack engineering experience with web frameworks, backend systems, and cloud infrastructure Proven track record of building, testing, deploying, scaling, and monitoring LLM-centered software architectures Hands-on expertise with LLM APIs and production AI system deployment Understanding of how to evaluate AI systems holistically-beyond model accuracy alone Strong communication skills-ability to write clear technical documentation and explain complex systems Bonus: Experience in healthcare or working with unstructured documents Why Join Us? Drive Impact: High-agency culture where you set the pace and see direct results Own Your Work: End-to-end ownership from research to production deployment Innovate with Purpose: Join a high-caliber team solving real problems at scale Competitive Package: $200K-$240K + equity + comprehensive benefits Great Perks: Unlimited PTO, 100% paid health benefits, 401(k) match, catered lunch, snacks Location: NYC office 4 days/week preferred (Chelsea), remote considered for exceptional candidates Desired Skills and Experience AI Engineer - Healthcare Automation Platform Well-Funded Startup | Healthcare AI | Hybrid (NYC preferred) or Remote About Us We're building an AI-powered automation platform that streamlines critical workflows in healthcare operations. Our system processes complex, unstructured data to ensure time-sensitive information gets where it needs to go-reducing delays and improving operational efficiency for healthcare providers. We're in production with paying enterprise customers and experiencing rapid growth. The Role We're looking for a high-agency AI Engineer to bridge the gap between cutting-edge ML research and real-world product delivery. You'll design and build agentic workflows that automate complex operational processes, combining LLMs, vision models, and structured automation to solve challenging infrastructure and workflow problems. This role involves creating pipelines, evaluation harnesses, and scalable production-grade agents. You'll research and implement the best-fit technology for each workflow, working across the full stack from data collection to orchestration to frontend integration. What You'll Build Ship full-stack AI systems end-to-end-from prototype to production Build observability and debugging tools to capture model performance, user feedback, and edge cases Go from ideation to working code within hours; iterate rapidly on experiments and data Design agentic workflows powered by LLMs and vision models for document understanding Create evaluation frameworks to test AI system performance beyond raw model accuracy Work directly with cross-functional teams (ML, Sales, Customer Success) to build AI solutions for diverse use cases What We're Looking For Full-stack engineering experience with web frameworks, backend systems, and cloud infrastructure Proven track record of building, testing, deploying, scaling, and monitoring LLM-centered software architectures Hands-on expertise with LLM APIs and production AI system deployment Understanding of how to evaluate AI systems holistically-beyond model accuracy alone Strong communication skills-ability to write clear technical documentation and explain complex systems Bonus: Experience in healthcare or working with unstructured documents Why Join Us? Drive Impact: High-agency culture where you set the pace and see direct results Own Your Work: End-to-end ownership from research to production deployment Innovate with Purpose: Join a high-caliber team solving real problems at scale Competitive Package: $200K-$240K + equity + comprehensive benefits Great Perks: Unlimited PTO, 100% paid health benefits, 401(k) match, catered lunch, snacks Location: NYC office 4 days/week preferred (Chelsea), remote considered for exceptional candidates Oscar Associates Limited (US) is acting as an Employment Agency in relation to this vacancy.
    $200k-240k yearly 3d ago
  • GTM Engineer

    Camber 3.2company rating

    Data engineer job in New York, NY

    About us: Camber builds software to improve the quality and accessibility of healthcare. We streamline and replace manual work so clinicians can focus on what they do best: providing great care. For more details on our thesis, check out our write-up: What is Camber? We've raised $50M in funding from phenomenal supporters at a16z, Craft Ventures, YCombinator, Manresa, and many others who are committed to improving the accessibility of care. For more information, take a look at: Announcing Camber About our Culture: Our mission to change behavioral health starts with us and how we operate. We don't want to just change behavioral health, we want to change the way startups operate. Here are a few tactical examples: 1) Improving accessibility and quality of healthcare is something we live and breathe. Everyone on Camber's team cares deeply about helping clinicians and patients. 2) We have to have a sense of humor. Healthcare is so broken, it's depressing if you don't laugh with us. About the role: We're seeking a proactive, tech-savvy sales operations professional with a startup mindset-someone who thrives on breaking growth barriers and enabling sales excellence. This person will be both a systems admin and a strategic partner: ensuring HubSpot and our tech stack are humming, while also helping shape compensation, territories, and GTM expansion. What you'll do: Systems & CRM Administration Manage and optimize current CRM (HubSpot) and other tech stack integrations: build workflows, dashboards, and troubleshoot system issues Support onboarding/offboarding of users, governance, data hygiene, and adoption Data, Forecasting & Reporting Design and maintain dashboards, reports, and metrics that drive decision-making (e.g., pipeline health, forecast accuracy, win rates) Deliver actionable insights to stakeholders across sales leadership Compensation & Territory Strategy Assist in designing incentive and quota plans that align with sales goals Collaborate on territory definition, alignment, and carve strategy to ensure balanced coverage Process & Cross-Functional Enablement Streamline sales workflows and sales-marketing-sales handoffs Partner across teams-Sales, Marketing, Finance-to ensure operational alignment and seamless execution Strategic & Tactical Execution Be hands-on when needed (data crunching, HubSpot tweaks) while contributing to broader sales strategy planning What we're looking for: 2-4 years in startup, sales operations, or Rev-Ops environment (or similar roles) CRM administration experience-ideally HubSpot; bonus if familiar with other tools and workflows Strong analytical skills-coding, Excel, BI, sales forecasting, data modeling Operational rigor and problem-solving mindset A strategic thinker who can scale systems and structure Thrives in growth-stage constraints; comfortable wearing multiple hats and moving quickly Perks & Benefits at Camber: Comprehensive Health Coverage: Medical, dental, and vision plans with nationwide coverage, including 24/7 virtual urgent care. Mental Health Support: Weekly therapy reimbursement up to $100, so you can prioritize the care that works best for you. Paid Parental Leave: Up to 12 weeks of fully paid time off for new parents ( birth, adoption, or foster care) . Financial Wellness: 401K (traditional & Roth), HSA & FSA options, and monthly commuter benefits for NYC employees. Time Off That Counts: 18 PTO days per year (plus rollover ), plus office closures for holidays, monthly team events, company off-sites, and daily, in-office lunches for our team. Fitness Stipend: $100/month to use on fitness however you choose. Hybrid Flexibility: In NYC? We gather in the office 3-5x/week, with flexibility when life happens. Fridays are remote-friendly. Camber is based in New York City, and we prioritize in-person and hybrid candidates. Building an inclusive culture is one of our core tenets as a company. We're very aware of structural inequalities that exist, and recognize that underrepresented minorities are less likely to apply for a role if they don't think they meet all of the requirements. If that's you and you're reading this, we'd like to encourage you to apply regardless - we'd love to get to know you and see if there's a place for you here! In addition, we take security seriously, and all of our employees contribute to uphold security requirements and maintain compliance with HIPAA security regulations.
    $101k-143k yearly est. 5d ago
  • Plumbing Engineer

    Precision Engeneering

    Data engineer job in New York, NY

    🔎 We're Hiring: Senior Plumbing & Fire Protection Engineer / MEP Designer (On-Site - Brooklyn, NY) Precision Design, a leading MEP Engineering firm in Brooklyn, NY, is seeking a Senior Plumbing Engineer / MEP Designer with strong experience in Plumbing, Fire Protection We are looking for a highly skilled professional who can independently design systems, coordinate across multiple disciplines, and manage multiple projects in a fast-paced environment. Candidates must have at least 5 years of industry experience, including a minimum of 3 years designing in NYC, and must be fully knowledgeable of NYC Building and Energy Codes. 💼 Responsibilities Design Plumbing & Fire Protection systems from concept through full construction documents Prepare calculations for water, gas, sanitary/sewer, and storm loads Perform field surveys and assess existing building conditions Produce drawings, specifications, and all phases of design (schematic → construction administration) Coordinate with architectural, engineering, and external project teams, including contractors and city agencies Manage multiple projects simultaneously Review shop drawings and participate in project meetings 📘 Required Skills & Experience 5+ years of related experience in Plumbing and/or Fire Protection design At least 3 years of NYC-specific design experience Strong knowledge of NYC Building Codes, NYC Energy Conservation Code, and NYC filing requirements Experience with utility company filing procedures Proficiency in AutoCAD (Revit is a plus) Familiarity with NFPA-13, NFPA-13R, and hydraulic calculations Experience with DEP cross-connection and site connection submissions is strongly preferred Excellent communication, teamwork, and interpersonal skills Ability to work independently and manage multiple deadlines 📍 Work Location On-site in our Brooklyn, NY office (no remote option)
    $74k-100k yearly est. 4d ago
  • Staff Software Engineer

    Cloudkitchens 3.6company rating

    Data engineer job in New York, NY

    Who We Are At City Storage Systems (CSS), we are dedicated to building Infrastructure for Better Food. Our mission is to empower restaurateurs worldwide to thrive in the online food delivery market. By making food more affordable, of higher quality, and convenient, we're transforming the industry for everyone, from budding entrepreneurs opening their first restaurant to global quick-service chains. What You'll Do As a backend-focused Software Engineer at CSS, you'll play a crucial role in our data-driven development team, helping to advance our state-of-the-art menu platform. Your responsibilities will include: Data-Driven Development: Contribute to our data-centric development efforts. Project Planning: Participate in strategic planning for various internal tools. Agile Methodologies: Implement and test software using agile methodologies. Collaborative Teamwork: Work closely with a team to enhance and support our technology. Code Contribution: Write, debug, maintain, and test code across multiple projects. Architectural Design: Design scalable systems with a focus on robust architecture. Continuous Improvement: Engage in continuous improvement initiatives. Innovation: Drive innovation within the team and support technological advancements at CSS. What the Team Focuses On Our menu platform (check our tech blog) offers comprehensive menu management features designed to streamline restaurant operations, enhance customer experiences, and optimize performance. It serves as a single source of truth for menus, seamlessly integrating with online channels such as DoorDash, UberEats, and Grubhub and offline point-of-sale (POS) systems like Square, Toast, and NCR. Key capabilities include updating menus with new items, pricing, and taxes, performing A/B testing on different structures, setting availability by channel, creating combos and promotions, managing ingredients and SKUs, and configuring operational hours. Additionally, our platform features automated linking to ensure POS and online menus are always synchronized, minimizing discrepancies. Boasting a 99.9% availability rate, our platform supports a vast network of brands in the US and worldwide, ensuring uninterrupted service. Over 100,000 restaurateurs use our platform daily to streamline their operations and consistently express high satisfaction. What We're Looking For Education: Bachelor's Degree in Computer Science or equivalent. Experience: 7-10 years of experience in a relevant role. Individual Contribution: Proven track record of significant contributions in previous roles, demonstrating your impact. Architectural Skills: Ability to design and create robust architecture from scratch and evolve existing systems. Communication Skills: Strong communication and presentation skills, with the ability to collaborate with non-engineering stakeholders. Technical Expertise: Experience designing and implementing scalable, reliable, and efficient distributed systems. Familiarity with Java / Go / Kotlin is required. Concurrency: Experience building systems that can execute multiple tasks while managing overlapping run-time and space complexities simultaneously. Application Maintenance: Experience in maintaining and extending large-scale, high-traffic applications. Why Join Us Growing Market: You'll be part of an $80 billion market projected to reach at least $500 billion by 2030 in the US alone. Industry Impact: Join a team that is transforming the restaurant industry and helping restaurants succeed in online food delivery. Collaborative Environment: Benefit from the support and guidance of experienced colleagues and managers, who will help you learn, grow, and achieve your goals. Work closely with other teams to ensure our customers' success. Additional Information This role is based in our Mountain View office. We look forward to sharing more about a meaningful career at CSS!
    $119k-163k yearly est. 3d ago
  • ETL Talend MDM Architect

    TRG 4.6company rating

    Data engineer job in New York, NY

    Responsibilities: • Develop and test Extract, Transformation, and Loading (ETL) modules based on design specifications • Develop and test ETL Mappings in Talend • Plan, test, and deploy ETL mappings, and database code as part of application build process across the enterprise • Provide effective communications with all levels of internal and external customers and staff • Must demonstrate knowledge in the following areas: o Data Integration o Data Architecture o Team Lead experience is a plus • Understand, analyze, assess and recommend ETL environment from technology strategy and operational standpoint • Understand and assess source system data issues and recommend solution from data integration standpoint • Create high level, low level technical design documents for data integration • Design exceptions handling, audit and data resolution processes • Performance tune ETL environment • Conduct proof of concepts • Estimation of work based on functional requirements documents • Identify system deficiencies and recommending solutions • Designing, coding, and writing unit test cases from functional requirements • Delivering efficient and bug-free ETL packages and documentation • Maintenance and support of enterprise ETL jobs • Experience with Talend Hadoop tools is a plus Basic Qualifications: • 3+ years of development experience on Talend ETL tools • 7+ years working with one or more of the following ETL Tools: Talend, Informatica, Ab Initio or Data Stage • 7+ years proficient experience as a developer • Bachelor's Degree in Computer Science or equivalent • Database (Oracle, SQL Server, DB2) • Database Programming (Complex SQL, PL/SQL development knowledge) • Data Modeling • Business Analysis • Top level performer with ability to work independently in short time frames • Proficient working in a Linux environment • Experience in scripting languages (Shell, Python or Perl) • 5+ years of experience deploying large scale projects ETL projects that • 3+ years of experience in a development lead position • Data analysis, data mapping, data loading, and data validation • Understand reusability, parameterization, workflow design, etc. • Thorough understanding of Entire life cycle of Software and various Software Engineering Methodologies • Performance tuning of interfaces that extract, transform and load tens of millions of records • Knowledge of Hadoop ecosystem technologies is a plus Additional Information If you are comfortable with the position and location then please revert me back at the earliest with your updated resume and following details or I would really appreciate if you can call me back on my number. Full Name: Email: Skype id: Contact Nos.: Current Location: Open to relocate: Start Availability: Work Permit: Flexible time for INTERVIEW: Current Company: Current Rate: Expected Rate: Total IT Experience [Years]: Total US Experience [Years]: Key Skill Set: Best time to call: In case you are not interested, I will be very grateful if you can pass this position to your colleagues or friends who might be interested. All your information will be kept confidential according to EEO guidelines.
    $100k-125k yearly est. 60d+ ago
  • ETL Architect

    Us Tech Solutions 4.4company rating

    Data engineer job in Jersey City, NJ

    US Tech Solutions is a global staff augmentation firm providing a wide-range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit our website ************************ We are constantly on the lookout for professionals to fulfill the staffing needs of our clients, sets the correct expectation and thus becomes an accelerator in the mutual growth of the individual and the organization as well. Keeping the same intent in mind, we would like you to consider the job opening with US Tech Solutions that fits your expertise and skillset. Job Description Responsibilities: • Assist in the identification/resolution of critical path issues related to the generation of technical specifications and associated Informatica ETL development • Provide solutions to design and develop ETL mappings and related activities; ability to articulate in architecture terms • Strong ability to author technical solution documents, and architecture components • Provide technical support and troubleshooting to data integration issues • Work together with the business intelligence team to design and develop reporting and dashboard solutions • Create and maintain detailed technical documentation on developments • Provide solutions to fine tune ETLs and database components for performance • Provide testing strategy and support execution • Provide deployment and release management strategy and support QualificationsQualifications: • BS in Computer Science ore related IT field, or equivalent work experience required • Experience: 3-5 years development experience with database solutions (Oracle SQL, PL/SQL) and Unix scripting • Hands on development experience 5+ years' experience in data integration (ETL) specifically in Informatica PowerCenter9.5 and above • Experience in developing data warehouse solutions in the Finance industry • Ability to analyze information architectures understands and derives business requirements • Ability to collaborate and work in a team environment • Proactive thinking and ability to work under minimal supervision • Excellent analytical and communication skills and ability to deal effectively with customers • Good communication skills both oral and written to be able to work with both the technical and banking users • Strong project experience in providing data integration solutions to Data Warehouse and/or Data Mart implementations Additional InformationKushal kumar **********
    $110k-141k yearly est. 60d+ ago
  • ETL Architect

    Integrated Resources 4.5company rating

    Data engineer job in New York, NY

    A Few Words About Us Integrated Resources, Inc is a premier staffing firm recognized as one of the tri-states most well-respected professional specialty firms. IRI has built its reputation on excellent service and integrity since its inception in 1996. Our mission centers on delivering only the best quality talent, the first time and every time. We provide quality resources in four specialty areas: Information Technology (IT), Clinical Research, Rehabilitation Therapy and Nursing. Position: ETL Architect Location: NYC Duration: 6 months Job Description: This opportunity is for individuals who have Hands-on experience in data warehouse design and development. The Role demands more than a typical ETL lead role as it interacts outwardly on projects with architects, PM's, OPS, data modelers, developers, admins, DBA's and testers. This is a hands-on delivery-focused role, and the individual will be responsible for technical delivery of data warehouse and data integration projects Must have skills • 7-10 years Hands on experience with Informatica ETL in designing and developing ETL processes based on multiple sources using ETL tools • Experience in Architecting end to end ETL solutions • Hands on UNIX experience. Scripting (e.g. shell, perl, alerts, cron, automation) • Expert at all aspects of relational database design • Experience working with engineering team with respect to database-related performance tuning, writing of complex SQL, indexing, etc. Good to Have: • Experience with IDQ, MDM, other ETL tools • Experience with dashboard and report development • Experience with financial services firms will be preferred Additional Information Kind Regards Sachin Gaikwad Technical Recruiter Integrated Resources, Inc. Direct Line : 732-429-1920
    $102k-130k yearly est. 60d+ ago
  • Exceptional Software Engineers (Coding Agent Experience)

    Mercor

    Data engineer job in New York, NY

    Mercor is seeking software engineers to support one of the world's leading AI labs in building **robust, high-performance systems** that serve the needs of next-generation machine learning applications. This role involves **real-world engineering work**-including environment configuration, database design, and the creation of scalable APIs and service layers that interface with advanced AI models. * * * **You are good fit if you:** - **Have experience using coding agents** as part of your software engineering workflow. - Have 3+ years of elite software engineering experience from top-tier technology startups, quantitative trading firms, hedge funds, or similarly demanding environments. - Hold a Computer Science degree from a prestigious university. - Have demonstrated success leading teams to build complex database schemas. - Possess expert-level proficiency in API development, including creation, testing, and integration. - Are highly skilled in SQL and database structuring. - Demonstrate exceptional attention to detail and rigorous problem-solving skills. - Excel in both written and verbal communication. * * * **About the Role** - This project will be a high-impact 24-hour sprint that will start in the next 1-2 weeks - This role would offer a task-based pay (top performers in the previous iteration made upwards of $1000 in the sprint) * * * **Compensation and Legal Details** - You will be legally classified as an hourly contractor for Mercor - We will pay you out at the end of each week via Stripe Connect * * * **About Mercor** Mercor connects elite creative and technical talent with leading AI research labs, headquartered in San Francisco, CA. Our distinguished investors include Benchmark, General Catalyst, Peter Thiel, Adam D'Angelo, Larry Summers, and Jack Dorsey. Apply today and redefine digital creativity alongside groundbreaking AI technologies!
    $80k-107k yearly est. 35d ago
  • Java Software Engineer

    Beaconfire Inc.

    Data engineer job in New York, NY

    BeaconFire is based in Central NJ, specializing in Software Development, Web Development, and Business Intelligence; we are looking for candidates with a strong background in Software Engineering or Computer Science for a Java/Software Developer position. Responsibilities: ● Develop software and web applications using Java 8/J2EE/Java EE (and higher), React.js,Angular2+, SQL, Spring, HTML5, CSS, JavaScript and TypeScript among other tools; ● Write scalable, secure, maintainable code that powers our clients' platforms; ● Create, deploy, and maintain automated system tests; ● Work with Testers to understand defects opened and resolves them in a timely manner; ● Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review; ● Collaborate effectively with other team members to accomplish shared user story and sprint goals; Basic Qualifications: ● Experience in programming language JavaScript or similar (e.g. Java, Python, C, C++, C#, etc.) an understanding of the software development life cycle; ● Basic programming skills using object-oriented programming (OOP) languages with in-depth knowledge of common APIs and data structures like Collections, Maps, lists, Sets etc. ● Knowledge of relational databases (e.g. SQL Server, Oracle) basic SQL query language skills Preferred Qualifications: ● Master's Degree in Computer Science (CS) ● 0-1 year of practical experience in Java coding ● Experience using Spring, Maven and Angular frameworks, HTML, CSS ● Knowledge with other contemporary Java technologies (e.g. Weblogic, RabbitMQ, Tomcat, etc.) · Knowledge of JSP, J2EE, and JDBC · Compensation: $65,000.00 to $80,000.00 /year BeaconFire is an e-verified company. Work visa sponsorship is available.
    $65k-80k yearly 1d ago

Learn more about data engineer jobs

How much does a data engineer earn in Piscataway, NJ?

The average data engineer in Piscataway, NJ earns between $71,000 and $128,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Piscataway, NJ

$96,000

What are the biggest employers of Data Engineers in Piscataway, NJ?

The biggest employers of Data Engineers in Piscataway, NJ are:
  1. Colgate-Palmolive
  2. InvestCloud
  3. Orion Innovation
  4. Ernst & Young
  5. Citizens Financial Group
  6. CompoSecure
  7. MetLife
  8. Johnson & Johnson
  9. Colgate University
  10. Citizens Alliance
Job type you want
Full Time
Part Time
Internship
Temporary