Post job

Requirements engineer jobs in Plainfield, NJ

- 2,568 jobs
All
Requirements Engineer
Data Engineer
Devops Engineer
  • AI/ML Engineer (Customer Facing, FDE, Python AI Agent/LLM)

    The Developer Link

    Requirements engineer job in New York, NY

    Python Engineer / Data Scientist (Forward Deployed) - LLM/AI Agent products in Legal Tech. Salary: $170,000-$190,000 + benefits Company: Late-Stage Scaleup in Legal AI Software Our client, a global rapidly expanding Legal AI software company backed by top-tier investors, is transforming how legal teams operate through intelligent automation and applied AI. They are hiring 2 x customer-facing, hands-on Python Engineer / Data Scientists to join their brand new Forward Deployed team in Manhattan New York. This role sits at the intersection of engineering, data, and client delivery, ideal for someone who thrives in technical problem-solving while working directly with enterprise customers. You should expect the role to be 50% hands on coding. As a Forward Deployed technologist, you will work with real customers on real problems, delivering bespoke, high-impact solutions. Responsibilities include: • Working closely with Technical and Legal Architects to qualify, scope, and execute bespoke client development requests. • Rapidly prototyping solutions using APIs, large language models, and supporting technologies to demonstrate feasibility and value. • Building and adapting integrations that fit into complex client environments, ensuring smooth onboarding and adoption. • Engaging directly with client technical teams to troubleshoot, debug, and optimise deployments in real time. • Translating experimental R&D concepts into production-quality code that can evolve into productised features. • Maintaining a strong feedback loop between client engagements and core engineering to ensure real-world learnings influence the product roadmap. • Balancing speed and stability, knowing when to produce a quick proof of concept and when to harden code for long-term reliability. • Collaborating with legal architects, product managers, and researchers to push the boundaries of AI-enabled legal technology. What We're Looking For • Strong hands-on Python development experience (for example FastAPI, data pipelines, automation, integrations). • Experience with AI/ML workflows, NLP, or LLM-driven solutions. • Strong communication skills and confidence working directly with both technical and non-technical customer stakeholders. • Ability to own problems end-to-end, from diagnosis to delivery. • Experience in a customer-facing engineering / forward-deployed environment. • Bonus: exposure to legal tech, enterprise SaaS, or complex integration projects. Please apply with your resume if interested. If you have any exposure to Legal Tech please email ************************ for faster review.
    $170k-190k yearly 2d ago
  • Cloud Engineer

    The Phoenix Group 4.8company rating

    Requirements engineer job in New York, NY

    Cloud Infrastructure Engineer We are seeking a skilled Cloud Infrastructure Engineer to design, implement, and maintain secure, scalable, and resilient cloud infrastructure solutions. The role involves leveraging SaaS and cloud-based technologies to solve complex business challenges and support global operations. Responsibilities Implement and support enterprise-scale cloud solutions and integrations. Build and automate cloud infrastructure using IaC tools such as Terraform, CloudFormation, or ARM templates. Deploy and support Generative AI platforms and cloud-based vendor solutions. Implement and enforce cloud security best practices, including IAM, encryption, network segmentation, and compliance with industry standards. Establish monitoring, logging, and alerting frameworks to ensure high availability and performance. Optimize cost, performance, and reliability of cloud services. Participate in on-call rotations and provide support for cloud infrastructure issues. Maintain documentation, conduct knowledge transfer sessions, and perform design peer reviews. Experience Level 5+ years in cloud infrastructure engineering, preferably in regulated industries. Deep expertise in at least one major cloud platform (Azure, AWS, or GCP). Proficient with Azure and related services (AI/ML tools, security, automation, governance). Familiarity with SIEM, CNAPP, EDR, Zero Trust architecture, and MDM solutions. Experience with SaaS integrations and managing third-party cloud services. Understanding of virtualization, containerization, auto-scaling, and fully automated systems. Experience scripting in PowerShell and Python; working knowledge of REST APIs. Networking knowledge (virtual networks, DNS, SSL, firewalls) and IT change management. Strong collaboration, interpersonal, and communication skills. Willingness to participate in on-call rotations and after-hours support. The Phoenix Group Advisors is an equal opportunity employer. We are committed to creating a diverse and inclusive workplace and prohibit discrimination and harassment of any kind based on race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, genetic information, disability, or veteran status. We strive to attract talented individuals from all backgrounds and provide equal employment opportunities to all employees and applicants for employment.
    $83k-115k yearly est. 2d ago
  • AV Engineer

    DTG Consulting Solutions, Inc.

    Requirements engineer job in New York, NY

    Full-Time Onsite Position We're looking for an experienced Audio/Visual Engineer to design, implement, and support a cutting-edge AV infrastructure. As our organization continues to grow, technology remains at the heart of our business, and our AV needs are evolving right alongside it. In this role, you'll be the subject matter expert for all things AV: from designing globally scalable, fault-tolerant systems to supporting live events, collaboration platforms, digital signage, and complex event spaces. You'll work closely with the Technology and Infrastructure teams to deliver high-quality, reliable AV solutions that enhance communication and collaboration across the firm. What You'll Do Architect and manage enterprise-scale AV systems (meeting rooms, event spaces, IPTV, digital signage, Cisco VC solutions). Provide hands-on support for live and hybrid events. Collaborate on acoustic treatments and space design to optimize performance. Partner with vendors, architects, and cross-functional teams on large-scale buildouts. Ensure world-class reliability and user experience in every solution. About You 7+ years in AV engineering with enterprise-scale systems. Expertise in AVoIP, DSPs, control systems, and collaboration tools (Zoom, Webex, Teams). Strong technical production skills for hybrid/live events. Solid understanding of networking fundamentals (VLANs, routing, firewalls). Skilled in multitasking and leading multiple complex projects. Bonus: AutoCAD/Revit or programming experience.
    $74k-100k yearly est. 4d ago
  • M365 Collaboration Engineer

    Stand 8 Technology Consulting

    Requirements engineer job in New York, NY

    STAND 8 provides end to end IT solutions to enterprise partners across the United States and with offices in Los Angeles, New York, New Jersey, Atlanta, and more including internationally in Mexico and India We are seeking a highly experienced M365 & Collaboration Engineer to architect, implement, and support modern collaboration and unified communication solutions across the enterprise. This senior-level role will lead end-to-end design, optimization, troubleshooting, and lifecycle management for Microsoft 365, unified communications, and integration capabilities. The ideal candidate combines technical depth, architectural vision, strong scripting skills, and the ability to collaborate across engineering, security, and business teams. This is a hybrid position, working on-site Monday through Thursday and remotely on Fridays Key Responsibilities Architecture, Design & Implementation Lead design, implementation, and lifecycle support for enterprise collaboration products (M365, Slack, Zoom, Teams, Unified Communication systems, etc.). Architect scalable solutions aligned with organizational strategy and future technology roadmap. Oversee full-stack collaboration deployments while simultaneously supporting ongoing architectural projects. Design and implement unified communication and collaboration workflows using Microsoft 365 technologies. Operations, Troubleshooting & Optimization Provide advanced troubleshooting, performance tuning, and reliability improvements for M365 and other collaboration tools. Perfom incident root-cause analysis and develop remediation strategies. Plan and deliver infrastructure improvements with a 6-month outlook. Integration & Cross-Functional Collaboration Work closely with cybersecurity teams and vendors to integrate M365 with other cloud and on-premise systems. Support technology assessments and collaborate with solution partners and internal stakeholders. Ensure interoperability between Microsoft 365 and other enterprise applications. Security, Governance & Documentation Maintain security and compliance across the Microsoft 365 environment by implementing appropriate policies and controls. Document architecture, configurations, and solution designs. Provide training and knowledge transfer to internal teams and stakeholders. Qualifications 7+ years of professional experience in Microsoft 365 architecture, engineering, and implementation. Expert-level knowledge of M365 services (Exchange Online, SharePoint Online, Teams, OneDrive, Power Platform, etc.). 7+ years experience with PowerShell and scripting for administration and automation. 5+ years hands-on experience with Azure services (AAD, App Services, SQL, Storage, Functions, Logic Apps, DevOps, etc.). Strong background in enterprise unified communication and collaboration solutions. Experience monitoring, troubleshooting, and performing root-cause analysis of M365 issues. Understanding of security/compliance practices within Microsoft environments. Experience integrating M365 with cloud and on-prem systems. Strong documentation, communication, and cross-functional collaboration skills. Knowledge of networking fundamentals and common internet protocols. Email/calendaring experience (Exchange, Outlook, Proofpoint). Ability to work in fast-paced Agile environments. Bachelor's degree in a computer-related field or equivalent experience. Flexibility to work evenings/weekends as needed. Preferred Qualifications Microsoft certifications such as: M365 Enterprise Administrator Expert,Teams Administrator Associate, Security Administrator Associate. Strong presentation and documentation abilities. Ability to manage multiple projects simultaneously. Ability to work both independently and within a team environment. Benefits Medical coverage and Health Savings Account (HSA) through Anthem Dental/Vision/Various Ancillary coverages through Unum 401(k) retirement savings plan Paid-time-off options Company-paid Employee Assistance Program (EAP) Discount programs through ADP WorkforceNow Additional Details The base range for this contract position is $35 - $45 / per hour, depending on experience. Our pay ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target for new hires of this position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Qualified applicants with arrest or conviction records will be considered About Us STAND 8 provides end-to-end IT solutions to enterprise partners across the United States and globally with offices in Los Angeles, Atlanta, New York, Mexico, Japan, India, and more. STAND 8 focuses on the "bleeding edge" of technology and leverages automation, process, marketing, and over fifteen years of success and growth to provide a world-class experience for our customers, partners, and employees. Our mission is to impact the world positively by creating success through PEOPLE, PROCESS, and TECHNOLOGY. Check out more at ************** and reach out today to explore opportunities to grow together! By applying to this position, your data will be processed in accordance with the STAND 8 Privacy Policy.
    $35-45 hourly 3d ago
  • AI / ML Engineer

    Wall Street Consulting Services LLC

    Requirements engineer job in Warren, NJ

    Title: AI Engineer or MCP Developer Duration: Long Term Contract Kindly share your resumes to **************** Description: A MCP Developer in commercial P&C insurance is typically an IT role focused on developing systems and integrations using the Model Context Protocol (MCP) to leverage Artificial Intelligence (AI) and Large Language Models (LLMs) within insurance operations. This role involves building the infrastructure that allows AI agents to securely and reliably access and act upon internal P&C data sources (e.g., policy systems, claims databases, underwriting documents), thereby enhancing automation and decision-making in core insurance functions like underwriting and claims processing. Responsibilities: AI Integration: Develop and implement robust integrations between AI models (LLMs) and internal data repositories and business tools using the Model Context Protocol (MCP). System Development: Build and maintain MCP servers and clients to expose necessary data and capabilities to AI agents. Workflow Automation: Design and implement agentic workflows that allow AI systems to perform complex, multi-step tasks, such as accessing real-time policy data, processing claims information, and updating customer records. Security & Compliance: Implement secure coding practices and ensure all AI interactions and data exchanges via MCP adhere to insurance industry regulations and internal compliance standards (e.g., data privacy, secure data handling). API Management: Work with existing APIs (REST/SOAP) and develop new ones to facilitate data flow to and from the MCP environment. Collaboration: Partner with actuaries, underwriters, claims specialists, and IT teams to identify AI opportunities and ensure seamless solution deployment. Testing & Quality Assurance: Perform testing to ensure AI-driven job outputs are accurate and reliable, and maintain high performance levels. Documentation: Document all development processes, system architectures, and operational procedures for MCP integrations. Experience: 3+ years of experience in software development or AI integration, preferably within the insurance or financial services industry. P&C Knowledge: Strong knowledge of Commercial P&C insurance products, underwriting processes, and claims systems is highly preferred. Technical Expertise: Proficiency in programming languages like Python, Java, or similar. Experience with API development and management. Familiarity with cloud platforms (AWS, Azure, GCP) and containerization tools (Docker, Kubernetes). Understanding of the Model Context Protocol (MCP) specification and SDKs
    $70k-94k yearly est. 4d ago
  • Neo4j Engineer

    Tata Consultancy Services 4.3company rating

    Requirements engineer job in Summit, NJ

    Must Have Technical/Functional Skills Neo4j, Graph Data Science, Cypher, Python, Graph Algorithms, Bloom, GraphXR, Cloud, Kubernetes, ETL Roles & Responsibilities Design and implement graph-based data models using Neo4j. Develop Cypher queries and procedures for efficient graph traversal and analysis. Apply Graph Data Science algorithms for community detection, centrality, and similarity. Integrate Neo4j with enterprise data platforms and APIs. Collaborate with data scientists and engineers to build graph-powered applications. Optimize performance and scalability of graph queries and pipelines. Support deployment and monitoring of Neo4j clusters in cloud or on-prem environments. Salary Range: $110,000 $140,000 Year TCS Employee Benefits Summary: Discretionary Annual Incentive. Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans. Family Support: Maternal & Parental Leaves. Insurance Options: Auto & Home Insurance, Identity Theft Protection. Convenience & Professional Growth: Commuter Benefits & Certification & amp; Training Reimbursement. Time Off: Vacation, Time Off, Sick Leave & Holidays. Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
    $110k-140k yearly 2d ago
  • Senior Data Engineer

    Godel Terminal

    Requirements engineer job in New York, NY

    Godel Terminal is a cutting edge financial platform that puts the world's financial data at your fingertips. From Equities and SEC filings, to global news delivered in milliseconds, thousands of customers rely on Godel every day to be their guide to the world of finance. We are looking for a senior engineer in New York City to join our team and help build out live data services as well as historical data for US markets and international exchanges. This position will specifically work on new asset classes and exchanges, but will be expected to contribute to the core architecture as we expand to international markets. Our team works quickly and efficiently, we are opinionated but flexible when it's time to ship. We know what needs to be done, and how to do it. We are laser focused on not just giving our customers what they want, but exceeding their expectations. We are very proud that when someone opens the app the first time they ask: “How on earth does this work so fast”. If that sounds like a team you want to be part of, here is what we need from you: Minimum qualifications: Able to work out of our Manhattan office minimum 4 days a week 5+ years of experience in a financial or startup environment 5+ years of experience working on live data as well as historical data 3+ years of experience in Java, Python, and SQL Experience managing multiple production ETL pipelines that reliably store and validate financial data Experience launching, scaling, and improving backend services in cloud environments Experience migrating critical data across different databases Experience owning and improving critical data infrastructure Experience teaching best practices to junior developers Preferred qualifications: 5+ years of experience in a fintech startup 5+ years of experience in Java, Kafka, Python, PostgreSQL 5+ years of experience working with Websockets like RXStomp or Socket.io 5+ years of experience wrangling cloud providers like AWS, Azure, GCP, or Linode 2+ years of experience shipping and optimizing Rust applications Demonstrated experience keeping critical systems online Demonstrated creativity and resourcefulness under pressure Experience with corporate debt / bonds and commodities data Salary range begins at $150,000 and increases with experience Benefits: Health Insurance, Vision, Dental To try the product, go to *************************
    $150k yearly 1d ago
  • Data Engineer

    DL Software Inc. 3.3company rating

    Requirements engineer job in New York, NY

    DL Software produces Godel, a financial information and trading terminal. Role Description This is a full-time, on-site role based in New York, NY, for a Data Engineer. The Data Engineer will design, build, and maintain scalable data systems and pipelines. Responsibilities include data modeling, developing and managing ETL workflows, optimizing data storage solutions, and supporting data warehousing initiatives. The role also involves collaborating with cross-functional teams to improve data accessibility and analytics capabilities. Qualifications Strong proficiency in Data Engineering and Data Modeling Mandatory: strong experience in global financial instruments including equities, fixed income, options and exotic asset classes Strong Python background Expertise in Extract, Transform, Load (ETL) processes and tools Experience in designing, managing, and optimizing Data Warehousing solutions
    $91k-123k yearly est. 2d ago
  • Data Analytics Engineer

    Dale Workforce Solutions

    Requirements engineer job in Somerset, NJ

    Client: manufacturing company Type: direct hire Our client is a publicly traded, globally recognized technology and manufacturing organization that relies on data-driven insights to support operational excellence, strategic decision-making, and digital transformation. They are seeking a Power BI Developer to design, develop, and maintain enterprise reporting solutions, data pipelines, and data warehousing assets. This role works closely with internal stakeholders across departments to ensure reporting accuracy, data availability, and the long-term success of the company's business intelligence initiatives. The position also plays a key role in shaping BI strategy and fostering collaboration across cross-functional teams. This role is on-site five days per week in Somerset, NJ. Key Responsibilities Power BI Reporting & Administration Lead the design, development, and deployment of Power BI and SSRS reports, dashboards, and analytics assets Collaborate with business stakeholders to gather requirements and translate needs into scalable technical solutions Develop and maintain data models to ensure accuracy, consistency, and reliability Serve as the Power BI tenant administrator, partnering with security teams to maintain data protection and regulatory compliance Optimize Power BI solutions for performance, scalability, and ease of use ETL & Data Warehousing Design and maintain data warehouse structures, including schema and database layouts Develop and support ETL processes to ensure timely and accurate data ingestion Integrate data from multiple systems while ensuring quality, consistency, and completeness Work closely with database administrators to optimize data warehouse performance Troubleshoot data pipelines, ETL jobs, and warehouse-related issues as needed Training & Documentation Create and maintain technical documentation, including specifications, mappings, models, and architectural designs Document data warehouse processes for reference, troubleshooting, and ongoing maintenance Manage data definitions, lineage documentation, and data cataloging for all enterprise data models Project Management Oversee Power BI and reporting projects, offering technical guidance to the Business Intelligence team Collaborate with key business stakeholders to ensure departmental reporting needs are met Record meeting notes in Confluence and document project updates in Jira Data Governance Implement and enforce data governance policies to ensure data quality, compliance, and security Monitor report usage metrics and follow up with end users as needed to optimize adoption and effectiveness Routine IT Functions Resolve Help Desk tickets related to reporting, dashboards, and BI tools Support general software and hardware installations when needed Other Responsibilities Manage email and phone communication professionally and promptly Respond to inquiries to resolve issues, provide information, or direct to appropriate personnel Perform additional assigned duties as needed Qualifications Required Minimum of 3 years of relevant experience Bachelor's degree in Computer Science, Data Analytics, Machine Learning, or equivalent experience Experience with cloud-based BI environments (Azure, AWS, etc.) Strong understanding of data modeling, data visualization, and ETL tools (e.g., SSIS, Azure Synapse, Snowflake, Informatica) Proficiency in SQL for data extraction, manipulation, and transformation Strong knowledge of DAX Familiarity with data warehouse technologies (e.g., Azure Blob Storage, Redshift, Snowflake) Experience with Power Pivot, SSRS, Azure Synapse, or similar reporting tools Strong analytical, problem-solving, and documentation skills Excellent written and verbal communication abilities High attention to detail and strong self-review practices Effective time management and organizational skills; ability to prioritize workload Professional, adaptable, team-oriented, and able to thrive in a dynamic environment
    $82k-112k yearly est. 1d ago
  • Azure Data Engineer

    Programmers.Io 3.8company rating

    Requirements engineer job in Weehawken, NJ

    · Expert level skills writing and optimizing complex SQL · Experience with complex data modelling, ETL design, and using large databases in a business environment · Experience with building data pipelines and applications to stream and process datasets at low latencies · Fluent with Big Data technologies like Spark, Kafka and Hive · Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required · Designing and building of data pipelines using API ingestion and Streaming ingestion methods · Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential · Experience in developing NO SQL solutions using Azure Cosmos DB is essential · Thorough understanding of Azure and AWS Cloud Infrastructure offerings · Working knowledge of Python is desirable · Designing and implementing scalable and secure data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services · Managing and optimizing data storage using Azure Data Lake Storage, Azure SQL Data Warehouse, and Azure Cosmos DB · Monitoring and troubleshooting data-related issues within the Azure environment to maintain high availability and performance · Implementing data security measures, including encryption, access controls, and auditing, to protect sensitive information · Automating data pipelines and workflows to streamline data ingestion, processing, and distribution tasks · Utilizing Azure's analytics services, such as Azure Synapse Analytics, to provide insights and support data-driven decision-making. · Documenting data procedures, systems, and architectures to maintain clarity and ensure compliance with regulatory standards · Providing guidance and support for data governance, including metadata management, data lineage, and data cataloging Best Regards, Dipendra Gupta Technical Recruiter *****************************
    $92k-132k yearly est. 1d ago
  • DevOps Engineer

    Confidential Company 4.2company rating

    Requirements engineer job in New York, NY

    About the Team The DevOps team is responsible for supporting the development teams and interfacing with the infrastructure teams. As a DevOps engineer, you'll have the exciting opportunity to work in a fast-paced, entrepreneurial environment. What You'll Do Drive the design, engineering, integration, and enhancements of DevOps enablement tools and applications by utilizing Site Reliability and DevOps principles suited for an on-prem environment Follow software development processes and practices (Functional Specification and Testing, Design Specifications, Code Reviews, Unit Testing, Monitoring) Document and maintain processes and procedures Implement and support established Continuous Integration / Continuous Delivery (CI/CD) practices Mentor and train the Technology team on tools that increase the use of automation and improve stability, advocating solutions Evaluate new technologies and explore their applicability to address new requirements in our environment Skills and Experience Bachelor's Degree in computer science, software engineering or related field 3+ years of total IT experience 3+ years of development experience in either Python, C#, Java Experience building, deploying and maintaining container images (e.g., Docker, Kubernetes) Experience with one or more configuration management tools (e.g., Ansible, Terraform, Git, Bash) Familiarity with DevOps practices and Site Reliability Engineering processes and tools (e.g., InfluxDB, Grafana, PagerDuty, REST, Prometheus) Experience with system administration, such as provisioning and managing servers, deploying database, security monitoring, system patching and managing internal and experience network connectivity What does it take to be successful in this role? Excellent problem-solving skills, soft skills, quality, and delivery mindset Strong communicator and collaborator Ability to thrive in a fast paced, start-up environment with individuals in dispersed locations Self-starter, results driven individual with a proven track record Comfortable with navigating ambiguity and translating it to impactful results What are some skills to make you stand out? Experience with trading strategies for securities, options, crypto and trading platforms Experience with big data and distributed systems (e.g., Kafka, Cassandra) Ability to demonstrate your ability to integrate different software using code (e.g., Python, shell, C#, Java)
    $97k-132k yearly est. 4d ago
  • AWS Data engineer with Databricks || USC Only || W2 Only

    Ipivot

    Requirements engineer job in Princeton, NJ

    AWS Data Engineer with Databricks Princeton, NJ - Hybrid - Need Locals or Neaby Duration: Long Term is available only to U.S. citizens. Key Responsibilities Design and implement ETL/ELT pipelines with Databricks, Apache Spark, AWS Glue, S3, Redshift, and EMR for processing large-scale structured and unstructured data. Optimize data flows, monitor performance, and troubleshoot issues to maintain reliability and scalability. Collaborate on data modeling, governance, security, and integration with tools like Airflow or Step Functions. Document processes and mentor junior team members on best practices. Required Qualifications Bachelor's degree in Computer Science, Engineering, or related field. 5+ years of data engineering experience, with strong proficiency in Databricks, Spark, Python, SQL, and AWS services (S3, Glue, Redshift, Lambda). Familiarity with big data tools like Kafka, Hadoop, and data warehousing concepts.
    $82k-112k yearly est. 2d ago
  • Azure Data Engineer

    Sharp Decisions 4.6company rating

    Requirements engineer job in Jersey City, NJ

    Title: Senior Azure Data Engineer Client: Major Japanese Bank Experience Level: Senior (10+ Years) The Senior Azure Data Engineer will design, build, and optimize enterprise data solutions within Microsoft Azure for a major Japanese bank. This role focuses on architecting scalable data pipelines, enhancing data lake environments, and ensuring security, compliance, and data governance best practices. Key Responsibilities: Develop, maintain, and optimize Azure-based data pipelines and ETL/ELT workflows. Design and implement Azure Data Lake, Synapse, Databricks, and ADF solutions. Ensure data security, compliance, lineage, and governance controls. Partner with architecture, data governance, and business teams to deliver high-quality data solutions. Troubleshoot performance issues and improve system efficiency. Required Skills: 10+ years of data engineering experience. Strong hands-on expertise with Azure Synapse, Azure Data Factory, Azure Databricks, Azure Data Lake, and Azure SQL. Azure certifications strongly preferred. Strong SQL, Python, and cloud data architecture skills. Experience in financial services or large enterprise environments preferred.
    $77k-101k yearly est. 1d ago
  • Market Data Engineer

    Harrington Starr

    Requirements engineer job in New York, NY

    🚀 Market Data Engineer - New York | Cutting-Edge Trading Environment I'm partnered with a leading technology-driven trading team in New York looking to bring on a Market Data Engineer to support global research, trading, and infrastructure groups. This role is central to managing the capture, normalization, and distribution of massive volumes of historical market data from exchanges worldwide. What You'll Do Own large-scale, time-sensitive market data capture + normalization pipelines Improve internal data formats and downstream datasets used by research and quantitative teams Partner closely with infrastructure to ensure reliability of packet-capture systems Build robust validation, QA, and monitoring frameworks for new market data sources Provide production support, troubleshoot issues, and drive quick, effective resolutions What You Bring Experience building or maintaining large-scale ETL pipelines Strong proficiency in Python + Bash, with familiarity in C++ Solid understanding of networking fundamentals Experience with workflow/orchestration tools (Airflow, Luigi, Dagster) Exposure to distributed computing frameworks (Slurm, Celery, HTCondor, etc.) Bonus Skills Experience working with binary market data protocols (ITCH, MDP3, etc.) Understanding of high-performance filesystems and columnar storage formats
    $90k-123k yearly est. 2d ago
  • Data Engineer - VC Backed Healthcare Firm - NYC or San Francisco

    Saragossa

    Requirements engineer job in New York, NY

    Are you a data engineer who loves building systems that power real impact in the world? A fast growing healthcare technology organization is expanding its innovation team and is looking for a Data Engineer II to help build the next generation of its data platform. This team sits at the center of a major transformation effort, partnering closely with engineering, analytics, and product to design the foundation that supports advanced automation, AI, intelligent workflows, and high scale data operations that drive measurable outcomes for hospitals, health systems, and medical groups. In this role, you will design, develop, and maintain software applications that process large volumes of data every day. You will collaborate with cross functional teams to understand data requirements, build and optimize data models, and create systems that ensure accuracy, reliability, and performance. You will write code that extracts, transforms, and loads data from a variety of sources into modern data warehouses and data lakes, while implementing best in class data quality and governance practices. You will work hands on with big data technologies such as Hadoop, Spark, and Kafka, and you will play a critical role in troubleshooting, performance tuning, and ensuring the scalability of complex data applications. To thrive here, you should bring strong problem solving ability, analytical thinking, and excellent communication skills. This is an opportunity to join an expanding innovation group within a leading healthcare platform that is investing heavily in data, AI, and the future of intelligent revenue operations. If you want to build systems that make a real difference and work with teams that care deeply about improving patient experiences and provider performance, this is a chance to do highly meaningful engineering at scale.
    $90k-123k yearly est. 1d ago
  • Data Engineer

    Gotham Technology Group 4.5company rating

    Requirements engineer job in New York, NY

    Our client is seeking a Data Engineer with hands-on experience in Web Scraping technologies to help build and scale a new scraping capability within their Data Engineering team. This role will work directly with Technology, Operations, and Compliance to source, structure, and deliver alternative data from websites, APIs, files, and internal systems. This is a unique opportunity to shape a new service offering and grow into a senior engineering role as the platform evolves. Responsibilities Develop scalable Web Scraping solutions using AI-assisted tools, Python frameworks, and modern scraping libraries. Manage the full lifecycle of scraping requests, including intake, feasibility assessment, site access evaluation, extraction approach, data storage, validation, entitlement, and ongoing monitoring. Coordinate with Compliance to review Terms of Use, secure approvals, and ensure all scrapes adhere to regulatory and internal policy guidelines. Build and support AWS-based data pipelines using tools such as Cron, Glue, EventBridge, Lambda, Python ETL, and Redshift. Normalize and standardize raw, vendor, and internal datasets for consistent consumption across the firm. Implement data quality checks and monitoring to ensure the reliability, historical continuity, and operational stability of scraped datasets. Provide operational support, troubleshoot issues, respond to inquiries about scrape behavior or data anomalies, and maintain strong communication with users. Promote data engineering best practices, including automation, documentation, repeatable workflows, and scalable design patterns. Required Qualifications Bachelor's degree in Computer Science, Engineering, Mathematics, or related field. 2-5 years of experience in a similar Data Engineering or Web Scraping role. Capital markets knowledge with familiarity across asset classes and experience supporting trading systems. Strong hands-on experience with AWS services (S3, Lambda, EventBridge, Cron, Glue, Redshift). Proficiency with modern Web Scraping frameworks (Scrapy, BeautifulSoup, Selenium, Playwright). Strong Python programming skills and experience with SQL and NoSQL databases. Familiarity with market data and time series datasets (Bloomberg, Refinitiv) is a plus. Experience with DevOps/IaC tooling such as Terraform or CloudFormation is desirable.
    $86k-120k yearly est. 2d ago
  • Senior Data Engineer (Snowflake)

    Epic Placements

    Requirements engineer job in Parsippany-Troy Hills, NJ

    Senior Data Engineer (Snowflake & Python) 1-Year Contract | $60/hour + Benefit Options Hybrid: On-site a few days per month (local candidates only) Work Authorization Requirement You must be authorized to work for any employer as a W2 employee. This is required for this role. This position is W-2 only - no C2C, no third-party submissions, and no sponsorship will be considered. Overview We are seeking a Senior Data Engineer to support enterprise-scale data initiatives for a highly collaborative engineering organization. This is a new, long-term contract opportunity for a hands-on data professional who thrives in fast-paced environments and enjoys building high-quality, scalable data solutions on Snowflake. Candidates must be based in or around New Jersey, able to work on-site at least 3 days per month, and meet the W2 employment requirement. What You'll Do Design, develop, and support enterprise-level data solutions with a strong focus on Snowflake Participate across the full software development lifecycle - planning, requirements, development, testing, and QA Partner closely with engineering and data teams to identify and implement optimal technical solutions Build and maintain high-performance, scalable data pipelines and data warehouse architectures Ensure platform performance, reliability, and uptime, maintaining strong coding and design standards Troubleshoot production issues, identify root causes, implement fixes, and document preventive solutions Manage deliverables and priorities effectively in a fast-moving environment Contribute to data governance practices including metadata management and data lineage Support analytics and reporting use cases leveraging advanced SQL and analytical functions Required Skills & Experience 8+ years of experience designing and developing data solutions in an enterprise environment 5+ years of hands-on Snowflake experience Strong hands-on development skills with SQL and Python Proven experience designing and developing data warehouses in Snowflake Ability to diagnose, optimize, and tune SQL queries Experience with Azure data frameworks (e.g., Azure Data Factory) Strong experience with orchestration tools such as Airflow, Informatica, Automic, or similar Solid understanding of metadata management and data lineage Hands-on experience with SQL analytical functions Working knowledge of Shell scripting and Java scripting Experience using Git, Confluence, and Jira Strong problem-solving and troubleshooting skills Collaborative mindset with excellent communication skills Nice to Have Experience supporting Pharma industry data Exposure to Omni-channel data environments Why This Opportunity $60/hour W2 on a long-term 1-year contract Benefit options available Hybrid structure with limited on-site requirement High-impact role supporting enterprise data initiatives Clear expectations: W-2 only, no third-party submissions, no Corp-to-Corp This employer participates in E-Verify and will provide the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S.
    $60 hourly 22h ago
  • Senior Data Engineer

    Apexon

    Requirements engineer job in New Providence, NJ

    Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. We have been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation.Apexon brings together distinct core competencies - in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences - to help businesses capitalize on the unlimited opportunities digital offers. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving clients' toughest technology problems, and a commitment to continuous improvement. Backed by Goldman Sachs Asset Management and Everstone Capital, Apexon now has a global presence of 15 offices (and 10 delivery centers) across four continents. Job Description Experienced Data management specialist responsible for developing, overseeing, organizing, storing, and analyzing data and data systems Participate in all aspects of the software development lifecycle for Snowflake solutions, including planning, requirements, development, testing, and quality assurance Work in tandem with our engineering team to identify and implement the most optimal solutions Ensure platform performance, uptime, and scale, maintaining high standards for code quality and thoughtful design Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures Able to manage deliverables in fast paced environments Areas of Expertise At least 10 years of experience designing and development of data solutions in enterprise environment At least 5+ years' experience on Snowflake Platform Strong hands-on SQL and Python development Experience with designing and developing data warehouses in Snowflake A minimum of three years' experience in developing production-ready data ingestion and processing pipelines using Spark, Scala Strong hands-on experience with Orchestration Tools e.g. Airflow, Informatica, Automic Good understanding on Metadata and data lineage Hands-on knowledge on SQL Analytical functions Strong knowledge and hands-on experience in Shell scripting, Java Scripting Able to demonstrate experience with software engineering practices including CI/CD, Automated testing and Performance Engineering. Good understanding and exposure to Git, Confluence and Jira Good problem solving and troubleshooting skills. Team player, collaborative approach and excellent communication skills Our Commitment to Diversity & Inclusion: Did you know that Apexon has been Certified™ by Great Place To Work , the global authority on workplace culture, in each of the three regions in which it operates: USA (for the fourth time in 2023), India (seven consecutive certifications as of 2023), and the UK.Apexon is committed to being an equal opportunity employer and promoting diversity in the workplace. We are taking affirmative action to ensure equal employment opportunity for all qualified individuals. Apexon strictly prohibits discrimination and harassment of any kind and provides equal employment opportunities to employees and applicants without regard to gender, race, color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. You can read about our Job Applicant Privacy policy here Job Applicant Privacy Policy (apexon.com)
    $82k-112k yearly est. 4d ago
  • Data Engineer

    Neenopal Inc.

    Requirements engineer job in Newark, NJ

    NeenOpal is a global consulting firm specializing in Data Science and Business Intelligence, with offices in Bengaluru, Newark, and Fredericton. We provide end-to-end solutions tailored to the unique needs of businesses, from startups to large organizations, across domains like digital strategy, sales and marketing, supply chain, and finance. Our mission is to help organizations achieve operational excellence and transform into data-driven enterprises. Role Description This is a full-time, hybrid, Data Engineer role located in Newark, NJ. The Data Engineer will be responsible for designing, implementing, and managing data engineering solutions to support business needs. Day-to-day tasks include building and optimizing data pipelines, developing and maintaining data models and ETL processes, managing data warehousing solutions, and contributing to the organization's data analytics initiatives. Collaboration with cross-functional teams to ensure robust data infrastructure will be a key aspect of this role. Key Responsibilities Data Pipeline Development: Design, implement, and manage robust data pipelines to ensure efficient data flow into data warehouses. Automate ETL processes using Python and advanced data engineering tools. Data Integration: Integrate and transform data using industry-standard tools. Experience required with: AWS Services: AWS Glue, Data Pipeline, Redshift, and S3. Azure Services: Azure Data Factory, Synapse Analytics, and Blob Storage. Data Warehousing: Implement and optimize solutions using Snowflake and Amazon Redshift. Database Management: Develop and manage relational databases (SQL Server, MySQL, PostgreSQL) to ensure data integrity. Performance Optimization: Continuously monitor and improve data processing workflows and apply best practices for query optimization. Global Collaboration: Work closely with cross-functional teams in the US, India, and Canada to deliver high-quality solutions. Governance & Support: Document ETL processes and data mappings in line with governance standards. Diagnose and resolve data-related issues promptly. Required Skills and Experience Experience: Minimum 2+ years of experience designing and developing ETL processes (AWS Glue, Azure Data Factory, or similar). Integration: Experience integrating data via RESTful / GraphQL APIs. Programming: Proficient in Python for ETL automation and SQL for database management. Cloud Platforms: Strong experience with AWS or Azure data services. (GCP familiarity is a plus) . Data Warehousing: Expertise with Snowflake, Amazon Redshift, or Azure Synapse Analytics. Integration: Experience integrating data via RESTful APIs. Communication: Excellent articulation skills to explain technical work directly to clients and stakeholders. Authorization: Must have valid work authorization in the United States. Salary Range: $65,000- $80,000 per year Benefits: This role includes health insurance, paid time off, and opportunities for professional growth and continuous learning within a fast-growing global analytics company. Equal Opportunity Employer NeenOpal Inc. is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status.
    $65k-80k yearly 3d ago
  • Data Engineer

    Haptiq

    Requirements engineer job in New York, NY

    Haptiq is a leader in AI-powered enterprise operations, delivering digital solutions and consulting services that drive value and transform businesses. We specialize in using advanced technology to streamline operations, improve efficiency, and unlock new revenue opportunities, particularly within the private capital markets. Our integrated ecosystem includes PaaS - Platform as a Service, the Core Platform, an AI-native enterprise operations foundation built to optimize workflows, surface insights, and accelerate value creation across portfolios; SaaS - Software as a Service, a cloud platform delivering unmatched performance, intelligence, and execution at scale; and S&C - Solutions and Consulting Suite, modular technology playbooks designed to manage, grow, and optimize company performance. With over a decade of experience supporting high-growth companies and private equity-backed platforms, Haptiq brings deep domain expertise and a proven ability to turn technology into a strategic advantage. The Opportunity As a Data Engineer within the Global Operations team, you will be responsible for managing the internal data infrastructure, building and maintaining data pipelines, and ensuring the integrity, cleanliness, and usability of data across our critical business systems. This role will play a foundational part in developing a scalable internal data capability to drive decision-making across Haptiq's operations. Responsibilities and Duties Design, build, and maintain scalable ETL/ELT pipelines to consolidate data from delivery, finance, and HR systems (e.g., Kantata, Salesforce, JIRA, HRIS platforms). Ensure consistent data hygiene, normalization, and enrichment across source systems. Develop and maintain data models and data warehouses optimized for analytics and operational reporting. Partner with business stakeholders to understand reporting needs and ensure the data structure supports actionable insights. Own the documentation of data schemas, definitions, lineage, and data quality controls. Collaborate with the Analytics, Finance, and Ops teams to build centralized reporting datasets. Monitor pipeline performance and proactively resolve data discrepancies or failures. Contribute to architectural decisions related to internal data infrastructure and tools. Requirements 3-5 years of experience as a data engineer, analytics engineer, or similar role. Strong experience with SQL, data modeling, and pipeline orchestration (e.g., Airflow, dbt). Hands-on experience with cloud data warehouses (e.g., Snowflake, BigQuery, Redshift). Experience working with REST APIs and integrating with SaaS platforms like Salesforce, JIRA, or Workday. Proficiency in Python or another scripting language for data manipulation. Familiarity with modern data stack tools (e.g., Fivetran, Stitch, Segment). Strong understanding of data governance, documentation, and schema management. Excellent communication skills and ability to work cross-functionally. Benefits Flexible work arrangements (including hybrid mode) Great Paid Time Off (PTO) policy Comprehensive benefits package (Medical / Dental / Vision / Disability / Life) Healthcare and Dependent Care Flexible Spending Accounts (FSAs) 401(k) retirement plan Access to HSA-compatible plans Pre-tax commuter benefits Employee Assistance Program (EAP) Opportunities for professional growth and development. A supportive, dynamic, and inclusive work environment. Why Join Us? We value creative problem solvers who learn fast, work well in an open and diverse environment, and enjoy pushing the bar for success ever higher. We do work hard, but we also choose to have fun while doing it. The compensation range for this role is $75,000 to $80,000 USD
    $75k-80k yearly 3d ago

Learn more about requirements engineer jobs

How much does a requirements engineer earn in Plainfield, NJ?

The average requirements engineer in Plainfield, NJ earns between $61,000 and $108,000 annually. This compares to the national average requirements engineer range of $62,000 to $120,000.

Average requirements engineer salary in Plainfield, NJ

$81,000

What are the biggest employers of Requirements Engineers in Plainfield, NJ?

The biggest employers of Requirements Engineers in Plainfield, NJ are:
  1. Tata Group
  2. Fiserv
  3. South Jersey Industries
  4. Jacobs Enterprises
  5. Zone It Solutions
  6. Orion Innovation
  7. Motion Recruitment
  8. Insight Global
  9. Scadea Solutions
  10. Transcat
Job type you want
Full Time
Part Time
Internship
Temporary