Post job

Requirements engineer jobs in New Jersey

- 990 jobs
  • BI Engineer (Tableau & Power BI - platforms/server)

    Harvey Nash

    Requirements engineer job in Newark, NJ

    Job Title: BI Engineer (Tableau & Power BI - platforms/server) Duration: 12 months long term project US citizens and Green Card Holders and those authorized to work in the US are encouraged to apply. We are unable to sponsor H1b candidates at this time Summary of the job -Extremely technical/hands on skills on Power BI, Python and some Tableau - Financial, Asset Management, banking background - FIX Income specifically is a big plus - Azure Cloud Job Description: Our Role: We are looking for an astute, determined professional like you to fulfil a BI Engineering role within our Technology Solutions Group. You will showcase your success in a fast-paced environment through collaboration, ownership, and innovation. Your expertise in emerging trends and practices will evoke stimulating discussions around optimization and change to help keep our competitive edge. This rewarding opportunity will enable you to make a big impact in our organization, so if this sounds exciting, then might be the place. Your Impact: Build and maintain new and existing applications in preparation for a large-scale architectural migration within an Agile function. Align with the Product Owner and Scrum Master in assessing business needs and transforming them into scalable applications. Build and maintain code to manage data received from heterogenous data formats including web-based sources, internal/external databases, flat files, heterogenous data formats (binary, ASCII). Help build new enterprise Datawarehouse and maintain the existing one. Design and support effective storage and retrieval of very large internal and external data set and be forward think about the convergence strategy with our AWS cloud migration Assess the impact of scaling up and scaling out and ensure sustained data management and data delivery performance. Build interfaces for supporting evolving and new applications and accommodating new data sources and types of data. Your Required Skills: 5+ years of hand-on experience in BI Platform administration such as Power BI and Tableau 3+ years of hand-on experience in Power BI/Tableau report development Experience with both server and desktop-based data visualization tools Expertise with multiple database platforms including relational databases (ie. SQL Server) as well as cloud-based data warehouses such as Azure Fluent with SQL for data analysis Working experience in a Windows based environment Knowledge of data warehousing, ETL procedures, and BI technologies Excellent analytical and problem-solving skills with the ability to think quickly and offer alternatives both independently and within teams. Exposure working in an Agile environment with Scrum Master/Product owner and ability to deliver Ability to communicate the status and challenges with the team Demonstrating the ability to learn new skills and work as a team Strong interpersonal skills A reasonable, good faith estimate of the minimum and maximum Pay rate for this position is $70/hr. to $80/hr.
    $70-80 hourly 1d ago
  • Neo4j Engineer

    Tata Consultancy Services 4.3company rating

    Requirements engineer job in Summit, NJ

    Must Have Technical/Functional Skills Neo4j, Graph Data Science, Cypher, Python, Graph Algorithms, Bloom, GraphXR, Cloud, Kubernetes, ETL Roles & Responsibilities Design and implement graph-based data models using Neo4j. Develop Cypher queries and procedures for efficient graph traversal and analysis. Apply Graph Data Science algorithms for community detection, centrality, and similarity. Integrate Neo4j with enterprise data platforms and APIs. Collaborate with data scientists and engineers to build graph-powered applications. Optimize performance and scalability of graph queries and pipelines. Support deployment and monitoring of Neo4j clusters in cloud or on-prem environments. Salary Range: $110,000 $140,000 Year TCS Employee Benefits Summary: Discretionary Annual Incentive. Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans. Family Support: Maternal & Parental Leaves. Insurance Options: Auto & Home Insurance, Identity Theft Protection. Convenience & Professional Growth: Commuter Benefits & Certification & amp; Training Reimbursement. Time Off: Vacation, Time Off, Sick Leave & Holidays. Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
    $110k-140k yearly 5d ago
  • Gen AI/ML Engineer

    Capgemini 4.5company rating

    Requirements engineer job in Jersey City, NJ

    Gen AI/ML Engineer with Data Engineering Exposure Experience : 8+ Years Preferred Employee Type : Full Time with Benefits Job Description: We are seeking a highly skilled and experienced AI/ML Engineer with a strong background in Machine Learning (ML), Large Language Models (LLMs), Generative AI (GenAI), and Data Engineering. The ideal candidate will have successfully delivered 3-4 end-to-end AI/ML projects, demonstrating expertise in building scalable ML systems and deploying them in production environments. A solid foundation in Python, SQL, PySpark, and NLP technologies is essential. Experience with cloud platforms such as AWS, Azure, or GCP is highly desirable. Key Responsibilities Design, develop, and deploy scalable ML/AI solutions, including robust MLOps pipelines for CI/CD, model monitoring, and governance. Lead the development of LLM and GenAI applications, including text summarization, conversational AI, and entity recognition. Build and optimize data pipelines using PySpark and SQL for large-scale data processing and feature engineering. Architect and implement production-grade ML systems with a focus on performance, scalability, and reliability. Collaborate with cross-functional teams to align AI initiatives with business goals and drive innovation. Mentor junior engineers and contribute to team-wide knowledge sharing and best practices. Required Skills & Qualifications Bachelor's degree in Computer Science, Data Science, or a related field. 7+ years of hands-on experience in ML/AI solution development and deployment. Proven track record of working on at least 3-4 AI/ML projects from concept to production. Programming Languages: Python (Pandas, NumPy, PyTorch, TensorFlow), SQL. MLOps Tools: MLflow, Kubeflow, Docker, Kubernetes, CI/CD pipelines. GenAI & NLP: Expertise in transformer models (e.g., GPT, BERT), Hugging Face, LangChain. Data Engineering: Strong experience with PySpark and distributed data processing. Cloud Platforms: Proficiency in AWS, Azure, or GCP. Strong problem-solving skills and ability to thrive in a fast-paced, collaborative environment. Life at Capgemini Capgemini supports all aspects of your well-being throughout the changing stages of your life and career. For eligible employees, we offer: Flexible work Healthcare including dental, vision, mental health, and well-being programs Financial well-being programs such as 401(k) and Employee Share Ownership Plan Paid time off and paid holidays Paid parental leave Family building benefits like adoption assistance, surrogacy, and cryopreservation Social well-being benefits like subsidized back-up child/elder care and tutoring Mentoring, coaching and learning programs Employee Resource Groups Disaster Relief Disclaimer Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law. This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship. Capgemini is committed to providing reasonable accommodations during our recruitment process. If you need assistance or accommodation, please get in touch with your recruiting contact. Click the following link for more information on your rights as an Applicant ************************************************************************** Salary Transparency: Capgemini discloses salary range information in compliance with state and local pay transparency obligations. The disclosed range represents the lowest to highest salary we, in good faith, believe we would pay for this role at the time of this posting, although we may ultimately pay more or less than the disclosed range, and the range may be modified in the future. The disclosed range takes into account the wide range of factors that are considered in making compensation decisions including, but not limited to, geographic location, relevant education, qualifications, certifications, experience, skills, seniority, performance, sales or revenue-based metrics, and business or organizational needs. At Capgemini, it is not typical for an individual to be hired at or near the top of the range for their role. The base salary range for the tagged location is $103330 - $128656/yearly. This role may be eligible for other compensation including variable compensation, bonus, or commission. Full time regular employees are eligible for paid time off, medical/dental/vision insurance, 401(k), and any other benefits to eligible employees. Note: No amount of pay is considered to be wages or compensation until such amount is earned, vested, and determinable. The amount and availability of any bonus, commission, or any other form of compensation that are allocable to a particular employee remains in the Company's sole discretion unless and until paid and may be modified at the Company's sole discretion, consistent with the law.
    $103.3k-128.7k yearly 3d ago
  • Sr Data Engineer Python Serverside

    Canyon Associates 4.2company rating

    Requirements engineer job in White House Station, NJ

    This is a direct hire full-time position, with a hybrid on-site 2 days a week format. YOU MUST BE A US CITIZEN OR GREEN CARD, NO OTHER STATUS TO WORK IN THE US WILL BE PERMITTED YOU MUST LIVE LOCAL TO THE AREA AND BE ABLE TO DRIVE ONSITE A MIN TWO DAYS A WEEK THE TECH STACK WILL BE: 7 years demonstrated server-side development proficiency 5 years demonstrated server-side development proficiency Programming Languages: Python (NumPy, Pandas, Oracle PL/SQL). Other non-interpreted languages like Java, C++, Rust, etc. are a plus. Must be proficient in the intermediate-advanced level of the language (concurrency, memory management, etc.) Design patterns: typical GOF patterns (Factory, Facade, Singleton, etc.) Data structures: maps, lists, arrays, etc SCM: solid Git proficiency, MS Azure DevOps (CI/CD)
    $97k-129k yearly est. 4d ago
  • Data Analytics Engineer

    Dale Workforce Solutions

    Requirements engineer job in Somerset, NJ

    Client: manufacturing company Type: direct hire Our client is a publicly traded, globally recognized technology and manufacturing organization that relies on data-driven insights to support operational excellence, strategic decision-making, and digital transformation. They are seeking a Power BI Developer to design, develop, and maintain enterprise reporting solutions, data pipelines, and data warehousing assets. This role works closely with internal stakeholders across departments to ensure reporting accuracy, data availability, and the long-term success of the company's business intelligence initiatives. The position also plays a key role in shaping BI strategy and fostering collaboration across cross-functional teams. This role is on-site five days per week in Somerset, NJ. Key Responsibilities Power BI Reporting & Administration Lead the design, development, and deployment of Power BI and SSRS reports, dashboards, and analytics assets Collaborate with business stakeholders to gather requirements and translate needs into scalable technical solutions Develop and maintain data models to ensure accuracy, consistency, and reliability Serve as the Power BI tenant administrator, partnering with security teams to maintain data protection and regulatory compliance Optimize Power BI solutions for performance, scalability, and ease of use ETL & Data Warehousing Design and maintain data warehouse structures, including schema and database layouts Develop and support ETL processes to ensure timely and accurate data ingestion Integrate data from multiple systems while ensuring quality, consistency, and completeness Work closely with database administrators to optimize data warehouse performance Troubleshoot data pipelines, ETL jobs, and warehouse-related issues as needed Training & Documentation Create and maintain technical documentation, including specifications, mappings, models, and architectural designs Document data warehouse processes for reference, troubleshooting, and ongoing maintenance Manage data definitions, lineage documentation, and data cataloging for all enterprise data models Project Management Oversee Power BI and reporting projects, offering technical guidance to the Business Intelligence team Collaborate with key business stakeholders to ensure departmental reporting needs are met Record meeting notes in Confluence and document project updates in Jira Data Governance Implement and enforce data governance policies to ensure data quality, compliance, and security Monitor report usage metrics and follow up with end users as needed to optimize adoption and effectiveness Routine IT Functions Resolve Help Desk tickets related to reporting, dashboards, and BI tools Support general software and hardware installations when needed Other Responsibilities Manage email and phone communication professionally and promptly Respond to inquiries to resolve issues, provide information, or direct to appropriate personnel Perform additional assigned duties as needed Qualifications Required Minimum of 3 years of relevant experience Bachelor's degree in Computer Science, Data Analytics, Machine Learning, or equivalent experience Experience with cloud-based BI environments (Azure, AWS, etc.) Strong understanding of data modeling, data visualization, and ETL tools (e.g., SSIS, Azure Synapse, Snowflake, Informatica) Proficiency in SQL for data extraction, manipulation, and transformation Strong knowledge of DAX Familiarity with data warehouse technologies (e.g., Azure Blob Storage, Redshift, Snowflake) Experience with Power Pivot, SSRS, Azure Synapse, or similar reporting tools Strong analytical, problem-solving, and documentation skills Excellent written and verbal communication abilities High attention to detail and strong self-review practices Effective time management and organizational skills; ability to prioritize workload Professional, adaptable, team-oriented, and able to thrive in a dynamic environment
    $82k-112k yearly est. 1d ago
  • Azure Data Engineer

    Programmers.Io 3.8company rating

    Requirements engineer job in Weehawken, NJ

    · Expert level skills writing and optimizing complex SQL · Experience with complex data modelling, ETL design, and using large databases in a business environment · Experience with building data pipelines and applications to stream and process datasets at low latencies · Fluent with Big Data technologies like Spark, Kafka and Hive · Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required · Designing and building of data pipelines using API ingestion and Streaming ingestion methods · Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential · Experience in developing NO SQL solutions using Azure Cosmos DB is essential · Thorough understanding of Azure and AWS Cloud Infrastructure offerings · Working knowledge of Python is desirable · Designing and implementing scalable and secure data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services · Managing and optimizing data storage using Azure Data Lake Storage, Azure SQL Data Warehouse, and Azure Cosmos DB · Monitoring and troubleshooting data-related issues within the Azure environment to maintain high availability and performance · Implementing data security measures, including encryption, access controls, and auditing, to protect sensitive information · Automating data pipelines and workflows to streamline data ingestion, processing, and distribution tasks · Utilizing Azure's analytics services, such as Azure Synapse Analytics, to provide insights and support data-driven decision-making. · Documenting data procedures, systems, and architectures to maintain clarity and ensure compliance with regulatory standards · Providing guidance and support for data governance, including metadata management, data lineage, and data cataloging Best Regards, Dipendra Gupta Technical Recruiter *****************************
    $92k-132k yearly est. 4d ago
  • Guidewire DevOps Engineer (with experience in Environment Strategy Build)

    Ltimindtree

    Requirements engineer job in Princeton, NJ

    About Us: LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit ******************** Job Title: Guidewire DevOps Engineer (with experience in Environment Strategy Build) Work Location: Princeton, NJ (Hybrid - Onsite) Required Experience: 7+ Years Role Overview: HSB is seeking a DevOps Engineer to lead the design build and ongoing management of Guidewire environments for the Cyber Admitted PolicyCenter program This role will define a repeatable and scalable environment strategy ensuring that environments across development SIT UAT and production are built and maintained with consistency automation and traceability The ideal candidate will have strong experience in Guidewire infrastructure setup Azure DevOps pipelines and infrastructure automation and will be comfortable operating in a complex multi stream program environment involving PolicyCenter Digital Portal SmartCOMM and Integration components Key Responsibilities Environment Strategy Planning Define an end-to-end environment strategy and roadmap including topology refresh cadence and configuration management across the program Establish environment standards and documentation to ensure consistency across all Guidewire instances Recommend and implement an iterative build approach that allows environments to evolve progressively as functionality matures Environment Build Configuration Lead the provisioning configuration and validation of Guidewire environments Dev SIT UAT PreProd Prod Develop and maintain automation pipelines or scripts eg Azure DevOps Terraform PowerShell to standardize environment setup and deployment processes Collaborate with infrastructure and database teams on environment readiness configuration and SQL Server remediation or upgrades Maintain alignment between PolicyCenter Digital and SmartCOMM environments for integrated end-to-end testing Environment Maintenance Operations Manage environment lifecycle activities including data refreshes DB drops configuration synchronization and release preparation Monitor and maintain environment health ensuring stability and readiness for testing and release activities Create and maintain environment status dashboards or tracking mechanisms within Azure DevOps for transparency and reporting Access Security Enablement Partner with internal IT and security teams to implement secure access models and environment level permissions Manage service accounts secrets and credentials used across environments in alignment with enterprise security standards Cross Stream Coordination Coordinate with PolicyCenter DigitalPortal SmartCOMM and Integration workstreams to ensure consistent environment dependencies and deployment sequencing Support integration testing by ensuring endpoint configurations and API connectivity are aligned across systems Participate in environment planning sessions readiness reviews and golive preparations Skills Experience: 5 years of DevOps or CloudOps experience in enterprise software or insurance platforms 3 years of experience supporting Guidewire applications PolicyCenter BillingCenter or ClaimCenter preferably on cloud or hybrid infrastructure Strong working knowledge of Azure DevOps ADO pipelines and automation for environment deployment Proficiency with infrastructure as code tools e.g., Terraform ARM templates PowerShell Experience managing SQL Server environments data refreshes and system configuration Familiarity with environment topology design CICD pipelines and multitier application environments Excellent collaboration and documentation skills across technical and nontechnical teams Success Metrics: Standardized environments build and refresh process implemented across all Guidewire environments Lead time for new environment setup reduced through automation and reusability All environments traceable version controlled and integrated with Azure DevOps Clear visibility into environment readiness for SIT UAT and production through dashboards or reports Benefits/perks listed below may vary depending on the nature of your employment with LTIMindtree (“LTIM”): Benefits and Perks: Comprehensive Medical Plan Covering Medical, Dental, Vision Short Term and Long-Term Disability Coverage 401(k) Plan with Company match Life Insurance Vacation Time, Sick Leave, Paid Holidays Paid Paternity and Maternity Leave The range displayed on each job posting reflects the minimum and maximum salary target for the position across all US locations. Within the range, individual pay is determined by work location and job level and additional factors including job-related skills, experience, and relevant education or training. Depending on the position offered, other forms of compensation may be provided as part of overall compensation like an annual performance-based bonus, sales incentive pay and other forms of bonus or variable compensation. Disclaimer: The compensation and benefits information provided herein is accurate as of the date of this posting. LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law. Safe return to office: In order to comply with LTIMindtree' s company COVID-19 vaccine mandate, candidates must be able to provide proof of full vaccination against COVID-19 before or by the date of hire. Alternatively, one may submit a request for reasonable accommodation from LTIMindtree's COVID-19 vaccination mandate for approval, in accordance with applicable state and federal law, by the date of hire. Any request is subject to review through LTIMindtree's applicable processes.
    $88k-115k yearly est. 3d ago
  • Data Engineer

    Ztek Consulting 4.3company rating

    Requirements engineer job in Hamilton, NJ

    Key Responsibilities: Manage and support batch processes and data pipelines in Azure Databricks and Azure Data Factory. Integrate and process Bloomberg market data feeds and files into trading or analytics platforms. Monitor, troubleshoot, and resolve data and system issues related to trading applications and market data ingestion. Develop, automate, and optimize ETL pipelines using Python, Spark, and SQL. Manage FTP/SFTP file transfers between internal systems and external vendors. Ensure data quality, completeness, and timeliness for downstream trading and reporting systems. Collaborate with operations, application support, and infrastructure teams to resolve incidents and enhance data workflows. Required Skills & Experience: 10+ years of experience in data engineering or production support within financial services or trading environments. Hands-on experience with Azure Databricks, Azure Data Factory, and Azure Storage, Logic Apps, Fabric. Strong Python and SQL programming skills. Experience with Bloomberg data feeds (BPIPE, TSIP,SFTP). Experience with Git, CI/CD pipelines, and Azure DevOps. Proven ability to support batch jobs, troubleshoot failures, and manage job scheduling. Experience handling FTP/SFTP file transfers and automation (e.g., using scripts or managed file transfer tools). Solid understanding of equities trading, fixed income trading, trading workflows, and financial instruments. Excellent communication, problem-solving, and stakeholder management skills.
    $89k-125k yearly est. 2d ago
  • Data Engineer

    Quantum World Technologies Inc. 4.2company rating

    Requirements engineer job in Berkeley Heights, NJ

    Role: Senior Data Engineer Job Type: Fulltime only Visa- Indipendet Visa only Job Discription Hands-on experience in building and optimizing data processing applications using Java and Python, ensuring high performance and scalability of data pipelines Advanced knowledge of Apache Spark to handle large-scale data processing tasks, including the development and optimization of complex Spark applications for efficient data transformation. Comprehensive understanding of Hadoop, HDFS, and cloud Big Data technologies, with hands-on experience in managing and processing vast amounts of data effectively.
    $92k-132k yearly est. 1d ago
  • Data Engineer

    Company 3.0company rating

    Requirements engineer job in Fort Lee, NJ

    The Senior Data Analyst will be responsible for developing MS SQL queries and procedures, building custom reports, and modifying ERP user forms to support and enhance organizational productivity. This role will also design and maintain databases, ensuring high levels of stability, reliability, and performance. Responsibilities Analyze, structure, and interpret raw data. Build and maintain datasets for business use. Design and optimize database tables, schemas, and data structures. Enhance data accuracy, consistency, and overall efficiency. Develop views, functions, and stored procedures. Write efficient SQL queries to support application integration. Create database triggers to support automation processes. Oversee data quality, integrity, and database security. Translate complex data into clear, actionable insights. Collaborate with cross-functional teams on multiple projects. Present data through graphs, infographics, dashboards, and other visualization methods. Define and track KPIs to measure the impact of business decisions. Prepare reports and presentations for management based on analytical findings. Conduct daily system maintenance and troubleshoot issues across all platforms. Perform additional ad hoc analysis and tasks as needed. Qualification Bachelor's Degree in Information Technology or relevant 4+ years of experience as a Data Analyst or Data Engineer, including database design experience. Strong ability to extract, manipulate, analyze, and report on data, as well as develop clear and effective presentations. Proficiency in writing complex SQL queries, including table joins, data aggregation (SUM, AVG, COUNT), and creating, retrieving, and updating views. Excellent written, verbal, and interpersonal communication skills. Ability to manage multiple tasks in a fast-paced and evolving environment. Strong work ethic, professionalism, and integrity. Advanced proficiency in Microsoft Office applications.
    $93k-132k yearly est. 1d ago
  • Data Engineer

    Mastech Digital 4.7company rating

    Requirements engineer job in Jersey City, NJ

    Mastech Digital Inc. (NYSE: MHH) is a minority-owned, publicly traded IT staffing and digital transformation services company. Headquartered in Pittsburgh, PA, and established in 1986, we serve clients nationwide through 11 U.S. offices. Role: Data Engineer Location: Merrimack, NH/Smithfield, RI/Jersey City, NJ Duration: Full-Time/W2 Job Description: Must-Haves: Python for running ETL batch jobs Heavy SQL for data analysis, validation and querying AWS and the ability to move the data through the data stages and into their target databases. The Postgres database is the target, so that is required. Nice to haves: Snowflake Java for API development is a nice to have (will teach this) Experience in asset management for domain knowledge. Production support debugging and processing of vendor data The Expertise and Skills You Bring A proven foundation in data engineering - bachelor's degree + preferred, 10+ years' experience Extensive experience with ETL technologies Design and develop ETL reporting and analytics solutions. Knowledge of Data Warehousing methodologies and concepts - preferred Advanced data manipulation languages and frameworks (JAVA, PYTHON, JSON) - required RMDS experience (Snowflake, PostgreSQL ) - required Knowledge of Cloud platforms and Services (AWS - IAM, EC2, S3, Lambda, RDS ) - required Designing and developing low to moderate complex data integration solution - required Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Docker) will be preferred Expert in SQL and Stored Procedures on any Relational databases Good in debugging, analyzing and Production Support Application Development based on JIRA stories (Agile environment) Demonstrable experience with ETL tools (Informatica, Snaplogic) Experience in working with Python in an AWS environment Create, update, and maintain technical documentation for software-based projects and products. Solving production issues. Interact effectively with business partners to understand business requirements and assist in generation of technical requirements. Participate in architecture, technical design, and product implementation discussions. Working Knowledge of Unix/Linux operating systems and shell scripting Experience with developing sophisticated Continuous Integration & Continuous Delivery (CI/CD) pipeline including software configuration management, test automation, version control, static code analysis. Excellent interpersonal and communication skills Ability to work with global Agile teams Proven ability to deal with ambiguity and work in fast paced environment Ability to mentor junior data engineers. The Value You Deliver The associate would help the team in designing and building a best-in-class data solutions using very diversified tech stack. Strong experience of working in large teams and proven technical leadership capabilities Knowledge of enterprise-level implementations like data warehouses and automated solutions. Ability to negotiate, influence and work with business peers and management. Ability to develop and drive a strategy as per the needs of the team Good to have: Full-Stack Programming knowledge, hands-on test case/plan preparation within Jira
    $81k-105k yearly est. 1d ago
  • Senior Data Engineer - Master Data Management (MDM)

    Synechron 4.4company rating

    Requirements engineer job in Iselin, NJ

    We are seeking a highly skilled and experienced Senior Data Engineer specializing in Master Data Management (MDM) to join our data team. The ideal candidate will have a strong background in designing, implementing, and managing end-to-end MDM solutions, preferably within the financial sector. You will be responsible for architecting robust data platforms, evaluating MDM tools, and aligning data strategies to meet business needs. Additional Information* The base salary for this role will vary based on geography and other factors. In accordance with law, the base salary for this role if filled within Iselin, NJ is $140K - $150K/year & benefits (see below). The Role Responsibilities: Lead the design, development, and deployment of comprehensive MDM solutions across the organization, with an emphasis on financial data domains. Demonstrate extensive experience with multiple MDM implementations, including platform selection, comparison, and optimization. Architect and present end-to-end MDM architectures, ensuring scalability, data quality, and governance standards are met. Evaluate various MDM platforms (e.g., Informatica, Reltio, Talend, IBM MDM, etc.) and provide objective recommendations aligned with business requirements. Collaborate with business stakeholders to understand reference data sources and develop strategies for managing reference and master data effectively. Implement data integration pipelines leveraging modern data engineering tools and practices. Develop, automate, and maintain data workflows using Python, Airflow, or Astronomer. Build and optimize data processing solutions using Kafka, Databricks, Snowflake, Azure Data Factory (ADF), and related technologies. Design microservices, especially utilizing GraphQL, to enable flexible and scalable data services. Ensure compliance with data governance, data privacy, and security standards. Support CI/CD pipelines for continuous integration and deployment of data solutions. Requirements: 12+ years of experience in data engineering, with a proven track record of MDM implementations, preferably in the financial services industry. Extensive hands-on experience designing and deploying MDM solutions and comparing MDM platform options. Strong functional knowledge of reference data sources and domain-specific data standards. Expertise in Python, Pyspark, Kafka, microservices architecture (particularly GraphQL), Databricks, Snowflake, Azure Data Factory, SQL, and orchestration tools such as Airflow or Astronomer. Familiarity with CI/CD practices, tools, and automation pipelines. Ability to work collaboratively across teams to deliver complex data solutions. Experience with financial systems (capital markets, credit risk, and regulatory compliance applications). Preferred Qualifications: Familiarity with financial data models and regulatory requirements. Experience with Azure cloud platforms Knowledge of data governance, data quality frameworks, and metadata management. We offer: A highly competitive compensation and benefits package. A multinational organization with 58 offices in 21 countries and the possibility to work abroad. 10 days of paid annual leave (plus sick leave and national holidays). Maternity & paternity leave plans. A comprehensive insurance plan including medical, dental, vision, life insurance, and long-/short-term disability (plans vary by region). Retirement savings plans. A higher education certification policy. Commuter benefits (varies by region). Extensive training opportunities, focused on skills, substantive knowledge, and personal development. On-demand Udemy for Business for all Synechron employees with free access to more than 5000 curated courses. Coaching opportunities with experienced colleagues from our Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups. Cutting edge projects at the world's leading tier-one banks, financial institutions and insurance firms. A flat and approachable organization. A truly diverse, fun-loving, and global work culture. S YNECHRON'S DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference' is committed to fostering an inclusive culture - promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant's gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
    $140k-150k yearly 1d ago
  • Azure Data Engineer

    Sharp Decisions 4.6company rating

    Requirements engineer job in Jersey City, NJ

    Title: Senior Azure Data Engineer Client: Major Japanese Bank Experience Level: Senior (10+ Years) The Senior Azure Data Engineer will design, build, and optimize enterprise data solutions within Microsoft Azure for a major Japanese bank. This role focuses on architecting scalable data pipelines, enhancing data lake environments, and ensuring security, compliance, and data governance best practices. Key Responsibilities: Develop, maintain, and optimize Azure-based data pipelines and ETL/ELT workflows. Design and implement Azure Data Lake, Synapse, Databricks, and ADF solutions. Ensure data security, compliance, lineage, and governance controls. Partner with architecture, data governance, and business teams to deliver high-quality data solutions. Troubleshoot performance issues and improve system efficiency. Required Skills: 10+ years of data engineering experience. Strong hands-on expertise with Azure Synapse, Azure Data Factory, Azure Databricks, Azure Data Lake, and Azure SQL. Azure certifications strongly preferred. Strong SQL, Python, and cloud data architecture skills. Experience in financial services or large enterprise environments preferred.
    $77k-101k yearly est. 1d ago
  • Azure Data Engineer

    Kaizen Technologies 3.6company rating

    Requirements engineer job in Princeton, NJ

    We are seeking an experienced Azure Data Engineer with strong expertise in modern data platform technologies including Azure Synapse, Microsoft Fabric, SQL Server, Azure Storage, Azure Data Factory (ADF), Python, Power BI, and Azure OpenAI. The ideal candidate will design, build, and optimize scalable data pipelines and analytics solutions to support enterprise-wide reporting, AI, and data integration initiatives. Key Responsibilities. Design, develop, and maintain Azure-based data pipelines using ADF, Synapse Pipelines, and Fabric Dataflows. Build and optimize data warehouses, data lakes, and lakehouse architectures on Azure. Develop complex SQL queries, stored procedures, and data transformations in SQL Server and Synapse SQL Pools. Implement data ingestion, transformation, and orchestration solutions using Python and Azure services. Manage and optimize Azure Storage solutions (ADLS Gen2, Blob Storage). Leverage Power BI and Fabric for data modeling, dataset creation, and dashboard/report development. Integrate and utilize Azure OpenAI for data enrichment, intelligent automation, and advanced analytics where applicable. Ensure data quality, data governance, and security best practices across the data lifecycle. Troubleshoot data pipeline issues, optimize performance, and support production workloads. Collaborate with data architects, analysts, BI developers, and business stakeholders to deliver end-to-end data solutions.
    $76k-106k yearly est. 3d ago
  • DevOps Engineer

    Optomi 4.5company rating

    Requirements engineer job in Short Hills, NJ

    DevOps Engineer | Direct Hire | Hybrid x2 day on-site | Short Hills, NJ Optomi, in partnership with a leading insurance organization, is seeking an accomplished Senior DevOps Engineer to join their team. This role offers the opportunity to leverage cloud technologies to accelerate value delivery to customers and drive innovation across the organization. The Senior DevOps Engineer will play a critical role in shaping and enhancing development practices by defining and implementing best practices, patterns, and automation strategies. This individual will lead efforts to design, improve, and sustain continuous integration and delivery pipelines while providing hands-on technical oversight to ensure projects align with organizational strategy, architecture, and methodologies. Acting as both a technical leader and trusted advisor, the Senior DevOps Engineer will bring thought leadership in modernization, technology advancement, and application lifecycle management, while also providing expert consulting, mentorship, and guidance to organizational leaders and development teams. What the right candidate will enjoy! Direct Hire full-time opportunity Flexible hybrid schedule Acting as a leader in modernization, technology advancement, and application lifecycle management Driving efficient development practices and influencing best practices and patterns across teams Experience of the right candidate: Over 7 years of experience in applications development More than 5 years of experience designing DevOps pipelines using tools and technologies including Azure DevOps, SonarQube, and YAML In-depth knowledge of Azure services including but not limited to Azure Compute, Azure Storage, Azure Networking, Azure App Service, Logic Apps, VMSS, and Azure Security Proficiency in Azure DevOps and building CI/CD pipelines, including Azure environment provisioning tasks Experience with Infrastructure as Code (IaC) using tools such as Azure Resource Manager (ARM) templates, Terraform, Puppet, or Ansible Experience with scripting languages such as Bicep, PowerShell, Bash, or Python Demonstrated experience in cloud cost optimization, governance, and implementing FinOps practices Strong leadership and influencing skills with the ability to drive change and foster a DevOps culture across teams Experience designing and implementing disaster recovery strategies and high-availability architectures in cloud environments Self-starter who is capable of working independently and making decisions when necessary/as applicable Strong verbal, written, and interpersonal communication and the ability to communicate with audiences at varying technical levels Preferred: Experience working in an Agile environment, preferably SAFe Preferred: Azure certifications such as Azure Administrator Associate, Azure DevOps Engineer Expert Preferred: Experience in Application Security / DevSecOps roles Responsibilities of the right candidate: Design and oversee the implementation of cloud-based architecture, networking, and containerization, utilizing Infrastructure-as-Code for automation and patterns Lead the creation and deployment of CI/CD and other automation solutions, focusing on design patterns that emphasize reuse, scalability, performance, availability, and security Develop and enhance process flows, release pipeline documentation, mockups, and other materials to convey technical details and their alignment with desired outcomes Conduct technical evaluations of DevOps solutions, understand existing industry options, and design necessary custom system integrations Serve as a strategic thinker, thought leader, internal consultant, advocate, mentor, and change agent for DevOps architecture within development teams Measure and demonstrate the benefits and business value of DevOps improvements Present innovative and complex solutions and ideas to participants at all levels, working both as a leader and an individual contributor Identify customer, business, and technology needs through relationship building and communication with key stakeholders Identify gaps and propose modernization opportunities that involve both process and technical/automation aspects of the SDLC Debug and troubleshoot issues with new and existing CI/CD pipelines
    $91k-122k yearly est. 1d ago
  • Senior Data Engineer (Snowflake)

    Epic Placements

    Requirements engineer job in Parsippany-Troy Hills, NJ

    Senior Data Engineer (Snowflake & Python) 1-Year Contract | $60/hour + Benefit Options Hybrid: On-site a few days per month (local candidates only) Work Authorization Requirement You must be authorized to work for any employer as a W2 employee. This is required for this role. This position is W-2 only - no C2C, no third-party submissions, and no sponsorship will be considered. Overview We are seeking a Senior Data Engineer to support enterprise-scale data initiatives for a highly collaborative engineering organization. This is a new, long-term contract opportunity for a hands-on data professional who thrives in fast-paced environments and enjoys building high-quality, scalable data solutions on Snowflake. Candidates must be based in or around New Jersey, able to work on-site at least 3 days per month, and meet the W2 employment requirement. What You'll Do Design, develop, and support enterprise-level data solutions with a strong focus on Snowflake Participate across the full software development lifecycle - planning, requirements, development, testing, and QA Partner closely with engineering and data teams to identify and implement optimal technical solutions Build and maintain high-performance, scalable data pipelines and data warehouse architectures Ensure platform performance, reliability, and uptime, maintaining strong coding and design standards Troubleshoot production issues, identify root causes, implement fixes, and document preventive solutions Manage deliverables and priorities effectively in a fast-moving environment Contribute to data governance practices including metadata management and data lineage Support analytics and reporting use cases leveraging advanced SQL and analytical functions Required Skills & Experience 8+ years of experience designing and developing data solutions in an enterprise environment 5+ years of hands-on Snowflake experience Strong hands-on development skills with SQL and Python Proven experience designing and developing data warehouses in Snowflake Ability to diagnose, optimize, and tune SQL queries Experience with Azure data frameworks (e.g., Azure Data Factory) Strong experience with orchestration tools such as Airflow, Informatica, Automic, or similar Solid understanding of metadata management and data lineage Hands-on experience with SQL analytical functions Working knowledge of Shell scripting and Java scripting Experience using Git, Confluence, and Jira Strong problem-solving and troubleshooting skills Collaborative mindset with excellent communication skills Nice to Have Experience supporting Pharma industry data Exposure to Omni-channel data environments Why This Opportunity $60/hour W2 on a long-term 1-year contract Benefit options available Hybrid structure with limited on-site requirement High-impact role supporting enterprise data initiatives Clear expectations: W-2 only, no third-party submissions, no Corp-to-Corp This employer participates in E-Verify and will provide the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S.
    $60 hourly 1d ago
  • Data Engineer

    The Judge Group 4.7company rating

    Requirements engineer job in Jersey City, NJ

    ONLY LOCALS TO NJ/NY - NO RELOCATION CANDIDATES Skillset: Data Engineer Must Haves: Python, PySpark, AWS - ECS, Glue, Lambda, S3 Nice to Haves: Java, Spark, React Js Interview Process: Interview Process: 2 rounds, 2nd will be on site You're ready to gain the skills and experience needed to grow within your role and advance your career - and we have the perfect software engineering opportunity for you. As a Data Engineer III - Python / Spark / Data Lake at JPMorgan Chase within the Consumer and Community Bank , you will be a seasoned member of an agile team, tasked with designing and delivering reliable data collection, storage, access, and analytics solutions that are secure, stable, and scalable. Your responsibilities will include developing, testing, and maintaining essential data pipelines and architectures across diverse technical areas, supporting various business functions to achieve the firm's business objectives. Job responsibilities: • Supports review of controls to ensure sufficient protection of enterprise data. • Advises and makes custom configuration changes in one to two tools to generate a product at the business or customer request. • Updates logical or physical data models based on new use cases. • Frequently uses SQL and understands NoSQL databases and their niche in the marketplace. • Adds to team culture of diversity, opportunity, inclusion, and respect. • Develop enterprise data models, Design/ develop/ maintain large-scale data processing pipelines (and infrastructure), Lead code reviews and provide mentoring thru the process, Drive data quality, Ensure data accessibility (to analysts and data scientists), Ensure compliance with data governance requirements, and Ensure business alignment (ensure data engineering practices align with business goals). • Supports review of controls to ensure sufficient protection of enterprise data Required qualifications, capabilities, and skills • Formal training or certification on data engineering concepts and 2+ years applied experience • Experience across the data lifecycle, advanced experience with SQL (e.g., joins and aggregations), and working understanding of NoSQL databases • Experience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysis • Extensive experience in AWS, design, implementation, and maintenance of data pipelines using Python and PySpark. • Proficient in Python and PySpark, able to write and execute complex queries to perform curation and build views required by end users (single and multi-dimensional). • Proven experience in performance and tuning to ensure jobs are running at optimal levels and no performance bottleneck. • Advanced proficiency in leveraging Gen AI models from Anthropic (or OpenAI, or Google) using APIs/SDKs • Advanced proficiency in cloud data lakehouse platform such as AWS data lake services, Databricks or Hadoop, relational data store such as Postgres, Oracle or similar, and at least one NOSQL data store such as Cassandra, Dynamo, MongoDB or similar • Advanced proficiency in Cloud Data Warehouse Snowflake, AWS Redshift • Advanced proficiency in at least one scheduling/orchestration tool such as Airflow, AWS Step Functions or similar • Proficiency in Unix scripting, data structures, data serialization formats such as JSON, AVRO, Protobuf, or similar, big-data storage formats such as Parquet, Iceberg, or similar, data processing methodologies such as batch, micro-batching, or stream, one or more data modelling techniques such as Dimensional, Data Vault, Kimball, Inmon, etc., Agile methodology, TDD or BDD and CI/CD tools. Preferred qualifications, capabilities, and skills • Knowledge of data governance and security best practices. • Experience in carrying out data analysis to support business insights. • Strong Python and Spark
    $79k-111k yearly est. 5d ago
  • Senior Data Engineer

    Apexon

    Requirements engineer job in New Providence, NJ

    Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. We have been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation.Apexon brings together distinct core competencies - in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences - to help businesses capitalize on the unlimited opportunities digital offers. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving clients' toughest technology problems, and a commitment to continuous improvement. Backed by Goldman Sachs Asset Management and Everstone Capital, Apexon now has a global presence of 15 offices (and 10 delivery centers) across four continents. Job Description Experienced Data management specialist responsible for developing, overseeing, organizing, storing, and analyzing data and data systems Participate in all aspects of the software development lifecycle for Snowflake solutions, including planning, requirements, development, testing, and quality assurance Work in tandem with our engineering team to identify and implement the most optimal solutions Ensure platform performance, uptime, and scale, maintaining high standards for code quality and thoughtful design Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures Able to manage deliverables in fast paced environments Areas of Expertise At least 10 years of experience designing and development of data solutions in enterprise environment At least 5+ years' experience on Snowflake Platform Strong hands-on SQL and Python development Experience with designing and developing data warehouses in Snowflake A minimum of three years' experience in developing production-ready data ingestion and processing pipelines using Spark, Scala Strong hands-on experience with Orchestration Tools e.g. Airflow, Informatica, Automic Good understanding on Metadata and data lineage Hands-on knowledge on SQL Analytical functions Strong knowledge and hands-on experience in Shell scripting, Java Scripting Able to demonstrate experience with software engineering practices including CI/CD, Automated testing and Performance Engineering. Good understanding and exposure to Git, Confluence and Jira Good problem solving and troubleshooting skills. Team player, collaborative approach and excellent communication skills Our Commitment to Diversity & Inclusion: Did you know that Apexon has been Certified™ by Great Place To Work , the global authority on workplace culture, in each of the three regions in which it operates: USA (for the fourth time in 2023), India (seven consecutive certifications as of 2023), and the UK.Apexon is committed to being an equal opportunity employer and promoting diversity in the workplace. We are taking affirmative action to ensure equal employment opportunity for all qualified individuals. Apexon strictly prohibits discrimination and harassment of any kind and provides equal employment opportunities to employees and applicants without regard to gender, race, color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. You can read about our Job Applicant Privacy policy here Job Applicant Privacy Policy (apexon.com)
    $82k-112k yearly est. 1d ago
  • Data Engineer

    Neenopal Inc.

    Requirements engineer job in Newark, NJ

    NeenOpal is a global consulting firm specializing in Data Science and Business Intelligence, with offices in Bengaluru, Newark, and Fredericton. We provide end-to-end solutions tailored to the unique needs of businesses, from startups to large organizations, across domains like digital strategy, sales and marketing, supply chain, and finance. Our mission is to help organizations achieve operational excellence and transform into data-driven enterprises. Role Description This is a full-time, hybrid, Data Engineer role located in Newark, NJ. The Data Engineer will be responsible for designing, implementing, and managing data engineering solutions to support business needs. Day-to-day tasks include building and optimizing data pipelines, developing and maintaining data models and ETL processes, managing data warehousing solutions, and contributing to the organization's data analytics initiatives. Collaboration with cross-functional teams to ensure robust data infrastructure will be a key aspect of this role. Key Responsibilities Data Pipeline Development: Design, implement, and manage robust data pipelines to ensure efficient data flow into data warehouses. Automate ETL processes using Python and advanced data engineering tools. Data Integration: Integrate and transform data using industry-standard tools. Experience required with: AWS Services: AWS Glue, Data Pipeline, Redshift, and S3. Azure Services: Azure Data Factory, Synapse Analytics, and Blob Storage. Data Warehousing: Implement and optimize solutions using Snowflake and Amazon Redshift. Database Management: Develop and manage relational databases (SQL Server, MySQL, PostgreSQL) to ensure data integrity. Performance Optimization: Continuously monitor and improve data processing workflows and apply best practices for query optimization. Global Collaboration: Work closely with cross-functional teams in the US, India, and Canada to deliver high-quality solutions. Governance & Support: Document ETL processes and data mappings in line with governance standards. Diagnose and resolve data-related issues promptly. Required Skills and Experience Experience: Minimum 2+ years of experience designing and developing ETL processes (AWS Glue, Azure Data Factory, or similar). Integration: Experience integrating data via RESTful / GraphQL APIs. Programming: Proficient in Python for ETL automation and SQL for database management. Cloud Platforms: Strong experience with AWS or Azure data services. (GCP familiarity is a plus) . Data Warehousing: Expertise with Snowflake, Amazon Redshift, or Azure Synapse Analytics. Integration: Experience integrating data via RESTful APIs. Communication: Excellent articulation skills to explain technical work directly to clients and stakeholders. Authorization: Must have valid work authorization in the United States. Salary Range: $65,000- $80,000 per year Benefits: This role includes health insurance, paid time off, and opportunities for professional growth and continuous learning within a fast-growing global analytics company. Equal Opportunity Employer NeenOpal Inc. is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status.
    $65k-80k yearly 1d ago
  • Java Software Engineer (Trading)-- AGADC5642050

    Compunnel Inc. 4.4company rating

    Requirements engineer job in Jersey City, NJ

    Must Haves: 1.) Low Latency Java Development experience (Trading would be preferred but not mandatory) These are more from a screening standpoint, if they have low latency java development experience they should have the following: 2.) Garbage collection, threading and or multi threading, Memory management experience 3.) Fix Protocol 4.) Optimization techniques or profiling techniques Nice to Haves: Order management System, Smart order router, market data experience
    $72k-93k yearly est. 2d ago

Learn more about requirements engineer jobs

Do you work as a requirements engineer?

What are the top employers for requirements engineer in NJ?

Zone It Solutions

Howmet Holdings Corporation

Grid Dynamics

Top 10 Requirements Engineer companies in NJ

  1. Tata Group

  2. Cushman & Wakefield

  3. Jacobs Enterprises

  4. Safway Group Holding LLC

  5. Zone It Solutions

  6. South Jersey Industries

  7. Howmet Holdings Corporation

  8. Grid Dynamics

  9. Photon Group

  10. Campbell Soup

Job type you want
Full Time
Part Time
Internship
Temporary

Browse requirements engineer jobs in new jersey by city

All requirements engineer jobs

Jobs in New Jersey