Post job

Requirements engineer jobs in Ramapo, NY

- 576 jobs
All
Requirements Engineer
Data Engineer
Software Engineer
Devops Engineer
Systems Engineer
  • O365 Engineer

    NLB Services 4.3company rating

    Requirements engineer job in Mahwah, NJ

    Key Responsibilities: • Serve as the primary technical lead for O365 Office applications (Word, Excel, PowerPoint, Outlook) and integrated services. • Manage monthly patching cadence cycle for 110,000 devices, from patch deployment Alpha/Beta testing and remediation of unpatched devices across the UPS network desktops, citrix workstations and servers. • Analyze business requirements and design solutions leveraging O365 capabilities. Manage configurations, updates, and integrations for O365 Office applications. • Provide Tier 3 support for complex O365 Office-related issues and escalations. Collaborate with IT teams and business units to optimize workflows and enhance productivity. • Develop and maintain documentation for O365 Office configurations, policies, and best practices. • Monitor system performance and ensure compliance with security and governance standards. • Stay current with Microsoft updates and recommend enhancements to improve user experience. Qualifications • Bachelor's degree in Computer Science, Information Systems, or related field (or equivalent experience). • 5+ years of experience in systems analysis or administration, with at least 3 years focused on O365 Office applications. • 3 years of knowledge of O365 architecture, Office application deployment, and integration with enterprise systems. • 3 years of Experience with PowerShell scripting, Azure AD, and security compliance. • Excellent analytical, problem-solving, and communication skills.
    $74k-107k yearly est. 1d ago
  • BI Engineer (Tableau & Power BI - platforms/server)

    Harvey Nash

    Requirements engineer job in Newark, NJ

    Job Title: BI Engineer (Tableau & Power BI - platforms/server) Duration: 12 months long term project US citizens and Green Card Holders and those authorized to work in the US are encouraged to apply. We are unable to sponsor H1b candidates at this time Summary of the job -Extremely technical/hands on skills on Power BI, Python and some Tableau - Financial, Asset Management, banking background - FIX Income specifically is a big plus - Azure Cloud Job Description: Our Role: We are looking for an astute, determined professional like you to fulfil a BI Engineering role within our Technology Solutions Group. You will showcase your success in a fast-paced environment through collaboration, ownership, and innovation. Your expertise in emerging trends and practices will evoke stimulating discussions around optimization and change to help keep our competitive edge. This rewarding opportunity will enable you to make a big impact in our organization, so if this sounds exciting, then might be the place. Your Impact: Build and maintain new and existing applications in preparation for a large-scale architectural migration within an Agile function. Align with the Product Owner and Scrum Master in assessing business needs and transforming them into scalable applications. Build and maintain code to manage data received from heterogenous data formats including web-based sources, internal/external databases, flat files, heterogenous data formats (binary, ASCII). Help build new enterprise Datawarehouse and maintain the existing one. Design and support effective storage and retrieval of very large internal and external data set and be forward think about the convergence strategy with our AWS cloud migration Assess the impact of scaling up and scaling out and ensure sustained data management and data delivery performance. Build interfaces for supporting evolving and new applications and accommodating new data sources and types of data. Your Required Skills: 5+ years of hand-on experience in BI Platform administration such as Power BI and Tableau 3+ years of hand-on experience in Power BI/Tableau report development Experience with both server and desktop-based data visualization tools Expertise with multiple database platforms including relational databases (ie. SQL Server) as well as cloud-based data warehouses such as Azure Fluent with SQL for data analysis Working experience in a Windows based environment Knowledge of data warehousing, ETL procedures, and BI technologies Excellent analytical and problem-solving skills with the ability to think quickly and offer alternatives both independently and within teams. Exposure working in an Agile environment with Scrum Master/Product owner and ability to deliver Ability to communicate the status and challenges with the team Demonstrating the ability to learn new skills and work as a team Strong interpersonal skills A reasonable, good faith estimate of the minimum and maximum Pay rate for this position is $70/hr. to $80/hr.
    $70-80 hourly 1d ago
  • GenAI Engineer

    AGM Tech Solutions-A Woman and Latina-Owned It Staffing Firm-An Inc. 5000 Company

    Requirements engineer job in Parsippany-Troy Hills, NJ

    ***This is an architect role. Not a Data Scientist position.*** Local candidates only please. Required three days per week onsite. GenAI Engineer GenAI Engineer Responsibilities: Design and develop innovative AI/ML solutions in collaboration with Center of Excellence Develop systems architecture, technical roadmaps, and prototypes Build, test, and deploy AI models into production leveraging AWS services Continuously explore new AI techniques and methodologies to drive innovation Requirements: Degree in Computer Science, Statistics, or related quantitative field 5+ years experience architecting and developing AI/ML systems, 1+ year experience with GenAI systems Expertise in Python, SQL, PyTorch, TensorFlow, and other ML libraries/frameworks Experience deploying solutions on cloud platforms like AWS. Familiarity with MLOps & AIOps principles. Strong communication, collaboration, and coaching skills Nice to have: Databricks experience You can expect: Courageous collaboration across high-performing teams Opportunity to deliver AI innovations at epic scale An environment surrounded by curious lifelong learners A culture of innovation, ownership, and hands-on creativity Industry leadership in applying AI/ML to transform HR Belonging in a company committed to equality, diversity, and inclusion Let's talk if you're ready to architect the next generation of AI!
    $70k-94k yearly est. 3d ago
  • Gen AI/ML Engineer

    Capgemini 4.5company rating

    Requirements engineer job in Jersey City, NJ

    Gen AI/ML Engineer with Data Engineering Exposure Experience : 8+ Years Preferred Employee Type : Full Time with Benefits Job Description: We are seeking a highly skilled and experienced AI/ML Engineer with a strong background in Machine Learning (ML), Large Language Models (LLMs), Generative AI (GenAI), and Data Engineering. The ideal candidate will have successfully delivered 3-4 end-to-end AI/ML projects, demonstrating expertise in building scalable ML systems and deploying them in production environments. A solid foundation in Python, SQL, PySpark, and NLP technologies is essential. Experience with cloud platforms such as AWS, Azure, or GCP is highly desirable. Key Responsibilities Design, develop, and deploy scalable ML/AI solutions, including robust MLOps pipelines for CI/CD, model monitoring, and governance. Lead the development of LLM and GenAI applications, including text summarization, conversational AI, and entity recognition. Build and optimize data pipelines using PySpark and SQL for large-scale data processing and feature engineering. Architect and implement production-grade ML systems with a focus on performance, scalability, and reliability. Collaborate with cross-functional teams to align AI initiatives with business goals and drive innovation. Mentor junior engineers and contribute to team-wide knowledge sharing and best practices. Required Skills & Qualifications Bachelor's degree in Computer Science, Data Science, or a related field. 7+ years of hands-on experience in ML/AI solution development and deployment. Proven track record of working on at least 3-4 AI/ML projects from concept to production. Programming Languages: Python (Pandas, NumPy, PyTorch, TensorFlow), SQL. MLOps Tools: MLflow, Kubeflow, Docker, Kubernetes, CI/CD pipelines. GenAI & NLP: Expertise in transformer models (e.g., GPT, BERT), Hugging Face, LangChain. Data Engineering: Strong experience with PySpark and distributed data processing. Cloud Platforms: Proficiency in AWS, Azure, or GCP. Strong problem-solving skills and ability to thrive in a fast-paced, collaborative environment. Life at Capgemini Capgemini supports all aspects of your well-being throughout the changing stages of your life and career. For eligible employees, we offer: Flexible work Healthcare including dental, vision, mental health, and well-being programs Financial well-being programs such as 401(k) and Employee Share Ownership Plan Paid time off and paid holidays Paid parental leave Family building benefits like adoption assistance, surrogacy, and cryopreservation Social well-being benefits like subsidized back-up child/elder care and tutoring Mentoring, coaching and learning programs Employee Resource Groups Disaster Relief Disclaimer Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law. This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship. Capgemini is committed to providing reasonable accommodations during our recruitment process. If you need assistance or accommodation, please get in touch with your recruiting contact. Click the following link for more information on your rights as an Applicant ************************************************************************** Salary Transparency: Capgemini discloses salary range information in compliance with state and local pay transparency obligations. The disclosed range represents the lowest to highest salary we, in good faith, believe we would pay for this role at the time of this posting, although we may ultimately pay more or less than the disclosed range, and the range may be modified in the future. The disclosed range takes into account the wide range of factors that are considered in making compensation decisions including, but not limited to, geographic location, relevant education, qualifications, certifications, experience, skills, seniority, performance, sales or revenue-based metrics, and business or organizational needs. At Capgemini, it is not typical for an individual to be hired at or near the top of the range for their role. The base salary range for the tagged location is $103330 - $128656/yearly. This role may be eligible for other compensation including variable compensation, bonus, or commission. Full time regular employees are eligible for paid time off, medical/dental/vision insurance, 401(k), and any other benefits to eligible employees. Note: No amount of pay is considered to be wages or compensation until such amount is earned, vested, and determinable. The amount and availability of any bonus, commission, or any other form of compensation that are allocable to a particular employee remains in the Company's sole discretion unless and until paid and may be modified at the Company's sole discretion, consistent with the law.
    $103.3k-128.7k yearly 4d ago
  • Data Engineer

    Company 3.0company rating

    Requirements engineer job in Fort Lee, NJ

    The Senior Data Analyst will be responsible for developing MS SQL queries and procedures, building custom reports, and modifying ERP user forms to support and enhance organizational productivity. This role will also design and maintain databases, ensuring high levels of stability, reliability, and performance. Responsibilities Analyze, structure, and interpret raw data. Build and maintain datasets for business use. Design and optimize database tables, schemas, and data structures. Enhance data accuracy, consistency, and overall efficiency. Develop views, functions, and stored procedures. Write efficient SQL queries to support application integration. Create database triggers to support automation processes. Oversee data quality, integrity, and database security. Translate complex data into clear, actionable insights. Collaborate with cross-functional teams on multiple projects. Present data through graphs, infographics, dashboards, and other visualization methods. Define and track KPIs to measure the impact of business decisions. Prepare reports and presentations for management based on analytical findings. Conduct daily system maintenance and troubleshoot issues across all platforms. Perform additional ad hoc analysis and tasks as needed. Qualification Bachelor's Degree in Information Technology or relevant 4+ years of experience as a Data Analyst or Data Engineer, including database design experience. Strong ability to extract, manipulate, analyze, and report on data, as well as develop clear and effective presentations. Proficiency in writing complex SQL queries, including table joins, data aggregation (SUM, AVG, COUNT), and creating, retrieving, and updating views. Excellent written, verbal, and interpersonal communication skills. Ability to manage multiple tasks in a fast-paced and evolving environment. Strong work ethic, professionalism, and integrity. Advanced proficiency in Microsoft Office applications.
    $93k-132k yearly est. 3d ago
  • Lead Data Engineer

    Themesoft Inc. 3.7company rating

    Requirements engineer job in Roseland, NJ

    Job Title: Lead Data Engineer. Hybrid Role: 3 Times / Week. Type: 12 Months Contract - Rolling / Extendable Contract. Work Authorization: Candidates must be authorized to work in the U.S. without current or future sponsorship requirements. Must haves: AWS. Databricks. Lead experience- this can be supplemented for staff as well. Python. Pyspark. Contact Center Experience is a nice to have. Job Description: As a Lead Data Engineer, you will spearhead the design and delivery of a data hub/marketplace aimed at providing curated client service data to internal data consumers, including analysts, data scientists, analytic content authors, downstream applications, and data warehouses. You will develop a service data hub solution that enables internal data consumers to create and maintain data integration workflows, manage subscriptions, and access content to understand data meaning and lineage. You will design and maintain enterprise data models for contact center-oriented data lakes, warehouses, and analytic models (relational, OLAP/dimensional, columnar, etc.). You will collaborate with source system owners to define integration rules and data acquisition options (streaming, replication, batch, etc.). You will work with data engineers to define workflows and data quality monitors. You will perform detailed data analysis to understand the content and viability of data sources to meet desired use cases and help define and maintain enterprise data taxonomy and data catalog. This role requires clear, compelling, and influential communication skills. You will mentor developers and collaborate with peer architects and developers on other teams. TO SUCCEED IN THIS ROLE: Ability to define and design complex data integration solutions with general direction and stakeholder access. Capability to work independently and as part of a global, multi-faceted data warehousing and analytics team. Advanced knowledge of cloud-based data engineering and data warehousing solutions, especially AWS, Databricks, and/or Snowflake. Highly skilled in RDBMS platforms such as Oracle, SQLServer. Familiarity with NoSQL DB platforms like MongoDB. Understanding of data modeling and data engineering, including SQL and Python. Strong understanding of data quality, compliance, governance and security. Proficiency in languages such as Python, SQL, and PySpark. Experience in building data ingestion pipelines for structured and unstructured data for storage and optimal retrieval. Ability to design and develop scalable data pipelines. Knowledge of cloud-based and on-prem contact center technologies such as Salesforce.com, ServiceNow, Oracle CRM, Genesys Cloud, Genesys InfoMart, Calabrio Voice Recording, Nuance Voice Biometrics, IBM Chatbot, etc., is highly desirable. Experience with code repository and project tools such as GitHub, JIRA, and Confluence. Working experience with CI/CD (Continuous Integration & Continuous Deployment) process, with hands-on expertise in Jenkins, Terraform, Splunk, and Dynatrace. Highly innovative with an aptitude for foresight, systems thinking, and design thinking, with a bias towards simplifying processes. Detail-oriented with strong analytical, problem-solving, and organizational skills. Ability to clearly communicate with both technical and business teams. Knowledge of Informatica PowerCenter, Data Quality, and Data Catalog is a plus. Knowledge of Agile development methodologies is a plus. Having a Databricks data engineer associate certification is a plus but not mandatory. Data Engineer Requirements: Bachelor's degree in computer science, information technology, or a similar field. 8+ years of experience integrating and transforming contact center data into standard, consumption-ready data sets incorporating standardized KPIs, supporting metrics, attributes, and enterprise hierarchies. Expertise in designing and deploying data integration solutions using web services with client-driven workflows and subscription features. Knowledge of mathematical foundations and statistical analysis. Strong interpersonal skills. Excellent communication and presentation skills. Advanced troubleshooting skills. Regards, Purnima Pobbathy Senior Technical Recruiter ************ | ********************* |Themesoft Inc |
    $78k-106k yearly est. 2d ago
  • Azure Data Engineer

    Programmers.Io 3.8company rating

    Requirements engineer job in Weehawken, NJ

    · Expert level skills writing and optimizing complex SQL · Experience with complex data modelling, ETL design, and using large databases in a business environment · Experience with building data pipelines and applications to stream and process datasets at low latencies · Fluent with Big Data technologies like Spark, Kafka and Hive · Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required · Designing and building of data pipelines using API ingestion and Streaming ingestion methods · Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential · Experience in developing NO SQL solutions using Azure Cosmos DB is essential · Thorough understanding of Azure and AWS Cloud Infrastructure offerings · Working knowledge of Python is desirable · Designing and implementing scalable and secure data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services · Managing and optimizing data storage using Azure Data Lake Storage, Azure SQL Data Warehouse, and Azure Cosmos DB · Monitoring and troubleshooting data-related issues within the Azure environment to maintain high availability and performance · Implementing data security measures, including encryption, access controls, and auditing, to protect sensitive information · Automating data pipelines and workflows to streamline data ingestion, processing, and distribution tasks · Utilizing Azure's analytics services, such as Azure Synapse Analytics, to provide insights and support data-driven decision-making. · Documenting data procedures, systems, and architectures to maintain clarity and ensure compliance with regulatory standards · Providing guidance and support for data governance, including metadata management, data lineage, and data cataloging Best Regards, Dipendra Gupta Technical Recruiter *****************************
    $92k-132k yearly est. 19h ago
  • Data Engineer

    Mastech Digital 4.7company rating

    Requirements engineer job in Jersey City, NJ

    Mastech Digital Inc. (NYSE: MHH) is a minority-owned, publicly traded IT staffing and digital transformation services company. Headquartered in Pittsburgh, PA, and established in 1986, we serve clients nationwide through 11 U.S. offices. Role: Data Engineer Location: Merrimack, NH/Smithfield, RI/Jersey City, NJ Duration: Full-Time/W2 Job Description: Must-Haves: Python for running ETL batch jobs Heavy SQL for data analysis, validation and querying AWS and the ability to move the data through the data stages and into their target databases. The Postgres database is the target, so that is required. Nice to haves: Snowflake Java for API development is a nice to have (will teach this) Experience in asset management for domain knowledge. Production support debugging and processing of vendor data The Expertise and Skills You Bring A proven foundation in data engineering - bachelor's degree + preferred, 10+ years' experience Extensive experience with ETL technologies Design and develop ETL reporting and analytics solutions. Knowledge of Data Warehousing methodologies and concepts - preferred Advanced data manipulation languages and frameworks (JAVA, PYTHON, JSON) - required RMDS experience (Snowflake, PostgreSQL ) - required Knowledge of Cloud platforms and Services (AWS - IAM, EC2, S3, Lambda, RDS ) - required Designing and developing low to moderate complex data integration solution - required Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Docker) will be preferred Expert in SQL and Stored Procedures on any Relational databases Good in debugging, analyzing and Production Support Application Development based on JIRA stories (Agile environment) Demonstrable experience with ETL tools (Informatica, Snaplogic) Experience in working with Python in an AWS environment Create, update, and maintain technical documentation for software-based projects and products. Solving production issues. Interact effectively with business partners to understand business requirements and assist in generation of technical requirements. Participate in architecture, technical design, and product implementation discussions. Working Knowledge of Unix/Linux operating systems and shell scripting Experience with developing sophisticated Continuous Integration & Continuous Delivery (CI/CD) pipeline including software configuration management, test automation, version control, static code analysis. Excellent interpersonal and communication skills Ability to work with global Agile teams Proven ability to deal with ambiguity and work in fast paced environment Ability to mentor junior data engineers. The Value You Deliver The associate would help the team in designing and building a best-in-class data solutions using very diversified tech stack. Strong experience of working in large teams and proven technical leadership capabilities Knowledge of enterprise-level implementations like data warehouses and automated solutions. Ability to negotiate, influence and work with business peers and management. Ability to develop and drive a strategy as per the needs of the team Good to have: Full-Stack Programming knowledge, hands-on test case/plan preparation within Jira
    $81k-105k yearly est. 4d ago
  • Data Engineer

    Spectraforce 4.5company rating

    Requirements engineer job in Newark, NJ

    Data Engineer Duration: 6 months (with possible extension) Visas-: USC, GC, GC EAD Contract type- W2 only (No H1b or H4EAD) Responsibilities Prepares data for analytical or operational uses Builds data pipelines to pull information from different source systems, integrating, consolidating, and cleansing data and structuring it for use in applications Creates interfaces and mechanisms for the flow and access of information. Required Skills ETL AWS Data analysis Multisource data gathering
    $85k-119k yearly est. 19h ago
  • Lead DevOps Engineer (Jenkins)

    PRI Technology 4.1company rating

    Requirements engineer job in Jersey City, NJ

    Employment Type: Full-Time, Direct Hire This is not a contract role and is not available for third-party agencies or contractors. About the Role We are seeking a hands-on Lead DevOps Engineer to drive the development of a next-generation enterprise pipeline infrastructure. This role is ideal for a technical leader with deep experience building scalable Jenkins environments, defining CI/CD strategy, and promoting DevOps best practices across large organizations. If you thrive in fast-paced, highly collaborative environments and enjoy solving complex engineering challenges, this is an excellent opportunity. What You'll Do Lead the design and implementation of a unified enterprise pipeline framework using Jenkins, Octopus Deploy, and related CI/CD tools. Build, optimize, and maintain a highly scalable Jenkins platform supporting multiple concurrent teams and workloads. Evaluate emerging CI/CD technologies and lead enterprise-wide adoption initiatives. Manage and mentor a team of developers and DevOps engineers; foster a culture of operational excellence. Collaborate with cross-functional stakeholders to gather requirements, align strategy, and advance DevOps maturity. Enforce Infrastructure-as-Code practices with proper governance, compliance, and audit controls. Implement monitoring, alerting, and automation to ensure strong operational performance. Lead incident response efforts; drive root-cause analysis and long-term remediation. Identify bottlenecks and drive end-to-end automation to improve deployment speed and reliability. What You Bring Strong expertise with Jenkins, Octopus Deploy, and modern CI/CD ecosystems. Hands-on experience with AWS or Azure, Docker, Kubernetes, Terraform, and IaC principles. Strong programming skills (Python, Node.js) and solid Git fundamentals. 3+ years of experience leading technical teams and delivering complex solutions. Experience with Software-Defined Networks, VPCs, cloud networking, and infrastructure automation. Familiarity with DevOps methodologies and ITIL best practices. Proactive, collaborative, and driven by innovation.
    $90k-121k yearly est. 1d ago
  • Azure Data Engineer

    Sharp Decisions 4.6company rating

    Requirements engineer job in Jersey City, NJ

    Title: Senior Azure Data Engineer Client: Major Japanese Bank Experience Level: Senior (10+ Years) The Senior Azure Data Engineer will design, build, and optimize enterprise data solutions within Microsoft Azure for a major Japanese bank. This role focuses on architecting scalable data pipelines, enhancing data lake environments, and ensuring security, compliance, and data governance best practices. Key Responsibilities: Develop, maintain, and optimize Azure-based data pipelines and ETL/ELT workflows. Design and implement Azure Data Lake, Synapse, Databricks, and ADF solutions. Ensure data security, compliance, lineage, and governance controls. Partner with architecture, data governance, and business teams to deliver high-quality data solutions. Troubleshoot performance issues and improve system efficiency. Required Skills: 10+ years of data engineering experience. Strong hands-on expertise with Azure Synapse, Azure Data Factory, Azure Databricks, Azure Data Lake, and Azure SQL. Azure certifications strongly preferred. Strong SQL, Python, and cloud data architecture skills. Experience in financial services or large enterprise environments preferred.
    $77k-101k yearly est. 19h ago
  • Azure DevOps Engineer

    Ltimindtree

    Requirements engineer job in Jersey City, NJ

    About US: LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 750 clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit ******************** Job Title: Azure DevOps Engineer Work Location Jersey City, NJ Job Description: 1. Extensive hands-on experience on GitHub Actions writing workflows in YAML using re-usable templates 2. Extensive hands on experience with application CI/CD pipelines both for Azure and on-prem for different frameworks 3. Hands on experience with Azure DevOps and migration programs of CI/CD pipelines preferably from Azure DevOps to GitHub Actions 4. Proficiency in integrating and consuming REST APIs to achieve automation through scripting 5. Hands on experience with atleast 1 scripting language and has done out of box automations for platforms like People Soft, SharePoint, MDM etc 6. Hands on experience with CI/CD of databases 7. Good to have experience with infrastructure-as-code including ARM templates Terraform Azure CLI Azure PowerShell modules 8. Exposure to monitoring tools like ELK Prometheus Grafana Benefits/perks listed below may vary depending on the nature of your employment with LTIMindtree (“LTIM”): Benefits and Perks: Comprehensive Medical Plan Covering Medical, Dental, Vision Short Term and Long-Term Disability Coverage 401(k) Plan with Company match Life Insurance Vacation Time, Sick Leave, Paid Holidays Paid Paternity and Maternity Leave The range displayed on each job posting reflects the minimum and maximum salary target for the position across all US locations. Within the range, individual pay is determined by work location and job level and additional factors including job-related skills, experience, and relevant education or training. Depending on the position offered, other forms of compensation may be provided as part of overall compensation like an annual performance-based bonus, sales incentive pay and other forms of bonus or variable compensation. Disclaimer: The compensation and benefits information provided herein is accurate as of the date of this posting. LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.
    $88k-115k yearly est. 1d ago
  • DevOps Engineer

    Optomi 4.5company rating

    Requirements engineer job in Short Hills, NJ

    DevOps Engineer | Direct Hire | Hybrid x2 day on-site | Short Hills, NJ Optomi, in partnership with a leading insurance organization, is seeking an accomplished Senior DevOps Engineer to join their team. This role offers the opportunity to leverage cloud technologies to accelerate value delivery to customers and drive innovation across the organization. The Senior DevOps Engineer will play a critical role in shaping and enhancing development practices by defining and implementing best practices, patterns, and automation strategies. This individual will lead efforts to design, improve, and sustain continuous integration and delivery pipelines while providing hands-on technical oversight to ensure projects align with organizational strategy, architecture, and methodologies. Acting as both a technical leader and trusted advisor, the Senior DevOps Engineer will bring thought leadership in modernization, technology advancement, and application lifecycle management, while also providing expert consulting, mentorship, and guidance to organizational leaders and development teams. What the right candidate will enjoy! Direct Hire full-time opportunity Flexible hybrid schedule Acting as a leader in modernization, technology advancement, and application lifecycle management Driving efficient development practices and influencing best practices and patterns across teams Experience of the right candidate: Over 7 years of experience in applications development More than 5 years of experience designing DevOps pipelines using tools and technologies including Azure DevOps, SonarQube, and YAML In-depth knowledge of Azure services including but not limited to Azure Compute, Azure Storage, Azure Networking, Azure App Service, Logic Apps, VMSS, and Azure Security Proficiency in Azure DevOps and building CI/CD pipelines, including Azure environment provisioning tasks Experience with Infrastructure as Code (IaC) using tools such as Azure Resource Manager (ARM) templates, Terraform, Puppet, or Ansible Experience with scripting languages such as Bicep, PowerShell, Bash, or Python Demonstrated experience in cloud cost optimization, governance, and implementing FinOps practices Strong leadership and influencing skills with the ability to drive change and foster a DevOps culture across teams Experience designing and implementing disaster recovery strategies and high-availability architectures in cloud environments Self-starter who is capable of working independently and making decisions when necessary/as applicable Strong verbal, written, and interpersonal communication and the ability to communicate with audiences at varying technical levels Preferred: Experience working in an Agile environment, preferably SAFe Preferred: Azure certifications such as Azure Administrator Associate, Azure DevOps Engineer Expert Preferred: Experience in Application Security / DevSecOps roles Responsibilities of the right candidate: Design and oversee the implementation of cloud-based architecture, networking, and containerization, utilizing Infrastructure-as-Code for automation and patterns Lead the creation and deployment of CI/CD and other automation solutions, focusing on design patterns that emphasize reuse, scalability, performance, availability, and security Develop and enhance process flows, release pipeline documentation, mockups, and other materials to convey technical details and their alignment with desired outcomes Conduct technical evaluations of DevOps solutions, understand existing industry options, and design necessary custom system integrations Serve as a strategic thinker, thought leader, internal consultant, advocate, mentor, and change agent for DevOps architecture within development teams Measure and demonstrate the benefits and business value of DevOps improvements Present innovative and complex solutions and ideas to participants at all levels, working both as a leader and an individual contributor Identify customer, business, and technology needs through relationship building and communication with key stakeholders Identify gaps and propose modernization opportunities that involve both process and technical/automation aspects of the SDLC Debug and troubleshoot issues with new and existing CI/CD pipelines
    $91k-122k yearly est. 1d ago
  • C++ Market Data Engineer

    TBG | The Bachrach Group

    Requirements engineer job in Stamford, CT

    We are seeking a C++ Market Data Engineer to design and optimize ultra-low-latency feed handlers that power global trading systems. This is a high-impact role where your code directly drives real-time decision making. What You'll Do: Build high-performance feed handlers in modern C++ (14/17/20) for equities, futures, and options Optimize systems for micro/nanosecond latency with lock-free algorithms and cache-friendly design Ensure reliable data delivery with failover, gap recovery, and replay mechanisms Collaborate with researchers and engineers to align data formats for trading and simulation Instrument and test systems for continuous performance improvements What We're Looking For: 3+ years of C++ development experience (low-latency, high-throughput systems) Experience with real-time market data feeds (e.g., Bloomberg B-PIPE, CME MDP, Refinitiv, OPRA, ITCH) Strong knowledge of concurrency, memory models, and compiler optimizations Python scripting skills for testing and automation Familiarity with Docker/Kubernetes and cloud networking (AWS/GCP) is a plus
    $84k-114k yearly est. 19h ago
  • Senior Azure Data Engineer

    Oakridge Staffing

    Requirements engineer job in Stamford, CT

    Great opportunity with a private equity firm located in Stamford, CT. The Azure Data Engineer in this role will partner closely with investment and operations teams to build scalable data pipelines and modern analytics solutions across the firm and its portfolio. The Senior Azure Data Engineer main responsibilities will be: Designing and implementing machine learning solutions as part of high-volume data ingestion and transformation pipelines Experience in designing solutions for large data warehouses and databases (Azure, Databricks and/or Snowflake) Gather requirements from business stakeholders. Experience in data architecture, data governance, data modeling, data transformation (from converting data, to data cleansing, to building data structures) data lineage, data Integration, and master data management. Technical Skills Architecting and delivering solutions using the Azure Data Analytics platform including Azure Databricks/Azure SQL Data Warehouse Utilizing Databricks (for processing and transforming massive quantities of data and exploring the data through machine learning models) Design and build solutions powered by DBT models and integrate with Databricks. Utilize Snowflake for data application development, and secure sharing and consumption of real-time and/or shared data. Expertise in data manipulation and analysis using Python. SQL for data migration and analysis. Pluses: Past work experience in financial markets is a plus (Asset Management, Multi-strategy, Private Equity, Structured Products, Fixed Income, Trading, Portfolio Management, etc.). E-Mail: DIANA@oakridgestaffing.com Please feel free to connect with me on LinkedIn: www.linkedin.com/in/dianagjuraj
    $84k-114k yearly est. 3d ago
  • Data Engineer

    The Judge Group 4.7company rating

    Requirements engineer job in Jersey City, NJ

    ONLY LOCALS TO NJ/NY - NO RELOCATION CANDIDATES Skillset: Data Engineer Must Haves: Python, PySpark, AWS - ECS, Glue, Lambda, S3 Nice to Haves: Java, Spark, React Js Interview Process: Interview Process: 2 rounds, 2nd will be on site You're ready to gain the skills and experience needed to grow within your role and advance your career - and we have the perfect software engineering opportunity for you. As a Data Engineer III - Python / Spark / Data Lake at JPMorgan Chase within the Consumer and Community Bank , you will be a seasoned member of an agile team, tasked with designing and delivering reliable data collection, storage, access, and analytics solutions that are secure, stable, and scalable. Your responsibilities will include developing, testing, and maintaining essential data pipelines and architectures across diverse technical areas, supporting various business functions to achieve the firm's business objectives. Job responsibilities: • Supports review of controls to ensure sufficient protection of enterprise data. • Advises and makes custom configuration changes in one to two tools to generate a product at the business or customer request. • Updates logical or physical data models based on new use cases. • Frequently uses SQL and understands NoSQL databases and their niche in the marketplace. • Adds to team culture of diversity, opportunity, inclusion, and respect. • Develop enterprise data models, Design/ develop/ maintain large-scale data processing pipelines (and infrastructure), Lead code reviews and provide mentoring thru the process, Drive data quality, Ensure data accessibility (to analysts and data scientists), Ensure compliance with data governance requirements, and Ensure business alignment (ensure data engineering practices align with business goals). • Supports review of controls to ensure sufficient protection of enterprise data Required qualifications, capabilities, and skills • Formal training or certification on data engineering concepts and 2+ years applied experience • Experience across the data lifecycle, advanced experience with SQL (e.g., joins and aggregations), and working understanding of NoSQL databases • Experience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysis • Extensive experience in AWS, design, implementation, and maintenance of data pipelines using Python and PySpark. • Proficient in Python and PySpark, able to write and execute complex queries to perform curation and build views required by end users (single and multi-dimensional). • Proven experience in performance and tuning to ensure jobs are running at optimal levels and no performance bottleneck. • Advanced proficiency in leveraging Gen AI models from Anthropic (or OpenAI, or Google) using APIs/SDKs • Advanced proficiency in cloud data lakehouse platform such as AWS data lake services, Databricks or Hadoop, relational data store such as Postgres, Oracle or similar, and at least one NOSQL data store such as Cassandra, Dynamo, MongoDB or similar • Advanced proficiency in Cloud Data Warehouse Snowflake, AWS Redshift • Advanced proficiency in at least one scheduling/orchestration tool such as Airflow, AWS Step Functions or similar • Proficiency in Unix scripting, data structures, data serialization formats such as JSON, AVRO, Protobuf, or similar, big-data storage formats such as Parquet, Iceberg, or similar, data processing methodologies such as batch, micro-batching, or stream, one or more data modelling techniques such as Dimensional, Data Vault, Kimball, Inmon, etc., Agile methodology, TDD or BDD and CI/CD tools. Preferred qualifications, capabilities, and skills • Knowledge of data governance and security best practices. • Experience in carrying out data analysis to support business insights. • Strong Python and Spark
    $79k-111k yearly est. 1d ago
  • Data Engineer

    Neenopal Inc.

    Requirements engineer job in Newark, NJ

    NeenOpal is a global consulting firm specializing in Data Science and Business Intelligence, with offices in Bengaluru, Newark, and Fredericton. We provide end-to-end solutions tailored to the unique needs of businesses, from startups to large organizations, across domains like digital strategy, sales and marketing, supply chain, and finance. Our mission is to help organizations achieve operational excellence and transform into data-driven enterprises. Role Description This is a full-time, hybrid, Data Engineer role located in Newark, NJ. The Data Engineer will be responsible for designing, implementing, and managing data engineering solutions to support business needs. Day-to-day tasks include building and optimizing data pipelines, developing and maintaining data models and ETL processes, managing data warehousing solutions, and contributing to the organization's data analytics initiatives. Collaboration with cross-functional teams to ensure robust data infrastructure will be a key aspect of this role. Key Responsibilities Data Pipeline Development: Design, implement, and manage robust data pipelines to ensure efficient data flow into data warehouses. Automate ETL processes using Python and advanced data engineering tools. Data Integration: Integrate and transform data using industry-standard tools. Experience required with: AWS Services: AWS Glue, Data Pipeline, Redshift, and S3. Azure Services: Azure Data Factory, Synapse Analytics, and Blob Storage. Data Warehousing: Implement and optimize solutions using Snowflake and Amazon Redshift. Database Management: Develop and manage relational databases (SQL Server, MySQL, PostgreSQL) to ensure data integrity. Performance Optimization: Continuously monitor and improve data processing workflows and apply best practices for query optimization. Global Collaboration: Work closely with cross-functional teams in the US, India, and Canada to deliver high-quality solutions. Governance & Support: Document ETL processes and data mappings in line with governance standards. Diagnose and resolve data-related issues promptly. Required Skills and Experience Experience: Minimum 2+ years of experience designing and developing ETL processes (AWS Glue, Azure Data Factory, or similar). Integration: Experience integrating data via RESTful / GraphQL APIs. Programming: Proficient in Python for ETL automation and SQL for database management. Cloud Platforms: Strong experience with AWS or Azure data services. (GCP familiarity is a plus) . Data Warehousing: Expertise with Snowflake, Amazon Redshift, or Azure Synapse Analytics. Integration: Experience integrating data via RESTful APIs. Communication: Excellent articulation skills to explain technical work directly to clients and stakeholders. Authorization: Must have valid work authorization in the United States. Salary Range: $65,000- $80,000 per year Benefits: This role includes health insurance, paid time off, and opportunities for professional growth and continuous learning within a fast-growing global analytics company. Equal Opportunity Employer NeenOpal Inc. is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status.
    $65k-80k yearly 2d ago
  • Software Engineer, Banking Operations

    BIP

    Requirements engineer job in Jersey City, NJ

    Business Integration Partners (BIP) is Europe's fastest growing digital consulting company and are on track to reach the Top 20 by 2030, with an expanding global footprint in the US. Operating at the intersection of business and technology we design, develop, and deliver sustainable solutions at pace and scale creating greater value for our customers, employees, shareholders, and society. BIP specializes in high-impact consulting services across multiple industries with 6,000 employees worldwide. Our domains include Financial Services business serves Capital Markets, Insurance and Payments verticals, supplemented with Data & AI, Cybersecurity, Risk & Compliance, Change Management and Digital Transformation practices. We integrate deep industry expertise with business, technology, and quantitative disciplines to deliver high-impact results for our clients. BIP is currently expanding its footprint in the United States, focusing on growing its Capital Markets and Financial Services lines. Our teams operate at the intersection of business strategy, technology, and data to help our clients in driving smarter decisions, reducing risks, and staying ahead in a fast-evolving market environment. About the Role: The Software Engineer will contribute to the design, development, and enhancement of core payments and wire processing applications within our corporate and investment banking client's technology organization. Engineers will work across distributed systems, real-time transaction pipelines, settlement engines, and compliance/monitoring platforms supporting high-volume, low-latency financial operations. You must have valid US work authorization and must physically reside around the posted city, within a 50-mile commute. We are unable to support relocation costs. Please do not apply for this position unless you meet the criteria outlined above. Key Responsibilities: Develop and maintain services supporting high-volume payments, wire transfers, and money movement workflows. Build scalable applications using Python, Java, or .NET across distributed environments. Implement integrations with internal banking platforms, payment rails, and ledger systems. Troubleshoot production issues, improve resiliency, and reduce latency across transaction flows. Contribute to modernization efforts, including cloud migration, refactoring legacy components, and API enablement. Collaborate closely with BAs, architects, PMs, and offshore/nearshore teams. Follow secure coding standards, operational controls, and SDLC processes required by the bank. Required Skills: 3-10+ years hands-on experience in Python, Java, or C#/.NET. Experience with relational databases (Oracle, SQL Server). Understanding of payments, wire transfers, clearing systems, or financial services workflows. Familiarity with distributed systems, messaging, and event-driven architectures. Strong debugging and production support experience. Understanding of CI/CD and Agile environments. Preferred Skills: Hadoop / Informatica ecosystem knowledge. Experience with ISO 20022, SWIFT, Fedwire, CHIPS. Microservices architecture, REST/gRPC APIs. Performance tuning and low-latency engineering. **The base salary range for this role is $125,000 - $175,000** Benefits: Choice of medical, dental, vision insurance. Voluntary benefits. Short- and long-term disability. HSA and FSAs. Matching 401k. Discretionary performance bonus. Employee referral bonus. Employee assistance program. 11 public holidays. 20 days PTO. 7 Sick Days. PTO buy and sell program. Volunteer days. Paid parental leave. Remote/hybrid work environment support. For more information about BIP US, visit ********************************* Equal Employment Opportunity: It is BIP US Consulting policy to provide equal employment opportunities to all individuals based on job-related qualifications and ability to perform a job, without regard to age, gender, gender identity, sexual orientation, race, color, religion, creed, national origin, disability, genetic information, veteran status, citizenship, or marital status, and to maintain a non-discriminatory environment free from intimidation, harassment or bias based upon these grounds. BIP US provides a reasonable range of compensation for our roles. Actual compensation is influenced by a wide array of factors including but not limited to skill set, education, level of experience, and knowledge.
    $125k-175k yearly 1d ago
  • Java Software Engineer (Trading)-- AGADC5642050

    Compunnel Inc. 4.4company rating

    Requirements engineer job in Jersey City, NJ

    Must Haves: 1.) Low Latency Java Development experience (Trading would be preferred but not mandatory) These are more from a screening standpoint, if they have low latency java development experience they should have the following: 2.) Garbage collection, threading and or multi threading, Memory management experience 3.) Fix Protocol 4.) Optimization techniques or profiling techniques Nice to Haves: Order management System, Smart order router, market data experience
    $72k-93k yearly est. 3d ago
  • Full Stack Hedge Fund Software Engineer (Java/Angular/React)

    Focus Capital Markets

    Requirements engineer job in Stamford, CT

    Focus Capital Markets is supporting its Connecticut-based hedge fund by providing a unique opportunity for talented a senior software engineer to work across verticals within the organization. In this role, you will assist the business by developing front-end and back-end applications, building and scaling APIs and working with the business to define technical solutions for business requirements. You will work with both sides of the stack, with a Core Java back-end (and C#/.Net) and latest versions of Angular on the front-end. . Although the organization is outside of the NYC area, it is just as lucrative and would afford someone with career stability, longevity and growth opportunity within the organization. The parent company is best of breed in the hedge fund industry and there are opportunities to grow from within. You will work onsite Monday-Thursday. Requirements: 5+ years of software engineering experience leveraging Angular on the front-end and Core Java or C# on the back-end. Experience with React is relevant. Experience with SQL is preferred. Experience with REST APIs Bachelors degree or higher in computer science, mathematics or related field. Must have excellent communication skills
    $70k-93k yearly est. 1d ago

Learn more about requirements engineer jobs

How much does a requirements engineer earn in Ramapo, NY?

The average requirements engineer in Ramapo, NY earns between $65,000 and $115,000 annually. This compares to the national average requirements engineer range of $62,000 to $120,000.

Average requirements engineer salary in Ramapo, NY

$86,000

What are the biggest employers of Requirements Engineers in Ramapo, NY?

The biggest employers of Requirements Engineers in Ramapo, NY are:
  1. BD (Becton, Dickinson and Company
  2. Weston & Sampson
  3. Watts Water Technologies
  4. KCI Technologies
  5. NLB Services
  6. Constantin Control Associates
  7. HCL Technologies
  8. BD Systems Inc
  9. Career Mentors, LLC
Job type you want
Full Time
Part Time
Internship
Temporary