Neo4j Engineer
Requirements engineer job in Summit, NJ
Must Have Technical/Functional Skills
Neo4j, Graph Data Science, Cypher, Python, Graph Algorithms, Bloom, GraphXR, Cloud, Kubernetes, ETL
Roles & Responsibilities
Design and implement graph-based data models using Neo4j.
Develop Cypher queries and procedures for efficient graph traversal and analysis.
Apply Graph Data Science algorithms for community detection, centrality, and similarity.
Integrate Neo4j with enterprise data platforms and APIs.
Collaborate with data scientists and engineers to build graph-powered applications.
Optimize performance and scalability of graph queries and pipelines.
Support deployment and monitoring of Neo4j clusters in cloud or on-prem environments.
Salary Range: $110,000 $140,000 Year
TCS Employee Benefits Summary:
Discretionary Annual Incentive.
Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans.
Family Support: Maternal & Parental Leaves.
Insurance Options: Auto & Home Insurance, Identity Theft Protection.
Convenience & Professional Growth: Commuter Benefits & Certification & amp; Training Reimbursement.
Time Off: Vacation, Time Off, Sick Leave & Holidays.
Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
MSD 365 Engineer
Requirements engineer job in Weehawken, NJ
MSD 365 Engineer (Only W2)
Contract
We are currently seeking an experienced Senior Microsoft Dynamics 365 Professional to join our team. Candidate will be joining a team of highly dedicated professionals that thrive for new challenges daily, as well as a company that demonstrates the greatest care for its employees and has a track record for sound business decisions.
Responsibilities:
• Develop and customize Microsoft Dynamics 365 applications using C#/.NET and Power Platform tools.
• Build integrations between Dynamics 365 and Azure services (Logic Apps, Functions) as part of a modern cloud architecture.
• Support the migration from Salesforce to Dynamics 365 - helping to unify global customer data and business processes.
• Work closely with senior developers and solution architects to design clean, scalable solutions aligned with best practices.
• Participate in code reviews, testing, and CI/CD pipeline activities to ensure high-quality deliverables.
• Troubleshoot and optimize Dynamics plugins and SQL queries to improve system performance.
Thanks & Kind Regards,
Avinash Pathak
Delta System & Software, Inc.
Email Id: ***************************
AI / ML Engineer
Requirements engineer job in Warren, NJ
Title: AI Engineer or MCP Developer
Duration: Long Term Contract
Kindly share your resumes to ****************
Description: A MCP Developer in commercial P&C insurance is typically an IT role focused on developing systems and integrations using the Model Context Protocol (MCP) to leverage Artificial Intelligence (AI) and Large Language Models (LLMs) within insurance operations. This role involves building the infrastructure that allows AI agents to securely and reliably access and act upon internal P&C data sources (e.g., policy systems, claims databases, underwriting documents), thereby enhancing automation and decision-making in core insurance functions like underwriting and claims processing.
Responsibilities:
AI Integration: Develop and implement robust integrations between AI models (LLMs) and internal data repositories and business tools using the Model Context Protocol (MCP).
System Development: Build and maintain MCP servers and clients to expose necessary data and capabilities to AI agents.
Workflow Automation: Design and implement agentic workflows that allow AI systems to perform complex, multi-step tasks, such as accessing real-time policy data, processing claims information, and updating customer records.
Security & Compliance: Implement secure coding practices and ensure all AI interactions and data exchanges via MCP adhere to insurance industry regulations and internal compliance standards (e.g., data privacy, secure data handling).
API Management: Work with existing APIs (REST/SOAP) and develop new ones to facilitate data flow to and from the MCP environment.
Collaboration: Partner with actuaries, underwriters, claims specialists, and IT teams to identify AI opportunities and ensure seamless solution deployment.
Testing & Quality Assurance: Perform testing to ensure AI-driven job outputs are accurate and reliable, and maintain high performance levels.
Documentation: Document all development processes, system architectures, and operational procedures for MCP integrations.
Experience: 3+ years of experience in software development or AI integration, preferably within the insurance or financial services industry.
P&C Knowledge: Strong knowledge of Commercial P&C insurance products, underwriting processes, and claims systems is highly preferred.
Technical Expertise:
Proficiency in programming languages like Python, Java, or similar.
Experience with API development and management.
Familiarity with cloud platforms (AWS, Azure, GCP) and containerization tools (Docker, Kubernetes).
Understanding of the Model Context Protocol (MCP) specification and SDKs
Data Analytics Engineer
Requirements engineer job in Somerset, NJ
Client: manufacturing company
Type: direct hire
Our client is a publicly traded, globally recognized technology and manufacturing organization that relies on data-driven insights to support operational excellence, strategic decision-making, and digital transformation. They are seeking a Power BI Developer to design, develop, and maintain enterprise reporting solutions, data pipelines, and data warehousing assets.
This role works closely with internal stakeholders across departments to ensure reporting accuracy, data availability, and the long-term success of the company's business intelligence initiatives. The position also plays a key role in shaping BI strategy and fostering collaboration across cross-functional teams.
This role is on-site five days per week in Somerset, NJ.
Key Responsibilities
Power BI Reporting & Administration
Lead the design, development, and deployment of Power BI and SSRS reports, dashboards, and analytics assets
Collaborate with business stakeholders to gather requirements and translate needs into scalable technical solutions
Develop and maintain data models to ensure accuracy, consistency, and reliability
Serve as the Power BI tenant administrator, partnering with security teams to maintain data protection and regulatory compliance
Optimize Power BI solutions for performance, scalability, and ease of use
ETL & Data Warehousing
Design and maintain data warehouse structures, including schema and database layouts
Develop and support ETL processes to ensure timely and accurate data ingestion
Integrate data from multiple systems while ensuring quality, consistency, and completeness
Work closely with database administrators to optimize data warehouse performance
Troubleshoot data pipelines, ETL jobs, and warehouse-related issues as needed
Training & Documentation
Create and maintain technical documentation, including specifications, mappings, models, and architectural designs
Document data warehouse processes for reference, troubleshooting, and ongoing maintenance
Manage data definitions, lineage documentation, and data cataloging for all enterprise data models
Project Management
Oversee Power BI and reporting projects, offering technical guidance to the Business Intelligence team
Collaborate with key business stakeholders to ensure departmental reporting needs are met
Record meeting notes in Confluence and document project updates in Jira
Data Governance
Implement and enforce data governance policies to ensure data quality, compliance, and security
Monitor report usage metrics and follow up with end users as needed to optimize adoption and effectiveness
Routine IT Functions
Resolve Help Desk tickets related to reporting, dashboards, and BI tools
Support general software and hardware installations when needed
Other Responsibilities
Manage email and phone communication professionally and promptly
Respond to inquiries to resolve issues, provide information, or direct to appropriate personnel
Perform additional assigned duties as needed
Qualifications
Required
Minimum of 3 years of relevant experience
Bachelor's degree in Computer Science, Data Analytics, Machine Learning, or equivalent experience
Experience with cloud-based BI environments (Azure, AWS, etc.)
Strong understanding of data modeling, data visualization, and ETL tools (e.g., SSIS, Azure Synapse, Snowflake, Informatica)
Proficiency in SQL for data extraction, manipulation, and transformation
Strong knowledge of DAX
Familiarity with data warehouse technologies (e.g., Azure Blob Storage, Redshift, Snowflake)
Experience with Power Pivot, SSRS, Azure Synapse, or similar reporting tools
Strong analytical, problem-solving, and documentation skills
Excellent written and verbal communication abilities
High attention to detail and strong self-review practices
Effective time management and organizational skills; ability to prioritize workload
Professional, adaptable, team-oriented, and able to thrive in a dynamic environment
Azure Data Engineer
Requirements engineer job in Weehawken, NJ
· Expert level skills writing and optimizing complex SQL
· Experience with complex data modelling, ETL design, and using large databases in a business environment
· Experience with building data pipelines and applications to stream and process datasets at low latencies
· Fluent with Big Data technologies like Spark, Kafka and Hive
· Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required
· Designing and building of data pipelines using API ingestion and Streaming ingestion methods
· Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential
· Experience in developing NO SQL solutions using Azure Cosmos DB is essential
· Thorough understanding of Azure and AWS Cloud Infrastructure offerings
· Working knowledge of Python is desirable
· Designing and implementing scalable and secure data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services
· Managing and optimizing data storage using Azure Data Lake Storage, Azure SQL Data Warehouse, and Azure Cosmos DB
· Monitoring and troubleshooting data-related issues within the Azure environment to maintain high availability and performance
· Implementing data security measures, including encryption, access controls, and auditing, to protect sensitive information
· Automating data pipelines and workflows to streamline data ingestion, processing, and distribution tasks
· Utilizing Azure's analytics services, such as Azure Synapse Analytics, to provide insights and support data-driven decision-making.
· Documenting data procedures, systems, and architectures to maintain clarity and ensure compliance with regulatory standards
· Providing guidance and support for data governance, including metadata management, data lineage, and data cataloging
Best Regards,
Dipendra Gupta
Technical Recruiter
*****************************
Data Engineer
Requirements engineer job in Hamilton, NJ
Key Responsibilities:
Manage and support batch processes and data pipelines in Azure Databricks and Azure Data Factory.
Integrate and process Bloomberg market data feeds and files into trading or analytics platforms.
Monitor, troubleshoot, and resolve data and system issues related to trading applications and market data ingestion.
Develop, automate, and optimize ETL pipelines using Python, Spark, and SQL.
Manage FTP/SFTP file transfers between internal systems and external vendors.
Ensure data quality, completeness, and timeliness for downstream trading and reporting systems.
Collaborate with operations, application support, and infrastructure teams to resolve incidents and enhance data workflows.
Required Skills & Experience:
10+ years of experience in data engineering or production support within financial services or trading environments.
Hands-on experience with Azure Databricks, Azure Data Factory, and Azure Storage, Logic Apps, Fabric.
Strong Python and SQL programming skills.
Experience with Bloomberg data feeds (BPIPE, TSIP,SFTP).
Experience with Git, CI/CD pipelines, and Azure DevOps.
Proven ability to support batch jobs, troubleshoot failures, and manage job scheduling.
Experience handling FTP/SFTP file transfers and automation (e.g., using scripts or managed file transfer tools).
Solid understanding of equities trading, fixed income trading, trading workflows, and financial instruments.
Excellent communication, problem-solving, and stakeholder management skills.
Data Engineer / Manager (Only W2)
Requirements engineer job in Edison, NJ
12-15 years of overall experience, combining solution architecture and hands-on delivery.
Strong client-facing skills, with the ability to lead technical conversations, manage expectations, and handle escalations calmly.
Hands-on technologist capable of designing, reviewing, and, when required, building/debugging components, not limited to high-level guidance.
Proficient in Data Solutions, with practical, hands-on experience in Databricks including Lakehouse architecture, ETL pipelines, analytics, and AI workloads.
Strong hands-on experience with PySpark and Python for large-scale data processing and transformation.
Experience designing and implementing microservices-based architectures.
Senior Bigdata Engineer
Requirements engineer job in New Jersey
Greetings!
We are looking for a Big Data Engineer to design, build, and maintain scalable data solutions. This role focuses on developing reliable data pipelines and platforms that support analytics, reporting, and data-driven decision making. The ideal candidate has strong hands-on experience with Python and SQL and is comfortable working with large, complex datasets.
Position: Sr. Big Data Engineer
Location: Whippany NJ (Hybrid)
Contract: Long term contract
Client: One of the largest financial clients.
Responsibilities
Design, develop, and maintain large-scale data pipelines and data platforms
Build efficient ETL and ELT processes using Python and SQL
Optimize data models, queries, and workflows for performance and reliability
Work with structured and unstructured data from multiple sources
Collaborate with data scientists, analysts, and software engineers to support analytics and machine learning use cases
Ensure data quality, consistency, and availability across systems
Monitor and troubleshoot data pipelines in production environments
Document data processes, models, and best practices
Required Qualifications
Strong experience in Python for data processing and pipeline development
Advanced SQL skills, including query optimization and complex data transformations
Experience working with big data technologies such as Spark, Hadoop, or similar frameworks
Solid understanding of data modeling, warehousing, and lakehouse concepts
Experience with cloud data platforms (AWS, Azure, or Google Cloud)
Familiarity with version control systems such as Git
Preferred Qualifications
Experience with workflow orchestration tools such as Airflow or similar
Knowledge of streaming technologies such as Kafka or equivalent
Experience with containerization and deployment tools (Docker, Kubernetes)
Exposure to data governance, security, and compliance best practices
Best Regards,
Sr. Azure Data Engineer with Databricks Expertise
Requirements engineer job in Iselin, NJ
Role : Sr. Azure Data Engineer with Databricks Expertise
Exp : 12+
We are seeking highly skilled Azure Data Engineer with strong expertise in SQL, Python, Datawarehouse, Cloud ETL tools to join our data team. The ideal candidate will design, implement and optimize large-scale data pipeline, ensuring scalability, reliability and performance. This role involves working closely with multiple teams and business stakeholders to deliver cutting-edge data solutions.
Key Responsibilities:
1. Data Pipeline Development:
• Build and maintain scalable ETL/ELT pipelines using Databricks.
• Leverage PySpark/Spark and SQL to transform and process large datasets.
• Integrate data from multiple sources including Azure Blob Storage, ADLS and other relational/non-relational systems.
2. Collaboration & Analysis:
• Work Closely with multiple teams to prepare data for dashboard and BI Tools.
• Collaborate with cross-functional teams to understand business requirements and deliver tailored data solutions.
3. Performance & Optimization:
• Optimize Databricks workloads for cost efficiency and performance.
• Monitor and troubleshoot data pipelines to ensure reliability and accuracy.
4. Governance & Security:
• Implement and manage data security, access controls and governance standards using Unity Catalog.
• Ensure compliance with organizational and regulatory data policies.
5. Deployment:
• Leverage Databricks Asset Bundles for seamless deployment of Databricks jobs, notebooks and configurations across environments.
• Manage version control for Databricks artifacts and collaborate with team to maintain development best practices.
Technical Skills:
• Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime etc.)
• Proficiency in Azure Cloud Services.
• Solid Understanding of Spark and PySpark for big data processing.
• Experience in relational databases.
• Knowledge on Databricks Asset Bundles and GitLab.
Preferred Experience:
• Familiarity with Databricks Runtimes and advanced configurations.
• Knowledge of streaming frameworks like Spark Streaming.
• Experience in developing real-time data solutions.
Certifications:
• Azure Data Engineer Associate or Databricks certified Data Engineer Associate certification. (Optional)
Azure Data Engineer
Requirements engineer job in Jersey City, NJ
Title: Senior Azure Data Engineer Client: Major Japanese Bank Experience Level: Senior (10+ Years)
The Senior Azure Data Engineer will design, build, and optimize enterprise data solutions within Microsoft Azure for a major Japanese bank. This role focuses on architecting scalable data pipelines, enhancing data lake environments, and ensuring security, compliance, and data governance best practices.
Key Responsibilities:
Develop, maintain, and optimize Azure-based data pipelines and ETL/ELT workflows.
Design and implement Azure Data Lake, Synapse, Databricks, and ADF solutions.
Ensure data security, compliance, lineage, and governance controls.
Partner with architecture, data governance, and business teams to deliver high-quality data solutions.
Troubleshoot performance issues and improve system efficiency.
Required Skills:
10+ years of data engineering experience.
Strong hands-on expertise with Azure Synapse, Azure Data Factory, Azure Databricks, Azure Data Lake, and Azure SQL.
Azure certifications strongly preferred.
Strong SQL, Python, and cloud data architecture skills.
Experience in financial services or large enterprise environments preferred.
Lead DevOps Engineer (Jenkins)
Requirements engineer job in Jersey City, NJ
Employment Type: Full-Time, Direct Hire
This is not a contract role and is not available for third-party agencies or contractors.
About the Role
We are seeking a hands-on Lead DevOps Engineer to drive the development of a next-generation enterprise pipeline infrastructure. This role is ideal for a technical leader with deep experience building scalable Jenkins environments, defining CI/CD strategy, and promoting DevOps best practices across large organizations. If you thrive in fast-paced, highly collaborative environments and enjoy solving complex engineering challenges, this is an excellent opportunity.
What You'll Do
Lead the design and implementation of a unified enterprise pipeline framework using Jenkins, Octopus Deploy, and related CI/CD tools.
Build, optimize, and maintain a highly scalable Jenkins platform supporting multiple concurrent teams and workloads.
Evaluate emerging CI/CD technologies and lead enterprise-wide adoption initiatives.
Manage and mentor a team of developers and DevOps engineers; foster a culture of operational excellence.
Collaborate with cross-functional stakeholders to gather requirements, align strategy, and advance DevOps maturity.
Enforce Infrastructure-as-Code practices with proper governance, compliance, and audit controls.
Implement monitoring, alerting, and automation to ensure strong operational performance.
Lead incident response efforts; drive root-cause analysis and long-term remediation.
Identify bottlenecks and drive end-to-end automation to improve deployment speed and reliability.
What You Bring
Strong expertise with Jenkins, Octopus Deploy, and modern CI/CD ecosystems.
Hands-on experience with AWS or Azure, Docker, Kubernetes, Terraform, and IaC principles.
Strong programming skills (Python, Node.js) and solid Git fundamentals.
3+ years of experience leading technical teams and delivering complex solutions.
Experience with Software-Defined Networks, VPCs, cloud networking, and infrastructure automation.
Familiarity with DevOps methodologies and ITIL best practices.
Proactive, collaborative, and driven by innovation.
Azure DevOps Engineer
Requirements engineer job in Jersey City, NJ
About US:
LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 750 clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit ********************
Job Title: Azure DevOps Engineer
Work Location
Jersey City, NJ
Job Description:
1. Extensive hands-on experience on GitHub Actions writing workflows in YAML using re-usable templates
2. Extensive hands on experience with application CI/CD pipelines both for Azure and on-prem for different frameworks
3. Hands on experience with Azure DevOps and migration programs of CI/CD pipelines preferably from Azure DevOps to GitHub Actions
4. Proficiency in integrating and consuming REST APIs to achieve automation through scripting
5. Hands on experience with atleast 1 scripting language and has done out of box automations for platforms like People Soft, SharePoint, MDM etc
6. Hands on experience with CI/CD of databases
7. Good to have experience with infrastructure-as-code including ARM templates Terraform Azure CLI Azure PowerShell modules
8. Exposure to monitoring tools like ELK Prometheus Grafana
Benefits/perks listed below may vary depending on the nature of your employment with LTIMindtree (“LTIM”):
Benefits and Perks:
Comprehensive Medical Plan Covering Medical, Dental, Vision
Short Term and Long-Term Disability Coverage
401(k) Plan with Company match
Life Insurance
Vacation Time, Sick Leave, Paid Holidays
Paid Paternity and Maternity Leave
The range displayed on each job posting reflects the minimum and maximum salary target for the position across all US locations. Within the range, individual pay is determined by work location and job level and additional factors including job-related skills, experience, and relevant education or training. Depending on the position offered, other forms of compensation may be provided as part of overall compensation like an annual performance-based bonus, sales incentive pay and other forms of bonus or variable compensation.
Disclaimer: The compensation and benefits information provided herein is accurate as of the date of this posting.
LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.
Senior Data Engineer (Snowflake)
Requirements engineer job in Parsippany-Troy Hills, NJ
Senior Data Engineer (Snowflake & Python)
1-Year Contract | $60/hour + Benefit Options
Hybrid: On-site a few days per month (local candidates only)
Work Authorization Requirement
You must be authorized to work for any employer as a W2 employee. This is required for this role.
This position is W-2 only - no C2C, no third-party submissions, and no sponsorship will be considered.
Overview
We are seeking a Senior Data Engineer to support enterprise-scale data initiatives for a highly collaborative engineering organization. This is a new, long-term contract opportunity for a hands-on data professional who thrives in fast-paced environments and enjoys building high-quality, scalable data solutions on Snowflake.
Candidates must be based in or around New Jersey, able to work on-site at least 3 days per month, and meet the W2 employment requirement.
What You'll Do
Design, develop, and support enterprise-level data solutions with a strong focus on Snowflake
Participate across the full software development lifecycle - planning, requirements, development, testing, and QA
Partner closely with engineering and data teams to identify and implement optimal technical solutions
Build and maintain high-performance, scalable data pipelines and data warehouse architectures
Ensure platform performance, reliability, and uptime, maintaining strong coding and design standards
Troubleshoot production issues, identify root causes, implement fixes, and document preventive solutions
Manage deliverables and priorities effectively in a fast-moving environment
Contribute to data governance practices including metadata management and data lineage
Support analytics and reporting use cases leveraging advanced SQL and analytical functions
Required Skills & Experience
8+ years of experience designing and developing data solutions in an enterprise environment
5+ years of hands-on Snowflake experience
Strong hands-on development skills with SQL and Python
Proven experience designing and developing data warehouses in Snowflake
Ability to diagnose, optimize, and tune SQL queries
Experience with Azure data frameworks (e.g., Azure Data Factory)
Strong experience with orchestration tools such as Airflow, Informatica, Automic, or similar
Solid understanding of metadata management and data lineage
Hands-on experience with SQL analytical functions
Working knowledge of Shell scripting and Java scripting
Experience using Git, Confluence, and Jira
Strong problem-solving and troubleshooting skills
Collaborative mindset with excellent communication skills
Nice to Have
Experience supporting Pharma industry data
Exposure to Omni-channel data environments
Why This Opportunity
$60/hour W2 on a long-term 1-year contract
Benefit options available
Hybrid structure with limited on-site requirement
High-impact role supporting enterprise data initiatives
Clear expectations: W-2 only, no third-party submissions, no Corp-to-Corp
This employer participates in E-Verify and will provide the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S.
Senior Data Engineer
Requirements engineer job in New Providence, NJ
Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. We have been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation.Apexon brings together distinct core competencies - in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences - to help businesses capitalize on the unlimited opportunities digital offers. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving clients' toughest technology problems, and a commitment to continuous improvement. Backed by Goldman Sachs Asset Management and Everstone Capital, Apexon now has a global presence of 15 offices (and 10 delivery centers) across four continents.
Job Description
Experienced Data management specialist responsible for developing, overseeing, organizing, storing, and analyzing data and data systems
Participate in all aspects of the software development lifecycle for Snowflake solutions, including planning, requirements, development, testing, and quality assurance
Work in tandem with our engineering team to identify and implement the most optimal solutions
Ensure platform performance, uptime, and scale, maintaining high standards for code quality and thoughtful design
Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures
Able to manage deliverables in fast paced environments
Areas of Expertise
At least 10 years of experience designing and development of data solutions in enterprise environment
At least 5+ years' experience on Snowflake Platform
Strong hands-on SQL and Python development
Experience with designing and developing data warehouses in Snowflake
A minimum of three years' experience in developing production-ready data ingestion and processing pipelines using Spark, Scala
Strong hands-on experience with Orchestration Tools e.g. Airflow, Informatica, Automic
Good understanding on Metadata and data lineage
Hands-on knowledge on SQL Analytical functions
Strong knowledge and hands-on experience in Shell scripting, Java Scripting
Able to demonstrate experience with software engineering practices including CI/CD, Automated testing and Performance Engineering.
Good understanding and exposure to Git, Confluence and Jira
Good problem solving and troubleshooting skills.
Team player, collaborative approach and excellent communication skills
Our Commitment to Diversity & Inclusion:
Did you know that Apexon has been Certified™ by Great Place To Work , the global authority on workplace culture, in each of the three regions in which it operates: USA (for the fourth time in 2023), India (seven consecutive certifications as of 2023), and the UK.Apexon is committed to being an equal opportunity employer and promoting diversity in the workplace. We are taking affirmative action to ensure equal employment opportunity for all qualified individuals. Apexon strictly prohibits discrimination and harassment of any kind and provides equal employment opportunities to employees and applicants without regard to gender, race, color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. You can read about our Job Applicant Privacy policy here Job Applicant Privacy Policy (apexon.com)
Distinguished Data Engineer- Bank Tech
Requirements engineer job in Cherry Hill, NJ
Distinguished Data Engineers are individual contributors who strive to be diverse in thought so we visualize the problem space. At Capital One, we believe diversity of thought strengthens our ability to influence, collaborate and provide the most innovative solutions across organizational boundaries. Distinguished Engineers will significantly impact our trajectory and devise clear roadmaps to deliver next generation technology solutions.
Horzianal, Bank data organization to accelerate data modernization across the bank by defining, building, and operating on a unified, resilient, and compliant Enterprise Data Platforms. Enable bank domains to produce and leverage modern data for a modern bank. The position focused on setting the technical vision, prototyping and driving the most complex data domain data architecture for the banking domains. In addition, partner closely with enterprise teams to develop highly resilient data platforms.
Deep technical experts and thought leaders that help accelerate adoption of the very best engineering practices, while maintaining knowledge on industry innovations, trends and practices
Visionaries, collaborating on Capital One's toughest issues, to deliver on business needs that directly impact the lives of our customers and associates
Role models and mentors, helping to coach and strengthen the technical expertise and know-how of our engineering and product community
Evangelists, both internally and externally, helping to elevate the Distinguished Engineering community and establish themselves as a go-to resource on given technologies and technology-enabled capabilities
Responsibilities:
Build awareness, increase knowledge and drive adoption of modern technologies, sharing consumer and engineering benefits to gain buy-in
Strike the right balance between lending expertise and providing an inclusive environment where others' ideas can be heard and championed; leverage expertise to grow skills in the broader Capital One team
Promote a culture of engineering excellence, using opportunities to reuse and innersource solutions where possible
Effectively communicate with and influence key stakeholders across the enterprise, at all levels of the organization
Operate as a trusted advisor for a specific technology, platform or capability domain, helping to shape use cases and implementation in an unified manner
Lead the way in creating next-generation talent for Tech, mentoring internal talent and actively recruiting external talent to bolster Capital One's Tech talent
Basic Qualifications:
Bachelor's Degree
At least 7 years of experience in data engineering
At least 3 years of experience in data architecture
At least 2 years of experience building applications in AWS
Preferred Qualifications:
Masters' Degree
9+ years of experience in data engineering
3+ years of data modeling experience
2+ years of experience with ontology standards for defining a domain
2+ years of experience using Python, SQL or Scala
1+ year of experience deploying machine learning models
3+ years of experience implementing big data processing solutions on AWS
Capital One will consider sponsoring a new qualified applicant for employment authorization for this position.
The minimum and maximum full-time annual salaries for this role are listed below, by location. Please note that this salary information is solely for candidates hired to perform work within one of these locations, and refers to the amount Capital One is willing to pay at the time of this posting. Salaries for part-time roles will be prorated based upon the agreed upon number of hours to be regularly worked.
McLean, VA: $263,900 - $301,200 for Distinguished Data Engineer
Philadelphia, PA: $239,900 - $273,800 for Distinguished Data Engineer
Richmond, VA: $239,900 - $273,800 for Distinguished Data Engineer
Wilmington, DE: $239,900 - $273,800 for Distinguished Data Engineer
Candidates hired to work in other locations will be subject to the pay range associated with that location, and the actual annualized salary amount offered to any candidate at the time of hire will be reflected solely in the candidate's offer letter.
This role is also eligible to earn performance based incentive compensation, which may include cash bonus(es) and/or long term incentives (LTI). Incentives could be discretionary or non discretionary depending on the plan.
Capital One offers a comprehensive, competitive, and inclusive set of health, financial and other benefits that support your total well-being. Learn more at the Capital One Careers website . Eligibility varies based on full or part-time status, exempt or non-exempt status, and management level.
This role is expected to accept applications for a minimum of 5 business days.No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections ; New York City's Fair Chance Act; Philadelphia's Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries.
If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1- or via email at . All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations.
For technical support or questions about Capital One's recruiting process, please send an email to
Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site.
Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC).
Java Software Engineer
Requirements engineer job in East Windsor, NJ
Job Responsibilities:
Develop applications using Java 8/JEE (and higher), Angular 2+, React.js, SQL, Spring, HTML5, CSS, JavaScript, and TypeScript, among other tools.
Write scalable, secure, and maintainable code that powers our clients' platforms.
Create, deploy, and maintain automated system tests.
Work with Testers to understand defects and resolve them in a timely manner.
Support continuous improvement by investigating alternatives and technologies, and presenting these for architectural review.
Collaborate effectively with other team members to accomplish shared user story and sprint goals.
Requirement:
Experience in programming languages: Java and JavaScript
Decent understanding of the software development life cycle
Basic programming skills using object-oriented programming (OOP) languages, with in-depth
knowledge of common APIs and data structures like Collections, Maps, Lists, Sets, etc.
Knowledge of relational databases (e.g., SQL Server, Oracle) and basic SQL query language skills
Preferred Qualifications:
Master's Degree in Computer Science (CS)
0-1 year of practical experience in Java coding
Experience using Spring, Maven, Angular frameworks, HTML, and CSS
Knowledge of other contemporary Java technologies (e.g., WebLogic, RabbitMQ, Tomcat)
Familiarity with JSP, J2EE, and JDBC
Software Engineer, Banking Operations
Requirements engineer job in Jersey City, NJ
Business Integration Partners (BIP) is Europe's fastest growing digital consulting company and are on track to reach the Top 20 by 2030, with an expanding global footprint in the US. Operating at the intersection of business and technology we design, develop, and deliver sustainable solutions at pace and scale creating greater value for our customers, employees, shareholders, and society.
BIP specializes in high-impact consulting services across multiple industries with 6,000 employees worldwide. Our domains include Financial Services business serves Capital Markets, Insurance and Payments verticals, supplemented with Data & AI, Cybersecurity, Risk & Compliance, Change Management and Digital Transformation practices. We integrate deep industry expertise with business, technology, and quantitative disciplines to deliver high-impact results for our clients.
BIP is currently expanding its footprint in the United States, focusing on growing its Capital Markets and Financial Services lines. Our teams operate at the intersection of business strategy, technology, and data to help our clients in driving smarter decisions, reducing risks, and staying ahead in a fast-evolving market environment.
About the Role:
The Software Engineer will contribute to the design, development, and enhancement of core payments and wire processing applications within our corporate and investment banking client's technology organization. Engineers will work across distributed systems, real-time transaction pipelines, settlement engines, and compliance/monitoring platforms supporting high-volume, low-latency financial operations.
You must have valid US work authorization and must physically reside around the posted city, within a 50-mile commute. We are unable to support relocation costs.
Please do not apply for this position unless you meet the criteria outlined above.
Key Responsibilities:
Develop and maintain services supporting high-volume payments, wire transfers, and money movement workflows.
Build scalable applications using Python, Java, or .NET across distributed environments.
Implement integrations with internal banking platforms, payment rails, and ledger systems.
Troubleshoot production issues, improve resiliency, and reduce latency across transaction flows.
Contribute to modernization efforts, including cloud migration, refactoring legacy components, and API enablement.
Collaborate closely with BAs, architects, PMs, and offshore/nearshore teams.
Follow secure coding standards, operational controls, and SDLC processes required by the bank.
Required Skills:
3-10+ years hands-on experience in Python, Java, or C#/.NET.
Experience with relational databases (Oracle, SQL Server).
Understanding of payments, wire transfers, clearing systems, or financial services workflows.
Familiarity with distributed systems, messaging, and event-driven architectures.
Strong debugging and production support experience.
Understanding of CI/CD and Agile environments.
Preferred Skills:
Hadoop / Informatica ecosystem knowledge.
Experience with ISO 20022, SWIFT, Fedwire, CHIPS.
Microservices architecture, REST/gRPC APIs.
Performance tuning and low-latency engineering.
**The base salary range for this role is $125,000 - $175,000**
Benefits:
Choice of medical, dental, vision insurance.
Voluntary benefits.
Short- and long-term disability.
HSA and FSAs.
Matching 401k.
Discretionary performance bonus.
Employee referral bonus.
Employee assistance program.
11 public holidays.
20 days PTO.
7 Sick Days.
PTO buy and sell program.
Volunteer days.
Paid parental leave.
Remote/hybrid work environment support.
For more information about BIP US, visit *********************************
Equal Employment Opportunity:
It is BIP US Consulting policy to provide equal employment opportunities to all individuals based on job-related qualifications and ability to perform a job, without regard to age, gender, gender identity, sexual orientation, race, color, religion, creed, national origin, disability, genetic information, veteran status, citizenship, or marital status, and to maintain a non-discriminatory environment free from intimidation, harassment or bias based upon these grounds.
BIP US provides a reasonable range of compensation for our roles. Actual compensation is influenced by a wide array of factors including but not limited to skill set, education, level of experience, and knowledge.
Java Software Engineer (Trading)-- AGADC5642050
Requirements engineer job in Jersey City, NJ
Must Haves:
1.) Low Latency Java Development experience (Trading would be preferred but not mandatory)
These are more from a screening standpoint, if they have low latency java development experience they should have the following:
2.) Garbage collection, threading and or multi threading, Memory management experience
3.) Fix Protocol
4.) Optimization techniques or profiling techniques
Nice to Haves:
Order management System, Smart order router, market data experience
Java Software Engineer
Requirements engineer job in Iselin, NJ
Job Information:
Functional Title - Assistant Vice President, Java Software Development Engineer
Department - Technology
Corporate Level - Assistant Vice President
Report to - Director, Application Development
Expected full-time salary range between $ 125,000 - 145,000 + variable compensation + 401(k) match + benefits
Job Description:
This position is with CLS Technology. The primary responsibilities of the job will be
(a) Hands-on software application development
(b) Level 3 support
Duties, Responsibilities, and Deliverables:
Develop scalable, robust applications utilizing appropriate design patterns, algorithms and Java frameworks
Collaborate with Business Analysts, Application Architects, Developers, QA, Engineering, and Technology Vendor teams for design, development, testing, maintenance and support
Adhere to CLS SDLC process and governance requirements and ensure full compliance of these requirements
Plan, implement and ensure that delivery milestones are met
Provide solutions using design patterns, common techniques, and industry best practices that meet the typical challenges/requirements of a financial application including usability, performance, security, resiliency, and compatibility
Proactively recognize system deficiencies and implement effective solutions
Participate in, contribute to, and assimilate changes, enhancements, requirements (functional and non-functional), and requirements traceability
Apply significant knowledge of industry trends and developments to improve CLS in-house practices and services
Provide Level-3 support. Provide application knowledge and training to Level-2 support teams
Experience Requirements:
5+ years of hands-on application development and testing experience with proficient knowledge of core Java and JEE technologies such as JDBC and JAXB, Java/Web technologies
Knowledge of Python, Perl, Unix shell scripting is a plus
Expert hands-on experience with SQL and with at least one DBMS such as IBM DB2 (preferred) or Oracle is a strong plus
Expert knowledge of and experience in securing web applications, secure coding practices
Hands-on knowledge of application resiliency, performance tuning, technology risk management is a strong plus
Hands-on knowledge of messaging middleware such as IBM MQ (preferred) or TIBCO EMS, and application servers such as WebSphere, or WebLogic
Knowledge of SWIFT messaging, payments processing, FX business domain is a plus
Hands-on knowledge of CI/CD practices and DevOps toolsets such as JIRA, GIT, Ant, Maven, Jenkins, Bamboo, Confluence, and ServiceNow.
Hands-on knowledge of MS Office toolset including MS-Excel, MS-Word, PowerPoint, and Visio
Proven track record of successful application delivery to production and effective Level-3 support.
Success factors: In addition, the person selected for the job will
Have strong analytical, written and oral communication skills with a high self-motivation factor
Possess excellent organization skills to manage multiple tasks in parallel
Be a team player
Have the ability to work on complex projects with globally distributed teams and manage tight delivery timelines
Have the ability to smoothly handle high stress application development and support environments
Strive continuously to improve stakeholder management for end-to-end application delivery and support
Qualification Requirements:
Bachelor Degree
Minimum 5 year experience in Information Technology
Java Software Engineer
Requirements engineer job in Monroe, NJ
Java 21 SE, Spring Boot, REST API Development, TCP/IP Socket communications, JDBC database interfacing, JSON
Proficiency with Maven/Gradle for build automation and Git for version control.
Experience with unit and integration testing tools like JUnit, Mockito
IDE - Eclipse IDE
Knowledge of Distribution Centre (Supply Chain Management) would be an added advantage