Sr Data Engineer Python Serverside
Data engineer job in White House Station, NJ
This is a direct hire full-time position, with a hybrid on-site 2 days a week format.
YOU MUST BE A US CITIZEN OR GREEN CARD, NO OTHER STATUS TO WORK IN THE US WILL BE PERMITTED
YOU MUST LIVE LOCAL TO THE AREA AND BE ABLE TO DRIVE ONSITE A MIN TWO DAYS A WEEK
THE TECH STACK WILL BE:
7 years demonstrated server-side development proficiency
5 years demonstrated server-side development proficiency
Programming Languages: Python (NumPy, Pandas, Oracle PL/SQL). Other non-interpreted languages like Java, C++, Rust, etc. are a plus. Must be proficient in the intermediate-advanced level of the language (concurrency, memory management, etc.)
Design patterns: typical GOF patterns (Factory, Facade, Singleton, etc.)
Data structures: maps, lists, arrays, etc
SCM: solid Git proficiency, MS Azure DevOps (CI/CD)
Senior Data Engineer (Snowflake)
Data engineer job in Parsippany-Troy Hills, NJ
Senior Data Engineer (Snowflake & Python)
1-Year Contract | $60/hour + Benefit Options
Hybrid: On-site a few days per month (local candidates only)
Work Authorization Requirement
You must be authorized to work for any employer as a W2 employee. This is required for this role.
This position is W-2 only - no C2C, no third-party submissions, and no sponsorship will be considered.
Overview
We are seeking a Senior Data Engineer to support enterprise-scale data initiatives for a highly collaborative engineering organization. This is a new, long-term contract opportunity for a hands-on data professional who thrives in fast-paced environments and enjoys building high-quality, scalable data solutions on Snowflake.
Candidates must be based in or around New Jersey, able to work on-site at least 3 days per month, and meet the W2 employment requirement.
What You'll Do
Design, develop, and support enterprise-level data solutions with a strong focus on Snowflake
Participate across the full software development lifecycle - planning, requirements, development, testing, and QA
Partner closely with engineering and data teams to identify and implement optimal technical solutions
Build and maintain high-performance, scalable data pipelines and data warehouse architectures
Ensure platform performance, reliability, and uptime, maintaining strong coding and design standards
Troubleshoot production issues, identify root causes, implement fixes, and document preventive solutions
Manage deliverables and priorities effectively in a fast-moving environment
Contribute to data governance practices including metadata management and data lineage
Support analytics and reporting use cases leveraging advanced SQL and analytical functions
Required Skills & Experience
8+ years of experience designing and developing data solutions in an enterprise environment
5+ years of hands-on Snowflake experience
Strong hands-on development skills with SQL and Python
Proven experience designing and developing data warehouses in Snowflake
Ability to diagnose, optimize, and tune SQL queries
Experience with Azure data frameworks (e.g., Azure Data Factory)
Strong experience with orchestration tools such as Airflow, Informatica, Automic, or similar
Solid understanding of metadata management and data lineage
Hands-on experience with SQL analytical functions
Working knowledge of Shell scripting and Java scripting
Experience using Git, Confluence, and Jira
Strong problem-solving and troubleshooting skills
Collaborative mindset with excellent communication skills
Nice to Have
Experience supporting Pharma industry data
Exposure to Omni-channel data environments
Why This Opportunity
$60/hour W2 on a long-term 1-year contract
Benefit options available
Hybrid structure with limited on-site requirement
High-impact role supporting enterprise data initiatives
Clear expectations: W-2 only, no third-party submissions, no Corp-to-Corp
This employer participates in E-Verify and will provide the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S.
Data Engineer
Data engineer job in Hamilton, NJ
Key Responsibilities:
Manage and support batch processes and data pipelines in Azure Databricks and Azure Data Factory.
Integrate and process Bloomberg market data feeds and files into trading or analytics platforms.
Monitor, troubleshoot, and resolve data and system issues related to trading applications and market data ingestion.
Develop, automate, and optimize ETL pipelines using Python, Spark, and SQL.
Manage FTP/SFTP file transfers between internal systems and external vendors.
Ensure data quality, completeness, and timeliness for downstream trading and reporting systems.
Collaborate with operations, application support, and infrastructure teams to resolve incidents and enhance data workflows.
Required Skills & Experience:
10+ years of experience in data engineering or production support within financial services or trading environments.
Hands-on experience with Azure Databricks, Azure Data Factory, and Azure Storage, Logic Apps, Fabric.
Strong Python and SQL programming skills.
Experience with Bloomberg data feeds (BPIPE, TSIP,SFTP).
Experience with Git, CI/CD pipelines, and Azure DevOps.
Proven ability to support batch jobs, troubleshoot failures, and manage job scheduling.
Experience handling FTP/SFTP file transfers and automation (e.g., using scripts or managed file transfer tools).
Solid understanding of equities trading, fixed income trading, trading workflows, and financial instruments.
Excellent communication, problem-solving, and stakeholder management skills.
Senior Data Engineer
Data engineer job in New Providence, NJ
Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. We have been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation.Apexon brings together distinct core competencies - in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences - to help businesses capitalize on the unlimited opportunities digital offers. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving clients' toughest technology problems, and a commitment to continuous improvement. Backed by Goldman Sachs Asset Management and Everstone Capital, Apexon now has a global presence of 15 offices (and 10 delivery centers) across four continents.
Job Description
Experienced Data management specialist responsible for developing, overseeing, organizing, storing, and analyzing data and data systems
Participate in all aspects of the software development lifecycle for Snowflake solutions, including planning, requirements, development, testing, and quality assurance
Work in tandem with our engineering team to identify and implement the most optimal solutions
Ensure platform performance, uptime, and scale, maintaining high standards for code quality and thoughtful design
Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures
Able to manage deliverables in fast paced environments
Areas of Expertise
At least 10 years of experience designing and development of data solutions in enterprise environment
At least 5+ years' experience on Snowflake Platform
Strong hands-on SQL and Python development
Experience with designing and developing data warehouses in Snowflake
A minimum of three years' experience in developing production-ready data ingestion and processing pipelines using Spark, Scala
Strong hands-on experience with Orchestration Tools e.g. Airflow, Informatica, Automic
Good understanding on Metadata and data lineage
Hands-on knowledge on SQL Analytical functions
Strong knowledge and hands-on experience in Shell scripting, Java Scripting
Able to demonstrate experience with software engineering practices including CI/CD, Automated testing and Performance Engineering.
Good understanding and exposure to Git, Confluence and Jira
Good problem solving and troubleshooting skills.
Team player, collaborative approach and excellent communication skills
Our Commitment to Diversity & Inclusion:
Did you know that Apexon has been Certifiedâ„¢ by Great Place To Work , the global authority on workplace culture, in each of the three regions in which it operates: USA (for the fourth time in 2023), India (seven consecutive certifications as of 2023), and the UK.Apexon is committed to being an equal opportunity employer and promoting diversity in the workplace. We are taking affirmative action to ensure equal employment opportunity for all qualified individuals. Apexon strictly prohibits discrimination and harassment of any kind and provides equal employment opportunities to employees and applicants without regard to gender, race, color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. You can read about our Job Applicant Privacy policy here Job Applicant Privacy Policy (apexon.com)
AWS Data engineer with Databricks || USC Only || W2 Only
Data engineer job in Princeton, NJ
AWS Data Engineer with Databricks
Princeton, NJ - Hybrid - Need Locals or Neaby
Duration: Long Term
is available only to U.S. citizens.
Key Responsibilities
Design and implement ETL/ELT pipelines with Databricks, Apache Spark, AWS Glue, S3, Redshift, and EMR for processing large-scale structured and unstructured data.
Optimize data flows, monitor performance, and troubleshoot issues to maintain reliability and scalability.
Collaborate on data modeling, governance, security, and integration with tools like Airflow or Step Functions.
Document processes and mentor junior team members on best practices.
Required Qualifications
Bachelor's degree in Computer Science, Engineering, or related field.
5+ years of data engineering experience, with strong proficiency in Databricks, Spark, Python, SQL, and AWS services (S3, Glue, Redshift, Lambda).
Familiarity with big data tools like Kafka, Hadoop, and data warehousing concepts.
Azure Data Engineer
Data engineer job in Jersey City, NJ
Title: Senior Azure Data Engineer Client: Major Japanese Bank Experience Level: Senior (10+ Years)
The Senior Azure Data Engineer will design, build, and optimize enterprise data solutions within Microsoft Azure for a major Japanese bank. This role focuses on architecting scalable data pipelines, enhancing data lake environments, and ensuring security, compliance, and data governance best practices.
Key Responsibilities:
Develop, maintain, and optimize Azure-based data pipelines and ETL/ELT workflows.
Design and implement Azure Data Lake, Synapse, Databricks, and ADF solutions.
Ensure data security, compliance, lineage, and governance controls.
Partner with architecture, data governance, and business teams to deliver high-quality data solutions.
Troubleshoot performance issues and improve system efficiency.
Required Skills:
10+ years of data engineering experience.
Strong hands-on expertise with Azure Synapse, Azure Data Factory, Azure Databricks, Azure Data Lake, and Azure SQL.
Azure certifications strongly preferred.
Strong SQL, Python, and cloud data architecture skills.
Experience in financial services or large enterprise environments preferred.
Data Scientist
Data engineer job in Parsippany-Troy Hills, NJ
***This role is hybrid three days per week onsite in Parsippany, NJ. LOCAL CANDIDATES ONLY. No relocation***
Data Scientist
• Summary: Provide analytics, telemetry, ML/GenAI-driven insights to measure SDLC
health, prioritize improvements, validate pilot outcomes, and implement AI-driven
development lifecycle capabilities.
• Responsibilities:
o Define metrics and instrumentation for SDLC/CI pipelines, incidents, and
delivery KPIs.
o Build dashboards, anomaly detection, and data models; implement GenAI
solutions (e.g., code suggestion, PR summarization, automated test
generation) to improve developer workflows.
o Design experiments and validate AI-driven features during the pilot.
o Collaborate with engineering and SRE to operationalize models and ensure
observability and data governance.
• Required skills:
o 3+ years applied data science/ML in production; hands-on experience with
GenAI/LLMs applied to developer workflows or DevOps automation.
o Strong Python (pandas, scikit-learn), ML frameworks, SQL, and data
visualization (Tableau/Power BI).
o Experience with observability/telemetry data (logs/metrics/traces) and A/B
experiment design.
• Preferred:
o Experience with model deployment, MLOps, prompt engineering, and cloud
data platforms (AWS/GCP/Azure).
Data Analytics Engineer
Data engineer job in Somerset, NJ
Client: manufacturing company
Type: direct hire
Our client is a publicly traded, globally recognized technology and manufacturing organization that relies on data-driven insights to support operational excellence, strategic decision-making, and digital transformation. They are seeking a Power BI Developer to design, develop, and maintain enterprise reporting solutions, data pipelines, and data warehousing assets.
This role works closely with internal stakeholders across departments to ensure reporting accuracy, data availability, and the long-term success of the company's business intelligence initiatives. The position also plays a key role in shaping BI strategy and fostering collaboration across cross-functional teams.
This role is on-site five days per week in Somerset, NJ.
Key Responsibilities
Power BI Reporting & Administration
Lead the design, development, and deployment of Power BI and SSRS reports, dashboards, and analytics assets
Collaborate with business stakeholders to gather requirements and translate needs into scalable technical solutions
Develop and maintain data models to ensure accuracy, consistency, and reliability
Serve as the Power BI tenant administrator, partnering with security teams to maintain data protection and regulatory compliance
Optimize Power BI solutions for performance, scalability, and ease of use
ETL & Data Warehousing
Design and maintain data warehouse structures, including schema and database layouts
Develop and support ETL processes to ensure timely and accurate data ingestion
Integrate data from multiple systems while ensuring quality, consistency, and completeness
Work closely with database administrators to optimize data warehouse performance
Troubleshoot data pipelines, ETL jobs, and warehouse-related issues as needed
Training & Documentation
Create and maintain technical documentation, including specifications, mappings, models, and architectural designs
Document data warehouse processes for reference, troubleshooting, and ongoing maintenance
Manage data definitions, lineage documentation, and data cataloging for all enterprise data models
Project Management
Oversee Power BI and reporting projects, offering technical guidance to the Business Intelligence team
Collaborate with key business stakeholders to ensure departmental reporting needs are met
Record meeting notes in Confluence and document project updates in Jira
Data Governance
Implement and enforce data governance policies to ensure data quality, compliance, and security
Monitor report usage metrics and follow up with end users as needed to optimize adoption and effectiveness
Routine IT Functions
Resolve Help Desk tickets related to reporting, dashboards, and BI tools
Support general software and hardware installations when needed
Other Responsibilities
Manage email and phone communication professionally and promptly
Respond to inquiries to resolve issues, provide information, or direct to appropriate personnel
Perform additional assigned duties as needed
Qualifications
Required
Minimum of 3 years of relevant experience
Bachelor's degree in Computer Science, Data Analytics, Machine Learning, or equivalent experience
Experience with cloud-based BI environments (Azure, AWS, etc.)
Strong understanding of data modeling, data visualization, and ETL tools (e.g., SSIS, Azure Synapse, Snowflake, Informatica)
Proficiency in SQL for data extraction, manipulation, and transformation
Strong knowledge of DAX
Familiarity with data warehouse technologies (e.g., Azure Blob Storage, Redshift, Snowflake)
Experience with Power Pivot, SSRS, Azure Synapse, or similar reporting tools
Strong analytical, problem-solving, and documentation skills
Excellent written and verbal communication abilities
High attention to detail and strong self-review practices
Effective time management and organizational skills; ability to prioritize workload
Professional, adaptable, team-oriented, and able to thrive in a dynamic environment
Sr Data Modeler with Capital Markets/ Custody
Data engineer job in Jersey City, NJ
LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 750 clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit *******************
Job Title: Principal Data Modeler / Data Architecture Lead - Capital Markets
Work Location
Jersey City, NJ (Onsite, 5 days / week)
Job Description:
We are seeking a highly experienced Principal Data Modeler / Data Architecture Lead to reverse engineer an existing logical data model supporting all major lines of business in the capital markets domain.
The ideal candidate will have deep capital markets domain expertise and will work closely with business and technology stakeholders to elicit and document requirements, map those requirements to the data model, and drive enhancements or rationalization of the logical model prior to its conversion to a physical data model.
A software development background is not required.
Key Responsibilities
Reverse engineers the current logical data model, analyzing entities, relationships, and subject areas across capital markets (including customer, account, portfolio, instruments, trades, settlement, funds, reporting, and analytics).
Engage with stakeholders (business, operations, risk, finance, compliance, technology) to capture and document business and functional requirements, and map these to the data model.
Enhance or streamline the logical data model, ensuring it is fit-for-purpose, scalable, and aligned with business needs before conversion to a physical model.
Lead the logical-to-physical data model transformation, including schema design, indexing, and optimization for performance and data quality.
Perform advanced data analysis using SQL or other data analysis tools to validate model assumptions, support business decisions, and ensure data integrity.
Document all aspects of the data model, including entity and attribute definitions, ERDs, source-to-target mappings, and data lineage.
Mentor and guide junior data modelers, providing coaching, peer reviews, and best practices for modeling and documentation.
Champion a detail-oriented and documentation-first culture within the data modeling team.
Qualifications
Minimum 15 years of experience in data modeling, data architecture, or related roles within capital markets or financial services.
Strong domain expertise in capital markets (e.g., trading, settlement, reference data, funds, private investments, reporting, analytics).
Proven expertise in reverse engineering complex logical data models and translating business requirements into robust data architectures.
Strong skills in data analysis using SQL and/or other data analysis tools.
Demonstrated ability to engage with stakeholders, elicit requirements, and produce high-quality documentation.
Experience in enhancing, rationalizing, and optimizing logical data models prior to physical implementation.
Ability to mentor and lead junior team members in data modeling best practices.
Passion for detail, documentation, and continuous improvement.
Software development background is not required.
Preferred Skills
Experience with data modeling tools (e.g., ER/Studio, ERwin, Power Designer).
Familiarity with capital markets, business processes and data flows.
Knowledge of regulatory and compliance requirements in financial data management.
Exposure to modern data platforms (e.g., Snowflake, Databricks, cloud databases).
Benefits and Perks:
Comprehensive Medical Plan Covering Medical, Dental, Vision
Short Term and Long-Term Disability Coverage
401(k) Plan with Company match
Life Insurance
Vacation Time, Sick Leave, Paid Holidays
Paid Paternity and Maternity Leave
LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, colour, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.
Data Engineer
Data engineer job in Newark, NJ
NeenOpal is a global consulting firm specializing in Data Science and Business Intelligence, with offices in Bengaluru, Newark, and Fredericton. We provide end-to-end solutions tailored to the unique needs of businesses, from startups to large organizations, across domains like digital strategy, sales and marketing, supply chain, and finance. Our mission is to help organizations achieve operational excellence and transform into data-driven enterprises.
Role Description
This is a full-time, hybrid, Data Engineer role located in Newark, NJ. The Data Engineer will be responsible for designing, implementing, and managing data engineering solutions to support business needs. Day-to-day tasks include building and optimizing data pipelines, developing and maintaining data models and ETL processes, managing data warehousing solutions, and contributing to the organization's data analytics initiatives. Collaboration with cross-functional teams to ensure robust data infrastructure will be a key aspect of this role.
Key Responsibilities
Data Pipeline Development: Design, implement, and manage robust data pipelines to ensure efficient data flow into data warehouses. Automate ETL processes using Python and advanced data engineering tools.
Data Integration: Integrate and transform data using industry-standard tools. Experience required with:
AWS Services: AWS Glue, Data Pipeline, Redshift, and S3.
Azure Services: Azure Data Factory, Synapse Analytics, and Blob Storage.
Data Warehousing: Implement and optimize solutions using Snowflake and Amazon Redshift.
Database Management: Develop and manage relational databases (SQL Server, MySQL, PostgreSQL) to ensure data integrity.
Performance Optimization: Continuously monitor and improve data processing workflows and apply best practices for query optimization.
Global Collaboration: Work closely with cross-functional teams in the US, India, and Canada to deliver high-quality solutions.
Governance & Support: Document ETL processes and data mappings in line with governance standards. Diagnose and resolve data-related issues promptly.
Required Skills and Experience
Experience: Minimum 2+ years of experience designing and developing ETL processes (AWS Glue, Azure Data Factory, or similar).
Integration: Experience integrating data via RESTful / GraphQL APIs.
Programming: Proficient in Python for ETL automation and SQL for database management.
Cloud Platforms: Strong experience with AWS or Azure data services. (GCP familiarity is a plus) .
Data Warehousing: Expertise with Snowflake, Amazon Redshift, or Azure Synapse Analytics.
Integration: Experience integrating data via RESTful APIs.
Communication: Excellent articulation skills to explain technical work directly to clients and stakeholders.
Authorization: Must have valid work authorization in the United States.
Salary Range: $65,000- $80,000 per year
Benefits: This role includes health insurance, paid time off, and opportunities for professional growth and continuous learning within a fast-growing global analytics company.
Equal Opportunity Employer NeenOpal Inc. is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status.
Senior Data Engineer - MDM
Data engineer job in Iselin, NJ
We are
At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver industry-leading digital solutions. Synechron's progressive technologies and optimization strategies span end-to-end Artificial Intelligence, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering, servicing an array of noteworthy financial services and technology firms. Through research and development initiatives in our FinLabs we develop solutions for modernization, from Artificial Intelligence and Blockchain to Data Science models, Digital Underwriting, mobile-first applications and more. Over the last 20+ years, our company has been honored with multiple employer awards, recognizing our commitment to our talented teams. With top clients to boast about, Synechron has a global workforce of 14,500+, and has 58 offices in 21 countries within key global markets.
Our challenge:
We are seeking a highly skilled and experienced Senior Data Engineer specializing in Master Data Management (MDM) to join our data team. The ideal candidate will have a strong background in designing, implementing, and managing end-to-end MDM solutions, preferably within the financial sector. You will be responsible for architecting robust data platforms, evaluating MDM tools, and aligning data strategies to meet business needs.
Additional Information
The base salary for this position will vary based on geography and other factors. In accordance with the law, the base salary for this role if filled within Iselin, NJ is $135K to $150K/year & benefits (see below).
Key Responsibilities:
Lead the design, development, and deployment of comprehensive MDM solutions across the organization, with an emphasis on financial data domains.
Demonstrate extensive experience with multiple MDM implementations, including platform selection, comparison, and optimization.
Architect and present end-to-end MDM architectures, ensuring scalability, data quality, and governance standards are met.
Evaluate various MDM platforms (e.g., Informatica, Reltio, Talend, IBM MDM, etc.) and provide objective recommendations aligned with business requirements.
Collaborate with business stakeholders to understand reference data sources and develop strategies for managing reference and master data effectively.
Implement data integration pipelines leveraging modern data engineering tools and practices.
Develop, automate, and maintain data workflows using Python, Airflow, or Astronomer.
Build and optimize data processing solutions using Kafka, Databricks, Snowflake, Azure Data Factory (ADF), and related technologies.
Design microservices, especially utilizing GraphQL, to enable flexible and scalable data services.
Ensure compliance with data governance, data privacy, and security standards.
Support CI/CD pipelines for continuous integration and deployment of data solutions.
Qualifications:
12+ years of experience in data engineering, with a proven track record of MDM implementations, preferably in the financial services industry.
Extensive hands-on experience designing and deploying MDM solutions and comparing MDM platform options.
Strong functional knowledge of reference data sources and domain-specific data standards.
Expertise in Python, Pyspark, Kafka, microservices architecture (particularly GraphQL), Databricks, Snowflake, Azure Data Factory, SQL, and orchestration tools such as Airflow or Astronomer.
Familiarity with CI/CD practices, tools, and automation pipelines.
Ability to work collaboratively across teams to deliver complex data solutions.
Experience with financial systems (capital markets, credit risk, and regulatory compliance applications).
Preferred Skills:
Familiarity with financial data models and regulatory requirements.
Experience with Azure cloud platforms
Knowledge of data governance, data quality frameworks, and metadata management.
We offer:
A highly competitive compensation and benefits package
A multinational organization with 58 offices in 21 countries and the possibility to work abroad
10 days of paid annual leave (plus sick leave and national holidays)
Maternity & Paternity leave plans
A comprehensive insurance plan including: medical, dental, vision, life insurance, and long-/short-term disability (plans vary by region)
Retirement savings plans
A higher education certification policy
Commuter benefits (varies by region)
Extensive training opportunities, focused on skills, substantive knowledge, and personal development.
On-demand Udemy for Business for all Synechron employees with free access to more than 5000 curated courses
Coaching opportunities with experienced colleagues from our Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups
Cutting edge projects at the world's leading tier-one banks, financial institutions and insurance firms
A flat and approachable organization
A truly diverse, fun-loving and global work culture
SYNECHRON'S DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference' is committed to fostering an inclusive culture - promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant's gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
Big Data Developer
Data engineer job in Jersey City, NJ
Designing Hive/HCatalog data model includes creating table definitions, file formats, compression techniques for Structured & Semi-structured data processing
Implementing Spark processing based ETL frameworks
Implementing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
Modifying the Informatica-Teradata & Unix based data pipeline
Enhancing the Talend-Hive/Spark & Unix based data pipelines
Develop and Deploy Scala/Python based Spark Jobs for ETL processing
Strong SQL & DWH concepts
Senior Data Architect
Data engineer job in Edison, NJ
Act as a Enterprise Architect, supporting architecture reviews, design decisions, and strategic planning.
Design and implement scalable data warehouse and analytics solutions on AWS and Snowflake.
Develop and optimize SQL, ETL/ELT pipelines, and data models to support reporting and analytics.
Collaborate with cross-functional teams (data engineering, application development, infrastructure) to align on architecture best practices and ensure consistency across solutions.
Evaluate and recommend technologies, tools, and frameworks to improve data processing efficiency and reliability.
Provide guidance and mentorship to data engineering teams, enforcing data governance, quality, and security standards.
Troubleshoot complex data and performance issues and propose long-term architectural solutions.
Support capacity planning, cost optimization, and environment management within AWS/Snowflake ecosystems.
About ValueMomentum
ValueMomentum is a leading solutions provider for the global property & casualty insurance industry, supported by deep domain and technology capabilities. We offer a comprehensive suite of advisory, development, implementation, and maintenance services across the entire P&C insurance value chain. This includes Underwriting, Claims, Distribution, and more, empowering insurers to stay ahead with sustained growth, high performance, and enhanced stakeholder value. Trusted by over 75 insurers, ValueMomentum is one of the largest standalone insurance-focused solutions providers to the US insurance industry.
Neo4j Engineer
Data engineer job in Summit, NJ
Must Have Technical/Functional Skills
Neo4j, Graph Data Science, Cypher, Python, Graph Algorithms, Bloom, GraphXR, Cloud, Kubernetes, ETL
Roles & Responsibilities
Design and implement graph-based data models using Neo4j.
Develop Cypher queries and procedures for efficient graph traversal and analysis.
Apply Graph Data Science algorithms for community detection, centrality, and similarity.
Integrate Neo4j with enterprise data platforms and APIs.
Collaborate with data scientists and engineers to build graph-powered applications.
Optimize performance and scalability of graph queries and pipelines.
Support deployment and monitoring of Neo4j clusters in cloud or on-prem environments.
Salary Range: $110,000 $140,000 Year
TCS Employee Benefits Summary:
Discretionary Annual Incentive.
Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans.
Family Support: Maternal & Parental Leaves.
Insurance Options: Auto & Home Insurance, Identity Theft Protection.
Convenience & Professional Growth: Commuter Benefits & Certification & amp; Training Reimbursement.
Time Off: Vacation, Time Off, Sick Leave & Holidays.
Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
Java Software Engineer
Data engineer job in East Windsor, NJ
Job Responsibilities:
Develop applications using Java 8/JEE (and higher), Angular 2+, React.js, SQL, Spring, HTML5, CSS, JavaScript, and TypeScript, among other tools.
Write scalable, secure, and maintainable code that powers our clients' platforms.
Create, deploy, and maintain automated system tests.
Work with Testers to understand defects and resolve them in a timely manner.
Support continuous improvement by investigating alternatives and technologies, and presenting these for architectural review.
Collaborate effectively with other team members to accomplish shared user story and sprint goals.
Requirement:
Experience in programming languages: Java and JavaScript
Decent understanding of the software development life cycle
Basic programming skills using object-oriented programming (OOP) languages, with in-depth
knowledge of common APIs and data structures like Collections, Maps, Lists, Sets, etc.
Knowledge of relational databases (e.g., SQL Server, Oracle) and basic SQL query language skills
Preferred Qualifications:
Master's Degree in Computer Science (CS)
0-1 year of practical experience in Java coding
Experience using Spring, Maven, Angular frameworks, HTML, and CSS
Knowledge of other contemporary Java technologies (e.g., WebLogic, RabbitMQ, Tomcat)
Familiarity with JSP, J2EE, and JDBC
Java Software Engineer
Data engineer job in Iselin, NJ
Job Information:
Functional Title - Assistant Vice President, Java Software Development Engineer
Department - Technology
Corporate Level - Assistant Vice President
Report to - Director, Application Development
Expected full-time salary range between $ 125,000 - 145,000 + variable compensation + 401(k) match + benefits
Job Description:
This position is with CLS Technology. The primary responsibilities of the job will be
(a) Hands-on software application development
(b) Level 3 support
Duties, Responsibilities, and Deliverables:
Develop scalable, robust applications utilizing appropriate design patterns, algorithms and Java frameworks
Collaborate with Business Analysts, Application Architects, Developers, QA, Engineering, and Technology Vendor teams for design, development, testing, maintenance and support
Adhere to CLS SDLC process and governance requirements and ensure full compliance of these requirements
Plan, implement and ensure that delivery milestones are met
Provide solutions using design patterns, common techniques, and industry best practices that meet the typical challenges/requirements of a financial application including usability, performance, security, resiliency, and compatibility
Proactively recognize system deficiencies and implement effective solutions
Participate in, contribute to, and assimilate changes, enhancements, requirements (functional and non-functional), and requirements traceability
Apply significant knowledge of industry trends and developments to improve CLS in-house practices and services
Provide Level-3 support. Provide application knowledge and training to Level-2 support teams
Experience Requirements:
5+ years of hands-on application development and testing experience with proficient knowledge of core Java and JEE technologies such as JDBC and JAXB, Java/Web technologies
Knowledge of Python, Perl, Unix shell scripting is a plus
Expert hands-on experience with SQL and with at least one DBMS such as IBM DB2 (preferred) or Oracle is a strong plus
Expert knowledge of and experience in securing web applications, secure coding practices
Hands-on knowledge of application resiliency, performance tuning, technology risk management is a strong plus
Hands-on knowledge of messaging middleware such as IBM MQ (preferred) or TIBCO EMS, and application servers such as WebSphere, or WebLogic
Knowledge of SWIFT messaging, payments processing, FX business domain is a plus
Hands-on knowledge of CI/CD practices and DevOps toolsets such as JIRA, GIT, Ant, Maven, Jenkins, Bamboo, Confluence, and ServiceNow.
Hands-on knowledge of MS Office toolset including MS-Excel, MS-Word, PowerPoint, and Visio
Proven track record of successful application delivery to production and effective Level-3 support.
Success factors: In addition, the person selected for the job will
Have strong analytical, written and oral communication skills with a high self-motivation factor
Possess excellent organization skills to manage multiple tasks in parallel
Be a team player
Have the ability to work on complex projects with globally distributed teams and manage tight delivery timelines
Have the ability to smoothly handle high stress application development and support environments
Strive continuously to improve stakeholder management for end-to-end application delivery and support
Qualification Requirements:
Bachelor Degree
Minimum 5 year experience in Information Technology
Software Engineer, Banking Operations
Data engineer job in Jersey City, NJ
Business Integration Partners (BIP) is Europe's fastest growing digital consulting company and are on track to reach the Top 20 by 2030, with an expanding global footprint in the US. Operating at the intersection of business and technology we design, develop, and deliver sustainable solutions at pace and scale creating greater value for our customers, employees, shareholders, and society.
BIP specializes in high-impact consulting services across multiple industries with 6,000 employees worldwide. Our domains include Financial Services business serves Capital Markets, Insurance and Payments verticals, supplemented with Data & AI, Cybersecurity, Risk & Compliance, Change Management and Digital Transformation practices. We integrate deep industry expertise with business, technology, and quantitative disciplines to deliver high-impact results for our clients.
BIP is currently expanding its footprint in the United States, focusing on growing its Capital Markets and Financial Services lines. Our teams operate at the intersection of business strategy, technology, and data to help our clients in driving smarter decisions, reducing risks, and staying ahead in a fast-evolving market environment.
About the Role:
The Software Engineer will contribute to the design, development, and enhancement of core payments and wire processing applications within our corporate and investment banking client's technology organization. Engineers will work across distributed systems, real-time transaction pipelines, settlement engines, and compliance/monitoring platforms supporting high-volume, low-latency financial operations.
You must have valid US work authorization and must physically reside around the posted city, within a 50-mile commute. We are unable to support relocation costs.
Please do not apply for this position unless you meet the criteria outlined above.
Key Responsibilities:
Develop and maintain services supporting high-volume payments, wire transfers, and money movement workflows.
Build scalable applications using Python, Java, or .NET across distributed environments.
Implement integrations with internal banking platforms, payment rails, and ledger systems.
Troubleshoot production issues, improve resiliency, and reduce latency across transaction flows.
Contribute to modernization efforts, including cloud migration, refactoring legacy components, and API enablement.
Collaborate closely with BAs, architects, PMs, and offshore/nearshore teams.
Follow secure coding standards, operational controls, and SDLC processes required by the bank.
Required Skills:
3-10+ years hands-on experience in Python, Java, or C#/.NET.
Experience with relational databases (Oracle, SQL Server).
Understanding of payments, wire transfers, clearing systems, or financial services workflows.
Familiarity with distributed systems, messaging, and event-driven architectures.
Strong debugging and production support experience.
Understanding of CI/CD and Agile environments.
Preferred Skills:
Hadoop / Informatica ecosystem knowledge.
Experience with ISO 20022, SWIFT, Fedwire, CHIPS.
Microservices architecture, REST/gRPC APIs.
Performance tuning and low-latency engineering.
**The base salary range for this role is $125,000 - $175,000**
Benefits:
Choice of medical, dental, vision insurance.
Voluntary benefits.
Short- and long-term disability.
HSA and FSAs.
Matching 401k.
Discretionary performance bonus.
Employee referral bonus.
Employee assistance program.
11 public holidays.
20 days PTO.
7 Sick Days.
PTO buy and sell program.
Volunteer days.
Paid parental leave.
Remote/hybrid work environment support.
For more information about BIP US, visit *********************************
Equal Employment Opportunity:
It is BIP US Consulting policy to provide equal employment opportunities to all individuals based on job-related qualifications and ability to perform a job, without regard to age, gender, gender identity, sexual orientation, race, color, religion, creed, national origin, disability, genetic information, veteran status, citizenship, or marital status, and to maintain a non-discriminatory environment free from intimidation, harassment or bias based upon these grounds.
BIP US provides a reasonable range of compensation for our roles. Actual compensation is influenced by a wide array of factors including but not limited to skill set, education, level of experience, and knowledge.
Senior Dotnet Developer
Data engineer job in Berkeley Heights, NJ
Net2Source is a Global Workforce Solutions Company headquartered at NJ, USA with its branch offices in Asia Pacific Region. We are one of the fastest growing IT Consulting company across the USA and we are hiring
"Senior Dotnet Developer"
for one of our clients. We offer a wide gamut of consulting solutions customized to our 450+ clients ranging from Fortune 500/1000 to Start-ups across various verticals like Technology, Financial Services, Healthcare, Life Sciences, Oil & Gas, Energy, Retail, Telecom, Utilities, Technology, Manufacturing, the Internet, and Engineering.
Position : Senior Dotnet Developer with Azure
Location: Berkeley Heights, NJ ( Onsite) - Only Locals
Type: Contract
Experience Level- 10+ Years
Client Interview Mode: In-person @ Berkeley Heights
As an experienced Software Engineer working for our leading client, you serve as member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. Depending on the team that you join, you could be developing mobile features that give our customers and clients more control over how they bank with the client, strategizing on how big data can make client's trading systems quicker, creating the next innovation in payments for merchants, or supporting the integration of client's private and public cloud platforms.
Mandatory Skills:
Good experience in C#, Dot Net Core, Web API, ADO, Azure DevOps, GitHub, Kafka, Redis Cache.
Understanding of Security Principle, Restful API, Identity and Token (OAuth/OpenID connect),
Good Hands on experience in webform / MVC .
Good experience on Azure DevOps, GitHub, Kafka, Redis Cache.
Minimum 1.5 years experience on in MVC, Webforms.
Understanding on Building .NET apps for Docker/Linux, Exposure to Kubernetes is a plus
Why Work With Us?
We believe in more than just jobs-we build careers. At Net2Source, we champion leadership at all levels, celebrate diverse perspectives, and empower you to make an impact. Think work-life balance, professional growth, and a collaborative culture where your ideas matter.
Our Commitment to Inclusion & Equity
Net2Source is an equal opportunity employer, dedicated to fostering a workplace where diverse talents and perspectives are valued. We make all employment decisions based on merit, ensuring a culture of respect, fairness, and opportunity for all, regardless of age, gender, ethnicity, disability, or other protected characteristics.
Awards & Recognition
America's Most Honored Businesses (Top 10%)
Fastest-Growing Staffing Firm by Staffing Industry Analysts
INC 5000 List for Eight Consecutive Years
Top 100 by
Dallas Business Journal
Spirit of Alliance Award by Agile1
Madhukar Singh
Email:***********************
Senior Python Developer
Data engineer job in Jersey City, NJ
Job Title: Senior Python Developer
Job description involves designing, developing, and maintaining software, with a strong focus on data pipelines and financial systems. Key responsibilities include writing clean Python code, building and optimizing ETL processes, and collaborating with teams to create and enhance scalable, secure solutions that meet business needs. This role requires strong skills in Python, SQL, and a background in enterprise-level platforms and data-driven solutions.
Key responsibilities
Application development: Design, develop, and maintain software applications using the Python language.
Database management: Work with relational and/or NoSQL databases to manage and store data.
Code quality: Write reusable, testable, and efficient code; participate in code reviews and debugging.
Collaboration: Work with cross-functional teams, including data engineers, business users, and other developers, to deliver solutions.
System enhancement: Contribute to system enhancements, and support the deployment of data-driven solutions.
Automation: Use Python scripts to automate tasks and processes.
Compliance and security: Ensure applications comply with security and regulatory standards.
Troubleshooting: Troubleshoot issues in existing systems and fix bugs.
Key skills
Proficiency in Python
Experience with SQL
Knowledge of ETL development
Familiarity with databases (e.g., Microsoft SQL Server, PostgreSQL, MongoDB)
Understanding of enterprise-level platforms
Experience with agile development methodologies
Strong problem-solving and analytical skills
Experience with big data technologies (e.g., Hadoop, PySpark) is a plus
Java Software Engineer (Trading)-- AGADC5642050
Data engineer job in Jersey City, NJ
Must Haves:
1.) Low Latency Java Development experience (Trading would be preferred but not mandatory)
These are more from a screening standpoint, if they have low latency java development experience they should have the following:
2.) Garbage collection, threading and or multi threading, Memory management experience
3.) Fix Protocol
4.) Optimization techniques or profiling techniques
Nice to Haves:
Order management System, Smart order router, market data experience