Sr Data Engineer Python Serverside
Data engineer job in White House Station, NJ
This is a direct hire full-time position, with a hybrid on-site 2 days a week format.
YOU MUST BE A US CITIZEN OR GREEN CARD, NO OTHER STATUS TO WORK IN THE US WILL BE PERMITTED
YOU MUST LIVE LOCAL TO THE AREA AND BE ABLE TO DRIVE ONSITE A MIN TWO DAYS A WEEK
THE TECH STACK WILL BE:
7 years demonstrated server-side development proficiency
5 years demonstrated server-side development proficiency
Programming Languages: Python (NumPy, Pandas, Oracle PL/SQL). Other non-interpreted languages like Java, C++, Rust, etc. are a plus. Must be proficient in the intermediate-advanced level of the language (concurrency, memory management, etc.)
Design patterns: typical GOF patterns (Factory, Facade, Singleton, etc.)
Data structures: maps, lists, arrays, etc
SCM: solid Git proficiency, MS Azure DevOps (CI/CD)
Data Engineer
Data engineer job in Philadelphia, PA
Job Title: Data Engineer
Experience: 5+ years
We are seeking an experienced Data Engineer with strong expertise in PySpark and data pipeline operations. This role focuses heavily on performance tuning Spark applications, managing large-scale data pipelines, and ensuring high operational stability. The ideal candidate is a strong technical problem-solver, highly collaborative, and proactive in automation and process improvements.
Key Responsibilities:
Data Pipeline Management & Support
Operate and support Business-as-Usual (BAU) data pipelines, ensuring stability, SLA adherence, and timely incident resolution.
Identify and implement opportunities for optimization and automation across pipelines and operational workflows.
Spark Development & Performance Tuning
Design, develop, and optimize PySpark jobs for efficient large-scale data processing.
Diagnose and resolve complex Spark performance issues such as data skew, shuffle spill, executor OOM errors, slow-running stages, and partition imbalance.
Platform & Tool Management
Use Databricks for Spark job orchestration, workflow automation, and cluster configuration.
Debug and manage Spark on Kubernetes, addressing pod crashes, OOM kills, resource tuning, and scheduling problems.
Work with MinIO/S3 storage for bucket management, permissions, and large-volume file ingestion and retrieval.
Collaboration & Communication
Partner with onshore business stakeholders to clarify requirements and convert them into well-defined technical tasks.
Provide daily coordination and technical oversight to offshore engineering teams.
Participate actively in design discussions and technical reviews.
Documentation & Operational Excellence
Maintain accurate and detailed documentation, runbooks, and troubleshooting guides.
Contribute to process improvements that enhance operational stability and engineering efficiency.
Required Skills & Qualifications:
Primary Skills (Must-Have)
PySpark: Advanced proficiency in transformations, performance tuning, and Spark internals.
SQL: Strong analytical query design, performance tuning, and foundational data modeling (relational & dimensional).
Python: Ability to write maintainable, production-grade code with a focus on modularity, automation, and reusability.
Secondary Skills (Highly Desirable)
Kubernetes: Experience with Spark-on-K8s, including pod diagnostics, resource configuration, and log/monitoring tools.
Databricks: Hands-on experience with cluster management, workflow creation, Delta Lake optimization, and job monitoring.
MinIO / S3: Familiarity with bucket configuration, policies, and efficient ingestion patterns.
DevOps: Experience with Git, CI/CD, and cloud environments (Azure preferred).
Data Modeler
Data engineer job in Philadelphia, PA
Role: Data Modeler
Fulltime
The Senior Data Modeler is responsible for developing advanced data models associated with the enterprise data warehouse (DART), IHG operational systems and IHG data exchange with limited management oversight. Primary job function entails gathering and assessing business and technical requirements to create/update database objects. Delivery items include using ERwin Data Modeler to create and maintain entity relationship diagrams, DDL, and supporting documentation as needed. Additional job functions include identifying solution design options, advanced data profiling and subject matter expertise.
This role works closely with all development resources including Business Systems Analysts, Developers and Project Managers. The Senior Data Modeler works independently with minimal guidance and acts as a resource for colleagues with less experience.
A solid understanding of the following is required:
• Software Development Life Cycle (SDLC)
• Agile Methodology
• Logical and Physical data modeling
• Levels of Normalization
• Abstraction and Generalization
• Subtyping and Classification
• Relational model design
• Dimensional model design (Star Schemas, Snowflake designs)
• Inmon & Kimball methodologies
• Master Data Management (MDM)
• Data Profiling
• Metadata Management
• Data security and protecting information
• ETL processes, BI processes
• Cloud Platforms, especially Google Cloud Platform
Requirements:
• 7+ years of proven experience in Data Modeling
• 5+ years of experience with advanced SQL query techniques
• Highly skilled in Erwin, including but not limited to: Diagraming, Complete Compare, Reverse Engineering, Forward Engineering
• Strong multi-tasking capability, with adaptability to work simultaneously in multiple environments with differing procedures, SME support and level of accountability
• Ability to estimate scope of effort, to prioritize and/or fast track requirements, and to provide multiple options for data-driven solution design
• Demonstrated ability to recognize, elicit and/or decompose complex business requirements
• Demonstrated ability to perceive patterns and relationships in data
• Demonstrated ability to design models which integrate data from disparate sources
• Demonstrated ability to design data models for complex and ragged hierarchies
• Demonstrated ability to design data models for complicated historical perspectives
• Understanding of data warehousing and decision support tools and techniques
• Strong ability to multi-task with several concurrent issues and projects
• Demonstrated ability to interact effectively with all levels of the organization
• Strong interpersonal, written and verbal communication skills
• Experience advising or mentoring staff regarding data administration, data modeling, and data mapping
Desired
• Experience creating data models for BigQuery, SQL Server, Oracle and MySQL databases
• Experience in Healthcare Insurance or related field strongly preferred"
Data Engineer
Data engineer job in Philadelphia, PA
Data Engineer - Job Opportunity
Full time Permanent
Remote - East coast only
Please note this role is open for US citizens or Green Card Holders only
We're looking for a Data Engineer to help build and enhance scalable data systems that power analytics, reporting, and business decision-making. This role is ideal for someone who enjoys solving complex technical challenges, optimizing data workflows, and collaborating across teams to deliver reliable, high-quality data solutions.
What You'll Do
Develop and maintain scalable data infrastructure, cloud-native workflows, and ETL/ELT pipelines supporting analytics and operational workloads.
Transform, model, and organize data from multiple sources to enable accurate reporting and data-driven insights.
Improve data quality and system performance by identifying issues, optimizing architecture, and enhancing reliability and scalability.
Monitor pipelines, troubleshoot discrepancies, and resolve data or platform issues-including participating in on-call support when needed.
Prototype analytical tools, automation solutions, and algorithms to support complex analysis and drive operational efficiency.
Collaborate closely with BI, Finance, and cross-functional teams to deliver robust and scalable data products.
Create and maintain clear, detailed documentation (configurations, specifications, test scripts, and project tracking).
Contribute to Agile development processes, engineering excellence, and continuous improvement initiatives.
What You Bring
Bachelor's degree in Computer Science or a related technical field.
2-4 years of hands-on SQL experience (Oracle, PostgreSQL, etc.).
2-4 years of experience with Java or Groovy.
2+ years working with orchestration and ingestion tools (e.g., Airflow, Airbyte).
2+ years integrating with APIs (SOAP, REST).
Experience with cloud data warehouses and modern ELT/ETL frameworks (e.g., Snowflake, Redshift, DBT) is a plus.
Comfortable working in an Agile environment.
Practical knowledge of version control and CI/CD workflows.
Experience with automation, including unit and integration testing.
Understanding of cloud storage solutions (e.g., S3, Blob Storage, Object Store).
Proactive mindset with strong analytical, logical-thinking, and consultative skills.
Ability to reason about design decisions and understand their broader technical impact.
Strong collaboration, adaptability, and prioritization abilities.
Excellent problem-solving and troubleshooting skills.
Data Modeler
Data engineer job in Philadelphia, PA
Philadelphia, PA
Hybrid / Remote
Brooksource is seeking an experienced Data Modeler to support an enterprise data warehousing team responsible for designing and implementing information solutions across large operational and analytical systems. You'll work closely with data stewards, architects, and DBAs to understand business needs and translate them into high-quality logical and physical data models that align with enterprise standards.
Key Responsibilities
Build and maintain logical and physical data models for the Active Enterprise Data Warehouse (AEDW), operational systems, and data exchange processes.
Collaborate with data stewards and architects to capture and refine business requirements and translate them into scalable data structures.
Ensure physical models accurately implement approved logical models.
Partner with DBAs on schema design, change management, and database optimization.
Assess and improve existing data structures for performance, consistency, and scalability.
Document data definitions, lineage, relationships, and standards using ERwin or similar tools.
Participate in design reviews, data governance work, and data quality initiatives.
Support impact analysis for enhancements, new development, and production changes.
Adhere to enterprise modeling standards, naming conventions, and best practices.
Deliver high-quality modeling artifacts with minimal supervision.
Required Skills & Experience
5+ years as a Data Modeler, Data Architect, or similar role.
Strong expertise with ERwin or other modeling tools.
Experience supporting EDW, ODS, or large analytics environments.
Proficiency developing conceptual, logical, and physical data models.
Strong understanding of relational design, dimensional modeling, and normalization.
Hands-on experience with Oracle, SQL Server, PostgreSQL, or comparable databases.
Ability to translate complex business requirements into clear technical solutions.
Familiarity with data governance, metadata management, and data quality concepts.
Strong communication skills and ability to collaborate across technical and business teams.
Preferred Skills
Experience in healthcare or insurance data environments.
Understanding of ETL/ELT concepts and how data models impact integration workflows.
Exposure to cloud data platforms (AWS, Azure, GCP) or modern modeling approaches.
Knowledge of enterprise architecture concepts.
About the Team
You'll join a collaborative, fast-moving data warehousing team focused on building reliable, scalable information systems that support enterprise decision-making. This role is key in aligning business needs with the data structures that power core operations and analytics.
Data Engineer
Data engineer job in Hamilton, NJ
Key Responsibilities:
Manage and support batch processes and data pipelines in Azure Databricks and Azure Data Factory.
Integrate and process Bloomberg market data feeds and files into trading or analytics platforms.
Monitor, troubleshoot, and resolve data and system issues related to trading applications and market data ingestion.
Develop, automate, and optimize ETL pipelines using Python, Spark, and SQL.
Manage FTP/SFTP file transfers between internal systems and external vendors.
Ensure data quality, completeness, and timeliness for downstream trading and reporting systems.
Collaborate with operations, application support, and infrastructure teams to resolve incidents and enhance data workflows.
Required Skills & Experience:
10+ years of experience in data engineering or production support within financial services or trading environments.
Hands-on experience with Azure Databricks, Azure Data Factory, and Azure Storage, Logic Apps, Fabric.
Strong Python and SQL programming skills.
Experience with Bloomberg data feeds (BPIPE, TSIP,SFTP).
Experience with Git, CI/CD pipelines, and Azure DevOps.
Proven ability to support batch jobs, troubleshoot failures, and manage job scheduling.
Experience handling FTP/SFTP file transfers and automation (e.g., using scripts or managed file transfer tools).
Solid understanding of equities trading, fixed income trading, trading workflows, and financial instruments.
Excellent communication, problem-solving, and stakeholder management skills.
AWS Data engineer with Databricks || USC Only || W2 Only
Data engineer job in Princeton, NJ
AWS Data Engineer with Databricks
Princeton, NJ - Hybrid - Need Locals or Neaby
Duration: Long Term
is available only to U.S. citizens.
Key Responsibilities
Design and implement ETL/ELT pipelines with Databricks, Apache Spark, AWS Glue, S3, Redshift, and EMR for processing large-scale structured and unstructured data.
Optimize data flows, monitor performance, and troubleshoot issues to maintain reliability and scalability.
Collaborate on data modeling, governance, security, and integration with tools like Airflow or Step Functions.
Document processes and mentor junior team members on best practices.
Required Qualifications
Bachelor's degree in Computer Science, Engineering, or related field.
5+ years of data engineering experience, with strong proficiency in Databricks, Spark, Python, SQL, and AWS services (S3, Glue, Redshift, Lambda).
Familiarity with big data tools like Kafka, Hadoop, and data warehousing concepts.
Data Analytics Engineer
Data engineer job in Somerset, NJ
Client: manufacturing company
Type: direct hire
Our client is a publicly traded, globally recognized technology and manufacturing organization that relies on data-driven insights to support operational excellence, strategic decision-making, and digital transformation. They are seeking a Power BI Developer to design, develop, and maintain enterprise reporting solutions, data pipelines, and data warehousing assets.
This role works closely with internal stakeholders across departments to ensure reporting accuracy, data availability, and the long-term success of the company's business intelligence initiatives. The position also plays a key role in shaping BI strategy and fostering collaboration across cross-functional teams.
This role is on-site five days per week in Somerset, NJ.
Key Responsibilities
Power BI Reporting & Administration
Lead the design, development, and deployment of Power BI and SSRS reports, dashboards, and analytics assets
Collaborate with business stakeholders to gather requirements and translate needs into scalable technical solutions
Develop and maintain data models to ensure accuracy, consistency, and reliability
Serve as the Power BI tenant administrator, partnering with security teams to maintain data protection and regulatory compliance
Optimize Power BI solutions for performance, scalability, and ease of use
ETL & Data Warehousing
Design and maintain data warehouse structures, including schema and database layouts
Develop and support ETL processes to ensure timely and accurate data ingestion
Integrate data from multiple systems while ensuring quality, consistency, and completeness
Work closely with database administrators to optimize data warehouse performance
Troubleshoot data pipelines, ETL jobs, and warehouse-related issues as needed
Training & Documentation
Create and maintain technical documentation, including specifications, mappings, models, and architectural designs
Document data warehouse processes for reference, troubleshooting, and ongoing maintenance
Manage data definitions, lineage documentation, and data cataloging for all enterprise data models
Project Management
Oversee Power BI and reporting projects, offering technical guidance to the Business Intelligence team
Collaborate with key business stakeholders to ensure departmental reporting needs are met
Record meeting notes in Confluence and document project updates in Jira
Data Governance
Implement and enforce data governance policies to ensure data quality, compliance, and security
Monitor report usage metrics and follow up with end users as needed to optimize adoption and effectiveness
Routine IT Functions
Resolve Help Desk tickets related to reporting, dashboards, and BI tools
Support general software and hardware installations when needed
Other Responsibilities
Manage email and phone communication professionally and promptly
Respond to inquiries to resolve issues, provide information, or direct to appropriate personnel
Perform additional assigned duties as needed
Qualifications
Required
Minimum of 3 years of relevant experience
Bachelor's degree in Computer Science, Data Analytics, Machine Learning, or equivalent experience
Experience with cloud-based BI environments (Azure, AWS, etc.)
Strong understanding of data modeling, data visualization, and ETL tools (e.g., SSIS, Azure Synapse, Snowflake, Informatica)
Proficiency in SQL for data extraction, manipulation, and transformation
Strong knowledge of DAX
Familiarity with data warehouse technologies (e.g., Azure Blob Storage, Redshift, Snowflake)
Experience with Power Pivot, SSRS, Azure Synapse, or similar reporting tools
Strong analytical, problem-solving, and documentation skills
Excellent written and verbal communication abilities
High attention to detail and strong self-review practices
Effective time management and organizational skills; ability to prioritize workload
Professional, adaptable, team-oriented, and able to thrive in a dynamic environment
Data Architect
Data engineer job in Piscataway, NJ
Data Architecture & Modeling
Design and maintain enterprise-level logical, conceptual, and physical data models.
Define data standards, naming conventions, metadata structures, and modeling best practices.
Ensure scalability, performance, and alignment of data models with business requirements.
Data Governance & Quality
Implement and enforce data governance principles and policies.
Define data ownership, stewardship, data lineage, and lifecycle management.
Lead initiatives to improve data quality, consistency, and compliance.
Enterprise Data Management
Develop enterprise data strategies, including data integration, master data management (MDM), and reference data frameworks.
Define and oversee the enterprise data architecture blueprint.
Ensure alignment between business vision and data technology roadmaps.
Senior Data Engineer- MDM
Data engineer job in Iselin, NJ
We are
At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver industry-leading digital solutions. Synechron's progressive technologies and optimization strategies span end-to-end Artificial Intelligence, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering, servicing an array of noteworthy financial services and technology firms. Through research and development initiatives in our FinLabs we develop solutions for modernization, from Artificial Intelligence and Blockchain to Data Science models, Digital Underwriting, mobile-first applications and more. Over the last 20+ years, our company has been honored with multiple employer awards, recognizing our commitment to our talented teams. With top clients to boast about, Synechron has a global workforce of 14,500+, and has 58 offices in 21 countries within key global markets.
Our challenge:
We are seeking a highly skilled and experienced Senior Data Engineer specializing in Master Data Management (MDM) to join our data team. The ideal candidate will have a strong background in designing, implementing, and managing end-to-end MDM solutions, preferably within the financial sector. You will be responsible for architecting robust data platforms, evaluating MDM tools, and aligning data strategies to meet business needs.
Additional Information
The base salary for this position will vary based on geography and other factors. In accordance with the law, the base salary for this role if filled within Iselin, NJ is $135K to $150K/year & benefits (see below).
Key Responsibilities:
Lead the design, development, and deployment of comprehensive MDM solutions across the organization, with an emphasis on financial data domains.
Demonstrate extensive experience with multiple MDM implementations, including platform selection, comparison, and optimization.
Architect and present end-to-end MDM architectures, ensuring scalability, data quality, and governance standards are met.
Evaluate various MDM platforms (e.g., Informatica, Reltio, Talend, IBM MDM, etc.) and provide objective recommendations aligned with business requirements.
Collaborate with business stakeholders to understand reference data sources and develop strategies for managing reference and master data effectively.
Implement data integration pipelines leveraging modern data engineering tools and practices.
Develop, automate, and maintain data workflows using Python, Airflow, or Astronomer.
Build and optimize data processing solutions using Kafka, Databricks, Snowflake, Azure Data Factory (ADF), and related technologies.
Design microservices, especially utilizing GraphQL, to enable flexible and scalable data services.
Ensure compliance with data governance, data privacy, and security standards.
Support CI/CD pipelines for continuous integration and deployment of data solutions.
Qualifications:
12+ years of experience in data engineering, with a proven track record of MDM implementations, preferably in the financial services industry.
Extensive hands-on experience designing and deploying MDM solutions and comparing MDM platform options.
Strong functional knowledge of reference data sources and domain-specific data standards.
Expertise in Python, Pyspark, Kafka, microservices architecture (particularly GraphQL), Databricks, Snowflake, Azure Data Factory, SQL, and orchestration tools such as Airflow or Astronomer.
Familiarity with CI/CD practices, tools, and automation pipelines.
Ability to work collaboratively across teams to deliver complex data solutions.
Experience with financial systems (capital markets, credit risk, and regulatory compliance applications).
Preferred Skills:
Familiarity with financial data models and regulatory requirements.
Experience with Azure cloud platforms
Knowledge of data governance, data quality frameworks, and metadata management.
We offer:
A highly competitive compensation and benefits package
A multinational organization with 58 offices in 21 countries and the possibility to work abroad
10 days of paid annual leave (plus sick leave and national holidays)
Maternity & Paternity leave plans
A comprehensive insurance plan including: medical, dental, vision, life insurance, and long-/short-term disability (plans vary by region)
Retirement savings plans
A higher education certification policy
Commuter benefits (varies by region)
Extensive training opportunities, focused on skills, substantive knowledge, and personal development.
On-demand Udemy for Business for all Synechron employees with free access to more than 5000 curated courses
Coaching opportunities with experienced colleagues from our Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups
Cutting edge projects at the world's leading tier-one banks, financial institutions and insurance firms
A flat and approachable organization
A truly diverse, fun-loving and global work culture
SYNECHRON'S DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference' is committed to fostering an inclusive culture - promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant's gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
Senior Data Architect
Data engineer job in Edison, NJ
Act as a Enterprise Architect, supporting architecture reviews, design decisions, and strategic planning.
Design and implement scalable data warehouse and analytics solutions on AWS and Snowflake.
Develop and optimize SQL, ETL/ELT pipelines, and data models to support reporting and analytics.
Collaborate with cross-functional teams (data engineering, application development, infrastructure) to align on architecture best practices and ensure consistency across solutions.
Evaluate and recommend technologies, tools, and frameworks to improve data processing efficiency and reliability.
Provide guidance and mentorship to data engineering teams, enforcing data governance, quality, and security standards.
Troubleshoot complex data and performance issues and propose long-term architectural solutions.
Support capacity planning, cost optimization, and environment management within AWS/Snowflake ecosystems.
About ValueMomentum
ValueMomentum is a leading solutions provider for the global property & casualty insurance industry, supported by deep domain and technology capabilities. We offer a comprehensive suite of advisory, development, implementation, and maintenance services across the entire P&C insurance value chain. This includes Underwriting, Claims, Distribution, and more, empowering insurers to stay ahead with sustained growth, high performance, and enhanced stakeholder value. Trusted by over 75 insurers, ValueMomentum is one of the largest standalone insurance-focused solutions providers to the US insurance industry.
Guidewire DevOps Engineer (with experience in Environment Strategy Build)
Data engineer job in Princeton, NJ
About Us:
LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit ********************
Job Title: Guidewire DevOps Engineer (with experience in Environment Strategy Build)
Work Location: Princeton, NJ (Hybrid - Onsite)
Required Experience: 7+ Years
Role Overview:
HSB is seeking a DevOps Engineer to lead the design build and ongoing management of Guidewire environments for the Cyber Admitted PolicyCenter program
This role will define a repeatable and scalable environment strategy ensuring that environments across development SIT UAT and production are built and maintained with consistency automation and traceability
The ideal candidate will have strong experience in Guidewire infrastructure setup Azure DevOps pipelines and infrastructure automation and will be comfortable operating in a complex multi stream program environment involving PolicyCenter Digital Portal SmartCOMM and Integration components
Key Responsibilities
Environment Strategy Planning
Define an end-to-end environment strategy and roadmap including topology refresh cadence and configuration management across the program
Establish environment standards and documentation to ensure consistency across all Guidewire instances
Recommend and implement an iterative build approach that allows environments to evolve progressively as functionality matures
Environment Build Configuration
Lead the provisioning configuration and validation of Guidewire environments Dev SIT UAT PreProd Prod
Develop and maintain automation pipelines or scripts eg Azure DevOps Terraform PowerShell to standardize environment setup and deployment processes
Collaborate with infrastructure and database teams on environment readiness configuration and SQL Server remediation or upgrades
Maintain alignment between PolicyCenter Digital and SmartCOMM environments for integrated end-to-end testing
Environment Maintenance Operations
Manage environment lifecycle activities including data refreshes DB drops configuration synchronization and release preparation
Monitor and maintain environment health ensuring stability and readiness for testing and release activities
Create and maintain environment status dashboards or tracking mechanisms within Azure DevOps for transparency and reporting
Access Security Enablement
Partner with internal IT and security teams to implement secure access models and environment level permissions
Manage service accounts secrets and credentials used across environments in alignment with enterprise security standards
Cross Stream Coordination
Coordinate with PolicyCenter DigitalPortal SmartCOMM and Integration workstreams to ensure consistent environment dependencies and deployment sequencing
Support integration testing by ensuring endpoint configurations and API connectivity are aligned across systems
Participate in environment planning sessions readiness reviews and golive preparations
Skills Experience:
5 years of DevOps or CloudOps experience in enterprise software or insurance platforms
3 years of experience supporting Guidewire applications PolicyCenter BillingCenter or ClaimCenter preferably on cloud or hybrid infrastructure
Strong working knowledge of Azure DevOps ADO pipelines and automation for environment deployment
Proficiency with infrastructure as code tools e.g., Terraform ARM templates PowerShell
Experience managing SQL Server environments data refreshes and system configuration
Familiarity with environment topology design CICD pipelines and multitier application environments
Excellent collaboration and documentation skills across technical and nontechnical teams
Success Metrics:
Standardized environments build and refresh process implemented across all Guidewire environments
Lead time for new environment setup reduced through automation and reusability
All environments traceable version controlled and integrated with Azure DevOps
Clear visibility into environment readiness for SIT UAT and production through dashboards or reports
Benefits/perks listed below may vary depending on the nature of your employment with LTIMindtree (“LTIM”):
Benefits and Perks:
Comprehensive Medical Plan Covering Medical, Dental, Vision
Short Term and Long-Term Disability Coverage
401(k) Plan with Company match
Life Insurance
Vacation Time, Sick Leave, Paid Holidays
Paid Paternity and Maternity Leave
The range displayed on each job posting reflects the minimum and maximum salary target for the position across all US locations. Within the range, individual pay is determined by work location and job level and additional factors including job-related skills, experience, and relevant education or training. Depending on the position offered, other forms of compensation may be provided as part of overall compensation like an annual performance-based bonus, sales incentive pay and other forms of bonus or variable compensation.
Disclaimer: The compensation and benefits information provided herein is accurate as of the date of this posting.
LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.
Safe return to office:
In order to comply with LTIMindtree' s company COVID-19 vaccine mandate, candidates must be able to provide proof of full vaccination against COVID-19 before or by the date of hire. Alternatively, one may submit a request for reasonable accommodation from LTIMindtree's COVID-19 vaccination mandate for approval, in accordance with applicable state and federal law, by the date of hire. Any request is subject to review through LTIMindtree's applicable processes.
Java Software Engineer
Data engineer job in East Windsor, NJ
Job Responsibilities:
Develop applications using Java 8/JEE (and higher), Angular 2+, React.js, SQL, Spring, HTML5, CSS, JavaScript, and TypeScript, among other tools.
Write scalable, secure, and maintainable code that powers our clients' platforms.
Create, deploy, and maintain automated system tests.
Work with Testers to understand defects and resolve them in a timely manner.
Support continuous improvement by investigating alternatives and technologies, and presenting these for architectural review.
Collaborate effectively with other team members to accomplish shared user story and sprint goals.
Requirement:
Experience in programming languages: Java and JavaScript
Decent understanding of the software development life cycle
Basic programming skills using object-oriented programming (OOP) languages, with in-depth
knowledge of common APIs and data structures like Collections, Maps, Lists, Sets, etc.
Knowledge of relational databases (e.g., SQL Server, Oracle) and basic SQL query language skills
Preferred Qualifications:
Master's Degree in Computer Science (CS)
0-1 year of practical experience in Java coding
Experience using Spring, Maven, Angular frameworks, HTML, and CSS
Knowledge of other contemporary Java technologies (e.g., WebLogic, RabbitMQ, Tomcat)
Familiarity with JSP, J2EE, and JDBC
Senior Dotnet Developer
Data engineer job in Ewing, NJ
The Software Engineer, III is a fullstack engineer with strong Node.js and functional programming (Scala & Ruby) skills. Our applications leverage Node.js, Bootstrap, JQuery, Mondo DB, Elastic Search, Redis, React.js and delightful interactive experience to the web. Our applications run in the AWS cloud environment. We use Agile Methodologies to enable our engineering team to work closely with partners and with our design & product teams. This role is full time and preferably located long-term in New York City or southern New Jersey areas.
Essential Job Duties and Responsibilities include:
Design, develop, and maintain modern web applications and UIs using .NET technologies such as C#, ASP.NET MVC, ASP.NET Core, Razor Pages, and Blazor.
Create clean, maintainable, and well-documented code following industry best practices and coding standards.
Develop and consume RESTful APIs and web services.
Build responsive and accessible user interfaces using HTML, CSS, JavaScript, and UI libraries/frameworks such as React, Angular, Vue.js, or Bootstrap.
Work with relational and NoSQL databases (e.g., SQL Server, MongoDB) and object-relational mappers (ORMs) such as Entity Framework Core.
Conduct unit and integration testing to validate functionality and ensure high-quality deliverables.
Participate in peer code reviews and provide constructive feedback to ensure continuous improvement and knowledge sharing.
Identify, troubleshoot, and resolve complex technical issues in development and production environments.
Collaborate with cross-functional teams throughout the software development lifecycle.
Stay current with emerging .NET technologies and trends.
May mentor and support junior developers in their technical growth and day-to-day work.
Maintain regular and punctual attendance.
Preferred Qualifications:
Experience with CI/CD pipelines and DevOps practices.
Familiarity with cloud platforms (e.g., Azure, AWS) and deploying .NET applications in cloud environments.
Knowledge of Blazor for interactive web UIs using C# instead of JavaScript
Education and/or Experience:
7+ years of professional software development experience with a strong focus on web and UI development in the .NET ecosystem.
Advanced proficiency in C#, ASP.NET, ASP.NET Core, and MVC frameworks; experience with VBScript is a plus.
Deep understanding of object-oriented programming (OOP) and design patterns.
Strong front-end development skills, including HTML, CSS, JavaScript, and at least one modern UI framework (React, Angular, Vue.js, etc.).
Proven experience developing and integrating RESTful APIs.
Hands-on experience with SQL Server and/or NoSQL databases; proficient in using Entity Framework Core or similar ORMs.
Familiarity with version control systems such as Git.
Solid grasp of Agile/Scrum development methodologies.
Excellent problem-solving abilities and strong attention to detail.
Effective communication and interpersonal skills with the ability to work independently and within a team.
Sr. C++FIX or Market Data Developer
Data engineer job in Princeton, NJ
Looking for a highly motivated C++ Trading Systems Developer with demonstrated experience in designing, developing and delivering core production software solutions in a mission critical trading systems environment. Major responsibilities include: Assessing business and systems requirements and developing functional specifications Designing and developing high quality, high performance trading systems software written in C++ to meet deliverable timelines and requirements Adhering to software development life cycle process/methodology Building business level subject matter expertise in trading systems functionality and processing Provide second level support for production on an ad hoc basis when necessary Location: Princeton, NJ Organizational Structure: The developer will be an integral part of a core development team and report to the Trading System Development management team. Qualifications: Full software development life cycle experience in a mission critical trading systems environment a must… Options, Equities, Futures, etc. Must possess excellent software design skills and knowledge of advanced data structures Must have exceptionally strong C++ knowledge and debugging skills in a Linux environment Solid knowledge of Object Oriented Programming concepts a must Strong knowledge of TCP/IP multicast and socket programming required Knowledge of the BOOST libraries and STL required Must have experience in developing real-time applications in a distributed processing architecture Must have excellent organizational and communication skills Must be able to work effectively in a team environment Strong knowledge of the logical business domain in Options or Equities trading systems a big plus Experience coding interface solutions for FIX, OPRA, CTA or UTP a big plus Knowledge of scripting languages such as Python, Shell, and Perl a plus Education and Experience: Minimum of a Bachelor's degree or equivalent in IT/Computer Science 7+ years of experience in C++ development 5+ years of demonstrated experience in delivering software solutions in a trading systems environment for an Exchange or a Wall Street firm
Java Software Engineer
Data engineer job in Iselin, NJ
Job Information:
Functional Title - Assistant Vice President, Java Software Development Engineer
Department - Technology
Corporate Level - Assistant Vice President
Report to - Director, Application Development
Expected full-time salary range between $ 125,000 - 145,000 + variable compensation + 401(k) match + benefits
Job Description:
This position is with CLS Technology. The primary responsibilities of the job will be
(a) Hands-on software application development
(b) Level 3 support
Duties, Responsibilities, and Deliverables:
Develop scalable, robust applications utilizing appropriate design patterns, algorithms and Java frameworks
Collaborate with Business Analysts, Application Architects, Developers, QA, Engineering, and Technology Vendor teams for design, development, testing, maintenance and support
Adhere to CLS SDLC process and governance requirements and ensure full compliance of these requirements
Plan, implement and ensure that delivery milestones are met
Provide solutions using design patterns, common techniques, and industry best practices that meet the typical challenges/requirements of a financial application including usability, performance, security, resiliency, and compatibility
Proactively recognize system deficiencies and implement effective solutions
Participate in, contribute to, and assimilate changes, enhancements, requirements (functional and non-functional), and requirements traceability
Apply significant knowledge of industry trends and developments to improve CLS in-house practices and services
Provide Level-3 support. Provide application knowledge and training to Level-2 support teams
Experience Requirements:
5+ years of hands-on application development and testing experience with proficient knowledge of core Java and JEE technologies such as JDBC and JAXB, Java/Web technologies
Knowledge of Python, Perl, Unix shell scripting is a plus
Expert hands-on experience with SQL and with at least one DBMS such as IBM DB2 (preferred) or Oracle is a strong plus
Expert knowledge of and experience in securing web applications, secure coding practices
Hands-on knowledge of application resiliency, performance tuning, technology risk management is a strong plus
Hands-on knowledge of messaging middleware such as IBM MQ (preferred) or TIBCO EMS, and application servers such as WebSphere, or WebLogic
Knowledge of SWIFT messaging, payments processing, FX business domain is a plus
Hands-on knowledge of CI/CD practices and DevOps toolsets such as JIRA, GIT, Ant, Maven, Jenkins, Bamboo, Confluence, and ServiceNow.
Hands-on knowledge of MS Office toolset including MS-Excel, MS-Word, PowerPoint, and Visio
Proven track record of successful application delivery to production and effective Level-3 support.
Success factors: In addition, the person selected for the job will
Have strong analytical, written and oral communication skills with a high self-motivation factor
Possess excellent organization skills to manage multiple tasks in parallel
Be a team player
Have the ability to work on complex projects with globally distributed teams and manage tight delivery timelines
Have the ability to smoothly handle high stress application development and support environments
Strive continuously to improve stakeholder management for end-to-end application delivery and support
Qualification Requirements:
Bachelor Degree
Minimum 5 year experience in Information Technology
SRE/DevOps w/ HashiCorp & Clojure Exp
Data engineer job in Philadelphia, PA
Locals Only! SRE/DevOps w/ HashiCorp & Clojure Exp Philadelphia, PA: 100% Onsite! 12 + Months
MUST: HashiCorp Clojure
Role: Lead SRE initiatives, automating and monitoring cloud infrastructure to ensure reliable, scalable, and secure systems for eCommerce.
Required: Must Have:
AWS, Terraform, HashiCorp Stack (Nomad, Vault, Consul)
Programming in Python/Clojure
Automation, monitoring, and log centralization (Splunk)
Experience leading large-scale cloud infrastructure
Desired Skills and Experience
Locals Only!
SRE/DevOps w/ HashiCorp & Clojure Exp
Philadelphia, PA: 100% Onsite!
12 + Months
Dexian stands at the forefront of Talent + Technology solutions with a presence spanning more than 70 locations worldwide and a team exceeding 10,000 professionals. As one of the largest technology and professional staffing companies and one of the largest minority-owned staffing companies in the United States, Dexian combines over 30 years of industry expertise with cutting-edge technologies to deliver comprehensive global services and support.
Dexian connects the right talent and the right technology with the right organizations to deliver trajectory-changing results that help everyone achieve their ambitions and goals. To learn more, please visit ********************
Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.
Identity and Access Management - Software Engineering Lead
Data engineer job in Philadelphia, PA
About the role - As Engineering Lead for NeoID-Elsevier's next-generation Identity and Access Management (IAM) platform-you'll leverage your deep security expertise to architect, build, and evolve the authentication and authorization backbone for Elsevier's global products. You'll also lead and manage a team of 5 engineers, fostering their growth and ensuring delivery excellence. You'll have the opportunity to work with industry standard protocols such as OAuth2, OIDC and SAML, as well as healthcare's SMART on FHIR and EHR integrations.
About the team- This team is entrusted with building Elsevier's next-generation Identity and Access Management (IAM) platform. This diverse team of engineers are also building and evolving the authentication and authorization backbone for Elsevier's global products. This team is building a brand new product in Cyber Security that will provide Authorization and Authentication for ALL Elsevier products
Qualifications
Current and extensive experience with at least one major IAM platform (KeyCloak, Auth0, Okta, or similar) - KeyCloak and Auth0 experience are strong pluses. Only candidates with this experience will be considered for this critical role.
Possess an in-depth security mindset, with proven experience designing and implementing secure authentication and authorization systems
Have an extensive understanding of OAuth2, OIDC and SAML protocols, including relevant RFCs and enterprise/server-side implementations
Familiarity with healthcare identity protocols, including SMART on FHIR and EHR integrations
Have current hands-on experience with AWS cloud services and infrastructure management. Proficiency in Infrastructure as Code (IaC) tools, especially Terraform
Strong networking skills, including network security, protocols, and troubleshooting
Familiarity with software development methodologies (Agile, Waterfall, etc.)
Current experience as a people manager of ideally Software and Security professionals.
Experience with Java/J2EE, JavaScript, and related technologies, or willingness to learn and deepen expertise
Knowledge of data modeling, optimization, and secure data handling best practices
Accountabilities
Leading the design and implementation of secure, scalable IAM solutions, with a focus on OAuth2/OIDC and healthcare protocols such as SMART on FHIR and EHR integrations
Managing, mentoring and supporting a team of 5 engineers, fostering a culture of security, innovation, and technical excellence
Collaborating with product managers and stakeholders to define requirements and strategic direction for the platform, including healthcare and life sciences use cases
Writing and reviewing code, performing code reviews, and ensuring adherence to security and engineering best practices
Troubleshooting and resolving complex technical issues, providing expert guidance on IAM, security, and healthcare protocol topics
Contributing to architectural decisions and long-term platform strategy
Staying current with industry trends, emerging technologies, and evolving security threats in the IAM and healthcare space
Senior Software Developer
Data engineer job in Horsham, PA
About Us TherapDescriptionyNotes is the go-to superhero for behavioral health Practice Management and EHR software! Our top-notch SaaS solution handles scheduling, billing, documenting, telehealth, and more so clinicians can focus on awesome patient care.
We're a dynamic team of pros who love to innovate and push the envelope, keeping our software cutting-edge. Join us, and let's revolutionize behavioral health software together while making a real difference!
About The Job
TherapyNotes is seeking a Senior Software Developer to join our growing team. We are looking for a passionate and experienced engineer skilled in building scalable and responsive web applications and services using Angular and ASP.NET Core. The ideal candidate will have demonstrated expertise in implementing robust APIs using event-based software design and adhering to Service-Oriented Architecture (SOA) principles. They should excel in a collaborative environment and have a proven track record of mentoring and developing others.
What You'll Do
Perform full-stack development including front end, business logic, and data access layers.
Responsible for the entire development lifecycle from planning to release and support
Actively contribute to software architecture decisions, design strategies, and code reviews to ensure high-quality, scalable, and maintainable solutions
Collaborate closely with development team members and stakeholders
Mentor and assist in the training and onboarding of new developers
Maintain high standards, attention to detail, accuracy and completeness
What We're Looking For
5 or more years experience developing software in an Agile, team-based environment
3 or more years experience developing responsive web applications
BS and/or MS in a technical discipline (Computer Science or Software Engineering required)
Strong understanding of OOP concepts and design patterns
Expertise with Angular, ASP.NET Core, C#, JavaScript, TypeScript, CSS, SASS, and HTML
Expertise in building robust APIs and adhering to Service-Oriented Architecture (SOA) principles
Experience in event-based software design and event-driven architecture
Experience with PostgreSQL or other relational databases, and Entity Framework Core or similar object-relational mapping frameworks
Excellent problem solving and communication skills
What We Offer
Competitive salary $110,000 - $135,000
Employer sponsored health, dental, vision, life, and disability insurance
Retirement plan with company contribution
Annual company profit sharing
Personal development/training budget
Open, collaborative work environment
Extensive 2-week onboarding plan
Comprehensive mentorship program
Equal Opportunity Employer Statement & Applicant Rights
TherapyNotes LLC is an Equal Opportunity Employer and does not discriminate based on race, color, religion, sex, national origin, age, disability, genetic information, or any other protected status under federal, state, or local law. We are committed to providing a workplace free of discrimination and harassment. For more information about your rights under federal employment laws, please review the following-
Know Your Rights- Workplace Discrimination is Illegal
Family and Medical Leave Act (FMLA)- Employee Rights Under FMLA
If you require a reasonable accommodation during the application process, please contact *******************************.
#LI-Remote
#LI-AC1
12/3/2025
Senior Data Engineer
Data engineer job in Philadelphia, PA
Full-time Perm
Remote - EAST COAST ONLY
Role open to US Citizens and Green Card Holders only
We're looking for a Senior Data Engineer to lead the design, build, and optimization of modern data pipelines and cloud-native data infrastructure. This role is ideal for someone who thrives on solving complex data challenges, improving systems at scale, and collaborating across technical and business teams to deliver high-impact solutions.
What You'll Do
Architect, develop, and maintain scalable, secure data infrastructure supporting analytics, reporting, and operational workflows.
Design and optimize ETL/ELT pipelines to integrate data from diverse internal and external sources.
Prepare and transform structured and unstructured data to support modeling, reporting, and advanced analysis.
Improve data quality, reliability, and performance across platforms and workflows.
Monitor pipelines, troubleshoot discrepancies, and ensure accuracy and timely data delivery.
Identify architectural bottlenecks and drive long-term scalability improvements.
Collaborate with Product, BI, Finance, and engineering teams to build end-to-end data solutions.
Prototype algorithms, transformations, and automation tools to accelerate insights.
Lead cloud-native workflow design, including logging, monitoring, and storage best practices.
Create and maintain high-quality technical documentation.
Contribute to Agile ceremonies, engineering best practices, and continuous improvement initiatives.
Mentor teammates and guide adoption of data platform tools and patterns.
Participate in on-call rotation to maintain platform stability and availability.
What You Bring
Bachelor's degree in Computer Science or related technical field.
4+ years of advanced SQL experience (Oracle, PostgreSQL, etc.).
4+ years working with Java or Groovy.
3+ years integrating with SOAP or REST APIs.
2+ years with DBT and data modeling.
Strong understanding of modern data architectures, distributed systems, and performance optimization.
Experience with Snowflake or similar cloud data platforms (preferred).
Hands-on experience with Git, Jenkins, CI/CD, and automation/testing practices.
Solid grasp of cloud concepts and cloud-native engineering.
Excellent problem-solving, communication, and cross-team collaboration skills.
Ability to lead projects, own solutions end-to-end, and influence technical direction.
Proactive mindset with strong analytical and consultative abilities.