Senior Data Engineer
Data engineer job in Iselin, NJ
Sr. Data Engineer (Snowflake, Databricks, Python, Pyspark, SQL and Banking)
Iselin, NJ (Need local profiles only)
In-Person interview will be required.
Over all 11+ Years of Experience & Recent experience with banking domain experience required.
Only W2 & Visa Independent candidates
Required experience:
Job Description:
We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic Team.
Responsibilities:
Understand technical specifications
Business requirements discussion with business analyst and business users
Python/SQL Server/Snowflake/Databricks application development and system design
Develop and maintain data models and schemas to support data integration and analysis.
Implement data quality and validation checks to ensure accuracy and consistency of data.
Execution of UT and SIT with business analysts to ensure of high-quality testing
Support for UAT with business users
Production support and maintenance of application platform
Qualifications:
General:
Around 12+ years IT industry experience
Agile methodology and SDLC processes
Design and Architecture experience
Experience working in global delivery model (onshore/offshore/nearshore)
Strong problem-solving and analytical skills
Self-starter, collaborative team player and works with minimal guidance
Strong communication skills
Technical Skills:
Mandatory (Strong) -
Python, SQL server and relational database concepts, Azure Databricks, Snowflake, Scheduler (Autosys/Control-M), ETL, CI/CD
Plus:
PySpark
Financial systems/capital markets/credit risk/regulatory application development experience
Data Engineer
Data engineer job in Hamilton, NJ
Key Responsibilities:
Manage and support batch processes and data pipelines in Azure Databricks and Azure Data Factory.
Integrate and process Bloomberg market data feeds and files into trading or analytics platforms.
Monitor, troubleshoot, and resolve data and system issues related to trading applications and market data ingestion.
Develop, automate, and optimize ETL pipelines using Python, Spark, and SQL.
Manage FTP/SFTP file transfers between internal systems and external vendors.
Ensure data quality, completeness, and timeliness for downstream trading and reporting systems.
Collaborate with operations, application support, and infrastructure teams to resolve incidents and enhance data workflows.
Required Skills & Experience:
10+ years of experience in data engineering or production support within financial services or trading environments.
Hands-on experience with Azure Databricks, Azure Data Factory, and Azure Storage, Logic Apps, Fabric.
Strong Python and SQL programming skills.
Experience with Bloomberg data feeds (BPIPE, TSIP,SFTP).
Experience with Git, CI/CD pipelines, and Azure DevOps.
Proven ability to support batch jobs, troubleshoot failures, and manage job scheduling.
Experience handling FTP/SFTP file transfers and automation (e.g., using scripts or managed file transfer tools).
Solid understanding of equities trading, fixed income trading, trading workflows, and financial instruments.
Excellent communication, problem-solving, and stakeholder management skills.
Data Architect
Data engineer job in Piscataway, NJ
Data Architecture & Modeling
Design and maintain enterprise-level logical, conceptual, and physical data models.
Define data standards, naming conventions, metadata structures, and modeling best practices.
Ensure scalability, performance, and alignment of data models with business requirements.
Data Governance & Quality
Implement and enforce data governance principles and policies.
Define data ownership, stewardship, data lineage, and lifecycle management.
Lead initiatives to improve data quality, consistency, and compliance.
Enterprise Data Management
Develop enterprise data strategies, including data integration, master data management (MDM), and reference data frameworks.
Define and oversee the enterprise data architecture blueprint.
Ensure alignment between business vision and data technology roadmaps.
Python Data Engineer
Data engineer job in Iselin, NJ
Job Title:Data Engineer (Python, Spark, Cloud)
Pay :$90000 per year DOE
Term : Contract
Work Authorization: US Citizens only
( may need Security clearance in future)
Job Summary:
We are seeking a mid-level Data Engineer with strong Python and Big Data skills to design, develop, and maintain scalable data pipelines and cloud-based solutions. This role involves hands-on coding, data integration, and collaboration with cross-functional teams to support enterprise analytics and reporting.
Key Responsibilities:
Build and maintain ETL pipelines using Python and PySpark for batch and streaming data.
Develop data ingestion frameworks for structured/unstructured sources.
Implement data workflows using Airflow and integrate with Kafka for real-time processing.
Deploy solutions on Azure or GCP using container platforms (Kubernetes/OpenShift).
Optimize SQL queries and ensure data quality and governance.
Collaborate with data architects and analysts to deliver reliable data solutions.
Required Skills:
Python (3.x) - scripting, API development, automation.
Big Data: Spark/PySpark, Hadoop ecosystem.
Streaming: Kafka.
SQL: Oracle, Teradata, or SQL Server.
Cloud: Azure or GCP (BigQuery, Dataflow).
Containers: Kubernetes/OpenShift.
CI/CD: GitHub, Jenkins.
Preferred Skills:
Airflow for orchestration.
ETL tools (Informatica, Talend).
Financial services experience.
Education & Experience:
Bachelor's in Computer Science or related field.
3-5 years of experience in data engineering and Python development.
Keywords for Visibility:
Python, PySpark, Spark, Hadoop, Kafka, Airflow, Azure, GCP, Kubernetes, CI/CD, ETL, Data Lake, Big Data, Cloud Data Engineering.
Reply with your profiles to this posting and send it to ******************
Data Analytics Engineer
Data engineer job in Somerset, NJ
Client: manufacturing company
Type: direct hire
Our client is a publicly traded, globally recognized technology and manufacturing organization that relies on data-driven insights to support operational excellence, strategic decision-making, and digital transformation. They are seeking a Power BI Developer to design, develop, and maintain enterprise reporting solutions, data pipelines, and data warehousing assets.
This role works closely with internal stakeholders across departments to ensure reporting accuracy, data availability, and the long-term success of the company's business intelligence initiatives. The position also plays a key role in shaping BI strategy and fostering collaboration across cross-functional teams.
This role is on-site five days per week in Somerset, NJ.
Key Responsibilities
Power BI Reporting & Administration
Lead the design, development, and deployment of Power BI and SSRS reports, dashboards, and analytics assets
Collaborate with business stakeholders to gather requirements and translate needs into scalable technical solutions
Develop and maintain data models to ensure accuracy, consistency, and reliability
Serve as the Power BI tenant administrator, partnering with security teams to maintain data protection and regulatory compliance
Optimize Power BI solutions for performance, scalability, and ease of use
ETL & Data Warehousing
Design and maintain data warehouse structures, including schema and database layouts
Develop and support ETL processes to ensure timely and accurate data ingestion
Integrate data from multiple systems while ensuring quality, consistency, and completeness
Work closely with database administrators to optimize data warehouse performance
Troubleshoot data pipelines, ETL jobs, and warehouse-related issues as needed
Training & Documentation
Create and maintain technical documentation, including specifications, mappings, models, and architectural designs
Document data warehouse processes for reference, troubleshooting, and ongoing maintenance
Manage data definitions, lineage documentation, and data cataloging for all enterprise data models
Project Management
Oversee Power BI and reporting projects, offering technical guidance to the Business Intelligence team
Collaborate with key business stakeholders to ensure departmental reporting needs are met
Record meeting notes in Confluence and document project updates in Jira
Data Governance
Implement and enforce data governance policies to ensure data quality, compliance, and security
Monitor report usage metrics and follow up with end users as needed to optimize adoption and effectiveness
Routine IT Functions
Resolve Help Desk tickets related to reporting, dashboards, and BI tools
Support general software and hardware installations when needed
Other Responsibilities
Manage email and phone communication professionally and promptly
Respond to inquiries to resolve issues, provide information, or direct to appropriate personnel
Perform additional assigned duties as needed
Qualifications
Required
Minimum of 3 years of relevant experience
Bachelor's degree in Computer Science, Data Analytics, Machine Learning, or equivalent experience
Experience with cloud-based BI environments (Azure, AWS, etc.)
Strong understanding of data modeling, data visualization, and ETL tools (e.g., SSIS, Azure Synapse, Snowflake, Informatica)
Proficiency in SQL for data extraction, manipulation, and transformation
Strong knowledge of DAX
Familiarity with data warehouse technologies (e.g., Azure Blob Storage, Redshift, Snowflake)
Experience with Power Pivot, SSRS, Azure Synapse, or similar reporting tools
Strong analytical, problem-solving, and documentation skills
Excellent written and verbal communication abilities
High attention to detail and strong self-review practices
Effective time management and organizational skills; ability to prioritize workload
Professional, adaptable, team-oriented, and able to thrive in a dynamic environment
Data Engineer
Data engineer job in East Windsor, NJ
🚀 Junior Data Engineer
📝 E-Verified | Visa Sponsorship Available
🔍 About Us:
BeaconFire, based in Central NJ, is a fast-growing company specializing in Software Development, Web Development, and Business Intelligence. We're looking for self-motivated and strong communicators to join our team as a Junior Data Engineer!
If you're passionate about data and eager to learn, this is your opportunity to grow in a collaborative and innovative environment. 🌟
🎓 Qualifications We're Looking For:
Passion for data and a strong desire to learn and grow.
Master's Degree in Computer Science, Information Technology, Data Analytics, Data Science, or a related field.
Intermediate Python skills (Experience with NumPy, Pandas, etc. is a plus!)
Experience with relational databases like SQL Server, Oracle, or MySQL.
Strong written and verbal communication skills.
Ability to work independently and collaboratively within a team.
🛠️ Your Responsibilities:
Collaborate with analytics teams to deliver reliable, scalable data solutions.
Design and implement ETL/ELT processes to meet business data demands.
Perform data extraction, manipulation, and production from database tables.
Build utilities, user-defined functions, and frameworks to optimize data flows.
Create automated unit tests and participate in integration testing.
Troubleshoot and resolve operational and performance-related issues.
Work with architecture and engineering teams to implement high-quality solutions and follow best practices.
🌟 Why Join BeaconFire?
✅ E-Verified employer
🌍 Work Visa Sponsorship Available
📈 Career growth in data engineering and BI
🤝 Supportive and collaborative work culture
💻 Exposure to real-world, enterprise-level projects
📩 Ready to launch your career in Data Engineering?
Apply now and let's build something amazing together! 🚀
SENIOR DATA ENGINEER DEVELOPER/ARCHITECT
Data engineer job in Red Bank, NJ
MUST BE WILLING TO GO ONSITE ON AN AS NEEDED BASIS/OPEN TO 2 DAYS A WEEK
MUST LIVE IN THE TRI-STATE AREA/NO RELOCATION AVAILABLE AND NO RELOCATING OPTION
YOU MUST BE ACTIVELY LOCAL TO THE BRIDGEWATER NJ LOCATION
YOU MUST BE A US CITIZEN OR GREENCARD, NO OTHER WORK STATUS IS AN OPTION
MUST HAVE STRONG SKILLS IN DATABRICKS/PYTHON ORACLE
MUST HAVE STRONG EXPERIENCE WITH AZURE AND AIRFLOW
MUST HAVE PYTHON,VDATAWAREHOUSING, ORACLE
Bachelor's Degree in Computer Science, Data Science, Software Engineering, Information Systems, or related quantitative field
7 plus years of experience working as a Data Engineer, ETL Engineer, Data/ETL Architect or similar roles
MUST HAVE SOLID SQl Server, Fabric, Synopse
DO NOT WANT TO SEE AWS OR GCP BACKGROUNDS
DATABRICKS CERTIFICATE IS A HUGE PLUS
solid continuous experience in Python.
years of continuous experience with Airflow.
years of experience with Azure Data Factory (ADF) or similar.
years of experience working with relational databases: Oracle, SQL Server, PostgreSQL, or similar.
years of experience working with NoSQL databases: MongoDB, Cosmos DB, DocumentDB or similar
years of experience writing SQL code.
years of experience in Kimball Dimensional Modeling (Star-Schema).
years of experience in columnar databases: Snowflake, Azure Synapse, or similar.
Senior Data Engineer - Master Data Management (MDM)
Data engineer job in Iselin, NJ
We are seeking a highly skilled and experienced Senior Data Engineer specializing in Master Data Management (MDM) to join our data team. The ideal candidate will have a strong background in designing, implementing, and managing end-to-end MDM solutions, preferably within the financial sector. You will be responsible for architecting robust data platforms, evaluating MDM tools, and aligning data strategies to meet business needs.
Additional Information*
The base salary for this role will vary based on geography and other factors. In accordance with law, the base salary for this role if filled within Iselin, NJ is $140K - $150K/year & benefits (see below).
The Role
Responsibilities:
Lead the design, development, and deployment of comprehensive MDM solutions across the organization, with an emphasis on financial data domains.
Demonstrate extensive experience with multiple MDM implementations, including platform selection, comparison, and optimization.
Architect and present end-to-end MDM architectures, ensuring scalability, data quality, and governance standards are met.
Evaluate various MDM platforms (e.g., Informatica, Reltio, Talend, IBM MDM, etc.) and provide objective recommendations aligned with business requirements.
Collaborate with business stakeholders to understand reference data sources and develop strategies for managing reference and master data effectively.
Implement data integration pipelines leveraging modern data engineering tools and practices.
Develop, automate, and maintain data workflows using Python, Airflow, or Astronomer.
Build and optimize data processing solutions using Kafka, Databricks, Snowflake, Azure Data Factory (ADF), and related technologies.
Design microservices, especially utilizing GraphQL, to enable flexible and scalable data services.
Ensure compliance with data governance, data privacy, and security standards.
Support CI/CD pipelines for continuous integration and deployment of data solutions.
Requirements:
12+ years of experience in data engineering, with a proven track record of MDM implementations, preferably in the financial services industry.
Extensive hands-on experience designing and deploying MDM solutions and comparing MDM platform options.
Strong functional knowledge of reference data sources and domain-specific data standards.
Expertise in Python, Pyspark, Kafka, microservices architecture (particularly GraphQL), Databricks, Snowflake, Azure Data Factory, SQL, and orchestration tools such as Airflow or Astronomer.
Familiarity with CI/CD practices, tools, and automation pipelines.
Ability to work collaboratively across teams to deliver complex data solutions.
Experience with financial systems (capital markets, credit risk, and regulatory compliance applications).
Preferred Qualifications:
Familiarity with financial data models and regulatory requirements.
Experience with Azure cloud platforms
Knowledge of data governance, data quality frameworks, and metadata management.
We offer:
A highly competitive compensation and benefits package.
A multinational organization with 58 offices in 21 countries and the possibility to work abroad.
10 days of paid annual leave (plus sick leave and national holidays).
Maternity & paternity leave plans.
A comprehensive insurance plan including medical, dental, vision, life insurance, and long-/short-term disability (plans vary by region).
Retirement savings plans.
A higher education certification policy.
Commuter benefits (varies by region).
Extensive training opportunities, focused on skills, substantive knowledge, and personal development.
On-demand Udemy for Business for all Synechron employees with free access to more than 5000 curated courses.
Coaching opportunities with experienced colleagues from our Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups.
Cutting edge projects at the world's leading tier-one banks, financial institutions and insurance firms.
A flat and approachable organization.
A truly diverse, fun-loving, and global work culture.
S YNECHRON'S DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference' is committed to fostering an inclusive culture - promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant's gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
Senior Data Architect
Data engineer job in Edison, NJ
Act as a Enterprise Architect, supporting architecture reviews, design decisions, and strategic planning.
Design and implement scalable data warehouse and analytics solutions on AWS and Snowflake.
Develop and optimize SQL, ETL/ELT pipelines, and data models to support reporting and analytics.
Collaborate with cross-functional teams (data engineering, application development, infrastructure) to align on architecture best practices and ensure consistency across solutions.
Evaluate and recommend technologies, tools, and frameworks to improve data processing efficiency and reliability.
Provide guidance and mentorship to data engineering teams, enforcing data governance, quality, and security standards.
Troubleshoot complex data and performance issues and propose long-term architectural solutions.
Support capacity planning, cost optimization, and environment management within AWS/Snowflake ecosystems.
About ValueMomentum
ValueMomentum is a leading solutions provider for the global property & casualty insurance industry, supported by deep domain and technology capabilities. We offer a comprehensive suite of advisory, development, implementation, and maintenance services across the entire P&C insurance value chain. This includes Underwriting, Claims, Distribution, and more, empowering insurers to stay ahead with sustained growth, high performance, and enhanced stakeholder value. Trusted by over 75 insurers, ValueMomentum is one of the largest standalone insurance-focused solutions providers to the US insurance industry.
Salesforce Engineer
Data engineer job in Yardley, PA
🚀 We're Hiring: Salesforce Engineer
📍 Hybrid to Yardley, PA OR Madison, WI OR Boise, ID
Professional Experience
3-5 years of hands-on Salesforce engineering/development experience.
Proven success designing scalable Salesforce solutions for sales & commercial teams.
Expertise in Apex, LWC, Visualforce, SOQL, APIs, and integration tools.
Experience implementing AI solutions (Einstein Analytics, predictive modeling, or third-party AI).
Strong experience integrating Salesforce with Pardot, Marketo, HubSpot, or similar platforms.
Implement Salesforce AgentForce and other AI tools to deliver predictive insights and intelligent automation.
Skills & Competencies
Strong analytical and problem-solving abilities.
Excellent communication skills - ability to clearly articulate work across teams is critical.
Experience working in Agile environments (Scrum/Kanban).
Ability to excel both independently and collaboratively in a fast-paced environment.
Guidewire DevOps Engineer (with experience in Environment Strategy Build)
Data engineer job in Princeton, NJ
About Us:
LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit ********************
Job Title: Guidewire DevOps Engineer (with experience in Environment Strategy Build)
Work Location: Princeton, NJ (Hybrid - Onsite)
Required Experience: 7+ Years
Role Overview:
HSB is seeking a DevOps Engineer to lead the design build and ongoing management of Guidewire environments for the Cyber Admitted PolicyCenter program
This role will define a repeatable and scalable environment strategy ensuring that environments across development SIT UAT and production are built and maintained with consistency automation and traceability
The ideal candidate will have strong experience in Guidewire infrastructure setup Azure DevOps pipelines and infrastructure automation and will be comfortable operating in a complex multi stream program environment involving PolicyCenter Digital Portal SmartCOMM and Integration components
Key Responsibilities
Environment Strategy Planning
Define an end-to-end environment strategy and roadmap including topology refresh cadence and configuration management across the program
Establish environment standards and documentation to ensure consistency across all Guidewire instances
Recommend and implement an iterative build approach that allows environments to evolve progressively as functionality matures
Environment Build Configuration
Lead the provisioning configuration and validation of Guidewire environments Dev SIT UAT PreProd Prod
Develop and maintain automation pipelines or scripts eg Azure DevOps Terraform PowerShell to standardize environment setup and deployment processes
Collaborate with infrastructure and database teams on environment readiness configuration and SQL Server remediation or upgrades
Maintain alignment between PolicyCenter Digital and SmartCOMM environments for integrated end-to-end testing
Environment Maintenance Operations
Manage environment lifecycle activities including data refreshes DB drops configuration synchronization and release preparation
Monitor and maintain environment health ensuring stability and readiness for testing and release activities
Create and maintain environment status dashboards or tracking mechanisms within Azure DevOps for transparency and reporting
Access Security Enablement
Partner with internal IT and security teams to implement secure access models and environment level permissions
Manage service accounts secrets and credentials used across environments in alignment with enterprise security standards
Cross Stream Coordination
Coordinate with PolicyCenter DigitalPortal SmartCOMM and Integration workstreams to ensure consistent environment dependencies and deployment sequencing
Support integration testing by ensuring endpoint configurations and API connectivity are aligned across systems
Participate in environment planning sessions readiness reviews and golive preparations
Skills Experience:
5 years of DevOps or CloudOps experience in enterprise software or insurance platforms
3 years of experience supporting Guidewire applications PolicyCenter BillingCenter or ClaimCenter preferably on cloud or hybrid infrastructure
Strong working knowledge of Azure DevOps ADO pipelines and automation for environment deployment
Proficiency with infrastructure as code tools e.g., Terraform ARM templates PowerShell
Experience managing SQL Server environments data refreshes and system configuration
Familiarity with environment topology design CICD pipelines and multitier application environments
Excellent collaboration and documentation skills across technical and nontechnical teams
Success Metrics:
Standardized environments build and refresh process implemented across all Guidewire environments
Lead time for new environment setup reduced through automation and reusability
All environments traceable version controlled and integrated with Azure DevOps
Clear visibility into environment readiness for SIT UAT and production through dashboards or reports
Benefits/perks listed below may vary depending on the nature of your employment with LTIMindtree (“LTIM”):
Benefits and Perks:
Comprehensive Medical Plan Covering Medical, Dental, Vision
Short Term and Long-Term Disability Coverage
401(k) Plan with Company match
Life Insurance
Vacation Time, Sick Leave, Paid Holidays
Paid Paternity and Maternity Leave
The range displayed on each job posting reflects the minimum and maximum salary target for the position across all US locations. Within the range, individual pay is determined by work location and job level and additional factors including job-related skills, experience, and relevant education or training. Depending on the position offered, other forms of compensation may be provided as part of overall compensation like an annual performance-based bonus, sales incentive pay and other forms of bonus or variable compensation.
Disclaimer: The compensation and benefits information provided herein is accurate as of the date of this posting.
LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.
Safe return to office:
In order to comply with LTIMindtree' s company COVID-19 vaccine mandate, candidates must be able to provide proof of full vaccination against COVID-19 before or by the date of hire. Alternatively, one may submit a request for reasonable accommodation from LTIMindtree's COVID-19 vaccination mandate for approval, in accordance with applicable state and federal law, by the date of hire. Any request is subject to review through LTIMindtree's applicable processes.
Java Software Engineer
Data engineer job in Iselin, NJ
Job Information:
Functional Title - Assistant Vice President, Java Software Development Engineer
Department - Technology
Corporate Level - Assistant Vice President
Report to - Director, Application Development
Expected full-time salary range between $ 125,000 - 145,000 + variable compensation + 401(k) match + benefits
Job Description:
This position is with CLS Technology. The primary responsibilities of the job will be
(a) Hands-on software application development
(b) Level 3 support
Duties, Responsibilities, and Deliverables:
Develop scalable, robust applications utilizing appropriate design patterns, algorithms and Java frameworks
Collaborate with Business Analysts, Application Architects, Developers, QA, Engineering, and Technology Vendor teams for design, development, testing, maintenance and support
Adhere to CLS SDLC process and governance requirements and ensure full compliance of these requirements
Plan, implement and ensure that delivery milestones are met
Provide solutions using design patterns, common techniques, and industry best practices that meet the typical challenges/requirements of a financial application including usability, performance, security, resiliency, and compatibility
Proactively recognize system deficiencies and implement effective solutions
Participate in, contribute to, and assimilate changes, enhancements, requirements (functional and non-functional), and requirements traceability
Apply significant knowledge of industry trends and developments to improve CLS in-house practices and services
Provide Level-3 support. Provide application knowledge and training to Level-2 support teams
Experience Requirements:
5+ years of hands-on application development and testing experience with proficient knowledge of core Java and JEE technologies such as JDBC and JAXB, Java/Web technologies
Knowledge of Python, Perl, Unix shell scripting is a plus
Expert hands-on experience with SQL and with at least one DBMS such as IBM DB2 (preferred) or Oracle is a strong plus
Expert knowledge of and experience in securing web applications, secure coding practices
Hands-on knowledge of application resiliency, performance tuning, technology risk management is a strong plus
Hands-on knowledge of messaging middleware such as IBM MQ (preferred) or TIBCO EMS, and application servers such as WebSphere, or WebLogic
Knowledge of SWIFT messaging, payments processing, FX business domain is a plus
Hands-on knowledge of CI/CD practices and DevOps toolsets such as JIRA, GIT, Ant, Maven, Jenkins, Bamboo, Confluence, and ServiceNow.
Hands-on knowledge of MS Office toolset including MS-Excel, MS-Word, PowerPoint, and Visio
Proven track record of successful application delivery to production and effective Level-3 support.
Success factors: In addition, the person selected for the job will
Have strong analytical, written and oral communication skills with a high self-motivation factor
Possess excellent organization skills to manage multiple tasks in parallel
Be a team player
Have the ability to work on complex projects with globally distributed teams and manage tight delivery timelines
Have the ability to smoothly handle high stress application development and support environments
Strive continuously to improve stakeholder management for end-to-end application delivery and support
Qualification Requirements:
Bachelor Degree
Minimum 5 year experience in Information Technology
Sr. C++FIX or Market Data Developer
Data engineer job in Princeton, NJ
Looking for a highly motivated C++ Trading Systems Developer with demonstrated experience in designing, developing and delivering core production software solutions in a mission critical trading systems environment. Major responsibilities include: Assessing business and systems requirements and developing functional specifications Designing and developing high quality, high performance trading systems software written in C++ to meet deliverable timelines and requirements Adhering to software development life cycle process/methodology Building business level subject matter expertise in trading systems functionality and processing Provide second level support for production on an ad hoc basis when necessary Location: Princeton, NJ Organizational Structure: The developer will be an integral part of a core development team and report to the Trading System Development management team. Qualifications: Full software development life cycle experience in a mission critical trading systems environment a must… Options, Equities, Futures, etc. Must possess excellent software design skills and knowledge of advanced data structures Must have exceptionally strong C++ knowledge and debugging skills in a Linux environment Solid knowledge of Object Oriented Programming concepts a must Strong knowledge of TCP/IP multicast and socket programming required Knowledge of the BOOST libraries and STL required Must have experience in developing real-time applications in a distributed processing architecture Must have excellent organizational and communication skills Must be able to work effectively in a team environment Strong knowledge of the logical business domain in Options or Equities trading systems a big plus Experience coding interface solutions for FIX, OPRA, CTA or UTP a big plus Knowledge of scripting languages such as Python, Shell, and Perl a plus Education and Experience: Minimum of a Bachelor's degree or equivalent in IT/Computer Science 7+ years of experience in C++ development 5+ years of demonstrated experience in delivering software solutions in a trading systems environment for an Exchange or a Wall Street firm
Senior Dotnet Developer
Data engineer job in Ewing, NJ
The Software Engineer, III is a fullstack engineer with strong Node.js and functional programming (Scala & Ruby) skills. Our applications leverage Node.js, Bootstrap, JQuery, Mondo DB, Elastic Search, Redis, React.js and delightful interactive experience to the web. Our applications run in the AWS cloud environment. We use Agile Methodologies to enable our engineering team to work closely with partners and with our design & product teams. This role is full time and preferably located long-term in New York City or southern New Jersey areas.
Essential Job Duties and Responsibilities include:
Design, develop, and maintain modern web applications and UIs using .NET technologies such as C#, ASP.NET MVC, ASP.NET Core, Razor Pages, and Blazor.
Create clean, maintainable, and well-documented code following industry best practices and coding standards.
Develop and consume RESTful APIs and web services.
Build responsive and accessible user interfaces using HTML, CSS, JavaScript, and UI libraries/frameworks such as React, Angular, Vue.js, or Bootstrap.
Work with relational and NoSQL databases (e.g., SQL Server, MongoDB) and object-relational mappers (ORMs) such as Entity Framework Core.
Conduct unit and integration testing to validate functionality and ensure high-quality deliverables.
Participate in peer code reviews and provide constructive feedback to ensure continuous improvement and knowledge sharing.
Identify, troubleshoot, and resolve complex technical issues in development and production environments.
Collaborate with cross-functional teams throughout the software development lifecycle.
Stay current with emerging .NET technologies and trends.
May mentor and support junior developers in their technical growth and day-to-day work.
Maintain regular and punctual attendance.
Preferred Qualifications:
Experience with CI/CD pipelines and DevOps practices.
Familiarity with cloud platforms (e.g., Azure, AWS) and deploying .NET applications in cloud environments.
Knowledge of Blazor for interactive web UIs using C# instead of JavaScript
Education and/or Experience:
7+ years of professional software development experience with a strong focus on web and UI development in the .NET ecosystem.
Advanced proficiency in C#, ASP.NET, ASP.NET Core, and MVC frameworks; experience with VBScript is a plus.
Deep understanding of object-oriented programming (OOP) and design patterns.
Strong front-end development skills, including HTML, CSS, JavaScript, and at least one modern UI framework (React, Angular, Vue.js, etc.).
Proven experience developing and integrating RESTful APIs.
Hands-on experience with SQL Server and/or NoSQL databases; proficient in using Entity Framework Core or similar ORMs.
Familiarity with version control systems such as Git.
Solid grasp of Agile/Scrum development methodologies.
Excellent problem-solving abilities and strong attention to detail.
Effective communication and interpersonal skills with the ability to work independently and within a team.
Software Engineer
Data engineer job in Middletown, NJ
Software Development Engineer (Python)
Contract type: Long term contract
Pay rate: $45-55 hourly (based on years of experience)
Brooksource is seeking a Software Development Engineer (Python) to join our client's technical team focused on automation infrastructure and systems integration. This role involves building Python-based tools, developing CI/CD pipelines, designing REST APIs, and supporting full-stack development where needed. The ideal candidate is adaptable, collaborative, and enjoys solving complex technical challenges in a fast-paced environment. This position offers exposure to modern development practices, network automation, and integration with backend systems.
Responsibilities:
Develop Python-based backend services and automation frameworks supporting networking systems (routers, switches).
Design, build, and maintain RESTful APIs and microservices for internal and external integrations.
Support and optimize CI/CD pipelines for development and deployment efficiency.
Collaborate with cross-functional teams to support both frontend and backend components.
Interface with SQL/NoSQL databases for data storage, retrieval, and analytics.
Write clean, maintainable, and well-documented code following best practices.
Ensure backend systems are secure, reliable, and performant.
Quickly adapt to new technologies and evolving project requirements.
Required Skills & Qualifications:
5+ years of hands-on experience in Python development.
Strong knowledge of object-oriented programming (OOP), decorators, generators, and context managers.
Experience with multi-threading, asynchronous programming, and performance optimization.
Proficiency in error handling, logging, and testing frameworks.
Hands-on experience with CI/CD tools and automation workflows.
Proven ability to design and implement REST APIs.
Solid understanding of database systems (e.g., MySQL, InfluxDB).
Ability to thrive in a fast-paced, agile environment.
Excellent communication and collaboration skills.
Preferred Skills:
Prior experience in network automation involving routers and switches.
Knowledge of networking protocols and configuration automation.
Experience with containerization and orchestration (e.g., Docker, Kubernetes).
What's in it for you:
Work on automation infrastructure and backend systems supporting high-impact projects.
Collaborate with experienced engineers and cross-functional teams in a dynamic environment.
Opportunity to expand technical skills in Python, CI/CD, APIs, microservices, and network automation.
Exposure to modern development practices and large-scale system integration.
Senior Drupal Developer/Tester- Need Local or Nearby Candidates - Inperson Interview
Data engineer job in Trenton, NJ
Senior Drupal Developer / Tester
Duration: 3 Months (with possible extension)
Interview Type - In-person
Experience:
Experience with testing and testing tools.
-Drupal 10/11
-Experience with writing automated tests
-Experience with Playwright
-Troubleshooting and debugging skills
-Manual testing
-Drupal Site Building
-Experience with version control and package management tools.
-Experience with PHP, HTMLS, CSS3, SASS, Twig
-GitHub Repo/Projects
-Code review and deployment
-Demonstrated understanding of accessibility standards and best practices for inclusive ---web development.
-Ability to write secure code and tests following Drupal coding standards and security guidelines.
Example of Duties:
-Develop, apply, and implement tests and automated testing.
-Troubleshoot issues and provide insights on how to solve.
-Keeping tests up to date with latest code changes and deployments.
-Collaborate with other team members and teams on projects to plan, develop,test, deploy, support, and enhance Drupal websites.
-Work independently and efficiently as required.
-Communicate and meet with management and stake holders to get a better understanding of business requirements.
-Implement best practices and standards for website accessibility according to the latest published guidelines.
-Participate in code quality checks and deployment processes.
-Work directly with cloud platform vendors and infrastructure support teams to create and manage cloud-hosted website environments and code repositories.
-Perform code updates to ensure websites remain stable and secure.
-Stay up to date with the latest in Drupal developments and trends.
-Provide guidance and mentorship to team members regarding best practices and efficient use of Drupal.
-Ensure all websites meet Judiciary standards for information security.
The selected candidate will possess strong analytical skills, attention to detail, and familiarity with the latest version of Drupal and its dependencies.
The ideal candidate will be adept at writing and executing automated tests, experienced with manual testing, proactive in keeping all tests up to date with the latest code changes, and adept at identifying and minimizing errors and vulnerabilities in the Judiciary's Drupal environment.
Acquia Drupal Developer Certification
Experience with Drupal, version control software, including Git, and proficiency in PHP are required. At times, the candidate will contribute their expertise to other areas of web development and write steps to replicate issues, apply troubleshooting skills, and provide clear requirements for developers.
Additionally, the candidate will test all code, develop automated test suites, and provide documentation for current and future team members. This individual will collaborate with a high-performing team of Drupal developers to deliver dynamic websites that meet coding and design standards for maintainability, usability, performance, security, and accessibility.
RELEVANT WORK EXPERIENCE: 4 to 6 yrs.
PREFERRED EDUCATION: 4 year college degree
Required Skills:
4 Years - Experience in Drupal site building,features and theming
4 Years - Working experience with the Drupal 10 and 11 ecosystem components (Git, Composer, TWIG). Experience with Playwright.
4 Years - Understand Drupal architecture and can develop a strategy for site testing
4 Years - Experience with Drupal user interface development to include user experience, theme development, and customization of community themes
4 Years - Ability to code and debug in PHP, HTML, CSS and JavaScript/jQuery
4 Years - Experience working in an iterative development environment (Jira, GitHub Projects, Agile, SCRUM)
6 Years - keeping tests up to date with latest code changes
4 Years - Experience with advanced PHP development frameworks and workflows (Symfony, Composer)
5 Years - familiarity with Acquia tools and solutions (Cloud IDE, Acquia Pipelines, BLT, DDEV, Acquia Cloud)
Acquia Drupal Developer Certification is highly desirable
Bachelor's Degree - Required
Thanks & Regards
Infojini Consulting
Website: **********************************
Address: 10015 Old Columbia Road, Suite B 215, Columbia, MD 21046
Data Scientist
Data engineer job in Trenton, NJ
VTS3 is a Veteran-owned and operated company which provides full-service Recruiting, Talent Search, Staffing, and IT Professional Services. For more information visit *************
Job Description
We are looking a Data Scientist Consultant in Trenton, NJ. The qualified Data Scientist will be part of our client's Prescription Overdose: Data-Driven Prevention Initiative (DDPI). The successful candidate ideally will have extensive experience working with large public health databases such as hospitalization data, vital records (death records) and prescription monitoring database.
The Data Scientist will lead the preparation and consolidation of public health data to identify indicators for the dashboard and perform predictive modeling and analytics. The Data Scientist will work closely with the Departments Healthcare Quality and Informatics team assigned on this project, the Data Architect, and Office of Information Technology Services (HIT) staff.
Responsibilities:
Research and develop statistical learning models for data analysis
Collaborate with HQI and HIT to understand general Department and the specific DDPI project needs and devise possible solutions
Keep up-to-date with latest technology trends
Communicate results and ideas to key staff in DOH
Implement new statistical or other mathematical methodologies as needed for specific models or analyses
Optimize joint development efforts through appropriate database use and project design
This is an hourly position with opportunity for overtime.
**All Candidates Must be Authorized to Legally Work in the US Without Sponsorship**
Qualifications
• Extensive background in data mining and statistical analysis
• Ability to understand various data structures and common methods in data transformation
• Excellent pattern recognition and predictive modeling skills
• Experience with programming languages such as Java/Python is an asset
• HDFS / Data Lake Stores / Analytics
• SQL Databases / Warehouses / BISM Stack (SSMS, SSDT, SSIS, SSAS, SSRS)
• Analysis Services (Tabular Mode / DAX / Cube concepts)
• Experience with modeling and presentation technologies (SQL Server Analysis Services, Reporting Services, MicroStrategy, Business Objects, Cognos, Tableau, PowerBI, others)
• Expertise in BI-related data architecture, integration, modeling, and presentation
• Understands BI-related features and limitations of common database platforms (SQL Server, Analysis Services, Oracle, DB2, Postgres, others)
• Experience with ETL technologies (SQL Server Integration Services, Data Factory)
• At least 5-10 years as a data scientist with all of the skills listed above
• Bachelor's Degree in Computer Science, Statistics, Applied Math or related field
Education Preferred:
• Master Degree in Computer Science, Statistics, Applied Math or related field
Additional Information
$500 Referral Fee Program
Earn extra cash while helping your friends!
VTS3 will pay you up to $500.00 for each person you refer to us and we place into a contract or full-time position. If you know someone who's a good candidate for any of our openings, use the "Refer a friend" button on this page and earn extra cash.
The rules are simple:
The referral must be made by using the "Refer a friend" button on this page
The person you refer must be placed within 90 days of being referred
The person you refer must complete 480 billable hours
Cannot be someone we already have on our team or are currently working with
VTS3 ONLY works directly with candidates.
All candidates registered must be able to legally work in the United States.
Distinguished Engineer - Data Architecture
Data engineer job in Atlantic City, NJ
Company DescriptionJobs for Humanity is partnering with Capital One to build an inclusive and just employment ecosystem. Therefore, we prioritize individuals coming from the following communities: Refugee, Neurodivergent, Single Parent, Blind or Low Vision, Deaf or Hard of Hearing, Black, Hispanic, Asian, Military Veterans, the Elderly, the LGBTQ, and Justice Impacted individuals. This position is open to candidates who reside in and have the legal right to work in the country where the job is located.
Company Name: Capital One
Job DescriptionLocations: VA - McLean, United States of America, McLean, VirginiaDistinguished Engineer - Data Architecture
Come join my teams in the Enterprise Data & Machine Learning group to build a highly scalable, well governed, data ecosystem that scales across 1000's of AWS accounts, tens of thousands of database instances, and millions of files stored across enterprise. You will also have the opportunity to shape the future of the platform that exchanges terabytes of data daily between Capital One and partners.
Distinguished Engineers are individual contributors who strive to be diverse in thought so we visualize the problem space. At Capital One, we believe diversity of thought strengthens our ability to influence, collaborate and provide the most innovative solutions across organizational boundaries. Distinguished Engineers will significantly impact our trajectory and devise clear roadmaps to deliver next generation technology solutions.
Deep technical experts and thought leaders that help accelerate adoption of the very best engineering practices, while maintaining knowledge on industry innovations, trends and practices
Visionaries, collaborating on Capital One's toughest issues, to deliver on business needs that directly impact the lives of our customers and associates
Role models and mentors, helping to coach and strengthen the technical expertise and know-how of our engineering and product community
Evangelists, both internally and externally, helping to elevate the Distinguished Engineering community and establish themselves as a go-to resource on given technologies and technology-enabled capabilities
Build awareness, increase knowledge and drive adoption of modern technologies, sharing consumer and engineering benefits to gain buy-in
Strike the right balance between lending expertise and providing an inclusive environment where others' ideas can be heard and championed; leverage expertise to grow skills in the broader Capital One team
Promote a culture of engineering excellence, using opportunities to reuse and inner source solutions where possible
Effectively communicate with and influence key stakeholders across the enterprise, at all levels of the organization
Operate as a trusted advisor for a specific technology, platform or capability domain, helping to shape use cases and implementation in an unified manner
Lead the way in creating next-generation talent for Tech, mentoring internal talent and actively recruiting external talent to bolster Capital One's Tech talent
Basic Qualifications:
Bachelor's Degree
At least 7 years of Data Architecture or Enterprise Architecture experience
At least 5 years of experience in design patterns
At least 5 years of AWS cloud experience
At least 3 years of experience building data products
At least 3 years of experience with enterprise level data governance
Preferred Qualifications:
Masters' Degree
8+ years of data governance, data access, data lineage, data monitoring, and security controls experience
7+ years of experience developing in Python, Java, Scala or Node
6+ years of experience dealing with large number of cross AWS account communications
5+ years of experience in software architecture
3+ years of rich experience in modern database technology evaluation and data modeling
3+ years of experience in building highly resilient distributed data systems
3+ years in data engineering including experience in distributed data pipelines and test data engineering
3+ years of experience in Agile practices
1+ year of experience with developing strategy and implementing target architectures
AWS Solution Architect - Professional or AWS Certified Data Analytics - Specialty certified
Capital One will consider sponsoring a new qualified applicant for employment authorization for this position.
The minimum and maximum full-time annual salaries for this role are listed below, by location. Please note that this salary information is solely for candidates hired to perform work within one of these locations, and refers to the amount Capital One is willing to pay at the time of this posting. Salaries for part-time roles will be prorated based upon the agreed upon number of hours to be regularly worked.
New York City (Hybrid On-Site): $269,400 - $307,500 for Distinguished Engineer
Candidates hired to work in other locations will be subject to the pay range associated with that location, and the actual annualized salary amount offered to any candidate at the time of hire will be reflected solely in the candidate's offer letter.
Capital One offers a comprehensive, competitive, and inclusive set of health, financial and other benefits that support your total well-being. Learn more at the Capital One Careers website. Eligibility varies based on full or part-time status, exempt or non-exempt status, and management level.
This role is expected to accept applications for a minimum of 5 business days.No agencies please. Capital One is an equal opportunity employer committed to diversity and inclusion in the workplace. All qualified applicants will receive consideration for employment without regard to sex (including pregnancy, childbirth or related medical conditions), race, color, age, national origin, religion, disability, genetic information, marital status, sexual orientation, gender identity, gender reassignment, citizenship, immigration status, protected veteran status, or any other basis prohibited under applicable federal, state or local law. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City's Fair Chance Act; Philadelphia's Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries.
If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at ************** or via email at [email protected]. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations.
For technical support or questions about Capital One's recruiting process, please send an email to [email protected]
Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site.
Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC).
Microsoft Dynamics 365 CE Data Migration Consultant
Data engineer job in Middletown, PA
Data-Core Systems, Inc. is a provider of information technology, consulting and business process services. We offer breakthrough tech solutions and have worked with companies, hospitals, universities and government organizations. A proven partner with a passion for client satisfaction, we combine technology innovation, business process expertise and a global, collaborative workforce that exemplifies the future of work. For more information about Data-Core Systems, Inc., please visit *****************************
Our client is a roadway system and as a part of their digital transformation they are implementing a solution based on SAP BRIM & Microsoft Dynamics CE.
Data-Core Systems Inc. is seeking Microsoft Dynamics 365 CE Data Migration Consultant to be a part of our Consulting team. You will be responsible for planning, designing, and executing the migration of customer, account, vehicle, financial, and transaction data from a variety of source systems-including legacy CRMs, ERPs, SQL databases, flat files, Excel, cloud platforms, and tolling systems-into Microsoft Dynamics 365 Customer Engagement (CE). This role involves understanding complex data models, extracting structured and unstructured data, transforming and mapping it to Dynamics CE entities, and ensuring data quality, integrity, and reconciliation throughout the migration lifecycle.
Roles & Responsibilities:
Analyze source system data structures, including customer profiles, accounts, vehicles, transponders, payment methods, transactions, violations, invoices, and billing records
Identify critical data relationships, parent/child hierarchies, and foreign key dependencies
Develop detailed data mapping and transformation documentation from source systems to Dynamics 365 CE entities (standard and custom)
Build, test, and execute ETL pipelines using tools such as SSIS/KingswaySoft, Azure Data Factory, Power Platform Dataflows, or custom .NET utilities
Perform data cleansing, normalization, deduplication, and standardization to meet Dynamics CE data model requirements
Execute multiple migration cycles, including test loads, validation, and final production migration
Ensure referential integrity, high data quality, and accuracy of historical data
Generate reconciliation reports, resolve data inconsistencies, and troubleshoot migration errors
Document migration strategies, execution runbooks, and transformation rules for future reference
Required Skills & Experience:
8-12 years of proven experience migrating data from tolling systems, transportation platforms, legacy CRMs, or other high-volume transactional systems
Strong SQL skills for complex queries, stored procedures, data transformation, and data validation
Hands-on experience with Microsoft Dynamics 365 CE / CRM data model, entities, and relationships
Proficiency with ETL/migration tools: SSIS with KingswaySoft, Azure Data Factory, Power Platform Dataflows, Custom C#/.NET migration scripts
Experience with large-scale migrations involving millions of records
Strong understanding of relational data structures such as: Customer ⇄ Account ⇄ Vehicle ⇄ Transponder ⇄ Transaction
Ability to analyze large datasets, identify anomalies, and resolve inconsistencies
Bachelor's degree in engineering or a bachelor's degree in technology from a recognized university
Preferred Skills & Experience:
Experience with financial transactions, billing data, or violation/enforcement records.
Experience in enterprise-scale Dynamics 365 CE migrations.
Familiarity with data governance, security, and compliance requirements for financial or transportation data.
Knowledge of historical data migration and archival strategies.
We are an equal opportunity employer.
Data Scientist - R01555533
Data engineer job in Edison, NJ
Data ScientistPrimary Skills
Hypothesis Testing, T-Test, Z-Test, Regression (Linear, Logistic), Python/PySpark, SAS/SPSS, Statistical analysis and computing, Probabilistic Graph Models, Great Expectation, Evidently AI, Forecasting (Exponential Smoothing, ARIMA, ARIMAX), Tools(KubeFlow, BentoML), Classification (Decision Trees, SVM), ML Frameworks (TensorFlow, PyTorch, Sci-Kit Learn, CNTK, Keras, MXNet), Distance (Hamming Distance, Euclidean Distance, Manhattan Distance), R/ R Studio
Specialization
Data Science Advanced: Data Specialist
Job requirements
JD: The Agentic AI Lead is a pivotal role responsible for driving the research, development, and deployment of semi-autonomous AI agents to solve complex enterprise challenges. This role involves hands-on experience with LangGraph, leading initiatives to build multi-agent AI systems that operate with greater autonomy, adaptability, and decision-making capabilities. The ideal candidate will have deep expertise in LLM orchestration, knowledge graphs, reinforcement learning (RLHF/RLAIF), and real-world AI applications. As a leader in this space, they will be responsible for designing, scaling, and optimizing agentic AI workflows, ensuring alignment with business objectives while pushing the boundaries of next-gen AI automation. ________________________________________ Key Responsibilities: 1. Architecting & Scaling Agentic AI Solutions o Design and develop multi-agent AI systems using LangGraph for workflow automation, complex decision-making, and autonomous problem-solving. o Build memory-augmented, context-aware AI agents capable of planning, reasoning, and executing tasks across multiple domains. o Define and implement scalable architectures for LLM-powered agents that seamlessly integrate with enterprise applications. 2. Hands-On Development & Optimization Develop and optimize agent orchestration workflows using LangGraph, ensuring high performance, modularity, and scalability. • Implement knowledge graphs, vector databases (Pinecone, Weaviate, FAISS), and retrieval-augmented generation (RAG) techniques for enhanced agent reasoning. • Apply reinforcement learning (RLHF/RLAIF) methodologies to fine-tune AI agents for improved decision-making. 3. Driving AI Innovation & Research • Lead cutting-edge AI research in Agentic AI, LangGraph, LLM Orchestration, and Self-improving AI Agents. • Stay ahead of advancements in multi-agent systems, AI planning, and goal-directed behavior, applying best practices to enterprise AI solutions. • Prototype and experiment with self-learning AI agents, enabling autonomous adaptation based on real-time feedback loops. 4. AI Strategy & Business Impact • Translate Agentic AI capabilities into enterprise solutions, driving automation, operational efficiency, and cost savings. • Lead Agentic AI proof-of-concept (PoC) projects that demonstrate tangible business impact and scale successful prototypes into production
We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
Auto-Apply