AI/ML Engineer (Customer Facing, FDE, Python AI Agent/LLM)
Requirements engineer job in New York, NY
Python Engineer / Data Scientist (Forward Deployed) - LLM/AI Agent products in Legal Tech.
Salary: $170,000-$190,000 + benefits
Company: Late-Stage Scaleup in Legal AI Software
Our client, a global rapidly expanding Legal AI software company backed by top-tier investors, is transforming how legal teams operate through intelligent automation and applied AI. They are hiring 2 x customer-facing, hands-on Python Engineer / Data Scientists to join their brand new Forward Deployed team in Manhattan New York.
This role sits at the intersection of engineering, data, and client delivery, ideal for someone who thrives in technical problem-solving while working directly with enterprise customers. You should expect the role to be 50% hands on coding. As a Forward Deployed technologist, you will work with real customers on real problems, delivering bespoke, high-impact solutions.
Responsibilities include:
• Working closely with Technical and Legal Architects to qualify, scope, and execute bespoke client development requests.
• Rapidly prototyping solutions using APIs, large language models, and supporting technologies to demonstrate feasibility and value.
• Building and adapting integrations that fit into complex client environments, ensuring smooth onboarding and adoption.
• Engaging directly with client technical teams to troubleshoot, debug, and optimise deployments in real time.
• Translating experimental R&D concepts into production-quality code that can evolve into productised features.
• Maintaining a strong feedback loop between client engagements and core engineering to ensure real-world learnings influence the product roadmap.
• Balancing speed and stability, knowing when to produce a quick proof of concept and when to harden code for long-term reliability.
• Collaborating with legal architects, product managers, and researchers to push the boundaries of AI-enabled legal technology.
What We're Looking For
• Strong hands-on Python development experience (for example FastAPI, data pipelines, automation, integrations).
• Experience with AI/ML workflows, NLP, or LLM-driven solutions.
• Strong communication skills and confidence working directly with both technical and non-technical customer stakeholders.
• Ability to own problems end-to-end, from diagnosis to delivery.
• Experience in a customer-facing engineering / forward-deployed environment.
• Bonus: exposure to legal tech, enterprise SaaS, or complex integration projects.
Please apply with your resume if interested.
If you have any exposure to Legal Tech please email ************************ for faster review.
Cloud Engineer
Requirements engineer job in New York, NY
Cloud Infrastructure Engineer
We are seeking a skilled Cloud Infrastructure Engineer to design, implement, and maintain secure, scalable, and resilient cloud infrastructure solutions. The role involves leveraging SaaS and cloud-based technologies to solve complex business challenges and support global operations.
Responsibilities
Implement and support enterprise-scale cloud solutions and integrations.
Build and automate cloud infrastructure using IaC tools such as Terraform, CloudFormation, or ARM templates.
Deploy and support Generative AI platforms and cloud-based vendor solutions.
Implement and enforce cloud security best practices, including IAM, encryption, network segmentation, and compliance with industry standards.
Establish monitoring, logging, and alerting frameworks to ensure high availability and performance.
Optimize cost, performance, and reliability of cloud services.
Participate in on-call rotations and provide support for cloud infrastructure issues.
Maintain documentation, conduct knowledge transfer sessions, and perform design peer reviews.
Experience Level
5+ years in cloud infrastructure engineering, preferably in regulated industries.
Deep expertise in at least one major cloud platform (Azure, AWS, or GCP).
Proficient with Azure and related services (AI/ML tools, security, automation, governance).
Familiarity with SIEM, CNAPP, EDR, Zero Trust architecture, and MDM solutions.
Experience with SaaS integrations and managing third-party cloud services.
Understanding of virtualization, containerization, auto-scaling, and fully automated systems.
Experience scripting in PowerShell and Python; working knowledge of REST APIs.
Networking knowledge (virtual networks, DNS, SSL, firewalls) and IT change management.
Strong collaboration, interpersonal, and communication skills.
Willingness to participate in on-call rotations and after-hours support.
The Phoenix Group Advisors is an equal opportunity employer. We are committed to creating a diverse and inclusive workplace and prohibit discrimination and harassment of any kind based on race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, genetic information, disability, or veteran status. We strive to attract talented individuals from all backgrounds and provide equal employment opportunities to all employees and applicants for employment.
Founding Engineer
Requirements engineer job in New York, NY
As an AI Software Engineer, you'll help architect and build the foundation of agent platform. You'll work at the intersection of LLMs, reasoning systems, and real-world enterprise automation-owning mission-critical infrastructure and shipping features that directly impact how Fortune 1000 IT teams operate.
You'll collaborate closely with the founder, design partners, and real users as we bring the next wave of intelligent IT systems to life.
You'll thrive in this role if you:
Are fluent in Python and/or TypeScript, and care about writing clean, testable code
Have built or worked on AI/agentic systems or LLM-powered products
Enjoy turning ambiguous workflows into structured, automatable systems
Have an entrepreneurial mindset or prior startup/founder experience
Are excited to work across the stack-from backend infra to prompt engineering and UI tools
Tech Stack
Python + TypeScript
LLM APIs (OpenAI, Anthropic, etc.)
Langchain / semantic search / vector DBs
Agentic orchestration systems
AI coding assistants (Cursor, Copilot, etc.)
NG911 Cybersecurity Engineer
Requirements engineer job in New York, NY
NG911 - Cyber Security Tools Implementation Engineer
• Implement solutions for DNS, Email, remote access configuration, integration, performance monitoring, and security management.
• Test Next Generation firewall platforms, host operating systems, applications such as LDAP, SMTP.
• Support for application development and database administration.
• Provide support for Email, DNS, AND Remote access solutions.
• Deploy appropriate network security solutions.
• Implement solutions for DNS, Email, remote access configuration, integration, performance monitoring, and security management.
• Test Next Generation firewall platforms, host operating systems, applications such as LDAP, SMTP.
• Support for application development and database administration.
• Provide support for Email, DNS, AND Remote access solutions.
• Deploy appropriate network security solutions.
MANDATORY SKILLS/EXPERIENCE
• At least 12 years of experience in enterprise data center environment to plan, design and install network & security infrastructure systems for public safety.
• 3+ years working experience with IBM QRadar SEIM solution integration with Cascade, Firemon and Citrix & other critical security service technologies
• CISSP or other industry Cyber Security Certification
• Experience migrating DNS to a new platform.
• Experience participating in the design and implementation of a DMZ and all associated requirements for monitoring external threats.
• Experience with security infrastructure and implementation of perimeter network security components such as Next Generation firewalls.
DESIRABLE SKILLS/EXPERIENCE:
• 3+ years working experience with IBM QRadar SEIM solution integration with Cascade, Firemon and Citrix & other critical security service technologies
• CISSP or other industry Cyber Security Certification
• Experience migrating DNS to a new platform.
Neo4j Engineer
Requirements engineer job in Summit, NJ
Must Have Technical/Functional Skills
Neo4j, Graph Data Science, Cypher, Python, Graph Algorithms, Bloom, GraphXR, Cloud, Kubernetes, ETL
Roles & Responsibilities
Design and implement graph-based data models using Neo4j.
Develop Cypher queries and procedures for efficient graph traversal and analysis.
Apply Graph Data Science algorithms for community detection, centrality, and similarity.
Integrate Neo4j with enterprise data platforms and APIs.
Collaborate with data scientists and engineers to build graph-powered applications.
Optimize performance and scalability of graph queries and pipelines.
Support deployment and monitoring of Neo4j clusters in cloud or on-prem environments.
Salary Range: $110,000 $140,000 Year
TCS Employee Benefits Summary:
Discretionary Annual Incentive.
Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans.
Family Support: Maternal & Parental Leaves.
Insurance Options: Auto & Home Insurance, Identity Theft Protection.
Convenience & Professional Growth: Commuter Benefits & Certification & amp; Training Reimbursement.
Time Off: Vacation, Time Off, Sick Leave & Holidays.
Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
MSD 365 Engineer
Requirements engineer job in Weehawken, NJ
MSD 365 Engineer (Only W2)
Contract
We are currently seeking an experienced Senior Microsoft Dynamics 365 Professional to join our team. Candidate will be joining a team of highly dedicated professionals that thrive for new challenges daily, as well as a company that demonstrates the greatest care for its employees and has a track record for sound business decisions.
Responsibilities:
• Develop and customize Microsoft Dynamics 365 applications using C#/.NET and Power Platform tools.
• Build integrations between Dynamics 365 and Azure services (Logic Apps, Functions) as part of a modern cloud architecture.
• Support the migration from Salesforce to Dynamics 365 - helping to unify global customer data and business processes.
• Work closely with senior developers and solution architects to design clean, scalable solutions aligned with best practices.
• Participate in code reviews, testing, and CI/CD pipeline activities to ensure high-quality deliverables.
• Troubleshoot and optimize Dynamics plugins and SQL queries to improve system performance.
Thanks & Kind Regards,
Avinash Pathak
Delta System & Software, Inc.
Email Id: ***************************
Cloud Engineer
Requirements engineer job in Garden City, NY
About the Role
We are seeking a skilled Cloud Application Engineer to join our on-site team in Garden City, NY. The ideal candidate has strong experience with AWS cloud infrastructure, solid SQL development skills, and exposure to .NET application environments.
Responsibilities
Implement, maintain and troubleshoot AWS cloud infrastructure for scalability, reliability, and performance.
Build and enhance CI/CD pipelines and deployment automation.
Manage SQL databases, including backups, upgrades, migrations, performance monitoring, and troubleshooting.
Collaborate with developers to automate deployment of and manually deploy .NET applications in the cloud environment.
Implement infrastructure as code (IaC) using CloudFormation or Terraform.
Monitor and troubleshoot production systems to ensure uptime and efficiency.
Contribute to architecture and design discussions focused on cloud optimization and modernization.
Partner with cross-functional teams to ensure secure, efficient, and well-documented deployments.
Qualifications
Strong hands-on experience with AWS services (e.g., EC2, RDS, S3, Lambda, ECS, CloudWatch, Cognito, IAM).
Experience implementing DevOps practices including CI/CD automation and environment provisioning.
Proficiency in SQL and database development best practices.
Exposure to .NET (C#, ASP.NET, or .NET Core) and comfort collaborating with application teams.
Familiarity with scripting and automation using PowerShell, Python, Go, or Bash.
Strong understanding of cloud networking, security, and monitoring.
Ability to work on-site in Garden City, NY.
Preferred Skills
Experience with microservices and containerization (Docker, Kubernetes).
Knowledge of infrastructure-as-code and configuration management tools.
Exposure to data pipelines or analytics workloads within AWS.
Familiarity with BitBucket Pipelines, GitLab CI, GitHub Actions, Jenkins or other CI/CD systems.
Strong problem-solving and troubleshooting skills.
Experience with JIRA and Confluence.
Senior DevOps Engineer, Serverless & Cloud Operations
Requirements engineer job in New York, NY
We're urgently hiring a Senior Cloud & DevOps Engineer on behalf of our client, a fast-moving, innovation-driven company in New York! This is a hybrid role in NYC with a base salary between $160-185K and an outstanding benefits package.
This is not a managerial role, but it demands someone highly professional, self-directed, and able to independently prioritize and execute. The ideal candidate moves with urgency, communicates clearly, and is available in emergency situations when needed.
Key Responsibilities
• Own and optimize CI/CD pipelines using AWS, including setup, troubleshooting, and workflow improvements.
• Work confidently across AWS, with hands-on experience in maximizing Lambda performance, managing costs, and using the latest serverless capabilities.
• Design, deploy, and operate serverless architectures using services such as AWS Lambda, API Gateway, and Step Functions.
• Maintain strong observability using Datadog: build dashboards, alerts, and monitoring frameworks to ensure reliability and proactive issue detection.
• Collaborate across teams with a startup-level sense of urgency, accountability, and autonomy.
• Uphold high standards of uptime, resilience, performance, and security across all systems.
Key Requirements
• 5 to 10 years of DevOps experience in fast-paced environments.
• Strong hands-on experience with Azure DevOps, AWS (particularly Lambda), and Datadog.
• Proven ability to build and maintain serverless architectures.
• A clear understanding of operational excellence and its impact on the business.
• Strong communication skills, ownership mentality, and proactive work style.
• U.S. Citizen or Green Card Holder. Able to provide 3 references of direct line managers from previous roles.
• Able to start ASAP and near NYC.
Sr. DevOps Engineer
Requirements engineer job in New York, NY
Job Title: Senior DevOps Engineer
Compensation: $145,000-165,000
Who we are:
Vernovis is a Total Talent Solutions company that specializes in Technology, Cybersecurity, Finance & Accounting functions. At Vernovis, we help these professionals achieve their career goals, matching them with innovative projects and dynamic direct hire opportunities.
Overview:
The Senior DevOps Engineer will design, implement, and maintain secure, scalable, and reliable cloud infrastructure supporting analytics and machine learning workloads in the insurance and analytics industry. This role focuses on automation, CI/CD, and infrastructure management to ensure efficient and compliant delivery across teams.
Key Responsibilities:
• Design and manage cloud-native infrastructure for analytics and business applications.
• Build and maintain CI/CD pipelines using tools such as GitHub Actions and ArgoCD.
• Develop Infrastructure as Code (IaC) with Terraform or similar tools.
• Implement monitoring, logging, and alerting systems (Prometheus, Grafana, Datadog, ELK).
• Automate processes to improve DevSecOps practices and operational efficiency.
• Collaborate with engineering and data teams to support technical and business goals.
• Lead incident response and root cause analysis to maintain system reliability.
Qualifications:
• Bachelor's degree in Computer Science, Information Technology, Engineering, or related field.
• Experience with Infrastructure as Code (Terraform, Pulumi, or Crossplane).
• Proficiency with Kubernetes or OpenShift.
• Experience with AWS, Azure, or Google Cloud Platform.
• Strong programming skills in Python, Golang, or JavaScript/TypeScript.
• Proficient in Linux systems, networking, and cybersecurity fundamentals.
• Experience with Docker and containerized environments.
Data Engineer
Requirements engineer job in New York, NY
DL Software produces Godel, a financial information and trading terminal.
Role Description
This is a full-time, on-site role based in New York, NY, for a Data Engineer. The Data Engineer will design, build, and maintain scalable data systems and pipelines. Responsibilities include data modeling, developing and managing ETL workflows, optimizing data storage solutions, and supporting data warehousing initiatives. The role also involves collaborating with cross-functional teams to improve data accessibility and analytics capabilities.
Qualifications
Strong proficiency in Data Engineering and Data Modeling
Mandatory: strong experience in global financial instruments including equities, fixed income, options and exotic asset classes
Strong Python background
Expertise in Extract, Transform, Load (ETL) processes and tools
Experience in designing, managing, and optimizing Data Warehousing solutions
Senior Data Engineer
Requirements engineer job in New York, NY
Godel Terminal is a cutting edge financial platform that puts the world's financial data at your fingertips. From Equities and SEC filings, to global news delivered in milliseconds, thousands of customers rely on Godel every day to be their guide to the world of finance.
We are looking for a senior engineer in New York City to join our team and help build out live data services as well as historical data for US markets and international exchanges. This position will specifically work on new asset classes and exchanges, but will be expected to contribute to the core architecture as we expand to international markets.
Our team works quickly and efficiently, we are opinionated but flexible when it's time to ship. We know what needs to be done, and how to do it. We are laser focused on not just giving our customers what they want, but exceeding their expectations. We are very proud that when someone opens the app the first time they ask: “How on earth does this work so fast”. If that sounds like a team you want to be part of, here is what we need from you:
Minimum qualifications:
Able to work out of our Manhattan office minimum 4 days a week
5+ years of experience in a financial or startup environment
5+ years of experience working on live data as well as historical data
3+ years of experience in Java, Python, and SQL
Experience managing multiple production ETL pipelines that reliably store and validate financial data
Experience launching, scaling, and improving backend services in cloud environments
Experience migrating critical data across different databases
Experience owning and improving critical data infrastructure
Experience teaching best practices to junior developers
Preferred qualifications:
5+ years of experience in a fintech startup
5+ years of experience in Java, Kafka, Python, PostgreSQL
5+ years of experience working with Websockets like RXStomp or Socket.io
5+ years of experience wrangling cloud providers like AWS, Azure, GCP, or Linode
2+ years of experience shipping and optimizing Rust applications
Demonstrated experience keeping critical systems online
Demonstrated creativity and resourcefulness under pressure
Experience with corporate debt / bonds and commodities data
Salary range begins at $150,000 and increases with experience
Benefits: Health Insurance, Vision, Dental
To try the product, go to *************************
Azure Data Engineer
Requirements engineer job in Weehawken, NJ
· Expert level skills writing and optimizing complex SQL
· Experience with complex data modelling, ETL design, and using large databases in a business environment
· Experience with building data pipelines and applications to stream and process datasets at low latencies
· Fluent with Big Data technologies like Spark, Kafka and Hive
· Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required
· Designing and building of data pipelines using API ingestion and Streaming ingestion methods
· Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential
· Experience in developing NO SQL solutions using Azure Cosmos DB is essential
· Thorough understanding of Azure and AWS Cloud Infrastructure offerings
· Working knowledge of Python is desirable
· Designing and implementing scalable and secure data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services
· Managing and optimizing data storage using Azure Data Lake Storage, Azure SQL Data Warehouse, and Azure Cosmos DB
· Monitoring and troubleshooting data-related issues within the Azure environment to maintain high availability and performance
· Implementing data security measures, including encryption, access controls, and auditing, to protect sensitive information
· Automating data pipelines and workflows to streamline data ingestion, processing, and distribution tasks
· Utilizing Azure's analytics services, such as Azure Synapse Analytics, to provide insights and support data-driven decision-making.
· Documenting data procedures, systems, and architectures to maintain clarity and ensure compliance with regulatory standards
· Providing guidance and support for data governance, including metadata management, data lineage, and data cataloging
Best Regards,
Dipendra Gupta
Technical Recruiter
*****************************
Data Engineer - VC Backed Healthcare Firm - NYC or San Francisco
Requirements engineer job in New York, NY
Are you a data engineer who loves building systems that power real impact in the world?
A fast growing healthcare technology organization is expanding its innovation team and is looking for a Data Engineer II to help build the next generation of its data platform. This team sits at the center of a major transformation effort, partnering closely with engineering, analytics, and product to design the foundation that supports advanced automation, AI, intelligent workflows, and high scale data operations that drive measurable outcomes for hospitals, health systems, and medical groups.
In this role, you will design, develop, and maintain software applications that process large volumes of data every day. You will collaborate with cross functional teams to understand data requirements, build and optimize data models, and create systems that ensure accuracy, reliability, and performance. You will write code that extracts, transforms, and loads data from a variety of sources into modern data warehouses and data lakes, while implementing best in class data quality and governance practices. You will work hands on with big data technologies such as Hadoop, Spark, and Kafka, and you will play a critical role in troubleshooting, performance tuning, and ensuring the scalability of complex data applications.
To thrive here, you should bring strong problem solving ability, analytical thinking, and excellent communication skills. This is an opportunity to join an expanding innovation group within a leading healthcare platform that is investing heavily in data, AI, and the future of intelligent revenue operations. If you want to build systems that make a real difference and work with teams that care deeply about improving patient experiences and provider performance, this is a chance to do highly meaningful engineering at scale.
Cloud Data Engineer
Requirements engineer job in New York, NY
Title: Enterprise Data Management - Data Cloud, Senior Developer I
Duration: FTE/Permanent
Salary: 130-165k
The Data Engineering team oversees the organization's central data infrastructure, which powers enterprise-wide data products and advanced analytics capabilities in the investment management sector. We are seeking a senior cloud data engineer to spearhead the architecture, development, and rollout of scalable, reusable data pipelines and products, emphasizing the creation of semantic data layers to support business users and AI-enhanced analytics. The ideal candidate will work hand-in-hand with business and technical groups to convert intricate data needs into efficient, cloud-native solutions using cutting-edge data engineering techniques and automation tools.
Responsibilities:
Collaborate with business and technical stakeholders to collect requirements, pinpoint data challenges, and develop reliable data pipeline and product architectures.
Design, build, and manage scalable data pipelines and semantic layers using platforms like Snowflake, dbt, and similar cloud tools, prioritizing modularity for broad analytics and AI applications.
Create semantic layers that facilitate self-service analytics, sophisticated reporting, and integration with AI-based data analysis tools.
Build and refine ETL/ELT processes with contemporary data technologies (e.g., dbt, Python, Snowflake) to achieve top-tier reliability, scalability, and efficiency.
Incorporate and automate AI analytics features atop semantic layers and data products to enable novel insights and process automation.
Refine data models (including relational, dimensional, and semantic types) to bolster complex analytics and AI applications.
Advance the data platform's architecture, incorporating data mesh concepts and automated centralized data access.
Champion data engineering standards, best practices, and governance across the enterprise.
Establish CI/CD workflows and protocols for data assets to enable seamless deployment, monitoring, and versioning.
Partner across Data Governance, Platform Engineering, and AI groups to produce transformative data solutions.
Qualifications:
Bachelor's or Master's in Computer Science, Information Systems, Engineering, or equivalent.
10+ years in data engineering, cloud platform development, or analytics engineering.
Extensive hands-on work designing and tuning data pipelines, semantic layers, and cloud-native data solutions, ideally with tools like Snowflake, dbt, or comparable technologies.
Expert-level SQL and Python skills, plus deep familiarity with data tools such as Spark, Airflow, and cloud services (e.g., Snowflake, major hyperscalers).
Preferred: Experience containerizing data workloads with Docker and Kubernetes.
Track record architecting semantic layers, ETL/ELT flows, and cloud integrations for AI/analytics scenarios.
Knowledge of semantic modeling, data structures (relational/dimensional/semantic), and enabling AI via data products.
Bonus: Background in data mesh designs and automated data access systems.
Skilled in dev tools like Azure DevOps equivalents, Git-based version control, and orchestration platforms like Airflow.
Strong organizational skills, precision, and adaptability in fast-paced settings with tight deadlines.
Proven self-starter who thrives independently and collaboratively, with a commitment to ongoing tech upskilling.
Bonus: Exposure to BI tools (e.g., Tableau, Power BI), though not central to the role.
Familiarity with investment operations systems (e.g., order management or portfolio accounting platforms).
Azure Data Engineer
Requirements engineer job in Jersey City, NJ
Title: Senior Azure Data Engineer Client: Major Japanese Bank Experience Level: Senior (10+ Years)
The Senior Azure Data Engineer will design, build, and optimize enterprise data solutions within Microsoft Azure for a major Japanese bank. This role focuses on architecting scalable data pipelines, enhancing data lake environments, and ensuring security, compliance, and data governance best practices.
Key Responsibilities:
Develop, maintain, and optimize Azure-based data pipelines and ETL/ELT workflows.
Design and implement Azure Data Lake, Synapse, Databricks, and ADF solutions.
Ensure data security, compliance, lineage, and governance controls.
Partner with architecture, data governance, and business teams to deliver high-quality data solutions.
Troubleshoot performance issues and improve system efficiency.
Required Skills:
10+ years of data engineering experience.
Strong hands-on expertise with Azure Synapse, Azure Data Factory, Azure Databricks, Azure Data Lake, and Azure SQL.
Azure certifications strongly preferred.
Strong SQL, Python, and cloud data architecture skills.
Experience in financial services or large enterprise environments preferred.
Azure DevOps Engineer
Requirements engineer job in Jersey City, NJ
About US:
LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 750 clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit ********************
Job Title: Azure DevOps Engineer
Work Location
Jersey City, NJ
Job Description:
1. Extensive hands-on experience on GitHub Actions writing workflows in YAML using re-usable templates
2. Extensive hands on experience with application CI/CD pipelines both for Azure and on-prem for different frameworks
3. Hands on experience with Azure DevOps and migration programs of CI/CD pipelines preferably from Azure DevOps to GitHub Actions
4. Proficiency in integrating and consuming REST APIs to achieve automation through scripting
5. Hands on experience with atleast 1 scripting language and has done out of box automations for platforms like People Soft, SharePoint, MDM etc
6. Hands on experience with CI/CD of databases
7. Good to have experience with infrastructure-as-code including ARM templates Terraform Azure CLI Azure PowerShell modules
8. Exposure to monitoring tools like ELK Prometheus Grafana
Benefits/perks listed below may vary depending on the nature of your employment with LTIMindtree (“LTIM”):
Benefits and Perks:
Comprehensive Medical Plan Covering Medical, Dental, Vision
Short Term and Long-Term Disability Coverage
401(k) Plan with Company match
Life Insurance
Vacation Time, Sick Leave, Paid Holidays
Paid Paternity and Maternity Leave
The range displayed on each job posting reflects the minimum and maximum salary target for the position across all US locations. Within the range, individual pay is determined by work location and job level and additional factors including job-related skills, experience, and relevant education or training. Depending on the position offered, other forms of compensation may be provided as part of overall compensation like an annual performance-based bonus, sales incentive pay and other forms of bonus or variable compensation.
Disclaimer: The compensation and benefits information provided herein is accurate as of the date of this posting.
LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.
Senior Data Engineer (Snowflake)
Requirements engineer job in Parsippany-Troy Hills, NJ
Senior Data Engineer (Snowflake & Python)
1-Year Contract | $60/hour + Benefit Options
Hybrid: On-site a few days per month (local candidates only)
Work Authorization Requirement
You must be authorized to work for any employer as a W2 employee. This is required for this role.
This position is W-2 only - no C2C, no third-party submissions, and no sponsorship will be considered.
Overview
We are seeking a Senior Data Engineer to support enterprise-scale data initiatives for a highly collaborative engineering organization. This is a new, long-term contract opportunity for a hands-on data professional who thrives in fast-paced environments and enjoys building high-quality, scalable data solutions on Snowflake.
Candidates must be based in or around New Jersey, able to work on-site at least 3 days per month, and meet the W2 employment requirement.
What You'll Do
Design, develop, and support enterprise-level data solutions with a strong focus on Snowflake
Participate across the full software development lifecycle - planning, requirements, development, testing, and QA
Partner closely with engineering and data teams to identify and implement optimal technical solutions
Build and maintain high-performance, scalable data pipelines and data warehouse architectures
Ensure platform performance, reliability, and uptime, maintaining strong coding and design standards
Troubleshoot production issues, identify root causes, implement fixes, and document preventive solutions
Manage deliverables and priorities effectively in a fast-moving environment
Contribute to data governance practices including metadata management and data lineage
Support analytics and reporting use cases leveraging advanced SQL and analytical functions
Required Skills & Experience
8+ years of experience designing and developing data solutions in an enterprise environment
5+ years of hands-on Snowflake experience
Strong hands-on development skills with SQL and Python
Proven experience designing and developing data warehouses in Snowflake
Ability to diagnose, optimize, and tune SQL queries
Experience with Azure data frameworks (e.g., Azure Data Factory)
Strong experience with orchestration tools such as Airflow, Informatica, Automic, or similar
Solid understanding of metadata management and data lineage
Hands-on experience with SQL analytical functions
Working knowledge of Shell scripting and Java scripting
Experience using Git, Confluence, and Jira
Strong problem-solving and troubleshooting skills
Collaborative mindset with excellent communication skills
Nice to Have
Experience supporting Pharma industry data
Exposure to Omni-channel data environments
Why This Opportunity
$60/hour W2 on a long-term 1-year contract
Benefit options available
Hybrid structure with limited on-site requirement
High-impact role supporting enterprise data initiatives
Clear expectations: W-2 only, no third-party submissions, no Corp-to-Corp
This employer participates in E-Verify and will provide the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S.
Data Engineer
Requirements engineer job in New York, NY
Haptiq is a leader in AI-powered enterprise operations, delivering digital solutions and consulting services that drive value and transform businesses. We specialize in using advanced technology to streamline operations, improve efficiency, and unlock new revenue opportunities, particularly within the private capital markets.
Our integrated ecosystem includes PaaS - Platform as a Service, the Core Platform, an AI-native enterprise operations foundation built to optimize workflows, surface insights, and accelerate value creation across portfolios; SaaS - Software as a Service, a cloud platform delivering unmatched performance, intelligence, and execution at scale; and S&C - Solutions and Consulting Suite, modular technology playbooks designed to manage, grow, and optimize company performance. With over a decade of experience supporting high-growth companies and private equity-backed platforms, Haptiq brings deep domain expertise and a proven ability to turn technology into a strategic advantage.
The Opportunity
As a Data Engineer within the Global Operations team, you will be responsible for managing the internal data infrastructure, building and maintaining data pipelines, and ensuring the integrity, cleanliness, and usability of data across our critical business systems. This role will play a foundational part in developing a scalable internal data capability to drive decision-making across Haptiq's operations.
Responsibilities and Duties
Design, build, and maintain scalable ETL/ELT pipelines to consolidate data from delivery, finance, and HR systems (e.g., Kantata, Salesforce, JIRA, HRIS platforms).
Ensure consistent data hygiene, normalization, and enrichment across source systems.
Develop and maintain data models and data warehouses optimized for analytics and operational reporting.
Partner with business stakeholders to understand reporting needs and ensure the data structure supports actionable insights.
Own the documentation of data schemas, definitions, lineage, and data quality controls.
Collaborate with the Analytics, Finance, and Ops teams to build centralized reporting datasets.
Monitor pipeline performance and proactively resolve data discrepancies or failures.
Contribute to architectural decisions related to internal data infrastructure and tools.
Requirements
3-5 years of experience as a data engineer, analytics engineer, or similar role.
Strong experience with SQL, data modeling, and pipeline orchestration (e.g., Airflow, dbt).
Hands-on experience with cloud data warehouses (e.g., Snowflake, BigQuery, Redshift).
Experience working with REST APIs and integrating with SaaS platforms like Salesforce, JIRA, or Workday.
Proficiency in Python or another scripting language for data manipulation.
Familiarity with modern data stack tools (e.g., Fivetran, Stitch, Segment).
Strong understanding of data governance, documentation, and schema management.
Excellent communication skills and ability to work cross-functionally.
Benefits
Flexible work arrangements (including hybrid mode)
Great Paid Time Off (PTO) policy
Comprehensive benefits package (Medical / Dental / Vision / Disability / Life)
Healthcare and Dependent Care Flexible Spending Accounts (FSAs)
401(k) retirement plan
Access to HSA-compatible plans
Pre-tax commuter benefits
Employee Assistance Program (EAP)
Opportunities for professional growth and development.
A supportive, dynamic, and inclusive work environment.
Why Join Us?
We value creative problem solvers who learn fast, work well in an open and diverse environment, and enjoy pushing the bar for success ever higher. We do work hard, but we also choose to have fun while doing it.
The compensation range for this role is $75,000 to $80,000 USD
Senior Data Engineer
Requirements engineer job in New York, NY
Our client is a growing Fintech software company Headquarted in New York, NY. They have several hundred employees and are in growth mode.
They are currently looking for a Senior Data Engineer w/ 6+ years of overall professional experience. Qualified candidates will have hands-on experience with Python (6 years), SQL (6 years), DBT (3 years), AWS (Lambda, Glue), Airflow and Snowflake (3 years). BSCS and good CS fundamentals.
The Senior Data Engineer will work in a collaborative team environment and will be responsible for building, optimizing and scaling ETL Data Pipelines, DBT models and Datawarehousing. Excellent communication and organizational skills are expected.
This role features competitive base salary, equity, 401(k) with company match and many other attractive perks. Please send your resume to ******************* for immediate consideration.
Senior Data Engineer
Requirements engineer job in New Providence, NJ
Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. We have been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation.Apexon brings together distinct core competencies - in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences - to help businesses capitalize on the unlimited opportunities digital offers. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving clients' toughest technology problems, and a commitment to continuous improvement. Backed by Goldman Sachs Asset Management and Everstone Capital, Apexon now has a global presence of 15 offices (and 10 delivery centers) across four continents.
Job Description
Experienced Data management specialist responsible for developing, overseeing, organizing, storing, and analyzing data and data systems
Participate in all aspects of the software development lifecycle for Snowflake solutions, including planning, requirements, development, testing, and quality assurance
Work in tandem with our engineering team to identify and implement the most optimal solutions
Ensure platform performance, uptime, and scale, maintaining high standards for code quality and thoughtful design
Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures
Able to manage deliverables in fast paced environments
Areas of Expertise
At least 10 years of experience designing and development of data solutions in enterprise environment
At least 5+ years' experience on Snowflake Platform
Strong hands-on SQL and Python development
Experience with designing and developing data warehouses in Snowflake
A minimum of three years' experience in developing production-ready data ingestion and processing pipelines using Spark, Scala
Strong hands-on experience with Orchestration Tools e.g. Airflow, Informatica, Automic
Good understanding on Metadata and data lineage
Hands-on knowledge on SQL Analytical functions
Strong knowledge and hands-on experience in Shell scripting, Java Scripting
Able to demonstrate experience with software engineering practices including CI/CD, Automated testing and Performance Engineering.
Good understanding and exposure to Git, Confluence and Jira
Good problem solving and troubleshooting skills.
Team player, collaborative approach and excellent communication skills
Our Commitment to Diversity & Inclusion:
Did you know that Apexon has been Certified™ by Great Place To Work , the global authority on workplace culture, in each of the three regions in which it operates: USA (for the fourth time in 2023), India (seven consecutive certifications as of 2023), and the UK.Apexon is committed to being an equal opportunity employer and promoting diversity in the workplace. We are taking affirmative action to ensure equal employment opportunity for all qualified individuals. Apexon strictly prohibits discrimination and harassment of any kind and provides equal employment opportunities to employees and applicants without regard to gender, race, color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. You can read about our Job Applicant Privacy policy here Job Applicant Privacy Policy (apexon.com)