AI/ML Engineer (Customer Facing, FDE, Python AI Agent/LLM)
Requirements engineer job in New York, NY
Python Engineer / Data Scientist (Forward Deployed) - LLM/AI Agent products in Legal Tech.
Salary: $170,000-$190,000 + benefits
Company: Late-Stage Scaleup in Legal AI Software
Our client, a global rapidly expanding Legal AI software company backed by top-tier investors, is transforming how legal teams operate through intelligent automation and applied AI. They are hiring 2 x customer-facing, hands-on Python Engineer / Data Scientists to join their brand new Forward Deployed team in Manhattan New York.
This role sits at the intersection of engineering, data, and client delivery, ideal for someone who thrives in technical problem-solving while working directly with enterprise customers. You should expect the role to be 50% hands on coding. As a Forward Deployed technologist, you will work with real customers on real problems, delivering bespoke, high-impact solutions.
Responsibilities include:
• Working closely with Technical and Legal Architects to qualify, scope, and execute bespoke client development requests.
• Rapidly prototyping solutions using APIs, large language models, and supporting technologies to demonstrate feasibility and value.
• Building and adapting integrations that fit into complex client environments, ensuring smooth onboarding and adoption.
• Engaging directly with client technical teams to troubleshoot, debug, and optimise deployments in real time.
• Translating experimental R&D concepts into production-quality code that can evolve into productised features.
• Maintaining a strong feedback loop between client engagements and core engineering to ensure real-world learnings influence the product roadmap.
• Balancing speed and stability, knowing when to produce a quick proof of concept and when to harden code for long-term reliability.
• Collaborating with legal architects, product managers, and researchers to push the boundaries of AI-enabled legal technology.
What We're Looking For
• Strong hands-on Python development experience (for example FastAPI, data pipelines, automation, integrations).
• Experience with AI/ML workflows, NLP, or LLM-driven solutions.
• Strong communication skills and confidence working directly with both technical and non-technical customer stakeholders.
• Ability to own problems end-to-end, from diagnosis to delivery.
• Experience in a customer-facing engineering / forward-deployed environment.
• Bonus: exposure to legal tech, enterprise SaaS, or complex integration projects.
Please apply with your resume if interested.
If you have any exposure to Legal Tech please email ************************ for faster review.
BI Engineer (Tableau & Power BI - platforms/server)
Requirements engineer job in Newark, NJ
Job Title: BI Engineer (Tableau & Power BI - platforms/server)
Duration: 12 months long term project
US citizens and Green Card Holders and those authorized to work in the US are encouraged to apply. We are unable to sponsor
H1b
candidates at this time
Summary of the job
-Extremely technical/hands on skills on Power BI, Python and some Tableau
- Financial, Asset Management, banking background
- FIX Income specifically is a big plus
- Azure Cloud
Job Description:
Our Role:
We are looking for an astute, determined professional like you to fulfil a BI Engineering role within our Technology Solutions Group.
You will showcase your success in a fast-paced environment through collaboration, ownership, and innovation.
Your expertise in emerging trends and practices will evoke stimulating discussions around optimization and change to help keep our competitive edge.
This rewarding opportunity will enable you to make a big impact in our organization, so if this sounds exciting, then might be the place.
Your Impact:
Build and maintain new and existing applications in preparation for a large-scale architectural migration within an Agile function.
Align with the Product Owner and Scrum Master in assessing business needs and transforming them into scalable applications.
Build and maintain code to manage data received from heterogenous data formats including web-based sources, internal/external databases, flat files, heterogenous data formats (binary, ASCII).
Help build new enterprise Datawarehouse and maintain the existing one.
Design and support effective storage and retrieval of very large internal and external data set and be forward think about the convergence strategy with our AWS cloud migration
Assess the impact of scaling up and scaling out and ensure sustained data management and data delivery performance.
Build interfaces for supporting evolving and new applications and accommodating new data sources and types of data.
Your Required Skills:
5+ years of hand-on experience in BI Platform administration such as Power BI and Tableau
3+ years of hand-on experience in Power BI/Tableau report development
Experience with both server and desktop-based data visualization tools
Expertise with multiple database platforms including relational databases (ie. SQL Server) as well as cloud-based data warehouses such as Azure
Fluent with SQL for data analysis
Working experience in a Windows based environment
Knowledge of data warehousing, ETL procedures, and BI technologies
Excellent analytical and problem-solving skills with the ability to think quickly and offer alternatives both independently and within teams.
Exposure working in an Agile environment with Scrum Master/Product owner and ability to deliver
Ability to communicate the status and challenges with the team
Demonstrating the ability to learn new skills and work as a team
Strong interpersonal skills
A reasonable, good faith estimate of the minimum and maximum Pay rate for this position is $70/hr. to $80/hr.
GTM Engineer
Requirements engineer job in New York, NY
About us:
Camber builds software to improve the quality and accessibility of healthcare. We streamline and replace manual work so clinicians can focus on what they do best: providing great care. For more details on our thesis, check out our write-up: What is Camber?
We've raised $50M in funding from phenomenal supporters at a16z, Craft Ventures, YCombinator, Manresa, and many others who are committed to improving the accessibility of care. For more information, take a look at: Announcing Camber
About our Culture:
Our mission to change behavioral health starts with us and how we operate. We don't want to just change behavioral health, we want to change the way startups operate. Here are a few tactical examples:
1) Improving accessibility and quality of healthcare is something we live and breathe. Everyone on Camber's team cares deeply about helping clinicians and patients.
2) We have to have a sense of humor. Healthcare is so broken, it's depressing if you don't laugh with us.
About the role:
We're seeking a proactive, tech-savvy sales operations professional with a startup mindset-someone who thrives on breaking growth barriers and enabling sales excellence. This person will be both a systems admin and a strategic partner: ensuring HubSpot and our tech stack are humming, while also helping shape compensation, territories, and GTM expansion.
What you'll do:
Systems & CRM Administration
Manage and optimize current CRM (HubSpot) and other tech stack integrations: build workflows, dashboards, and troubleshoot system issues
Support onboarding/offboarding of users, governance, data hygiene, and adoption
Data, Forecasting & Reporting
Design and maintain dashboards, reports, and metrics that drive decision-making (e.g., pipeline health, forecast accuracy, win rates)
Deliver actionable insights to stakeholders across sales leadership
Compensation & Territory Strategy
Assist in designing incentive and quota plans that align with sales goals
Collaborate on territory definition, alignment, and carve strategy to ensure balanced coverage
Process & Cross-Functional Enablement
Streamline sales workflows and sales-marketing-sales handoffs
Partner across teams-Sales, Marketing, Finance-to ensure operational alignment and seamless execution
Strategic & Tactical Execution
Be hands-on when needed (data crunching, HubSpot tweaks) while contributing to broader sales strategy planning
What we're looking for:
2-4 years in startup, sales operations, or Rev-Ops environment (or similar roles)
CRM administration experience-ideally HubSpot; bonus if familiar with other tools and workflows
Strong analytical skills-coding, Excel, BI, sales forecasting, data modeling
Operational rigor and problem-solving mindset
A strategic thinker who can scale systems and structure
Thrives in growth-stage constraints; comfortable wearing multiple hats and moving quickly
Perks & Benefits at Camber:
Comprehensive Health Coverage: Medical, dental, and vision plans with nationwide coverage, including 24/7 virtual urgent care.
Mental Health Support: Weekly therapy reimbursement up to $100, so you can prioritize the care that works best for you.
Paid Parental Leave: Up to 12 weeks of fully paid time off for new parents (
birth, adoption, or foster care)
.
Financial Wellness: 401K (traditional & Roth), HSA & FSA options, and monthly commuter benefits for NYC employees.
Time Off That Counts: 18 PTO days per year
(plus rollover
), plus office closures for holidays, monthly team events, company off-sites, and daily, in-office lunches for our team.
Fitness Stipend: $100/month to use on fitness however you choose.
Hybrid Flexibility: In NYC? We gather in the office 3-5x/week, with flexibility when life happens. Fridays are remote-friendly.
Camber is based in New York City, and we prioritize in-person and hybrid candidates.
Building an inclusive culture is one of our core tenets as a company. We're very aware of structural inequalities that exist, and recognize that underrepresented minorities are less likely to apply for a role if they don't think they meet all of the requirements. If that's you and you're reading this, we'd like to encourage you to apply regardless - we'd love to get to know you and see if there's a place for you here!
In addition, we take security seriously, and all of our employees contribute to uphold security requirements and maintain compliance with HIPAA security regulations.
Founding Engineer
Requirements engineer job in New York, NY
About the Role
We are a stealth U.S. startup building a high-impact software platform that will redefine operations within our target industry.
We are searching for a Founding Engineer who has previously built and shipped full software products, someone who can take ownership, work closely with the CEO, and help craft the product from the ground up.
You will be building a platform that must be secure, fast, reliable, and scalable.
Your work will shape the foundation of the company.
Responsibilities
Build the MVP quickly while maintaining clean, modern architecture
Write production-level frontend, backend, and database code
Implement platform security, authentication, and data protection
Collaborate closely with the CEO on product decisions and sprint priorities
Build internal tools, dashboards, and new features at startup speed
Help shape engineering culture from day one
Requirements
Experience building software from end-to-end (solo or as a lead)
Strong knowledge of React/Next.js, Typescript, Node.js, PostgreSQL or similar
Ability to architect secure systems with best practices
Entrepreneurial mindset, comfortable iterating quickly
Open to equity-only compensation during initial phase
Compensation
Competitive founding-team equity
Equity-only in the early phase
Remote
Path to salary after funding
If this role resonates with you and you have built full products before, please apply. I review every application personally and would love to explore whether this could be a strong fit.
Cloud Engineer
Requirements engineer job in New York, NY
Cloud Infrastructure Engineer
We are seeking a skilled Cloud Infrastructure Engineer to design, implement, and maintain secure, scalable, and resilient cloud infrastructure solutions. The role involves leveraging SaaS and cloud-based technologies to solve complex business challenges and support global operations.
Responsibilities
Implement and support enterprise-scale cloud solutions and integrations.
Build and automate cloud infrastructure using IaC tools such as Terraform, CloudFormation, or ARM templates.
Deploy and support Generative AI platforms and cloud-based vendor solutions.
Implement and enforce cloud security best practices, including IAM, encryption, network segmentation, and compliance with industry standards.
Establish monitoring, logging, and alerting frameworks to ensure high availability and performance.
Optimize cost, performance, and reliability of cloud services.
Participate in on-call rotations and provide support for cloud infrastructure issues.
Maintain documentation, conduct knowledge transfer sessions, and perform design peer reviews.
Experience Level
5+ years in cloud infrastructure engineering, preferably in regulated industries.
Deep expertise in at least one major cloud platform (Azure, AWS, or GCP).
Proficient with Azure and related services (AI/ML tools, security, automation, governance).
Familiarity with SIEM, CNAPP, EDR, Zero Trust architecture, and MDM solutions.
Experience with SaaS integrations and managing third-party cloud services.
Understanding of virtualization, containerization, auto-scaling, and fully automated systems.
Experience scripting in PowerShell and Python; working knowledge of REST APIs.
Networking knowledge (virtual networks, DNS, SSL, firewalls) and IT change management.
Strong collaboration, interpersonal, and communication skills.
Willingness to participate in on-call rotations and after-hours support.
The Phoenix Group Advisors is an equal opportunity employer. We are committed to creating a diverse and inclusive workplace and prohibit discrimination and harassment of any kind based on race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, genetic information, disability, or veteran status. We strive to attract talented individuals from all backgrounds and provide equal employment opportunities to all employees and applicants for employment.
Plumbing Engineer
Requirements engineer job in New York, NY
🔎 We're Hiring: Senior Plumbing & Fire Protection Engineer / MEP Designer (On-Site - Brooklyn, NY)
Precision Design, a leading MEP Engineering firm in Brooklyn, NY, is seeking a Senior Plumbing Engineer / MEP Designer with strong experience in Plumbing, Fire Protection
We are looking for a highly skilled professional who can independently design systems, coordinate across multiple disciplines, and manage multiple projects in a fast-paced environment.
Candidates must have at least 5 years of industry experience, including a minimum of 3 years designing in NYC, and must be fully knowledgeable of NYC Building and Energy Codes.
💼 Responsibilities
Design Plumbing & Fire Protection systems from concept through full construction documents
Prepare calculations for water, gas, sanitary/sewer, and storm loads
Perform field surveys and assess existing building conditions
Produce drawings, specifications, and all phases of design (schematic → construction administration)
Coordinate with architectural, engineering, and external project teams, including contractors and city agencies
Manage multiple projects simultaneously
Review shop drawings and participate in project meetings
📘 Required Skills & Experience
5+ years of related experience in Plumbing and/or Fire Protection design
At least 3 years of NYC-specific design experience
Strong knowledge of NYC Building Codes, NYC Energy Conservation Code, and NYC filing requirements
Experience with utility company filing procedures
Proficiency in AutoCAD (Revit is a plus)
Familiarity with NFPA-13, NFPA-13R, and hydraulic calculations
Experience with DEP cross-connection and site connection submissions is strongly preferred
Excellent communication, teamwork, and interpersonal skills
Ability to work independently and manage multiple deadlines
📍 Work Location
On-site in our Brooklyn, NY office (no remote option)
AV Engineer
Requirements engineer job in New York, NY
Full-Time Onsite Position
We're looking for an experienced Audio/Visual Engineer to design, implement, and support a cutting-edge AV infrastructure. As our organization continues to grow, technology remains at the heart of our business, and our AV needs are evolving right alongside it.
In this role, you'll be the subject matter expert for all things AV: from designing globally scalable, fault-tolerant systems to supporting live events, collaboration platforms, digital signage, and complex event spaces. You'll work closely with the Technology and Infrastructure teams to deliver high-quality, reliable AV solutions that enhance communication and collaboration across the firm.
What You'll Do
Architect and manage enterprise-scale AV systems (meeting rooms, event spaces, IPTV, digital signage, Cisco VC solutions).
Provide hands-on support for live and hybrid events.
Collaborate on acoustic treatments and space design to optimize performance.
Partner with vendors, architects, and cross-functional teams on large-scale buildouts.
Ensure world-class reliability and user experience in every solution.
About You
7+ years in AV engineering with enterprise-scale systems.
Expertise in AVoIP, DSPs, control systems, and collaboration tools (Zoom, Webex, Teams).
Strong technical production skills for hybrid/live events.
Solid understanding of networking fundamentals (VLANs, routing, firewalls).
Skilled in multitasking and leading multiple complex projects.
Bonus: AutoCAD/Revit or programming experience.
Backend Engineer
Requirements engineer job in New York, NY
🚀 Coders Connect is partnering with one of New York's fastest-growing AI startups to hire a Backend Engineer who thrives on building systems at scale.
This company is pioneering a new product category-helping global brands understand and control how they show up in AI-driven search environments. If you're a backend builder who wants to shape the infrastructure of a real-time, data-intensive platform, this one's for you.
⚙️ The Role:
As a Backend Engineer, you'll be designing the scalable APIs, data pipelines, and high-performance systems that power a cutting-edge analytics platform at the intersection of AI, data, and search visibility.
You'll work closely with front-end, data, and product teams to create the backend layer that makes fast, intelligent insights possible.
🎯 What You'll Do:
Architect and implement backend systems that support real-time, AI-driven analytics.
Design and maintain APIs that handle large-scale structured and unstructured data.
Optimize data processing pipelines for speed, scalability, and reliability.
Collaborate with engineers, designers, and data scientists to ship features that matter.
Own key technical decisions and balance speed vs long-term maintainability.
Contribute to a fast-moving, product-first engineering culture.
✅ What You Bring:
Software engineering degree or related degree from a top USA University
Production experience with Node.js, Python, or Go.
Deep knowledge of relational or NoSQL databases (PostgreSQL, MySQL, or similar).
Experience designing and optimizing data pipelines and working with distributed systems.
Solid understanding of API security, authentication, and performance tuning.
Familiarity with cloud infrastructure (AWS, GCP, Azure) and containerization tools like Docker or Kubernetes.
Bonus: background working on analytics or data-heavy products at fast-scaling startups.
✨ Culture & Perks:
Impact from Day 1: Own core infrastructure in a startup that's scaling fast.
Design + Data Mindset: Work closely with frontend and product teams to deliver real-time insights beautifully and reliably.
Tight-Knit Team: Join a lean, ambitious engineering squad with startup roots and a hacker culture.
Onsite Energy: The team thrives on in-person collaboration, speed, and iteration.
If you're excited to build the infrastructure behind the next generation of AI analytics-and want to do it with a world-class team in NYC-this could be your next big move. Apply now to learn more and explore this opportunity!
M365 Collaboration Engineer
Requirements engineer job in New York, NY
STAND 8 provides end to end IT solutions to enterprise partners across the United States and with offices in Los Angeles, New York, New Jersey, Atlanta, and more including internationally in Mexico and India We are seeking a highly experienced M365 & Collaboration Engineer to architect, implement, and support modern collaboration and unified communication solutions across the enterprise. This senior-level role will lead end-to-end design, optimization, troubleshooting, and lifecycle management for Microsoft 365, unified communications, and integration capabilities.
The ideal candidate combines technical depth, architectural vision, strong scripting skills, and the ability to collaborate across engineering, security, and business teams.
This is a hybrid position, working on-site Monday through Thursday and remotely on Fridays
Key Responsibilities
Architecture, Design & Implementation
Lead design, implementation, and lifecycle support for enterprise collaboration products (M365, Slack, Zoom, Teams, Unified Communication systems, etc.).
Architect scalable solutions aligned with organizational strategy and future technology roadmap.
Oversee full-stack collaboration deployments while simultaneously supporting ongoing architectural projects.
Design and implement unified communication and collaboration workflows using Microsoft 365 technologies.
Operations, Troubleshooting & Optimization
Provide advanced troubleshooting, performance tuning, and reliability improvements for M365 and other collaboration tools.
Perfom incident root-cause analysis and develop remediation strategies.
Plan and deliver infrastructure improvements with a 6-month outlook.
Integration & Cross-Functional Collaboration
Work closely with cybersecurity teams and vendors to integrate M365 with other cloud and on-premise systems.
Support technology assessments and collaborate with solution partners and internal stakeholders.
Ensure interoperability between Microsoft 365 and other enterprise applications.
Security, Governance & Documentation
Maintain security and compliance across the Microsoft 365 environment by implementing appropriate policies and controls.
Document architecture, configurations, and solution designs.
Provide training and knowledge transfer to internal teams and stakeholders.
Qualifications
7+ years of professional experience in Microsoft 365 architecture, engineering, and implementation.
Expert-level knowledge of M365 services (Exchange Online, SharePoint Online, Teams, OneDrive, Power Platform, etc.).
7+ years experience with PowerShell and scripting for administration and automation.
5+ years hands-on experience with Azure services (AAD, App Services, SQL, Storage, Functions, Logic Apps, DevOps, etc.).
Strong background in enterprise unified communication and collaboration solutions.
Experience monitoring, troubleshooting, and performing root-cause analysis of M365 issues.
Understanding of security/compliance practices within Microsoft environments.
Experience integrating M365 with cloud and on-prem systems.
Strong documentation, communication, and cross-functional collaboration skills.
Knowledge of networking fundamentals and common internet protocols.
Email/calendaring experience (Exchange, Outlook, Proofpoint).
Ability to work in fast-paced Agile environments.
Bachelor's degree in a computer-related field or equivalent experience.
Flexibility to work evenings/weekends as needed.
Preferred Qualifications
Microsoft certifications such as: M365 Enterprise Administrator Expert,Teams Administrator Associate, Security Administrator Associate.
Strong presentation and documentation abilities.
Ability to manage multiple projects simultaneously.
Ability to work both independently and within a team environment.
Benefits
Medical coverage and Health Savings Account (HSA) through Anthem
Dental/Vision/Various Ancillary coverages through Unum
401(k) retirement savings plan
Paid-time-off options
Company-paid Employee Assistance Program (EAP)
Discount programs through ADP WorkforceNow
Additional Details
The base range for this contract position is $35 - $45 / per hour, depending on experience. Our pay ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target for new hires of this position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Qualified applicants with arrest or conviction records will be considered
About Us
STAND 8 provides end-to-end IT solutions to enterprise partners across the United States and globally with offices in Los Angeles, Atlanta, New York, Mexico, Japan, India, and more. STAND 8 focuses on the "bleeding edge" of technology and leverages automation, process, marketing, and over fifteen years of success and growth to provide a world-class experience for our customers, partners, and employees.
Our mission is to impact the world positively by creating success through PEOPLE, PROCESS, and TECHNOLOGY.
Check out more at ************** and reach out today to explore opportunities to grow together!
By applying to this position, your data will be processed in accordance with the STAND 8 Privacy Policy.
Data Engineer Manager
Requirements engineer job in New York, NY
Be part of a global consulting powerhouse, partnering with clients on their most critical strategic transformations.
We are Wavestone. Energetic, solution-driven experts who focus as much on people as on performance and growth. Hand in hand, we share a deep desire to make a positive impact. We are an ambitious firm with a worldwide reach and an ever-expanding portfolio of clients, topics, and projects. In North America, Wavestone operates from hubs in New York City, Pittsburgh, Dallas and Toronto. We work closely with CEOs and technology leaders to optimize IT strategy, sourcing models, and business processes and are committed to building lasting partnerships with our clients.
Are you a true team player, living strong values? Are you a passionate learner, aiming to grow every day? Are you a driven go-getter, tackling challenges head-on? Then we could be the right fit for you. Join Wavestone and thrive in an environment that's empowering, collaborative, and full of opportunities to turn today's challenges into tomorrow's solutions - contributing to one or more of our core 4 capabilities:
Business Consulting | Business Strategy & Transformation, Organizational Effectiveness & Change Management, Operating Model Design & Agility, Program Leadership & Project Management, Marketing, Innovation, & Customer Experience
Technology Consulting | IT Strategy & CTO Advisory, Technology Delivery, Data & Artificial Intelligence, Software & Application: Development & Integration, SAP Consulting, Insurance/Reinsurance
Cybersecurity | Cyber Transformation Remediation, Cyber Defense & Recovery, Digital Identity, Audit & Incident Response, Product & Industrial Cybersecurity
Sourcing & Service Optimization | Global Services Strategy, IT & Business Process Services Outsourcing, Global In-House Center Support, Services Optimization, Sourcing Program Management
Read more at *****************
Job Description
As a Data Engineer at a manager level at Wavestone, you will be expected to help address strategic as well as detailed client needs, specifically serving as a trusted advisor to C-level executives and be comfortable supporting and leading hands-on data projects with technical teams.
In this role you would be leading or supporting high-impact data transformation, data modernization and data initiatives to accelerate and enable AI solutions, bridging business strategy and technical execution. You will architect and deliver robust, scalable data solutions, while mentoring teams and helping to shape the firm's data consulting offerings and skills. This role requires a unique blend of strategic vision, technical depth, and consulting leadership.
Key Responsibilities
Lead complex client engagements in data engineering, analytics, and digital transformation, from strategy through hands-on implementation.
Advise C-level and senior stakeholders on data strategy, architecture, governance, and technology adoption to drive measurable business value.
Architect and implement enterprise-scale data platforms, pipelines, and cloud-native solutions (Azure, AWS, Snowflake, Databricks, etc.).
Oversee and optimize ETL/ELT processes, data integration, and data quality frameworks for large, complex organizations.
Translate business objectives into actionable technical road maps, balancing innovation, scalability, and operational excellence.
Mentor and develop consultants and client teams, fostering a culture of technical excellence, continuous learning, and high performance.
Drive business development by shaping proposals, leading client pitches, and contributing to thought leadership and market offerings.
Stay at the forefront of emerging technologies and industry trends in data engineering, AI/ML, and cloud platforms.
Key Competencies & Skills
Strategic Data Leadership: Proven ability to set and execute data strategy, governance, and architecture at the enterprise level.
Advanced Data Engineering: Deep hands-on experience designing, building, and optimizing data pipelines and architectures (Python, SQL, Spark, Databricks, Snowflake, Azure, AWS, etc.).
Designing Data Models: Experience creating conceptual, logical, and physical data models that leverage different data modeling concepts and methodologies (normalization/denormalization, dimensional typing, data vault methodology, partitioning/embedding strategies, etc.) to meet solution requirements.
Cloud Data Platforms: Expertise in architecting and deploying solutions on leading cloud platforms (Azure, AWS, GCP, Snowflake).
Data Governance & Quality: Mastery of data management, MDM, data quality, and regulatory compliance (e.g., IFRS17, GDPR).
Analytics & AI Enablement: Experience enabling advanced analytics, BI, and AI/ML initiatives in complex environments.
Executive Stakeholder Management: Ability to communicate and influence at the C-suite and senior leadership level.
Project & Team Leadership: Demonstrated success managing project delivery, budgets, and cross-functional teams in a consulting context.
Continuous Learning & Innovation: Commitment to staying ahead of industry trends and fostering innovation within teams.
Qualifications
Bachelor's or master's degree in Computer Science, Engineering, Data Science, or related field, or equivalent business experience.
8+ years of experience in data engineering, data architecture, or analytics consulting, with at least 2 years in a leadership or management role.
Demonstrated success in client-facing roles, ideally within a consulting or professional services environment.
Advanced proficiency in Python, SQL, and modern data engineering tools (e.g., Spark, Databricks, Airflow).
Experience with cloud data platforms (Azure, AWS, GCP, Snowflake).
Relevant certifications (e.g., AWS Certified Data Analytics, Azure Data Engineer, Databricks, Snowflake) are a strong plus.
Exceptional problem-solving, analytical, and communication skills.
Industry exposure: Deep experience in Insurance, Pharma, or Financial Services
Additional Information
Salary Range : $157k - $200k annual salary
We are recruiting across several levels of seniority from Senior Consultant to Manager.
*Only candidates legally authorized to work for any employer in the U.S on a full time basis without the need for sponsorship will be considered. We are unable to sponsor or take over sponsorship of an employment Visa at this time.
Our Commitment
Wavestone values and Positive Way
At Wavestone, we believe our employees are our greatest ambassadors. By embodying our shared values, vision, mission, and corporate brand, you'll become a powerful force for positive change. We are united by a shared commitment to making a positive impact, no matter where we are. This is better defined by our value base, "The Positive Way," which serves as the glue that binds us together:
Energetic - A positive attitude gives energy to lead projects to success. While we may not control the circumstances, we can always choose how we respond to them.
Responsible - We act with integrity and take ownership of our decisions and actions, considering their impact around us.
Together - We want to be a great team, not a team of greats. The team's strength is each individual member, each member's strength is the team.
We are Energetic, Responsible and Together!
Benefits
25 PTO / 6 Federal Holidays / 4 Floating Holidays
Great parental leave (birthing parent: 4 months | supporting parent: 2 months)
Medical / Dental / Vision coverage
401K Savings Plan with Company Match
HSA/FSA
Up to 4% bonus based on personal and company performance with room to grow as you progress in your career
Regular Compensation increases based on performance
Employee Stock Options Plan (ESPP)
Travel and Location
This full-time position is based in our New York office. You must reside or be willing to relocate within commutable distance to the office.
Travel requirements tend to fluctuate depends on your projects and client needs
Diversity and Inclusion
Wavestone seeks diversity among our team members and is an Equal Opportunity Employer.
At Wavestone, we celebrate diversity and inclusion. We have a strong global CSR agenda and an active Diversity & Inclusion committee with Gender Equality, LGBTQ+, Disability Inclusion and Anti-Racism networks.
If you need flexibility, assistance, or an adjustment to our recruitment process due to a disability or impairment, you may reach out to us to discuss this.
Feel free to visit our Wavestone website and LinkedIn page to see our most trending insights!!
Data Engineer
Requirements engineer job in New York, NY
DL Software produces Godel, a financial information and trading terminal.
Role Description
This is a full-time, on-site role based in New York, NY, for a Data Engineer. The Data Engineer will design, build, and maintain scalable data systems and pipelines. Responsibilities include data modeling, developing and managing ETL workflows, optimizing data storage solutions, and supporting data warehousing initiatives. The role also involves collaborating with cross-functional teams to improve data accessibility and analytics capabilities.
Qualifications
Strong proficiency in Data Engineering and Data Modeling
Mandatory: strong experience in global financial instruments including equities, fixed income, options and exotic asset classes
Strong Python background
Expertise in Extract, Transform, Load (ETL) processes and tools
Experience in designing, managing, and optimizing Data Warehousing solutions
Azure Data Engineer
Requirements engineer job in Weehawken, NJ
· Expert level skills writing and optimizing complex SQL
· Experience with complex data modelling, ETL design, and using large databases in a business environment
· Experience with building data pipelines and applications to stream and process datasets at low latencies
· Fluent with Big Data technologies like Spark, Kafka and Hive
· Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required
· Designing and building of data pipelines using API ingestion and Streaming ingestion methods
· Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential
· Experience in developing NO SQL solutions using Azure Cosmos DB is essential
· Thorough understanding of Azure and AWS Cloud Infrastructure offerings
· Working knowledge of Python is desirable
· Designing and implementing scalable and secure data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services
· Managing and optimizing data storage using Azure Data Lake Storage, Azure SQL Data Warehouse, and Azure Cosmos DB
· Monitoring and troubleshooting data-related issues within the Azure environment to maintain high availability and performance
· Implementing data security measures, including encryption, access controls, and auditing, to protect sensitive information
· Automating data pipelines and workflows to streamline data ingestion, processing, and distribution tasks
· Utilizing Azure's analytics services, such as Azure Synapse Analytics, to provide insights and support data-driven decision-making.
· Documenting data procedures, systems, and architectures to maintain clarity and ensure compliance with regulatory standards
· Providing guidance and support for data governance, including metadata management, data lineage, and data cataloging
Best Regards,
Dipendra Gupta
Technical Recruiter
*****************************
DevOps Engineer
Requirements engineer job in New York, NY
About the Team
The DevOps team is responsible for supporting the development teams and interfacing with the infrastructure teams. As a DevOps engineer, you'll have the exciting opportunity to work in a fast-paced, entrepreneurial environment.
What You'll Do
Drive the design, engineering, integration, and enhancements of DevOps enablement tools and applications by utilizing Site Reliability and DevOps principles suited for an on-prem environment
Follow software development processes and practices (Functional Specification and Testing, Design Specifications, Code Reviews, Unit Testing, Monitoring)
Document and maintain processes and procedures
Implement and support established Continuous Integration / Continuous Delivery (CI/CD) practices
Mentor and train the Technology team on tools that increase the use of automation and improve stability, advocating solutions
Evaluate new technologies and explore their applicability to address new requirements in our environment
Skills and Experience
Bachelor's Degree in computer science, software engineering or related field
3+ years of total IT experience
3+ years of development experience in either Python, C#, Java
Experience building, deploying and maintaining container images (e.g., Docker, Kubernetes)
Experience with one or more configuration management tools (e.g., Ansible, Terraform, Git, Bash)
Familiarity with DevOps practices and Site Reliability Engineering processes and tools
(e.g., InfluxDB, Grafana, PagerDuty, REST, Prometheus)
Experience with system administration, such as provisioning and managing servers, deploying database, security monitoring, system patching and managing internal and experience network connectivity
What does it take to be successful in this role?
Excellent problem-solving skills, soft skills, quality, and delivery mindset
Strong communicator and collaborator
Ability to thrive in a fast paced, start-up environment with individuals in dispersed locations
Self-starter, results driven individual with a proven track record
Comfortable with navigating ambiguity and translating it to impactful results
What are some skills to make you stand out?
Experience with trading strategies for securities, options, crypto and trading platforms
Experience with big data and distributed systems (e.g., Kafka, Cassandra)
Ability to demonstrate your ability to integrate different software using code (e.g.,
Python, shell, C#, Java)
Data Engineer
Requirements engineer job in New York, NY
About Beauty by Imagination:
Beauty by Imagination is a global haircare company dedicated to boosting self-confidence with imaginative solutions for every hair moment. We are a platform company of diverse, market-leading brands, including Wet Brush, Goody, Bio Ionic, and Ouidad - all of which are driven to be the most trusted choice for happy, healthy hair. Our talented team is passionate about delivering high-performing products for consumers and salon professionals alike.
Position Overview:
We are looking for a skilled Data Engineer to design, build, and maintain our enterprise Data Warehouse (DWH) and analytics ecosystem - with a growing focus on enabling AI-driven insights, automation, and enterprise-grade AI usage. In this role, you will architect scalable pipelines, improve data quality and reliability, and help lay the foundational data structures that power tools like Microsoft Copilot, Copilot for Power BI, and AI-assisted analytics across the business.
You'll collaborate with business stakeholders, analysts, and IT teams to modernize our data environment, integrate complex data sources, and support advanced analytics initiatives. Your work will directly influence decision-making, enterprise reporting, and next-generation AI capabilities built on top of our Data Warehouse.
Key Responsibilities
Design, develop, and maintain Data Warehouse architecture, including ETL/ELT pipelines, staging layers, and data marts.
Build and manage ETL workflows using SQL Server Integration Services (SSIS) and other data integration tools.
Integrate and transform data from multiple systems, including ERP platforms such as NetSuite.
Develop and optimize SQL scripts, stored procedures, and data transformations for performance and scalability.
Support and enhance Power BI dashboards and other BI/reporting systems.
Implement data quality checks, automation, and process monitoring.
Collaborate with business and analytics teams to translate requirements into scalable data solutions.
Contribute to data governance, standardization, and documentation practices.
Support emerging AI initiatives by ensuring model-ready data quality, accessibility, and semantic alignment with Copilot and other AI tools.
Required Qualifications
Proven experience with Data Warehouse design and development (ETL/ELT, star schema, SCD, staging, data marts).
Hands-on experience with SSIS (SQL Server Integration Services) for building and managing ETL workflows.
Strong SQL skills and experience with Microsoft SQL Server.
Proficiency in Power BI or other BI tools (Tableau, Looker, Qlik).
Understanding of data modeling, performance optimization, and relational database design.
Familiarity with Python, Airflow, or Azure Data Factory for data orchestration and automation.
Excellent analytical and communication skills.
Preferred Qualifications
Experience with cloud data platforms (Azure, AWS, or GCP).
Understanding of data security, governance, and compliance (GDPR, SOC2).
Experience with API integrations and real-time data ingestion.
Background in finance, supply chain, or e-commerce analytics.
Experience with NetSuite ERP or other ERP systems (SAP, Oracle, Dynamics, etc.).
AI Focused Preferred Skills:
Experience implementing AI-driven analytics or automation inside Data Warehouses.
Hands-on experience using Microsoft Copilot, Copilot for Power BI, or Copilot Studio to accelerate SQL, DAX, data modeling, documentation, or insights.
Familiarity with building RAG (Retrieval-Augmented Generation) or AI-assisted query patterns using SQL Server, Synapse, or Azure SQL.
Understanding of how LLMs interact with enterprise data, including grounding, semantic models, and data security considerations (Purview, RBAC).
Experience using AI tools to optimize ETL/ELT workflows, generate SQL scripts, or streamline data mapping/design.
Exposure to AI-driven data quality monitoring, anomaly detection, or pipeline validation tools.
Experience with Microsoft Fabric, semantic models, or ML-integrated analytics environments.
Soft Skills
Strong analytical and problem-solving mindset.
Ability to communicate complex technical concepts to business stakeholders.
Detail-oriented, organized, and self-motivated.
Collaborative team player with a growth mindset.
Impact
You will play a key role in shaping the company's modern data infrastructure - building scalable pipelines, enabling advanced analytics, and empowering the organization to safely and effectively adopt AI-powered insights across all business functions.
Our Tech Stack
SQL Server, SSIS, Azure Synapse
Python, Airflow, Azure Data Factory
Power BI, NetSuite ERP, REST APIs
CI/CD (Azure DevOps, GitHub)
What We Offer
Location: New York, NY (Hybrid work model)
Employment Type: Full-time
Compensation: Competitive salary based on experience
Benefits: Health insurance, 401(k), paid time off
Opportunities for professional growth and participation in enterprise AI modernization initiatives
Data Engineer - VC Backed Healthcare Firm - NYC or San Francisco
Requirements engineer job in New York, NY
Are you a data engineer who loves building systems that power real impact in the world?
A fast growing healthcare technology organization is expanding its innovation team and is looking for a Data Engineer II to help build the next generation of its data platform. This team sits at the center of a major transformation effort, partnering closely with engineering, analytics, and product to design the foundation that supports advanced automation, AI, intelligent workflows, and high scale data operations that drive measurable outcomes for hospitals, health systems, and medical groups.
In this role, you will design, develop, and maintain software applications that process large volumes of data every day. You will collaborate with cross functional teams to understand data requirements, build and optimize data models, and create systems that ensure accuracy, reliability, and performance. You will write code that extracts, transforms, and loads data from a variety of sources into modern data warehouses and data lakes, while implementing best in class data quality and governance practices. You will work hands on with big data technologies such as Hadoop, Spark, and Kafka, and you will play a critical role in troubleshooting, performance tuning, and ensuring the scalability of complex data applications.
To thrive here, you should bring strong problem solving ability, analytical thinking, and excellent communication skills. This is an opportunity to join an expanding innovation group within a leading healthcare platform that is investing heavily in data, AI, and the future of intelligent revenue operations. If you want to build systems that make a real difference and work with teams that care deeply about improving patient experiences and provider performance, this is a chance to do highly meaningful engineering at scale.
Data Engineer (Web Scraping technologies)
Requirements engineer job in New York, NY
Title: Data Engineer (Web Scraping technologies)
Duration: FTE/Perm
Salary: 125-190k plus bonus
Responsibilities:
Utilize AI Models, Code, Libraries or applications to enable a scalable Web Scraping capability
Web Scraping Request Management including intake, assessment, accessing sites to scrape, utilizing tools to scrape, storage of scrape, validation and entitlement to users
Fielding Questions from users about the scrapes and websites
Coordinating with Compliance on approvals and TOU reviews
Some Experience building Data pipelines in AWS platform utilizing existing tools like Cron, Glue, Eventbridge, Python based ETL, AWS Redshift
Normalizing/standardizing vendor data, firm data for firm consumption
Implement data quality checks to ensure reliability and accuracy of scraped data
Coordinate with Internal teams on delivery, access, requests, support
Promote Data Engineering best practices
Required Skills and Qualifications:
Bachelor's degree in computer science, Engineering, Mathematics or related field
2-5 experience in a similar role
Prior buy side experience is strongly preferred (Multi-Strat/Hedge Funds)
Capital markets experience is necessary with good working knowledge of reference data across asset classes and experience with trading systems
AWS cloud experience with commons services (S3, lambda, cron, Event Bridge etc.)
Experience with web-scraping frameworks (Scrapy, BeautifulSoup, Selenium, Playwright etc.)
Strong hands-on skills with NoSQL and SQL databases, programming in Python, data pipeline orchestration tools and analytics tools
Familiarity with time series data and common market data sources (Bloomberg, Refinitiv etc.)
Familiarity with modern Dev Ops practices and infrastructure-as-code tools (e.g. Terraform, CloudFormation)
Strong communication skills to work with stakeholders across technology, investment, and operations teams.
Senior Azure Data Engineer
Requirements engineer job in Stamford, CT
Great opportunity with a private equity firm located in Stamford, CT.
The Azure Data Engineer in this role will partner closely with investment and operations teams to build scalable data pipelines and modern analytics solutions across the firm and its portfolio.
The Senior Azure Data Engineer main responsibilities will be:
Designing and implementing machine learning solutions as part of high-volume data ingestion and transformation pipelines
Experience in designing solutions for large data warehouses and databases (Azure, Databricks and/or Snowflake)
Gather requirements from business stakeholders.
Experience in data architecture, data governance, data modeling, data transformation (from converting data, to data cleansing, to building data structures) data lineage, data Integration, and master data management.
Technical Skills
Architecting and delivering solutions using the Azure Data Analytics platform including Azure Databricks/Azure SQL Data Warehouse
Utilizing Databricks (for processing and transforming massive quantities of data and exploring the data through machine learning models)
Design and build solutions powered by DBT models and integrate with Databricks.
Utilize Snowflake for data application development, and secure sharing and consumption of real-time and/or shared data.
Expertise in data manipulation and analysis using Python.
SQL for data migration and analysis.
Pluses:
Past work experience in financial markets is a plus (Asset Management, Multi-strategy, Private Equity, Structured Products, Fixed Income, Trading, Portfolio Management, etc.).
E-Mail: DIANA@oakridgestaffing.com
Please feel free to connect with me on LinkedIn:
www.linkedin.com/in/dianagjuraj
C++ Market Data Engineer
Requirements engineer job in Stamford, CT
We are seeking a C++ Market Data Engineer to design and optimize ultra-low-latency feed handlers that power global trading systems. This is a high-impact role where your code directly drives real-time decision making.
What You'll Do:
Build high-performance feed handlers in modern C++ (14/17/20) for equities, futures, and options
Optimize systems for micro/nanosecond latency with lock-free algorithms and cache-friendly design
Ensure reliable data delivery with failover, gap recovery, and replay mechanisms
Collaborate with researchers and engineers to align data formats for trading and simulation
Instrument and test systems for continuous performance improvements
What We're Looking For:
3+ years of C++ development experience (low-latency, high-throughput systems)
Experience with real-time market data feeds (e.g., Bloomberg B-PIPE, CME MDP, Refinitiv, OPRA, ITCH)
Strong knowledge of concurrency, memory models, and compiler optimizations
Python scripting skills for testing and automation
Familiarity with Docker/Kubernetes and cloud networking (AWS/GCP) is a plus
Data Engineer
Requirements engineer job in Jersey City, NJ
ONLY LOCALS TO NJ/NY - NO RELOCATION CANDIDATES
Skillset: Data Engineer
Must Haves: Python, PySpark, AWS - ECS, Glue, Lambda, S3
Nice to Haves: Java, Spark, React Js
Interview Process: Interview Process: 2 rounds, 2nd will be on site
You're ready to gain the skills and experience needed to grow within your role and advance your career - and we have the perfect software engineering opportunity for you.
As a Data Engineer III - Python / Spark / Data Lake at JPMorgan Chase within the Consumer and Community Bank , you will be a seasoned member of an agile team, tasked with designing and delivering reliable data collection, storage, access, and analytics solutions that are secure, stable, and scalable. Your responsibilities will include developing, testing, and maintaining essential data pipelines and architectures across diverse technical areas, supporting various business functions to achieve the firm's business objectives.
Job responsibilities:
• Supports review of controls to ensure sufficient protection of enterprise data.
• Advises and makes custom configuration changes in one to two tools to generate a product at the business or customer request.
• Updates logical or physical data models based on new use cases.
• Frequently uses SQL and understands NoSQL databases and their niche in the marketplace.
• Adds to team culture of diversity, opportunity, inclusion, and respect.
• Develop enterprise data models, Design/ develop/ maintain large-scale data processing pipelines (and infrastructure), Lead code reviews and provide mentoring thru the process, Drive data quality, Ensure data accessibility (to analysts and data scientists), Ensure compliance with data governance requirements, and Ensure business alignment (ensure data engineering practices align with business goals).
• Supports review of controls to ensure sufficient protection of enterprise data
Required qualifications, capabilities, and skills
• Formal training or certification on data engineering concepts and 2+ years applied experience
• Experience across the data lifecycle, advanced experience with SQL (e.g., joins and aggregations), and working understanding of NoSQL databases
• Experience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysis
• Extensive experience in AWS, design, implementation, and maintenance of data pipelines using Python and PySpark.
• Proficient in Python and PySpark, able to write and execute complex queries to perform curation and build views required by end users (single and multi-dimensional).
• Proven experience in performance and tuning to ensure jobs are running at optimal levels and no performance bottleneck.
• Advanced proficiency in leveraging Gen AI models from Anthropic (or OpenAI, or Google) using APIs/SDKs
• Advanced proficiency in cloud data lakehouse platform such as AWS data lake services, Databricks or Hadoop, relational data store such as Postgres, Oracle or similar, and at least one NOSQL data store such as Cassandra, Dynamo, MongoDB or similar
• Advanced proficiency in Cloud Data Warehouse Snowflake, AWS Redshift
• Advanced proficiency in at least one scheduling/orchestration tool such as Airflow, AWS Step Functions or similar
• Proficiency in Unix scripting, data structures, data serialization formats such as JSON, AVRO, Protobuf, or similar, big-data storage formats such as Parquet, Iceberg, or similar, data processing methodologies such as batch, micro-batching, or stream, one or more data modelling techniques such as Dimensional, Data Vault, Kimball, Inmon, etc., Agile methodology, TDD or BDD and CI/CD tools.
Preferred qualifications, capabilities, and skills
• Knowledge of data governance and security best practices.
• Experience in carrying out data analysis to support business insights.
• Strong Python and Spark
Data Engineer
Requirements engineer job in Newark, NJ
NeenOpal is a global consulting firm specializing in Data Science and Business Intelligence, with offices in Bengaluru, Newark, and Fredericton. We provide end-to-end solutions tailored to the unique needs of businesses, from startups to large organizations, across domains like digital strategy, sales and marketing, supply chain, and finance. Our mission is to help organizations achieve operational excellence and transform into data-driven enterprises.
Role Description
This is a full-time, hybrid, Data Engineer role located in Newark, NJ. The Data Engineer will be responsible for designing, implementing, and managing data engineering solutions to support business needs. Day-to-day tasks include building and optimizing data pipelines, developing and maintaining data models and ETL processes, managing data warehousing solutions, and contributing to the organization's data analytics initiatives. Collaboration with cross-functional teams to ensure robust data infrastructure will be a key aspect of this role.
Key Responsibilities
Data Pipeline Development: Design, implement, and manage robust data pipelines to ensure efficient data flow into data warehouses. Automate ETL processes using Python and advanced data engineering tools.
Data Integration: Integrate and transform data using industry-standard tools. Experience required with:
AWS Services: AWS Glue, Data Pipeline, Redshift, and S3.
Azure Services: Azure Data Factory, Synapse Analytics, and Blob Storage.
Data Warehousing: Implement and optimize solutions using Snowflake and Amazon Redshift.
Database Management: Develop and manage relational databases (SQL Server, MySQL, PostgreSQL) to ensure data integrity.
Performance Optimization: Continuously monitor and improve data processing workflows and apply best practices for query optimization.
Global Collaboration: Work closely with cross-functional teams in the US, India, and Canada to deliver high-quality solutions.
Governance & Support: Document ETL processes and data mappings in line with governance standards. Diagnose and resolve data-related issues promptly.
Required Skills and Experience
Experience: Minimum 2+ years of experience designing and developing ETL processes (AWS Glue, Azure Data Factory, or similar).
Integration: Experience integrating data via RESTful / GraphQL APIs.
Programming: Proficient in Python for ETL automation and SQL for database management.
Cloud Platforms: Strong experience with AWS or Azure data services. (GCP familiarity is a plus) .
Data Warehousing: Expertise with Snowflake, Amazon Redshift, or Azure Synapse Analytics.
Integration: Experience integrating data via RESTful APIs.
Communication: Excellent articulation skills to explain technical work directly to clients and stakeholders.
Authorization: Must have valid work authorization in the United States.
Salary Range: $65,000- $80,000 per year
Benefits: This role includes health insurance, paid time off, and opportunities for professional growth and continuous learning within a fast-growing global analytics company.
Equal Opportunity Employer NeenOpal Inc. is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status.