Post job

Data engineer jobs in White Plains, NY - 442 jobs

All
Data Engineer
Data Scientist
Senior Data Architect
Data Consultant
  • Senior Data Architect - Power & Utilities AI Platforms

    Ernst & Young Oman 4.7company rating

    Data engineer job in Stamford, CT

    A leading global consulting firm is seeking a Senior Manager in Data Architecture for the Power & Utilities sector. This role requires at least 12 years of consulting experience and expertise in data architecture and engineering. The successful candidate will manage technology projects, lead teams, and develop innovative data solutions that drive significant business outcomes. Strong relationship management and communication skills are essential for engaging with clients and stakeholders. Join us to help shape a better working world. #J-18808-Ljbffr
    $112k-156k yearly est. 2d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Data Scientist

    Gartner 4.7company rating

    Data engineer job in Stamford, CT

    About this role In Gartner's Services Data Science team, we innovate the way our team helps clients receive value, so technology leaders will be able to make smarter decisions in a different way. We are searching for a talented data scientist to join our team. You will have access to the best facilities, technology and expertise within the industry and will work on challenging business problems. This is an excellent opportunity to be part of a new venture, in a start-up environment where you can truly develop your skill set and knowledge and bring impact to the team. What you'll do * Designing and implementing state of the art Large Language Model (LLM) based agents that seamlessly synthesize complex information and initiate important actions in a business workflow. * Using advanced Generative AI techniques deriving actionable insights from unstructured text data, such as call transcripts and emails. * Predicting client interest basis their digital footprint and making relevant recommendations to drive higher client value delivery * Leverage statistical and machine learning techniques to extract actionable insights from client retention data. * Develop customer churn prediction models that proactively identify at-risk clients, * Build tools to process structured and unstructured data * Engineering features and signals to train ML model from diverse data collection What you'll need * BS required/ MS/ preferred; in Computer Science or other technology, Math, Physics, Statistics or Economics (focus on Natural Language Processing, Information Retrieval a plus) * 4 years' experience in data science methodologies as applied to live initiatives or software development// Experience working with Gen AI projects * Minimum 4+ years of experience in python coding and statistical analysis * Minimum 2 years working experience in several of the following: o Prompt Engineering and working with LLMs o Machine Learning and statistical techniques o Data mining and recommendation systems o Natural Language Processing and Information Retrieval o Experience working with large volumes of data o User behavior modeling Who you are * A team player. You get along well with your colleagues and are always ready to help get things done. You enjoy working on projects with multiple people and share knowledge. * Passionate about learning. You thrive on complex technical challenges and are always eager to learn the latest technologies. * Organized and detailed-oriented. You think ahead of time about how best to implement new features, and your code is clean, well-organized and properly documented. * Innovative. You are always proactively looking for opportunities to problem solve using innovative methods that impact the business What we offer * A collaborative, positive culture. You'll work with people who are as enthusiastic, smart and driven as you are. You'll be managed by the best too. * Limitless growth and learning opportunities. We offer the excitement of a fast-paced entrepreneurial workplace and the professional growth opportunities of an established global organization. About Gartner: Gartner, Inc. (NYSE: IT) is the world's leading information technology research and advisory company. We deliver the technology-related insight necessary for our clients to make the right decisions, every day. We work with every client to research, analyze and interpret the business of IT within the context of their individual role. Founded in 1979, Gartner is headquartered in Stamford, Connecticut, U.S.A - Visit gartner.com to learn more. Diversity, inclusion and engagement at Gartner: The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, caste, creed, religion, sex, sexual orientation, gender identity or expression, marital status, citizenship status, age, national origin, ancestry, disability, or any other characteristic protected by applicable law. Gartner affirmatively seeks to advance the principles of equal employment opportunity and values diversity and inclusion. Gartner is an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified applicant with a disability and unable to or limited in your ability to use or access the Gartner's career webpage as a result of your disability, you may request reasonable accommodations by calling Human Resources at or by sending an email to #LI-Hybrid #LI-GV1 Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective business and technology insights, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we've grown to 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That's why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our vast, virtually untapped market potential offers limitless opportunities - opportunities that may not even exist right now - for you to grow professionally and flourish personally. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work. What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive - working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. Gartner believes in fair and equitable pay. A reasonable estimate of the base salary range for this role is 98,000 USD - 133,000 USD. Please note that actual salaries may vary within the range, or be above or below the range, based on factors including, but not limited to, education, training, experience, professional achievement, business need, and location. In addition to base salary, employees will participate in either an annual bonus plan based on company and individual performance, or a role-based, uncapped sales incentive plan. Our talent acquisition team will provide the specific opportunity on our bonus or incentive programs to eligible candidates. We also offer market leading benefit programs including generous PTO, a 401k match up to $7,200 per year, the opportunity to purchase company stock at a discount, and more. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company's career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at or by sending an email . Job Requisition ID:106172 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.
    $78k-103k yearly est. 4d ago
  • AWS Data Migration Consultant

    Slalom 4.6company rating

    Data engineer job in Bogota, NJ

    We have a hybrid and flexible environment and encourage employees to come into the office at least 1x/week. Who You'll Work With As a modern technology company, we've never met a technical challenge we didn't like. We enable our clients to learn from their data, create incredible digital experiences, and make the most of new technologies. We blend design, engineering, and analytics expertise to build the future. We surround our technologists with interesting challenges, innovative minds, and emerging technologies. We are seeking an experienced Cloud Data Migration Architect with deep expertise in SQL Server, Oracle, DB2, or a combination of these platforms, to lead the design, migration, and optimization of scalable database solutions in the AWS cloud. This role will focus on modernizing on-premises database systems by architecting high-performance, secure, and reliable AWS-hosted solutions. As a key technical leader, you will work closely with data engineers, cloud architects, and business stakeholders to define data strategies, lead complex database migrations, build out ETL pipelines, and optimize performance across legacy and cloud-native environments. What You'll Do * Design and optimize database solutions on AWS, including Amazon RDS, EC2-hosted instances, and advanced configurations like SQL Server Always On or Oracle RAC (Real Application Clusters). * Contribute and possibly lead cloud database migrations using AWS Database Migration Service (DMS), Schema Conversion Tool (SCT), and custom automation tools. * Architect high-performance database schemas, indexing strategies, partitioning models, and query optimization techniques. * Optimize complex SQL queries, stored procedures, functions, and views to ensure performance and scalability in the cloud. * Implement high-availability and disaster recovery (HA/DR) strategies including Always-On, Failover Clusters, Log Shipping, and Replication, tailored to each RDBMS. * Ensure security best practices are followed including IAM-based access control, encryption, and compliance with industry standards. * Collaborate with DevOps teams to implement Infrastructure-as-Code (IaC) using tools like Terraform, CloudFormation, or AWS CDK. * Monitor performance using tools such as AWS CloudWatch, Performance Insights, Query Store, Dynamic Management Views (DMVs), or Oracle-native tools. * Work with software engineers and data teams to integrate cloud databases into enterprise applications and analytics platforms. What You'll Bring * 3+ years of experience in database architecture, design, and administration with at least one of the following: SQL Server, Oracle, or DB2. * Expertise in one or more of the following RDBMS platforms: Microsoft SQL Server, Oracle, DB2. * Hands-on experience with AWS database services (RDS, EC2-hosted databases). * Strong understanding of HA/DR solutions and cloud database design patterns. * Experience with ETL development and data integration, using tools such as SSIS, AWS Glue, or custom solutions. * Familiarity with AWS networking components (VPCs, security groups) and hybrid cloud connectivity. * Strong troubleshooting and analytical skills to resolve complex database and performance issues. * Ability to work independently and lead database modernization initiatives in collaboration with engineering and client stakeholders. Nice to Have * AWS certifications such as AWS Certified Database - Specialty or AWS Certified Solutions Architect - Professional. * Experience with NoSQL databases or hybrid data architectures. * Knowledge of analytics and big data tools (e.g., Snowflake, Redshift, Athena, Power BI, Tableau). * Familiarity with containerization (Docker, Kubernetes) and serverless technologies (AWS Lambda, Fargate). * Experience with DB2 on-premise or cloud-hosted environments. About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all.
    $113k-147k yearly est. 7d ago
  • Data Scientist

    Drive Devilbiss Healthcare

    Data engineer job in Port Washington, NY

    Who is Drive Medical.. Drive Medical has become a leading manufacturer of medical products with a strong and consistent track record of growth achieved both organically and through acquisitions. We are proud of our high-quality, diverse product portfolio, channel footprint and global operating scale. Our products are sold into the homecare, long-term care, retail, and e-commerce channels in more than 100 countries around the world. “Leading the World with Innovative Healthcare Solutions that Enhance Lives” Summary (Major Purpose of the Role): Position Summary The Sales Data Scientist will use data analytics and statistical techniques to generate insights that support sales performance and revenue growth. This role focuses on building and improving reporting tools, analyzing data, and providing actionable recommendations to help the sales organization make informed decisions. Key Responsibilities Data Analysis & Reporting Analyze sales data to identify trends, patterns, and opportunities. Create and maintain dashboards and reports for Sales and leadership teams. Support root-cause analysis and process improvement initiatives. Sales Insights Provide data-driven recommendations for pricing, discount strategies, and sales funnel optimization. Assist in segmentation analysis to identify key customer groups and markets. Collaboration Work closely with Sales, Marketing, Finance, and Product teams to align analytics with business needs. Present findings in clear, actionable formats to stakeholders. Data Infrastructure Ensure data accuracy and integrity across reporting tools. Help automate reporting processes for efficiency and scalability. Required Qualifications: 2-4 years of experience in a data analytics or sales operations role. Strong Excel skills (pivot tables, formulas, data analysis). Bachelor's degree in Mathematics, Statistics, Economics, Data Science, or related field-or equivalent experience. Preferred Qualifications: Familiarity with Python, R, SQL, and data visualization tools (e.g., Power BI). Experience leveraging AI/ML tools and platforms (e.g., predictive analytics, natural language processing, automated insights). Experience with CRM systems (Salesforce) and marketing automation platforms. Strong analytical and problem-solving skills with attention to detail. Ability to communicate insights clearly to non-technical audiences. Collaborative mindset and willingness to learn new tools and techniques. Why Apply… Competitive Benefits, Paid Time Off, 401(k) Savings Plan This position does not offer sponsorship opportunities. Pursuant to New York law, Drive Medical provides a salary range in job advertisements. The salary range for this role is $95,000.00 to $125,000.00 per year. Actual salaries may vary depending on factors such as the applicant's experience, specialization, education, as well as the company's requirements. The provided salary range does not include bonuses, incentives, differential pay, or other forms of compensation or benefits which may be offered to the applicant, if eligible according to the company's policies. Drive Medical is an Equal Opportunity Employer and provides equal employment opportunities to all employees and applicants for employment. Drive Medical strictly prohibits and does not tolerate discrimination against employees, applicants, or any other covered person because of race, color, religion, gender, sexual orientation, gender identity, pregnancy and/or parental status, national origin, age, disability status, protected veteran status, genetic information (including family medical history), or any other characteristic protected by federal, state, or local law. Drive Medical complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities.
    $95k-125k yearly Auto-Apply 29d ago
  • Data Scientist

    Mindlance 4.6company rating

    Data engineer job in Franklin Lakes, NJ

    Bachelors or Master's degree with specialization in Computer Science, Information Systems, Mathematics, Statistics or other quantitative disciplines Excellent ability to query large datasets using SQL queries and working with databases Excellent in programming languages such as Java, Python, etc. Experience in data extraction and processing, using MapReduce, Pig, Hive, etc. Proficient programming skills using SAS and/or R Knowledge of building and applying machine learning or predictive modeling Strong problem solving skills Exceptional ability to communicate and present findings clearly to both technical and non-technical audiences Excellent interpersonal and collaboration skills Additional Information Thanks & Regards, Praveen K. Paila ************
    $77k-107k yearly est. 8h ago
  • Ice Cream Customer Analytics Data Scientist

    Magnum ICC Us

    Data engineer job in Englewood Cliffs, NJ

    Scope: US, Customer Development Terms & Conditions: Full time, International sponsorship or relocation not supported If you are in the Ice Cream business or consider choosing to work for the The Magnum Ice Cream Company Ice Cream business, you will work for the Global, leading Ice Cream player with €7.9bn Turn Over in 2023. The Ice Cream business is operating in a highly attractive category, as we are part of the 1 trillion snacking and refreshment industry, growing consistently at high pace. We have strong brands equities: 5 of top 10 selling brands including Magnum and Ben & Jerry's. We are investing to unlock the full growth potential of Ice Cream as a standalone entity, once we separate from Unilever, which happened in December of 2025. Ice Cream has distinct characteristics from TMICC's other operating businesses and the growth potential of Ice Cream will be better delivered under a different ownership structure. As an Ice Cream company we are committed to developing and nurturing talent within. You will have ample options for career growth and exploration, allowing you for you to explore roles and opportunities across the new organization. Your career development will be a priority for us, and we are dedicated to supporting your growth journey within the new company. We hope that you will want to build the new chapter of our Ice Cream history together with us. ABOUT ICE CREAM: Life Tastes Better with Ice Cream TMICC Ice Cream is the largest global Ice Cream Company in the world, with over 100 years of experience delivering a diverse range of indulgent, yet responsible, craft food experiences and treats delighting consumers. Committed to innovation, quality, and sustainability we have 35 brands, including 3 one billion Euro brands (Magnum, Wall's, Ben & Jerry's), a strong presence in over 60 countries, generating annual revenue of over $8 billion. All brands are driven to transform moments into memories through indulgent yet responsibly made and marketed products. We have a well-developed strategy to deliver growth and value creation which is clear on where to play and how to win. We turn the ordinary into the extraordinary by designing unique and innovative Ice Cream experiences that make life taste better, creating joyful experiences. In our Ice Cream business, we're crafting the future through innovation and imaginative minds, creating unique products. We spark moments of happiness for people and within the communities where we operate. However, it is not as simple as it may seem. As Ice Cream makers we are serious about happiness. With warm hearts, we create the coolest products. JOB PURPOSE: The Customer Analytics Data Scientist will report to the Associate Director of CD Analytics and Insights and will be responsible for developing advanced analytical models and insights that empower the US Customer Development function to achieve its sales ambitions. In this role, you will have the opportunity to design and implement predictive and prescriptive analytics solutions that drive decision-making across sales, category management, shopper marketing, and channel strategy. This is a hands-on technical role with strong business integration. The ideal candidate not only has deep expertise in statistical modeling and machine learning but also thrives on partnering with commercial and customer development teams to translate complex data into actionable business stories. You will collaborate closely with data engineering to ensure models are seamlessly operationalized within our data ecosystem and brought to life through visualization and stakeholder engagement. KEY RESPONSIBILITIES: Work closely with business stakeholders to translate needs into well designed and executed technology products that are embedded into daily decision-making, with a focus on promotion optimization, syndicated and retailer data, demand forecasting, shopper marketing, digital commerce, and sales reporting Develop predictive and forecasting models (e.g., forecasting demand, pricing, promotions optimization, customer/shopper behavior) that drive commercial decision-making and business performance. Collaborate with data engineering to define consistent business logic and unified data definitions, consistent with how the business operates, that enable self-service analytics and integration across data products. Build advanced analytics capabilities including demand forecasting, promotion performance, assortment optimization, and shopper behavior modeling. Liase with local and global business and technology teams to define, manage, and enhance Magnum's Customer Development analytics capabilities Conduct advanced analytics analyses using data science, machine learning, and AI to uncover insights that generate tangible business value Communicate and deliver engaging data visualizations and data narratives that bring data to life for both technical & non-technical audiences, simplifying complex concepts and enabling decision-making across commercial teams Guide and lead squads or project teams, driving adoption of data science principles and best-in-class analytics approaches to deliver impactful solutions. WHAT YOU NEED TO SUCCEED: Technical aptitude and the ability to lead teams toward business-value focused technology solutions 6+ years of experience with cloud data platforms (Azure, AWS, GCP) and Databricks, with strong proficiency in SQL, PySpark, Python, and DAX for building high-performance data transformations and pipelines. Hands-on experience with Databricks, Azure Data Factory, Delta Lake, Power BI, Alteryx and related cloud data technologies. Proficient in machine learning frameworks and libraries including scikit-learn, TensorFlow, PyTorch, XGBoost Experience with data visualization tools like PowerBI, Tableau. Strong ability to communicate technical findings to non-technical audiences, turning data into actionable business narratives. Intellectual curiosity and strong analytic ability Preparedness to develop a deep understanding of business operations and needs for the Customer Development function Excellent planning, organization, and project estimation skills Strong understanding of the use and management of retailer and syndicated data Ability to work collaboratively with cross functional teams both locally and globally Ability to communicate and present complex topics clearly and compellingly to business stakeholders SKILLS: Strong understanding of data science principles and data visualization approaches Solid understanding of data modelling to support analytics and reporting use cases. Proficiency in coding languages such as Python, PySpark, DAX and SQL Ability to collect business requirements and translate into technical solutions Experience using tools like Databricks, Azure Data Factory, Power BI, and Alteryx Experience leveraging data sources such as Circana, Nielsen, panel data, and retailer data Ability to design intuitive and compelling front end business applications Data Engineering, Computer Science or study in related field. EXPERIENCES & QUALIFICATIONS: Required Experience applying data science, machine learning, and AI to business problems in Consumer Goods organizations Experience designing and implementing end-to-end data analyses and capabilities for a business focused audience Experience creating data visualization dashboards in tools such as PowerBI or Tableau Experience collecting business requirements and translating them into technology projects Experience working in a consumer goods organization Experience with the use and management of retailer and syndicated data (such as Circana, Nielsen) Preferred Experience in a consumer goods Customer Development function Experience with retailer data platforms such as Scintilla, SymphonyAI, 1010data and 84.51 Stratum Familiarity with Digital Commerce and Shopper Marketing data solutions Experience managing analytics teams Pay: The pay range for this position is $99,760 to $149,640. TMICC takes into consideration a wide range of factors that are utilized in making compensation decisions including, but not limited to, skill sets, experience and training, licensure and certifications, qualifications and education, and other business and organizational needs. Bonus: This position is bonus eligible. Long-Term Incentive (LTI): This position is LTI eligible. Benefits: TMICC employees are eligible to participate in our benefits plan. Should the employee choose to participate, they can choose from a range of benefits to include, but is not limited to, health insurance (including prescription drug, dental, and vision coverage), retirement savings benefits, life insurance and disability benefits, parental leave, sick leave, paid vacation and holidays, as well as access to numerous voluntary benefits. Any coverages for health insurance and retirement benefits will be in accordance with the terms and conditions of the applicable plans and associated governing plan documents. #TMICC #TMICC ABOUT THE MAGNUM ICE CREAM COMPANY: With 19.000 expert ice cream colleagues and iconic brands like Wall's, Cornetto and Ben & Jerry's, loved in 76 countries, we are the world's largest Ice Cream company leading the industry We have been taking pleasure seriously for more than 100 years, serving happiness with every lick or scoop of ice cream for generations. The Magnum Ice Cream Company (formerly part of Unilever) is all about growth. Growing our business. Growing our customers' businesses. Growing our people's careers. Growth begins with empowerment. So we free our people to be innovative, responsible entrepreneurs, driven and equipped to give our consumers more amazing products and unforgettable moments - and having fun doing it. Here's what defines success in our organization : · We are all about growth · We operate with speed and simplicity · We win together with fun · We boldly innovate to disrupt our industry · We care and challenge · We are experts in the Ice Cream Category ARE YOU EXCITED TO CRAFT THE ICE CREAM FUTURE? Please apply online and do not forget to upload your CV. Your application will be reviewed against the requirements, and we will be in touch shortly after the closing date, with an update on the status of your application. The Magnum Ice Cream Company is an organization committed to diversity and inclusion to drive our business results and create a better future every day for our diverse employees, global consumers, partners, and communities. We believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, age, national origin, protected veteran status, or any other characteristic protected by local, state, or federal law and will not be discriminated against on the basis of disability. For more information about your Federal rights, please see Know Your Rights: Workplace Discrimination is Illegal and Pay Transparency Non discrimination Provision. Employment is subject to verification of pre-screening tests, which may include drug screening, background check, credit check and DMV check. If you are an individual with a disability in need of assistance at any time during our recruitment process, please contact your recruiter. Please note: This email is reserved for individuals with disabilities in need of assistance and is not a means of inquiry about positions or application statuses. The Protected Veterans or Individuals with Disabilities AAP narratives are available for inspection by any employee or applicant for employment Monday through Friday during normal business hours at establishment.
    $99.8k-149.6k yearly Auto-Apply 12d ago
  • Data Scientist

    Psychogenics 4.5company rating

    Data engineer job in Paramus, NJ

    **A spinout from PsychoGenics Inc, Axelyra** is a well-funded,biotech startupfocused on unmet needs in psychiatric and neurological disorders. We leverage proprietary preclinical platforms and AI-driven analytics spanning both clinical and preclinical behavioral and electrophysiological (EEG) phenotyping to enable a compound re-innovation strategy. We are building AI-enabled platforms that support clinical development, including tools and models that help quantify treatment response, stratify patients, and accelerate learning across trials. Our work is highly multidisciplinary, with day-to-day collaboration across data science/engineering, biologists, translational scientists, and clinicians. PsychoGenics is a preclinical CRO with expertise in central nervous system (CNS) and orphan disorders. PsychoGenics is known for it's cutting-edge translational approach to research, customized solutions, the breadth, and quality of our work, as well as for our ability to identify statistically relevant phenotypic changes that help clients quantify the efficacy of their treatments. With an extensive portfolio of highly predictive disease models and unparalleled experience performing studies for biopharmaceutical companies of all sizes, we enable clients to deliver much needed superior clinical candidates to patients. **Location:** Paramus, NJ (hybrid remote allowed) **Level:** Open (commensurate with experience) **Responsibilities:** + Design, develop, and evaluate ML/DL models for EEG and behavioral datasets. + Perform time-series analytics and feature engineering; build interpretable model outputs for scientific stakeholders. + Create dashboards and visualizations supporting translational insights and clinical trial readouts. + Apply rigorous statistical analysis (study design support, QC, model evaluation, uncertainty). + Work closely with biologists and clinicians to translate scientific questions into measurable signals and actionable endpoints. **Required Qualifications and Skills** : + Bachelor's degree in computer science, Software Engineering, Electrical Engineering, Statistics, Mathematics, or a related field. Master's degree is a plus. + Strong programming and software design skills in Python, with a thorough understanding of software engineering principles, application packaging, and deployment. + Strong experiencewith data analysis libraries or tools (e.g., **NumPy, Pandas, Jupyter** ,notebooks; reproducible workflows). + Experience in developing DL models (e.g., Transformers, CNNs, RNN/LSTM). + Solid background in statistics. + Experience with time-series data. + Experience with computer vision methods (as relevant to behavioral video/pose/features). + Proficiency with Git and GitHub workflows. + Comfort working with biomedical/clinical data and interpreting biological signals (EEG, behavior, trial endpoints). + **Collaborative, driven with experience working in a fast-paced start-up environment.** + Excellent analytical and problem-solving skills, with the ability to deeply understand the "why" and "how" of our work. + Strong communication and teamwork abilities. Salary will be determined by factors including job-related skills, experience, knowledge, education and geographic location. **Total Rewards Plan** Axelyra offers competitive performance-based compensation, comprehensive benefits, and career development opportunities. This role is eligible for annual performance-based raise and discretionary bonus for both individual and PsychoGenics annual performance. Our benefits include: + Medical, dental, vision and life insurance plans + HSA, FSA and dependent FSA + Short- and long-term disability plans + Stock-based incentive plan + 401K with company contribution + Tuition reimbursement + Generous paid time off policy and paid parental leave policy + EAP + Additional voluntary benefits **Axelyra is a veteran/disability/equal opportunity employer.** We take pride in maintaining a diverse work environment. We do not discriminate in recruitment, hiring, training, promotion, or other employment practices for reasons of race, color, religion, age, sex (including pregnancy, gender identity or expression and sexual orientation), parental status, national origin, age, disability, genetic information (including family medical history), political affiliation, status as a protected veteran, or any other legally protected status. We strive to create a workplace that cultivates bold innovation through collaboration and allows our people to unleash their full potential. **Reasonable Accommodation** PsychoGenics provides qualified individuals with reasonable accommodation during the application and hiring process. If you require assistance or need reasonable accommodation due to disability, please contact Human Resources at: ************************ **Nearest Major Market:** New York City Apply now »
    $80k-117k yearly est. 3d ago
  • Data Engineer w AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake

    Intermedia Group

    Data engineer job in Ridgefield, CT

    OPEN JOB: Data Engineer w AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake **HYBRID - This candidate will work on site 2-3X per week in Ridgefield CT location SALARY: $140,000 to $185,000 2 Openings NOTE: CANDIDATE MUST BE US CITIZEN OR GREEN CARD HOLDER We are seeking a highly skilled and experienced Data Engineer to design, build, and maintain our scalable and robust data infrastructure on a cloud platform. In this pivotal role, you will be instrumental in enhancing our data infrastructure, optimizing data flow, and ensuring data availability. You will be responsible for both the hands-on implementation of data pipelines and the strategic design of our overall data architecture. Seeking a candidate with hands-on experience with AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake, Proficiency in Python and SQL and DevOps/CI/CD experience Duties & Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analytics. Collaborate with data architects, modelers and IT team members to help define and evolve the overall cloud-based data architecture strategy, including data warehousing, data lakes, streaming analytics, and data governance frameworks Collaborate with data scientists, analysts, and other business stakeholders to understand data requirements and deliver solutions. Optimize and manage data storage solutions (e.g., S3, Snowflake, Redshift) ensuring data quality, integrity, security, and accessibility. Implement data quality and validation processes to ensure data accuracy and reliability. Develop and maintain documentation for data processes, architecture, and workflows. Monitor and troubleshoot data pipeline performance and resolve issues promptly. Consulting and Analysis: Meet regularly with defined clients and stakeholders to understand and analyze their processes and needs. Determine requirements to present possible solutions or improvements. Technology Evaluation: Stay updated with the latest industry trends and technologies to continuously improve data engineering practices. Requirements Cloud Expertise: Expert-level proficiency in at least one major cloud platform (AWS, Azure, or GCP) with extensive experience in their respective data services (e.g., AWS S3, Glue, Lambda, Redshift, Kinesis; Azure Data Lake, Data Factory, Synapse, Event Hubs; GCP BigQuery, Dataflow, Pub/Sub, Cloud Storage); experience with AWS data cloud platform preferred SQL Mastery: Advanced SQL writing and optimization skills. Data Warehousing: Deep understanding of data warehousing concepts, Kimball methodology, and various data modeling techniques (dimensional, star/snowflake schemas). Big Data Technologies: Experience with big data processing frameworks (e.g., Spark, Hadoop, Flink) is a plus. Database Systems: Experience with relational and NoSQL databases (e.g., PostgreSQL, MySQL, MongoDB, Cassandra). DevOps/CI/CD: Familiarity with DevOps principles and CI/CD pipelines for data solutions. Hands-on experience with AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake Formation Proficiency in Python and SQL Desired Skills, Experience and Abilities 4+ years of progressive experience in data engineering, with a significant portion dedicated to cloud-based data platforms. ETL/ELT Tools: Hands-on experience with ETL/ELT tools and orchestrators (e.g., Apache Airflow, Azure Data Factory, AWS Glue, dbt). Data Governance: Understanding of data governance, data quality, and metadata management principles. AWS Experience: Ability to evaluate AWS cloud applications, make architecture recommendations; AWS solutions architect certification (Associate or Professional) is a plus Familiarity with Snowflake Knowledge of dbt (data build tool) Strong problem-solving skills, especially in data pipeline troubleshooting and optimization If you are interested in pursuing this opportunity, please respond back and include the following: Full CURRENT Resume Required compensation Contact information Availability Upon receipt, one of our managers will contact you to discuss in full STEPHEN FLEISCHNER Recruiting Manager INTERMEDIA GROUP, INC. EMAIL: *******************************
    $140k-185k yearly Easy Apply 60d+ ago
  • ETL/Data Platform Engineer

    Clarapath

    Data engineer job in Hawthorne, NY

    Job Description JOB TITLE: ETL/Data Platform Engineer TYPE: Full time, regular COMPENSATION: $130,000 - $180,000/yr Clarapath is a medical robotics company based in Westchester County, NY. Our mission is to transform and modernize laboratory workflows with the goal of improving patient care, decreasing costs, and enhancing the quality and consistency of laboratory processes. SectionStar™ by Clarapath is a ground-breaking electro-mechanical system designed to elevate and automate the workflow in histology laboratories and provide pathologists with the tissue samples they need to make the most accurate diagnoses. Through the use of innovative technology, data, and precision analytics, Clarapath is paving the way for a new era of laboratory medicine. Role Summary: The ETL/Data Platform Engineer will play a key role in designing, building, and maintaining Clarapath's data pipelines and platform infrastructure supporting SectionStar™, our advanced electro-mechanical device. This role requires a strong foundation in data engineering, including ETL/ELT development, data modeling, and scalable data platform design. Working closely with cross-functional teams-including software, firmware, systems, and mechanical engineering-this individual will enable reliable ingestion, transformation, and storage of device and operational data. The engineer will help power analytics, system monitoring, diagnostics, and long-term insights that support product performance, quality, and continuous improvement. We are seeking a proactive, detail-oriented engineer who thrives in a fast-paced, rapidly growing environment and is excited to apply data engineering best practices to complex, data-driven challenges in a regulated medical technology setting. Responsibilities: Design, develop, and maintain robust ETL/ELT pipelines for device, telemetry, and operational data Build and optimize data models to support analytics, reporting, and system insights Develop and maintain scalable data platform infrastructure (cloud and/or on-prem) Ensure data quality, reliability, observability, and performance across pipelines Support real-time or near real-time data ingestion where applicable Collaborate with firmware and software teams to integrate device-generated data Enable dashboards, analytics, and internal tools for engineering, quality, and operations teams Implement best practices for data security, access control, and compliance Troubleshoot pipeline failures and improve system resilience Document data workflows, schemas, and platform architecture Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience) 3+ years of experience in data engineering, ETL development, or data platform roles Strong proficiency in SQL and at least one programming language (Python preferred) Experience building and maintaining ETL/ELT pipelines Familiarity with data modeling concepts and schema design Experience with cloud platforms (AWS, GCP, or Azure) or hybrid environments Understanding of data reliability, monitoring, and pipeline orchestration Strong problem-solving skills and attention to detail Experience with streaming data or message-based systems (ex: Kafka, MQTT), a plus Experience working with IoT, device, or telemetry data, a plus Familiarity with data warehouses and analytics platforms, a plus Experience in regulated environments (medical device, healthcare, life sciences), a plus Exposure to DevOps practices, CI/CD, or infrastructure-as-code, a plus Company Offers: Competitive salary, commensurate with experience and education Comprehensive benefits package available: (healthcare, vision, dental and life insurances; 401k; PTO and holidays) A collaborative and diverse work environment where our teams thrive on solving complex challenges Ability to file IP with the company Connections with world class researchers and their laboratories Collaboration with strategic leaders in healthcare and pharmaceutical world A mission driven organization where every team member will be responsible for changing the standards of delivering healthcare Clarapath is proud to be an equal opportunity employer. We are committed to providing equal employment opportunities to all employees and applicants for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. In addition to federal law requirements, Clarapath complies with applicable state and local laws governing nondiscrimination in employment. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
    $130k-180k yearly 5d ago
  • Network Planning Data Scientist (Manager)

    Atlas Air 4.9company rating

    Data engineer job in White Plains, NY

    Atlas Air is seeking a detail-oriented and analytical Network Planning Analyst to help optimize our global cargo network. This role plays a critical part in the 2-year to 11-day planning window, driving insights that enable operational teams to execute the most efficient and reliable schedules. The successful candidate will provide actionable analysis on network delays, utilization trends, and operating performance, build models and reports to govern network operating parameters, and contribute to the development and implementation of software optimization tools that improve reliability and streamline planning processes. This position requires strong analytical skills, a proactive approach to problem-solving, and the ability to translate data into operational strategies that protect service quality and maximize network efficiency. Responsibilities * Analyze and Monitor Network Performance * Track and assess network delays, capacity utilization, and operating constraints to identify opportunities for efficiency gains and reliability improvements. * Develop and maintain key performance indicators (KPIs) for network operations and planning effectiveness. * Modeling & Optimization * Build and maintain predictive models to assess scheduling scenarios and network performance under varying conditions. * Support the design, testing, and implementation of software optimization tools to enhance operational decision-making. * Reporting & Governance * Develop periodic performance and reliability reports for customers, assisting in presentation creation * Produce regular and ad hoc reports to monitor compliance with established operating parameters. * Establish data-driven processes to govern scheduling rules, protect operational integrity, and ensure alignment with reliability targets. * Cross-Functional Collaboration * Partner with Operations, Planning, and Technology teams to integrate analytics into network planning and execution. * Provide insights that inform schedule adjustments, fleet utilization, and contingency planning. * Innovation & Continuous Improvement * Identify opportunities to streamline workflows and automate recurring analyses. * Contributes to the development of new planning methodologies and tools that enhance decision-making and operational agility. Qualifications * Proficiency in SQL (Python and R are a plus) for data extraction and analysis; experience building decision-support tools, reporting tools dashboards (e.g., Tableau, Power BI) * Bachelor's degree required in Industrial Engineering, Operations Research, Applied Mathematics, Data Science or related quantitative discipline or equivalent work experience. * 5+ years of experience in strategy, operations planning, finance or continuous improvement, ideally with airline network planning * Strong analytical skills with experience in statistical analysis, modeling, and scenario evaluation. * Strong problem-solving skills with the ability to work in a fast-paced, dynamic environment. * Excellent communication skills with the ability to convey complex analytical findings to non-technical stakeholders. * A proactive, solution-focused mindset with a passion for operational excellence and continuous improvement. * Knowledge of operations, scheduling, and capacity planning, ideally in airlines, transportation or other complex network operations Salary Range: $131,500 - $177,500 Financial offer within the stated range will be based on multiple factors to include but not limited to location, relevant experience/level and skillset. The Company is an Equal Opportunity Employer. It is our policy to afford equal employment opportunity to all employees and applicants for employment without regard to race, color, religion, sex, sexual orientation, national origin, citizenship, place of birth, age, disability, protected veteran status, gender identity or any other characteristic or status protected by applicable in accordance with federal, state and local laws. If you'd like more information about your EEO rights as an applicant under the law, please download the available EEO is the Law document at ****************************************** To view our Pay Transparency Statement, please click here: Pay Transparency Statement "Know Your Rights: Workplace Discrimination is Illegal" Poster The "EEO Is The Law" Poster
    $131.5k-177.5k yearly Auto-Apply 35d ago
  • Data Scientist

    Akkodis

    Data engineer job in Ridgefield, NJ

    Akkodis is seeking a Data Scientist for a Contract with a client in Basking Ridge, NJ. You will be responsible for analyzing large-scale wireless and IEN alarm data to detect patterns, reduce noise, and improve network reliability through KPIs, dashboards, and ML-driven insights. Rate Range: $50/hour to $53/hour; The rate may be negotiable based on experience, education, geographic location, and other factors. Data Scientist job responsibilities include: * Analyze large-scale alarm and time-series data from OSS/NMS/EMS systems to identify patterns, recurring fault signatures, and cross-domain trends (RAN, Core, IEN). * Build and deploy ML models for alarm correlation, noise reduction, anomaly detection, root cause analysis, and predictive fault forecasting using Python/R (Pandas, NumPy, Scikit-learn, PySpark). * Design KPIs, dashboards, and real-time monitoring in tools like Tableau/Power BI/Grafana to track network health, MTTR/SLA, and reliability metrics for NOC and engineering teams. * Clean, normalize, and enrich multi-source data (OSS, EMS, NMS, CMDB, performance systems); integrate pipelines on big data platforms (Spark/Hadoop) and streaming tools (Kafka/Flink). * Automate fault insight pipelines and model operations, including versioning, performance tracking, and documentation; collaborate with NOC, Network Engineering, and Reliability teams to operationalize insights. * Communicate findings to technical and non-technical stakeholders, driving proactive maintenance, incident prevention initiatives, and continuous improvement of AIOps/network intelligence capabilities. Required Qualifications: * Bachelor's or master's degree in computer science, Data Science, Telecommunications, or a related field. * 3-5 years of experience in data analytics, machine learning, or network fault management. * Strong proficiency in Python or R with experience in libraries such as Pandas, NumPy, Scikit-learn, and PySpark. * Hands-on experience with wireless networks (2G/3G/4G/5G), OSS/NMS systems, and big data platforms like Spark or Hadoop. If you are interested in this role, then please click APPLY NOW. For other opportunities available at Akkodis, or any questions, feel free to contact me at *****************************. Pay Details: $50.00 to $53.00 per hour Benefit offerings available for our associates include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, EAP program, commuter benefits and a 401K plan. Our benefit offerings provide employees the flexibility to choose the type of coverage that meets their individual needs. In addition, our associates may be eligible for paid leave including Paid Sick Leave or any other paid leave required by Federal, State, or local law, as well as Holiday pay where applicable. Equal Opportunity Employer/Veterans/Disabled Military connected talent encouraged to apply To read our Candidate Privacy Information Statement, which explains how we will use your information, please navigate to ****************************************************** The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable: * The California Fair Chance Act * Los Angeles City Fair Chance Ordinance * Los Angeles County Fair Chance Ordinance for Employers * San Francisco Fair Chance Ordinance Massachusetts Candidates Only: It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.
    $50-53 hourly Easy Apply 6d ago
  • Data Engineer

    Orion Innovation 3.7company rating

    Data engineer job in Montvale, NJ

    Orion Innovation is a premier, award-winning, global business and technology services firm. Orion delivers game-changing business transformation and product development rooted in digital strategy, experience design, and engineering, with a unique combination of agility, scale, and maturity. We work with a wide range of clients across many industries including financial services, professional services, telecommunications and media, consumer products, automotive, industrial automation, professional sports and entertainment, life sciences, ecommerce, and education. Role Overview We're seeking Financial Statement Data Developers/Engineers to support client engagements. You'll work with modern data tooling to transform accounting trial balance data into accurate Income Statements and Balance Sheets. This role blends data engineering, financial data transformation, and client-facing consulting. Key Responsibilities Ingest & transform trial balance data from various sources (CSV, Excel, ERP extract, database). Apply chart of accounts mappings, including entity-specific variations. Utilize purpose-built applications to aggregate outputs into Income Statement & Balance Sheet formats. Build repeatable Alteryx workflows, SQL transformations, and Python scripts for automation. Perform data reconciliation, tie‑outs, variance checks, and data quality controls across entities and periods. Required Technical Skills Strong proficiency in SQL, Python and Alteryx. Hands-on experience in accounting database development, data engineering and analytics Accounting fundamentals: trial balance, chart of accounts, journal entries, Income Statement & Balance Sheet. Experience converting ledger-level data into financial statements and performing reconciliations. Strong communication skills and client-facing experience; ability to work onsite when required. Experience with cloud environments (Azure). Qualifications & Experience Education: Bachelor's degree in Computer Science, Accounting or a related field (or equivalent experience). Experience: 8+ years of hands-on experience with database development and scripting Preferred nice-to-have skills Exposure to M&A/Deal Advisory workflows: multi-entity consolidation, intercompany eliminations, pro forma reporting. Proficiency with Power BI reporting and validation dashboards. ERP familiarity (e.g., SAP, Oracle). Big-4 or top tier Professional Services experience is a plus. Orion is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, citizenship status, disability status, genetic information, protected veteran status, or any other characteristic protected by law. Candidate Privacy Policy Orion Systems Integrators, LLC and its subsidiaries and its affiliates (collectively, “Orion,” “we” or “us”) are committed to protecting your privacy. This Candidate Privacy Policy (orioninc.com) (“Notice”) explains: What information we collect during our application and recruitment process and why we collect it; How we handle that information; and How to access and update that information. Your use of Orion services is governed by any applicable terms in this notice and our general Privacy Policy.
    $93k-133k yearly est. Auto-Apply 5d ago
  • Data Engineer (AI, ML, and Data Science)

    Consumer Reports

    Data engineer job in Yonkers, NY

    WHO WE ARE Consumer Reports is an independent, nonprofit organization dedicated to a fair and just marketplace for all. CR is known for our rigorous testing and trusted ratings on thousands of products and services. We report extensively on consumer trends and challenges, and survey millions of people in the U.S. each year. We leverage our evidence-based approach to advocate for consumer rights, working with policymakers and companies to find solutions for safer products and fair practices. Our mission starts with you. We offer medical benefits that start on your first day as a CR employee that include behavioral health coverage, family planning and a generous 401K match. Learn more about how CR advocates on behalf of our employees. OVERVIEW Data powers everything we do at CR-and it's the foundation for our AI and machine learning efforts that are transforming how we serve consumers. The Data Engineer ( AI/ML & Data Science) will play a critical role in building the data infrastructure that powers advanced AI applications, machine learning models, and analytics systems across CR. Reporting to the Associate Director, AI/M & Data Science, in this role, you will design and maintain robust data pipelines and services that support experimentation, model training, and AI application deployment. If you're passionate about solving complex data challenges, working with cutting-edge AI technologies, and enabling impactful, data-driven products that support CR's mission, this is the role for you. This is a hybrid position. This position is not eligible for sponsorship or relocation assistance. How You'll Make An Impact As a mission based organization, CR and our Software team are pursuing an AI strategy that will drive value for our customers, give our employees superpowers, and address AI harms in the digital marketplace. We're looking for an AI/ML engineer to help us execute on our multi-year roadmap around generative AI. As a Data Engineer ( AI/M & Data Science) you will: Design, develop, and maintain ETL/ELT pipelines for structured and unstructured data to support AI/ML model and application development, evaluation, and monitoring. Build and optimize data processing workflows in Databricks, AWS SageMaker, or similar cloud platforms. Collaborate with AI/ML engineers to deliver clean, reliable datasets for model training and inference. Implement data quality, observability, and lineage tracking within the ML lifecycle. Develop Data APIs/microservices to power AI applications and reporting/analytics dashboards. Support the deployment of AI/ML applications by building and maintaining feature stores and data pipelines optimized for production workloads. Ensure adherence to CR's data governance, security, and compliance standards across all AI and data workflows. Work with Product, Engineering and other stakeholders to define project requirements and deliverables. Integrate data from multiple internal and external systems, including APIs, third-party datasets, and cloud storage. ABOUT YOU You'll Be Highly Rated If: You have the experience. You have 3+ years of experience designing and developing data pipelines, data models/schemas, APIs, or services for analytics or ML workloads. You have the education. You've earned a Bachelor's degree in Computer Science, Engineering, or a related field. You have programming skills. You are skilled in Python, SQL, and have experience with PySpark on large-scale datasets. You have experience with data orchestration tools such as Airflow, dbt and Prefect, plus CI/CD pipelines for data delivery. You have experience with Data and AI/ML platforms such as Databricks, AWS SageMaker or similar. You have experience working with Kubernetes on cloud platforms like - AWS, GCP, or Azure. You'll Be One of Our Top Picks If: You are passionate about automation and continuous improvement. You have excellent documentation and technical communication skills. You are an analytical thinker with troubleshooting abilities. You are self-driven and proactive in solving infrastructure bottlenecks. FAIR PAY AND A JUST WORKPLACE At Consumer Reports, we are committed to fair, transparent pay and we strive to provide competitive, market-informed compensation.The target salary range for this position is $100K-$120K. It is anticipated that most qualified candidates will fall near the middle of this range. Compensation for the successful candidate will be informed by the candidate's particular combination of knowledge, skills, competencies, and experience. We have three locations: Yonkers, NY, Washington, DC and Colchester, CT. We are registered to do business in and can only hire from the following states and federal district: Arizona, California, Connecticut, Illinois, Maryland, Massachusetts, Michigan, New Hampshire, New Jersey, New York, Texas, Vermont, Virginia and Washington, DC. Salary ranges NY/California: $120K-$140K annually DMV/Massachusetts: $115K-$135K annually Colchester, CT and additional approved CR locations: $100K-$120K annually Consumer Reports is an equal opportunity employer and does not discriminate in employment on the basis of actual or perceived race, color, creed, religion, age, national origin, ancestry, citizenship status, sex or gender (including pregnancy, childbirth, related medical conditions or lactation), gender identity and expression (including transgender status), sexual orientation, marital status, military service or veteran status, protected medical condition as defined by applicable state or local law, disability, genetic information, or any other basis protected by applicable federal, state or local laws. Consumer Reports will provide you with any reasonable assistance or accommodation for any part of the application and hiring process.
    $120k-140k yearly Auto-Apply 45d ago
  • Strategy Data & AI Consultant - AIC 26-00535

    Navitaspartners

    Data engineer job in North Bergen, NJ

    Job DescriptionJob Title: Strategy Data & AI Consultant Contract Duration: 12 Months Work Schedule: Part-time (4 days per week) - potential to transition to full-time An Engineering Operations organization is seeking an experienced Strategy Data & AI Consultant to support digital transformation initiatives within its Engineering Solutions and Application Management environment. This role will focus on defining and executing enterprise-wide Data Governance and AI strategies, while partnering closely with technical and business stakeholders to drive innovation, responsible AI adoption, and organizational data maturity. This position requires onsite presence as part of a hybrid work arrangement. Key Responsibilities Develop and implement a Data Governance Strategy to support digital transformation initiatives Define and lead an AI Strategy, including AI adoption, governance frameworks, and staff upskilling Assess organizational data maturity and create a roadmap to advance data capabilities Integrate Responsible AI principles into strategic vision and execution Drive cultural change focused on innovation, experimentation, and AI adoption Establish processes to curate and evaluate AI innovations across the organization Develop and implement an AI Governance Risk Assessment framework Identify required people, processes, and technologies to design, deploy, and operate AI solutions Lead and manage Data Governance initiatives from discovery through delivery and knowledge transfer Collaborate closely with technical teams to align Data & AI strategy with broader organizational AI initiatives Address complex business and technical challenges through strategic analysis and solution design Manage stakeholder relationships to identify opportunities for efficiency, collaboration, and ROI Plan, assign, and oversee project work to ensure timely delivery and effective resource utilization Facilitate workshops, interviews, and working groups to gather requirements and drive change initiatives Required Education & Experience Bachelor's degree in Engineering, Data Science, Computer Science, or a related field (Master's degree highly desirable) 3+ years of experience leading AI strategy and execution in large organizations 5+ years of experience working in or supporting engineering, design, or construction environments 2+ years of hands-on experience with Microsoft Azure AI and storage platforms Strong proficiency with Microsoft Office 365 (Word, PowerPoint, Excel) Demonstrated proficiency with Adobe Acrobat Pro Required Soft Skills Excellent analytical, conceptual, and problem-solving skills Strong verbal and written communication abilities Self-motivated with the ability to manage multiple priorities and deadlines in a dynamic environment Preferred Qualifications Experience with additional AI platforms and technologies For more details reach at *********************** About Navitas Partners, LLC: It is a certified WBENC and one of the fastest-growing Technical / IT staffing firms in the US providing services to numerous clients. We offer the most competitive pay for every position. We understand this is a partnership. You will not be blindsided and your salary will be discussed upfront.
    $92k-124k yearly est. Easy Apply 4d ago
  • Onsite Data Engineer Miami

    It Search Corp

    Data engineer job in Norwood, NJ

    Benefits: 401(k) matching Bonus based on performance Dental insurance Health insurance Data Engineer Onsite Miami, Fl $130150K base , 10% bonus We are seeking an experienced Senior Data Engineer to lead the design, development, and optimization of end-to-end data pipelines and cloud-based solutions. You will be responsible for architecting scalable data and analytic systems, ensuring data integrity, and implementing software engineering best practices and patterns. The ideal candidate has a strong background in ETL, big data technologies, and cloud services, with a proven ability to drive complex projects from concept to production. Primary Responsibilities/Essential Functions This job description in no way states or implies that these are the only duties to be performed by the teammate occupying this position. The selected candidate may perform other related duties assigned to meet the ongoing needs of the business. Qualifications Required: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. 5+ years of experience as a Data Engineer with expertise in building large-scale data solutions. Proficiency in Python, SQL, and scripting languages (Bash, PowerShell). Deep understanding of big data tools (Hadoop, Spark) and ETL processes. Hands-on experience with cloud platforms (AWS S3, Azure Data Lake, Google BigQuery, Snowflake). Strong knowledge of database systems (SQL, NoSQL), database design, and query optimization. Experience designing and managing data warehouses for performance and scalability. Proficiency in software engineering practices: version control (Git), CI/CD pipelines, and unit testing. Preferred Strong experience in software architecture, design patterns, and code optimization. Expertise in Python-based pipelines and ETL frameworks. Experience with Azure Data Services and Databricks. Excellent problem-solving, analytical, and communication skills. Experience working in agile environments and collaborating with diverse teams.
    $82k-112k yearly est. 10d ago
  • Tech Lead, Data & Inference Engineer

    Catalyst Labs

    Data engineer job in Stamford, CT

    Our Client A fast moving and venture backed advertising technology startup based in San Francisco. They have raised twelve million dollars in funding and are transforming how business to business marketers reach their ideal customers. Their identity resolution technology blends business and consumer signals to convert static audience lists into high match and cross channel segments without the use of cookies. By transforming first party and third party data into precision targetable audiences across platforms such as Meta, Google and YouTube, they enable marketing teams to reach higher match rates, reduce wasted advertising spend and accelerate pipeline growth. With a strong understanding of how business buyers behave in channels that have traditionally been focused on business to consumer activity, they are redefining how business brands scale demand generation and account based efforts. About Us Catalyst Labs is a leading talent agency with a specialized vertical in Applied AI, Machine Learning, and Data Science. We stand out as an agency thats deeply embedded in our clients recruitment operations. We collaborate directly with Founders, CTOs, and Heads of AI in those themes who are driving the next wave of applied intelligence from model optimization to productized AI workflows. We take pride in facilitating conversations that align with your technical expertise, creative problem-solving mindset, and long-term growth trajectory in the evolving world of intelligent systems. Location: San Francisco Work type: Full Time, Compensation: above market base + bonus + equity Roles & Responsibilities Lead the design, development and scaling of an end to end data platform from ingestion to insights, ensuring that data is fast, reliable and ready for business use. Build and maintain scalable batch and streaming pipelines, transforming diverse data sources and third party application programming interfaces into trusted and low latency systems. Take full ownership of reliability, cost and service level objectives. This includes achieving ninety nine point nine percent uptime, maintaining minutes level latency and optimizing cost per terabyte. Conduct root cause analysis and provide long lasting solutions. Operate inference pipelines that enhance and enrich data. This includes enrichment, scoring and quality assurance using large language models and retrieval augmented generation. Manage version control, caching and evaluation loops. Work across teams to deliver data as a product through the creation of clear data contracts, ownership models, lifecycle processes and usage based decision making. Guide architectural decisions across the data lake and the entire pipeline stack. Document lineage, trade offs and reversibility while making practical decisions on whether to build internally or buy externally. Scale integration with application programming interfaces and internal services while ensuring data consistency, high data quality and support for both real time and batch oriented use cases. Mentor engineers, review code and raise the overall technical standard across teams. Promote data driven best practices throughout the organization. Qualifications Bachelors or Masters degree in Computer Science, Computer Engineering, Electrical Engineering, or Mathematics. Excellent written and verbal communication; proactive and collaborative mindset. Comfortable in hybrid or distributed environments with strong ownership and accountability. A founder-level bias for actionable to identify bottlenecks, automate workflows, and iterate rapidly based on measurable outcomes. Demonstrated ability to teach, mentor, and document technical decisions and schemas clearly. Core Experience 6 to 12 years of experience building and scaling production-grade data systems, with deep expertise in data architecture, modeling, and pipeline design. Expert SQL (query optimization on large datasets) and Python skills. Hands-on experience with distributed data technologies (Spark, Flink, Kafka) and modern orchestration tools (Airflow, Dagster, Prefect). Familiarity with dbt, DuckDB, and the modern data stack; experience with IaC, CI/CD, and observability. Exposure to Kubernetes and cloud infrastructure (AWS, GCP, or Azure). Bonus: Strong Node.js skills for faster onboarding and system integration. Previous experience at a high-growth startup (10 to 200 people) or early-stage environment with a strong product mindset.
    $84k-114k yearly est. 60d+ ago
  • Data Engineer - Databricks

    A.M. Best 4.4company rating

    Data engineer job in Waldwick, NJ

    * Flexible and hybrid work arrangements * Paid time off/Paid company holidays * Medical plan options/prescription drug plan * Dental plan/vision plan options * Flexible spending and health savings accounts * 401(k) retirement savings plan with a Roth savings option and company matching contributions * Educational assistance program Overview The Data Engineer is responsible for designing, building, and optimizing scalable data solutions to support a wide range of business needs. This role requires a strong ability to work both independently and collaboratively in a fast-paced, agile environment. The ideal candidate will engage with cross-functional teams to gather data requirements, propose enhancements to existing data pipelines and structures, and ensure the reliability and efficiency of data processes. Responsibilities * Assist with leading the team's transition to the Databricks platform and utilize the newer features of Delta Live Tables, Workflows etc• Design and develop data pipelines that extract data from Oracle, load it into the data lake, transform it into the desired format, and load it into Databricks data lakehouse• Optimize data pipelines and data processing workflows for performance, scalability, and efficiency• Implement data quality checks and validations within data pipelines to ensure the accuracy, consistency, and completeness of data• Help create and maintain documentation for data mappings, data definitions, architecture and data flow diagrams• Build proof-of-concepts to determine viability of possible new processes and technologies• Deploy and manage code in non-prod and prod environments• Investigate and troubleshoot data related issues and fix or provide solutions to fix defects• Identify and resolve performance bottlenecks, which could include suggesting ways to optimize and performance tune databases and queries to enhance query performance Qualifications * Bachelor's Degree in Computer Science, Data Science, Software Engineering, Information Systems, or related quantitative field• 4 plus years of experience working as a Data Engineer, ETL Engineer, Data/ETL Architect or similar roles• Must hold a current/active Databricks Data Engineer/Analyst certification Skills * 4+ years of solid continuous experience in Python • 3+ years working with Databricks with knowledge and expertise of data structures, data storage and change data capture gained from prior production implementations of data pipelines, optimizations, and best practices • 3+ years of experience in Kimball dimensional modeling (star-schema comprising of facts, type1 and type2 dimensions, aggregates, etc.) with solid understanding of ELT/ETL • 3+ years of solid experience writing SQL and PL/SQL code • 2+ years of experience with Airflow • 3+ years of experience working with relational databases (Oracle preferred) • 2+ years of experience working with NoSQL databases: MongoDB, Cosmos DB, DocumentDB or similar • 2+ years of cloud experience (Azure preferred) • Experience with CI/CD utilizing git/Azure DevOps • Experience with storage formats including Parquet/Arrow/Avro • Effectively collaborate with team members while being able to work independently with minimal supervision • Must have a creative mindset, knack to solve complex problems, passion to work with data, and a positive attitude • Ability to collaborate within and across teams of different technical knowledge to support delivery and educate end users on data products • Expert problem-solving skills, including debugging skills, allowing the determination of sources of issues in unfamiliar code or systems Pluses, but not required: Any work experience in the following: ETL / ELT tools: Spark, Kafka, Azure Data Factory (ADF) Languages: R, Java, Scala Databases: Redis, Elasticsearch
    $99k-138k yearly est. Auto-Apply 9d ago
  • C++ Market Data Engineer (USA)

    Trexquant Investment 4.0company rating

    Data engineer job in Stamford, CT

    Trexquant is a growing systematic fund at the forefront of quantitative finance, with a core team of highly accomplished researchers and engineers. To keep pace with our expanding global trading operations, we are seeking a C++ Market Data Engineer to design and build ultra-low-latency feed handlers for premier vendor feeds and major exchange multicast feeds. This is a high-impact role that sits at the heart of Trexquant's trading platform; the quality, speed, and reliability of your code directly influence every strategy we run. Responsibilities Design & implement high-performance feed handlers in modern C++ for equities, futures, and options across global venues (e.g., NYSE, CME, Refinitiv RTS, Bloomberg B-PIPE). Optimize for micro- and nanosecond latency using lock-free data structures, cache-friendly memory layouts, and kernel-bypass networking where appropriate. Build reusable libraries for message decoding, normalization, and publication to internal buses shared by research, simulation, and live trading systems. Collaborate with cross-functional teams to tune TCP/UDP multicast stacks, kernel parameters, and NIC settings for deterministic performance. Provide robust failover, gap-recovery, and replay mechanisms to guarantee data integrity under packet loss or venue outages. Instrument code paths with precision timestamping and performance metrics; drive continuous latency regression testing and capacity planning. Partner closely with quantitative researchers to understand downstream data requirements and to fine-tune delivery formats for both simulation and live trading. Produce clear architecture documents, operational run-books, and post-mortems; participate in a 24×7 follow-the-sun support rotation for mission-critical market-data services. Requirements BS/MS/PhD in Computer Science, Electrical Engineering, or related field. 3+ years of professional C++ (14,17,20) development experience focused on low-latency, high-throughput systems. Proven track record building or maintaining real-time market-data feeds (e.g., Refinitiv RTS/TREP, Bloomberg B-PIPE, OPRA, CME MDP, ITCH). Strong grasp of concurrency, lock-free algorithms, memory-model semantics, and compiler optimizations. Familiarity with serialization formats (FAST, SBE, Protocol Buffers) and time-series databases or in-memory caches. Comfort with scripting in Python for prototyping, testing, and ops automation. Excellent problem-solving skills, ownership mindset, and ability to thrive in a fast-paced trading environment. Familiarity with containerization (Docker/K8s) and public-cloud networking (AWS, GCP). Benefits Competitive salary, plus bonus based on individual and company performance. Collaborative, casual, and friendly work environment while solving the hardest problems in the financial markets. PPO Health, dental and vision insurance premiums fully covered for you and your dependents. Pre-Tax Commuter Benefits Applications are now open for our NYC office, opening in September 2026. The base salary range is $175,000 - $200,000 depending on the candidate's educational and professional background. Base salary is one component of Trexquant's total compensation, which may also include a discretionary, performance-based bonus. Trexquant is an Equal Opportunity Employer
    $175k-200k yearly Auto-Apply 60d+ ago
  • Data Architect - Power & Utilities - Senior Manager- Consulting - Location OPEN

    Ernst & Young Oman 4.7company rating

    Data engineer job in Stamford, CT

    At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. AI & Data - Data Architecture - Senior Manager - Power & Utilities Sector EY is seeking a motivated professional with solid experience in the utilities sector to serve as a Senior Manager who possesses a robust background in Data Architecture, Data Modernization, End to end Data capabilities, AI, Gen AI, Agentic AI, preferably with a power systems / electrical engineering background and having delivered business use cases in Transmission / Distribution / Generation / Customer. The ideal candidate will have a history of working for consulting companies and be well-versed in the fast-paced culture of consulting work. This role is dedicated to the utilities sector, where the successful candidate will craft, deploy, and maintain large-scale AI data ready architectures. The opportunity You will help our clients enable better business outcomes while working in the rapidly growing Power & Utilities sector. You will have the opportunity to lead and develop your skill set to keep up with the ever-growing demands of the modern data platform. During implementation you will solve complex analytical problems to bring data to insights and enable the use of ML and AI at scale for your clients. This is a high growth area and a high visibility role with plenty of opportunities to enhance your skillset and build your career. As a Senior Manager in Data Architecture, you will have the opportunity to lead transformative technology projects and programs that align with our organizational strategy to achieve impactful outcomes. You will provide assurance to leadership by managing timelines, costs, and quality, and lead both technical and non-technical project teams in the development and implementation of cutting-edge technology solutions and infrastructure. You will have the opportunity to be face to face with external clients and build new and existing relationships in the sector. Your specialized knowledge in project and program delivery methods, including Agile and Waterfall, will be instrumental in coaching others and proposing solutions to technical constraints. Your key responsibilities In this pivotal role, you will be responsible for the effective management and delivery of one or more processes, solutions, and projects, with a focus on quality and effective risk management. You will drive continuous process improvement and identify innovative solutions through research, analysis, and best practices. Managing professional employees or supervising team members to deliver complex technical initiatives, you will apply your depth of expertise to guide others and interpret internal/external issues to recommend quality solutions. Your responsibilities will include: As Data Architect - Senior Manager, you will have an expert understanding of data architecture and data engineering and will be focused on problem-solving to design, architect, and present findings and solutions, leading more junior team members, and working with a wide variety of clients to sell and lead delivery of technology consulting services. You will be the go-to resource for understanding our clients' problems and responding with appropriate methodologies and solutions anchored around data architectures, platforms, and technologies. You are responsible for helping to win new business for EY. You are a trusted advisor with a broad understanding of digital transformation initiatives, the analytic technology landscape, industry trends and client motivations. You are also a charismatic communicator and thought leader, capable of going toe-to-toe with the C-level in our clients and prospects and willing and able to constructively challenge them. Skills and attributes for success To thrive in this role, you will need a combination of technical and business skills that will make a significant impact. Your skills will include: Technical Skills Applications Integration Cloud Computing and Cloud Computing Architecture Data Architecture Design and Modelling Data Integration and Data Quality AI/Agentic AI driven data operations Experience delivering business use cases in Transmission / Distribution / Generation / Customer. Strong relationship management and business development skills. Become a trusted advisor to your clients' senior decision makers and internal EY teams by establishing credibility and expertise in both data strategy in general and in the use of analytic technology solutions to solve business problems. Engage with senior business leaders to understand and shape their goals and objectives and their corresponding information needs and analytic requirements. Collaborate with cross-functional teams (Data Scientists, Business Analysts, and IT teams) to define data requirements, design solutions, and implement data strategies that align with our clients' objectives. Organize and lead workshops and design sessions with stakeholders, including clients, team members, and cross-functional partners, to capture requirements, understand use cases, personas, key business processes, brainstorm solutions, and align on data architecture strategies and projects. Lead the design and implementation of modern data architectures, supporting transactional, operational, analytical, and AI solutions. Direct and mentor global data architecture and engineering teams, fostering a culture of innovation, collaboration, and continuous improvement. Establish data governance policies and practices, including data security, quality, and lifecycle management. Stay abreast of industry trends and emerging technologies in data architecture and management, recommending innovations and improvements to enhance our capabilities. To qualify for the role, you must have A Bachelor's degree required in STEM 12+ years professional consulting experience in industry or in technology consulting. 12+ years hands-on experience in architecting, designing, delivering or optimizing data lake solutions. 5+ years' experience with native cloud products and services such as Azure or GCP. 8+ years of experience mentoring and leading teams of data architects and data engineers, fostering a culture of innovation and professional development. In-depth knowledge of data architecture principles and best practices, including data modelling, data warehousing, data lakes, and data integration. Demonstrated experience in leading large data engineering teams to design and build platforms with complex architectures and diverse features including various data flow patterns, relational and no-SQL databases, production-grade performance, and delivery to downstream use cases and applications. Hands-on experience in designing end-to-end architectures and pipelines that collect, process, and deliver data to its destination efficiently and reliably. Proficiency in data modelling techniques and the ability to choose appropriate architectural design patterns, including Data Fabrics, Data Mesh, Lake Houses, or Delta Lakes. Manage complex data analysis, migration, and integration of enterprise solutions to modern platforms, including code efficiency and performance optimizations. Previous hands‑on coding skills in languages commonly used in data engineering, such as Python, Java, or Scala. Ability to design data solutions that can scale horizontally and vertically while optimizing performance. Experience with containerization technologies like Docker and container orchestration platforms like Kubernetes for managing data workloads. Experience in version control systems (e.g. Git) and knowledge of DevOps practices for automating data engineering workflows (DataOps). Practical understanding of data encryption, access control, and security best practices to protect sensitive data. Experience leading Infrastructure and Security engineers and architects in overall platform build. Excellent leadership, communication, and project management skills. Data Security and Database Management Enterprise Data Management and Metadata Management Ontology Design and Systems Design Ideally, you'll also have Master's degree in Electrical / Power Systems Engineering, Computer science, Statistics, Applied Mathematics, Data Science, Machine Learning or commensurate professional experience. Experience working at big 4 or a major utility. Experience with cloud data platforms like Databricks. Experience in leading and influencing teams, with a focus on mentorship and professional development. A passion for innovation and the strategic application of emerging technologies to solve real-world challenges. The ability to foster an inclusive environment that values diverse perspectives and empowers team members. Building and Managing Relationships Client Trust and Value and Commercial Astuteness Communicating With Impact and Digital Fluency What we look for We are looking for top performers who demonstrate a blend of technical expertise and business acumen, with the ability to build strong client relationships and lead teams through change. Emotional agility and hybrid collaboration skills are key to success in this dynamic role. FY26NATAID What we offer you At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more. We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $144,000 to $329,100. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $172,800 to $374,000. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options. Join us in our team‑led and leader‑enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year. Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well‑being. Are you ready to shape your future with confidence? Apply today. EY accepts applications for this position on an on‑going basis. For those living in California, please click here for additional information. EY focuses on high‑ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities. EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories. EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law. EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** . #J-18808-Ljbffr
    $112k-156k yearly est. 2d ago
  • Data Scientist

    Mindlance 4.6company rating

    Data engineer job in Franklin Lakes, NJ

    Mindlance is a national recruiting company which partners with many of the leading employers across the country. Feel free to check us out at ************************* Job Description Bachelors or Master's degree with specialization in Computer Science, Information Systems, Mathematics, Statistics or other quantitative disciplines Excellent ability to query large datasets using SQL queries and working with databases Excellent in programming languages such as Java, Python, etc. Experience in data extraction and processing, using MapReduce, Pig, Hive, etc. Proficient programming skills using SAS and/or R Knowledge of building and applying machine learning or predictive modeling Strong problem solving skills Exceptional ability to communicate and present findings clearly to both technical and non-technical audiences Excellent interpersonal and collaboration skills Additional Information Thanks & Regards, Praveen K. Paila ************
    $77k-107k yearly est. 60d+ ago

Learn more about data engineer jobs

How much does a data engineer earn in White Plains, NY?

The average data engineer in White Plains, NY earns between $79,000 and $141,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in White Plains, NY

$106,000

What are the biggest employers of Data Engineers in White Plains, NY?

The biggest employers of Data Engineers in White Plains, NY are:
  1. Interactive Brokers
  2. Idexcel
  3. Starwood Capital Group
  4. Clarapath
  5. Health Alliance
  6. IBM
  7. Kforce
  8. PepsiCo
  9. Consumer Reports
Job type you want
Full Time
Part Time
Internship
Temporary