Post job

Data engineer jobs in New Haven, CT - 1,464 jobs

All
Data Engineer
Data Scientist
Data Modeler
Senior Data Architect
Data Architect
Data Consultant
ETL Architect
Requirements Engineer
  • Data Engineer - Data Integration

    IBM Corporation 4.7company rating

    Data engineer job in Armonk, NY

    Data Engineer Check below to see if you have what is needed for this opportunity, and if so, make an application asap. - Data Integration, IBM Corporation, Armonk, NY and various unanticipated client sites throughout the US: Manage the end-to-end delivery of data migration projects, implementing ETL/ELT concepts and leveraging ETL tools such as Informatica and DataStage, and cloud platforms like Google Cloud. Design and build end-to-end data pipelines to extract, integrate, transform, and load data from diverse source systems into target environments such as databases, data warehouses, or data marts. Collaborate with clients to define data mapping and transformation rules, ensuring accurate application prior to loading. Normalize data and establish relational structures to support system migrations. Develop processes for data cleaning, filtering, aggregation, and augmentation to maintain data integrity. Implement validation checks and data quality controls to ensure accuracy and consistency across systems. Create, maintain, and optimize SQL procedures, functions, triggers, and ETL/ELT processes. Develop, debug, and maintain ETL jobs while applying query optimization techniques - such as indexing, clustering, partitioning, and use of analytical functions - to enhance performance on large datasets. Partner with data analysts, data scientists, and business stakeholders to understand requirements and ensure delivery of the right data. Capture fallouts and prepare reports using Excel, Power BI, Looker, Crystal Reports, etc. Perform root cause analysis and resolution. Monitor and maintain pipelines to ensure stability and efficiency of data pipelines through regular monitoring, troubleshooting, and performance optimization. Maintain thorough and up-to-date documentation of all data integration processes, pipelines, and architectures. Analyze current trends, tools, and technologies in data engineering and integration. Utilize: Google Cloud Platform (Google Big Query, Cloud Storage, Google Looker), Procedural language/Structured Query Language (PL/SQL), Informatica, DataStage, Data Integration, Data Warehousing, Database Design / Modelling, Data Visualization (Power BI/ Crystal reports). Required: Master's degree or equivalent in Computer Science or related (employer will accept a bachelor's degree plus five (5) years of progressive experience in lieu of a master's degree) and one (1) year of experience as a Data Engineer or related. One (1) year of experience must include utilizing Google Cloud Platform (Google Big Query, Cloud Storage, Google Looker), Procedural language/Structured Query Language (PL/SQL), Informatica, DataStage, Data Integration, Data Warehousing, Database Design / Modelling, Data Visualization (Power BI/ Crystal reports). $167835 to $216700 per year. Please send resumes to Applicants must reference D185 in the subject line. xevrcyc JobiqoTJN. Keywords: Data Engineer, Location: NORTH CASTLE, NY - 10504
    $167.8k-216.7k yearly 1d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • GCP Data engineer

    E-Solutions 4.5company rating

    Data engineer job in Hartford, CT

    Role: GCP Data Engineer Must Have Skills: 7+ Years of experience with GCP , Python , Pyspark , SQL GCP Services - Bigquery , Dataproc , Pub/sub , GCP Dataflow. The client will take - Coderpad interview so Python coding is strongly required here.
    $110k-154k yearly est. 2d ago
  • Senior Data Architect - Power & Utilities AI Platforms

    Ernst & Young Oman 4.7company rating

    Data engineer job in Stamford, CT

    A leading global consulting firm is seeking a Senior Manager in Data Architecture for the Power & Utilities sector. This role requires at least 12 years of consulting experience and expertise in data architecture and engineering. The successful candidate will manage technology projects, lead teams, and develop innovative data solutions that drive significant business outcomes. Strong relationship management and communication skills are essential for engaging with clients and stakeholders. Join us to help shape a better working world. #J-18808-Ljbffr
    $112k-156k yearly est. 3d ago
  • Cloud Engineer

    Transcend Softech LLC

    Data engineer job in Hartford, CT

    GCP Engineer Duration: Long Term Contract Unable to provide sponsorship for this role (Need Visa Independent Candidates) Required At least 5 years of Information Technology experience Proficiency in Python is highly desirable. Strong knowledge of ETL tools and processes. Expertise in GCP services (BigQuery, Cloud Storage, Dataflow, Dataproc, Cloud Composer) Preferred Qualifications: Have at least 5 years of experience in Google Cloud platform (especially BigQuery & DataFlow) Experience with Python and Google Cloud SDK & API Scripting GCP certifications (e.g., Google Cloud Professional Data Engineer) is a plus.
    $70k-95k yearly est. 1d ago
  • Data Scientist, Analytics (Technical Leadership)

    Meta 4.8company rating

    Data engineer job in Hartford, CT

    We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond. **Required Skills:** Data Scientist, Analytics (Technical Leadership) Responsibilities: 1. Work with complex data sets to solve challenging problems using analytical and statistical approaches 2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies 3. Identify and measure success through goal setting, forecasting, and monitoring key metrics 4. Partner with cross-functional teams to inform and execute product strategy and investment decisions 5. Build long-term vision and strategy for programs and products 6. Collaborate with executives to define and develop data platforms and instrumentation 7. Effectively communicate insights and recommendations to stakeholders 8. Define success metrics, forecast changes, and set team goals 9. Support developing roadmaps and coordinate analytics efforts across teams **Minimum Qualifications:** Minimum Qualifications: 10. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience 11. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab) 12. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development 13. Experience working effectively with and leading complex analytics projects across multiple stakeholders and cross-functional teams, including Engineering, PM/TPM, Analytics and Finance 14. Experience with predictive modeling, machine learning, and experimentation/causal inference methods 15. Experience communicating complex technical topics in a clear, precise, and actionable manner **Preferred Qualifications:** Preferred Qualifications: 16. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy 17. Masters or Ph.D. Degree in a quantitative field 18. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research) 19. 10+ years of experience doing complex quantitative analysis in product analytics **Public Compensation:** $210,000/year to $281,000/year + bonus + equity + benefits **Industry:** Internet **Equal Opportunity:** Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment. Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
    $210k-281k yearly 60d+ ago
  • Data Scientist II

    Coinbase 4.2company rating

    Data engineer job in Hartford, CT

    Ready to be pushed beyond what you think you're capable of? At Coinbase, our mission is to increase economic freedom in the world. It's a massive, ambitious opportunity that demands the best of us, every day, as we build the emerging onchain platform - and with it, the future global financial system. To achieve our mission, we're seeking a very specific candidate. We want someone who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system. We want someone who is eager to leave their mark on the world, who relishes the pressure and privilege of working with high caliber colleagues, and who actively seeks feedback to keep leveling up. We want someone who will run towards, not away from, solving the company's hardest problems. Our ******************************** is intense and isn't for everyone. But if you want to build the future alongside others who excel in their disciplines and expect the same from you, there's no better place to be. While many roles at Coinbase are remote-first, we are not remote-only. In-person participation is required throughout the year. Team and company-wide offsites are held multiple times annually to foster collaboration, connection, and alignment. Attendance is expected and fully supported. Data Science is an integral component of Coinbase's product and decision making process: we work in partnership with Product, Engineering and Design to influence the roadmap and better understand our users. With a deep expertise in experimentation, analytics and advanced modeling, we produce insights which directly move the company's bottom line. *What you'll be doing* * Perform analyses on products to answer open-ended questions and provide strategic recommendations. * Design and guide experiments/analysis to measure impact and drive product improvements. * Develop and maintain key metrics and reports, enhancing data infrastructure for better analysis. *What we look for in you:* * At least a BA/BS in a quantitative field (ex Math, Stats, Physics, or Computer Science) with ≥2+ years of relevant experience. * Experience driving impact for a digital product with an iterative development cycle. * Understanding of statistical concepts and practical experience applying them (in A|B testing, causal inference, ML, etc.). * Experience in data analyses using SQL. * Experience in programming/modeling in Python. * Demonstration of our core cultural values: clear communication, positive energy, continuous learning, and efficient execution. Disclaimer: Applying for a specific role does not guarantee consideration for that exact position. Leveling and team matching are assessed throughout the interview process. ID: G2462 \#LI-Remote *Pay Transparency Notice:* Depending on your work location, the target annual salary for this position can range as detailed below. Full time offers from Coinbase also include bonus eligibility + equity eligibility**+ benefits (including medical, dental, vision and 401(k)). Pay Range: $152,405-$179,300 USD Please be advised that each candidate may submit a maximum of four applications within any 30-day period. We encourage you to carefully evaluate how your skills and interests align with Coinbase's roles before applying. Commitment to Equal Opportunity Coinbase is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, creed, gender, national origin, age, disability, veteran status, sex, gender expression or identity, sexual orientation or any other basis protected by applicable law. Coinbase will also consider for employment qualified applicants with criminal histories in a manner consistent with applicable federal, state and local law. For US applicants, you may view the *********************************************** in certain locations, as required by law. Coinbase is also committed to providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process, please contact us at accommodations*********************************** *Global Data Privacy Notice for Job Candidates and Applicants* Depending on your location, the General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) may regulate the way we manage the data of job applicants. Our full notice outlining how data will be processed as part of the application procedure for applicable locations is available ********************************************************** By submitting your application, you are agreeing to our use and processing of your data as required. *AI Disclosure* For select roles, Coinbase is piloting an AI tool based on machine learning technologies to conduct initial screening interviews to qualified applicants. The tool simulates realistic interview scenarios and engages in dynamic conversation. A human recruiter will review your interview responses, provided in the form of a voice recording and/or transcript, to assess them against the qualifications and characteristics outlined in the job description. For select roles, Coinbase is also piloting an AI interview intelligence platform to transcribe and summarize interview notes, allowing our interviewers to fully focus on you as the candidate. *The above pilots are for testing purposes and Coinbase will not use AI to make decisions impacting employment*. To request a reasonable accommodation due to disability, please contact accommodations[at]coinbase.com
    $152.4k-179.3k yearly 60d+ ago
  • Data Scientist - Analytics roles draw analytical talent hunting for roles.

    Boxncase

    Data engineer job in Commack, NY

    About the Role We believe that the best decisions are backed by data. We are seeking a curious and analytical Data Scientist to champion our data -driven culture. In this role, you will act as a bridge between technical data and business strategy. You will mine massive datasets, build predictive models, and-most importantly-tell the story behind the numbers to help our leadership team make smarter choices. You are perfect for this role if you are as comfortable with SQL queries as you are with slide decks. ### What You Will Do Exploratory Analysis: Dive deep into raw data to discover trends, patterns, and anomalies that others miss. Predictive Modeling: Build and test statistical models (Regression, Time -series, Clustering) to forecast business outcomes and customer behavior. Data Visualization: Create clear, impactful dashboards using Tableau, PowerBI, or Python libraries (Matplotlib/Seaborn) to visualize success metrics. Experimentation: Design and analyze A/B tests to optimize product features and marketing campaigns. Data Cleaning: Work with Data Engineers to clean and structure messy data for analysis. Strategy: Present findings to stakeholders, translating complex math into clear, actionable business recommendations. Requirements Experience: 2+ years of experience in Data Science or Advanced Analytics. The Toolkit: Expert proficiency in Python or R for statistical analysis. Data Querying: Advanced SQL skills are non -negotiable (Joins, Window Functions, CTEs). Math Mindset: Strong grasp of statistics (Hypothesis testing, distributions, probability). Visualization: Ability to communicate data visually using Tableau, PowerBI, or Looker. Communication: Excellent verbal and written skills; you can explain a p -value to a non -technical manager. ### Preferred Tech Stack (Keywords) Languages: Python (Pandas, NumPy), R, SQL Viz Tools: Tableau, PowerBI, Looker, Plotly Machine Learning: Scikit -learn, XGBoost (applied to business problems) Big Data: Spark, Hadoop, Snowflake Benefits Salary Range: $50,000 - $180,000 USD / year (Commensurate with location and experience) Remote Friendly: Work from where you are most productive. Learning Budget: Stipend for data courses (Coursera, DataCamp) and books.
    $50k-180k yearly 34d ago
  • Senior Data Engineer

    Stratacuity

    Data engineer job in Bristol, CT

    Description/Comment: Disney Streaming is the leading premium streaming service offering live and on-demand TV and movies, with and without commercials, both in and outside the home. Operating at the intersection of entertainment and technology, Disney Streaming has a unique opportunity to be the number one choice for TV. We captivate and connect viewers with the stories they love, and we're looking for people who are passionate about redefining TV through innovation, unconventional thinking, and embracing fun. Join us and see what this is all about. The Product Performance Data Solutions team for the Data organization within Disney Streaming (DS), a segment under the Disney Media & Entertainment Distribution is in search of a Senior Data Engineer. As a member of the Product Performance team, you will work on building foundational datasets from clickstream and quality of service telemetry data - enabling dozens of engineering and analytical teams to unlock the power of data to drive key business decisions and provide engineering, analytics, and operational teams the critical information necessary to scale the largest streaming service. The Product Performance Data Solutions team is seeking to grow their team of world-class Data Engineers that share their charisma and enthusiasm for making a positive impact. Responsibilities: * Contribute to maintaining, updating, and expanding existing data pipelines in Python / Spark while maintaining strict uptime SLAs * Architect, design, and code shared libraries in Scala and Python that abstract complex business logic to allow consistent functionality across all data pipelines * Tech stack includes Airflow, Spark, Databricks, Delta Lake, Snowflake, Scala, Python * Collaborate with product managers, architects, and other engineers to drive the success of the Product Performance Data and key business stakeholders * Contribute to developing and documenting both internal and external standards for pipeline configurations, naming conventions, partitioning strategies, and more * Ensure high operational efficiency and quality of datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our partners (Engineering, Data Science, Operations, and Analytics teams) * Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for our team * Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements * Maintain detailed documentation of your work and changes to support data quality and data governance requirements Additional Information: NOTE: There will be no SPC for this role Interview process: 4 rounds (1 with HM, 2 tech rounds, and a final with Product) We need an expert in SQL, extensive experience with Scala, a proven self-starter (expected to discover the outcome, and then chase after it), not only able to speak technical but clearly articulate that info to the business as well. Preferred Qualifications: Candidates with Click stream, user browse data are highly preferred Apex Systems is a world-class IT services company that serves thousands of clients across the globe. When you join Apex, you become part of a team that values innovation, collaboration, and continuous learning. We offer quality career resources, training, certifications, development opportunities, and a comprehensive benefits package. Our commitment to excellence is reflected in many awards, including ClearlyRated's Best of Staffing in Talent Satisfaction in the United States and Great Place to Work in the United Kingdom and Mexico. Apex uses a virtual recruiter as part of the application process. Click here for more details. Apex Benefits Overview: Apex offers a range of supplemental benefits, including medical, dental, vision, life, disability, and other insurance plans that offer an optional layer of financial protection. We offer an ESPP (employee stock purchase program) and a 401K program which allows you to contribute typically within 30 days of starting, with a company match after 12 months of tenure. Apex also offers a HSA (Health Savings Account on the HDHP plan), a SupportLinc Employee Assistance Program (EAP) with up to 8 free counseling sessions, a corporate discount savings program and other discounts. In terms of professional development, Apex hosts an on-demand training program, provides access to certification prep and a library of technical and leadership courses/books/seminars once you have 6+ months of tenure, and certification discounts and other perks to associations that include CompTIA and IIBA. Apex has a dedicated customer service team for our Consultants that can address questions around benefits and other resources, as well as a certified Career Coach. You can access a full list of our benefits, programs, support teams and resources within our 'Welcome Packet' as well, which an Apex team member can provide. Employee Type: Contract Location: Bristol, CT, US Job Type: Date Posted: January 8, 2026 Pay Range: $50 - $100 per hour Similar Jobs * Senior Data Engineer * Sr Data Engineers x12 * Senior Data Scientist * Senior Data Engineer - SQL & Reporting * Data Engineer
    $50-100 hourly 2d ago
  • Senior Data Engineer - Product Performance Data -1573

    Akube

    Data engineer job in Bristol, CT

    City: Bristol, CT /NYC Onsite/ Hybrid/ Remote: Hybrid (4 days a week Onsite)Duration: 10 months Rate Range: Up to $96/hr on W2 depending on experience (no C2C or 1099 or sub -contract) Work Authorization: GC, USC, All valid EADs except OPT, CPT, H1B Must Have: Advanced SQL expertise Strong Scala development experience Python for data engineering Apache Spark in production Airflow for orchestration Databricks platform experience Cloud data storage experience (S3 or equivalent) Responsibilities: Build and maintain large -scale data pipelines with strict SLAs. Design shared libraries in Scala and Python to standardize data logic. Develop foundational datasets from clickstream and telemetry data. Ensure data quality, reliability, and operational efficiency. Partner with product, engineering, and analytics teams. Define and document data standards and best practices. Participate actively in Agile and Scrum ceremonies. Communicate technical outcomes clearly to business stakeholders. Maintain detailed technical and data governance documentation. Qualifications: 5+ years of data engineering experience. Strong problem -solving and algorithmic skills. Expert -level SQL with complex analytical queries. Hands -on experience with distributed systems at scale. Experience supporting production data platforms. Self -starter who can define outcomes and drive solutions. Ability to translate technical concepts for non -technical audiences. Bachelor's degree or equivalent experience.
    $96 hourly 3d ago
  • Data Scientist, Privacy

    Datavant

    Data engineer job in Hartford, CT

    Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care. By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare. As part of the Privacy Science team within Privacy Hub you will play a crucial role in ensuring that privacy of patients is safeguarded in the modern world of data sharing. As well as working on real data, you will be involved in exciting research to keep us as industry leaders in this area, and stimulating discussions on re-identification risk. You will be supported in developing/consolidating data analysis and coding skills to become proficient in the analysis of large health-related datasets. **You Will:** + Critically analyze large health datasets using standard and bespoke software libraries + Discuss your findings and progress with internal and external stakeholders + Produce high quality reports which summarise your findings + Contribute to research activities as we explore novel and established sources of re-identification risk **What You Will Bring to the Table:** + Excellent communication skills. Meticulous attention to detail in the production of comprehensive, well-presented reports + A good understanding of statistical probability distributions, bias, error and power as well as sampling and resampling methods + Seeks to understand real-world data in context rather than consider it in abstraction. + Familiarity or proficiency with programmable data analysis software R or Python, and the desire to develop expertise in its language + Application of scientific methods to practical problems through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions + Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines + Initiative and ability to independently explore and research novel topics and concepts as they arise, to expand Privacy Hub's knowledge base + An appreciation of the need for effective methods in data privacy and security, and an awareness of the relevant legislation + Familiarity with Amazon Web Services cloud-based storage and computing facilities **Bonus Points If You Have:** + Experience creating documents using LATEX + Detailed knowledge of one or more types of health information, e.g., genomics, disease, health images + Experience working with or supporting public sector organizations, such as federal agencies (e.g., CMS, NIH, VA, CDC), state health departments, or public health research partners. Familiarity with government data environments, procurement processes, or privacy frameworks in regulated settings is highly valued. \#LI-BC1 We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services. The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job. The estimated total cash compensation range for this role is: $104,000-$130,000 USD To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion. This job is not eligible for employment sponsorship. Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay. At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way. Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis. For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
    $104k-130k yearly 15d ago
  • AWS Data Migration Consultant

    Slalom 4.6company rating

    Data engineer job in Hartford, CT

    Candidates can live within commutable distance to any Slalom office in the US. We have a hybrid and flexible environment. Who You'll Work With As a modern technology company, we've never met a technical challenge we didn't like. We enable our clients to learn from their data, create incredible digital experiences, and make the most of new technologies. We blend design, engineering, and analytics expertise to build the future. We surround our technologists with interesting challenges, innovative minds, and emerging technologies. We are seeking an experienced Cloud Data Migration Architect with deep expertise in SQL Server, Oracle, DB2, or a combination of these platforms, to lead the design, migration, and optimization of scalable database solutions in the AWS cloud. This role will focus on modernizing on-premises database systems by architecting high-performance, secure, and reliable AWS-hosted solutions. As a key technical leader, you will work closely with data engineers, cloud architects, and business stakeholders to define data strategies, lead complex database migrations, build out ETL pipelines, and optimize performance across legacy and cloud-native environments. What You'll Do * Design and optimize database solutions on AWS, including Amazon RDS, EC2-hosted instances, and advanced configurations like SQL Server Always On or Oracle RAC (Real Application Clusters). * Lead and execute cloud database migrations using AWS Database Migration Service (DMS), Schema Conversion Tool (SCT), and custom automation tools. * Architect high-performance database schemas, indexing strategies, partitioning models, and query optimization techniques. * Optimize complex SQL queries, stored procedures, functions, and views to ensure performance and scalability in the cloud. * Implement high-availability and disaster recovery (HA/DR) strategies including Always-On, Failover Clusters, Log Shipping, and Replication, tailored to each RDBMS. * Ensure security best practices are followed including IAM-based access control, encryption, and compliance with industry standards. * Collaborate with DevOps teams to implement Infrastructure-as-Code (IaC) using tools like Terraform, CloudFormation, or AWS CDK. * Monitor performance using tools such as AWS CloudWatch, Performance Insights, Query Store, Dynamic Management Views (DMVs), or Oracle-native tools. * Work with software engineers and data teams to integrate cloud databases into enterprise applications and analytics platforms. What You'll Bring * 5+ years of experience in database architecture, design, and administration with at least one of the following: SQL Server, Oracle, or DB2. * Expertise in one or more of the following RDBMS platforms: Microsoft SQL Server, Oracle, DB2. * Hands-on experience with AWS database services (RDS, EC2-hosted databases). * Strong understanding of HA/DR solutions and cloud database design patterns. * Experience with ETL development and data integration, using tools such as SSIS, AWS Glue, or custom solutions. * Familiarity with AWS networking components (VPCs, security groups) and hybrid cloud connectivity. * Strong troubleshooting and analytical skills to resolve complex database and performance issues. * Ability to work independently and lead database modernization initiatives in collaboration with engineering and client stakeholders. Nice to Have * AWS certifications such as AWS Certified Database - Specialty or AWS Certified Solutions Architect - Professional. * Experience with NoSQL databases or hybrid data architectures. * Knowledge of analytics and big data tools (e.g., Snowflake, Redshift, Athena, Power BI, Tableau). * Familiarity with containerization (Docker, Kubernetes) and serverless technologies (AWS Lambda, Fargate). * Experience with DB2 on-premise or cloud-hosted environments. About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this position, the target base salary pay range in the following locations: Boston, Houston, Los Angeles, Orange County, Seattle, San Diego, Washington DC, New York, New Jersey, for Consultant level is $105,000-147,000 and for Senior Consultant level it is $120,000-$169,000 and for Principal level it is $133,000-$187,000. In all other markets, the target base salary pay range for Consultant level is $96,000-$135,000 and for Senior Consultant level it is $110,000-$155,000 and for Principal level it is $122,000-$172,000. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time. EEO and Accommodations Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process. We will accept applications until 1/31/2026 or until the positions are filled.
    $133k-187k yearly 4d ago
  • Junior Data Scientist

    Bexorg

    Data engineer job in New Haven, CT

    About Us Bexorg is revolutionizing drug discovery by restoring molecular activity in postmortem human brains. Through our BrainEx platform, we directly experiment on functionally preserved human brain tissue, creating enormous high-fidelity molecular datasets that fuel AI-driven breakthroughs in treating CNS diseases. We are looking for a Junior Data Scientist to join our team and dive into this one-of-a-kind data. In this onsite role, you will work at the intersection of computational biology and machine learning, helping analyze high-dimensional brain data and uncover patterns that could lead to the next generation of CNS therapeutics. This is an ideal opportunity for a recent graduate or early-career scientist to grow in a fast-paced, mission-driven environment. The Job Data Analysis & Exploration: Work with large-scale molecular datasets from our BrainEx experiments - including transcriptomic, proteomic, and metabolic data. Clean, transform, and explore these high-dimensional datasets to understand their structure and identify initial insights or anomalies. Collaborative Research Support: Collaborate closely with our life sciences, computational biology and deep learning teams to support ongoing research. You will help biologists interpret data results and assist machine learning researchers in preparing data for modeling, ensuring that domain knowledge and data science intersect effectively. Machine Learning Model Execution: Run and tune machine learning and deep learning models on real-world central nervous system (CNS) data. You'll help set up experiments, execute training routines (for example, using scikit-learn or PyTorch models), and evaluate model performance to extract meaningful patterns that could inform drug discovery. Statistical Insight Generation: Apply statistical analysis and visualization techniques to derive actionable insights from complex data. Whether it's identifying gene expression patterns or correlating molecular changes with experimental conditions, you will contribute to turning data into scientific discoveries. Reporting & Communication: Document your analysis workflows and results in clear reports or dashboards. Present findings to the team, highlighting key insights and recommendations. You will play a key role in translating data into stories that drive decision-making in our R&D efforts. Qualifications and Skills: Strong Python Proficiency: Expert coding skills in Python and deep familiarity with the standard data science stack. You have hands-on experience with NumPy, pandas, and Matplotlib for data manipulation and visualization; scikit-learn for machine learning; and preferably PyTorch (or similar frameworks like TensorFlow) for deep learning tasks. Educational Background: A Bachelor's or Master's degree in Data Science, Computer Science, Computational Biology, Bioinformatics, Statistics, or a related field. Equivalent practical project experience or internships in data science will also be considered. Machine Learning Knowledge: Solid understanding of machine learning fundamentals and algorithms. Experience developing or applying models to real or simulated datasets (through coursework or projects) is expected. Familiarity with high-dimensional data techniques or bioinformatics methods is a plus. Analytical & Problem-Solving Skills: Comfortable with statistics and data analysis techniques for finding signals in noisy data. Able to break down complex problems, experiment with solutions, and clearly interpret the results. Team Player: Excellent communication and collaboration skills. Willingness to learn from senior scientists and ability to contribute effectively in a multidisciplinary team that includes biologists, data engineers, and AI researchers. Motivation and Curiosity: Highly motivated, with an evident passion for data-driven discovery. You are excited by Bexorg's mission and eager to take on challenging tasks - whether it's mastering a new analysis method or digging into scientific literature - to push our research forward. Local to New Haven, CT preferred. No relocation offered for this position. Bexorg is an equal opportunity employer. We strive to create a supportive and inclusive workplace where contributions are valued and celebrated, and our employees thrive by being themselves and are inspired to do their best work. We seek applicants of all backgrounds and identities, across race, color, ethnicity, national origin or ancestry, citizenship, religion, sex, sexual orientation, gender identity or expression, veteran status, marital status, pregnancy or parental status, or disability. Applicants will not be discriminated against based on these or other protected categories or social identities. Bexorg will also consider for employment qualified applicants with criminal histories in a manner consistent with applicable federal, state and local law.
    $75k-105k yearly est. Auto-Apply 60d+ ago
  • Data Scientist

    Tsunami Tsolutions 4.0company rating

    Data engineer job in Glastonbury, CT

    Tsunami Tsolutions is seeking a motivated Data Scientist to join its Aviation Analytics department. This person will be responsible for developing and deploying solutions to various customers that utilize a wide range of analytics tools. They will work with team members and customers to identify new sources of value and take actions to capture it. The selected individual will also work to ensure data quality and provide metrics as indicators of the current state. This role will also require the presentation of findings to customers and senior management. NOTE: This position requires access to technologies and hardware subject to US national Security based export control requirements. All applicants must be US Citizen (8 USC 1324b(a)(3)), or otherwise authorized by the U.S. Government. NO Company Sponsorship offered. Responsibilities: • Working with stakeholders to understand complex business processes and data streams. • Collaborate with stakeholders and whiteboard solutions. • Identify, collect and clean data from multiple sources • Validate, interpret and provide business insights • Research, build, implement and evaluate various analytic techniques to select the best application • Work with team members to deploy solutions across the organization • Reporting periodic progress on projects, including tracking usage and value derived from the models Position Requirements • B.S degree in computer science, data science or engineering. Advanced Degree Preferred • 3-5 years of industry experience preferred • Experience building dashboards and visualizations using Qlik, Power BI, Tableau or similar • Strong programming skills on languages used in data science including Python, R and SQL • Advanced proficiency in Microsoft Excel with competency in vlookups, pivot tables, etc. • Ability to learn new concepts quickly and translate them into practical applications • Ability to effectively communicate findings to non-technical audiences • Experience building and implementing machine learning models in Python or R a plus Offer contingent upon successful completion of a background check and drug screen.
    $78k-114k yearly est. Auto-Apply 60d+ ago
  • IBM IIB, WMB, Data power Consultant

    Sonsoft 3.7company rating

    Data engineer job in Hartford, CT

    Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services. Job Description Preferred • At least 4 years of experience with IBM IIB, WMB, Datapower • At least 4 years of experience in software development life cycle. • At least 4 years of experience in Project life cycle activities on development and maintenance projects. • At least 2 years of experience in Design and architecture review. • Ability to work in team in diverse/ multiple stakeholder environments • Ability to work in Scrum team in diverse/ multiple stakeholder environments • Interface analysis, Technical leadership, activities coordination, etc. • Perform reviews • Interactions with application teams, GI Team and other stake holders relevant to technology • Experience in Automation Domain. • Analytical skills • Experience and desire to work in a Global delivery environment Qualifications Qualifications Basic • Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education. • At least 4 years of experience with Information Technology. Additional Information ** U.S. citizens and those authorized to work in the U.S. are encouraged to apply . We are unable to sponsor at this time. Note:- This is a Full-Time Permanent job opportunity for you. Only US Citizen, Green Card Holder, GC-EAD, H4-EAD & L2-EAD can apply. No OPT-EAD, TN Visa & H1B Consultants please. Please mention your Visa Status in your email or resume.
    $83k-112k yearly est. 60d+ ago
  • Data Scientist, Media

    Digital United

    Data engineer job in Farmington, CT

    Accepting applicants in CT, FL, MN, NJ, NC, OH, TX Mediate.ly is seeking a hands-on Data Scientist to elevate media performance analysis, predictive modeling, and channel optimization. In this role, you'll leverage advanced machine learning techniques and generative AI tools to uncover actionable insights, automate reporting, and enhance campaign effectiveness across digital channels. You'll manage and evolve our existing performance dashboard (with a small external team), own the feature roadmap, and collaborate closely with Primacy on SEO/CRO data integration. A key part of the role involves supporting Account teams with clear, insight-rich reporting powered by enhanced data storytelling and visualization. This was meant for you if you are passionate and skilled in transforming complex datasets into clear, compelling insights. Measures: AI-Enhanced Reporting & Insight Automation Business & Media Impact Reporting Standardization and Quality Dashboard & Data Product Ownership Reports to: President RESPONSIBILITIES: Media & Channel Analytics Analyze paid media across Google Ads, Meta, LinkedIn, Programmatic, YouTube; translate results into clear recommendations. Build/maintain attribution approaches (last-click, MTA, assisted) and funnel diagnostics. Integrate CRM/GA4/platform data to surface actionable trends by geo, audience, and creative. Predictive Modeling & Experimentation Develop forecasting and propensity models to guide budget allocation and channel mix. Run simulations (CPM/CPC/conv-rate scenarios) and design A/B and lift tests. Partner with SEO/CRO to connect acquisition with on-site conversion improvements. Dashboard Ownership (Existing Platform) Manage the dashboard development team (backlog, priorities, sprints) and collaborate on new features that improve usability and insight depth. Gather stakeholder requirements (Accounts, Media, Leadership) and maintain a transparent roadmap. Ensure data reliability (ETL QA, schema governance, tagging/UTM standards). Reporting & Client Enablement Support Account teams with data-backed, insight-driven reporting (monthly/quarterly reviews, executive summaries, narrative analyses). Build repeatable report templates; automate where possible while preserving clear storytelling. AI & Product Ideation Explore LLM/ML use cases (persona signals, creative scoring, conversion prediction). Prototype lightweight tools for planners/buyers (e.g., channel recommender, influence maps). What it takes to succeed in this role- QUALIFICATIONS: 5-7 years in data science/marketing analytics/digital media performance. Proficient in Python or R; strong SQL; experience with GA4/BigQuery and media platform exports. Comfort with BI tools (Looker Studio, Tableau, Power BI) and dashboard product management/ Data visualization. Familiarity with generative AI tools (e.g., OpenAI, Hugging Face, or Google Vertex AI) for automating insights, reporting, or content analysis. Comfortable in a fast-paced environment with competing priorities. Experience applying machine learning models to media mix modeling, customer segmentation, or predictive performance forecasting. Strong understanding of marketing attribution models and how to evaluate cross-channel performance using statistical techniques. Excellent communicator who can turn data into decisions for non-technical stakeholders. Experience with paid media a plus! Key Competencies Data Visualization & Storytelling - Skilled in transforming complex datasets into clear, compelling insights using tools like Tableau, Power BI, or Python libraries. AI & Machine Learning Expertise - Proficient in applying supervised and unsupervised learning techniques to optimize media performance and audience targeting. Media Analytics & Attribution - Deep understanding of digital media metrics, multi-touch attribution models, and cross-channel performance analysis. Dashboard Development & Management - Experience managing analytics dashboards, defining feature roadmaps, and collaborating with developers for scalable solutions. SEO/CRO Data Integration - Ability to synthesize SEO and conversion rate optimization data to inform predictive models and campaign strategies. Stakeholder Communication - Strong ability to translate data into actionable insights for Account teams and clients, supporting strategic decision-making. Automation & Efficiency - Familiarity with AI tools to streamline reporting, anomaly detection, and campaign optimization workflows. Statistical Analysis & Experimentation - Proficient in A/B testing, regression analysis, and causal inference to validate media strategies. The Perks: The best co-workers you'll ever find Unlimited PTO Medical, Dental, Vision, 401k plus match Annual performance bonus eligibility Ongoing training opportunities Planned outings and team events (remote workers included!) PHYSICAL DEMANDS AND WORK ENVIRONMENT: Prolonged periods of sitting at a desk and working on a computer. Occasional standing, walking, or lifting of office supplies (up to 10-20 lbs.) Frequent communication via phone, email, and video conferencing. Work is performed in a temperature-controlled office environment with standard lighting and noise levels. Position may require occasional travel to client site Compensation Range: We offer a competitive salary based on experience and qualifications. The compensation range for this position is $90,000 to $100,000 annually, with potential for bonuses, stock and additional benefits. EEO & Accessibility Statement Primacy is an Equal Opportunity Employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. If you require reasonable accommodation during the application or interview process, please contact [email protected]
    $90k-100k yearly Auto-Apply 60d+ ago
  • Staff Data Scientist - Underwriting and Operations Analytics

    Travelers Insurance Company 4.4company rating

    Data engineer job in Hartford, CT

    **Who Are We?** Taking care of our customers, our communities and each other. That's the Travelers Promise. By honoring this commitment, we have maintained our reputation as one of the best property casualty insurers in the industry for over 170 years. Join us to discover a culture that is rooted in innovation and thrives on collaboration. Imagine loving what you do and where you do it. **Job Category** Data Science **Compensation Overview** The annual base salary range provided for this position is a nationwide market range and represents a broad range of salaries for this role across the country. The actual salary for this position will be determined by a number of factors, including the scope, complexity and location of the role; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. As part of our comprehensive compensation and benefits program, employees are also eligible for performance-based cash incentive awards. **Salary Range** $161,400.00 - $266,300.00 **Target Openings** 1 **What Is the Opportunity?** As a Staff Data Scientist, you will build complex models that solve key business problems to support underwriting, risk control, and business operations. This may include the use of the most advanced technical tools in the data science practice, allowing you to develop sophisticated solutions that enhance risk segmentation, streamline decision-making processes, and drive operational excellence across these critical business functions. **What Will You Do?** + Lead business or technical projects focused on the design or development of analytical solutions. + Lead development of community best practices in AI/Machine Learning, statistical techniques, and coding. + Establish a practice/process of sharing expertise with the community through discussions, presentations, or peer reviews. + Begin to challenge conventional thinking where appropriate. + Anticipate potential objections and persuade peers, technical and business leaders to adopt a different point of view. + Guide technical strategy of teams through own technical expertise. + Set and manage expectations with business partners for multiple projects, generate ideas and build consensus, and be aware of potential conflicts. + Communicate analysis, insights, and results to team, peers, business partners. + Partner with cross-functional teams and leaders to support the successful execution of data science strategies. + Be a mentor or resource for less experienced analytic talent, onboard new employees and interns, and provide support for recruiting and talent assessment efforts. + Collaborate with Sr Staff Data Scientist on various training and skill development initiatives, including delivering training to the analytics community. + Perform other duties as assigned. **What Will Our Ideal Candidate Have?** + Subject matter expertise in modeling/ research/ analytics or actuarial required + Subject matter expertise in value creation and business model concepts + Subject matter expertise in multiple statistical software programs + Ability to develop highly complex models, interpret model results and recommend adjustments + Expertise in advanced statistics underlying data models + Ability to apply emerging statistical procedures to complex work + Subject matter expertise in 3-5 of the following: Regression, Classification, Machine Vision, Natural Language Processing, Deep Learning and Statistical modeling. **What is a Must Have?** + Master's degree in Statistics, Mathematics, Decision Sciences, Actuarial Science or related analytical STEM field plus five years of experience or any suitable and equivalent combination of education and work experience. + Heavy concentration in mathematics, including statistics and programming, business intelligence/analytics, as well as data science tools and research using large data sets. Additional verification of specific coursework will be required. **What Is in It for You?** + **Health Insurance** : Employees and their eligible family members - including spouses, domestic partners, and children - are eligible for coverage from the first day of employment. + **Retirement:** Travelers matches your 401(k) contributions dollar-for-dollar up to your first 5% of eligible pay, subject to an annual maximum. If you have student loan debt, you can enroll in the Paying it Forward Savings Program. When you make a payment toward your student loan, Travelers will make an annual contribution into your 401(k) account. You are also eligible for a Pension Plan that is 100% funded by Travelers. + **Paid Time Off:** Start your career at Travelers with a minimum of 20 days Paid Time Off annually, plus nine paid company Holidays. + **Wellness Program:** The Travelers wellness program is comprised of tools, discounts and resources that empower you to achieve your wellness goals and caregiving needs. In addition, our mental health program provides access to free professional counseling services, health coaching and other resources to support your daily life needs. + **Volunteer Encouragement:** We have a deep commitment to the communities we serve and encourage our employees to get involved. Travelers has a Matching Gift and Volunteer Rewards program that enables you to give back to the charity of your choice. **Employment Practices** Travelers is an equal opportunity employer. We value the unique abilities and talents each individual brings to our organization and recognize that we benefit in numerous ways from our differences. In accordance with local law, candidates seeking employment in Colorado are not required to disclose dates of attendance at or graduation from educational institutions. If you are a candidate and have specific questions regarding the physical requirements of this role, please send us an email (*******************) so we may assist you. Travelers reserves the right to fill this position at a level above or below the level included in this posting. To learn more about our comprehensive benefit programs please visit ******************************************************** .
    $77k-103k yearly est. 60d+ ago
  • Data Engineer w AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake

    Intermedia Group

    Data engineer job in Ridgefield, CT

    OPEN JOB: Data Engineer w AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake **HYBRID - This candidate will work on site 2-3X per week in Ridgefield CT location SALARY: $140,000 to $185,000 2 Openings NOTE: CANDIDATE MUST BE US CITIZEN OR GREEN CARD HOLDER We are seeking a highly skilled and experienced Data Engineer to design, build, and maintain our scalable and robust data infrastructure on a cloud platform. In this pivotal role, you will be instrumental in enhancing our data infrastructure, optimizing data flow, and ensuring data availability. You will be responsible for both the hands-on implementation of data pipelines and the strategic design of our overall data architecture. Seeking a candidate with hands-on experience with AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake, Proficiency in Python and SQL and DevOps/CI/CD experience Duties & Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analytics. Collaborate with data architects, modelers and IT team members to help define and evolve the overall cloud-based data architecture strategy, including data warehousing, data lakes, streaming analytics, and data governance frameworks Collaborate with data scientists, analysts, and other business stakeholders to understand data requirements and deliver solutions. Optimize and manage data storage solutions (e.g., S3, Snowflake, Redshift) ensuring data quality, integrity, security, and accessibility. Implement data quality and validation processes to ensure data accuracy and reliability. Develop and maintain documentation for data processes, architecture, and workflows. Monitor and troubleshoot data pipeline performance and resolve issues promptly. Consulting and Analysis: Meet regularly with defined clients and stakeholders to understand and analyze their processes and needs. Determine requirements to present possible solutions or improvements. Technology Evaluation: Stay updated with the latest industry trends and technologies to continuously improve data engineering practices. Requirements Cloud Expertise: Expert-level proficiency in at least one major cloud platform (AWS, Azure, or GCP) with extensive experience in their respective data services (e.g., AWS S3, Glue, Lambda, Redshift, Kinesis; Azure Data Lake, Data Factory, Synapse, Event Hubs; GCP BigQuery, Dataflow, Pub/Sub, Cloud Storage); experience with AWS data cloud platform preferred SQL Mastery: Advanced SQL writing and optimization skills. Data Warehousing: Deep understanding of data warehousing concepts, Kimball methodology, and various data modeling techniques (dimensional, star/snowflake schemas). Big Data Technologies: Experience with big data processing frameworks (e.g., Spark, Hadoop, Flink) is a plus. Database Systems: Experience with relational and NoSQL databases (e.g., PostgreSQL, MySQL, MongoDB, Cassandra). DevOps/CI/CD: Familiarity with DevOps principles and CI/CD pipelines for data solutions. Hands-on experience with AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake Formation Proficiency in Python and SQL Desired Skills, Experience and Abilities 4+ years of progressive experience in data engineering, with a significant portion dedicated to cloud-based data platforms. ETL/ELT Tools: Hands-on experience with ETL/ELT tools and orchestrators (e.g., Apache Airflow, Azure Data Factory, AWS Glue, dbt). Data Governance: Understanding of data governance, data quality, and metadata management principles. AWS Experience: Ability to evaluate AWS cloud applications, make architecture recommendations; AWS solutions architect certification (Associate or Professional) is a plus Familiarity with Snowflake Knowledge of dbt (data build tool) Strong problem-solving skills, especially in data pipeline troubleshooting and optimization If you are interested in pursuing this opportunity, please respond back and include the following: Full CURRENT Resume Required compensation Contact information Availability Upon receipt, one of our managers will contact you to discuss in full STEPHEN FLEISCHNER Recruiting Manager INTERMEDIA GROUP, INC. EMAIL: *******************************
    $140k-185k yearly Easy Apply 60d+ ago
  • C++ Market Data Engineer (USA)

    Trexquant Investment 4.0company rating

    Data engineer job in Stamford, CT

    Trexquant is a growing systematic fund at the forefront of quantitative finance, with a core team of highly accomplished researchers and engineers. To keep pace with our expanding global trading operations, we are seeking a C++ Market Data Engineer to design and build ultra-low-latency feed handlers for premier vendor feeds and major exchange multicast feeds. This is a high-impact role that sits at the heart of Trexquant's trading platform; the quality, speed, and reliability of your code directly influence every strategy we run. Responsibilities Design & implement high-performance feed handlers in modern C++ for equities, futures, and options across global venues (e.g., NYSE, CME, Refinitiv RTS, Bloomberg B-PIPE). Optimize for micro- and nanosecond latency using lock-free data structures, cache-friendly memory layouts, and kernel-bypass networking where appropriate. Build reusable libraries for message decoding, normalization, and publication to internal buses shared by research, simulation, and live trading systems. Collaborate with cross-functional teams to tune TCP/UDP multicast stacks, kernel parameters, and NIC settings for deterministic performance. Provide robust failover, gap-recovery, and replay mechanisms to guarantee data integrity under packet loss or venue outages. Instrument code paths with precision timestamping and performance metrics; drive continuous latency regression testing and capacity planning. Partner closely with quantitative researchers to understand downstream data requirements and to fine-tune delivery formats for both simulation and live trading. Produce clear architecture documents, operational run-books, and post-mortems; participate in a 24×7 follow-the-sun support rotation for mission-critical market-data services. Requirements BS/MS/PhD in Computer Science, Electrical Engineering, or related field. 3+ years of professional C++ (14,17,20) development experience focused on low-latency, high-throughput systems. Proven track record building or maintaining real-time market-data feeds (e.g., Refinitiv RTS/TREP, Bloomberg B-PIPE, OPRA, CME MDP, ITCH). Strong grasp of concurrency, lock-free algorithms, memory-model semantics, and compiler optimizations. Familiarity with serialization formats (FAST, SBE, Protocol Buffers) and time-series databases or in-memory caches. Comfort with scripting in Python for prototyping, testing, and ops automation. Excellent problem-solving skills, ownership mindset, and ability to thrive in a fast-paced trading environment. Familiarity with containerization (Docker/K8s) and public-cloud networking (AWS, GCP). Benefits Competitive salary, plus bonus based on individual and company performance. Collaborative, casual, and friendly work environment while solving the hardest problems in the financial markets. PPO Health, dental and vision insurance premiums fully covered for you and your dependents. Pre-Tax Commuter Benefits Applications are now open for our NYC office, opening in September 2026. The base salary range is $175,000 - $200,000 depending on the candidate's educational and professional background. Base salary is one component of Trexquant's total compensation, which may also include a discretionary, performance-based bonus. Trexquant is an Equal Opportunity Employer
    $175k-200k yearly Auto-Apply 60d+ ago
  • Data Engineer

    Innovative Rocket Technologies Inc. 4.3company rating

    Data engineer job in Hauppauge, NY

    Job Description Data is pivotal to our goal of frequent launch and rapid iteration. We're recruiting a Data Engineer at iRocket to build pipelines, analytics, and tools that support propulsion test, launch operations, manufacturing, and vehicle performance. The Role Design and build data pipelines for test stands, manufacturing machines, launch telemetry, and operations systems. Develop dashboards, real-time monitoring, data-driven anomaly detection, performance trending, and predictive maintenance tools. Work with engineers across propulsion, manufacturing, and operations to translate data-needs into data-products. Maintain data architecture, ETL processes, cloud/edge-data systems, and analytics tooling. Support A/B testing, performance metrics, and feed insights back into design/manufacturing cycles. Requirements Bachelor's degree in Computer Science, Data Engineering, or related technical field. 2+ years of experience building data pipelines, ETL/ELT workflows, and analytics systems. Proficient in Python, SQL, cloud data platforms (AWS, GCP, Azure), streaming/real-time analytics, and dashboarding (e.g., Tableau, PowerBI). Strong ability to work cross-functionally and deliver data-products to engineering and operations teams. Strong communication, documentation, and a curiosity-driven mindset. Benefits Health Care Plan (Medical, Dental & Vision) Retirement Plan (401k, IRA) Life Insurance (Basic, Voluntary & AD&D) Paid Time Off (Vacation, Sick & Public Holidays) Family Leave (Maternity, Paternity) Short Term & Long Term Disability Wellness Resources
    $102k-146k yearly est. 3d ago
  • Sr Data Engineer (MFT - IBM Sterling)

    The Hertz Corporation 4.3company rating

    Data engineer job in Hartford, CT

    **A Day in the Life:** The IBM Sterling Senior MFT Engineer will have expertise in Sterling B2B Integrator, Sterling File Gateway, Sterling Secure Proxy, Control Center. This position requires an expert level knowledge of these technologies. You'll provide third-level support for core hardware, software, data and infrastructure components within the B2B integration team. The software focus areas will include software infrastructure components, continuous improvement and automation related to middleware and enterprise data movement within a mission-critical, multi-site enterprise environment. The ideal candidate will have a passion for technology; possess the ability to create change and be the facilitator for this transformation. They will have experience tailoring software design, developing, and implementing of solutions in support of data quality and automation. They will need to understand the requirements of a project, test the performance of the infrastructure, and verify that the requirements have been successfully met. We expect the starting salary to be around $135k but will be commensurate with experience. **What You'll Do:** TECHNICAL SENIORSHIP + Communication with internal and external business users on Sterling Integrator mappings + Making changes to existing partner integrations to meet internal and external requirements + Design, develop and implement solutions based on standards and processes that establish consistency across the enterprise data, reduce risks, and promote efficiencies in support of the organizational goals and objectives. + Diagnose and troubleshoot complex issues, restore services and perform root cause analysis. + Facilitate the review, vetting of these designs with the architecture governance bodies, as required. + Be aware of all aspects of security related to the Sterling environment and integrations INNOVATIVE INFRASTRUCTURE & PROBLEM SOLVING + Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows. TEAMWORK & COMMUNICATION + Superior & demonstrated team building & development skills to harness powerful teams + Ability to communicate effectively with different levels of Seniorship within the organization + Provide timely updates so that progress against each individual incident can be updated as required + Write and review high quality technical documentation CONTROL & AUDIT + Ensures their workstation and all processes and procedures, follow organization standards CONTINUOUS IMPROVEMENT + Encourages and maintains a 'Best practice sharing culture', always striving to find ways to improve service and change mind set. **What We're Looking For:** + Bachelor's degree in Engineering, Statistics, Computer Science or other quantitative fields, required + 5+ years of IT experience + 4+ years' experience working with Sterling B2B Integrator, Sterling File Gateway, PCM (preferred) + 3+ years' experience with scripting to enable automation of standard activities (example: Ansible, Python, Bash, Java) + Strong interpersonal and communication skills with Agile/Scrum experience. + Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions with the ability to develop custom setups. + Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels. + Prefer Travel, transportation, or hospitality experience + Prefer experience with designing application data models for mobile or web applications + Excellent written and verbal communication skills. + Flexibility in scheduling which may include nights, weekends, and holidays **What You'll Get:** + Up to 40% off the base rate of any standard Hertz Rental + Paid Time Off + Medical, Dental & Vision plan options + Retirement programs, including 401(k) employer matching + Paid Parental Leave & Adoption Assistance + Employee Assistance Program for employees & family + Educational Reimbursement & Discounts + Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness + Perks & Discounts -Theme Park Tickets, Gym Discounts & more The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world. **US EEO STATEMENT** At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company. Individuals are encouraged to apply for positions because of the characteristics that make them unique. EOE, including disability/veteran
    $135k yearly 60d+ ago

Learn more about data engineer jobs

How much does a data engineer earn in New Haven, CT?

The average data engineer in New Haven, CT earns between $73,000 and $131,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in New Haven, CT

$98,000

What are the biggest employers of Data Engineers in New Haven, CT?

The biggest employers of Data Engineers in New Haven, CT are:
  1. Waters
  2. Rockridge Resources
  3. Bexorg
Job type you want
Full Time
Part Time
Internship
Temporary