Post job

Data engineer jobs in Meriden, CT - 1,139 jobs

All
Data Engineer
Data Scientist
Data Architect
Data Modeler
Data Consultant
Senior Software Engineer
  • Data & Analytics Architect

    Harnham

    Data engineer job in Hartford, CT

    Data & Analytics Solutions Architect 2x days/week onsite Up to $180,000 + 10% bonus **this client is not able to sponsor or transfer visas at this time** A high-performing, financially strong U.S. insurance group is investing heavily in its Data & Analytics platform as it enters a major growth phase - and is hiring a hands-on Data & Analytics Solutions Architect to help shape what comes next. This is not an Enterprise Architect role. It's a deeply technical, delivery-oriented architecture position sitting within a central Data & Analytics function, working directly with engineering teams and business partners to design, evolve, and execute modern data platforms at scale. 🚀 Why this role stands out A $5.5B revenue business with an ambition to reach $10B in the next few years Expanding rapidly across the U.S. - growth driven by technology, automation, and data, not headcount alone Moving decisively toward a Databricks-centric Lakehouse architecture (Azure-based) Production ML and early LLM use cases already live - not just experimentation Architects here own real problems end-to-end, with visibility across the full data lifecycle You'll join a small, senior architecture group supporting multiple business domains (claims, underwriting, operations, analytics), helping teams move faster while setting the right technical guardrails. 🧠 What you'll be doing Designing and evolving modern Data & Analytics architectures (batch, real-time, cloud & hybrid) Partnering with delivery teams and business stakeholders to translate needs into scalable solutions Defining standards, patterns, and frameworks across Databricks, Azure, Power BI, ML platforms, and ingestion tools Supporting execution - reviewing designs, advising engineers, solving complex technical problems Evaluating new tools and capabilities, running POCs, and influencing platform roadmaps Acting as a technical mentor across data engineering, analytics, and ML teams This is an individual contributor role (manager-grade) - ideal for someone who wants architectural ownership without people management. ✅ What they're looking for ~7+ years in Data & Analytics Architecture or senior data engineering with strong architectural ownership Strong Databricks experience (Lakehouse / medallion architecture is essential) Solid Azure knowledge (data, networking concepts, security patterns) Experience designing data pipelines, analytics platforms, and ML-adjacent architectures Comfortable working across teams, articulating technical decisions, and challenging designs constructively Insurance experience is helpful but not required - technical depth matters most Nice to have: Power BI, Azure ML, Informatica/IICS, real-time ingestion, governance & data quality frameworks. 🌱 Culture & growth High autonomy, low politics, strong internal mobility Long tenure and real investment in upskilling Architects here influence strategy and delivery - no ivory towers
    $86k-119k yearly est. 2d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Sr. Principal Software Developer - Cloud Storage

    Oracle 4.6company rating

    Data engineer job in Hartford, CT

    At Oracle Cloud Infrastructure (OCI), we build the future of the cloud for Enterprises as a diverse team of fellow creators and inventors. Oracle's Cloud Infrastructure team is building Infrastructure-as-a-Service technologies that operate at a high scale in a broadly distributed multi-tenant cloud environment. We act with the speed and attitude of a start-up, with the scale and customer focus of the leading enterprise software company in the world. Are you interested in working on Paxos, IO Path availability and performance, replication, data and metadata consistency checking. The data plane team is building a new Storage layer supporting low latency high throughput storage system in 100+ regions. Our customers run their businesses on our cloud, and our mission is to provide them with industry leading compute, storage, networking, database, security, and an ever expanding set of foundational cloud-based services. As part of this effort, the Object Storage Service team is looking for hands-on engineers with expertise and passion in solving difficult problems in distributed systems, large scale storage, and scaling services to meet future growth. If this is you, you can be part of the team that drives the best in class Object Storage Service into the next phase of its development. These are exciting times for the service - we are growing fast, and delivering innovative, enterprise class features to satisfy the customer workload. As a technical leader, you will own the software design and development for major components and features of the Object Storage Service. You should be able to design complex systems, strong programmer and a distributed systems generalist, able to dive deep into any part of the stack and low level systems, as well as design broad distributed system interactions. You should value simplicity and scale, work comfortably in a collaborative, agile environment, and be excited to learn. **This position is located in Seattle or Santa Clara. No remote. Relocation assistance is available.** **Responsibilities** As a Consulting Member of Technical Staff, you will be called upon to lead major projects and have significant participation in design and architecture. You will be expected to act as a technical leader on your team and demonstrate core values for other more junior engineers. You should be both a rock-solid coder and a distributed systems generalist, able to dive deep into any part of the stack and low-level systems, as well as design broad distributed system interactions. You should value simplicity and scale, work comfortably in a collaborative, agile environment, and be excited to learn. To succeed with these responsibilities will require: + Bachelors or Masters in Computer Science, Computer Engineering, or related field. + 15+ years experience delivering storage systems. + Cloud experience is a plus. + Proven experience with a major Object Oriented Programming language such as Java or C++. + Strong knowledge of data structures, algorithms, operating systems, and distributed systems fundamentals. + Very strong knowledge of databases, storage and distributed persistence technologies. + Strong troubleshooting and performance tuning skills. Disclaimer: **Certain US customer or client-facing roles may be required to comply with applicable requirements, such as immunization and occupational health mandates.** **Range and benefit information provided in this posting are specific to the stated locations only** US: Hiring Range in USD from: $96,800 to $251,600 per annum. May be eligible for bonus, equity, and compensation deferral. Oracle maintains broad salary ranges for its roles in order to account for variations in knowledge, skills, experience, market conditions and locations, as well as reflect Oracle's differing products, industries and lines of business. Candidates are typically placed into the range based on the preceding factors as well as internal peer equity. Oracle US offers a comprehensive benefits package which includes the following: 1. Medical, dental, and vision insurance, including expert medical opinion 2. Short term disability and long term disability 3. Life insurance and AD&D 4. Supplemental life insurance (Employee/Spouse/Child) 5. Health care and dependent care Flexible Spending Accounts 6. Pre-tax commuter and parking benefits 7. 401(k) Savings and Investment Plan with company match 8. Paid time off: Flexible Vacation is provided to all eligible employees assigned to a salaried (non-overtime eligible) position. Accrued Vacation is provided to all other employees eligible for vacation benefits. For employees working at least 35 hours per week, the vacation accrual rate is 13 days annually for the first three years of employment and 18 days annually for subsequent years of employment. Vacation accrual is prorated for employees working between 20 and 34 hours per week. Employees working fewer than 20 hours per week are not eligible for vacation. 9. 11 paid holidays 10. Paid sick leave: 72 hours of paid sick leave upon date of hire. Refreshes each calendar year. Unused balance will carry over each year up to a maximum cap of 112 hours. 11. Paid parental leave 12. Adoption assistance 13. Employee Stock Purchase Plan 14. Financial planning and group legal 15. Voluntary benefits including auto, homeowner and pet insurance The role will generally accept applications for at least three calendar days from the posting date or as long as the job remains posted. Career Level - IC5 **About Us** As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_************* or by calling *************** in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
    $83k-106k yearly est. 8d ago
  • Data Scientist, Analytics (Technical Leadership)

    Meta 4.8company rating

    Data engineer job in Hartford, CT

    We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond. **Required Skills:** Data Scientist, Analytics (Technical Leadership) Responsibilities: 1. Work with complex data sets to solve challenging problems using analytical and statistical approaches 2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies 3. Identify and measure success through goal setting, forecasting, and monitoring key metrics 4. Partner with cross-functional teams to inform and execute product strategy and investment decisions 5. Build long-term vision and strategy for programs and products 6. Collaborate with executives to define and develop data platforms and instrumentation 7. Effectively communicate insights and recommendations to stakeholders 8. Define success metrics, forecast changes, and set team goals 9. Support developing roadmaps and coordinate analytics efforts across teams **Minimum Qualifications:** Minimum Qualifications: 10. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience 11. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab) 12. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development 13. Experience working effectively with and leading complex analytics projects across multiple stakeholders and cross-functional teams, including Engineering, PM/TPM, Analytics and Finance 14. Experience with predictive modeling, machine learning, and experimentation/causal inference methods 15. Experience communicating complex technical topics in a clear, precise, and actionable manner **Preferred Qualifications:** Preferred Qualifications: 16. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy 17. Masters or Ph.D. Degree in a quantitative field 18. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research) 19. 10+ years of experience doing complex quantitative analysis in product analytics **Public Compensation:** $210,000/year to $281,000/year + bonus + equity + benefits **Industry:** Internet **Equal Opportunity:** Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment. Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@meta.com.
    $210k-281k yearly 60d+ ago
  • Senior Data Engineer

    Stratacuity

    Data engineer job in Bristol, CT

    Description/Comment: Disney Streaming is the leading premium streaming service offering live and on-demand TV and movies, with and without commercials, both in and outside the home. Operating at the intersection of entertainment and technology, Disney Streaming has a unique opportunity to be the number one choice for TV. We captivate and connect viewers with the stories they love, and we're looking for people who are passionate about redefining TV through innovation, unconventional thinking, and embracing fun. Join us and see what this is all about. The Product Performance Data Solutions team for the Data organization within Disney Streaming (DS), a segment under the Disney Media & Entertainment Distribution is in search of a Senior Data Engineer. As a member of the Product Performance team, you will work on building foundational datasets from clickstream and quality of service telemetry data - enabling dozens of engineering and analytical teams to unlock the power of data to drive key business decisions and provide engineering, analytics, and operational teams the critical information necessary to scale the largest streaming service. The Product Performance Data Solutions team is seeking to grow their team of world-class Data Engineers that share their charisma and enthusiasm for making a positive impact. Responsibilities: * Contribute to maintaining, updating, and expanding existing data pipelines in Python / Spark while maintaining strict uptime SLAs * Architect, design, and code shared libraries in Scala and Python that abstract complex business logic to allow consistent functionality across all data pipelines * Tech stack includes Airflow, Spark, Databricks, Delta Lake, Snowflake, Scala, Python * Collaborate with product managers, architects, and other engineers to drive the success of the Product Performance Data and key business stakeholders * Contribute to developing and documenting both internal and external standards for pipeline configurations, naming conventions, partitioning strategies, and more * Ensure high operational efficiency and quality of datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our partners (Engineering, Data Science, Operations, and Analytics teams) * Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for our team * Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements * Maintain detailed documentation of your work and changes to support data quality and data governance requirements Additional Information: NOTE: There will be no SPC for this role Interview process: 4 rounds (1 with HM, 2 tech rounds, and a final with Product) We need an expert in SQL, extensive experience with Scala, a proven self-starter (expected to discover the outcome, and then chase after it), not only able to speak technical but clearly articulate that info to the business as well. Preferred Qualifications: Candidates with Click stream, user browse data are highly preferred Apex Systems is a world-class IT services company that serves thousands of clients across the globe. When you join Apex, you become part of a team that values innovation, collaboration, and continuous learning. We offer quality career resources, training, certifications, development opportunities, and a comprehensive benefits package. Our commitment to excellence is reflected in many awards, including ClearlyRated's Best of Staffing in Talent Satisfaction in the United States and Great Place to Work in the United Kingdom and Mexico. Apex uses a virtual recruiter as part of the application process. Click here for more details. Apex Benefits Overview: Apex offers a range of supplemental benefits, including medical, dental, vision, life, disability, and other insurance plans that offer an optional layer of financial protection. We offer an ESPP (employee stock purchase program) and a 401K program which allows you to contribute typically within 30 days of starting, with a company match after 12 months of tenure. Apex also offers a HSA (Health Savings Account on the HDHP plan), a SupportLinc Employee Assistance Program (EAP) with up to 8 free counseling sessions, a corporate discount savings program and other discounts. In terms of professional development, Apex hosts an on-demand training program, provides access to certification prep and a library of technical and leadership courses/books/seminars once you have 6+ months of tenure, and certification discounts and other perks to associations that include CompTIA and IIBA. Apex has a dedicated customer service team for our Consultants that can address questions around benefits and other resources, as well as a certified Career Coach. You can access a full list of our benefits, programs, support teams and resources within our 'Welcome Packet' as well, which an Apex team member can provide. Employee Type: Contract Location: Bristol, CT, US Job Type: Date Posted: January 8, 2026 Pay Range: $50 - $100 per hour Similar Jobs * Senior Data Engineer * Sr Data Engineers x12 * Senior Data Scientist * Sr. Data Analyst * Senior Data Engineer - SQL & Reporting
    $50-100 hourly 9d ago
  • Senior Data Engineer - Product Performance Data -1573

    Akube

    Data engineer job in Bristol, CT

    City: Bristol, CT /NYC Onsite/ Hybrid/ Remote: Hybrid (4 days a week Onsite)Duration: 10 months Rate Range: Up to $96/hr on W2 depending on experience (no C2C or 1099 or sub -contract) Work Authorization: GC, USC, All valid EADs except OPT, CPT, H1B Must Have: Advanced SQL expertise Strong Scala development experience Python for data engineering Apache Spark in production Airflow for orchestration Databricks platform experience Cloud data storage experience (S3 or equivalent) Responsibilities: Build and maintain large -scale data pipelines with strict SLAs. Design shared libraries in Scala and Python to standardize data logic. Develop foundational datasets from clickstream and telemetry data. Ensure data quality, reliability, and operational efficiency. Partner with product, engineering, and analytics teams. Define and document data standards and best practices. Participate actively in Agile and Scrum ceremonies. Communicate technical outcomes clearly to business stakeholders. Maintain detailed technical and data governance documentation. Qualifications: 5+ years of data engineering experience. Strong problem -solving and algorithmic skills. Expert -level SQL with complex analytical queries. Hands -on experience with distributed systems at scale. Experience supporting production data platforms. Self -starter who can define outcomes and drive solutions. Ability to translate technical concepts for non -technical audiences. Bachelor's degree or equivalent experience.
    $96 hourly 10d ago
  • Data Scientist

    Tsunami Tsolutions 4.0company rating

    Data engineer job in Glastonbury, CT

    Tsunami Tsolutions is seeking a motivated Data Scientist to join its Aviation Analytics department. This person will be responsible for developing and deploying solutions to various customers that utilize a wide range of analytics tools. They will work with team members and customers to identify new sources of value and take actions to capture it. The selected individual will also work to ensure data quality and provide metrics as indicators of the current state. This role will also require the presentation of findings to customers and senior management. NOTE: This position requires access to technologies and hardware subject to US national Security based export control requirements. All applicants must be US Citizen (8 USC 1324b(a)(3)), or otherwise authorized by the U.S. Government. NO Company Sponsorship offered. Responsibilities: • Working with stakeholders to understand complex business processes and data streams. • Collaborate with stakeholders and whiteboard solutions. • Identify, collect and clean data from multiple sources • Validate, interpret and provide business insights • Research, build, implement and evaluate various analytic techniques to select the best application • Work with team members to deploy solutions across the organization • Reporting periodic progress on projects, including tracking usage and value derived from the models Position Requirements • B.S degree in computer science, data science or engineering. Advanced Degree Preferred • 3-5 years of industry experience preferred • Experience building dashboards and visualizations using Qlik, Power BI, Tableau or similar • Strong programming skills on languages used in data science including Python, R and SQL • Advanced proficiency in Microsoft Excel with competency in vlookups, pivot tables, etc. • Ability to learn new concepts quickly and translate them into practical applications • Ability to effectively communicate findings to non-technical audiences • Experience building and implementing machine learning models in Python or R a plus Offer contingent upon successful completion of a background check and drug screen.
    $78k-114k yearly est. Auto-Apply 60d+ ago
  • Data Scientist, Privacy

    Datavant

    Data engineer job in Hartford, CT

    Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care. By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare. As part of the Privacy Science team within Privacy Hub you will play a crucial role in ensuring that privacy of patients is safeguarded in the modern world of data sharing. As well as working on real data, you will be involved in exciting research to keep us as industry leaders in this area, and stimulating discussions on re-identification risk. You will be supported in developing/consolidating data analysis and coding skills to become proficient in the analysis of large health-related datasets. **You Will:** + Critically analyze large health datasets using standard and bespoke software libraries + Discuss your findings and progress with internal and external stakeholders + Produce high quality reports which summarise your findings + Contribute to research activities as we explore novel and established sources of re-identification risk **What You Will Bring to the Table:** + Excellent communication skills. Meticulous attention to detail in the production of comprehensive, well-presented reports + A good understanding of statistical probability distributions, bias, error and power as well as sampling and resampling methods + Seeks to understand real-world data in context rather than consider it in abstraction. + Familiarity or proficiency with programmable data analysis software R or Python, and the desire to develop expertise in its language + Application of scientific methods to practical problems through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions + Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines + Initiative and ability to independently explore and research novel topics and concepts as they arise, to expand Privacy Hub's knowledge base + An appreciation of the need for effective methods in data privacy and security, and an awareness of the relevant legislation + Familiarity with Amazon Web Services cloud-based storage and computing facilities **Bonus Points If You Have:** + Experience creating documents using LATEX + Detailed knowledge of one or more types of health information, e.g., genomics, disease, health images + Experience working with or supporting public sector organizations, such as federal agencies (e.g., CMS, NIH, VA, CDC), state health departments, or public health research partners. Familiarity with government data environments, procurement processes, or privacy frameworks in regulated settings is highly valued. \#LI-BC1 We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services. The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job. The estimated total cash compensation range for this role is: $104,000-$130,000 USD To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion. This job is not eligible for employment sponsorship. Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay. At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way. Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis. For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
    $104k-130k yearly 22d ago
  • Principal Data Scientist

    Maximus 4.3company rating

    Data engineer job in Bridgeport, CT

    Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team. You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes. This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.) This position requires occasional travel to the DC area for client meetings. U.S. citizenship is required for this position due to government contract requirements. Essential Duties and Responsibilities: - Make deep dives into the data, pulling out objective insights for business leaders. - Initiate, craft, and lead advanced analyses of operational data. - Provide a strong voice for the importance of data-driven decision making. - Provide expertise to others in data wrangling and analysis. - Convert complex data into visually appealing presentations. - Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners. - Understand the importance of automation and look to implement and initiate automated solutions where appropriate. - Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects. - Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects. - Guide operational partners on product performance and solution improvement/maturity options. - Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization. - Learn new skills in advanced analytics/AI/ML tools, techniques, and languages. - Mentor more junior data analysts/data scientists as needed. - Apply strategic approach to lead projects from start to finish; Job-Specific Minimum Requirements: - Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation. - Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital. - Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning. - Contribute to the development of mathematically rigorous process improvement procedures. - Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments. Minimum Requirements - Bachelor's degree in related field required. - 10-12 years of relevant professional experience required. Job-Specific Minimum Requirements: - 10+ years of relevant Software Development + AI / ML / DS experience. - Professional Programming experience (e.g. Python, R, etc.). - Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML. - Experience with API programming. - Experience with Linux. - Experience with Statistics. - Experience with Classical Machine Learning. - Experience working as a contributor on a team. Preferred Skills and Qualifications: - Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.). - Experience developing machine learning or signal processing algorithms: - Ability to leverage mathematical principles to model new and novel behaviors. - Ability to leverage statistics to identify true signals from noise or clutter. - Experience working as an individual contributor in AI. - Use of state-of-the-art technology to solve operational problems in AI and Machine Learning. - Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles. - Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions. - Ability to build reference implementations of operational AI & Advanced Analytics processing solutions. Background Investigations: - IRS MBI - Eligibility #techjobs #VeteransPage #LI-Remote EEO Statement Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics. Pay Transparency Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances. Accommodations Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************. Minimum Salary $ 156,740.00 Maximum Salary $ 234,960.00
    $77k-112k yearly est. Easy Apply 7d ago
  • IBM IIB, WMB, Data power Consultant

    Sonsoft 3.7company rating

    Data engineer job in Hartford, CT

    Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services. Job Description Preferred • At least 4 years of experience with IBM IIB, WMB, Datapower • At least 4 years of experience in software development life cycle. • At least 4 years of experience in Project life cycle activities on development and maintenance projects. • At least 2 years of experience in Design and architecture review. • Ability to work in team in diverse/ multiple stakeholder environments • Ability to work in Scrum team in diverse/ multiple stakeholder environments • Interface analysis, Technical leadership, activities coordination, etc. • Perform reviews • Interactions with application teams, GI Team and other stake holders relevant to technology • Experience in Automation Domain. • Analytical skills • Experience and desire to work in a Global delivery environment Qualifications Qualifications Basic • Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education. • At least 4 years of experience with Information Technology. Additional Information ** U.S. citizens and those authorized to work in the U.S. are encouraged to apply . We are unable to sponsor at this time. Note:- This is a Full-Time Permanent job opportunity for you. Only US Citizen, Green Card Holder, GC-EAD, H4-EAD & L2-EAD can apply. No OPT-EAD, TN Visa & H1B Consultants please. Please mention your Visa Status in your email or resume.
    $83k-112k yearly est. 60d+ ago
  • OP Corporate Bank - Data Scientist

    MPS Baltic

    Data engineer job in Springfield, MA

    Job DescriptionSalary: MPS Baltic client OP Corporate Bank is part of OP Financial Group, the leading financial services group in Finland. OP Corporate Bank has been operating in Lithuania for 12 years, offers services to business customers and focuses on large companies. CurrentlyOP Corporate Bank is looking for aData Scientistto join their team in Vilnius. Do you have a passion for credit risk, data science, and automation? We invite you to make a deep dive into genuine banking expertise, managing credit risk, and solving substantial real-world financial challenges in a dynamic and forward-thinking environment. This opportunity allows you to grow as a credit risk expert by applying cutting-edge data science techniques, collaborating with top professionals in Lithuania and Finland, and contributing to innovative banking solutions. The Role: Improve existing and create new data-driven tools in DataBricks for efficiency and automation Contribute to the ongoing Azure cloud transformation in credit risk area Contribute to the optimization/automation of MLflow deployments Promoting best practices across department, contributing/organizing workshops Work with Python, Spark, SQL, and Git to manage and analyse data The Person: 2+ years of experience in data analysis, modelling, or machine learning Python & SQL proficiency Experience with Git Strong analytical & problem-solving skills Fluent Lithuanian and English (written & spoken) Credit risk and/or Azure Databricks experience is a bonus What is offered: Professional Growth: Work in a Scandinavian corporate culture that values learning and development Advanced Technology: Use Azure, Databricks, Python, and ML models in a cloud-based environment Impactful Work: Contribute to credit risk modelling, automation, and analytics that drive business decisions Collaboration: Work with top data professionals in Lithuania and Finland Competitive Package: Salary depending on your experience and competencies, plus health insurance, lunch subsidies, event budget, parking, and more International Experience: Opportunity for business trips to Finland Should you feel that your skills and experience match the above we would be delighted to receive your application. To apply in the strictest confidentiality, please send your CV marked Data Scientist to executive search company MPS Baltic by email:*********. Only short-listed candidates will be contacted. Additional information provided on ph.: +************5, Ina Skiauterien. Check out more about OP Corporate Bank Lithuania branch on*********************
    $79k-111k yearly est. 20d ago
  • AWS Data Migration Consultant

    Slalom 4.6company rating

    Data engineer job in Hartford, CT

    Candidates can live within commutable distance to any Slalom office in the US. We have a hybrid and flexible environment. Who You'll Work With As a modern technology company, we've never met a technical challenge we didn't like. We enable our clients to learn from their data, create incredible digital experiences, and make the most of new technologies. We blend design, engineering, and analytics expertise to build the future. We surround our technologists with interesting challenges, innovative minds, and emerging technologies. We are seeking an experienced Cloud Data Migration Architect with deep expertise in SQL Server, Oracle, DB2, or a combination of these platforms, to lead the design, migration, and optimization of scalable database solutions in the AWS cloud. This role will focus on modernizing on-premises database systems by architecting high-performance, secure, and reliable AWS-hosted solutions. As a key technical leader, you will work closely with data engineers, cloud architects, and business stakeholders to define data strategies, lead complex database migrations, build out ETL pipelines, and optimize performance across legacy and cloud-native environments. What You'll Do * Design and optimize database solutions on AWS, including Amazon RDS, EC2-hosted instances, and advanced configurations like SQL Server Always On or Oracle RAC (Real Application Clusters). * Lead and execute cloud database migrations using AWS Database Migration Service (DMS), Schema Conversion Tool (SCT), and custom automation tools. * Architect high-performance database schemas, indexing strategies, partitioning models, and query optimization techniques. * Optimize complex SQL queries, stored procedures, functions, and views to ensure performance and scalability in the cloud. * Implement high-availability and disaster recovery (HA/DR) strategies including Always-On, Failover Clusters, Log Shipping, and Replication, tailored to each RDBMS. * Ensure security best practices are followed including IAM-based access control, encryption, and compliance with industry standards. * Collaborate with DevOps teams to implement Infrastructure-as-Code (IaC) using tools like Terraform, CloudFormation, or AWS CDK. * Monitor performance using tools such as AWS CloudWatch, Performance Insights, Query Store, Dynamic Management Views (DMVs), or Oracle-native tools. * Work with software engineers and data teams to integrate cloud databases into enterprise applications and analytics platforms. What You'll Bring * 5+ years of experience in database architecture, design, and administration with at least one of the following: SQL Server, Oracle, or DB2. * Expertise in one or more of the following RDBMS platforms: Microsoft SQL Server, Oracle, DB2. * Hands-on experience with AWS database services (RDS, EC2-hosted databases). * Strong understanding of HA/DR solutions and cloud database design patterns. * Experience with ETL development and data integration, using tools such as SSIS, AWS Glue, or custom solutions. * Familiarity with AWS networking components (VPCs, security groups) and hybrid cloud connectivity. * Strong troubleshooting and analytical skills to resolve complex database and performance issues. * Ability to work independently and lead database modernization initiatives in collaboration with engineering and client stakeholders. Nice to Have * AWS certifications such as AWS Certified Database - Specialty or AWS Certified Solutions Architect - Professional. * Experience with NoSQL databases or hybrid data architectures. * Knowledge of analytics and big data tools (e.g., Snowflake, Redshift, Athena, Power BI, Tableau). * Familiarity with containerization (Docker, Kubernetes) and serverless technologies (AWS Lambda, Fargate). * Experience with DB2 on-premise or cloud-hosted environments. About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this position, the target base salary pay range in the following locations: Boston, Houston, Los Angeles, Orange County, Seattle, San Diego, Washington DC, New York, New Jersey, for Consultant level is $105,000-147,000 and for Senior Consultant level it is $120,000-$169,000 and for Principal level it is $133,000-$187,000. In all other markets, the target base salary pay range for Consultant level is $96,000-$135,000 and for Senior Consultant level it is $110,000-$155,000 and for Principal level it is $122,000-$172,000. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time. EEO and Accommodations Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process. We will accept applications until 1/31/2026 or until the positions are filled.
    $133k-187k yearly 11d ago
  • Data Scientist, Media

    Digital United

    Data engineer job in Farmington, CT

    Accepting applicants in CT, FL, MN, NJ, NC, OH, TX Mediate.ly is seeking a hands-on Data Scientist to elevate media performance analysis, predictive modeling, and channel optimization. In this role, you'll leverage advanced machine learning techniques and generative AI tools to uncover actionable insights, automate reporting, and enhance campaign effectiveness across digital channels. You'll manage and evolve our existing performance dashboard (with a small external team), own the feature roadmap, and collaborate closely with Primacy on SEO/CRO data integration. A key part of the role involves supporting Account teams with clear, insight-rich reporting powered by enhanced data storytelling and visualization. This was meant for you if you are passionate and skilled in transforming complex datasets into clear, compelling insights. Measures: AI-Enhanced Reporting & Insight Automation Business & Media Impact Reporting Standardization and Quality Dashboard & Data Product Ownership Reports to: President RESPONSIBILITIES: Media & Channel Analytics Analyze paid media across Google Ads, Meta, LinkedIn, Programmatic, YouTube; translate results into clear recommendations. Build/maintain attribution approaches (last-click, MTA, assisted) and funnel diagnostics. Integrate CRM/GA4/platform data to surface actionable trends by geo, audience, and creative. Predictive Modeling & Experimentation Develop forecasting and propensity models to guide budget allocation and channel mix. Run simulations (CPM/CPC/conv-rate scenarios) and design A/B and lift tests. Partner with SEO/CRO to connect acquisition with on-site conversion improvements. Dashboard Ownership (Existing Platform) Manage the dashboard development team (backlog, priorities, sprints) and collaborate on new features that improve usability and insight depth. Gather stakeholder requirements (Accounts, Media, Leadership) and maintain a transparent roadmap. Ensure data reliability (ETL QA, schema governance, tagging/UTM standards). Reporting & Client Enablement Support Account teams with data-backed, insight-driven reporting (monthly/quarterly reviews, executive summaries, narrative analyses). Build repeatable report templates; automate where possible while preserving clear storytelling. AI & Product Ideation Explore LLM/ML use cases (persona signals, creative scoring, conversion prediction). Prototype lightweight tools for planners/buyers (e.g., channel recommender, influence maps). What it takes to succeed in this role- QUALIFICATIONS: 5-7 years in data science/marketing analytics/digital media performance. Proficient in Python or R; strong SQL; experience with GA4/BigQuery and media platform exports. Comfort with BI tools (Looker Studio, Tableau, Power BI) and dashboard product management/ Data visualization. Familiarity with generative AI tools (e.g., OpenAI, Hugging Face, or Google Vertex AI) for automating insights, reporting, or content analysis. Comfortable in a fast-paced environment with competing priorities. Experience applying machine learning models to media mix modeling, customer segmentation, or predictive performance forecasting. Strong understanding of marketing attribution models and how to evaluate cross-channel performance using statistical techniques. Excellent communicator who can turn data into decisions for non-technical stakeholders. Experience with paid media a plus! Key Competencies Data Visualization & Storytelling - Skilled in transforming complex datasets into clear, compelling insights using tools like Tableau, Power BI, or Python libraries. AI & Machine Learning Expertise - Proficient in applying supervised and unsupervised learning techniques to optimize media performance and audience targeting. Media Analytics & Attribution - Deep understanding of digital media metrics, multi-touch attribution models, and cross-channel performance analysis. Dashboard Development & Management - Experience managing analytics dashboards, defining feature roadmaps, and collaborating with developers for scalable solutions. SEO/CRO Data Integration - Ability to synthesize SEO and conversion rate optimization data to inform predictive models and campaign strategies. Stakeholder Communication - Strong ability to translate data into actionable insights for Account teams and clients, supporting strategic decision-making. Automation & Efficiency - Familiarity with AI tools to streamline reporting, anomaly detection, and campaign optimization workflows. Statistical Analysis & Experimentation - Proficient in A/B testing, regression analysis, and causal inference to validate media strategies. The Perks: The best co-workers you'll ever find Unlimited PTO Medical, Dental, Vision, 401k plus match Annual performance bonus eligibility Ongoing training opportunities Planned outings and team events (remote workers included!) PHYSICAL DEMANDS AND WORK ENVIRONMENT: Prolonged periods of sitting at a desk and working on a computer. Occasional standing, walking, or lifting of office supplies (up to 10-20 lbs.) Frequent communication via phone, email, and video conferencing. Work is performed in a temperature-controlled office environment with standard lighting and noise levels. Position may require occasional travel to client site Compensation Range: We offer a competitive salary based on experience and qualifications. The compensation range for this position is $90,000 to $100,000 annually, with potential for bonuses, stock and additional benefits. EEO & Accessibility Statement Primacy is an Equal Opportunity Employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. If you require reasonable accommodation during the application or interview process, please contact [email protected]
    $90k-100k yearly Auto-Apply 60d+ ago
  • Sr GenAI Data Engineer

    The Hartford 4.5company rating

    Data engineer job in Hartford, CT

    Sr Staff Data Engineer - GE07DE We're determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals - and to help others accomplish theirs, too. Join our team as we help shape the future. Join our team as a Sr GenAI Data Engineer and lead the charge in developing cutting-edge AI solutions and data engineering strategies. Embrace our core values of innovation, collaboration, and excellence as you unlock unparalleled growth opportunities in the dynamic field of AI and data engineering. Shape the future of technology with us! Apply now to be part of our innovative journey and make a significant impact!Key Responsibilities Primary Job Responsibilities Develop AI-driven systems to improve data capabilities, ensuring compliance with industry best practices. Implement efficient Retrieval-Augmented Generation (RAG) architectures and integrate with enterprise data infrastructure. Collaborate with cross-functional teams to integrate solutions into operational processes and systems supporting various functions. Stay up to date with industry advancements in AI and apply modern technologies and methodologies to our systems. Design, build and maintain scalable and robust real-time data streaming pipelines using technologies such as GCP, Vertex AI, S3, AWS Bedrock, Spark streaming, or similar. Develop data domains and data products for various consumption archetypes including Reporting, Data Science, AI/ML, Analytics etc. Ensure the reliability, availability, and scalability of data pipelines and systems through effective monitoring, alerting, and incident management. Implement best practices in reliability engineering, including redundancy, fault tolerance, and disaster recovery strategies. Collaborate closely with DevOps and infrastructure teams to ensure seamless deployment, operation, and maintenance of data systems. Mentor junior team members and engage in communities of practice to deliver high-quality data and AI solutions while promoting best practices, standards, and adoption of reusable patterns. Apply AI solutions to insurance-specific data use cases and challenges. Partner with architects and stakeholders to influence and implement the vision of the AI and data pipelines while safeguarding the integrity and scalability of the environment. Required Skills & Experience: Mandatory Skills: + 8+ years' Strong hands-on experience programming skills in Python. + 7+ years of data engineering Strong hands-on experience including Data solutions, SQL and NoSQL, Snowflake, ETL/ELT tools, CICD, Bigdata, Cloud Technologies (AWS/Google/AZURE), Python/Spark. + 3+ years of data engineering experience focused on supporting Generative AI technologies. + 2+ years Strong hands-on experience implementing production ready enterprise grade GenAI data solutions. + 3+ years' experience in implementing Retrieval-Augmented Generation (RAG) pipelines, integrating retrieval mechanisms with language models. + 3+ years' Experience of vector databases and graph databases, including implementation and optimization. + 3+ years' Experience in processing and leveraging unstructured data for GenAI applications. + 3+ years' Proficiency in implementing scalable AI driven data systems supporting agentic solution (AWS Lambda, S3, EC2, Langchain, Langgraph). + 3+ years' Experience with building AI pipelines that bring together structured, semi-structured and unstructured data. This includes pre-processing with extraction, chunking, embedding and grounding strategies, semantic modeling, and getting the data ready for Models and Agentic solutions. Nice to Have + Experience with prompt engineering techniques for large language models. + Experience in implementing data governance practices, including Data Quality, Lineage, Data Catalogue capture, holistically, strategically, and dynamically on a large-scale data platform. + Experience in multi cloud hybrid AI solutions. + AI Certifications Experience in P&C or Employee Benefits industry Knowledge of natural language processing (NLP) and computer vision technologies. + Contributions to open-source AI projects or research publications in the field of Generative AI. Candidate must be authorized to work in the US without company sponsorship. The company will not support the STEM OPT I-983 Training Plan endorsement for this position. Compensation The listed annualized base pay range is primarily based on analysis of similar positions in the external market. Actual base pay could vary and may be above or below the listed range based on factors including but not limited to performance, proficiency and demonstration of competencies required for the role. The base pay is just one component of The Hartford's total compensation package for employees. Other rewards may include short-term or annual bonuses, long-term incentives, and on-the-spot recognition. The annualized base pay range for this role is: $135,040 - $202,560 Equal Opportunity Employer/Sex/Race/Color/Veterans/Disability/Sexual Orientation/Gender Identity or Expression/Religion/Age About Us (************************************* | Our Culture (******************************************************* | What It's Like to Work Here (************************************************** | Perks & Benefits (********************************************* Every day, a day to do right. Showing up for people isn't just what we do. It's who we are - and have been for more than 200 years. We're devoted to finding innovative ways to serve our customers, communities and employees-continually asking ourselves what more we can do. Is our policy language as simple and inclusive as it can be? Can we better help businesses navigate our ever-changing world? What else can we do to destigmatize mental health in the workplace? Can we make our communities more equitable? That we can rise to the challenge of these questions is due in no small part to our company values that our employees have shaped and defined. And while how we contribute looks different for each of us, it's these values that drive all of us to do more and to do better every day. About Us (************************************* Our Culture What It's Like to Work Here (************************************************** Perks & Benefits Legal Notice (***************************************** Accessibility Statement Producer Compensation (************************************************** EEO Privacy Policy (************************************************** California Privacy Policy Your California Privacy Choices (****************************************************** International Privacy Policy Canadian Privacy Policy (**************************************************** Unincorporated Areas of LA County, CA (Applicant Information) MA Applicant Notice (******************************************** Hartford India Prospective Personnel Privacy Notice
    $135k-202.6k yearly 46d ago
  • Data Engineer w AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake

    Intermedia Group

    Data engineer job in Ridgefield, CT

    OPEN JOB: Data Engineer w AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake **HYBRID - This candidate will work on site 2-3X per week in Ridgefield CT location SALARY: $140,000 to $185,000 2 Openings NOTE: CANDIDATE MUST BE US CITIZEN OR GREEN CARD HOLDER We are seeking a highly skilled and experienced Data Engineer to design, build, and maintain our scalable and robust data infrastructure on a cloud platform. In this pivotal role, you will be instrumental in enhancing our data infrastructure, optimizing data flow, and ensuring data availability. You will be responsible for both the hands-on implementation of data pipelines and the strategic design of our overall data architecture. Seeking a candidate with hands-on experience with AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake, Proficiency in Python and SQL and DevOps/CI/CD experience Duties & Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analytics. Collaborate with data architects, modelers and IT team members to help define and evolve the overall cloud-based data architecture strategy, including data warehousing, data lakes, streaming analytics, and data governance frameworks Collaborate with data scientists, analysts, and other business stakeholders to understand data requirements and deliver solutions. Optimize and manage data storage solutions (e.g., S3, Snowflake, Redshift) ensuring data quality, integrity, security, and accessibility. Implement data quality and validation processes to ensure data accuracy and reliability. Develop and maintain documentation for data processes, architecture, and workflows. Monitor and troubleshoot data pipeline performance and resolve issues promptly. Consulting and Analysis: Meet regularly with defined clients and stakeholders to understand and analyze their processes and needs. Determine requirements to present possible solutions or improvements. Technology Evaluation: Stay updated with the latest industry trends and technologies to continuously improve data engineering practices. Requirements Cloud Expertise: Expert-level proficiency in at least one major cloud platform (AWS, Azure, or GCP) with extensive experience in their respective data services (e.g., AWS S3, Glue, Lambda, Redshift, Kinesis; Azure Data Lake, Data Factory, Synapse, Event Hubs; GCP BigQuery, Dataflow, Pub/Sub, Cloud Storage); experience with AWS data cloud platform preferred SQL Mastery: Advanced SQL writing and optimization skills. Data Warehousing: Deep understanding of data warehousing concepts, Kimball methodology, and various data modeling techniques (dimensional, star/snowflake schemas). Big Data Technologies: Experience with big data processing frameworks (e.g., Spark, Hadoop, Flink) is a plus. Database Systems: Experience with relational and NoSQL databases (e.g., PostgreSQL, MySQL, MongoDB, Cassandra). DevOps/CI/CD: Familiarity with DevOps principles and CI/CD pipelines for data solutions. Hands-on experience with AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake Formation Proficiency in Python and SQL Desired Skills, Experience and Abilities 4+ years of progressive experience in data engineering, with a significant portion dedicated to cloud-based data platforms. ETL/ELT Tools: Hands-on experience with ETL/ELT tools and orchestrators (e.g., Apache Airflow, Azure Data Factory, AWS Glue, dbt). Data Governance: Understanding of data governance, data quality, and metadata management principles. AWS Experience: Ability to evaluate AWS cloud applications, make architecture recommendations; AWS solutions architect certification (Associate or Professional) is a plus Familiarity with Snowflake Knowledge of dbt (data build tool) Strong problem-solving skills, especially in data pipeline troubleshooting and optimization If you are interested in pursuing this opportunity, please respond back and include the following: Full CURRENT Resume Required compensation Contact information Availability Upon receipt, one of our managers will contact you to discuss in full STEPHEN FLEISCHNER Recruiting Manager INTERMEDIA GROUP, INC. EMAIL: *******************************
    $140k-185k yearly Easy Apply 60d+ ago
  • Sr Data Engineer (MFT - IBM Sterling)

    The Hertz Corporation 4.3company rating

    Data engineer job in Hartford, CT

    **A Day in the Life:** The IBM Sterling Senior MFT Engineer will have expertise in Sterling B2B Integrator, Sterling File Gateway, Sterling Secure Proxy, Control Center. This position requires an expert level knowledge of these technologies. You'll provide third-level support for core hardware, software, data and infrastructure components within the B2B integration team. The software focus areas will include software infrastructure components, continuous improvement and automation related to middleware and enterprise data movement within a mission-critical, multi-site enterprise environment. The ideal candidate will have a passion for technology; possess the ability to create change and be the facilitator for this transformation. They will have experience tailoring software design, developing, and implementing of solutions in support of data quality and automation. They will need to understand the requirements of a project, test the performance of the infrastructure, and verify that the requirements have been successfully met. We expect the starting salary to be around $135k but will be commensurate with experience. **What You'll Do:** TECHNICAL SENIORSHIP + Communication with internal and external business users on Sterling Integrator mappings + Making changes to existing partner integrations to meet internal and external requirements + Design, develop and implement solutions based on standards and processes that establish consistency across the enterprise data, reduce risks, and promote efficiencies in support of the organizational goals and objectives. + Diagnose and troubleshoot complex issues, restore services and perform root cause analysis. + Facilitate the review, vetting of these designs with the architecture governance bodies, as required. + Be aware of all aspects of security related to the Sterling environment and integrations INNOVATIVE INFRASTRUCTURE & PROBLEM SOLVING + Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows. TEAMWORK & COMMUNICATION + Superior & demonstrated team building & development skills to harness powerful teams + Ability to communicate effectively with different levels of Seniorship within the organization + Provide timely updates so that progress against each individual incident can be updated as required + Write and review high quality technical documentation CONTROL & AUDIT + Ensures their workstation and all processes and procedures, follow organization standards CONTINUOUS IMPROVEMENT + Encourages and maintains a 'Best practice sharing culture', always striving to find ways to improve service and change mind set. **What We're Looking For:** + Bachelor's degree in Engineering, Statistics, Computer Science or other quantitative fields, required + 5+ years of IT experience + 4+ years' experience working with Sterling B2B Integrator, Sterling File Gateway, PCM (preferred) + 3+ years' experience with scripting to enable automation of standard activities (example: Ansible, Python, Bash, Java) + Strong interpersonal and communication skills with Agile/Scrum experience. + Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions with the ability to develop custom setups. + Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels. + Prefer Travel, transportation, or hospitality experience + Prefer experience with designing application data models for mobile or web applications + Excellent written and verbal communication skills. + Flexibility in scheduling which may include nights, weekends, and holidays **What You'll Get:** + Up to 40% off the base rate of any standard Hertz Rental + Paid Time Off + Medical, Dental & Vision plan options + Retirement programs, including 401(k) employer matching + Paid Parental Leave & Adoption Assistance + Employee Assistance Program for employees & family + Educational Reimbursement & Discounts + Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness + Perks & Discounts -Theme Park Tickets, Gym Discounts & more The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world. **US EEO STATEMENT** At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company. Individuals are encouraged to apply for positions because of the characteristics that make them unique. EOE, including disability/veteran
    $135k yearly 60d+ ago
  • Consultant, Data Engineer

    IBM Corporation 4.7company rating

    Data engineer job in Southbury, CT

    Introduction At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk. Your role and responsibilities We are in search of a skilled Consultant Data Engineer to join our expanding team of experts. This role will be pivotal in the design and development of Snowflake Data Cloud solutions, encompassing responsibilities such as constructing data ingestion pipelines, establishing sound data architecture, and implementing stringent data governance and security protocols. The ideal candidate brings experience as a proficient data pipeline builder and adept data wrangler, deriving satisfaction from optimizing data systems from their foundational stages. Collaborating closely with database architects, data analysts, and data scientists, the Data Engineer will play a crucial role in ensuring a consistent and optimal data delivery architecture across ongoing customer projects. This position demands a self-directed individual comfortable navigating the diverse data needs of multiple teams, systems, and products. If you are enthusiastic about the prospect of contributing to a startup environment and supporting our customers in their next generation of data initiatives, we invite you to explore this opportunity. As of April 2025, Hakkoda has been acquired by IBM and will be integrated in the IBM organization. Your recruitment process will be managed by IBM. IBM will be the hiring entity This role can be performed from anywhere in the US. Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise * Bachelor's degree in engineering, computer science or equivalent area * 3+yrs in related technical roles with experience in data management, database development, ETL, and/or data prep domains. * Experience developing data warehouses. * Experience building ETL / ELT ingestion pipelines. * Proficiency in using cloud platform services for data engineering tasks, including managed database services (Snowflake and its pros and cons vs Redshift, BigQuery etc) and data processing services (AWS Glue, Azure Data Factory, Google Dataflow). * Skills in designing and implementing scalable and cost-effective solutions using cloud services, with an understanding of best practices for security and compliance. * Knowledge of how to manipulate, process and extract value from large disconnected datasets. * SQL and Python scripting experience require, Scala and Javascript is a plus. * Cloud experience (AWS, Azure or GCP) is a plus. * Knowledge of any of the following tools is also a plus: Snowflake, Matillion/Fivetran or DBT. * Strong interpersonal skills including assertiveness and ability to build strong client relationships. * Strong project management and organizational skills. * Ability to support and work with cross-functional and agile teams in a dynamic environment. * Advanced English required. ABOUT BUSINESS UNIT IBM Consulting is IBM's consulting and global professional services business, with market leading capabilities in business and technology transformation. With deep expertise in many industries, we offer strategy, experience, technology, and operations services to many of the most innovative and valuable companies in the world. Our people are focused on accelerating our clients' businesses through the power of collaboration. We believe in the power of technology responsibly used to help people, partners and the planet. YOUR LIFE @ IBM In a world where technology never stands still, we understand that, dedication to our clients success, innovation that matters, and trust and personal responsibility in all our relationships, lives in what we do as IBMers as we strive to be the catalyst that makes the world work better. Being an IBMer means you'll be able to learn and develop yourself and your career, you'll be encouraged to be courageous and experiment everyday, all whilst having continuous trust and support in an environment where everyone can thrive whatever their personal or professional background. Our IBMers are growth minded, always staying curious, open to feedback and learning new information and skills to constantly transform themselves and our company. They are trusted to provide on-going feedback to help other IBMers grow, as well as collaborate with colleagues keeping in mind a team focused approach to include different perspectives to drive exceptional outcomes for our customers. The courage our IBMers have to make critical decisions everyday is essential to IBM becoming the catalyst for progress, always embracing challenges with resources they have to hand, a can-do attitude and always striving for an outcome focused approach within everything that they do. Are you ready to be an IBMer? ABOUT IBM IBM's greatest invention is the IBMer. We believe that through the application of intelligence, reason and science, we can improve business, society and the human condition, bringing the power of an open hybrid cloud and AI strategy to life for our clients and partners around the world. Restlessly reinventing since 1911, we are not only one of the largest corporate organizations in the world, we're also one of the biggest technology and consulting employers, with many of the Fortune 500 companies relying on the IBM Cloud to run their business. At IBM, we pride ourselves on being an early adopter of artificial intelligence, quantum computing and blockchain. Now it's time for you to join us on our journey to being a responsible technology innovator and a force for good in the world. IBM is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, genetics, pregnancy, disability, neurodivergence, age, or other characteristics protected by the applicable law. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status. OTHER RELEVANT JOB DETAILS IBM offers a competitive and comprehensive benefits program. Eligible employees may have access to: * Healthcare benefits including medical & prescription drug coverage, dental, vision, and mental health & well being * Financial programs such as 401(k), cash balance pension plan, the IBM Employee Stock Purchase Plan, financial counseling, life insurance, short & long- term disability coverage, and opportunities for performance based salary incentive programs * Generous paid time off including 12 holidays, minimum 56 hours sick time, 120 hours vacation, 12 weeks parental bonding leave in accordance with IBM Policy, and other Paid Care Leave programs. IBM also offers paid family leave benefits to eligible employees where required by applicable law * Training and educational resources on our personalized, AI-driven learning platform where IBMers can grow skills and obtain industry-recognized certifications to achieve their career goals * Diverse and inclusive employee resource groups, giving & volunteer opportunities, and discounts on retail products, services & experiences We consider qualified applicants with criminal histories, consistent with applicable law. This position was posted on the date cited in the key job details section and is anticipated to remain posted for 21 days from this date or less if not needed to fill the role. IBM will not be providing visa sponsorship for this position now or in the future. Therefore, in order to be considered for this position, you must have the ability to work without a need for current or future visa sponsorship. The compensation range and benefits for this position are based on a full-time schedule for a full calendar year. The salary will vary depending on your job-related skills, experience and location. Pay increment and frequency of pay will be in accordance with employment classification and applicable laws. For part time roles, your compensation and benefits will be adjusted to reflect your hours. Benefits may be pro-rated for those who start working during the calendar year.
    $77k-101k yearly est. 4d ago
  • Senior Data Engineer, Personal Insurance

    Travelers Insurance Company 4.4company rating

    Data engineer job in Hartford, CT

    **Who Are We?** Taking care of our customers, our communities and each other. That's the Travelers Promise. By honoring this commitment, we have maintained our reputation as one of the best property casualty insurers in the industry for over 170 years. Join us to discover a culture that is rooted in innovation and thrives on collaboration. Imagine loving what you do and where you do it. **Job Category** Technology **Compensation Overview** The annual base salary range provided for this position is a nationwide market range and represents a broad range of salaries for this role across the country. The actual salary for this position will be determined by a number of factors, including the scope, complexity and location of the role; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. As part of our comprehensive compensation and benefits program, employees are also eligible for performance-based cash incentive awards. **Salary Range** $139,400.00 - $230,000.00 **Target Openings** 1 **What Is the Opportunity?** Travelers Data Engineering team constructs pipelines that contextualize and provide easy access to data by the entire enterprise. As a Senior Data Engineer you will accelerate growth and transformation of our analytics landscape. You will bring a strong desire to guide team members' growth and develop data solutions that translate complex data into user-friendly terminology. You will leverage your ability to design, build and deploy data solutions that capture, explore, transform, and utilize data to support Artificial Intelligence, Machine Learning and business intelligence/insights. **What Will You Do?** + Build and operationalize complex data solutions, correct problems, apply transformations, and recommending data cleansing/quality solutions. + Design complex data solutions, including incorporating new data sources and ensuring designs are consistent across projects and aligned to data strategies. + Perform analysis of complex sources to determine value and use and recommend data to include in analytical processes. + Incorporate core data management competencies including data governance, data security and data quality. + Act as a data and technology subject matter expert within lines of business to support delivery and educate end users on data products/analytic environment. + Perform data and system analysis, assessment and resolution for defects and incidents of high complexity and correct as appropriate. + Collaborate across team to support delivery and educate end users on complex data products/analytic environment. + Perform other duties as assigned. **What Will Our Ideal Candidate Have?** + Bachelor's Degree in STEM related field or equivalent. + Ten years of related experience. + Primary Job Requirements: + Architect and design scalable, secure data solutions using AWS, Databricks, and Ab Initio. + Lead technical direction for data engineering initiatives across cloud and on-premises infrastructure. + Hands-on development: build ETL pipelines, optimize Spark jobs, and create Ab Initio graphs. + Troubleshoot production issues and provide technical guidance to junior engineers. + Conduct mentoring sessions and offer technical guidance to the 20-person admin team. + Collaborate with DBA teams, business analysts, and QA teams to ensure data governance and quality. + Manage infrastructure deployment and optimize cloud resources. + Lead technical design reviews and architecture discussions. + Implement data integration solutions and ensure compliance with data protection regulations. + Establish and enforce coding standards, best practices, and data governance policies. + Technical Skills: + **AbInitio:** Expert proficiency with GDE, Co>Operating System, EME, BRE, Express>It, metaprogramming (PDL) + **Programming:** Python, PySpark, SQL + **Cloud:** AWS architecture and services + **Databricks:** Workspace management, cluster configuration, Delta Lake, Unity Catalog + **Data Warehousing:** Strong understanding of data modeling, dimensional modeling (star/snowflake schemas) + **ETL/ELT:** End-to-end ETL development lifecycle + **Version Control:** Git, CI/CD pipelines + Advanced knowledge of tools, techniques, and manipulation including cloud platforms, programming languages, and modern software engineering practices. + Strong delivery skills including the ability to determine the software design strategy and methodology to be used for efforts, use automated tests, analysis, and informed feedback loops to ensure the quality and production readiness of work before release, monitor the health of work efforts and that of adjacent systems. + Demonstrated track record of domain expertise including the ability to develop business partnerships and influence priorities by identifying solutions that are aligned with current business objective and closely follow industry trends relevant to domain, understanding how to apply them, and sharing knowledge with coworkers. + Strong problem solver who utilizes data and proofs of concepts to find creative solutions to difficult problems involving a significant number of factors with broad implications, reflects on solutions, measures impact, and uses that information to ideate and optimize. + Excellent communication skills with the ability to develop business partnerships, describe technology concepts in ways the business can understand, document initiatives in a concise and clear manner, and empathetically and attentively listen to others thoughts and ideas. + Ability to lead and take action even when there is no clear owner, inspire and motivate others, and be effective at influencing team members. **What is a Must Have?** + Bachelor's degree in computer science, related STEM field, or its equivalent in education and/or work experience. + 6 additional years of data engineering experience. **What Is in It for You?** + **Health Insurance** : Employees and their eligible family members - including spouses, domestic partners, and children - are eligible for coverage from the first day of employment. + **Retirement:** Travelers matches your 401(k) contributions dollar-for-dollar up to your first 5% of eligible pay, subject to an annual maximum. If you have student loan debt, you can enroll in the Paying it Forward Savings Program. When you make a payment toward your student loan, Travelers will make an annual contribution into your 401(k) account. You are also eligible for a Pension Plan that is 100% funded by Travelers. + **Paid Time Off:** Start your career at Travelers with a minimum of 20 days Paid Time Off annually, plus nine paid company Holidays. + **Wellness Program:** The Travelers wellness program is comprised of tools, discounts and resources that empower you to achieve your wellness goals and caregiving needs. In addition, our mental health program provides access to free professional counseling services, health coaching and other resources to support your daily life needs. + **Volunteer Encouragement:** We have a deep commitment to the communities we serve and encourage our employees to get involved. Travelers has a Matching Gift and Volunteer Rewards program that enables you to give back to the charity of your choice. **Employment Practices** Travelers is an equal opportunity employer. We value the unique abilities and talents each individual brings to our organization and recognize that we benefit in numerous ways from our differences. In accordance with local law, candidates seeking employment in Colorado are not required to disclose dates of attendance at or graduation from educational institutions. If you are a candidate and have specific questions regarding the physical requirements of this role, please send us an email (*******************) so we may assist you. Travelers reserves the right to fill this position at a level above or below the level included in this posting. To learn more about our comprehensive benefit programs please visit ******************************************************** .
    $139.4k-230k yearly 7d ago
  • GCP Data Engineer

    Inizio Partners Corp

    Data engineer job in Hartford, CT

    Role - GCP Data Engineer (Visualization experience) As a Data Engineer you will work on process of transforming raw data to a usable format, which is then further analyzed by other teams. Cleansing, organizing, and manipulating data using pipelines are some of the key responsibilities of a data engineer. You will also work on applying data engineering principles on the Google Cloud Platform to optimize its services. And create interactive dashboards/reports to present it to the stakeholders. Role and Responsibilities: Create and maintain optimal data pipeline architecture, Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and other data sources Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. Design intuitive and visually appearing visualizations to communicate complex data insights to users using Tableau, Streamlit, Dash Enterprise and Power BI Use Flask to develop APIs that integrate data from various sources and facilitate automation of business processes. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Candidate Profile: Required Qualification and Skills: Expert in DASH Enterprise in developing friendly UI and strong story telling experience. Ability to conduct real-time data exploration, predictive modeling integration and a seamless deployment of machine learning modules to GCP. Overall, 5-8 years of experience on ETL technologies 3+ years of experience in data engineering technologies like SQL, Hadoop, Big Query, Dataproc, Composer Strong Python/Pyspark Data Engineer Skills Hands-on experience with data visualizations like Tableau, Power BI, Dash and Streamlit Experience in building interactive dashboards, visualizations, and custom reports for business users. Knowledge of Flask for developing APIs and automating data workflows Experience in data automation and implemented workflows in a cloud environment. Strong SQL ETL skills GCP data engineer certified to be preferred. Ability to understand and design the underlying data/schema Strong communication skills to effectively communicate client updates.
    $84k-114k yearly est. 60d+ ago
  • AWS Data Engineer Onshore_HartMap - C102534 6.0 Hartford, CT

    CapB Infotek

    Data engineer job in Hartford, CT

    For one of our long-term multiyear projects we are looking for an AWS Data Engineer Onshore_HartMap out of Hartford, CT. Required qualifications to be successful in this role: Experience using Amazon RDS, AWS Glue, AWS Lake Formation, Amazon EMR, AWS Data Pipeline Additional Skills: Amazon Athena, Amazon Redshift, Amazon Kinesis, Kafka, Amazon Neptune, Amazon DynamoDB Prior Experience as AWS Architect across multiple projects with go-live milestones Leadership and mentoring skills are mandatory to lead diverse global architecture teams. Desired AWS certifications: Specialty: AWS Certified Security Specialist, AWS Certified Big Data Specialist Professional: AWS Certified Solutions Architect professional"
    $84k-114k yearly est. 60d+ ago
  • Data Engineer

    Global Channel Management

    Data engineer job in Windsor, CT

    Global Channel Management is a technology company that specializes in various types of recruiting and staff augmentation. Our account managers and recruiters have over a decade of experience in various verticals. GCM understands the challenges companies face when it comes to the skills and experience needed to fill the void of the day to day function. Organizations need to reduce training and labor costs but at same requiring the best "talent " for the job. Qualifications Data Engineer need 5 years experience Data Engineer requires: BS in Mathematics, Economics, Computer Science, Information Management or Statistics. Expert knowledge in SQL query language is required. Experience with Getting and Cleansing Data (ETL), including SalesForce and Big Data sources, is highly desirable. Experience with IBM's SPSS product is desirable. Working knowledge of Linux based client server applications. Aptitude for research, organization and structuring of complex problems. Strong technical, analytical and problem-solving skills. Strong technical writing skills and communication skills. Prior experience with metadata management repositories is a plus. Prior experience in the financial services, retirement or insurance industry is highly desirable Strong team-oriented interpersonal and communication skills are required. Data Engineer duties: Create Best Practices and enforce those practices, to optimize the Analytics Data Store data loading and consumption. Identify data needs for projects, architect, plan, and execute the extract, transform, and load (ETL) activities, using a combination of 3rd party and open source technologies. Additional Information $55/hr CTH
    $55 hourly 1d ago

Learn more about data engineer jobs

How much does a data engineer earn in Meriden, CT?

The average data engineer in Meriden, CT earns between $73,000 and $131,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Meriden, CT

$98,000
Job type you want
Full Time
Part Time
Internship
Temporary