Data & Analytics Solutions Architect
2x days/week onsite
Up to $180,000 + 10% bonus
**this client is not able to sponsor or transfer visas at this time**
A high-performing, financially strong U.S. insurance group is investing heavily in its Data & Analytics platform as it enters a major growth phase - and is hiring a hands-on Data & Analytics Solutions Architect to help shape what comes next.
This is not an Enterprise Architect role. It's a deeply technical, delivery-oriented architecture position sitting within a central Data & Analytics function, working directly with engineering teams and business partners to design, evolve, and execute modern data platforms at scale.
🚀 Why this role stands out
A $5.5B revenue business with an ambition to reach $10B in the next few years
Expanding rapidly across the U.S. - growth driven by technology, automation, and data, not headcount alone
Moving decisively toward a Databricks-centric Lakehouse architecture (Azure-based)
Production ML and early LLM use cases already live - not just experimentation
Architects here own real problems end-to-end, with visibility across the full data lifecycle
You'll join a small, senior architecture group supporting multiple business domains (claims, underwriting, operations, analytics), helping teams move faster while setting the right technical guardrails.
🧠What you'll be doing
Designing and evolving modern Data & Analytics architectures (batch, real-time, cloud & hybrid)
Partnering with delivery teams and business stakeholders to translate needs into scalable solutions
Defining standards, patterns, and frameworks across Databricks, Azure, Power BI, ML platforms, and ingestion tools
Supporting execution - reviewing designs, advising engineers, solving complex technical problems
Evaluating new tools and capabilities, running POCs, and influencing platform roadmaps
Acting as a technical mentor across dataengineering, analytics, and ML teams
This is an individual contributor role (manager-grade) - ideal for someone who wants architectural ownership without people management.
✅ What they're looking for
~7+ years in Data & Analytics Architecture or senior dataengineering with strong architectural ownership
Strong Databricks experience (Lakehouse / medallion architecture is essential)
Solid Azure knowledge (data, networking concepts, security patterns)
Experience designing data pipelines, analytics platforms, and ML-adjacent architectures
Comfortable working across teams, articulating technical decisions, and challenging designs constructively
Insurance experience is helpful but not required - technical depth matters most
Nice to have: Power BI, Azure ML, Informatica/IICS, real-time ingestion, governance & data quality frameworks.
🌱 Culture & growth
High autonomy, low politics, strong internal mobility
Long tenure and real investment in upskilling
Architects here influence strategy
and
delivery - no ivory towers
$86k-119k yearly est. 4d ago
Looking for a job?
Let Zippia find it for you.
Data Scientist, Analytics (Technical Leadership)
Meta 4.8
Data engineer job in Hartford, CT
We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.
**Required Skills:**
Data Scientist, Analytics (Technical Leadership) Responsibilities:
1. Work with complex data sets to solve challenging problems using analytical and statistical approaches
2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies
3. Identify and measure success through goal setting, forecasting, and monitoring key metrics
4. Partner with cross-functional teams to inform and execute product strategy and investment decisions
5. Build long-term vision and strategy for programs and products
6. Collaborate with executives to define and develop data platforms and instrumentation
7. Effectively communicate insights and recommendations to stakeholders
8. Define success metrics, forecast changes, and set team goals
9. Support developing roadmaps and coordinate analytics efforts across teams
**Minimum Qualifications:**
Minimum Qualifications:
10. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
11. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab)
12. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development
13. Experience working effectively with and leading complex analytics projects across multiple stakeholders and cross-functional teams, including Engineering, PM/TPM, Analytics and Finance
14. Experience with predictive modeling, machine learning, and experimentation/causal inference methods
15. Experience communicating complex technical topics in a clear, precise, and actionable manner
**Preferred Qualifications:**
Preferred Qualifications:
16. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy
17. Masters or Ph.D. Degree in a quantitative field
18. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research)
19. 10+ years of experience doing complex quantitative analysis in product analytics
**Public Compensation:**
$210,000/year to $281,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@meta.com.
$210k-281k yearly 60d+ ago
Senior Data Engineer
Stratacuity
Data engineer job in Bristol, CT
Description/Comment: Disney Streaming is the leading premium streaming service offering live and on-demand TV and movies, with and without commercials, both in and outside the home. Operating at the intersection of entertainment and technology, Disney Streaming has a unique opportunity to be the number one choice for TV. We captivate and connect viewers with the stories they love, and we're looking for people who are passionate about redefining TV through innovation, unconventional thinking, and embracing fun. Join us and see what this is all about.
The Product Performance Data Solutions team for the Data organization within Disney Streaming (DS), a segment under the Disney Media & Entertainment Distribution is in search of a Senior DataEngineer. As a member of the Product Performance team, you will work on building foundational datasets from clickstream and quality of service telemetry data - enabling dozens of engineering and analytical teams to unlock the power of data to drive key business decisions and provide engineering, analytics, and operational teams the critical information necessary to scale the largest streaming service. The Product Performance Data Solutions team is seeking to grow their team of world-class DataEngineers that share their charisma and enthusiasm for making a positive impact.
Responsibilities:
* Contribute to maintaining, updating, and expanding existing data pipelines in Python / Spark while maintaining strict uptime SLAs
* Architect, design, and code shared libraries in Scala and Python that abstract complex business logic to allow consistent functionality across all data pipelines
* Tech stack includes Airflow, Spark, Databricks, Delta Lake, Snowflake, Scala, Python
* Collaborate with product managers, architects, and other engineers to drive the success of the Product Performance Data and key business stakeholders
* Contribute to developing and documenting both internal and external standards for pipeline configurations, naming conventions, partitioning strategies, and more
* Ensure high operational efficiency and quality of datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our partners (Engineering, Data Science, Operations, and Analytics teams)
* Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for our team
* Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements
* Maintain detailed documentation of your work and changes to support data quality and data governance requirements
Additional Information: NOTE: There will be no SPC for this role Interview process: 4 rounds (1 with HM, 2 tech rounds, and a final with Product) We need an expert in SQL, extensive experience with Scala, a proven self-starter (expected to discover the outcome, and then chase after it), not only able to speak technical but clearly articulate that info to the business as well.
Preferred Qualifications: Candidates with Click stream, user browse data are highly preferred
Apex Systems is a world-class IT services company that serves thousands of clients across the globe. When you join Apex, you become part of a team that values innovation, collaboration, and continuous learning. We offer quality career resources, training, certifications, development opportunities, and a comprehensive benefits package. Our commitment to excellence is reflected in many awards, including ClearlyRated's Best of Staffing in Talent Satisfaction in the United States and Great Place to Work in the United Kingdom and Mexico. Apex uses a virtual recruiter as part of the application process. Click here for more details.
Apex Benefits Overview: Apex offers a range of supplemental benefits, including medical, dental, vision, life, disability, and other insurance plans that offer an optional layer of financial protection. We offer an ESPP (employee stock purchase program) and a 401K program which allows you to contribute typically within 30 days of starting, with a company match after 12 months of tenure. Apex also offers a HSA (Health Savings Account on the HDHP plan), a SupportLinc Employee Assistance Program (EAP) with up to 8 free counseling sessions, a corporate discount savings program and other discounts. In terms of professional development, Apex hosts an on-demand training program, provides access to certification prep and a library of technical and leadership courses/books/seminars once you have 6+ months of tenure, and certification discounts and other perks to associations that include CompTIA and IIBA. Apex has a dedicated customer service team for our Consultants that can address questions around benefits and other resources, as well as a certified Career Coach. You can access a full list of our benefits, programs, support teams and resources within our 'Welcome Packet' as well, which an Apex team member can provide.
Employee Type:
Contract
Location:
Bristol, CT, US
Job Type:
Date Posted:
January 8, 2026
Pay Range:
$50 - $100 per hour
Similar Jobs
* Senior DataEngineer
* Sr DataEngineers x12
* Senior Data Scientist
* Sr. Data Analyst
* Senior DataEngineer - SQL & Reporting
$50-100 hourly 11d ago
Senior Data Engineer - Product Performance Data -1573
Akube
Data engineer job in Bristol, CT
City: Bristol, CT /NYC Onsite/ Hybrid/ Remote: Hybrid (4 days a week Onsite)Duration: 10 months Rate Range: Up to $96/hr on W2 depending on experience (no C2C or 1099 or sub -contract) Work Authorization: GC, USC, All valid EADs except OPT, CPT, H1B
Please note that visa transfers are not supported for this role.
Candidates must be on our payroll.
If you require visa sponsorship, please do not apply.
Must Have:
Advanced SQL expertise
Strong Scala development experience
Python for dataengineering
Apache Spark in production
Airflow for orchestration
Databricks platform experience
Cloud data storage experience (S3 or equivalent)
Responsibilities:
Build and maintain large -scale data pipelines with strict SLAs.
Design shared libraries in Scala and Python to standardize data logic.
Develop foundational datasets from clickstream and telemetry data.
Ensure data quality, reliability, and operational efficiency.
Partner with product, engineering, and analytics teams.
Define and document data standards and best practices.
Participate actively in Agile and Scrum ceremonies.
Communicate technical outcomes clearly to business stakeholders.
Maintain detailed technical and data governance documentation.
Qualifications:
5+ years of dataengineering experience.
Strong problem -solving and algorithmic skills.
Expert -level SQL with complex analytical queries.
Hands -on experience with distributed systems at scale.
Experience supporting production data platforms.
Self -starter who can define outcomes and drive solutions.
Ability to translate technical concepts for non -technical audiences.
Bachelor's degree or equivalent experience.
$96 hourly 12d ago
Data Scientist
Tsunami Tsolutions 4.0
Data engineer job in Glastonbury, CT
Tsunami Tsolutions is seeking a motivated Data Scientist to join its Aviation Analytics department. This person will be responsible for developing and deploying solutions to various customers that utilize a wide range of analytics tools. They will work with team members and customers to identify new sources of value and take actions to capture it. The selected individual will also work to ensure data quality and provide metrics as indicators of the current state.
This role will also require the presentation of findings to customers and senior management.
NOTE: This position requires access to technologies and hardware subject to US national Security based export control requirements. All applicants must be US Citizen (8 USC 1324b(a)(3)), or otherwise authorized by the U.S. Government. NO Company Sponsorship offered.
Responsibilities:
• Working with stakeholders to understand complex business processes and data streams.
• Collaborate with stakeholders and whiteboard solutions.
• Identify, collect and clean data from multiple sources
• Validate, interpret and provide business insights
• Research, build, implement and evaluate various analytic techniques to select the best application
• Work with team members to deploy solutions across the organization
• Reporting periodic progress on projects, including tracking usage and value derived from the models
Position Requirements
• B.S degree in computer science, data science or engineering. Advanced Degree Preferred
• 3-5 years of industry experience preferred
• Experience building dashboards and visualizations using Qlik, Power BI, Tableau or similar
• Strong programming skills on languages used in data science including Python, R and SQL
• Advanced proficiency in Microsoft Excel with competency in vlookups, pivot tables, etc.
• Ability to learn new concepts quickly and translate them into practical applications
• Ability to effectively communicate findings to non-technical audiences
• Experience building and implementing machine learning models in Python or R a plus
Offer contingent upon successful completion of a background check and drug screen.
$78k-114k yearly est. Auto-Apply 60d+ ago
Data Scientist, Privacy
Datavant
Data engineer job in Hartford, CT
Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care.
By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare.
As part of the Privacy Science team within Privacy Hub you will play a crucial role in ensuring that privacy of patients is safeguarded in the modern world of data sharing. As well as working on real data, you will be involved in exciting research to keep us as industry leaders in this area, and stimulating discussions on re-identification risk. You will be supported in developing/consolidating data analysis and coding skills to become proficient in the analysis of large health-related datasets.
**You Will:**
+ Critically analyze large health datasets using standard and bespoke software libraries
+ Discuss your findings and progress with internal and external stakeholders
+ Produce high quality reports which summarise your findings
+ Contribute to research activities as we explore novel and established sources of re-identification risk
**What You Will Bring to the Table:**
+ Excellent communication skills. Meticulous attention to detail in the production of comprehensive, well-presented reports
+ A good understanding of statistical probability distributions, bias, error and power as well as sampling and resampling methods
+ Seeks to understand real-world data in context rather than consider it in abstraction.
+ Familiarity or proficiency with programmable data analysis software R or Python, and the desire to develop expertise in its language
+ Application of scientific methods to practical problems through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions
+ Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines
+ Initiative and ability to independently explore and research novel topics and concepts as they arise, to expand Privacy Hub's knowledge base
+ An appreciation of the need for effective methods in data privacy and security, and an awareness of the relevant legislation
+ Familiarity with Amazon Web Services cloud-based storage and computing facilities
**Bonus Points If You Have:**
+ Experience creating documents using LATEX
+ Detailed knowledge of one or more types of health information, e.g., genomics, disease, health images
+ Experience working with or supporting public sector organizations, such as federal agencies (e.g., CMS, NIH, VA, CDC), state health departments, or public health research partners. Familiarity with government data environments, procurement processes, or privacy frameworks in regulated settings is highly valued.
\#LI-BC1
We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status.
At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services.
The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job.
The estimated total cash compensation range for this role is:
$104,000-$130,000 USD
To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion.
This job is not eligible for employment sponsorship.
Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay.
At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way.
Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis.
For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
$104k-130k yearly 24d ago
Principal Data Scientist
Maximus 4.3
Data engineer job in Bridgeport, CT
Description & Requirements Maximus has an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team.
You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes.
This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.)
This is a remote position.
Essential Duties and Responsibilities:
- Make deep dives into the data, pulling out objective insights for business leaders.
- Initiate, craft, and lead advanced analyses of operational data.
- Provide a strong voice for the importance of data-driven decision making.
- Provide expertise to others in data wrangling and analysis.
- Convert complex data into visually appealing presentations.
- Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners.
- Understand the importance of automation and look to implement and initiate automated solutions where appropriate.
- Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects.
- Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects.
- Guide operational partners on product performance and solution improvement/maturity options.
- Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization.
- Learn new skills in advanced analytics/AI/ML tools, techniques, and languages.
- Mentor more junior data analysts/data scientists as needed.
- Apply strategic approach to lead projects from start to finish;
Job-Specific Essential Duties and Responsibilities:
- Develop, collaborate, and advance the applied and responsible use of AI, ML, simulation, and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
- Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning.
- Maintain current knowledge and evaluation of the AI technology landscape and emerging developments and their applicability for use in production/operational environments
Minimum Requirements
- Bachelor's degree in related field required.
- 10-12 years of relevant professional experience required.
Job-Specific Minimum Requirements (required skills that align with contract LCAT, verifiable, and measurable):
- 10+ years of relevant Software Development + AI / ML / DS experience
- Professional Programming experience (e.g. Python, R, etc.)
- Experience with AI / Machine Learning
- Experience working as a contributor on a team
- Experience leading AI/DS/or Analytics teams
- Experience mentoring Junior Staff
- Experience with Modeling and Simulation
- Experience with program management
Preferred Skills and Qualifications:
- Masters in quantitative discipline (Math, Operations Research, Computer Science, etc.)
- Experience developing machine learning or signal processing algorithms:
- Ability to leverage mathematical principles to model new and novel behaviors
- Ability to leverage statistics to identify true signals from noise or clutter
- Experience working as an individual contributor in AI or modeling and simulation
- Use of state-of-the-art technology to solve operational problems in AI, Machine Learning, or Modeling and Simulation spheres
- Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles
- Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions
- Use and development of program automation, CI/CD, DevSecOps, and Agile
- Experience managing technical teams delivering technical solutions for clients.
- Experience working with optimization problems like scheduling
- Experience with Data Analytics and Visualizations
- Cloud certifications (AWS, Azure, or GCP)
- 10+ yrs of related experience in AI, advanced analytics, computer science, or software development
#techjobs #Veteranspage
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
156,640.00
Maximum Salary
$
234,960.00
$77k-112k yearly est. Easy Apply 2d ago
IBM IIB, WMB, Data power Consultant
Sonsoft 3.7
Data engineer job in Hartford, CT
Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services.
Job Description
Preferred
• At least 4 years of experience with IBM IIB, WMB, Datapower
• At least 4 years of experience in software development life cycle.
• At least 4 years of experience in Project life cycle activities on development and maintenance projects.
• At least 2 years of experience in Design and architecture review.
• Ability to work in team in diverse/ multiple stakeholder environments
• Ability to work in Scrum team in diverse/ multiple stakeholder environments
• Interface analysis, Technical leadership, activities coordination, etc.
• Perform reviews
• Interactions with application teams, GI Team and other stake holders relevant to technology
• Experience in Automation Domain.
• Analytical skills
• Experience and desire to work in a Global delivery environment
Qualifications
Qualifications Basic
• Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
• At least 4 years of experience with Information Technology.
Additional Information
**
U.S. citizens and those authorized to work in the U.S. are encouraged to apply
. We are unable to sponsor at this time.
Note:-
This is a Full-Time Permanent job opportunity for you.
Only US Citizen, Green Card Holder, GC-EAD, H4-EAD & L2-EAD can apply.
No OPT-EAD, TN Visa & H1B Consultants please.
Please mention your Visa Status in your email or resume.
$83k-112k yearly est. 60d+ ago
OP Corporate Bank - Data Scientist
MPS Baltic
Data engineer job in Springfield, MA
Job DescriptionSalary:
MPS Baltic client OP Corporate Bank is part of OP Financial Group, the leading financial services group in Finland. OP Corporate Bank has been operating in Lithuania for 12 years, offers services to business customers and focuses on large companies.
CurrentlyOP Corporate Bank is looking for aData Scientistto join their team in Vilnius.
Do you have a passion for credit risk, data science, and automation? We invite you to make a deep dive into genuine banking expertise, managing credit risk, and solving substantial real-world financial challenges in a dynamic and forward-thinking environment.
This opportunity allows you to grow as a credit risk expert by applying cutting-edge data science techniques, collaborating with top professionals in Lithuania and Finland, and contributing to innovative banking solutions.
The Role:
Improve existing and create new data-driven tools in DataBricks for efficiency and automation
Contribute to the ongoing Azure cloud transformation in credit risk area
Contribute to the optimization/automation of MLflow deployments
Promoting best practices across department, contributing/organizing workshops
Work with Python, Spark, SQL, and Git to manage and analyse data
The Person:
2+ years of experience in data analysis, modelling, or machine learning
Python & SQL proficiency
Experience with Git
Strong analytical & problem-solving skills
Fluent Lithuanian and English (written & spoken)
Credit risk and/or Azure Databricks experience is a bonus
What is offered:
Professional Growth: Work in a Scandinavian corporate culture that values learning and development
Advanced Technology: Use Azure, Databricks, Python, and ML models in a cloud-based environment
Impactful Work: Contribute to credit risk modelling, automation, and analytics that drive business decisions
Collaboration: Work with top data professionals in Lithuania and Finland
Competitive Package: Salary depending on your experience and competencies, plus health insurance, lunch subsidies, event budget, parking, and more
International Experience: Opportunity for business trips to Finland
Should you feel that your skills and experience match the above we would be delighted to receive your application. To apply in the strictest confidentiality, please send your CV marked Data Scientist to executive search company MPS Baltic by email:*********.
Only short-listed candidates will be contacted.
Additional information provided on ph.: +************5, Ina Skiauterien.
Check out more about OP Corporate Bank Lithuania branch on*********************
$79k-111k yearly est. 22d ago
AWS Data Migration Consultant
Slalom 4.6
Data engineer job in Hartford, CT
Candidates can live within commutable distance to any Slalom office in the US. We have a hybrid and flexible environment. Who You'll Work With As a modern technology company, we've never met a technical challenge we didn't like. We enable our clients to learn from their data, create incredible digital experiences, and make the most of new technologies. We blend design, engineering, and analytics expertise to build the future. We surround our technologists with interesting challenges, innovative minds, and emerging technologies.
We are seeking an experienced Cloud Data Migration Architect with deep expertise in SQL Server, Oracle, DB2, or a combination of these platforms, to lead the design, migration, and optimization of scalable database solutions in the AWS cloud. This role will focus on modernizing on-premises database systems by architecting high-performance, secure, and reliable AWS-hosted solutions.
As a key technical leader, you will work closely with dataengineers, cloud architects, and business stakeholders to define data strategies, lead complex database migrations, build out ETL pipelines, and optimize performance across legacy and cloud-native environments.
What You'll Do
* Design and optimize database solutions on AWS, including Amazon RDS, EC2-hosted instances, and advanced configurations like SQL Server Always On or Oracle RAC (Real Application Clusters).
* Lead and execute cloud database migrations using AWS Database Migration Service (DMS), Schema Conversion Tool (SCT), and custom automation tools.
* Architect high-performance database schemas, indexing strategies, partitioning models, and query optimization techniques.
* Optimize complex SQL queries, stored procedures, functions, and views to ensure performance and scalability in the cloud.
* Implement high-availability and disaster recovery (HA/DR) strategies including Always-On, Failover Clusters, Log Shipping, and Replication, tailored to each RDBMS.
* Ensure security best practices are followed including IAM-based access control, encryption, and compliance with industry standards.
* Collaborate with DevOps teams to implement Infrastructure-as-Code (IaC) using tools like Terraform, CloudFormation, or AWS CDK.
* Monitor performance using tools such as AWS CloudWatch, Performance Insights, Query Store, Dynamic Management Views (DMVs), or Oracle-native tools.
* Work with software engineers and data teams to integrate cloud databases into enterprise applications and analytics platforms.
What You'll Bring
* 5+ years of experience in database architecture, design, and administration with at least one of the following: SQL Server, Oracle, or DB2.
* Expertise in one or more of the following RDBMS platforms: Microsoft SQL Server, Oracle, DB2.
* Hands-on experience with AWS database services (RDS, EC2-hosted databases).
* Strong understanding of HA/DR solutions and cloud database design patterns.
* Experience with ETL development and data integration, using tools such as SSIS, AWS Glue, or custom solutions.
* Familiarity with AWS networking components (VPCs, security groups) and hybrid cloud connectivity.
* Strong troubleshooting and analytical skills to resolve complex database and performance issues.
* Ability to work independently and lead database modernization initiatives in collaboration with engineering and client stakeholders.
Nice to Have
* AWS certifications such as AWS Certified Database - Specialty or AWS Certified Solutions Architect - Professional.
* Experience with NoSQL databases or hybrid data architectures.
* Knowledge of analytics and big data tools (e.g., Snowflake, Redshift, Athena, Power BI, Tableau).
* Familiarity with containerization (Docker, Kubernetes) and serverless technologies (AWS Lambda, Fargate).
* Experience with DB2 on-premise or cloud-hosted environments.
About Us
Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all.
Compensation and Benefits
Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance.
Slalom is committed to fair and equitable compensation practices. For this position, the target base salary pay range in the following locations:
Boston, Houston, Los Angeles, Orange County, Seattle, San Diego, Washington DC, New York, New Jersey, for Consultant level is $105,000-147,000 and for Senior Consultant level it is $120,000-$169,000 and for Principal level it is $133,000-$187,000.
In all other markets, the target base salary pay range for Consultant level is $96,000-$135,000 and for Senior Consultant level it is $110,000-$155,000 and for Principal level it is $122,000-$172,000.
In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time.
EEO and Accommodations
Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process.
We will accept applications until 1/31/2026 or until the positions are filled.
$133k-187k yearly 13d ago
Data Scientist, Media
Digital United
Data engineer job in Farmington, CT
Accepting applicants in CT, FL, MN, NJ, NC, OH, TX Mediate.ly is seeking a hands-on Data Scientist to elevate media performance analysis, predictive modeling, and channel optimization. In this role, you'll leverage advanced machine learning techniques and generative AI tools to uncover actionable insights, automate reporting, and enhance campaign effectiveness across digital channels. You'll manage and evolve our existing performance dashboard (with a small external team), own the feature roadmap, and collaborate closely with Primacy on SEO/CRO data integration. A key part of the role involves supporting Account teams with clear, insight-rich reporting powered by enhanced data storytelling and visualization. This was meant for you if you are passionate and skilled in transforming complex datasets into clear, compelling insights.
Measures:
AI-Enhanced Reporting & Insight Automation
Business & Media Impact
Reporting Standardization and Quality
Dashboard & Data Product Ownership
Reports to: President
RESPONSIBILITIES:
Media & Channel Analytics
Analyze paid media across Google Ads, Meta, LinkedIn, Programmatic, YouTube; translate results into clear recommendations.
Build/maintain attribution approaches (last-click, MTA, assisted) and funnel diagnostics.
Integrate CRM/GA4/platform data to surface actionable trends by geo, audience, and creative.
Predictive Modeling & Experimentation
Develop forecasting and propensity models to guide budget allocation and channel mix.
Run simulations (CPM/CPC/conv-rate scenarios) and design A/B and lift tests.
Partner with SEO/CRO to connect acquisition with on-site conversion improvements.
Dashboard Ownership (Existing Platform)
Manage the dashboard development team (backlog, priorities, sprints) and collaborate on new features that improve usability and insight depth.
Gather stakeholder requirements (Accounts, Media, Leadership) and maintain a transparent roadmap.
Ensure data reliability (ETL QA, schema governance, tagging/UTM standards).
Reporting & Client Enablement
Support Account teams with data-backed, insight-driven reporting (monthly/quarterly reviews, executive summaries, narrative analyses).
Build repeatable report templates; automate where possible while preserving clear storytelling.
AI & Product Ideation
Explore LLM/ML use cases (persona signals, creative scoring, conversion prediction).
Prototype lightweight tools for planners/buyers (e.g., channel recommender, influence maps).
What it takes to succeed in this role-
QUALIFICATIONS:
5-7 years in data science/marketing analytics/digital media performance.
Proficient in Python or R; strong SQL; experience with GA4/BigQuery and media platform exports.
Comfort with BI tools (Looker Studio, Tableau, Power BI) and dashboard product management/ Data visualization.
Familiarity with generative AI tools (e.g., OpenAI, Hugging Face, or Google Vertex AI) for automating insights, reporting, or content analysis.
Comfortable in a fast-paced environment with competing priorities.
Experience applying machine learning models to media mix modeling, customer segmentation, or predictive performance forecasting.
Strong understanding of marketing attribution models and how to evaluate cross-channel performance using statistical techniques.
Excellent communicator who can turn data into decisions for non-technical stakeholders.
Experience with paid media a plus!
Key Competencies
Data Visualization & Storytelling - Skilled in transforming complex datasets into clear, compelling insights using tools like Tableau, Power BI, or Python libraries.
AI & Machine Learning Expertise - Proficient in applying supervised and unsupervised learning techniques to optimize media performance and audience targeting.
Media Analytics & Attribution - Deep understanding of digital media metrics, multi-touch attribution models, and cross-channel performance analysis.
Dashboard Development & Management - Experience managing analytics dashboards, defining feature roadmaps, and collaborating with developers for scalable solutions.
SEO/CRO Data Integration - Ability to synthesize SEO and conversion rate optimization data to inform predictive models and campaign strategies.
Stakeholder Communication - Strong ability to translate data into actionable insights for Account teams and clients, supporting strategic decision-making.
Automation & Efficiency - Familiarity with AI tools to streamline reporting, anomaly detection, and campaign optimization workflows.
Statistical Analysis & Experimentation - Proficient in A/B testing, regression analysis, and causal inference to validate media strategies.
The Perks:
The best co-workers you'll ever find
Unlimited PTO
Medical, Dental, Vision, 401k plus match
Annual performance bonus eligibility
Ongoing training opportunities
Planned outings and team events (remote workers included!)
PHYSICAL DEMANDS AND WORK ENVIRONMENT:
Prolonged periods of sitting at a desk and working on a computer.
Occasional standing, walking, or lifting of office supplies (up to 10-20 lbs.)
Frequent communication via phone, email, and video conferencing.
Work is performed in a temperature-controlled office environment with standard lighting and noise levels.
Position may require occasional travel to client site
Compensation Range: We offer a competitive salary based on experience and qualifications. The compensation range for this position is $90,000 to $100,000 annually, with potential for bonuses, stock and additional benefits. EEO & Accessibility Statement
Primacy is an Equal Opportunity Employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. If you require reasonable accommodation during the application or interview process, please contact [email protected]
$90k-100k yearly Auto-Apply 60d+ ago
Data Scientist
The Connecticut Rise Network
Data engineer job in New Haven, CT
RISE Data Scientist
Reports to: Monitoring, Evaluation, and Learning Manager
Salary: Competitive and commensurate with experience
Please note: Due to the upcoming holidays, application review for this position will begin the first week of January. Applicants can expect outreach by the end of the week of January 5.
Overview:
The RISE Network's mission is to ensure all high school students graduate with a plan and the skills and confidence to achieve college and career success. Founded in 2016, RISE partners with public high schools to lead networks where communities work together to use data to learn and improve. Through its core and most comprehensive network, RISE partners with nine high schools and eight districts, serving over 13,000 students in historically marginalized communities.
RISE high schools work together to ensure all students experience success as they transition to, through, and beyond high school by using data to pinpoint needs, form hypotheses, and pursue ideas to advance student achievement. Partner schools have improved Grade 9 promotion rates by nearly 20 percentage points, while also decreasing subgroup gaps and increasing schoolwide graduation and college access rates. In 2021, the RISE Network was honored to receive the Carnegie Foundation's annual Spotlight on Quality in Continuous Improvement recognition. Increasingly, RISE is pursuing opportunities to scale its impact through research publications, consulting partnerships, professional development experiences, and other avenues to drive excellent student outcomes.
Position Summary and Essential Job Functions:
The RISE Data Scientist will play a critical role in leveraging data to support continuous improvement, program evaluation, and research, enhancing the organization's evidence-based learning and decision-making. RISE is seeking a talented and motivated individual to design and conduct rigorous quantitative analyses to assess the outcomes and impacts of programs.
The ideal candidate is an experienced analyst who is passionate about using data to drive social change, with strong skills in statistical modeling, data visualization, and research design. This individual will also lead efforts to monitor and analyze organization-wide data related to mission progress and key performance indicators (KPIs), and communicate these insights in ways that inspire improvement and action. This is an exciting opportunity for an individual who thrives in an entrepreneurial environment and is passionate about closing opportunity gaps and supporting the potential of all students, regardless of life circumstances. The role will report to the Monitoring, Evaluation, and Learning (MEL) Manager and sit on the MEL team.
Responsibilities include, but are not limited to:
1. Research and Evaluation (30%)
Collaborate with MEL and network teams to design and implement rigorous process, outcome, and impact evaluations.
Lead in the development of data collection tools and survey instruments.
Manage survey data collection, reporting, and learning processes.
Develop RISE learning and issue briefs supported by quantitative analysis.
Design and implement causal inference approaches where applicable, including quasi-experimental designs.
Provide technical input on statistical analysis plans, monitoring frameworks, and indicator selection for network programs.
Translate complex findings into actionable insights and policy-relevant recommendations for non-technical audiences.
Report data for RISE leadership and staff, generating new insights to inform program design.
Create written reports, presentations, publications, and communications pieces.
2. Quantitative Analysis and Statistical Modeling (30%)
Clean, transform, and analyze large and complex datasets from internal surveys, the RISE data warehouse, and external data sources such as the National Student Clearinghouse (NSC).
Conduct exploratory research that informs organizational learning.
Lead complex statistical analyses using advanced methods (regression modeling, propensity score matching, difference in differences analysis, time-series analysis, etc).
Contribute to data cleaning and analysis for key performance indicator reporting.
Develop processes that support automation of cleaning and analysis for efficiency.
Develop and maintain analytical code and workflows to ensure reproducibility.
3. Data Visualization and Tool-building (30%)
Work closely with non-technical stakeholders to understand the question(s) they are asking and the use cases they have for specific data visualizations or tools
Develop well-documented overviews and specifications for new tools.
Create clear, compelling data visualizations and dashboards.
Collaborate with DataEngineering to appropriately and sustainably source data for new tools.
Manage complex projects to build novel and specific tools for internal or external stakeholders.
Maintain custom tools for the duration of their usefulness, including by responding to feedback and requests from project stakeholders.
4. Data Governance and Quality Assurance (10%)
Support data quality assurance protocols and standards across the MEL team.
Ensure compliance with data protection, security, and ethical standards.
Maintain organized, well-documented code and databases.
Collaborate with the DataEngineering team to maintain RISE MEL data infrastructure.
Qualifications
Master's degree (or PhD) in statistics, economics, quantitative social sciences, public policy, data science, or related field.
Minimum of 3 years of professional experience conducting statistical analysis and managing large datasets.
Advanced proficiency in R, Python, or Stata for data analysis and modeling.
Experience designing and implementing quantitative research and evaluation studies.
Strong understanding of inferential statistics, experimental and quasi-experimental methods, and sampling design.
Strong knowledge of survey data collection tools such as Key Surveys, Google Forms, etc.
Excellent data visualization and communication skills
Experience with data visualization tools; strong preference for Tableau.
Ability to translate complex data into insights for diverse audiences, including non-technical stakeholders.
Ability to cultivate relationships and earn credibility with a diverse range of stakeholders.
Strong organizational and project management skills.
Strong sense of accountability and responsibility for results.
Ability to work in an independent and self-motivated manner.
Demonstrated proficiency with Google Workspace.
Commitment to equity, ethics, and learning in a nonprofit or mission-driven context.
Positive attitude and willingness to work in a collaborative environment.
Strong belief that all students can learn and achieve at high levels.
Preferred
Experience working on a monitoring, evaluation, and learning team.
Familiarity with school data systems and prior experience working in a school, district, or similar K-12 educational context preferred.
Experience working with survey data (e.g., DHS, LSMS), administrative datasets, or real-time digital data sources.
Working knowledge of dataengineering or database management (SQL, cloud-based platforms).
Salary Range
$85k - $105k
Most new hires' salaries fall within the first half of the range, allowing team members to grow in their roles. For those who already have significant and aligned experiences at the same level as the role, placement may be at the higher end of the range.
The Connecticut RISE Network is an equal opportunity employer and welcomes candidates from diverse backgrounds.
RISE Interview & Communication Policy
The RISE interview process includes:
A video or phone screening with the hiring manager
Interviews with the hiring panel
A performance task
Reference checks
Applicants will never receive an offer unless they have completed the full interview process.
All official communications with applicants are sent only through ADP or CT RISE email addresses (@ctrise.org). There has been a job offer scam circulating from various email addresses using the domain @careers-ctrise.org-this is not a valid RISE email address.
If you receive an email from anyone claiming to represent RISE with a job offer outside of our official channels, or requesting written screening information, and you have not completed the full interview process, please do not respond and report it to ******************.
$85k-105k yearly Auto-Apply 41d ago
Lead Data Engineer
Launch Potato
Data engineer job in New Haven, CT
WHO ARE WE?
Launch Potato is a profitable digital media company that reaches over 30M+ monthly visitors through brands such as FinanceBuzz, All About Cookies, and OnlyInYourState.
As The Discovery and Conversion Company, our mission is to connect consumers with the world's leading brands through data-driven content and technology.
Headquartered in South Florida with a remote-first team spanning over 15 countries, we've built a high-growth, high-performance culture where speed, ownership, and measurable impact drive success.
WHY JOIN US?
At Launch Potato, you'll accelerate your career by owning outcomes, moving fast, and driving impact with a global team of high-performers.
BASE SALARY: $150,000 to $190,000 per year
MUST HAVE:
5+ years of experience in dataengineering within fast-paced, cloud-native environments
Deep expertise in Python, SQL, Docker, and AWS (S3, Glue, Kinesis, Athena/Presto)
Experience building and managing scalable ETL pipelines and data lake infrastructure
Familiarity with distributed systems, Spark, and data quality best practices
Strong cross-functional collaboration skills to support BI, analytics, and engineering teams
EXPERIENCE: 5+ years of dataengineering experience in an AWS-based environment where data powers decision-making across product, marketing, and operations.
YOUR ROLE
Lead scalable dataengineering efforts that empower cross-functional teams with reliable, timely, and actionable data, ensuring Launch Potato's analytics and business intelligence infrastructure fuels strategic growth.
OUTCOMES
Build and optimize scalable, efficient ETL and data lake processes that proactively catch issues before they impact the business
Own the ingestion, modeling, and transformation of structured and unstructured data to support reporting and analysis across all business units
Partner closely with BI and Analytics to deliver clean, query-ready datasets that improve user acquisition, engagement, and revenue growth
Maintain and enhance database monitoring, anomaly detection, and quality assurance workflows
Serve as the internal point of contact for reporting infrastructure-delivering ad hoc data analyses and driving consistent data integrity
Drive adoption and advancement of Looker dashboards by ensuring robust and scalable backend data support
Contribute to the future of Launch Potato's data team by mentoring peers and shaping a high-performance, quality-first engineering culture
COMPETENCIES
DataEngineering Mastery: Demonstrates technical excellence in building data pipelines, troubleshooting distributed systems, and scaling infrastructure using AWS and open-source tools
Cross-Functional Collaboration: Communicates clearly across technical and non-technical teams, translating business needs into robust data solutions
Proactive Ownership: Operates with a strong sense of accountability; identifies and solves data issues independently and efficiently
Quality-Driven Execution: Holds a high bar for data accuracy, auditability, and documentation throughout all systems and workflows
Strategic Thinking: Anticipates how data infrastructure impacts wider company OKRs and proactively suggests improvements and innovations
Growth Mindset: Seeks out opportunities to elevate team capabilities, mentor others, and stay ahead of evolving best practices in dataengineering
TOTAL COMPENSATION
Base salary is set according to market rates for the nearest major metro and varies based on Launch Potato's Levels Framework. Your compensation package includes a base salary, profit-sharing bonus, and competitive benefits. Launch Potato is a performance-driven company, which means once you are hired, future increases will be based on company and personal performance, not annual cost of living adjustments.
Want to accelerate your career? Apply now!
Since day one, we've been committed to having a diverse, inclusive team and culture. We are proud to be an Equal Employment Opportunity company. We value diversity, equity, and inclusion.
We do not discriminate based on race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.
$150k-190k yearly Auto-Apply 2d ago
Lead AI Data Engineer
The Hartford 4.5
Data engineer job in Hartford, CT
Sr Staff DataEngineer - GE07DE We're determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals - and to help others accomplish theirs, too. Join our team as we help shape the future.
The Enterprise Data Services Department's Customer Data Ecosystem Operations Team is looking for a skilled Lead AI DataEngineer to join our team. This is an exciting opportunity to join us on our multi-year Cloud Modernization journey. The role focuses on integrating data from multiple systems, curating and transforming it into high-quality data products for actionable insights, using a mix of solutions spanning AI and Cloud tools and technologies. You uphold data integrity, accessibility, and compliance while creating curated Data Products to support diverse analytics needs-including descriptive, diagnostic, predictive, and prescriptive use cases for visualization and advanced data science.
Ideal candidates bring deep expertise in dataengineering frameworks and tools (e.g., Spark, Snowflake), proficiency in programming languages (Python, SQL, PL/SQL), and experience with DevOps / DataOps pipelines, cloud platforms (AWS services), and agile methodologies (Scrum, Kanban). Familiarity with data warehousing, streaming technologies, and modern architecture such as Lakehouse and Data Mesh is highly desirable. Strong problem-solving, communication, and collaboration skills are essential, along with a proactive mindset and the ability to thrive in complex, fast-paced environments.
To succeed in this role, you should be a strong critical thinker, technical acumen and be able to derive the root causes of business problems.
This role will have a Hybrid work schedule, with the expectation of working in an office (Columbus, OH, Chicago, IL, Hartford, CT or Charlotte, NC) 3 days a week (Tuesday through Thursday).
Key Responsibilities
* Design, develop, and optimize ETL/ELT pipelines for both structured and unstructured data.
* Mentor junior team members and engage in communities of practice to deliver high-quality data and AI solutions while promoting best practices, standards, and adoption of reusable patterns.
* Partner with architects and stakeholders to influence and implement the vision of the AI and data pipelines while safeguarding the integrity and scalability of the environment.
* Ingest and process large-scale datasets into the Enterprise Data Lake and downstream systems.
* Curate and publish Data Products to support analytics, visualization, and machine learning use cases.
* Collaborate with data analysts, data scientists, and BI teams to build data models and pipelines for research, reporting, and advanced analytics.
* Apply best practices for data modeling, governance, and security across all solutions.
* Partner with cross-functional teams to ensure alignment and delivery of high-value outcomes.
* Monitor and fine-tune data pipelines for performance, scalability, and reliability.
* Automate auditing, balance, reconciliation, and data quality checks to maintain high data integrity.
* Develop self-healing pipelines with robust re-startability mechanisms for resilience.
* Schedule and orchestrate complex, dependent workflows using tools like MWAA, Autosys, or Control-M.
* Leverage CI/CD pipelines to enable automated integration, testing, and deployment processes.
* Lead Proof of Concepts (POCs) and technology evaluations to drive innovation.
* Develop AI-driven systems to improve data capabilities, ensuring compliance with industry best practices.
* Implement efficient Retrieval-Augmented Generation (RAG) architectures and integrate with enterprise data infrastructure.
* Implement data observability practices to proactively monitor data health, lineage, and quality across pipelines, ensuring transparency and trust in data assets.
Required Skills & Experience:
* Bachelor's or master's degree in computer science or a related discipline.
* 5+ years of experience in data analysis, transformation, and development, with ideally 2+ years in the insurance or a related industry.
* 3+ years of experience developing and deploying large-scale data and analytics applications on cloud platforms such as AWS and Snowflake.
* Strong proficiency in SQL, Python, and ETL tools such as Informatica IDMC for data integration and transformation (3+ years).
* Experience designing and optimizing data models for Data Warehouses, Data Marts, and Data Fabric, including dimensional modeling, semantic layers, metadata management, and integration for scalable, governed, and high-performance analytics (3+ years).
* 3+ years of hands-on experience in processing large-scale structured and unstructured data in both batch and near-real-time environments, leveraging distributed computing frameworks and streaming technologies for high-performance data pipelines.
* Strong technical knowledge (AI solution leveraging Cloud and modern solutions)
* 3+ years of experience in Agile methodologies, including Scrum and Kanban frameworks.
* 2+ years of experience in leveraging DevOps pipelines for automated testing and deployment, ensuring continuous integration and delivery of data solutions.
* Proficient in data visualization tools such as Tableau and Power BI, with expertise in creating interactive dashboards, reports, and visual analytics to support data-driven decision-making.
* Ability to analyze source systems, provide business solutions, and translate these solutions into actionable steps.
Nice to Have
* Certifications in AWS Data & Analytics Services, Snowflake, and Informatica IDMC.
* Experience in building and integrating AI agents into data workflows.
* Strong knowledge of data governance, including metadata management and lineage tracking, with hands-on experience collaborating with the Data Governance Office.
Candidate must be authorized to work in the US without company sponsorship. The company will not support the STEM OPT I-983 Training Plan endorsement for this position.
Compensation
The listed annualized base pay range is primarily based on analysis of similar positions in the external market. Actual base pay could vary and may be above or below the listed range based on factors including but not limited to performance, proficiency and demonstration of competencies required for the role. The base pay is just one component of The Hartford's total compensation package for employees. Other rewards may include short-term or annual bonuses, long-term incentives, and on-the-spot recognition. The annualized base pay range for this role is:
$135,040 - $202,560
Equal Opportunity Employer/Sex/Race/Color/Veterans/Disability/Sexual Orientation/Gender Identity or Expression/Religion/Age
About Us | Our Culture | What It's Like to Work Here | Perks & Benefits
$135k-202.6k yearly Auto-Apply 8d ago
Data Engineer w AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake
Intermedia Group
Data engineer job in Ridgefield, CT
OPEN JOB: DataEngineer w AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake **HYBRID - This candidate will work on site 2-3X per week in Ridgefield CT location SALARY: $140,000 to $185,000
2 Openings
NOTE: CANDIDATE MUST BE US CITIZEN OR GREEN CARD HOLDER
We are seeking a highly skilled and experienced DataEngineer to design, build, and maintain our scalable and robust data infrastructure on a cloud platform. In this pivotal role, you will be instrumental in enhancing our data infrastructure, optimizing data flow, and ensuring data availability. You will be responsible for both the hands-on implementation of data pipelines and the strategic design of our overall data architecture.
Seeking a candidate with hands-on experience with AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake, Proficiency in Python and SQL and DevOps/CI/CD experience
Duties & Responsibilities
Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analytics.
Collaborate with data architects, modelers and IT team members to help define and evolve the overall cloud-based data architecture strategy, including data warehousing, data lakes, streaming analytics, and data governance frameworks
Collaborate with data scientists, analysts, and other business stakeholders to understand data requirements and deliver solutions.
Optimize and manage data storage solutions (e.g., S3, Snowflake, Redshift) ensuring data quality, integrity, security, and accessibility.
Implement data quality and validation processes to ensure data accuracy and reliability.
Develop and maintain documentation for data processes, architecture, and workflows.
Monitor and troubleshoot data pipeline performance and resolve issues promptly.
Consulting and Analysis: Meet regularly with defined clients and stakeholders to understand and analyze their processes and needs. Determine requirements to present possible solutions or improvements.
Technology Evaluation: Stay updated with the latest industry trends and technologies to continuously improve dataengineering practices.
Requirements
Cloud Expertise: Expert-level proficiency in at least one major cloud platform (AWS, Azure, or GCP) with extensive experience in their respective data services (e.g., AWS S3, Glue, Lambda, Redshift, Kinesis; Azure Data Lake, Data Factory, Synapse, Event Hubs; GCP BigQuery, Dataflow, Pub/Sub, Cloud Storage); experience with AWS data cloud platform preferred
SQL Mastery: Advanced SQL writing and optimization skills.
Data Warehousing: Deep understanding of data warehousing concepts, Kimball methodology, and various data modeling techniques (dimensional, star/snowflake schemas).
Big Data Technologies: Experience with big data processing frameworks (e.g., Spark, Hadoop, Flink) is a plus.
Database Systems: Experience with relational and NoSQL databases (e.g., PostgreSQL, MySQL, MongoDB, Cassandra).
DevOps/CI/CD: Familiarity with DevOps principles and CI/CD pipelines for data solutions.
Hands-on experience with AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake Formation
Proficiency in Python and SQL
Desired Skills, Experience and Abilities
4+ years of progressive experience in dataengineering, with a significant portion dedicated to cloud-based data platforms.
ETL/ELT Tools: Hands-on experience with ETL/ELT tools and orchestrators (e.g., Apache Airflow, Azure Data Factory, AWS Glue, dbt).
Data Governance: Understanding of data governance, data quality, and metadata management principles.
AWS Experience: Ability to evaluate AWS cloud applications, make architecture recommendations; AWS solutions architect certification (Associate or Professional) is a plus
Familiarity with Snowflake
Knowledge of dbt (data build tool)
Strong problem-solving skills, especially in data pipeline troubleshooting and optimization
If you are interested in pursuing this opportunity, please respond back and include the following:
Full CURRENT Resume
Required compensation
Contact information
Availability
Upon receipt, one of our managers will contact you to discuss in full
STEPHEN FLEISCHNER
Recruiting Manager
INTERMEDIA GROUP, INC.
EMAIL: *******************************
$140k-185k yearly Easy Apply 60d+ ago
Sr Data Engineer (MFT - IBM Sterling)
The Hertz Corporation 4.3
Data engineer job in Hartford, CT
**A Day in the Life:** The IBM Sterling Senior MFT Engineer will have expertise in Sterling B2B Integrator, Sterling File Gateway, Sterling Secure Proxy, Control Center. This position requires an expert level knowledge of these technologies. You'll provide third-level support for core hardware, software, data and infrastructure components within the B2B integration team. The software focus areas will include software infrastructure components, continuous improvement and automation related to middleware and enterprise data movement within a mission-critical, multi-site enterprise environment.
The ideal candidate will have a passion for technology; possess the ability to create change and be the facilitator for this transformation. They will have experience tailoring software design, developing, and implementing of solutions in support of data quality and automation. They will need to understand the requirements of a project, test the performance of the infrastructure, and verify that the requirements have been successfully met.
We expect the starting salary to be around $135k but will be commensurate with experience.
**What You'll Do:**
TECHNICAL SENIORSHIP
+ Communication with internal and external business users on Sterling Integrator mappings
+ Making changes to existing partner integrations to meet internal and external requirements
+ Design, develop and implement solutions based on standards and processes that establish consistency across the enterprise data, reduce risks, and promote efficiencies in support of the organizational goals and objectives.
+ Diagnose and troubleshoot complex issues, restore services and perform root cause analysis.
+ Facilitate the review, vetting of these designs with the architecture governance bodies, as required.
+ Be aware of all aspects of security related to the Sterling environment and integrations
INNOVATIVE INFRASTRUCTURE & PROBLEM SOLVING
+ Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows.
TEAMWORK & COMMUNICATION
+ Superior & demonstrated team building & development skills to harness powerful teams
+ Ability to communicate effectively with different levels of Seniorship within the organization
+ Provide timely updates so that progress against each individual incident can be updated as required
+ Write and review high quality technical documentation
CONTROL & AUDIT
+ Ensures their workstation and all processes and procedures, follow organization standards
CONTINUOUS IMPROVEMENT
+ Encourages and maintains a 'Best practice sharing culture', always striving to find ways to improve service and change mind set.
**What We're Looking For:**
+ Bachelor's degree in Engineering, Statistics, Computer Science or other quantitative fields, required
+ 5+ years of IT experience
+ 4+ years' experience working with Sterling B2B Integrator, Sterling File Gateway, PCM (preferred)
+ 3+ years' experience with scripting to enable automation of standard activities (example: Ansible, Python, Bash, Java)
+ Strong interpersonal and communication skills with Agile/Scrum experience.
+ Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions with the ability to develop custom setups.
+ Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels.
+ Prefer Travel, transportation, or hospitality experience
+ Prefer experience with designing application data models for mobile or web applications
+ Excellent written and verbal communication skills.
+ Flexibility in scheduling which may include nights, weekends, and holidays
**What You'll Get:**
+ Up to 40% off the base rate of any standard Hertz Rental
+ Paid Time Off
+ Medical, Dental & Vision plan options
+ Retirement programs, including 401(k) employer matching
+ Paid Parental Leave & Adoption Assistance
+ Employee Assistance Program for employees & family
+ Educational Reimbursement & Discounts
+ Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness
+ Perks & Discounts -Theme Park Tickets, Gym Discounts & more
The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world.
**US EEO STATEMENT**
At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company.
Individuals are encouraged to apply for positions because of the characteristics that make them unique.
EOE, including disability/veteran
$135k yearly 60d+ ago
Senior Data Engineer
Travelers Insurance Company 4.4
Data engineer job in Hartford, CT
**Who Are We?** Taking care of our customers, our communities and each other. That's the Travelers Promise. By honoring this commitment, we have maintained our reputation as one of the best property casualty insurers in the industry for over 170 years. Join us to discover a culture that is rooted in innovation and thrives on collaboration. Imagine loving what you do and where you do it.
**Job Category**
Data Analytics, Data Science, Technology
**Compensation Overview**
The annual base salary range provided for this position is a nationwide market range and represents a broad range of salaries for this role across the country. The actual salary for this position will be determined by a number of factors, including the scope, complexity and location of the role; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. As part of our comprehensive compensation and benefits program, employees are also eligible for performance-based cash incentive awards.
**Salary Range**
$139,400.00 - $230,000.00
**Target Openings**
1
**What Is the Opportunity?**
Travelers DataEngineering team constructs pipelines that contextualize and provide easy access to data by the entire enterprise. As a Senior DataEngineer you will accelerate growth and transformation of our analytics landscape. You will bring a strong desire to guide team members' growth and develop data solutions that translate complex data into user-friendly terminology. You will leverage your ability to design, build and deploy data solutions that capture, explore, transform, and utilize data to support Artificial Intelligence, Machine Learning and business intelligence/insights.
**What Will You Do?**
+ Build and operationalize complex data solutions, correct problems, apply transformations, and recommending data cleansing/quality solutions.
+ Design complex data solutions, including incorporating new data sources and ensuring designs are consistent across projects and aligned to data strategies.
+ Perform analysis of complex sources to determine value and use and recommend data to include in analytical processes.
+ Incorporate core data management competencies including data governance, data security and data quality.
+ Act as a data and technology subject matter expert within lines of business to support delivery and educate end users on data products/analytic environment.
+ Perform data and system analysis, assessment and resolution for defects and incidents of high complexity and correct as appropriate.
+ Collaborate across team to support delivery and educate end users on complex data products/analytic environment.
+ Perform other duties as assigned.
**What Will Our Ideal Candidate Have?**
+ Bachelor's Degree in STEM related field or equivalent.
+ Ten years of related experience.
+ Advanced knowledge of ETL tools like Abinitio & Databricks, AWS cloud platforms, programming languages like Python, SQL, and modern dataengineering practices.
+ Strong delivery skills including the ability to determine the software design strategy and methodology to be used for efforts, use automated tests, analysis, and informed feedback loops to ensure the quality and production readiness of work before release, monitor the health of work efforts and that of adjacent systems.
+ Demonstrated track record of domain expertise including the ability to develop business partnerships and influence priorities by identifying solutions that are aligned with current business objective and closely follow industry trends relevant to domain, understanding how to apply them, and sharing knowledge with coworkers.
+ Strong problem solver who utilizes data and proofs of concepts to find creative solutions to difficult problems involving a significant number of factors with broad implications, reflects on solutions, measures impact, and uses that information to ideate and optimize.
+ Excellent communication skills with the ability to develop business partnerships, describe technology concepts in ways the business can understand, document initiatives in a concise and clear manner, and empathetically and attentively listen to others thoughts and ideas.
+ Ability to lead and take action even when there is no clear owner, inspire and motivate others, and be effective at influencing team members.
**What is a Must Have?**
+ Bachelor's degree in computer science, related STEM field, or its equivalent in education and/or work experience.
+ 6 additional years of dataengineering experience.
**What Is in It for You?**
+ **Health Insurance** : Employees and their eligible family members - including spouses, domestic partners, and children - are eligible for coverage from the first day of employment.
+ **Retirement:** Travelers matches your 401(k) contributions dollar-for-dollar up to your first 5% of eligible pay, subject to an annual maximum. If you have student loan debt, you can enroll in the Paying it Forward Savings Program. When you make a payment toward your student loan, Travelers will make an annual contribution into your 401(k) account. You are also eligible for a Pension Plan that is 100% funded by Travelers.
+ **Paid Time Off:** Start your career at Travelers with a minimum of 20 days Paid Time Off annually, plus nine paid company Holidays.
+ **Wellness Program:** The Travelers wellness program is comprised of tools, discounts and resources that empower you to achieve your wellness goals and caregiving needs. In addition, our mental health program provides access to free professional counseling services, health coaching and other resources to support your daily life needs.
+ **Volunteer Encouragement:** We have a deep commitment to the communities we serve and encourage our employees to get involved. Travelers has a Matching Gift and Volunteer Rewards program that enables you to give back to the charity of your choice.
**Employment Practices**
Travelers is an equal opportunity employer. We value the unique abilities and talents each individual brings to our organization and recognize that we benefit in numerous ways from our differences.
In accordance with local law, candidates seeking employment in Colorado are not required to disclose dates of attendance at or graduation from educational institutions.
If you are a candidate and have specific questions regarding the physical requirements of this role, please send us an email (*******************) so we may assist you.
Travelers reserves the right to fill this position at a level above or below the level included in this posting.
To learn more about our comprehensive benefit programs please visit ******************************************************** .
$139.4k-230k yearly 8d ago
AWS Data Engineer Onshore_HartMap - C102534 6.0 Hartford, CT
CapB Infotek
Data engineer job in Hartford, CT
For one of our long-term multiyear projects we are looking for an AWS DataEngineer Onshore_HartMap out of Hartford, CT. Required qualifications to be successful in this role: Experience using Amazon RDS, AWS Glue, AWS Lake Formation, Amazon EMR, AWS Data Pipeline
Additional Skills: Amazon Athena, Amazon Redshift, Amazon Kinesis, Kafka, Amazon Neptune, Amazon DynamoDB
Prior Experience as AWS Architect across multiple projects with go-live milestones
Leadership and mentoring skills are mandatory to lead diverse global architecture teams.
Desired AWS certifications:
Specialty: AWS Certified Security Specialist, AWS Certified Big Data Specialist
Professional: AWS Certified Solutions Architect professional"
$84k-114k yearly est. 60d+ ago
GCP Data Engineer
Inizio Partners Corp
Data engineer job in Hartford, CT
Role - GCP DataEngineer (Visualization experience)
As a DataEngineer you will work on process of transforming raw data to a usable format, which is then further analyzed by other teams. Cleansing, organizing, and manipulating data using pipelines are some of the key responsibilities of a dataengineer. You will also work on applying dataengineering principles on the Google Cloud Platform to optimize its services. And create interactive dashboards/reports to present it to the stakeholders.
Role and Responsibilities:
Create and maintain optimal data pipeline architecture,
Assemble large, complex data sets that meet functional / non-functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and other data sources
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
Design intuitive and visually appearing visualizations to communicate complex data insights to users using Tableau, Streamlit, Dash Enterprise and Power BI
Use Flask to develop APIs that integrate data from various sources and facilitate automation of business processes.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Candidate Profile:
Required Qualification and Skills:
Expert in DASH Enterprise in developing friendly UI and strong story telling experience.
Ability to conduct real-time data exploration, predictive modeling integration and a seamless deployment of machine learning modules to GCP.
Overall, 5-8 years of experience on ETL technologies
3+ years of experience in dataengineering technologies like SQL, Hadoop, Big Query, Dataproc, Composer
Strong Python/Pyspark DataEngineer Skills
Hands-on experience with data visualizations like Tableau, Power BI, Dash and Streamlit
Experience in building interactive dashboards, visualizations, and custom reports for business users.
Knowledge of Flask for developing APIs and automating data workflows
Experience in data automation and implemented workflows in a cloud environment.
Strong SQL ETL skills
GCP dataengineer certified to be preferred.
Ability to understand and design the underlying data/schema
Strong communication skills to effectively communicate client updates.
$84k-114k yearly est. 60d+ ago
Data Engineer
Global Channel Management
Data engineer job in Windsor, CT
Global Channel Management is a technology company that specializes in various types of recruiting and staff augmentation. Our account managers and recruiters have over a decade of experience in various verticals. GCM understands the challenges companies face when it comes to the skills and experience needed to fill the void of the day to day function. Organizations need to reduce training and labor costs but at same requiring the best "talent " for the job.
Qualifications
DataEngineer need 5 years experience
DataEngineer requires:
BS in Mathematics, Economics, Computer Science, Information Management or Statistics.
Expert knowledge in SQL query language is required.
Experience with Getting and Cleansing Data (ETL), including SalesForce and Big Data sources, is highly desirable.
Experience with IBM's SPSS product is desirable.
Working knowledge of Linux based client server applications.
Aptitude for research, organization and structuring of complex problems.
Strong technical, analytical and problem-solving skills.
Strong technical writing skills and communication skills.
Prior experience with metadata management repositories is a plus.
Prior experience in the financial services, retirement or insurance industry is highly desirable
Strong team-oriented interpersonal and communication skills are required.
DataEngineer duties:
Create Best Practices and enforce those practices, to optimize the Analytics Data Store data loading and consumption.
Identify
data needs for projects, architect, plan, and execute the extract,
transform, and load (ETL) activities, using a combination of 3rd party
and open source technologies.
Additional Information
$55/hr
CTH
How much does a data engineer earn in Meriden, CT?
The average data engineer in Meriden, CT earns between $73,000 and $131,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.