The Data Architect will be responsible for designing, creating, and managing our organization's data architecture. This role will involve collaborating with various stakeholders to understand business requirements and translate them into efficient and scalable data solutions. The Data Architect will also play a key role in implementing data governance best practices and ensuring data quality and integrity across all systems.
Responsibilities and Qualifications:
* Experience with defining strategy and approach for Architecture (preferably Data Architecture) function including but not limited to: scope\vision, roles and responsibilities, operational processes, and governance structure.
* Deep domain knowledge of Data Management concepts including but not limited to: Data Acquisition, Storage, Processing\Transformation, Modeling, Retention, and Protection.
* Hands-on practical experience designing and delivering enterprise-grade data solutions including both operational and analytic use cases spanning data pipelines with enterprise ETL (batch and real-time) and logical\physical data modeling.
* Experience with Cloud Computing notably for Operational Data Processing and Analytic Use Cases
* Strong communication and collaboration skills, with the ability to interact effectively with technical and non-technical stakeholders at all levels of the organization.
* Demonstrated ability to collaborate with business stakeholders, data analysts, and software developers to understand data requirements and translate them into technical specifications.
* History of providing guidance and support to development teams during the implementation of data-related projects, including database design, data migration, and performance optimization.
* Experience defining and implementing strategies and processes for effective Reference Data Management.
* Experience defining and implementing strategies and processes for Data Retention in operational and analytic data sets.
* Proven track record of aligning technical data strategy with overall business goals and objectives to ensure that data initiatives contribute to organizational success.
* Technical acumen to evaluate and select appropriate data management technologies and tools to support data architecture goals, such as data warehouses, data lakes, and ETL processes.
Additional Requirements:
* Bachelor's degree in Computer Science, Information Systems, or related field; advanced degree preferred.
* Certification in data management or related field (e.g., Certified Data Management Professional) preferred.
* Familiarity with agile software development methodologies and DevOps practices.
* Strong project management skills and the ability to prioritize tasks in a fast-paced environment.
* Excellent analytical and problem-solving skills, with the ability to understand complex data requirements and design scalable solutions.
* Knowledge of data governance frameworks, data security best practices, and regulatory compliance standards (e.g., GDPR, HIPAA).
* Stay current with emerging trends and technologies in data management and recommend innovative solutions to enhance the organization's data architecture capabilities.
* Monitor and analyze data architecture performance metrics to identify areas for improvement and optimization.
* Strong understanding of enterprise architecture principles and practices
* Independent thinker and self-starter
*Job Type & Location*
This is a Permanent position based out of Cleveland, OH.
*Pay and Benefits*The pay range for this position is $130000.00 - $160000.00/yr.
More details in file.
Medical, Dental, Vision offered through United Health Group.
4 weeks of PTO
Holidays: 8 observed and 3 floating
*Workplace Type*This is a hybrid position in Cleveland,OH.
*Application Deadline*This position is anticipated to close on Jan 21, 2026.
h4>About TEKsystems:
We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company.
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
About TEKsystems and TEKsystems Global Services
We're a leading provider of business and technology services. We accelerate business transformation for our customers. Our expertise in strategy, design, execution and operations unlocks business value through a range of solutions. We're a team of 80,000 strong, working with over 6,000 customers, including 80% of the Fortune 500 across North America, Europe and Asia, who partner with us for our scale, full-stack capabilities and speed. We're strategic thinkers, hands-on collaborators, helping customers capitalize on change and master the momentum of technology. We're building tomorrow by delivering business outcomes and making positive impacts in our global communities. TEKsystems and TEKsystems Global Services are Allegis Group companies. Learn more at TEKsystems.com.
The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
$130k-160k yearly 2d ago
Looking for a job?
Let Zippia find it for you.
Data Architect
Novocure Inc. 4.6
Data engineer job in Cleveland, OH
We are seeking an experienced and innovative Data Architect to lead the design, development, and optimization of our enterprise data architecture. This individual will play a critical role in aligning data strategy with business objectives, ensuring data integrity, and driving value from data across multiple platforms. The ideal candidate will have deep expertise in data architecture best practices and technologies, particularly across SAP S/4 HANA, Veeva CRM, Veeva Vault, SaaS platforms, Operational Data Stores (ODS), and Master Data Management (MDM) platforms.
This is a full-time, position reporting to the Director, Enterprise Architecture
ESSENTIAL DUTIES AND RESPONSIBILITIES:
Design, develop, and maintain scalable and secure enterprise data architecture solutions across SAP S/4 HANA, Veeva CRM, and Veeva Vault environments.
Serve as a subject matter expert for Operational Data Stores and Master Data Management architecture, ensuring clean, consistent, and governed data across the enterprise.
Collaborate with cross-functional teams to identify data needs, establish data governance frameworks, and define data integration strategies.
Develop data models, data flows, and system integration patterns that support enterprise analytics and reporting needs.
Evaluate and recommend new tools, platforms, and methodologies for improving data management capabilities.
Ensure architectural alignment with data privacy, regulatory, and security standards.
Provide leadership and mentoring to dataengineering and analytics teams on best practices in data modeling, metadata management, and data lifecycle management.
Contribute to data governance initiatives by enforcing standards, policies, and procedures for enterprise data.
QUALIFICATIONS/KNOWLEDGE:
Bachelor's or Master's degree in Computer Science, Information Systems, Data Science, or a related field.
Minimum of 8+ years of experience in data architecture, data integration, or enterprise data management roles.
Proven experience in designing and implementing data solutions on SAP S/4 HANA, including integration with other enterprise systems.
Strong hands-on experience with SaaS platforms, including data extraction, modeling, and harmonization.
Deep understanding of Operational Data Stores and MDM design patterns, implementation, and governance practices.
Proficiency in data modeling tools (e.g., Erwin, SAP PowerDesigner), ETL tools (e.g., Business Objects Data Services, SAP Data Services), and integration platforms (e.g., MuleSoft).
Familiarity with cloud data architecture (e.g., AWS, Azure, GCP) and hybrid data environments.
Excellent communication and stakeholder management skills.
OTHER:
Experience with pharmaceutical, life sciences, or regulated industry environments.
Knowledge of data privacy regulations such as GDPR, HIPAA, and data compliance frameworks
Ability to travel domestically and internationally as needed for high priority projects
Novocure is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sexual orientation, gender identity, age, national origin, disability, protected veteran status or other characteristics protected by federal, state, or local law. We actively seek qualified candidates who are protected veteran and individuals with disabilities as defined under VEVRAA and Section 503 of the Rehabilitation Act.
Novocure is committed to providing an interview process that is inclusive of our applicant's needs. If you are an individual with a disability and would like to request an accommodation, please email
ABOUT NOVOCURE:
Our vision
Patient-forward: aspiring to make a difference in cancer.
Our patient-forward mission
Together with our patients, we strive to extend survival in some of the most aggressive forms of cancer by developing and commercializing our innovative therapy.
Our patient-forward values
innovation
focus
drive
courage
trust
empathy
#LI-RJ1
$91k-121k yearly est. 2d ago
Lead DevOps Engineer
Strategic Systems Inc. 4.4
Data engineer job in Cleveland, OH
Role: Lead DevOps Engineer
Duration: Direct Hire
The Lead DevOps Engineer will be responsible for growing, contributing and supporting the Infrastructure for all Safeguard Properties platforms and applications. The DevOps Engineer will ensure that all platforms are healthy and reliable and that alerting and monitoring are in place for all systems. The DevOps Engineer will also manage production deployments across all platforms, ensuring deployments are a “non-event.” The DevOps engineer will work closely with other IT team members, including the development team, to troubleshoot and resolve all hardware and software issues in addition to being part of a 24/7 on-call rotation.
Responsibilities and Expectations
Responsible for the planning, implementation, and growth of the AWS cloud infrastructure.
Windows OS EC2
AutoScaleGroup management
SQL Server on RDS
EC2 Image Builder
Hybrid Patch Management with SSM
Build, release, and manage the configuration of all production AWS systems.
Stay current with new AWS technology options and vendor products, continually evaluate which ones would be a good fit for the company.
Terraform
Design and build infrastructure using code
Automate infrastructure provisioning and configuration
Maintain and enhance the existing Terraform codebase
Develop and implement best practices for infrastructure as code
IAM Integration - Okta
Application Integration
Security policy configuration
Workflow automation
Troubleshooting
Building and maintaining technical relationships with influential technical decision makers.
Building, working, maintaining production applications in multiple areas of the business pipeline.
Design, implement, and manage CI/CD pipelines to support rapid development and deployment and work with the Development Teams
Ensure that SLA for our production and supporting environments is maintained.
Continually seek opportunities to improve SLA/Uptime and minimize customer impacts.
Performing administration to include installs, upgrades, configurations, tuning and monitoring of Safeguard applications and third-party platforms.
Use judgment to develop and clarify expectations, scope, and scale to achieve shared objectives and minimize redundancy.
Manage several concurrent projects involving multiple stakeholders.
Provide Proof of Concept and prototyping as needed.
May be required to work irregular of non-standard hours
Employees will be expected to attend regular Zoom collaboration throughout the day
All other duties as assigned.
Competencies
Adheres to Safeguard's core values and competencies
Customer Service = Resolution;
Team Work
Integrity
Adaptability
Leadership
Employee Management
Project Management
Qualifications and Requirements
7+ years SYSADMIN experience with Windows/Linux/Unix operating systems or cloud platforms: system administration, networking concepts and protocols & programming skills.
Demonstrated ability to adapt to new technologies and learn quickly.
Comfortable working in a production environment requiring 24x7 support and being part of an on-call rotation.
Strong experience with deploying, configuring, scaling, monitoring, and securing AWS EC2, ELB, RDS, S3, ECS, Fargate,EKS, Lambda, and SSM.
Experience with AWS CloudFormation, Terraform, or similar Infrastructure-as-Code (IaC) platforms, familiarity with IaC principles.
Experience with Container solutions. Docker, DC/OS, Kubernetes, Tectonic, OpenShift or similar.
Strong experience using DevOps tooling in the AWS environment including Ansible, GitHub, Jenkins, or similar.
Develop and maintain CI/CD pipelines in GitHub Actions to automate testing, building, and deployment.
Experience designing AWS solutions inside the AWS Well-Architected Framework.
Strong experience monitoring and alerting solutions, and used tools such as CloudWatch, Nagios, Graphite, SolarWinds, PagerDuty, etc.
Strong experience with web application technology: IIS, Tomcat, Apache, Elasticsearch, NGINX, haproxy, etc.
Exceptional customer service orientation.
Ability to deliver high quality documentation paying attention to detail.
Scripting and automation experience: AWS API, GO, Bash, Python, Shell, PowerShell, Azure REST, or similar.
Experience with source control tooling, CI/CD tooling, and team management tooling.
Familiarity with ITIL/ITSM security management concepts and best practices.
$80k-103k yearly est. 1d ago
Data Scientist
6090-Johnson & Johnson Services Legal Entity
Data engineer job in Brunswick, OH
At Johnson & Johnson, we believe health is everything. Our strength in healthcare innovation empowers us to build a world where complex diseases are prevented, treated, and cured, where treatments are smarter and less invasive, and solutions are personal. Through our expertise in Innovative Medicine and MedTech, we are uniquely positioned to innovate across the full spectrum of healthcare solutions today to deliver the breakthroughs of tomorrow, and profoundly impact health for humanity. Learn more at *******************
Job Function:
Data Analytics & Computational Sciences
Job Sub Function:
Data Science
Job Category:
Scientific/Technology
All Job Posting Locations:
New Brunswick, New Jersey, United States of America
Job Description:
About Innovative Medicine
Our expertise in Innovative Medicine is informed and inspired by patients, whose insights fuel our science-based advancements. Visionaries like you work on teams that save lives by developing the medicines of tomorrow.
Join us in developing treatments, finding cures, and pioneering the path from lab to life while championing patients every step of the way.
Learn more at *******************/innovative-medicine
We are searching for the best talent for Data Scientist
Purpose:
As a Data Scientist in Global Finance Data Science Team: You will be contributing to a high standard delivery of data science driven predictive financial statements - consumed by senior leadership. You will work in a global team of Data Scientists, DataEngineers and Machine Learning Engineers to advance the Data Science and AI roadmap for J&J's Global Finance function. You will assist in delivering value-added insights and analytics to our finance and business leaders, reduce manual work through automated reasoning, and enhance user-experience.
You will be responsible for:
You will mainly focus on advancing and broadening the capabilities of our Predictive Financial Statements, plugging results into SAP Analytics Cloud reporting - where most stakeholders go for corporate internal financial statements. Your job will also include aligning finance and business needs, validation of data from different source systems, and data reconciliation. You will mainly be required to create new models and leverage or extend the use of existing models for other financial statements. You will also enhance existing models for accuracy, speed and cost.
You will be involved in data science projects across their lifecycle, from design to production and adoption by end users. This will include creating proof-of-concepts for new project, data science model development, data pipeline development and production deployment. The capabilities developed will include forecasting, descriptive analytics, data visualization, GenAI and decision support. This role will involve understanding the needs of business stakeholders and advocating the merits of data-driven analytics to provide viable solutions.
You will be responsible for:
Adopting a highly successful forecasting processes and technologies and delivering predictive financial statements monthly.
Modeling the impact of future events to enhance forecast accuracy.
Developing data pipelines for large datasets sourced from financial systems and automating data science processes.
Documenting and aligning model changes within the team and stakeholders.
Communicating insights to stakeholders leveraging data visualization tools.
Monitoring model performance, and continuously improving existing models.
Collaborating with finance, commercial leaders, technology teams, and external partners to deliver end-to-end solutions, ensuring compliance and risk management.
Advocating for data-driven insights and data science methods across the organization and managing compliance adherence.
Qualifications / Requirements:
Minimum of 2 years of Data Science/ AI experience in an industry setting is required, preferably in a Finance or Healthcare setting.
Minimum of a Bachelors degree is required, preferably in Science, Economics, Business Analytics, Data Science, Finance, Computer Science, Engineering or any other quantitative or STEM discipline.
Technical Requirements
Proficient in Python/R/Alteryx programming and experienced with Data Science Cloud platforms like AWS and Databricks or Domino.
Experienced using finance data and SAP HANA data tables is a plus.
Proficient in interpreting data sources and correlating to financial metrics.
Able to work independently and under time pressure to deliver results, investigating and solving data issues in an explainable way.
Skilled in data visualization and dashboarding using Tableau or PowerBI.
Knowledgeable in statistical techniques and concepts, such as regression, properties of distributions, and statistical tests.
Strong data analytics skills and attention to detail.
Other:
The position will be located in New Brunswick, NJ and may require up to 10% travel.
Johnson & Johnson is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, age, national origin, disability, protected veteran status or other characteristics protected by federal, state or local law. We actively seek qualified candidates who are protected veterans and individuals with disabilities as defined under VEVRAA and Section 503 of the Rehabilitation Act.
Johnson & Johnson is committed to providing an interview process that is inclusive of our applicants' needs. If you are an individual with a disability and would like to request an accommodation, external applicants please contact us via *******************/contact-us/careers , internal employees contact AskGS to be directed to your accommodation resource.
#LI-Hybrid #JNJDataScience
Required Skills:
Artificial Intelligence (AI), Python (Programming Language)
Preferred Skills:
Advanced Analytics, Analytical Reasoning, Business Intelligence (BI), Business Writing, Coaching, Collaborating, Communication, Data Analysis, Data Compilation, Data Privacy Standards, Data Savvy, Data Science, Data Visualization, Econometric Models, Execution Focus, Technical Credibility, Technologically Savvy, Workflow Analysis
The anticipated base pay range for this position is :
The anticipated base pay range for this position is $89,000 to $143,750 USD.
Additional Description for Pay Transparency:
Subject to the terms of their respective plans, employees and/or eligible dependents are eligible to participate in the following Company sponsored employee benefit programs: medical, dental, vision, life insurance, short- and long-term disability, business accident insurance, and group legal insurance. Subject to the terms of their respective plans, employees are eligible to participate in the Company's consolidated retirement plan (pension) and savings plan (401(k)). This position is eligible to participate in the Company's long-term incentive program. Subject to the terms of their respective policies and date of hire, Employees are eligible for the following time off benefits: Vacation -120 hours per calendar year Sick time - 40 hours per calendar year; for employees who reside in the State of Washington -56 hours per calendar year Holiday pay, including Floating Holidays -13 days per calendar year Work, Personal and Family Time - up to 40 hours per calendar year Parental Leave - 480 hours within one year of the birth/adoption/foster care of a child Condolence Leave - 30 days for an immediate family member: 5 days for an extended family member Caregiver Leave - 10 days Volunteer Leave - 4 days Military Spouse Time-Off - 80 hours Additional information can be found through the link below. *********************************************
$89k-143.8k yearly Auto-Apply 2d ago
Data Consultant - Employee Benefits
Oswald Company 4.2
Data engineer job in Cleveland, OH
Would you like to take ownership in a dynamic, high-growth business that truly walks the talk?
Oswald Companies seeks goal-driven professionals ready to take their career to the next level.
Responsible for performing and presenting data analysis for assigned accounts and complete special project deliverables in the Group Benefits Business Unit; conducts complex analysis and customized models on multi-national accounts across many lines of coverage and varying funding mechanisms. Supports sales growth and retention objectives.
A Day in The Life:
Manages large projects from inception and design through post-implementation assessment, working closely with internal departments and clients.
Attends client meetings as lead data consultant; presents to key business leaders to explain analytical findings and financial recommendations; demonstrates strong client-facing skills and engenders client's confidence in the data recommendations / findings.
Provides insight on various funding methodologies to include fully insured projections, carrier rating methodologies, self-insured stop-loss risk levels, and other alternative funding programs.
Analyzes data to identify plan utilization, financial trends, and comparative industry benchmarks as the basis for preparing financial projections, utilizing benchmarking resources and tools, developing COBRA rates, and making recommendations for plan design modification.
Ability to assess reasonableness of results of own work and establish strong peer review strategies; provides mentorship and coaching for Analysts and Senior Analysts.
Interfaces with colleagues from other practices and markets on assignments that reflect the client's broader business issues.
Collaborates with Client Executive to drive strategic direction with data-driven analysis; works closely to identify and develop solutions to minimize risk and maximize cost effectiveness.
Accesses the data of existing clients to conduct a financial analysis of relevant information to aid in the evaluation of plan performance, both current and projected experience.
Assist in RFP process for new business and participates in finalist meetings as the Analytics SME.
Takes ownership of an assigned book of business. Provides internal teams with accurate deliverables in a timely fashion to meet client expectations; maintains a concise and consistent level of communication within the team.
Collaborates with other Data Analytics team members to share knowledge and contribute to the design and utilization of client deliverables; remains current on compliance regulations and incorporates that knowledge into existing data analytics tools; builds team synergy for department effectiveness.
Prioritizes and manages workload effectively, thinks through issues and determines alternative solutions to meet deadlines and improve personal productivity.
What You'll Need:
Bachelor's degree in actuarial science, Finance, Mathematics, Economics, Statistics, or related field from a four-year college or university
Ten or more years of related experience or the equivalent combination of education and experience.
Prior Experience Specifically Required
Prior experience gathering, manipulating, and deciphering data required.
Prior experience in delivering analytical findings and recommendations required.
Prior experience working in a group benefits environment required.
Prior analytical and research experience required.
Who You Are:
The specific personal traits required to accomplish the essential duties of this job successfully include:
Strong attention to detail particularly with mathematical information
Strong organizational skills with the ability to prioritize accordingly.
Ability to focus on work-at-hand; not easily distracted.
Exceptional written and verbal communication skills
Demonstrated resourcefulness; works well independently and on a team.
Manages stress well.
Self-confident with capable interpersonal skills
Strong client-facing and presentation abilities
Who is Oswald?
Oswald is a 129-year-old company that creates a world of protection around the lives and businesses of our clients.
We are an independent, employee-owned company. So, essentially, you own your own success in a personally and financially rewarding opportunity.
Inclusivity is a priority. We foster an environment of collaboration and belonging where our Employee-Owners thrive on their unique path. Our diverse talent reflects the communities and clients we serve, while driving unmatched risk and insurance innovations.
Our people-first culture and client service excellence have built our reputation of integrity, resourcefulness, and a relentless care for our clients and employees. Don't believe us? Ask your friends, colleagues, and mentors about Oswald. There's a reason Oswald has been named a Top Workplace for nine consecutive years.
What you'll get...
At Oswald, you will have the opportunity to build a long-term career with unlimited growth potential. Aim high, work hard and we'll help you achieve your goals.
At Oswald, you will experience our caring work environment. We care about our Employee-Owners, we care about our customers, and we care about the world around us. Our caring personality comes to life in the form of volunteering in the community. We even give employees paid time off to volunteer with an organization of their choice.
At Oswald, you will achieve a work-life balance. We care about your physical and emotional well-being, so work-life balance is encouraged and practiced. We understand you have a life outside of work, and we want you to live it.
At Oswald, you will have access to a world-class Total Rewards package. We truly value our people, which shows in our compensation, benefits, and perks.
In addition to competitive pay, we have designed a performance-based annual incentive program. All employees are eligible to earn a bonus by meeting performance objectives.
Comprehensive medical, dental and vision plans and numerous supplemental benefit offerings.
Paid time off annually and a sabbatical at every 10-year service anniversary.
Ownership in the company in the form of company stock (discretionary profit-sharing and 401(k) match contribution)
Assistance with parking expenses, discount programs for area services/experiences, and financial support for professional development and licensure/designations
Access to specialized leadership development programming designed to take your career to the next level.
And so much more!
To learn more about Oswald, our culture and everything we have to offer, visit us on LinkedIn.
Oswald, a Unison Risk Advisors company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic.
Responsible for performing and presenting data analysis for assigned accounts and complete special project deliverables in the Group Benefits Business Unit; conducts complex analysis and customized models on multi-national accounts across many lines of coverage and varying funding mechanisms. Supports sales growth and retention objectives.
A Day in The Life:
Manages large projects from inception and design through post-implementation assessment, working closely with internal departments and clients.
Attends client meetings as lead data consultant; presents to key business leaders to explain analytical findings and financial recommendations; demonstrates strong client-facing skills and engenders client's confidence in the data recommendations / findings.
Provides insight on various funding methodologies to include fully insured projections, carrier rating methodologies, self-insured stop-loss risk levels, and other alternative funding programs.
Analyzes data to identify plan utilization, financial trends, and comparative industry benchmarks as the basis for preparing financial projections, utilizing benchmarking resources and tools, developing COBRA rates, and making recommendations for plan design modification.
Ability to assess reasonableness of results of own work and establish strong peer review strategies; provides mentorship and coaching for Analysts and Senior Analysts.
Interfaces with colleagues from other practices and markets on assignments that reflect the client's broader business issues.
Collaborates with Client Executive to drive strategic direction with data-driven analysis; works closely to identify and develop solutions to minimize risk and maximize cost effectiveness.
Accesses the data of existing clients to conduct a financial analysis of relevant information to aid in the evaluation of plan performance, both current and projected experience.
Assist in RFP process for new business and participates in finalist meetings as the Analytics SME.
Takes ownership of an assigned book of business. Provides internal teams with accurate deliverables in a timely fashion to meet client expectations; maintains a concise and consistent level of communication within the team.
Collaborates with other Data Analytics team members to share knowledge and contribute to the design and utilization of client deliverables; remains current on compliance regulations and incorporates that knowledge into existing data analytics tools; builds team synergy for department effectiveness.
Prioritizes and manages workload effectively, thinks through issues and determines alternative solutions to meet deadlines and improve personal productivity.
What You'll Need:
Bachelor's degree in actuarial science, Finance, Mathematics, Economics, Statistics, or related field from a four-year college or university
Ten or more years of related experience or the equivalent combination of education and experience.
Prior Experience Specifically Required
Prior experience gathering, manipulating, and deciphering data required.
Prior experience in delivering analytical findings and recommendations required.
Prior experience working in a group benefits environment required.
Prior analytical and research experience required.
Who You Are:
The specific personal traits required to accomplish the essential duties of this job successfully include:
Strong attention to detail particularly with mathematical information
Strong organizational skills with the ability to prioritize accordingly.
Ability to focus on work-at-hand; not easily distracted.
Exceptional written and verbal communication skills
Demonstrated resourcefulness; works well independently and on a team.
Manages stress well.
Self-confident with capable interpersonal skills
Strong client-facing and presentation abilities
$72k-96k yearly est. 57d ago
Data Engineer
GD Information Technology
Data engineer job in Fairview Park, OH
Type of Requisition:
Regular
Clearance Level Must Currently Possess:
None
Clearance Level Must Be Able to Obtain:
None
Public Trust/Other Required:
None
Job Family:
Data Science and DataEngineering
Job Qualifications:
Skills:
Business Functions, Data Science, Data Solutions
Certifications:
None
Experience:
5 + years of related experience
US Citizenship Required:
No
Job Description:
Own your opportunity to turn data into measurable outcomes for our customers' most complex challenges. As a DataEngineer at GDIT, you'll power innovation to drive mission impact and grow your expertise to power your career forward. We are a fast-growing AI and Data team within the Global CIO organization at GDIT. We design and deliver enterprise-grade solutions that integrate AI, data, and human-AI collaboration workflows across key business functions (e.g., Growth, Finance, HR, Legal, and Supply Chain). We are looking for a highly skilled DataEngineer (Agentic AI) who shares a passion for delivering dataengineering for agentic AI applications and building an AI-first data ecosystem foundational to enterprise AI transformation.
MEANINGFUL WORK AND PERSONAL IMPACT:
Design, build, and operate scalable end-to-end data pipelines and curated data products that support enterprise analytics and agentic AI use cases
Integrate data from enterprise systems and external sources, including structured, semi-structured, and unstructured data
Deliver reliable data services for agentic AI workflows, including APIs, retrieval/indexing, and governed context delivery for AI agents
Implement data quality, observability, and governance best practices across data pipelines and products
Optimize performance and cost across storage, compute, orchestration, and serving layers
Collaborate with cross-functional teams, including business stakeholders, AI engineers, and software developers, to translate requirements into production solutions
WHAT YOU'LL NEED TO SUCCEED
Bring your expertise and drive for innovation to GDIT. The DataEngineer must have:
Education: Bachelor's degree in Computer Science/Engineering, Data Science, or a related field
Experience: 5+ years of experience delivering production-grade dataengineering across databases, data integration, data services, and data governance
Role requirements: Proficiency in programming languages (Python or Java) and databases (SQL and NoSQL). Strong collaboration and communication skills in cross-functional enterprise environments.
Preferred Skills and Abilities: Master's degree in Computer Science/Engineering, Data Science, or a related field. Relevant certifications in DataEngineering, AI, or Cloud. Experience delivering dataengineering for agentic AI applications. Experience with retrieval-based AI data foundations (document processing, metadata, embeddings, vector or hybrid search). Familiarity with agent workflows and how agents interact with data services and tools in production. Experience with lakehouse architectures and cloud data platforms such as Azure (preferred), OCI (preferred), or AWS, Experience in real-time streaming applications or other high-velocity solutions. Experience leveraging AI tools to improve dataengineering productivity and quality in coding, testing, and documentation.
Location: This is a hybrid position working at our GDIT facility in Falls Church, VA. Must be comfortable working (3) days a week onsite.
US Persons required
GDIT IS YOUR PLACE
At GDIT, the mission is our purpose, and our people are at the center of everything we do.
Growth: AI-powered career tool that identifies career steps and learning opportunities
Support: An internal mobility team focused on helping you achieve your career goals
Rewards: Comprehensive benefits and wellness packages, 401K with company match, and competitive pay and paid time off
Flexibility: Full-flex work week to own your priorities at work and at home
Community: Award-winning culture of innovation and a military-friendly workplace
OWN YOUR OPPORTUNITY
Explore a career in data science and engineering at GDIT and you'll find endless opportunities to grow alongside colleagues who share your determination for solving complex data challenges.
The likely salary range for this position is $119,000 - $161,000. This is not, however, a guarantee of compensation or salary. Rather, salary will be set based on experience, geographic location and possibly contractual requirements and could fall outside of this range.
Scheduled Weekly Hours:
40
Travel Required:
10-25%
Telecommuting Options:
Hybrid
Work Location:
USA VA Falls Church
Additional Work Locations:
Total Rewards at GDIT:
Our benefits package for all US-based employees includes a variety of medical plan options, some with Health Savings Accounts, dental plan options, a vision plan, and a 401(k) plan offering the ability to contribute both pre and post-tax dollars up to the IRS annual limits and receive a company match. To encourage work/life balance, GDIT offers employees full flex work weeks where possible and a variety of paid time off plans, including vacation, sick and personal time, holidays, paid parental, military, bereavement and jury duty leave. To ensure our employees are able to protect their income, other offerings such as short and long-term disability benefits, life, accidental death and dismemberment, personal accident, critical illness and business travel and accident insurance are provided or available. We regularly review our Total Rewards package to ensure our offerings are competitive and reflect what our employees have told us they value most.We are GDIT. A global technology and professional services company that delivers consulting, technology and mission services to every major agency across the U.S. government, defense and intelligence community. Our 30,000 experts extract the power of technology to create immediate value and deliver solutions at the edge of innovation. We operate across 50 countries worldwide, offering leading capabilities in digital modernization, AI/ML, Cloud, Cyber and application development. Together with our clients, we strive to create a safer, smarter world by harnessing the power of deep expertise and advanced technology.Join our Talent Community to stay up to date on our career opportunities and events at
gdit.com/tc.
Equal Opportunity Employer / Individuals with Disabilities / Protected Veterans
$119k-161k yearly Auto-Apply 3d ago
Data Technology Lead
Westfield Group, Insurance
Data engineer job in Westfield Center, OH
The role is part of the Data, Analytics and Reporting team. This role leads a team of dataengineers and data testers to deliver secure, scalable, and high-quality data solutions that support analytics, reporting, and business operations. The Data Technology Lead collaborates with stakeholders to understand technology and data requirements, implements best practices for data governance and testing, and drives innovation in dataengineering. The position involves managing modern cloud-based platforms, while fostering a culture of continuous improvement and technical excellence within the IT team.
Job Responsibilities
* Data Architecture & Engineering: Design, develop, and maintain robust data pipelines and architectures for structured and unstructured data; optimize workflows across Azure Data Lake, Snowflake, and other environments; implement best practices for data modeling and transformation using dbt.
* Team Leadership: Lead and mentor dataengineers and testers; manage workload distribution; foster collaboration and innovation.
* Testing & Quality Assurance: Establish data testing frameworks; ensure data accuracy and reliability; integrate testing into broader QA processes.
* Collaboration & Stakeholder Engagement: Partner with analytics, BI, and business teams to deliver data solutions; provide technical guidance.
* Vendor & Tool Management: Evaluate and select tools and vendors; negotiate contracts and manage relationships.
* Business Continuity: Develop and maintain disaster recovery and business continuity plans for data systems.
Job Qualifications
* 7+ years of experience in dataengineering, with at least 2 years in a data leadership role.
* Insurance industry experience required
* Bachelor's degree in Computer Science, Information Technology, or a related field and/or commensurate experience.
* Master's degree in related field is preferred.
* Proficiency in SQL.
* Optional skills in Python and experience with modern data frameworks (e.g., Spark).
* Expertise in Snowflake, Azure Data Lake, dbt, and modern data platforms.
* Strong experience in data integration, data warehousing, and data lake architectures.
* Experience with Azure DevOps, CI/CD pipelines, and Git for code management
* Awareness of Generative AI (GenAI) capabilities to accelerate development and testing processes.
* Familiarity with data testing methodologies and tools.
* Excellent leadership, communication, and problem-solving skills.
Behavioral Competencies
* Directs work
* Collaborates
* Develops talent
* Customer focus
* Communicates effectively
* Ensures accountability
* Decision quality
* Business insight
* Nimble learning
* Builds effective teams
* Manages complexity
Technical Skills
* Technical Support
* Operating Systems
* Workflow Management
* Budgeting
* Disaster Recovery
* Process Improvement
* Project Management
* IT Strategy & Framework
* IT Regulatory Compliance
* Stakeholder Management
This job description describes the general nature and level of work performed in this role. It is not intended to be an exhaustive list of all duties, skills, responsibilities, knowledge, etc. These may be subject to change and additional functions may be assigned as needed by management.
$92k-127k yearly est. 47d ago
GTM Data Engineer
Partssource 4.4
Data engineer job in Cleveland, OH
PartsSource is the leading technology and software platform for managing mission-critical healthcare equipment. Trusted by over 5,000 US hospitals and 15,000 clinical sites, PartsSource empowers providers and service organizations to maximize clinical availability for patient care and automates the procurement of parts, services and training through a unique digital experience.
PartsSource team members are deeply committed to our mission of Ensuring Healthcare is Always On , which is foundational to our success and growth. Our vibrant culture is built upon aligned values, shared ownership, mutual respect, and a passion for collaborating to solve complex customer problems.
About the Job Opportunity
The GTM DataEngineer is responsible for building and maintaining a single, trusted customer and revenue data foundation across Marketing, Sales, and Customer Success. This role partners closely with Revenue Operations to ensure all GTM teams operate from a consistent source of truth for pipeline, revenue, retention, and growth. You will own how GTM data is structured, enriched, validated, and made available-eliminating data ambiguity and enabling confident, data-driven decision making.
What You'll Do GTM Data Modeling & Governance
(Technology - DataEngineering: Data Modeling & Architecture, Data Quality & Governance)
Design and maintain the canonical customer, account, and revenue data model across GTM systems
Resolve identity across contacts, accounts, users, assets, services, and subscriptions
Define authoritative objects and metrics for pipeline, bookings, renewals, expansion, and churn
Ensure historical accuracy, data lineage, and consistent metric definitions
Data Enrichment, Integration & Pipelines
(Technology - DataEngineering: ETL & Data Integration, Data Pipeline Development)
Build and manage data pipelines across CRM, marketing automation, services, and financial systems
Identify data gaps and implement enrichment strategies to improve completeness and usability
Merge datasets into unified customer and account views with clear conflict-resolution rules
Own schema changes, backfills, reprocessing, and validation as systems evolve
Attribution, Revenue Logic & Reporting Enablement
(Sales Revenue Operations: Performance Metrics & Reporting, Sales Analytics)
Implement approved attribution and revenue logic consistently across channels and time periods
Validate sourced, influenced, and assisted revenue before executive reporting
Enable trusted funnel, pipeline, retention, and expansion reporting within systems of record
Reduce reliance on spreadsheets and manual reconciliation
GTM Architecture, CDP & AI Readiness
(Technology - Systems & Applications: Systems Integration, Systems Thinking)
Support a warehouse-centric or composable CDP approach for GTM data
Partner with GTM leadership to evolve long-term data architecture
Prepare high-quality, LLM-ready datasets for AI-enabled GTM workflows
Ensure access controls, privacy, and compliance requirements are met
What You'll Bring
Your Background
5+ years in dataengineering, analytics engineering, or GTM data roles
Strong experience with CRM and GTM data models
Advanced SQL skills and experience with modern data stacks and ETL tools
Experience supporting attribution, lifecycle, and revenue reporting
Familiarity with Customer Data Platforms or warehouse-centric CDP approaches
Ability to work cross-functionally with Marketing, Sales, Customer Success, Finance, and RevOps
Who We Want to Meet
Act Like an Owner -
Accountability & Execution
: You take full ownership of GTM data quality and follow through to reliable outcomes.
Serve with Purpose -
Business Impact
: You connect data architecture decisions to revenue visibility and GTM effectiveness.
Adapt to Thrive -
Managing Ambiguity
: You remain productive amid evolving systems, definitions, and priorities.
Collaborate to Win -
Influence & Communication
: You partner effectively with RevOps and GTM teams to align on shared metrics.
Challenge the Status Quo -
Data-Informed Decision Making
: You use evidence and clarity to replace assumptions and debates.
Benefits & Perks
Competitive compensation package with salary, incentives, company ownership/equity, and comprehensive benefits (401k match, health, college debt reduction, and more!)
Career and professional development through training, coaching and new experiences.
Hybrid culture with new & beautiful workspaces that balance flexibility, collaboration, and productivity.
Inclusive and diverse community of passionate professionals learning and growing together.
Interested?
We'd love to hear from you! Submit your resume and an optional cover letter explaining why you'd be a great fit.
About PartsSource
Since 2001, PartsSource has evolved into the leading technology and software platform for managing mission-critical equipment, serving over half of the U.S. hospital infrastructure. Our digital systems modernize and automate the procurement of parts, services, technical support, and training for HTM professionals to efficiently and effectively maintain their mission-critical equipment. PartsSource employs over 700 employees nationwide that committed to supporting healthcare providers and ensuring healthcare always on.
In 2021, Bain Capital invested in the business, further accelerating our growth and positive impact within the healthcare industry.
Read more about us here:
· PartsSource Named to Newsweek's List of the Top 200 America's Most Loved Workplaces for 2024
· PartsSource Named Among the Top 50 Healthcare Technology Companies of 2025
· PartsSource Named Among the Top 25 Healthcare Software Companies of 2025
· PartsSource President and CEO Philip Settimi Named to Top 50 Healthcare Technology CEO List 2025
· WSJ: Bain Capital Private Equity Scoops Up PartsSource
EEO PartsSource, Inc., and its affiliates and subsidiaries, provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.
Legal authorization to work in the U.S. is required.
$86k-122k yearly est. Auto-Apply 1d ago
Junior Data Engineer
Qode
Data engineer job in Cleveland, OH
PNC Bank is seeking a Junior DataEngineer to support the design, development, and maintenance of scalable data pipelines and data platforms that enable analytics, reporting, and regulatory compliance. This role is ideal for early-career professionals eager to build hands-on experience in enterprise dataengineering within the financial services domain.
Key Responsibilities
Assist in building and maintaining ETL/ELT pipelines to ingest, transform, and load data from multiple source systems.Support development of batch and near real-time data processing workflows.Work with structured and semi-structured data using SQL and Python.Participate in data validation, reconciliation, and quality checks to ensure accuracy and completeness.Collaborate with senior dataengineers, data analysts, and business stakeholders to understand data requirements.Help manage data storage solutions such as data warehouses and data lakes.Assist with documentation of data models, pipelines, and operational processes.Follow data governance, security, and compliance standards relevant to banking and financial services.Monitor data pipelines and troubleshoot failures under guidance.Support deployment and version control using Git and CI/CD practices.
Required Qualifications
Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field.1-3 years of experience or strong academic/project experience in dataengineering or data analytics.Proficiency in SQL (joins, subqueries, performance tuning basics).Working knowledge of Python for data processing.Basic understanding of ETL concepts, data modeling, and data warehousing.Familiarity with relational databases (Oracle, PostgreSQL, SQL Server, or similar).Exposure to cloud platforms (AWS, Azure, or GCP) is a plus.Experience with version control tools such as Git.
Preferred / Nice-to-Have Skills
Exposure to Big Data technologies (Spark, Hadoop).Familiarity with cloud data services (AWS S3, Glue, Redshift, Azure Data Factory, Snowflake).Understanding of banking or financial data, including transactions, risk, or regulatory reporting.Knowledge of data quality frameworks and basic data governance concepts.Experience with workflow orchestration tools (Airflow, Control-M).
Soft Skills
Strong analytical and problem-solving skills.Willingness to learn and adapt in a regulated environment.Good communication and documentation skills.Ability to work effectively in a team-oriented, Agile environment.
$78k-106k yearly est. 10d ago
Data Engineer
Rockwool
Data engineer job in Cleveland, OH
Ready to help build a better future for generations to come? In an ever-changing, fast paced world, we owe it to ourselves and our future generations to live life responsibly. At ROCKWOOL, we work relentlessly to enrich modern living through our innovative stone wool solutions.
Join us and make a difference!
Your future team
You will join our Factory of the Future team within Group R&D - a passionate community of specialists who develop and implement the latest technologies for stone wool production.
Our team's mission is to transform the way we operate by building next-generation OT & IT solutions that drive smarter, more efficient, and more sustainable manufacturing. As part of this journey, we are expanding our Data Science and Engineering capabilities and you could be a key part of it.
What you will be doing
* Design, build, and automate data pipelines for industrial process analysis
* Create systems for efficient storage and access to large-scale operational data
* Ensure data quality, reliability, and accessibility for ML model training and deployment
* Develop real-time visualizations and dashboards to monitor industrial operations
* Engineerdata infrastructure for deploying ML solutions both on-premises and in the cloud
* Evaluate and validate new technologies in production environments
* Collaborate closely with Data Scientists and stakeholders to align technology across all levels of the organization
What you bring
* 2+ years of experience in dataengineering, preferably in a production or industrial setting
* Degree in Computer Science, Computer Engineering, or a related technical field (or equivalent practical foundations)
* Experience with relational and non-relational databases (SQL, NoSQL, InfluxDB)
* Solid programming skills in Python
* Strong knowledge of data pipelines and ELT processes
* Proficiency in data warehousing and data lake technologies (on-prem & cloud)
* Hands-on experience with on-premise data infrastructure
* Experience with big data frameworks (Kafka, Apache Spark, Flink)
* Understanding of IT/OT convergence and data quality management
* Familiarity with cloud platforms (AWS, Azure, or GCP)
* Experience with DevOps tools
* Familiarity with Linux environments
Tools and technologies you'll work with
* Data storage: PostgreSQL, MS SQL Server, MongoDB, S3/MinIO
* Transformation & orchestration: DBT, Apache Airflow, Luigi
* Big data: Apache Kafka, Apache Flink, Apache Spark
* Cloud platforms: AWS S3, Azure Blob Storage, GCP Cloud Storage
* Visualization: Grafana, Apache Superset, Power BI
* DevOps: Git, Docker, CI/CD, OpenShift
What we offer
By joining our team, you become a part of the people-centric work environment of a Danish company. We offer you a competitive salary, permanent contract after the probation period, development package, team building events, activity-based office in Poznan's city center in the new prestigious office building - Nowy Rynek. The building is recognized as a building without barriers, which means that it is fully adapted to the needs of people with disabilities.
Our compensation package on employment contracts includes:
* An office-first approach: home office is available up to 1 day per week
* Adaptable Hours: start your workday anytime between 7:00 AM and 9:00 AM
* Home office subsidy
* Private Medical Care
* Multikafeteria MyBenefit
* Wellbeing program
* Extra Day Off for voluntary activities
… and while in the office you can also use modern office space with beautiful view and high standard furniture, bicycle parking facilities & showers, chill-out rooms with PlayStation, football table, pool table, board games, subsidized canteen with delicious food & fruit.
Interested?
If you recognize yourself in this profile and challenge, we kindly invite you to apply with CV written in English.
Who we are
We are the world leader in stone wool solutions. Founded in 1937 in Denmark, we transform volcanic rock into safe, sustainable products that help people and communities thrive. We are a global company with more than 12,200 employees, located in 40+ countries with 42 manufacturing facilities… all focused on one common purpose - to release the natural power of stone to enrich modern living.
Sustainability is central to our business strategy. ROCKWOOL was one of the first companies to commit to actively contributing to the United Nations Sustainable Development Goals (SDGs) framework and are actively committed to 11 SDGs, including SDG 14, Life Below Water. Through our partnership with the One Ocean Foundation and in connection with our sponsorship of the ROCKWOOL Denmark SailGP team, we will help raise awareness around ocean health challenges in an effort to accelerate solutions to protect it.
Diverse and Inclusive Culture
We want all our people to feel valued, respected, included and heard. We employ 79 different nationalities worldwide and are committed to providing equal opportunities to all employees, promote diversity, and work against all forms of discrimination among ROCKWOOL employees.
At ROCKWOOL, you will experience a friendly team environment. Our culture is very important to us. In fact, we refer to our culture as "The ROCKWOOL Way". This is the foundation in which we operate and is based upon our values of ambition, responsibility, integrity and efficiency.
$78k-106k yearly est. Auto-Apply 60d+ ago
Senior Data Engineer - N.I.
Cleveland Clinic 4.7
Data engineer job in Cleveland, OH
Join the Cleveland Clinic team where you will work alongside passionate caregivers and make a lasting, meaningful impact on patient care. Here, you will receive endless support and appreciation while building a rewarding career with one of the most respected healthcare organizations in the world.
The Senior DataEngineer will extract, transform, and integrate data from diverse clinical systems-including Epic Clarity, third-party APIs, and other internal and external sources-using SQL, Python, and additional programming languages as needed. The role ensures seamless integration of structured, semi-structured, and unstructured data within Snowflake and Databricks environments, with familiarity in REDCap databases considered a plus.
This position plays a key role in organizing, managing, and optimizing the organization's data ecosystem, enabling enhanced decision-making across clinical operations and research. You will be responsible for designing, building, and maintaining scalable data pipelines, data warehouses, and data lakes that support clinical and financial reporting.
The role is full time (40 hours per week), Monday through Friday from 8:00 a.m. to 5:00 p.m., with minimal weekend or holiday work and occasional on-call coverage as needed. The caregiver must work onsite at Cleveland Main Campus, with eligibility for a hybrid work arrangement after 90 days.
A caregiver who excels in this role will:
* Analyze, design and coordinate the development of software systems.
* Develop new software and proof/test them to assure production of a quality product.
* Analyze current programs and processes.
* Make recommendations for more cost-effective products and better streamlined workflows.
* Define and implement high performance software by leveraging a strong understanding of embedded hardware design.
* Provide technical support to other Developers when project support is needed, including the design of relational databases and client-side programming strategies using the latest HTML, CSS and JavaScript frameworks.
* Determine and communicate the implications of system-level decisions on subsystems and components and mitigate issues.
* Translate clinical problems into innovative healthcare solutions.
* Gather business and application specific requirements.
* Ensure all requirements are met as well as maintained within defined project scope.
* Determine whether a particular problem is caused by hardware, operating systems software, application programs or network failures and supporting as necessary.
Minimum qualifications for the ideal future caregiver include:
* Bachelor's Degree in Computer Science, Computer Engineering or a related field and 10 years of software development experience and healthcare software development
* OR High School Diploma/GED and 15 years of experience
* OR Associate's Degree and 13 years of experience
* ITIL Foundations Certification within six months of hire
* Proven track record of enterprise architecture experience with large volume, high availability enterprise applications.
* Experience with Microsoft, Net Technology Stack and Relational Database Design
* Microsoft C#, iOS Objective C, VBScript, Visual Basic, ColdFusion, Microsoft T-SQL, JavaScript, CCSS3 and/or HTML5 experience
* Query, query Mobile, KnockoutJS, UnderscoreJS and/or YUI Compressor experience
* Experience in interfacing with internal and external customers
* SOA development using SOAP and REST; mobile application development and associated libraries experience
Preferred qualifications for the ideal future caregiver include:
* Experience with hospital or medical systems
Physical Requirements:
* Ability to perform work in a stationary position for extended periods.
* Ability to travel throughout the hospital system.
* Ability to operate a computer and other office equipment.
* Ability to communicate and exchange accurate information.
Personal Protective Equipment:
* Follows Standard Precautions using personal protective equipment as required for procedures.
Pay Range
Minimum Annual Salary: $92,620.00
Maximum Annual Salary: $141,265.00
The pay range displayed on this job posting reflects the anticipated range for new hires. A successful candidate's actual compensation will be determined after taking factors into consideration such as the candidate's work history, experience, skill set and education. The pay range displayed does not include any applicable pay practices (e.g., shift differentials, overtime, etc.). The pay range does not include the value of Cleveland Clinic's benefits package (e.g., healthcare, dental and vision benefits, retirement savings account contributions, etc.).
$92.6k-141.3k yearly 4d ago
Senior Data Engineer
Advance Local 3.6
Data engineer job in Cleveland, OH
**Advance Local** is looking for a **Senior DataEngineer** to design, build, and maintain the enterprise data infrastructure that powers our cloud data platform. This position will combine your deep technical expertise in dataengineering with team leadership responsibilities for dataengineering, overseeing the ingestion, integration, and reliability of data systems across Snowflake, AWS, Google Cloud, and legacy platforms. You'll partner with data product and across business units to translate requirements into technical solutions, integrate data from numerous third-party platforms, (CDPs, DMPs, analytics platforms, marketing tech) into central data platform, collaborate closely with the Data Architect on platform strategy and ensure scalable, well-engineered solutions for modern data infrastructure using infrastructure as code and API-driven integrations.
The base salary range is $120,000 - $140,000 per year.
**What you'll be doing:**
+ Lead the design and implementation of scalable data ingestion pipelines from diverse sources into Snowflake.
+ Partner with platform owners across business units to establish and maintain data integrations from third party systems into the central data platform.
+ Architect and maintain data infrastructure using IAC, ensuring reproducibility, version control and disaster recovery capabilities.
+ Design and implement API integrations and event-driven data flows to support real time and batch data requirements.
+ Evaluate technical capabilities and integration patterns of existing and potential third-party platforms, advising on platform consolidation and optimization opportunities.
+ Partner with the Data Architect and data product to define the overall data platform strategy, ensuring alignment between raw data ingestion and analytics-ready data products that serve business unit needs.
+ Develop and enforce dataengineering best practices including testing frameworks, deployment automation, and observability.
+ Support rapid prototyping of new data products in collaboration with data product by building flexible, reusable data infrastructure components.
+ Design, develop, and maintain scalable data pipelines and ETL processes; optimize and improve existing data systems for performance, cost efficiency, and scalability.
+ Collaborate with data product, third-party platform owners, Data Architects, Analytics Engineers, Data Scientists, and business stakeholders to understand data requirements and deliver technical solutions that enable business outcomes across the organization.
+ Implement data quality validation, monitoring, and alerting systems to ensure reliability of data pipelines from all sources.
+ Develop and maintain comprehensive documentation for dataengineering processes and systems, architecture, integration patterns, and runbooks.
+ Lead incident response and troubleshooting efforts for data pipeline issues, ensuring minimal business impact.
+ Stay current with the emerging dataengineering technologies, cloud services, SaaS platform capabilities, and industry best practices.
**Our ideal candidate will have the following:**
+ Bachelor's degree in computer science, engineering, or a related field
+ Minimum of seven years of experience in dataengineering with at least two years in a lead or senior technical role
+ Expert proficiency in Snowflake dataengineering patterns
+ Strong experience with AWS services (S3, Lambda, Glue, Step Functions) and Google Cloud Platform
+ Experience integrating data from SaaS platforms and marketing technology stacks (CDPs, DMPs, analytics platforms, CRMs)
+ Proven ability to work with third party APIs, webhooks, and data exports
+ Experience with infrastructure such as code (Terraform, Cloud Formation) and CI/CD pipelines for data infrastructure
+ Proven ability to design and implement API integrations and event-driven architecture
+ Experience with data modeling, data warehousing, and ETL processes at scale
+ Advanced proficiency in Python and SQL for data pipeline development
+ Experience with data orchestration tools (airflow, dbt, Snowflake tasks)
+ Strong understanding of data security, access controls, and compliance requirements
+ Ability to navigate vendor relationships and evaluate technical capabilities of third-party platforms
+ Excellent problem-solving skills and attention to detail
+ Strong communication and collaboraion skills
**Additional Information**
Advance Local Media offers competitive pay and a comprehensive benefits package with affordable options for your healthcare including medical, dental and vision plans, mental health support options, flexible spending accounts, fertility assistance, a competitive 401(k) plan to help plan for your future, generous paid time off, paid parental and caregiver leave and an employee assistance program to support your work/life balance, optional legal assistance, life insurance options, as well as flexible holidays to honor cultural diversity.
Advance Local Media is one of the largest media groups in the United States, which operates the leading news and information companies in more than 20 cities, reaching 52+ million people monthly with our quality, real-time journalism and community engagement. Our company is built upon the values of Integrity, Customer-first, Inclusiveness, Collaboration and Forward-looking. For more information about Advance Local, please visit ******************** .
Advance Local Media includes MLive Media Group, Advance Ohio, Alabama Media Group, NJ Advance Media, Advance Media NY, MassLive Media, Oregonian Media Group, Staten Island Media Group, PA Media Group, ZeroSum, Headline Group, Adpearance, Advance Aviation, Advance Healthcare, Advance Education, Advance National Solutions, Advance Originals, Advance Recruitment, Advance Travel & Tourism, BookingsCloud, Cloud Theory, Fox Dealer, Hoot Interactive, Search Optics, Subtext.
_Advance Local Media is proud to be an equal opportunity employer, encouraging applications from people of all backgrounds. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, genetic information, national origin, age, disability, sexual orientation, marital status, veteran status, or any other category protected under federal, state or local law._
_If you need a reasonable accommodation because of a disability for any part of the employment process, please contact Human Resources and let us know the nature of your request and your contact information._
Advance Local Media does not provide sponsorship for work visas or employment authorization in the United States. Only candidates who are legally authorized to work in the U.S. will be considered for this position.
$120k-140k yearly 60d+ ago
Data Technology Lead
Westfield High School 3.3
Data engineer job in Westfield Center, OH
The role is part of the Data, Analytics and Reporting team. This role leads a team of dataengineers and data testers to deliver secure, scalable, and high-quality data solutions that support analytics, reporting, and business operations. The Data Technology Lead collaborates with stakeholders to understand technology and data requirements, implements best practices for data governance and testing, and drives innovation in dataengineering. The position involves managing modern cloud-based platforms, while fostering a culture of continuous improvement and technical excellence within the IT team.
Job Responsibilities
Data Architecture & Engineering: Design, develop, and maintain robust data pipelines and architectures for structured and unstructured data; optimize workflows across Azure Data Lake, Snowflake, and other environments; implement best practices for data modeling and transformation using dbt.
Team Leadership: Lead and mentor dataengineers and testers; manage workload distribution; foster collaboration and innovation.
Testing & Quality Assurance: Establish data testing frameworks; ensure data accuracy and reliability; integrate testing into broader QA processes.
Collaboration & Stakeholder Engagement: Partner with analytics, BI, and business teams to deliver data solutions; provide technical guidance.
Vendor & Tool Management: Evaluate and select tools and vendors; negotiate contracts and manage relationships.
Business Continuity: Develop and maintain disaster recovery and business continuity plans for data systems.
Job Qualifications
7+ years of experience in dataengineering, with at least 2 years in a data leadership role.
Insurance industry experience required
Bachelor's degree in Computer Science, Information Technology, or a related field and/or commensurate experience.
Master's degree in related field is preferred.
Proficiency in SQL.
Optional skills in Python and experience with modern data frameworks (e.g., Spark).
Expertise in Snowflake, Azure Data Lake, dbt, and modern data platforms.
Strong experience in data integration, data warehousing, and data lake architectures.
Experience with Azure DevOps, CI/CD pipelines, and Git for code management
Awareness of Generative AI (GenAI) capabilities to accelerate development and testing processes.
Familiarity with data testing methodologies and tools.
Excellent leadership, communication, and problem-solving skills.
Behavioral Competencies
Directs work
Collaborates
Develops talent
Customer focus
Communicates effectively
Ensures accountability
Decision quality
Business insight
Nimble learning
Builds effective teams
Manages complexity
Technical Skills
Technical Support
Operating Systems
Workflow Management
Budgeting
Disaster Recovery
Process Improvement
Project Management
IT Strategy & Framework
IT Regulatory Compliance
Stakeholder Management
This job description describes the general nature and level of work performed in this role. It is not intended to be an exhaustive list of all duties, skills, responsibilities, knowledge, etc. These may be subject to change and additional functions may be assigned as needed by management.
$76k-95k yearly est. Auto-Apply 48d ago
BI Data Engineer
Quadax Careers & Culture
Data engineer job in Middleburg Heights, OH
Job Title: DataEngineer
We are seeking a DataEngineer to join our growing Big Data team. In this role, you will design and implement data models, build semantic layers, create data extracts, and develop robust data pipelines to ingest data into our data warehouse. You will collaborate with BI developers, data analysts, and data scientists to support a variety of data initiatives. The ideal candidate is self-directed, highly organized, and comfortable managing the data needs of multiple teams, systems, and products. This is a hybrid position: 4 days per week onsite in Middleburg Heights, OH following initial 3 month/5 day onsite period. Visa sponsorship not available.
Key Responsibilities:
Design, build, and maintain data pipelines for optimal extraction, transformation, and loading (ETL) from diverse sources into our Snowflake Data Warehouse.
Perform data analysis, mapping, and validation across multiple sources and formats to produce consolidated data models.
Assemble large, complex datasets to meet technical requirements for reporting, extraction, and analytics.
Define and implement semantic layers on top of data models for reporting and analytical purposes.
Create custom data extracts to support ad-hoc reporting requests.
Identify and implement process improvements, including automation of manual tasks, pipeline optimization for scalability and performance, and data model enhancements for query efficiency.
Support Data Science initiatives by creating and automating pipelines for model training datasets.
Maintain and optimize the Snowflake environment, including monitoring consumption, configuring warehouses, managing role-based security, applying environment changes, and evaluating new functionality.
Collaborate with stakeholders such as BI Developers, Product Owners, Data Scientists, and Architects to resolve data-related technical issues and support infrastructure needs.
Ensure compliance with PHI and HIPAA standards and guidelines.
Perform other duties as assigned.
Education / Experience:
3+ years of experience in dataengineering, including building and optimizing ETL processes, data pipelines, and datasets.
Strong understanding of dataengineering principles and best practices.
Hands-on experience working with structured and unstructured datasets.
Proven ability to analyze datasets to answer business questions and identify improvement opportunities.
Preferred experience with Snowflake, Microsoft SQL Server, Microsoft Fabric, and Python.
Experience with C# and/or Java is a plus.
Strong project management and organizational skills.
Demonstrated success working with cross-functional teams in dynamic environments.
Familiarity with Scrum/Agile development methodologies is a plus.
Medical billing knowledge is a plus.
$78k-106k yearly est. 5d ago
Azure Data Engineer - 6013916
Accenture 4.7
Data engineer job in Cleveland, OH
Accenture Flex offers you the flexibility of local fixed-duration project-based work powered by Accenture, a leading global professional services company. Accenture is consistently recognized on FORTUNE's 100 Best Companies to Work For and Diversity Inc's Top 50 Companies For Diversity lists.
As an Accenture Flex employee, you will apply your skills and experience to help drive business transformation for leading organizations and communities. In addition to delivering innovative solutions for Accenture's clients, you will work with a highly skilled, diverse network of people across Accenture businesses who are using the latest emerging technologies to address today's biggest business challenges.
You will receive competitive rewards and access to benefits programs and world-class learning resources. Accenture Flex employees work in their local metro area onsite at the project, significantly reducing and/or eliminating the demands to travel.
Job Description:
Join our dynamic team and embark on a journey where you will be empowered to perform independently and become an SME. Required active participation/contribution in team discussions will be key as you contribute in providing solutions to work related problems. Let's work together to achieve greatness!
Responsibilities:
* Create new data pipelines leveraging existing data ingestion frameworks, tools
* Orchestrate data pipelines using the Azure Data Factory service.
* Develop/Enhance data transformations based on the requirements to parse, transform and load data into Enterprise Data Lake, Delta Lake, Enterprise DWH (Synapse Analytics)
* Perform Unit Testing, coordinate integration testing and UAT Create HLD/DD/runbooks for the data pipelines
* Configure compute, DQ Rules, Maintenance Performance tuning/optimization
Qualification
Basic Qualifications:
* Minimum of 3 years of work experience with one or more of the following: DatabricksDataEngineering, DLT, Azure Data Factory, SQL, PySpark, Synapse Dedicated SQL Pool, Azure DevOps, Python
Preferred Qualifications:
* Azure Function Apps
* Azure Logic Apps
* Precisely & COSMOS DB
* Advanced proficiency in PySpark.
* Advanced proficiency in Microsoft Azure Databricks, Azure DevOps, Databricks Delta Live Tables and Azure Data Factory.
* Bachelor's or Associate's degree
Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired as set forth below. We accept applications on an on-going basis and there is no fixed deadline to apply.
Information on benefits is here.
Role Location:
California - $47.69 - $57.69
Cleveland - $47.69 - $57.69
Colorado - $47.69 - $57.69
District of Columbia - $47.69 - $57.69
Illinois - $47.69 - $57.69
Minnesota - $47.69 - $57.69
Maryland - $47.69 - $57.69
Massachusetts - $47.69 - $57.69
New York/New Jersey - $47.69 - $57.69
Washington - $47.69 - $57.69
Locations
$69k-89k yearly est. 2d ago
Data Architect
Zone It Solutions
Data engineer job in Strongsville, OH
Job Description
Zone IT Solutions is in search of a highly skilled Data Architect. In this pivotal role, you will be responsible for designing, implementing, and managing our data architecture strategies to support our business objectives and enhance our data capabilities.
Requirements
Bachelor's or Master's degree in Computer Science, Information Systems, or related field.
Minimum of 6+ years of experience in data architecture, data modeling, and data management.
Strong proficiency in SQL and experience with relational databases (e.g., Oracle, SQL Server, MySQL).
Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus.
In-depth understanding of data integration, ETL processes, and data warehousing concepts.
Proven experience in designing scalable data solutions for large datasets.
Strong analytical and problem-solving skills, with the ability to work under pressure.
Excellent communication and collaboration skills to work effectively with cross-functional teams.
Familiarity with cloud data solutions (e.g., AWS, Azure, GCP) is preferred.
Benefits
About Us
We specialize in Digital, ERP, and larger IT Services. We offer flexible, efficient and collaborative solutions to any organization that requires IT, experts. Our agile, agnostic, and flexible solutions will help you source the IT Expertise you need. If you are looking for new opportunities, send your profile at *******************************.
Also follow our LinkedIn page for new job opportunities and more.
Zone IT Solutions is an equal opportunity employer and our recruitment process focuses on essential skills and abilities. We encourage applications from a diverse array of backgrounds, including individuals of various ethnicities, cultures, and linguistic backgrounds, as well as those with disabilities.
$85k-117k yearly est. Easy Apply 10d ago
Lead Data Engineer (Mentor, OH, US, 44060)
Steris Corporation 4.5
Data engineer job in Mentor, OH
At STERIS, we help our Customers create a healthier and safer world by providing innovative healthcare and life science product and service solutions around the globe. The Lead DataEngineer demonstrates mastery of skills and knowledge and is a mentor, strategist, thought leader, evangelist, champion, Plans and leads dataengineering activities.
Responsible for overseeing the design, development, deployment, and maintenance of scalable and robust data solutions. Develops and manages data integration pipelines connecting disparate data sources. Works closely with data architects, data scientists, analysts, and other stakeholders to support business needs in analytical and data solutions/projects. Collaborates with Infrastructure and DBA teams to ensure appropriate infrastructure is in place. Optimizes and streamlines data processing efforts to ensure data quality, security, privacy, on time delivery and compliance. Provides technical leadership, mentorship, reviews deliverables and provides feedback to the dataengineering team.
What You'll do as a Lead DataEngineer
* Data Architecture and Technical Infrastructure: Defines, plans, designs and support implementation of enterprise data architectures and enterprise data platform. Plans and leads dataengineering activities for strategic, large, and complex programs. Leads the selection and development of dataengineering methods, tools, and techniques.
* SDLC Methodology & Project Management: Plans technical transitions between development, testing, and production phases of solutions' lifecycle, and the facilitation of the change control, problem management, and communication processes.
* Innovation, Continuous Improvement & Optimization: Develops organizational policies, standards, and guidelines for the development and secure operation of data services and products. Ensures adherence to technical strategies and architectures.
* Data Modelling / Designing Datasets: Coordinates the application of analysis, design, and modelling techniques to establish, modify or maintain data structures and their associated components. Manages the iteration, review and maintenance of data requirements and data models.
* Partnership and Community Building: Collaborates with other IT teams, business community, data scientists and other architects to meet business requirements.
* Data Pipeline/ETL: Sets standards for data modelling and design tools and techniques, advises on their application, and ensures compliance. Defines and implements administration and control activities related to data warehouse planning and development and the establishment of policies and procedures pertaining to its management, security, maintenance, and utilization.
* Support & Operations: Manages the investigation of enterprise data requirements based upon a detailed understanding of information requirements.
* Data Governance and Data Quality: Ensures that data is reliable, secure, and timely. Implement Data privacy and best practices. Defines, designs and implements data quality assessment and improvement methodology, processes and solutions.
* End-User Support, Education and Enablement: Plans, designs, develops and facilitates training and Data Literacy initiatives within the team and End user community.
* Metadata Management & Documentation: Ensure standards, and best practices in documentation of Metadata, DataEngineering processes and Architectures.
The Experience, Abilities and Skills Needed
* Bachelor's Degree and 6+ years of experience
* Development, maintenance, and enhancement of Data Pipelines (ETL/ELT) and processes with thorough knowledge of star/snowflake schemas
* Developing complex SQL queries and SQL optimization
* Development experience must be full Life Cycle experience including business requirements gathering, data sourcing, testing/data reconciliation, and deployment within Business
* Intelligence/Data Warehousing Architecture.
* Designing and implementing data security
* Monitoring and optimizing data storage and data processing
* Delivering Data Solutions using Cloud Technologies
* Advanced SQL skills and experience with relational databases and database design like Oracle and SQLSERVER.
* Significant experience working with cloud Data Warehouse and Data Lake solutions (e.g., Snowflake, Redshift, BigQuery, Azure Data Lake Storage, Amazon S3, etc.)
* Experience working with data ingestion tools such as: Informatica, IDMC, IICS, Power Center, Fivetran, stitch, or Matillion.
* Working knowledge of Cloud-based solutions (e.g., Azure, AWS and GCP).
* Experience building and deploying machine learning models in production.
* Strong proficiency in object-oriented languages: Python, Java, C++, Scala.
* Strong proficiency in scripting languages like Bash.
* Strong proficiency in data pipeline and workflow management tools (e.g., Airflow, Azkaban).
* Familiarity with big data frameworks such as Apache Hadoop and Apache Spark
* Strong project management and organizational skills.
* Excellent problem-solving, communication, and organizational skills.
* Demonstrated leadership experience and skills
* Ability to communicate effectively and influence technical and business stakeholders at all levels of the organization
What STERIS Offers
At STERIS, we invest in our employees and their families for the long term! STERIS wouldn't be where it is today without our incredible people. We shares our success together with you by rewarding you for your hard work and achievements.
Here is just a brief overview of what we offer:
* Competitive Pay
* Extensive Paid Time Off and (9) added Holidays.
* Excellent healthcare, dental, and vision benefits
* 401(k) with a company match
* Long/Short term disability coverage
* Parental Leave
* Additional add-on benefits/discounts for programs such as Pet Insurance
* Continued training and educations programs
* Excellent opportunities for advancement in a stable long-term career
* #LI-KS1 #LI-Hybrid
Pay range for this opportunity is $105,000-$125,000. This position is eligible for bonus participation.
Minimum pay rates offered will comply with county/city minimums, if higher than range listed. Pay rates are based on a number of factors, including but not limited to local labor market costs, years of relevant experience, education, professional certifications, foreign language fluency, etc.
STERIS offers a comprehensive and competitive benefits portfolio. Click here for a complete list of benefits: STERIS Benefits
Open until position is filled.
STERIS is a leading global provider of products and services that support patient care with an emphasis on infection prevention. WE HELP OUR CUSTOMERS CREATE A HEALTHIER AND SAFER WORLD by providing innovative healthcare and life sciences products and services around the globe. For more information, visit ***************
If you need assistance completing the application process, please call ****************. This contact information is for accommodation inquiries only and cannot be used to check application status.
STERIS is an Equal Opportunity Employer. We are committed to equal employment opportunity to ensure that persons are recruited, hired, trained, transferred and promoted in all job groups regardless of race, color, religion, age, disability, national origin, citizenship status, military or veteran status, sex (including pregnancy, childbirth and related medical conditions), sexual orientation, gender identity, genetic information, and any other category protected by federal, state or local law. We are not only committed to this policy by our status as a federal government contractor, but also we are strongly bound by the principle of equal employment opportunity.
The full affirmative action program, absent the data metrics required by § 60-741.44(k), shall be available to all employees and applicants for employment for inspection upon request. The program may be obtained at your location's HR Office during normal business hours.
$105k-125k yearly 60d+ ago
Data Warehouse Applications Developer
Maverick Direct
Data engineer job in Akron, OH
Currently have a challenging opportunity for a Data Warehouse Applications Developer . A direct hire with a 2 billion a year in sales organization in the Akron , OH area . Market, manufacturer, and distribute a variety of products to a variety of customers across the industrial, OE, aftermarket, and retail channels.
Job Description
The development /application support includes but is not limited to:
• Applying systems solutions to business problems through the design and programming;
• Assisting in preparing detailed specifications through discussion with clients;
• Breaking down program specification into its simplest elements and translating this logic into a programming language;
• Combining all elements of the program design and testing;
• Testing sample data-sets to check that output from the program works as intended;
• Reacting to problems and correcting the program as necessary;
• Evaluating and increasing the program's effectiveness;
• Adapting the program to new requirements, as necessary;
• Conducting user acceptance testing to ensure the program can be used easily, quickly and accurately;
• Writing detailed documentation for the operation of the program by users and computer operators;
• Updating, repairing, modifying and developing existing software and generic applications.
Qualifications
Position Requirements
• 3 + year's experience: SSIS development, developing SQL queries, Stored Procedures, Triggers, Functions for SQL Server
• 3+ years in C# and .Net framework
• Data warehouse concept, design pattern, best practice, interface programming, n-tier architecture, business object, developing WinForm/WebService/WinService/Web Application/MVC.
• Strong problem solving and organizational skills
Additional Information
Target salary range $80-90k/year
***Excellent benefits package that includes a low deductible medical plan, pension plan, 401k, dental, vision, life, prescription drug, A&S, attendance and plant performance bonus, paid vacations and holidays.***
Contact Information:
Jim Replogle
Talent Acquisition Manager
************ x208
[email protected]
How much does a data engineer earn in Cleveland, OH?
The average data engineer in Cleveland, OH earns between $68,000 and $122,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.
Average data engineer salary in Cleveland, OH
$91,000
What are the biggest employers of Data Engineers in Cleveland, OH?
The biggest employers of Data Engineers in Cleveland, OH are: