Senior Data Scientist
Senior data scientist job in Milwaukee, WI
What you will do
In this exciting role you will lead the effort to build and maintain a digital twin of our mission critical North American closed-loop critical minerals supply chain. This is not only a $2.2B global network that represents over half of Clarios' NA's cost of goods sold, it also is a vital for the US economy - directly enabling consumers and business to stay on the road!
This will require working with predictive, prescriptive analytics and decision-intelligence across the US / Canada region at Clarios. You'll apply modern statistics, machine learning and AI to real manufacturing and supply chain problems, working side-by-side with our business stakeholders and our global analytics team to deploy transformative solutions- not just models.
How you will do it
Build production-ready ML/statistical models (regression/classification, clustering, time series, linear / non-linear optimizations) to detect patterns, perform scenario and what if analytics and generate prescriptive insights / outcomes.
Wrangle and analyze and model data with Python and SQL; perform feature engineering, data quality checks, and exploration analysis to validate hypotheses and model readiness.
Develop digital solutions /visuals in Power BI and our decision intelligence platform to communicate results and monitor performance with business users.
Partner with stakeholders to clarify use cases, translate needs into technical tasks/user stories, and iterate solutions in sprints.
Manage model deployment (e.g., packaging models, basic MLOps) with guidance from Global Analytics
Lead complex solution development, scoping and utilize design thinking principles for customer centric model delivery
Document and communicate model methodology, assumptions, and results to non-technical audiences; support troubleshooting and continuous improvement of delivered analytics.
Deliver value realization as part of our business analytics team to drive positive business outcomes for our metals team.
Potential travel up to 5-10%
What we look for
Required
Bachelor's degree in Statistics, Mathematics, Computer Science, Engineering, or related field-or equivalent practical experience.
3-5 years of applying ML/statistics on business data.
Python proficiency (pandas, scikit-learn, SciPy/statsmodels) and SQL across common platforms (e.g., SQL Server, Snowflake).
Core math/stats fundamentals: probability, hypothesis testing/DoE basics, linear algebra, and the principles behind common ML methods.
Data visualization experience with Power BI / Decision Intelligence Platforms for analysis and stakeholder storytelling.
Ability to work in cross-functional teams and explain technical work clearly to non-technical partners. Candidates must be self-driven, curious, and creative
Preferred
Cloud & big data exposure: Azure (or AWS), Databricks/Spark; Snowpark is a plus.
Understanding of ETL/ELT tools such as ADF, SSIS, Talend, Informatica, or Matillion.
MLOps concepts (model validation, monitoring, packaging with Docker/Kubernetes).
Deep learning basics (PyTorch/Keras) for the right use cases.
Experience contributing to agile backlogs, user stories, and sprint delivery.
5+ years of experience in data analytics
Master's Degree in Statistics, Economics, Data Science or Computer Science.
How you will be successful
Deliver incremental value quickly (first dashboards, baseline models) and iterate with stakeholder feedback.
Balance rigor with practicality-choose the simplest model that solves the problem and can be supported in production.
Keep data quality front-and-center; instrument checks to protect decisions from drift and bad inputs.
What you get:
Medical, dental and vision care coverage and a 401(k) savings plan with company matching - all starting on date of hire
Tuition reimbursement, perks, and discounts
Parental and caregiver leave programs
All the usual benefits such as paid time off, flexible spending, short-and long-term disability, basic life insurance, business travel insurance, Employee Assistance Program, and domestic partner benefits
Global market strength and worldwide market share leadership
HQ location earns LEED certification for sustainability plus a full-service cafeteria and workout facility
Clarios has been recognized as one of 2025's Most Ethical Companies by Ethisphere. This prestigious recognition marks the third consecutive year Clarios has received this distinction.
Who we are:
Clarios is the force behind the world's most recognizable car battery brands, powering vehicles from leading automakers like Ford, General Motors, Toyota, Honda, and Nissan. With 18,000 employees worldwide, we develop, manufacture, and distribute energy storage solutions while recovering, recycling, and reusing up to 99% of battery materials-setting the standard for sustainability in our industry. At Clarios, we're not just making batteries; we're shaping the future of sustainable transportation. Join our mission to innovate, push boundaries, and make a real impact. Discover your potential at Clarios-where your power meets endless possibilities.
Veterans/Military Spouses:
We value the leadership, adaptability, and technical expertise developed through military service. At Clarios, those capabilities thrive in an environment built on grit, ingenuity, and passion-where you can grow your career while helping to power progress worldwide. All qualified applicants will be considered without regard to protected characteristics.
We recognize that people come with a wealth of experience and talent beyond just the technical requirements of a job. If your experience is close to what you see listed here, please apply. Diversity of experience and skills combined with passion is key to challenging the status quo. Therefore, we encourage people from all backgrounds to apply to our positions. Please let us know if you require accommodations during the interview process by emailing Special.Accommodations@Clarios.com. We are an Equal Opportunity Employer and value diversity in our teams in terms of work experience, area of expertise, gender, ethnicity, and all other characteristics protected by laws in the countries where we operate. For more information on our commitment to sustainability, diversity, and equal opportunity, please read our latest report. We want you to know your rights because EEO is the law.
A Note to Job Applicants: please be aware of scams being perpetrated through the Internet and social media platforms. Clarios will never require a job applicant to pay money as part of the application or hiring process.
To all recruitment agencies: Clarios does not accept unsolicited agency resumes/CVs. Please do not forward resumes/CVs to our careers email addresses, Clarios employees or any other company location. Clarios is not responsible for any fees related to unsolicited resumes/CVs.
Auto-ApplyData Scientist
Senior data scientist job in Park City, IL
Department
BSD MED - Hematology and Oncology - Howard Research Staff
About the Department
The Section of Hematology/Oncology has a proud and long tradition of excellence in research-based patient care and clinical discovery. Ranked among the finest cancer programs in the country, the Section is comprised of nationally and internationally known faculty with expertise in all major types of malignancies, blood disorders, and experimental therapies.
Job Summary
The Data Scientist will lead data science research efforts within breast cancer program in the Section of Hematology/Oncology in the Department of Medicine.
Responsibilities
Manage the program's projects.
Develop frameworks for housing multi-modal clinical data - including genomic, pathology, and radiographic imaging data.
Develop and code data dictionaries for discrete data.
Write, Implement, and develop efficient code and machine learning tools and pipelines for data processing and analysis.
Work with collaborators to implement emerging external computational tools for program data.
Draft grant proposals.
Develop training material.
Keep lab's knowledge-based trainings up to date.
Supervise trainees.
Perform reproducible genomic data analysis.
Calibrates data between large and complex research and administrative datasets. Guides and may set the operational protocols for collecting and analyzing information from the University's various internal data systems as well as from external sources.
Designs and evaluates statistical models and reproducible data processing pipelines using expertise of best practices in machine learning and statistical inference.
Provides expertise for high level or complex data-related requests and engages other IT resources as needed.
Analyzes moderately complex data sets for the purpose of extracting and purposefully using applicable information.
Provides professional support to staff or faculty members in defining the project and applying principals of data science in manipulation, statistical applications, programming, analysis and modeling.
Performs other related work as needed.
Minimum Qualifications
Education:
Minimum requirements include a college or university degree in related field.
Work Experience:
Minimum requirements include knowledge and skills developed through 2-5 years of work experience in a related job discipline.
Certifications:
---
Preferred Qualifications
Education:
Advanced degree in computer science.
Experience:
Experience in python including torch and scikit-learn libraries strongly preferred.
Experience in working with image data or unstructured text data with large language models.
Preferred Competencies
Proficiency in Linux/Ubuntu OS including basic file and user management, ssh.
Familiarity with tmux, SLURM.
Strong Analytical skills.
Problem-solving skills.
Organizational skills.
Verbal and written communication skills.
Ability to work independently and as part of a team.
Application Documents
Resume (required)
Cover Letter (required)
When applying, the document(s) MUST be uploaded via the My Experience page, in the section titled Application Documents of the application.
Job Family
Research
Role Impact
Individual Contributor
Scheduled Weekly Hours
40
Drug Test Required
No
Health Screen Required
No
Motor Vehicle Record Inquiry Required
No
Pay Rate Type
Salary
FLSA Status
Exempt
Pay Range
$70,000.00 - $100,000.00
The included pay rate or range represents the University's good faith estimate of the possible compensation offer for this role at the time of posting.
Benefits Eligible
Yes
The University of Chicago offers a wide range of benefits programs and resources for eligible employees, including health, retirement, and paid time off. Information about the benefit offerings can be found in the Benefits Guidebook.
Posting Statement
The University of Chicago is an equal opportunity employer and does not discriminate on the basis of race, color, religion, sex, sexual orientation, gender, gender identity, or expression, national or ethnic origin, shared ancestry, age, status as an individual with a disability, military or veteran status, genetic information, or other protected classes under the law. For additional information please see the University's Notice of Nondiscrimination.
Job seekers in need of a reasonable accommodation to complete the application process should call ************ or submit a request via Applicant Inquiry Form.
All offers of employment are contingent upon a background check that includes a review of conviction history. A conviction does not automatically preclude University employment. Rather, the University considers conviction information on a case-by-case basis and assesses the nature of the offense, the circumstances surrounding it, the proximity in time of the conviction, and its relevance to the position.
The University of Chicago's Annual Security & Fire Safety Report (Report) provides information about University offices and programs that provide safety support, crime and fire statistics, emergency response and communications plans, and other policies and information. The Report can be accessed online at: *********************************** Paper copies of the Report are available, upon request, from the University of Chicago Police Department, 850 E. 61st Street, Chicago, IL 60637.
Auto-ApplyLead Data Scientist GenAI, Strategic Analytics - Data Science
Senior data scientist job in Milwaukee, WI
Deloitte is at the leading edge of GenAI innovation, transforming Strategic Analytics and shaping the future of Finance. We invite applications from highly skilled and experienced Lead Data Scientists ready to drive the development of our next-generation GenAI solutions.
The Team
Strategic Analytics is a dynamic part of our Finance FP&A organization, dedicated to empowering executive leaders across the firm, as well as our partners in financial and operational functions. Our team harnesses the power of cloud computing, data science, AI, and strategic expertise-combined with deep institutional knowledge-to deliver insights that inform our most critical business decisions and fuel the firm's ongoing growth.
GenAI is at the forefront of our innovation agenda and a key strategic priority for our future. We are rapidly developing groundbreaking products and solutions poised to transform both our organization and our clients. As part of our team, the selected candidate will play a pivotal role in driving the success of these high-impact initiatives.
Recruiting for this role ends on December 14, 2025
Work You'll Do
Client Engagement & Solution Scoping
+ Partner with stakeholders to analyze business requirements, pain points, and objectives relevant to GenAI use cases.
+ Facilitate workshops to identify, prioritize, and scope impactful GenAI applications (e.g., text generation, code synthesis, conversational agents).
+ Clearly articulate GenAI's value proposition, including efficiency gains, risk mitigation, and innovation.
+ Solution Architecture & Design
+ Architect holistic GenAI solutions, selecting and customizing appropriate models (GPT, Llama, Claude, Zora AI, etc.).
+ Design scalable integration strategies for embedding GenAI into existing client systems (ERP, CRM, KM platforms).
+ Define and govern reliable, ethical, and compliant data sourcing and management.
Development & Customization
+ Lead model fine-tuning, prompt engineering, and customization for client-specific needs.
+ Oversee the development of GenAI-powered applications and user-friendly interfaces, ensuring robustness and exceptional user experience.
+ Drive thorough validation, testing, and iteration to ensure quality and accuracy.
Implementation, Deployment & Change Management
+ Manage solution rollout, including cloud setup, configuration, and production deployment.
+ Guide clients through adoption: deliver training, create documentation, and provide enablement resources for users.
Risk, Ethics & Compliance
+ Lead efforts in responsible AI, ensuring safeguards against bias, privacy breaches, and unethical outcomes.
+ Monitor performance, implement KPIs, and manage model retraining and auditing processes.
Stakeholder Communication
+ Prepare executive-level reports, dashboards, and demos to summarize progress and impact.
+ Coordinate across internal teams, tech partners, and clients for effective project delivery.
Continuous Improvement & Thought Leadership
+ Stay current on GenAI trends, best practices, and emerging technologies; share insights across teams.
+ Mentor junior colleagues, promote knowledge transfer, and contribute to reusable methodologies.
Qualifications
Required:
+ Bachelor's or Master's degree in Computer Science, Engineering, Data Science, Mathematics, or related field.
+ 5+ years of hands-on experience delivering machine learning or AI solutions, preferably including generative AI.
+ Independent thinker who can create the vision and execute on transforming data into high end client products.
+ Demonstrated accomplishments in the following areas:
+ Deep understanding of GenAI models and approaches (LLMs, transformers, prompt engineering).
+ Proficiency in Python (PyTorch, TensorFlow, HuggingFace), Databricks, ML pipelines, and cloud-based deployment (Azure, AWS, GCP).
+ Experience integrating AI into enterprise applications, building APIs, and designing scalable workflows.
+ Knowledge of solution architecture, risk assessment, and mapping technology to business goals.
+ Familiarity with agile methodologies and iterative delivery.
+ Commitment to responsible AI, including data ethics, privacy, and regulatory compliance.
+ Ability to travel 0-10%, on average, based on the work you do and the clients and industries/sectors you serve
+ Limited immigration sponsorship may be available.
Preferred:
+ Relevant Certifications: May include Google Cloud Professional ML Engineer, Microsoft Azure AI Engineer, AWS Certified Machine Learning, or specialized GenAI/LLM credentials.
+ Experience with data visualization tools such as Tableau
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $102,500 - $188,900.
You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance.
Information for applicants with a need for accommodation
************************************************************************************************************
EA_FA_ExpHire
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.
Lead Data & BI Scientist
Senior data scientist job in Milwaukee, WI
The Company Zurn Elkay Water Solutions Corporation is a thriving, values-driven company focused on doing the right things. We're a fast growing, publicly traded company (NYSE: ZWS), with an enduring reputation for integrity, giving back, and providing an engaging, inclusive environment where careers flourish and grow.
Named by Newsweek as One of America's Most Responsible Companies and an Energage USA Top Workplace, at Zurn Elkay Water Solutions Corporation, we never forget that our people are at the center of what makes us successful. They are the driving force behind our superior quality, product ingenuity, and exceptional customer experience. Our commitment to our people and their professional development is a recipe for success that has fueled our growth for over 100 years, as one of today's leading international suppliers of plumbing and water delivery solutions.
Headquartered in Milwaukee, WI, Zurn Elkay Water Solutions Corporation employs over 2800 employees worldwide, working from 24 locations across the U.S., China, Canada, Dubai, and Mexico, with sales offices available around the globe. We hope you'll visit our website and learn more about Zurn Elkay at zurnelkay.com.
If you're ready to join a company where what you do makes a difference and you have pride in the work you are doing, talk to us about joining the Zurn Elkay Water Solutions Corporation family!
If you are a current employee, please navigate here to apply internally.
Job Description
The Lead Data & BI Scientist is a senior-level role that blends advanced data science capabilities with business intelligence leadership. This position is responsible for driving strategic insight generation, building predictive models, and leading analytics initiatives across departments such as sales, marketing, pricing, manufacturing, logistics, supply chain, and finance.
The role requires both technical depth and business acumen to ensure that data-driven solutions are aligned with organizational goals and deliver measurable value.
Key Accountabilities
Strategic Insight & Business Partnership
* Partner with business leaders to identify high-impact opportunities and form hypotheses.
* Present findings and recommendations to leadership in a clear, impactful manner.
* Demonstrate ROI and business value from analytics initiatives.
Data Science Leadership
* Define and implement data science processes, tools, and governance frameworks.
* Mentor junior team members and foster a culture of continuous learning.
Advanced Analytics & Modeling
* Design, build and validate predictive models, machine learning algorithms and statistical analyses.
* Translate complex data into actionable insights for strategic decision-making.
Technology & Tools
* Utilize tools such as Tableau, Power BI, OBIEE/OAC, Snowflake, SQL, R, Python, and data catalogs.
* Stay current with emerging technologies like agentic analytics and AI-driven insights.
* Evaluate and recommend BI platforms and data science tools.
Project & Change Management
* Manage analytics projects from inception to delivery.
* Provide training and change management support for new tools and processes.
* Lead the establishment of a data science center of excellence.
Qualifications/Requirements
* Bachelor degree required, in a quantitative field such as engineering, mathematics, science, and/or MIS. Master degree preferred
* 10+ years of overall work experience
* 7+ years of experience in data science and statistical analysis
* Strong understanding and experience with analytics tool such as Tableau and OBIEE/OAC (or similar tools) for reporting and visualization, Snowflake for data storage, data modeling, data prep or ETL tools, R or Python, SQL, and data catalogs.
* Strong understanding and experience with multiple statistical and quantitative models and techniques such as (but not limited to) those used for predictive analytics, machine learning, AI, linear models and optimization, clustering, and decision tree.
* Deep experience applying data science to solve problems in at least one of the following areas is required. Experience in multiple areas is preferred: marketing, manufacturing, pricing, logistics, sourcing, and sales.
* Strong communication (verbal, written) skills, and ability to work with all levels of the organization effectively
* Working knowledge of and proven experience applying project management tools
* Strong analytical skills
* Ability to lead and mentor the work of others
* High degree of creativity and latitude is expected
Capabilities and Success Factors
* Decision Quality - Making good and timely decisions that keep the organization moving forward.
* Manages Complexity - Making sense of complex, high quantity and sometimes contradictory information to effectively solve problems.
* Plans & Aligns - Planning and prioritizing work to meet commitments aligned with organizational goals.
* Drives Results - Consistently achieving results, even under tough circumstances.
* Collaborates - Building partnerships and working collaboratively with others to meet shared objectives.
Total Rewards and Benefits
* Competitive Salary
* Medical, Dental, Vision, STD, LTD, AD&D, and Life Insurance
* Matching 401(k) Contribution
* Health Savings Account
* Up to 3 weeks starting Vacation (may increase with tenure)
* 12 Paid Holidays
* Annual Bonus Eligibility
* Educational Reimbursement
* Matching Gift Program
* Employee Stock Purchase Plan - purchase company stock at a discount!
THIRD PARTY AGENCY: Any unsolicited submissions received from recruitment agencies will be considered property of Zurn Elkay, and we will not be liable for any fees or obligations related to those submissions.
Equal Opportunity Employer - Minority/Female/Disability/Veteran
Auto-ApplyHealthcare Data Analyst Lead (CMH Health)
Senior data scientist job in Brookfield, WI
Individual(s) must be legally authorized to work in the United States without the need for immigration support or sponsorship from Milliman now or in the future. Milliman is seeking a technically savvy, analytically strong Healthcare Data Analyst Lead to manage analytics and technology projects as well as coach and mentor junior analytical staff. The ideal candidate is someone seeking to join a challenging, yet rewarding environment focused on delivering world-class analytics across a variety of healthcare-related domains to a variety of healthcare entities.
Who We Are
Independent for over 75 years, Milliman delivers market-leading services and solutions to clients worldwide. Today, we are helping companies take on some of the world's most critical and complex issues, including retirement funding and healthcare financing, risk management and regulatory compliance, data analytics and business transformation.
Job Responsibilities
* Lead and manage analytics and technology projects from data ingestion through delivery.
* Design, develop, and optimize data processes and workflows supporting large healthcare datasets.
* Perform and oversee ETL, validation, and transformation tasks for claims, eligibility, and pharmacy data.
* Guide project teams through the full "data-to-deliverable" lifecycle, ensuring accuracy and efficiency.
* Build analytical models, dashboards, and data pipelines to support consulting engagements.
* Collaborate with consultants, actuaries, and project managers to interpret results and deliver client insights.
* Review and approve technical work from peers and junior analysts to ensure quality standards are met.
* Mentor, coach, and delegate to analytical staff to strengthen technical and professional development.
* Contribute to innovation and process improvement initiatives, including automation, cloud enablement, and AI integration.
* Participate in client meetings and presentations, occasionally requiring travel.
Minimum requirements
* Bachelor's degree required (Computer Science, Management Information Systems, Computer Engineering, Math, Actuarial Science, Data Analytics, or related degree is preferred)
* 6+ years of experience in healthcare data analytics or a related technical analytics role.
* Advanced proficiency with SQL and Microsoft Excel for data analysis, validation, and automation.
* Strong programming skills in Python, R, or other analytical languages.
* Experience with data visualization and reporting tools (e.g., Power BI, Tableau, or R Shiny).
* Solid understanding of healthcare data structures, including claims, eligibility, and provider data.
* Proven ability to lead multiple projects simultaneously while mentoring and developing junior team members.
* Experience with cloud data technologies (e.g., Azure Data Factory, Databricks, Snowflake, AWS Redshift, or similar).
* Exposure to AI or Generative AI tools for data analysis, automation, or insight generation is preferred, but not required.
Competencies and Behaviors that Support Success in this Role
* Deep understanding of database architecture and large-scale healthcare data environments.
* Strong analytical thinking and the ability to translate complex data into actionable insights.
* Excellent communication skills, including the ability to explain technical concepts to non-technical audiences.
* Highly organized, detail-oriented, and able to manage competing priorities.
* Collaborative and proactive leadership style with a focus on mentorship and knowledge-sharing.
* Passion for applying analytics to improve healthcare performance, quality, and cost outcomes.
* Demonstrated accountability for quality, timelines, and client satisfaction.
* Fast learner who thrives in a dynamic, innovation-driven environment.
The Team
The Healthcare Data Analyst Lead will join a team that thrives on leveraging data, analytics, and technology to deliver meaningful business value. This is a team with technical aptitude and analytical prowess that enjoys building efficient and scalable products and processes. Ultimately, we are passionate about effecting change in healthcare. We also believe that collaboration and communication are cornerstones of success.
The Healthcare Data Analyst Lead will also join a mix of Healthcare Analysts, Leads, Consultants, and Principals. In addition, as part of the broader Milliman landscape, they will work alongside Healthcare Actuaries, Pharmacists, Clinicians, and Physicians. We aim to provide everyone a supportive environment, where we foster learning and growth through rewarding challenges.
Salary:
The overall salary range for this role is $104,900 - $199,065.
For candidates residing in:
* Alaska, California, Connecticut, Illinois, Maryland, Massachusetts, New Jersey, Notable New York City, Newark, San Jose, San Francisco, Pennsylvania, Virginia, Washington, or the District of Columbia the salary range is $120,635 - $199,065.
* All other locations the salary range is $104,900 - $173,100.
A combination of factors will be considered, including, but not limited to, education, relevant work experience, qualifications, skills, certifications, etc.
Location: It is preferred that candidates work on-site at our Brookfield, Wisconsin office, however, remote candidates will be considered.
The expected application deadline for this job is May 25, 2026.
Benefits
We offer a comprehensive benefits package designed to support employees' health, financial security, and well-being. Benefits include:
* Medical, Dental and Vision - Coverage for employees, dependents, and domestic partners.
* Employee Assistance Program (EAP) - Confidential support for personal and work-related challenges.
* 401(k) Plan - Includes a company matching program and profit-sharing contributions.
* Discretionary Bonus Program - Recognizing employee contributions.
* Flexible Spending Accounts (FSA) - Pre-tax savings for dependent care, transportation, and eligible medical expenses.
* Paid Time Off (PTO) - Begins accruing on the first day of work. Full-time employees accrue 15 days per year, and employees working less than full-time accrue PTO on a prorated basis.
* Holidays - A minimum of 10 observed holidays per year.
* Family Building Benefits - Includes adoption and fertility assistance.
* Paid Parental Leave - Up to 12 weeks of paid leave for employees who meet eligibility criteria.
* Life Insurance & AD&D - 100% of premiums covered by Milliman.
* Short-Term and Long-Term Disability - Fully paid by Milliman.
Equal Opportunity:
All qualified applicants will receive consideration for employment, without regard to race, color, religion, sex, sexual orientation, national origin, disability, or status as a protected veteran.
Security Lead-Data Protection
Senior data scientist job in Milwaukee, WI
Currently working with a Fortune 500 global manufacturing client that is looking for a Security Lead- Data Protection focused individual. This client is looking to add a permanent member to their security team that can grow within the company. Responsibilities:
• Develop and lead the Security Lead-Global Data Protection program across all clients business units
• Build and manage an effective data classification program
• Develop and drive Security Lead-Data Protection policies, standards and processes
• Develop and drive a data loss prevention tools strategy partnering with companies Information Technology groups and the clients business units
• Align Security -Lead Data Protection programs with other information security and cross-functional programs
• Direct and improve the Security Lead-Data Protection and DLP programs and associated governance activities including metrics, issue tracking and remediation, and programs supporting client policies.
• Establish and maintain cross-business relationships with key Leads and stakeholders across clients business, information security, and IT functions
• Coordinate with the Global IT Council to ensure compliance with clients standards and share best practices
• Conduct research and make recommendations on products, services, and standards in support of all global infrastructure security efforts.
• Develop and maintain appropriate response playbooks, facilitate routine exercises, and ensure a sound communication process for all cyber events
Job Requirements:
Formal Education & Certification
• Bachelor's degree in Information Security, IT Engineering, or Computer Science with 5+ years of IT experience
• Industry-recognized security certification such as GIAC, CISSP, CISM, or CISA are preferred
.
Qualifications
• Minimum 5 years' information security experience including experience mapping and securing business processes / data flows
• Demonstrated experience building and leading a cyber-security program
• Advanced knowledge of data protection/DLP and Windows client/server security concepts, best practices, and procedures
• Solid “hands on” technical experience is essential
• Experience in Information Security Incident Response
• Experience in IaaS/SaaS environments
• Broad understanding of all aspects of IT and enterprise systems interoperability
• Ability to communicate technical topics (verbal and written) to multiple organizational levels
• Global enterprise experience is preferred
Personal Attributes:
Demonstrated Leadship managing direct and matrix-reporting global teams
• Demonstrated experience leading global programs across technology and business functions
• Strong interpersonal, written, and oral communication skills
• Able to conduct research into issues and products as required
• Ability to prioritize and execute tasks in a fast-paced environment and make sound decisions in emergency situation
• Highly self-motivated and directed
• Keen attention to detail
• Proven analytical and problem-solving abilities
Additional Information
Work with blue Stone recruiting to find your next Cyber Security role. You can find us at ******************************* We look forward to speaking with you.
All your information will be kept confidential according to EEO guidelines.
Data Modeler - Manager - Consulting - Miami/South Florida
Senior data scientist job in Milwaukee, WI
At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
**Technology - Data and Decision Science - Data Modeling - Manager**
**The opportunit** **y**
We are looking for a dynamic and experienced Manager of Data Modeling to lead our team in designing and implementing complex cloud analytics solutions with a strong focus on Databricks. The ideal candidate will possess deep technical expertise in data architecture, cloud technologies, and analytics, along with exceptional leadership and client management skills
**Your Key Responsibilities**
In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that business requirements are translated into effective technical solutions.
+ Understand and analyze business requirements to translate them into technical requirements.
+ Designing, building, and operating scalable data architecture and modeling solutions.
+ Staying up to date with the latest trends and emerging technologies to maintain a competitive edge.
+ Lead the design and development of conceptual, logical, and physical data models for the Enterprise Data Warehouse (EDW), adhering to Kimball dimensional methodology (dimensional modeling approach, a core skill for this role.
+ Design and implement **star schemas** (Fact and Dimension tables) to support enterprise analytical, Business Intelligence (BI), and reporting requirements with high performance and query efficiency.
+ Define and model **Slowly Changing Dimensions (SCD)** , with an emphasis on **SCD Type 2** , to accurately capture and preserve the full historical context of dimension attributes (e.g., customer addresses, product categories) over time.
+ Collaboratively determine the appropriate **grain** (atomic level of detail) for fact tables and identify measurable **facts** (metrics) and their relationships to the surrounding dimension tables.
+ Design highly denormalized dimension structures to reduce joins and simplify querying for end-users and analytical tools.
+ Collaborate with stakeholders and cross-functional teams to understand data requirements and design appropriate data models that align with business needs.
+ Create and maintain data dictionaries and metadata repositories to ensure consistency and integrity of data models.
+ Identify and resolve data model performance issues to optimize database performance and enhance overall system functionality.
+ Manage and provide guidance to a small team of 1-3 resources
+ Document and communicate data model designs and standards to ensure understanding and compliance across the organization.
+ Create detailed **source-to-target mapping (STM) documentation** and data transformation specifications for ETL/ELT developers, including all SCD Type 2 logic and surrogate key generation rules.
+ Stay current with industry best practices and trends in data modelling and incorporate new techniques and methodologies into our data modelling processes.
+ Guide team on implementing data modelling techniques such as Kimball's dimensional modeling
+ Developing conceptual, logical, and physical data models to support data analysis and business intelligence.
+ Utilize industry-standard data modeling tools (e.g., Erwin, PowerDesigner, ER/Studio) to create and maintain conceptual, logical, and physical models, and to manage version control.
+ Having cloud knowledge and certification will be an advantage. Preferably Azure DP-203 certification.
**To qualify for the role, you must have**
+ A Bachelor's degree in STEM
+ 10+ Years of hands-on Experience in data modelling
+ Strong understanding of SQL and data modelling concepts such as Kimball dimensional modeling
+ Experience working with major enterprise relational database platforms or cloud data warehouses (e.g., Oracle, SQL Server).
+ Strong understanding of **data warehousing** principles, ETL/ELT architecture, data quality management, and metadata management.
+ **Mastery of SCD Type 2** (SCD2) implementation for preserving historical data (e.g., using **surrogate keys** , start/end dates, and current/active flags). Familiarity with other SCD types (Type 1, 3, 4, 6) is also expected
+ The ability to analyze complex business processes and translate abstract business requirements into precise, efficient technical data models.
+ Proficiency in data modelling tools such as Erwin / Visio or Power Designer
+ Experience in designing and implementing database structures
+ Ability to create and maintain conceptual, logical, and physical data models
+ Knowledge of industry best practices for data modelling and database design
+ Ability to collaborate with cross-functional teams to gather and analyze data requirements
+ Experience in performance tuning and optimization of database queries
+ Having cloud certification is an added advantage
+ Strong analytical and problem-solving skills
+ Excellent communication and documentation skills.
+ Knowledge in Hospitality Domain is mandatory.
**Ideally, you'll also have**
+ Strong business acumen with the ability to manage complex client relationships
+ Proven experience in managing and leading teams within a dynamic environment
+ Excellent communication and interpersonal skills
+ Cloud knowledge and certification will be an advantage. Preferably Azure DP-203 certification.
+ Experience in leading and influencing teams, with a focus on mentorship and professional development.
+ A passion for innovation and the strategic application of emerging technologies to solve real-world challenges.
+ The ability to foster an inclusive environment that values diverse perspectives and empowers team members.
+ Spanish speaking
+ Italian speaking
**What we look for**
We seek individuals who are not only technically proficient but also possess the qualities of top performers, including a strong sense of collaboration, adaptability, and a passion for continuous learning. If you are driven by results and have a desire to make a meaningful impact, we want to hear from you.
FY26NATAID
**What we offer you**
At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more .
+ We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $125,500 to $230,200. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $150,700 to $261,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
+ Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
+ Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
**Are you ready to shape your future with confidence? Apply today.**
EY accepts applications for this position on an on-going basis.
For those living in California, please click here for additional information.
EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
**EY | Building a better working world**
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
Business Intelligence Data Modeler
Senior data scientist job in Milwaukee, WI
CapB is a global leader on IT Solutions and Managed Services. Our R&D is focused on providing cutting edge products and solutions across Digital Transformations from Cloud, AI/ML, IOT, Blockchain to MDM/PIM, Supply chain, ERP, CRM, HRMS and Integration solutions. For our growing needs we need consultants who can work with us on salaried or contract basis. We provide industry standard benefits, and an environment for LEARNING & Growth.
For one of our going on project we are looking for a Business Intelligence Data Modeler. The position is based out of Milwaukee.
Responsibilities:
The Business Intelligence Analyst performs a variety of project-oriented tasks to support the information needs of the organization.
This position is responsible for all phases of reporting, decision support, and data analysis activities including report design, measure development, data collection, summarization, and validation.
The BI Analyst exercises independent judgement and discretionary decision making related to managing multiple and more complex reporting projects.
The BI Analyst is proficient with analytical and reporting tools, database development, ETL processes, query languages, and database and spreadsheet tools.
The BI Analyst will participate in reporting and presentations to various levels of management and staff and may also be included in action plan development.
BI Analyst will have in depth experience with Power BI to create dashboards, data exploration, visualization, and data storytelling from concept to final deliverables.
Advanced experience with Power BI, Power BI dataflow and dashboard. Technical expertise with data modeling and design to interact with multiple sources of data.
Ability to write complex DAX code and SQL queries for data manipulation.
Skills:
10 years experience required to Data analysis.
10 years experience required to Dashboarding/Business Objects Xcelsius.
Staff Data Engineer
Senior data scientist job in Park City, IL
Abbott is a global healthcare leader that helps people live more fully at all stages of life. Our portfolio of life-changing technologies spans the spectrum of healthcare, with leading businesses and products in diagnostics, medical devices, nutritionals and branded generic medicines. Our 114,000 colleagues serve people in more than 160 countries.
We're focused on helping people with diabetes manage their health with life-changing products that provide accurate data to drive better-informed decisions. We're revolutionizing the way people monitor their glucose levels with our new sensing technology.
**Working at Abbott**
At Abbott, you can do work that matters, grow, and learn, care for yourself and family, be your true self and live a full life. You'll also have access to:
+ Career development with an international company where you can grow the career you dream of.
+ Employees can qualify for free medical coverage in our Health Investment Plan (HIP) PPO medical plan in the next calendar year
+ An excellent retirement savings plan with high employer contribution
+ Tuition reimbursement, the Freedom 2 Save (******************************************************************************************************* student debt program and FreeU (*************************************************************************************************************** education benefit - an affordable and convenient path to getting a bachelor's degree.
+ A company recognized as a great place to work in dozens of countries around the world and named one of the most admired companies in the world by Fortune.
+ A company that is recognized as one of the best big companies to work for as well as a best place to work for diversity, working mothers, female executives, and scientists.
**THE OPPORTUNITY**
This **Staff** **Da** **ta Engineer** position can work out **remotely within the U.S** .
Are you ready to apply your technical expertise to make a real impact in the medical field and help improve the lives of people with diabetes? This role offers the opportunity to lead cloud-based big data engineering efforts, including data wrangling, analysis, and pipeline development. You'll help define and implement the organization's Big Data strategy, working closely with data engineers, analysts, and scientists to solve complex business problems using data science and machine learning.
As a staff member of the Data Engineering & Analytics team, you will be building big data collection and analytics capabilities to uncover customer, product and operational insights. Candidate should be able to work on a geographically distributed team to develop data pipelines capable of handling complex data sets quickly and securely as well as operationalize data science solutions. Additionally, they will be working in a technology-driven environment utilizing the latest tools and techniques such as Databricks, Redshift, S3, Lambda, DynamoDB, Spark and Python.
The candidate should have a passion for software engineering to help shape the direction of the team. Highly sought-after qualities include versatility and a desire to continuously learn, improve, and empower other team members. Candidate will support building scalable, highly available, efficient, and secure software solutions for big data initiatives.
**What You'll Work On**
+ Design and implement data pipelines to be processed and visualized across a variety of projects and initiatives
+ Develop and maintain optimal data pipeline architecture by designing and implementing data ingestion solutions on AWS using AWS native services.
+ Design and optimize data models on AWS Cloud using Databricks and AWS data stores such as Redshift, RDS, S3
+ Integrate and assemble large, complex data sets that meet a broad range of business requirements
+ Read, extract, transform, stage and load data to selected tools and frameworks as required and requested
+ Customizing and managing integration tools, databases, warehouses, and analytical systems
+ Process unstructured data into a form suitable for analysis and assist in analysis of the processed data
+ Working directly with the technology and engineering teams to integrate data processing and business objectives
+ Monitoring and optimizing data performance, uptime, and scale; Maintaining high standards of code quality and thoughtful design
+ Create software architecture and design documentation for the supported solutions and overall best practices and patterns
+ Support team with technical planning, design, and code reviews including peer code reviews
+ Provide Architecture and Technical Knowledge training and support for the solution groups
+ Develop good working relations with the other solution teams and groups, such as Engineering, Marketing, Product, Test, QA.
+ Stay current with emerging trends, making recommendations as needed to help the organization innovate
+ Proactively planning complex projects from scope/timeline development through technical design and execution.
+ Demonstrate leadership through mentoring other team members.
**Required Qualifications**
+ Bachelors Degree in Computer Science, Information Technology or other relevant field
+ **At least 5 to 10 years** of recent experience in Software Engineering, Data Engineering or Big Data
+ Ability to work effectively within a team in a fast-paced changing environment
+ Knowledge of or direct experience with Databricks and/or Spark.
+ Software development experience, ideally in Python, PySpark, Kafka or Go, and a willingness to learn new software development languages to meet goals and objectives.
+ Knowledge of strategies for processing large amounts of structured and unstructured data, including integrating data from multiple sources
+ Knowledge of data cleaning, wrangling, visualization and reporting
+ Ability to explore new alternatives or options to solve data mining issues, and utilize a combination of industry best practices, data innovations and experience
+ Familiarity of databases, BI applications, data quality and performance tunin
+ Excellent written, verbal and listening communication skills
+ Comfortable working asynchronously with a distributed team
**Preferred Qualifications**
+ Knowledge of or direct experience with the following AWS Services desired S3, RDS, Redshift, DynamoDB, EMR, Glue, and Lambda.
+ Experience working in an agile environment
+ Practical Knowledge of Linux
Apply Now (******************************
**Learn more about our health and wellness benefits, which provide the security to help you and your family live full lives:** ********************** (http://**********************/pages/candidate.aspx)
Follow your career aspirations to Abbott for diverse opportunities with a company that can help you build your future and live your best life. Abbott is an Equal Opportunity Employer, committed to employee diversity.
Connect with us at ************** , on Facebook at *********************** and on Twitter @AbbottNews and @AbbottGlobal
The base pay for this position is $97,300.00 - $194,700.00. In specific locations, the pay range may vary from the range posted.
An Equal Opportunity Employer
Abbot welcomes and encourages diversity in our workforce.
We provide reasonable accommodation to qualified individuals with disabilities.
To request accommodation, please call ************ or email ******************
Data Engineer - Platform & Product
Senior data scientist job in Milwaukee, WI
We are seeking a skilled and solution-oriented Data Engineer to contribute to the development of our growing Data Engineering function. This role will be instrumental in designing and optimizing data workflows, building domain-specific pipelines, and enhancing platform services. The ideal candidate will help evolve our Snowflake-based data platform into a scalable, domain-oriented architecture that supports business-critical analytics and machine learning initiatives.
Responsibilities
The candidate is expected to:
* Design and build reusable platform services, including pipeline frameworks, CI/CD workflows, data validation utilities, data contracts, lineage integrations
* Develop and maintain data pipelines for sourcing, transforming, and delivering trusted datasets into Snowflake
* Partner with Data Domain Owners to onboard new sources, implement data quality checks, and model data for analytics and machine learning use cases
* Collaborate with the Lead Data Platform Engineer and Delivery Manager to deliver monthly feature releases and support bug remediation
* Document and promote best practices for data pipeline development, testing, and deployment
Qualifications
The successful candidate will possess strong analytical skills and attention to detail. Additionally, the ideal candidate will possess:
* 3-6 years of experience in data engineering or analytics engineering
* Strong SQL and Python skills; experience with dbt or similar transformation frameworks
* Demonstrated experience building pipelines and services on Snowflake or other modern cloud data platforms
* Understanding of data quality, validation, lineage, and schema evolution
* Background in financial or market data (trading, pricing, benchmarks, ESG) is a plus
* Strong collaboration and communication skills, with a passion for enabling domain teams
Privacy Notice for California Applicants
Artisan Partners Limited Partnership is an equal opportunity employer. Artisan Partners does not discriminate on the basis of race, religion, color, national origin, gender, age, disability, marital status, sexual orientation or any other characteristic protected under applicable law. All employment decisions are made on the basis of qualifications, merit and business need.
#LI-Hybrid/span>
Auto-ApplyData Engineer
Senior data scientist job in Milwaukee, WI
* Hybrid Work Schedule - Coming into ManpowerGroup Headquarters in Milwaukee WI* The purpose of the Data Engineer role is to design and implement data models that maintain data integrity by staying informed of the ways the data is used within the organization and its information systems. The Data Engineer must be highly experienced in the analytics environment, so that they possess a strong foundation to be able to understand as well as create models that meet business requirements.
Making an Impact
* Working with business leaders, business analysts and web architects to define a strategy to implement long-term vision for the Production system application data and a strategy to implement that vision
* Defining, documenting, and enforcing the use of standards, policies and procedures for use by the team, including naming conventions, estimating guidelines, review and promotion procedures, security and user/role access, etc.
* Understanding of a phased project approach and the ability to identify the tasks, timelines, and interdependencies with other team's tasks to complete the project on-time and in budget
* Creating and maintaining the conceptual (logical) and physical data models
* Facilitating meetings and communications
o with the business to define business requirements, to gain consensus/approval, and to communicate status
o within the project team to ensure a common understanding, drive decisions, and promote cohesion
o with other Data & Analytics functional teams to foster a collaborative work environment and drive issues to resolution
* Leverage Production Database expertise to produce analytics platform data structures to be consumed in Data modelling, Dashboards, Analytics & stakeholder needs
* Workflow design to create scalable data solutions in a automated / scheduled manner
* Monitor Live Data transformation workflows to ensure content availability
Sharing Expertise
* Participating in Production system development review sessions
* Influencing through functional knowledge/solution options. Leads workflow, influences team & customers to ensure they're aligned on timing and approach
* Responsible for recommending ways to continually improve own workflow, the department workflow, and delivery of solutions for the customers of the functional area. Responsible for bringing issues to management and presenting options for solutions.
* Acts as an escalation point to resolving problems and conflicts where necessary, ensuring alternate points of view are taken into account and problems are brought to management where necessary with pros and cons of the various options identified, as well as a recommendation where feasible.
Gaining Exposure
* Partnering with leaders across NA to understand their unique and shared areas of focus from a D&A perspective
* Sharing D&A perspective on how to solve or approach gaps that aren't yet identified by leaders, identify alternatives and provide recommendations to address proactively
Your Typical Day
* Defining and documenting overall data architecture, infrastructure, and performance requirements and working with the DBA to implement it
Required
* Bachelor's degree
* 2 years working in an analytics or IT environment or specialized Bachelor's degree in data modeling or similar focus
Nice to Have
* Experience with conceptual data modeling methodologies, and the development of architectures supporting web-based applications
ManpowerGroup is proud to be an equal opportunity affirmative action workplace. We celebrate diversity and are committed to providing an inclusive environment for all employees. Qualified applicants will receive consideration for employment without regard to race, religion, creed, color, national origin, citizenship, marital status, pregnancy (including childbirth, lactation and related medical conditions), age, gender, gender identity or expression, sexual orientation, protected veteran status, political ideology, ancestry, the presence of any physical, sensory, or mental disabilities, or other legally protected status.
A strong commitment is made by each employee and is necessary to ensure equal employment opportunity for all. ManpowerGroup is an inclusive workplace that will recruit, hire, train, and promote persons of all job titles, and ensure all other personnel actions are administered without regard to non-merit-based characteristics of individuals.
Reasonable accommodation during the interview process can be provided. Contact talentacquisition@manpowergroup.com for assistance.
Data Engineer
Senior data scientist job in Mequon, WI
Charter Manufacturing is a fourth-generation family-owned business where our will to grow drives us to do it better. Join the team and become part of our family!
Applicants must be authorized to work for ANY employer in the U.S. Charter Manufacturing is unable to sponsor for employment visas at this time.
This position is hybrid, 3 days a week in office in Mequon, WI.
BI&A- Lead Data Engineer
Charter Manufacturing continues to invest in Data & Analytics. Come join a great team and great culture leveraging your expertise to drive analytics transformation across Charter's companies. This is a key role in the organization that will provide thought leadership, as well as add substantial value by delivering trusted data pipelines that will be used to develop models and visualizations that tell a story and solve real business needs/problems. This role will collaborate with team members and business stakeholders to leverage data as an asset driving business outcomes aligned to business strategies.
Having 7+ years prior experience in developing data pipelines and partnering with team members and business stakeholders to drive adoption will be critical to the success of this role.
MINIMUM QUALIFICATIONS:
Bachelor's degree in computer science, data science, software engineering, information systems, or related quantitative field; master's degree preferred
At least seven years of work experience in data management disciplines, including data integration, modeling, optimization and data quality, or other areas directly relevant to data engineering responsibilities and tasks
Proven project experience designing, developing, deploying, and maintaining data pipelines used to support AI, ML, and BI using big data solutions (Azure, Snowflake)
Strong knowledge in Azure technologies such as Azure Web Application, Azure Data Explorer, Azure DevOps, and Azure Blob Storage to build scalable and efficient data pipelines
Strong knowledge using programming languages such as R, Python, C#, and Azure Machine Learning Workspace development
Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP) and modern data warehouse tools (Snowflake, Databricks)
Experience with database technologies such as SQL, Oracle, and Snowflake
Prior experience with ETL/ELT data ingestion into data lakes/data warehouses for analytics consumption
Strong SQL skills
Ability to collaborate within and across teams of different technical knowledge to support delivery and educate end users on data products
Passionate about teaching, coaching, and mentoring others
Strong problem-solving skills, including debugging skills, allowing the determination of sources of issues in unfamiliar code or systems, and the ability to recognize and solve problems
Excellent business acumen and interpersonal skills; able to work across business lines at a senior level to influence and effect change to achieve common goals
Ability to describe business use cases/outcomes, data sources and management concepts, and analytical approaches/options
Demonstrated experience delivering business value by structuring and analyzing large, complex data sets
Demonstrated initiative, strong sense of accountability, collaboration, and known as a trusted business partner
PREFERRED QUALIFICATIONS INCLUDES EXPERIENCE WITH:
Manufacturing industry experience specifically heavy industry, supply chain and operations
Designing and supporting data integrations with ERP systems such as Oracle or SAP
MAJOR ACCOUNTABILITIES:
Designs, develops, and supports data pipelines for batch and streaming data extraction from various sources (databases, API's, external systems), transforming it into the desired format, and load it into the appropriate data storage systems
Collaborates with data scientists and analysts to optimize models and algorithms for in accordance with data quality, security, and governance policies
Ensures data quality, consistency, and integrity during the integration process, performing data cleansing, aggregation, filtering, and validation as needed to ensure accuracy, consistency, and completeness of data
Optimizes data pipelines and data processing workflows for performance, scalability, and efficiency.
Monitors and tunes data pipelines for performance, scalability, and efficiency resolving performance bottlenecks
Establish architecture patterns, design standards, and best practices to accelerate delivery and adoption of solutions
Assist, educate, train users to drive self-service enablement leveraging best practices
Collaborate with business subject matter experts, analysts, and offshore team members to develop and deliver solutions in a timely manner
Embraces and establishes governance of data and algorithms, quality, standards, and best practices, ensuring data accuracy
We offer comprehensive health, dental, and vision benefits, along with a 401(k) plan that includes employer matching and profit sharing. Additionally, we offer company-paid life insurance, disability coverage, and paid time off (PTO).
Auto-ApplySenior Data Engineer
Senior data scientist job in Milwaukee, WI
At Wipfli, people count. At Wipfli, our people are core to everything we do-the catalyst behind our ability to create exceptional impact and extraordinary results. We believe in flexibility. We focus on relationships. We encourage each individual to follow their own path.
People truly matter and they feel it. For those looking to make a difference and find a professional home, Wipfli offers a career-defining opportunity.
Requisition Number: 2025-7427
Position Overview:
This role will take direction from the Information Team Director and will be responsible for contributing to the continuous advancement of a modern data lakehouse built to support a rapidly growing firm's desire to democratize its data asset.
Responsibilities
Responsibilities:
+ Lead influence and consensus building efforts for recommended solutions
+ Support and document the continued evolution of the firm's data lakehouse using the medallion architecture
+ Translate requirements into effective data models to support visualizations, AI\ML models, etc. leveraging design best practices and team standards using approved tools
+ Develop in data technologies such as Databricks, Microsoft Azure Data Factory, Python, t-SQL
+ Manage the execution of project life cycle activities in accordance with the Information Team scrum processes and tools such as Microsoft Azure DevOps
+ Achieve/maintain proficiency in required skills identified by the Information Team to effectively deliver defined products
+ Collaborate with team members to evolve products and internal processes
+ Mentor other Engineers and other IT associates as needed.
+ Perform on-call support for after business hours as needed.
Knowledge, Skills and Abilities
Qualifications:
+ Demonstrated success in working on a modern data platform with Databricks experience being preferred. Accredited certification(s) and/or 5+ years hands on desired.
+ Naturally curious with the ability to learn and implement new concepts quickly.
+ A mastery of extracting and landing data from source systems via all access methods. Extra credit for Workday RaaS and/or Microsoft Dynamics/Dataverse skills.
+ A commitment to operational standards, quality, and accountability for testing, code reviews/management and documentation.
+ Engaged in the virtual team experience leveraging video conferencing (cameras on) and a focus on relationship building. Travel is rare, but we do occasionally organize in-person events.
Benjamin Dzanic, from our recruiting team, will be guiding you through this process. Visit his LinkedIn (************************************* page to connect!
#LI-REMOTE #LI-BD1
Additional Details
Additional Details:
Wipfli is an equal opportunity/affirmative action employer. All candidates will receive consideration for employment without regards to race, creed, color, religion, national origin, sex, age, marital status, sexual orientation, gender identify, veteran status, disability, or any other characteristics protected by federal, state, or local laws.
Wipfli is committed to providing reasonable accommodations for people with disabilities. If you require a reasonable accommodation to complete an application, interview, or participate in our recruiting process, please send us an email at *************
Wipfli values fair, transparent, and competitive compensation, considering each candidate's unique skills and experiences. The estimated base pay range for this role is $107,000 to $144,000, with offers typically not made at the maximum, allowing for future salary increases. The actual salary at the time of offer depends on business related factors like location, skills, experience, training/education, licensure, certifications, business needs, current associate pay, and relevant employment laws.
Individuals may be eligible for an annual discretionary bonus, subject to participation rules and based on a variety of factors including, but not limited to, individual and Firm performance.
Wipfli cares about our associates and offers a variety of benefits to support their well-being. Highlights include 8 health plan options (both HMO & PPO plans), dental and vision coverage, opportunity to enroll in HSA with potential Firm contribution and an Employee Assistance Program. Other benefits include firm-sponsored basic life and short and long-term disability coverage, a 401(k) savings plan & profit share as well as Firm matching contribution, well-being incentive, education & certification assistance, flexible time off, family care leave, parental leave, family formation benefits, cell phone reimbursement, and travel rewards. Voluntary benefit offerings include critical illness & accident insurance, hospital indemnity insurance, legal, long-term care, pet insurance, ID theft protection, and supplemental life/AD&D. Eligibility for all benefits programs is dependent on annual hours expectation, position status/level and location.
"Wipfli" is the brand name under which Wipfli LLP and Wipfli Advisory LLC and its respective subsidiary entities provide professional services. Wipfli LLP and Wipfli Advisory LLC (and its respective subsidiary entities) practice in an alternative practice structure in accordance with the AICPA Code of Professional Conduct and applicable law, regulations, and professional standards. Wipfli LLP is a licensed independent CPA firm that provides attest services to its clients, and Wipfli Advisory LLC provides tax and business consulting services to its clients. Wipfli Advisory LLC and its subsidiary entities are not licensed CPA firms.
Job LocationsUS
Job ID 2025-7427
Category Internal IT
Remote Yes
Associate Data Engineer
Senior data scientist job in Milwaukee, WI
Baker Tilly is a leading advisory, tax and assurance firm, providing clients with a genuine coast-to-coast and global advantage in major regions of the U.S. and in many of the world's leading financial centers - New York, London, San Francisco, Los Angeles, Chicago and Boston. Baker Tilly Advisory Group, LP and Baker Tilly US, LLP (Baker Tilly) provide professional services through an alternative practice structure in accordance with the AICPA Code of Professional Conduct and applicable laws, regulations and professional standards. Baker Tilly US, LLP is a licensed independent CPA firm that provides attest services to its clients. Baker Tilly Advisory Group, LP and its subsidiary entities provide tax and business advisory services to their clients. Baker Tilly Advisory Group, LP and its subsidiary entities are not licensed CPA firms.
Baker Tilly Advisory Group, LP and Baker Tilly US, LLP, trading as Baker Tilly, are independent members of Baker Tilly International, a worldwide network of independent accounting and business advisory firms in 141 territories, with 43,000 professionals and a combined worldwide revenue of $5.2 billion. Visit bakertilly.com or join the conversation on LinkedIn, Facebook and Instagram.
Please discuss the work location status with your Baker Tilly talent acquisition professional to understand the requirements for an opportunity you are exploring.
Baker Tilly is an equal opportunity/affirmative action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, gender identity, sexual orientation, or any other legally protected basis, in accordance with applicable federal, state or local law.
Any unsolicited resumes submitted through our website or to Baker Tilly Advisory Group, LP, employee e-mail accounts are considered property of Baker Tilly Advisory Group, LP, and are not subject to payment of agency fees. In order to be an authorized recruitment agency ("search firm") for Baker Tilly Advisory Group, LP, there must be a formal written agreement in place and the agency must be invited, by Baker Tilly's Talent Attraction team, to submit candidates for review via our applicant tracking system.
Job Description:
Associate Data Engineer
As a Senior Consultant - Associate Data Engineer you will design, build, and optimize modern data solutions for our mid‑market and enterprise clients. Working primarily inside the Microsoft stack (Azure, Synapse, and Microsoft Fabric), you will transform raw data into trusted, analytics‑ready assets that power dashboards, advanced analytics, and AI use cases. You'll collaborate with solution architects, analysts, and client stakeholders while sharpening both your technical depth and consulting skills.
Key Responsibilities:
* Data Engineering: Develop scalable, well‑documented ETL/ELT pipelines using T‑SQL, Python, Azure Data Factory/Fabric Data Pipelines, and Databricks; implement best‑practice patterns for performance, security, and cost control.
* Modeling & Storage: Design relational and lakehouse models; create Fabric OneLake shortcuts, medallion‑style layers, and dimensional/semantic models for Power BI.
* Quality & Governance: Build automated data‑quality checks, lineage, and observability metrics; contribute to CI/CD workflows in Azure DevOps or GitHub.
* Client Delivery: Gather requirements, demo iterative deliverables, document technical designs, and translate complex concepts to non‑technical audiences.
* Continuous Improvement: Research new capabilities, share findings in internal communities of practice, and contribute to reusable accelerators. Collaborate with clients and internal stakeholders to design and implement scalable data engineering solutions.
Qualifications:
* Education - Bachelor's in Computer Science, Information Systems, Engineering, or related field (or equivalent experience)
* Experience - 2-3 years delivering production data solutions, preferably in a consulting or client‑facing role.
* Technical Skills:
Strong T‑SQL for data transformation and performance tuning.
Python for data wrangling, orchestration, or notebook‑based development.
Hands‑on ETL/ELT with at least one Microsoft service (ADF, Synapse Pipelines, Fabric Data Pipelines).
* Project experience with Microsoft Fabric (OneLake, Lakehouses, Data Pipelines, Notebooks, Warehouse, Power BI DirectLake) preferred
* Familiarity with Databricks, Delta Lake, or comparable lakehouse technologies preferred
* Exposure to DevOps (YAML pipelines, Terraform/Bicep) and test automation frameworks preferred
* Experience integrating SaaS/ERP sources (e.g., Dynamics 365, Workday, Costpoint) preferred
Auto-ApplyAI Data Scientist
Senior data scientist job in Milwaukee, WI
What you Will Do
Clarios is seeking a skilled AI Data Scientist to design, develop, and deploy machine learning and AI solutions that unlock insights, optimize processes, and drive innovation across operations, offices, and products. This role focuses on transforming complex, high-volume data into actionable intelligence and enabling predictive and prescriptive capabilities that deliver measurable business impact. The AI Data Scientist will collaborate closely with AI Product Owners and business SMEs to ensure solutions are robust, scalable, and aligned with enterprise objectives.
This role requires an analytical, innovative, and detail-oriented team member with a strong foundation in AI/ML and a passion for solving complex problems. The individual must be highly collaborative, an effective communicator, and committed to continuous learning and improvement. This will be onsite three days a week in Glendale.
How you will do it
Hypothesis Framing & Metric Measurement: Translate business objectives into well-defined AI problem statements with clear success metrics and decision criteria. Prioritize opportunities by ROI, feasibility, risk, and data readiness; define experimental plans and acceptance thresholds to progress solutions from concept to scaled adoption.
Data Analysis & Feature Engineering: Conduct rigorous exploratory data analysis to uncover patterns, anomalies, and relationships across heterogeneous datasets. Apply advanced statistical methods and visualization to generate actionable insights; engineer high-value features (transformations, aggregations, embeddings) and perform preprocessing (normalization, encoding, outlier handling, dimensionality reduction). Establish data quality checks, schemas, and data contracts to ensure trustworthy inputs.
Model Development & Iteration: Design and build models across classical ML and advanced techniques-deep learning, NLP, computer vision, time-series forecasting, anomaly detection, and optimization. Run statistically sound experiments (cross-validation, holdouts, A/B testing), perform hyperparameter tuning and model selection, and balance accuracy, latency, stability, and cost. Extend beyond prediction to prescriptive decision-making (policy, scheduling, setpoint optimization, reinforcement learning), with domain applications such as OEE improvement, predictive maintenance, production process optimization, and digital twin integration in manufacturing contexts.
MLOps & Performance: Develop end-to-end pipelines for ingestion, training, validation, packaging, and deployment using CI/CD, reproducibility, and observability best practices. Implement performance and drift monitoring, automated retraining triggers, rollback strategies, and robust versioning to ensure reliability in dynamic environments. Optimize for scale, latency, and cost; support real-time inference and edge/plant-floor constraints under defined SLAs/SLOs.
Collaboration & Vendor Leadership: Partner with AI Product Owners, business SMEs, IT, and operations teams to translate requirements into pragmatic, integrated solutions aligned with enterprise standards. Engage process owners to validate data sources, constraints, and hypotheses; design human-in-the-loop workflows that drive adoption and continuous feedback. Provide technical oversight of external vendors-evaluating capabilities, directing data scientists/engineers/solution architects, validating architectures and algorithms, and ensuring seamless integration, timely delivery, and measurable value. Mentor peers, set coding/modeling standards, and foster a culture of excellence.
Responsible AI & Knowledge Management: Ensure data integrity, model explainability, fairness, privacy, and regulatory compliance throughout the lifecycle. Establish model risk controls; maintain documentation (model cards, data lineage, decision logs), audit trails, and objective acceptance criteria for production release. Curate reusable assets (feature catalogs, templates, code libraries) and best-practice playbooks to accelerate delivery while enforcing Responsible AI principles and rigorous quality assurance
What we look for
5+ years of experience in data science and machine learning, delivering production-grade solutions in corporate or manufacturing environments.
Strong proficiency in Python and common data science libraries (e.g., Pandas, NumPy, scikit-learn); experience with deep learning frameworks (TensorFlow, PyTorch) and advanced techniques (NLP, computer vision, time-series forecasting).
Hands-on experience with data preprocessing, feature engineering, and EDA for large, complex datasets.
Expertise in model development, validation, and deployment, including hyperparameter tuning, optimization, and performance monitoring.
Experience interacting with databases and writing SQL queries.
Experience using data visualization techniques for analysis and model explanation.
Familiarity with MLOps best practices-CI/CD pipelines, containerization (Docker), orchestration, model versioning, and drift monitoring.
Knowledge of cloud platforms (e.g., Microsoft Azure, Snowflake) and distributed computing frameworks (e.g., Spark) for scalable AI solutions.
Experience with agile methodologies and collaboration tools (e.g., JIRA, Azure DevOps), working in matrixed environments across IT, analytics, and business teams.
Strong analytical and business acumen, with the ability to quantify ROI and build business cases for AI initiatives.
Excellent communication and stakeholder engagement skills; able to present insights and recommendations to technical and non-technical audiences.
Knowledge of LLMs and VLMs is a strong plus.
Understanding of manufacturing systems (SCADA, PLCs, MES) and the ability to integrate AI models into operational workflows is a strong plus.
Willingness to travel up to 10% as needed.
#LI-AL1
#LI-HYBRID
What you get:
Medical, dental and vision care coverage and a 401(k) savings plan with company matching - all starting on date of hire
Tuition reimbursement, perks, and discounts
Parental and caregiver leave programs
All the usual benefits such as paid time off, flexible spending, short-and long-term disability, basic life insurance, business travel insurance, Employee Assistance Program, and domestic partner benefits
Global market strength and worldwide market share leadership
HQ location earns LEED certification for sustainability plus a full-service cafeteria and workout facility
Clarios has been recognized as one of 2025's Most Ethical Companies by Ethisphere. This prestigious recognition marks the third consecutive year Clarios has received this distinction.
Who we are:
Clarios is the force behind the world's most recognizable car battery brands, powering vehicles from leading automakers like Ford, General Motors, Toyota, Honda, and Nissan. With 18,000 employees worldwide, we develop, manufacture, and distribute energy storage solutions while recovering, recycling, and reusing up to 99% of battery materials-setting the standard for sustainability in our industry. At Clarios, we're not just making batteries; we're shaping the future of sustainable transportation. Join our mission to innovate, push boundaries, and make a real impact. Discover your potential at Clarios-where your power meets endless possibilities.
Veterans/Military Spouses:
We value the leadership, adaptability, and technical expertise developed through military service. At Clarios, those capabilities thrive in an environment built on grit, ingenuity, and passion-where you can grow your career while helping to power progress worldwide. All qualified applicants will be considered without regard to protected characteristics.
We recognize that people come with a wealth of experience and talent beyond just the technical requirements of a job. If your experience is close to what you see listed here, please apply. Diversity of experience and skills combined with passion is key to challenging the status quo. Therefore, we encourage people from all backgrounds to apply to our positions. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, age, status as a protected veteran or other protected characteristics protected by law. As a federal contractor, we are committed to not discriminating against any applicant or employee based on these protected statuses. We will also take affirmative action to ensure equal employment opportunities. Please let us know if you require accommodations during the interview process by emailing Special.Accommodations@Clarios.com. We are an Equal Opportunity Employer and value diversity in our teams in terms of work experience, area of expertise, and all characteristics protected by laws in the countries where we operate. For more information on our commitment to sustainability, diversity, and equal opportunity, please read our latest report. We want you to know your rights because EEO is the law.
A Note to Job Applicants: please be aware of scams being perpetrated through the Internet and social media platforms. Clarios will never require a job applicant to pay money as part of the application or hiring process.
To all recruitment agencies: Clarios does not accept unsolicited agency resumes/CVs. Please do not forward resumes/CVs to our careers email addresses, Clarios employees or any other company location. Clarios is not responsible for any fees related to unsolicited resumes/CVs.
Auto-ApplySecurity Lead-Data Protection
Senior data scientist job in Milwaukee, WI
blue Stone Recruiting is a national search firm with a focus of placing top Cyber Security talent from the Analyst level to CISO with prestigious organizations nationwide.
Job Description
Currently working with a Fortune 500 global manufacturing client that is looking for a Security Lead- Data Protection focused individual. This client is looking to add a permanent member to their security team that can grow within the company.
Responsibilities:
• Develop and lead the Security Lead-Global Data Protection program across all clients business units
• Build and manage an effective data classification program
• Develop and drive Security Lead-Data Protection policies, standards and processes
• Develop and drive a data loss prevention tools strategy partnering with companies Information Technology groups and the clients business units
• Align Security -Lead Data Protection programs with other information security and cross-functional programs
• Direct and improve the Security Lead-Data Protection and DLP programs and associated governance activities including metrics, issue tracking and remediation, and programs supporting client policies.
• Establish and maintain cross-business relationships with key Leads and stakeholders across clients business, information security, and IT functions
• Coordinate with the Global IT Council to ensure compliance with clients standards and share best practices
• Conduct research and make recommendations on products, services, and standards in support of all global infrastructure security efforts.
• Develop and maintain appropriate response playbooks, facilitate routine exercises, and ensure a sound communication process for all cyber events
Job Requirements:
Formal Education & Certification
• Bachelor's degree in Information Security, IT Engineering, or Computer Science with 5+ years of IT experience
• Industry-recognized security certification such as GIAC, CISSP, CISM, or CISA are preferred
.
Qualifications
• Minimum 5 years' information security experience including experience mapping and securing business processes / data flows
• Demonstrated experience building and leading a cyber-security program
• Advanced knowledge of data protection/DLP and Windows client/server security concepts, best practices, and procedures
• Solid “hands on” technical experience is essential
• Experience in Information Security Incident Response
• Experience in IaaS/SaaS environments
• Broad understanding of all aspects of IT and enterprise systems interoperability
• Ability to communicate technical topics (verbal and written) to multiple organizational levels
• Global enterprise experience is preferred
Personal Attributes:
Demonstrated Leadship managing direct and matrix-reporting global teams
• Demonstrated experience leading global programs across technology and business functions
• Strong interpersonal, written, and oral communication skills
• Able to conduct research into issues and products as required
• Ability to prioritize and execute tasks in a fast-paced environment and make sound decisions in emergency situation
• Highly self-motivated and directed
• Keen attention to detail
• Proven analytical and problem-solving abilities
Additional Information
Work with blue Stone recruiting to find your next Cyber Security role. You can find us at ******************************* We look forward to speaking with you.
All your information will be kept confidential according to EEO guidelines.
Senior Data Engineer
Senior data scientist job in Park City, IL
Abbott is a global healthcare leader that helps people live more fully at all stages of life. Our portfolio of life-changing technologies spans the spectrum of healthcare, with leading businesses and products in diagnostics, medical devices, nutritionals and branded generic medicines. Our 114,000 colleagues serve people in more than 160 countries.JOB DESCRIPTION:
We're focused on helping people with diabetes manage their health with life-changing products that provide accurate data to drive better-informed decisions. We're revolutionizing the way people monitor their glucose levels with our new sensing technology.
Working at Abbott
At Abbott, you can do work that matters, grow, and learn, care for yourself and family, be your true self and live a full life. You'll also have access to:
Career development with an international company where you can grow the career you dream of.
Employees can qualify for free medical coverage in our Health Investment Plan (HIP) PPO medical plan in the next calendar year
An excellent retirement savings plan with high employer contribution
Tuition reimbursement, the Freedom 2 Save student debt program and FreeU education benefit - an affordable and convenient path to getting a bachelor's degree.
A company recognized as a great place to work in dozens of countries around the world and named one of the most admired companies in the world by Fortune.
A company that is recognized as one of the best big companies to work for as well as a best place to work for diversity, working mothers, female executives, and scientists.
THE OPPORTUNITY
This Senior Data Engineer position can work out remotely within the U.S.
Are you ready to apply your technical expertise to make a real impact in the medical field and help improve the lives of people with diabetes? This role offers the opportunity to lead cloud-based big data engineering efforts, including data wrangling, analysis, and pipeline development. You'll help define and implement the organization's Big Data strategy, working closely with data engineers, analysts, and scientists to solve complex business problems using data science and machine learning.
As a senior member of the Data Engineering & Analytics team, you'll build scalable data solutions that uncover insights across customer behavior, product performance, and operations. You'll work in a distributed team environment using modern technologies like Databricks, Redshift, S3, Lambda, DynamoDB, Spark, and Python. The ideal candidate is passionate about software engineering, thrives in fast-paced environments, and brings versatility, curiosity, and a collaborative spirit to the team.
What You'll Work On
Design and implement data pipelines to be processed and visualized across a variety of projects and initiatives
Develop and maintain optimal data pipeline architecture by designing and implementing data ingestion solutions on AWS using AWS native services.
Design and optimize data models on AWS Cloud using Databricks and AWS data stores such as Redshift, RDS, S3
Integrate and assemble large, complex data sets that meet a broad range of business requirements
Read, extract, transform, stage and load data to selected tools and frameworks as required and requested
Customizing and managing integration tools, databases, warehouses, and analytical systems
Process unstructured data into a form suitable for analysis and assist in analysis of the processed data
Working directly with the technology and engineering teams to integrate data processing and business objectives
Monitoring and optimizing data performance, uptime, and scale; Maintaining high standards of code quality and thoughtful design
Create software architecture and design documentation for the supported solutions and overall best practices and patterns
Support team with technical planning, design, and code reviews including peer code reviews
Provide Architecture and Technical Knowledge training and support for the solution groups
Develop good working relations with the other solution teams and groups, such as Engineering, Marketing, Product, Test, QA.
Stay current with emerging trends, making recommendations as needed to help the organization innovate
Qualifications
Bachelors Degree in Computer Science, Information Technology or other relevant field
At least 2 to 6 years of recent experience in Software Engineering, Data Engineering or Big Data
Ability to work effectively within a team in a fast-paced changing environment
Knowledge of or direct experience with Databricks and/or Spark.
Software development experience, ideally in Python, PySpark, Kafka or Go, and a willingness to learn new software development languages to meet goals and objectives.
Knowledge of strategies for processing large amounts of structured and unstructured data, including integrating data from multiple sources
Knowledge of data cleaning, wrangling, visualization and reporting
Ability to explore new alternatives or options to solve data mining issues, and utilize a combination of industry best practices, data innovations and experience
Familiarity of databases, BI applications, data quality and performance tuning
Excellent written, verbal and listening communication skills
Comfortable working asynchronously with a distributed team
Preferred
Knowledge of or direct experience with the following AWS Services desired S3, RDS, Redshift, DynamoDB, EMR, Glue, and Lambda.
Experience working in an agile environment
Practical Knowledge of Linux
#software
Apply Now
Learn more about our health and wellness benefits, which provide the security to help you and your family live full lives: **********************
Follow your career aspirations to Abbott for diverse opportunities with a company that can help you build your future and live your best life. Abbott is an Equal Opportunity Employer, committed to employee diversity.
Connect with us at *************** on Facebook at *********************** and on Twitter @AbbottNews and @AbbottGlobal
The base pay for this position is
$75,300.00 - $150,700.00
In specific locations, the pay range may vary from the range posted.
JOB FAMILY:Product DevelopmentDIVISION:ADC Diabetes CareLOCATION:United States of America : RemoteADDITIONAL LOCATIONS:WORK SHIFT:StandardTRAVEL:Yes, 10 % of the TimeMEDICAL SURVEILLANCE:Not ApplicableSIGNIFICANT WORK ACTIVITIES:Continuous sitting for prolonged periods (more than 2 consecutive hours in an 8 hour day), Keyboard use (greater or equal to 50% of the workday) Abbott is an Equal Opportunity Employer of Minorities/Women/Individuals with Disabilities/Protected Veterans.EEO is the Law link - English: ************************************************************ EEO is the Law link - Espanol: ************************************************************
Auto-ApplyDatabricks Data Engineer - Manager - Consulting - Location Open 1
Senior data scientist job in Milwaukee, WI
At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
**Technology - Data and Decision Science - Data Engineering - Manager**
We are looking for a dynamic and experienced Manager of Data Engineering to lead our team in designing and implementing complex cloud analytics solutions with a strong focus on Databricks. The ideal candidate will possess deep technical expertise in data architecture, cloud technologies, and analytics, along with exceptional leadership and client management skills.
**The opportunity:**
In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that business requirements are translated into effective technical solutions. Key responsibilities include:
+ Understanding and analyzing business requirements to translate them into technical requirements.
+ Designing, building, and operating scalable data architecture and modeling solutions.
+ Staying up to date with the latest trends and emerging technologies to maintain a competitive edge.
**Key Responsibilities:**
As a Data Engineering Manager, you will play a crucial role in managing and delivering complex technical initiatives. Your time will be spent across various responsibilities, including:
+ Leading workstream delivery and ensuring quality in all processes.
+ Engaging with clients on a daily basis, actively participating in working sessions, and identifying opportunities for additional services.
+ Implementing resource plans and budgets while managing engagement economics.
This role offers the opportunity to work in a dynamic environment where you will face challenges that require innovative solutions. You will learn and grow as you guide others and interpret internal and external issues to recommend quality solutions. Travel may be required regularly based on client needs.
**Skills and attributes for success:**
To thrive in this role, you should possess a blend of technical and interpersonal skills. The following attributes will make a significant impact:
+ Lead the design and development of scalable data engineering solutions using Databricks on cloud platforms (e.g., AWS, Azure, GCP).
+ Oversee the architecture of complex cloud analytics solutions, ensuring alignment with business objectives and best practices.
+ Manage and mentor a team of data engineers, fostering a culture of innovation, collaboration, and continuous improvement.
+ Collaborate with clients to understand their analytics needs and deliver tailored solutions that drive business value.
+ Ensure the quality, integrity, and security of data throughout the data lifecycle, implementing best practices in data governance.
+ Drive end-to-end data pipeline development, including data ingestion, transformation, and storage, leveraging Databricks and other cloud services.
+ Communicate effectively with stakeholders, including technical and non-technical audiences, to convey complex data concepts and project progress.
+ Manage client relationships and expectations, ensuring high levels of satisfaction and engagement.
+ Stay abreast of the latest trends and technologies in data engineering, cloud computing, and analytics.
+ Strong analytical and problem-solving abilities.
+ Excellent communication skills, with the ability to convey complex information clearly.
+ Proven experience in managing and delivering projects effectively.
+ Ability to build and manage relationships with clients and stakeholders.
**To qualify for the role, you must have:**
+ Bachelor's degree in computer science, Engineering, or a related field required; Master's degree preferred.
+ Typically, no less than 4 - 6 years relevant experience in data engineering, with a focus on cloud data solutions and analytics.
+ Proven expertise in Databricks and experience with Spark for big data processing.
+ Strong background in data architecture and design, with experience in building complex cloud analytics solutions.
+ Experience in leading and managing teams, with a focus on mentoring and developing talent.
+ Strong programming skills in languages such as Python, Scala, or SQL.
+ Excellent problem-solving skills and the ability to work independently and as part of a team.
+ Strong communication and interpersonal skills, with a focus on client management.
**Required Expertise for Managerial Role:**
+ **Strategic Leadership:** Ability to align data engineering initiatives with organizational goals and drive strategic vision.
+ **Project Management:** Experience in managing multiple projects and teams, ensuring timely delivery and adherence to project scope.
+ **Stakeholder Engagement:** Proficiency in engaging with various stakeholders, including executives, to understand their needs and present solutions effectively.
+ **Change Management:** Skills in guiding clients through change processes related to data transformation and technology adoption.
+ **Risk Management:** Ability to identify potential risks in data projects and develop mitigation strategies.
+ **Technical Leadership:** Experience in leading technical discussions and making architectural decisions that impact project outcomes.
+ **Documentation and Reporting:** Proficiency in creating comprehensive documentation and reports to communicate project progress and outcomes to clients.
**Large-Scale Implementation Programs:**
1. **Enterprise Data Lake Implementation:** Led the design and deployment of a cloud-based data lake solution for a Fortune 500 retail client, integrating data from multiple sources (e.g., ERPs, POS systems, e-commerce platforms) to enable advanced analytics and reporting capabilities.
2. **Real-Time Analytics Platform:** Managed the development of a real-time analytics platform using Databricks for a financial services organization, enabling real-time fraud detection and risk assessment through streaming data ingestion and processing.
3. **Data Warehouse Modernization:** Oversaw the modernization of a legacy data warehouse to a cloud-native architecture for a healthcare provider, implementing ETL processes with Databricks and improving data accessibility for analytics and reporting.
**Ideally, you'll also have:**
+ Experience with advanced data analytics tools and techniques.
+ Familiarity with machine learning concepts and applications.
+ Knowledge of industry trends and best practices in data engineering.
+ Familiarity with cloud platforms (AWS, Azure, GCP) and their data services.
+ Knowledge of data governance and compliance standards.
+ Experience with machine learning frameworks and tools.
**What we look for:**
We seek individuals who are not only technically proficient but also possess the qualities of top performers, including a strong sense of collaboration, adaptability, and a passion for continuous learning. If you are driven by results and have a desire to make a meaningful impact, we want to hear from you.
FY26NATAID
**What we offer you**
At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more .
+ We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $125,500 to $230,200. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $150,700 to $261,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
+ Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
+ Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
**Are you ready to shape your future with confidence? Apply today.**
EY accepts applications for this position on an on-going basis.
For those living in California, please click here for additional information.
EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
**EY | Building a better working world**
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
Data Scientist, US Supply Chain
Senior data scientist job in Milwaukee, WI
What you will do
In this exciting role you will lead the effort to build and deploy predictive and prescriptive analytics in our next generation decision intelligence platform. The work will require helping to build /maintain a digital twin of our production supply chain, perform optimization and forecasting, and connect our analytics and ML solutions to enable our people to make the best data driven decisions possible!
This will require working with predictive, prescriptive analytics and decision-intelligence across the US / Canada region at Clarios. You'll apply modern statistics, machine learning and AI to real manufacturing and supply chain problems, working side-by-side with our business stakeholders and our global analytics team to deploy transformative solutions- not just models.
How you will do it
Build production-ready ML/statistical models (regression/classification, clustering, time series, linear / non-linear optimizations) to detect patterns, perform scenario analytics and generate actionable insights / outcomes.
Wrangle and analyze data with Python and SQL; perform feature engineering, data quality checks, and exploratory analysis to validate hypotheses and model readiness.
Develop digital solutions /visuals in Power BI and our decision intelligence platform to communicate results and monitor performance with business users.
Partner with stakeholders to clarify use cases, translate needs into technical tasks/user stories, and iterate solutions in sprints.
Manage model deployment (e.g., packaging models, basic MLOps) with guidance from Global Analytics
Document and communicate model methodology, assumptions, and results to non-technical audiences; support troubleshooting and continuous improvement of delivered analytics.
Deliver value realization as part of our business analytics team to drive positive business outcomes for our metals team.
Deliver incremental value quickly (first dashboards, baseline models) and iterate with stakeholder feedback.
Balance rigor with practicality-choose the simplest model that solves the problem and can be supported in production.
Keep data quality front-and-center; instrument checks to protect decisions from drift and bad inputs.
Travel: Up to ~10% for plant or stakeholder visits
What we look for
Bachelor's degree in Statistics, Mathematics, Computer Science, Engineering, or related field-or equivalent practical experience.
1-3 years (or strong internship/co-op) applying ML/statistics on business data.
Python proficiency (pandas, scikit-learn, SciPy/statsmodels) and SQL across common platforms (e.g., SQL Server, Snowflake).
Core math/stats fundamentals: probability, hypothesis testing/DoE basics, linear algebra, and the principles behind common ML methods.
Data visualization experience with Power BI / Decision Intelligence Platforms for analysis and stakeholder storytelling.
Ability to work in cross-functional teams and explain technical work clearly to non-technical partners. Candidates must be self-driven, curious, and creative
Preferred
Cloud & big data exposure: Azure (or AWS), Databricks/Spark; Snowpark is a plus.
Understanding of ETL/ELT tools such as ADF, SSIS, Talend, Informatica, or Matillion.
Experience in an Decision Intelligence platform like Palantir, Aera, etc building and deploying models.
MLOps concepts (model validation, monitoring, packaging with Docker/Kubernetes).
Deep learning basics (PyTorch/Keras) for the right use cases.
Experience contributing to agile backlogs, user stories, and sprint delivery.
3+ years of experience in data analytics
Master's Degree in Statistics, Economics, Data Science or computer science.
What you get:
Medical, dental and vision care coverage and a 401(k) savings plan with company matching - all starting on date of hire
Tuition reimbursement, perks, and discounts
Parental and caregiver leave programs
All the usual benefits such as paid time off, flexible spending, short-and long-term disability, basic life insurance, business travel insurance, Employee Assistance Program, and domestic partner benefits
Global market strength and worldwide market share leadership
HQ location earns LEED certification for sustainability plus a full-service cafeteria and workout facility
Clarios has been recognized as one of 2025's Most Ethical Companies by Ethisphere. This prestigious recognition marks the third consecutive year Clarios has received this distinction.
Who we are:
Clarios is the force behind the world's most recognizable car battery brands, powering vehicles from leading automakers like Ford, General Motors, Toyota, Honda, and Nissan. With 18,000 employees worldwide, we develop, manufacture, and distribute energy storage solutions while recovering, recycling, and reusing up to 99% of battery materials-setting the standard for sustainability in our industry. At Clarios, we're not just making batteries; we're shaping the future of sustainable transportation. Join our mission to innovate, push boundaries, and make a real impact. Discover your potential at Clarios-where your power meets endless possibilities.
Veterans/Military Spouses:
We value the leadership, adaptability, and technical expertise developed through military service. At Clarios, those capabilities thrive in an environment built on grit, ingenuity, and passion-where you can grow your career while helping to power progress worldwide. All qualified applicants will be considered without regard to protected characteristics.
We recognize that people come with a wealth of experience and talent beyond just the technical requirements of a job. If your experience is close to what you see listed here, please apply. Diversity of experience and skills combined with passion is key to challenging the status quo. Therefore, we encourage people from all backgrounds to apply to our positions. Please let us know if you require accommodations during the interview process by emailing Special.Accommodations@Clarios.com. We are an Equal Opportunity Employer and value diversity in our teams in terms of work experience, area of expertise, gender, ethnicity, and all other characteristics protected by laws in the countries where we operate. For more information on our commitment to sustainability, diversity, and equal opportunity, please read our latest report. We want you to know your rights because EEO is the law.
A Note to Job Applicants: please be aware of scams being perpetrated through the Internet and social media platforms. Clarios will never require a job applicant to pay money as part of the application or hiring process.
To all recruitment agencies: Clarios does not accept unsolicited agency resumes/CVs. Please do not forward resumes/CVs to our careers email addresses, Clarios employees or any other company location. Clarios is not responsible for any fees related to unsolicited resumes/CVs.
Auto-ApplySenior Data Engineer
Senior data scientist job in Park City, IL
Abbott is a global healthcare leader that helps people live more fully at all stages of life. Our portfolio of life-changing technologies spans the spectrum of healthcare, with leading businesses and products in diagnostics, medical devices, nutritionals and branded generic medicines. Our 114,000 colleagues serve people in more than 160 countries.
We're focused on helping people with diabetes manage their health with life-changing products that provide accurate data to drive better-informed decisions. We're revolutionizing the way people monitor their glucose levels with our new sensing technology.
**Working at Abbott**
At Abbott, you can do work that matters, grow, and learn, care for yourself and family, be your true self and live a full life. You'll also have access to:
+ Career development with an international company where you can grow the career you dream of.
+ Employees can qualify for free medical coverage in our Health Investment Plan (HIP) PPO medical plan in the next calendar year
+ An excellent retirement savings plan with high employer contribution
+ Tuition reimbursement, the Freedom 2 Save (******************************************************************************************************* student debt program and FreeU (*************************************************************************************************************** education benefit - an affordable and convenient path to getting a bachelor's degree.
+ A company recognized as a great place to work in dozens of countries around the world and named one of the most admired companies in the world by Fortune.
+ A company that is recognized as one of the best big companies to work for as well as a best place to work for diversity, working mothers, female executives, and scientists.
**THE OPPORTUNITY**
This **Senior** **Da** **ta Engineer** position can work out **remotely within the U.S** .
Are you ready to apply your technical expertise to make a real impact in the medical field and help improve the lives of people with diabetes? This role offers the opportunity to lead cloud-based big data engineering efforts, including data wrangling, analysis, and pipeline development. You'll help define and implement the organization's Big Data strategy, working closely with data engineers, analysts, and scientists to solve complex business problems using data science and machine learning.
As a senior member of the Data Engineering & Analytics team, you'll build scalable data solutions that uncover insights across customer behavior, product performance, and operations. You'll work in a distributed team environment using modern technologies like Databricks, Redshift, S3, Lambda, DynamoDB, Spark, and Python. The ideal candidate is passionate about software engineering, thrives in fast-paced environments, and brings versatility, curiosity, and a collaborative spirit to the team.
**What You'll Work On**
+ Design and implement data pipelines to be processed and visualized across a variety of projects and initiatives
+ Develop and maintain optimal data pipeline architecture by designing and implementing data ingestion solutions on AWS using AWS native services.
+ Design and optimize data models on AWS Cloud using Databricks and AWS data stores such as Redshift, RDS, S3
+ Integrate and assemble large, complex data sets that meet a broad range of business requirements
+ Read, extract, transform, stage and load data to selected tools and frameworks as required and requested
+ Customizing and managing integration tools, databases, warehouses, and analytical systems
+ Process unstructured data into a form suitable for analysis and assist in analysis of the processed data
+ Working directly with the technology and engineering teams to integrate data processing and business objectives
+ Monitoring and optimizing data performance, uptime, and scale; Maintaining high standards of code quality and thoughtful design
+ Create software architecture and design documentation for the supported solutions and overall best practices and patterns
+ Support team with technical planning, design, and code reviews including peer code reviews
+ Provide Architecture and Technical Knowledge training and support for the solution groups
+ Develop good working relations with the other solution teams and groups, such as Engineering, Marketing, Product, Test, QA.
+ Stay current with emerging trends, making recommendations as needed to help the organization innovate
**Qualifications**
+ Bachelors Degree in Computer Science, Information Technology or other relevant field
+ At least 2 to 6 years of recent experience in Software Engineering, Data Engineering or Big Data
+ Ability to work effectively within a team in a fast-paced changing environment
+ Knowledge of or direct experience with Databricks and/or Spark.
+ Software development experience, ideally in Python, PySpark, Kafka or Go, and a willingness to learn new software development languages to meet goals and objectives.
+ Knowledge of strategies for processing large amounts of structured and unstructured data, including integrating data from multiple sources
+ Knowledge of data cleaning, wrangling, visualization and reporting
+ Ability to explore new alternatives or options to solve data mining issues, and utilize a combination of industry best practices, data innovations and experience
+ Familiarity of databases, BI applications, data quality and performance tuning
+ Excellent written, verbal and listening communication skills
+ Comfortable working asynchronously with a distributed team
**Preferred**
+ Knowledge of or direct experience with the following AWS Services desired S3, RDS, Redshift, DynamoDB, EMR, Glue, and Lambda.
+ Experience working in an agile environment
+ Practical Knowledge of Linux
\#software
Apply Now (******************************
**Learn more about our health and wellness benefits, which provide the security to help you and your family live full lives:** ********************** (http://**********************/pages/candidate.aspx)
Follow your career aspirations to Abbott for diverse opportunities with a company that can help you build your future and live your best life. Abbott is an Equal Opportunity Employer, committed to employee diversity.
Connect with us at ************** , on Facebook at *********************** and on Twitter @AbbottNews and @AbbottGlobal
The base pay for this position is $75,300.00 - $150,700.00. In specific locations, the pay range may vary from the range posted.
An Equal Opportunity Employer
Abbot welcomes and encourages diversity in our workforce.
We provide reasonable accommodation to qualified individuals with disabilities.
To request accommodation, please call ************ or email ******************