Sr Boomi Developer
Data engineer job in Kenosha, WI
Responsibilities:
Design and Architect Solutions: Bringing deep knowledge to design stable, reliable, and scalable integration solutions using the Dell Boomi AtomSphere platform and its components (Integration, API Management, MDM, etc.)
Hands-on Development: Designing, developing, and implementing complex integration processes, workflows, and APIs (REST/SOAP) to connect various applications (on-premises and cloud-based), ERP systems (like Microsoft Dynamics, Oracle EBS, SAP), and other data sources.
Data Transformation: Proficiently handling various data formats such as XML, JSON, CSV and database formats, and using Boomi's capabilities and scripting languages (like Groovy or JavaScript) for complex data mapping and transformations.
Dell Boomi Platform Knowledge: Proficiency in Dell Boomi is crucial. Familiarize yourself with Boomi components such as connectors, processes, maps, and APIs. Understand how to design, build, and deploy integrations using Boomi.
API Development: Strong knowledge of RESTful and SOAP APIs. You'll create, consume, and manage APIs within Boomi.
Working with team members and business users to understand project requirements and deliver successful design, implementation, and post implementation support.
Working closely with team members to translate business requirements into feasible and efficient technical solutions.
Develop and maintain documentation for integration and testing processes
Be highly accurate in activity assessment, effort estimation and delivery commitment to ensure all project activities are delivered on time without comprising quality.
Diagnose complex technical issues and provide recommendations on solutions with consideration of best practices and longer-term impacts of decisions.
Lead/Perform third party testing, performance testing and UAT coordination.
Selecting the appropriate development platform(s) to execute business requirements and ensure post implementation success.
Serve as technical lead on projects to design, develop, test, document and deploy robust integration solutions.
Working both independently and as part of a team; collaborating closely with other IT and non-IT team members.
Assessing and troubleshooting production issues with a varying degree of priority and complexity.
Optimizing existing and developing new integration solutions to support business requirements.
Providing continuous support and management of the integration layer ensuring the integrity of our data and integrations and remove single points of failure.
Good knowledge of best practices in error handling, logging, and monitoring.
Documenting and cross-training team members for support continuity.
Qualifications:
10-15 years of experience with enterprise integration platform
Bachelor's degree in computer science
Troubleshooting Skills: Be adept at diagnosing and resolving integration issues. Familiarity with Boomi's debugging tools is valuable.
Security Awareness: Knowledge of authentication methods, encryption, and secure data transmission.
Experience and proven track record of implementing integration projects.
Extensible Stylesheet Language Transformations (XSLT) experience is a plus.
Project Management experience is a plus
Experience of ERP systems within a fast-moving wholesale, retail, and Ecommerce environment is highly desirable.
Experience of Boomi implementation with Microsoft Dynamics ERP system is a plus.
Strong communication and ability to work cross-functionally in a fast-paced environment.
Healthcare Data Analyst Lead (CMH Health)
Data engineer job in Brookfield, WI
Individual(s) must be legally authorized to work in the United States without the need for immigration support or sponsorship from Milliman now or in the future.
Milliman is seeking a technically savvy, analytically strong Healthcare Data Analyst Lead to manage analytics and technology projects as well as coach and mentor junior analytical staff. The ideal candidate is someone seeking to join a challenging, yet rewarding environment focused on delivering world-class analytics across a variety of healthcare-related domains to a variety of healthcare entities.
Who We Are
Independent for over 75 years, Milliman delivers market-leading services and solutions to clients worldwide. Today, we are helping companies take on some of the world's most critical and complex issues, including retirement funding and healthcare financing, risk management and regulatory compliance, data analytics and business transformation.
Job Responsibilities
Lead and manage analytics and technology projects from data ingestion through delivery.
Design, develop, and optimize data processes and workflows supporting large healthcare datasets.
Perform and oversee ETL, validation, and transformation tasks for claims, eligibility, and pharmacy data.
Guide project teams through the full “data-to-deliverable” lifecycle, ensuring accuracy and efficiency.
Build analytical models, dashboards, and data pipelines to support consulting engagements.
Collaborate with consultants, actuaries, and project managers to interpret results and deliver client insights.
Review and approve technical work from peers and junior analysts to ensure quality standards are met.
Mentor, coach, and delegate to analytical staff to strengthen technical and professional development.
Contribute to innovation and process improvement initiatives, including automation, cloud enablement, and AI integration.
Participate in client meetings and presentations, occasionally requiring travel.
Minimum requirements
Bachelor's degree required (Computer Science, Management Information Systems, Computer Engineering, Math, Actuarial Science, Data Analytics, or related degree is preferred)
6+ years of experience in healthcare data analytics or a related technical analytics role.
Advanced proficiency with SQL and Microsoft Excel for data analysis, validation, and automation.
Strong programming skills in Python, R, or other analytical languages.
Experience with data visualization and reporting tools (e.g., Power BI, Tableau, or R Shiny).
Solid understanding of healthcare data structures, including claims, eligibility, and provider data.
Proven ability to lead multiple projects simultaneously while mentoring and developing junior team members.
Experience with cloud data technologies (e.g., Azure Data Factory, Databricks, Snowflake, AWS Redshift, or similar).
Exposure to AI or Generative AI tools for data analysis, automation, or insight generation is preferred, but not required.
Competencies and Behaviors that Support Success in this Role
Deep understanding of database architecture and large-scale healthcare data environments.
Strong analytical thinking and the ability to translate complex data into actionable insights.
Excellent communication skills, including the ability to explain technical concepts to non-technical audiences.
Highly organized, detail-oriented, and able to manage competing priorities.
Collaborative and proactive leadership style with a focus on mentorship and knowledge-sharing.
Passion for applying analytics to improve healthcare performance, quality, and cost outcomes.
Demonstrated accountability for quality, timelines, and client satisfaction.
Fast learner who thrives in a dynamic, innovation-driven environment.
The Team
The Healthcare Data Analyst Lead will join a team that thrives on leveraging data, analytics, and technology to deliver meaningful business value. This is a team with technical aptitude and analytical prowess that enjoys building efficient and scalable products and processes. Ultimately, we are passionate about effecting change in healthcare. We also believe that collaboration and communication are cornerstones of success.
The Healthcare Data Analyst Lead will also join a mix of Healthcare Analysts, Leads, Consultants, and Principals. In addition, as part of the broader Milliman landscape, they will work alongside Healthcare Actuaries, Pharmacists, Clinicians, and Physicians. We aim to provide everyone a supportive environment, where we foster learning and growth through rewarding challenges.
Salary:
The overall salary range for this role is $104,900 - $199,065.
For candidates residing in:
Alaska, California, Connecticut, Illinois, Maryland, Massachusetts, New Jersey, Notable New York City, Newark, San Jose, San Francisco, Pennsylvania, Virginia, Washington, or the District of Columbia the salary range is $120,635 - $199,065.
All other locations the salary range is $104,900 - $173,100.
A combination of factors will be considered, including, but not limited to, education, relevant work experience, qualifications, skills, certifications, etc.
Location
:
It is preferred that candidates work on-site at our Brookfield, Wisconsin office, however, remote candidates will be considered.
The expected application deadline for this job is May 25, 2026.
Benefits
We offer a comprehensive benefits package designed to support employees' health, financial security, and well-being. Benefits include:
Medical, Dental and Vision - Coverage for employees, dependents, and domestic partners.
Employee Assistance Program (EAP) - Confidential support for personal and work-related challenges.
401(k) Plan - Includes a company matching program and profit-sharing contributions.
Discretionary Bonus Program - Recognizing employee contributions.
Flexible Spending Accounts (FSA) - Pre-tax savings for dependent care, transportation, and eligible medical expenses.
Paid Time Off (PTO) - Begins accruing on the first day of work. Full-time employees accrue 15 days per year, and employees working less than full-time accrue PTO on a prorated basis.
Holidays - A minimum of 10 observed holidays per year.
Family Building Benefits - Includes adoption and fertility assistance.
Paid Parental Leave - Up to 12 weeks of paid leave for employees who meet eligibility criteria.
Life Insurance & AD&D - 100% of premiums covered by Milliman.
Short-Term and Long-Term Disability - Fully paid by Milliman.
Equal Opportunity:
All qualified applicants will receive consideration for employment, without regard to race, color, religion, sex, sexual orientation, national origin, disability, or status as a protected veteran.
Databricks Data Engineer - Senior - Consulting - Location Open
Data engineer job in Milwaukee, WI
At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
**Technology - Data and Decision Science - Data Engineering - Senior**
We are seeking a highly skilled Senior Consultant Data Engineer with expertise in cloud data engineering, specifically Databricks. The ideal candidate will have strong client management and communication skills, along with a proven track record of successful end-to-end implementations in data engineering projects.
**The opportunity**
In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that technical requirements align with business needs. Your responsibilities will include creating scalable data architecture and modeling solutions that support the entire data asset lifecycle.
**Your key responsibilities**
As a Senior Data Engineer, you will play a pivotal role in transforming data into actionable insights. Your time will be spent on various responsibilities, including:
+ Designing, building, and operating scalable on-premises or cloud data architecture.
+ Analyzing business requirements and translating them into technical specifications.
+ Optimizing data flows for target data platform designs.
+ Design, develop, and implement data engineering solutions using Databricks on cloud platforms (e.g., AWS, Azure, GCP).
+ Collaborate with clients to understand their data needs and provide tailored solutions that meet their business objectives.
+ Lead end-to-end data pipeline development, including data ingestion, transformation, and storage.
+ Ensure data quality, integrity, and security throughout the data lifecycle.
+ Provide technical guidance and mentorship to junior data engineers and team members.
+ Communicate effectively with stakeholders, including technical and non-technical audiences, to convey complex data concepts.
+ Manage client relationships and expectations, ensuring high levels of satisfaction and engagement.
+ Stay updated with the latest trends and technologies in data engineering and cloud computing.
This role offers the opportunity to work with cutting-edge technologies and stay ahead of industry trends, ensuring you gain a competitive advantage in the market. The position may require regular travel to meet with external clients.
**Skills and attributes for success**
To thrive in this role, you will need a blend of technical and interpersonal skills. Your ability to communicate effectively and build relationships will be crucial. Here are some key attributes we look for:
+ Strong analytical and decision-making skills.
+ Proficiency in cloud computing and data architecture design.
+ Experience in data integration and security.
+ Ability to manage complex problem-solving scenarios.
**To qualify for the role, you must have**
+ A Bachelor's degree in Computer Science, Engineering, or a related field required (4-year degree). Master's degree preferred
+ Typically, no less than 2 - 4 years relevant experience in data engineering, with a focus on cloud data solutions.
+ 5+ years of experience in data engineering, with a focus on cloud data solutions.
+ Expertise in Databricks and experience with Spark for big data processing.
+ Proven experience in at least two end-to-end data engineering implementations, including:
+ Implementation of a data lake solution using Databricks, integrating various data sources, and enabling analytics for business intelligence.
+ Development of a real-time data processing pipeline using Databricks and cloud services, delivering insights for operational decision-making.
+ Strong programming skills in languages such as Python, Scala, or SQL.
+ Experience with data modeling, ETL processes, and data warehousing concepts.
+ Excellent problem-solving skills and the ability to work independently and as part of a team.
+ Strong communication and interpersonal skills, with a focus on client management.
**Required Expertise for Senior Consulting Projects:**
+ **Strategic Thinking:** Ability to align data engineering solutions with business strategies and objectives.
+ **Project Management:** Experience in managing multiple projects simultaneously, ensuring timely delivery and adherence to project scope.
+ **Stakeholder Engagement:** Proficiency in engaging with various stakeholders, including executives, to understand their needs and present solutions effectively.
+ **Change Management:** Skills in guiding clients through change processes related to data transformation and technology adoption.
+ **Risk Management:** Ability to identify potential risks in data projects and develop mitigation strategies.
+ **Technical Leadership:** Experience in leading technical discussions and making architectural decisions that impact project outcomes.
+ **Documentation and Reporting:** Proficiency in creating comprehensive documentation and reports to communicate project progress and outcomes to clients.
**Ideally, you'll also have**
+ Experience with data quality management.
+ Familiarity with semantic layers in data architecture.
+ Familiarity with cloud platforms (AWS, Azure, GCP) and their data services.
+ Knowledge of data governance and compliance standards.
+ Experience with machine learning frameworks and tools.
**What we look for**
We seek individuals who are not only technically proficient but also possess the qualities of top performers. You should be adaptable, collaborative, and driven by a desire to achieve excellence in every project you undertake.
FY26NATAID
**What we offer you**
At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more .
+ We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $106,900 to $176,500. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $128,400 to $200,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
+ Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
+ Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
**Are you ready to shape your future with confidence? Apply today.**
EY accepts applications for this position on an on-going basis.
For those living in California, please click here for additional information.
EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
**EY | Building a better working world**
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
Senior Data Engineer
Data engineer job in Milwaukee, WI
At Wipfli, people count. At Wipfli, our people are core to everything we do-the catalyst behind our ability to create exceptional impact and extraordinary results. We believe in flexibility. We focus on relationships. We encourage each individual to follow their own path.
People truly matter and they feel it. For those looking to make a difference and find a professional home, Wipfli offers a career-defining opportunity.
Position Overview:
This role will take direction from the Information Team Director and will be responsible for contributing to the continuous advancement of a modern data lakehouse built to support a rapidly growing firm's desire to democratize its data asset.
Responsibilities
Responsibilities:
+ Lead influence and consensus building efforts for recommended solutions
+ Support and document the continued evolution of the firm's data lakehouse using the medallion architecture
+ Translate requirements into effective data models to support visualizations, AI\ML models, etc. leveraging design best practices and team standards using approved tools
+ Develop in data technologies such as Databricks, Microsoft Azure Data Factory, Python, t-SQL
+ Manage the execution of project life cycle activities in accordance with the Information Team scrum processes and tools such as Microsoft Azure DevOps
+ Achieve/maintain proficiency in required skills identified by the Information Team to effectively deliver defined products
+ Collaborate with team members to evolve products and internal processes
+ Mentor other Engineers and other IT associates as needed.
+ Perform on-call support for after business hours as needed.
Knowledge, Skills and Abilities
Qualifications:
+ Demonstrated success in working on a modern data platform with Databricks experience being preferred. Accredited certification(s) and/or 5+ years hands on desired.
+ Naturally curious with the ability to learn and implement new concepts quickly.
+ A mastery of extracting and landing data from source systems via all access methods. Extra credit for Workday RaaS and/or Microsoft Dynamics/Dataverse skills.
+ A commitment to operational standards, quality, and accountability for testing, code reviews/management and documentation.
+ Engaged in the virtual team experience leveraging video conferencing (cameras on) and a focus on relationship building. Travel is rare, but we do occasionally organize in-person events.
Benjamin Dzanic, from our recruiting team, will be guiding you through this process. Visit his LinkedIn (************************************* page to connect!
#LI-REMOTE #LI-BD1
Additional Details
Additional Details:
Wipfli is an equal opportunity/affirmative action employer. All candidates will receive consideration for employment without regards to race, creed, color, religion, national origin, sex, age, marital status, sexual orientation, gender identify, veteran status, disability, or any other characteristics protected by federal, state, or local laws.
Wipfli is committed to providing reasonable accommodations for people with disabilities. If you require a reasonable accommodation to complete an application, interview, or participate in our recruiting process, please send us an email at *************
Wipfli values fair, transparent, and competitive compensation, considering each candidate's unique skills and experiences. The estimated base pay range for this role is $107,000 to $144,000, with offers typically not made at the maximum, allowing for future salary increases. The actual salary at the time of offer depends on business related factors like location, skills, experience, training/education, licensure, certifications, business needs, current associate pay, and relevant employment laws.
Individuals may be eligible for an annual discretionary bonus, subject to participation rules and based on a variety of factors including, but not limited to, individual and Firm performance.
Wipfli cares about our associates and offers a variety of benefits to support their well-being. Highlights include 8 health plan options (both HMO & PPO plans), dental and vision coverage, opportunity to enroll in HSA with potential Firm contribution and an Employee Assistance Program. Other benefits include firm-sponsored basic life and short and long-term disability coverage, a 401(k) savings plan & profit share as well as Firm matching contribution, well-being incentive, education & certification assistance, flexible time off, family care leave, parental leave, family formation benefits, cell phone reimbursement, and travel rewards. Voluntary benefit offerings include critical illness & accident insurance, hospital indemnity insurance, legal, long-term care, pet insurance, ID theft protection, and supplemental life/AD&D. Eligibility for all benefits programs is dependent on annual hours expectation, position status/level and location.
"Wipfli" is the brand name under which Wipfli LLP and Wipfli Advisory LLC and its respective subsidiary entities provide professional services. Wipfli LLP and Wipfli Advisory LLC (and its respective subsidiary entities) practice in an alternative practice structure in accordance with the AICPA Code of Professional Conduct and applicable law, regulations, and professional standards. Wipfli LLP is a licensed independent CPA firm that provides attest services to its clients, and Wipfli Advisory LLC provides tax and business consulting services to its clients. Wipfli Advisory LLC and its subsidiary entities are not licensed CPA firms.
Job LocationsUS
Job ID 2025-7427
Category Internal IT
Remote Yes
Lead Data Scientist GenAI, Strategic Analytics - Data Science
Data engineer job in Milwaukee, WI
Deloitte is at the leading edge of GenAI innovation, transforming Strategic Analytics and shaping the future of Finance. We invite applications from highly skilled and experienced Lead Data Scientists ready to drive the development of our next-generation GenAI solutions.
The Team
Strategic Analytics is a dynamic part of our Finance FP&A organization, dedicated to empowering executive leaders across the firm, as well as our partners in financial and operational functions. Our team harnesses the power of cloud computing, data science, AI, and strategic expertise-combined with deep institutional knowledge-to deliver insights that inform our most critical business decisions and fuel the firm's ongoing growth.
GenAI is at the forefront of our innovation agenda and a key strategic priority for our future. We are rapidly developing groundbreaking products and solutions poised to transform both our organization and our clients. As part of our team, the selected candidate will play a pivotal role in driving the success of these high-impact initiatives.
Recruiting for this role ends on December 14, 2025
Work You'll Do
Client Engagement & Solution Scoping
+ Partner with stakeholders to analyze business requirements, pain points, and objectives relevant to GenAI use cases.
+ Facilitate workshops to identify, prioritize, and scope impactful GenAI applications (e.g., text generation, code synthesis, conversational agents).
+ Clearly articulate GenAI's value proposition, including efficiency gains, risk mitigation, and innovation.
+ Solution Architecture & Design
+ Architect holistic GenAI solutions, selecting and customizing appropriate models (GPT, Llama, Claude, Zora AI, etc.).
+ Design scalable integration strategies for embedding GenAI into existing client systems (ERP, CRM, KM platforms).
+ Define and govern reliable, ethical, and compliant data sourcing and management.
Development & Customization
+ Lead model fine-tuning, prompt engineering, and customization for client-specific needs.
+ Oversee the development of GenAI-powered applications and user-friendly interfaces, ensuring robustness and exceptional user experience.
+ Drive thorough validation, testing, and iteration to ensure quality and accuracy.
Implementation, Deployment & Change Management
+ Manage solution rollout, including cloud setup, configuration, and production deployment.
+ Guide clients through adoption: deliver training, create documentation, and provide enablement resources for users.
Risk, Ethics & Compliance
+ Lead efforts in responsible AI, ensuring safeguards against bias, privacy breaches, and unethical outcomes.
+ Monitor performance, implement KPIs, and manage model retraining and auditing processes.
Stakeholder Communication
+ Prepare executive-level reports, dashboards, and demos to summarize progress and impact.
+ Coordinate across internal teams, tech partners, and clients for effective project delivery.
Continuous Improvement & Thought Leadership
+ Stay current on GenAI trends, best practices, and emerging technologies; share insights across teams.
+ Mentor junior colleagues, promote knowledge transfer, and contribute to reusable methodologies.
Qualifications
Required:
+ Bachelor's or Master's degree in Computer Science, Engineering, Data Science, Mathematics, or related field.
+ 5+ years of hands-on experience delivering machine learning or AI solutions, preferably including generative AI.
+ Independent thinker who can create the vision and execute on transforming data into high end client products.
+ Demonstrated accomplishments in the following areas:
+ Deep understanding of GenAI models and approaches (LLMs, transformers, prompt engineering).
+ Proficiency in Python (PyTorch, TensorFlow, HuggingFace), Databricks, ML pipelines, and cloud-based deployment (Azure, AWS, GCP).
+ Experience integrating AI into enterprise applications, building APIs, and designing scalable workflows.
+ Knowledge of solution architecture, risk assessment, and mapping technology to business goals.
+ Familiarity with agile methodologies and iterative delivery.
+ Commitment to responsible AI, including data ethics, privacy, and regulatory compliance.
+ Ability to travel 0-10%, on average, based on the work you do and the clients and industries/sectors you serve
+ Limited immigration sponsorship may be available.
Preferred:
+ Relevant Certifications: May include Google Cloud Professional ML Engineer, Microsoft Azure AI Engineer, AWS Certified Machine Learning, or specialized GenAI/LLM credentials.
+ Experience with data visualization tools such as Tableau
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $102,500 - $188,900.
You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance.
Information for applicants with a need for accommodation
************************************************************************************************************
EA_FA_ExpHire
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.
Data Engineer - Platform & Product
Data engineer job in Milwaukee, WI
We are seeking a skilled and solution-oriented Data Engineer to contribute to the development of our growing Data Engineering function. This role will be instrumental in designing and optimizing data workflows, building domain-specific pipelines, and enhancing platform services. The ideal candidate will help evolve our Snowflake-based data platform into a scalable, domain-oriented architecture that supports business-critical analytics and machine learning initiatives.
Responsibilities
The candidate is expected to:
Design and build reusable platform services, including pipeline frameworks, CI/CD workflows, data validation utilities, data contracts, lineage integrations
Develop and maintain data pipelines for sourcing, transforming, and delivering trusted datasets into Snowflake
Partner with Data Domain Owners to onboard new sources, implement data quality checks, and model data for analytics and machine learning use cases
Collaborate with the Lead Data Platform Engineer and Delivery Manager to deliver monthly feature releases and support bug remediation
Document and promote best practices for data pipeline development, testing, and deployment
Qualifications
The successful candidate will possess strong analytical skills and attention to detail. Additionally, the ideal candidate will possess:
3-6 years of experience in data engineering or analytics engineering
Strong SQL and Python skills; experience with dbt or similar transformation frameworks
Demonstrated experience building pipelines and services on Snowflake or other modern cloud data platforms
Understanding of data quality, validation, lineage, and schema evolution
Background in financial or market data (trading, pricing, benchmarks, ESG) is a plus
Strong collaboration and communication skills, with a passion for enabling domain teams
Privacy Notice for California Applicants
Artisan Partners Limited Partnership is an equal opportunity employer. Artisan Partners does not discriminate on the basis of race, religion, color, national origin, gender, age, disability, marital status, sexual orientation or any other characteristic protected under applicable law. All employment decisions are made on the basis of qualifications, merit and business need.
#LI-Hybrid/span>
Auto-ApplyPrincipal Data Scientist
Data engineer job in Milwaukee, WI
Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team.
You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes.
This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.)
This position requires occasional travel to the DC area for client meetings.
Essential Duties and Responsibilities:
- Make deep dives into the data, pulling out objective insights for business leaders.
- Initiate, craft, and lead advanced analyses of operational data.
- Provide a strong voice for the importance of data-driven decision making.
- Provide expertise to others in data wrangling and analysis.
- Convert complex data into visually appealing presentations.
- Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners.
- Understand the importance of automation and look to implement and initiate automated solutions where appropriate.
- Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects.
- Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects.
- Guide operational partners on product performance and solution improvement/maturity options.
- Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization.
- Learn new skills in advanced analytics/AI/ML tools, techniques, and languages.
- Mentor more junior data analysts/data scientists as needed.
- Apply strategic approach to lead projects from start to finish;
Job-Specific Minimum Requirements:
- Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
- Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning.
- Contribute to the development of mathematically rigorous process improvement procedures.
- Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments.
Minimum Requirements
- Bachelor's degree in related field required.
- 10-12 years of relevant professional experience required.
Job-Specific Minimum Requirements:
- 10+ years of relevant Software Development + AI / ML / DS experience.
- Professional Programming experience (e.g. Python, R, etc.).
- Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML.
- Experience with API programming.
- Experience with Linux.
- Experience with Statistics.
- Experience with Classical Machine Learning.
- Experience working as a contributor on a team.
Preferred Skills and Qualifications:
- Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.).
- Experience developing machine learning or signal processing algorithms:
- Ability to leverage mathematical principles to model new and novel behaviors.
- Ability to leverage statistics to identify true signals from noise or clutter.
- Experience working as an individual contributor in AI.
- Use of state-of-the-art technology to solve operational problems in AI and Machine Learning.
- Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles.
- Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions.
- Ability to build reference implementations of operational AI & Advanced Analytics processing solutions.
Background Investigations:
- IRS MBI - Eligibility
#techjobs #VeteransPage
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
156,740.00
Maximum Salary
$
234,960.00
Easy ApplyData Scientist, US Supply Chain
Data engineer job in Milwaukee, WI
What you will do
In this exciting role you will lead the effort to build and deploy predictive and prescriptive analytics in our next generation decision intelligence platform. The work will require helping to build /maintain a digital twin of our production supply chain, perform optimization and forecasting, and connect our analytics and ML solutions to enable our people to make the best data driven decisions possible!
This will require working with predictive, prescriptive analytics and decision-intelligence across the US / Canada region at Clarios. You'll apply modern statistics, machine learning and AI to real manufacturing and supply chain problems, working side-by-side with our business stakeholders and our global analytics team to deploy transformative solutions- not just models.
How you will do it
Build production-ready ML/statistical models (regression/classification, clustering, time series, linear / non-linear optimizations) to detect patterns, perform scenario analytics and generate actionable insights / outcomes.
Wrangle and analyze data with Python and SQL; perform feature engineering, data quality checks, and exploratory analysis to validate hypotheses and model readiness.
Develop digital solutions /visuals in Power BI and our decision intelligence platform to communicate results and monitor performance with business users.
Partner with stakeholders to clarify use cases, translate needs into technical tasks/user stories, and iterate solutions in sprints.
Manage model deployment (e.g., packaging models, basic MLOps) with guidance from Global Analytics
Document and communicate model methodology, assumptions, and results to non-technical audiences; support troubleshooting and continuous improvement of delivered analytics.
Deliver value realization as part of our business analytics team to drive positive business outcomes for our metals team.
Deliver incremental value quickly (first dashboards, baseline models) and iterate with stakeholder feedback.
Balance rigor with practicality-choose the simplest model that solves the problem and can be supported in production.
Keep data quality front-and-center; instrument checks to protect decisions from drift and bad inputs.
Travel: Up to ~10% for plant or stakeholder visits
What we look for
Bachelor's degree in Statistics, Mathematics, Computer Science, Engineering, or related field-or equivalent practical experience.
1-3 years (or strong internship/co-op) applying ML/statistics on business data.
Python proficiency (pandas, scikit-learn, SciPy/statsmodels) and SQL across common platforms (e.g., SQL Server, Snowflake).
Core math/stats fundamentals: probability, hypothesis testing/DoE basics, linear algebra, and the principles behind common ML methods.
Data visualization experience with Power BI / Decision Intelligence Platforms for analysis and stakeholder storytelling.
Ability to work in cross-functional teams and explain technical work clearly to non-technical partners. Candidates must be self-driven, curious, and creative
Preferred
Cloud & big data exposure: Azure (or AWS), Databricks/Spark; Snowpark is a plus.
Understanding of ETL/ELT tools such as ADF, SSIS, Talend, Informatica, or Matillion.
Experience in an Decision Intelligence platform like Palantir, Aera, etc building and deploying models.
MLOps concepts (model validation, monitoring, packaging with Docker/Kubernetes).
Deep learning basics (PyTorch/Keras) for the right use cases.
Experience contributing to agile backlogs, user stories, and sprint delivery.
3+ years of experience in data analytics
Master's Degree in Statistics, Economics, Data Science or computer science.
What you get:
Medical, dental and vision care coverage and a 401(k) savings plan with company matching - all starting on date of hire
Tuition reimbursement, perks, and discounts
Parental and caregiver leave programs
All the usual benefits such as paid time off, flexible spending, short-and long-term disability, basic life insurance, business travel insurance, Employee Assistance Program, and domestic partner benefits
Global market strength and worldwide market share leadership
HQ location earns LEED certification for sustainability plus a full-service cafeteria and workout facility
Clarios has been recognized as one of 2025's Most Ethical Companies by Ethisphere. This prestigious recognition marks the third consecutive year Clarios has received this distinction.
Who we are:
Clarios is the force behind the world's most recognizable car battery brands, powering vehicles from leading automakers like Ford, General Motors, Toyota, Honda, and Nissan. With 18,000 employees worldwide, we develop, manufacture, and distribute energy storage solutions while recovering, recycling, and reusing up to 99% of battery materials-setting the standard for sustainability in our industry. At Clarios, we're not just making batteries; we're shaping the future of sustainable transportation. Join our mission to innovate, push boundaries, and make a real impact. Discover your potential at Clarios-where your power meets endless possibilities.
Veterans/Military Spouses:
We value the leadership, adaptability, and technical expertise developed through military service. At Clarios, those capabilities thrive in an environment built on grit, ingenuity, and passion-where you can grow your career while helping to power progress worldwide. All qualified applicants will be considered without regard to protected characteristics.
We recognize that people come with a wealth of experience and talent beyond just the technical requirements of a job. If your experience is close to what you see listed here, please apply. Diversity of experience and skills combined with passion is key to challenging the status quo. Therefore, we encourage people from all backgrounds to apply to our positions. Please let us know if you require accommodations during the interview process by emailing Special.Accommodations@Clarios.com. We are an Equal Opportunity Employer and value diversity in our teams in terms of work experience, area of expertise, gender, ethnicity, and all other characteristics protected by laws in the countries where we operate. For more information on our commitment to sustainability, diversity, and equal opportunity, please read our latest report. We want you to know your rights because EEO is the law.
A Note to Job Applicants: please be aware of scams being perpetrated through the Internet and social media platforms. Clarios will never require a job applicant to pay money as part of the application or hiring process.
To all recruitment agencies: Clarios does not accept unsolicited agency resumes/CVs. Please do not forward resumes/CVs to our careers email addresses, Clarios employees or any other company location. Clarios is not responsible for any fees related to unsolicited resumes/CVs.
Auto-ApplyManager, Data Operations, Data Engineer
Data engineer job in Milwaukee, WI
Known for being a great place to work and build a career, KPMG provides audit, tax and advisory services for organizations in today's most important industries. Our growth is driven by delivering real results for our clients. It's also enabled by our culture, which encourages individual development, embraces an inclusive environment, rewards innovative excellence and supports our communities. With qualities like those, it's no wonder we're consistently ranked among the best companies to work for by Fortune Magazine, Consulting Magazine, Seramount, Fair360 and others. If you're as passionate about your future as we are, join our team.
KPMG is currently seeking a Manager of Data Engineering to join our Digital Nexus technology organization. This is a hybrid work opportunity.
Responsibilities:
* Lead a team of Azure Data Lake and business intelligence engineers in designing and delivering ADL Pipelines, Notebooks and interactive Power BI dashboards that clearly communicate actionable insights to stakeholders; contribute strategic thought leadership to shape the firm's business intelligence vision and standards
* Design and maintain scalable data pipelines using Azure Data Factory and Databricks to ingest, transform, and deliver data across medallion architecture layers; develop production-grade ETL/ELT solutions using PySpark and SQL to produce analytics-ready Delta Lake datasets aligned with enterprise standards
* Apply critical thinking and creativity to design innovative, non-standard BI solutions that address complex and evolving business challenges; design, build, and optimize data models to support analytics, ensuring accuracy, reliability, and efficiency
* Stay ahead of emerging technologies including Generative AI and AI agents to identify novel opportunities that improve analytics, automation, and decision-making across the enterprise
* Manage and provide technical expertise and strategic guidance to counselees (direct reports), department peers, and cross-functional team members; set goals, participate in strategic initiatives for the team, and foster the development of high-performance teams
* Act with integrity, professionalism, and personal responsibility to uphold KPMG's respectful and courteous work environment
Qualifications:
* Minimum seven years of recent experience designing and building ADL Pipelines and Data bricks notebooks and interactive dashboards using modern business intelligence tools (preferably Power BI); minimum two years of recent experience designing scalable data pipelines using Azure Data Factory and Azure Databricks to support ingestion, transformation, and delivery of data across medallion architecture layers
* Bachelor's degree from an accredited college or university is preferred; minimum of a high school diploma or GED is required
* Demonstrated analytical and problem-solving abilities, with a creative and methodical approach to addressing complex challenges
* Advanced knowledge of SQL, DAX, and data modeling concepts; proven track record in defining, managing, and delivering BI projects; ability to participate in the development of resource plans and influence organizational priorities
* Excellent written and verbal communication skills, including the ability to effectively present proposals and vision to executive leadership
* Applicants must be authorized to work in the U.S. without the need for employment-based visa sponsorship now or in the future; KPMG LLP will not sponsor applicants for U.S. work visa status for this opportunity (no sponsorship is available for H-1B, L-1, TN, O-1, E-3, H-1B1, F-1, J-1, OPT, CPT or any other employment-based visa)
KPMG LLP and its affiliates and subsidiaries ("KPMG") complies with all local/state regulations regarding displaying salary ranges. If required, the ranges displayed below or via the URL below are specifically for those potential hires who will work in the location(s) listed. Any offered salary is determined based on relevant factors such as applicant's skills, job responsibilities, prior relevant experience, certain degrees and certifications and market considerations. In addition, KPMG is proud to offer a comprehensive, competitive benefits package, with options designed to help you make the best decisions for yourself, your family, and your lifestyle. Available benefits are based on eligibility. Our Total Rewards package includes a variety of medical and dental plans, vision coverage, disability and life insurance, 401(k) plans, and a robust suite of personal well-being benefits to support your mental health. Depending on job classification, standard work hours, and years of service, KPMG provides Personal Time Off per fiscal year. Additionally, each year KPMG publishes a calendar of holidays to be observed during the year and provides eligible employees two breaks each year where employees will not be required to use Personal Time Off; one is at year end and the other is around the July 4th holiday. Additional details about our benefits can be found towards the bottom of our KPMG US Careers site at Benefits & How We Work.
Follow this link to obtain salary ranges by city outside of CA:
**********************************************************************
KPMG offers a comprehensive compensation and benefits package. KPMG is an equal opportunity employer. KPMG complies with all applicable federal, state and local laws regarding recruitment and hiring. All qualified applicants are considered for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, citizenship status, disability, protected veteran status, or any other category protected by applicable federal, state or local laws. The attached link contains further information regarding KPMG's compliance with federal, state and local recruitment and hiring laws. No phone calls or agencies please.
KPMG recruits on a rolling basis. Candidates are considered as they apply, until the opportunity is filled. Candidates are encouraged to apply expeditiously to any role(s) for which they are qualified that is also of interest to them.
Los Angeles County applicants: Material job duties for this position are listed above. Criminal history may have a direct, adverse, and negative relationship with some of the material job duties of this position. These include the duties and responsibilities listed above, as well as the abilities to adhere to company policies, exercise sound judgment, effectively manage stress and work safely and respectfully with others, exhibit trustworthiness, and safeguard business operations and company reputation. Pursuant to the California Fair Chance Act, Los Angeles County Fair Chance Ordinance for Employers, Fair Chance Initiative for Hiring Ordinance, and San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records.
Business Intelligence Data Modeler
Data engineer job in Milwaukee, WI
CapB is a global leader on IT Solutions and Managed Services. Our R&D is focused on providing cutting edge products and solutions across Digital Transformations from Cloud, AI/ML, IOT, Blockchain to MDM/PIM, Supply chain, ERP, CRM, HRMS and Integration solutions. For our growing needs we need consultants who can work with us on salaried or contract basis. We provide industry standard benefits, and an environment for LEARNING & Growth.
For one of our going on project we are looking for a Business Intelligence Data Modeler. The position is based out of Milwaukee.
Responsibilities:
The Business Intelligence Analyst performs a variety of project-oriented tasks to support the information needs of the organization.
This position is responsible for all phases of reporting, decision support, and data analysis activities including report design, measure development, data collection, summarization, and validation.
The BI Analyst exercises independent judgement and discretionary decision making related to managing multiple and more complex reporting projects.
The BI Analyst is proficient with analytical and reporting tools, database development, ETL processes, query languages, and database and spreadsheet tools.
The BI Analyst will participate in reporting and presentations to various levels of management and staff and may also be included in action plan development.
BI Analyst will have in depth experience with Power BI to create dashboards, data exploration, visualization, and data storytelling from concept to final deliverables.
Advanced experience with Power BI, Power BI dataflow and dashboard. Technical expertise with data modeling and design to interact with multiple sources of data.
Ability to write complex DAX code and SQL queries for data manipulation.
Skills:
10 years experience required to Data analysis.
10 years experience required to Dashboarding/Business Objects Xcelsius.
Data Warehouse Developer
Data engineer job in Milwaukee, WI
Description We are looking for a skilled Data Warehouse Analyst to join our team on a contract basis in Milwaukee, Wisconsin. In this role, you will play a pivotal part in developing and maintaining data solutions that support organizational analytics and decision-making. You will work closely with cross-functional teams to ensure data integration, accuracy, and accessibility using modern tools and methodologies.
Responsibilities:
- Design and implement data warehouse solutions to support business intelligence and reporting needs.
- Develop and maintain ETL processes to extract, transform, and load data from Oracle into Azure SQL Server.
- Collaborate with stakeholders and business analysts to gather requirements and translate them into actionable technical solutions.
- Optimize workflows and ensure efficient performance of the data warehouse environment.
- Validate and monitor data quality to ensure integrity and reliability.
- Create and maintain documentation for processes, architecture, and data models.
- Troubleshoot and resolve issues related to data integration and system performance.
- Utilize Azure Data Factory for data processing and workflow management.
- Apply Kimball methodology to design and maintain efficient data models.
- Support the ongoing improvement of data systems and analytics processes. Requirements - Proven experience in data warehousing, including design and development.
- Expertise in ETL processes and tools, with a focus on data integration.
- Proficiency in Azure Data Factory for creating workflows and managing data pipelines.
- Strong knowledge of Microsoft SQL Server and Azure SQL Database.
- Familiarity with Oracle Autonomous Data Warehouse systems.
- Experience with business intelligence and data warehousing methodologies.
- Ability to apply Kimball methodology in data model design.
- Strong problem-solving skills and attention to detail to ensure data accuracy and quality. Technology Doesn't Change the World, People Do.
Robert Half is the world's first and largest specialized talent solutions firm that connects highly qualified job seekers to opportunities at great companies. We offer contract, temporary and permanent placement solutions for finance and accounting, technology, marketing and creative, legal, and administrative and customer support roles.
Robert Half works to put you in the best position to succeed. We provide access to top jobs, competitive compensation and benefits, and free online training. Stay on top of every opportunity - whenever you choose - even on the go. Download the Robert Half app (https://www.roberthalf.com/us/en/mobile-app) and get 1-tap apply, notifications of AI-matched jobs, and much more.
All applicants applying for U.S. job openings must be legally authorized to work in the United States. Benefits are available to contract/temporary professionals, including medical, vision, dental, and life and disability insurance. Hired contract/temporary professionals are also eligible to enroll in our company 401(k) plan. Visit roberthalf.gobenefits.net for more information.
© 2025 Robert Half. An Equal Opportunity Employer. M/F/Disability/Veterans. By clicking "Apply Now," you're agreeing to Robert Half's Terms of Use (https://www.roberthalf.com/us/en/terms) .
Senior Data Engineer
Data engineer job in Milwaukee, WI
We're looking for those individuals-the creative thinkers and innovation seekers-who are content with nothing short of changing the world. Discover the endless opportunities within the Medical College of Wisconsin (MCW) and be inspired by the work we can do together to improve health, and make a positive, daily impact in our communities
As a Sr. Data Engineer, you will prepare and transform data for analytical, operational, and intelligent automation uses to support diverse stakeholder needs throughout the Center for International of Blood and Marrow Transplant Research (CIBMTR) , including statisticians, data scientists, scientific directors, and CIBMTR partners. Design, develop, and implement data- and AI-centric solutions aligned with evolving user needs-leveraging structured and unstructured data, optimizing data quality, and enabling scalable Machine Learning and Agentic Artificial Intelligence (AI) systems.
Responsibilities:
Lead or actively contribute to multidisciplinary data and AI workstreams, collaborating with data scientists, analytics engineers, and partner engineering teams to design and build next-generation intelligent data solutions.
Design and implement modern, scalable, secure, high performing data architectures, including data lakes, lakehouse and data commons.
Develop and optimize data pipelines (ETL/ELT) and orchestration systems for large scale data ingestion and processing of structured and unstructured data as well as integrating organizational data for ML/AI workloads for analytics and ML/AI applications.
Build reusable, modular components and APIs to support scalable Agentic AI frameworks and enable autonomous data operations aligned with outcomes research and clinical trials.
Work independently or as part of a team with subject matter experts to identify user needs and requirements for efficient, scalable and reliable data pipelines and models that support data-driven initiatives.
Support automation of operational workflows using Agentic AI, including evaluation, performance tuning and observability.
Develop and maintain clear documentation aligned with Standard Operating Procedures (SOPs), best practices, and regulatory requirements for both data engineering and AI components.
Mentor and train data engineers and analytics engineers in the adoption and integration of AI and ML methods and frameworks.
Perform other duties as required.
Knowledge - Skills - Abilities
Experience in designing, implementing, and scaling data pipelines for structured and unstructured data in modern Lakehouse or data lake architectures.
Strong proficiency in SQL and at least one scripting language (e.g., Python).
Solid understanding of data engineering principles, including data preparation, feature engineering, and model lifecycle management.
Solid understanding of prompt engineering.
Experience with cloud-based AI/ML platforms (e.g., AWS SageMaker, Bedrock, or comparable).
Familiarity with LLMs, agentic frameworks, and AI orchestration patterns (e.g., LangChain, AutoGen, or similar).
Familiarity with techniques to integrate organizational data into AI workflows, such as Retrieval Augmented Generation (RAG).
Strong data profiling and data quality assurance experience.
Knowledge of workflow decomposition for automation.
Excellent problem-solving, analytical thinking, and communication skills.
Ability to mentor technical staff and communicate data engineering and AI concepts to non-expert audiences.
Preferred:
Knowledge of data interoperability and data standards (e.g., FHIR, HL7, JSON, XML).
Experience with Agentic AI reasoning patterns and system frameworks.
Experience working in AGILE Scrum team framework.
Familiarity with orchestration tools (e.g. Airflow or Dagster).
Familiarity of R, SAS, JupytrLab or other statistical software is a plus.
Experience with Model Context Protocol (MCP), RESTful APIs and modern integration methods.
Familiarity of model evaluation and observability.
Preferred Schedule:
Mon-Friday
Position Requirements:
Minimum Qualifications:
Appropriate experience may be substituted for education on an equivalent basis.
Minimum education: Bachelor's Degree
Minimum experience: 5 years
Preferred Qualifications:
Preferred education: Master's degree
Preferred experience: 8 years in Computer Science, Informatics, Data Science, or technical discipline in healthcare or life sciences. AI/ML certifications are preferred.
Why MCW?
Outstanding Healthcare Coverage, including but not limited to Health, Vision, and Dental.
403B Retirement Package
Competitive Vacation, Sick Time, and Paid Holidays
Tuition Reimbursement
Paid Parental Leave
For a brief overview of our benefits see: ******************************************************** #LI-NK1
MCW as an Equal Opportunity Employer and Commitment to Non-Discrimination
The Medical College of Wisconsin (MCW) is an Equal Opportunity Employer. We are committed to fostering an inclusive community of outstanding faculty, staff, and students, as well as ensuring equal educational opportunity, employment, and access to services, programs, and activities, without regard to an individual's race, color, national origin, religion, age, disability, sex, gender identity/expression, sexual orientation, marital status, pregnancy, predisposing genetic characteristic, or military status. Employees, students, applicants, or other members of the MCW community (including but not limited to vendors, visitors, and guests) may not be subjected to harassment that is prohibited by law or treated adversely or retaliated against based upon a protected characteristic.
.
Data Engineer
Data engineer job in Mequon, WI
Charter Manufacturing is a fourth-generation family-owned business where our will to grow drives us to do it better. Join the team and become part of our family!
Applicants must be authorized to work for ANY employer in the U.S. Charter Manufacturing is unable to sponsor for employment visas at this time.
This position is hybrid, 3 days a week in office in Mequon, WI.
BI&A- Lead Data Engineer
Charter Manufacturing continues to invest in Data & Analytics. Come join a great team and great culture leveraging your expertise to drive analytics transformation across Charter's companies. This is a key role in the organization that will provide thought leadership, as well as add substantial value by delivering trusted data pipelines that will be used to develop models and visualizations that tell a story and solve real business needs/problems. This role will collaborate with team members and business stakeholders to leverage data as an asset driving business outcomes aligned to business strategies.
Having 7+ years prior experience in developing data pipelines and partnering with team members and business stakeholders to drive adoption will be critical to the success of this role.
MINIMUM QUALIFICATIONS:
Bachelor's degree in computer science, data science, software engineering, information systems, or related quantitative field; master's degree preferred
At least seven years of work experience in data management disciplines, including data integration, modeling, optimization and data quality, or other areas directly relevant to data engineering responsibilities and tasks
Proven project experience designing, developing, deploying, and maintaining data pipelines used to support AI, ML, and BI using big data solutions (Azure, Snowflake)
Strong knowledge in Azure technologies such as Azure Web Application, Azure Data Explorer, Azure DevOps, and Azure Blob Storage to build scalable and efficient data pipelines
Strong knowledge using programming languages such as R, Python, C#, and Azure Machine Learning Workspace development
Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP) and modern data warehouse tools (Snowflake, Databricks)
Experience with database technologies such as SQL, Oracle, and Snowflake
Prior experience with ETL/ELT data ingestion into data lakes/data warehouses for analytics consumption
Strong SQL skills
Ability to collaborate within and across teams of different technical knowledge to support delivery and educate end users on data products
Passionate about teaching, coaching, and mentoring others
Strong problem-solving skills, including debugging skills, allowing the determination of sources of issues in unfamiliar code or systems, and the ability to recognize and solve problems
Excellent business acumen and interpersonal skills; able to work across business lines at a senior level to influence and effect change to achieve common goals
Ability to describe business use cases/outcomes, data sources and management concepts, and analytical approaches/options
Demonstrated experience delivering business value by structuring and analyzing large, complex data sets
Demonstrated initiative, strong sense of accountability, collaboration, and known as a trusted business partner
PREFERRED QUALIFICATIONS INCLUDES EXPERIENCE WITH:
Manufacturing industry experience specifically heavy industry, supply chain and operations
Designing and supporting data integrations with ERP systems such as Oracle or SAP
MAJOR ACCOUNTABILITIES:
Designs, develops, and supports data pipelines for batch and streaming data extraction from various sources (databases, API's, external systems), transforming it into the desired format, and load it into the appropriate data storage systems
Collaborates with data scientists and analysts to optimize models and algorithms for in accordance with data quality, security, and governance policies
Ensures data quality, consistency, and integrity during the integration process, performing data cleansing, aggregation, filtering, and validation as needed to ensure accuracy, consistency, and completeness of data
Optimizes data pipelines and data processing workflows for performance, scalability, and efficiency.
Monitors and tunes data pipelines for performance, scalability, and efficiency resolving performance bottlenecks
Establish architecture patterns, design standards, and best practices to accelerate delivery and adoption of solutions
Assist, educate, train users to drive self-service enablement leveraging best practices
Collaborate with business subject matter experts, analysts, and offshore team members to develop and deliver solutions in a timely manner
Embraces and establishes governance of data and algorithms, quality, standards, and best practices, ensuring data accuracy
We offer comprehensive health, dental, and vision benefits, along with a 401(k) plan that includes employer matching and profit sharing. Additionally, we offer company-paid life insurance, disability coverage, and paid time off (PTO).
Auto-ApplyAzure Data Engineer (Python/SQL) - 6013914
Data engineer job in Milwaukee, WI
Accenture Flex offers you the flexibility of local fixed-duration project-based work powered by Accenture, a leading global professional services company. Accenture is consistently recognized on FORTUNE's 100 Best Companies to Work For and Diversity Inc's Top 50 Companies For Diversity lists.
As an Accenture Flex employee, you will apply your skills and experience to help drive business transformation for leading organizations and communities. In addition to delivering innovative solutions for Accenture's clients, you will work with a highly skilled, diverse network of people across Accenture businesses who are using the latest emerging technologies to address today's biggest business challenges.
You will receive competitive rewards and access to benefits programs and world-class learning resources. Accenture Flex employees work in their local metro area onsite at the project, significantly reducing and/or eliminating the demands to travel.
Job Description:
Join our dynamic team and embark on a journey where you will be empowered to perform independently and become an SME. Required active participation/contribution in team discussions will be key as you contribute in providing solutions to work related problems. Let's work together to achieve greatness!
Responsibilities:
+ Create new data pipelines leveraging existing data ingestion frameworks, tools
+ Orchestrate data pipelines using the Azure Data Factory service.
+ Develop/Enhance data transformations based on the requirements to parse, transform and load data into Enterprise Data Lake, Delta Lake, Enterprise DWH (Synapse Analytics)
+ Perform Unit Testing, coordinate integration testing and UAT Create HLD/DD/runbooks for the data pipelines
+ Configure compute, DQ Rules, Maintenance Performance tuning/optimization
Basic Qualifications:
+ Minimum of 3 years of work experience with one or more of the following: Databricks Data Engineering, DLT, Azure Data Factory, SQL, PySpark, Synapse Dedicated SQL Pool, Azure DevOps, Python
Preferred Qualifications:
+ Azure Function Apps
+ Azure Logic Apps
+ Precisely & COSMOS DB
+ Advanced proficiency in PySpark.
+ Advanced proficiency in Microsoft Azure Databricks, Azure DevOps, Databricks Delta Live Tables and Azure Data Factory.
+ Bachelor's or Associate's degree
Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired as set forth below. We accept applications on an on-going basis and there is no fixed deadline to apply.
Information on benefits is here. (************************************************************
Role Location:
California - $47.69 - $57.69
Cleveland - $47.69 - $57.69
Colorado - $47.69 - $57.69
District of Columbia - $47.69 - $57.69
Illinois - $47.69 - $57.69
Minnesota - $47.69 - $57.69
Maryland - $47.69 - $57.69
Massachusetts - $47.69 - $57.69
New York/New Jersey - $47.69 - $57.69
Washington - $47.69 - $57.69
Requesting an Accommodation
Accenture is committed to providing equal employment opportunities for persons with disabilities or religious observances, including reasonable accommodation when needed. If you are hired by Accenture and require accommodation to perform the essential functions of your role, you will be asked to participate in our reasonable accommodation process. Accommodations made to facilitate the recruiting process are not a guarantee of future or continued accommodations once hired.
If you would like to be considered for employment opportunities with Accenture and have accommodation needs such as for a disability or religious observance, please call us toll free at **************** or send us an email or speak with your recruiter.
Equal Employment Opportunity Statement
We believe that no one should be discriminated against because of their differences. All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law. Our rich diversity makes us more innovative, more competitive, and more creative, which helps us better serve our clients and our communities.
For details, view a copy of the Accenture Equal Opportunity Statement (********************************************************************************************************************************************
Accenture is an EEO and Affirmative Action Employer of Veterans/Individuals with Disabilities.
Accenture is committed to providing veteran employment opportunities to our service men and women.
Other Employment Statements
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States.
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process. Further, at Accenture a criminal conviction history is not an absolute bar to employment.
The Company will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. Additionally, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the Company's legal duty to furnish information.
California requires additional notifications for applicants and employees. If you are a California resident, live in or plan to work from Los Angeles County upon being hired for this position, please click here for additional important information.
Please read Accenture's Recruiting and Hiring Statement for more information on how we process your data during the Recruiting and Hiring process.
Security Lead-Data Protection
Data engineer job in Milwaukee, WI
blue Stone Recruiting is a national search firm with a focus of placing top Cyber Security talent from the Analyst level to CISO with prestigious organizations nationwide.
Job Description
Currently working with a Fortune 500 global manufacturing client that is looking for a Security Lead- Data Protection focused individual. This client is looking to add a permanent member to their security team that can grow within the company.
Responsibilities:
• Develop and lead the Security Lead-Global Data Protection program across all clients business units
• Build and manage an effective data classification program
• Develop and drive Security Lead-Data Protection policies, standards and processes
• Develop and drive a data loss prevention tools strategy partnering with companies Information Technology groups and the clients business units
• Align Security -Lead Data Protection programs with other information security and cross-functional programs
• Direct and improve the Security Lead-Data Protection and DLP programs and associated governance activities including metrics, issue tracking and remediation, and programs supporting client policies.
• Establish and maintain cross-business relationships with key Leads and stakeholders across clients business, information security, and IT functions
• Coordinate with the Global IT Council to ensure compliance with clients standards and share best practices
• Conduct research and make recommendations on products, services, and standards in support of all global infrastructure security efforts.
• Develop and maintain appropriate response playbooks, facilitate routine exercises, and ensure a sound communication process for all cyber events
Job Requirements:
Formal Education & Certification
• Bachelor's degree in Information Security, IT Engineering, or Computer Science with 5+ years of IT experience
• Industry-recognized security certification such as GIAC, CISSP, CISM, or CISA are preferred
.
Qualifications
• Minimum 5 years' information security experience including experience mapping and securing business processes / data flows
• Demonstrated experience building and leading a cyber-security program
• Advanced knowledge of data protection/DLP and Windows client/server security concepts, best practices, and procedures
• Solid “hands on” technical experience is essential
• Experience in Information Security Incident Response
• Experience in IaaS/SaaS environments
• Broad understanding of all aspects of IT and enterprise systems interoperability
• Ability to communicate technical topics (verbal and written) to multiple organizational levels
• Global enterprise experience is preferred
Personal Attributes:
Demonstrated Leadship managing direct and matrix-reporting global teams
• Demonstrated experience leading global programs across technology and business functions
• Strong interpersonal, written, and oral communication skills
• Able to conduct research into issues and products as required
• Ability to prioritize and execute tasks in a fast-paced environment and make sound decisions in emergency situation
• Highly self-motivated and directed
• Keen attention to detail
• Proven analytical and problem-solving abilities
Additional Information
Work with blue Stone recruiting to find your next Cyber Security role. You can find us at ******************************* We look forward to speaking with you.
All your information will be kept confidential according to EEO guidelines.
Data Architect
Data engineer job in West Allis, WI
Invest In You! Tri City National Bank is your hometown bank. We believe in putting customers first, building relationships, and fostering a sense of community. We work in a team environment with opportunities for hard workers to grow personally and professionally. We enjoy celebrating success and great benefits along the way. Most importantly, we believe superior customer service paired with the right banking solutions help our customers and businesses fulfill their financial dreams, and our communities grow. Our ideal candidate believes in our mission, values continuous learning, and is comfortable adapting to change. If this resonates with you, apply today and come join our team. #investinyou
The Data Architect is a strategic support role responsible for designing and implementing the organization's enterprise data management strategy. Reporting to the Director of IT, this role will define and drive standards, policies, and architectural practices for data access, modeling, stewardship, categorization, quality, archival, and destruction.
The Data Architect will serve as the organizational authority on data architecture, owning database platform strategy and participating in enterprise governance through the Architectural Review Board. This role will collaborate across technical and business domains to ensure data is managed as a strategic asset, with a strong emphasis on governance across SaaS platforms and third-party data processors.
Compensation: $150,000+ annually depending on experience.
This position is in-office at our Operations Center in West Allis, WI
Responsibilities
Strategic Planning & Execution
Conduct a comprehensive assessment of the current data environment and maturity.
Deliver a multi-year data strategy including prioritized recommendations, implementation timeline, and required resources.
Provide data governance thought leadership.
Build and/or coordinate the future data management organization. This may include shaping potential staffing needs.
Strategic Architecture & Governance
Develop and maintain the enterprise data architecture strategy, including standards for data access, modeling, categorization, stewardship, and quality.
Own the architectural design of database platforms, including redundancy, backup, and recovery strategies.
Participate in the Architectural Review Board as the data management representative.
Recommend and guide the adoption of supporting tools for data inventory, modeling, categorization, and quality assurance.
Establish architectural standards that apply across both on-premises and SaaS-hosted data environments.
SaaS & Third-Party Data Governance
Lead efforts to inventory and categorize data stored in SaaS applications and cloud platforms.
Track and document data processors and sub-processors handling corporate data to meet regulatory obligations.
Define governance policies for data shared with third-party applications, ensuring visibility and control over sensitive information.
Collaborate with Information Security and Compliance teams to align SaaS data governance with enterprise risk and regulatory frameworks.
Policy & Process Development
Define and implement policies and procedures for data archival and destruction, including unstructured data across the corporate network.
Co-lead efforts to track and manage sensitive data shared with third-party and cloud-based applications.
Establish standards and constraints to guide the Business Intelligence team, while maintaining separation from BI execution.
Cross-Functional Collaboration
Facilitate the development of a data stewardship program by partnering with business departments and subject matter experts.
Serve as a key contributor to corporate projects involving the creation, modification, or destruction of sensitive data.
Collaborate with Information Security, Infrastructure, and Application teams to ensure alignment with enterprise architecture and compliance requirements.
Operational
Review database design decisions to ensure that solutions exhibit data storage and access that is secure, recoverable, performant, scalable, and maintainable.
Provide recommendations or, where applicable, implement enhancements that enhance data security, access performance, and platform reliability.
Support regulatory data security and compliance through routine review and monitoring of relevant controls. Participate in design and development of data-oriented control processes.
Maintain a current and working knowledge of Tri City's products, processes, and data exchanges.
Perform other duties as requested.
Qualifications
Minimum 7 years of experience in data architecture or related disciplines.
Strong understanding of enterprise data management principles and practices.
Experience with third-party and cloud data integrations.
Background in highly regulated industries with demonstrated compliance awareness.
Familiarity with database administration and platform ownership (SQL Server preferred).
Prior career demonstration of technical, analytical, and administrative skills.
Excellent written, verbal communication skills and ability to work well independently and with others.
Preferred Qualifications
Prior hands-on experience as a database administrator.
Certifications such as CDMP, CIMP, DAMA, or relevant cloud/data platform credentials.
Experience facilitating data governance programs or stewardship initiatives.
Prior experience with typical data architecture technologies (i.e., ETL, data modeling, data visualization, database performance monitoring, BI, MDM, etc.)
Why Join Us:
Community Impact: Be part of a local bank deeply rooted in community values, contributing to the growth and prosperity of our neighborhoods.
Innovation: Embrace a dynamic and evolving work environment that encourages fresh perspectives and continuous learning.
Career Growth: Unlock future opportunities for personal and professional development as you navigate through our Pathways for Success.
Celebration of Success: Join a team that values and celebrates individual and collective achievements.
Work Life Balance: No early mornings or late nights, enjoy a predictable schedule with major holidays off.
Great Employee Benefits that start on the 1st of the month after your hire date!
Part-Time:
401(k) with company match
Up to 20 hours of paid vacation after 3 months (must work an average of 20+ hours per week in order to be eligible for paid vacation.)
Full-Time:
401(k) with company match
Tuition reimbursement
Medical, dental, and vision coverage
Paid vacation and more!
Equal Opportunity Employer/Veterans/Disabled
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
Reasonable Accommodation
If you are an individual with a disability and would like to request a reasonable accommodation as part of the employment selection process, please contact Human Resources at ************ or ************
Auto-ApplySlalom Flex (Project Based)- Java Data Engineer
Data engineer job in Milwaukee, WI
About the Role: We are seeking a highly skilled and motivated Data Engineer to join our team as an individual contributor. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure that support our data-driven initiatives. You will work closely with cross-functional teams to ensure data availability, quality, and performance across the organization.
About Us
Slalom is a purpose-led, global business and technology consulting company. From strategy to implementation, our approach is fiercely human. In six countries and 43 markets, we deeply understand our customers-and their customers-to deliver practical, end-to-end solutions that drive meaningful impact. Backed by close partnerships with over 400 leading technology providers, our 10,000+ strong team helps people and organizations dream bigger, move faster, and build better tomorrows for all. We're honored to be consistently recognized as a great place to work, including being one of Fortune's 100 Best Companies to Work For seven years running. Learn more at slalom.com.
Key Responsibilities:
* Design, develop, and maintain robust data pipelines using Java and Python.
* Build and optimize data workflows on AWS using services such as EMR, Glue, Lambda, and NoSQL databases.
* Leverage open-source frameworks to enhance data processing capabilities and performance.
* Collaborate with data scientists, analysts, and other engineers to deliver high-quality data solutions.
* Participate in Agile development practices, including sprint planning, stand-ups, and retrospectives.
* Ensure data integrity, security, and compliance with internal and external standards.
Required Qualifications:
* 5+ years of hands-on experience in software development using Java and Python (Spring Boo).
* 1+ years of experience working with AWS services including EMR, Glue, Lambda, and NoSQL databases.
* 3+ years of experience working with open-source data processing frameworks (e.g., Apache Spark, Kafka, Airflow).
* 2+ years of experience in Agile software development environments.
* Strong problem-solving skills and the ability to work independently in a fast-paced environment.
* Excellent communication and collaboration skills.
Preferred Qualifications:
* Experience with CI/CD pipelines and infrastructure-as-code tools (e.g., Terraform, CloudFormation).
* Familiarity with data governance and data quality best practices.
* Exposure to data lake and data warehouse architectures.
Compensation and Benefits
Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses.
EEO and Accommodations
Slalom is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration
for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements.
Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the
selection process. Please advise the talent acquisition team if you require accommodations during the interview process.
Associate Data Engineer
Data engineer job in Milwaukee, WI
Baker Tilly is a leading advisory, tax and assurance firm, providing clients with a genuine coast-to-coast and global advantage in major regions of the U.S. and in many of the world's leading financial centers - New York, London, San Francisco, Los Angeles, Chicago and Boston. Baker Tilly Advisory Group, LP and Baker Tilly US, LLP (Baker Tilly) provide professional services through an alternative practice structure in accordance with the AICPA Code of Professional Conduct and applicable laws, regulations and professional standards. Baker Tilly US, LLP is a licensed independent CPA firm that provides attest services to its clients. Baker Tilly Advisory Group, LP and its subsidiary entities provide tax and business advisory services to their clients. Baker Tilly Advisory Group, LP and its subsidiary entities are not licensed CPA firms.
Baker Tilly Advisory Group, LP and Baker Tilly US, LLP, trading as Baker Tilly, are independent members of Baker Tilly International, a worldwide network of independent accounting and business advisory firms in 141 territories, with 43,000 professionals and a combined worldwide revenue of $5.2 billion. Visit bakertilly.com or join the conversation on LinkedIn, Facebook and Instagram.
Please discuss the work location status with your Baker Tilly talent acquisition professional to understand the requirements for an opportunity you are exploring.
Baker Tilly is an equal opportunity/affirmative action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, gender identity, sexual orientation, or any other legally protected basis, in accordance with applicable federal, state or local law.
Any unsolicited resumes submitted through our website or to Baker Tilly Advisory Group, LP, employee e-mail accounts are considered property of Baker Tilly Advisory Group, LP, and are not subject to payment of agency fees. In order to be an authorized recruitment agency ("search firm") for Baker Tilly Advisory Group, LP, there must be a formal written agreement in place and the agency must be invited, by Baker Tilly's Talent Attraction team, to submit candidates for review via our applicant tracking system.
Job Description:
Associate Data Engineer
As a Senior Consultant - Associate Data Engineer you will design, build, and optimize modern data solutions for our mid‑market and enterprise clients. Working primarily inside the Microsoft stack (Azure, Synapse, and Microsoft Fabric), you will transform raw data into trusted, analytics‑ready assets that power dashboards, advanced analytics, and AI use cases. You'll collaborate with solution architects, analysts, and client stakeholders while sharpening both your technical depth and consulting skills.
Key Responsibilities:
* Data Engineering: Develop scalable, well‑documented ETL/ELT pipelines using T‑SQL, Python, Azure Data Factory/Fabric Data Pipelines, and Databricks; implement best‑practice patterns for performance, security, and cost control.
* Modeling & Storage: Design relational and lakehouse models; create Fabric OneLake shortcuts, medallion‑style layers, and dimensional/semantic models for Power BI.
* Quality & Governance: Build automated data‑quality checks, lineage, and observability metrics; contribute to CI/CD workflows in Azure DevOps or GitHub.
* Client Delivery: Gather requirements, demo iterative deliverables, document technical designs, and translate complex concepts to non‑technical audiences.
* Continuous Improvement: Research new capabilities, share findings in internal communities of practice, and contribute to reusable accelerators. Collaborate with clients and internal stakeholders to design and implement scalable data engineering solutions.
Qualifications:
* Education - Bachelor's in Computer Science, Information Systems, Engineering, or related field (or equivalent experience)
* Experience - 2-3 years delivering production data solutions, preferably in a consulting or client‑facing role.
* Technical Skills:
Strong T‑SQL for data transformation and performance tuning.
Python for data wrangling, orchestration, or notebook‑based development.
Hands‑on ETL/ELT with at least one Microsoft service (ADF, Synapse Pipelines, Fabric Data Pipelines).
* Project experience with Microsoft Fabric (OneLake, Lakehouses, Data Pipelines, Notebooks, Warehouse, Power BI DirectLake) preferred
* Familiarity with Databricks, Delta Lake, or comparable lakehouse technologies preferred
* Exposure to DevOps (YAML pipelines, Terraform/Bicep) and test automation frameworks preferred
* Experience integrating SaaS/ERP sources (e.g., Dynamics 365, Workday, Costpoint) preferred
Auto-ApplySAP Data Migration Lead/ Architect
Data engineer job in Milwaukee, WI
Deltacubes is currently looking for a SAP Data Migration Lead/ Architect for one of our clients located in Milwaukee, WI
This role is mostly remote with some travel to client on need basis.
If you are interested, kindly apply here
Acceptable Visa Type :H1B , US Citizen/Green Card holder preferred.
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Healthcare Data Analyst Lead (CMH Health)
Data engineer job in Brookfield, WI
Individual(s) must be legally authorized to work in the United States without the need for immigration support or sponsorship from Milliman now or in the future. Milliman is seeking a technically savvy, analytically strong Healthcare Data Analyst Lead to manage analytics and technology projects as well as coach and mentor junior analytical staff. The ideal candidate is someone seeking to join a challenging, yet rewarding environment focused on delivering world-class analytics across a variety of healthcare-related domains to a variety of healthcare entities.
Who We Are
Independent for over 75 years, Milliman delivers market-leading services and solutions to clients worldwide. Today, we are helping companies take on some of the world's most critical and complex issues, including retirement funding and healthcare financing, risk management and regulatory compliance, data analytics and business transformation.
Job Responsibilities
* Lead and manage analytics and technology projects from data ingestion through delivery.
* Design, develop, and optimize data processes and workflows supporting large healthcare datasets.
* Perform and oversee ETL, validation, and transformation tasks for claims, eligibility, and pharmacy data.
* Guide project teams through the full "data-to-deliverable" lifecycle, ensuring accuracy and efficiency.
* Build analytical models, dashboards, and data pipelines to support consulting engagements.
* Collaborate with consultants, actuaries, and project managers to interpret results and deliver client insights.
* Review and approve technical work from peers and junior analysts to ensure quality standards are met.
* Mentor, coach, and delegate to analytical staff to strengthen technical and professional development.
* Contribute to innovation and process improvement initiatives, including automation, cloud enablement, and AI integration.
* Participate in client meetings and presentations, occasionally requiring travel.
Minimum requirements
* Bachelor's degree required (Computer Science, Management Information Systems, Computer Engineering, Math, Actuarial Science, Data Analytics, or related degree is preferred)
* 6+ years of experience in healthcare data analytics or a related technical analytics role.
* Advanced proficiency with SQL and Microsoft Excel for data analysis, validation, and automation.
* Strong programming skills in Python, R, or other analytical languages.
* Experience with data visualization and reporting tools (e.g., Power BI, Tableau, or R Shiny).
* Solid understanding of healthcare data structures, including claims, eligibility, and provider data.
* Proven ability to lead multiple projects simultaneously while mentoring and developing junior team members.
* Experience with cloud data technologies (e.g., Azure Data Factory, Databricks, Snowflake, AWS Redshift, or similar).
* Exposure to AI or Generative AI tools for data analysis, automation, or insight generation is preferred, but not required.
Competencies and Behaviors that Support Success in this Role
* Deep understanding of database architecture and large-scale healthcare data environments.
* Strong analytical thinking and the ability to translate complex data into actionable insights.
* Excellent communication skills, including the ability to explain technical concepts to non-technical audiences.
* Highly organized, detail-oriented, and able to manage competing priorities.
* Collaborative and proactive leadership style with a focus on mentorship and knowledge-sharing.
* Passion for applying analytics to improve healthcare performance, quality, and cost outcomes.
* Demonstrated accountability for quality, timelines, and client satisfaction.
* Fast learner who thrives in a dynamic, innovation-driven environment.
The Team
The Healthcare Data Analyst Lead will join a team that thrives on leveraging data, analytics, and technology to deliver meaningful business value. This is a team with technical aptitude and analytical prowess that enjoys building efficient and scalable products and processes. Ultimately, we are passionate about effecting change in healthcare. We also believe that collaboration and communication are cornerstones of success.
The Healthcare Data Analyst Lead will also join a mix of Healthcare Analysts, Leads, Consultants, and Principals. In addition, as part of the broader Milliman landscape, they will work alongside Healthcare Actuaries, Pharmacists, Clinicians, and Physicians. We aim to provide everyone a supportive environment, where we foster learning and growth through rewarding challenges.
Salary:
The overall salary range for this role is $104,900 - $199,065.
For candidates residing in:
* Alaska, California, Connecticut, Illinois, Maryland, Massachusetts, New Jersey, Notable New York City, Newark, San Jose, San Francisco, Pennsylvania, Virginia, Washington, or the District of Columbia the salary range is $120,635 - $199,065.
* All other locations the salary range is $104,900 - $173,100.
A combination of factors will be considered, including, but not limited to, education, relevant work experience, qualifications, skills, certifications, etc.
Location: It is preferred that candidates work on-site at our Brookfield, Wisconsin office, however, remote candidates will be considered.
The expected application deadline for this job is May 25, 2026.
Benefits
We offer a comprehensive benefits package designed to support employees' health, financial security, and well-being. Benefits include:
* Medical, Dental and Vision - Coverage for employees, dependents, and domestic partners.
* Employee Assistance Program (EAP) - Confidential support for personal and work-related challenges.
* 401(k) Plan - Includes a company matching program and profit-sharing contributions.
* Discretionary Bonus Program - Recognizing employee contributions.
* Flexible Spending Accounts (FSA) - Pre-tax savings for dependent care, transportation, and eligible medical expenses.
* Paid Time Off (PTO) - Begins accruing on the first day of work. Full-time employees accrue 15 days per year, and employees working less than full-time accrue PTO on a prorated basis.
* Holidays - A minimum of 10 observed holidays per year.
* Family Building Benefits - Includes adoption and fertility assistance.
* Paid Parental Leave - Up to 12 weeks of paid leave for employees who meet eligibility criteria.
* Life Insurance & AD&D - 100% of premiums covered by Milliman.
* Short-Term and Long-Term Disability - Fully paid by Milliman.
Equal Opportunity:
All qualified applicants will receive consideration for employment, without regard to race, color, religion, sex, sexual orientation, national origin, disability, or status as a protected veteran.