Sr Boomi Developer
Data engineer job in Kenosha, WI
Responsibilities:
Design and Architect Solutions: Bringing deep knowledge to design stable, reliable, and scalable integration solutions using the Dell Boomi AtomSphere platform and its components (Integration, API Management, MDM, etc.)
Hands-on Development: Designing, developing, and implementing complex integration processes, workflows, and APIs (REST/SOAP) to connect various applications (on-premises and cloud-based), ERP systems (like Microsoft Dynamics, Oracle EBS, SAP), and other data sources.
Data Transformation: Proficiently handling various data formats such as XML, JSON, CSV and database formats, and using Boomi's capabilities and scripting languages (like Groovy or JavaScript) for complex data mapping and transformations.
Dell Boomi Platform Knowledge: Proficiency in Dell Boomi is crucial. Familiarize yourself with Boomi components such as connectors, processes, maps, and APIs. Understand how to design, build, and deploy integrations using Boomi.
API Development: Strong knowledge of RESTful and SOAP APIs. You'll create, consume, and manage APIs within Boomi.
Working with team members and business users to understand project requirements and deliver successful design, implementation, and post implementation support.
Working closely with team members to translate business requirements into feasible and efficient technical solutions.
Develop and maintain documentation for integration and testing processes
Be highly accurate in activity assessment, effort estimation and delivery commitment to ensure all project activities are delivered on time without comprising quality.
Diagnose complex technical issues and provide recommendations on solutions with consideration of best practices and longer-term impacts of decisions.
Lead/Perform third party testing, performance testing and UAT coordination.
Selecting the appropriate development platform(s) to execute business requirements and ensure post implementation success.
Serve as technical lead on projects to design, develop, test, document and deploy robust integration solutions.
Working both independently and as part of a team; collaborating closely with other IT and non-IT team members.
Assessing and troubleshooting production issues with a varying degree of priority and complexity.
Optimizing existing and developing new integration solutions to support business requirements.
Providing continuous support and management of the integration layer ensuring the integrity of our data and integrations and remove single points of failure.
Good knowledge of best practices in error handling, logging, and monitoring.
Documenting and cross-training team members for support continuity.
Qualifications:
10-15 years of experience with enterprise integration platform
Bachelor's degree in computer science
Troubleshooting Skills: Be adept at diagnosing and resolving integration issues. Familiarity with Boomi's debugging tools is valuable.
Security Awareness: Knowledge of authentication methods, encryption, and secure data transmission.
Experience and proven track record of implementing integration projects.
Extensible Stylesheet Language Transformations (XSLT) experience is a plus.
Project Management experience is a plus
Experience of ERP systems within a fast-moving wholesale, retail, and Ecommerce environment is highly desirable.
Experience of Boomi implementation with Microsoft Dynamics ERP system is a plus.
Strong communication and ability to work cross-functionally in a fast-paced environment.
Databricks Data Engineer - Senior - Consulting - Location Open 1
Data engineer job in Milwaukee, WI
At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
**Technology - Data and Decision Science - Data Engineering - Senior**
We are seeking a highly skilled Senior Consultant Data Engineer with expertise in cloud data engineering, specifically Databricks. The ideal candidate will have strong client management and communication skills, along with a proven track record of successful end-to-end implementations in data engineering projects.
**The opportunity**
In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that technical requirements align with business needs. Your responsibilities will include creating scalable data architecture and modeling solutions that support the entire data asset lifecycle.
**Your key responsibilities**
As a Senior Data Engineer, you will play a pivotal role in transforming data into actionable insights. Your time will be spent on various responsibilities, including:
+ Designing, building, and operating scalable on-premises or cloud data architecture.
+ Analyzing business requirements and translating them into technical specifications.
+ Optimizing data flows for target data platform designs.
+ Design, develop, and implement data engineering solutions using Databricks on cloud platforms (e.g., AWS, Azure, GCP).
+ Collaborate with clients to understand their data needs and provide tailored solutions that meet their business objectives.
+ Lead end-to-end data pipeline development, including data ingestion, transformation, and storage.
+ Ensure data quality, integrity, and security throughout the data lifecycle.
+ Provide technical guidance and mentorship to junior data engineers and team members.
+ Communicate effectively with stakeholders, including technical and non-technical audiences, to convey complex data concepts.
+ Manage client relationships and expectations, ensuring high levels of satisfaction and engagement.
+ Stay updated with the latest trends and technologies in data engineering and cloud computing.
This role offers the opportunity to work with cutting-edge technologies and stay ahead of industry trends, ensuring you gain a competitive advantage in the market. The position may require regular travel to meet with external clients.
**Skills and attributes for success**
To thrive in this role, you will need a blend of technical and interpersonal skills. Your ability to communicate effectively and build relationships will be crucial. Here are some key attributes we look for:
+ Strong analytical and decision-making skills.
+ Proficiency in cloud computing and data architecture design.
+ Experience in data integration and security.
+ Ability to manage complex problem-solving scenarios.
**To qualify for the role, you must have**
+ A Bachelor's degree in Computer Science, Engineering, or a related field required (4-year degree). Master's degree preferred
+ Typically, no less than 2 - 4 years relevant experience in data engineering, with a focus on cloud data solutions.
+ 5+ years of experience in data engineering, with a focus on cloud data solutions.
+ Expertise in Databricks and experience with Spark for big data processing.
+ Proven experience in at least two end-to-end data engineering implementations, including:
+ Implementation of a data lake solution using Databricks, integrating various data sources, and enabling analytics for business intelligence.
+ Development of a real-time data processing pipeline using Databricks and cloud services, delivering insights for operational decision-making.
+ Strong programming skills in languages such as Python, Scala, or SQL.
+ Experience with data modeling, ETL processes, and data warehousing concepts.
+ Excellent problem-solving skills and the ability to work independently and as part of a team.
+ Strong communication and interpersonal skills, with a focus on client management.
**Required Expertise for Senior Consulting Projects:**
+ **Strategic Thinking:** Ability to align data engineering solutions with business strategies and objectives.
+ **Project Management:** Experience in managing multiple projects simultaneously, ensuring timely delivery and adherence to project scope.
+ **Stakeholder Engagement:** Proficiency in engaging with various stakeholders, including executives, to understand their needs and present solutions effectively.
+ **Change Management:** Skills in guiding clients through change processes related to data transformation and technology adoption.
+ **Risk Management:** Ability to identify potential risks in data projects and develop mitigation strategies.
+ **Technical Leadership:** Experience in leading technical discussions and making architectural decisions that impact project outcomes.
+ **Documentation and Reporting:** Proficiency in creating comprehensive documentation and reports to communicate project progress and outcomes to clients.
**Ideally, you'll also have**
+ Experience with data quality management.
+ Familiarity with semantic layers in data architecture.
+ Familiarity with cloud platforms (AWS, Azure, GCP) and their data services.
+ Knowledge of data governance and compliance standards.
+ Experience with machine learning frameworks and tools.
**What we look for**
We seek individuals who are not only technically proficient but also possess the qualities of top performers. You should be adaptable, collaborative, and driven by a desire to achieve excellence in every project you undertake.
FY26NATAID
**What we offer you**
At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more .
+ We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $106,900 to $176,500. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $128,400 to $200,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
+ Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
+ Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
**Are you ready to shape your future with confidence? Apply today.**
EY accepts applications for this position on an on-going basis.
For those living in California, please click here for additional information.
EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
**EY | Building a better working world**
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
Healthcare Data Analyst Lead (CMH Health)
Data engineer job in Brookfield, WI
Individual(s) must be legally authorized to work in the United States without the need for immigration support or sponsorship from Milliman now or in the future.
Milliman is seeking a technically savvy, analytically strong Healthcare Data Analyst Lead to manage analytics and technology projects as well as coach and mentor junior analytical staff. The ideal candidate is someone seeking to join a challenging, yet rewarding environment focused on delivering world-class analytics across a variety of healthcare-related domains to a variety of healthcare entities.
Who We Are
Independent for over 75 years, Milliman delivers market-leading services and solutions to clients worldwide. Today, we are helping companies take on some of the world's most critical and complex issues, including retirement funding and healthcare financing, risk management and regulatory compliance, data analytics and business transformation.
Job Responsibilities
Lead and manage analytics and technology projects from data ingestion through delivery.
Design, develop, and optimize data processes and workflows supporting large healthcare datasets.
Perform and oversee ETL, validation, and transformation tasks for claims, eligibility, and pharmacy data.
Guide project teams through the full “data-to-deliverable” lifecycle, ensuring accuracy and efficiency.
Build analytical models, dashboards, and data pipelines to support consulting engagements.
Collaborate with consultants, actuaries, and project managers to interpret results and deliver client insights.
Review and approve technical work from peers and junior analysts to ensure quality standards are met.
Mentor, coach, and delegate to analytical staff to strengthen technical and professional development.
Contribute to innovation and process improvement initiatives, including automation, cloud enablement, and AI integration.
Participate in client meetings and presentations, occasionally requiring travel.
Minimum requirements
Bachelor's degree required (Computer Science, Management Information Systems, Computer Engineering, Math, Actuarial Science, Data Analytics, or related degree is preferred)
6+ years of experience in healthcare data analytics or a related technical analytics role.
Advanced proficiency with SQL and Microsoft Excel for data analysis, validation, and automation.
Strong programming skills in Python, R, or other analytical languages.
Experience with data visualization and reporting tools (e.g., Power BI, Tableau, or R Shiny).
Solid understanding of healthcare data structures, including claims, eligibility, and provider data.
Proven ability to lead multiple projects simultaneously while mentoring and developing junior team members.
Experience with cloud data technologies (e.g., Azure Data Factory, Databricks, Snowflake, AWS Redshift, or similar).
Exposure to AI or Generative AI tools for data analysis, automation, or insight generation is preferred, but not required.
Competencies and Behaviors that Support Success in this Role
Deep understanding of database architecture and large-scale healthcare data environments.
Strong analytical thinking and the ability to translate complex data into actionable insights.
Excellent communication skills, including the ability to explain technical concepts to non-technical audiences.
Highly organized, detail-oriented, and able to manage competing priorities.
Collaborative and proactive leadership style with a focus on mentorship and knowledge-sharing.
Passion for applying analytics to improve healthcare performance, quality, and cost outcomes.
Demonstrated accountability for quality, timelines, and client satisfaction.
Fast learner who thrives in a dynamic, innovation-driven environment.
The Team
The Healthcare Data Analyst Lead will join a team that thrives on leveraging data, analytics, and technology to deliver meaningful business value. This is a team with technical aptitude and analytical prowess that enjoys building efficient and scalable products and processes. Ultimately, we are passionate about effecting change in healthcare. We also believe that collaboration and communication are cornerstones of success.
The Healthcare Data Analyst Lead will also join a mix of Healthcare Analysts, Leads, Consultants, and Principals. In addition, as part of the broader Milliman landscape, they will work alongside Healthcare Actuaries, Pharmacists, Clinicians, and Physicians. We aim to provide everyone a supportive environment, where we foster learning and growth through rewarding challenges.
Salary:
The overall salary range for this role is $104,900 - $199,065.
For candidates residing in:
Alaska, California, Connecticut, Illinois, Maryland, Massachusetts, New Jersey, Notable New York City, Newark, San Jose, San Francisco, Pennsylvania, Virginia, Washington, or the District of Columbia the salary range is $120,635 - $199,065.
All other locations the salary range is $104,900 - $173,100.
A combination of factors will be considered, including, but not limited to, education, relevant work experience, qualifications, skills, certifications, etc.
Location
:
It is preferred that candidates work on-site at our Brookfield, Wisconsin office, however, remote candidates will be considered.
The expected application deadline for this job is May 25, 2026.
Benefits
We offer a comprehensive benefits package designed to support employees' health, financial security, and well-being. Benefits include:
Medical, Dental and Vision - Coverage for employees, dependents, and domestic partners.
Employee Assistance Program (EAP) - Confidential support for personal and work-related challenges.
401(k) Plan - Includes a company matching program and profit-sharing contributions.
Discretionary Bonus Program - Recognizing employee contributions.
Flexible Spending Accounts (FSA) - Pre-tax savings for dependent care, transportation, and eligible medical expenses.
Paid Time Off (PTO) - Begins accruing on the first day of work. Full-time employees accrue 15 days per year, and employees working less than full-time accrue PTO on a prorated basis.
Holidays - A minimum of 10 observed holidays per year.
Family Building Benefits - Includes adoption and fertility assistance.
Paid Parental Leave - Up to 12 weeks of paid leave for employees who meet eligibility criteria.
Life Insurance & AD&D - 100% of premiums covered by Milliman.
Short-Term and Long-Term Disability - Fully paid by Milliman.
Equal Opportunity:
All qualified applicants will receive consideration for employment, without regard to race, color, religion, sex, sexual orientation, national origin, disability, or status as a protected veteran.
Senior Data Engineer
Data engineer job in Milwaukee, WI
At Wipfli, people count. At Wipfli, our people are core to everything we do-the catalyst behind our ability to create exceptional impact and extraordinary results. We believe in flexibility. We focus on relationships. We encourage each individual to follow their own path.
People truly matter and they feel it. For those looking to make a difference and find a professional home, Wipfli offers a career-defining opportunity.
Requisition Number: 2025-7427
Position Overview:
This role will take direction from the Information Team Director and will be responsible for contributing to the continuous advancement of a modern data lakehouse built to support a rapidly growing firm's desire to democratize its data asset.
Responsibilities
Responsibilities:
+ Lead influence and consensus building efforts for recommended solutions
+ Support and document the continued evolution of the firm's data lakehouse using the medallion architecture
+ Translate requirements into effective data models to support visualizations, AI\ML models, etc. leveraging design best practices and team standards using approved tools
+ Develop in data technologies such as Databricks, Microsoft Azure Data Factory, Python, t-SQL
+ Manage the execution of project life cycle activities in accordance with the Information Team scrum processes and tools such as Microsoft Azure DevOps
+ Achieve/maintain proficiency in required skills identified by the Information Team to effectively deliver defined products
+ Collaborate with team members to evolve products and internal processes
+ Mentor other Engineers and other IT associates as needed.
+ Perform on-call support for after business hours as needed.
Knowledge, Skills and Abilities
Qualifications:
+ Demonstrated success in working on a modern data platform with Databricks experience being preferred. Accredited certification(s) and/or 5+ years hands on desired.
+ Naturally curious with the ability to learn and implement new concepts quickly.
+ A mastery of extracting and landing data from source systems via all access methods. Extra credit for Workday RaaS and/or Microsoft Dynamics/Dataverse skills.
+ A commitment to operational standards, quality, and accountability for testing, code reviews/management and documentation.
+ Engaged in the virtual team experience leveraging video conferencing (cameras on) and a focus on relationship building. Travel is rare, but we do occasionally organize in-person events.
Benjamin Dzanic, from our recruiting team, will be guiding you through this process. Visit his LinkedIn (************************************* page to connect!
#LI-REMOTE #LI-BD1
Additional Details
Additional Details:
Wipfli is an equal opportunity/affirmative action employer. All candidates will receive consideration for employment without regards to race, creed, color, religion, national origin, sex, age, marital status, sexual orientation, gender identify, veteran status, disability, or any other characteristics protected by federal, state, or local laws.
Wipfli is committed to providing reasonable accommodations for people with disabilities. If you require a reasonable accommodation to complete an application, interview, or participate in our recruiting process, please send us an email at *************
Wipfli values fair, transparent, and competitive compensation, considering each candidate's unique skills and experiences. The estimated base pay range for this role is $107,000 to $144,000, with offers typically not made at the maximum, allowing for future salary increases. The actual salary at the time of offer depends on business related factors like location, skills, experience, training/education, licensure, certifications, business needs, current associate pay, and relevant employment laws.
Individuals may be eligible for an annual discretionary bonus, subject to participation rules and based on a variety of factors including, but not limited to, individual and Firm performance.
Wipfli cares about our associates and offers a variety of benefits to support their well-being. Highlights include 8 health plan options (both HMO & PPO plans), dental and vision coverage, opportunity to enroll in HSA with potential Firm contribution and an Employee Assistance Program. Other benefits include firm-sponsored basic life and short and long-term disability coverage, a 401(k) savings plan & profit share as well as Firm matching contribution, well-being incentive, education & certification assistance, flexible time off, family care leave, parental leave, family formation benefits, cell phone reimbursement, and travel rewards. Voluntary benefit offerings include critical illness & accident insurance, hospital indemnity insurance, legal, long-term care, pet insurance, ID theft protection, and supplemental life/AD&D. Eligibility for all benefits programs is dependent on annual hours expectation, position status/level and location.
"Wipfli" is the brand name under which Wipfli LLP and Wipfli Advisory LLC and its respective subsidiary entities provide professional services. Wipfli LLP and Wipfli Advisory LLC (and its respective subsidiary entities) practice in an alternative practice structure in accordance with the AICPA Code of Professional Conduct and applicable law, regulations, and professional standards. Wipfli LLP is a licensed independent CPA firm that provides attest services to its clients, and Wipfli Advisory LLC provides tax and business consulting services to its clients. Wipfli Advisory LLC and its subsidiary entities are not licensed CPA firms.
Job LocationsUS
Job ID 2025-7427
Category Internal IT
Remote Yes
Lead Data Scientist GenAI, Strategic Analytics - Data Science
Data engineer job in Milwaukee, WI
Deloitte is at the leading edge of GenAI innovation, transforming Strategic Analytics and shaping the future of Finance. We invite applications from highly skilled and experienced Lead Data Scientists ready to drive the development of our next-generation GenAI solutions.
The Team
Strategic Analytics is a dynamic part of our Finance FP&A organization, dedicated to empowering executive leaders across the firm, as well as our partners in financial and operational functions. Our team harnesses the power of cloud computing, data science, AI, and strategic expertise-combined with deep institutional knowledge-to deliver insights that inform our most critical business decisions and fuel the firm's ongoing growth.
GenAI is at the forefront of our innovation agenda and a key strategic priority for our future. We are rapidly developing groundbreaking products and solutions poised to transform both our organization and our clients. As part of our team, the selected candidate will play a pivotal role in driving the success of these high-impact initiatives.
Recruiting for this role ends on December 14, 2025
Work You'll Do
Client Engagement & Solution Scoping
+ Partner with stakeholders to analyze business requirements, pain points, and objectives relevant to GenAI use cases.
+ Facilitate workshops to identify, prioritize, and scope impactful GenAI applications (e.g., text generation, code synthesis, conversational agents).
+ Clearly articulate GenAI's value proposition, including efficiency gains, risk mitigation, and innovation.
+ Solution Architecture & Design
+ Architect holistic GenAI solutions, selecting and customizing appropriate models (GPT, Llama, Claude, Zora AI, etc.).
+ Design scalable integration strategies for embedding GenAI into existing client systems (ERP, CRM, KM platforms).
+ Define and govern reliable, ethical, and compliant data sourcing and management.
Development & Customization
+ Lead model fine-tuning, prompt engineering, and customization for client-specific needs.
+ Oversee the development of GenAI-powered applications and user-friendly interfaces, ensuring robustness and exceptional user experience.
+ Drive thorough validation, testing, and iteration to ensure quality and accuracy.
Implementation, Deployment & Change Management
+ Manage solution rollout, including cloud setup, configuration, and production deployment.
+ Guide clients through adoption: deliver training, create documentation, and provide enablement resources for users.
Risk, Ethics & Compliance
+ Lead efforts in responsible AI, ensuring safeguards against bias, privacy breaches, and unethical outcomes.
+ Monitor performance, implement KPIs, and manage model retraining and auditing processes.
Stakeholder Communication
+ Prepare executive-level reports, dashboards, and demos to summarize progress and impact.
+ Coordinate across internal teams, tech partners, and clients for effective project delivery.
Continuous Improvement & Thought Leadership
+ Stay current on GenAI trends, best practices, and emerging technologies; share insights across teams.
+ Mentor junior colleagues, promote knowledge transfer, and contribute to reusable methodologies.
Qualifications
Required:
+ Bachelor's or Master's degree in Computer Science, Engineering, Data Science, Mathematics, or related field.
+ 5+ years of hands-on experience delivering machine learning or AI solutions, preferably including generative AI.
+ Independent thinker who can create the vision and execute on transforming data into high end client products.
+ Demonstrated accomplishments in the following areas:
+ Deep understanding of GenAI models and approaches (LLMs, transformers, prompt engineering).
+ Proficiency in Python (PyTorch, TensorFlow, HuggingFace), Databricks, ML pipelines, and cloud-based deployment (Azure, AWS, GCP).
+ Experience integrating AI into enterprise applications, building APIs, and designing scalable workflows.
+ Knowledge of solution architecture, risk assessment, and mapping technology to business goals.
+ Familiarity with agile methodologies and iterative delivery.
+ Commitment to responsible AI, including data ethics, privacy, and regulatory compliance.
+ Ability to travel 0-10%, on average, based on the work you do and the clients and industries/sectors you serve
+ Limited immigration sponsorship may be available.
Preferred:
+ Relevant Certifications: May include Google Cloud Professional ML Engineer, Microsoft Azure AI Engineer, AWS Certified Machine Learning, or specialized GenAI/LLM credentials.
+ Experience with data visualization tools such as Tableau
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $102,500 - $188,900.
You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance.
Information for applicants with a need for accommodation
************************************************************************************************************
EA_FA_ExpHire
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.
Data Engineer - Platform & Product
Data engineer job in Milwaukee, WI
We are seeking a skilled and solution-oriented Data Engineer to contribute to the development of our growing Data Engineering function. This role will be instrumental in designing and optimizing data workflows, building domain-specific pipelines, and enhancing platform services. The ideal candidate will help evolve our Snowflake-based data platform into a scalable, domain-oriented architecture that supports business-critical analytics and machine learning initiatives.
Responsibilities
The candidate is expected to:
* Design and build reusable platform services, including pipeline frameworks, CI/CD workflows, data validation utilities, data contracts, lineage integrations
* Develop and maintain data pipelines for sourcing, transforming, and delivering trusted datasets into Snowflake
* Partner with Data Domain Owners to onboard new sources, implement data quality checks, and model data for analytics and machine learning use cases
* Collaborate with the Lead Data Platform Engineer and Delivery Manager to deliver monthly feature releases and support bug remediation
* Document and promote best practices for data pipeline development, testing, and deployment
Qualifications
The successful candidate will possess strong analytical skills and attention to detail. Additionally, the ideal candidate will possess:
* 3-6 years of experience in data engineering or analytics engineering
* Strong SQL and Python skills; experience with dbt or similar transformation frameworks
* Demonstrated experience building pipelines and services on Snowflake or other modern cloud data platforms
* Understanding of data quality, validation, lineage, and schema evolution
* Background in financial or market data (trading, pricing, benchmarks, ESG) is a plus
* Strong collaboration and communication skills, with a passion for enabling domain teams
Privacy Notice for California Applicants
Artisan Partners Limited Partnership is an equal opportunity employer. Artisan Partners does not discriminate on the basis of race, religion, color, national origin, gender, age, disability, marital status, sexual orientation or any other characteristic protected under applicable law. All employment decisions are made on the basis of qualifications, merit and business need.
#LI-Hybrid/span>
Auto-ApplySecurity Lead-Data Protection
Data engineer job in Milwaukee, WI
Currently working with a Fortune 500 global manufacturing client that is looking for a Security Lead- Data Protection focused individual. This client is looking to add a permanent member to their security team that can grow within the company. Responsibilities:
• Develop and lead the Security Lead-Global Data Protection program across all clients business units
• Build and manage an effective data classification program
• Develop and drive Security Lead-Data Protection policies, standards and processes
• Develop and drive a data loss prevention tools strategy partnering with companies Information Technology groups and the clients business units
• Align Security -Lead Data Protection programs with other information security and cross-functional programs
• Direct and improve the Security Lead-Data Protection and DLP programs and associated governance activities including metrics, issue tracking and remediation, and programs supporting client policies.
• Establish and maintain cross-business relationships with key Leads and stakeholders across clients business, information security, and IT functions
• Coordinate with the Global IT Council to ensure compliance with clients standards and share best practices
• Conduct research and make recommendations on products, services, and standards in support of all global infrastructure security efforts.
• Develop and maintain appropriate response playbooks, facilitate routine exercises, and ensure a sound communication process for all cyber events
Job Requirements:
Formal Education & Certification
• Bachelor's degree in Information Security, IT Engineering, or Computer Science with 5+ years of IT experience
• Industry-recognized security certification such as GIAC, CISSP, CISM, or CISA are preferred
.
Qualifications
• Minimum 5 years' information security experience including experience mapping and securing business processes / data flows
• Demonstrated experience building and leading a cyber-security program
• Advanced knowledge of data protection/DLP and Windows client/server security concepts, best practices, and procedures
• Solid “hands on” technical experience is essential
• Experience in Information Security Incident Response
• Experience in IaaS/SaaS environments
• Broad understanding of all aspects of IT and enterprise systems interoperability
• Ability to communicate technical topics (verbal and written) to multiple organizational levels
• Global enterprise experience is preferred
Personal Attributes:
Demonstrated Leadship managing direct and matrix-reporting global teams
• Demonstrated experience leading global programs across technology and business functions
• Strong interpersonal, written, and oral communication skills
• Able to conduct research into issues and products as required
• Ability to prioritize and execute tasks in a fast-paced environment and make sound decisions in emergency situation
• Highly self-motivated and directed
• Keen attention to detail
• Proven analytical and problem-solving abilities
Additional Information
Work with blue Stone recruiting to find your next Cyber Security role. You can find us at ******************************* We look forward to speaking with you.
All your information will be kept confidential according to EEO guidelines.
Data Scientist, US Supply Chain
Data engineer job in Milwaukee, WI
What you will do
In this exciting role you will lead the effort to build and deploy predictive and prescriptive analytics in our next generation decision intelligence platform. The work will require helping to build /maintain a digital twin of our production supply chain, perform optimization and forecasting, and connect our analytics and ML solutions to enable our people to make the best data driven decisions possible!
This will require working with predictive, prescriptive analytics and decision-intelligence across the US / Canada region at Clarios. You'll apply modern statistics, machine learning and AI to real manufacturing and supply chain problems, working side-by-side with our business stakeholders and our global analytics team to deploy transformative solutions- not just models.
How you will do it
Build production-ready ML/statistical models (regression/classification, clustering, time series, linear / non-linear optimizations) to detect patterns, perform scenario analytics and generate actionable insights / outcomes.
Wrangle and analyze data with Python and SQL; perform feature engineering, data quality checks, and exploratory analysis to validate hypotheses and model readiness.
Develop digital solutions /visuals in Power BI and our decision intelligence platform to communicate results and monitor performance with business users.
Partner with stakeholders to clarify use cases, translate needs into technical tasks/user stories, and iterate solutions in sprints.
Manage model deployment (e.g., packaging models, basic MLOps) with guidance from Global Analytics
Document and communicate model methodology, assumptions, and results to non-technical audiences; support troubleshooting and continuous improvement of delivered analytics.
Deliver value realization as part of our business analytics team to drive positive business outcomes for our metals team.
Deliver incremental value quickly (first dashboards, baseline models) and iterate with stakeholder feedback.
Balance rigor with practicality-choose the simplest model that solves the problem and can be supported in production.
Keep data quality front-and-center; instrument checks to protect decisions from drift and bad inputs.
Travel: Up to ~10% for plant or stakeholder visits
What we look for
Bachelor's degree in Statistics, Mathematics, Computer Science, Engineering, or related field-or equivalent practical experience.
1-3 years (or strong internship/co-op) applying ML/statistics on business data.
Python proficiency (pandas, scikit-learn, SciPy/statsmodels) and SQL across common platforms (e.g., SQL Server, Snowflake).
Core math/stats fundamentals: probability, hypothesis testing/DoE basics, linear algebra, and the principles behind common ML methods.
Data visualization experience with Power BI / Decision Intelligence Platforms for analysis and stakeholder storytelling.
Ability to work in cross-functional teams and explain technical work clearly to non-technical partners. Candidates must be self-driven, curious, and creative
Preferred
Cloud & big data exposure: Azure (or AWS), Databricks/Spark; Snowpark is a plus.
Understanding of ETL/ELT tools such as ADF, SSIS, Talend, Informatica, or Matillion.
Experience in an Decision Intelligence platform like Palantir, Aera, etc building and deploying models.
MLOps concepts (model validation, monitoring, packaging with Docker/Kubernetes).
Deep learning basics (PyTorch/Keras) for the right use cases.
Experience contributing to agile backlogs, user stories, and sprint delivery.
3+ years of experience in data analytics
Master's Degree in Statistics, Economics, Data Science or computer science.
What you get:
Medical, dental and vision care coverage and a 401(k) savings plan with company matching - all starting on date of hire
Tuition reimbursement, perks, and discounts
Parental and caregiver leave programs
All the usual benefits such as paid time off, flexible spending, short-and long-term disability, basic life insurance, business travel insurance, Employee Assistance Program, and domestic partner benefits
Global market strength and worldwide market share leadership
HQ location earns LEED certification for sustainability plus a full-service cafeteria and workout facility
Clarios has been recognized as one of 2025's Most Ethical Companies by Ethisphere. This prestigious recognition marks the third consecutive year Clarios has received this distinction.
Who we are:
Clarios is the force behind the world's most recognizable car battery brands, powering vehicles from leading automakers like Ford, General Motors, Toyota, Honda, and Nissan. With 18,000 employees worldwide, we develop, manufacture, and distribute energy storage solutions while recovering, recycling, and reusing up to 99% of battery materials-setting the standard for sustainability in our industry. At Clarios, we're not just making batteries; we're shaping the future of sustainable transportation. Join our mission to innovate, push boundaries, and make a real impact. Discover your potential at Clarios-where your power meets endless possibilities.
Veterans/Military Spouses:
We value the leadership, adaptability, and technical expertise developed through military service. At Clarios, those capabilities thrive in an environment built on grit, ingenuity, and passion-where you can grow your career while helping to power progress worldwide. All qualified applicants will be considered without regard to protected characteristics.
We recognize that people come with a wealth of experience and talent beyond just the technical requirements of a job. If your experience is close to what you see listed here, please apply. Diversity of experience and skills combined with passion is key to challenging the status quo. Therefore, we encourage people from all backgrounds to apply to our positions. Please let us know if you require accommodations during the interview process by emailing Special.Accommodations@Clarios.com. We are an Equal Opportunity Employer and value diversity in our teams in terms of work experience, area of expertise, gender, ethnicity, and all other characteristics protected by laws in the countries where we operate. For more information on our commitment to sustainability, diversity, and equal opportunity, please read our latest report. We want you to know your rights because EEO is the law.
A Note to Job Applicants: please be aware of scams being perpetrated through the Internet and social media platforms. Clarios will never require a job applicant to pay money as part of the application or hiring process.
To all recruitment agencies: Clarios does not accept unsolicited agency resumes/CVs. Please do not forward resumes/CVs to our careers email addresses, Clarios employees or any other company location. Clarios is not responsible for any fees related to unsolicited resumes/CVs.
Auto-ApplyBusiness Intelligence Data Modeler
Data engineer job in Milwaukee, WI
CapB is a global leader on IT Solutions and Managed Services. Our R&D is focused on providing cutting edge products and solutions across Digital Transformations from Cloud, AI/ML, IOT, Blockchain to MDM/PIM, Supply chain, ERP, CRM, HRMS and Integration solutions. For our growing needs we need consultants who can work with us on salaried or contract basis. We provide industry standard benefits, and an environment for LEARNING & Growth.
For one of our going on project we are looking for a Business Intelligence Data Modeler. The position is based out of Milwaukee.
Responsibilities:
The Business Intelligence Analyst performs a variety of project-oriented tasks to support the information needs of the organization.
This position is responsible for all phases of reporting, decision support, and data analysis activities including report design, measure development, data collection, summarization, and validation.
The BI Analyst exercises independent judgement and discretionary decision making related to managing multiple and more complex reporting projects.
The BI Analyst is proficient with analytical and reporting tools, database development, ETL processes, query languages, and database and spreadsheet tools.
The BI Analyst will participate in reporting and presentations to various levels of management and staff and may also be included in action plan development.
BI Analyst will have in depth experience with Power BI to create dashboards, data exploration, visualization, and data storytelling from concept to final deliverables.
Advanced experience with Power BI, Power BI dataflow and dashboard. Technical expertise with data modeling and design to interact with multiple sources of data.
Ability to write complex DAX code and SQL queries for data manipulation.
Skills:
10 years experience required to Data analysis.
10 years experience required to Dashboarding/Business Objects Xcelsius.
Data Warehouse Developer
Data engineer job in Milwaukee, WI
Description We are looking for a skilled Data Warehouse Analyst to join our team on a contract basis in Milwaukee, Wisconsin. In this role, you will play a pivotal part in developing and maintaining data solutions that support organizational analytics and decision-making. You will work closely with cross-functional teams to ensure data integration, accuracy, and accessibility using modern tools and methodologies.
Responsibilities:
- Design and implement data warehouse solutions to support business intelligence and reporting needs.
- Develop and maintain ETL processes to extract, transform, and load data from Oracle into Azure SQL Server.
- Collaborate with stakeholders and business analysts to gather requirements and translate them into actionable technical solutions.
- Optimize workflows and ensure efficient performance of the data warehouse environment.
- Validate and monitor data quality to ensure integrity and reliability.
- Create and maintain documentation for processes, architecture, and data models.
- Troubleshoot and resolve issues related to data integration and system performance.
- Utilize Azure Data Factory for data processing and workflow management.
- Apply Kimball methodology to design and maintain efficient data models.
- Support the ongoing improvement of data systems and analytics processes. Requirements - Proven experience in data warehousing, including design and development.
- Expertise in ETL processes and tools, with a focus on data integration.
- Proficiency in Azure Data Factory for creating workflows and managing data pipelines.
- Strong knowledge of Microsoft SQL Server and Azure SQL Database.
- Familiarity with Oracle Autonomous Data Warehouse systems.
- Experience with business intelligence and data warehousing methodologies.
- Ability to apply Kimball methodology in data model design.
- Strong problem-solving skills and attention to detail to ensure data accuracy and quality. Technology Doesn't Change the World, People Do.
Robert Half is the world's first and largest specialized talent solutions firm that connects highly qualified job seekers to opportunities at great companies. We offer contract, temporary and permanent placement solutions for finance and accounting, technology, marketing and creative, legal, and administrative and customer support roles.
Robert Half works to put you in the best position to succeed. We provide access to top jobs, competitive compensation and benefits, and free online training. Stay on top of every opportunity - whenever you choose - even on the go. Download the Robert Half app (https://www.roberthalf.com/us/en/mobile-app) and get 1-tap apply, notifications of AI-matched jobs, and much more.
All applicants applying for U.S. job openings must be legally authorized to work in the United States. Benefits are available to contract/temporary professionals, including medical, vision, dental, and life and disability insurance. Hired contract/temporary professionals are also eligible to enroll in our company 401(k) plan. Visit roberthalf.gobenefits.net for more information.
© 2025 Robert Half. An Equal Opportunity Employer. M/F/Disability/Veterans. By clicking "Apply Now," you're agreeing to Robert Half's Terms of Use (https://www.roberthalf.com/us/en/terms) .
Senior Data Engineer
Data engineer job in Milwaukee, WI
We're looking for those individuals-the creative thinkers and innovation seekers-who are content with nothing short of changing the world. Discover the endless opportunities within the Medical College of Wisconsin (MCW) and be inspired by the work we can do together to improve health, and make a positive, daily impact in our communities
As a Sr. Data Engineer, you will prepare and transform data for analytical, operational, and intelligent automation uses to support diverse stakeholder needs throughout the Center for International of Blood and Marrow Transplant Research (CIBMTR) , including statisticians, data scientists, scientific directors, and CIBMTR partners. Design, develop, and implement data- and AI-centric solutions aligned with evolving user needs-leveraging structured and unstructured data, optimizing data quality, and enabling scalable Machine Learning and Agentic Artificial Intelligence (AI) systems.
Responsibilities:
Lead or actively contribute to multidisciplinary data and AI workstreams, collaborating with data scientists, analytics engineers, and partner engineering teams to design and build next-generation intelligent data solutions.
Design and implement modern, scalable, secure, high performing data architectures, including data lakes, lakehouse and data commons.
Develop and optimize data pipelines (ETL/ELT) and orchestration systems for large scale data ingestion and processing of structured and unstructured data as well as integrating organizational data for ML/AI workloads for analytics and ML/AI applications.
Build reusable, modular components and APIs to support scalable Agentic AI frameworks and enable autonomous data operations aligned with outcomes research and clinical trials.
Work independently or as part of a team with subject matter experts to identify user needs and requirements for efficient, scalable and reliable data pipelines and models that support data-driven initiatives.
Support automation of operational workflows using Agentic AI, including evaluation, performance tuning and observability.
Develop and maintain clear documentation aligned with Standard Operating Procedures (SOPs), best practices, and regulatory requirements for both data engineering and AI components.
Mentor and train data engineers and analytics engineers in the adoption and integration of AI and ML methods and frameworks.
Perform other duties as required.
Knowledge - Skills - Abilities
Experience in designing, implementing, and scaling data pipelines for structured and unstructured data in modern Lakehouse or data lake architectures.
Strong proficiency in SQL and at least one scripting language (e.g., Python).
Solid understanding of data engineering principles, including data preparation, feature engineering, and model lifecycle management.
Solid understanding of prompt engineering.
Experience with cloud-based AI/ML platforms (e.g., AWS SageMaker, Bedrock, or comparable).
Familiarity with LLMs, agentic frameworks, and AI orchestration patterns (e.g., LangChain, AutoGen, or similar).
Familiarity with techniques to integrate organizational data into AI workflows, such as Retrieval Augmented Generation (RAG).
Strong data profiling and data quality assurance experience.
Knowledge of workflow decomposition for automation.
Excellent problem-solving, analytical thinking, and communication skills.
Ability to mentor technical staff and communicate data engineering and AI concepts to non-expert audiences.
Preferred:
Knowledge of data interoperability and data standards (e.g., FHIR, HL7, JSON, XML).
Experience with Agentic AI reasoning patterns and system frameworks.
Experience working in AGILE Scrum team framework.
Familiarity with orchestration tools (e.g. Airflow or Dagster).
Familiarity of R, SAS, JupytrLab or other statistical software is a plus.
Experience with Model Context Protocol (MCP), RESTful APIs and modern integration methods.
Familiarity of model evaluation and observability.
Preferred Schedule:
Mon-Friday
Position Requirements:
Minimum Qualifications:
Appropriate experience may be substituted for education on an equivalent basis.
Minimum education: Bachelor's Degree
Minimum experience: 5 years
Preferred Qualifications:
Preferred education: Master's degree
Preferred experience: 8 years in Computer Science, Informatics, Data Science, or technical discipline in healthcare or life sciences. AI/ML certifications are preferred.
Why MCW?
Outstanding Healthcare Coverage, including but not limited to Health, Vision, and Dental.
403B Retirement Package
Competitive Vacation, Sick Time, and Paid Holidays
Tuition Reimbursement
Paid Parental Leave
For a brief overview of our benefits see: ******************************************************** #LI-NK1
MCW as an Equal Opportunity Employer and Commitment to Non-Discrimination
The Medical College of Wisconsin (MCW) is an Equal Opportunity Employer. We are committed to fostering an inclusive community of outstanding faculty, staff, and students, as well as ensuring equal educational opportunity, employment, and access to services, programs, and activities, without regard to an individual's race, color, national origin, religion, age, disability, sex, gender identity/expression, sexual orientation, marital status, pregnancy, predisposing genetic characteristic, or military status. Employees, students, applicants, or other members of the MCW community (including but not limited to vendors, visitors, and guests) may not be subjected to harassment that is prohibited by law or treated adversely or retaliated against based upon a protected characteristic.
.
Data Engineer
Data engineer job in Mequon, WI
Charter Manufacturing is a fourth-generation family-owned business where our will to grow drives us to do it better. Join the team and become part of our family!
Applicants must be authorized to work for ANY employer in the U.S. Charter Manufacturing is unable to sponsor for employment visas at this time.
This position is hybrid, 3 days a week in office in Mequon, WI.
BI&A- Lead Data Engineer
Charter Manufacturing continues to invest in Data & Analytics. Come join a great team and great culture leveraging your expertise to drive analytics transformation across Charter's companies. This is a key role in the organization that will provide thought leadership, as well as add substantial value by delivering trusted data pipelines that will be used to develop models and visualizations that tell a story and solve real business needs/problems. This role will collaborate with team members and business stakeholders to leverage data as an asset driving business outcomes aligned to business strategies.
Having 7+ years prior experience in developing data pipelines and partnering with team members and business stakeholders to drive adoption will be critical to the success of this role.
MINIMUM QUALIFICATIONS:
Bachelor's degree in computer science, data science, software engineering, information systems, or related quantitative field; master's degree preferred
At least seven years of work experience in data management disciplines, including data integration, modeling, optimization and data quality, or other areas directly relevant to data engineering responsibilities and tasks
Proven project experience designing, developing, deploying, and maintaining data pipelines used to support AI, ML, and BI using big data solutions (Azure, Snowflake)
Strong knowledge in Azure technologies such as Azure Web Application, Azure Data Explorer, Azure DevOps, and Azure Blob Storage to build scalable and efficient data pipelines
Strong knowledge using programming languages such as R, Python, C#, and Azure Machine Learning Workspace development
Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP) and modern data warehouse tools (Snowflake, Databricks)
Experience with database technologies such as SQL, Oracle, and Snowflake
Prior experience with ETL/ELT data ingestion into data lakes/data warehouses for analytics consumption
Strong SQL skills
Ability to collaborate within and across teams of different technical knowledge to support delivery and educate end users on data products
Passionate about teaching, coaching, and mentoring others
Strong problem-solving skills, including debugging skills, allowing the determination of sources of issues in unfamiliar code or systems, and the ability to recognize and solve problems
Excellent business acumen and interpersonal skills; able to work across business lines at a senior level to influence and effect change to achieve common goals
Ability to describe business use cases/outcomes, data sources and management concepts, and analytical approaches/options
Demonstrated experience delivering business value by structuring and analyzing large, complex data sets
Demonstrated initiative, strong sense of accountability, collaboration, and known as a trusted business partner
PREFERRED QUALIFICATIONS INCLUDES EXPERIENCE WITH:
Manufacturing industry experience specifically heavy industry, supply chain and operations
Designing and supporting data integrations with ERP systems such as Oracle or SAP
MAJOR ACCOUNTABILITIES:
Designs, develops, and supports data pipelines for batch and streaming data extraction from various sources (databases, API's, external systems), transforming it into the desired format, and load it into the appropriate data storage systems
Collaborates with data scientists and analysts to optimize models and algorithms for in accordance with data quality, security, and governance policies
Ensures data quality, consistency, and integrity during the integration process, performing data cleansing, aggregation, filtering, and validation as needed to ensure accuracy, consistency, and completeness of data
Optimizes data pipelines and data processing workflows for performance, scalability, and efficiency.
Monitors and tunes data pipelines for performance, scalability, and efficiency resolving performance bottlenecks
Establish architecture patterns, design standards, and best practices to accelerate delivery and adoption of solutions
Assist, educate, train users to drive self-service enablement leveraging best practices
Collaborate with business subject matter experts, analysts, and offshore team members to develop and deliver solutions in a timely manner
Embraces and establishes governance of data and algorithms, quality, standards, and best practices, ensuring data accuracy
We offer comprehensive health, dental, and vision benefits, along with a 401(k) plan that includes employer matching and profit sharing. Additionally, we offer company-paid life insurance, disability coverage, and paid time off (PTO).
Auto-ApplyData Architect
Data engineer job in West Allis, WI
Invest In You! Tri City National Bank is your hometown bank. We believe in putting customers first, building relationships, and fostering a sense of community. We work in a team environment with opportunities for hard workers to grow personally and professionally. We enjoy celebrating success and great benefits along the way. Most importantly, we believe superior customer service paired with the right banking solutions help our customers and businesses fulfill their financial dreams, and our communities grow. Our ideal candidate believes in our mission, values continuous learning, and is comfortable adapting to change. If this resonates with you, apply today and come join our team. #investinyou
The Data Architect is a strategic support role responsible for designing and implementing the organization's enterprise data management strategy. Reporting to the Director of IT, this role will define and drive standards, policies, and architectural practices for data access, modeling, stewardship, categorization, quality, archival, and destruction.
The Data Architect will serve as the organizational authority on data architecture, owning database platform strategy and participating in enterprise governance through the Architectural Review Board. This role will collaborate across technical and business domains to ensure data is managed as a strategic asset, with a strong emphasis on governance across SaaS platforms and third-party data processors.
Compensation: $150,000+ annually depending on experience.
This position is in-office at our Operations Center in West Allis, WI
Responsibilities
Strategic Planning & Execution
Conduct a comprehensive assessment of the current data environment and maturity.
Deliver a multi-year data strategy including prioritized recommendations, implementation timeline, and required resources.
Provide data governance thought leadership.
Build and/or coordinate the future data management organization. This may include shaping potential staffing needs.
Strategic Architecture & Governance
Develop and maintain the enterprise data architecture strategy, including standards for data access, modeling, categorization, stewardship, and quality.
Own the architectural design of database platforms, including redundancy, backup, and recovery strategies.
Participate in the Architectural Review Board as the data management representative.
Recommend and guide the adoption of supporting tools for data inventory, modeling, categorization, and quality assurance.
Establish architectural standards that apply across both on-premises and SaaS-hosted data environments.
SaaS & Third-Party Data Governance
Lead efforts to inventory and categorize data stored in SaaS applications and cloud platforms.
Track and document data processors and sub-processors handling corporate data to meet regulatory obligations.
Define governance policies for data shared with third-party applications, ensuring visibility and control over sensitive information.
Collaborate with Information Security and Compliance teams to align SaaS data governance with enterprise risk and regulatory frameworks.
Policy & Process Development
Define and implement policies and procedures for data archival and destruction, including unstructured data across the corporate network.
Co-lead efforts to track and manage sensitive data shared with third-party and cloud-based applications.
Establish standards and constraints to guide the Business Intelligence team, while maintaining separation from BI execution.
Cross-Functional Collaboration
Facilitate the development of a data stewardship program by partnering with business departments and subject matter experts.
Serve as a key contributor to corporate projects involving the creation, modification, or destruction of sensitive data.
Collaborate with Information Security, Infrastructure, and Application teams to ensure alignment with enterprise architecture and compliance requirements.
Operational
Review database design decisions to ensure that solutions exhibit data storage and access that is secure, recoverable, performant, scalable, and maintainable.
Provide recommendations or, where applicable, implement enhancements that enhance data security, access performance, and platform reliability.
Support regulatory data security and compliance through routine review and monitoring of relevant controls. Participate in design and development of data-oriented control processes.
Maintain a current and working knowledge of Tri City's products, processes, and data exchanges.
Perform other duties as requested.
Qualifications
Minimum 7 years of experience in data architecture or related disciplines.
Strong understanding of enterprise data management principles and practices.
Experience with third-party and cloud data integrations.
Background in highly regulated industries with demonstrated compliance awareness.
Familiarity with database administration and platform ownership (SQL Server preferred).
Prior career demonstration of technical, analytical, and administrative skills.
Excellent written, verbal communication skills and ability to work well independently and with others.
Preferred Qualifications
Prior hands-on experience as a database administrator.
Certifications such as CDMP, CIMP, DAMA, or relevant cloud/data platform credentials.
Experience facilitating data governance programs or stewardship initiatives.
Prior experience with typical data architecture technologies (i.e., ETL, data modeling, data visualization, database performance monitoring, BI, MDM, etc.)
Why Join Us:
Community Impact: Be part of a local bank deeply rooted in community values, contributing to the growth and prosperity of our neighborhoods.
Innovation: Embrace a dynamic and evolving work environment that encourages fresh perspectives and continuous learning.
Career Growth: Unlock future opportunities for personal and professional development as you navigate through our Pathways for Success.
Celebration of Success: Join a team that values and celebrates individual and collective achievements.
Work Life Balance: No early mornings or late nights, enjoy a predictable schedule with major holidays off.
Great Employee Benefits that start on the 1st of the month after your hire date!
Part-Time:
401(k) with company match
Up to 20 hours of paid vacation after 3 months (must work an average of 20+ hours per week in order to be eligible for paid vacation.)
Full-Time:
401(k) with company match
Tuition reimbursement
Medical, dental, and vision coverage
Paid vacation and more!
Equal Opportunity Employer/Veterans/Disabled
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
Reasonable Accommodation
If you are an individual with a disability and would like to request a reasonable accommodation as part of the employment selection process, please contact Human Resources at ************ or ************
Auto-ApplyAssociate Data Engineer
Data engineer job in Milwaukee, WI
Baker Tilly is a leading advisory, tax and assurance firm, providing clients with a genuine coast-to-coast and global advantage in major regions of the U.S. and in many of the world's leading financial centers - New York, London, San Francisco, Los Angeles, Chicago and Boston. Baker Tilly Advisory Group, LP and Baker Tilly US, LLP (Baker Tilly) provide professional services through an alternative practice structure in accordance with the AICPA Code of Professional Conduct and applicable laws, regulations and professional standards. Baker Tilly US, LLP is a licensed independent CPA firm that provides attest services to its clients. Baker Tilly Advisory Group, LP and its subsidiary entities provide tax and business advisory services to their clients. Baker Tilly Advisory Group, LP and its subsidiary entities are not licensed CPA firms.
Baker Tilly Advisory Group, LP and Baker Tilly US, LLP, trading as Baker Tilly, are independent members of Baker Tilly International, a worldwide network of independent accounting and business advisory firms in 141 territories, with 43,000 professionals and a combined worldwide revenue of $5.2 billion. Visit bakertilly.com or join the conversation on LinkedIn, Facebook and Instagram.
Please discuss the work location status with your Baker Tilly talent acquisition professional to understand the requirements for an opportunity you are exploring.
Baker Tilly is an equal opportunity/affirmative action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, gender identity, sexual orientation, or any other legally protected basis, in accordance with applicable federal, state or local law.
Any unsolicited resumes submitted through our website or to Baker Tilly Advisory Group, LP, employee e-mail accounts are considered property of Baker Tilly Advisory Group, LP, and are not subject to payment of agency fees. In order to be an authorized recruitment agency ("search firm") for Baker Tilly Advisory Group, LP, there must be a formal written agreement in place and the agency must be invited, by Baker Tilly's Talent Attraction team, to submit candidates for review via our applicant tracking system.
Job Description:
Associate Data Engineer
As a Senior Consultant - Associate Data Engineer you will design, build, and optimize modern data solutions for our mid‑market and enterprise clients. Working primarily inside the Microsoft stack (Azure, Synapse, and Microsoft Fabric), you will transform raw data into trusted, analytics‑ready assets that power dashboards, advanced analytics, and AI use cases. You'll collaborate with solution architects, analysts, and client stakeholders while sharpening both your technical depth and consulting skills.
Key Responsibilities:
* Data Engineering: Develop scalable, well‑documented ETL/ELT pipelines using T‑SQL, Python, Azure Data Factory/Fabric Data Pipelines, and Databricks; implement best‑practice patterns for performance, security, and cost control.
* Modeling & Storage: Design relational and lakehouse models; create Fabric OneLake shortcuts, medallion‑style layers, and dimensional/semantic models for Power BI.
* Quality & Governance: Build automated data‑quality checks, lineage, and observability metrics; contribute to CI/CD workflows in Azure DevOps or GitHub.
* Client Delivery: Gather requirements, demo iterative deliverables, document technical designs, and translate complex concepts to non‑technical audiences.
* Continuous Improvement: Research new capabilities, share findings in internal communities of practice, and contribute to reusable accelerators. Collaborate with clients and internal stakeholders to design and implement scalable data engineering solutions.
Qualifications:
* Education - Bachelor's in Computer Science, Information Systems, Engineering, or related field (or equivalent experience)
* Experience - 2-3 years delivering production data solutions, preferably in a consulting or client‑facing role.
* Technical Skills:
Strong T‑SQL for data transformation and performance tuning.
Python for data wrangling, orchestration, or notebook‑based development.
Hands‑on ETL/ELT with at least one Microsoft service (ADF, Synapse Pipelines, Fabric Data Pipelines).
* Project experience with Microsoft Fabric (OneLake, Lakehouses, Data Pipelines, Notebooks, Warehouse, Power BI DirectLake) preferred
* Familiarity with Databricks, Delta Lake, or comparable lakehouse technologies preferred
* Exposure to DevOps (YAML pipelines, Terraform/Bicep) and test automation frameworks preferred
* Experience integrating SaaS/ERP sources (e.g., Dynamics 365, Workday, Costpoint) preferred
Auto-ApplySAP Data Migration Lead/ Architect
Data engineer job in Milwaukee, WI
Deltacubes is currently looking for a SAP Data Migration Lead/ Architect for one of our clients located in Milwaukee, WI
This role is mostly remote with some travel to client on need basis.
If you are interested, kindly apply here
Acceptable Visa Type :H1B , US Citizen/Green Card holder preferred.
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Data Architect
Data engineer job in Waukegan, IL
Skill Data Architect
Total Experience 10 yrs.
Max Salary $ DOE Per Hour
Employment Type Contract Jobs (Temp/Consulting)
Job Duration 10+ months
Domain Any
Description
• 10+ years as a Data Architect
• Hands on Experience with various data architecture and modeling tools (Erwin and ER Studio)
• Some experience with ALL of Informatica Suite (ETL)
o Metadata manager
o PowerCenter
o Master Data Management (MDM)
o Analyst
• Excellent written and verbal communication skills
PLUSSES:
• Experience working in Biopharmaceutical industry (regulatory affairs)
• Experience with Data Warehouse and Data Mart Design
Additional Information
Multiple Openings for OPT/CPT/H4/L2/EAD/Citizen's.
Healthcare Data Analyst Lead (CMH Health)
Data engineer job in Brookfield, WI
Individual(s) must be legally authorized to work in the United States without the need for immigration support or sponsorship from Milliman now or in the future. Milliman is seeking a technically savvy, analytically strong Healthcare Data Analyst Lead to manage analytics and technology projects as well as coach and mentor junior analytical staff. The ideal candidate is someone seeking to join a challenging, yet rewarding environment focused on delivering world-class analytics across a variety of healthcare-related domains to a variety of healthcare entities.
Who We Are
Independent for over 75 years, Milliman delivers market-leading services and solutions to clients worldwide. Today, we are helping companies take on some of the world's most critical and complex issues, including retirement funding and healthcare financing, risk management and regulatory compliance, data analytics and business transformation.
Job Responsibilities
* Lead and manage analytics and technology projects from data ingestion through delivery.
* Design, develop, and optimize data processes and workflows supporting large healthcare datasets.
* Perform and oversee ETL, validation, and transformation tasks for claims, eligibility, and pharmacy data.
* Guide project teams through the full "data-to-deliverable" lifecycle, ensuring accuracy and efficiency.
* Build analytical models, dashboards, and data pipelines to support consulting engagements.
* Collaborate with consultants, actuaries, and project managers to interpret results and deliver client insights.
* Review and approve technical work from peers and junior analysts to ensure quality standards are met.
* Mentor, coach, and delegate to analytical staff to strengthen technical and professional development.
* Contribute to innovation and process improvement initiatives, including automation, cloud enablement, and AI integration.
* Participate in client meetings and presentations, occasionally requiring travel.
Minimum requirements
* Bachelor's degree required (Computer Science, Management Information Systems, Computer Engineering, Math, Actuarial Science, Data Analytics, or related degree is preferred)
* 6+ years of experience in healthcare data analytics or a related technical analytics role.
* Advanced proficiency with SQL and Microsoft Excel for data analysis, validation, and automation.
* Strong programming skills in Python, R, or other analytical languages.
* Experience with data visualization and reporting tools (e.g., Power BI, Tableau, or R Shiny).
* Solid understanding of healthcare data structures, including claims, eligibility, and provider data.
* Proven ability to lead multiple projects simultaneously while mentoring and developing junior team members.
* Experience with cloud data technologies (e.g., Azure Data Factory, Databricks, Snowflake, AWS Redshift, or similar).
* Exposure to AI or Generative AI tools for data analysis, automation, or insight generation is preferred, but not required.
Competencies and Behaviors that Support Success in this Role
* Deep understanding of database architecture and large-scale healthcare data environments.
* Strong analytical thinking and the ability to translate complex data into actionable insights.
* Excellent communication skills, including the ability to explain technical concepts to non-technical audiences.
* Highly organized, detail-oriented, and able to manage competing priorities.
* Collaborative and proactive leadership style with a focus on mentorship and knowledge-sharing.
* Passion for applying analytics to improve healthcare performance, quality, and cost outcomes.
* Demonstrated accountability for quality, timelines, and client satisfaction.
* Fast learner who thrives in a dynamic, innovation-driven environment.
The Team
The Healthcare Data Analyst Lead will join a team that thrives on leveraging data, analytics, and technology to deliver meaningful business value. This is a team with technical aptitude and analytical prowess that enjoys building efficient and scalable products and processes. Ultimately, we are passionate about effecting change in healthcare. We also believe that collaboration and communication are cornerstones of success.
The Healthcare Data Analyst Lead will also join a mix of Healthcare Analysts, Leads, Consultants, and Principals. In addition, as part of the broader Milliman landscape, they will work alongside Healthcare Actuaries, Pharmacists, Clinicians, and Physicians. We aim to provide everyone a supportive environment, where we foster learning and growth through rewarding challenges.
Salary:
The overall salary range for this role is $104,900 - $199,065.
For candidates residing in:
* Alaska, California, Connecticut, Illinois, Maryland, Massachusetts, New Jersey, Notable New York City, Newark, San Jose, San Francisco, Pennsylvania, Virginia, Washington, or the District of Columbia the salary range is $120,635 - $199,065.
* All other locations the salary range is $104,900 - $173,100.
A combination of factors will be considered, including, but not limited to, education, relevant work experience, qualifications, skills, certifications, etc.
Location: It is preferred that candidates work on-site at our Brookfield, Wisconsin office, however, remote candidates will be considered.
The expected application deadline for this job is May 25, 2026.
Benefits
We offer a comprehensive benefits package designed to support employees' health, financial security, and well-being. Benefits include:
* Medical, Dental and Vision - Coverage for employees, dependents, and domestic partners.
* Employee Assistance Program (EAP) - Confidential support for personal and work-related challenges.
* 401(k) Plan - Includes a company matching program and profit-sharing contributions.
* Discretionary Bonus Program - Recognizing employee contributions.
* Flexible Spending Accounts (FSA) - Pre-tax savings for dependent care, transportation, and eligible medical expenses.
* Paid Time Off (PTO) - Begins accruing on the first day of work. Full-time employees accrue 15 days per year, and employees working less than full-time accrue PTO on a prorated basis.
* Holidays - A minimum of 10 observed holidays per year.
* Family Building Benefits - Includes adoption and fertility assistance.
* Paid Parental Leave - Up to 12 weeks of paid leave for employees who meet eligibility criteria.
* Life Insurance & AD&D - 100% of premiums covered by Milliman.
* Short-Term and Long-Term Disability - Fully paid by Milliman.
Equal Opportunity:
All qualified applicants will receive consideration for employment, without regard to race, color, religion, sex, sexual orientation, national origin, disability, or status as a protected veteran.
AI Data Scientist
Data engineer job in Milwaukee, WI
What you Will Do Clarios is seeking a skilled AI Data Scientist to design, develop, and deploy machine learning and AI solutions that unlock insights, optimize processes, and drive innovation across operations, offices, and products. This role focuses on transforming complex, high-volume data into actionable intelligence and enabling predictive and prescriptive capabilities that deliver measurable business impact. The AI Data Scientist will collaborate closely with AI Product Owners and business SMEs to ensure solutions are robust, scalable, and aligned with enterprise objectives.
This role requires an analytical, innovative, and detail-oriented team member with a strong foundation in AI/ML and a passion for solving complex problems. The individual must be highly collaborative, an effective communicator, and committed to continuous learning and improvement. This will be onsite three days a week in Glendale.
How you will do it
* Hypothesis Framing & Metric Measurement: Translate business objectives into well-defined AI problem statements with clear success metrics and decision criteria. Prioritize opportunities by ROI, feasibility, risk, and data readiness; define experimental plans and acceptance thresholds to progress solutions from concept to scaled adoption.
* Data Analysis & Feature Engineering: Conduct rigorous exploratory data analysis to uncover patterns, anomalies, and relationships across heterogeneous datasets. Apply advanced statistical methods and visualization to generate actionable insights; engineer high-value features (transformations, aggregations, embeddings) and perform preprocessing (normalization, encoding, outlier handling, dimensionality reduction). Establish data quality checks, schemas, and data contracts to ensure trustworthy inputs.
* Model Development & Iteration: Design and build models across classical ML and advanced techniques-deep learning, NLP, computer vision, time-series forecasting, anomaly detection, and optimization. Run statistically sound experiments (cross-validation, holdouts, A/B testing), perform hyperparameter tuning and model selection, and balance accuracy, latency, stability, and cost. Extend beyond prediction to prescriptive decision-making (policy, scheduling, setpoint optimization, reinforcement learning), with domain applications such as OEE improvement, predictive maintenance, production process optimization, and digital twin integration in manufacturing contexts.
* MLOps & Performance: Develop end-to-end pipelines for ingestion, training, validation, packaging, and deployment using CI/CD, reproducibility, and observability best practices. Implement performance and drift monitoring, automated retraining triggers, rollback strategies, and robust versioning to ensure reliability in dynamic environments. Optimize for scale, latency, and cost; support real-time inference and edge/plant-floor constraints under defined SLAs/SLOs.
* Collaboration & Vendor Leadership: Partner with AI Product Owners, business SMEs, IT, and operations teams to translate requirements into pragmatic, integrated solutions aligned with enterprise standards. Engage process owners to validate data sources, constraints, and hypotheses; design human-in-the-loop workflows that drive adoption and continuous feedback. Provide technical oversight of external vendors-evaluating capabilities, directing data scientists/engineers/solution architects, validating architectures and algorithms, and ensuring seamless integration, timely delivery, and measurable value. Mentor peers, set coding/modeling standards, and foster a culture of excellence.
* Responsible AI & Knowledge Management: Ensure data integrity, model explainability, fairness, privacy, and regulatory compliance throughout the lifecycle. Establish model risk controls; maintain documentation (model cards, data lineage, decision logs), audit trails, and objective acceptance criteria for production release. Curate reusable assets (feature catalogs, templates, code libraries) and best-practice playbooks to accelerate delivery while enforcing Responsible AI principles and rigorous quality assurance
What we look for
* 5+ years of experience in data science and machine learning, delivering production-grade solutions in corporate or manufacturing environments.
* Strong proficiency in Python and common data science libraries (e.g., Pandas, NumPy, scikit-learn); experience with deep learning frameworks (TensorFlow, PyTorch) and advanced techniques (NLP, computer vision, time-series forecasting).
* Hands-on experience with data preprocessing, feature engineering, and EDA for large, complex datasets.
* Expertise in model development, validation, and deployment, including hyperparameter tuning, optimization, and performance monitoring.
* Experience interacting with databases and writing SQL queries.
* Experience using data visualization techniques for analysis and model explanation.
* Familiarity with MLOps best practices-CI/CD pipelines, containerization (Docker), orchestration, model versioning, and drift monitoring.
* Knowledge of cloud platforms (e.g., Microsoft Azure, Snowflake) and distributed computing frameworks (e.g., Spark) for scalable AI solutions.
* Experience with agile methodologies and collaboration tools (e.g., JIRA, Azure DevOps), working in matrixed environments across IT, analytics, and business teams.
* Strong analytical and business acumen, with the ability to quantify ROI and build business cases for AI initiatives.
* Excellent communication and stakeholder engagement skills; able to present insights and recommendations to technical and non-technical audiences.
* Knowledge of LLMs and VLMs is a strong plus.
* Understanding of manufacturing systems (SCADA, PLCs, MES) and the ability to integrate AI models into operational workflows is a strong plus.
* Willingness to travel up to 10% as needed.
#LI-AL1
#LI-HYBRID
What you get:
* Medical, dental and vision care coverage and a 401(k) savings plan with company matching - all starting on date of hire
* Tuition reimbursement, perks, and discounts
* Parental and caregiver leave programs
* All the usual benefits such as paid time off, flexible spending, short-and long-term disability, basic life insurance, business travel insurance, Employee Assistance Program, and domestic partner benefits
* Global market strength and worldwide market share leadership
* HQ location earns LEED certification for sustainability plus a full-service cafeteria and workout facility
* Clarios has been recognized as one of 2025's Most Ethical Companies by Ethisphere. This prestigious recognition marks the third consecutive year Clarios has received this distinction.
Who we are:
Clarios is the force behind the world's most recognizable car battery brands, powering vehicles from leading automakers like Ford, General Motors, Toyota, Honda, and Nissan. With 18,000 employees worldwide, we develop, manufacture, and distribute energy storage solutions while recovering, recycling, and reusing up to 99% of battery materials-setting the standard for sustainability in our industry. At Clarios, we're not just making batteries; we're shaping the future of sustainable transportation. Join our mission to innovate, push boundaries, and make a real impact. Discover your potential at Clarios-where your power meets endless possibilities.
Veterans/Military Spouses:
We value the leadership, adaptability, and technical expertise developed through military service. At Clarios, those capabilities thrive in an environment built on grit, ingenuity, and passion-where you can grow your career while helping to power progress worldwide. All qualified applicants will be considered without regard to protected characteristics.
We recognize that people come with a wealth of experience and talent beyond just the technical requirements of a job. If your experience is close to what you see listed here, please apply. Diversity of experience and skills combined with passion is key to challenging the status quo. Therefore, we encourage people from all backgrounds to apply to our positions. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, age, status as a protected veteran or other protected characteristics protected by law. As a federal contractor, we are committed to not discriminating against any applicant or employee based on these protected statuses. We will also take affirmative action to ensure equal employment opportunities. Please let us know if you require accommodations during the interview process by emailing Special.Accommodations@Clarios.com. We are an Equal Opportunity Employer and value diversity in our teams in terms of work experience, area of expertise, and all characteristics protected by laws in the countries where we operate. For more information on our commitment to sustainability, diversity, and equal opportunity, please read our latest report. We want you to know your rights because EEO is the law.
A Note to Job Applicants: please be aware of scams being perpetrated through the Internet and social media platforms. Clarios will never require a job applicant to pay money as part of the application or hiring process.
To all recruitment agencies: Clarios does not accept unsolicited agency resumes/CVs. Please do not forward resumes/CVs to our careers email addresses, Clarios employees or any other company location. Clarios is not responsible for any fees related to unsolicited resumes/CVs.
Auto-ApplyDatabricks Data Engineer - Manager - Consulting - Location Open
Data engineer job in Milwaukee, WI
At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
**Technology - Data and Decision Science - Data Engineering - Manager**
We are looking for a dynamic and experienced Manager of Data Engineering to lead our team in designing and implementing complex cloud analytics solutions with a strong focus on Databricks. The ideal candidate will possess deep technical expertise in data architecture, cloud technologies, and analytics, along with exceptional leadership and client management skills.
**The opportunity:**
In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that business requirements are translated into effective technical solutions. Key responsibilities include:
+ Understanding and analyzing business requirements to translate them into technical requirements.
+ Designing, building, and operating scalable data architecture and modeling solutions.
+ Staying up to date with the latest trends and emerging technologies to maintain a competitive edge.
**Key Responsibilities:**
As a Data Engineering Manager, you will play a crucial role in managing and delivering complex technical initiatives. Your time will be spent across various responsibilities, including:
+ Leading workstream delivery and ensuring quality in all processes.
+ Engaging with clients on a daily basis, actively participating in working sessions, and identifying opportunities for additional services.
+ Implementing resource plans and budgets while managing engagement economics.
This role offers the opportunity to work in a dynamic environment where you will face challenges that require innovative solutions. You will learn and grow as you guide others and interpret internal and external issues to recommend quality solutions. Travel may be required regularly based on client needs.
**Skills and attributes for success:**
To thrive in this role, you should possess a blend of technical and interpersonal skills. The following attributes will make a significant impact:
+ Lead the design and development of scalable data engineering solutions using Databricks on cloud platforms (e.g., AWS, Azure, GCP).
+ Oversee the architecture of complex cloud analytics solutions, ensuring alignment with business objectives and best practices.
+ Manage and mentor a team of data engineers, fostering a culture of innovation, collaboration, and continuous improvement.
+ Collaborate with clients to understand their analytics needs and deliver tailored solutions that drive business value.
+ Ensure the quality, integrity, and security of data throughout the data lifecycle, implementing best practices in data governance.
+ Drive end-to-end data pipeline development, including data ingestion, transformation, and storage, leveraging Databricks and other cloud services.
+ Communicate effectively with stakeholders, including technical and non-technical audiences, to convey complex data concepts and project progress.
+ Manage client relationships and expectations, ensuring high levels of satisfaction and engagement.
+ Stay abreast of the latest trends and technologies in data engineering, cloud computing, and analytics.
+ Strong analytical and problem-solving abilities.
+ Excellent communication skills, with the ability to convey complex information clearly.
+ Proven experience in managing and delivering projects effectively.
+ Ability to build and manage relationships with clients and stakeholders.
**To qualify for the role, you must have:**
+ Bachelor's degree in computer science, Engineering, or a related field required; Master's degree preferred.
+ Typically, no less than 4 - 6 years relevant experience in data engineering, with a focus on cloud data solutions and analytics.
+ Proven expertise in Databricks and experience with Spark for big data processing.
+ Strong background in data architecture and design, with experience in building complex cloud analytics solutions.
+ Experience in leading and managing teams, with a focus on mentoring and developing talent.
+ Strong programming skills in languages such as Python, Scala, or SQL.
+ Excellent problem-solving skills and the ability to work independently and as part of a team.
+ Strong communication and interpersonal skills, with a focus on client management.
**Required Expertise for Managerial Role:**
+ **Strategic Leadership:** Ability to align data engineering initiatives with organizational goals and drive strategic vision.
+ **Project Management:** Experience in managing multiple projects and teams, ensuring timely delivery and adherence to project scope.
+ **Stakeholder Engagement:** Proficiency in engaging with various stakeholders, including executives, to understand their needs and present solutions effectively.
+ **Change Management:** Skills in guiding clients through change processes related to data transformation and technology adoption.
+ **Risk Management:** Ability to identify potential risks in data projects and develop mitigation strategies.
+ **Technical Leadership:** Experience in leading technical discussions and making architectural decisions that impact project outcomes.
+ **Documentation and Reporting:** Proficiency in creating comprehensive documentation and reports to communicate project progress and outcomes to clients.
**Large-Scale Implementation Programs:**
1. **Enterprise Data Lake Implementation:** Led the design and deployment of a cloud-based data lake solution for a Fortune 500 retail client, integrating data from multiple sources (e.g., ERPs, POS systems, e-commerce platforms) to enable advanced analytics and reporting capabilities.
2. **Real-Time Analytics Platform:** Managed the development of a real-time analytics platform using Databricks for a financial services organization, enabling real-time fraud detection and risk assessment through streaming data ingestion and processing.
3. **Data Warehouse Modernization:** Oversaw the modernization of a legacy data warehouse to a cloud-native architecture for a healthcare provider, implementing ETL processes with Databricks and improving data accessibility for analytics and reporting.
**Ideally, you'll also have:**
+ Experience with advanced data analytics tools and techniques.
+ Familiarity with machine learning concepts and applications.
+ Knowledge of industry trends and best practices in data engineering.
+ Familiarity with cloud platforms (AWS, Azure, GCP) and their data services.
+ Knowledge of data governance and compliance standards.
+ Experience with machine learning frameworks and tools.
**What we look for:**
We seek individuals who are not only technically proficient but also possess the qualities of top performers, including a strong sense of collaboration, adaptability, and a passion for continuous learning. If you are driven by results and have a desire to make a meaningful impact, we want to hear from you.
FY26NATAID
**What we offer you**
At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more .
+ We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $125,500 to $230,200. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $150,700 to $261,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
+ Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
+ Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
**Are you ready to shape your future with confidence? Apply today.**
EY accepts applications for this position on an on-going basis.
For those living in California, please click here for additional information.
EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
**EY | Building a better working world**
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
Security Lead-Data Protection
Data engineer job in Milwaukee, WI
blue Stone Recruiting is a national search firm with a focus of placing top Cyber Security talent from the Analyst level to CISO with prestigious organizations nationwide.
Job Description
Currently working with a Fortune 500 global manufacturing client that is looking for a Security Lead- Data Protection focused individual. This client is looking to add a permanent member to their security team that can grow within the company.
Responsibilities:
• Develop and lead the Security Lead-Global Data Protection program across all clients business units
• Build and manage an effective data classification program
• Develop and drive Security Lead-Data Protection policies, standards and processes
• Develop and drive a data loss prevention tools strategy partnering with companies Information Technology groups and the clients business units
• Align Security -Lead Data Protection programs with other information security and cross-functional programs
• Direct and improve the Security Lead-Data Protection and DLP programs and associated governance activities including metrics, issue tracking and remediation, and programs supporting client policies.
• Establish and maintain cross-business relationships with key Leads and stakeholders across clients business, information security, and IT functions
• Coordinate with the Global IT Council to ensure compliance with clients standards and share best practices
• Conduct research and make recommendations on products, services, and standards in support of all global infrastructure security efforts.
• Develop and maintain appropriate response playbooks, facilitate routine exercises, and ensure a sound communication process for all cyber events
Job Requirements:
Formal Education & Certification
• Bachelor's degree in Information Security, IT Engineering, or Computer Science with 5+ years of IT experience
• Industry-recognized security certification such as GIAC, CISSP, CISM, or CISA are preferred
.
Qualifications
• Minimum 5 years' information security experience including experience mapping and securing business processes / data flows
• Demonstrated experience building and leading a cyber-security program
• Advanced knowledge of data protection/DLP and Windows client/server security concepts, best practices, and procedures
• Solid “hands on” technical experience is essential
• Experience in Information Security Incident Response
• Experience in IaaS/SaaS environments
• Broad understanding of all aspects of IT and enterprise systems interoperability
• Ability to communicate technical topics (verbal and written) to multiple organizational levels
• Global enterprise experience is preferred
Personal Attributes:
Demonstrated Leadship managing direct and matrix-reporting global teams
• Demonstrated experience leading global programs across technology and business functions
• Strong interpersonal, written, and oral communication skills
• Able to conduct research into issues and products as required
• Ability to prioritize and execute tasks in a fast-paced environment and make sound decisions in emergency situation
• Highly self-motivated and directed
• Keen attention to detail
• Proven analytical and problem-solving abilities
Additional Information
Work with blue Stone recruiting to find your next Cyber Security role. You can find us at ******************************* We look forward to speaking with you.
All your information will be kept confidential according to EEO guidelines.