Post job

Data engineer jobs in Kenosha, WI - 502 jobs

All
Data Engineer
Data Scientist
ETL Architect
Software Engineer
  • Data Scientist II (Clinicogenomic)

    Us Tech Solutions 4.4company rating

    Data engineer job in North Chicago, IL

    We are seeking a data scientist to apply advanced analytics and AI methods on EHR phenotyping data from clinicogenomic data sources such as UK Biobank (UKBB) and Alliance for Genomic Discovery (AGD) datasets, driving discovery through large-scale biobank analyses. The candidate will extract patient cohorts from the EHR data in order for Genome-wide Association Studies (GWAS) and related applications to extract actionable insights from rich biobank data resources. Responsibilities: UKBB, AGD data and other RWD. Data curation and mining using Linux command line. Advanced analytics. Advanced programming skills with writing reusable scripts (R, Python, Spark or SQL). Learn existing automated EHR-phenotyping using large longitudinal UK Biobank and AGD data. Communication and teamwork. Preferred skills: Biostatistics, Genetics, experience with latest AI methodologies. Education and experience: 2-3 years with a PhD degree or 5 years with an MSc degree. Work environment: Individual role within a collaborative team. Other notable details: Advance the state-of-art Phenomics Data Science initiatives using large-scale real-world data. Background successful in the role: Experienced data scientist/data engineers with interest in driving new mathematical solutions and innovation in line with business strategy and rapidly changing data streams. About US Tech Solutions: US Tech Solutions is a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit ************************ US Tech Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Recruiter Details: Name: Shivangi Shivpuri Email: ********************************* Internal Id: 26-01351
    $69k-98k yearly est. 3d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Software Engineer (IAM / Identity) 4840

    Tier4 Group

    Data engineer job in Franklin, WI

    Must Haves: Experience designing, developing, and implementing CIAM solutions integrating with various systems and platforms Hands-on experience with CIAM IDP tools such as Okta, Ping, or equivalent Knowledge of orchestration tools like PingOne DaVinci or Transmit Security FlexID Strong understanding of identity protocols (OIDC, OAuth, SAML, AD-Fed, API Gateways, SCIM) Experience implementing and managing SSO, MFA, FIDO authentication, PAM, and Identity Governance & Administration Stand out skills from others: Experience with Okta and Transmit Security is a plus Ability to provide technical leadership and mentorship Responsibilities Collaborate with business units, developers, vendors, and security engineers Integrate with third-party software and on-premises infrastructure Ensure high availability and seamless user experience Conduct impact analysis, analyze data, and create work effort estimates Lead automation and process improvement initiatives Support CIAM infrastructure upgrades, patches, and performance tuning Maintain compliance with industry standards and best practices
    $63k-83k yearly est. 3d ago
  • Principal Data Scientist

    Maximus 4.3company rating

    Data engineer job in Milwaukee, WI

    Description & Requirements Maximus has an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team. You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes. This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.) This is a remote position. Essential Duties and Responsibilities: - Make deep dives into the data, pulling out objective insights for business leaders. - Initiate, craft, and lead advanced analyses of operational data. - Provide a strong voice for the importance of data-driven decision making. - Provide expertise to others in data wrangling and analysis. - Convert complex data into visually appealing presentations. - Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners. - Understand the importance of automation and look to implement and initiate automated solutions where appropriate. - Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects. - Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects. - Guide operational partners on product performance and solution improvement/maturity options. - Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization. - Learn new skills in advanced analytics/AI/ML tools, techniques, and languages. - Mentor more junior data analysts/data scientists as needed. - Apply strategic approach to lead projects from start to finish; Job-Specific Essential Duties and Responsibilities: - Develop, collaborate, and advance the applied and responsible use of AI, ML, simulation, and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation. - Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital. - Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning. - Maintain current knowledge and evaluation of the AI technology landscape and emerging developments and their applicability for use in production/operational environments Minimum Requirements - Bachelor's degree in related field required. - 10-12 years of relevant professional experience required. Job-Specific Minimum Requirements (required skills that align with contract LCAT, verifiable, and measurable): - 10+ years of relevant Software Development + AI / ML / DS experience - Professional Programming experience (e.g. Python, R, etc.) - Experience with AI / Machine Learning - Experience working as a contributor on a team - Experience leading AI/DS/or Analytics teams - Experience mentoring Junior Staff - Experience with Modeling and Simulation - Experience with program management Preferred Skills and Qualifications: - Masters in quantitative discipline (Math, Operations Research, Computer Science, etc.) - Experience developing machine learning or signal processing algorithms: - Ability to leverage mathematical principles to model new and novel behaviors - Ability to leverage statistics to identify true signals from noise or clutter - Experience working as an individual contributor in AI or modeling and simulation - Use of state-of-the-art technology to solve operational problems in AI, Machine Learning, or Modeling and Simulation spheres - Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles - Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions - Use and development of program automation, CI/CD, DevSecOps, and Agile - Experience managing technical teams delivering technical solutions for clients. - Experience working with optimization problems like scheduling - Experience with Data Analytics and Visualizations - Cloud certifications (AWS, Azure, or GCP) - 10+ yrs of related experience in AI, advanced analytics, computer science, or software development #techjobs #Veteranspage EEO Statement Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics. Pay Transparency Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances. Accommodations Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************. Minimum Salary $ 156,640.00 Maximum Salary $ 234,960.00
    $70k-98k yearly est. Easy Apply 2d ago
  • Data Scientist

    Bayforce 4.4company rating

    Data engineer job in Milwaukee, WI

    Role Title: Data Scientist Employment Type: Contract Duration: 3-6 month contract Preferred Location: Local to Milwaukee with 3 days onsite The Data Scientist will play a key role in supporting a predictive maintenance initiative by developing and deploying advanced analytical models focused on asset reliability and anomaly detection. This role is hands-on and highly specialized, requiring direct experience in predictive maintenance use cases rather than a broad or generalized data science background. This position is well-suited for candidates with strong experience in industrial analytics, asset performance management, or building anomaly detection who can apply data science techniques to real-world operational challenges. Key Responsibilities * Design, develop, and implement predictive maintenance and anomaly detection models * Analyze asset and operational data to identify failure patterns and reliability risks * Build and validate models to support asset reliability and proactive maintenance strategies * Collaborate with engineering, operations, and technical teams to understand asset behavior and data sources * Deploy and manage data science solutions within an Azure-based environment * Write clean, maintainable code and collaborate through GitHub * Document modeling approaches, assumptions, and results for technical and business audiences * Support continuous improvement of predictive models based on performance and feedback Requirements Required Qualifications * Strong data scientist experience with a specific background in predictive maintenance, asset reliability, or building anomaly detection models * Proven experience developing and deploying machine learning or statistical models for operational use cases * Proficiency working in an Azure environment * Strong analytical, problem-solving, and communication skills Preferred Qualifications: * Experience with Python and/or R * Experience using GitHub for version control and collaboration * Background working with industrial, facilities, or asset-based data environments
    $66k-89k yearly est. 22h ago
  • Lead Data & BI Scientist

    Zurn Elkay Water Solutions Corporation

    Data engineer job in Milwaukee, WI

    The Company Zurn Elkay Water Solutions Corporation is a thriving, values-driven company focused on doing the right things. We're a fast growing, publicly traded company (NYSE: ZWS), with an enduring reputation for integrity, giving back, and providing an engaging, inclusive environment where careers flourish and grow. Named by Newsweek as One of America's Most Responsible Companies and an Energage USA Top Workplace, at Zurn Elkay Water Solutions Corporation, we never forget that our people are at the center of what makes us successful. They are the driving force behind our superior quality, product ingenuity, and exceptional customer experience. Our commitment to our people and their professional development is a recipe for success that has fueled our growth for over 100 years, as one of today's leading international suppliers of plumbing and water delivery solutions. Headquartered in Milwaukee, WI, Zurn Elkay Water Solutions Corporation employs over 2800 employees worldwide, working from 24 locations across the U.S., China, Canada, Dubai, and Mexico, with sales offices available around the globe. We hope you'll visit our website and learn more about Zurn Elkay at zurnelkay.com. If you're ready to join a company where what you do makes a difference and you have pride in the work you are doing, talk to us about joining the Zurn Elkay Water Solutions Corporation family! If you are a current employee, please navigate here to apply internally. Job Description The Lead Data & BI Scientist is a senior-level role that blends advanced data science capabilities with business intelligence leadership. This position is responsible for driving strategic insight generation, building predictive models, and leading analytics initiatives across departments such as sales, marketing, pricing, manufacturing, logistics, supply chain, and finance. The role requires both technical depth and business acumen to ensure that data-driven solutions are aligned with organizational goals and deliver measurable value. Key Accountabilities Strategic Insight & Business Partnership * Partner with business leaders to identify high-impact opportunities and form hypotheses. * Present findings and recommendations to leadership in a clear, impactful manner. * Demonstrate ROI and business value from analytics initiatives. Data Science Leadership * Define and implement data science processes, tools, and governance frameworks. * Mentor junior team members and foster a culture of continuous learning. Advanced Analytics & Modeling * Design, build and validate predictive models, machine learning algorithms and statistical analyses. * Translate complex data into actionable insights for strategic decision-making. Technology & Tools * Utilize tools such as Tableau, Power BI, OBIEE/OAC, Snowflake, SQL, R, Python, and data catalogs. * Stay current with emerging technologies like agentic analytics and AI-driven insights. * Evaluate and recommend BI platforms and data science tools. Project & Change Management * Manage analytics projects from inception to delivery. * Provide training and change management support for new tools and processes. * Lead the establishment of a data science center of excellence. Qualifications/Requirements * Bachelor degree required, in a quantitative field such as engineering, mathematics, science, and/or MIS. Master degree preferred * 10+ years of overall work experience * 7+ years of experience in data science and statistical analysis * Strong understanding and experience with analytics tool such as Tableau and OBIEE/OAC (or similar tools) for reporting and visualization, Snowflake for data storage, data modeling, data prep or ETL tools, R or Python, SQL, and data catalogs. * Strong understanding and experience with multiple statistical and quantitative models and techniques such as (but not limited to) those used for predictive analytics, machine learning, AI, linear models and optimization, clustering, and decision tree. * Deep experience applying data science to solve problems in at least one of the following areas is required. Experience in multiple areas is preferred: marketing, manufacturing, pricing, logistics, sourcing, and sales. * Strong communication (verbal, written) skills, and ability to work with all levels of the organization effectively * Working knowledge of and proven experience applying project management tools * Strong analytical skills * Ability to lead and mentor the work of others * High degree of creativity and latitude is expected Capabilities and Success Factors * Decision Quality - Making good and timely decisions that keep the organization moving forward. * Manages Complexity - Making sense of complex, high quantity and sometimes contradictory information to effectively solve problems. * Plans & Aligns - Planning and prioritizing work to meet commitments aligned with organizational goals. * Drives Results - Consistently achieving results, even under tough circumstances. * Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Total Rewards and Benefits * Competitive Salary * Medical, Dental, Vision, STD, LTD, AD&D, and Life Insurance * Matching 401(k) Contribution * Health Savings Account * Up to 3 weeks starting Vacation (may increase with tenure) * 12 Paid Holidays * Annual Bonus Eligibility * Educational Reimbursement * Matching Gift Program * Employee Stock Purchase Plan - purchase company stock at a discount! THIRD PARTY AGENCY: Any unsolicited submissions received from recruitment agencies will be considered property of Zurn Elkay, and we will not be liable for any fees or obligations related to those submissions. Equal Opportunity Employer - Minority/Female/Disability/Veteran
    $68k-94k yearly est. Auto-Apply 16d ago
  • Data Scientist

    Northwestern Mutual 4.5company rating

    Data engineer job in Milwaukee, WI

    You and Northwestern Mutual. We believe relationships are built on trust. That our lives and our work matter. And we're much stronger together than we are apart. These beliefs launched our company nearly 160 years ago. Today, they're just a few of the reasons why people choose to build careers at Northwestern Mutual. Our business is about helping people secure their financial futures, and that starts with putting people first - our clients, our employees and our field representatives. Northwestern Mutual is known for financial strength. We're strong, innovative and growing. Come grow with us. Job Description At Northwestern Mutual, we believe relationships are built on trust. That our lives and our work matter. These beliefs launched our company nearly 160 years ago. Today, they're just a few of the reasons why people choose to build careers at Northwestern Mutual. We're strong and growing. In a company with such a long and storied history, this may be the most exciting and important time to be a part of Northwestern Mutual. We're strong, innovative and growing. We invest in our people. We provide opportunities for employees to grow themselves, their career and in turn, our business. We care. We make a positive difference in our communities. Nationally, thousands have benefitted from our support of research and programs to fight childhood cancer. Each year, our Foundation, employees and financial representatives donate time, talent and financial support to causes they're passionate about. We are an equal opportunity/affirmative action employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender identity or expression, sexual orientation, national origin, disability, age or status as a protected veteran, or any other characteristic protected by law. What's the role? We collaborate with internal business partners to implement data-driven solutions with measurable business value. Our innovative team of data scientists work in a highly collaborative, agile environment. As a Data Scientist, you will work with a diverse and talented team with responsibilities that include: Application of statistical and machine driven techniques to address business objectives and client needs; Building relationships and collaborating with stakeholders to gather, consolidate and validate business assumptions prior to undertaking analytical work; Collaborating within team to research, evaluate and select data sources; Partner with team to document requirements, assumptions and methodologies including validating, testing and implementation strategies; Designing rich data visualizations to communicate complex ideas to internal clients and leaders; Assuring compliance with regulatory and privacy requirements during design and implementation of database development, modeling and analysis projects; Creating clear and easy to understand presentations/reports Maintaining a high level of understanding of cutting edge data science techniques and tools; Bring Your Best! What this role needs: Graduate Degree in quantitative disciplines such as Data Science, Statistics, Mathematics, Economics, Computer Science or related field Experience in statistical modeling and data mining using large and complex datasets Strong intellectual curiosity and willingness to adapt data science concepts and tools to solve real-world, business problems Demonstrated experience in influencing business decisions; including knowing when to escalate issues. Strong verbal and written communication skills, listening and teamwork skills and effective presentation skills Demonstrated competencies such as excellent relationship management, creative problem solving, flexibility and willingness to challenge the status quo Programming experience with SAS, R and other software, SQL server or Hadoop experience is a plus. #LI-PB1 Req ID: 10401 Position Type: Regular Full Time Education Experience: Bachelor's Required Employment Experience: 6-8 years Licenses/Certifications: FLSA Status: Exempt Posting Date: 01/10/2017
    $78k-100k yearly est. 60d+ ago
  • Lead ETL Architect (No H1B)

    Sonsoft 3.7company rating

    Data engineer job in Deerfield, IL

    Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services. As background, this client has been working with us and internal recruiting to fill this role for a LONG time. Previously, the client was looking for someone who could both technically lead the team specific to SAG technologies and also work with the business. Two things have changed: 1) the span of control of the team increased from SAG to include the other technologies listed in the job description, and 2) he has been unsuccessful in finding someone who was both the best technical architect on the team and also had manager qualities. He is now open to lesser capabilities on the technical, as long as they have the manager. I WOULD TARGET CURRENT/ FORMER SR. MANAGERS/ DIRECTORS OF INTEGRATION/ SOA. PREFERABLY, WERE TECHNICAL AT SOME POINT IN THEIR CAREER, BUT MOVED INTO MANAGEMENT. WITH THIS TARGET, YOU SHOULD BE ABLE TO ACCESS THE APPROPRIATE TALENT AT THIS PAY RATE. Lead ETL Architect Client's is seeking a Lead ETL Architect as a key member of their Center of Expertise for Integration Technologies. This consultant will be primarily responsible for leading the demand intake/ management process with the business leaders and other IT groups related to managing the demand for the design, build and ongoing support of new ETL architectures. They also will interact extensively with the team of architects supporting these technologies. Expertise in one of more of the following integration technology areas is required: -- ETL - Datastage, AnInito, Talend (client is moving from AbInitio to Talend as their primary ETL tool) The overall team is responsible for addressing any architecture impacts and defining technology solution architectures focusing on infrastructure, logical and physical layout of the systems and technologies not limited to hardware, distributed and virtual systems, software, security, data management, storage, backup/recovery, appliances, messaging, networking, work flow, interoperability, flexibility, scalability, monitoring and reliability including fault tolerance and disaster recovery in collaboration with Enterprise Architect(s). Additionally, responsible for the build-out and ongoing run support of these integration platforms. Skills Set · Possession of fundamental skills of a solution architect with ability to inquire and resolve vague requirements · Ability to analyze existing systems through an interview process with technology SMEs. Does not need to be the SME. · Takes a holistic view and communicate to others to be able to understand the enterprise view to ensure coherence of all aspects of the project as an integrated system · Perform gap analysis between current state and the future state architecture to identify single points of failure, capabilities, capacity, fault tolerance, hours of operation (SLA), change windows, · Strong verbal and written communication with proven skills in facilitating the design sessions. · Able to influence, conceptualize, visualize and communicate the target architecture. · Able to communicate complex technical or architecture concepts in a simple manner and can adapt to different audiences · Ability to work independently and market architecture best practices, guidelines, standards, principles, and patterns while working with infrastructure and technology project teams. · Ability to document end-to-end application transaction flows through the enterprise · Ability to document the technology architecture decision process, and if required develop relevant templates · Resolve conflicts within infrastructure and application teams, business units and other architects · Identify opportunities to cut cost without sacrificing the overall business goals. · Ability to estimate the financial impact of solution architecture alternatives / options · Knowledge of all components of an enterprise technical architecture. Additional Information ** U.S. Citizens and those who are authorized to work independently in the United States are encouraged to apply. We are unable to sponsor at this time. Note:- This is a Contract job opportunity for you. Only US Citizen , Green Card Holder , GC-EAD , H4-EAD, L2-EAD, OPT-EAD & TN-Visa can apply. No H1B candidates, please. Please mention your Visa Status in your email or resume . ** All your information will be kept confidential according to EEO guidelines.
    $87k-114k yearly est. 1d ago
  • Big Data Engineer

    Forhyre

    Data engineer job in Rolling Meadows, IL

    Job Description Looking for an experienced Senior Big Data Developer Experience: 8 - 10 years Requirements Primary / Essential Skills : SPARK with Scala or Python Secondary / Optional Skills : AWS, UNIX & SQL Looking for an experienced Senior Big Data Developer who will be responsible for, Technical leadership in driving solutions and hands on contributions To build new cloud based ingestion, transformation and data movement applications To migrate / modernize legacy data platforms Contribute and Assist in translating the requirements to high level and low level solution designs and working program / code Interact with business/IT stakeholders, other involved teams to understand requirements, identify dependencies, suggest and convincingly present solutions any or all parties involved. Perform hands-on to deliver on commitments and coordinating with team members both onsite and offshore and enable the team to deliver on commitments. To be successful in this role, the candidate must need, Good working knowledge and strong on concepts on Spark framework Comfortably work with any one or more of the scripting language listed in the order of preference, Scala, Python , Unix shell scripting Experience / exposure in AWS Services (EC2, Lambda, S3, RDS) and related cloud technologies Good understanding of DATA space - Data Integration, Building Data Warehouse solutions
    $75k-100k yearly est. 2d ago
  • Principal Data Engineer

    Sdevops

    Data engineer job in Rolling Meadows, IL

    Key Responsibilities of Principal Data Engineer: Guide the team towards successful project delivery Provide technical leadership and key decision making Work with and mentor individual team members to meet professional goals Continuous effort to automate and increase team efficiency Maintain high standards of software quality via code review, testing, automation standardization, and tooling Collaborate directly with both developers and business stakeholders Provide estimates and risk-reducing spike stories Assist in collection and documentation of requirements from business Assist in planning deployments and migrations Prepare status updates and run the daily standup Analyze data problems and provide solutions Assess opportunities for improvements and optimization Required Qualifications of the Principal Data Engineer: Masters Degree or equivalent work experience Minimum 12 years of experience working with open source databases Minimum 10 years of experience working with ETL and related data pipeline technologies. Highly proficient in open source SQL systems, particularly MySQL and PostgreSQL Proficiency in multiple scripting languages, including Python and Ruby Demonstrable experience with cloud-based ETL tools, including EMR, Expertise with distributed data stores, with demonstrable experience using Redshift and with optimizing query performance Deep understanding of data structures and schema design Prior work with AWS ecosystem, particularly RDS, SQS and SNS, StepFunctions, CDK) Exceptional analytical, organizational, interpersonal, and communication (both oral and written) skills Self-motivated, driven, resourceful, and able to get things done Enjoy working in fast-paced, collaborative, Agile environments Ability to adapt and learn quickly is a fundamental necessity Benefits: 401(k) Dental Insurance Health insurance Health savings account Paid time off Professional development assistance Vision insurance
    $75k-100k yearly est. 60d+ ago
  • Data Analytics and Insights Engineer

    American College of Chest Physicians 4.2company rating

    Data engineer job in Glenview, IL

    Primary Purpose: Supports CHEST's growing data ecosystem by supporting and monitoring high-quality data pipelines, performing advanced analytics, and supporting our enterprise AI analytics platform. This role works closely with stakeholders across the organization to deliver accurate insights, maintain reliable data systems, and support key strategic initiatives such as Customer 360 and leadership-level analytics. This position is located in Glenivew, IL. Essential Functions/Responsibilities: Demonstrates a passion for crushing lung disease and embodies CHEST values: honor the team, lead with integrity, leverage passion, cultivate innovation, and have serious fun. CHEST Data Analytics and Engineering: Provide SQL and data support to the Analytics team. Conduct exploratory analysis and validate analytical models. Support Customer 360 segmentation and data products. Develop and maintain SQL queries and data models using Snowflake. Maintain and support ETL/ELT data pipelines and curated datasets. Integrate CHEST proprietary data with third-party sources. Monitor data jobs and troubleshoot workflow issues in support of Data Architecture & Engineering Team. Analytics Platform Support: Work with leadership and users to understand deep-dive questions. Validate AI-generated results and ensure data accuracy. Maintain curated datasets and support platform improvements. Provide documentation and training to help users get the most value from the platform. Project and Agile Support: Translate business needs into user stories and Jira tasks. Participate in Agile/SCRUM ceremonies and maintain project notes. Communicate status updates, risks, and findings to stakeholders. Collaboration: Align with cross-functional teams to support organizational goals. Share best practices for SQL, data modeling, and analytics. Requirements Required Qualifications: 5 years of related experience Strong SQL skills (Snowflake, MS SQL preferred) Experience with data modeling Analytical experience with dataset exploration and model validation Strong communication skills to support both technical and non-technical users Familiarity with Agile/SCRUM and Jira Preferred Qualifications: Experience with AI analytics platforms or prompt-driven insights tools Customer 360 or identity-resolution experience Python or dbt experience is a plus MS SQL and/or Snowflake monitoring, troubleshooting, and resolution of job failures. CRM Support in NetFORUM for job creation and maintenance. Benefits While we offer benefits that you'd expect from any forward-thinking, progressive organization, we offer a lot of extras too, including the standardization of a hybrid working environment. From tuition reimbursement to parental leave, we offer the benefits that you want most. Health and Wellness: Medical, dental, and vision insurance*; flexible spending account*; long- term and short-term disability insurance; life/AD&D insurance Work/life Balance: 37.5-hour work week with flexible start times; Paid Time Off; Paid parental leave; Hybrid work environment; Paid holidays Giving and sharing: 401(k) with matching contribution from CHEST*; Health club and fitness reimbursement; Employee counseling program; Reimbursement for professional memberships; Tuition Reimbursement Office perks: Lunch & Learns; Annual Health Fair; Professional development courses; Volunteering opportunities; Annual Holiday Party; In-office “Busy Breaks” Additional Information The annual base salary range for this position is USD $105,000 to USD $115,000. This pay range represents base pay only and excludes any additional items such as incentives, bonuses, or other items. CHEST considers factors such as, but not limited to, the scope and responsibilities of the position, the candidate's work experience, education/training, key skills, internal peer equity, and market and organizational considerations when extending an offer. CHEST is proud to be an equal opportunity employer and does not discriminate on the basis of race, color, national origin, sex, gender identity, religion, sexual orientation, age, disability, parental status, veteran status, or any other protected status under applicable laws. At CHEST, our employees come from different backgrounds with various lived experiences and dynamic strengths. We strive to continuously improve our way of working to reflect our commitment to inclusion and equity and build a workforce that reflects the communities that we serve. And that means we need you! Your experiences may only perfectly align with some qualifications listed in the job description. But if you are excited by this position, we highly encourage you to consider still applying. You may be just the right candidate to help us with our mission to improve patient care. This description was designed as a convenience to acquaint employees and managers with the essential elements of the position. It is solely to summarize basic duties, and it is not intended to be a contract or guarantee of employment or any specific terms or conditions of employment. *Participation is voluntary
    $105k-115k yearly 19d ago
  • Data Engineer - Data Products & Delivery

    Parts Town 3.4company rating

    Data engineer job in Addison, IL

    at Parts Town See What We're All About As the fastest-growing distributor of restaurant equipment, HVAC and residential appliance parts, we like to do things a little differently. First, you need to understand and demonstrate our Core Values with safety being your first priority. That's key. But we're also looking for unique enthusiasm, high integrity, courage to embrace change…and if you know a few jokes, that puts you on the top of our list! Do you have a genius-level knowledge of original equipment manufacturer parts? If not, no problem! We're more interested in passionate people with fresh ideas from different backgrounds. That's what keeps us at the top of our game. We're proud that our workplace has been recognized for its growth and innovation on the Inc. 5000 list 15 years in a row and the Crain's Fast 50 list ten times. We are honored to be voted by our Chicagoland team as a Chicago Tribune Top Workplace for the last four years. If you're ready to roll up your sleeves, go above and beyond and put your ambition to work, all while having some fun, let's chat - Apply Today! Perks Parts Town Pride - check out our virtual tour and culture! Quarterly profit-sharing bonus Hybrid work schedule Team member appreciation events and recognition programs Volunteer opportunities Monthly IT stipend Casual dress code On-demand pay options: Access your pay as you earn it, to cover unexpected or even everyday expenses All the traditional benefits like health insurance, 401k/401k match, employee assistance programs and time away - don't worry, we've got you covered. The Job at a Glance The Data Engineer- Data Products & Delivery will specialize in turning raw data into business-ready products within GCP. They will design and optimize data models, marts, and semantic layers in BigQuery, enabling analytics, BI, and ML use cases across the enterprise. You will also support downstream systems and APIs that deliver trusted data for operational and AI-driven processes. You will play a foundational role in shaping this future - building the pipelines, products, and platforms that power the next generation of digital distribution. A Typical Day Build silver/gold layers in BigQuery, transforming raw data into clean, business-ready models. Design semantic layers using Looker or dbt for consistent business metrics Develop data marts and star schemas optimized for analytics and self-service BI Build APIs and services for data delivery (Cloud Functions, Cloud Run) Partner with analysts, data scientists, and ML engineers to ensure data & AI readiness and support advanced modeling Collaborate with the Data Governance team to embed stewardship, lineage, and metadata into Dataplex and other MDM tooling Support real-time analytics using BigQuery streaming and Pub/Sub as needed. Optimize query performance and cost efficiency in BigQuery Drive adoption of AI/automation by ensuring data models are accessible for predictive and agentic use cases To Land This Opportunity You have 4+ years of experience in data engineering or BI-focused data modeling You have hands-on expertise in BigQuery (partitioning, clustering, performance tuning, cost management) You have strong knowledge of dbt, Looker, and SQL for transformation and semantic modeling You obtain experience with Cloud Functions, Cloud Run, and APIs for data delivery You're familiar with Pub/Sub and BigQuery streaming for real-time use cases You have exposure to ML feature engineering in Vertex AI or similar platforms a plus You have a strong understanding of data governance frameworks, Dataplex, and metadata management You're an all-star communicator and are proficient in English (both written and verbal) You have a quality, high speed internet connection at home About Your Future Team Our IT team's favorite pastimes include corny jokes, bowling, pool, and good pizza. They like vehicles that go really fast, Harry Potter, and coffee…a lot (they'll hear you out on whether Dunkin or Starbucks gets your vote). At Parts Town, we value transparency and are committed to ensuring our team members feel appreciated and supported. We prioritize our positive workplace culture where collaboration, growth, and work-life balance are celebrated. The salary range for this role is $97,628.20- 131,752.87 which is based on including but not limited to qualifications, experience, and geographical location. Parts Town is a pay for performance-company. In addition to base pay, some roles offer a profit-sharing program, and an annual bonus depending on the role. Our comprehensive benefits package includes health, dental and vision insurance, 401(k) with match, employee assistance programs, paid time off, paid sick time off, paid holidays, paid parental leave, and professional development opportunities. Parts Town welcomes diversity and as an equal opportunity employer all qualified applicants will be considered regardless of race, religion, color, national origin, sex, age, sexual orientation, gender identity, disability or protected veteran status.
    $97.6k-131.8k yearly Auto-Apply 1h ago
  • Advanced Analytics & Data Science Engineer

    AAOS

    Data engineer job in Rosemont, IL

    You are ready to turn data into decisions that shape the future of healthcare. Join a purpose-driven organization where your expertise in data science, machine learning, and advanced analytics will make a meaningful impact on the orthopaedic community. As an Advanced Analytics and Data Science Engineer, you'll play a pivotal role in designing and implementing predictive models, uncovering actionable insights, and driving data-informed strategies across Membership, Marketing, Education, and our Annual Meeting. Your work will spark innovation, elevate member experience, and help us deliver greater value to the professionals we serve. If this sounds like you, please read on! In this role, you'll lead the development of machine learning algorithms, statistical models, and AI-driven solutions to solve complex business challenges. You'll collaborate with stakeholders to identify opportunities, analyze large datasets, and deliver intelligence that drives smarter decisions. Your ability to communicate technical findings in a clear, business-focused manner and align data initiatives with organizational goals will be essential to your success. Travel: Up to 5 days per year Qualifications: Required: Strong foundation in data science, AI/ML, and advanced analytics (Bachelor's required, Master's preferred) 7+ years of experience in data analysis and modeling, including 5+ years applying machine learning techniques in real-world scenarios Hands-on expertise in Python, R, SQL, and ML frameworks such as Scikit-Learn or TensorFlow Experience with statistical modeling, A/B testing, forecasting, and anomaly detection Ability to translate ambiguous business problems into clear, data-driven solutions Desired: Experience with healthcare or medical associations, Agile methodologies, and cloud platforms like Microsoft Azure Familiarity with data visualization tools (Power BI or similar) Salary Range: $105,000 ‒ $115,000, depending on qualifications and experience. Ready to make an impact? Your next big opportunity starts here! Apply now and help us turn data into decisions that matter. Please share the following: Clearly communicate why you are the ideal candidate for this role, providing specific examples and experiences as proof points. Resumes must be accompanied by a cover letter with salary expectations to be considered. Please note: This position is based in Rosemont, Illinois and is open to applicants who are able to commute at least twice per week to this office. Applicants must already be authorized to work in the United States on a full-time basis. We are unable to sponsor or take over sponsorship of work visas. JOB CODE: 1000148
    $105k-115k yearly 60d+ ago
  • GTM Data Engineer

    Partssource 4.4company rating

    Data engineer job in Hoffman Estates, IL

    PartsSource is the leading technology and software platform for managing mission-critical healthcare equipment. Trusted by over 5,000 US hospitals and 15,000 clinical sites, PartsSource empowers providers and service organizations to maximize clinical availability for patient care and automates the procurement of parts, services and training through a unique digital experience. PartsSource team members are deeply committed to our mission of Ensuring Healthcare is Always On , which is foundational to our success and growth. Our vibrant culture is built upon aligned values, shared ownership, mutual respect, and a passion for collaborating to solve complex customer problems. About the Job Opportunity The GTM Data Engineer is responsible for building and maintaining a single, trusted customer and revenue data foundation across Marketing, Sales, and Customer Success. This role partners closely with Revenue Operations to ensure all GTM teams operate from a consistent source of truth for pipeline, revenue, retention, and growth. You will own how GTM data is structured, enriched, validated, and made available-eliminating data ambiguity and enabling confident, data-driven decision making. What You'll Do GTM Data Modeling & Governance (Technology - Data Engineering: Data Modeling & Architecture, Data Quality & Governance) Design and maintain the canonical customer, account, and revenue data model across GTM systems Resolve identity across contacts, accounts, users, assets, services, and subscriptions Define authoritative objects and metrics for pipeline, bookings, renewals, expansion, and churn Ensure historical accuracy, data lineage, and consistent metric definitions Data Enrichment, Integration & Pipelines (Technology - Data Engineering: ETL & Data Integration, Data Pipeline Development) Build and manage data pipelines across CRM, marketing automation, services, and financial systems Identify data gaps and implement enrichment strategies to improve completeness and usability Merge datasets into unified customer and account views with clear conflict-resolution rules Own schema changes, backfills, reprocessing, and validation as systems evolve Attribution, Revenue Logic & Reporting Enablement (Sales Revenue Operations: Performance Metrics & Reporting, Sales Analytics) Implement approved attribution and revenue logic consistently across channels and time periods Validate sourced, influenced, and assisted revenue before executive reporting Enable trusted funnel, pipeline, retention, and expansion reporting within systems of record Reduce reliance on spreadsheets and manual reconciliation GTM Architecture, CDP & AI Readiness (Technology - Systems & Applications: Systems Integration, Systems Thinking) Support a warehouse-centric or composable CDP approach for GTM data Partner with GTM leadership to evolve long-term data architecture Prepare high-quality, LLM-ready datasets for AI-enabled GTM workflows Ensure access controls, privacy, and compliance requirements are met What You'll Bring Your Background 5+ years in data engineering, analytics engineering, or GTM data roles Strong experience with CRM and GTM data models Advanced SQL skills and experience with modern data stacks and ETL tools Experience supporting attribution, lifecycle, and revenue reporting Familiarity with Customer Data Platforms or warehouse-centric CDP approaches Ability to work cross-functionally with Marketing, Sales, Customer Success, Finance, and RevOps Who We Want to Meet Act Like an Owner - Accountability & Execution : You take full ownership of GTM data quality and follow through to reliable outcomes. Serve with Purpose - Business Impact : You connect data architecture decisions to revenue visibility and GTM effectiveness. Adapt to Thrive - Managing Ambiguity : You remain productive amid evolving systems, definitions, and priorities. Collaborate to Win - Influence & Communication : You partner effectively with RevOps and GTM teams to align on shared metrics. Challenge the Status Quo - Data-Informed Decision Making : You use evidence and clarity to replace assumptions and debates. Benefits & Perks Competitive compensation package with salary, incentives, company ownership/equity, and comprehensive benefits (401k match, health, college debt reduction, and more!) Career and professional development through training, coaching and new experiences. Hybrid culture with new & beautiful workspaces that balance flexibility, collaboration, and productivity. Inclusive and diverse community of passionate professionals learning and growing together. Interested? We'd love to hear from you! Submit your resume and an optional cover letter explaining why you'd be a great fit. About PartsSource Since 2001, PartsSource has evolved into the leading technology and software platform for managing mission-critical equipment, serving over half of the U.S. hospital infrastructure. Our digital systems modernize and automate the procurement of parts, services, technical support, and training for HTM professionals to efficiently and effectively maintain their mission-critical equipment. PartsSource employs over 700 employees nationwide that committed to supporting healthcare providers and ensuring healthcare always on. In 2021, Bain Capital invested in the business, further accelerating our growth and positive impact within the healthcare industry. Read more about us here: · PartsSource Named to Newsweek's List of the Top 200 America's Most Loved Workplaces for 2024 · PartsSource Named Among the Top 50 Healthcare Technology Companies of 2025 · PartsSource Named Among the Top 25 Healthcare Software Companies of 2025 · PartsSource President and CEO Philip Settimi Named to Top 50 Healthcare Technology CEO List 2025 · WSJ: Bain Capital Private Equity Scoops Up PartsSource EEO PartsSource, Inc., and its affiliates and subsidiaries, provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. Legal authorization to work in the U.S. is required.
    $82k-113k yearly est. Auto-Apply 4d ago
  • Data Engineer

    Charter Manufacturing 4.1company rating

    Data engineer job in Mequon, WI

    Charter Manufacturing is a fourth-generation family-owned business where our will to grow drives us to do it better. Join the team and become part of our family! Applicants must be authorized to work for ANY employer in the U.S. Charter Manufacturing is unable to sponsor for employment visas at this time. This position is hybrid, 3 days a week in office in Mequon, WI. BI&A- Lead Data Engineer Charter Manufacturing continues to invest in Data & Analytics. Come join a great team and great culture leveraging your expertise to drive analytics transformation across Charter's companies. This is a key role in the organization that will provide thought leadership, as well as add substantial value by delivering trusted data pipelines that will be used to develop models and visualizations that tell a story and solve real business needs/problems. This role will collaborate with team members and business stakeholders to leverage data as an asset driving business outcomes aligned to business strategies. Having 7+ years prior experience in developing data pipelines and partnering with team members and business stakeholders to drive adoption will be critical to the success of this role. MINIMUM QUALIFICATIONS: Bachelor's degree in computer science, data science, software engineering, information systems, or related quantitative field; master's degree preferred At least seven years of work experience in data management disciplines, including data integration, modeling, optimization and data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience designing, developing, deploying, and maintaining data pipelines used to support AI, ML, and BI using big data solutions (Azure, Snowflake) Strong knowledge in Azure technologies such as Azure Web Application, Azure Data Explorer, Azure DevOps, and Azure Blob Storage to build scalable and efficient data pipelines Strong knowledge using programming languages such as R, Python, C#, and Azure Machine Learning Workspace development Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP) and modern data warehouse tools (Snowflake, Databricks) Experience with database technologies such as SQL, Oracle, and Snowflake Prior experience with ETL/ELT data ingestion into data lakes/data warehouses for analytics consumption Strong SQL skills Ability to collaborate within and across teams of different technical knowledge to support delivery and educate end users on data products Passionate about teaching, coaching, and mentoring others Strong problem-solving skills, including debugging skills, allowing the determination of sources of issues in unfamiliar code or systems, and the ability to recognize and solve problems Excellent business acumen and interpersonal skills; able to work across business lines at a senior level to influence and effect change to achieve common goals Ability to describe business use cases/outcomes, data sources and management concepts, and analytical approaches/options Demonstrated experience delivering business value by structuring and analyzing large, complex data sets Demonstrated initiative, strong sense of accountability, collaboration, and known as a trusted business partner PREFERRED QUALIFICATIONS INCLUDES EXPERIENCE WITH: Manufacturing industry experience specifically heavy industry, supply chain and operations Designing and supporting data integrations with ERP systems such as Oracle or SAP MAJOR ACCOUNTABILITIES: Designs, develops, and supports data pipelines for batch and streaming data extraction from various sources (databases, API's, external systems), transforming it into the desired format, and load it into the appropriate data storage systems Collaborates with data scientists and analysts to optimize models and algorithms for in accordance with data quality, security, and governance policies Ensures data quality, consistency, and integrity during the integration process, performing data cleansing, aggregation, filtering, and validation as needed to ensure accuracy, consistency, and completeness of data Optimizes data pipelines and data processing workflows for performance, scalability, and efficiency. Monitors and tunes data pipelines for performance, scalability, and efficiency resolving performance bottlenecks Establish architecture patterns, design standards, and best practices to accelerate delivery and adoption of solutions Assist, educate, train users to drive self-service enablement leveraging best practices Collaborate with business subject matter experts, analysts, and offshore team members to develop and deliver solutions in a timely manner Embraces and establishes governance of data and algorithms, quality, standards, and best practices, ensuring data accuracy We offer comprehensive health, dental, and vision benefits, along with a 401(k) plan that includes employer matching and profit sharing. Additionally, we offer company-paid life insurance, disability coverage, and paid time off (PTO).
    $80k-110k yearly est. Auto-Apply 60d+ ago
  • Data Engineer

    360-Tsg

    Data engineer job in Des Plaines, IL

    As a Data Engineer your primarily responsible for maintaining and enhancing the Registry's data acquisition, integration, and ETL pipelines in support of both operational and business intelligence data stores. The incumbent is responsible for applying diverse data cleansing and transformation techniques as well as the ongoing management and monitoring of all Registry databases. This will include addressing issues pertaining to the ongoing operations and optimization of the data environment including performance, reliability, logging, scalability, etc. You will also provide support for the organizations database systems, warehouse, marts, and supporting applications. You will also lead the efforts to develop a unified enterprise data model for the organization. As a Data Engineer you will also design, develop, and maintain high\-performance data platforms on premise and in Microsoft Azure cloud\-based environments including leading the development of a data warehouse environment to support the Registry's business intelligence roadmap. You will champion efforts that will ensure that the organization's business intelligence applications remain relevant for use by internal business groups by actively participating in strategy and project planning discussions. Work collaboratively with Registry participants and internal support teams, identify and implement changes that improve system performance and the user experience. Design, develop, and maintain the ETL pipelines using SSIS that standardized raw data from multiple data sources and optimize both the operational and dimensional\/star schema data model necessary for transactional systems and business intelligence applications. Gather business and functional requirements and translate these requirements into robust, scalable, operable solutions aligning to an overall data architecture. Extract, transform, and load data to and from various data sources including relational databases, NoSQL databases, web services, and flat files. Produce various technical documents such as ER diagrams, table schemas, data lineage, API documents, etc. Provide leadership and oversight on database architectural design for existing systems including database administration, performance monitoring, and troubleshooting. Provide complex analysis, conceptualize, design, implement, and develop solutions for critical data\-centric projects. Perform dataflow, system and data analysis, and develop meaningful and useful presentation of data in downstream applications. Plan and implement standards, define\/code conformed global and reusable objects, perform complex database design and data repository modelling. Monitor ETL processes, system audits, dashboard reporting, and presentation layer functioning and performance. Proactively identify and implement procedures that resolve performance and\/or data reporting issues. Support the optimal performance of the organization's data and BI systems. Monitor database performance, provide optimization recommendations, and implement recommendations. Follow the release cycles and implement on\-time delivery of task assignments, defect correction, change requests, and enhancements. Troubleshoot and solve technical problems. Perform other responsibilities as assignment by management. Requirements 3 years of relevant work experience in a data engineering role leveraging SQL, SSIS; including design and support of ETL routines that support the import of data from multiple data sources 2 years of experience with PowerBI 3 years of data warehousing experience including the design, development, and ongoing support of star or snowflake data schemas to support business intelligence applications 5 years of database administration or database development experience in a SQL or MySQL environment; knowledge of Microsoft technology stack; background in Azure Infrastructure as a Service environment desired Experience working with both structured and unstructured data Demonstrated understanding of Business Intelligence and data solutions including cubes, data warehouses, data marts, and supporting schema types (star, snowflake, etc.) Data modeling experience in building logical and physical data models Applied knowledge of Microsoft Security\/Authentication Concepts (Active Directory, IIS, Windows OS) Strong technical planning skills with the ability to prioritize and multitask across a number of work streams Polished presentation skills; experience creating and presenting findings to executive level staff Strong written, verbal and interpersonal communication skills, with an ability to communicate ideas and solutions effectively Must be highly collaborative with the ability to manage and motivate project teams and meet deliverables Ability to build strong stakeholder relationships and translate complex technical concepts to non\-technical stakeholders Experience with Data Warehouse is a plus Knowledge of SSRS is a plus "}}],"is Mobile":false,"iframe":"true","job Type":"Full time","apply Name":"Apply Now","zsoid":"61675964","FontFamily":"PuviRegular","job OtherDetails":[{"field Label":"Industry","uitype":2,"value":"IT Services"},{"field Label":"Job Opening ID","uitype":111,"value":"ZR_157_JOB"},{"field Label":"City","uitype":1,"value":"Des Plaines"},{"field Label":"State\/Province","uitype":1,"value":"Illinois"},{"field Label":"Zip\/Postal Code","uitype":1,"value":"60018"}],"header Name":"Data Engineer","widget Id":"**********00072311","is JobBoard":"false","user Id":"**********00096003","attach Arr":[],"custom Template":"5","is CandidateLoginEnabled":false,"job Id":"**********09521003","FontSize":"15","google IndexUrl":"https:\/\/360techstaffing.zohorecruit.com\/recruit\/ViewJob.na?digest=tX.9.R6Jttlwt1o4GuBo5d1K0gqwm9rHdzPF1FZdX24\-&embedsource=Google","location":"Des Plaines","embedsource":"CareerSite","indeed CallBackUrl":"https:\/\/recruit.zoho.com\/recruit\/JBApplyAuth.do"}
    $75k-100k yearly est. 60d+ ago
  • Staff Data Scientist-Promo Analytics

    Milwaukee Tool 4.8company rating

    Data engineer job in Menomonee Falls, WI

    Staff Data Scientist - Promo Analytics **Applicants must be authorized to work in the U.S.; Sponsorship is not available for this position. INNOVATE without boundaries! At Milwaukee Tool we firmly believe that our People and our Culture are the secrets to our success-so we give you unlimited access to everything you need to provide support to your business unit. Behind our doors you'll be empowered every day to own it, drive it, and do what it takes to support the biggest breakthroughs in the industry. Meanwhile, you'll have the support and resources of the fastest-growing brand in the construction industry to make it happen. Your Role on Our Team: As a Staff Data Scientist - Promo Analytics, you will serve as a senior technical leader, driving the architecture and delivery of innovative analytics solutions that enable Milwaukee Tool to make fast, data-driven decisions about promotional strategies. Working closely with business partners and the Promo Analytics pod, you will design, build, and support the foundational systems that power our promo analytics capabilities. You will develop and implement scalable machine learning models and forecasting tools to optimize promotions, ensuring trusted and actionable insights across the organization. You'll build deep expertise in our promotional data, shape long-term analytics strategy, and lead advanced analytics initiatives. Success in this role requires technical excellence, a passion for experimentation, and the ability to influence and elevate the team while delivering significant business value. You'll be DISRUPTIVE through these duties and responsibilities: Architect, design, and build scalable machine learning and forecasting models to optimize promotional strategies and drive significant business impact. Set technical direction for promo analytics, establishing best practices and standards for model development, validation, and deployment. Lead the development and implementation of advanced analytics frameworks, reusable tools, and templates to accelerate team and organizational productivity. Translate complex, cross-functional business needs into actionable data science projects and long-term analytics strategy. Oversee and guide the integration of promo analytics solutions across multiple business domains, ensuring consistency and scalability. Lead root cause analysis for major promo analytics issues, drive long-term remediation, and influence operational reliability improvements. Mentor and provide technical guidance to other data scientists and analytics professionals within the pod and across teams. Collaborate with Data Engineers, Product Owners, Business Intelligence Engineers, and other technical team members to deliver integrated, enterprise-scale solutions. Engage directly with marketing and sales leadership to understand strategic business challenges and deliver high-impact analytics solutions. Perform other duties as assigned. The TOOLS you'll bring with you: Bachelor's degree in Statistics, Mathematics, Computer Science, Data Science, or a related field, or equivalent experience. 8 or more years of experience in data science, advanced analytics, or a closely related technical field. Expert-level proficiency in statistical modeling, machine learning, and advanced analytics techniques. Advanced proficiency in at least one programming language used for data science (Python preferred) Extensive hands-on experience building, validating, and deploying scalable machine learning and forecasting models in production environments. Demonstrated experience with cloud-based data science platforms (Databricks, Azure Machine Learning, or similar). Proven track record developing and implementing complex statistical and ML models for business impact, ideally in pricing, promotion, or revenue management. Strong knowledge of experimental design, A/B testing, and model evaluation best practices. Ability to lead and influence cross-functional analytics initiatives and set technical direction for data science projects. Strong problem-solving, debugging, and analytical skills, especially in complex, multi-system environments. Ability to thrive in agile, dynamic, and collaborative teams, and to communicate complex analytics clearly to technical and non-technical audiences. Other TOOLS we prefer you to have: Deep expertise with advanced machine learning techniques (e.g., time series forecasting, causal inference, uplift modeling, optimization, or reinforcement learning). Experience applying analytics to promo, pricing, or revenue management challenges. Hands-on experience with cloud-based data science platforms (Databricks, Azure Machine Learning, or similar). Strong background in experimental design and statistical testing (A/B testing, multivariate testing, quasi-experimental methods). Proficiency with data visualization and storytelling tools (e.g., Power BI, Tableau, matplotlib, or similar). Demonstrated ability to work with large, complex, and messy datasets in real-world business contexts. Experience collaborating in Agile, cross-functional teams. Exceptional communication skills for translating complex analytics into actionable insights for both technical and non-technical audiences. A track record of thought leadership, publication, or conference presentations in data science or applied analytics (preferred but not required). Working Conditions: The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Frequently required to stand, walk, bend, stretch, reach, and effectively communicate with others in the workplace Sitting for prolonged periods of time Prolonged exposure to computer screens Repetitive use of hands and fingers to operate office equipment, machinery, hand tools and/or power tools Specific vision abilities required by this job include close vision, color vision, peripheral vision, depth perception, and ability to adjust focus May require to wear personal protective equipment which includes, but is not limited to, safety glasses, gloves, and hearing protection May work in laboratories and/or controlled, enclosed, restricted areas Noise levels range from moderate to loud Must be able to lift up to 50 pounds at a time May require travel dependent on company needs We provide these great perks and benefits: Robust health, dental and vision insurance plans Generous 401 (K) savings plan Education assistance On-site wellness, fitness center, food, and coffee service And many more, check out our benefits site HERE. Milwaukee Tool is an equal opportunity employer.
    $68k-86k yearly est. Auto-Apply 22d ago
  • Data Scientist (38948)

    Young Innovations 4.3company rating

    Data engineer job in Algonquin, IL

    At Young Innovations, we foster a dynamic environment where team members make an impact every day as part of a collaborative, inclusive culture. Together, we serve the dental profession and their patients, united in our mission of achieving a Lifetime of Oral Health™ We embrace diverse perspectives and encourage bold thinking, challenging traditional approaches with a bias for action. Whether you're looking to expand your skills or grow your career, Young is here to support your goals and continuous learning. At Young, YOU are at the core of what we do. Position Overview: The Data Scientist will drive adoption and implementation of artificial intelligence, machine learning, and strategic analytics within our global data Fabric environment. This Data Scientist is ideal for someone with a strong foundation in statistical analysis, machine learning, and natural language processing, particularly in the application and fine-tuning of Large Language Models (LLMs). This professional will bring expertise in correlation analysis, predictive modeling, and the ability to work with large, complex datasets to drive data-driven decision-making across the organization. You'll like this role if: You like building processes from the ground up Partnering across the business Wearing multiple hats in a face paced environment Why You'll Love Working Here: Medium sized company - not too big, not too small - just enough to get things done and see your impact. Did we say benefits? Full medical, dental, vision, 401k, parental leave, paid holidays, paid time off, and more! You will have the ability to create everything from the ground up and fully own this function. Who you'll work with: Stakeholders from all across the business - IT, Supply Chain, Finance, Marketing, and more. What You'll Do: AI/ML Development & Integration Integrate large language models (LLMs)-such as OpenAI, BERT, and LLaMA-into business processes and products to enable text summarization, classification, personalized recommendations, and conversational interfaces. Build and optimize predictive models using advanced machine learning techniques to support forecasting, customer segmentation, retention strategies, fraud detection, and other high-impact business functions. Stay up to date with advancements in AI/ML, NLP, and data analytics; proactively recommend new tools, frameworks, and best practices to enhance capabilities. Statistical & Analytical Modeling Perform statistical analyses to uncover correlations and causal relationships across structured and unstructured data, identifying key drivers of business performance and emerging trends. Develop customer segmentation frameworks by analyzing behavioral, transactional, and demographic data to support personalized marketing and engagement initiatives. Translate complex results into clear, actionable insights through dashboards, visualizations, and reports for both technical and non-technical audiences. Leverage large-scale datasets (e.g., customer interactions, transaction records) using distributed computing and modern data platforms to support data-driven decision-making. Cross Functional & Strategic Collaboration Partner with Product, Marketing, Commercial, Customer Service, Finance, and Operations teams to understand analytical needs and deliver impactful solutions. Support enterprise-wide initiatives, including M&A data integration and other strategic analytics projects. Qualifications What You'll Bring: Bachelors degree in Data Science, Computer Science, Statistics, Mathematics, or a related field; Masters, MBA and/or PhD a plus 3+ years of professional experience in data science, analytics, or machine learning roles Hands-on experience working with and implementing Large Language Models (LLMs) and NLP techniques Experience working in a private equity is a plus Technical Skills Strong foundation in statistical analysis and predictive modeling, including correlation analysis, hypothesis testing, regression, classifications, and time-series forecasting using tools like Phython (SciPy, statsmodels), R, or SaaS Experience working with large datasets and data warehouses using SQL and cloud platforms (i.e. AWS, GCP, Azure) Strong proficiency in Python and experience with ML frameworks (ie. PyTorch, TensorFlow) The pay range reflects the minimum and maximum target for the position at the time of posting. Within the range, the compensation will be determined based on education/training, skill set, experience, and other organizational needs. Physical Requirements/Working Locations: Office/Remote Environment This position is based in an office environment and is primarily sedentary in nature. It requires regular use of standard office equipment, including computers, phones, photocopiers, scanners, filing cabinets, and fax machines. Some roles may involve wearing a headset and sustained computer use for 8 or more hours per day. Employees may be required to sit, stand, or walk for extended periods, and occasional bending, lifting, or carrying items up to 50 pounds may be necessary. We are an Equal Opportunity Employer and value diversity at all levels of the organization. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or protected veteran status.
    $75k-104k yearly est. 19d ago
  • Data Scientist

    Bayforce 4.4company rating

    Data engineer job in Mequon, WI

    NO 3rd Party vendor candidates or sponsorship Role Title: Data Scientist Client: Manufacturing Company Employment Type: Contract Duration: 3-6 months Preferred Location: Local to Milwaukee with 3 days onsite Role Description: Looking for a data scientist to help with a predictive maintenance project. We don't need a general data science background but someone with a specific background in areas like predictive maintenance, asset reliability, building anomaly detection models. Requirements: * Strong data scientist experience with a specific background in areas like predictive maintenance, asset reliability, building anomaly detection models. * Proficient in Azure environment * Experience with Pyton, R, Github highly preferred
    $66k-89k yearly est. 22h ago
  • Lead ETL Architect (No H1B)

    Sonsoft 3.7company rating

    Data engineer job in Deerfield, IL

    Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services. As background, this client has been working with us and internal recruiting to fill this role for a LONG time. Previously, the client was looking for someone who could both technically lead the team specific to SAG technologies and also work with the business. Two things have changed: 1) the span of control of the team increased from SAG to include the other technologies listed in the job description, and 2) he has been unsuccessful in finding someone who was both the best technical architect on the team and also had manager qualities. He is now open to lesser capabilities on the technical, as long as they have the manager. I WOULD TARGET CURRENT/ FORMER SR. MANAGERS/ DIRECTORS OF INTEGRATION/ SOA. PREFERABLY, WERE TECHNICAL AT SOME POINT IN THEIR CAREER, BUT MOVED INTO MANAGEMENT. WITH THIS TARGET, YOU SHOULD BE ABLE TO ACCESS THE APPROPRIATE TALENT AT THIS PAY RATE. Lead ETL Architect Client's is seeking a Lead ETL Architect as a key member of their Center of Expertise for Integration Technologies. This consultant will be primarily responsible for leading the demand intake/ management process with the business leaders and other IT groups related to managing the demand for the design, build and ongoing support of new ETL architectures. They also will interact extensively with the team of architects supporting these technologies. Expertise in one of more of the following integration technology areas is required: -- ETL - Datastage, AnInito, Talend (client is moving from AbInitio to Talend as their primary ETL tool) The overall team is responsible for addressing any architecture impacts and defining technology solution architectures focusing on infrastructure, logical and physical layout of the systems and technologies not limited to hardware, distributed and virtual systems, software, security, data management, storage, backup/recovery, appliances, messaging, networking, work flow, interoperability, flexibility, scalability, monitoring and reliability including fault tolerance and disaster recovery in collaboration with Enterprise Architect(s). Additionally, responsible for the build-out and ongoing run support of these integration platforms. Skills Set · Possession of fundamental skills of a solution architect with ability to inquire and resolve vague requirements · Ability to analyze existing systems through an interview process with technology SMEs. Does not need to be the SME. · Takes a holistic view and communicate to others to be able to understand the enterprise view to ensure coherence of all aspects of the project as an integrated system · Perform gap analysis between current state and the future state architecture to identify single points of failure, capabilities, capacity, fault tolerance, hours of operation (SLA), change windows, · Strong verbal and written communication with proven skills in facilitating the design sessions. · Able to influence, conceptualize, visualize and communicate the target architecture. · Able to communicate complex technical or architecture concepts in a simple manner and can adapt to different audiences · Ability to work independently and market architecture best practices, guidelines, standards, principles, and patterns while working with infrastructure and technology project teams. · Ability to document end-to-end application transaction flows through the enterprise · Ability to document the technology architecture decision process, and if required develop relevant templates · Resolve conflicts within infrastructure and application teams, business units and other architects · Identify opportunities to cut cost without sacrificing the overall business goals. · Ability to estimate the financial impact of solution architecture alternatives / options · Knowledge of all components of an enterprise technical architecture. Additional Information ** U.S. Citizens and those who are authorized to work independently in the United States are encouraged to apply. We are unable to sponsor at this time. Note:- This is a Contract job opportunity for you. Only US Citizen, Green Card Holder, GC-EAD, H4-EAD, L2-EAD, OPT-EAD & TN-Visa can apply. No H1B candidates, please. Please mention your Visa Status in your email or resume. ** All your information will be kept confidential according to EEO guidelines.
    $87k-114k yearly est. 60d+ ago
  • Data Engineer II-Promo Analytics

    Milwaukee Tool 4.8company rating

    Data engineer job in Menomonee Falls, WI

    Data Engineer II- Promo Analytics **Applicants must be authorized to work in the U.S.; Sponsorship is not available for this position. INNOVATE without boundaries! At Milwaukee Tool we firmly believe that our People and our Culture are the secrets to our success-so we give you unlimited access to everything you need to provide support to your business unit. Behind our doors you'll be empowered every day to own it, drive it, and do what it takes to support the biggest breakthroughs in the industry. Meanwhile, you'll have the support and resources of the fastest-growing brand in the construction industry to make it happen. Your Role on Our Team: As a Data Engineer II, you will play a critical role in enabling fast, accurate, and scalable data‑driven decisions at Milwaukee Tool. You will help build and evolve the pipelines, models, and governance frameworks that power analytics for retail promotions and enterprise-wide initiatives. Partnering with business teams and Data Platform engineers, you will turn requirements into high‑quality data products using Databricks and modern cloud technologies. Your work ensures that teams have timely, reliable, and well‑structured data to support operational reporting, strategic insights, and advanced analytics. You'll be DISRUPTIVE through these duties and responsibilities: Design and build scalable data pipelines to ingest, transform, and curate data from a variety of systems including APIs, databases, files, and event streams. Review functional requirements & design specs with Senior Data Engineers and business partners and converting to data transformations. Implement and maintain data models such as dimensional models, star schemas, normalized models, and data vault approaches to support analytics and BI. Work with Data Architects and Data Leads to optimize cloud‑based data platforms, ensuring performance, reliability, and cost‑efficient execution of data workloads. Develop and enforce data quality checks, lineage, and monitoring to ensure accuracy, completeness, and trust in enterprise datasets. Leverage your expertise within the software development lifecycle, continuous improvement, and best practices to help drive the team towards rapid success. Automate and operationalize data pipelines using CI/CD, Infrastructure‑as‑Code, and modern orchestration tools. Profile, tune, and optimize SQL, Python, and Spark workloads running in Databricks. Author technical documentation, promote reusable components, and contribute to engineering standards and best practices. Troubleshoot pipeline issues, participate in root‑cause analysis, and help maintain healthy, reliable data operations. Performs other duties as assigned. The TOOLS you'll bring with you: Bachelor's degree in Computer Science, Information Systems or equivalent experience. 3-5 years of experience in data engineering or a related technical field. Strong proficiency in SQL and a programming language such as Python (preferred). Experience building and orchestrating data workflows in Databricks, including Delta Lake, notebooks, jobs, and workflows. Hands‑on experience with distributed data processing technologies such as Apache Spark. Experience with cloud data ecosystems (Azure, AWS, or GCP), especially Azure Databricks. Familiarity with cloud data warehouses such as Snowflake, Synapse, Redshift, or BigQuery. Experience working with structured and semi‑structured data (Parquet, Avro, JSON, Delta). Strong understanding of version control (Git) and modern CI/CD workflows. Strong problem‑solving, debugging, and analytical skills. Ability to work effectively in agile, cross‑functional engineering teams. Other TOOLS we prefer you to have: Experience with Databricks Unity Catalog, Delta Live Tables, or Databricks Workflows. DataOps experience (pipeline observability, monitoring, automated quality). Knowledge of metadata management or cataloging platforms (Purview, Collibra, Alation). Experience with ML pipelines and feature engineering in Databricks. Familiarity with streaming frameworks (Kafka, Event Hubs, Kinesis) used with Spark Structured Streaming. Knowledge and experience working in an Agile environment. Experience working with retail product promotion data. Working Conditions: The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Frequently required to stand, walk, bend, stretch, reach, and effectively communicate with others in the workplace Sitting for prolonged periods of time Prolonged exposure to computer screens Repetitive use of hands and fingers to operate office equipment, machinery, hand tools and/or power tools Specific vision abilities required by this job include close vision, color vision, peripheral vision, depth perception, and ability to adjust focus May require to wear personal protective equipment which includes, but is not limited to, safety glasses, gloves, and hearing protection May work in laboratories and/or controlled, enclosed, restricted areas Noise levels range from moderate to loud Must be able to lift up to 50 pounds at a time May require travel dependent on company needs We provide these great perks and benefits: Robust health, dental and vision insurance plans Generous 401 (K) savings plan Education assistance On-site wellness, fitness center, food, and coffee service And many more, check out our benefits site HERE. Milwaukee Tool is an equal opportunity employer.
    $86k-106k yearly est. Auto-Apply 22d ago

Learn more about data engineer jobs

How much does a data engineer earn in Kenosha, WI?

The average data engineer in Kenosha, WI earns between $66,000 and $115,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Kenosha, WI

$88,000
Job type you want
Full Time
Part Time
Internship
Temporary