About us: At Plexus, our vision is to help create the products that build a better world. Driven by a passion for excellence, we partner with leading Aerospace/Defense, Healthcare/Life Sciences and Industrial companies to design, manufacture and service some of the world's most transformative products, including advanced surgical systems, diagnostic instruments, healthcare imaging equipment, mission critical aerospace systems, and electric vehicle (EV) charging solutions. Visit Plexus.com to learn more about our unwavering commitment to our vision.
When we invest in our people, we invest in building a better world.
With a vision rooted in the wellbeing and inclusive engagement of our team members, our customers, their end users and our communities, people are the heart of what we do and who we are. It is our values that unite us and guide us in everything that we do, including how we operate, behave and interact to foster a workplace where every team member feels valued and empowered to contribute their best.
Our values include: Growing our People, Building Belonging, Innovating Responsibly, Delivering Excellence and Creating Customer Success.
As a team member, you will engage in impactful work through global collaboration and the use of emerging technologies, join an inclusive culture where every team member is valued and working toward a greater purpose, and be empowered to reach your full potential through various development programs designed to accelerate your growth.
Plexus offers a comprehensive benefits package designed to support team members' wellbeing, including medical, dental, and vision insurance, paid time off, retirement savings, and opportunities for professional development. We also prioritize work-life balance and offer a variety of perks to enhance the team member experience. For more information, visit our US benefits website at usbenefits.plexus.com. Our commitment to pay range transparency fosters an equitable workplace, where everyone can feel valued. The annual compensation range for this position is stated below. The salary offered within this range will be based upon the geographic location, work experience, education, licensure requirements and/or skill level. Salary Range:
$85,100.00 - $127,700.00
Purpose Statement: The DataEngineer II designs, builds, and maintains robust and scalable data pipelines and infrastructure to support the organization's data needs, particularly related to cloud-based technologies. This includes acquiring, transforming, and optimizing large datasets to ensure high-quality, reliable, and accessible data for analytics, reporting, machine learning and AI initiatives.
Key Job Accountabilities:
Lead Data Pipeline and Infrastructure Development: Design, develop, and optimize end-to-end ETL/ELT pipelines(on-premises and cloud) using Python, SQL, Spark, etc. to ingest data into scalable cloud infrastructure, including Data Lakes and Data Warehouses (AWS, Azure, or GCP).
Collaborate on data modeling and schema design: Collaborate with stakeholders to design and implement efficient data models and schema structures. Enforce data quality through robust validation, monitoring, and alerting mechanisms to ensure accuracy and reliability.
Optimize Performance, Cost, and Scalability: Proactively evaluate, tune, and improve the performance, scalability, and cost-efficiency of all data pipelines and underlying cloud infrastructure.
Enforce Best Practices and Documentation: Champion and enforce modern dataengineering best practices, including version control, automated testing, and CI/CD. Create and maintain comprehensive technical documentation for all data systems and operational procedures.
Mentorship: Provide technical guidance, mentorship, and perform code reviews for junior team members, actively fostering a culture of continuous learning and high standards within the dataengineering team.
All GT team members are responsible for upholding the organization's cybersecurity posture by adhering to security policies and procedures, actively participating in training, protecting data and systems, actively identifying and mitigating vulnerabilities, and promptly reporting any suspicious activity or potential security incidents.
Education/Experience Qualifications:
Typically requires a minimum of 5 years of related experience with a Bachelor's degree; or 3 years and a Master's degree; or equivalent work experience.
Other Qualifications:
Proven experience with cloud data platforms (e.g., AWS S3, Redshift, Glue; Azure Data Lake, Synapse, Data Factory; Google Cloud Storage, BigQuery, Dataflow).
Strong proficiency in SQL and at least one programming language (e.g., Python, Java, Scala) for data manipulation scripting.
Experience with big data technologies (e.g., Spark, Hadoop) is a plus.
Familiarity with data governance, data security, and compliance principles.
Excellent problem-solving, analytical, and communications skills.
This document does not represent a contract of employment and is not intended to capture every possible assignment the incumbent could be asked to perform.
We are pleased to provide reasonable accommodations to individuals with disabilities or special requirements. If you need an application accommodation, please contact us by email at *****************. Please include your contact information and clearly describe how we can help you. This email is for accommodation requests only and cannot be used to inquire about the status of applications.
We are an Equal Opportunity Employer (EOE) and do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Some offers of employment are contingent upon successfully passing a drug screen and/or background check.
$85.1k-127.7k yearly Auto-Apply 7d ago
Looking for a job?
Let Zippia find it for you.
AZCO Data Scientist - IT (Appleton, WI)
AZCO
Data engineer job in Appleton, WI
The Staff Data Scientist plays a critical role in leveraging data to drive business insights, optimize operations, and support strategic decision-making. You will be responsible for designing and implementing advanced analytical models, developing data pipelines, and applying statistical and machine learning techniques to solve complex business challenges. This position requires a balance of technical expertise, business acumen, and communication skills to translate data findings into actionable recommendations.
+ Develop and apply data solutions for cleansing data to remove errors and review consistency.
+ Perform analysis of data to discover information, business value, patterns and trends in support to guide development of asset business solutions.
+ Gather data, find patterns and relationships and create prediction models to evaluate client assets.
+ Conduct research and apply existing data science methods to business line problems.
+ Monitor client assets and perform predictive and root cause analysis to identify adverse trends; choose best fit methods, define algorithms, and validate and deploy models to achieve desired results.
+ Produce reports and visualizations to communicate technical results and interpretation of trends; effectively communicate findings and recommendations to all areas of the business.
+ Collaborate with cross-functional stakeholders to assess needs, provide assistance and resolve problems.
+ Translate business problems into data science solutions.
+ Performs other duties as assigned
+ Complies with all policies and standards
**Qualifications**
+ Bachelor Degree in Analytics, Computer Science, Information Systems, Statistics, Math, or related field from an accredited program and 4 years related experience required or experience may be substituted for degree requirement required
+ Experience in data mining and predictive analytics.
+ Strong problem-solving skills, analytical thinking, attention to detail and hypothesis-driven approach.
+ Excellent verbal/written communication, and the ability to present and explain technical concepts to business audiences.
+ Proficiency with data visualization tools (Power BI, Tableau, or Python libraries).
+ Experience with Azure Machine Learning, Databricks, or similar ML platforms.
+ Expert proficiency in Python with pandas, scikit-learn, and statistical libraries.
+ Advanced SQL skills and experience with large datasets.
+ Experience with predictive modeling, time series analysis, and statistical inference.
+ Knowledge of A/B testing, experimental design, and causal inference.
+ Familiarity with computer vision for image/video analysis.
+ Understanding of NLP techniques for document processing.
+ Experience with optimization algorithms and operations research techniques preferred.
+ Knowledge of machine learning algorithms, feature engineering, and model evaluation.
This job posting will remain open a minimum of 72 hours and on an ongoing basis until filled.
EEO/Disabled/Veterans
**Job** Information Technology
**Primary Location** US-WI-Appleton
**Schedule:** Full-time
**Travel:** Yes, 5 % of the Time
**Req ID:** 253790
\#LI-MF #ACO N/A
$69k-94k yearly est. 60d+ ago
Data Scientist
W.W. Grainger, Inc. 4.6
Data engineer job in Green Bay, WI
Imperial Supplies, a Grainger Company, is a national distributor of quality maintenance products. Serving the fleet maintenance industry since 1958, Imperial has formed lasting relationships with customers by tailoring our services to meet their changing needs.
Our welcoming workplace enables you to learn, grow and make a difference by keeping businesses running and their people safe. As a Great Place to Work-Certified company, we're looking for passionate people to join our team as we continue leading the industry.
We are seeking a Data Scientist to join our team at Imperial Supplies and drive data-driven decision-making across the organization. In this role, you'll design, test, and implement statistical models and machine learning applications to uncover actionable insights and create measurable business value. You'll work closely with cross-functional teams and executive leadership, applying innovative approaches to complex business challenges and identifying opportunities to leverage data for strategic impact.
At Imperial, people come first. Here's what we offer:
* Competitive salary
* Hybrid / Remote schedule
* Work/Life balance
* Immediate medical, dental, vision; 18 days paid vacation, 6 paid holidays and 6% of annual earnings contributed to your retirement, immediately vested!
Key Responsibilities
* Collect, organize, and analyze large datasets from internal and external sources to support predictive modeling and strategic recommendations.
* Develop, test, and validate custom models, algorithms, and simulations using R/Python and SQL.
* Research and build machine learning applications involving structured and unstructured data, including regression, classification, and clustering models.
* Create automated processes to test and monitor model performance and ensure data accuracy.
* Develop and own measurement plans in partnership with key stakeholders.
* Present key insights and recommended actions to stakeholders at all levels using effective communication and storytelling.
* Build dynamic data visualizations using Power BI to monitor key business trends and metrics.
* Stay current on advancements in data science methodologies and tools, fostering innovation in analytical processes.
* Collaborate with IT and other departments to enhance data quality through improved cleansing and wrangling processes.
Qualifications
Required:
* Master's degree in Data Science, Statistics, Economics, Mathematics, Computer Science, or a related quantitative field - or equivalent work experience.
* At least 1 year of experience in a data science or related analytics role.
* Proficiency in R and/or Python, SQL, Power BI, and machine learning techniques (e.g., clustering, classification, regression).
* Strong skills in data mining, algorithm development, and performance testing.
* Excellent communication and presentation skills, with the ability to translate complex data into actionable insights.
* Strong organization and time management skills with the ability to manage multiple priorities in a fast-paced environment.
Preferred:
* 3+ years of experience in a data science role.
* Familiarity with tools such as Snowflake, Smartsheet, or Jira.
* Demonstrated experience using large and multiple datasets to drive business value.
* Proven ability to work cross-functionally with stakeholders at all levels.
We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex (including pregnancy), national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or expression, protected veteran status or any other protected characteristic under federal, state, or local law. We are proud to be an equal opportunity workplace.
We are committed to fostering an inclusive, accessible work environment that includes both providing reasonable accommodations to individuals with disabilities during the application and hiring process as well as throughout the course of one's employment, should you need a reasonable accommodation during the application and selection process, including, but not limited to use of our website, any part of the application, interview or hiring process, please advise us so that we can provide appropriate assistance.
$82k-106k yearly est. 28d ago
Data Scientist
Cognizant 4.6
Data engineer job in Appleton, WI
**Core Responsibilities** : + Develop and iterate on analytical frameworks to evaluate capacity, throughput, and system constraints across potato fries production stages. + Work closely with plant teams, engineering, and insights teams to validate assumptions, refine models, and drive data-informed decision-making.
+ Translate complex data into actionable insights using clear and impactful visualizations.
+ Build tools and dashboards to monitor key performance metrics and enable self-serve analytics.
**Key Skills & Qualifications**
**1. Technical & Analytical Proficiency**
+ Strong command of Python for data analysis and modeling.
+ Ability to wrangle and analyze time-series and operational datasets.
+ Familiarity with statistical methods and basic machine learning techniques.
**2. Data Visualization & Communication**
+ Proficient in data visualization tools (e.g., Python and Power BI).
+ Ability to translate data findings into clear, concise, and actionable visuals.
+ Skilled in crafting presentations style story telling for both technical and non-technical audiences.
**3. Manufacturing Domain Acumen**
+ Demonstrated interest in or experience with manufacturing, operations, or industrial systems (e.g., refrigeration, production lines, capacity planning).
+ Ability to understand plant operations, process flows, and constraints to contextualize analysis.
**4. Problem Solving & Initiative**
+ Capable of working on ambiguous problems and breaking them down into structured analytical tasks.
+ Demonstrated experience designing analysis frameworks from scratch or improving existing ones.
**5. Communication & Stakeholder Engagement**
+ Strong written and verbal communication skills with a top-down approach (start with the "so what").
+ Ability to collaborate cross-functionally with engineering, operations, and analytics teams.
+ Comfortable presenting findings and recommendations to mid-to-senior level stakeholders.
**Nice to Have:**
+ Exposure to manufacturing data systems (e.g., PI System, SCADA, MES).
+ Experience with simulation tools or modeling physical processes (e.g., heat transfer, fluid dynamics
**_Please note, this role is not able to offer visa transfer or sponsorship now or in the future_**
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
$71k-95k yearly est. 14d ago
Data Engineer
Insight Global
Data engineer job in Neenah, WI
Day to Day - Own data pipeline development on GCP using BigQuery, Dataform, Dataflow, and Cloud Composer; design and implement performant, cost aware patterns for batch and near real time workloads. - Orchestrate source to warehouse ingestion (e.g., Fivetran) and transformations in dbt to move data "from point A to point B" with robust testing, documentation, and version control.
- Model warehouse tables (staging, marts, dimensional models) that support analytics and manufacturing operations, drive standards for naming, partitioning, clustering, and data quality.
- Partner cross functionally with application teams, manufacturing operations, and business stakeholders to understand requirements and translate them into scalable data solutions.
- Contribute to platform governance: CI/CD for data assets, secrets management, cost monitoring, access controls, and compliance aligned with enterprise standards.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to ********************.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: ****************************************************
Skills and Requirements
- 7+ years of professional DataEngineering experience (Senior/Staff level).
- Proven delivery on GCP: BigQuery, Dataform, Dataflow, Cloud Composer (or equivalent orchestration like Airflow).
- Hands on with dbt (testing, documentation, packages) and experience operationalizing connector based ingestion (e.g., Fivetran).
- Strong SQL engineering and performance tuning in columnar warehouses; familiarity with query cost optimization.
- Experience migrating from on prem systems to cloud data warehouses, including schema rationalization and data quality strategies.
- Solid software engineering foundations: version control (git), code review, CI/CD for data, Infrastructure as Code (Terraform or similar preferred).
- Excellent communication skills for stakeholder alignment and knowledge sharing.
$76k-101k yearly est. 19d ago
Data Engineer
Amplifi Group 4.3
Data engineer job in Appleton, WI
Amplifi is the go-to data consultancy for enterprise organizations that want their success to be driven by data. We empower our clients to innovate, grow and succeed by establishing and delivering strategies across all elements of the data value chain. From the governance and management of data through to analytics and automation, our integrated approach to modern data ecosystems delivers measurable results through a combination of expert consultancy and best-in-breed technology. Our company and team members are proud to empower our clients' businesses by providing exceptional solutions and value, as we truly believe their success is our success. We thrive on delivering excellent solutions and overcoming technical and business challenges. As such, we're looking for like-minded individuals to learn, grow, and mentor others as a part of the Amplifi family.
Position Summary
The DataEngineer will be responsible for designing, building, and maintaining scalable, secure data pipelines that drive analytics and support operational data products. The ideal candidate brings a strong foundation in SQL, Python, and modern data warehousing with a deep understanding of Snowflake, Databricks, or Microsoft Fabric, and a solid understanding of cloud-based architectures.
What You Will Get To Do
Design, develop, and optimize robust ETL/ELT pipelines to ingest, transform, and expose data across multiple systems.
Build and maintain data models and warehouse layers, enabling high-performance analytics and reporting.
Collaborate with analytics, product, and engineering teams to understand data needs and deliver well-structured data solutions.
Write clean, efficient, and testable code in SQL and Python to support automation, data quality, and transformation logic.
Support deployment and orchestration workflows, using Azure Data Factory, dbt, or similar tools.
Work across multi-cloud environments (Azure preferred; AWS and GCP optional) to integrate data sources and manage cloud-native components.
Contribute to CI/CD practices and data pipeline observability (monitoring, logging, alerting).
Ensure data governance, security, and compliance in all engineering activities.
Support ad hoc data science and machine learning workflows within Dataiku.
What You Bring to the Team
4+ years of experience in a dataengineering or related software engineering role.
Proficiency in SQL and Python for data manipulation, transformation, and scripting.
Strong experience working with Snowflake and MSSQL Server.
Practical knowledge of working with cloud data platforms, especially Microsoft Azure.
Experience with modern data modeling and warehouse optimization techniques.
Experience with Databricks, Azure Data Factory, or DBT preferred.
Exposure to Microsoft Fabric components like OneLake, Pipelines, or Direct Lake.
Familiarity with cloud services across AWS, GCP, or hybrid cloud environments.
Understanding of or curiosity about Dataiku for data science and advanced analytics collaboration.
Ability to work independently and with a team in a hybrid/remote environment.
Location
Wisconsin is preferred.
Travel
Ability to travel up to 10% of the time.
Benefits & Compensation
Amplifi offers excellent compensation and benefits including, but not limited to, health, dental, 401(k) program, employee assistance program, short and long-term disability, life insurance, accidental death and dismemberment (AD&D), PTO program, flex work schedules and paid holidays.
Equal Opportunity Employer
Amplifi is proud to be an equal opportunity employer. We do not discriminate against applicants based on race, religion, disability, medical condition, national origin, gender, sexual orientation, marital status, gender identity, pregnancy, childbirth, age, veteran status or other legally protected characteristics.
$77k-104k yearly est. 60d+ ago
AZCO Data Scientist - IT (Appleton, WI)
Burns & McDonnell 4.5
Data engineer job in Appleton, WI
The Staff Data Scientist plays a critical role in leveraging data to drive business insights, optimize operations, and support strategic decision-making. You will be responsible for designing and implementing advanced analytical models, developing data pipelines, and applying statistical and machine learning techniques to solve complex business challenges. This position requires a balance of technical expertise, business acumen, and communication skills to translate data findings into actionable recommendations.
+ Develop and apply data solutions for cleansing data to remove errors and review consistency.
+ Perform analysis of data to discover information, business value, patterns and trends in support to guide development of asset business solutions.
+ Gather data, find patterns and relationships and create prediction models to evaluate client assets.
+ Conduct research and apply existing data science methods to business line problems.
+ Monitor client assets and perform predictive and root cause analysis to identify adverse trends; choose best fit methods, define algorithms, and validate and deploy models to achieve desired results.
+ Produce reports and visualizations to communicate technical results and interpretation of trends; effectively communicate findings and recommendations to all areas of the business.
+ Collaborate with cross-functional stakeholders to assess needs, provide assistance and resolve problems.
+ Translate business problems into data science solutions.
+ Performs other duties as assigned
+ Complies with all policies and standards
Qualifications
+ Bachelor Degree in Analytics, Computer Science, Information Systems, Statistics, Math, or related field from an accredited program and 4 years related experience required or experience may be substituted for degree requirement required
+ Experience in data mining and predictive analytics.
+ Strong problem-solving skills, analytical thinking, attention to detail and hypothesis-driven approach.
+ Excellent verbal/written communication, and the ability to present and explain technical concepts to business audiences.
+ Proficiency with data visualization tools (Power BI, Tableau, or Python libraries).
+ Experience with Azure Machine Learning, Databricks, or similar ML platforms.
+ Expert proficiency in Python with pandas, scikit-learn, and statistical libraries.
+ Advanced SQL skills and experience with large datasets.
+ Experience with predictive modeling, time series analysis, and statistical inference.
+ Knowledge of A/B testing, experimental design, and causal inference.
+ Familiarity with computer vision for image/video analysis.
+ Understanding of NLP techniques for document processing.
+ Experience with optimization algorithms and operations research techniques preferred.
+ Knowledge of machine learning algorithms, feature engineering, and model evaluation.
This job posting will remain open a minimum of 72 hours and on an ongoing basis until filled.
EEO/Disabled/Veterans
Job Information Technology
Primary Location US-WI-Appleton
Schedule: Full-time
Travel: Yes, 5 % of the Time
Req ID: 253790
\#LI-MF #ACO N/A
$62k-82k yearly est. 60d+ ago
Digital Technology Data Scientist Lead
Oshkosh 4.7
Data engineer job in Oshkosh, WI
At Oshkosh, we build, serve and protect people and communities around the world by designing and manufacturing some of the toughest specialty trucks and access equipment. We employ over 18,000 team members all united by a common purpose. Our engineering and product innovation help keep soldiers and firefighters safe, is critical in building and keeping communities clean and helps people do their jobs every day.
SUMMARY:
As a Data Scientist, your primary responsibilities will be to analyze and interpret datasets by using statistical techniques, machine learning, and/or programming skills in order to extract insights and build models which solve business problems.
YOUR IMPACT:
Familiarity with data science tools, such as Azure Databricks, Power BI, Microsoft Office (including Excel pivot tables), Spark, Python, R and C.
Undertake the processing of multiple datasets, including structured and unstructured, to analyze large amounts of information to discover trends and patterns.
Prepare and concisely deliver analysis results in visual and written forms that communicate data insights to both technical and non-technical audiences.
Collaborate with cross-functional teams (e.g. data analysts, dataengineers, architects, business stakeholders) to understand data needs for complex business requirements.
Build highly complex predictive models and machine-learning algorithms. Execute Integration into existing systems or creation of new products.
Direct some data science assignments, projects, visualization tasks, data quality improvements, and troubleshooting of data incidents, including the resolution of root causes.
Lead efforts to resolve and document solutions to track and manage incidents, changes, problems, tasks, and demands.
Coach and mentor other team members on new technologies and best practices across data science and business intelligence.
Possesses advanced understanding of business' processes in at least one area of business and has an understanding of business processes in multiple areas of the business.
Actively supports the advancement of the strategic roadmap for data science.
MINIMUM QUALIFICATIONS:
Bachelors degree with five (5) or more years of experience in the field or in a related area.
STANDOUT QUALIFICATIONS:
Masters or doctorate degree
Expertise in Power Platforms
Familiarity with LLM (open source or closed)
Experience in Front end web app development (Flaskapp, Gradio etc)
Familiarity with RAG architecture
Pay Range:
$118,600.00 - $204,000.00
The above pay range reflects the minimum and maximum target pay for the position across all U.S. locations. Within this range, individual pay is determined by various factors, including the scope and responsibilities of the role, the candidate's experience, education and skills, as well as the equity of pay among team members in similar positions. Beyond offering a competitive total rewards package, we prioritize a people-first culture and offer various opportunities to support team member growth and success.
Oshkosh is committed to working with and offering reasonable accommodation to job applicants with disabilities. If you need assistance or an accommodation due to disability for any part of the employment process, please contact us at ******************************************.
Oshkosh Corporation is a merit-based Equal Opportunity Employer. Job opportunities are open for application to all qualified individuals and selection decisions are made without regard to race, color, religion, sex, national origin, age, disability, veteran status, or other protected characteristic. To the extent that information is provided or collected regarding categories as provided by law it will in no way affect the decision regarding an employment application.
Oshkosh Corporation will not discharge or in any manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with Oshkosh Corporation's legal duty to furnish information.
Certain positions with Oshkosh Corporation require access to controlled goods and technologies subject to the International Traffic in Arms Regulations or the Export Administration Regulations. Applicants for these positions may need to be "U.S. Persons," as defined in these regulations. Generally, a "U.S. Person" is a U.S. citizen, lawful permanent resident, or an individual who has been admitted as a refugee or granted asylum.
$68k-88k yearly est. Auto-Apply 28d ago
Data Scientist - Strategic Analytics
Acuity 4.7
Data engineer job in Sheboygan, WI
Acuity is seeking a Data Scientist - Strategic Analytics to work with business leaders to inform company strategies and decision-making through an objective viewpoint by leveraging the team's expertise in deep business knowledge, leveraging data as an asset, and utilizing statistical and advanced analytical techniques as appropriate. The role of Data Scientist - Strategic Analytics uses advanced statistical and analytical techniques to inform highly complex and sophisticated business questions. The data scientist will apply and integrate advanced statistical, mathematical, data mining, and business analysis skills and techniques to discover new behavioral insights, predict outcomes, prescribe decision options, and advise on turning new insights into actionable opportunities.
Internal deadline to apply: December 22nd, 2025
ESSENTIAL FUNCTIONS:
Develop solutions, to support analytic insights and visualization using mathematical models, algorithms, machine learning techniques, and robust analytics.
Partner with business clients and technical leaders to spearhead the development of standards for appropriate statistical methodology, study design, power and sample size estimation, and data analysis plans for behavioral analytics.
Provide data driven decision support for key initiatives of company strategy and measurement related to client, claims, sales, and campaign analyses to understand both growth and economic impact.
Determine requisite data elements and partners with dataengineers to design integrated datasets for analytical research purposes.
Work closely with machine learning engineers and business partners to develop and build behavioral data science and analytics products.
Effectively present analytical results and derived recommendations for action to senior leaders, peers, and product team members.
Regular and predictable attendance.
Performs other duties as assigned.
EDUCATION:
Bachelor's degree in data science, statistics, math, computer science, economics, or related field. Advanced graduate level degree in a quantitative discipline (statistics, applied mathematics, computer science, econometrics, or a related field) is preferred.
EXPERIENCE:
A minimum of three years relevant experience to include research and data analysis, experiment design and measurement, or application of statistical research techniques.
OTHER QUALIFICATIONS:
Expertise in one or more development or statistical analysis tool such as R, Python, SAS, SQL, SPSS, or other tools. Tableau experience is a plus.
Proven excellence in research, quantitative analysis, problem solving, and analytical working techniques.
Statistical knowledge and intuition
Strong aptitude and desire for learning new analytical and visualization tools, modeling, and quantitative techniques. Initiative to independently design and develop own deliverables while still being a team player.
Demonstrates ability to deliver results and recommendations in written, verbal and presentation form at an appropriate level for a business audience.
*Acuity does not sponsor applicants for U.S. work authorization.*
This job is classified as exempt.
We are an Equal Employment Opportunity employer. Applicants and employees are considered for positions and are evaluated without regard to mental or physical disability, race, color, religion, gender, national origin, age, genetic information, military or veteran status, sexual orientation, marital status or any other protected Federal, State/Province or Local status unrelated to the performance of the work involved.
If you have a disability and require reasonable accommodations to apply or during the interview process, please contact our Talent Acquisition team at
******************
. Acuity is dedicated to offering reasonable accommodations during our recruitment process for qualified individuals.
$74k-98k yearly est. Easy Apply 50d ago
Lead Data Steward
Faith Technologies 4.0
Data engineer job in Menasha, WI
You've discovered something special. A company that cares. Cares about leading the way in construction, engineering, manufacturing and renewable energy. Cares about redefining how energy is designed, applied and consumed. Cares about thoughtfully growing to meet market demands. And ─ as “one of the Healthiest 100 Workplaces in America” ─ is focused on the mind/body/soul of team members through our Culture of Care.
We are seeking a highly motivated and detail-oriented FTI Data Steward Leader to establish and champion the data governance within our organization. As data governance is a new initiative for the company, this role will be pivotal in laying the foundation, defining standards, and ensuring that data is treated as a critical enterprise asset. The FTI Data Steward Leader will help lead a community of business and technical data stewards across domains, promote best practices, and work closely with data governance leadership to develop policies, data quality standards, and stewardship processes. This role requires a strategic thinker with strong leadership skills and a deep understanding of data management principles.
MINIMUM REQUIREMENTS
Education: Bachelor's degree in Information Management, Data Science, Business, or related field, or a related field or experience in lieu of degree.
Experience: 5+ years of experience in data management, data governance, or related disciplines with demonstrated experience in leading data stewardship or governance initiatives enterprise wide. Strong knowledge of data quality principles, metadata management, and master data management (MDM). Familiarity with data governance frameworks (e.g., DAMA DMBOK, DCAM) and tools (e.g. Informatica, etc.).
Travel: 0-10%
Work Schedule: This position works between the hours of 7 AM and 5 PM, Monday- Friday. However, work may be performed at any time on any day of the week to meet business needs.
KEY RESPONSIBILITIES
Leadership and Strategy:
Serves as the primary representative and advocate for the community of master data leads and data coordinators.
Leads master data management (MDM) strategies and policies to support FTI.
Collaborates with data governance leadership to define the enterprise data stewardship framework, standards, and playbooks.
Creates and implements data governance policies, procedures, and maturity roadmaps.
Leads the formation of a data governance council and facilitate regular working sessions.
Stewardship and Quality
Ensures consistent application of data definitions, metadata standards, and classification across data domains.
Leads and develops data quality standards with data stewards and help resolve data quality issues with appropriate data stewards.
Collaboration and Stakeholder Engagement:
Partners with business units, BT, and Data Analytics teams to identify data steward leaders and data ownership roles.
Facilitates communication between business and technical stakeholders to resolve data issues and improve data understanding.
Acts as a liaison between the data governance and operational teams to ensure stewardship initiatives are aligned with business needs.
Metadata and Cataloging:
Works with data governance and Data Analytics team to maintain an enterprise data catalog.
Supports documentation of business glossaries, data dictionaries, and lineage across key data assets.
Training and Change Management:
Promotes data literacy and fosters a data-centric culture across the organization.
Leads change management efforts related to data governance adoption and tool implementation.
Performs other related duties as required and assigned.
The job description and responsibilities described are intended to provide guidelines for job expectations and the employee's ability to perform the position described. It is not intended to be construed as an exhaustive list of all functions, responsibilities, skills, and abilities. Additional functions and requirements may be assigned by supervisors as deemed appropriate.
How Does FTI Give YOU the Chance to Thrive?
If you're energized by new challenges, FTI provides you with many opportunities. Joining FTI opens doors to redefine what's possible for your future.
Once you're a team member, you're supported and provided with the knowledge and resources to achieve your career goals with FTI. You're officially in the driver's seat of your career, and FTI's career development and continued education programs give you opportunities to position yourself for success.
FTI is a “merit to the core” organization. We recognize and reward top performers, offering competitive, merit-based compensation, career path development and a flexible and robust benefits package.
Benefits are the Game-Changer
We provide industry-leading benefits as an investment in the lives of team members and their families. You're invited to review the full list of FTI benefits available to regular/full-time team members. Start here. Grow here. Succeed here. If you're ready to learn more about your career with FTI, apply today!
Faith Technologies, Inc. is an Equal Opportunity Employer - veterans/disabled.
$101k-131k yearly est. Auto-Apply 60d+ ago
Data Warehouse Engineer I
Menasha 4.8
Data engineer job in Menasha, WI
The Data Warehouse Engineer I is part of a team dedicated to supporting Network Health's Enterprise Data Warehouse. This individual will perform development, analysis, testing, debugging, documentation, implementation, and maintenance of interfaces to support the Enterprise Data Warehouse and related applications. They will consult with other technical resources and key departmental users on solutions and best practices. They will monitor performance and effectiveness of the data warehouse and recommend changes as appropriate.
Location: Candidates must reside in the state of Wisconsin for consideration. This position is eligible to work at your home office (reliable internet is required), at our office in Brookfield or Menasha, or a combination of both in our hybrid workplace model.
Hours: 1.0 FTE, 40 hours per week, 8am-5pm Monday through Friday, may be required to work later hours when system changes are being implemented or problems arise
Check out our 2024 Community Report to learn a little more about the difference our employees make in the communities we live and work in. As an employee, you will have the opportunity to work hard and have fun while getting paid to volunteer in your local neighborhood. You too, can be part of the team and making a difference. Apply to this position to learn more about our team.
Job Responsibilities:
Perform end to end delivery of data interfaces in various stages of the Enterprise Data Warehouse in accordance with professional standards and industry best practice
Perform all phases of the development lifecycle including solution design, creation of acceptance criteria, implementation, technical documentation, development and execution of test cases, performance monitoring, troubleshooting, data analysis, and profiling
Consult with Developers, Engineers, DBAs, key departmental stakeholders, data governance and leadership on technical solutions and best practice
Monitor and audit the Enterprise Data Warehouse for effectiveness, throughput, and responsiveness. Recommend changes as appropriate. Troubleshoot customer complaints related to system performance issues
Maintain effective communication with customers from all departments for system development, implementation, and problem resolution
Required to take call to assist in resolution of technical problems
Other duties and responsibilities as assigned
Job Requirements:
Requires Associate Degree in Computer Science, Business, or related technical field; equivalent years of experience may be substituted
Minimum of 1 year experience in program interfacing required
Experience with T-SQL development, SSIS development, and database troubleshooting skills required
Network Health is an Equal Opportunity Employer
$82k-107k yearly est. 60d+ ago
Data Architect
Rosen's Diversified Inc. 4.5
Data engineer job in Green Bay, WI
We are seeking a highly skilled Data Architect to play a pivotal role in designing scalable, secure, and high-performing data solutions across the enterprise. This role will focus on architecting the next generation of data platforms, driving the adoption of modern technologies, and ensuring alignment with enterprise goals and governance standards. As part of RDI's broader mission, this role serves as a hands-on architect who blends technical depth with strategic insight to shape dataengineering, analytics, and AI/ML capabilities in unlocking the power of data to inform decisions, optimize performance, and create competitive advantage internal stakeholders.
ESSENTIAL FUNCTIONS AND RESPONSIBILITIES
* Develop end-to-end data architecture strategies for big data and advanced analytics solutions using Azure Databricks.
* Experience with building Databricks Delta Lake based Lakehouse using DLT, PySpark Jobs, Databricks
* Workflows, Unity Catalog and Medallion architecture from the ground up.
* Ensure data models support scalability, performance, and security requirements.
* Configure and optimize Databricks clusters, notebooks, and workflows for ETL/ELT processes.
* Integrate Databricks with other Azure services such as Azure Data Lake Storage (ADLS), Azure Data Factory, Azure Key Vault, and Microsoft Fabric.
* Implement best practices for data governance, security, and cost optimization.
* Work closely with dataengineers, analysts, and business stakeholders to translate requirements into technical solutions.
* Provide technical leadership and mentorship to team members on Databricks and Azure technologies.
* Monitor and troubleshoot data pipelines and workflows for reliability and efficiency.
* Ensure compliance with organizational and regulatory standards for data privacy and security.
Qualifications
QUALIFICATIONS, KNOWLEDGE, SKILLS, AND EXPERIENCE
* Bachelor's or Master's degree in Computer Science, Information Systems, DataEngineering, or a related field.
* 7+ years of progressive experience in dataengineering, data modelling including at least 2 years in an architecture-focused or lead technical role.
* Proven experience architecting enterprise-grade cloud data platforms using Azure.
* Experience with building Databricks Delta Lake based Lakehouse using DLT, PySpark Jobs, Databricks
* Workflows, Unity Catalog and Medallion architecture from the ground up.
ADDITIONAL SKILLS/EXPERIENCE/REQUIREMENTS
* Strong understanding of data security principles and best practices.
* Experience with Databricks monitoring and performance tuning tools.
* Solid understanding of DevOps and Infrastructure-as-Code practices, including CI/CD pipelines and automated deployment frameworks.
* Analytical & Problem-Solving Skills: Demonstrated ability to troubleshoot complex issues and deliver effective solutions under time constraints.
* Documentation & Process Management: Ability to create and maintain clear documentation for configurations, processes, and governance standards.
* Certifications such as Microsoft Certified: Azure DataEngineer Associate or Databricks Certified DataEngineer.
* Ability to work in a fast-paced, collaborative environment.
* Excellent organizational skills, time management, and ability to work independently on multiple deadlines ensuring high quality deliverables while managing shifting responsibilities. Strong written and verbal communication skills and ability to tailor message to appropriate audience.
* Ability to travel up to 15% to various company locations as needed to provide support.
Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, or any other legally protected status. Applicants with a disability who require a reasonable accommodation for any part of the application or hiring process can contact Human Resources at the location(s) where you are applying. We participate in the E-Verify program in certain locations as required by law.
Summary
OUR FAMILY CULTURE
We are a family-owned business established in 1946 with nearly 5,000 employees. Over the years, Rosen's Diversified, Inc. ("RDI") has grown into a holding company of vertically integrated business units, including American Foods Group, America's Service Line, Scientific Life Solutions, and Rosen's Inc. By understanding our employees, our customers and ourselves, we are preparing RDI for the future generations of success.
Our Company is comprised of innovative entrepreneurs who value our casual and down to earth culture. As a member of the Rosen's family, you will find yourself challenged and rewarded for your professional contributions as well as the Company's success.
WHAT WE OFFER
* Privately held, family-owned (three generations) business which operates with a mentality of what should be done versus share holder requirements.
* Excellent health and welfare benefits including but not limited to medical, dental, vision, disability, and a variety of voluntary benefit options.
* 401(k) benefits with annual company match for eligible employees.
* Professional and personal development programs including Career and Learning Paths providing opportunities for advancement.
$83k-116k yearly est. Auto-Apply 17d ago
Data Engineer
Alliance Laundry Systems 4.7
Data engineer job in Ripon, WI
Our Enterprise Analytics Center of Excellence (CoE) is building the next-generation analytics stack centered on Snowflake (AWS) and Sigma. We're looking for a DataEngineer to help design, build, and operate this modern data architecture. This role focuses on developing efficient, governed data pipelines and enabling analytics teams across the business to access reliable, high-quality data.
Responsibilities
Ingestion & Transformation
Build and maintain scalable data pipelines using Snowflake OpenFlow and related Snowflake-native tools.
Assist in migrating existing Dataiku pipelines into Snowflake to simplify architecture and improve efficiency.
Continuously improve workflows for performance, reliability, and maintainability.
Semantic Modeling & Views
Develop and maintain Snowflake semantic views that support analytics and reporting needs.
Apply data modeling best practices to ensure consistency across business domains.
Support creation of governed, role-based semantic layers for Finance, Operations, and other enterprise areas.
Governance & Security
Implement data access controls aligned with enterprise RBAC frameworks.
Support metadata management and cataloging for visibility in Sigma.
Ensure pipelines and models meet enterprise data governance standards.
Analytics Enablement
Deliver clean, governed data sets for Sigma dashboards and embedded analytics use cases.
Monitor and optimize query performance to balance cost and speed.
Partner with analysts and BI developers to ensure data is reliable and ready for consumption.
Continuous Improvement
Stay current with Snowflake and Sigma feature updates.
Recommend improvements to dataengineering processes and automation where possible.
Contribute to a culture of learning and collaboration within the CoE.
Qualifications
Education & Experience:
Bachelor's or Master's degree in Computer Science, DataEngineering, or a related field.
At least 5 years of experience as a DataEngineer or in a similar role.
Required Skills & Experience
Snowflake expertise: SQL, OpenFlow, Tasks/Streams, and performance optimization.
Cloud experience: Hands-on with AWS (S3, IAM, Glue, Lambda a plus).
Data modeling: Familiar with dimensional modeling and semantic layer design.
ETL/ELT engineering: Experience building and optimizing data pipelines.
Governance & Security: Understanding of AD/LDAP integration, RBAC, and data cataloging.
Programming: Proficient in SQL and Python.
Strong problem-solving skills and a desire to learn emerging data technologies.
Preferred Skills
Experience supporting Power BI to Sigma migrations or similar modern BI transitions.
Familiarity with Sigma features such as Ask Sigma, Write-Back, or Embedded Analytics.
Exposure to Dataiku or similar ETL platforms.
Interest in modern data practices such as AI-assisted analytics or NLP-based BI tools.
Travel: Approximately 10%
Standard and Physical Requirements:
Position involves sitting for long periods, standing, manual dexterity, stooping, bending and minimal lifting.
Alliance Team Members Demonstrate DRIVE:
Dedicated: Follows through on commitments. Strong say/do.
Respectful: Acts with integrity and values diverse perspective.
Innovative: Always looking for a better way; leads change.
Versatile: Adapts quickly to changing circumstances. Demonstrates agility.
Engaged: Acts like an owner. Wants to create and grow a business which is tightly aligned with market needs.
EEO We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. ID 2025-10914 Pos. Type Full-Time
$72k-97k yearly est. Auto-Apply 22d ago
Building Lead - Appleton 25313 (Fox Valley Operations Non Medical)
Kleenmark Services Corp 4.1
Data engineer job in Appleton, WI
Operating in a $1Billion plus industry, KleenMark is Wisconsin's largest independent commercial cleaning and supply company. Built on 60 years of experience, KleenMark uses proven processes and the industry's best-trained teams to deliver unmatched service. Expertise in healthcare, commercial, life sciences, manufacturing, and education, KleenMark's 900-plus technicians clean more than 30-million square feet daily. We are a family owned and run business that lives out our values of Trust, Teamwork and Results.
We have excellent opportunities for you to join our team!
Job Skills / Requirements
Job details:
Schedule: Monday-Thursday 9pm-1:30am, Friday 6pm-10:30pm
Pay: $17
Additional Details
Building Leads are responsible for maintaining the cleanliness of the building in which they work by performing various cleaning duties. Duties and hours may vary dependent upon the size of the building and the number of teammates they may be working with. A cleaner may be responsible for any or all the following tasks. Tasks may also change throughout a cleaner's employment.
ESSENTIAL JOB FUNCTIONS
Note: This is not an all-inclusive list. Additional duties may be assigned.
Restrooms | Cleans and disinfects sinks, countertops, toilets, mirrors, floors, etc. Replenishes bathroom supplies. Polishes metalwork, such as fixtures and fittings.
Floors | Sweeps, mops, vacuums, floors using brooms, mops, and vacuum cleaners. Other floor work may be required such as: scrubbing, waxing and polishing floors.
Break rooms /Kitchenettes | Cleans and disinfects sinks, countertops, tables, chairs, refrigerators, etc. Replenishes break room supplies.
Dust | Dusts furniture, equipment, partitions, etc.
Trash | Empties wastebaskets and recyclables and transports to disposal area.
Other Duties | Cleans rugs, carpets, and upholstered furniture, using vacuum cleaner (hip or backpack). Washes walls and woodwork. Washes windows, door panels, partitions, sills, etc.
EXPECTATIONS
Reports to work each day and on time and works extra hours when needed.
Employee must comply with proper safety policies and procedures as required (i.e., when using cleaning chemicals, reporting incidents, etc.).
Provides excellent level of customer service to both internal and external customers by maintaining a positive attitude.
The employee must be able to determine the neatness, accuracy and thoroughness of the work assigned.
Additional Information / Benefits
Medical, Vision & Dental Insurance for qualifying positions.
Personal Time Off (PTO) for qualifying positions.
6 Paid federal holidays after 90 days for qualifying positions.
Employee Referral Bonus
Instant Pay Access through DailyPay.
Employee of the Month, Quarter and Year Employee Recognition Program.
Growth within the company.
Great work/life balance
Safety First:
Personal protective equipment provided or required
Safety Monthly Trainings for all employees.
Sanitizing, disinfecting, or cleaning procedures in place
Employees working in medical facilities are required to wear a mask and gloves during the entirety of their shift. We provide all necessary PPE.
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Affirmative Action/EEO statement Kleenmark is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran, or disability status.
This is a Part-Time position 2nd Shift, 3rd Shift.
Number of Openings for this position: 1
$58k-84k yearly est. 12d ago
Software Engineer, Platform - Green Bay, USA
Speechify
Data engineer job in Green Bay, WI
The mission of Speechify is to make sure that reading is never a barrier to learning.
Over 50 million people use Speechify's text-to-speech products to turn whatever they're reading - PDFs, books, Google Docs, news articles, websites - into audio, so they can read faster, read more, and remember more. Speechify's text-to-speech reading products include its iOS app, Android App, Mac App, Chrome Extension, and Web App. Google recently named Speechify the Chrome Extension of the Year and Apple named Speechify its 2025 Design Award winner for Inclusivity.
Today, nearly 200 people around the globe work on Speechify in a 100% distributed setting - Speechify has no office. These include frontend and backend engineers, AI research scientists, and others from Amazon, Microsoft, and Google, leading PhD programs like Stanford, high growth startups like Stripe, Vercel, Bolt, and many founders of their own companies.
Overview
The responsibilities of our Platform team include building and maintaining all backend services, including, but not limited to, payments, analytics, subscriptions, new products, text to speech, and external APIs.
This is a key role and ideal for someone who thinks strategically, enjoys fast-paced environments, is passionate about making product decisions, and has experience building great user experiences that delight users.
We are a flat organization that allows anyone to become a leader by showing excellent technical skills and delivering results consistently and fast. Work ethic, solid communication skills, and obsession with winning are paramount.
Our interview process involves several technical interviews and we aim to complete them within 1 week.
What You'll Do
Design, develop, and maintain robust APIs including public TTS API, internal APIs like Payment, Subscription, Auth and Consumption Tracking, ensuring they meet business and scalability requirements
Oversee the full backend API landscape, enhancing and optimizing for performance and maintainability
Collaborate on B2B solutions, focusing on customization and integration needs for enterprise clients
Work closely with cross-functional teams to align backend architecture with overall product strategy and user experience
An Ideal Candidate Should Have
Proven experience in backend development: TS/Node (required)
Direct experience with GCP and knowledge of AWS, Azure, or other cloud providers
Efficiency in ideation and implementation, prioritizing tasks based on urgency and impact
Preferred: Experience with Docker and containerized deployments
Preferred: Proficiency in deploying high availability applications on Kubernetes
What We Offer
A dynamic environment where your contributions shape the company and its products
A team that values innovation, intuition, and drive
Autonomy, fostering focus and creativity
The opportunity to have a significant impact in a revolutionary industry
Competitive compensation, a welcoming atmosphere, and a commitment to an exceptional asynchronous work culture
The privilege of working on a product that changes lives, particularly for those with learning differences like dyslexia, ADD, and more
An active role at the intersection of artificial intelligence and audio - a rapidly evolving tech domain
The United States Based Salary range for this role is: 140,000-200,000 USD/Year + Bonus + Stock depending on experience
Think you're a good fit for this job?
Tell us more about yourself and why you're interested in the role when you apply.
And don't forget to include links to your portfolio and LinkedIn.
Not looking but know someone who would make a great fit?
Refer them!
Speechify is committed to a diverse and inclusive workplace.
Speechify does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.
$63k-83k yearly est. Auto-Apply 10d ago
Software Engineer
Actalent
Data engineer job in Ripon, WI
Join our innovative team as a Software Engineer where you will collaborate with fellow engineers to develop new and enhance existing applications using Python, Rust, and other languages based on high-level specifications and requirements. You will lead the development of prototypes, demos, and internal tooling, as well as refactor and optimize front-end user experiences using frameworks such as React and Flutter. Your role will also involve maintaining and improving legacy systems as needed.
Responsibilities
* Spearhead the development of prototypes, demos, and internal tooling.
* Refactor and optimize front-end user experiences using React and Flutter.
* Maintain, replace, and improve legacy systems when needed.
* Use Jira to track and manage assigned stories, epics, and escalated service tickets.
* Utilize Git and BitBucket for creating branches, pull requests, and commits.
* Provide support to end users directly or by creating detailed feature requests, bug reports, and process improvements.
* Review and approve documentation, manuals, and procedures for both internal and external use.
* Identify and patch bugs in development and production under time constraints.
Essential Skills
* Experience with embedded software, software development, and firmware.
* Proficiency in Python, Rust, and front-end frameworks such as React and Flutter.
* Familiarity with Jira for task tracking and management.
* Proficient use of version control systems like Git and BitBucket.
Additional Skills & Qualifications
* Familiarity with specific hardware such as HP Aruba.
* Ability to work with customers to deploy and maintain audio and access control systems.
Job Type & Location
This is a Permanent position based out of Ripon, WI.
Pay and Benefits
The pay range for this position is $70000.00 - $90000.00/yr.
Health, Vision, Dental, and PTO
Workplace Type
This is a fully onsite position in Ripon,WI.
Application Deadline
This position is anticipated to close on Jan 29, 2026.
About Actalent
Actalent is a global leader in engineering and sciences services and talent solutions. We help visionary companies advance their engineering and science initiatives through access to specialized experts who drive scale, innovation and speed to market. With a network of almost 30,000 consultants and more than 4,500 clients across the U.S., Canada, Asia and Europe, Actalent serves many of the Fortune 500.
The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
If you would like to request a reasonable accommodation, such as the modification or adjustment of the job application process or interviewing due to a disability, please email actalentaccommodation@actalentservices.com for other accommodation options.
$70k-90k yearly 13d ago
Digital Technology Data Scientist Lead
Oshkosh Corp 4.7
Data engineer job in Oshkosh, WI
At Oshkosh, we build, serve and protect people and communities around the world by designing and manufacturing some of the toughest specialty trucks and access equipment. We employ over 18,000 team members all united by a common purpose. Our engineering and product innovation help keep soldiers and firefighters safe, is critical in building and keeping communities clean and helps people do their jobs every day.
SUMMARY:
As a Data Scientist, your primary responsibilities will be to analyze and interpret datasets by using statistical techniques, machine learning, and/or programming skills in order to extract insights and build models which solve business problems.
YOUR IMPACT:
* Familiarity with data science tools, such as Azure Databricks, Power BI, Microsoft Office (including Excel pivot tables), Spark, Python, R and C.
* Undertake the processing of multiple datasets, including structured and unstructured, to analyze large amounts of information to discover trends and patterns.
* Prepare and concisely deliver analysis results in visual and written forms that communicate data insights to both technical and non-technical audiences.
* Collaborate with cross-functional teams (e.g. data analysts, dataengineers, architects, business stakeholders) to understand data needs for complex business requirements.
* Build highly complex predictive models and machine-learning algorithms. Execute Integration into existing systems or creation of new products.
* Direct some data science assignments, projects, visualization tasks, data quality improvements, and troubleshooting of data incidents, including the resolution of root causes.
* Lead efforts to resolve and document solutions to track and manage incidents, changes, problems, tasks, and demands.
* Coach and mentor other team members on new technologies and best practices across data science and business intelligence.
* Possesses advanced understanding of business' processes in at least one area of business and has an understanding of business processes in multiple areas of the business.
* Actively supports the advancement of the strategic roadmap for data science.
MINIMUM QUALIFICATIONS:
* Bachelors degree with five (5) or more years of experience in the field or in a related area.
STANDOUT QUALIFICATIONS:
* Masters or doctorate degree
* Expertise in Power Platforms
* Familiarity with LLM (open source or closed)
* Experience in Front end web app development (Flaskapp, Gradio etc)
* Familiarity with RAG architecture
Pay Range:
$118,600.00 - $204,000.00
The above pay range reflects the minimum and maximum target pay for the position across all U.S. locations. Within this range, individual pay is determined by various factors, including the scope and responsibilities of the role, the candidate's experience, education and skills, as well as the equity of pay among team members in similar positions. Beyond offering a competitive total rewards package, we prioritize a people-first culture and offer various opportunities to support team member growth and success.
Oshkosh is committed to working with and offering reasonable accommodation to job applicants with disabilities. If you need assistance or an accommodation due to disability for any part of the employment process, please contact us at ******************************************.
Oshkosh Corporation is a merit-based Equal Opportunity Employer. Job opportunities are open for application to all qualified individuals and selection decisions are made without regard to race, color, religion, sex, national origin, age, disability, veteran status, or other protected characteristic. To the extent that information is provided or collected regarding categories as provided by law it will in no way affect the decision regarding an employment application.
Oshkosh Corporation will not discharge or in any manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with Oshkosh Corporation's legal duty to furnish information.
Certain positions with Oshkosh Corporation require access to controlled goods and technologies subject to the International Traffic in Arms Regulations or the Export Administration Regulations. Applicants for these positions may need to be "U.S. Persons," as defined in these regulations. Generally, a "U.S. Person" is a U.S. citizen, lawful permanent resident, or an individual who has been admitted as a refugee or granted asylum.
$68k-88k yearly est. Auto-Apply 60d+ ago
Analytics Data Developer - Strategic Analytics
Acuity 4.7
Data engineer job in Sheboygan, WI
Acuity is seeking an Analytics Data Developer to be responsible for designing, building, and optimizing Acuity's Strategic Analytics Team's data pipelines and data marts to enable efficient and scalable analytical workflows. This role plays a critical part in ensuring that high-quality, trusted data is readily available to drive insight generation, predictive modeling, and data-driven decision-making across the enterprise. The Analytics Data Developer will support our data scientists, analytics product managers, insight analysts, and reporting analysts by developing robust data ingestion, transformation, and enrichment processes. They will ensure the reliability, performance, and integrity of the data ecosystem while maintaining strong data governance and comprehensive documentation.
Internal deadline to apply: December 22nd, 2025
ESSENTIAL FUNCTIONS:
Design, build, and optimize scalable and maintainable data pipelines and data marts to support analytics and reporting needs.
Perform data ingestion, transformation, and enrichment from various structured and unstructured data sources.
Monitor, troubleshoot, and optimize pipelines to ensure data quality, reliability, and performance.
Maintain strong data governance by adhering to established standards for data quality, lineage, security, and documentation.
Collaborate closely with data scientists, analysts, engineers, and business partners to translate analytic requirements into efficient data solutions.
Contribute to continuous improvement efforts through automation, performance tuning, and best practice adoption in data development.
Support data platform modernization efforts by integrating new technologies and approaches that enhance analytical capability.
Excellent teamwork, coordination, influence and communication skills.
Ability to develop timely and effective solutions for challenging design problems.
Establishes relationships with data owners, experts and SMEs across a wide variety of Acuity data domains.
Work closely with data scientists and analysts to develop and build analytical products.
Acquire, analyze, combine, synthesize, and store data from a wide range of internal and external sources as it pertains to model development.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Partner with business and technical leaders to prioritize data needs to expand Acuity's analytical data universe.
Develops and supports monitoring solutions for the ML system and components.
EDUCATION:
Bachelor's degree in computer science, MIS, DataEngineering, Business or related field.
EXPERIENCE:
3+ years of experience developing data pipelines, data marts, or ETL/ELT workflows in a modern data environment. Proficiency in SQL and at least one programming language such as Python, Scala, or Java.
OTHER QUALIFICATIONS:
Demonstrated experience designing and implementing Medallion Architecture (Bronze-Silver-Gold layers) in a cloud or distributed data environment.
Experience with modern data warehousing and processing platforms (e.g., Snowflake, Databricks, Azure Synapse, or similar).
Familiarity with data governance, metadata management, and version control best practices.
Strong problem-solving skills, attention to detail, and ability to operate independently in a fast-paced, collaborative environment.
Excellent communication and interpersonal skills with a demonstrated ability to work effectively with both technical and non-technical stakeholders.
Experience with big data tools: Hadoop, Spark, Kafka, etc.
Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
*Acuity does not sponsor applicants for U.S. work authorization.*
This job is classified as exempt.
We are an Equal Employment Opportunity employer. Applicants and employees are considered for positions and are evaluated without regard to mental or physical disability, race, color, religion, gender, national origin, age, genetic information, military or veteran status, sexual orientation, marital status or any other protected Federal, State/Province or Local status unrelated to the performance of the work involved.
If you have a disability and require reasonable accommodations to apply or during the interview process, please contact our Talent Acquisition team at
******************
. Acuity is dedicated to offering reasonable accommodations during our recruitment process for qualified individuals.
$88k-112k yearly est. Easy Apply 50d ago
Software Engineer
Actalent
Data engineer job in Ripon, WI
Join our innovative team as a Software Engineer where you will collaborate with fellow engineers to develop new and enhance existing applications using Python, Rust, and other languages based on high-level specifications and requirements. You will lead the development of prototypes, demos, and internal tooling, as well as refactor and optimize front-end user experiences using frameworks such as React and Flutter. Your role will also involve maintaining and improving legacy systems as needed.
Responsibilities
+ Spearhead the development of prototypes, demos, and internal tooling.
+ Refactor and optimize front-end user experiences using React and Flutter.
+ Maintain, replace, and improve legacy systems when needed.
+ Use Jira to track and manage assigned stories, epics, and escalated service tickets.
+ Utilize Git and BitBucket for creating branches, pull requests, and commits.
+ Provide support to end users directly or by creating detailed feature requests, bug reports, and process improvements.
+ Review and approve documentation, manuals, and procedures for both internal and external use.
+ Identify and patch bugs in development and production under time constraints.
Essential Skills
+ Experience with embedded software, software development, and firmware.
+ Proficiency in Python, Rust, and front-end frameworks such as React and Flutter.
+ Familiarity with Jira for task tracking and management.
+ Proficient use of version control systems like Git and BitBucket.
Additional Skills & Qualifications
+ Familiarity with specific hardware such as HP Aruba.
+ Ability to work with customers to deploy and maintain audio and access control systems.
Job Type & Location
This is a Permanent position based out of Ripon, WI.
Pay and Benefits
The pay range for this position is $70000.00 - $90000.00/yr.
Health, Vision, Dental, and PTO
Workplace Type
This is a fully onsite position in Ripon,WI.
Application Deadline
This position is anticipated to close on Jan 29, 2026.
About Actalent
Actalent is a global leader in engineering and sciences services and talent solutions. We help visionary companies advance their engineering and science initiatives through access to specialized experts who drive scale, innovation and speed to market. With a network of almost 30,000 consultants and more than 4,500 clients across the U.S., Canada, Asia and Europe, Actalent serves many of the Fortune 500.
The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
If you would like to request a reasonable accommodation, such as the modification or adjustment of the job application process or interviewing due to a disability, please email actalentaccommodation@actalentservices.com (%20actalentaccommodation@actalentservices.com) for other accommodation options.
$70k-90k yearly 13d ago
Building Lead - Neenah Weekends 23961 (Fox Valley Operations Non Medical)
Kleenmark Services Corp 4.1
Data engineer job in Neenah, WI
Operating in a $1Billion plus industry, KleenMark is Wisconsin's largest independent commercial cleaning and supply company. Built on 60 years of experience, KleenMark uses proven processes and the industry's best-trained teams to deliver unmatched service. Expertise in healthcare, commercial, life sciences, manufacturing, and education, KleenMark's 900-plus technicians clean more than 30-million square feet daily. We are a family owned and run business that lives out our values of Trust, Teamwork and Results.
We have excellent opportunities for you to join our team!
Job Skills / Requirements
Job details:
Schedule: Saturday & Sunday
Hours: 4pm-8:00pm
Pay: $19.50
Additional Details
Building Leads are responsible for maintaining the cleanliness of the building in which they work by performing various cleaning duties. Duties and hours may vary dependent upon the size of the building and the number of teammates they may be working with. A cleaner may be responsible for any or all the following tasks. Tasks may also change throughout a cleaner's employment.
ESSENTIAL JOB FUNCTIONS
Note: This is not an all-inclusive list. Additional duties may be assigned.
Restrooms | Cleans and disinfects sinks, countertops, toilets, mirrors, floors, etc. Replenishes bathroom supplies. Polishes metalwork, such as fixtures and fittings.
Floors | Sweeps, mops, vacuums, floors using brooms, mops, and vacuum cleaners. Other floor work may be required such as: scrubbing, waxing and polishing floors.
Break rooms /Kitchenettes | Cleans and disinfects sinks, countertops, tables, chairs, refrigerators, etc. Replenishes break room supplies.
Dust | Dusts furniture, equipment, partitions, etc.
Trash | Empties wastebaskets and recyclables and transports to disposal area.
Other Duties | Cleans rugs, carpets, and upholstered furniture, using vacuum cleaner (hip or backpack). Washes walls and woodwork. Washes windows, door panels, partitions, sills, etc.
EXPECTATIONS
Reports to work each day and on time and works extra hours when needed.
Employee must comply with proper safety policies and procedures as required (i.e., when using cleaning chemicals, reporting incidents, etc.).
Provides excellent level of customer service to both internal and external customers by maintaining a positive attitude.
The employee must be able to determine the neatness, accuracy and thoroughness of the work assigned.
Additional Information / Benefits
Medical, Vision & Dental Insurance for qualifying positions.
Personal Time Off (PTO) for qualifying positions.
6 Paid federal holidays after 90 days for qualifying positions.
Employee Referral Bonus
Instant Pay Access through DailyPay.
Employee of the Month, Quarter and Year Employee Recognition Program.
Growth within the company.
Great work/life balance
Safety First:
Personal protective equipment provided or required
Safety Monthly Trainings for all employees.
Sanitizing, disinfecting, or cleaning procedures in place
Employees working in medical facilities are required to wear a mask and gloves during the entirety of their shift. We provide all necessary PPE.
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Affirmative Action/EEO statement Kleenmark is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran, or disability status.
This job reports to the Krissia Henriquez
This is a Part-Time position 2nd Shift, Weekends.
Number of Openings for this position: 1
How much does a data engineer earn in Oshkosh, WI?
The average data engineer in Oshkosh, WI earns between $67,000 and $115,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.