Cloud Engineer
Data engineer job in Oshkosh, WI
Insight Global is looking for a Cloud Engineer to support one of our largest federal clients in Wisconsin. The Senior Cloud Engineer will have emphasis on driving profitable cloud adoption. While driving the innovation and consumption of Cloud based services across the enterprise. Ensure the security of the cloud environment by implementing security controls, threat protection, and managing identity and access. Handle data feeds and integration process to ensure seamless data exchange between systems. Assess innovation and vendor alignment for the continuous build out and scale of the cloud ecosystem. Provide infrastructure support to ensure operational excellence ensuring issue resolution, disaster recovery, and data backup standards are defined.
• Manage and support Azure Gov Cloud operations, including installations, configurations, deployments, integrations, and administration using tools like Azure Kubernetes, Terraform Enterprise, and GitHub.
• Troubleshoot and resolve cloud ecosystem issues, ensuring uptime and performance monitoring. • Collaborate across teams (server, storage, database, security) to implement and maintain solutions aligned with project plans.
• Apply advanced cloud expertise in areas such as containerization, DevOps, networking, and scripting while adopting new technologies and best practices.
• Design and operate complex solutions following ITIL processes for ticket resolution and stakeholder communication.
Required Skills and Experience:
• Four (4) or more years of experience in the field or in a related area.
• Current Microsoft Azure Gov. Cloud experience
• Monitoring, troubleshooting, scripting, deployment, integration, messaging, automation, orchestration
• Written and communication skills, problem solving, time management, teamwork, detail oriented, customer service.
Nice to have:
• Bachelor's degree in Information Technology or related field.
Compensation:
$125,000-$153,000 per year annual salary.
Exact compensation may vary based on several factors, including skills, experience, and education. Benefit packages for this role may include healthcare insurance offerings and paid leave as provided by applicable law.
AZCO Data Scientist - IT (Appleton, WI)
Data engineer job in Appleton, WI
The Staff Data Scientist plays a critical role in leveraging data to drive business insights, optimize operations, and support strategic decision-making. You will be responsible for designing and implementing advanced analytical models, developing data pipelines, and applying statistical and machine learning techniques to solve complex business challenges. This position requires a balance of technical expertise, business acumen, and communication skills to translate data findings into actionable recommendations.
+ Develop and apply data solutions for cleansing data to remove errors and review consistency.
+ Perform analysis of data to discover information, business value, patterns and trends in support to guide development of asset business solutions.
+ Gather data, find patterns and relationships and create prediction models to evaluate client assets.
+ Conduct research and apply existing data science methods to business line problems.
+ Monitor client assets and perform predictive and root cause analysis to identify adverse trends; choose best fit methods, define algorithms, and validate and deploy models to achieve desired results.
+ Produce reports and visualizations to communicate technical results and interpretation of trends; effectively communicate findings and recommendations to all areas of the business.
+ Collaborate with cross-functional stakeholders to assess needs, provide assistance and resolve problems.
+ Translate business problems into data science solutions.
+ Performs other duties as assigned
+ Complies with all policies and standards
**Qualifications**
+ Bachelor Degree in Analytics, Computer Science, Information Systems, Statistics, Math, or related field from an accredited program and 4 years related experience required or experience may be substituted for degree requirement required
+ Experience in data mining and predictive analytics.
+ Strong problem-solving skills, analytical thinking, attention to detail and hypothesis-driven approach.
+ Excellent verbal/written communication, and the ability to present and explain technical concepts to business audiences.
+ Proficiency with data visualization tools (Power BI, Tableau, or Python libraries).
+ Experience with Azure Machine Learning, Databricks, or similar ML platforms.
+ Expert proficiency in Python with pandas, scikit-learn, and statistical libraries.
+ Advanced SQL skills and experience with large datasets.
+ Experience with predictive modeling, time series analysis, and statistical inference.
+ Knowledge of A/B testing, experimental design, and causal inference.
+ Familiarity with computer vision for image/video analysis.
+ Understanding of NLP techniques for document processing.
+ Experience with optimization algorithms and operations research techniques preferred.
+ Knowledge of machine learning algorithms, feature engineering, and model evaluation.
This job posting will remain open a minimum of 72 hours and on an ongoing basis until filled.
EEO/Disabled/Veterans
**Job** Information Technology
**Primary Location** US-WI-Appleton
**Schedule:** Full-time
**Travel:** Yes, 5 % of the Time
**Req ID:** 253790
\#LI-MF #ACO N/A
Senior Data Engr
Data engineer job in Neenah, WI
The Sr Data Engineer will oversee the Department's data integration work, including developing a data model, maintaining a data Warehouse and analytics environment, and writing scripts for data integration and analysis. Troubleshoot and resolve data-related issues, collaborating with the team and our Vendor Consultant to identify root causes. Recommend and deploy data models and solutions for existing data systems. This individual is responsible for playing a key role in the design and development of data pipelines to modernize our financial back end into our Azure based solution. The Data Engineer will collaborate with SMEs, Data Analysts, and Business Analysts to achieve this goal. This role will serve as a mentor for Junior Data Engineers by sharing knowledge and experience.
RESPONSIBILITIES:
Serve as a mentor to less senior Data Engineers on the team
Troubleshoot data discrepancies within the databases table of the IAP data model
Provide data to Data Analyst
Assist other developers with further developing the IAP data model
Develop, maintain and optimize current data pipelines which includes, but isn't limited to data from our policy administration systems, claims system, agent facing portals, and third party data assets
Knowledge and understanding of best practices for development including query optimization, version control, code reviews, and technical documentation
Develops complex data objects for business analytics using data modeling techniquesâ¯
Ensures data quality and implements tools and frameworks for automating the identification of data quality issuesâ¯
Analyze and propose improvements to existing Data Structures and Data Movement Processes
Perform assigned project tasks in a highly interactive team environment while maintaining a productive and positive atmosphere
Stay current with P&C insurance knowledge and industry technology and utilize that knowledge with existing productivity tools, standards, and procedures to contribute to the cost-effective operation of the department and company
Other duties as assigned
QUALIFICATIONS:â¯
ESSENTIAL:â¯
Associate's or Bachelor's degree in IT related field or equivalent combination of education/experience Associate or bachelor's degree in an IT related field, with business analysis experience in related field
5+ years of data pipeline/ETL/ELT development experience in a corporate environment
Knowledge of the SDLC, business processes and technical aspects of data pipelines and business intelligence outcomes
Experience with Azure DevOps, Azure Synapse, and Azure Data Lake/SQL
Experience working with XML and JSON data formats
Expert in SQL for data mapping, extracting, and validating source data to enable accurate reporting and data feeds
Experience with large system design and implementation
Collaborative team member with a strong ability to contribute positively to team dynamics and culture (Team and culture fit)
Results oriented, self-motivated, resourceful team player
Superior oral and written communication skills
Demonstrates thought Leadership
Able to prioritize and work through multiple tasks
PREFERRED:â¯
P&C Insurance industry experience
Policy, Claims, Billing and Finance experience
Agile methodology
Experience with Duck creek Software Solutions
Experience with Azure DevOps (ADO)
At SECURA, we are transforming the insurance experience by putting authenticity at the forefront of everything we do. Our mission is clear: we're making insurance genuine. We recognize that our associates are our greatest assets, and we invest in their well-being and professional growth. We offer opportunities for continuous learning and career advancement, competitive benefits, and a culture that champions work-life balance. Joining SECURA means becoming part of a dynamic team that values each individual's contribution and fosters a collaborative atmosphere. Here, you'll not only find a fulfilling career but also a place where you can make a positive impact every day.
SECURA Insurance strives to provide equal opportunity for all employees and is committed to fostering an inclusive work environment. We welcome applicants from all backgrounds and walks of life.
Data Engineer
Data engineer job in Appleton, WI
Amplifi is the go-to data consultancy for enterprise organizations that want their success to be driven by data. We empower our clients to innovate, grow and succeed by establishing and delivering strategies across all elements of the data value chain. From the governance and management of data through to analytics and automation, our integrated approach to modern data ecosystems delivers measurable results through a combination of expert consultancy and best-in-breed technology. Our company and team members are proud to empower our clients' businesses by providing exceptional solutions and value, as we truly believe their success is our success. We thrive on delivering excellent solutions and overcoming technical and business challenges. As such, we're looking for like-minded individuals to learn, grow, and mentor others as a part of the Amplifi family.
Position Summary
The Data Engineer will be responsible for designing, building, and maintaining scalable, secure data pipelines that drive analytics and support operational data products. The ideal candidate brings a strong foundation in SQL, Python, and modern data warehousing with a deep understanding of Snowflake, Databricks, or Microsoft Fabric, and a solid understanding of cloud-based architectures.
What You Will Get To Do
Design, develop, and optimize robust ETL/ELT pipelines to ingest, transform, and expose data across multiple systems.
Build and maintain data models and warehouse layers, enabling high-performance analytics and reporting.
Collaborate with analytics, product, and engineering teams to understand data needs and deliver well-structured data solutions.
Write clean, efficient, and testable code in SQL and Python to support automation, data quality, and transformation logic.
Support deployment and orchestration workflows, using Azure Data Factory, dbt, or similar tools.
Work across multi-cloud environments (Azure preferred; AWS and GCP optional) to integrate data sources and manage cloud-native components.
Contribute to CI/CD practices and data pipeline observability (monitoring, logging, alerting).
Ensure data governance, security, and compliance in all engineering activities.
Support ad hoc data science and machine learning workflows within Dataiku.
What You Bring to the Team
4+ years of experience in a data engineering or related software engineering role.
Proficiency in SQL and Python for data manipulation, transformation, and scripting.
Strong experience working with Snowflake and MSSQL Server.
Practical knowledge of working with cloud data platforms, especially Microsoft Azure.
Experience with modern data modeling and warehouse optimization techniques.
Experience with Databricks, Azure Data Factory, or DBT preferred.
Exposure to Microsoft Fabric components like OneLake, Pipelines, or Direct Lake.
Familiarity with cloud services across AWS, GCP, or hybrid cloud environments.
Understanding of or curiosity about Dataiku for data science and advanced analytics collaboration.
Ability to work independently and with a team in a hybrid/remote environment.
Location
Wisconsin is preferred.
Travel
Ability to travel up to 10% of the time.
Benefits & Compensation
Amplifi offers excellent compensation and benefits including, but not limited to, health, dental, 401(k) program, employee assistance program, short and long-term disability, life insurance, accidental death and dismemberment (AD&D), PTO program, flex work schedules and paid holidays.
Equal Opportunity Employer
Amplifi is proud to be an equal opportunity employer. We do not discriminate against applicants based on race, religion, disability, medical condition, national origin, gender, sexual orientation, marital status, gender identity, pregnancy, childbirth, age, veteran status or other legally protected characteristics.
Data Architect II
Data engineer job in Appleton, WI
Join our Corporate Data and Analytics team as we continue to expand our data capabilities! The Data Architect II role will be responsible to design, manage, and optimize the organization's data infrastructure for a specific data domain. This individual will ensure that data is structured, secure, and accessible to support business operations and decision-making, will mentor junior data architects and will provide cross functional governance leadership.
The ideal/preferred location for this position is in Appleton, WI. Will consider candidates in other locations based on relevancy of related experience.
JOB RESPONSIBILITIES
Essential Job Responsibilities:
* Create and maintain conceptual, logical, and physical data models. Document data flows, lineage, and dependencies for assigned data domains.
* Collaborate with data engineers, data analysts, and business partners to understand aligning the data model to business requirements.
* Capture metadata associated with new data projects. Manage Metadata Repositories. Coordinate with business partners to maintain data catalog information for assigned data domains. Implement metadata and lineage tracking within domain.
* Manage and monitor data quality assessments. Communicate and resolve data quality issues with business stewards.
* Enforce governance rules and policies for assigned data domain. Leverage master data management tools and canonical data practices to govern data.
* Define schemas, transformations, and integration rules. Plan and design data integration methods. Support data integration activities.
* Enforce security protocols and policies on assigned data domains. Measure and monitor access to sensitive and secure data.
* Work closely with data engineers, analysts, and business stakeholders to align data architecture with organizational goals.
* Own one or more data domains end-to-end, including integration rules, data catalog upkeep, and governance enforcement.
* Support regulatory compliance initiatives (GDPR, CCPA, HIPAA, SOX depending on company domain).
* Introduce cloud-first design patterns and domain-oriented governance practices.
Additional Job Responsibilities:
* Live our values of High Performance, Caring Relationships, Strategic Foresight, and Enterprising Spirit
* Find A Better Way by championing continuous improvement and quality control efforts to identify opportunities to innovate and improve efficiency, accuracy, and standardization.
* Continuously learn and develop self professionally.
* Support corporate efforts for safety, government compliance, and all other company policies & procedures.
* Perform other related duties as required and assigned.
QUALIFICATIONS
Required:
* Bachelor's in computer science, Information Systems, or related field
* 4+ years in a data engineering or BI Developer role, including 2 years experience in data modeling
* Experience with data analysis tools for data research including languages such as SQL or Python and exploration tools such as Power BI, Tableau, or Looker.
* Experience using cloud data platforms such as AWS, Azure, GCP, Snowflake, and Databricks.
* Ability to translate complex business needs into technical architectural solutions.
* Strong documentation and communication skills
* Excellent analytical and problem-solving skills
Preferred:
* Awareness of data governance frameworks and tools
* Familiarity with compliance regulations
* Familiarity with database management tools and business intelligence tools
DIVISION:
Corporate
U.S. Venture requires that a team member have and maintain authorization to work in the country in which the role is based. In general, U.S. Venture does not sponsor candidates for nonimmigrant visas or permanent residency unless based on business need.
U.S. Venture will not accept unsolicited resumes from recruiters or employment agencies. In the absence of an executed recruitment Master Service Agreement, there will be no obligation to any referral compensation or recruiter fee. In the event a recruiter or agency submits a resume or candidate without an agreement, U.S. Venture shall reserve the right to pursue and hire those candidate(s) without any financial obligation to the recruiter or agency. Any unsolicited resumes, including those submitted to hiring managers, shall be deemed the property of U.S. Venture.
U.S. Venture, Inc. is an equal opportunity employer that is committed to inclusion and diversity. We ensure equal opportunity for all applicants without regard to race, color, religion, sex, sexual orientation, gender, gender identity or expression, marital status, age, national origin, disability, veteran status, genetic information, or other protected characteristic. If you need assistance or an accommodation due to a disability, you may call Human Resources at **************.
Auto-ApplyAZCO Data Scientist - IT (Appleton, WI)
Data engineer job in Appleton, WI
The Staff Data Scientist plays a critical role in leveraging data to drive business insights, optimize operations, and support strategic decision-making. You will be responsible for designing and implementing advanced analytical models, developing data pipelines, and applying statistical and machine learning techniques to solve complex business challenges. This position requires a balance of technical expertise, business acumen, and communication skills to translate data findings into actionable recommendations.
+ Develop and apply data solutions for cleansing data to remove errors and review consistency.
+ Perform analysis of data to discover information, business value, patterns and trends in support to guide development of asset business solutions.
+ Gather data, find patterns and relationships and create prediction models to evaluate client assets.
+ Conduct research and apply existing data science methods to business line problems.
+ Monitor client assets and perform predictive and root cause analysis to identify adverse trends; choose best fit methods, define algorithms, and validate and deploy models to achieve desired results.
+ Produce reports and visualizations to communicate technical results and interpretation of trends; effectively communicate findings and recommendations to all areas of the business.
+ Collaborate with cross-functional stakeholders to assess needs, provide assistance and resolve problems.
+ Translate business problems into data science solutions.
+ Performs other duties as assigned
+ Complies with all policies and standards
Qualifications
+ Bachelor Degree in Analytics, Computer Science, Information Systems, Statistics, Math, or related field from an accredited program and 4 years related experience required or experience may be substituted for degree requirement required
+ Experience in data mining and predictive analytics.
+ Strong problem-solving skills, analytical thinking, attention to detail and hypothesis-driven approach.
+ Excellent verbal/written communication, and the ability to present and explain technical concepts to business audiences.
+ Proficiency with data visualization tools (Power BI, Tableau, or Python libraries).
+ Experience with Azure Machine Learning, Databricks, or similar ML platforms.
+ Expert proficiency in Python with pandas, scikit-learn, and statistical libraries.
+ Advanced SQL skills and experience with large datasets.
+ Experience with predictive modeling, time series analysis, and statistical inference.
+ Knowledge of A/B testing, experimental design, and causal inference.
+ Familiarity with computer vision for image/video analysis.
+ Understanding of NLP techniques for document processing.
+ Experience with optimization algorithms and operations research techniques preferred.
+ Knowledge of machine learning algorithms, feature engineering, and model evaluation.
This job posting will remain open a minimum of 72 hours and on an ongoing basis until filled.
EEO/Disabled/Veterans
Job Information Technology
Primary Location US-WI-Appleton
Schedule: Full-time
Travel: Yes, 5 % of the Time
Req ID: 253790
\#LI-MF #ACO N/A
Digital Technology Data Scientist Lead
Data engineer job in Oshkosh, WI
**At Oshkosh, we build, serve and protect people and communities around the world by designing and manufacturing some of the toughest specialty trucks and access equipment. We employ over 18,000 team members all united by a common purpose. Our engineering and product innovation help keep soldiers and firefighters safe, is critical in building and keeping communities clean and helps people do their jobs every day.**
**SUMMARY:**
As a Data Scientist, your primary responsibilities will be to analyze and interpret datasets by using statistical techniques, machine learning, and/or programming skills in order to extract insights and build models which solve business problems.
**YOUR IMPACT** **:**
+ Familiarity with data science tools, such as Azure Databricks, Power BI, Microsoft Office (including Excel pivot tables), Spark, Python, R and C.
+ Undertake the processing of multiple datasets, including structured and unstructured, to analyze large amounts of information to discover trends and patterns.
+ Prepare and concisely deliver analysis results in visual and written forms that communicate data insights to both technical and non-technical audiences.
+ Collaborate with cross-functional teams (e.g. data analysts, data engineers, architects, business stakeholders) to understand data needs for complex business requirements.
+ Build highly complex predictive models and machine-learning algorithms. Execute Integration into existing systems or creation of new products.
+ Direct some data science assignments, projects, visualization tasks, data quality improvements, and troubleshooting of data incidents, including the resolution of root causes.
+ Lead efforts to resolve and document solutions to track and manage incidents, changes, problems, tasks, and demands.
+ Coach and mentor other team members on new technologies and best practices across data science and business intelligence.
+ Possesses advanced understanding of business' processes in at least one area of business and has an understanding of business processes in multiple areas of the business.
+ Actively supports the advancement of the strategic roadmap for data science.
**MINIMUM QUALIFICATIONS:**
+ Bachelorsdegree with five (5) or more years of experience in the field or in a related area.
**STANDOUT QUALIFICATIONS:**
+ Masters or doctorate degree
+ Expertise in Power Platforms
+ Familiarity with LLM (open source or closed)
+ Experience in Front end web app development (Flaskapp, Gradioetc)
+ Familiarity with RAG architecture
**Pay Range:**
$115,600.00 - $196,400.00
The above pay range reflects the minimum and maximum target pay for the position across all U.S. locations. Within this range, individual pay is determined by various factors, including the scope and responsibilities of the role, the candidate's experience, education and skills, as well as the equity of pay among team members in similar positions. Beyond offering a competitive total rewards package, we prioritize a people-first culture and offer various opportunities to support team member growth and success.
Oshkosh is committed to working with and offering reasonable accommodation to job applicants with disabilities. If you need assistance or an accommodation due to disability for any part of the employment process, please contact us at ******************************************.
Oshkosh Corporation is a merit-based Equal Opportunity Employer. Job opportunities are open for application to all qualified individuals and selection decisions are made without regard to race, color, religion, sex, national origin, age, disability, veteran status, or other protected characteristic. To the extent that information is provided or collected regarding categories as provided by law it will in no way affect the decision regarding an employment application.
Oshkosh Corporation will not discharge or in any manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with Oshkosh Corporation's legal duty to furnish information.
Certain positions with Oshkosh Corporation require access to controlled goods and technologies subject to the International Traffic in Arms Regulations or the Export Administration Regulations. Applicants for these positions may need to be "U.S. Persons," as defined in these regulations. Generally, a "U.S. Person" is a U.S. citizen, lawful permanent resident, or an individual who has been admitted as a refugee or granted asylum.
Data Scientist - Strategic Analytics
Data engineer job in Sheboygan, WI
Acuity is seeking a Data Scientist - Strategic Analytics to work with business leaders to inform company strategies and decision-making through an objective viewpoint by leveraging the team's expertise in deep business knowledge, leveraging data as an asset, and utilizing statistical and advanced analytical techniques as appropriate. The role of Data Scientist - Strategic Analytics uses advanced statistical and analytical techniques to inform highly complex and sophisticated business questions. The data scientist will apply and integrate advanced statistical, mathematical, data mining, and business analysis skills and techniques to discover new behavioral insights, predict outcomes, prescribe decision options, and advise on turning new insights into actionable opportunities.
Internal deadline to apply: December 22nd, 2025
ESSENTIAL FUNCTIONS:
* Develop solutions, to support analytic insights and visualization using mathematical models, algorithms, machine learning techniques, and robust analytics.
* Partner with business clients and technical leaders to spearhead the development of standards for appropriate statistical methodology, study design, power and sample size estimation, and data analysis plans for behavioral analytics.
* Provide data driven decision support for key initiatives of company strategy and measurement related to client, claims, sales, and campaign analyses to understand both growth and economic impact.
* Determine requisite data elements and partners with data engineers to design integrated datasets for analytical research purposes.
* Work closely with machine learning engineers and business partners to develop and build behavioral data science and analytics products.
* Effectively present analytical results and derived recommendations for action to senior leaders, peers, and product team members.
* Regular and predictable attendance.
* Performs other duties as assigned.
EDUCATION:
Bachelor's degree in data science, statistics, math, computer science, economics, or related field. Advanced graduate level degree in a quantitative discipline (statistics, applied mathematics, computer science, econometrics, or a related field) is preferred.
EXPERIENCE:
A minimum of three years relevant experience to include research and data analysis, experiment design and measurement, or application of statistical research techniques.
OTHER QUALIFICATIONS:
* Expertise in one or more development or statistical analysis tool such as R, Python, SAS, SQL, SPSS, or other tools. Tableau experience is a plus.
* Proven excellence in research, quantitative analysis, problem solving, and analytical working techniques.
* Statistical knowledge and intuition
* Strong aptitude and desire for learning new analytical and visualization tools, modeling, and quantitative techniques. Initiative to independently design and develop own deliverables while still being a team player.
* Demonstrates ability to deliver results and recommendations in written, verbal and presentation form at an appropriate level for a business audience.
* Acuity does not sponsor applicants for U.S. work authorization.*
This job is classified as exempt.
We are an Equal Employment Opportunity employer. Applicants and employees are considered for positions and are evaluated without regard to mental or physical disability, race, color, religion, gender, national origin, age, genetic information, military or veteran status, sexual orientation, marital status or any other protected Federal, State/Province or Local status unrelated to the performance of the work involved.
If you have a disability and require reasonable accommodations to apply or during the interview process, please contact our Talent Acquisition team at ******************. Acuity is dedicated to offering reasonable accommodations during our recruitment process for qualified individuals.
Data Architect
Data engineer job in Appleton, WI
Seeking a Data Architect to help their growing data team transform how the company operates with data. This person will own data architecture for smaller projects, design models end-to-end, and collaborate with business stakeholders to define sources, business logic, and governance standards.
Responsibilities:
Design and implement data models across multiple domains
Define source systems, tables, and business logic for unified models
Partner with IT and business teams to ensure governed, reliable data
Support cloud adoption (Azure/GCP) while managing on-prem data
Contribute to data governance and architecture best practices
Requirements:
4+ years in data roles (engineering, BI, analytics)
2+ years in data architecture
Strong data modeling skills
Business-facing communication experience
Familiarity with Azure or GCP
Understanding of data governance principles
Skills
Data modeling, Data architecture, gcp, azure, data governance, Sql, power bi, Python, database management, compliance regulations, warehouse management system
Top Skills Details
Data modeling,Data architecture,gcp,azure,data governance,Sql,power bi
Additional Skills & Qualifications
Strong analytical and problem-solving skills
Ability to work independently and collaboratively in a team environment
Comfortable with hybrid work model and occasional travel
Experience with relational databases and SQL
Exposure to BI tools and ETL processes
Awareness of data governance frameworks and tools
Familiarity with compliance regulations
Familiarity with database management tools and business intelligence tools
Job Type & Location
This is a Permanent position based out of Appleton, WI.
Pay and Benefits
The pay range for this position is $105000.00 - $130000.00/yr.
1. Reimbursement programs for wellness ( gym memberships, group classes) 2. Health, Vision and Dental starting day 1 3. 7% company match on 401k 4. PTO, holidays, sicks day, volunteer time off, caregiver leave. Short and long term disability
Workplace Type
This is a fully onsite position in Appleton,WI.
Application Deadline
This position is anticipated to close on Dec 25, 2025.
h4>About TEKsystems:
We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company.
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
About TEKsystems and TEKsystems Global Services
We're a leading provider of business and technology services. We accelerate business transformation for our customers. Our expertise in strategy, design, execution and operations unlocks business value through a range of solutions. We're a team of 80,000 strong, working with over 6,000 customers, including 80% of the Fortune 500 across North America, Europe and Asia, who partner with us for our scale, full-stack capabilities and speed. We're strategic thinkers, hands-on collaborators, helping customers capitalize on change and master the momentum of technology. We're building tomorrow by delivering business outcomes and making positive impacts in our global communities. TEKsystems and TEKsystems Global Services are Allegis Group companies. Learn more at TEKsystems.com.
The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
Data Warehouse Engineer I
Data engineer job in Menasha, WI
The Data Warehouse Engineer I is part of a team dedicated to supporting Network Health's Enterprise Data Warehouse. This individual will perform development, analysis, testing, debugging, documentation, implementation, and maintenance of interfaces to support the Enterprise Data Warehouse and related applications. They will consult with other technical resources and key departmental users on solutions and best practices. They will monitor performance and effectiveness of the data warehouse and recommend changes as appropriate.
Location: Candidates must reside in the state of Wisconsin for consideration. This position is eligible to work at your home office (reliable internet is required), at our office in Brookfield or Menasha, or a combination of both in our hybrid workplace model.
Hours: 1.0 FTE, 40 hours per week, 8am-5pm Monday through Friday, may be required to work later hours when system changes are being implemented or problems arise
Check out our 2024 Community Report to learn a little more about the difference our employees make in the communities we live and work in. As an employee, you will have the opportunity to work hard and have fun while getting paid to volunteer in your local neighborhood. You too, can be part of the team and making a difference. Apply to this position to learn more about our team.
Job Responsibilities:
Perform end to end delivery of data interfaces in various stages of the Enterprise Data Warehouse in accordance with professional standards and industry best practice
Perform all phases of the development lifecycle including solution design, creation of acceptance criteria, implementation, technical documentation, development and execution of test cases, performance monitoring, troubleshooting, data analysis, and profiling
Consult with Developers, Engineers, DBAs, key departmental stakeholders, data governance and leadership on technical solutions and best practice
Monitor and audit the Enterprise Data Warehouse for effectiveness, throughput, and responsiveness. Recommend changes as appropriate. Troubleshoot customer complaints related to system performance issues
Maintain effective communication with customers from all departments for system development, implementation, and problem resolution
Required to take call to assist in resolution of technical problems
Other duties and responsibilities as assigned
Job Requirements:
Requires Associate Degree in Computer Science, Business, or related technical field; equivalent years of experience may be substituted
Minimum of 1 year experience in program interfacing required
Experience with T-SQL development, SSIS development, and database troubleshooting skills required
Network Health is an Equal Opportunity Employer
Data Scientist
Data engineer job in Kohler, WI
_Work Mode: Onsite_ **Opportunity** To serve as a data-driven innovator and strategic enabler, transforming business decisions through advanced analytics, machine learning and AI. This role combines statistical and technical expertise with business insight to unlock patterns, forecast trends, and deliver actionable intelligence that accelerates growth and operational excellence. The Data Scientist will champion the use of data as a competitive advantage-designing robust analytical models, ensuring data integrity, and driving scalable solutions that align with enterprise goals.
**Key Responsibilities**
**Analytical Modeling and AI Development**
+ Design experiments, test hypotheses, and build predictive models to uncover actionable insights.
+ Develop and refine algorithms to identify patterns and trends in large, complex datasets.
+ Validate findings through rigorous statistical methods and iterative testing.
+ Continuously monitor and optimize models for accuracy, scalability, and business impact.
+ Prototype and deploy AI-driven solutions that align with strategic business objectives and deliver measurable value.
**Data Exploration & Architecture**
+ Independently collate, analyze, and interpret large, complex datasets to generate actionable insights.
+ Consult on data architecture and governance to ensure quality, accuracy, and readiness for advanced analytics.
+ Identify relevant internal and external data sources and ensure compliance with data standards.
+ Drive improvements in data pipelines and integration to enable faster, more reliable analytics.
**Decision Intelligence & Transformation**
+ Partner with stakeholders to translate business requirements into analytical solutions that drive process and product improvements.
+ Support creation of scalable AI/ML solutions aligned with strategic goals.
+ Model and frame business scenarios that influence critical decisions and outcomes.
+ Provide clear metrics and KPIs to demonstrate the impact of analytics initiatives on business performance.
+ Identify opportunities for automation and optimization to accelerate enterprise-wide transformation.
**Mentorship & Evangelism**
+ Mentor Functional teams globally on AI tools, prompt engineering, and experimentation. Lead workshops, proofs-of-concept, and training sessions to build momentum and literacy across cultures and time zones.
+ Share personal and professional learnings to foster a culture of innovation and continuous improvement.
+ Advocate for responsible AI use and digital fluency across the enterprise
**Skills/Requirements**
+ Bachelor's degree in Mathematics, Statistics, Computer Science, or related field (Master's preferred).
+ 1+ years of relevant quantitative and qualitative research and analytics experience.
+ Demonstrated experience in using, building, or configuring AI tools, bots, or agents-especially by those who actively explore and experiment with emerging technologies, including in personal contexts. Strong familiarity with large language models (LLMs), prompt engineering, and commonly used AI tools.
**Technical Fluency**
+ Proficiency in Python or R
+ Strong skills in SQL
+ Experience with cloud platforms (Azure and Databricks preferred)
+ Familiarity with prompt engineering and agent-based tools (Copilot Studio, Azure AI Foundry, or similar)
+ Knowledge of data visualization tools (Power BI, Tableau, or similar)
**Core Competencies**
+ Creative Problem solving:Works with full competence to find practical solutions for unexpected stakeholder problems. Typically works without supervision
+ Brings a cross-functional collaborative mindset across business and technical teams; thrives in cross-disciplinary environments to align AI efforts across groups.
+ Communication and storytelling:Tailors communication content and style to the needs of others. Pays attention to others' input and perspectives, asks questions, and summarizes to confirm understanding
+ Nimble learning:Learns through experimentation when tackling new problems, using both successes and failures as learning fodder. Swiftly incorporates new concepts and principles into own expertise; skillfully uses these fresh insights to solve problems
+ Cultivates innovation:Creates new and better ways for the organization to be successful. Offers creative ideas, finds unique connections between previously unrelated elements. Builds upon and strengthens new solutions in a positive and collaborative manner.
\#LI-DNI
**_Applicants must be authorized to work in the US without requiring sponsorship now or in the future._**
_We believe in supporting you from the moment you join us, which is why Kohler offers day 1 benefits. This means you'll have access to your applicable benefit programs from your first day on the job, with no waiting period. The salary range for this position is $64,750 - $98,350. The specific salary offered to a candidate may be influenced by a variety of factors including the candidate's experience, their education, and the work location._
**Why Choose Kohler?**
We empower each associate to #BecomeMoreAtKohler with a competitive total rewards package to support your health and wellbeing, access to career growth and development opportunities, a diverse and inclusive workplace, and a strong culture of innovation. With more than 30,000 bold leaders across the globe, we're driving meaningful change in our mission to help people live gracious, healthy, and sustainable lives.
**About Us**
It is Kohler's policy to recruit, hire, and promote qualified applicants without regard to race, creed, religion, age, sex, sexual orientation, gender identity or expression, marital status, national origin, disability or status as a protected veteran. If, as an individual with a disability, you need reasonable accommodation during the recruitment process, please contact ********************* . Kohler Co. is an equal opportunity/affirmative action employer.
Senior Data Engineer
Data engineer job in Ripon, WI
The Enterprise Analytics Center of Excellence (CoE) is building the next-generation analytics stack centered on Snowflake (AWS) and Sigma. We are looking for a Sr Data Engineer who will play a critical role in shaping and operating this modern architecture. This role is not just about maintaining pipelines. It's about helping lead the way in adopting cutting-edge analytics practices, scaling governed data models, and exploring new trends in the industry.
Responsibilities
Ingestion & Transformation
Design, build, and maintain scalable data pipelines in Snowflake OpenFlow and other Snowflake-native services.
Migrate existing Dataiku pipelines to Snowflake, reducing complexity and cost.
Continuously improve workflows for performance, reliability, and efficiency.
Semantic Modeling & Views
Create and maintain Snowflake semantic views that serve as the governed data layer for analytics.
Apply dimensional modeling best practices to ensure consistency and reusability across business domains.
Enable role-based semantic views for Finance, Operations, and other enterprise functions.
Governance & Security
Implement and maintain RBAC frameworks using Active Directory roles/groups.
Support metadata management and cataloging surfaced directly in Sigma.
Ensure pipelines and models comply with enterprise data governance standards.
Analytics Enablement
Deliver clean, governed data sets for Sigma dashboards, Ask Sigma (NLP), Write-Back, and embedded analytics.
Monitor and tune query performance to optimize cost and speed.
Provide technical depth to ensure analytics can scale securely and efficiently across the enterprise.
Innovation & Continuous Improvement
Stay on top of the latest trends in data engineering and analytics (semantic layers, LLM-powered BI, write-back, real-time pipelines, etc.).
Bring new ideas and emerging technologies to ALS for evaluation and potential adoption.
Automate testing, monitoring, and deployment processes to keep the stack modern and efficient.
Qualifications
Education and Experience:
Bachelor's degree in computer science or related field, . Equivalent experience may be considered
Skills and Abilities:
Snowflake expertise: SQL, OpenFlow, Tasks/Streams, performance tuning, semantic views.
Cloud experience: Strong hands-on with AWS (S3, IAM, Glue, Lambda a plus).
Data modeling: Dimensional modeling, facts/dimensions, semantic design.
ETL/ELT engineering: Proven ability to build and optimize large-scale pipelines.
Governance & Security: AD/LDAP integration, RBAC, cataloging.
Programming: SQL ; Python
Curiosity and enthusiasm for cutting-edge analytics tools and approaches.
Preferred Skills:
Experience migrating Power BI models/reports into Snowflake + Sigma.
Familiarity with Sigma features: Ask Sigma, Write-Back, Embedded Analytics.
Exposure to Dataiku (to support migration away from it).
Interest in AI/ML-powered analytics and NLP-based BI interfaces.
Travel:
Up to 25% travel
Standard and Physical Requirements:
Position involves sitting long periods, standing, manual dexterity, stooping, bending and minimal lifting
Alliance Team Members Demonstrate DRIVE:
Dedicated: Follows through on commitments. Strong say/do.
Respectful: Acts with integrity and values diverse perspective.
Innovative: Always looking for a better way; leads change.
Versatile: Adapts quickly to changing circumstances. Demonstrates agility.
EEO We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. ID 2025-10436 Pos. Type Full-Time
Auto-ApplyLead Data Steward
Data engineer job in Menasha, WI
You've discovered something special. A company that cares. Cares about leading the way in construction, engineering, manufacturing and renewable energy. Cares about redefining how energy is designed, applied and consumed. Cares about thoughtfully growing to meet market demands. And ─ as “one of the Healthiest 100 Workplaces in America” ─ is focused on the mind/body/soul of team members through our Culture of Care.
We are seeking a highly motivated and detail-oriented FTI Data Steward Leader to establish and champion the data governance within our organization. As data governance is a new initiative for the company, this role will be pivotal in laying the foundation, defining standards, and ensuring that data is treated as a critical enterprise asset. The FTI Data Steward Leader will help lead a community of business and technical data stewards across domains, promote best practices, and work closely with data governance leadership to develop policies, data quality standards, and stewardship processes. This role requires a strategic thinker with strong leadership skills and a deep understanding of data management principles.
MINIMUM REQUIREMENTS
Education: Bachelor's degree in Information Management, Data Science, Business, or related field, or a related field or experience in lieu of degree.
Experience: 5+ years of experience in data management, data governance, or related disciplines with demonstrated experience in leading data stewardship or governance initiatives enterprise wide. Strong knowledge of data quality principles, metadata management, and master data management (MDM). Familiarity with data governance frameworks (e.g., DAMA DMBOK, DCAM) and tools (e.g. Informatica, etc.).
Travel: 0-10%
Work Schedule: This position works between the hours of 7 AM and 5 PM, Monday- Friday. However, work may be performed at any time on any day of the week to meet business needs.
KEY RESPONSIBILITIES
Leadership and Strategy:
Serves as the primary representative and advocate for the community of master data leads and data coordinators.
Leads master data management (MDM) strategies and policies to support FTI.
Collaborates with data governance leadership to define the enterprise data stewardship framework, standards, and playbooks.
Creates and implements data governance policies, procedures, and maturity roadmaps.
Leads the formation of a data governance council and facilitate regular working sessions.
Stewardship and Quality
Ensures consistent application of data definitions, metadata standards, and classification across data domains.
Leads and develops data quality standards with data stewards and help resolve data quality issues with appropriate data stewards.
Collaboration and Stakeholder Engagement:
Partners with business units, BT, and Data Analytics teams to identify data steward leaders and data ownership roles.
Facilitates communication between business and technical stakeholders to resolve data issues and improve data understanding.
Acts as a liaison between the data governance and operational teams to ensure stewardship initiatives are aligned with business needs.
Metadata and Cataloging:
Works with data governance and Data Analytics team to maintain an enterprise data catalog.
Supports documentation of business glossaries, data dictionaries, and lineage across key data assets.
Training and Change Management:
Promotes data literacy and fosters a data-centric culture across the organization.
Leads change management efforts related to data governance adoption and tool implementation.
Performs other related duties as required and assigned.
The job description and responsibilities described are intended to provide guidelines for job expectations and the employee's ability to perform the position described. It is not intended to be construed as an exhaustive list of all functions, responsibilities, skills, and abilities. Additional functions and requirements may be assigned by supervisors as deemed appropriate.
How Does FTI Give YOU the Chance to Thrive?
If you're energized by new challenges, FTI provides you with many opportunities. Joining FTI opens doors to redefine what's possible for your future.
Once you're a team member, you're supported and provided with the knowledge and resources to achieve your career goals with FTI. You're officially in the driver's seat of your career, and FTI's career development and continued education programs give you opportunities to position yourself for success.
FTI is a “merit to the core” organization. We recognize and reward top performers, offering competitive, merit-based compensation, career path development and a flexible and robust benefits package.
Benefits are the Game-Changer
We provide industry-leading benefits as an investment in the lives of team members and their families. You're invited to review the full list of FTI benefits available to regular/full-time team members. Start here. Grow here. Succeed here. If you're ready to learn more about your career with FTI, apply today!
Faith Technologies, Inc. is an Equal Opportunity Employer - veterans/disabled.
Auto-ApplySoftware Engineer
Data engineer job in West Bend, WI
Recognized as a Milwaukee Journal Sentinel Top Workplace for 14 consecutive years, including three years of being honored as number one! Join us at West Bend, where we believe that our associates are our greatest asset. We hire talented individuals who are conscientious, dedicated, customer focused, and able to build lasting relationships. We create and maintain an environment where you feel a sense of belonging and appreciation. Your diversity of thought, experience, and knowledge are valued. We're committed to fostering a welcoming culture, offering you opportunities for meaningful work and professional growth. More than a workplace, we celebrate our successes and take pride in serving our communities.
Job Summary
The Enterprise applications team is looking for a Software Engineer to join the team. Though this individual will be interacting with full stack technologies, there is an emphasis on back-end technologies (specifically .NET). This team is building high-visibility, enterprise capabilities to underpin our diverse software landscape. In this role, you will be a key contributor to the enterprise technology platform and help drive business value with emerging technologies. Further, you will embrace craftsmanship and innovation to develop a best-of-breed experience for our engineering community. Come join us and be part of a growing team!
Work Location
This not a remote position. Candidates who are located within 50 miles of a West Bend office location will work a hybrid schedule (at least 3 days/week) for collaboration days, team meetings or other in-person events. The position can be based in West Bend or Madison.
The internal deadline to apply is 12/12/2025. External applications will be accepted on a rolling basis while the position remains open.
Responsibilities & Qualifications
Requirements:
* .NET & REST expertise
* Microservices experience
* Familiarity with API specification and practices
* Automated unit, integration, and regression testing
* CI/CD pipelines to build, test, deploy and release code to the cloud
* Continuous advocacy of technical craftsmanship and preferred/best practices
* Review and oversight of peer proposals and solutions
Practical Experience With One or More of the Following:
* Cloud experience (Microsoft Azure preferred)
* Event driven design Azure Service Bus / NServiceBus)
* Python and FastAPI
* Prompt Engineering
* React frontend development. Experience with micro-frontends.
#LI-LW1
#LI-Hybrid
Salary Statement
The salary range for this position is $94,205 - $117,756.
The actual base pay offered to the successful candidate will be based on multiple factors, including but not limited to job-related knowledge/skills, experience, business needs, geographical location, and internal equity. Compensation decisions are made by West Bend and are dependent upon the facts and circumstances of each position and candidate.
Benefits
West Bend offers a comprehensive benefit plan including but not limited to:
* Medical & Prescription Insurance
* Health Savings Account
* Dental Insurance
* Vision Insurance
* Short and Long Term Disability
* Flexible Spending Accounts
* Life and Accidental Death & Disability
* Accident and Critical Illness Insurance
* Employee Assistance Program
* 401(k) Plan with Company Match
* Pet Insurance
* Paid Time Off. Standard first year PTO is 17 days, pro-rated based on month of hire. Enhanced PTO may be available for experienced candidates
* Bonus eligible based on performance
* West Bend will comply with any applicable state and local laws regarding employee leave benefits, including, but not limited to providing time off pursuant to the Colorado Healthy Families and Workplaces Act for Colorado employees, in accordance with its plans and policies.
EEO
West Bend provides equal employment opportunities to all associates and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, and promotion.
Auto-ApplyEntry Level Engineer
Data engineer job in Neenah, WI
United Plastic Fabricating is the industry leader in the manufacture of plastic water tanks for the fire industry. In addition, we design and manufacture a variety of products for the industrial and transportation markets.
As an entry level engineer you could work in any of these disciplines and/or rotate through each for training:
Sales Engineer, Quality Engineer, Manufacturing Engineer, Design Engineer.
Assists in preparation of preliminary outlines, layouts and sketches for sales quotes. Assist customers with questions such as design feasibility and volume calculations. Evaluates new orders for manufacturability/warranty concerns. Receive, examine and enter sales orders to verify completeness and accuracy of data and manufacturability. Review and resolve any previously reported issues prior to releasing.
Provide engineering and assurance to ensure products are produced to meet customer requirements and expectations. Work in conjunction with other business and engineering disciplines using a cross-functional approach to ensure new/improved and current products and processes are in compliance with applicable quality management systems, and standards.
Works cross-functionally to capture and communicate information for possible design and manufacturability concerns or opportunities.
Improve manufacturing efficiency by analyzing and planning work flow, space requirements and equipment layout. Develop fixtures, and automate, semi-automate processes.
Develop manufacturing processes by studying product requirements, researching, designing, modifying and testing manufacturing methods, equipment and material handling conveyance.
Assure product and process quality by designing testing methods, process capabilities, establishing standards and confirming manufacturing processes.
Perform data gathering and historical trending for continuous improvement.
Requirements
BS Degree in Engineering
CAD and drafting skills.
Excellent mechanical aptitude necessary
Excellent benefits including Medical, Life, Dental, Disability insurance, 401K with employer match, student loan assistance, and gainsharing!
Visit UPF's website @ ********************* to visit our career page and submit your resume
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
No relocation allowance for this position.
100% in person role
This Employer uses E-Verify
Software Engineer
Data engineer job in West Bend, WI
Recognized as a
Milwaukee Journal Sentinel
Top Workplace for 14 consecutive years, including three years of being honored as number one! Join us at West Bend, where we believe that our associates are our greatest asset. We hire talented individuals who are conscientious, dedicated, customer focused, and able to build lasting relationships. We create and maintain an environment where you feel a sense of belonging and appreciation. Your diversity of thought, experience, and knowledge are valued. We're committed to fostering a welcoming culture, offering you opportunities for meaningful work and professional growth. More than a workplace, we celebrate our successes and take pride in serving our communities.
Job Summary
The Enterprise applications team is looking for a Software Engineer to join the team. Though this individual will be interacting with full stack technologies, there is an emphasis on back-end technologies (specifically .NET). This team is building high-visibility, enterprise capabilities to underpin our diverse software landscape. In this role, you will be a key contributor to the enterprise technology platform and help drive business value with emerging technologies. Further, you will embrace craftsmanship and innovation to develop a best-of-breed experience for our engineering community. Come join us and be part of a growing team!
Work Location
This not a remote position. Candidates who are located within 50 miles of a West Bend office location will work a hybrid schedule (at least 3 days/week) for collaboration days, team meetings or other in-person events. The position can be based in West Bend or Madison.
The internal deadline to apply is 12/12/2025. External applications will be accepted on a rolling basis while the position remains open.
Responsibilities & Qualifications
Requirements:
.NET & REST expertise
Microservices experience
Familiarity with API specification and practices
Automated unit, integration, and regression testing
CI/CD pipelines to build, test, deploy and release code to the cloud
Continuous advocacy of technical craftsmanship and preferred/best practices
Review and oversight of peer proposals and solutions
Practical Experience With One or More of the Following:
Cloud experience (Microsoft Azure preferred)
Event driven design Azure Service Bus / NServiceBus)
Python and FastAPI
Prompt Engineering
React frontend development. Experience with micro-frontends.
#LI-LW1
#LI-Hybrid
Salary Statement
The salary range for this position is $94,205 - $117,756.
The actual base pay offered to the successful candidate will be based on multiple factors, including but not limited to job-related knowledge/skills, experience, business needs, geographical location, and internal equity. Compensation decisions are made by West Bend and are dependent upon the facts and circumstances of each position and candidate.
Benefits
West Bend offers a comprehensive benefit plan including but not limited to:
Medical & Prescription Insurance
Health Savings Account
Dental Insurance
Vision Insurance
Short and Long Term Disability
Flexible Spending Accounts
Life and Accidental Death & Disability
Accident and Critical Illness Insurance
Employee Assistance Program
401(k) Plan with Company Match
Pet Insurance
Paid Time Off. Standard first year PTO is 17 days, pro-rated based on month of hire. Enhanced PTO may be available for experienced candidates
Bonus eligible based on performance
West Bend will comply with any applicable state and local laws regarding employee leave benefits, including, but not limited to providing time off pursuant to the Colorado Healthy Families and Workplaces Act for Colorado employees, in accordance with its plans and policies.
EEO
West Bend provides equal employment opportunities to all associates and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, and promotion.
Auto-ApplyBuilding Lead - Appleton Weekends-23960 (Fox Valley Operations Non Medical)
Data engineer job in Appleton, WI
Operating in a $1Billion plus industry, KleenMark is Wisconsin's largest independent commercial cleaning and supply company. Built on 60 years of experience, KleenMark uses proven processes and the industry's best-trained teams to deliver unmatched service. Expertise in healthcare, commercial, life sciences, manufacturing, and education, KleenMark's 900-plus technicians clean more than 30-million square feet daily. We are a family owned and run business that lives out our values of Trust, Teamwork and Results.
We have excellent opportunities for you to join our team!
Job Skills / Requirements
Job details:
Schedule: Saturday & Sunday
Hours: 4pm- 8pm
Pay: $19.50
Additional Details
Building Leads are responsible for maintaining the cleanliness of the building in which they work by performing various cleaning duties. Duties and hours may vary dependent upon the size of the building and the number of teammates they may be working with. A cleaner may be responsible for any or all the following tasks. Tasks may also change throughout a cleaner's employment.
ESSENTIAL JOB FUNCTIONS
Note: This is not an all-inclusive list. Additional duties may be assigned.
Restrooms | Cleans and disinfects sinks, countertops, toilets, mirrors, floors, etc. Replenishes bathroom supplies. Polishes metalwork, such as fixtures and fittings.
Floors | Sweeps, mops, vacuums, floors using brooms, mops, and vacuum cleaners. Other floor work may be required such as: scrubbing, waxing and polishing floors.
Break rooms /Kitchenettes | Cleans and disinfects sinks, countertops, tables, chairs, refrigerators, etc. Replenishes break room supplies.
Dust | Dusts furniture, equipment, partitions, etc.
Trash | Empties wastebaskets and recyclables and transports to disposal area.
Other Duties | Cleans rugs, carpets, and upholstered furniture, using vacuum cleaner (hip or backpack). Washes walls and woodwork. Washes windows, door panels, partitions, sills, etc.
EXPECTATIONS
Reports to work each day and on time and works extra hours when needed.
Employee must comply with proper safety policies and procedures as required (i.e., when using cleaning chemicals, reporting incidents, etc.).
Provides excellent level of customer service to both internal and external customers by maintaining a positive attitude.
The employee must be able to determine the neatness, accuracy and thoroughness of the work assigned.
Additional Information / Benefits
Medical, Vision & Dental Insurance for qualifying positions.
Personal Time Off (PTO) for qualifying positions.
6 Paid federal holidays after 90 days for qualifying positions.
Employee Referral Bonus
Instant Pay Access through DailyPay.
Employee of the Month, Quarter and Year Employee Recognition Program.
Growth within the company.
Great work/life balance
Safety First:
Personal protective equipment provided or required
Safety Monthly Trainings for all employees.
Sanitizing, disinfecting, or cleaning procedures in place
Employees working in medical facilities are required to wear a mask and gloves during the entirety of their shift. We provide all necessary PPE.
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Affirmative Action/EEO statement Kleenmark is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran, or disability status.
This job reports to the Krissia Henriquez
This is a Part-Time position 2nd Shift, Weekends.
Number of Openings for this position: 1
Software Engineer, Platform - Green Bay, USA
Data engineer job in Green Bay, WI
The mission of Speechify is to make sure that reading is never a barrier to learning.
Over 50 million people use Speechify's text-to-speech products to turn whatever they're reading - PDFs, books, Google Docs, news articles, websites - into audio, so they can read faster, read more, and remember more. Speechify's text-to-speech reading products include its iOS app, Android App, Mac App, Chrome Extension, and Web App. Google recently named Speechify the Chrome Extension of the Year and Apple named Speechify its 2025 Design Award winner for Inclusivity.
Today, nearly 200 people around the globe work on Speechify in a 100% distributed setting - Speechify has no office. These include frontend and backend engineers, AI research scientists, and others from Amazon, Microsoft, and Google, leading PhD programs like Stanford, high growth startups like Stripe, Vercel, Bolt, and many founders of their own companies.
Overview
The responsibilities of our Platform team include building and maintaining all backend services, including, but not limited to, payments, analytics, subscriptions, new products, text to speech, and external APIs.
This is a key role and ideal for someone who thinks strategically, enjoys fast-paced environments, is passionate about making product decisions, and has experience building great user experiences that delight users.
We are a flat organization that allows anyone to become a leader by showing excellent technical skills and delivering results consistently and fast. Work ethic, solid communication skills, and obsession with winning are paramount.
Our interview process involves several technical interviews and we aim to complete them within 1 week.
What You'll Do
Design, develop, and maintain robust APIs including public TTS API, internal APIs like Payment, Subscription, Auth and Consumption Tracking, ensuring they meet business and scalability requirements
Oversee the full backend API landscape, enhancing and optimizing for performance and maintainability
Collaborate on B2B solutions, focusing on customization and integration needs for enterprise clients
Work closely with cross-functional teams to align backend architecture with overall product strategy and user experience
An Ideal Candidate Should Have
Proven experience in backend development: TS/Node (required)
Direct experience with GCP and knowledge of AWS, Azure, or other cloud providers
Efficiency in ideation and implementation, prioritizing tasks based on urgency and impact
Preferred: Experience with Docker and containerized deployments
Preferred: Proficiency in deploying high availability applications on Kubernetes
What We Offer
A dynamic environment where your contributions shape the company and its products
A team that values innovation, intuition, and drive
Autonomy, fostering focus and creativity
The opportunity to have a significant impact in a revolutionary industry
Competitive compensation, a welcoming atmosphere, and a commitment to an exceptional asynchronous work culture
The privilege of working on a product that changes lives, particularly for those with learning differences like dyslexia, ADD, and more
An active role at the intersection of artificial intelligence and audio - a rapidly evolving tech domain
The United States Based Salary range for this role is: 140,000-200,000 USD/Year + Bonus + Stock depending on experience
Think you're a good fit for this job?
Tell us more about yourself and why you're interested in the role when you apply.
And don't forget to include links to your portfolio and LinkedIn.
Not looking but know someone who would make a great fit?
Refer them!
Speechify is committed to a diverse and inclusive workplace.
Speechify does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.
Auto-ApplyJava Software Engineer
Data engineer job in Green Bay, WI
Candidates need to have 10 years experience, and 5 years in engineering software solutions using Java, Spring boot.
Core Characteristics and Soft Skills:
Beyond technical proficiency, the right mindset and interpersonal skills are crucial for success on our team. We'd prioritize candidates who demonstrate:
Problem-Solving Acumen: The ability to analyze complex problems, break them down, evaluate different approaches, and implement robust, efficient solutions. This includes troubleshooting existing systems and designing new ones.
Independence and Initiative: We value engineers who can take ownership of tasks, research potential solutions independently, make informed decisions, and drive work forward with minimal supervision once objectives are clear.
Dependability and Accountability: Team members must be reliable, meet commitments, deliver high-quality work, and take responsibility for their contributions.
Strong Communication Skills: Clear, concise communication (both written and verbal) is essential. This includes explaining technical concepts to varied audiences, actively listening, providing constructive feedback, and documenting work effectively.
Collaboration and Teamwork: Ability to work effectively within a team structure, share knowledge, participate in code reviews, and contribute to a positive team dynamic.
Adaptability and Eagerness to Learn: The technology landscape and business needs evolve. We seek individuals who are curious, adaptable, and willing to learn new technologies and methodologies.
Core Technical Skillset:
Our current technology stack forms the foundation of our work. Proficiency or strong experience in the following areas is highly desirable:
Backend Development:
Java: Deep understanding of Java (latest LTS versions preferred).
Spring Boot: Extensive experience building applications and microservices using the Spring Boot framework and its ecosystem (e.g., Spring Data, Spring Security, Spring Cloud).
Messaging Systems:
Apache Kafka: Solid understanding of Kafka concepts (topics, producers, consumers, partitioning, brokers) and experience building event-driven systems.
Containerization & Orchestration:
Kubernetes: Practical experience deploying, managing, and troubleshooting applications on Kubernetes.
OCP (OpenShift Container Platform): Experience specifically with OpenShift is a significant advantage.
AKS (Azure Kubernetes Service): Experience with AKS is also highly relevant.
(General Docker knowledge is expected)
CI/CD & DevOps:
GitHub Actions: Proven experience in creating, managing, and optimizing CI/CD pipelines using GitHub Actions for build, test, and deployment automation.
Understanding of Git branching strategies and DevOps principles.
Frontend Development:
JavaScript: Strong proficiency in modern JavaScript (ES6+).
React: Experience building user interfaces with the React library and its common patterns/ecosystem (e.g., state management, hooks).
Database & Data Warehousing:
Oracle: Experience with Oracle databases, including writing efficient SQL queries, understanding data modeling, and potentially PL/SQL.
Snowflake: Experience with Snowflake cloud data warehouse, including data loading, querying (SQL), and understanding its architecture.
Scripting:
Python: Proficiency in Python for scripting, automation, data manipulation, or potentially backend API development (e.g., using Flask/Django, though Java/Spring is primary).
Requirements
Domain Understanding (Transportation & Logistics):While not strictly mandatory, candidates with experience or a demonstrated understanding of the transportation and logistics industry (e.g., supply chain management, freight operations, warehousing, fleet management, routing optimization, TMS systems) will be able to contribute more quickly and effectively. They can better grasp the business context and user needs.Additional Valuable Skills:We are also interested in candidates who may possess skills in related areas that complement our core activities:
Data Science & Analytics:
Experience with data analysis techniques.
Knowledge of Machine Learning (ML) concepts and algorithms (particularly relevant for optimization, forecasting, anomaly detection in logistics).
Proficiency with Python data science libraries (Pandas, NumPy, Scikit-learn).
Experience with data visualization tools and techniques.
Understanding of optimization algorithms (linear programming, vehicle routing problem algorithms, etc.).
Cloud Platforms: Broader experience with cloud services (particularly Azure, but also AWS or GCP) beyond Kubernetes (e.g., managed databases, serverless functions, monitoring services).
Testing: Strong experience with automated testing practices and tools (e.g., JUnit, Mockito, Cypress, Selenium, Postman/Newman).
API Design & Management: Deep understanding of RESTful API design principles, API security (OAuth, JWT), and potentially experience with API gateways.
Monitoring & Observability: Experience with tools like Prometheus, Grafana, ELK Stack (Elasticsearch, Logstash, Kibana), Datadog, Dynatrace, etc., for monitoring application health and performance.
Security: Awareness and application of secure coding practices (e.g., OWASP Top 10).
Data Scientist
Data engineer job in Kohler, WI
Work Mode: Onsite Opportunity To serve as a data-driven innovator and strategic enabler, transforming business decisions through advanced analytics, machine learning and AI. This role combines statistical and technical expertise with business insight to unlock patterns, forecast trends, and deliver actionable intelligence that accelerates growth and operational excellence. The Data Scientist will champion the use of data as a competitive advantage-designing robust analytical models, ensuring data integrity, and driving scalable solutions that align with enterprise goals.
Key Responsibilities
Analytical Modeling and AI Development
* Design experiments, test hypotheses, and build predictive models to uncover actionable insights.
* Develop and refine algorithms to identify patterns and trends in large, complex datasets.
* Validate findings through rigorous statistical methods and iterative testing.
* Continuously monitor and optimize models for accuracy, scalability, and business impact.
* Prototype and deploy AI-driven solutions that align with strategic business objectives and deliver measurable value.
Data Exploration & Architecture
* Independently collate, analyze, and interpret large, complex datasets to generate actionable insights.
* Consult on data architecture and governance to ensure quality, accuracy, and readiness for advanced analytics.
* Identify relevant internal and external data sources and ensure compliance with data standards.
* Drive improvements in data pipelines and integration to enable faster, more reliable analytics.
Decision Intelligence & Transformation
* Partner with stakeholders to translate business requirements into analytical solutions that drive process and product improvements.
* Support creation of scalable AI/ML solutions aligned with strategic goals.
* Model and frame business scenarios that influence critical decisions and outcomes.
* Provide clear metrics and KPIs to demonstrate the impact of analytics initiatives on business performance.
* Identify opportunities for automation and optimization to accelerate enterprise-wide transformation.
Mentorship & Evangelism
* Mentor Functional teams globally on AI tools, prompt engineering, and experimentation. Lead workshops, proofs-of-concept, and training sessions to build momentum and literacy across cultures and time zones.
* Share personal and professional learnings to foster a culture of innovation and continuous improvement.
* Advocate for responsible AI use and digital fluency across the enterprise
Skills/Requirements
* Bachelor's degree in Mathematics, Statistics, Computer Science, or related field (Master's preferred).
* 1+ years of relevant quantitative and qualitative research and analytics experience.
* Demonstrated experience in using, building, or configuring AI tools, bots, or agents-especially by those who actively explore and experiment with emerging technologies, including in personal contexts. Strong familiarity with large language models (LLMs), prompt engineering, and commonly used AI tools.
Technical Fluency
* Proficiency in Python or R
* Strong skills in SQL
* Experience with cloud platforms (Azure and Databricks preferred)
* Familiarity with prompt engineering and agent-based tools (Copilot Studio, Azure AI Foundry, or similar)
* Knowledge of data visualization tools (Power BI, Tableau, or similar)
Core Competencies
* Creative Problem solving: Works with full competence to find practical solutions for unexpected stakeholder problems. Typically works without supervision
* Brings a cross-functional collaborative mindset across business and technical teams; thrives in cross-disciplinary environments to align AI efforts across groups.
* Communication and storytelling: Tailors communication content and style to the needs of others. Pays attention to others' input and perspectives, asks questions, and summarizes to confirm understanding
* Nimble learning: Learns through experimentation when tackling new problems, using both successes and failures as learning fodder. Swiftly incorporates new concepts and principles into own expertise; skillfully uses these fresh insights to solve problems
* Cultivates innovation: Creates new and better ways for the organization to be successful. Offers creative ideas, finds unique connections between previously unrelated elements. Builds upon and strengthens new solutions in a positive and collaborative manner.
#LI-DNI
Applicants must be authorized to work in the US without requiring sponsorship now or in the future.
We believe in supporting you from the moment you join us, which is why Kohler offers day 1 benefits. This means you'll have access to your applicable benefit programs from your first day on the job, with no waiting period. The salary range for this position is $64,750 - $98,350. The specific salary offered to a candidate may be influenced by a variety of factors including the candidate's experience, their education, and the work location.
Why Choose Kohler?
We empower each associate to #BecomeMoreAtKohler with a competitive total rewards package to support your health and wellbeing, access to career growth and development opportunities, a diverse and inclusive workplace, and a strong culture of innovation. With more than 30,000 bold leaders across the globe, we're driving meaningful change in our mission to help people live gracious, healthy, and sustainable lives.
About Us
It is Kohler's policy to recruit, hire, and promote qualified applicants without regard to race, creed, religion, age, sex, sexual orientation, gender identity or expression, marital status, national origin, disability or status as a protected veteran. If, as an individual with a disability, you need reasonable accommodation during the recruitment process, please contact *********************. Kohler Co. is an equal opportunity/affirmative action employer.