Cloud Engineer
Data engineer job in Oshkosh, WI
Insight Global is looking for a Cloud Engineer to support one of our largest federal clients in Wisconsin. The Senior Cloud Engineer will have emphasis on driving profitable cloud adoption. While driving the innovation and consumption of Cloud based services across the enterprise. Ensure the security of the cloud environment by implementing security controls, threat protection, and managing identity and access. Handle data feeds and integration process to ensure seamless data exchange between systems. Assess innovation and vendor alignment for the continuous build out and scale of the cloud ecosystem. Provide infrastructure support to ensure operational excellence ensuring issue resolution, disaster recovery, and data backup standards are defined.
• Manage and support Azure Gov Cloud operations, including installations, configurations, deployments, integrations, and administration using tools like Azure Kubernetes, Terraform Enterprise, and GitHub.
• Troubleshoot and resolve cloud ecosystem issues, ensuring uptime and performance monitoring. • Collaborate across teams (server, storage, database, security) to implement and maintain solutions aligned with project plans.
• Apply advanced cloud expertise in areas such as containerization, DevOps, networking, and scripting while adopting new technologies and best practices.
• Design and operate complex solutions following ITIL processes for ticket resolution and stakeholder communication.
Required Skills and Experience:
• Four (4) or more years of experience in the field or in a related area.
• Current Microsoft Azure Gov. Cloud experience
• Monitoring, troubleshooting, scripting, deployment, integration, messaging, automation, orchestration
• Written and communication skills, problem solving, time management, teamwork, detail oriented, customer service.
Nice to have:
• Bachelor's degree in Information Technology or related field.
Compensation:
$125,000-$153,000 per year annual salary.
Exact compensation may vary based on several factors, including skills, experience, and education. Benefit packages for this role may include healthcare insurance offerings and paid leave as provided by applicable law.
AZCO Data Scientist - IT (Appleton, WI)
Data engineer job in Appleton, WI
The Staff Data Scientist plays a critical role in leveraging data to drive business insights, optimize operations, and support strategic decision-making. You will be responsible for designing and implementing advanced analytical models, developing data pipelines, and applying statistical and machine learning techniques to solve complex business challenges. This position requires a balance of technical expertise, business acumen, and communication skills to translate data findings into actionable recommendations.
+ Develop and apply data solutions for cleansing data to remove errors and review consistency.
+ Perform analysis of data to discover information, business value, patterns and trends in support to guide development of asset business solutions.
+ Gather data, find patterns and relationships and create prediction models to evaluate client assets.
+ Conduct research and apply existing data science methods to business line problems.
+ Monitor client assets and perform predictive and root cause analysis to identify adverse trends; choose best fit methods, define algorithms, and validate and deploy models to achieve desired results.
+ Produce reports and visualizations to communicate technical results and interpretation of trends; effectively communicate findings and recommendations to all areas of the business.
+ Collaborate with cross-functional stakeholders to assess needs, provide assistance and resolve problems.
+ Translate business problems into data science solutions.
+ Performs other duties as assigned
+ Complies with all policies and standards
**Qualifications**
+ Bachelor Degree in Analytics, Computer Science, Information Systems, Statistics, Math, or related field from an accredited program and 4 years related experience required or experience may be substituted for degree requirement required
+ Experience in data mining and predictive analytics.
+ Strong problem-solving skills, analytical thinking, attention to detail and hypothesis-driven approach.
+ Excellent verbal/written communication, and the ability to present and explain technical concepts to business audiences.
+ Proficiency with data visualization tools (Power BI, Tableau, or Python libraries).
+ Experience with Azure Machine Learning, Databricks, or similar ML platforms.
+ Expert proficiency in Python with pandas, scikit-learn, and statistical libraries.
+ Advanced SQL skills and experience with large datasets.
+ Experience with predictive modeling, time series analysis, and statistical inference.
+ Knowledge of A/B testing, experimental design, and causal inference.
+ Familiarity with computer vision for image/video analysis.
+ Understanding of NLP techniques for document processing.
+ Experience with optimization algorithms and operations research techniques preferred.
+ Knowledge of machine learning algorithms, feature engineering, and model evaluation.
This job posting will remain open a minimum of 72 hours and on an ongoing basis until filled.
EEO/Disabled/Veterans
**Job** Information Technology
**Primary Location** US-WI-Appleton
**Schedule:** Full-time
**Travel:** Yes, 5 % of the Time
**Req ID:** 253790
\#LI-MF #ACO N/A
Data Engineer
Data engineer job in Appleton, WI
Amplifi is the go-to data consultancy for enterprise organizations that want their success to be driven by data. We empower our clients to innovate, grow and succeed by establishing and delivering strategies across all elements of the data value chain. From the governance and management of data through to analytics and automation, our integrated approach to modern data ecosystems delivers measurable results through a combination of expert consultancy and best-in-breed technology. Our company and team members are proud to empower our clients' businesses by providing exceptional solutions and value, as we truly believe their success is our success. We thrive on delivering excellent solutions and overcoming technical and business challenges. As such, we're looking for like-minded individuals to learn, grow, and mentor others as a part of the Amplifi family.
Position Summary
The Data Engineer will be responsible for designing, building, and maintaining scalable, secure data pipelines that drive analytics and support operational data products. The ideal candidate brings a strong foundation in SQL, Python, and modern data warehousing with a deep understanding of Snowflake, Databricks, or Microsoft Fabric, and a solid understanding of cloud-based architectures.
What You Will Get To Do
Design, develop, and optimize robust ETL/ELT pipelines to ingest, transform, and expose data across multiple systems.
Build and maintain data models and warehouse layers, enabling high-performance analytics and reporting.
Collaborate with analytics, product, and engineering teams to understand data needs and deliver well-structured data solutions.
Write clean, efficient, and testable code in SQL and Python to support automation, data quality, and transformation logic.
Support deployment and orchestration workflows, using Azure Data Factory, dbt, or similar tools.
Work across multi-cloud environments (Azure preferred; AWS and GCP optional) to integrate data sources and manage cloud-native components.
Contribute to CI/CD practices and data pipeline observability (monitoring, logging, alerting).
Ensure data governance, security, and compliance in all engineering activities.
Support ad hoc data science and machine learning workflows within Dataiku.
What You Bring to the Team
4+ years of experience in a data engineering or related software engineering role.
Proficiency in SQL and Python for data manipulation, transformation, and scripting.
Strong experience working with Snowflake and MSSQL Server.
Practical knowledge of working with cloud data platforms, especially Microsoft Azure.
Experience with modern data modeling and warehouse optimization techniques.
Experience with Databricks, Azure Data Factory, or DBT preferred.
Exposure to Microsoft Fabric components like OneLake, Pipelines, or Direct Lake.
Familiarity with cloud services across AWS, GCP, or hybrid cloud environments.
Understanding of or curiosity about Dataiku for data science and advanced analytics collaboration.
Ability to work independently and with a team in a hybrid/remote environment.
Location
Wisconsin is preferred.
Travel
Ability to travel up to 10% of the time.
Benefits & Compensation
Amplifi offers excellent compensation and benefits including, but not limited to, health, dental, 401(k) program, employee assistance program, short and long-term disability, life insurance, accidental death and dismemberment (AD&D), PTO program, flex work schedules and paid holidays.
Equal Opportunity Employer
Amplifi is proud to be an equal opportunity employer. We do not discriminate against applicants based on race, religion, disability, medical condition, national origin, gender, sexual orientation, marital status, gender identity, pregnancy, childbirth, age, veteran status or other legally protected characteristics.
Data Architect II
Data engineer job in Appleton, WI
Join our Corporate Data and Analytics team as we continue to expand our data capabilities! The Data Architect II role will be responsible to design, manage, and optimize the organization's data infrastructure for a specific data domain. This individual will ensure that data is structured, secure, and accessible to support business operations and decision-making, will mentor junior data architects and will provide cross functional governance leadership.
The ideal/preferred location for this position is in Appleton, WI. Will consider candidates in other locations based on relevancy of related experience.
JOB RESPONSIBILITIES
Essential Job Responsibilities:
* Create and maintain conceptual, logical, and physical data models. Document data flows, lineage, and dependencies for assigned data domains.
* Collaborate with data engineers, data analysts, and business partners to understand aligning the data model to business requirements.
* Capture metadata associated with new data projects. Manage Metadata Repositories. Coordinate with business partners to maintain data catalog information for assigned data domains. Implement metadata and lineage tracking within domain.
* Manage and monitor data quality assessments. Communicate and resolve data quality issues with business stewards.
* Enforce governance rules and policies for assigned data domain. Leverage master data management tools and canonical data practices to govern data.
* Define schemas, transformations, and integration rules. Plan and design data integration methods. Support data integration activities.
* Enforce security protocols and policies on assigned data domains. Measure and monitor access to sensitive and secure data.
* Work closely with data engineers, analysts, and business stakeholders to align data architecture with organizational goals.
* Own one or more data domains end-to-end, including integration rules, data catalog upkeep, and governance enforcement.
* Support regulatory compliance initiatives (GDPR, CCPA, HIPAA, SOX depending on company domain).
* Introduce cloud-first design patterns and domain-oriented governance practices.
Additional Job Responsibilities:
* Live our values of High Performance, Caring Relationships, Strategic Foresight, and Enterprising Spirit
* Find A Better Way by championing continuous improvement and quality control efforts to identify opportunities to innovate and improve efficiency, accuracy, and standardization.
* Continuously learn and develop self professionally.
* Support corporate efforts for safety, government compliance, and all other company policies & procedures.
* Perform other related duties as required and assigned.
QUALIFICATIONS
Required:
* Bachelor's in computer science, Information Systems, or related field
* 4+ years in a data engineering or BI Developer role, including 2 years experience in data modeling
* Experience with data analysis tools for data research including languages such as SQL or Python and exploration tools such as Power BI, Tableau, or Looker.
* Experience using cloud data platforms such as AWS, Azure, GCP, Snowflake, and Databricks.
* Ability to translate complex business needs into technical architectural solutions.
* Strong documentation and communication skills
* Excellent analytical and problem-solving skills
Preferred:
* Awareness of data governance frameworks and tools
* Familiarity with compliance regulations
* Familiarity with database management tools and business intelligence tools
DIVISION:
Corporate
U.S. Venture requires that a team member have and maintain authorization to work in the country in which the role is based. In general, U.S. Venture does not sponsor candidates for nonimmigrant visas or permanent residency unless based on business need.
U.S. Venture will not accept unsolicited resumes from recruiters or employment agencies. In the absence of an executed recruitment Master Service Agreement, there will be no obligation to any referral compensation or recruiter fee. In the event a recruiter or agency submits a resume or candidate without an agreement, U.S. Venture shall reserve the right to pursue and hire those candidate(s) without any financial obligation to the recruiter or agency. Any unsolicited resumes, including those submitted to hiring managers, shall be deemed the property of U.S. Venture.
U.S. Venture, Inc. is an equal opportunity employer that is committed to inclusion and diversity. We ensure equal opportunity for all applicants without regard to race, color, religion, sex, sexual orientation, gender, gender identity or expression, marital status, age, national origin, disability, veteran status, genetic information, or other protected characteristic. If you need assistance or an accommodation due to a disability, you may call Human Resources at **************.
Auto-ApplySenior Data Engr
Data engineer job in Neenah, WI
The Sr Data Engineer will oversee the Department's data integration work, including developing a data model, maintaining a data Warehouse and analytics environment, and writing scripts for data integration and analysis. Troubleshoot and resolve data-related issues, collaborating with the team and our Vendor Consultant to identify root causes. Recommend and deploy data models and solutions for existing data systems. This individual is responsible for playing a key role in the design and development of data pipelines to modernize our financial back end into our Azure based solution. The Data Engineer will collaborate with SMEs, Data Analysts, and Business Analysts to achieve this goal. This role will serve as a mentor for Junior Data Engineers by sharing knowledge and experience.
RESPONSIBILITIES:
Serve as a mentor to less senior Data Engineers on the team
Troubleshoot data discrepancies within the databases table of the IAP data model
Provide data to Data Analyst
Assist other developers with further developing the IAP data model
Develop, maintain and optimize current data pipelines which includes, but isn't limited to data from our policy administration systems, claims system, agent facing portals, and third party data assets
Knowledge and understanding of best practices for development including query optimization, version control, code reviews, and technical documentation
Develops complex data objects for business analytics using data modeling techniquesâ¯
Ensures data quality and implements tools and frameworks for automating the identification of data quality issuesâ¯
Analyze and propose improvements to existing Data Structures and Data Movement Processes
Perform assigned project tasks in a highly interactive team environment while maintaining a productive and positive atmosphere
Stay current with P&C insurance knowledge and industry technology and utilize that knowledge with existing productivity tools, standards, and procedures to contribute to the cost-effective operation of the department and company
Other duties as assigned
QUALIFICATIONS:â¯
ESSENTIAL:â¯
Associate's or Bachelor's degree in IT related field or equivalent combination of education/experience Associate or bachelor's degree in an IT related field, with business analysis experience in related field
5+ years of data pipeline/ETL/ELT development experience in a corporate environment
Knowledge of the SDLC, business processes and technical aspects of data pipelines and business intelligence outcomes
Experience with Azure DevOps, Azure Synapse, and Azure Data Lake/SQL
Experience working with XML and JSON data formats
Expert in SQL for data mapping, extracting, and validating source data to enable accurate reporting and data feeds
Experience with large system design and implementation
Collaborative team member with a strong ability to contribute positively to team dynamics and culture (Team and culture fit)
Results oriented, self-motivated, resourceful team player
Superior oral and written communication skills
Demonstrates thought Leadership
Able to prioritize and work through multiple tasks
PREFERRED:â¯
P&C Insurance industry experience
Policy, Claims, Billing and Finance experience
Agile methodology
Experience with Duck creek Software Solutions
Experience with Azure DevOps (ADO)
At SECURA, we are transforming the insurance experience by putting authenticity at the forefront of everything we do. Our mission is clear: we're making insurance genuine. We recognize that our associates are our greatest assets, and we invest in their well-being and professional growth. We offer opportunities for continuous learning and career advancement, competitive benefits, and a culture that champions work-life balance. Joining SECURA means becoming part of a dynamic team that values each individual's contribution and fosters a collaborative atmosphere. Here, you'll not only find a fulfilling career but also a place where you can make a positive impact every day.
SECURA Insurance strives to provide equal opportunity for all employees and is committed to fostering an inclusive work environment. We welcome applicants from all backgrounds and walks of life.
Data Architect
Data engineer job in Appleton, WI
Seeking a Data Architect to help their growing data team transform how the company operates with data. This person will own data architecture for smaller projects, design models end-to-end, and collaborate with business stakeholders to define sources, business logic, and governance standards.
Responsibilities:
Design and implement data models across multiple domains
Define source systems, tables, and business logic for unified models
Partner with IT and business teams to ensure governed, reliable data
Support cloud adoption (Azure/GCP) while managing on-prem data
Contribute to data governance and architecture best practices
Requirements:
4+ years in data roles (engineering, BI, analytics)
2+ years in data architecture
Strong data modeling skills
Business-facing communication experience
Familiarity with Azure or GCP
Understanding of data governance principles
Skills
Data modeling, Data architecture, gcp, azure, data governance, Sql, power bi, Python, database management, compliance regulations, warehouse management system
Top Skills Details
Data modeling,Data architecture,gcp,azure,data governance,Sql,power bi
Additional Skills & Qualifications
Strong analytical and problem-solving skills
Ability to work independently and collaboratively in a team environment
Comfortable with hybrid work model and occasional travel
Experience with relational databases and SQL
Exposure to BI tools and ETL processes
Awareness of data governance frameworks and tools
Familiarity with compliance regulations
Familiarity with database management tools and business intelligence tools
Job Type & Location
This is a Permanent position based out of Appleton, WI.
Pay and Benefits
The pay range for this position is $105000.00 - $130000.00/yr.
1. Reimbursement programs for wellness ( gym memberships, group classes) 2. Health, Vision and Dental starting day 1 3. 7% company match on 401k 4. PTO, holidays, sicks day, volunteer time off, caregiver leave. Short and long term disability
Workplace Type
This is a fully onsite position in Appleton,WI.
Application Deadline
This position is anticipated to close on Dec 25, 2025.
h4>About TEKsystems:
We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company.
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
About TEKsystems and TEKsystems Global Services
We're a leading provider of business and technology services. We accelerate business transformation for our customers. Our expertise in strategy, design, execution and operations unlocks business value through a range of solutions. We're a team of 80,000 strong, working with over 6,000 customers, including 80% of the Fortune 500 across North America, Europe and Asia, who partner with us for our scale, full-stack capabilities and speed. We're strategic thinkers, hands-on collaborators, helping customers capitalize on change and master the momentum of technology. We're building tomorrow by delivering business outcomes and making positive impacts in our global communities. TEKsystems and TEKsystems Global Services are Allegis Group companies. Learn more at TEKsystems.com.
The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
Data Warehouse Engineer I
Data engineer job in Menasha, WI
The Data Warehouse Engineer I is part of a team dedicated to supporting Network Health's Enterprise Data Warehouse. This individual will perform development, analysis, testing, debugging, documentation, implementation, and maintenance of interfaces to support the Enterprise Data Warehouse and related applications. They will consult with other technical resources and key departmental users on solutions and best practices. They will monitor performance and effectiveness of the data warehouse and recommend changes as appropriate.
Location: Candidates must reside in the state of Wisconsin for consideration. This position is eligible to work at your home office (reliable internet is required), at our office in Brookfield or Menasha, or a combination of both in our hybrid workplace model.
Hours: 1.0 FTE, 40 hours per week, 8am-5pm Monday through Friday, may be required to work later hours when system changes are being implemented or problems arise
Check out our 2024 Community Report to learn a little more about the difference our employees make in the communities we live and work in. As an employee, you will have the opportunity to work hard and have fun while getting paid to volunteer in your local neighborhood. You too, can be part of the team and making a difference. Apply to this position to learn more about our team.
Job Responsibilities:
Perform end to end delivery of data interfaces in various stages of the Enterprise Data Warehouse in accordance with professional standards and industry best practice
Perform all phases of the development lifecycle including solution design, creation of acceptance criteria, implementation, technical documentation, development and execution of test cases, performance monitoring, troubleshooting, data analysis, and profiling
Consult with Developers, Engineers, DBAs, key departmental stakeholders, data governance and leadership on technical solutions and best practice
Monitor and audit the Enterprise Data Warehouse for effectiveness, throughput, and responsiveness. Recommend changes as appropriate. Troubleshoot customer complaints related to system performance issues
Maintain effective communication with customers from all departments for system development, implementation, and problem resolution
Required to take call to assist in resolution of technical problems
Other duties and responsibilities as assigned
Job Requirements:
Requires Associate Degree in Computer Science, Business, or related technical field; equivalent years of experience may be substituted
Minimum of 1 year experience in program interfacing required
Experience with T-SQL development, SSIS development, and database troubleshooting skills required
Network Health is an Equal Opportunity Employer
Data Scientist
Data engineer job in Luxemburg, WI
Your career at Deutsche Börse Group Your Area of Work Join Clearstream Fund Services as a Data Scientist to design and prototype data products that empower data monetization and business users through curated datasets, semantic models, and advanced analytics. You'll work across the data stack-from pipelines to visualizations-and contribute to the evolution of AI-driven solutions.
Your Responsibilities
* Prototype data products including curated datasets and semantic models to support data democratization and self-service BI
* Design semantic layers to simplify data access and usability
* Develop and optimize data pipelines using data engineering tool (e.g. Databricks)
* Use SQL, Python, and PySpark for data processing and transformation
* Create Power BI dashboards to support prototyping and reporting
* Apply ML/AI techniques to support early-stage modeling and future product innovation
* Collaborate with data product managers, functional analyst, engineers, and business stakeholders
* Ensure data quality, scalability, and performance in all deliverables
Your Profile
* Master's in Data Science, Computer Science, Engineering, or related field
* 3+ years of experience in data pipeline development and prototyping in financial services or fund administration
* Proficiency in SQL, Python, and PySpark
* Hands-on experience with Databricks
* Experience building Power BI dashboards and semantic models
* Strong analytical and communication skills
* Fluent in English
Digital Technology Data Scientist Lead
Data engineer job in Oshkosh, WI
At Oshkosh, we build, serve and protect people and communities around the world by designing and manufacturing some of the toughest specialty trucks and access equipment. We employ over 18,000 team members all united by a common purpose. Our engineering and product innovation help keep soldiers and firefighters safe, is critical in building and keeping communities clean and helps people do their jobs every day.
SUMMARY:
As a Data Scientist, your primary responsibilities will be to analyze and interpret datasets by using statistical techniques, machine learning, and/or programming skills in order to extract insights and build models which solve business problems.
YOUR IMPACT:
* Familiarity with data science tools, such as Azure Databricks, Power BI, Microsoft Office (including Excel pivot tables), Spark, Python, R and C.
* Undertake the processing of multiple datasets, including structured and unstructured, to analyze large amounts of information to discover trends and patterns.
* Prepare and concisely deliver analysis results in visual and written forms that communicate data insights to both technical and non-technical audiences.
* Collaborate with cross-functional teams (e.g. data analysts, data engineers, architects, business stakeholders) to understand data needs for complex business requirements.
* Build highly complex predictive models and machine-learning algorithms. Execute Integration into existing systems or creation of new products.
* Direct some data science assignments, projects, visualization tasks, data quality improvements, and troubleshooting of data incidents, including the resolution of root causes.
* Lead efforts to resolve and document solutions to track and manage incidents, changes, problems, tasks, and demands.
* Coach and mentor other team members on new technologies and best practices across data science and business intelligence.
* Possesses advanced understanding of business' processes in at least one area of business and has an understanding of business processes in multiple areas of the business.
* Actively supports the advancement of the strategic roadmap for data science.
MINIMUM QUALIFICATIONS:
* Bachelors degree with five (5) or more years of experience in the field or in a related area.
STANDOUT QUALIFICATIONS:
* Masters or doctorate degree
* Expertise in Power Platforms
* Familiarity with LLM (open source or closed)
* Experience in Front end web app development (Flaskapp, Gradio etc)
* Familiarity with RAG architecture
Pay Range:
$115,600.00 - $196,400.00
The above pay range reflects the minimum and maximum target pay for the position across all U.S. locations. Within this range, individual pay is determined by various factors, including the scope and responsibilities of the role, the candidate's experience, education and skills, as well as the equity of pay among team members in similar positions. Beyond offering a competitive total rewards package, we prioritize a people-first culture and offer various opportunities to support team member growth and success.
Oshkosh is committed to working with and offering reasonable accommodation to job applicants with disabilities. If you need assistance or an accommodation due to disability for any part of the employment process, please contact us at ******************************************.
Oshkosh Corporation is a merit-based Equal Opportunity Employer. Job opportunities are open for application to all qualified individuals and selection decisions are made without regard to race, color, religion, sex, national origin, age, disability, veteran status, or other protected characteristic. To the extent that information is provided or collected regarding categories as provided by law it will in no way affect the decision regarding an employment application.
Oshkosh Corporation will not discharge or in any manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with Oshkosh Corporation's legal duty to furnish information.
Certain positions with Oshkosh Corporation require access to controlled goods and technologies subject to the International Traffic in Arms Regulations or the Export Administration Regulations. Applicants for these positions may need to be "U.S. Persons," as defined in these regulations. Generally, a "U.S. Person" is a U.S. citizen, lawful permanent resident, or an individual who has been admitted as a refugee or granted asylum.
Auto-ApplyLead Data Steward
Data engineer job in Menasha, WI
You've discovered something special. A company that cares. Cares about leading the way in construction, engineering, manufacturing and renewable energy. Cares about redefining how energy is designed, applied and consumed. Cares about thoughtfully growing to meet market demands. And ─ as “one of the Healthiest 100 Workplaces in America” ─ is focused on the mind/body/soul of team members through our Culture of Care.
We are seeking a highly motivated and detail-oriented FTI Data Steward Leader to establish and champion the data governance within our organization. As data governance is a new initiative for the company, this role will be pivotal in laying the foundation, defining standards, and ensuring that data is treated as a critical enterprise asset. The FTI Data Steward Leader will help lead a community of business and technical data stewards across domains, promote best practices, and work closely with data governance leadership to develop policies, data quality standards, and stewardship processes. This role requires a strategic thinker with strong leadership skills and a deep understanding of data management principles.
MINIMUM REQUIREMENTS
Education: Bachelor's degree in Information Management, Data Science, Business, or related field, or a related field or experience in lieu of degree.
Experience: 5+ years of experience in data management, data governance, or related disciplines with demonstrated experience in leading data stewardship or governance initiatives enterprise wide. Strong knowledge of data quality principles, metadata management, and master data management (MDM). Familiarity with data governance frameworks (e.g., DAMA DMBOK, DCAM) and tools (e.g. Informatica, etc.).
Travel: 0-10%
Work Schedule: This position works between the hours of 7 AM and 5 PM, Monday- Friday. However, work may be performed at any time on any day of the week to meet business needs.
KEY RESPONSIBILITIES
Leadership and Strategy:
Serves as the primary representative and advocate for the community of master data leads and data coordinators.
Leads master data management (MDM) strategies and policies to support FTI.
Collaborates with data governance leadership to define the enterprise data stewardship framework, standards, and playbooks.
Creates and implements data governance policies, procedures, and maturity roadmaps.
Leads the formation of a data governance council and facilitate regular working sessions.
Stewardship and Quality
Ensures consistent application of data definitions, metadata standards, and classification across data domains.
Leads and develops data quality standards with data stewards and help resolve data quality issues with appropriate data stewards.
Collaboration and Stakeholder Engagement:
Partners with business units, BT, and Data Analytics teams to identify data steward leaders and data ownership roles.
Facilitates communication between business and technical stakeholders to resolve data issues and improve data understanding.
Acts as a liaison between the data governance and operational teams to ensure stewardship initiatives are aligned with business needs.
Metadata and Cataloging:
Works with data governance and Data Analytics team to maintain an enterprise data catalog.
Supports documentation of business glossaries, data dictionaries, and lineage across key data assets.
Training and Change Management:
Promotes data literacy and fosters a data-centric culture across the organization.
Leads change management efforts related to data governance adoption and tool implementation.
Performs other related duties as required and assigned.
The job description and responsibilities described are intended to provide guidelines for job expectations and the employee's ability to perform the position described. It is not intended to be construed as an exhaustive list of all functions, responsibilities, skills, and abilities. Additional functions and requirements may be assigned by supervisors as deemed appropriate.
How Does FTI Give YOU the Chance to Thrive?
If you're energized by new challenges, FTI provides you with many opportunities. Joining FTI opens doors to redefine what's possible for your future.
Once you're a team member, you're supported and provided with the knowledge and resources to achieve your career goals with FTI. You're officially in the driver's seat of your career, and FTI's career development and continued education programs give you opportunities to position yourself for success.
FTI is a “merit to the core” organization. We recognize and reward top performers, offering competitive, merit-based compensation, career path development and a flexible and robust benefits package.
Benefits are the Game-Changer
We provide industry-leading benefits as an investment in the lives of team members and their families. You're invited to review the full list of FTI benefits available to regular/full-time team members. Start here. Grow here. Succeed here. If you're ready to learn more about your career with FTI, apply today!
Faith Technologies, Inc. is an Equal Opportunity Employer - veterans/disabled.
Auto-ApplySoftware Engineering Technical Lead, Go (Cilium)
Data engineer job in Appleton, WI
The application window is expected to close on: December 18, 2025. NOTE: Job posting may be removed earlier if the position is filled or if a sufficient number of applications are received. NOTE: This role is US Remote role, but preference given to candidates located in the Eastern US time zone (ET).
Meet the Team
Isovalent, now part of Cisco, is the company founded by the creators of Cilium and eBPF. Cisco Isovalent builds open-source software and enterprise solutions solving networking, security, and observability needs for modern cloud native infrastructure. The flagship technology, Cilium, is the choice of numerous, industry-leading, global organizations.
We believe in fostering an inclusive and diverse workplace where every team member feels valued, respected, and empowered. We believe that every employee contributes to our success, and we are committed to fostering an environment where everyone can thrive. We encourage candidates from all backgrounds to apply and join us in our mission to deliver exceptional products and services.
Your Impact
Cisco Isovalent is seeking a skilled and experienced Software Engineer troubleshoot, mature and improve the reliability and scalability of Isovalent Products. You will be responsible for building and enhancing a Kubernetes-native control plane to provide seamless management of its network traffic, enabling scalable, secure, and resilient traffic management in cloud and hybrid enterprise environments. You will work with a highly collaborative and skilled team to build solutions that advance the next generation of networking and security in Kubernetes environments.
What you'll do:
* Developing high-quality Go and eBPF code for Cilium OSS and Enterprise, while following open source development principles and best practices.
* Debug, troubleshoot, and resolve performance, reliability, and security issues in the control plane.
* Design highly scalable solution to reliably run Cilium in very large environments.
* Participate in code reviews, architectural discussions, and contribute to technical documentation.
* Work and coordinate US and EMEA teams, ensuring alignment during East Coast business hours (EST/EDT).
Minimum Qualifications
* Bachelors Computer Science degree, or related fields
* 4+ years of experience in Go, 2+ years of experience in C or eBPF
* Experience with Kubernetes, Cloud Native workloads and/or distributed systems
Preferred Qualifications
* Knowledge about Linux systems design, security and/or networking, Linux kernel
* Desire to write high quality and efficient code
* Experience designing and implementing APIs
* A public track record of open-source code commits is a plus
Why Cisco?
At Cisco, we're revolutionizing how data and infrastructure connect and protect organizations in the AI era - and beyond. We've been innovating fearlessly for 40 years to create solutions that power how humans and technology work together across the physical and digital worlds. These solutions provide customers with unparalleled security, visibility, and insights across the entire digital footprint.
Fueled by the depth and breadth of our technology, we experiment and create meaningful solutions. Add to that our worldwide network of doers and experts, and you'll see that the opportunities to grow and build are limitless. We work as a team, collaborating with empathy to make really big things happen on a global scale. Because our solutions are everywhere, our impact is everywhere.
We are Cisco, and our power starts with you.
Message to applicants applying to work in the U.S. and/or Canada:
The starting salary range posted for this position is $183,800.00 to $263,600.00 and reflects the projected salary range for new hires in this position in U.S. and/or Canada locations, not including incentive compensation*, equity, or benefits.
Individual pay is determined by the candidate's hiring location, market conditions, job-related skillset, experience, qualifications, education, certifications, and/or training. The full salary range for certain locations is listed below. For locations not listed below, the recruiter can share more details about compensation for the role in your location during the hiring process.
U.S. employees are offered benefits, subject to Cisco's plan eligibility rules, which include medical, dental and vision insurance, a 401(k) plan with a Cisco matching contribution, paid parental leave, short and long-term disability coverage, and basic life insurance. Please see the Cisco careers site to discover more benefits and perks. Employees may be eligible to receive grants of Cisco restricted stock units, which vest following continued employment with Cisco for defined periods of time.
U.S. employees are eligible for paid time away as described below, subject to Cisco's policies:
* 10 paid holidays per full calendar year, plus 1 floating holiday for non-exempt employees
* 1 paid day off for employee's birthday, paid year-end holiday shutdown, and 4 paid days off for personal wellness determined by Cisco
* Non-exempt employees receive 16 days of paid vacation time per full calendar year, accrued at rate of 4.92 hours per pay period for full-time employees
* Exempt employees participate in Cisco's flexible vacation time off program, which has no defined limit on how much vacation time eligible employees may use (subject to availability and some business limitations)
* 80 hours of sick time off provided on hire date and each January 1st thereafter, and up to 80 hours of unused sick time carried forward from one calendar year to the next
* Additional paid time away may be requested to deal with critical or emergency issues for family members
* Optional 10 paid days per full calendar year to volunteer
For non-sales roles, employees are also eligible to earn annual bonuses subject to Cisco's policies.
Employees on sales plans earn performance-based incentive pay on top of their base salary, which is split between quota and non-quota components, subject to the applicable Cisco plan. For quota-based incentive pay, Cisco typically pays as follows:
* .75% of incentive target for each 1% of revenue attainment up to 50% of quota;
* 1.5% of incentive target for each 1% of attainment between 50% and 75%;
* 1% of incentive target for each 1% of attainment between 75% and 100%; and
* Once performance exceeds 100% attainment, incentive rates are at or above 1% for each 1% of attainment with no cap on incentive compensation.
For non-quota-based sales performance elements such as strategic sales objectives, Cisco may pay 0% up to 125% of target. Cisco sales plans do not have a minimum threshold of performance for sales incentive compensation to be paid.
The applicable full salary ranges for this position, by specific state, are listed below:
New York City Metro Area:
$183,800.00 - $303,100.00
Non-Metro New York state & Washington state:
$163,600.00 - $269,800.00
* For quota-based sales roles on Cisco's sales plan, the ranges provided in this posting include base pay and sales target incentive compensation combined.
Employees in Illinois, whether exempt or non-exempt, will participate in a unique time off program to meet local requirements.
Java Software Engineer
Data engineer job in Green Bay, WI
Candidates need to have 10 years experience, and 5 years in engineering software solutions using Java, Spring boot.
Core Characteristics and Soft Skills:
Beyond technical proficiency, the right mindset and interpersonal skills are crucial for success on our team. We'd prioritize candidates who demonstrate:
Problem-Solving Acumen: The ability to analyze complex problems, break them down, evaluate different approaches, and implement robust, efficient solutions. This includes troubleshooting existing systems and designing new ones.
Independence and Initiative: We value engineers who can take ownership of tasks, research potential solutions independently, make informed decisions, and drive work forward with minimal supervision once objectives are clear.
Dependability and Accountability: Team members must be reliable, meet commitments, deliver high-quality work, and take responsibility for their contributions.
Strong Communication Skills: Clear, concise communication (both written and verbal) is essential. This includes explaining technical concepts to varied audiences, actively listening, providing constructive feedback, and documenting work effectively.
Collaboration and Teamwork: Ability to work effectively within a team structure, share knowledge, participate in code reviews, and contribute to a positive team dynamic.
Adaptability and Eagerness to Learn: The technology landscape and business needs evolve. We seek individuals who are curious, adaptable, and willing to learn new technologies and methodologies.
Core Technical Skillset:
Our current technology stack forms the foundation of our work. Proficiency or strong experience in the following areas is highly desirable:
Backend Development:
Java: Deep understanding of Java (latest LTS versions preferred).
Spring Boot: Extensive experience building applications and microservices using the Spring Boot framework and its ecosystem (e.g., Spring Data, Spring Security, Spring Cloud).
Messaging Systems:
Apache Kafka: Solid understanding of Kafka concepts (topics, producers, consumers, partitioning, brokers) and experience building event-driven systems.
Containerization & Orchestration:
Kubernetes: Practical experience deploying, managing, and troubleshooting applications on Kubernetes.
OCP (OpenShift Container Platform): Experience specifically with OpenShift is a significant advantage.
AKS (Azure Kubernetes Service): Experience with AKS is also highly relevant.
(General Docker knowledge is expected)
CI/CD & DevOps:
GitHub Actions: Proven experience in creating, managing, and optimizing CI/CD pipelines using GitHub Actions for build, test, and deployment automation.
Understanding of Git branching strategies and DevOps principles.
Frontend Development:
JavaScript: Strong proficiency in modern JavaScript (ES6+).
React: Experience building user interfaces with the React library and its common patterns/ecosystem (e.g., state management, hooks).
Database & Data Warehousing:
Oracle: Experience with Oracle databases, including writing efficient SQL queries, understanding data modeling, and potentially PL/SQL.
Snowflake: Experience with Snowflake cloud data warehouse, including data loading, querying (SQL), and understanding its architecture.
Scripting:
Python: Proficiency in Python for scripting, automation, data manipulation, or potentially backend API development (e.g., using Flask/Django, though Java/Spring is primary).
Requirements
Domain Understanding (Transportation & Logistics):While not strictly mandatory, candidates with experience or a demonstrated understanding of the transportation and logistics industry (e.g., supply chain management, freight operations, warehousing, fleet management, routing optimization, TMS systems) will be able to contribute more quickly and effectively. They can better grasp the business context and user needs.Additional Valuable Skills:We are also interested in candidates who may possess skills in related areas that complement our core activities:
Data Science & Analytics:
Experience with data analysis techniques.
Knowledge of Machine Learning (ML) concepts and algorithms (particularly relevant for optimization, forecasting, anomaly detection in logistics).
Proficiency with Python data science libraries (Pandas, NumPy, Scikit-learn).
Experience with data visualization tools and techniques.
Understanding of optimization algorithms (linear programming, vehicle routing problem algorithms, etc.).
Cloud Platforms: Broader experience with cloud services (particularly Azure, but also AWS or GCP) beyond Kubernetes (e.g., managed databases, serverless functions, monitoring services).
Testing: Strong experience with automated testing practices and tools (e.g., JUnit, Mockito, Cypress, Selenium, Postman/Newman).
API Design & Management: Deep understanding of RESTful API design principles, API security (OAuth, JWT), and potentially experience with API gateways.
Monitoring & Observability: Experience with tools like Prometheus, Grafana, ELK Stack (Elasticsearch, Logstash, Kibana), Datadog, Dynatrace, etc., for monitoring application health and performance.
Security: Awareness and application of secure coding practices (e.g., OWASP Top 10).
Software Engineer, Platform - Green Bay, USA
Data engineer job in Green Bay, WI
The mission of Speechify is to make sure that reading is never a barrier to learning.
Over 50 million people use Speechify's text-to-speech products to turn whatever they're reading - PDFs, books, Google Docs, news articles, websites - into audio, so they can read faster, read more, and remember more. Speechify's text-to-speech reading products include its iOS app, Android App, Mac App, Chrome Extension, and Web App. Google recently named Speechify the Chrome Extension of the Year and Apple named Speechify its 2025 Design Award winner for Inclusivity.
Today, nearly 200 people around the globe work on Speechify in a 100% distributed setting - Speechify has no office. These include frontend and backend engineers, AI research scientists, and others from Amazon, Microsoft, and Google, leading PhD programs like Stanford, high growth startups like Stripe, Vercel, Bolt, and many founders of their own companies.
Overview
The responsibilities of our Platform team include building and maintaining all backend services, including, but not limited to, payments, analytics, subscriptions, new products, text to speech, and external APIs.
This is a key role and ideal for someone who thinks strategically, enjoys fast-paced environments, is passionate about making product decisions, and has experience building great user experiences that delight users.
We are a flat organization that allows anyone to become a leader by showing excellent technical skills and delivering results consistently and fast. Work ethic, solid communication skills, and obsession with winning are paramount.
Our interview process involves several technical interviews and we aim to complete them within 1 week.
What You'll Do
Design, develop, and maintain robust APIs including public TTS API, internal APIs like Payment, Subscription, Auth and Consumption Tracking, ensuring they meet business and scalability requirements
Oversee the full backend API landscape, enhancing and optimizing for performance and maintainability
Collaborate on B2B solutions, focusing on customization and integration needs for enterprise clients
Work closely with cross-functional teams to align backend architecture with overall product strategy and user experience
An Ideal Candidate Should Have
Proven experience in backend development: TS/Node (required)
Direct experience with GCP and knowledge of AWS, Azure, or other cloud providers
Efficiency in ideation and implementation, prioritizing tasks based on urgency and impact
Preferred: Experience with Docker and containerized deployments
Preferred: Proficiency in deploying high availability applications on Kubernetes
What We Offer
A dynamic environment where your contributions shape the company and its products
A team that values innovation, intuition, and drive
Autonomy, fostering focus and creativity
The opportunity to have a significant impact in a revolutionary industry
Competitive compensation, a welcoming atmosphere, and a commitment to an exceptional asynchronous work culture
The privilege of working on a product that changes lives, particularly for those with learning differences like dyslexia, ADD, and more
An active role at the intersection of artificial intelligence and audio - a rapidly evolving tech domain
The United States Based Salary range for this role is: 140,000-200,000 USD/Year + Bonus + Stock depending on experience
Think you're a good fit for this job?
Tell us more about yourself and why you're interested in the role when you apply.
And don't forget to include links to your portfolio and LinkedIn.
Not looking but know someone who would make a great fit?
Refer them!
Speechify is committed to a diverse and inclusive workplace.
Speechify does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.
Auto-ApplyDevOps Engineer (IGEN)
Data engineer job in Appleton, WI
We're seeking a hands-on and detail-oriented DevOps Engineer to join our Operations team and help build the foundation for scalable, reliable software delivery. This role is a key contributor to our cloud infrastructure, CI/CD pipelines, and deployment automation across both legacy and modern systems.
IGEN, a division of U.S. Venture, is a leading tax compliance software company trusted by industry leaders. Our tax compliance platform, powered by a data engine, gives you the complete toolkit for reducing redundancy and risk while managing tax compliance implications across the organization.
The DevOps Engineer will work closely with software developers and IT team members to improve build systems, optimize cloud environments, and ensure our applications are deployed reliably and securely.
This position will be located at our Appleton, WI office.JOB RESPONSIBILITIES
Essential Job Responsibilities:
Own the administration and continuous improvement of CI/CD pipelines and deploy automation.
Participate in sprint planning and technical scoping to ensure DevOps tasks align with objectives.
Implement infrastructure-as-code (IaC) to assist with the administration of hybrid environments.
Contribute to the design and maintenance of secure, automated deployment of API workflows.
Create and document infrastructure processes, scripts, and configuration standards for APIs.
Design and implement backup, recovery, and high-availability strategies for critical systems.
Contribute to the creation and execution of unit tests to ensure quality and reliability of code.
Collaborate with development teams to troubleshoot software build and deployment issues.
Support the technical transition of legacy systems to cloud-native application architecture.
Support Azure cloud operations, including resource provisioning, scaling, and monitoring.
Additional Job Responsibilities:
Live our values of High Performance, Caring Relationships, Strategic Foresight, and Entrepreneurial Spirit
Find A Better Way by championing continuous improvement and quality control efforts to identify opportunities to innovate and improve efficiency, accuracy, and standardization
Continuously learn and develop self professionally
Support corporate efforts for safety, government compliance, and all other company policies & procedures
Perform other related duties as required and assigned
QUALIFICATIONS
Required:
5 - 8+ years of experience in DevOps, Site Reliability Engineering, IT automation, or related roles.
Hands-on knowledge of CI/CD best practices, including rollback, gateway, & release strategies.
Experience with building, deploying, and hosting full-stack web applications (React & Node.js).
Proficient in Jenkins, Git/Bitbucket, PowerShell, Terraform, and scripting for automation tasks.
Familiarity with Windows Server, Active Directory, RDS/RDP, SQL Server, and/or Microsoft IIS.
Knowledge of software development methodologies including Agile, Kanban, and Scrum.
A problem-solver with high standards, attention to detail, and a bias for execution.
A mindset for security, performance, and reliability in software delivery.
Exposure to .NET build pipelines and C# application architecture.
Preferred:
Experience migrating and modernizing desktop applications.
Experience with hybrid cloud environments and MSP collaboration.
Knowledge of Windows Forms (WinForms) deployments and versioning.
Experience supporting compliance-heavy environments (i.e., tax, financial services).
Knowledge of networking, security, and compliance principles in cloud environments.
DIVISION:
IGEN
U.S. Venture requires that a team member have and maintain authorization to work in the country in which the role is based. In general, U.S. Venture does not sponsor candidates for nonimmigrant visas or permanent residency unless based on business need.
U.S. Venture will not accept unsolicited resumes from recruiters or employment agencies. In the absence of an executed recruitment Master Service Agreement, there will be no obligation to any referral compensation or recruiter fee. In the event a recruiter or agency submits a resume or candidate without an agreement, U.S. Venture shall reserve the right to pursue and hire those candidate(s) without any financial obligation to the recruiter or agency. Any unsolicited resumes, including those submitted to hiring managers, shall be deemed the property of U.S. Venture.
U.S. Venture, Inc. is an equal opportunity employer that is committed to inclusion and diversity. We ensure equal opportunity for all applicants without regard to race, color, religion, sex, sexual orientation, gender, gender identity or expression, marital status, age, national origin, disability, veteran status, genetic information, or other protected characteristic. If you need assistance or an accommodation due to a disability, you may call Human Resources at **************.
Auto-ApplyCloud Engineer
Data engineer job in Oshkosh, WI
**Direct Hire Opportunity** Salary Range depending on experience: $91k - $153K Standard working hours Hybrid work model, with three days in-office and two days remote each week. The preferred location for this role is Oshkosh, WI, with relocation assistance available for candidates who are not local. However, we are also open to candidates based out of the following locations:
Frederick, MD
McConnellsburg, PA
Must Have
Current Microsoft Azure Gov. Cloud experience
Job Description
The Senior Cloud Engineer will have emphasis on driving profitable cloud adoption. While driving the innovation and consumption of Cloud based services across the enterprise. Ensure the security of the cloud environment by implementing security controls, threat protection, and managing identity and access. Handle data feeds and integration process to ensure seamless data exchange between systems. Assess innovation and vendor alignment for the continuous build out and scale of the cloud ecosystem. Provide infrastructure support to ensure operational excellence ensuring issue resolution, disaster recovery, and data back up standards are defined.
YOUR IMPACT
Provide day to day operational support of the Microsoft Azure Gov Cloud ecosystem ensuring installations, configuration, deployments, integrations updates, and administration is streamlined using Azure Kubernetes, Terraform Enterprise, Github, and equivalent tools.
Advanced understanding of public and private clouds, IaaS, data security standards, system messaging, data feeds.
Conducting triage and resolution of cloud ecosystem issues and outages
Active member of project teams following senior leadership design and project plans
Cross functional team communicating and coordinating efforts for server, storage, support, database, security, etc.
Learning and adopting new technologies and best practices across cloud, continuous integration, automation, and process improvement
Continuously monitor and analyze cloud ecosystem and performance.
Advanced knowledge in 1 of the primary cloud ecosystems (database, virtualization, containerization, DevOps, networking, servers, scripting, etc.)
Cross functional team communicating and coordinating efforts for database, applications, and infrastructure activities.
Intimate knowledge of the ITIL process, ticket resolution, and stakeholder response
Design, implement, and operate complex solutions.
Other duties as assigned.
Regular attendance is required.
MINIMUM QUALIFICATIONS
Bachelor's degree in information technology or related field.
Four (4) or more years of experience in the field or in a related area.
Monitoring, troubleshooting, scripting, deployment, integration, messaging, automation, orchestration
Written and communication skills, problem solving, time management, teamwork, detail oriented, customer service.
Entry Level Engineer
Data engineer job in Neenah, WI
United Plastic Fabricating is the industry leader in the manufacture of plastic water tanks for the fire industry. In addition, we design and manufacture a variety of products for the industrial and transportation markets.
As an entry level engineer you could work in any of these disciplines and/or rotate through each for training:
Sales Engineer, Quality Engineer, Manufacturing Engineer, Design Engineer.
Assists in preparation of preliminary outlines, layouts and sketches for sales quotes. Assist customers with questions such as design feasibility and volume calculations. Evaluates new orders for manufacturability/warranty concerns. Receive, examine and enter sales orders to verify completeness and accuracy of data and manufacturability. Review and resolve any previously reported issues prior to releasing.
Provide engineering and assurance to ensure products are produced to meet customer requirements and expectations. Work in conjunction with other business and engineering disciplines using a cross-functional approach to ensure new/improved and current products and processes are in compliance with applicable quality management systems, and standards.
Works cross-functionally to capture and communicate information for possible design and manufacturability concerns or opportunities.
Improve manufacturing efficiency by analyzing and planning work flow, space requirements and equipment layout. Develop fixtures, and automate, semi-automate processes.
Develop manufacturing processes by studying product requirements, researching, designing, modifying and testing manufacturing methods, equipment and material handling conveyance.
Assure product and process quality by designing testing methods, process capabilities, establishing standards and confirming manufacturing processes.
Perform data gathering and historical trending for continuous improvement.
Requirements
BS Degree in Engineering
CAD and drafting skills.
Excellent mechanical aptitude necessary
Excellent benefits including Medical, Life, Dental, Disability insurance, 401K with employer match, student loan assistance, and gainsharing!
Visit UPF's website @ ********************* to visit our career page and submit your resume
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
No relocation allowance for this position.
100% in person role
This Employer uses E-Verify
Building Lead - Appleton Weekends-23960 (Fox Valley Operations Non Medical)
Data engineer job in Appleton, WI
Operating in a $1Billion plus industry, KleenMark is Wisconsin's largest independent commercial cleaning and supply company. Built on 60 years of experience, KleenMark uses proven processes and the industry's best-trained teams to deliver unmatched service. Expertise in healthcare, commercial, life sciences, manufacturing, and education, KleenMark's 900-plus technicians clean more than 30-million square feet daily. We are a family owned and run business that lives out our values of Trust, Teamwork and Results.
We have excellent opportunities for you to join our team!
Job Skills / Requirements
Job details:
Schedule: Saturday & Sunday
Hours: 4pm- 8pm
Pay: $19.50
Additional Details
Building Leads are responsible for maintaining the cleanliness of the building in which they work by performing various cleaning duties. Duties and hours may vary dependent upon the size of the building and the number of teammates they may be working with. A cleaner may be responsible for any or all the following tasks. Tasks may also change throughout a cleaner's employment.
ESSENTIAL JOB FUNCTIONS
Note: This is not an all-inclusive list. Additional duties may be assigned.
Restrooms | Cleans and disinfects sinks, countertops, toilets, mirrors, floors, etc. Replenishes bathroom supplies. Polishes metalwork, such as fixtures and fittings.
Floors | Sweeps, mops, vacuums, floors using brooms, mops, and vacuum cleaners. Other floor work may be required such as: scrubbing, waxing and polishing floors.
Break rooms /Kitchenettes | Cleans and disinfects sinks, countertops, tables, chairs, refrigerators, etc. Replenishes break room supplies.
Dust | Dusts furniture, equipment, partitions, etc.
Trash | Empties wastebaskets and recyclables and transports to disposal area.
Other Duties | Cleans rugs, carpets, and upholstered furniture, using vacuum cleaner (hip or backpack). Washes walls and woodwork. Washes windows, door panels, partitions, sills, etc.
EXPECTATIONS
Reports to work each day and on time and works extra hours when needed.
Employee must comply with proper safety policies and procedures as required (i.e., when using cleaning chemicals, reporting incidents, etc.).
Provides excellent level of customer service to both internal and external customers by maintaining a positive attitude.
The employee must be able to determine the neatness, accuracy and thoroughness of the work assigned.
Additional Information / Benefits
Medical, Vision & Dental Insurance for qualifying positions.
Personal Time Off (PTO) for qualifying positions.
6 Paid federal holidays after 90 days for qualifying positions.
Employee Referral Bonus
Instant Pay Access through DailyPay.
Employee of the Month, Quarter and Year Employee Recognition Program.
Growth within the company.
Great work/life balance
Safety First:
Personal protective equipment provided or required
Safety Monthly Trainings for all employees.
Sanitizing, disinfecting, or cleaning procedures in place
Employees working in medical facilities are required to wear a mask and gloves during the entirety of their shift. We provide all necessary PPE.
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Affirmative Action/EEO statement Kleenmark is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran, or disability status.
This job reports to the Krissia Henriquez
This is a Part-Time position 2nd Shift, Weekends.
Number of Openings for this position: 1
Senior ServiceNow Developer
Data engineer job in Oshkosh, WI
Job DescriptionSenior ServiceNow Developer
Travel: Minimal (occasional conference or annual HQ visit) Citizenship Requirement: Must be a U.S. Citizen
Function as a change agent in process optimization and strategic planning for ServiceNow Technology and Business innovation.
Have competencies around Automation and AI in ServiceNow and help deploy solutions that drive efficiencies and business benefits.
Hybrid-must work 3 days a week Mon-Wed at one of the following sites:
Oshkosh, WI (preferred)
Hagerstown, MD
Must be US Citizen
Travel: Very limited. Might travel for a conference, or to get the team together at GHQ in Oshkosh once a year.
Requirement:
1-strong knowledge of web technologies including REST/SOAP APIs, HTML and JavaScript
2-strong service now development skills
3-experience with ITSM/ITOM,
Preference:
1-Service now certifications such as certified application developer, implementation specialist.
2-Familiarity with ITIL and enterprise service management frameworks
As a Senior ServiceNow Developer, you will be responsible for the architecture and development of solutions and optimization of our ServiceNow platform. You will collaborate closely with business stakeholders and technical teams to design and deliver scalable, secure, and efficient solutions aligned with enterprise strategies and business goals.
Additional responsibilities include driving process improvement, innovation, and building a culture based on the DT Competencies of customer obsession, agility, results, and entrepreneurship as well as the Oshkosh leadership traits.
WORK LOCATION
The preferred location for this role is at our Global Headquarters in Oshkosh, WI, with relocation assistance available for candidates who are not local.
However, we are also open to candidates based out of the following U.S. office locations: Frederick, MD
YOUR IMPACT
Design, development, and implementation of ServiceNow solutions tailored to business requirements.
Drive platform enhancements through ServiceNow upgrades and new feature implementations, including participation in semi-annual upgrades.
Advanced knowledge of ServiceNow modules like ITAM, ITOM to drive best practices and efficiencies
Provide prompt and effective support for day-to-day platform operations, addressing incidents, root cause analysis, and long-term resolution.
Collaborate with stakeholders to analyze requirements, propose solutions, and translate business needs into technical deliverables.
Advanced experience with SDLC or Agile methodologies, ensuring on-time, within-budget delivery.
Document architectural designs, data flows, code, and configurations to maintain traceability and platform integrity.
Partner with ITOM, ITAM, and CMDB experts to enhance discovery, automation, and asset management capabilities.
Ensure platform performance, data quality, and compliance with security standards.
Special projects and workstreams within the ecosystem to drive business process optimization and ROI
Consult with business partners on best practices integrating across systems through REST, SOAP, JSON, and APIs standards
Partner with cross-functional stakeholders and business leads to educate, present, and showcase design impacts and innovation.
Function as an agent for change in the process optimization and strategic planning for technology and business innovation
MINIMUM QUALIFICATIONS
Five (5) or more years of ServiceNow development or implementation experience
ITIL, COBIT, financial planning, budgeting, business acumen
Written and verbal communication, customer service, stakeholder alignment, relationship building, problem solving, critical thinking, leadership, coaching, delegation, design thinking
STANDOUT QUALIFICATIONS
Bachelors degree in Computer Science, Information Technology or related field
7+ years of experience managing enterprise IT platforms, including ServiceNow and/or integrations platforms like Boomi
Proven experience leading ServiceNow Implementations and custom development
Knowledge of ITIL, DevOps, and Agile Methodologies
Experience in AI/ML driven automation and self-service solutions
Digital Technology Data Scientist Lead
Data engineer job in Oshkosh, WI
**At Oshkosh, we build, serve and protect people and communities around the world by designing and manufacturing some of the toughest specialty trucks and access equipment. We employ over 18,000 team members all united by a common purpose. Our engineering and product innovation help keep soldiers and firefighters safe, is critical in building and keeping communities clean and helps people do their jobs every day.**
**SUMMARY:**
As a Data Scientist, your primary responsibilities will be to analyze and interpret datasets by using statistical techniques, machine learning, and/or programming skills in order to extract insights and build models which solve business problems.
**YOUR IMPACT** **:**
+ Familiarity with data science tools, such as Azure Databricks, Power BI, Microsoft Office (including Excel pivot tables), Spark, Python, R and C.
+ Undertake the processing of multiple datasets, including structured and unstructured, to analyze large amounts of information to discover trends and patterns.
+ Prepare and concisely deliver analysis results in visual and written forms that communicate data insights to both technical and non-technical audiences.
+ Collaborate with cross-functional teams (e.g. data analysts, data engineers, architects, business stakeholders) to understand data needs for complex business requirements.
+ Build highly complex predictive models and machine-learning algorithms. Execute Integration into existing systems or creation of new products.
+ Direct some data science assignments, projects, visualization tasks, data quality improvements, and troubleshooting of data incidents, including the resolution of root causes.
+ Lead efforts to resolve and document solutions to track and manage incidents, changes, problems, tasks, and demands.
+ Coach and mentor other team members on new technologies and best practices across data science and business intelligence.
+ Possesses advanced understanding of business' processes in at least one area of business and has an understanding of business processes in multiple areas of the business.
+ Actively supports the advancement of the strategic roadmap for data science.
**MINIMUM QUALIFICATIONS:**
+ Bachelorsdegree with five (5) or more years of experience in the field or in a related area.
**STANDOUT QUALIFICATIONS:**
+ Masters or doctorate degree
+ Expertise in Power Platforms
+ Familiarity with LLM (open source or closed)
+ Experience in Front end web app development (Flaskapp, Gradioetc)
+ Familiarity with RAG architecture
**Pay Range:**
$115,600.00 - $196,400.00
The above pay range reflects the minimum and maximum target pay for the position across all U.S. locations. Within this range, individual pay is determined by various factors, including the scope and responsibilities of the role, the candidate's experience, education and skills, as well as the equity of pay among team members in similar positions. Beyond offering a competitive total rewards package, we prioritize a people-first culture and offer various opportunities to support team member growth and success.
Oshkosh is committed to working with and offering reasonable accommodation to job applicants with disabilities. If you need assistance or an accommodation due to disability for any part of the employment process, please contact us at ******************************************.
Oshkosh Corporation is a merit-based Equal Opportunity Employer. Job opportunities are open for application to all qualified individuals and selection decisions are made without regard to race, color, religion, sex, national origin, age, disability, veteran status, or other protected characteristic. To the extent that information is provided or collected regarding categories as provided by law it will in no way affect the decision regarding an employment application.
Oshkosh Corporation will not discharge or in any manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with Oshkosh Corporation's legal duty to furnish information.
Certain positions with Oshkosh Corporation require access to controlled goods and technologies subject to the International Traffic in Arms Regulations or the Export Administration Regulations. Applicants for these positions may need to be "U.S. Persons," as defined in these regulations. Generally, a "U.S. Person" is a U.S. citizen, lawful permanent resident, or an individual who has been admitted as a refugee or granted asylum.
Building Lead - Appleton-23960 (Fox Valley Operations Non Medical)
Data engineer job in Appleton, WI
Operating in a $1Billion plus industry, KleenMark is Wisconsin's largest independent commercial cleaning and supply company. Built on 60 years of experience, KleenMark uses proven processes and the industry's best-trained teams to deliver unmatched service. Expertise in healthcare, commercial, life sciences, manufacturing, and education, KleenMark's 900-plus technicians clean more than 30-million square feet daily. We are a family owned and run business that lives out our values of Trust, Teamwork and Results.
We have excellent opportunities for you to join our team!
Job Skills / Requirements
Job details:
Schedule: Monday-Friday
Hours: 5pm-12:30am
Pay: $18
Additional Details
Building Leads are responsible for maintaining the cleanliness of the building in which they work by performing various cleaning duties. Duties and hours may vary dependent upon the size of the building and the number of teammates they may be working with. A cleaner may be responsible for any or all the following tasks. Tasks may also change throughout a cleaner's employment.
ESSENTIAL JOB FUNCTIONS
Note: This is not an all-inclusive list. Additional duties may be assigned.
Restrooms | Cleans and disinfects sinks, countertops, toilets, mirrors, floors, etc. Replenishes bathroom supplies. Polishes metalwork, such as fixtures and fittings.
Floors | Sweeps, mops, vacuums, floors using brooms, mops, and vacuum cleaners. Other floor work may be required such as: scrubbing, waxing and polishing floors.
Break rooms /Kitchenettes | Cleans and disinfects sinks, countertops, tables, chairs, refrigerators, etc. Replenishes break room supplies.
Dust | Dusts furniture, equipment, partitions, etc.
Trash | Empties wastebaskets and recyclables and transports to disposal area.
Other Duties | Cleans rugs, carpets, and upholstered furniture, using vacuum cleaner (hip or backpack). Washes walls and woodwork. Washes windows, door panels, partitions, sills, etc.
EXPECTATIONS
Reports to work each day and on time and works extra hours when needed.
Employee must comply with proper safety policies and procedures as required (i.e., when using cleaning chemicals, reporting incidents, etc.).
Provides excellent level of customer service to both internal and external customers by maintaining a positive attitude.
The employee must be able to determine the neatness, accuracy and thoroughness of the work assigned.
Additional Information / Benefits
Medical, Vision & Dental Insurance for qualifying positions.
Personal Time Off (PTO) for qualifying positions.
6 Paid federal holidays after 90 days for qualifying positions.
Employee Referral Bonus
Instant Pay Access through DailyPay.
Employee of the Month, Quarter and Year Employee Recognition Program.
Growth within the company.
Great work/life balance
Safety First:
Personal protective equipment provided or required
Safety Monthly Trainings for all employees.
Sanitizing, disinfecting, or cleaning procedures in place
Employees working in medical facilities are required to wear a mask and gloves during the entirety of their shift. We provide all necessary PPE.
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Affirmative Action/EEO statement Kleenmark is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran, or disability status.
This job reports to the Krissia Henriquez
This is a Full-Time position 2nd Shift, 3rd Shift.
Number of Openings for this position: 1