AZCO Data Scientist - IT (Appleton, WI)
Data engineer job in Appleton, WI
The Staff Data Scientist plays a critical role in leveraging data to drive business insights, optimize operations, and support strategic decision-making. You will be responsible for designing and implementing advanced analytical models, developing data pipelines, and applying statistical and machine learning techniques to solve complex business challenges. This position requires a balance of technical expertise, business acumen, and communication skills to translate data findings into actionable recommendations.
+ Develop and apply data solutions for cleansing data to remove errors and review consistency.
+ Perform analysis of data to discover information, business value, patterns and trends in support to guide development of asset business solutions.
+ Gather data, find patterns and relationships and create prediction models to evaluate client assets.
+ Conduct research and apply existing data science methods to business line problems.
+ Monitor client assets and perform predictive and root cause analysis to identify adverse trends; choose best fit methods, define algorithms, and validate and deploy models to achieve desired results.
+ Produce reports and visualizations to communicate technical results and interpretation of trends; effectively communicate findings and recommendations to all areas of the business.
+ Collaborate with cross-functional stakeholders to assess needs, provide assistance and resolve problems.
+ Translate business problems into data science solutions.
+ Performs other duties as assigned
+ Complies with all policies and standards
**Qualifications**
+ Bachelor Degree in Analytics, Computer Science, Information Systems, Statistics, Math, or related field from an accredited program and 4 years related experience required or experience may be substituted for degree requirement required
+ Experience in data mining and predictive analytics.
+ Strong problem-solving skills, analytical thinking, attention to detail and hypothesis-driven approach.
+ Excellent verbal/written communication, and the ability to present and explain technical concepts to business audiences.
+ Proficiency with data visualization tools (Power BI, Tableau, or Python libraries).
+ Experience with Azure Machine Learning, Databricks, or similar ML platforms.
+ Expert proficiency in Python with pandas, scikit-learn, and statistical libraries.
+ Advanced SQL skills and experience with large datasets.
+ Experience with predictive modeling, time series analysis, and statistical inference.
+ Knowledge of A/B testing, experimental design, and causal inference.
+ Familiarity with computer vision for image/video analysis.
+ Understanding of NLP techniques for document processing.
+ Experience with optimization algorithms and operations research techniques preferred.
+ Knowledge of machine learning algorithms, feature engineering, and model evaluation.
This job posting will remain open a minimum of 72 hours and on an ongoing basis until filled.
EEO/Disabled/Veterans
**Job** Information Technology
**Primary Location** US-WI-Appleton
**Schedule:** Full-time
**Travel:** Yes, 5 % of the Time
**Req ID:** 253790
\#LI-MF #ACO N/A
Data Engineer
Data engineer job in Appleton, WI
Amplifi is the go-to data consultancy for enterprise organizations that want their success to be driven by data. We empower our clients to innovate, grow and succeed by establishing and delivering strategies across all elements of the data value chain. From the governance and management of data through to analytics and automation, our integrated approach to modern data ecosystems delivers measurable results through a combination of expert consultancy and best-in-breed technology. Our company and team members are proud to empower our clients' businesses by providing exceptional solutions and value, as we truly believe their success is our success. We thrive on delivering excellent solutions and overcoming technical and business challenges. As such, we're looking for like-minded individuals to learn, grow, and mentor others as a part of the Amplifi family.
Position Summary
The Data Engineer will be responsible for designing, building, and maintaining scalable, secure data pipelines that drive analytics and support operational data products. The ideal candidate brings a strong foundation in SQL, Python, and modern data warehousing with a deep understanding of Snowflake, Databricks, or Microsoft Fabric, and a solid understanding of cloud-based architectures.
What You Will Get To Do
Design, develop, and optimize robust ETL/ELT pipelines to ingest, transform, and expose data across multiple systems.
Build and maintain data models and warehouse layers, enabling high-performance analytics and reporting.
Collaborate with analytics, product, and engineering teams to understand data needs and deliver well-structured data solutions.
Write clean, efficient, and testable code in SQL and Python to support automation, data quality, and transformation logic.
Support deployment and orchestration workflows, using Azure Data Factory, dbt, or similar tools.
Work across multi-cloud environments (Azure preferred; AWS and GCP optional) to integrate data sources and manage cloud-native components.
Contribute to CI/CD practices and data pipeline observability (monitoring, logging, alerting).
Ensure data governance, security, and compliance in all engineering activities.
Support ad hoc data science and machine learning workflows within Dataiku.
What You Bring to the Team
4+ years of experience in a data engineering or related software engineering role.
Proficiency in SQL and Python for data manipulation, transformation, and scripting.
Strong experience working with Snowflake and MSSQL Server.
Practical knowledge of working with cloud data platforms, especially Microsoft Azure.
Experience with modern data modeling and warehouse optimization techniques.
Experience with Databricks, Azure Data Factory, or DBT preferred.
Exposure to Microsoft Fabric components like OneLake, Pipelines, or Direct Lake.
Familiarity with cloud services across AWS, GCP, or hybrid cloud environments.
Understanding of or curiosity about Dataiku for data science and advanced analytics collaboration.
Ability to work independently and with a team in a hybrid/remote environment.
Location
Wisconsin is preferred.
Travel
Ability to travel up to 10% of the time.
Benefits & Compensation
Amplifi offers excellent compensation and benefits including, but not limited to, health, dental, 401(k) program, employee assistance program, short and long-term disability, life insurance, accidental death and dismemberment (AD&D), PTO program, flex work schedules and paid holidays.
Equal Opportunity Employer
Amplifi is proud to be an equal opportunity employer. We do not discriminate against applicants based on race, religion, disability, medical condition, national origin, gender, sexual orientation, marital status, gender identity, pregnancy, childbirth, age, veteran status or other legally protected characteristics.
Data Architect II
Data engineer job in Appleton, WI
Join our Corporate Data and Analytics team as we continue to expand our data capabilities! The Data Architect II role will be responsible to design, manage, and optimize the organization's data infrastructure for a specific data domain. This individual will ensure that data is structured, secure, and accessible to support business operations and decision-making, will mentor junior data architects and will provide cross functional governance leadership.
The ideal/preferred location for this position is in Appleton, WI. Will consider candidates in other locations based on relevancy of related experience.
JOB RESPONSIBILITIES
Essential Job Responsibilities:
* Create and maintain conceptual, logical, and physical data models. Document data flows, lineage, and dependencies for assigned data domains.
* Collaborate with data engineers, data analysts, and business partners to understand aligning the data model to business requirements.
* Capture metadata associated with new data projects. Manage Metadata Repositories. Coordinate with business partners to maintain data catalog information for assigned data domains. Implement metadata and lineage tracking within domain.
* Manage and monitor data quality assessments. Communicate and resolve data quality issues with business stewards.
* Enforce governance rules and policies for assigned data domain. Leverage master data management tools and canonical data practices to govern data.
* Define schemas, transformations, and integration rules. Plan and design data integration methods. Support data integration activities.
* Enforce security protocols and policies on assigned data domains. Measure and monitor access to sensitive and secure data.
* Work closely with data engineers, analysts, and business stakeholders to align data architecture with organizational goals.
* Own one or more data domains end-to-end, including integration rules, data catalog upkeep, and governance enforcement.
* Support regulatory compliance initiatives (GDPR, CCPA, HIPAA, SOX depending on company domain).
* Introduce cloud-first design patterns and domain-oriented governance practices.
Additional Job Responsibilities:
* Live our values of High Performance, Caring Relationships, Strategic Foresight, and Enterprising Spirit
* Find A Better Way by championing continuous improvement and quality control efforts to identify opportunities to innovate and improve efficiency, accuracy, and standardization.
* Continuously learn and develop self professionally.
* Support corporate efforts for safety, government compliance, and all other company policies & procedures.
* Perform other related duties as required and assigned.
QUALIFICATIONS
Required:
* Bachelor's in computer science, Information Systems, or related field
* 4+ years in a data engineering or BI Developer role, including 2 years experience in data modeling
* Experience with data analysis tools for data research including languages such as SQL or Python and exploration tools such as Power BI, Tableau, or Looker.
* Experience using cloud data platforms such as AWS, Azure, GCP, Snowflake, and Databricks.
* Ability to translate complex business needs into technical architectural solutions.
* Strong documentation and communication skills
* Excellent analytical and problem-solving skills
Preferred:
* Awareness of data governance frameworks and tools
* Familiarity with compliance regulations
* Familiarity with database management tools and business intelligence tools
DIVISION:
Corporate
U.S. Venture requires that a team member have and maintain authorization to work in the country in which the role is based. In general, U.S. Venture does not sponsor candidates for nonimmigrant visas or permanent residency unless based on business need.
U.S. Venture will not accept unsolicited resumes from recruiters or employment agencies. In the absence of an executed recruitment Master Service Agreement, there will be no obligation to any referral compensation or recruiter fee. In the event a recruiter or agency submits a resume or candidate without an agreement, U.S. Venture shall reserve the right to pursue and hire those candidate(s) without any financial obligation to the recruiter or agency. Any unsolicited resumes, including those submitted to hiring managers, shall be deemed the property of U.S. Venture.
U.S. Venture, Inc. is an equal opportunity employer that is committed to inclusion and diversity. We ensure equal opportunity for all applicants without regard to race, color, religion, sex, sexual orientation, gender, gender identity or expression, marital status, age, national origin, disability, veteran status, genetic information, or other protected characteristic. If you need assistance or an accommodation due to a disability, you may call Human Resources at **************.
Auto-ApplyAZCO Data Scientist - IT (Appleton, WI)
Data engineer job in Appleton, WI
The Staff Data Scientist plays a critical role in leveraging data to drive business insights, optimize operations, and support strategic decision-making. You will be responsible for designing and implementing advanced analytical models, developing data pipelines, and applying statistical and machine learning techniques to solve complex business challenges. This position requires a balance of technical expertise, business acumen, and communication skills to translate data findings into actionable recommendations.
+ Develop and apply data solutions for cleansing data to remove errors and review consistency.
+ Perform analysis of data to discover information, business value, patterns and trends in support to guide development of asset business solutions.
+ Gather data, find patterns and relationships and create prediction models to evaluate client assets.
+ Conduct research and apply existing data science methods to business line problems.
+ Monitor client assets and perform predictive and root cause analysis to identify adverse trends; choose best fit methods, define algorithms, and validate and deploy models to achieve desired results.
+ Produce reports and visualizations to communicate technical results and interpretation of trends; effectively communicate findings and recommendations to all areas of the business.
+ Collaborate with cross-functional stakeholders to assess needs, provide assistance and resolve problems.
+ Translate business problems into data science solutions.
+ Performs other duties as assigned
+ Complies with all policies and standards
Qualifications
+ Bachelor Degree in Analytics, Computer Science, Information Systems, Statistics, Math, or related field from an accredited program and 4 years related experience required or experience may be substituted for degree requirement required
+ Experience in data mining and predictive analytics.
+ Strong problem-solving skills, analytical thinking, attention to detail and hypothesis-driven approach.
+ Excellent verbal/written communication, and the ability to present and explain technical concepts to business audiences.
+ Proficiency with data visualization tools (Power BI, Tableau, or Python libraries).
+ Experience with Azure Machine Learning, Databricks, or similar ML platforms.
+ Expert proficiency in Python with pandas, scikit-learn, and statistical libraries.
+ Advanced SQL skills and experience with large datasets.
+ Experience with predictive modeling, time series analysis, and statistical inference.
+ Knowledge of A/B testing, experimental design, and causal inference.
+ Familiarity with computer vision for image/video analysis.
+ Understanding of NLP techniques for document processing.
+ Experience with optimization algorithms and operations research techniques preferred.
+ Knowledge of machine learning algorithms, feature engineering, and model evaluation.
This job posting will remain open a minimum of 72 hours and on an ongoing basis until filled.
EEO/Disabled/Veterans
Job Information Technology
Primary Location US-WI-Appleton
Schedule: Full-time
Travel: Yes, 5 % of the Time
Req ID: 253790
\#LI-MF #ACO N/A
Senior Data Engr
Data engineer job in Neenah, WI
The Sr Data Engineer will oversee the Department's data integration work, including developing a data model, maintaining a data Warehouse and analytics environment, and writing scripts for data integration and analysis. Troubleshoot and resolve data-related issues, collaborating with the team and our Vendor Consultant to identify root causes. Recommend and deploy data models and solutions for existing data systems. This individual is responsible for playing a key role in the design and development of data pipelines to modernize our financial back end into our Azure based solution. The Data Engineer will collaborate with SMEs, Data Analysts, and Business Analysts to achieve this goal. This role will serve as a mentor for Junior Data Engineers by sharing knowledge and experience.
RESPONSIBILITIES:
Serve as a mentor to less senior Data Engineers on the team
Troubleshoot data discrepancies within the databases table of the IAP data model
Provide data to Data Analyst
Assist other developers with further developing the IAP data model
Develop, maintain and optimize current data pipelines which includes, but isn't limited to data from our policy administration systems, claims system, agent facing portals, and third party data assets
Knowledge and understanding of best practices for development including query optimization, version control, code reviews, and technical documentation
Develops complex data objects for business analytics using data modeling techniquesâ¯
Ensures data quality and implements tools and frameworks for automating the identification of data quality issuesâ¯
Analyze and propose improvements to existing Data Structures and Data Movement Processes
Perform assigned project tasks in a highly interactive team environment while maintaining a productive and positive atmosphere
Stay current with P&C insurance knowledge and industry technology and utilize that knowledge with existing productivity tools, standards, and procedures to contribute to the cost-effective operation of the department and company
Other duties as assigned
QUALIFICATIONS:â¯
ESSENTIAL:â¯
Associate's or Bachelor's degree in IT related field or equivalent combination of education/experience Associate or bachelor's degree in an IT related field, with business analysis experience in related field
5+ years of data pipeline/ETL/ELT development experience in a corporate environment
Knowledge of the SDLC, business processes and technical aspects of data pipelines and business intelligence outcomes
Experience with Azure DevOps, Azure Synapse, and Azure Data Lake/SQL
Experience working with XML and JSON data formats
Expert in SQL for data mapping, extracting, and validating source data to enable accurate reporting and data feeds
Experience with large system design and implementation
Collaborative team member with a strong ability to contribute positively to team dynamics and culture (Team and culture fit)
Results oriented, self-motivated, resourceful team player
Superior oral and written communication skills
Demonstrates thought Leadership
Able to prioritize and work through multiple tasks
PREFERRED:â¯
P&C Insurance industry experience
Policy, Claims, Billing and Finance experience
Agile methodology
Experience with Duck creek Software Solutions
Experience with Azure DevOps (ADO)
At SECURA, we are transforming the insurance experience by putting authenticity at the forefront of everything we do. Our mission is clear: we're making insurance genuine. We recognize that our associates are our greatest assets, and we invest in their well-being and professional growth. We offer opportunities for continuous learning and career advancement, competitive benefits, and a culture that champions work-life balance. Joining SECURA means becoming part of a dynamic team that values each individual's contribution and fosters a collaborative atmosphere. Here, you'll not only find a fulfilling career but also a place where you can make a positive impact every day.
SECURA Insurance strives to provide equal opportunity for all employees and is committed to fostering an inclusive work environment. We welcome applicants from all backgrounds and walks of life.
Data Architect
Data engineer job in Appleton, WI
Seeking a Data Architect to help their growing data team transform how the company operates with data. This person will own data architecture for smaller projects, design models end-to-end, and collaborate with business stakeholders to define sources, business logic, and governance standards.
Responsibilities:
Design and implement data models across multiple domains
Define source systems, tables, and business logic for unified models
Partner with IT and business teams to ensure governed, reliable data
Support cloud adoption (Azure/GCP) while managing on-prem data
Contribute to data governance and architecture best practices
Requirements:
4+ years in data roles (engineering, BI, analytics)
2+ years in data architecture
Strong data modeling skills
Business-facing communication experience
Familiarity with Azure or GCP
Understanding of data governance principles
Skills
Data modeling, Data architecture, gcp, azure, data governance, Sql, power bi, Python, database management, compliance regulations, warehouse management system
Top Skills Details
Data modeling,Data architecture,gcp,azure,data governance,Sql,power bi
Additional Skills & Qualifications
Strong analytical and problem-solving skills
Ability to work independently and collaboratively in a team environment
Comfortable with hybrid work model and occasional travel
Experience with relational databases and SQL
Exposure to BI tools and ETL processes
Awareness of data governance frameworks and tools
Familiarity with compliance regulations
Familiarity with database management tools and business intelligence tools
Job Type & Location
This is a Permanent position based out of Appleton, WI.
Pay and Benefits
The pay range for this position is $105000.00 - $130000.00/yr.
1. Reimbursement programs for wellness ( gym memberships, group classes) 2. Health, Vision and Dental starting day 1 3. 7% company match on 401k 4. PTO, holidays, sicks day, volunteer time off, caregiver leave. Short and long term disability
Workplace Type
This is a fully onsite position in Appleton,WI.
Application Deadline
This position is anticipated to close on Dec 25, 2025.
h4>About TEKsystems:
We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company.
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
About TEKsystems and TEKsystems Global Services
We're a leading provider of business and technology services. We accelerate business transformation for our customers. Our expertise in strategy, design, execution and operations unlocks business value through a range of solutions. We're a team of 80,000 strong, working with over 6,000 customers, including 80% of the Fortune 500 across North America, Europe and Asia, who partner with us for our scale, full-stack capabilities and speed. We're strategic thinkers, hands-on collaborators, helping customers capitalize on change and master the momentum of technology. We're building tomorrow by delivering business outcomes and making positive impacts in our global communities. TEKsystems and TEKsystems Global Services are Allegis Group companies. Learn more at TEKsystems.com.
The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
Data Warehouse Engineer I
Data engineer job in Menasha, WI
The Data Warehouse Engineer I is part of a team dedicated to supporting Network Health's Enterprise Data Warehouse. This individual will perform development, analysis, testing, debugging, documentation, implementation, and maintenance of interfaces to support the Enterprise Data Warehouse and related applications. They will consult with other technical resources and key departmental users on solutions and best practices. They will monitor performance and effectiveness of the data warehouse and recommend changes as appropriate.
Location: Candidates must reside in the state of Wisconsin for consideration. This position is eligible to work at your home office (reliable internet is required), at our office in Brookfield or Menasha, or a combination of both in our hybrid workplace model.
Hours: 1.0 FTE, 40 hours per week, 8am-5pm Monday through Friday, may be required to work later hours when system changes are being implemented or problems arise
Check out our 2024 Community Report to learn a little more about the difference our employees make in the communities we live and work in. As an employee, you will have the opportunity to work hard and have fun while getting paid to volunteer in your local neighborhood. You too, can be part of the team and making a difference. Apply to this position to learn more about our team.
Job Responsibilities:
Perform end to end delivery of data interfaces in various stages of the Enterprise Data Warehouse in accordance with professional standards and industry best practice
Perform all phases of the development lifecycle including solution design, creation of acceptance criteria, implementation, technical documentation, development and execution of test cases, performance monitoring, troubleshooting, data analysis, and profiling
Consult with Developers, Engineers, DBAs, key departmental stakeholders, data governance and leadership on technical solutions and best practice
Monitor and audit the Enterprise Data Warehouse for effectiveness, throughput, and responsiveness. Recommend changes as appropriate. Troubleshoot customer complaints related to system performance issues
Maintain effective communication with customers from all departments for system development, implementation, and problem resolution
Required to take call to assist in resolution of technical problems
Other duties and responsibilities as assigned
Job Requirements:
Requires Associate Degree in Computer Science, Business, or related technical field; equivalent years of experience may be substituted
Minimum of 1 year experience in program interfacing required
Experience with T-SQL development, SSIS development, and database troubleshooting skills required
Network Health is an Equal Opportunity Employer
Data Architect II
Data engineer job in Appleton, WI
Join our Corporate Data and Analytics team as we continue to expand our data capabilities! The Data Architect II role will be responsible to design, manage, and optimize the organization's data infrastructure for a specific data domain. This individual will ensure that data is structured, secure, and accessible to support business operations and decision-making, will mentor junior data architects and will provide cross functional governance leadership.
The ideal/preferred location for this position is in Appleton, WI. Will consider candidates in other locations based on relevancy of related experience.JOB RESPONSIBILITIES
Essential Job Responsibilities:
Create and maintain conceptual, logical, and physical data models. Document data flows, lineage, and dependencies for assigned data domains.
Collaborate with data engineers, data analysts, and business partners to understand aligning the data model to business requirements.
Capture metadata associated with new data projects. Manage Metadata Repositories. Coordinate with business partners to maintain data catalog information for assigned data domains. Implement metadata and lineage tracking within domain.
Manage and monitor data quality assessments. Communicate and resolve data quality issues with business stewards.
Enforce governance rules and policies for assigned data domain. Leverage master data management tools and canonical data practices to govern data.
Define schemas, transformations, and integration rules. Plan and design data integration methods. Support data integration activities.
Enforce security protocols and policies on assigned data domains. Measure and monitor access to sensitive and secure data.
Work closely with data engineers, analysts, and business stakeholders to align data architecture with organizational goals.
Own one or more data domains end-to-end, including integration rules, data catalog upkeep, and governance enforcement.
Support regulatory compliance initiatives (GDPR, CCPA, HIPAA, SOX depending on company domain).
Introduce cloud-first design patterns and domain-oriented governance practices.
Additional Job Responsibilities:
Live our values of High Performance, Caring Relationships, Strategic Foresight, and Enterprising Spirit
Find A Better Way by championing continuous improvement and quality control efforts to identify opportunities to innovate and improve efficiency, accuracy, and standardization.
Continuously learn and develop self professionally.
Support corporate efforts for safety, government compliance, and all other company policies & procedures.
Perform other related duties as required and assigned.
QUALIFICATIONS
Required:
Bachelor's in computer science, Information Systems, or related field
4+ years in a data engineering or BI Developer role, including 2 years experience in data modeling
Experience with data analysis tools for data research including languages such as SQL or Python and exploration tools such as Power BI, Tableau, or Looker.
Experience using cloud data platforms such as AWS, Azure, GCP, Snowflake, and Databricks.
Ability to translate complex business needs into technical architectural solutions.
Strong documentation and communication skills
Excellent analytical and problem-solving skills
Preferred:
Awareness of data governance frameworks and tools
Familiarity with compliance regulations
Familiarity with database management tools and business intelligence tools
DIVISION:
Corporate
U.S. Venture requires that a team member have and maintain authorization to work in the country in which the role is based. In general, U.S. Venture does not sponsor candidates for nonimmigrant visas or permanent residency unless based on business need.
U.S. Venture will not accept unsolicited resumes from recruiters or employment agencies. In the absence of an executed recruitment Master Service Agreement, there will be no obligation to any referral compensation or recruiter fee. In the event a recruiter or agency submits a resume or candidate without an agreement, U.S. Venture shall reserve the right to pursue and hire those candidate(s) without any financial obligation to the recruiter or agency. Any unsolicited resumes, including those submitted to hiring managers, shall be deemed the property of U.S. Venture.
U.S. Venture, Inc. is an equal opportunity employer that is committed to inclusion and diversity. We ensure equal opportunity for all applicants without regard to race, color, religion, sex, sexual orientation, gender, gender identity or expression, marital status, age, national origin, disability, veteran status, genetic information, or other protected characteristic. If you need assistance or an accommodation due to a disability, you may call Human Resources at **************.
Auto-ApplyData Scientist
Data engineer job in Luxemburg, WI
Your career at Deutsche Börse Group Your Area of Work Join Clearstream Fund Services as a Data Scientist to design and prototype data products that empower data monetization and business users through curated datasets, semantic models, and advanced analytics. You'll work across the data stack-from pipelines to visualizations-and contribute to the evolution of AI-driven solutions.
Your Responsibilities
* Prototype data products including curated datasets and semantic models to support data democratization and self-service BI
* Design semantic layers to simplify data access and usability
* Develop and optimize data pipelines using data engineering tool (e.g. Databricks)
* Use SQL, Python, and PySpark for data processing and transformation
* Create Power BI dashboards to support prototyping and reporting
* Apply ML/AI techniques to support early-stage modeling and future product innovation
* Collaborate with data product managers, functional analyst, engineers, and business stakeholders
* Ensure data quality, scalability, and performance in all deliverables
Your Profile
* Master's in Data Science, Computer Science, Engineering, or related field
* 3+ years of experience in data pipeline development and prototyping in financial services or fund administration
* Proficiency in SQL, Python, and PySpark
* Hands-on experience with Databricks
* Experience building Power BI dashboards and semantic models
* Strong analytical and communication skills
* Fluent in English
Digital Technology Data Scientist Lead
Data engineer job in Oshkosh, WI
At Oshkosh, we build, serve and protect people and communities around the world by designing and manufacturing some of the toughest specialty trucks and access equipment. We employ over 18,000 team members all united by a common purpose. Our engineering and product innovation help keep soldiers and firefighters safe, is critical in building and keeping communities clean and helps people do their jobs every day.
SUMMARY:
As a Data Scientist, your primary responsibilities will be to analyze and interpret datasets by using statistical techniques, machine learning, and/or programming skills in order to extract insights and build models which solve business problems.
YOUR IMPACT:
Familiarity with data science tools, such as Azure Databricks, Power BI, Microsoft Office (including Excel pivot tables), Spark, Python, R and C.
Undertake the processing of multiple datasets, including structured and unstructured, to analyze large amounts of information to discover trends and patterns.
Prepare and concisely deliver analysis results in visual and written forms that communicate data insights to both technical and non-technical audiences.
Collaborate with cross-functional teams (e.g. data analysts, data engineers, architects, business stakeholders) to understand data needs for complex business requirements.
Build highly complex predictive models and machine-learning algorithms. Execute Integration into existing systems or creation of new products.
Direct some data science assignments, projects, visualization tasks, data quality improvements, and troubleshooting of data incidents, including the resolution of root causes.
Lead efforts to resolve and document solutions to track and manage incidents, changes, problems, tasks, and demands.
Coach and mentor other team members on new technologies and best practices across data science and business intelligence.
Possesses advanced understanding of business' processes in at least one area of business and has an understanding of business processes in multiple areas of the business.
Actively supports the advancement of the strategic roadmap for data science.
MINIMUM QUALIFICATIONS:
Bachelors degree with five (5) or more years of experience in the field or in a related area.
STANDOUT QUALIFICATIONS:
Masters or doctorate degree
Expertise in Power Platforms
Familiarity with LLM (open source or closed)
Experience in Front end web app development (Flaskapp, Gradio etc)
Familiarity with RAG architecture
Pay Range:
$115,600.00 - $196,400.00
The above pay range reflects the minimum and maximum target pay for the position across all U.S. locations. Within this range, individual pay is determined by various factors, including the scope and responsibilities of the role, the candidate's experience, education and skills, as well as the equity of pay among team members in similar positions. Beyond offering a competitive total rewards package, we prioritize a people-first culture and offer various opportunities to support team member growth and success.
Oshkosh is committed to working with and offering reasonable accommodation to job applicants with disabilities. If you need assistance or an accommodation due to disability for any part of the employment process, please contact us at ******************************************.
Oshkosh Corporation is a merit-based Equal Opportunity Employer. Job opportunities are open for application to all qualified individuals and selection decisions are made without regard to race, color, religion, sex, national origin, age, disability, veteran status, or other protected characteristic. To the extent that information is provided or collected regarding categories as provided by law it will in no way affect the decision regarding an employment application.
Oshkosh Corporation will not discharge or in any manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with Oshkosh Corporation's legal duty to furnish information.
Certain positions with Oshkosh Corporation require access to controlled goods and technologies subject to the International Traffic in Arms Regulations or the Export Administration Regulations. Applicants for these positions may need to be "U.S. Persons," as defined in these regulations. Generally, a "U.S. Person" is a U.S. citizen, lawful permanent resident, or an individual who has been admitted as a refugee or granted asylum.
Auto-ApplyLead Data Steward
Data engineer job in Menasha, WI
You've discovered something special. A company that cares. Cares about leading the way in construction, engineering, manufacturing and renewable energy. Cares about redefining how energy is designed, applied and consumed. Cares about thoughtfully growing to meet market demands. And ─ as “one of the Healthiest 100 Workplaces in America” ─ is focused on the mind/body/soul of team members through our Culture of Care.
We are seeking a highly motivated and detail-oriented FTI Data Steward Leader to establish and champion the data governance within our organization. As data governance is a new initiative for the company, this role will be pivotal in laying the foundation, defining standards, and ensuring that data is treated as a critical enterprise asset. The FTI Data Steward Leader will help lead a community of business and technical data stewards across domains, promote best practices, and work closely with data governance leadership to develop policies, data quality standards, and stewardship processes. This role requires a strategic thinker with strong leadership skills and a deep understanding of data management principles.
MINIMUM REQUIREMENTS
Education: Bachelor's degree in Information Management, Data Science, Business, or related field, or a related field or experience in lieu of degree.
Experience: 5+ years of experience in data management, data governance, or related disciplines with demonstrated experience in leading data stewardship or governance initiatives enterprise wide. Strong knowledge of data quality principles, metadata management, and master data management (MDM). Familiarity with data governance frameworks (e.g., DAMA DMBOK, DCAM) and tools (e.g. Informatica, etc.).
Travel: 0-10%
Work Schedule: This position works between the hours of 7 AM and 5 PM, Monday- Friday. However, work may be performed at any time on any day of the week to meet business needs.
KEY RESPONSIBILITIES
Leadership and Strategy:
Serves as the primary representative and advocate for the community of master data leads and data coordinators.
Leads master data management (MDM) strategies and policies to support FTI.
Collaborates with data governance leadership to define the enterprise data stewardship framework, standards, and playbooks.
Creates and implements data governance policies, procedures, and maturity roadmaps.
Leads the formation of a data governance council and facilitate regular working sessions.
Stewardship and Quality
Ensures consistent application of data definitions, metadata standards, and classification across data domains.
Leads and develops data quality standards with data stewards and help resolve data quality issues with appropriate data stewards.
Collaboration and Stakeholder Engagement:
Partners with business units, BT, and Data Analytics teams to identify data steward leaders and data ownership roles.
Facilitates communication between business and technical stakeholders to resolve data issues and improve data understanding.
Acts as a liaison between the data governance and operational teams to ensure stewardship initiatives are aligned with business needs.
Metadata and Cataloging:
Works with data governance and Data Analytics team to maintain an enterprise data catalog.
Supports documentation of business glossaries, data dictionaries, and lineage across key data assets.
Training and Change Management:
Promotes data literacy and fosters a data-centric culture across the organization.
Leads change management efforts related to data governance adoption and tool implementation.
Performs other related duties as required and assigned.
The job description and responsibilities described are intended to provide guidelines for job expectations and the employee's ability to perform the position described. It is not intended to be construed as an exhaustive list of all functions, responsibilities, skills, and abilities. Additional functions and requirements may be assigned by supervisors as deemed appropriate.
How Does FTI Give YOU the Chance to Thrive?
If you're energized by new challenges, FTI provides you with many opportunities. Joining FTI opens doors to redefine what's possible for your future.
Once you're a team member, you're supported and provided with the knowledge and resources to achieve your career goals with FTI. You're officially in the driver's seat of your career, and FTI's career development and continued education programs give you opportunities to position yourself for success.
FTI is a “merit to the core” organization. We recognize and reward top performers, offering competitive, merit-based compensation, career path development and a flexible and robust benefits package.
Benefits are the Game-Changer
We provide industry-leading benefits as an investment in the lives of team members and their families. You're invited to review the full list of FTI benefits available to regular/full-time team members. Start here. Grow here. Succeed here. If you're ready to learn more about your career with FTI, apply today!
Faith Technologies, Inc. is an Equal Opportunity Employer - veterans/disabled.
Auto-ApplyUpgrades Engineer
Data engineer job in Green Bay, WI
Apply Description
Upgrades Department
FAC ENG/CARPENTER 1
Data engineer job in Green Bay, WI
Desired Previous Job Experience
General carpentry experience especially in food retail and/or commercial facilities.
Higher level internal Administrative positions, higher-level General Office Administrative positions, and Division office administrative positions.
Minimum Position Qualifications:
Ability to lift at least 25 lbs.
Ability to work as part of a team in a fast-paced work environment and a willingness to help all members of the department.
Flexibility to handle ever-changing scope of assignments and projects.
Ability to work a variety of schedules as required (including nights, weekends and holidays). Overtime may be required.
Excellent communication skills.
Ability to prioritize job functions.
Personal initiative and follow-through to completion.
Attention to detail and accuracy.
Strong PC skills. Position will utilize Windows XP, MS Excel, CPC Ultrasite/Controllers and MS Outlook. Candidates must possess the ability to utilizeand learn Kroger-specific mainframe applications including Viryanet Service Hub.
Valid driver's license
Essential Job Functions:
Complete commercial carpentry skills such as installation of drywall, ceramic and vinyl floor tile installation, cabinet or shelving repair, door andwindow repairs or installations cement work, acoustical ceilings and locks.
Complete and submit accurate reporting of time reporting and vehicle inventory by scheduled dates via company computer programs.
Perform preventive maintenance in assigned stores at least twice per year.
Ensure maintenance expense (by account code) of assigned stores meet annual budgetary guidelines.
Must be able to perform the essential functions of this position with or without reasonable accommodation
Plumbing Engineer
Data engineer job in Green Bay, WI
Job Description
Specific duties include plan, design and engineer building plumbing systems and fire protection systems for a wide variety of markets, production of construction documents utilizing Revit software, and maintaining client relationships in the plumbing and fire protection disciplines for facilities projects. Additional duties include complete plumbing calculations, energy analysis and sustainable design (pertaining to plumbing systems), existing system condition assessments, and construction administration activities. This position reports to the Director of Engineering.
General Duties and Responsibilities
Exceptional interpersonal skills are essential.
Demonstrated ability to clearly communicate project objectives, problems and solutions with clients, field personnel and internal staff.
Ability to facilitate close work with clients, field personnel and internal staff to achieve deadlines and quality work.
Demonstrated ability to work well in a team/collaborative setting, make sound decisions and drive action cross-functionally.
Demonstrated strong written, verbal presentation and communication skills.
Knowledge development of building codes, regulations and requirements, thorough knowledge of building materials and construction methods, knowledge of procedures, and sequencing of construction documents.
Ability to organize and prioritize personal workload and make necessary adjustments to meet strict deadlines; must be results-oriented and milestone-driven.
Demonstrated exceptional planning, analytical and problem-solving skills.
Minimum Requirements
Licensed/registered mechanical engineer (PE).
Plumbing and fire protection design/engineering experience of 5 years
Must be proficient in Microsoft Office (Word, Excel, Outlook); Revit experience is preferred.
Ability to function effectively in a demanding environment and work on multiple projects simultaneously
Physical Requirements and Environmental Adaptability
Ability to move around construction sites in various stages of development from excavation to completion and beyond, without the assistance of others
Work is performed in an office setting and in the field; job site monitoring may be performed in various weather conditions
Activities may be necessary outside of the normal office hours depending on workload, schedules and client geographical location
Occasional travel will be required
The salary range for this position is between $90,000 - $102,000 based on experience
Software Engineer, Platform - Green Bay, USA
Data engineer job in Green Bay, WI
The mission of Speechify is to make sure that reading is never a barrier to learning.
Over 50 million people use Speechify's text-to-speech products to turn whatever they're reading - PDFs, books, Google Docs, news articles, websites - into audio, so they can read faster, read more, and remember more. Speechify's text-to-speech reading products include its iOS app, Android App, Mac App, Chrome Extension, and Web App. Google recently named Speechify the Chrome Extension of the Year and Apple named Speechify its 2025 Design Award winner for Inclusivity.
Today, nearly 200 people around the globe work on Speechify in a 100% distributed setting - Speechify has no office. These include frontend and backend engineers, AI research scientists, and others from Amazon, Microsoft, and Google, leading PhD programs like Stanford, high growth startups like Stripe, Vercel, Bolt, and many founders of their own companies.
Overview
The responsibilities of our Platform team include building and maintaining all backend services, including, but not limited to, payments, analytics, subscriptions, new products, text to speech, and external APIs.
This is a key role and ideal for someone who thinks strategically, enjoys fast-paced environments, is passionate about making product decisions, and has experience building great user experiences that delight users.
We are a flat organization that allows anyone to become a leader by showing excellent technical skills and delivering results consistently and fast. Work ethic, solid communication skills, and obsession with winning are paramount.
Our interview process involves several technical interviews and we aim to complete them within 1 week.
What You'll Do
Design, develop, and maintain robust APIs including public TTS API, internal APIs like Payment, Subscription, Auth and Consumption Tracking, ensuring they meet business and scalability requirements
Oversee the full backend API landscape, enhancing and optimizing for performance and maintainability
Collaborate on B2B solutions, focusing on customization and integration needs for enterprise clients
Work closely with cross-functional teams to align backend architecture with overall product strategy and user experience
An Ideal Candidate Should Have
Proven experience in backend development: TS/Node (required)
Direct experience with GCP and knowledge of AWS, Azure, or other cloud providers
Efficiency in ideation and implementation, prioritizing tasks based on urgency and impact
Preferred: Experience with Docker and containerized deployments
Preferred: Proficiency in deploying high availability applications on Kubernetes
What We Offer
A dynamic environment where your contributions shape the company and its products
A team that values innovation, intuition, and drive
Autonomy, fostering focus and creativity
The opportunity to have a significant impact in a revolutionary industry
Competitive compensation, a welcoming atmosphere, and a commitment to an exceptional asynchronous work culture
The privilege of working on a product that changes lives, particularly for those with learning differences like dyslexia, ADD, and more
An active role at the intersection of artificial intelligence and audio - a rapidly evolving tech domain
The United States Based Salary range for this role is: 140,000-200,000 USD/Year + Bonus + Stock depending on experience
Think you're a good fit for this job?
Tell us more about yourself and why you're interested in the role when you apply.
And don't forget to include links to your portfolio and LinkedIn.
Not looking but know someone who would make a great fit?
Refer them!
Speechify is committed to a diverse and inclusive workplace.
Speechify does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.
Auto-ApplyJava Software Engineer
Data engineer job in Green Bay, WI
Candidates need to have 10 years experience, and 5 years in engineering software solutions using Java, Spring boot.
Core Characteristics and Soft Skills:
Beyond technical proficiency, the right mindset and interpersonal skills are crucial for success on our team. We'd prioritize candidates who demonstrate:
Problem-Solving Acumen: The ability to analyze complex problems, break them down, evaluate different approaches, and implement robust, efficient solutions. This includes troubleshooting existing systems and designing new ones.
Independence and Initiative: We value engineers who can take ownership of tasks, research potential solutions independently, make informed decisions, and drive work forward with minimal supervision once objectives are clear.
Dependability and Accountability: Team members must be reliable, meet commitments, deliver high-quality work, and take responsibility for their contributions.
Strong Communication Skills: Clear, concise communication (both written and verbal) is essential. This includes explaining technical concepts to varied audiences, actively listening, providing constructive feedback, and documenting work effectively.
Collaboration and Teamwork: Ability to work effectively within a team structure, share knowledge, participate in code reviews, and contribute to a positive team dynamic.
Adaptability and Eagerness to Learn: The technology landscape and business needs evolve. We seek individuals who are curious, adaptable, and willing to learn new technologies and methodologies.
Core Technical Skillset:
Our current technology stack forms the foundation of our work. Proficiency or strong experience in the following areas is highly desirable:
Backend Development:
Java: Deep understanding of Java (latest LTS versions preferred).
Spring Boot: Extensive experience building applications and microservices using the Spring Boot framework and its ecosystem (e.g., Spring Data, Spring Security, Spring Cloud).
Messaging Systems:
Apache Kafka: Solid understanding of Kafka concepts (topics, producers, consumers, partitioning, brokers) and experience building event-driven systems.
Containerization & Orchestration:
Kubernetes: Practical experience deploying, managing, and troubleshooting applications on Kubernetes.
OCP (OpenShift Container Platform): Experience specifically with OpenShift is a significant advantage.
AKS (Azure Kubernetes Service): Experience with AKS is also highly relevant.
(General Docker knowledge is expected)
CI/CD & DevOps:
GitHub Actions: Proven experience in creating, managing, and optimizing CI/CD pipelines using GitHub Actions for build, test, and deployment automation.
Understanding of Git branching strategies and DevOps principles.
Frontend Development:
JavaScript: Strong proficiency in modern JavaScript (ES6+).
React: Experience building user interfaces with the React library and its common patterns/ecosystem (e.g., state management, hooks).
Database & Data Warehousing:
Oracle: Experience with Oracle databases, including writing efficient SQL queries, understanding data modeling, and potentially PL/SQL.
Snowflake: Experience with Snowflake cloud data warehouse, including data loading, querying (SQL), and understanding its architecture.
Scripting:
Python: Proficiency in Python for scripting, automation, data manipulation, or potentially backend API development (e.g., using Flask/Django, though Java/Spring is primary).
Requirements
Domain Understanding (Transportation & Logistics):While not strictly mandatory, candidates with experience or a demonstrated understanding of the transportation and logistics industry (e.g., supply chain management, freight operations, warehousing, fleet management, routing optimization, TMS systems) will be able to contribute more quickly and effectively. They can better grasp the business context and user needs.Additional Valuable Skills:We are also interested in candidates who may possess skills in related areas that complement our core activities:
Data Science & Analytics:
Experience with data analysis techniques.
Knowledge of Machine Learning (ML) concepts and algorithms (particularly relevant for optimization, forecasting, anomaly detection in logistics).
Proficiency with Python data science libraries (Pandas, NumPy, Scikit-learn).
Experience with data visualization tools and techniques.
Understanding of optimization algorithms (linear programming, vehicle routing problem algorithms, etc.).
Cloud Platforms: Broader experience with cloud services (particularly Azure, but also AWS or GCP) beyond Kubernetes (e.g., managed databases, serverless functions, monitoring services).
Testing: Strong experience with automated testing practices and tools (e.g., JUnit, Mockito, Cypress, Selenium, Postman/Newman).
API Design & Management: Deep understanding of RESTful API design principles, API security (OAuth, JWT), and potentially experience with API gateways.
Monitoring & Observability: Experience with tools like Prometheus, Grafana, ELK Stack (Elasticsearch, Logstash, Kibana), Datadog, Dynatrace, etc., for monitoring application health and performance.
Security: Awareness and application of secure coding practices (e.g., OWASP Top 10).
Cloud Engineer
Data engineer job in Oshkosh, WI
**Direct Hire Opportunity** Salary Range depending on experience: $91k - $153K Standard working hours Hybrid work model, with three days in-office and two days remote each week. The preferred location for this role is Oshkosh, WI, with relocation assistance available for candidates who are not local. However, we are also open to candidates based out of the following locations:
Frederick, MD
McConnellsburg, PA
Must Have
Current Microsoft Azure Gov. Cloud experience
Job Description
The Senior Cloud Engineer will have emphasis on driving profitable cloud adoption. While driving the innovation and consumption of Cloud based services across the enterprise. Ensure the security of the cloud environment by implementing security controls, threat protection, and managing identity and access. Handle data feeds and integration process to ensure seamless data exchange between systems. Assess innovation and vendor alignment for the continuous build out and scale of the cloud ecosystem. Provide infrastructure support to ensure operational excellence ensuring issue resolution, disaster recovery, and data back up standards are defined.
YOUR IMPACT
Provide day to day operational support of the Microsoft Azure Gov Cloud ecosystem ensuring installations, configuration, deployments, integrations updates, and administration is streamlined using Azure Kubernetes, Terraform Enterprise, Github, and equivalent tools.
Advanced understanding of public and private clouds, IaaS, data security standards, system messaging, data feeds.
Conducting triage and resolution of cloud ecosystem issues and outages
Active member of project teams following senior leadership design and project plans
Cross functional team communicating and coordinating efforts for server, storage, support, database, security, etc.
Learning and adopting new technologies and best practices across cloud, continuous integration, automation, and process improvement
Continuously monitor and analyze cloud ecosystem and performance.
Advanced knowledge in 1 of the primary cloud ecosystems (database, virtualization, containerization, DevOps, networking, servers, scripting, etc.)
Cross functional team communicating and coordinating efforts for database, applications, and infrastructure activities.
Intimate knowledge of the ITIL process, ticket resolution, and stakeholder response
Design, implement, and operate complex solutions.
Other duties as assigned.
Regular attendance is required.
MINIMUM QUALIFICATIONS
Bachelor's degree in information technology or related field.
Four (4) or more years of experience in the field or in a related area.
Monitoring, troubleshooting, scripting, deployment, integration, messaging, automation, orchestration
Written and communication skills, problem solving, time management, teamwork, detail oriented, customer service.
Senior ServiceNow Developer
Data engineer job in Oshkosh, WI
Job DescriptionSenior ServiceNow Developer (Hands-On | End-to-End Delivery)
Oshkosh, WI (preferred)
Frederick, MD
Employment Type: Full-Time
Travel: Minimal (occasional conference or annual team meeting)
Citizenship: U.S. Citizen required
Role Overview
We are seeking a hands-on Senior ServiceNow Developer to serve as a technical leader within our Digital Technology organization. This role is responsible for owning the full development lifecycle from solution design and hands-on development through implementation, deployment, and ongoing optimization of the ServiceNow platform.
This is not a ServiceNow Administrator or configuration-only role. The successful candidate will actively write code, build custom solutions, and lead implementation efforts while partnering with business and technical stakeholders to deliver scalable, secure, and high-impact ServiceNow solutions.
Key ResponsibilitiesEnd-to-End Development & Implementation
Lead the design, development, testing, and implementation of ServiceNow solutions aligned to enterprise and business objectives.
Perform hands-on development using ServiceNow scripting and web technologies, including:
JavaScript (Business Rules, Script Includes, Client Scripts, UI Policies)
Flow Designer and workflow automation
Custom applications and modules
Translate business requirements into technical designs and working solutions, ensuring maintainability and performance.
Own deployments, upgrades, and feature releases, including participation in semi-annual ServiceNow upgrades.
Platform Architecture, Automation & Integration
Architect and implement integrations using REST, SOAP, JSON, and APIs.
Partner with ITOM, ITAM, and CMDB teams to enhance:
Discovery and service mapping
Automation and event-driven workflows
Asset and configuration data quality
Design and deploy automation and AI-enabled solutions within ServiceNow to improve efficiency, reduce manual work, and drive business outcomes.
Technical Leadership & Governance
Serve as a technical lead, setting development standards and best practices.
Review and guide other developers code and designs.
Document solution architectures, data flows, code, and configurations to ensure traceability and platform integrity.
Function as a change agent for process optimization and ServiceNow-driven business innovation.
Collaboration & Continuous Improvement
Collaborate with business partners and cross-functional teams to analyze needs and propose scalable solutions.
Educate stakeholders on platform capabilities and design decisions.
Lead special projects and workstreams focused on ROI, operational excellence, and continuous improvement.
Required Qualifications
5+ years of hands-on ServiceNow development or implementation experience
Strong proficiency in:
JavaScript
REST/SOAP APIs
HTML
Demonstrated experience delivering custom ServiceNow development, not just configuration
Advanced experience with ITSM, ITOM, ITAM, and CMDB
Experience owning solutions across the full SDLC (design build test deploy support)
Strong communication, stakeholder engagement, and problem-solving skills
Preferred / Standout Qualifications
ServiceNow certifications (e.g., Certified Application Developer, Implementation Specialist)
Experience leading enterprise-scale ServiceNow implementations
Familiarity with ITIL, DevOps, and Agile methodologies
Experience with AI/ML-driven automation and self-service solutions
Exposure to integration platforms such as Boomi
Bachelors degree in Computer Science, Information Technology, or related field
What This Role Is Not
Not a ServiceNow Administrator-only role
Not a low-code / no-code or citizen developer position
Not a process or governance-only role
This position requires active, hands-on development through implementation.
Requirement:
1-strong knowledge of web technologies including REST/SOAP APIs, HTML and JavaScript
2-strong service now development skills
3-experience with ITSM/ITOM,
Preference:
1-Service now certifications such as certified application developer, implementation specialist.
2-Familiarity with ITIL and enterprise service management frameworks
Waterfront Engineer II
Data engineer job in Marinette, WI
Full-time Description
Days: M-F Hours of Work: Full Time - 40 Hours/Week + potential OT
Travel Requirements: 100% On-Site Marinette, WI Signing Bonus: No Salary: Negotiable based on experience
Background check, drug testing and US Citizenship are requirements for this position.
The Engineer II's purpose is to solve technical problems, develop applicable documentation, and supervise all assigned resources to support the assigned project. The Engineer II assists the Senior Engineers, Engineering Supervisors and Project Engineers in determining the technical requirements of the project, defining resource requirements to achieve schedule and budget commitments, and solving technical issues as they arise within the field of competence.
• Work independently in solving technical issues
• Plan, estimate, schedule and monitor assigned work and the work of assigned resources
• Track and report progress to budget and schedule commitments
• Supervise assigned resources
• Facilitate communications of issues and solutions throughout MMC
• Prepare, check and issue professional quality deliverables
• Interpret contract requirements
• Develop contract change documents to support the project needs
• May be required to define, track and control subcontractor activities
• Abide by and enforce all organizational policies and procedures.
• May be required to lead or participate in product improvement projects
LABOR CATEGORY - SUPERVISORY RESPONSIBILITIES
The Engineer II may directly supervise 2 to 8 employees. Supervisory responsibilities are carried out in accordance with the Company's policies, procedures, and applicable laws. Responsibilities include interviewing, and training employees; planning, assigning, and directing work; appraising performance; addressing complaints and resolving problems. The Engineer II also demonstrates leadership skills that align with the mission, vision, and values of the Company
LABOR CATEGORY - QUALIFICATIONS
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
LABOR CATEGORY - EDUCATION AND EXPERIENCE
• Bachelor's Degree (4 Year Degree) in field of competence (or equivalent) or High School Diploma/GED and 13 years of equivalent experience.
• 4 years of experience of increasing responsibility in the engineering field of competence.
• Supervisory experience desired.
• Six Sigma or LEAN training desired.
LABOR CATEGORY - SKILLS
• Must have technological skills (see attachment for specific skills) proven by verifiable and specific work experience, technical publication, or patent issue.
• Must have a working knowledge of standard shipbuilding rules and regulations, as well as the associated regulatory agencies and the review & approval process.
• Must have current computer skills relevant to the field of competence.
• Must have ability to work within schedule and budget restraints.
• Must have the ability to maintain a professional demeanor.
• Must have effective written and verbal communication skills.
• Must have ability to maintain confidentiality.
• A basic Earned Value Management System competency is a bonus
• Process Improvement Specialist knowledge and experience beneficial.
LABOR CATEGORY - COMPETENCIES: SPARC
• Safety First
• People oriented, determined to assist where able
• Accountability and Ownership: Demonstrate a sense of urgency, ownership in your role
• Respect people's difference and be open to differing opinions and beliefs
• Continue to improve your current role and help develop the role for the future based on lean management.
LABOR CATEGORY - COMPETENCIES
• Ownership and Accountability
• Team Work and Collaboration
• Business Discipline
• Sense of Urgency
Education Requirements:
• Shipyard Waterfront Marine Engineer BS in Systems Engineering, Electrical Engineering, Mechanical Engineering, Structure Engineering or Shipyard Management Engineering
• USCG Steam/Motor/Gas Turbine, Unlimited Horsepower license
Requirements
TMMG is seeking a full-time Shipyard Waterfront Marine Engineer to provide waterfront support during the construction of Naval ship systems in a shipyard environment. The marine engineer must be able to read prints, work inside a 3D model of the ship, develop drawings and reports to affect corrections to the 3D model piping, electrical or structural systems. Candidate must be self-motivated, self-directed, self-managed and exhibited excellent verbal and written communication skills. Training in shipyard procedures and utilizing Navisworks to view 3D model will be provided. Position is located to Marinette WI.
Waterfront Engineer - Responsibilities
• Demonstrate experience in AutoCAD drawing development and 2D & 3D modeling
• Compare as-built ship conditions with design drawings using Navisworks 3D modeling software
• Verify vessel constructed IAW the 3D model and drawings and troubleshoot interferences not identified during design
• Read prints, work inside a 3D model of the ship, develop drawings and reports to affect corrections to the 3D model piping, electrical and structural systems
• Utilize engineering software to review designs and create engineering drawings. Produce 2D drawings from a 3D model.
• Identify deck plate deficiencies to assist ship production and manufacturing schedules
• Verify engineering changes are implemented and are IAW with engineering drawings
• Conduct design calculations for piping, electrical and structural systems
• Identification of new equipment and technologies to meet customer requirements
• Develop shipboard equipment operating instructions, manuals, and test procedures
• Develop maintenance procedures for marine equipment
• Experienced in AutoCAD, Navisworks and/ or Solidworks engineering drawing development
• Proficient in Microsoft Word, Excel, and Power Point
• Must meet DOT drug screening requirements
• Familiar with the development of work items, drawings, analysis, cost estimates, and technical reports for ship repair or related projects
• Company Background Investigation upon hire
• COVID vaccination dependent on customer's requirements
• Secret security clearance not required (desired but not required)
• Climb tall ladders, crawl tanks/bilges in confined spaces, inspect from aerial lifts, and work in adverse environments for long periods. Periodically lift-up to 50 pounds and be able to walk to/from job sites.
Software Engineer, iOS Core Product - Green Bay, USA
Data engineer job in Green Bay, WI
The mission of Speechify is to make sure that reading is never a barrier to learning.
Over 50 million people use Speechify's text-to-speech products to turn whatever they're reading - PDFs, books, Google Docs, news articles, websites - into audio, so they can read faster, read more, and remember more. Speechify's text-to-speech reading products include its iOS app, Android App, Mac App, Chrome Extension, and Web App. Google recently named Speechify the Chrome Extension of the Year and Apple named Speechify its App of the Day.
Today, nearly 200 people around the globe work on Speechify in a 100% distributed setting - Speechify has no office. These include frontend and backend engineers, AI research scientists, and others from Amazon, Microsoft, and Google, leading PhD programs like Stanford, high growth startups like Stripe, Vercel, Bolt, and many founders of their own companies.
Overview
With the growth of our iOS app, being the #18 productivity app in the App Store category and also our recent recognition as Apple's 2025 Design Award for Inclusivity, we find the need for a Senior iOS Engineer to help us support the new user base as well as work on new and exciting projects to push our missing forward.
This is a key role and ideal for someone who thinks strategically, enjoys fast-paced environments, passionate about making product decisions, and has experience building great user experiences that delight users.
We are a flat organization that allows anyone to become a leader by showing excellent technical skills and delivering results consistently and fast. Work ethic, solid communication skills, and obsession with winning are paramount.
Our interview process involves several technical interviews and we aim to complete them within 1 week.
What You'll Do
Opportunity to lead key engineering and product decisions
Actively shipping production code for the Speechify iOS app
Work within a dedicated product team
Participate in product discussions to shape the product roadmap
Maintain and enhance the existing complex app architecture
An Ideal Candidate Should Have
Experience. You've worked on products that scaled to a large user base
Track record. You have worked on various products from inception to decent traction. You have been responsible for engineering the product
Customer obsession. We expect every team member whose responsibilities directly impact customers to be constantly obsessed about providing the best possible experience
Product thinking. You make thoughtful decisions about the evolution of your product and support internal teams and designers into taking the right direction
Speed. You work quickly to generate ideas and know how to decide which things can ship now and what things need time
Focus. We're a high-growth startup with a busy, remote team. You know how and when to engage or be heads down
Technical skills. Swift, SwiftUI
Technical Requirements:
Swift Programming Language
SwiftUI experience
Experience in Multithreading Programming
Working with CI/CD infrastructure
Experience with Fastlane
SOLID principles, the ability to write every single class according to SOLID
Experience with Git and understanding of different Git strategies
What We offer:
A fast-growing environment where you can help shape the company and product
An entrepreneurial crew that supports risk, intuition, and hustle
The opportunity to make a big impact in a transformative industry
A competitive salary, a collegiate atmosphere, and a commitment to building a great asynchronous culture
Work on a product that millions of people use and where daily feedback includes users sharing that they cried when they first found the product because it was so impactful on their lives
Support people with learning differences like Dyslexia, ADD, Low Vision, Concussions, Autism, and Second Language Learners, and give reading superpowers to professionals all over the world
Work in one of the fastest growing sectors of tech: Intersection of Artificial Intelligence and Audio
The United States Based Salary range for this role is: 140,000-200,000 USD/Year + Bonus + Stock depending on experience
Think you're a good fit for this job?
Tell us more about yourself and why you're interested in the role when you apply.
And don't forget to include links to your portfolio and LinkedIn.
Not looking but know someone who would make a great fit?
Refer them!
Speechify is committed to a diverse and inclusive workplace.
Speechify does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.
Auto-Apply