Data Scientist
Data scientist job in Portage, MI
# Job Description: AI Task Evaluation & Statistical Analysis Specialist
## Role Overview We're seeking a data-driven analyst to conduct comprehensive failure analysis on AI agent performance across finance-sector tasks. You'll identify patterns, root causes, and systemic issues in our evaluation framework by analyzing task performance across multiple dimensions (task types, file types, criteria, etc.). ## Key Responsibilities - **Statistical Failure Analysis**: Identify patterns in AI agent failures across task components (prompts, rubrics, templates, file types, tags) - **Root Cause Analysis**: Determine whether failures stem from task design, rubric clarity, file complexity, or agent limitations - **Dimension Analysis**: Analyze performance variations across finance sub-domains, file types, and task categories - **Reporting & Visualization**: Create dashboards and reports highlighting failure clusters, edge cases, and improvement opportunities - **Quality Framework**: Recommend improvements to task design, rubric structure, and evaluation criteria based on statistical findings - **Stakeholder Communication**: Present insights to data labeling experts and technical teams ## Required Qualifications - **Statistical Expertise**: Strong foundation in statistical analysis, hypothesis testing, and pattern recognition - **Programming**: Proficiency in Python (pandas, scipy, matplotlib/seaborn) or R for data analysis - **Data Analysis**: Experience with exploratory data analysis and creating actionable insights from complex datasets - **AI/ML Familiarity**: Understanding of LLM evaluation methods and quality metrics - **Tools**: Comfortable working with Excel, data visualization tools (Tableau/Looker), and SQL ## Preferred Qualifications - Experience with AI/ML model evaluation or quality assurance - Background in finance or willingness to learn finance domain concepts - Experience with multi-dimensional failure analysis - Familiarity with benchmark datasets and evaluation frameworks - 2-4 years of relevant experience
AI & GenAI Data Scientist - Manager
Data scientist job in Grand Rapids, MI
Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals.
In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems.
Enhancing your leadership style, you motivate, develop and inspire others to deliver quality. You are responsible for coaching, leveraging team member's unique strengths, and managing performance to deliver on client expectations. With your growing knowledge of how business works, you play an important role in identifying opportunities that contribute to the success of our Firm. You are expected to lead with integrity and authenticity, articulating our purpose and values in a meaningful way. You embrace technology and innovation to enhance your delivery and encourage others to do the same.
Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to:
* Analyse and identify the linkages and interactions between the component parts of an entire system.
* Take ownership of projects, ensuring their successful planning, budgeting, execution, and completion.
* Partner with team leadership to ensure collective ownership of quality, timelines, and deliverables.
* Develop skills outside your comfort zone, and encourage others to do the same.
* Effectively mentor others.
* Use the review of work as an opportunity to deepen the expertise of team members.
* Address conflicts or issues, engaging in difficult conversations with clients, team members and other stakeholders, escalating where appropriate.
* Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements.
Minimum Degree Required
Bachelor's Degree
Minimum Year(s) of Experience
7 year(s)
Demonstrates extensive-level abilities and/or a proven record of success managing the identification and addressing of client needs:
* Managing development teams in building of AI and GenAI solutions, including but not limited to analytical modeling, prompt engineering, general all-purpose programming (e.g., Python), testing, communication of results, front end and back-end integration, and iterative development with clients
* Documenting and analyzing business processes for AI and Generative AI opportunities, including gathering of requirements, creation of initial hypotheses, and development of AI/GenAI solution approach
* Collaborating with client team to understand their business problem and select the appropriate models and approaches for AI/GenAI use cases
* Designing and solutioning AI/GenAI architectures for clients, specifically for plugin-based solutions (i.e., ChatClient application with plugins) and custom AI/GenAI application builds
* Managing teams to process unstructured and structured data to be consumed as context for LLMs, including but not limited to embedding of large text corpus, generative development of SQL queries, building connectors to structured databases
* Managing daily operations of a global data and analytics team on client engagements, review developed models, provide feedback and assist in analysis;
* Directing data engineers and other data scientists to deliver efficient solutions to meet client requirements;
* Leading and contributing to development of proof of concepts, pilots, and production use cases for clients while working in cross-functional teams;
* Facilitating and conducting executive level presentations to showcase GenAI solutions, development progress, and next steps
* Structuring, write, communicate and facilitate client presentations; and,
* Managing associates and senior associates through coaching, providing feedback, and guiding work performance.
Demonstrates extensive abilities and/or a proven record of success learning and performing in functional and technical capacities, including the following areas:
* Managing GenAI application development teams including back-end and front-end integrations
* Using Python (e.g., Pandas, NLTK, Scikit-learn, Keras, etc.), common LLM development frameworks (e.g., Langchain, Semantic Kernel), Relational storage (SQL), Non-relational storage (NoSQL);
* Experience in analytical techniques such as Machine Learning, Deep Learning and Optimization
* Vectorization and embedding, prompt engineering, RAG (retrieval, augmented, generation) workflow dev
* Understanding or hands on experience with Azure, AWS, and / or Google Cloud platforms
* Experience with Git Version Control, Unit/Integration/End-to-End Testing, CI/CD, release management, etc.
Travel Requirements
Up to 80%
Job Posting End Date
Learn more about how we work: **************************
PwC does not intend to hire experienced or entry level job seekers who will need, now or in the future, PwC sponsorship through the H-1B lottery, except as set forth within the following policy: ***********************************
As PwC is an equal opportunity employer, all qualified applicants will receive consideration for employment at PwC without regard to race; color; religion; national origin; sex (including pregnancy, sexual orientation, and gender identity); age; disability; genetic information (including family medical history); veteran, marital, or citizenship status; or, any other status protected by law.
For only those qualified applicants that are impacted by the Los Angeles County Fair Chance Ordinance for Employers, the Los Angeles' Fair Chance Initiative for Hiring Ordinance, the San Francisco Fair Chance Ordinance, San Diego County Fair Chance Ordinance, and the California Fair Chance Act, where applicable, arrest or conviction records will be considered for Employment in accordance with these laws. At PwC, we recognize that conviction records may have a direct, adverse, and negative relationship to responsibilities such as accessing sensitive company or customer information, handling proprietary assets, or collaborating closely with team members. We evaluate these factors thoughtfully to establish a secure and trusted workplace for all.
Applications will be accepted until the position is filled or the posting is removed, unless otherwise set forth on the following webpage. Please visit this link for information about anticipated application deadlines: ***************************************
The salary range for this position is: $99,000 - $232,000, plus individuals may be eligible for an annual discretionary bonus. For roles that are based in Maryland, this is the listed salary range for this position. Actual compensation within the range will be dependent upon the individual's skills, experience, qualifications and location, and applicable employment laws. PwC offers a wide range of benefits, including medical, dental, vision, 401k, holiday pay, vacation, personal and family sick leave, and more. To view our benefits at a glance, please visit the following link: ***********************************
Auto-ApplyLead Data Scientist, Strategic Analytics - Data Science
Data scientist job in Grand Rapids, MI
Are you passionate about harnessing data to drive business strategy and shape decision-making at the highest levels? Do you thrive at the intersection of finance, analytics, and innovative problem-solving? If so, Deloitte invites you to step into a pivotal role as a Lead Data Scientist.
At Deloitte, you'll work alongside a team of accomplished professionals, delivering impactful financial planning and advanced analytics for our Strategic Analytics function in a rapidly evolving business landscape. As a strategic advisor to executive leaders, you'll have the opportunity to influence major decisions, develop cutting-edge data science solutions, and see your insights translate into real-world results.
We offer an environment built for growth-for both your technical and leadership abilities. You'll collaborate on high-visibility projects, contribute directly to business outcomes, and expand your expertise with the resources of a top-tier firm.
If you're driven to make a difference and ready for meaningful professional and personal development, consider joining our Data Science team at Deloitte Strategic Analytics.
The team and the role
The Strategic Analytics team is dedicated to delivering actionable insights to finance and operational leaders, empowering them to make data-driven decisions that drive organizational success. By leveraging advanced analytical techniques and cutting-edge technologies, the team transforms complex data into clear, strategic recommendations. Our team members collaborate closely with various departments to understand their unique challenges and opportunities, ensuring that our analyses are both relevant and impactful. Join us to be at the forefront of strategic decision-making, where your analytical skills will directly contribute to shaping the future of our Firm.
The successful candidate will possess:
Recruiting for this role ends on December 14, 2025
Key responsibilities:
+ Lead Data-Driven Consulting Projects: Manage end-to-end delivery of analytics and strategic initiatives for clients, translating business objectives into actionable data science solutions.
+ Advanced Data Analysis: Develop statistical models and machine learning algorithms to solve complex client problems and inform decision-making.
+ Strategic Planning & Execution: Support and drive high-impact projects by evaluating business challenges, designing project roadmaps, and implementing analytic solutions.
+ Stakeholder Engagement: Collaborate with cross-functional teams (including business, technology, and finance leads) to understand requirements, secure buy-in, and communicate findings effectively.
+ Insight Generation & Reporting: Transform large and complex data sets into meaningful insights, and communicate findings via presentations and dashboards tailored for executive audiences.
+ Continuous Improvement: Stay up to date with emerging data science trends, tools, and best practices to enhance project outcomes and client value.
+ Financial Analysis: Apply financial acumen to project work, offering insights on revenue impact, cost optimization, and other finance-related considerations.
Required Qualifications
+ Education: Bachelor's or Master's degree in a quantitative field (e.g., Data Science, Mathematics, Statistics, Engineering, Business Analytics, Finance, or related discipline).
+ Minimum of 3+ years of relevant experience
+ Data Science Techniques: Proficient in advanced analytics, statistical modeling, machine learning, and data visualization tools (such as Python, R, SQL, Tableau, or Power BI).
+ Strategic Thinking: Demonstrated ability to translate analytical findings into actionable business strategies and recommendations.
+ Consulting Experience: Proven track record of delivering value in a client-facing or advisory capacity.
+ Communication: Demonstrated ability to explain technical concepts to non-technical stakeholders and craft executive presentations.
+ Ability to travel 0-5 %, on average, based on the work you do and the clients and industries/sectors you serve
+ Limited immigration sponsorship may be available
Preferred Qualifications:
+ Finance Acumen: Familiarity with financial principles, statements, and metrics; previous experience supporting finance-related projects or functions is a plus.
+ Familiarity with Databricks and Azure platforms.
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $102,500 to $188,900.
You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance.
Information for applicants with a need for accommodation:
************************************************************************************************************
EA_ExpHire
EA_FA_ExpHire
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.
Data Scientist
Data scientist job in Zeeland, MI
Why join us? Our purpose is to design for the good of humankind. It's the ideal we strive toward each day in everything we do. Being a part of MillerKnoll means being a part of something larger than your work team, or even your brand. We are redefining modern for the 21st century. And our success allows MillerKnoll to support causes that align with our values, so we can build a more sustainable, equitable, and beautiful future for everyone.
About the Role
We're looking for an experienced and adaptable Data Scientist to join our growing AI & Data Science team. You'll be part of a small, highly technical group focused on delivering impactful machine learning, forecasting, and generative AI solutions.
In this role, you'll work closely with stakeholders to translate business challenges into well-defined analytical problems, design and validate models, and communicate results in clear, actionable terms. You'll collaborate extensively with our ML Engineer to transition solutions from experimentation to production, ensuring models are both effective and robust in real-world environments. You'll be expected to quickly prototype and iterate on solutions, adapt to new tools and approaches, and share knowledge with the broader organization. This is a hands-on role with real impact and room to innovate.
Key Responsibilities
* Partner with business stakeholders to identify, scope, and prioritize data science opportunities.
* Translate complex business problems into structured analytical tasks and hypotheses.
* Design, develop, and evaluate machine learning, forecasting, and statistical models, considering fairness, interpretability, and business impact.
* Perform exploratory data analysis, feature engineering, and data preprocessing.
* Rapidly prototype solutions to assess feasibility before scaling.
* Interpret model outputs and clearly communicate findings, implications, and recommendations to both technical and non-technical audiences.
* Collaborate closely with the ML Engineer to transition models from experimentation into scalable, production-ready systems.
* Develop reproducible code, clear documentation, and reusable analytical workflows to support org-wide AI adoption.
* Stay up to date with advances in data science, AI/ML, and generative AI, bringing innovative approaches to the team.
Required Technical Skills
* Bachelor's or Master's degree in Data Science, Statistics, Applied Mathematics, Computer Science, or a related quantitative field, with 3+ years of applied experience in data science.
* Strong foundation in statistics, probability, linear algebra, and optimization.
* Proficiency with Python and common data science libraries (Pandas, NumPy, Scikit-learn, XGBoost, PyTorch or TensorFlow).
* Experience with time series forecasting, regression, classification, clustering, or recommendation systems.
* Familiarity with GenAI concepts and tools (LLM APIs, embeddings, prompt engineering, evaluation methods).
* Strong SQL skills and experience working with large datasets and cloud-based data warehouses (Snowflake, BigQuery, etc.).
* Solid understanding of experimental design and model evaluation metrics beyond accuracy.
* Experience with data visualization and storytelling tools (Plotly, Tableau, Power BI, or Streamlit).
* Exposure to MLOps/LLMOps concepts and working in close collaboration with engineering teams.
Soft Skills & Qualities
* Excellent communication skills with the ability to translate analysis into actionable business recommendations.
* Strong problem-solving abilities and business acumen.
* High adaptability to evolving tools, frameworks, and industry practices.
* Curiosity and continuous learning mindset.
* Stakeholder empathy and ability to build trust while introducing AI solutions.
* Strong collaboration skills and comfort working in ambiguous, fast-paced environments.
* Commitment to clear documentation and knowledge sharing.
Who We Hire?
Simply put, we hire qualified applicants representing a wide range of backgrounds and abilities. MillerKnoll is comprised of people of all abilities, gender identities and expressions, ages, ethnicities, sexual orientations, veterans from every branch of military service, and more. Here, you can bring your whole self to work. We're committed to equal opportunity employment, including veterans and people with disabilities.
This organization participates in E-Verify Employment Eligibility Verification. In general, MillerKnoll positions are closed within 45 days and are open for applications for a minimum of 5 days. We encourage our prospective candidates to submit their application(s) expediently so as not to miss out on our opportunities. We frequently post new opportunities and encourage prospective candidates to check back often for new postings.
MillerKnoll complies with applicable disability laws and makes reasonable accommodations for applicants and employees with disabilities. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact MillerKnoll Talent Acquisition at careers_********************.
Auto-ApplyData Scientist
Data scientist job in Zeeland, MI
Why join us?
Our purpose is to design for the good of humankind. It's the ideal we strive toward each day in everything we do. Being a part of MillerKnoll means being a part of something larger than your work team, or even your brand. We are redefining modern for the 21st century. And our success allows MillerKnoll to support causes that align with our values, so we can build a more sustainable, equitable, and beautiful future for everyone.
About the Role
We're looking for an experienced and adaptable Data Scientist to join our growing AI & Data Science team. You'll be part of a small, highly technical group focused on delivering impactful machine learning, forecasting, and generative AI solutions.
In this role, you'll work closely with stakeholders to translate business challenges into well-defined analytical problems, design and validate models, and communicate results in clear, actionable terms. You'll collaborate extensively with our ML Engineer to transition solutions from experimentation to production, ensuring models are both effective and robust in real-world environments. You'll be expected to quickly prototype and iterate on solutions, adapt to new tools and approaches, and share knowledge with the broader organization. This is a hands-on role with real impact and room to innovate.
Key Responsibilities
Partner with business stakeholders to identify, scope, and prioritize data science opportunities.
Translate complex business problems into structured analytical tasks and hypotheses.
Design, develop, and evaluate machine learning, forecasting, and statistical models, considering fairness, interpretability, and business impact.
Perform exploratory data analysis, feature engineering, and data preprocessing.
Rapidly prototype solutions to assess feasibility before scaling.
Interpret model outputs and clearly communicate findings, implications, and recommendations to both technical and non-technical audiences.
Collaborate closely with the ML Engineer to transition models from experimentation into scalable, production-ready systems.
Develop reproducible code, clear documentation, and reusable analytical workflows to support org-wide AI adoption.
Stay up to date with advances in data science, AI/ML, and generative AI, bringing innovative approaches to the team.
Required Technical Skills
Bachelor's or Master's degree in Data Science, Statistics, Applied Mathematics, Computer Science, or a related quantitative field, with 3+ years of applied experience in data science.
Strong foundation in statistics, probability, linear algebra, and optimization.
Proficiency with Python and common data science libraries (Pandas, NumPy, Scikit-learn, XGBoost, PyTorch or TensorFlow).
Experience with time series forecasting, regression, classification, clustering, or recommendation systems.
Familiarity with GenAI concepts and tools (LLM APIs, embeddings, prompt engineering, evaluation methods).
Strong SQL skills and experience working with large datasets and cloud-based data warehouses (Snowflake, BigQuery, etc.).
Solid understanding of experimental design and model evaluation metrics beyond accuracy.
Experience with data visualization and storytelling tools (Plotly, Tableau, Power BI, or Streamlit).
Exposure to MLOps/LLMOps concepts and working in close collaboration with engineering teams.
Soft Skills & Qualities
Excellent communication skills with the ability to translate analysis into actionable business recommendations.
Strong problem-solving abilities and business acumen.
High adaptability to evolving tools, frameworks, and industry practices.
Curiosity and continuous learning mindset.
Stakeholder empathy and ability to build trust while introducing AI solutions.
Strong collaboration skills and comfort working in ambiguous, fast-paced environments.
Commitment to clear documentation and knowledge sharing.
Who We Hire?
Simply put, we hire qualified applicants representing a wide range of backgrounds and abilities. MillerKnoll is comprised of people of all abilities, gender identities and expressions, ages, ethnicities, sexual orientations, veterans from every branch of military service, and more. Here, you can bring your whole self to work. We're committed to equal opportunity employment, including veterans and people with disabilities.
This organization participates in E-Verify Employment Eligibility Verification. In general, MillerKnoll positions are closed within 45 days and are open for applications for a minimum of 5 days. We encourage our prospective candidates to submit their application(s) expediently so as not to miss out on our opportunities. We frequently post new opportunities and encourage prospective candidates to check back often for new postings.
MillerKnoll complies with applicable disability laws and makes reasonable accommodations for applicants and employees with disabilities. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact MillerKnoll Talent Acquisition at careers_********************.
Auto-ApplyData Scientist III
Data scientist job in Grand Rapids, MI
As a family company, we serve people and communities. When you work at Meijer, you're provided with career and community opportunities centered around leadership, personal growth and development. Consider joining our family - take care of your career and your community!
Meijer Rewards
Weekly pay
Scheduling flexibility
Paid parental leave
Paid education assistance
Team member discount
Development programs for advancement and career growth
Please review the job profile below and apply today!
The Data Science team at Meijer leads the strategy, development and integration of Machine Learning and Artificial Intelligence at Meijer. Data Scientists on the team will drive customer loyalty, digital conversion, and system efficiencies by delivering innovative data driven solutions. Through these solutions, the data scientists drive material business value, mitigating business and operational risk, and significantly impacts the customer experience. This role works directly with product development, merchandising, marketing, operations, ITS, ecommerce, and vendor partners.
What You'll Be Doing:
Deliver against the overall data science strategy to drive in-store and digital merchandising, marketing, customer loyalty, and operational performance
Partner with product development to define requirements which meet system and customer experience needs for data science projects
Partner with Merchandising, Supply Chain, Operations and customer insights to understand the journey that will be improved with the data science deliverables
Build prototypes for, and iteratively develop, end-to-end data science pipelines including custom algorithms, statistical models, machine learning and artificial intelligence functions to meet end user needs
Partner with product development and technology teams to deploy pipelines into production MLOps environment following Safe Agile methodology
Proactively identify data driven solutions for strategic cross-functional initiatives, develop and present business cases, and gain stakeholder alignment of solution
Responsible to define, document and follow best practices and approve for ML/AI development at Meijer
Own communication with data consumers (internal and external) to ensure they understand the data science products, have the proper training, and are following the best practices in application of data science products
Define and analyze Key Performance Indicators to monitor the usage, adoption, health and value of the data products
Identify and scope, in conjunction with IT, the architecture of systems and environment needs to ensure that data science systems can deliver against strategy
Build and maintain relationships with key partners, suppliers and industry associations and continue to advance data science capabilities, knowledge and impact This job profile is not meant to be all inclusive of the responsibilities of this position; may perform other duties as assigned or required
What You'll Be Doing:
Advanced Degree (MA/MS, PhD) in Mathematics, Statistics, Economics, or related quantitative field
Certifications: Azure Data Science Associate, Azure AI, Safe Agile
6+ years of relevant data science experience in an applied role - preferable w/in retail, logistics, supply chain or CPG
Advanced and hands on experience using: Python, Databricks, Azure ML, Azure Cognitive Service, SAS, R, SQL, PySpark, Numpy, Pandas, Scikit Learn, TensorFlow, PyTorch, AutoTS, Prophet, NLTK
Experience with Azure Cloud technologies including Azure DevOps, Azure Synapse, MLOps, GitHub
Solid experience working with large datasets and developing ML/AI systems such as: natural language processing, speech/text/image recognition, supervised and unsupervised learning models, forecasting and/or econometric time series models
Proactive and action oriented
Ability to collaborate with, and present to internal and external partners
Able to learn company systems, processes and tools, and identify opportunities to improve
Detail oriented and organized
Ability to meet production deadlines
Strong communications, interpersonal and organizational skills
Excellent written and verbal communication skills
Understanding of intellectual property rights, compliance and enforcement
Auto-ApplySr. Biomedical Data Scientist
Data scientist job in Grand Rapids, MI
Join BAMF Health, where you're not just part of a team; you're at the forefront of a revolution in Theranostics, changing lives for the better. As a member of our global team, you'll contribute to pioneering technology and deliver top-tier patient care.
Located in the heart of downtown Grand Rapids, our cutting-edge global headquarters resides within the state-of-the-art Doug Meijer Medical Innovation Building. Step into our modern and spacious facilities, where innovation thrives and collaboration knows no bounds.
Join us in our mission to make Theranostics accessible and affordable for all, and be part of something truly remarkable at BAMF Health.
The Biomedical Data Scientist leads the transformation of complex clinical and Imaging data into actionable AI models in the oncology space. The role combines clinical informatics, data engineering, and machine learning for real-world precision medicine. Responsible for translating multimodal data-including PACS imaging, EMRs, physician notes, and optionally molecular assays-into unified structured datasets and for building prognostic and progression-free survival (PFS) models from these datasets to advance cancer treatment personalization.
Data Integration & Engineering:
Build and maintain robust pipelines that ingest and harmonize multimodal data (DICOM imaging, EMR, pathology, molecular/omics data, and free-text clinical notes).
Work across HL7, FHIR, DICOM, and unstructured text formats to enable a comprehensive longitudinal patient view.
Apply NLP or LLM tools to extract structured information from freeform physician notes, pathology reports, and radiology impressions.
Collaborate with solution architects and engineers to ensure AI-readiness and scalability of the data infrastructure.
AI Modeling & Analytics:
Develop and validate AI/ML models for:
Progression-Free Survival (PFS) estimation
Overall survival prediction
Prognostic classification and risk stratification
Use classical survival analysis (e.g., Kaplan-Meier, Cox models) as well as modern ML approaches (e.g., DeepSurv, time-to-event modeling, random survival forests).
Perform feature engineering across clinical, imaging, and molecular features.
Lead the data science component of retrospective and prospective studies in collaboration with clinicians and statisticians.
Collaboration & Strategy:
Partner with oncologists, nuclear medicine physicians, and radiologists to define clinically meaningful endpoints and labels.
Contribute to publications, presentations, and regulatory submissions where needed.
Ensure all processes adhere to data privacy regulations (e.g., HIPAA, GDPR, 21 CFR Part 11).
Basic Qualifications:
Advanced degree (MS/PhD) in Biomedical Informatics, Biostatistics, Computer Science, Bioengineering, or related discipline required.
5 years of experience in biomedical data science, clinical informatics, or a related field required
Proven experience with: Clinical data (EMR, PACS, structured/unstructured EHR), NLP/LLM tools, and AI/ML models for prognostic or survival prediction required
Proficiency in Python (pandas, scikit-learn, PyTorch), SQL, and data pipeline frameworks required
Preferred Qualifications:
Experience with time-to-event models (DeepSurv, CoxNet, survival forests) preferred
Familiarity with radiology and nuclear medicine workflows preferred
Exposure to omics datasets and molecular biomarkers preferred
Prior experience in oncology, radiology, or theranostics domains preferred
Schedule Details:
Employment Status: Full time (1.0 FTE)
Weekly Scheduled Hours: 40
Hours of work: 8:00 a.m. to 5:00 p.m.
Days worked: On-site role with potential hybrid flexibility Monday-Friday
At BAMF Health, our top priority is patient care. To ensure we are able to drive a Bold Advance Medical Future, we offer a well-rounded benefit package to care for our team members and their families. Highlights include:
Employer paid High Deductible Health Plan with employer HSA contribution
Flexible Vacation Time
401(k) Retirement Plan with generous employer match
Several benefit options including, but not limited to; dental, vision, disability, life, supplemental coverages, legal and identity protection
Free Grand Rapids downtown parking
Disclaimer
BAMF Health provides equal opportunities to all employees for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
BAMF Health will reasonably accommodate qualified individuals with a disability so that they can perform the essential functions of a job unless doing so causes a direct threat to these individuals or others in the workplace and the threat cannot be eliminated by reasonable accommodation or if the accommodation creates an undue hardship to BAMF Health.
BAMF Health is an Equal Opportunity Employer and will not accept or tolerate discrimination or harassment against any applicant, employee, intern, or volunteer based upon the following characteristics: race, color, religion, creed, national origin, ancestry, sex, age, qualified mental or physical disability or handicap, sexual orientation, gender identity/expression, transgender status, genetic information, pregnancy or pregnancy-related status, marital status, veteran status, military service, any application for any military service, or any other category or class protected by applicable federal, state, or local laws.
Auto-ApplyData Science Intern
Data scientist job in Grand Rapids, MI
Feyen Zylstra is a team of hardworking doers and thinkers proud to use our brains and brawn to solve the complex problems associated with the design, installation, and maintenance of electrical and low voltage systems. We tend to work in industries like healthcare, industrial manufacturing, commercial, and data centers where our customers benefit most from our technical expertise and the experience we provide them.
We exist to have a positive impact on the lives of people. This starts with our FZers and a commitment to providing safe and energizing work environments, opportunities to learn and grow, and great pay and benefits. It then moves to our customers and a passion for helping them solve their most challenging problems. When we are successful in meeting the needs of our employees and our customers, we have the opportunity to have a positive impact well beyond ourselves in each of the communities where we live and work.
FZ is headquartered in Grand Rapids, Michigan and is focused on serving customers throughout Michigan, Tennessee, and the Carolinas.
FZ is looking for an Data Science Intern for the Grand Rapids, MI office to join our 2026 Summer Intern program. Our interns will be responsible for supporting their assigned department on day-to-day duties as well as working on an assigned project for the duration of this 14-week internship. The internship will also include developmental opportunities such as visiting construction jobsites to learn more about FZ's business, networking opportunities, a training session to familiarize themselves with their own personal workstyle as well as how to work better with others, taking part in National Intern Day and much more!
Check out what our Summer 2025 Interns had to say about their experience at FZ -Click HERE!
What We're Looking for:
* A Story Teller. You see data as a way to tell the business story, directly impacting the outcome of complex construction projects. Key stakeholders make quick but educated business decisions based on these reports and interactive dashboards.
* A Learner. Learning new skills excites you and you're not afraid of new programs or systems. You are inquisitive. You love a challenge. You see challenging situations as an opportunity to learn and grow. You are accurate, thorough, credible, and organized. You are open to feedback.
* An Initiator. When you see a problem or an area of improvement, you don't wait for others to solve it. You keep yourself productive and engaged at work. You share your perspectives on fresh ways to do things.
* A Team Player. You love supporting others and working as a team. No task is too big or too small for you to complete. You are upbeat and positive, treating others with respect even during conflict. You appreciate differences.
* A Communicator. You ask questions when you are unsure or want to learn more. You listen so that you may gain understanding and enhance your internship experience. You understand that there are many mediums to communicate but many times face-to-face interaction gets the best result. You have great written and verbal communication skills.
Areas of focus during the internship:
* Transition legacy reporting into a modern tech stack that is AI production ready.
* Build interactive dashboards based on data that has been queried and manipulated for analysis and action.
* Leverage APIs through variety of systems across the business.
* Manage workflow and version control through Git repositories & an Azure DevOps environment.
* Work cross-functionally in tight proximity to many stakeholders.
Key Qualifications:
* Enrolled in a bachelor's or master's degree program pursuing a degree in computer or data science - preference towards Junior or Senior status for Fall '26 semester
* Baseline technical awareness and experience with Apache Spark (Python/R), Power BI, Azure DevOps, SQL, Data Warehousing, APIs, and Git
* Ability to work in-office 40 hours per week from May-August '26
* Strong communication and interpersonal skills
Candidates are required to take a pre-employment drug screen. FZ is an Equal Opportunity Employer and considers applicants without regard to race, color, religion, sex, national origin, or other protected classes.
Databricks Data Engineer - Senior - Consulting - Location Open
Data scientist job in Grand Rapids, MI
At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
**Technology - Data and Decision Science - Data Engineering - Senior**
We are seeking a highly skilled Senior Consultant Data Engineer with expertise in cloud data engineering, specifically Databricks. The ideal candidate will have strong client management and communication skills, along with a proven track record of successful end-to-end implementations in data engineering projects.
**The opportunity**
In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that technical requirements align with business needs. Your responsibilities will include creating scalable data architecture and modeling solutions that support the entire data asset lifecycle.
**Your key responsibilities**
As a Senior Data Engineer, you will play a pivotal role in transforming data into actionable insights. Your time will be spent on various responsibilities, including:
+ Designing, building, and operating scalable on-premises or cloud data architecture.
+ Analyzing business requirements and translating them into technical specifications.
+ Optimizing data flows for target data platform designs.
+ Design, develop, and implement data engineering solutions using Databricks on cloud platforms (e.g., AWS, Azure, GCP).
+ Collaborate with clients to understand their data needs and provide tailored solutions that meet their business objectives.
+ Lead end-to-end data pipeline development, including data ingestion, transformation, and storage.
+ Ensure data quality, integrity, and security throughout the data lifecycle.
+ Provide technical guidance and mentorship to junior data engineers and team members.
+ Communicate effectively with stakeholders, including technical and non-technical audiences, to convey complex data concepts.
+ Manage client relationships and expectations, ensuring high levels of satisfaction and engagement.
+ Stay updated with the latest trends and technologies in data engineering and cloud computing.
This role offers the opportunity to work with cutting-edge technologies and stay ahead of industry trends, ensuring you gain a competitive advantage in the market. The position may require regular travel to meet with external clients.
**Skills and attributes for success**
To thrive in this role, you will need a blend of technical and interpersonal skills. Your ability to communicate effectively and build relationships will be crucial. Here are some key attributes we look for:
+ Strong analytical and decision-making skills.
+ Proficiency in cloud computing and data architecture design.
+ Experience in data integration and security.
+ Ability to manage complex problem-solving scenarios.
**To qualify for the role, you must have**
+ A Bachelor's degree in Computer Science, Engineering, or a related field required (4-year degree). Master's degree preferred
+ Typically, no less than 2 - 4 years relevant experience in data engineering, with a focus on cloud data solutions.
+ 5+ years of experience in data engineering, with a focus on cloud data solutions.
+ Expertise in Databricks and experience with Spark for big data processing.
+ Proven experience in at least two end-to-end data engineering implementations, including:
+ Implementation of a data lake solution using Databricks, integrating various data sources, and enabling analytics for business intelligence.
+ Development of a real-time data processing pipeline using Databricks and cloud services, delivering insights for operational decision-making.
+ Strong programming skills in languages such as Python, Scala, or SQL.
+ Experience with data modeling, ETL processes, and data warehousing concepts.
+ Excellent problem-solving skills and the ability to work independently and as part of a team.
+ Strong communication and interpersonal skills, with a focus on client management.
**Required Expertise for Senior Consulting Projects:**
+ **Strategic Thinking:** Ability to align data engineering solutions with business strategies and objectives.
+ **Project Management:** Experience in managing multiple projects simultaneously, ensuring timely delivery and adherence to project scope.
+ **Stakeholder Engagement:** Proficiency in engaging with various stakeholders, including executives, to understand their needs and present solutions effectively.
+ **Change Management:** Skills in guiding clients through change processes related to data transformation and technology adoption.
+ **Risk Management:** Ability to identify potential risks in data projects and develop mitigation strategies.
+ **Technical Leadership:** Experience in leading technical discussions and making architectural decisions that impact project outcomes.
+ **Documentation and Reporting:** Proficiency in creating comprehensive documentation and reports to communicate project progress and outcomes to clients.
**Ideally, you'll also have**
+ Experience with data quality management.
+ Familiarity with semantic layers in data architecture.
+ Familiarity with cloud platforms (AWS, Azure, GCP) and their data services.
+ Knowledge of data governance and compliance standards.
+ Experience with machine learning frameworks and tools.
**What we look for**
We seek individuals who are not only technically proficient but also possess the qualities of top performers. You should be adaptable, collaborative, and driven by a desire to achieve excellence in every project you undertake.
FY26NATAID
**What we offer you**
At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more .
+ We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $106,900 to $176,500. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $128,400 to $200,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
+ Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
+ Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
**Are you ready to shape your future with confidence? Apply today.**
EY accepts applications for this position on an on-going basis.
For those living in California, please click here for additional information.
EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
**EY | Building a better working world**
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
Data Engineer
Data scientist job in Grand Rapids, MI
Job Title: Data EngineerSkill Category: Data EngineeringWork Location: Grand Rapids, MIOnsite/Remote: HybridHourly C2C Rate: $80/hr Job Description: We are looking for a Data Engineer with over 6 years of industry experience in business application design, development, implementation, and solution architecture. The ideal candidate should have experience with Databricks and building and designing data and analytics on enterprise solutions such as:
Azure Data Factory
Azure Function App
Log Analytics
Databricks
Synapse
Power BI
ADLS Gen2
Logic Apps
Required Skills:
Data Classification
Data Modeling
Data Architecture
Data Quality
Design
Network Components
Solution Architecture
Teamwork
Technical Training
.Net Core
Agile
C#
Data Warehouse
Infrastructure
Product Management
Functional Requirements
Interfaces
Research
Subsystems
Support
.Net
Python
Data Engineering
Data Analysis
Data Science
Responsibilities:
Design, code, test, and implement data movement, dashboarding, and analytical assets.
Develop system documentation according to SAFe Agile principles and industry standards.
Evaluate architectural options and define the overall architecture of enterprise Data Lake and Data Warehouse.
Provide subject matter expertise and technical consulting support on vendor or internal applications and interfaces.
Develop Azure Function App using C# and .Net Core.
Define functional and non-functional requirements, including performance monitoring, alerting, and code management.
Partner with business areas to gather requirements for Data and Analytics and design solutions.
Define major elements and subsystems and their interfaces.
Mentor and coach team members.
Engage with ITS Security and Infrastructure teams to ensure secure development and deployment of solutions.
Interface with the Product Manager and IT partners to define and estimate features for agile teams.
Conduct industry research, facilitate new product and vendor evaluations, and assist in vendor selection.
Qualifications:
6+ years of industry experience in business application design, development, implementation, and solution architecture.
Experience with Azure Data Factory, Azure Function App, Log Analytics, Databricks, Synapse, Power BI, ADLS Gen2, and Logic Apps.
Databricks experience is required.
Experience designing data pipelines and implementing data quality, governance, and security compliance in Azure architecture.
Bachelor's degree in Computer Science, Computer Information Systems, Business Information Systems, Engineering, or related discipline or equivalent work experience and technical training.
Excellent written and oral communication skills.
Experience in Power BI, Data Modeling, Data Classification, Data Architecture, and reporting.
In-depth understanding of computer, storage, and network components, including backup, monitoring, and DR environment requirements.
Preferred knowledge and experience in Python and API architecture in Azure.
SAFe certification or training is a plus.
Experience with diverse technical configurations, technologies, and processing environments.
Exceptional interpersonal skills, including teamwork, facilitation, and negotiation.
Must Be Included with Submittal:
Full Legal Name
Phone
Email
Current Location
Rate
Work Authorization
Willingness to Relocate
Confirmation that the candidate is or will be on your W2
Senior Data Engineer
Data scientist job in Grand Rapids, MI
**Advance Local** is looking for a **Senior Data Engineer** to design, build, and maintain the enterprise data infrastructure that powers our cloud data platform. This position will combine your deep technical expertise in data engineering with team leadership responsibilities for data engineering, overseeing the ingestion, integration, and reliability of data systems across Snowflake, AWS, Google Cloud, and legacy platforms. You'll partner with data product and across business units to translate requirements into technical solutions, integrate data from numerous third-party platforms, (CDPs, DMPs, analytics platforms, marketing tech) into central data platform, collaborate closely with the Data Architect on platform strategy and ensure scalable, well-engineered solutions for modern data infrastructure using infrastructure as code and API-driven integrations.
The base salary range is $120,000 - $140,000 per year.
**What you'll be doing:**
+ Lead the design and implementation of scalable data ingestion pipelines from diverse sources into Snowflake.
+ Partner with platform owners across business units to establish and maintain data integrations from third party systems into the central data platform.
+ Architect and maintain data infrastructure using IAC, ensuring reproducibility, version control and disaster recovery capabilities.
+ Design and implement API integrations and event-driven data flows to support real time and batch data requirements.
+ Evaluate technical capabilities and integration patterns of existing and potential third-party platforms, advising on platform consolidation and optimization opportunities.
+ Partner with the Data Architect and data product to define the overall data platform strategy, ensuring alignment between raw data ingestion and analytics-ready data products that serve business unit needs.
+ Develop and enforce data engineering best practices including testing frameworks, deployment automation, and observability.
+ Support rapid prototyping of new data products in collaboration with data product by building flexible, reusable data infrastructure components.
+ Design, develop, and maintain scalable data pipelines and ETL processes; optimize and improve existing data systems for performance, cost efficiency, and scalability.
+ Collaborate with data product, third-party platform owners, Data Architects, Analytics Engineers, Data Scientists, and business stakeholders to understand data requirements and deliver technical solutions that enable business outcomes across the organization.
+ Implement data quality validation, monitoring, and alerting systems to ensure reliability of data pipelines from all sources.
+ Develop and maintain comprehensive documentation for data engineering processes and systems, architecture, integration patterns, and runbooks.
+ Lead incident response and troubleshooting efforts for data pipeline issues, ensuring minimal business impact.
+ Stay current with the emerging data engineering technologies, cloud services, SaaS platform capabilities, and industry best practices.
**Our ideal candidate will have the following:**
+ Bachelor's degree in computer science, engineering, or a related field
+ Minimum of seven years of experience in data engineering with at least two years in a lead or senior technical role
+ Expert proficiency in Snowflake data engineering patterns
+ Strong experience with AWS services (S3, Lambda, Glue, Step Functions) and Google Cloud Platform
+ Experience integrating data from SaaS platforms and marketing technology stacks (CDPs, DMPs, analytics platforms, CRMs)
+ Proven ability to work with third party APIs, webhooks, and data exports
+ Experience with infrastructure such as code (Terraform, Cloud Formation) and CI/CD pipelines for data infrastructure
+ Proven ability to design and implement API integrations and event-driven architecture
+ Experience with data modeling, data warehousing, and ETL processes at scale
+ Advanced proficiency in Python and SQL for data pipeline development
+ Experience with data orchestration tools (airflow, dbt, Snowflake tasks)
+ Strong understanding of data security, access controls, and compliance requirements
+ Ability to navigate vendor relationships and evaluate technical capabilities of third-party platforms
+ Excellent problem-solving skills and attention to detail
+ Strong communication and collaboraion skills
**Additional Information**
Advance Local Media offers competitive pay and a comprehensive benefits package with affordable options for your healthcare including medical, dental and vision plans, mental health support options, flexible spending accounts, fertility assistance, a competitive 401(k) plan to help plan for your future, generous paid time off, paid parental and caregiver leave and an employee assistance program to support your work/life balance, optional legal assistance, life insurance options, as well as flexible holidays to honor cultural diversity.
Advance Local Media is one of the largest media groups in the United States, which operates the leading news and information companies in more than 20 cities, reaching 52+ million people monthly with our quality, real-time journalism and community engagement. Our company is built upon the values of Integrity, Customer-first, Inclusiveness, Collaboration and Forward-looking. For more information about Advance Local, please visit ******************** .
Advance Local Media includes MLive Media Group, Advance Ohio, Alabama Media Group, NJ Advance Media, Advance Media NY, MassLive Media, Oregonian Media Group, Staten Island Media Group, PA Media Group, ZeroSum, Headline Group, Adpearance, Advance Aviation, Advance Healthcare, Advance Education, Advance National Solutions, Advance Originals, Advance Recruitment, Advance Travel & Tourism, BookingsCloud, Cloud Theory, Fox Dealer, Hoot Interactive, Search Optics, Subtext.
_Advance Local Media is proud to be an equal opportunity employer, encouraging applications from people of all backgrounds. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, genetic information, national origin, age, disability, sexual orientation, marital status, veteran status, or any other category protected under federal, state or local law._
_If you need a reasonable accommodation because of a disability for any part of the employment process, please contact Human Resources and let us know the nature of your request and your contact information._
Advance Local Media does not provide sponsorship for work visas or employment authorization in the United States. Only candidates who are legally authorized to work in the U.S. will be considered for this position.
Senior Data Engineer Solution Design Focused
Data scientist job in Texas, MI
We're looking for a Senior Data Engineer with deep expertise in GCP solution architecture to join our engineering team. You'll play a key role in designing, building, and optimizing secure, scalable, and high-performance cloud solutions on Google Cloud Platform. By combining your technical skills with your understanding of e-commerce, CPG, or retail, you'll transform complex business needs into innovative, impactful solutions. If you're passionate about cloud architecture, thrive in fast-paced environments, and enjoy solving complex challenges, we'd love to hear from you.
What You'll Do
* Architect Scalable Cloud Solutions: Lead the design and implementation of end-to-end GCP-based architectures covering data, applications, infrastructure, and security.
* Build Robust Data Pipelines: Develop and maintain reliable ETL/ELT pipelines using tools like Airflow, DBT, and languages like Python and We'rSQL.
* Optimize BigQuery Warehousing: Design schemas, tune queries, and manage costs while working with large, complex datasets.
* Ensure Data Quality & Governance: Implement standards for data accuracy, consistency, and lineage tracking.
* Drive Performance & Efficiency: Monitor and optimize the performance and cost-effectiveness of cloud solutions.
* Leverage Industry Knowledge: Apply your domain expertise to solve real-world retail, e-commerce, and supply chain challenges.
* Maintain Security & Compliance: Build secure solutions aligned with privacy laws and industry best practices.
* Lead & Mentor: Guide engineers, share best practices, and foster a culture of technical excellence.
* Collaborate Across Teams: Work closely with stakeholders to define requirements and ensure solution success.
* Document Clearly: Produce high-quality design docs, architecture diagrams, and best-practice guides.
What You Bring
* 8+ years in IT, with 3+ in cloud architecture or GCP solutions
* Proven hands-on experience with Google Cloud Platform services
* Strong knowledge of data architecture, ETL/ELT, and data warehousing (BigQuery)
* Solid coding skills in SQL, Python (preferred), Java, or Scala
* Familiarity with Airflow, DBT, and version control (e.g., Git)
* Industry experience in e-commerce, retail, or CPG
* Excellent communication and problem-solving skills
You'll Thrive Here If You Are:
* A fast learner who keeps up with evolving tech
* A proactive problem-solver who owns their work end-to-end
* A collaborative team player who values knowledge sharing
* Passionate about creating real business impact through data
Join us and help shape the future of data-driven decision-making in a fast-paced, high-impact environment. Apply now if you're ready to build solutions that matter.
Salary Range: USD 70-100k
Auto-ApplyData Engineer
Data scientist job in Grand Rapids, MI
Our client, a family-owned Midwestern grocery/retailer striving to better people's lives in all communities, seeks a senior-level Data Engineer.
The ideal candidate for this role has 5 to 10 years of relevant work experience.
Designs, modifies, develops, writes and implements software programming applications.
Supports and/or installs software applications/operating systems.
Participates in the testing process through test review and analysis, test witnessing and certification of software.
Familiar with a variety of the field's concepts, practices, and procedures.
Relies on experience and judgment to plan and accomplish goals.
Performs a variety of complicated tasks.
A wide degree of creativity and latitude is expected.
Staff Data Engineer
Data scientist job in Grand Rapids, MI
Cybercrime is rising, reaching record highs in 2024. According to the FBI's IC3 report total losses exceeded $16 billion. With investment fraud and BEC scams at the forefront, the message is clear: the real estate sector remains a lucrative target for cybercriminals. At CertifID, we take this threat seriously and provide a secure platform that verifies the identities of parties involved in transactions, authenticates wire transfer instructions, and detects potential fraud attempts. Our technology is designed to mitigate risks and ensure that every transaction is conducted with confidence and peace of mind.
We know we couldn't take on this challenge without our incredible team. We have been recognized as one of the Best Startups to Work for in Austin, made the Inc. 5000 list, and won Best Culture by Purpose Jobs two years in a row. We are guided by our core values and our vision of a world without wire fraud. We offer a dynamic work environment where you can contribute to meaningful impact and be part of a team dedicated to enhancing security and fighting fraud.
We're seeking an exceptional Staff Data Engineer to help us take our data platform to the next level. This is a high-impact role where you'll architect and build scalable data infrastructure, empower data-driven decision-making, and help shape the future of fraud prevention. We're scaling fast, and you'll have the chance to shape the future of our data platform.
You will collaborate with data scientists, engineers, and business stakeholders to ensure high-quality, actionable data insights to enable business outcomes.What You Will Do
Design and build robust, scalable, secure data pipelines and infrastructure to support analytics, product intelligence, and machine learning.
Lead data architecture decisions, ensuring high performance, reliability, and maintainability across our platform.
Collaborate cross-functionally with product, engineering, and business teams to deliver data solutions that drive strategic insights and operational excellence.
Champion data quality and governance, implementing best practices for data validation, lineage, and observability.
Mentor and guide other data engineers, fostering a culture of technical excellence and continuous learning.
What We're Looking For
Proven experience as a staff-level data engineer in a fast-paced, product-driven environment. Engineering experience of 8+ years.
Experience in leading teams of other engineers to build a long-term technical vision and plans to achieve it.
Deep expertise in cloud data platforms (e.g., Snowflake, BigQuery, Redshift) and orchestration tools (e.g., Airflow, dbt).Strong programming skills in Python or Scala, and proficiency with SQL.
Experience with real-time data processing (e.g., Kafka, Spark Streaming) and data modeling for analytics and ML.
A growth mindset, strong ownership, and a passion for solving complex problems that matter.
Preferred Experience
Experience in fintech, cybersecurity, or fraud prevention.
Familiarity with data privacy regulations (e.g., SOC 2, GDPR, CCPA).
Contributions to open-source data tools or communities.
What We Offer
Flexible vacation
12 company-paid holidays
10 paid sick days
No work on your birthday
Health, dental, and vision Insurance (including a $0 option)
401(k) with matching, and no waiting period
Equity
Life insurance
Generous parental paid leave
Wellness reimbursement of $300/year
Remote worker reimbursement of $300/year
Professional development reimbursement
Competitive pay
An award-winning culture
Change doesn't happen overnight, and the same goes for us here at CertifID. We PROGRESS collectively and individually as we grow, abiding by our core values. Protect the Customer, Raise the Bar, Operate with Urgency, Grow with Grit, Ride the Wave, Enthusiasm Spreads, Stay Connected, Send It.
Auto-ApplyMarketing Data Engineer
Data scientist job in Grand Rapids, MI
For over half a decade, Hudson Manpower has been a trusted partner in delivering specialized talent and technology solutions across IT, Energy, and Engineering industries worldwide. We work closely with startups, mid-sized firms, and Fortune 500 clients to support their digital transformation journeys. Our teams are empowered to bring fresh ideas, shape innovative solutions, and drive meaningful impact for our clients. If you're looking to grow in an environment where your expertise is valued and your voice matters, then Hudson Manpower is the place for you. Join us and collaborate with forward thinking professionals who are passionate about building the future of work.
Job Description:
Designs, codes, tests, and implements data movement pipelines, dashboarding and analytical assets; develops system documentation according to SAFe Agile principles and industry standards.
Design and hands-on development on either vendor or internal applications and interfaces including Azure - Data Factory, Log Analytics, Databricks, Synapse, Power BI, ADLS Gen2, GCP Cloud storage and Big Query
Assist in configuration and administration of marketing platforms (Google Marketing platform, Salesforce MCI, The Trade Desk etc., ADvendio) that the team supports
Build data ingestion into and out of the above listed marketing platforms
Gathers and implements functional and non-functional requirements including performance monitoring, alerting and code management and ensuring alignment with technology best practices and SLAs.
Partnering with all areas of the business to gather requirements for Data and Analytics and designing solutions.
Participate in agile ceremonies and follow agile cadence to build, test and deliver solutions
Driving engagement with ITS Security and Infrastructure teams to ensure secure development and deployment of solutions.
Interfaces with the Product Manager and IT partners at the Program level and within other Release Trains to define and estimate features for agile teams.
Job requirements
Requirements
6+ years industry experience (business application design, development, implementation, and/or solution architecture)
Understanding of architecture practices and execution for large projects / programs.
Experience building and designing data and analytics solutions on enterprise solutions such as Azure - Data Factory, Log Analytics, Databricks, Synapse, Power BI, ADLS Gen2.
Databricks experience is required.
Experience designing data pipelines, ingestion, storage, prep-train, model and serve using above technologies, Automating Azure Workloads, Data quality, Governance/standards, Security and legal compliance in the Azure architecture
Working knowledge of these tools preferred - Google Marketing platform, Salesforce MCI, The Trade Desk etc., ADvendio
Experience with Marketing domain data ingestion and analytics a must
Bachelor's degree in Computer Science, Computer Information Systems, Business Information Systems, Engineering or related discipline or equivalent work experience and technical training is required.
Excellent written and oral communications skills.
Previous experience in Power BI, Data Modeling, Data Classification and zones, data movement, Data architecture and reporting
Preferred knowledge and experience on Python and API architecture in Azure
Any SAFe certification or training.
Experience with multiple, diverse technical configurations, technologies, and processing environments.
Exceptional interpersonal skills, including teamwork, facilitation, and negotiation
All done!
Your application has been successfully submitted!
Other jobs
Experienced Data Engineer Analyst
Data scientist job in Grand Rapids, MI
Ascential Technologies is a global leader in automated diagnostic, inspection, assembly, and test systems. We serve mission-critical industries including automotive (EOL testing, ADAS), aerospace, industrial automation (balancing, grinding, test stands), and medical & life sciences (instruments, devices, consumables, and automation). Our solutions help customers accelerate innovation, ensure safety, and improve performance across the product lifecycle.
Role Overview:
We are seeking an experienced Data Engineer/Analyst to join our team to lead and support the ongoing development of our data analytics platform. This individual will play a critical role in building, maintaining, and enhancing scalable data systems while also supporting customers and internal teams. The role requires strong technical expertise in data engineering, analytics, visualization, and AI integration along with proven experience managing on-prem and cloud-based stacks.
Key Responsibilities:
Data Engineering & Processing
Design, develop, and maintain advanced data collectors for diverse industrial, aerospace, automotive, and medical systems.
Architect robust data pipelines for on-premises and cloud environments (Elastic Cloud, OpenSearch).
Analyze machine-generated log files from test stands, medical devices, and automation systems.
Clean and preprocess data to remove duplicates, noise, and errors.
Implement updates and enhancements to accommodate evolving data file types and formats (CSV, SQL, PLC logs, text, etc.).
Augment datasets with derived features to enhance analytical value.
Optimize performance, scalability, and fault tolerance of ingestion pipelines.
Visualization & Reporting
Design and implement dashboards tailored to specific test and measurement applications.
Apply standard dashboard templates to new and existing datasets.
Generate reports and visual summaries for internal and customer-facing use.
Provide automatic report generation using AI/LLM frameworks to summarize data, generate insights and guidance on performance improvements, provide predictive analytics, anomaly detection, and automated recommendations.
Data Collection & Ingestion
Develop and deploy data collectors for machines across automotive, aerospace, and medical domains.
Configure systems to monitor, clean, transform, and ingest data into cloud platforms.
Ensure data pipelines are robust, scalable, and secure.
Integrate new communication protocols (e.g. PLCs of different makes/models) and data formats; adapt to customer-specific software platforms
Monitoring & Alerting
Set up thresholds and integrate webhooks for real-time alerts and notifications.
Investigate and resolve issues related to data collectors, dashboards, and ELK stack components.
Collaboration & Maintenance
Collaborate directly with customers to gather requirements, resolve technical issues, and deliver solutions.
Provide guidance to internal teams, mentoring junior engineers and analysts.
Participate in daily standups and agile team activities.
Review and provide feedback on new datasets and data models.
Perform backups and conduct code reviews for data-related components.
Qualifications:
Bachelor's degree in Computer Science, Data Science, Engineering, or a related field.
5+ years of professional experience in data engineering, data science or analytics roles for both on-prem and cloud environments.
Proven expertise of ELK, OpenSearch, and/or OpenTelemetry.
Strong programming and scripting skills (e.g., Python, Bash, PowerShell).
Experience with data cleaning, transformation, and preprocessing for engineering/industrial data.
Hands-on experience with cloud deployments (Elastic Cloud, AWS, Azure, or GCP).
Proficiency with visualization frameworks (Kibana, OpenSearch Dashboards, Grafana).
Experience with core programming principles and design patterns.
Strong analytical and problem-solving skills.
Excellent communication and teamwork abilities with both technical and customer stakeholders.
Bonus Skills:
Experience working with industrial protocols and PLC integration.
Familiarity with containerization (Docker, Kubernetes).
Experience incorporating cybersecurity principles for secure data handling.
Experience with REST APIs and microservices architectures.
Background in test and measurement systems for automotive, aerospace, or medical device industries.
Prior experience in customer-facing engineering roles.
Knowledge of test and measurement systems in automotive, aerospace, or medical device industries.
This role can be remote; we prefer to have a candidate within a commuting distance to one of the offices listed in posting (Wisconsin, Michigan)
IT Data Engineer
Data scientist job in Grand Rapids, MI
Your passion. Your purpose. If you're here, you're looking for something more. More opportunity, more impact, more purpose. At Rehmann, each and every one of our associates plays a pivotal role in the Firm's success. When you join our team, you can count on exceptional support, encouragement, and guidance from your colleagues and from leadership.
To learn more about Rehmann, visit: ********************************
The IT Data Engineer with a focus on Integrations will be responsible for designing, implementing, and maintaining integration solutions that connect various systems and applications within our organization. This role requires a strong understanding of integration technologies, excellent problem-solving skills, and the ability to work collaboratively with cross-functional teams. The engineer will play a key role in ensuring seamless data flow between systems, automating workflows, and supporting insightful reports to drive business operations.
Key Responsibilities:
* Design and develop integration solutions using APIs, middleware, and other integration tools.
* Collaborate with software developers, system administrators, and business analysts to understand integration requirements.
* Implement and maintain integration solutions to ensure seamless data flow between systems.
* Troubleshoot and resolve integration issues in a timely manner.
* Monitor and optimize integration performance.
* Document integration processes and solutions.
* Stay up to date with the latest integration technologies and best practices.
Qualifications:
* Proven experience as an Integration Engineer or similar role.
* Strong knowledge of integration technologies such as RESTful APIs, SOAP, XML, JSON, and middleware platforms.
* Knowledge of Microsoft SQL databases.
* Understanding of cloud computing concepts and services, particularly Microsoft Azure.
* Proficiency in programming and scripting languages such as Python, C#, PowerShell and SQL.
* Strong background in managing priorities from planning to execution.
* Excellent problem-solving and analytical skills.
* Strong communication and interpersonal skills.
* Ability to work independently and as part of a team.
* Detail-oriented with a focus on delivering high-quality solutions.
No matter where you want to go in your career, Rehmann can help you get there. Whether you're in the early stages of your professional journey or you're further down your path, we're focused on helping you achieve your goals - whatever they may be. When you join Rehmann, you are part of a culture that Puts People First and aims to help everyone reach their fullest potential. Let us show you all the ways we can Empower Your Purpose.
We Put People First in all that we do. Our associates are our greatest assets and we provide programs and benefits that encourage growth and development and align with their needs and goals. This includes benefits focused on physical and mental health, paid time off for volunteering and diversity-related activities, flexible work arrangements, and more. When you join Rehmann, you become part of a firm dedicated to helping Empower Your Purpose, whatever it may be.
Rehmann is an Equal Opportunity Employer
#LI-VK1
Auto-ApplyData Integrity Analyst
Data scientist job in Three Rivers, MI
At AAM, the POWER is in our people. We believe that an equitable and inclusive workplace benefits everyone, and that the diversity of our Associates drives creativity and innovation. Our global team is made of dreamers, doers and innovators who are Delivering POWER for a safer, brighter and more sustainable tomorrow.
Job Posting Title
Data Integrity Analyst
Summary
#TeamAAM is looking for a Data Integrity Analyst to join our team in Three Rivers, Michigan. This is a fully onsite role on the first shift.
The Data Integrity Analyst is responsible for the organization's inventory integrity and master data management, which includes the creation and maintenance of item numbers, resources, routings and bills of material and PFEP set-ups. They will monitor and audit inventory and master data integrity, and analyze and troubleshoot discrepancies to the inventory integrity and master data management.
The ideal candidate is an excellent problem solver with strong communication skills. Relevant experience in an automotive manufacturing environment is strongly preferred.
Ready to join the team that is Bringing the Future Faster? Apply today!
Job Description
Monitors inventory integrity, utilizing available reports and transactions to help validate daily performance. Provides root cause analysis for incorrect data.
Performs the API process and root cause analysis of discrepancies.
Prepares and distributes operational reports by collecting, analyzing, and summarizing data and trends.
Audits bills of material and routings, and develops action plans to implement corrective actions for non-conformances identified.
Performs and maintains cycle count records and provides root cause analysis for identified discrepancies.
Tracks, maintains records, and communicates master data changes to the organization for new program launches and changes to existing programs.
Creates item numbers, departments, resources, routings and bills of material for new program launches and changes to existing programs including both engineering and process changes and maintains correct records for these changes to inventory and master data.
Responsible for year-end budget process standard load.
Responsible for month-end close process.
Creates and maintains PFEP records.
All other duties as assigned.
Required Skills and Education
Bachelor's Degree in Supply Chain, Business Administration, Process Engineering, Industrial Engineering, or equivalent experience.
2-4+ years of relevant professional experience in manufacturing processes, particularly in an automotive environment, with hands-on involvement in new project execution and supporting business cases.
Knowledge of Lean Manufacturing principles to support and implement continuous improvement initiatives.
Practical experience with automotive manufacturing processes and the ability to assist in the implementation of new technologies and systems.
About AAM:
As a leading global Tier 1 Automotive and Mobility Supplier, AAM designs, engineers and manufactures Driveline and Metal Forming technologies to support electric, hybrid and internal combustion vehicles. Headquartered in Detroit with over 80 facilities in 18 countries, we are
Bringing the Future Faster
for a safer and more sustainable tomorrow. To learn more, visit AAM.com.
Why Join #TeamAAM:
As a member of #TeamAAM, you'll get to make a difference on day one. From your first day with us, you'll have the opportunity to grow, embrace challenges, build your skills, and bring your authentic self to work every day, all while helping to shape the future of mobility for AAM…and the world.
AAM will not discriminate against any Associate or applicant for employment because of age, race, color, gender, religion, weight, height, marital status, sexual orientation, genetic history or information, gender identity or expression, disability, protected veteran status, national origin, or other characteristic protected by law. AAM will take affirmative action to ensure that applicants are employed, and that Associates are treated equally during employment, without regard to their age, race, color, gender, religion, weight, height, marital status, sexual orientation, genetic history or information, gender identity or expression, disability, protected veteran status, national origin, or other characteristic protected by law. For the Disabled Job Seeker: We offer reasonable accommodations for qualified disabled individuals who are applicants for employment. To request assistance or accommodations, please e-mail *************************. AAM is an equal opportunity/affirmative action employer.
Auto-ApplyManager, Data Operations, Data Engineer
Data scientist job in Grand Rapids, MI
Known for being a great place to work and build a career, KPMG provides audit, tax and advisory services for organizations in today's most important industries. Our growth is driven by delivering real results for our clients. It's also enabled by our culture, which encourages individual development, embraces an inclusive environment, rewards innovative excellence and supports our communities. With qualities like those, it's no wonder we're consistently ranked among the best companies to work for by Fortune Magazine, Consulting Magazine, Seramount, Fair360 and others. If you're as passionate about your future as we are, join our team.
KPMG is currently seeking a Manager of Data Engineering to join our Digital Nexus technology organization. This is a hybrid work opportunity.
Responsibilities:
* Lead a team of Azure Data Lake and business intelligence engineers in designing and delivering ADL Pipelines, Notebooks and interactive Power BI dashboards that clearly communicate actionable insights to stakeholders; contribute strategic thought leadership to shape the firm's business intelligence vision and standards
* Design and maintain scalable data pipelines using Azure Data Factory and Databricks to ingest, transform, and deliver data across medallion architecture layers; develop production-grade ETL/ELT solutions using PySpark and SQL to produce analytics-ready Delta Lake datasets aligned with enterprise standards
* Apply critical thinking and creativity to design innovative, non-standard BI solutions that address complex and evolving business challenges; design, build, and optimize data models to support analytics, ensuring accuracy, reliability, and efficiency
* Stay ahead of emerging technologies including Generative AI and AI agents to identify novel opportunities that improve analytics, automation, and decision-making across the enterprise
* Manage and provide technical expertise and strategic guidance to counselees (direct reports), department peers, and cross-functional team members; set goals, participate in strategic initiatives for the team, and foster the development of high-performance teams
* Act with integrity, professionalism, and personal responsibility to uphold KPMG's respectful and courteous work environment
Qualifications:
* Minimum seven years of recent experience designing and building ADL Pipelines and Data bricks notebooks and interactive dashboards using modern business intelligence tools (preferably Power BI); minimum two years of recent experience designing scalable data pipelines using Azure Data Factory and Azure Databricks to support ingestion, transformation, and delivery of data across medallion architecture layers
* Bachelor's degree from an accredited college or university is preferred; minimum of a high school diploma or GED is required
* Demonstrated analytical and problem-solving abilities, with a creative and methodical approach to addressing complex challenges
* Advanced knowledge of SQL, DAX, and data modeling concepts; proven track record in defining, managing, and delivering BI projects; ability to participate in the development of resource plans and influence organizational priorities
* Excellent written and verbal communication skills, including the ability to effectively present proposals and vision to executive leadership
* Applicants must be authorized to work in the U.S. without the need for employment-based visa sponsorship now or in the future; KPMG LLP will not sponsor applicants for U.S. work visa status for this opportunity (no sponsorship is available for H-1B, L-1, TN, O-1, E-3, H-1B1, F-1, J-1, OPT, CPT or any other employment-based visa)
KPMG LLP and its affiliates and subsidiaries ("KPMG") complies with all local/state regulations regarding displaying salary ranges. If required, the ranges displayed below or via the URL below are specifically for those potential hires who will work in the location(s) listed. Any offered salary is determined based on relevant factors such as applicant's skills, job responsibilities, prior relevant experience, certain degrees and certifications and market considerations. In addition, KPMG is proud to offer a comprehensive, competitive benefits package, with options designed to help you make the best decisions for yourself, your family, and your lifestyle. Available benefits are based on eligibility. Our Total Rewards package includes a variety of medical and dental plans, vision coverage, disability and life insurance, 401(k) plans, and a robust suite of personal well-being benefits to support your mental health. Depending on job classification, standard work hours, and years of service, KPMG provides Personal Time Off per fiscal year. Additionally, each year KPMG publishes a calendar of holidays to be observed during the year and provides eligible employees two breaks each year where employees will not be required to use Personal Time Off; one is at year end and the other is around the July 4th holiday. Additional details about our benefits can be found towards the bottom of our KPMG US Careers site at Benefits & How We Work.
Follow this link to obtain salary ranges by city outside of CA:
**********************************************************************
KPMG offers a comprehensive compensation and benefits package. KPMG is an equal opportunity employer. KPMG complies with all applicable federal, state and local laws regarding recruitment and hiring. All qualified applicants are considered for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, citizenship status, disability, protected veteran status, or any other category protected by applicable federal, state or local laws. The attached link contains further information regarding KPMG's compliance with federal, state and local recruitment and hiring laws. No phone calls or agencies please.
KPMG recruits on a rolling basis. Candidates are considered as they apply, until the opportunity is filled. Candidates are encouraged to apply expeditiously to any role(s) for which they are qualified that is also of interest to them.
Los Angeles County applicants: Material job duties for this position are listed above. Criminal history may have a direct, adverse, and negative relationship with some of the material job duties of this position. These include the duties and responsibilities listed above, as well as the abilities to adhere to company policies, exercise sound judgment, effectively manage stress and work safely and respectfully with others, exhibit trustworthiness, and safeguard business operations and company reputation. Pursuant to the California Fair Chance Act, Los Angeles County Fair Chance Ordinance for Employers, Fair Chance Initiative for Hiring Ordinance, and San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records.
Mfg .Product Data Engineer
Data scientist job in Grand Rapids, MI
We make life more comfortable. Leggett & Platt's overall mission is a commitment to enhance lives - by delivering quality products, offering empowering and rewarding careers, and doing our part in bringing about a better future. Leggett & Platt's inventive heritage and leadership in the residential products industry span more than 130 years. As The Components People, we are the leading supplier of a wide range of products and components for all areas of life, including mattress springs and carpet cushion, as well as bedding machinery and erosion-control products.
From aerospace tubing and fabricated assemblies to flooring underlayment and carpet cushion, Leggett & Platt has divisions that design, manufacture, and sell a variety of products. Our reliable product development and launch capability, coupled with our global footprint, make us a trusted partner for customers in the aerospace, hydraulic cylinders, flooring, textile, and geo components industries.
Learn more about the history of Leggett: ***************************
Department / Division: Engineering
Summary: Responsible for all computer related product data such as three-dimensional model definitions, fully dimensioned drawings, and all necessary system requirements.
Essential Duties and Responsibilities: (Other duties may be assigned)
* 5.1. Create three-dimensional product definition from customer and product engineering specifications.
* 5.2. Create fully dimensioned product drawings for customer approval and quality check sheets.
* 5.3. Create three-dimensional tool definition (tooling data sheets TDS) from manufacturing engineering specifications.
* 5.4. Create fully dimensioned tool drawings as required.
* 5.5. Responsible for revision control and maintaining drawing files.
* 5.6. Responsible for computer system troubleshooting and upgrade installations.
* 5.7. Responsible for managing customer computer files as required.
* 5.8. Plans and formulates engineering program and organizes project staff according to project requirements.
* 5.9. Reviews product design for compliance with engineering principles, company standards, and customer contract requirements, and related specifications.
* 5.10. Coordinates activities concerned with technical developments, scheduling, and resolving engineering design and test problems.
* 5.11. Directs integration of technical activities and products.
* 5.12. Evaluates and approves design changes, specifications, and drawing releases.
* 5.13. Controls expenditures within limitations of project budget.
* 5.14. Prepares interim and completion project status reports.
* 5.15. Coordinates first product runs including sample submission and product verification studies.
* 5.16. Controls all aspects of engineering related quality issues including customer follow up visits. Verifies production personnel follow up visits on production related quality issues.
* 5.17. Issues engineering changes as requested from customer including internal production modifications and cost changes.
* 5.18. Issues internal engineering changes related to processes and material changes.
* 5.19. Follows products from initial quote through production release servicing the account with engineering changes as required by the customer.
Supervisory Responsibilities: Coordinates employees involved with projects including accounting, tooling and production. Carries out supervisory responsibilities in accordance with the organization's policies and applicable laws. Responsibilities: planning, assigning, and directing work, addressing complaints and resolving problems.
Qualification Requirements: To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
* 7.1. Education and/or Experience: Associate's degree (A. A.) or equivalent from two-year college or technical school; three years related experience and/or training; or equivalent combination of education and experience.
* 7.2. Language Skills: Ability to read and interpret documents such as safety rules, operating and maintenance instructions, and procedure manuals. Ability to write routine reports and correspondence. Ability to speak effectively before groups of customers or employees of organization.
* 7.3. Mathematical Skills: Ability to calculate figures and amounts such as proportions, percentages, area, circumference, and volume. Ability to apply concepts of basic algebra, geometry and trigonometry.
* 7.4. Reasoning Ability: Ability to define problems, collect data, establish facts, and draw valid conclusions. Ability to interpret an extensive variety of technical instructions in mathematical or diagram form and deal with several abstract and concrete variables.
* 7.5. Certificates, Licenses, Registrations:
Other Skills and Abilities:
Physical Demands: The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
* 9.1. While performing the duties of this job, the employee is frequently required to stand; walk; use hands to finger, handle, or feel objects, tools, or controls; and talk or hear. The employee is occasionally required to sit and reach with hands and arms.
* 9.2. The employee must occasionally lift and/or move up to 10 pounds. Specific vision abilities required by this job include color vision.
Work Environment: The work environment characteristics described here are representative of those an employee encounters while performing the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
* 10.1. While performing the duties of this job, the employee occasionally works near moving mechanical parts and is occasionally exposed to fumes or airborne particles.
* 10.2. The noise level in the work environment is usually quiet.
What to Do Next
Now that you've had a chance to learn more about us, what are you waiting for! Apply today and allow us the opportunity to learn more about you and the value you can bring to our team. Once you apply, be sure to create a profile, and sign up for job alerts, so you can be the first to know when new opportunities become available. If you require assistance completing an application, please contact our team at *******************
Our Values
Our values speak to our shared beliefs, and describe how we approach working together.
* Put People First reflects our commitment to safety and care of each other, learning and development, and creating an inclusive environment of mutual respect, empathy and belonging.
* Do the Right Thing focuses us on acting with honesty and integrity, delivering the results the right way, taking pride in our work, and speaking the truth - good or bad.
* Do Great Work…Together occurs when we engage without hierarchy, collaborate as a team, embrace challenges, and work for the good of all of us.
* Take Ownership and Raise the Bar demonstrates our responsibility to add value and make a difference, challenge the status quo and biases to make things better, foster innovative and creative solutions to drive impact, and explore new perspectives and embrace change.
Our Commitment to You
We're actively taking steps to make sure our culture is inclusive and that our processes and practices promote equity for all. Leggett & Platt is comprised of people of all abilities, gender identities and expressions, ages, ethnicities, sexual orientations, veteran status, and more. Join us!
We welcome and encourage applications if you meet the minimum qualifications. Even if you do not meet the preferred qualifications, we'd love the opportunity to consider you.
For more information about how we handle your personal data in connection with our recruiting processes, please refer to the Recruiting Privacy Notice on the "Privacy Notice" tab located at **************************
IMPORTANT NOTICE TO APPLICANTS
Regarding Equal Employment / Equal Access
Leggett & Platt, Incorporated, and all its Canadian affiliates value the diversity of skills, knowledge, and perspectives our employees bring to our workplace. We believe the presence in our workforce of individuals with characteristics protected under local human rights legislation, including but not limited to females, minorities, individuals with disabilities, veterans, persons from all faiths and national origins and of all colors reward us with greater talent, new ideas and unique perspectives.
It is therefore our policy to recruit, hire, promote, transfer, administer benefits and handle all other conditions of employment in a non-discriminatory manner without discrimination regarding any protected status under local human rights legislation. Our Chief Executive Officer has requested that all of our employees make equal opportunity a priority.
We are committed to workplace accessibility and accommodation of persons with disabilities. Should you have any need for accommodation at any stage of the hiring or recruitment process, please let us know and we will work with you to provide appropriate accommodation or address any accessibility-related needs. This information will be handled as confidentially as practical. You are welcome to contact a member of our Corporate Employee Relations staff at ************ if you need assistance to apply, wish to discuss an accommodation, or have questions about this notice.