Post job

Data scientist jobs in Comstock, MI

- 40 jobs
All
Data Scientist
Data Engineer
Senior Data Scientist
  • Principal Data Scientist

    Maximus 4.3company rating

    Data scientist job in Grand Rapids, MI

    Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team. You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes. This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.) This position requires occasional travel to the DC area for client meetings. Essential Duties and Responsibilities: - Make deep dives into the data, pulling out objective insights for business leaders. - Initiate, craft, and lead advanced analyses of operational data. - Provide a strong voice for the importance of data-driven decision making. - Provide expertise to others in data wrangling and analysis. - Convert complex data into visually appealing presentations. - Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners. - Understand the importance of automation and look to implement and initiate automated solutions where appropriate. - Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects. - Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects. - Guide operational partners on product performance and solution improvement/maturity options. - Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization. - Learn new skills in advanced analytics/AI/ML tools, techniques, and languages. - Mentor more junior data analysts/data scientists as needed. - Apply strategic approach to lead projects from start to finish; Job-Specific Minimum Requirements: - Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation. - Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital. - Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning. - Contribute to the development of mathematically rigorous process improvement procedures. - Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments. Minimum Requirements - Bachelor's degree in related field required. - 10-12 years of relevant professional experience required. Job-Specific Minimum Requirements: - 10+ years of relevant Software Development + AI / ML / DS experience. - Professional Programming experience (e.g. Python, R, etc.). - Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML. - Experience with API programming. - Experience with Linux. - Experience with Statistics. - Experience with Classical Machine Learning. - Experience working as a contributor on a team. Preferred Skills and Qualifications: - Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.). - Experience developing machine learning or signal processing algorithms: - Ability to leverage mathematical principles to model new and novel behaviors. - Ability to leverage statistics to identify true signals from noise or clutter. - Experience working as an individual contributor in AI. - Use of state-of-the-art technology to solve operational problems in AI and Machine Learning. - Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles. - Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions. - Ability to build reference implementations of operational AI & Advanced Analytics processing solutions. Background Investigations: - IRS MBI - Eligibility #techjobs #VeteransPage EEO Statement Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics. Pay Transparency Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances. Accommodations Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************. Minimum Salary $ 156,740.00 Maximum Salary $ 234,960.00
    $70k-99k yearly est. Easy Apply 5d ago
  • Lead Data Scientist GenAI, Strategic Analytics - Data Science

    Deloitte 4.7company rating

    Data scientist job in Grand Rapids, MI

    Deloitte is at the leading edge of GenAI innovation, transforming Strategic Analytics and shaping the future of Finance. We invite applications from highly skilled and experienced Lead Data Scientists ready to drive the development of our next-generation GenAI solutions. The Team Strategic Analytics is a dynamic part of our Finance FP&A organization, dedicated to empowering executive leaders across the firm, as well as our partners in financial and operational functions. Our team harnesses the power of cloud computing, data science, AI, and strategic expertise-combined with deep institutional knowledge-to deliver insights that inform our most critical business decisions and fuel the firm's ongoing growth. GenAI is at the forefront of our innovation agenda and a key strategic priority for our future. We are rapidly developing groundbreaking products and solutions poised to transform both our organization and our clients. As part of our team, the selected candidate will play a pivotal role in driving the success of these high-impact initiatives. Recruiting for this role ends on December 14, 2025 Work You'll Do Client Engagement & Solution Scoping * Partner with stakeholders to analyze business requirements, pain points, and objectives relevant to GenAI use cases. * Facilitate workshops to identify, prioritize, and scope impactful GenAI applications (e.g., text generation, code synthesis, conversational agents). * Clearly articulate GenAI's value proposition, including efficiency gains, risk mitigation, and innovation. * Solution Architecture & Design * Architect holistic GenAI solutions, selecting and customizing appropriate models (GPT, Llama, Claude, Zora AI, etc.). * Design scalable integration strategies for embedding GenAI into existing client systems (ERP, CRM, KM platforms). * Define and govern reliable, ethical, and compliant data sourcing and management. Development & Customization * Lead model fine-tuning, prompt engineering, and customization for client-specific needs. * Oversee the development of GenAI-powered applications and user-friendly interfaces, ensuring robustness and exceptional user experience. * Drive thorough validation, testing, and iteration to ensure quality and accuracy. Implementation, Deployment & Change Management * Manage solution rollout, including cloud setup, configuration, and production deployment. * Guide clients through adoption: deliver training, create documentation, and provide enablement resources for users. Risk, Ethics & Compliance * Lead efforts in responsible AI, ensuring safeguards against bias, privacy breaches, and unethical outcomes. * Monitor performance, implement KPIs, and manage model retraining and auditing processes. Stakeholder Communication * Prepare executive-level reports, dashboards, and demos to summarize progress and impact. * Coordinate across internal teams, tech partners, and clients for effective project delivery. Continuous Improvement & Thought Leadership * Stay current on GenAI trends, best practices, and emerging technologies; share insights across teams. * Mentor junior colleagues, promote knowledge transfer, and contribute to reusable methodologies. Qualifications Required: * Bachelor's or Master's degree in Computer Science, Engineering, Data Science, Mathematics, or related field. * 5+ years of hands-on experience delivering machine learning or AI solutions, preferably including generative AI. * Independent thinker who can create the vision and execute on transforming data into high end client products. * Demonstrated accomplishments in the following areas: * Deep understanding of GenAI models and approaches (LLMs, transformers, prompt engineering). * Proficiency in Python (PyTorch, TensorFlow, HuggingFace), Databricks, ML pipelines, and cloud-based deployment (Azure, AWS, GCP). * Experience integrating AI into enterprise applications, building APIs, and designing scalable workflows. * Knowledge of solution architecture, risk assessment, and mapping technology to business goals. * Familiarity with agile methodologies and iterative delivery. * Commitment to responsible AI, including data ethics, privacy, and regulatory compliance. * Ability to travel 0-10%, on average, based on the work you do and the clients and industries/sectors you serve * Limited immigration sponsorship may be available. Preferred: * Relevant Certifications: May include Google Cloud Professional ML Engineer, Microsoft Azure AI Engineer, AWS Certified Machine Learning, or specialized GenAI/LLM credentials. * Experience with data visualization tools such as Tableau The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $102,500 - $188,900. You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance. Information for applicants with a need for accommodation ************************************************************************************************************ EA_FA_ExpHire Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte's purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Learn more. Professional development From entry-level employees to senior leaders, we believe there's always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. As used in this posting, "Deloitte" means Deloitte Services LP, a subsidiary of Deloitte LLP. Please see ************************* for a detailed description of the legal structure of Deloitte LLP and its subsidiaries. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law. Requisition code: 316523 Job ID 316523
    $63k-85k yearly est. 9d ago
  • Principal Data Scientist

    Meijer 4.5company rating

    Data scientist job in Grand Rapids, MI

    As a family company, we serve people and communities. When you work at Meijer, you're provided with career and community opportunities centered around leadership, personal growth and development. Consider joining our family - take care of your career and your community! Meijer Rewards Weekly pay Scheduling flexibility Paid parental leave Paid education assistance Team member discount Development programs for advancement and career growth Please review the job profile below and apply today! The Data Science team at Meijer leads the strategy, development and integration of Machine Learning and Artificial Intelligence at Meijer. Data Scientists on the team will drive customer loyalty, digital conversion, and system efficiencies by delivering innovative data driven solutions. Through these solutions, the data scientists drive material business value, mitigating business and operational risk, and significantly impacts the customer experience. This role works directly with product development, merchandising, marketing, operations, ITS, ecommerce, and vendor partners. The position will lead, consult or oversee multiple highly complex data science projects/programs/domains that have significant impacts and require in-depth technical knowledge across multiple specific architecture disciplines such as technology, solution, business, or information/data. What You'll Be Doing: Deliver against the overall data science strategy to drive in-store and digital merchandising, marketing, customer loyalty, and operational performance Partner with product development to define requirements which meet system and customer experience needs for data science projects Partner with Merchandising, Supply Chain, Operations and customer insights to understand the journey that will be improved with the data science deliverables Build production ready prototypes for, and iteratively develop, end-to-end data science pipelines including custom algorithms, statistical models, machine learning and artificial intelligence functions to meet end user needs Partner with product development and technology teams to deploy pipelines into production MLOps environment following Safe Agile methodology You will architect, design, and lead the development and implementation of machine learning algorithms and models for a data science capability within Digital Services, Merchandising, Marketing, Supply Chain and Operations. Designs, codes, and implements new industry leading design patterns in data science, to create the convention/technique/practice that teams are accountable to follow. Foster a culture of innovation by keeping up-to-date with the latest industry trends and research across data science, AI and retail. Mentors data science teams and promotes a culture of curiosity, accountability, & enthusiasm. Provides data science strategy to ensure code modernization and industry standard patterns for data science. Provides guidance to development teams within domain (Commerce, Services or Mobile). Leads non-functional requirements (NFRs) such as security, reliability, performance, maintainability, scalability, and usability related to data science. Promotes the implementation of new technology, solutions, and methods to improve business processes, efficiency, effectiveness, and value delivered to customers. Proactively identify data driven solutions for strategic cross-functional initiatives, develop and present business cases, and gain stakeholder alignment of solution Responsible to define, document & follow best practices & approve for ML/AI development at Meijer Own communication with data consumers (internal and external) to ensure they understand the data science products, have the proper training, and are following the best practices in application of data science products Define and analyze Key Performance Indicators to monitor the usage, adoption, health and value of the data products Identify and scope, in conjunction with IT, the architecture of systems and environment needs to ensure that data science systems can deliver against strategy Build and maintain relationships with key partners, suppliers and industry associations and continue to advance data science capabilities, knowledge and impact This job profile is not meant to be all inclusive of the responsibilities of this position; may perform other duties as assigned or required What You'll Bring With You: Advanced Degree (MA/MS, PhD) in Data Science, Computer Science, Mathematics, Statistics, Economics, or related quantitative field Certifications: Azure Data Science Associate, Azure AI, Safe Agile 8+ years of relevant data science experience in an applied role - preferable w/in retail, logistics, supply chain or CPG with a focus on NLP and AI Advanced and hands on experience using: Python, Databricks, Azure ML, Azure Cognitive Service, Ads Data Hub, BIQuery, SAS, R, SQL, PySpark, Numpy, Pandas, Scikit Learn, TensorFlow, PyTorch, AutoTS, Prophet, NLTK Experience with Azure Cloud technologies including Azure DevOps, Azure Synapse, MLOps, GitHub Solid experience working with large datasets and developing ML/AI systems such as: natural language processing, speech/text/image recognition, supervised and unsupervised learning models, forecasting and/or econometric time series models Developed efficient and effective solutions to diverse and complex business problems Technical team leadership experience. Experience in Scaled Agile framework is preferred. Proactive and action oriented Ability to collaborate with, and present to internal and external partners Experience in the retail industry or in a production/service environment is preferred.
    $73k-93k yearly est. Auto-Apply 60d+ ago
  • Data Scientist

    Millerknoll, Inc.

    Data scientist job in Zeeland, MI

    Why join us? Our purpose is to design for the good of humankind. It's the ideal we strive toward each day in everything we do. Being a part of MillerKnoll means being a part of something larger than your work team, or even your brand. We are redefining modern for the 21st century. And our success allows MillerKnoll to support causes that align with our values, so we can build a more sustainable, equitable, and beautiful future for everyone. About the Role We're looking for an experienced and adaptable Data Scientist to join our growing AI & Data Science team. You'll be part of a small, highly technical group focused on delivering impactful machine learning, forecasting, and generative AI solutions. In this role, you'll work closely with stakeholders to translate business challenges into well-defined analytical problems, design and validate models, and communicate results in clear, actionable terms. You'll collaborate extensively with our ML Engineer to transition solutions from experimentation to production, ensuring models are both effective and robust in real-world environments. You'll be expected to quickly prototype and iterate on solutions, adapt to new tools and approaches, and share knowledge with the broader organization. This is a hands-on role with real impact and room to innovate. Key Responsibilities * Partner with business stakeholders to identify, scope, and prioritize data science opportunities. * Translate complex business problems into structured analytical tasks and hypotheses. * Design, develop, and evaluate machine learning, forecasting, and statistical models, considering fairness, interpretability, and business impact. * Perform exploratory data analysis, feature engineering, and data preprocessing. * Rapidly prototype solutions to assess feasibility before scaling. * Interpret model outputs and clearly communicate findings, implications, and recommendations to both technical and non-technical audiences. * Collaborate closely with the ML Engineer to transition models from experimentation into scalable, production-ready systems. * Develop reproducible code, clear documentation, and reusable analytical workflows to support org-wide AI adoption. * Stay up to date with advances in data science, AI/ML, and generative AI, bringing innovative approaches to the team. Required Technical Skills * Bachelor's or Master's degree in Data Science, Statistics, Applied Mathematics, Computer Science, or a related quantitative field, with 3+ years of applied experience in data science. * Strong foundation in statistics, probability, linear algebra, and optimization. * Proficiency with Python and common data science libraries (Pandas, NumPy, Scikit-learn, XGBoost, PyTorch or TensorFlow). * Experience with time series forecasting, regression, classification, clustering, or recommendation systems. * Familiarity with GenAI concepts and tools (LLM APIs, embeddings, prompt engineering, evaluation methods). * Strong SQL skills and experience working with large datasets and cloud-based data warehouses (Snowflake, BigQuery, etc.). * Solid understanding of experimental design and model evaluation metrics beyond accuracy. * Experience with data visualization and storytelling tools (Plotly, Tableau, Power BI, or Streamlit). * Exposure to MLOps/LLMOps concepts and working in close collaboration with engineering teams. Soft Skills & Qualities * Excellent communication skills with the ability to translate analysis into actionable business recommendations. * Strong problem-solving abilities and business acumen. * High adaptability to evolving tools, frameworks, and industry practices. * Curiosity and continuous learning mindset. * Stakeholder empathy and ability to build trust while introducing AI solutions. * Strong collaboration skills and comfort working in ambiguous, fast-paced environments. * Commitment to clear documentation and knowledge sharing. Who We Hire? Simply put, we hire qualified applicants representing a wide range of backgrounds and abilities. MillerKnoll is comprised of people of all abilities, gender identities and expressions, ages, ethnicities, sexual orientations, veterans from every branch of military service, and more. Here, you can bring your whole self to work. We're committed to equal opportunity employment, including veterans and people with disabilities. This organization participates in E-Verify Employment Eligibility Verification. In general, MillerKnoll positions are closed within 45 days and are open for applications for a minimum of 5 days. We encourage our prospective candidates to submit their application(s) expediently so as not to miss out on our opportunities. We frequently post new opportunities and encourage prospective candidates to check back often for new postings. MillerKnoll complies with applicable disability laws and makes reasonable accommodations for applicants and employees with disabilities. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact MillerKnoll Talent Acquisition at careers_********************.
    $68k-94k yearly est. Auto-Apply 60d+ ago
  • Data Scientist

    Millerknoll

    Data scientist job in Zeeland, MI

    Why join us? Our purpose is to design for the good of humankind. It's the ideal we strive toward each day in everything we do. Being a part of MillerKnoll means being a part of something larger than your work team, or even your brand. We are redefining modern for the 21st century. And our success allows MillerKnoll to support causes that align with our values, so we can build a more sustainable, equitable, and beautiful future for everyone. About the Role We're looking for an experienced and adaptable Data Scientist to join our growing AI & Data Science team. You'll be part of a small, highly technical group focused on delivering impactful machine learning, forecasting, and generative AI solutions. In this role, you'll work closely with stakeholders to translate business challenges into well-defined analytical problems, design and validate models, and communicate results in clear, actionable terms. You'll collaborate extensively with our ML Engineer to transition solutions from experimentation to production, ensuring models are both effective and robust in real-world environments. You'll be expected to quickly prototype and iterate on solutions, adapt to new tools and approaches, and share knowledge with the broader organization. This is a hands-on role with real impact and room to innovate. Key Responsibilities Partner with business stakeholders to identify, scope, and prioritize data science opportunities. Translate complex business problems into structured analytical tasks and hypotheses. Design, develop, and evaluate machine learning, forecasting, and statistical models, considering fairness, interpretability, and business impact. Perform exploratory data analysis, feature engineering, and data preprocessing. Rapidly prototype solutions to assess feasibility before scaling. Interpret model outputs and clearly communicate findings, implications, and recommendations to both technical and non-technical audiences. Collaborate closely with the ML Engineer to transition models from experimentation into scalable, production-ready systems. Develop reproducible code, clear documentation, and reusable analytical workflows to support org-wide AI adoption. Stay up to date with advances in data science, AI/ML, and generative AI, bringing innovative approaches to the team. Required Technical Skills Bachelor's or Master's degree in Data Science, Statistics, Applied Mathematics, Computer Science, or a related quantitative field, with 3+ years of applied experience in data science. Strong foundation in statistics, probability, linear algebra, and optimization. Proficiency with Python and common data science libraries (Pandas, NumPy, Scikit-learn, XGBoost, PyTorch or TensorFlow). Experience with time series forecasting, regression, classification, clustering, or recommendation systems. Familiarity with GenAI concepts and tools (LLM APIs, embeddings, prompt engineering, evaluation methods). Strong SQL skills and experience working with large datasets and cloud-based data warehouses (Snowflake, BigQuery, etc.). Solid understanding of experimental design and model evaluation metrics beyond accuracy. Experience with data visualization and storytelling tools (Plotly, Tableau, Power BI, or Streamlit). Exposure to MLOps/LLMOps concepts and working in close collaboration with engineering teams. Soft Skills & Qualities Excellent communication skills with the ability to translate analysis into actionable business recommendations. Strong problem-solving abilities and business acumen. High adaptability to evolving tools, frameworks, and industry practices. Curiosity and continuous learning mindset. Stakeholder empathy and ability to build trust while introducing AI solutions. Strong collaboration skills and comfort working in ambiguous, fast-paced environments. Commitment to clear documentation and knowledge sharing. Who We Hire? Simply put, we hire qualified applicants representing a wide range of backgrounds and abilities. MillerKnoll is comprised of people of all abilities, gender identities and expressions, ages, ethnicities, sexual orientations, veterans from every branch of military service, and more. Here, you can bring your whole self to work. We're committed to equal opportunity employment, including veterans and people with disabilities. This organization participates in E-Verify Employment Eligibility Verification. In general, MillerKnoll positions are closed within 45 days and are open for applications for a minimum of 5 days. We encourage our prospective candidates to submit their application(s) expediently so as not to miss out on our opportunities. We frequently post new opportunities and encourage prospective candidates to check back often for new postings. MillerKnoll complies with applicable disability laws and makes reasonable accommodations for applicants and employees with disabilities. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact MillerKnoll Talent Acquisition at careers_********************.
    $68k-94k yearly est. Auto-Apply 60d+ ago
  • Sr. Biomedical Data Scientist

    Bamf Health Inc.

    Data scientist job in Grand Rapids, MI

    Join BAMF Health, where you're not just part of a team; you're at the forefront of a revolution in Theranostics, changing lives for the better. As a member of our global team, you'll contribute to pioneering technology and deliver top-tier patient care. Located in the heart of downtown Grand Rapids, our cutting-edge global headquarters resides within the state-of-the-art Doug Meijer Medical Innovation Building. Step into our modern and spacious facilities, where innovation thrives and collaboration knows no bounds. Join us in our mission to make Theranostics accessible and affordable for all, and be part of something truly remarkable at BAMF Health. The Biomedical Data Scientist leads the transformation of complex clinical and Imaging data into actionable AI models in the oncology space. The role combines clinical informatics, data engineering, and machine learning for real-world precision medicine. Responsible for translating multimodal data-including PACS imaging, EMRs, physician notes, and optionally molecular assays-into unified structured datasets and for building prognostic and progression-free survival (PFS) models from these datasets to advance cancer treatment personalization. Data Integration & Engineering: Build and maintain robust pipelines that ingest and harmonize multimodal data (DICOM imaging, EMR, pathology, molecular/omics data, and free-text clinical notes). Work across HL7, FHIR, DICOM, and unstructured text formats to enable a comprehensive longitudinal patient view. Apply NLP or LLM tools to extract structured information from freeform physician notes, pathology reports, and radiology impressions. Collaborate with solution architects and engineers to ensure AI-readiness and scalability of the data infrastructure. AI Modeling & Analytics: Develop and validate AI/ML models for: Progression-Free Survival (PFS) estimation Overall survival prediction Prognostic classification and risk stratification Use classical survival analysis (e.g., Kaplan-Meier, Cox models) as well as modern ML approaches (e.g., DeepSurv, time-to-event modeling, random survival forests). Perform feature engineering across clinical, imaging, and molecular features. Lead the data science component of retrospective and prospective studies in collaboration with clinicians and statisticians. Collaboration & Strategy: Partner with oncologists, nuclear medicine physicians, and radiologists to define clinically meaningful endpoints and labels. Contribute to publications, presentations, and regulatory submissions where needed. Ensure all processes adhere to data privacy regulations (e.g., HIPAA, GDPR, 21 CFR Part 11). Basic Qualifications: Advanced degree (MS/PhD) in Biomedical Informatics, Biostatistics, Computer Science, Bioengineering, or related discipline required. 5 years of experience in biomedical data science, clinical informatics, or a related field required Proven experience with: Clinical data (EMR, PACS, structured/unstructured EHR), NLP/LLM tools, and AI/ML models for prognostic or survival prediction required Proficiency in Python (pandas, scikit-learn, PyTorch), SQL, and data pipeline frameworks required Preferred Qualifications: Experience with time-to-event models (DeepSurv, CoxNet, survival forests) preferred Familiarity with radiology and nuclear medicine workflows preferred Exposure to omics datasets and molecular biomarkers preferred Prior experience in oncology, radiology, or theranostics domains preferred Schedule Details: Employment Status: Full time (1.0 FTE) Weekly Scheduled Hours: 40 Hours of work: 8:00 a.m. to 5:00 p.m. Days worked: On-site role with potential hybrid flexibility Monday-Friday At BAMF Health, our top priority is patient care. To ensure we are able to drive a Bold Advance Medical Future, we offer a well-rounded benefit package to care for our team members and their families. Highlights include: Employer paid High Deductible Health Plan with employer HSA contribution Flexible Vacation Time 401(k) Retirement Plan with generous employer match Several benefit options including, but not limited to; dental, vision, disability, life, supplemental coverages, legal and identity protection Free Grand Rapids downtown parking Disclaimer BAMF Health provides equal opportunities to all employees for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. BAMF Health will reasonably accommodate qualified individuals with a disability so that they can perform the essential functions of a job unless doing so causes a direct threat to these individuals or others in the workplace and the threat cannot be eliminated by reasonable accommodation or if the accommodation creates an undue hardship to BAMF Health. BAMF Health is an Equal Opportunity Employer and will not accept or tolerate discrimination or harassment against any applicant, employee, intern, or volunteer based upon the following characteristics: race, color, religion, creed, national origin, ancestry, sex, age, qualified mental or physical disability or handicap, sexual orientation, gender identity/expression, transgender status, genetic information, pregnancy or pregnancy-related status, marital status, veteran status, military service, any application for any military service, or any other category or class protected by applicable federal, state, or local laws.
    $78k-107k yearly est. Auto-Apply 60d+ ago
  • Data Engineer - Senior Manager - Consulting - Location Open

    EY 4.7company rating

    Data scientist job in Grand Rapids, MI

    At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. **AI & Data - Data Engineer - Senior Manager** EY delivers unparalleled tech consulting services in data strategy, business intelligence, digital, machine learning and Artificial Intelligence. We support and enable big ideas, always with the ambitions to keep doing more. **The Opportunity:** You will help our clients enable better business outcomes. You will have the opportunity to lead and develop your skill set to keep up with the ever-growing demands of the modern data platform. During implementation you will solve complex analytical problems to bring data to insights and enable the use of ML and AI at scale for your clients. This is a high growth area and a high visibility role with plenty of opportunities to enhance your skillset and build your career. As a Senior Manager in our AI and Data practice, you will lead data engineering initiatives, driving the design and implementation of robust data solutions for our clients. You will manage a team of data engineers, collaborating with cross-functional teams to deliver high-quality data products that support advanced analytics and AI applications. **Key Responsibilities:** In this pivotal role, you will be responsible for the effective management and delivery of one or more processes, solutions, and projects, with a focus on quality and effective risk management. You will drive continuous process improvement and identify innovative solutions through research, analysis, and best practices. Managing professional employees or supervising team members to deliver complex technical initiatives, you will apply your depth of expertise to guide others and interpret internal/external issues to recommend quality solutions. Your responsibilities will include: + Lead the development and optimization of data pipelines and architectures to ensure efficient data processing and integration. + Oversee the design and implementation of data models, ensuring data quality and accessibility. + Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical specifications. + Manage project timelines, budgets, and resources, ensuring successful delivery of data engineering projects. + Mentor and develop junior team members, fostering a culture of innovation and continuous improvement. + Engage with clients to understand their business needs, providing strategic insights and solutions that align with their goals. + Develop and execute go-to-market strategies for data engineering services, identifying opportunities for growth and innovation. + Stay current with industry trends and emerging technologies in data engineering and AI. **Skills and attributes for success** To thrive in this role, you will need a combination of technical and business skills that will make a significant impact. Your skills will include: + Technical Skills Applications Integration + Data Engineering and Modelling + Data Integration and Data Quality + Become a trusted advisor to your clients' senior decision makers and internal EY teams by establishing credibility and expertise in both data strategy in general and in the use of analytic technology solutions to solve business problems. + Engage with senior business leaders to understand and shape their goals and objectives and their corresponding information needs and analytic requirements. + Collaborate with cross-functional teams (Data Scientists, Business Analysts, and IT teams) to define data requirements, design solutions, and implement data strategies that align with our clients' objectives. + Organize and lead workshops and design sessions with stakeholders, including clients, team members, and cross-functional partners, to capture requirements, understand use cases, personas, key business processes, brainstorm solutions, and align on data architecture strategies and projects. + Direct and mentor global data engineering teams, fostering a culture of innovation, collaboration, and continuous improvement. + Establish data governance policies and practices, including data security, quality, and lifecycle management. + Stay abreast of industry trends and emerging technologies in data architecture and management, recommending innovations and improvements to enhance our capabilities. **To qualify for the role, you must have** + Bachelor's or Master's degree in Computer Science, Data Science, or a related field. + 10+ years professional consulting experience in industry or in technology consulting. + 10+ years of hands-on work experience in data engineering, with a strong background in data architecture, ETL processes, and data warehousing. + 3+ years' experience with native cloud products and services such as AWS, Azure or GCP. + 5+ years of experience mentoring and leading teams of data engineers, fostering a culture of innovation and professional development. + Proficiency in programming languages such as Python, Java, or Scala, and experience with big data technologies (e.g., Hadoop, Spark). + Demonstrated experience in leading large data engineering teams to design and build platforms with complex architectures and diverse features including various data flow patterns, relational and no-SQL databases, production-grade performance, and delivery to downstream use cases and applications. + Manage complex data analysis, migration, and integration of enterprise solutions to modern platforms, including code efficiency and performance optimizations. + Previous hands-on coding skills in languages commonly used in data engineering, such as Python, Java, or Scala. + Ability to design data solutions that can scale horizontally and vertically while optimizing performance. + Excellent client management skills, with the ability to build and maintain strong relationships with stakeholders. + Experience in developing and executing go-to-market strategies for data and AI solutions. + Knowledge of machine learning concepts and frameworks, with the ability to integrate AI capabilities into data engineering processes. + Excellent leadership, communication, and project management skills. + Data Security and Database Management + Enterprise Data Management and Metadata Management **Ideally, you'll also have** + Master's degree in computer science, Management Information Systems, Informatics, Statistics, Applied Mathematics, Data Science, Machine Learning or commensurate professional experience. + Experience in cloud platforms (e.g., AWS, Azure, Google Cloud) and data visualization tools. + Experience in leading and influencing teams, with a focus on mentorship and professional development. + A passion for innovation and the strategic application of emerging technologies to solve real-world challenges. + The ability to foster an inclusive environment that values diverse perspectives and empowers team members. + Building and Managing Relationships + Client Trust and Value and Commercial Astuteness + Communicating With Impact and Digital Fluency **What we look for** We are looking for top performers who demonstrate a blend of technical expertise and business acumen, with the ability to build strong client relationships and lead teams through change. Emotional agility and hybrid collaboration skills are key to success in this dynamic role. **What we offer you** At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more . + We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $144,000 to $329,100. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $172,800 to $374,000. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options. + Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year. + Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. **Are you ready to shape your future with confidence? Apply today.** EY accepts applications for this position on an on-going basis. For those living in California, please click here for additional information. EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities. **EY | Building a better working world** EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories. EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law. EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
    $77k-112k yearly est. 6d ago
  • Data Engineer

    Pro It

    Data scientist job in Grand Rapids, MI

    Job Title: Data EngineerSkill Category: Data EngineeringWork Location: Grand Rapids, MIOnsite/Remote: HybridHourly C2C Rate: $80/hr Job Description: We are looking for a Data Engineer with over 6 years of industry experience in business application design, development, implementation, and solution architecture. The ideal candidate should have experience with Databricks and building and designing data and analytics on enterprise solutions such as: Azure Data Factory Azure Function App Log Analytics Databricks Synapse Power BI ADLS Gen2 Logic Apps Required Skills: Data Classification Data Modeling Data Architecture Data Quality Design Network Components Solution Architecture Teamwork Technical Training .Net Core Agile C# Data Warehouse Infrastructure Product Management Functional Requirements Interfaces Research Subsystems Support .Net Python Data Engineering Data Analysis Data Science Responsibilities: Design, code, test, and implement data movement, dashboarding, and analytical assets. Develop system documentation according to SAFe Agile principles and industry standards. Evaluate architectural options and define the overall architecture of enterprise Data Lake and Data Warehouse. Provide subject matter expertise and technical consulting support on vendor or internal applications and interfaces. Develop Azure Function App using C# and .Net Core. Define functional and non-functional requirements, including performance monitoring, alerting, and code management. Partner with business areas to gather requirements for Data and Analytics and design solutions. Define major elements and subsystems and their interfaces. Mentor and coach team members. Engage with ITS Security and Infrastructure teams to ensure secure development and deployment of solutions. Interface with the Product Manager and IT partners to define and estimate features for agile teams. Conduct industry research, facilitate new product and vendor evaluations, and assist in vendor selection. Qualifications: 6+ years of industry experience in business application design, development, implementation, and solution architecture. Experience with Azure Data Factory, Azure Function App, Log Analytics, Databricks, Synapse, Power BI, ADLS Gen2, and Logic Apps. Databricks experience is required. Experience designing data pipelines and implementing data quality, governance, and security compliance in Azure architecture. Bachelor's degree in Computer Science, Computer Information Systems, Business Information Systems, Engineering, or related discipline or equivalent work experience and technical training. Excellent written and oral communication skills. Experience in Power BI, Data Modeling, Data Classification, Data Architecture, and reporting. In-depth understanding of computer, storage, and network components, including backup, monitoring, and DR environment requirements. Preferred knowledge and experience in Python and API architecture in Azure. SAFe certification or training is a plus. Experience with diverse technical configurations, technologies, and processing environments. Exceptional interpersonal skills, including teamwork, facilitation, and negotiation. Must Be Included with Submittal: Full Legal Name Phone Email Current Location Rate Work Authorization Willingness to Relocate Confirmation that the candidate is or will be on your W2
    $80 hourly 60d+ ago
  • Staff Data Engineer

    Certifid 3.9company rating

    Data scientist job in Grand Rapids, MI

    Cybercrime is rising, reaching record highs in 2024. According to the FBI's IC3 report total losses exceeded $16 billion. With investment fraud and BEC scams at the forefront, the message is clear: the real estate sector remains a lucrative target for cybercriminals. At CertifID, we take this threat seriously and provide a secure platform that verifies the identities of parties involved in transactions, authenticates wire transfer instructions, and detects potential fraud attempts. Our technology is designed to mitigate risks and ensure that every transaction is conducted with confidence and peace of mind. We know we couldn't take on this challenge without our incredible team. We have been recognized as one of the Best Startups to Work for in Austin, made the Inc. 5000 list, and won Best Culture by Purpose Jobs two years in a row. We are guided by our core values and our vision of a world without wire fraud. We offer a dynamic work environment where you can contribute to meaningful impact and be part of a team dedicated to enhancing security and fighting fraud. We're seeking an exceptional Staff Data Engineer to help us take our data platform to the next level. This is a high-impact role where you'll architect and build scalable data infrastructure, empower data-driven decision-making, and help shape the future of fraud prevention. We're scaling fast, and you'll have the chance to shape the future of our data platform. You will collaborate with data scientists, engineers, and business stakeholders to ensure high-quality, actionable data insights to enable business outcomes.What You Will Do Design and build robust, scalable, secure data pipelines and infrastructure to support analytics, product intelligence, and machine learning. Lead data architecture decisions, ensuring high performance, reliability, and maintainability across our platform. Collaborate cross-functionally with product, engineering, and business teams to deliver data solutions that drive strategic insights and operational excellence. Champion data quality and governance, implementing best practices for data validation, lineage, and observability. Mentor and guide other data engineers, fostering a culture of technical excellence and continuous learning. What We're Looking For Proven experience as a staff-level data engineer in a fast-paced, product-driven environment. Engineering experience of 8+ years. Experience in leading teams of other engineers to build a long-term technical vision and plans to achieve it. Deep expertise in cloud data platforms (e.g., Snowflake, BigQuery, Redshift) and orchestration tools (e.g., Airflow, dbt).Strong programming skills in Python or Scala, and proficiency with SQL. Experience with real-time data processing (e.g., Kafka, Spark Streaming) and data modeling for analytics and ML. A growth mindset, strong ownership, and a passion for solving complex problems that matter. Preferred Experience Experience in fintech, cybersecurity, or fraud prevention. Familiarity with data privacy regulations (e.g., SOC 2, GDPR, CCPA). Contributions to open-source data tools or communities. What We Offer Flexible vacation 12 company-paid holidays 10 paid sick days No work on your birthday Health, dental, and vision Insurance (including a $0 option) 401(k) with matching, and no waiting period Equity Life insurance Generous parental paid leave Wellness reimbursement of $300/year Remote worker reimbursement of $300/year Professional development reimbursement Competitive pay An award-winning culture Change doesn't happen overnight, and the same goes for us here at CertifID. We PROGRESS collectively and individually as we grow, abiding by our core values. Protect the Customer, Raise the Bar, Operate with Urgency, Grow with Grit, Ride the Wave, Enthusiasm Spreads, Stay Connected, Send It.
    $81k-113k yearly est. Auto-Apply 60d+ ago
  • Senior Data Engineer

    Advance Local 3.6company rating

    Data scientist job in Grand Rapids, MI

    **Advance Local** is looking for a **Senior Data Engineer** to design, build, and maintain the enterprise data infrastructure that powers our cloud data platform. This position will combine your deep technical expertise in data engineering with team leadership responsibilities for data engineering, overseeing the ingestion, integration, and reliability of data systems across Snowflake, AWS, Google Cloud, and legacy platforms. You'll partner with data product and across business units to translate requirements into technical solutions, integrate data from numerous third-party platforms, (CDPs, DMPs, analytics platforms, marketing tech) into central data platform, collaborate closely with the Data Architect on platform strategy and ensure scalable, well-engineered solutions for modern data infrastructure using infrastructure as code and API-driven integrations. The base salary range is $120,000 - $140,000 per year. **What you'll be doing:** + Lead the design and implementation of scalable data ingestion pipelines from diverse sources into Snowflake. + Partner with platform owners across business units to establish and maintain data integrations from third party systems into the central data platform. + Architect and maintain data infrastructure using IAC, ensuring reproducibility, version control and disaster recovery capabilities. + Design and implement API integrations and event-driven data flows to support real time and batch data requirements. + Evaluate technical capabilities and integration patterns of existing and potential third-party platforms, advising on platform consolidation and optimization opportunities. + Partner with the Data Architect and data product to define the overall data platform strategy, ensuring alignment between raw data ingestion and analytics-ready data products that serve business unit needs. + Develop and enforce data engineering best practices including testing frameworks, deployment automation, and observability. + Support rapid prototyping of new data products in collaboration with data product by building flexible, reusable data infrastructure components. + Design, develop, and maintain scalable data pipelines and ETL processes; optimize and improve existing data systems for performance, cost efficiency, and scalability. + Collaborate with data product, third-party platform owners, Data Architects, Analytics Engineers, Data Scientists, and business stakeholders to understand data requirements and deliver technical solutions that enable business outcomes across the organization. + Implement data quality validation, monitoring, and alerting systems to ensure reliability of data pipelines from all sources. + Develop and maintain comprehensive documentation for data engineering processes and systems, architecture, integration patterns, and runbooks. + Lead incident response and troubleshooting efforts for data pipeline issues, ensuring minimal business impact. + Stay current with the emerging data engineering technologies, cloud services, SaaS platform capabilities, and industry best practices. **Our ideal candidate will have the following:** + Bachelor's degree in computer science, engineering, or a related field + Minimum of seven years of experience in data engineering with at least two years in a lead or senior technical role + Expert proficiency in Snowflake data engineering patterns + Strong experience with AWS services (S3, Lambda, Glue, Step Functions) and Google Cloud Platform + Experience integrating data from SaaS platforms and marketing technology stacks (CDPs, DMPs, analytics platforms, CRMs) + Proven ability to work with third party APIs, webhooks, and data exports + Experience with infrastructure such as code (Terraform, Cloud Formation) and CI/CD pipelines for data infrastructure + Proven ability to design and implement API integrations and event-driven architecture + Experience with data modeling, data warehousing, and ETL processes at scale + Advanced proficiency in Python and SQL for data pipeline development + Experience with data orchestration tools (airflow, dbt, Snowflake tasks) + Strong understanding of data security, access controls, and compliance requirements + Ability to navigate vendor relationships and evaluate technical capabilities of third-party platforms + Excellent problem-solving skills and attention to detail + Strong communication and collaboraion skills **Additional Information** Advance Local Media offers competitive pay and a comprehensive benefits package with affordable options for your healthcare including medical, dental and vision plans, mental health support options, flexible spending accounts, fertility assistance, a competitive 401(k) plan to help plan for your future, generous paid time off, paid parental and caregiver leave and an employee assistance program to support your work/life balance, optional legal assistance, life insurance options, as well as flexible holidays to honor cultural diversity. Advance Local Media is one of the largest media groups in the United States, which operates the leading news and information companies in more than 20 cities, reaching 52+ million people monthly with our quality, real-time journalism and community engagement. Our company is built upon the values of Integrity, Customer-first, Inclusiveness, Collaboration and Forward-looking. For more information about Advance Local, please visit ******************** . Advance Local Media includes MLive Media Group, Advance Ohio, Alabama Media Group, NJ Advance Media, Advance Media NY, MassLive Media, Oregonian Media Group, Staten Island Media Group, PA Media Group, ZeroSum, Headline Group, Adpearance, Advance Aviation, Advance Healthcare, Advance Education, Advance National Solutions, Advance Originals, Advance Recruitment, Advance Travel & Tourism, BookingsCloud, Cloud Theory, Fox Dealer, Hoot Interactive, Search Optics, Subtext. _Advance Local Media is proud to be an equal opportunity employer, encouraging applications from people of all backgrounds. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, genetic information, national origin, age, disability, sexual orientation, marital status, veteran status, or any other category protected under federal, state or local law._ _If you need a reasonable accommodation because of a disability for any part of the employment process, please contact Human Resources and let us know the nature of your request and your contact information._ Advance Local Media does not provide sponsorship for work visas or employment authorization in the United States. Only candidates who are legally authorized to work in the U.S. will be considered for this position.
    $120k-140k yearly 30d ago
  • Data Engineer

    Impact Business Group 4.1company rating

    Data scientist job in Grand Rapids, MI

    Our client, a family-owned Midwestern grocery/retailer striving to better people's lives in all communities, seeks a senior-level Data Engineer. The ideal candidate for this role has 5 to 10 years of relevant work experience. Designs, modifies, develops, writes and implements software programming applications. Supports and/or installs software applications/operating systems. Participates in the testing process through test review and analysis, test witnessing and certification of software. Familiar with a variety of the field's concepts, practices, and procedures. Relies on experience and judgment to plan and accomplish goals. Performs a variety of complicated tasks. A wide degree of creativity and latitude is expected.
    $82k-114k yearly est. 21d ago
  • Data Engineer (AI-RPA)

    Padnos 3.8company rating

    Data scientist job in Grandville, MI

    Title: Data Engineer YOUR ROLE PADNOS is seeking a Data Engineer on our Data and Software team who thrives at the intersection of data, automation, and applied AI. This role builds intelligent data pipelines and robotic process automations (RPAs) that connect systems, streamline operations, and unlock efficiency across the enterprise. You'll design and develop pipelines using Python, SQL Server, and modern APIs-integrating services such as OpenAI, Anthropic, and Azure ML-to drive automation and accelerate business processes. Your work will extend beyond traditional data engineering, applying AI models and API logic to eliminate manual effort and make data more actionable across teams. You will report directly to IT Manager, at PADNOS Corporate in Grandville, MI. This is an in-person role based in Grandville, Michigan. Must reside within daily commuting distance of Grandville, Michigan. We do not relocate, sponsor visas, or consider remote applicants. ACCOUNTABILITIES Design and develop automated data pipelines that integrate AI and machine learning services to process, enrich, and deliver high-value data for analytics and automation use cases. Build, maintain, and optimize SQL Server ELT workflows and Python-based automation scripts. Connect to external APIs (OpenAI, Anthropic, Azure ML, and other SaaS systems) to retrieve, transform, and post data as part of end-to-end workflows. Partner with business stakeholders to identify manual workflows and translate them into AI-enabled automations. Work with software developers to integrate automation logic directly into enterprise applications. Implement and monitor data quality, reliability, and observability metrics across pipelines. Apply performance tuning and best practices for database and process efficiency. Develop and maintain reusable Python modules and configuration standards for automation scripts. Support data governance and version control processes to ensure consistency and transparency across environments. Collaborate closely with analytics, software, and operations teams to prioritize and deliver automation solutions that create measurable business impact. MEASUREMENTS Reduction in manual hours across teams through implemented automations. Reliable and reusable data pipelines supporting AI and analytics workloads. Delivery of production-ready automation projects in collaboration with business units. Adherence to data quality and reliability standards. Continuous improvement in data pipeline performance and maintainability. QUALIFICATIONS/EXPERIENCE Bachelor's degree or equivalent experience in data engineering, computer science, or software development. Must have personally owned an automated pipeline end-to-end (design → build → deploy → maintain). Minimum 3 years hands-on experience building production data pipelines using Python and SQL Server. Contract, academic, bootcamp, or coursework experience does not qualify. Intermediate to advanced Python development skills, particularly for data and API automation. Experience working with RESTful APIs and JSON data structures. Familiarity with AI/ML API services (OpenAI, Anthropic, Azure ML, etc.) and their integration into data workflows. Experience with modern data stack components such as Fivetran, dbt, or similar tools preferred. Knowledge of SQL Server performance tuning and query optimization. Familiarity with Git and CI/CD workflows for data pipeline deployment. Bonus: Experience deploying or maintaining RPA or AI automation solutions. PADNOS is an Equal Opportunity Employer and does not discriminate on the basis of race, color, religion, sex, age, national origin, disability, veteran status, sexual orientation or any other classification protected by Federal, State or local law.
    $79k-109k yearly est. 2d ago
  • Marketing Data Engineer

    Hudson Manpower

    Data scientist job in Grand Rapids, MI

    For over half a decade, Hudson Manpower has been a trusted partner in delivering specialized talent and technology solutions across IT, Energy, and Engineering industries worldwide. We work closely with startups, mid-sized firms, and Fortune 500 clients to support their digital transformation journeys. Our teams are empowered to bring fresh ideas, shape innovative solutions, and drive meaningful impact for our clients. If you're looking to grow in an environment where your expertise is valued and your voice matters, then Hudson Manpower is the place for you. Join us and collaborate with forward thinking professionals who are passionate about building the future of work. Job Description: Designs, codes, tests, and implements data movement pipelines, dashboarding and analytical assets; develops system documentation according to SAFe Agile principles and industry standards. Design and hands-on development on either vendor or internal applications and interfaces including Azure - Data Factory, Log Analytics, Databricks, Synapse, Power BI, ADLS Gen2, GCP Cloud storage and Big Query Assist in configuration and administration of marketing platforms (Google Marketing platform, Salesforce MCI, The Trade Desk etc., ADvendio) that the team supports Build data ingestion into and out of the above listed marketing platforms Gathers and implements functional and non-functional requirements including performance monitoring, alerting and code management and ensuring alignment with technology best practices and SLAs. Partnering with all areas of the business to gather requirements for Data and Analytics and designing solutions. Participate in agile ceremonies and follow agile cadence to build, test and deliver solutions Driving engagement with ITS Security and Infrastructure teams to ensure secure development and deployment of solutions. Interfaces with the Product Manager and IT partners at the Program level and within other Release Trains to define and estimate features for agile teams. Job requirements Requirements 6+ years industry experience (business application design, development, implementation, and/or solution architecture) Understanding of architecture practices and execution for large projects / programs. Experience building and designing data and analytics solutions on enterprise solutions such as Azure - Data Factory, Log Analytics, Databricks, Synapse, Power BI, ADLS Gen2. Databricks experience is required. Experience designing data pipelines, ingestion, storage, prep-train, model and serve using above technologies, Automating Azure Workloads, Data quality, Governance/standards, Security and legal compliance in the Azure architecture Working knowledge of these tools preferred - Google Marketing platform, Salesforce MCI, The Trade Desk etc., ADvendio Experience with Marketing domain data ingestion and analytics a must Bachelor's degree in Computer Science, Computer Information Systems, Business Information Systems, Engineering or related discipline or equivalent work experience and technical training is required. Excellent written and oral communications skills. Previous experience in Power BI, Data Modeling, Data Classification and zones, data movement, Data architecture and reporting Preferred knowledge and experience on Python and API architecture in Azure Any SAFe certification or training. Experience with multiple, diverse technical configurations, technologies, and processing environments. Exceptional interpersonal skills, including teamwork, facilitation, and negotiation All done! Your application has been successfully submitted! Other jobs
    $74k-100k yearly est. 59d ago
  • Experienced Data Engineer Analyst

    Ascential Technologies

    Data scientist job in Grand Rapids, MI

    Ascential Technologies is a global leader in automated diagnostic, inspection, assembly, and test systems. We serve mission-critical industries including automotive (EOL testing, ADAS), aerospace, industrial automation (balancing, grinding, test stands), and medical & life sciences (instruments, devices, consumables, and automation). Our solutions help customers accelerate innovation, ensure safety, and improve performance across the product lifecycle. Role Overview: We are seeking an experienced Data Engineer/Analyst to join our team to lead and support the ongoing development of our data analytics platform. This individual will play a critical role in building, maintaining, and enhancing scalable data systems while also supporting customers and internal teams. The role requires strong technical expertise in data engineering, analytics, visualization, and AI integration along with proven experience managing on-prem and cloud-based stacks. Key Responsibilities: Data Engineering & Processing Design, develop, and maintain advanced data collectors for diverse industrial, aerospace, automotive, and medical systems. Architect robust data pipelines for on-premises and cloud environments (Elastic Cloud, OpenSearch). Analyze machine-generated log files from test stands, medical devices, and automation systems. Clean and preprocess data to remove duplicates, noise, and errors. Implement updates and enhancements to accommodate evolving data file types and formats (CSV, SQL, PLC logs, text, etc.). Augment datasets with derived features to enhance analytical value. Optimize performance, scalability, and fault tolerance of ingestion pipelines. Visualization & Reporting Design and implement dashboards tailored to specific test and measurement applications. Apply standard dashboard templates to new and existing datasets. Generate reports and visual summaries for internal and customer-facing use. Provide automatic report generation using AI/LLM frameworks to summarize data, generate insights and guidance on performance improvements, provide predictive analytics, anomaly detection, and automated recommendations. Data Collection & Ingestion Develop and deploy data collectors for machines across automotive, aerospace, and medical domains. Configure systems to monitor, clean, transform, and ingest data into cloud platforms. Ensure data pipelines are robust, scalable, and secure. Integrate new communication protocols (e.g. PLCs of different makes/models) and data formats; adapt to customer-specific software platforms Monitoring & Alerting Set up thresholds and integrate webhooks for real-time alerts and notifications. Investigate and resolve issues related to data collectors, dashboards, and ELK stack components. Collaboration & Maintenance Collaborate directly with customers to gather requirements, resolve technical issues, and deliver solutions. Provide guidance to internal teams, mentoring junior engineers and analysts. Participate in daily standups and agile team activities. Review and provide feedback on new datasets and data models. Perform backups and conduct code reviews for data-related components. Qualifications: Bachelor's degree in Computer Science, Data Science, Engineering, or a related field. 5+ years of professional experience in data engineering, data science or analytics roles for both on-prem and cloud environments. Proven expertise of ELK, OpenSearch, and/or OpenTelemetry. Strong programming and scripting skills (e.g., Python, Bash, PowerShell). Experience with data cleaning, transformation, and preprocessing for engineering/industrial data. Hands-on experience with cloud deployments (Elastic Cloud, AWS, Azure, or GCP). Proficiency with visualization frameworks (Kibana, OpenSearch Dashboards, Grafana). Experience with core programming principles and design patterns. Strong analytical and problem-solving skills. Excellent communication and teamwork abilities with both technical and customer stakeholders. Bonus Skills: Experience working with industrial protocols and PLC integration. Familiarity with containerization (Docker, Kubernetes). Experience incorporating cybersecurity principles for secure data handling. Experience with REST APIs and microservices architectures. Background in test and measurement systems for automotive, aerospace, or medical device industries. Prior experience in customer-facing engineering roles. Knowledge of test and measurement systems in automotive, aerospace, or medical device industries. This role can be remote; we prefer to have a candidate within a commuting distance to one of the offices listed in posting (Wisconsin, Michigan)
    $74k-100k yearly est. 60d+ ago
  • Senior Data Engineer Solution Design Focused

    Unybrands

    Data scientist job in Texas, MI

    We're looking for a Senior Data Engineer with deep expertise in GCP solution architecture to join our engineering team. You'll play a key role in designing, building, and optimizing secure, scalable, and high-performance cloud solutions on Google Cloud Platform. By combining your technical skills with your understanding of e-commerce, CPG, or retail, you'll transform complex business needs into innovative, impactful solutions. If you're passionate about cloud architecture, thrive in fast-paced environments, and enjoy solving complex challenges, we'd love to hear from you. What You'll Do * Architect Scalable Cloud Solutions: Lead the design and implementation of end-to-end GCP-based architectures covering data, applications, infrastructure, and security. * Build Robust Data Pipelines: Develop and maintain reliable ETL/ELT pipelines using tools like Airflow, DBT, and languages like Python and We'rSQL. * Optimize BigQuery Warehousing: Design schemas, tune queries, and manage costs while working with large, complex datasets. * Ensure Data Quality & Governance: Implement standards for data accuracy, consistency, and lineage tracking. * Drive Performance & Efficiency: Monitor and optimize the performance and cost-effectiveness of cloud solutions. * Leverage Industry Knowledge: Apply your domain expertise to solve real-world retail, e-commerce, and supply chain challenges. * Maintain Security & Compliance: Build secure solutions aligned with privacy laws and industry best practices. * Lead & Mentor: Guide engineers, share best practices, and foster a culture of technical excellence. * Collaborate Across Teams: Work closely with stakeholders to define requirements and ensure solution success. * Document Clearly: Produce high-quality design docs, architecture diagrams, and best-practice guides. What You Bring * 8+ years in IT, with 3+ in cloud architecture or GCP solutions * Proven hands-on experience with Google Cloud Platform services * Strong knowledge of data architecture, ETL/ELT, and data warehousing (BigQuery) * Solid coding skills in SQL, Python (preferred), Java, or Scala * Familiarity with Airflow, DBT, and version control (e.g., Git) * Industry experience in e-commerce, retail, or CPG * Excellent communication and problem-solving skills You'll Thrive Here If You Are: * A fast learner who keeps up with evolving tech * A proactive problem-solver who owns their work end-to-end * A collaborative team player who values knowledge sharing * Passionate about creating real business impact through data Join us and help shape the future of data-driven decision-making in a fast-paced, high-impact environment. Apply now if you're ready to build solutions that matter. Salary Range: USD 70-100k
    $74k-99k yearly est. Auto-Apply 60d+ ago
  • Manager, Data Operations, Data Engineer

    KPMG 4.8company rating

    Data scientist job in Grand Rapids, MI

    Known for being a great place to work and build a career, KPMG provides audit, tax and advisory services for organizations in today's most important industries. Our growth is driven by delivering real results for our clients. It's also enabled by our culture, which encourages individual development, embraces an inclusive environment, rewards innovative excellence and supports our communities. With qualities like those, it's no wonder we're consistently ranked among the best companies to work for by Fortune Magazine, Consulting Magazine, Seramount, Fair360 and others. If you're as passionate about your future as we are, join our team. KPMG is currently seeking a Manager of Data Engineering to join our Digital Nexus technology organization. This is a hybrid work opportunity. Responsibilities: * Lead a team of Azure Data Lake and business intelligence engineers in designing and delivering ADL Pipelines, Notebooks and interactive Power BI dashboards that clearly communicate actionable insights to stakeholders; contribute strategic thought leadership to shape the firm's business intelligence vision and standards * Design and maintain scalable data pipelines using Azure Data Factory and Databricks to ingest, transform, and deliver data across medallion architecture layers; develop production-grade ETL/ELT solutions using PySpark and SQL to produce analytics-ready Delta Lake datasets aligned with enterprise standards * Apply critical thinking and creativity to design innovative, non-standard BI solutions that address complex and evolving business challenges; design, build, and optimize data models to support analytics, ensuring accuracy, reliability, and efficiency * Stay ahead of emerging technologies including Generative AI and AI agents to identify novel opportunities that improve analytics, automation, and decision-making across the enterprise * Manage and provide technical expertise and strategic guidance to counselees (direct reports), department peers, and cross-functional team members; set goals, participate in strategic initiatives for the team, and foster the development of high-performance teams * Act with integrity, professionalism, and personal responsibility to uphold KPMG's respectful and courteous work environment Qualifications: * Minimum seven years of recent experience designing and building ADL Pipelines and Data bricks notebooks and interactive dashboards using modern business intelligence tools (preferably Power BI); minimum two years of recent experience designing scalable data pipelines using Azure Data Factory and Azure Databricks to support ingestion, transformation, and delivery of data across medallion architecture layers * Bachelor's degree from an accredited college or university is preferred; minimum of a high school diploma or GED is required * Demonstrated analytical and problem-solving abilities, with a creative and methodical approach to addressing complex challenges * Advanced knowledge of SQL, DAX, and data modeling concepts; proven track record in defining, managing, and delivering BI projects; ability to participate in the development of resource plans and influence organizational priorities * Excellent written and verbal communication skills, including the ability to effectively present proposals and vision to executive leadership * Applicants must be authorized to work in the U.S. without the need for employment-based visa sponsorship now or in the future; KPMG LLP will not sponsor applicants for U.S. work visa status for this opportunity (no sponsorship is available for H-1B, L-1, TN, O-1, E-3, H-1B1, F-1, J-1, OPT, CPT or any other employment-based visa) KPMG LLP and its affiliates and subsidiaries ("KPMG") complies with all local/state regulations regarding displaying salary ranges. If required, the ranges displayed below or via the URL below are specifically for those potential hires who will work in the location(s) listed. Any offered salary is determined based on relevant factors such as applicant's skills, job responsibilities, prior relevant experience, certain degrees and certifications and market considerations. In addition, KPMG is proud to offer a comprehensive, competitive benefits package, with options designed to help you make the best decisions for yourself, your family, and your lifestyle. Available benefits are based on eligibility. Our Total Rewards package includes a variety of medical and dental plans, vision coverage, disability and life insurance, 401(k) plans, and a robust suite of personal well-being benefits to support your mental health. Depending on job classification, standard work hours, and years of service, KPMG provides Personal Time Off per fiscal year. Additionally, each year KPMG publishes a calendar of holidays to be observed during the year and provides eligible employees two breaks each year where employees will not be required to use Personal Time Off; one is at year end and the other is around the July 4th holiday. Additional details about our benefits can be found towards the bottom of our KPMG US Careers site at Benefits & How We Work. Follow this link to obtain salary ranges by city outside of CA: ********************************************************************** KPMG offers a comprehensive compensation and benefits package. KPMG is an equal opportunity employer. KPMG complies with all applicable federal, state and local laws regarding recruitment and hiring. All qualified applicants are considered for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, citizenship status, disability, protected veteran status, or any other category protected by applicable federal, state or local laws. The attached link contains further information regarding KPMG's compliance with federal, state and local recruitment and hiring laws. No phone calls or agencies please. KPMG recruits on a rolling basis. Candidates are considered as they apply, until the opportunity is filled. Candidates are encouraged to apply expeditiously to any role(s) for which they are qualified that is also of interest to them. Los Angeles County applicants: Material job duties for this position are listed above. Criminal history may have a direct, adverse, and negative relationship with some of the material job duties of this position. These include the duties and responsibilities listed above, as well as the abilities to adhere to company policies, exercise sound judgment, effectively manage stress and work safely and respectfully with others, exhibit trustworthiness, and safeguard business operations and company reputation. Pursuant to the California Fair Chance Act, Los Angeles County Fair Chance Ordinance for Employers, Fair Chance Initiative for Hiring Ordinance, and San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records.
    $49k-69k yearly est. 27d ago
  • Mfg .Product Data Engineer

    Leggett & Platt, Incorporated 4.4company rating

    Data scientist job in Grand Rapids, MI

    We make life more comfortable. Leggett & Platt's overall mission is a commitment to enhance lives - by delivering quality products, offering empowering and rewarding careers, and doing our part in bringing about a better future. Leggett & Platt's inventive heritage and leadership in the residential products industry span more than 130 years. As The Components People, we are the leading supplier of a wide range of products and components for all areas of life, including mattress springs and carpet cushion, as well as bedding machinery and erosion-control products. From aerospace tubing and fabricated assemblies to flooring underlayment and carpet cushion, Leggett & Platt has divisions that design, manufacture, and sell a variety of products. Our reliable product development and launch capability, coupled with our global footprint, make us a trusted partner for customers in the aerospace, hydraulic cylinders, flooring, textile, and geo components industries. Learn more about the history of Leggett: *************************** This job is located in Grand Rapids, Michigan and is on site. There is no relocation assistance for this role. Department / Division: Engineering Summary: Responsible for all computer related product data such as three-dimensional model definitions, fully dimensioned drawings, and all necessary system requirements. Essential Duties and Responsibilities: (Other duties may be assigned) * 5.1. Create three-dimensional product definition from customer and product engineering specifications. * 5.2. Create fully dimensioned product drawings for customer approval and quality check sheets. * 5.3. Create three-dimensional tool definition (tooling data sheets TDS) from manufacturing engineering specifications. * 5.4. Create fully dimensioned tool drawings as required. * 5.5. Responsible for revision control and maintaining drawing files. * 5.6. Responsible for computer system troubleshooting and upgrade installations. * 5.7. Responsible for managing customer computer files as required. * 5.8. Plans and formulates engineering program and organizes project staff according to project requirements. * 5.9. Reviews product design for compliance with engineering principles, company standards, and customer contract requirements, and related specifications. * 5.10. Coordinates activities concerned with technical developments, scheduling, and resolving engineering design and test problems. * 5.11. Directs integration of technical activities and products. * 5.12. Evaluates and approves design changes, specifications, and drawing releases. * 5.13. Controls expenditures within limitations of project budget. * 5.14. Prepares interim and completion project status reports. * 5.15. Coordinates first product runs including sample submission and product verification studies. * 5.16. Controls all aspects of engineering related quality issues including customer follow up visits. Verifies production personnel follow up visits on production related quality issues. * 5.17. Issues engineering changes as requested from customer including internal production modifications and cost changes. * 5.18. Issues internal engineering changes related to processes and material changes. * 5.19. Follows products from initial quote through production release servicing the account with engineering changes as required by the customer. Supervisory Responsibilities: Coordinates employees involved with projects including accounting, tooling and production. Carries out supervisory responsibilities in accordance with the organization's policies and applicable laws. Responsibilities: planning, assigning, and directing work, addressing complaints and resolving problems. Qualification Requirements: To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. * 7.1. Education and/or Experience: Associate's degree (A. A.) or equivalent from two-year college or technical school; three years related experience and/or training; or equivalent combination of education and experience. * 7.2. Language Skills: Ability to read and interpret documents such as safety rules, operating and maintenance instructions, and procedure manuals. Ability to write routine reports and correspondence. Ability to speak effectively before groups of customers or employees of organization. * 7.3. Mathematical Skills: Ability to calculate figures and amounts such as proportions, percentages, area, circumference, and volume. Ability to apply concepts of basic algebra, geometry and trigonometry. * 7.4. Reasoning Ability: Ability to define problems, collect data, establish facts, and draw valid conclusions. Ability to interpret an extensive variety of technical instructions in mathematical or diagram form and deal with several abstract and concrete variables. * 7.5. Certificates, Licenses, Registrations: Other Skills and Abilities: Physical Demands: The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. * 9.1. While performing the duties of this job, the employee is frequently required to stand; walk; use hands to finger, handle, or feel objects, tools, or controls; and talk or hear. The employee is occasionally required to sit and reach with hands and arms. * 9.2. The employee must occasionally lift and/or move up to 10 pounds. Specific vision abilities required by this job include color vision. Work Environment: The work environment characteristics described here are representative of those an employee encounters while performing the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. * 10.1. While performing the duties of this job, the employee occasionally works near moving mechanical parts and is occasionally exposed to fumes or airborne particles. * 10.2. The noise level in the work environment is usually quiet. What to Do Next Now that you've had a chance to learn more about us, what are you waiting for! Apply today and allow us the opportunity to learn more about you and the value you can bring to our team. Once you apply, be sure to create a profile, and sign up for job alerts, so you can be the first to know when new opportunities become available. If you require assistance completing an application, please contact our team at ******************* Our Values Our values speak to our shared beliefs, and describe how we approach working together. * Put People First reflects our commitment to safety and care of each other, learning and development, and creating an inclusive environment of mutual respect, empathy and belonging. * Do the Right Thing focuses us on acting with honesty and integrity, delivering the results the right way, taking pride in our work, and speaking the truth - good or bad. * Do Great Work…Together occurs when we engage without hierarchy, collaborate as a team, embrace challenges, and work for the good of all of us. * Take Ownership and Raise the Bar demonstrates our responsibility to add value and make a difference, challenge the status quo and biases to make things better, foster innovative and creative solutions to drive impact, and explore new perspectives and embrace change. Our Commitment to You We're actively taking steps to make sure our culture is inclusive and that our processes and practices promote equity for all. Leggett & Platt is comprised of people of all abilities, gender identities and expressions, ages, ethnicities, sexual orientations, veteran status, and more. Join us! We welcome and encourage applications if you meet the minimum qualifications. Even if you do not meet the preferred qualifications, we'd love the opportunity to consider you. For more information about how we handle your personal data in connection with our recruiting processes, please refer to the Recruiting Privacy Notice on the "Privacy Notice" tab located at ************************** IMPORTANT NOTICE TO APPLICANTS Regarding Equal Employment / Equal Access Leggett & Platt, Incorporated, and all its Canadian affiliates value the diversity of skills, knowledge, and perspectives our employees bring to our workplace. We believe the presence in our workforce of individuals with characteristics protected under local human rights legislation, including but not limited to females, minorities, individuals with disabilities, veterans, persons from all faiths and national origins and of all colors reward us with greater talent, new ideas and unique perspectives. It is therefore our policy to recruit, hire, promote, transfer, administer benefits and handle all other conditions of employment in a non-discriminatory manner without discrimination regarding any protected status under local human rights legislation. Our Chief Executive Officer has requested that all of our employees make equal opportunity a priority. We are committed to workplace accessibility and accommodation of persons with disabilities. Should you have any need for accommodation at any stage of the hiring or recruitment process, please let us know and we will work with you to provide appropriate accommodation or address any accessibility-related needs. This information will be handled as confidentially as practical. You are welcome to contact a member of our Corporate Employee Relations staff at ************ if you need assistance to apply, wish to discuss an accommodation, or have questions about this notice.
    $80k-103k yearly est. 7d ago
  • Lead Data Scientist, Strategic Analytics - Data Science

    Deloitte 4.7company rating

    Data scientist job in Grand Rapids, MI

    Are you passionate about harnessing data to drive business strategy and shape decision-making at the highest levels? Do you thrive at the intersection of finance, analytics, and innovative problem-solving? If so, Deloitte invites you to step into a pivotal role as a Lead Data Scientist. At Deloitte, you'll work alongside a team of accomplished professionals, delivering impactful financial planning and advanced analytics for our Strategic Analytics function in a rapidly evolving business landscape. As a strategic advisor to executive leaders, you'll have the opportunity to influence major decisions, develop cutting-edge data science solutions, and see your insights translate into real-world results. We offer an environment built for growth-for both your technical and leadership abilities. You'll collaborate on high-visibility projects, contribute directly to business outcomes, and expand your expertise with the resources of a top-tier firm. If you're driven to make a difference and ready for meaningful professional and personal development, consider joining our Data Science team at Deloitte Strategic Analytics. The team and the role The Strategic Analytics team is dedicated to delivering actionable insights to finance and operational leaders, empowering them to make data-driven decisions that drive organizational success. By leveraging advanced analytical techniques and cutting-edge technologies, the team transforms complex data into clear, strategic recommendations. Our team members collaborate closely with various departments to understand their unique challenges and opportunities, ensuring that our analyses are both relevant and impactful. Join us to be at the forefront of strategic decision-making, where your analytical skills will directly contribute to shaping the future of our Firm. The successful candidate will possess: Recruiting for this role ends on December 14, 2025 Key responsibilities: * Lead Data-Driven Consulting Projects: Manage end-to-end delivery of analytics and strategic initiatives for clients, translating business objectives into actionable data science solutions. * Advanced Data Analysis: Develop statistical models and machine learning algorithms to solve complex client problems and inform decision-making. * Strategic Planning & Execution: Support and drive high-impact projects by evaluating business challenges, designing project roadmaps, and implementing analytic solutions. * Stakeholder Engagement: Collaborate with cross-functional teams (including business, technology, and finance leads) to understand requirements, secure buy-in, and communicate findings effectively. * Insight Generation & Reporting: Transform large and complex data sets into meaningful insights, and communicate findings via presentations and dashboards tailored for executive audiences. * Continuous Improvement: Stay up to date with emerging data science trends, tools, and best practices to enhance project outcomes and client value. * Financial Analysis: Apply financial acumen to project work, offering insights on revenue impact, cost optimization, and other finance-related considerations. Required Qualifications * Education: Bachelor's or Master's degree in a quantitative field (e.g., Data Science, Mathematics, Statistics, Engineering, Business Analytics, Finance, or related discipline). * Minimum of 3+ years of relevant experience * Data Science Techniques: Proficient in advanced analytics, statistical modeling, machine learning, and data visualization tools (such as Python, R, SQL, Tableau, or Power BI). * Strategic Thinking: Demonstrated ability to translate analytical findings into actionable business strategies and recommendations. * Consulting Experience: Proven track record of delivering value in a client-facing or advisory capacity. * Communication: Demonstrated ability to explain technical concepts to non-technical stakeholders and craft executive presentations. * Ability to travel 0-5 %, on average, based on the work you do and the clients and industries/sectors you serve * Limited immigration sponsorship may be available Preferred Qualifications: * Finance Acumen: Familiarity with financial principles, statements, and metrics; previous experience supporting finance-related projects or functions is a plus. * Familiarity with Databricks and Azure platforms. The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $102,500 to $188,900. You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance. Information for applicants with a need for accommodation: ************************************************************************************************************ EA_ExpHire EA_FA_ExpHire Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte's purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Learn more. Professional development From entry-level employees to senior leaders, we believe there's always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. As used in this posting, "Deloitte" means Deloitte Services LP, a subsidiary of Deloitte LLP. Please see ************************* for a detailed description of the legal structure of Deloitte LLP and its subsidiaries. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law. Requisition code: 317999 Job ID 317999
    $63k-85k yearly est. 13d ago
  • Data Scientist III

    Meijer 4.5company rating

    Data scientist job in Grand Rapids, MI

    As a family company, we serve people and communities. When you work at Meijer, you're provided with career and community opportunities centered around leadership, personal growth and development. Consider joining our family - take care of your career and your community! Meijer Rewards Weekly pay Scheduling flexibility Paid parental leave Paid education assistance Team member discount Development programs for advancement and career growth Please review the job profile below and apply today! The Data Science team at Meijer leads the strategy, development and integration of Machine Learning and Artificial Intelligence at Meijer. Data Scientists on the team will drive customer loyalty, digital conversion, and system efficiencies by delivering innovative data driven solutions. Through these solutions, the data scientists drive material business value, mitigating business and operational risk, and significantly impacts the customer experience. This role works directly with product development, merchandising, marketing, operations, ITS, ecommerce, and vendor partners. What You'll Be Doing: Deliver against the overall data science strategy to drive in-store and digital merchandising, marketing, customer loyalty, and operational performance Partner with product development to define requirements which meet system and customer experience needs for data science projects Partner with Merchandising, Supply Chain, Operations and customer insights to understand the journey that will be improved with the data science deliverables Build prototypes for, and iteratively develop, end-to-end data science pipelines including custom algorithms, statistical models, machine learning and artificial intelligence functions to meet end user needs Partner with product development and technology teams to deploy pipelines into production MLOps environment following Safe Agile methodology Proactively identify data driven solutions for strategic cross-functional initiatives, develop and present business cases, and gain stakeholder alignment of solution Responsible to define, document and follow best practices and approve for ML/AI development at Meijer Own communication with data consumers (internal and external) to ensure they understand the data science products, have the proper training, and are following the best practices in application of data science products Define and analyze Key Performance Indicators to monitor the usage, adoption, health and value of the data products Identify and scope, in conjunction with IT, the architecture of systems and environment needs to ensure that data science systems can deliver against strategy Build and maintain relationships with key partners, suppliers and industry associations and continue to advance data science capabilities, knowledge and impact This job profile is not meant to be all inclusive of the responsibilities of this position; may perform other duties as assigned or required What You'll Be Doing: Advanced Degree (MA/MS, PhD) in Mathematics, Statistics, Economics, or related quantitative field Certifications: Azure Data Science Associate, Azure AI, Safe Agile 6+ years of relevant data science experience in an applied role - preferable w/in retail, logistics, supply chain or CPG Advanced and hands on experience using: Python, Databricks, Azure ML, Azure Cognitive Service, SAS, R, SQL, PySpark, Numpy, Pandas, Scikit Learn, TensorFlow, PyTorch, AutoTS, Prophet, NLTK Experience with Azure Cloud technologies including Azure DevOps, Azure Synapse, MLOps, GitHub Solid experience working with large datasets and developing ML/AI systems such as: natural language processing, speech/text/image recognition, supervised and unsupervised learning models, forecasting and/or econometric time series models Proactive and action oriented Ability to collaborate with, and present to internal and external partners Able to learn company systems, processes and tools, and identify opportunities to improve Detail oriented and organized Ability to meet production deadlines Strong communications, interpersonal and organizational skills Excellent written and verbal communication skills Understanding of intellectual property rights, compliance and enforcement
    $73k-93k yearly est. Auto-Apply 20d ago
  • Databricks Data Engineer - Manager - Consulting - Location Open 1

    EY 4.7company rating

    Data scientist job in Grand Rapids, MI

    At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. **Technology - Data and Decision Science - Data Engineering - Manager** We are looking for a dynamic and experienced Manager of Data Engineering to lead our team in designing and implementing complex cloud analytics solutions with a strong focus on Databricks. The ideal candidate will possess deep technical expertise in data architecture, cloud technologies, and analytics, along with exceptional leadership and client management skills. **The opportunity:** In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that business requirements are translated into effective technical solutions. Key responsibilities include: + Understanding and analyzing business requirements to translate them into technical requirements. + Designing, building, and operating scalable data architecture and modeling solutions. + Staying up to date with the latest trends and emerging technologies to maintain a competitive edge. **Key Responsibilities:** As a Data Engineering Manager, you will play a crucial role in managing and delivering complex technical initiatives. Your time will be spent across various responsibilities, including: + Leading workstream delivery and ensuring quality in all processes. + Engaging with clients on a daily basis, actively participating in working sessions, and identifying opportunities for additional services. + Implementing resource plans and budgets while managing engagement economics. This role offers the opportunity to work in a dynamic environment where you will face challenges that require innovative solutions. You will learn and grow as you guide others and interpret internal and external issues to recommend quality solutions. Travel may be required regularly based on client needs. **Skills and attributes for success:** To thrive in this role, you should possess a blend of technical and interpersonal skills. The following attributes will make a significant impact: + Lead the design and development of scalable data engineering solutions using Databricks on cloud platforms (e.g., AWS, Azure, GCP). + Oversee the architecture of complex cloud analytics solutions, ensuring alignment with business objectives and best practices. + Manage and mentor a team of data engineers, fostering a culture of innovation, collaboration, and continuous improvement. + Collaborate with clients to understand their analytics needs and deliver tailored solutions that drive business value. + Ensure the quality, integrity, and security of data throughout the data lifecycle, implementing best practices in data governance. + Drive end-to-end data pipeline development, including data ingestion, transformation, and storage, leveraging Databricks and other cloud services. + Communicate effectively with stakeholders, including technical and non-technical audiences, to convey complex data concepts and project progress. + Manage client relationships and expectations, ensuring high levels of satisfaction and engagement. + Stay abreast of the latest trends and technologies in data engineering, cloud computing, and analytics. + Strong analytical and problem-solving abilities. + Excellent communication skills, with the ability to convey complex information clearly. + Proven experience in managing and delivering projects effectively. + Ability to build and manage relationships with clients and stakeholders. **To qualify for the role, you must have:** + Bachelor's degree in computer science, Engineering, or a related field required; Master's degree preferred. + Typically, no less than 4 - 6 years relevant experience in data engineering, with a focus on cloud data solutions and analytics. + Proven expertise in Databricks and experience with Spark for big data processing. + Strong background in data architecture and design, with experience in building complex cloud analytics solutions. + Experience in leading and managing teams, with a focus on mentoring and developing talent. + Strong programming skills in languages such as Python, Scala, or SQL. + Excellent problem-solving skills and the ability to work independently and as part of a team. + Strong communication and interpersonal skills, with a focus on client management. **Required Expertise for Managerial Role:** + **Strategic Leadership:** Ability to align data engineering initiatives with organizational goals and drive strategic vision. + **Project Management:** Experience in managing multiple projects and teams, ensuring timely delivery and adherence to project scope. + **Stakeholder Engagement:** Proficiency in engaging with various stakeholders, including executives, to understand their needs and present solutions effectively. + **Change Management:** Skills in guiding clients through change processes related to data transformation and technology adoption. + **Risk Management:** Ability to identify potential risks in data projects and develop mitigation strategies. + **Technical Leadership:** Experience in leading technical discussions and making architectural decisions that impact project outcomes. + **Documentation and Reporting:** Proficiency in creating comprehensive documentation and reports to communicate project progress and outcomes to clients. **Large-Scale Implementation Programs:** 1. **Enterprise Data Lake Implementation:** Led the design and deployment of a cloud-based data lake solution for a Fortune 500 retail client, integrating data from multiple sources (e.g., ERPs, POS systems, e-commerce platforms) to enable advanced analytics and reporting capabilities. 2. **Real-Time Analytics Platform:** Managed the development of a real-time analytics platform using Databricks for a financial services organization, enabling real-time fraud detection and risk assessment through streaming data ingestion and processing. 3. **Data Warehouse Modernization:** Oversaw the modernization of a legacy data warehouse to a cloud-native architecture for a healthcare provider, implementing ETL processes with Databricks and improving data accessibility for analytics and reporting. **Ideally, you'll also have:** + Experience with advanced data analytics tools and techniques. + Familiarity with machine learning concepts and applications. + Knowledge of industry trends and best practices in data engineering. + Familiarity with cloud platforms (AWS, Azure, GCP) and their data services. + Knowledge of data governance and compliance standards. + Experience with machine learning frameworks and tools. **What we look for:** We seek individuals who are not only technically proficient but also possess the qualities of top performers, including a strong sense of collaboration, adaptability, and a passion for continuous learning. If you are driven by results and have a desire to make a meaningful impact, we want to hear from you. FY26NATAID **What we offer you** At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more . + We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $125,500 to $230,200. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $150,700 to $261,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options. + Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year. + Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. **Are you ready to shape your future with confidence? Apply today.** EY accepts applications for this position on an on-going basis. For those living in California, please click here for additional information. EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities. **EY | Building a better working world** EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories. EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law. EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
    $77k-112k yearly est. 9d ago

Learn more about data scientist jobs

How much does a data scientist earn in Comstock, MI?

The average data scientist in Comstock, MI earns between $59,000 and $109,000 annually. This compares to the national average data scientist range of $75,000 to $148,000.

Average data scientist salary in Comstock, MI

$80,000
Job type you want
Full Time
Part Time
Internship
Temporary