Cybersecurity Data & AI Consultant
Data engineer job in Grand Rapids, MI
Consultant - Cyber Defense & Resilience - Security Operations Are you interested in working in a dynamic environment that offers opportunities for professional growth and new responsibilities? If so, Deloitte could be the place for you. Traditional security programs have often been unsuccessful in unifying the need to both secure and support technology innovation required by the business. Join Deloitte's Cyber Defense & Resilience (D&R) Security Operations team and become a member of the largest group of cybersecurity professionals worldwide.
Recruiting for this role ends on 12/31/2025
The Team
Cyber Defense & Resilience is an integrated team of security and data technologists working at the intersection cybersecurity, advanced cyber data engineering and the use of AI and ML for cyber defense and operations issues. We serve as a trusted advisor and managed service provider bringing a mix of capability and capacity across security data modernization, data ops, AI, and ML, and the use of these disciplines towards cyber specific solutioning. Through our unrivaled breadth and depth of services across every major industry and domain, we help our clients run smarter, faster, and more efficiently. With Deloitte's AI & Data, our clients have the support they need to continuously develop, innovate, automate, scale, and operate in service of organizational performance and growth.
Cyber Detect & Respond practitioners work with our clients to:
+ Design and modernize large scale cyber data and analytics programs that promote organizational intelligence, provide embedded capacity, and implement as-a-service based subscription models at scale
+ Harness the potential of bleeding edge cyber big data and AI technologies such as Databricks for Cyber, AWS Security Lake, Google Sec Ops and the latest from traditional security providers like Splunk, Crowdstrike, Palo Alto and others.
+ Enable day-to-day operations, maintenance and ongoing enhancements of their data platforms and applications as well as managing and governing the underlying data leveraging standardized, automated and AI enabled Data Ops capabilities
+ Mature their AI and Analytics journey with fluid capacity and flexible capability of AI and Analytics experts complemented with experience hardened assets and curated data sets to experiment with AI and scale their AI and Analytics ambition.
Qualifications:
Required:
+ 2-4 years of relevant Analytics consulting or industry experience
+ 2-4 Experience with AI development tools such as vector databases (Pinecone, Elastic, etc.) and AI development frameworks (Langchain, CrewAI, etc.)
+ 2-4 years experience in statistical analysis, machine learning, and data mining techniques.
+ 2-4 years of experience using statistical computer languages (Python, SQL, R, SAS, etc.) to prepare data for analysis, visualize data as part of exploratory analysis, generate features, and other similar data science driven data handling
+ 2-4 years of experience using cyber security cloud platforms (Google SecOps, AWS, Azure, etc.)
+ 1-4 years of experience with SOC threat hunting and incident response
+ Demonstrated expertise with one full life cycle analytics engagement across strategy, design and implementation.
+ Bachelor's Degree in Engineering, Mathematics, Empirical Statistics or 4 years equivalent professional experience
+ Ability to travel up to 50%, on average, based on the work you do and the clients and industries/sectors you serve
+ Limited immigration sponsorship may be available.
Preferred:
+ Experience architecting, designing, developing and deploying enterprise data science solutions which include components across the Artificial Intelligence spectrum such as NLP, Chatbots, Virtual Assistants, Computer Vision, and Cognitive Services as well the use of big data tools for the management of massive datasets.
+ Knowledge of the intersection of AI / ML / Advanced Data Engineering and cybersecurity specific use cases for Detection, cyber threat response acceleration.
+ Experience parsing and normalizing cyber or IT specific telemetry datasets
+ Expertise in Python machine and deep learning frameworks and libraries, e.g. PyTorch, Keras, Tensorflow, Scikit-learn, Numpy, SciPy
+ Experience designing and implementing Apache Open Source (Kafka, Storm, Spark) frameworks to process end to end data management life cycle
+ Ability to work independently and manage multiple task assignments.
+ Strong oral and written communication skills, including presentation skills (MS Visio, MS PowerPoint).
Information for applicants with a need for accommodation: ************************************************************************************************************
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $80,400 to $148,000.
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.
Senior Data Engineer
Data engineer job in Home, MI
At CVS Health, we're building a world of health around every consumer and surrounding ourselves with dedicated colleagues who are passionate about transforming health care. As the nation's leading health solutions company, we reach millions of Americans through our local presence, digital channels and more than 300,000 purpose-driven colleagues - caring for people where, when and how they choose in a way that is uniquely more connected, more convenient and more compassionate.
And we do it all with heart, each and every day.
Position SummaryThe Senior Data Engineer will be responsible for delivering high quality modern data solutions through collaboration with our engineering, analysts, data scientists, and product teams in a fast-paced, agile environment leveraging cutting-edge technology to reimagine how Healthcare is provided.
You will be instrumental in designing, integrating, and implementing solutions on-premise as well supporting migrations of existing workloads to the cloud.
The Senior Data Engineer is expected to have extensive knowledge of modern programming languages, designing and developing data solutions.
The position is open in a data engineering team that is responsible for processing Claims, Revenue, Rx Medicare Specific files and data from 30+ payors into our Data Warehouse.
Required Qualifications6+ years' experience in data engineering, working with SQL and relational database management systems4+ years' experience in Cloud technologies (Azure, GCP or AWS)3+ years' experience with Microsoft SQL server3+ years' experience with Databricks3+ years' experience programming and modifying code in languages like SQL, Python, and PySpark to support and implement Cloud based and on-prem data warehousing services3+ years hands-on experience with dimensional data modeling, schema design, and data warehousing3+ years hands-on experience with troubleshooting data issues2+ years' hands-on experience with various performance improvement techniques Preferred QualificationsFamiliarity with AzureWillingness to identify and implement process improvements, and best practices as well as ability to take ownership to work within a fast-paced, collaborative, and team-based support environment Excellent oral and written communication skills Familiarity with healthcare data and healthcare insurance feeds EducationBachelor's degree in Computer Science / Engineering Anticipated Weekly Hours40Time TypeFull time Pay RangeThe typical pay range for this role is:$83,430.
00 - $222,480.
00This pay range represents the base hourly rate or base annual full-time salary for all positions in the job grade within which this position falls.
The actual base salary offer will depend on a variety of factors including experience, education, geography and other relevant factors.
This position is eligible for a CVS Health bonus, commission or short-term incentive program in addition to the base pay range listed above.
Our people fuel our future.
Our teams reflect the customers, patients, members and communities we serve and we are committed to fostering a workplace where every colleague feels valued and that they belong.
Great benefits for great people We take pride in our comprehensive and competitive mix of pay and benefits - investing in the physical, emotional and financial wellness of our colleagues and their families to help them be the healthiest they can be.
In addition to our competitive wages, our great benefits include:Affordable medical plan options, a 401(k) plan (including matching company contributions), and an employee stock purchase plan.
No-cost programs for all colleagues including wellness screenings, tobacco cessation and weight management programs, confidential counseling and financial coaching.
Benefit solutions that address the different needs and preferences of our colleagues including paid time off, flexible work schedules, family leave, dependent care resources, colleague assistance programs, tuition assistance, retiree medical access and many other benefits depending on eligibility.
For more information, visit *************
cvshealth.
com/us/en/benefits We anticipate the application window for this opening will close on: 12/18/2025Qualified applicants with arrest or conviction records will be considered for employment in accordance with all federal, state and local laws.
Data Scientist III
Data engineer job in Grand Rapids, MI
As a family company, we serve people and communities. When you work at Meijer, you're provided with career and community opportunities centered around leadership, personal growth and development. Consider joining our family - take care of your career and your community!
Meijer Rewards
Weekly pay
Scheduling flexibility
Paid parental leave
Paid education assistance
Team member discount
Development programs for advancement and career growth
Please review the job profile below and apply today!
The Data Science team at Meijer leads the strategy, development and integration of Machine Learning and Artificial Intelligence at Meijer. Data Scientists on the team will drive customer loyalty, digital conversion, and system efficiencies by delivering innovative data driven solutions. Through these solutions, the data scientists drive material business value, mitigating business and operational risk, and significantly impacts the customer experience. This role works directly with product development, merchandising, marketing, operations, ITS, ecommerce, and vendor partners.
What You'll Be Doing:
Deliver against the overall data science strategy to drive in-store and digital merchandising, marketing, customer loyalty, and operational performance
Partner with product development to define requirements which meet system and customer experience needs for data science projects
Partner with Merchandising, Supply Chain, Operations and customer insights to understand the journey that will be improved with the data science deliverables
Build prototypes for, and iteratively develop, end-to-end data science pipelines including custom algorithms, statistical models, machine learning and artificial intelligence functions to meet end user needs
Partner with product development and technology teams to deploy pipelines into production MLOps environment following Safe Agile methodology
Proactively identify data driven solutions for strategic cross-functional initiatives, develop and present business cases, and gain stakeholder alignment of solution
Responsible to define, document and follow best practices and approve for ML/AI development at Meijer
Own communication with data consumers (internal and external) to ensure they understand the data science products, have the proper training, and are following the best practices in application of data science products
Define and analyze Key Performance Indicators to monitor the usage, adoption, health and value of the data products
Identify and scope, in conjunction with IT, the architecture of systems and environment needs to ensure that data science systems can deliver against strategy
Build and maintain relationships with key partners, suppliers and industry associations and continue to advance data science capabilities, knowledge and impact This job profile is not meant to be all inclusive of the responsibilities of this position; may perform other duties as assigned or required
What You'll Be Doing:
Advanced Degree (MA/MS, PhD) in Mathematics, Statistics, Economics, or related quantitative field
Certifications: Azure Data Science Associate, Azure AI, Safe Agile
6+ years of relevant data science experience in an applied role - preferable w/in retail, logistics, supply chain or CPG
Advanced and hands on experience using: Python, Databricks, Azure ML, Azure Cognitive Service, SAS, R, SQL, PySpark, Numpy, Pandas, Scikit Learn, TensorFlow, PyTorch, AutoTS, Prophet, NLTK
Experience with Azure Cloud technologies including Azure DevOps, Azure Synapse, MLOps, GitHub
Solid experience working with large datasets and developing ML/AI systems such as: natural language processing, speech/text/image recognition, supervised and unsupervised learning models, forecasting and/or econometric time series models
Proactive and action oriented
Ability to collaborate with, and present to internal and external partners
Able to learn company systems, processes and tools, and identify opportunities to improve
Detail oriented and organized
Ability to meet production deadlines
Strong communications, interpersonal and organizational skills
Excellent written and verbal communication skills
Understanding of intellectual property rights, compliance and enforcement
Auto-ApplyPrincipal Data Scientist
Data engineer job in Grand Rapids, MI
Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team.
You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes.
This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.)
This position requires occasional travel to the DC area for client meetings.
Essential Duties and Responsibilities:
- Make deep dives into the data, pulling out objective insights for business leaders.
- Initiate, craft, and lead advanced analyses of operational data.
- Provide a strong voice for the importance of data-driven decision making.
- Provide expertise to others in data wrangling and analysis.
- Convert complex data into visually appealing presentations.
- Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners.
- Understand the importance of automation and look to implement and initiate automated solutions where appropriate.
- Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects.
- Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects.
- Guide operational partners on product performance and solution improvement/maturity options.
- Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization.
- Learn new skills in advanced analytics/AI/ML tools, techniques, and languages.
- Mentor more junior data analysts/data scientists as needed.
- Apply strategic approach to lead projects from start to finish;
Job-Specific Minimum Requirements:
- Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
- Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning.
- Contribute to the development of mathematically rigorous process improvement procedures.
- Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments.
Minimum Requirements
- Bachelor's degree in related field required.
- 10-12 years of relevant professional experience required.
Job-Specific Minimum Requirements:
- 10+ years of relevant Software Development + AI / ML / DS experience.
- Professional Programming experience (e.g. Python, R, etc.).
- Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML.
- Experience with API programming.
- Experience with Linux.
- Experience with Statistics.
- Experience with Classical Machine Learning.
- Experience working as a contributor on a team.
Preferred Skills and Qualifications:
- Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.).
- Experience developing machine learning or signal processing algorithms:
- Ability to leverage mathematical principles to model new and novel behaviors.
- Ability to leverage statistics to identify true signals from noise or clutter.
- Experience working as an individual contributor in AI.
- Use of state-of-the-art technology to solve operational problems in AI and Machine Learning.
- Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles.
- Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions.
- Ability to build reference implementations of operational AI & Advanced Analytics processing solutions.
Background Investigations:
- IRS MBI - Eligibility
#techjobs #VeteransPage
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
156,740.00
Maximum Salary
$
234,960.00
Easy ApplyDatabricks Data Engineer - Manager - Consulting - Location Open 1
Data engineer job in Grand Rapids, MI
At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
**Technology - Data and Decision Science - Data Engineering - Manager**
We are looking for a dynamic and experienced Manager of Data Engineering to lead our team in designing and implementing complex cloud analytics solutions with a strong focus on Databricks. The ideal candidate will possess deep technical expertise in data architecture, cloud technologies, and analytics, along with exceptional leadership and client management skills.
**The opportunity:**
In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that business requirements are translated into effective technical solutions. Key responsibilities include:
+ Understanding and analyzing business requirements to translate them into technical requirements.
+ Designing, building, and operating scalable data architecture and modeling solutions.
+ Staying up to date with the latest trends and emerging technologies to maintain a competitive edge.
**Key Responsibilities:**
As a Data Engineering Manager, you will play a crucial role in managing and delivering complex technical initiatives. Your time will be spent across various responsibilities, including:
+ Leading workstream delivery and ensuring quality in all processes.
+ Engaging with clients on a daily basis, actively participating in working sessions, and identifying opportunities for additional services.
+ Implementing resource plans and budgets while managing engagement economics.
This role offers the opportunity to work in a dynamic environment where you will face challenges that require innovative solutions. You will learn and grow as you guide others and interpret internal and external issues to recommend quality solutions. Travel may be required regularly based on client needs.
**Skills and attributes for success:**
To thrive in this role, you should possess a blend of technical and interpersonal skills. The following attributes will make a significant impact:
+ Lead the design and development of scalable data engineering solutions using Databricks on cloud platforms (e.g., AWS, Azure, GCP).
+ Oversee the architecture of complex cloud analytics solutions, ensuring alignment with business objectives and best practices.
+ Manage and mentor a team of data engineers, fostering a culture of innovation, collaboration, and continuous improvement.
+ Collaborate with clients to understand their analytics needs and deliver tailored solutions that drive business value.
+ Ensure the quality, integrity, and security of data throughout the data lifecycle, implementing best practices in data governance.
+ Drive end-to-end data pipeline development, including data ingestion, transformation, and storage, leveraging Databricks and other cloud services.
+ Communicate effectively with stakeholders, including technical and non-technical audiences, to convey complex data concepts and project progress.
+ Manage client relationships and expectations, ensuring high levels of satisfaction and engagement.
+ Stay abreast of the latest trends and technologies in data engineering, cloud computing, and analytics.
+ Strong analytical and problem-solving abilities.
+ Excellent communication skills, with the ability to convey complex information clearly.
+ Proven experience in managing and delivering projects effectively.
+ Ability to build and manage relationships with clients and stakeholders.
**To qualify for the role, you must have:**
+ Bachelor's degree in computer science, Engineering, or a related field required; Master's degree preferred.
+ Typically, no less than 4 - 6 years relevant experience in data engineering, with a focus on cloud data solutions and analytics.
+ Proven expertise in Databricks and experience with Spark for big data processing.
+ Strong background in data architecture and design, with experience in building complex cloud analytics solutions.
+ Experience in leading and managing teams, with a focus on mentoring and developing talent.
+ Strong programming skills in languages such as Python, Scala, or SQL.
+ Excellent problem-solving skills and the ability to work independently and as part of a team.
+ Strong communication and interpersonal skills, with a focus on client management.
**Required Expertise for Managerial Role:**
+ **Strategic Leadership:** Ability to align data engineering initiatives with organizational goals and drive strategic vision.
+ **Project Management:** Experience in managing multiple projects and teams, ensuring timely delivery and adherence to project scope.
+ **Stakeholder Engagement:** Proficiency in engaging with various stakeholders, including executives, to understand their needs and present solutions effectively.
+ **Change Management:** Skills in guiding clients through change processes related to data transformation and technology adoption.
+ **Risk Management:** Ability to identify potential risks in data projects and develop mitigation strategies.
+ **Technical Leadership:** Experience in leading technical discussions and making architectural decisions that impact project outcomes.
+ **Documentation and Reporting:** Proficiency in creating comprehensive documentation and reports to communicate project progress and outcomes to clients.
**Large-Scale Implementation Programs:**
1. **Enterprise Data Lake Implementation:** Led the design and deployment of a cloud-based data lake solution for a Fortune 500 retail client, integrating data from multiple sources (e.g., ERPs, POS systems, e-commerce platforms) to enable advanced analytics and reporting capabilities.
2. **Real-Time Analytics Platform:** Managed the development of a real-time analytics platform using Databricks for a financial services organization, enabling real-time fraud detection and risk assessment through streaming data ingestion and processing.
3. **Data Warehouse Modernization:** Oversaw the modernization of a legacy data warehouse to a cloud-native architecture for a healthcare provider, implementing ETL processes with Databricks and improving data accessibility for analytics and reporting.
**Ideally, you'll also have:**
+ Experience with advanced data analytics tools and techniques.
+ Familiarity with machine learning concepts and applications.
+ Knowledge of industry trends and best practices in data engineering.
+ Familiarity with cloud platforms (AWS, Azure, GCP) and their data services.
+ Knowledge of data governance and compliance standards.
+ Experience with machine learning frameworks and tools.
**What we look for:**
We seek individuals who are not only technically proficient but also possess the qualities of top performers, including a strong sense of collaboration, adaptability, and a passion for continuous learning. If you are driven by results and have a desire to make a meaningful impact, we want to hear from you.
FY26NATAID
**What we offer you**
At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more .
+ We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $125,500 to $230,200. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $150,700 to $261,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
+ Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
+ Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
**Are you ready to shape your future with confidence? Apply today.**
EY accepts applications for this position on an on-going basis.
For those living in California, please click here for additional information.
EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
**EY | Building a better working world**
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
Data Scientist
Data engineer job in Zeeland, MI
Why join us?
Our purpose is to design for the good of humankind. It's the ideal we strive toward each day in everything we do. Being a part of MillerKnoll means being a part of something larger than your work team, or even your brand. We are redefining modern for the 21st century. And our success allows MillerKnoll to support causes that align with our values, so we can build a more sustainable, equitable, and beautiful future for everyone.
About the Role
We're looking for an experienced and adaptable Data Scientist to join our growing AI & Data Science team. You'll be part of a small, highly technical group focused on delivering impactful machine learning, forecasting, and generative AI solutions.
In this role, you'll work closely with stakeholders to translate business challenges into well-defined analytical problems, design and validate models, and communicate results in clear, actionable terms. You'll collaborate extensively with our ML Engineer to transition solutions from experimentation to production, ensuring models are both effective and robust in real-world environments. You'll be expected to quickly prototype and iterate on solutions, adapt to new tools and approaches, and share knowledge with the broader organization. This is a hands-on role with real impact and room to innovate.
Key Responsibilities
Partner with business stakeholders to identify, scope, and prioritize data science opportunities.
Translate complex business problems into structured analytical tasks and hypotheses.
Design, develop, and evaluate machine learning, forecasting, and statistical models, considering fairness, interpretability, and business impact.
Perform exploratory data analysis, feature engineering, and data preprocessing.
Rapidly prototype solutions to assess feasibility before scaling.
Interpret model outputs and clearly communicate findings, implications, and recommendations to both technical and non-technical audiences.
Collaborate closely with the ML Engineer to transition models from experimentation into scalable, production-ready systems.
Develop reproducible code, clear documentation, and reusable analytical workflows to support org-wide AI adoption.
Stay up to date with advances in data science, AI/ML, and generative AI, bringing innovative approaches to the team.
Required Technical Skills
Bachelor's or Master's degree in Data Science, Statistics, Applied Mathematics, Computer Science, or a related quantitative field, with 3+ years of applied experience in data science.
Strong foundation in statistics, probability, linear algebra, and optimization.
Proficiency with Python and common data science libraries (Pandas, NumPy, Scikit-learn, XGBoost, PyTorch or TensorFlow).
Experience with time series forecasting, regression, classification, clustering, or recommendation systems.
Familiarity with GenAI concepts and tools (LLM APIs, embeddings, prompt engineering, evaluation methods).
Strong SQL skills and experience working with large datasets and cloud-based data warehouses (Snowflake, BigQuery, etc.).
Solid understanding of experimental design and model evaluation metrics beyond accuracy.
Experience with data visualization and storytelling tools (Plotly, Tableau, Power BI, or Streamlit).
Exposure to MLOps/LLMOps concepts and working in close collaboration with engineering teams.
Soft Skills & Qualities
Excellent communication skills with the ability to translate analysis into actionable business recommendations.
Strong problem-solving abilities and business acumen.
High adaptability to evolving tools, frameworks, and industry practices.
Curiosity and continuous learning mindset.
Stakeholder empathy and ability to build trust while introducing AI solutions.
Strong collaboration skills and comfort working in ambiguous, fast-paced environments.
Commitment to clear documentation and knowledge sharing.
Who We Hire?
Simply put, we hire qualified applicants representing a wide range of backgrounds and abilities. MillerKnoll is comprised of people of all abilities, gender identities and expressions, ages, ethnicities, sexual orientations, veterans from every branch of military service, and more. Here, you can bring your whole self to work. We're committed to equal opportunity employment, including veterans and people with disabilities.
This organization participates in E-Verify Employment Eligibility Verification. In general, MillerKnoll positions are closed within 45 days and are open for applications for a minimum of 5 days. We encourage our prospective candidates to submit their application(s) expediently so as not to miss out on our opportunities. We frequently post new opportunities and encourage prospective candidates to check back often for new postings.
MillerKnoll complies with applicable disability laws and makes reasonable accommodations for applicants and employees with disabilities. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact MillerKnoll Talent Acquisition at careers_********************.
Auto-ApplyData Scientist
Data engineer job in Zeeland, MI
Why join us? Our purpose is to design for the good of humankind. It's the ideal we strive toward each day in everything we do. Being a part of MillerKnoll means being a part of something larger than your work team, or even your brand. We are redefining modern for the 21st century. And our success allows MillerKnoll to support causes that align with our values, so we can build a more sustainable, equitable, and beautiful future for everyone.
About the Role
We're looking for an experienced and adaptable Data Scientist to join our growing AI & Data Science team. You'll be part of a small, highly technical group focused on delivering impactful machine learning, forecasting, and generative AI solutions.
In this role, you'll work closely with stakeholders to translate business challenges into well-defined analytical problems, design and validate models, and communicate results in clear, actionable terms. You'll collaborate extensively with our ML Engineer to transition solutions from experimentation to production, ensuring models are both effective and robust in real-world environments. You'll be expected to quickly prototype and iterate on solutions, adapt to new tools and approaches, and share knowledge with the broader organization. This is a hands-on role with real impact and room to innovate.
Key Responsibilities
* Partner with business stakeholders to identify, scope, and prioritize data science opportunities.
* Translate complex business problems into structured analytical tasks and hypotheses.
* Design, develop, and evaluate machine learning, forecasting, and statistical models, considering fairness, interpretability, and business impact.
* Perform exploratory data analysis, feature engineering, and data preprocessing.
* Rapidly prototype solutions to assess feasibility before scaling.
* Interpret model outputs and clearly communicate findings, implications, and recommendations to both technical and non-technical audiences.
* Collaborate closely with the ML Engineer to transition models from experimentation into scalable, production-ready systems.
* Develop reproducible code, clear documentation, and reusable analytical workflows to support org-wide AI adoption.
* Stay up to date with advances in data science, AI/ML, and generative AI, bringing innovative approaches to the team.
Required Technical Skills
* Bachelor's or Master's degree in Data Science, Statistics, Applied Mathematics, Computer Science, or a related quantitative field, with 3+ years of applied experience in data science.
* Strong foundation in statistics, probability, linear algebra, and optimization.
* Proficiency with Python and common data science libraries (Pandas, NumPy, Scikit-learn, XGBoost, PyTorch or TensorFlow).
* Experience with time series forecasting, regression, classification, clustering, or recommendation systems.
* Familiarity with GenAI concepts and tools (LLM APIs, embeddings, prompt engineering, evaluation methods).
* Strong SQL skills and experience working with large datasets and cloud-based data warehouses (Snowflake, BigQuery, etc.).
* Solid understanding of experimental design and model evaluation metrics beyond accuracy.
* Experience with data visualization and storytelling tools (Plotly, Tableau, Power BI, or Streamlit).
* Exposure to MLOps/LLMOps concepts and working in close collaboration with engineering teams.
Soft Skills & Qualities
* Excellent communication skills with the ability to translate analysis into actionable business recommendations.
* Strong problem-solving abilities and business acumen.
* High adaptability to evolving tools, frameworks, and industry practices.
* Curiosity and continuous learning mindset.
* Stakeholder empathy and ability to build trust while introducing AI solutions.
* Strong collaboration skills and comfort working in ambiguous, fast-paced environments.
* Commitment to clear documentation and knowledge sharing.
Who We Hire?
Simply put, we hire qualified applicants representing a wide range of backgrounds and abilities. MillerKnoll is comprised of people of all abilities, gender identities and expressions, ages, ethnicities, sexual orientations, veterans from every branch of military service, and more. Here, you can bring your whole self to work. We're committed to equal opportunity employment, including veterans and people with disabilities.
This organization participates in E-Verify Employment Eligibility Verification. In general, MillerKnoll positions are closed within 45 days and are open for applications for a minimum of 5 days. We encourage our prospective candidates to submit their application(s) expediently so as not to miss out on our opportunities. We frequently post new opportunities and encourage prospective candidates to check back often for new postings.
MillerKnoll complies with applicable disability laws and makes reasonable accommodations for applicants and employees with disabilities. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact MillerKnoll Talent Acquisition at careers_********************.
Auto-ApplyData Engineer
Data engineer job in Grand Rapids, MI
Our client, a family-owned Midwestern grocery/retailer striving to better people's lives in all communities, seeks a senior-level Data Engineer.
The ideal candidate for this role has 5 to 10 years of relevant work experience.
Designs, modifies, develops, writes and implements software programming applications.
Supports and/or installs software applications/operating systems.
Participates in the testing process through test review and analysis, test witnessing and certification of software.
Familiar with a variety of the field's concepts, practices, and procedures.
Relies on experience and judgment to plan and accomplish goals.
Performs a variety of complicated tasks.
A wide degree of creativity and latitude is expected.
Data Engineer
Data engineer job in Grand Rapids, MI
Job Title: Data EngineerSkill Category: Data EngineeringWork Location: Grand Rapids, MIOnsite/Remote: HybridHourly C2C Rate: $80/hr Job Description: We are looking for a Data Engineer with over 6 years of industry experience in business application design, development, implementation, and solution architecture. The ideal candidate should have experience with Databricks and building and designing data and analytics on enterprise solutions such as:
Azure Data Factory
Azure Function App
Log Analytics
Databricks
Synapse
Power BI
ADLS Gen2
Logic Apps
Required Skills:
Data Classification
Data Modeling
Data Architecture
Data Quality
Design
Network Components
Solution Architecture
Teamwork
Technical Training
.Net Core
Agile
C#
Data Warehouse
Infrastructure
Product Management
Functional Requirements
Interfaces
Research
Subsystems
Support
.Net
Python
Data Engineering
Data Analysis
Data Science
Responsibilities:
Design, code, test, and implement data movement, dashboarding, and analytical assets.
Develop system documentation according to SAFe Agile principles and industry standards.
Evaluate architectural options and define the overall architecture of enterprise Data Lake and Data Warehouse.
Provide subject matter expertise and technical consulting support on vendor or internal applications and interfaces.
Develop Azure Function App using C# and .Net Core.
Define functional and non-functional requirements, including performance monitoring, alerting, and code management.
Partner with business areas to gather requirements for Data and Analytics and design solutions.
Define major elements and subsystems and their interfaces.
Mentor and coach team members.
Engage with ITS Security and Infrastructure teams to ensure secure development and deployment of solutions.
Interface with the Product Manager and IT partners to define and estimate features for agile teams.
Conduct industry research, facilitate new product and vendor evaluations, and assist in vendor selection.
Qualifications:
6+ years of industry experience in business application design, development, implementation, and solution architecture.
Experience with Azure Data Factory, Azure Function App, Log Analytics, Databricks, Synapse, Power BI, ADLS Gen2, and Logic Apps.
Databricks experience is required.
Experience designing data pipelines and implementing data quality, governance, and security compliance in Azure architecture.
Bachelor's degree in Computer Science, Computer Information Systems, Business Information Systems, Engineering, or related discipline or equivalent work experience and technical training.
Excellent written and oral communication skills.
Experience in Power BI, Data Modeling, Data Classification, Data Architecture, and reporting.
In-depth understanding of computer, storage, and network components, including backup, monitoring, and DR environment requirements.
Preferred knowledge and experience in Python and API architecture in Azure.
SAFe certification or training is a plus.
Experience with diverse technical configurations, technologies, and processing environments.
Exceptional interpersonal skills, including teamwork, facilitation, and negotiation.
Must Be Included with Submittal:
Full Legal Name
Phone
Email
Current Location
Rate
Work Authorization
Willingness to Relocate
Confirmation that the candidate is or will be on your W2
Manager, Data Operations, Data Engineer
Data engineer job in Grand Rapids, MI
Known for being a great place to work and build a career, KPMG provides audit, tax and advisory services for organizations in today's most important industries. Our growth is driven by delivering real results for our clients. It's also enabled by our culture, which encourages individual development, embraces an inclusive environment, rewards innovative excellence and supports our communities. With qualities like those, it's no wonder we're consistently ranked among the best companies to work for by Fortune Magazine, Consulting Magazine, Seramount, Fair360 and others. If you're as passionate about your future as we are, join our team.
KPMG is currently seeking a Manager of Data Engineering to join our Digital Nexus technology organization. This is a hybrid work opportunity.
Responsibilities:
* Lead a team of Azure Data Lake and business intelligence engineers in designing and delivering ADL Pipelines, Notebooks and interactive Power BI dashboards that clearly communicate actionable insights to stakeholders; contribute strategic thought leadership to shape the firm's business intelligence vision and standards
* Design and maintain scalable data pipelines using Azure Data Factory and Databricks to ingest, transform, and deliver data across medallion architecture layers; develop production-grade ETL/ELT solutions using PySpark and SQL to produce analytics-ready Delta Lake datasets aligned with enterprise standards
* Apply critical thinking and creativity to design innovative, non-standard BI solutions that address complex and evolving business challenges; design, build, and optimize data models to support analytics, ensuring accuracy, reliability, and efficiency
* Stay ahead of emerging technologies including Generative AI and AI agents to identify novel opportunities that improve analytics, automation, and decision-making across the enterprise
* Manage and provide technical expertise and strategic guidance to counselees (direct reports), department peers, and cross-functional team members; set goals, participate in strategic initiatives for the team, and foster the development of high-performance teams
* Act with integrity, professionalism, and personal responsibility to uphold KPMG's respectful and courteous work environment
Qualifications:
* Minimum seven years of recent experience designing and building ADL Pipelines and Data bricks notebooks and interactive dashboards using modern business intelligence tools (preferably Power BI); minimum two years of recent experience designing scalable data pipelines using Azure Data Factory and Azure Databricks to support ingestion, transformation, and delivery of data across medallion architecture layers
* Bachelor's degree from an accredited college or university is preferred; minimum of a high school diploma or GED is required
* Demonstrated analytical and problem-solving abilities, with a creative and methodical approach to addressing complex challenges
* Advanced knowledge of SQL, DAX, and data modeling concepts; proven track record in defining, managing, and delivering BI projects; ability to participate in the development of resource plans and influence organizational priorities
* Excellent written and verbal communication skills, including the ability to effectively present proposals and vision to executive leadership
* Applicants must be authorized to work in the U.S. without the need for employment-based visa sponsorship now or in the future; KPMG LLP will not sponsor applicants for U.S. work visa status for this opportunity (no sponsorship is available for H-1B, L-1, TN, O-1, E-3, H-1B1, F-1, J-1, OPT, CPT or any other employment-based visa)
KPMG LLP and its affiliates and subsidiaries ("KPMG") complies with all local/state regulations regarding displaying salary ranges. If required, the ranges displayed below or via the URL below are specifically for those potential hires who will work in the location(s) listed. Any offered salary is determined based on relevant factors such as applicant's skills, job responsibilities, prior relevant experience, certain degrees and certifications and market considerations. In addition, KPMG is proud to offer a comprehensive, competitive benefits package, with options designed to help you make the best decisions for yourself, your family, and your lifestyle. Available benefits are based on eligibility. Our Total Rewards package includes a variety of medical and dental plans, vision coverage, disability and life insurance, 401(k) plans, and a robust suite of personal well-being benefits to support your mental health. Depending on job classification, standard work hours, and years of service, KPMG provides Personal Time Off per fiscal year. Additionally, each year KPMG publishes a calendar of holidays to be observed during the year and provides eligible employees two breaks each year where employees will not be required to use Personal Time Off; one is at year end and the other is around the July 4th holiday. Additional details about our benefits can be found towards the bottom of our KPMG US Careers site at Benefits & How We Work.
Follow this link to obtain salary ranges by city outside of CA:
**********************************************************************
KPMG offers a comprehensive compensation and benefits package. KPMG is an equal opportunity employer. KPMG complies with all applicable federal, state and local laws regarding recruitment and hiring. All qualified applicants are considered for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, citizenship status, disability, protected veteran status, or any other category protected by applicable federal, state or local laws. The attached link contains further information regarding KPMG's compliance with federal, state and local recruitment and hiring laws. No phone calls or agencies please.
KPMG recruits on a rolling basis. Candidates are considered as they apply, until the opportunity is filled. Candidates are encouraged to apply expeditiously to any role(s) for which they are qualified that is also of interest to them.
Los Angeles County applicants: Material job duties for this position are listed above. Criminal history may have a direct, adverse, and negative relationship with some of the material job duties of this position. These include the duties and responsibilities listed above, as well as the abilities to adhere to company policies, exercise sound judgment, effectively manage stress and work safely and respectfully with others, exhibit trustworthiness, and safeguard business operations and company reputation. Pursuant to the California Fair Chance Act, Los Angeles County Fair Chance Ordinance for Employers, Fair Chance Initiative for Hiring Ordinance, and San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records.
Data Engineer (AI-RPA)
Data engineer job in Grandville, MI
Title: Data Engineer YOUR ROLE PADNOS is seeking a Data Engineer on our Data and Software team who thrives at the intersection of data, automation, and applied AI. This role builds intelligent data pipelines and robotic process automations (RPAs) that connect systems, streamline operations, and unlock efficiency across the enterprise.
You'll design and develop pipelines using Python, SQL Server, and modern APIs-integrating services such as OpenAI, Anthropic, and Azure ML-to drive automation and accelerate business processes. Your work will extend beyond traditional data engineering, applying AI models and API logic to eliminate manual effort and make data more actionable across teams.
You will report directly to IT Manager, at PADNOS Corporate in Grandville, MI. This is an in-person role based in Grandville, Michigan. Must reside within daily commuting distance of Grandville, Michigan. We do not relocate, sponsor visas, or consider remote applicants.
ACCOUNTABILITIES
Design and develop automated data pipelines that integrate AI and machine learning services to process, enrich, and deliver high-value data for analytics and automation use cases.
Build, maintain, and optimize SQL Server ELT workflows and Python-based automation scripts.
Connect to external APIs (OpenAI, Anthropic, Azure ML, and other SaaS systems) to retrieve, transform, and post data as part of end-to-end workflows.
Partner with business stakeholders to identify manual workflows and translate them into AI-enabled automations.
Work with software developers to integrate automation logic directly into enterprise applications.
Implement and monitor data quality, reliability, and observability metrics across pipelines.
Apply performance tuning and best practices for database and process efficiency.
Develop and maintain reusable Python modules and configuration standards for automation scripts.
Support data governance and version control processes to ensure consistency and transparency across environments.
Collaborate closely with analytics, software, and operations teams to prioritize and deliver automation solutions that create measurable business impact.
MEASUREMENTS
Reduction in manual hours across teams through implemented automations.
Reliable and reusable data pipelines supporting AI and analytics workloads.
Delivery of production-ready automation projects in collaboration with business units.
Adherence to data quality and reliability standards.
Continuous improvement in data pipeline performance and maintainability.
QUALIFICATIONS/EXPERIENCE
Bachelor's degree or equivalent experience in data engineering, computer science, or software development.
Must have personally owned an automated pipeline end-to-end (design → build → deploy → maintain).
Minimum 3 years hands-on experience building production data pipelines using Python and SQL Server. Contract, academic, bootcamp, or coursework experience does not qualify.
Intermediate to advanced Python development skills, particularly for data and API automation.
Experience working with RESTful APIs and JSON data structures.
Familiarity with AI/ML API services (OpenAI, Anthropic, Azure ML, etc.) and their integration into data workflows.
Experience with modern data stack components such as Fivetran, dbt, or similar tools preferred.
Knowledge of SQL Server performance tuning and query optimization.
Familiarity with Git and CI/CD workflows for data pipeline deployment.
Bonus: Experience deploying or maintaining RPA or AI automation solutions.
PADNOS is an Equal Opportunity Employer and does not discriminate on the basis of race, color, religion, sex, age, national origin, disability, veteran status, sexual orientation or any other classification protected by Federal, State or local law.
Staff Data Engineer
Data engineer job in Grand Rapids, MI
Cybercrime is rising, reaching record highs in 2024. According to the FBI's IC3 report total losses exceeded $16 billion. With investment fraud and BEC scams at the forefront, the message is clear: the real estate sector remains a lucrative target for cybercriminals. At CertifID, we take this threat seriously and provide a secure platform that verifies the identities of parties involved in transactions, authenticates wire transfer instructions, and detects potential fraud attempts. Our technology is designed to mitigate risks and ensure that every transaction is conducted with confidence and peace of mind.
We know we couldn't take on this challenge without our incredible team. We have been recognized as one of the Best Startups to Work for in Austin, made the Inc. 5000 list, and won Best Culture by Purpose Jobs two years in a row. We are guided by our core values and our vision of a world without wire fraud. We offer a dynamic work environment where you can contribute to meaningful impact and be part of a team dedicated to enhancing security and fighting fraud.
We're seeking an exceptional Staff Data Engineer to help us take our data platform to the next level. This is a high-impact role where you'll architect and build scalable data infrastructure, empower data-driven decision-making, and help shape the future of fraud prevention. We're scaling fast, and you'll have the chance to shape the future of our data platform.
You will collaborate with data scientists, engineers, and business stakeholders to ensure high-quality, actionable data insights to enable business outcomes.What You Will Do
Design and build robust, scalable, secure data pipelines and infrastructure to support analytics, product intelligence, and machine learning.
Lead data architecture decisions, ensuring high performance, reliability, and maintainability across our platform.
Collaborate cross-functionally with product, engineering, and business teams to deliver data solutions that drive strategic insights and operational excellence.
Champion data quality and governance, implementing best practices for data validation, lineage, and observability.
Mentor and guide other data engineers, fostering a culture of technical excellence and continuous learning.
What We're Looking For
Proven experience as a staff-level data engineer in a fast-paced, product-driven environment. Engineering experience of 8+ years.
Experience in leading teams of other engineers to build a long-term technical vision and plans to achieve it.
Deep expertise in cloud data platforms (e.g., Snowflake, BigQuery, Redshift) and orchestration tools (e.g., Airflow, dbt).Strong programming skills in Python or Scala, and proficiency with SQL.
Experience with real-time data processing (e.g., Kafka, Spark Streaming) and data modeling for analytics and ML.
A growth mindset, strong ownership, and a passion for solving complex problems that matter.
Preferred Experience
Experience in fintech, cybersecurity, or fraud prevention.
Familiarity with data privacy regulations (e.g., SOC 2, GDPR, CCPA).
Contributions to open-source data tools or communities.
What We Offer
Flexible vacation
12 company-paid holidays
10 paid sick days
No work on your birthday
Health, dental, and vision Insurance (including a $0 option)
401(k) with matching, and no waiting period
Equity
Life insurance
Generous parental paid leave
Wellness reimbursement of $300/year
Remote worker reimbursement of $300/year
Professional development reimbursement
Competitive pay
An award-winning culture
Change doesn't happen overnight, and the same goes for us here at CertifID. We PROGRESS collectively and individually as we grow, abiding by our core values. Protect the Customer, Raise the Bar, Operate with Urgency, Grow with Grit, Ride the Wave, Enthusiasm Spreads, Stay Connected, Send It.
Auto-ApplySenior Data Engineer
Data engineer job in Grand Rapids, MI
**Advance Local** is looking for a **Senior Data Engineer** to design, build, and maintain the enterprise data infrastructure that powers our cloud data platform. This position will combine your deep technical expertise in data engineering with team leadership responsibilities for data engineering, overseeing the ingestion, integration, and reliability of data systems across Snowflake, AWS, Google Cloud, and legacy platforms. You'll partner with data product and across business units to translate requirements into technical solutions, integrate data from numerous third-party platforms, (CDPs, DMPs, analytics platforms, marketing tech) into central data platform, collaborate closely with the Data Architect on platform strategy and ensure scalable, well-engineered solutions for modern data infrastructure using infrastructure as code and API-driven integrations.
The base salary range is $120,000 - $140,000 per year.
**What you'll be doing:**
+ Lead the design and implementation of scalable data ingestion pipelines from diverse sources into Snowflake.
+ Partner with platform owners across business units to establish and maintain data integrations from third party systems into the central data platform.
+ Architect and maintain data infrastructure using IAC, ensuring reproducibility, version control and disaster recovery capabilities.
+ Design and implement API integrations and event-driven data flows to support real time and batch data requirements.
+ Evaluate technical capabilities and integration patterns of existing and potential third-party platforms, advising on platform consolidation and optimization opportunities.
+ Partner with the Data Architect and data product to define the overall data platform strategy, ensuring alignment between raw data ingestion and analytics-ready data products that serve business unit needs.
+ Develop and enforce data engineering best practices including testing frameworks, deployment automation, and observability.
+ Support rapid prototyping of new data products in collaboration with data product by building flexible, reusable data infrastructure components.
+ Design, develop, and maintain scalable data pipelines and ETL processes; optimize and improve existing data systems for performance, cost efficiency, and scalability.
+ Collaborate with data product, third-party platform owners, Data Architects, Analytics Engineers, Data Scientists, and business stakeholders to understand data requirements and deliver technical solutions that enable business outcomes across the organization.
+ Implement data quality validation, monitoring, and alerting systems to ensure reliability of data pipelines from all sources.
+ Develop and maintain comprehensive documentation for data engineering processes and systems, architecture, integration patterns, and runbooks.
+ Lead incident response and troubleshooting efforts for data pipeline issues, ensuring minimal business impact.
+ Stay current with the emerging data engineering technologies, cloud services, SaaS platform capabilities, and industry best practices.
**Our ideal candidate will have the following:**
+ Bachelor's degree in computer science, engineering, or a related field
+ Minimum of seven years of experience in data engineering with at least two years in a lead or senior technical role
+ Expert proficiency in Snowflake data engineering patterns
+ Strong experience with AWS services (S3, Lambda, Glue, Step Functions) and Google Cloud Platform
+ Experience integrating data from SaaS platforms and marketing technology stacks (CDPs, DMPs, analytics platforms, CRMs)
+ Proven ability to work with third party APIs, webhooks, and data exports
+ Experience with infrastructure such as code (Terraform, Cloud Formation) and CI/CD pipelines for data infrastructure
+ Proven ability to design and implement API integrations and event-driven architecture
+ Experience with data modeling, data warehousing, and ETL processes at scale
+ Advanced proficiency in Python and SQL for data pipeline development
+ Experience with data orchestration tools (airflow, dbt, Snowflake tasks)
+ Strong understanding of data security, access controls, and compliance requirements
+ Ability to navigate vendor relationships and evaluate technical capabilities of third-party platforms
+ Excellent problem-solving skills and attention to detail
+ Strong communication and collaboraion skills
**Additional Information**
Advance Local Media offers competitive pay and a comprehensive benefits package with affordable options for your healthcare including medical, dental and vision plans, mental health support options, flexible spending accounts, fertility assistance, a competitive 401(k) plan to help plan for your future, generous paid time off, paid parental and caregiver leave and an employee assistance program to support your work/life balance, optional legal assistance, life insurance options, as well as flexible holidays to honor cultural diversity.
Advance Local Media is one of the largest media groups in the United States, which operates the leading news and information companies in more than 20 cities, reaching 52+ million people monthly with our quality, real-time journalism and community engagement. Our company is built upon the values of Integrity, Customer-first, Inclusiveness, Collaboration and Forward-looking. For more information about Advance Local, please visit ******************** .
Advance Local Media includes MLive Media Group, Advance Ohio, Alabama Media Group, NJ Advance Media, Advance Media NY, MassLive Media, Oregonian Media Group, Staten Island Media Group, PA Media Group, ZeroSum, Headline Group, Adpearance, Advance Aviation, Advance Healthcare, Advance Education, Advance National Solutions, Advance Originals, Advance Recruitment, Advance Travel & Tourism, BookingsCloud, Cloud Theory, Fox Dealer, Hoot Interactive, Search Optics, Subtext.
_Advance Local Media is proud to be an equal opportunity employer, encouraging applications from people of all backgrounds. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, genetic information, national origin, age, disability, sexual orientation, marital status, veteran status, or any other category protected under federal, state or local law._
_If you need a reasonable accommodation because of a disability for any part of the employment process, please contact Human Resources and let us know the nature of your request and your contact information._
Advance Local Media does not provide sponsorship for work visas or employment authorization in the United States. Only candidates who are legally authorized to work in the U.S. will be considered for this position.
Marketing Data Engineer
Data engineer job in Grand Rapids, MI
For over half a decade, Hudson Manpower has been a trusted partner in delivering specialized talent and technology solutions across IT, Energy, and Engineering industries worldwide. We work closely with startups, mid-sized firms, and Fortune 500 clients to support their digital transformation journeys. Our teams are empowered to bring fresh ideas, shape innovative solutions, and drive meaningful impact for our clients. If you're looking to grow in an environment where your expertise is valued and your voice matters, then Hudson Manpower is the place for you. Join us and collaborate with forward thinking professionals who are passionate about building the future of work.
Job Description:
Designs, codes, tests, and implements data movement pipelines, dashboarding and analytical assets; develops system documentation according to SAFe Agile principles and industry standards.
Design and hands-on development on either vendor or internal applications and interfaces including Azure - Data Factory, Log Analytics, Databricks, Synapse, Power BI, ADLS Gen2, GCP Cloud storage and Big Query
Assist in configuration and administration of marketing platforms (Google Marketing platform, Salesforce MCI, The Trade Desk etc., ADvendio) that the team supports
Build data ingestion into and out of the above listed marketing platforms
Gathers and implements functional and non-functional requirements including performance monitoring, alerting and code management and ensuring alignment with technology best practices and SLAs.
Partnering with all areas of the business to gather requirements for Data and Analytics and designing solutions.
Participate in agile ceremonies and follow agile cadence to build, test and deliver solutions
Driving engagement with ITS Security and Infrastructure teams to ensure secure development and deployment of solutions.
Interfaces with the Product Manager and IT partners at the Program level and within other Release Trains to define and estimate features for agile teams.
Job requirements
Requirements
6+ years industry experience (business application design, development, implementation, and/or solution architecture)
Understanding of architecture practices and execution for large projects / programs.
Experience building and designing data and analytics solutions on enterprise solutions such as Azure - Data Factory, Log Analytics, Databricks, Synapse, Power BI, ADLS Gen2.
Databricks experience is required.
Experience designing data pipelines, ingestion, storage, prep-train, model and serve using above technologies, Automating Azure Workloads, Data quality, Governance/standards, Security and legal compliance in the Azure architecture
Working knowledge of these tools preferred - Google Marketing platform, Salesforce MCI, The Trade Desk etc., ADvendio
Experience with Marketing domain data ingestion and analytics a must
Bachelor's degree in Computer Science, Computer Information Systems, Business Information Systems, Engineering or related discipline or equivalent work experience and technical training is required.
Excellent written and oral communications skills.
Previous experience in Power BI, Data Modeling, Data Classification and zones, data movement, Data architecture and reporting
Preferred knowledge and experience on Python and API architecture in Azure
Any SAFe certification or training.
Experience with multiple, diverse technical configurations, technologies, and processing environments.
Exceptional interpersonal skills, including teamwork, facilitation, and negotiation
All done!
Your application has been successfully submitted!
Other jobs
Experienced Data Engineer Analyst
Data engineer job in Grand Rapids, MI
Ascential Technologies is a global leader in automated diagnostic, inspection, assembly, and test systems. We serve mission-critical industries including automotive (EOL testing, ADAS), aerospace, industrial automation (balancing, grinding, test stands), and medical & life sciences (instruments, devices, consumables, and automation). Our solutions help customers accelerate innovation, ensure safety, and improve performance across the product lifecycle.
Role Overview:
We are seeking an experienced Data Engineer/Analyst to join our team to lead and support the ongoing development of our data analytics platform. This individual will play a critical role in building, maintaining, and enhancing scalable data systems while also supporting customers and internal teams. The role requires strong technical expertise in data engineering, analytics, visualization, and AI integration along with proven experience managing on-prem and cloud-based stacks.
Key Responsibilities:
Data Engineering & Processing
Design, develop, and maintain advanced data collectors for diverse industrial, aerospace, automotive, and medical systems.
Architect robust data pipelines for on-premises and cloud environments (Elastic Cloud, OpenSearch).
Analyze machine-generated log files from test stands, medical devices, and automation systems.
Clean and preprocess data to remove duplicates, noise, and errors.
Implement updates and enhancements to accommodate evolving data file types and formats (CSV, SQL, PLC logs, text, etc.).
Augment datasets with derived features to enhance analytical value.
Optimize performance, scalability, and fault tolerance of ingestion pipelines.
Visualization & Reporting
Design and implement dashboards tailored to specific test and measurement applications.
Apply standard dashboard templates to new and existing datasets.
Generate reports and visual summaries for internal and customer-facing use.
Provide automatic report generation using AI/LLM frameworks to summarize data, generate insights and guidance on performance improvements, provide predictive analytics, anomaly detection, and automated recommendations.
Data Collection & Ingestion
Develop and deploy data collectors for machines across automotive, aerospace, and medical domains.
Configure systems to monitor, clean, transform, and ingest data into cloud platforms.
Ensure data pipelines are robust, scalable, and secure.
Integrate new communication protocols (e.g. PLCs of different makes/models) and data formats; adapt to customer-specific software platforms
Monitoring & Alerting
Set up thresholds and integrate webhooks for real-time alerts and notifications.
Investigate and resolve issues related to data collectors, dashboards, and ELK stack components.
Collaboration & Maintenance
Collaborate directly with customers to gather requirements, resolve technical issues, and deliver solutions.
Provide guidance to internal teams, mentoring junior engineers and analysts.
Participate in daily standups and agile team activities.
Review and provide feedback on new datasets and data models.
Perform backups and conduct code reviews for data-related components.
Qualifications:
Bachelor's degree in Computer Science, Data Science, Engineering, or a related field.
5+ years of professional experience in data engineering, data science or analytics roles for both on-prem and cloud environments.
Proven expertise of ELK, OpenSearch, and/or OpenTelemetry.
Strong programming and scripting skills (e.g., Python, Bash, PowerShell).
Experience with data cleaning, transformation, and preprocessing for engineering/industrial data.
Hands-on experience with cloud deployments (Elastic Cloud, AWS, Azure, or GCP).
Proficiency with visualization frameworks (Kibana, OpenSearch Dashboards, Grafana).
Experience with core programming principles and design patterns.
Strong analytical and problem-solving skills.
Excellent communication and teamwork abilities with both technical and customer stakeholders.
Bonus Skills:
Experience working with industrial protocols and PLC integration.
Familiarity with containerization (Docker, Kubernetes).
Experience incorporating cybersecurity principles for secure data handling.
Experience with REST APIs and microservices architectures.
Background in test and measurement systems for automotive, aerospace, or medical device industries.
Prior experience in customer-facing engineering roles.
Knowledge of test and measurement systems in automotive, aerospace, or medical device industries.
This role can be remote; we prefer to have a candidate within a commuting distance to one of the offices listed in posting (Wisconsin, Michigan)
Salesforce Data 360 Architect
Data engineer job in Grand Rapids, MI
Who You'll Work With In our Salesforce business, we help our clients bring the most impactful customer experiences to life and we do that in a way that makes our clients the hero of their transformation story. We are passionate about and dedicated to building a diverse and inclusive team, recognizing that diverse team members who are celebrated for bringing their authentic selves to their work build solutions that reach more diverse populations in innovative and impactful ways. Our team is comprised of customer strategy experts, Salesforce-certified experts across all Salesforce capabilities, industry experts, organizational and cultural change consultants, and project delivery leaders. As the 3rd largest Salesforce partner globally and in North America, we are committed to growing and developing our Salesforce talent, offering continued growth opportunities, and exposing our people to meaningful work that aligns to their personal and professional goals.
We're looking for individuals who have experience implementing Salesforce Data Cloud or similar platforms and are passionate about customer data. The ideal candidate has a desire for continuous professional growth and can deliver complex, end-to-end Data Cloud implementations from strategy and design, through to data ingestion, segment creation, and activation; all while working alongside both our clients and other delivery disciplines. Our Global Salesforce team is looking to add a passionate Principal or Senior Principal to take on the role of Data Cloud Architect within our Salesforce practice.
What You'll Do:
* Responsible for business requirements gathering, architecture design, data ingestion and modeling, identity resolution setup, calculated insight configuration, segment creation and activation, end-user training, and support procedures
* Lead technical conversations with both business and technical client teams; translate those outcomes into well-architected solutions that best utilize Salesforce Data Cloud and the wider Salesforce ecosystem
* Ability to direct technical teams, both internal and client-side
* Provide subject matter expertise as warranted via customer needs and business demands
* Build lasting relationships with key client stakeholders and sponsors
* Collaborate with digital specialists across disciplines to innovate and build premier solutions
* Participate in compiling industry research, thought leadership and proposal materials for business development activities
* Experience with scoping client work
* Experience with hyperscale data platforms (ex: Snowflake), robust database modeling and data governance is a plus.
What You'll Bring:
* Have been part of at least one Salesforce Data Cloud implementation
* Familiarity with Salesforce's technical architecture: APIs, Standard and Custom Objects, APEX. Proficient with ANSI SQL and supported functions in Salesforce Data Cloud
* Strong proficiency toward presenting complex business and technical concepts using visualization aids
* Ability to conceptualize and craft sophisticated wireframes, workflows, and diagrams
* Strong understanding of data management concepts, including data quality, data distribution, data modeling and data governance
* Detailed understanding of the fundamentals of digital marketing and complementary Salesforce products that organizations may use to run their business. Experience defining strategy, developing requirements, and implementing practical business solutions.
* Experience in delivering projects using Agile-based methodologies
* Salesforce Data Cloud certification preferred
* Additional Salesforce certifications like Administrator are a plus
* Strong interpersonal skills
* Bachelor's degree in a related field preferred, but not required
* Open to travel (up to 50%)
About Us
Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all.
Compensation and Benefits
Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance.
Slalom is committed to fair and equitable compensation practices. For this role, we are hiring at the following levels and salary ranges:
* East Bay, San Francisco, Silicon Valley:
* Principal: $145,000-$225,000
* San Diego, Los Angeles, Orange County, Seattle, Boston, Houston, New Jersey, New York City, Washington DC, Westchester:
* Principal: $133,000-$206,000
* All other locations:
* Principal: $122,000-$189,000
In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time.
We are committed to pay transparency and compliance with applicable laws. If you have questions or concerns about the pay range or other compensation information in this posting, please contact us at: ********************.
We will accept applications until December 31, 2025 or until the position is filled.
EEO and Accommodations
Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team or contact ****************************** if you require accommodations during the interview process.
#LI-EC
Easy ApplyEmbedded Software Development Engineer - DoD Secret Clearance
Data engineer job in Grand Rapids, MI
Are you an Embedded Software Engineer who loves working on innovative technologies? If so, GE Aerospace Defense and Systems wants you to join their Emerging Technologies team in Grand Rapids, MI. You will be part of a cross-functional team that develops innovative solutions for capabilities on custom hardware. You will work on projects related to stores management, cyber security, networking, sensing and signal processing, and AI/ML.
This opportunity is located in Grand Rapids, MI, but don't worry, GE will provide comprehensive corporate relocation assistance.
GE Aerospace is a leader in inventing the future of flight. GE offers competitive salaries and a full range of benefits, including 401K contributions and matching, flexible work arrangements, generous time off, tuition reimbursement, and various health insurance options.
If you are interested in this exciting opportunity, please apply today!
**Job Description**
As a Embedded Software Development Engineer you will play a vital role in designing, developing, and testing Embedded Software (Firmware), platforms and systems for our mission-critical technologies and products. You will work with a team of passionate and skilled engineers who share your vision of creating innovative and reliable embedded systems. To succeed in this position, you will need a background in embedded systems, a keen eye for detail, and a deep understanding of real-time systems and their constraints. You will also need to demonstrate your ability to take full ownership of your role and deliver high-quality software solutions. The role has moderate autonomy, requiring high levels of operational judgment.
**Responsibilities:**
+ Designs and/or programs/develops a small module or a large component, feature, set of features, whole feature area or entire embedded software product.
+ Define and formalize system, hardware, software, and human integration requirements.
+ Define and execute engineering test, validation, and verification activities.
+ Follow established software development methodologies and principles and document your tasks and designs.
+ Validate and verify software designs in a diversity of system integration environments - from local desktop computer simulations to fully representative flight tests.
+ Implement protocols and algorithms for resource-constrained environments and collaborate with the hardware team to enable communication between modules and applications.
+ Ensure software robustness, resilience, and fail-safe operation for critical devices, and mitigate potential safety and security vulnerabilities.
+ Create detailed design and technical documentation, optimize existing applications, and implement new features.
+ Participate in task prioritization, execution, requirements, specifications, code and design reviews, and mentorship across the software development life cycle.
+ Apply best practices for software engineering and understand the key business drivers and product roadmap.
+ Deliver your work to support project scope, cost, and schedule targets, and interface effectively with all levels of the organization and customers.
+ Propose novel solutions to technical challenges, generate cost and time estimates for future bids and programs, and utilize hardware/software to demonstrate capability against customer expectations.
**Required Qualifications:**
+ Grand Rapids, MI opportunity - Corporate relocation assistance provided.
+ BD + 5 years of related software engineering experience or MD + 2 years of experience in Computer Science, Electrical Engineering, Mathematics, Physics, or related fields
+ Proficiency in C, C++, or equivalent languages.
+ Grand Rapids, MI opportunity - Corporate relocation assistance provided.
+ **This role requires the successful candidate to obtain and maintain US Government Security Clearance; prerequisite for a security clearance is U.S. citizenship.**
**Desired Qualifications:**
+ Master's degree in engineering or computer science with extensive experience in Ada, C/C++ for embedded software design, development, and testing.
+ Knowledge of scripting languages (Python, Perl, Tcl, etc.) and code management tools (Git, CVS, SVN, Perforce, etc.)
+ Experience with RTOS (Linux, FreeRTOS, QNX, VxWorks, etc.) and device drivers for complex systems using parallel processing, multi-threading, distributed processing, multi-core, SoM, and/or secure processing.
+ Experience with embedded software testing, debugging, and integration on hardware using debuggers (gdb, lldb, etc.), test equipment (scopes, analyzers, multi-meters, etc.), and UARTs, JTAGs and oscilloscopes.
+ Experience working in mission-critical industries (aerospace, automotive, defense, first responder, medical devices, etc.) and turning CONOPS, Specifications or Requirements into software design, code, test plans and execution.
+ Innovative, critical thinking and troubleshooting skills and proficiency with IDEs, version control tools, defect tracking tools and scripting tools.
+ Experience with bare metal software design and optimization for cycles and memory and fundamental facility with compilers, build and source code control tools.
+ Experience with model based engineering on Cameo.
+ Outstanding written and verbal communication skills.
+ Knowledgeable of system interfaces (e.g. Ethernet, Mil-Std-1553, Serial).
+ Knowledgeable of component interfaces (e.g. I2C, SPI, PCIe).
**The base pay range for this position is $90,800.00 - $121,000.00 USD Annual. The specific pay offered may be influenced by a variety of factors, including the candidate's experience, education, and skill set. This position is also eligible for an annual discretionary bonus based on a percentage of your base salary/ commission based on the plan. This posting is expected to close on 12/31/25. **
GE Aerospace offers comprehensive benefits and programs to support your health and, along with programs like HealthAhead, your physical, emotional, financial and social wellbeing. Healthcare benefits include medical, dental, vision, and prescription drug coverage; access to a Health Coach from GE Aerospace; and the Employee Assistance Program, which provides 24/7 confidential assessment, counseling and referral services. Retirement benefits include the GE Aerospace Retirement Savings Plan, a 401(k) savings plan with company matching contributions and company retirement contributions, as well as access to Fidelity resources and planning consultants. Other benefits include tuition assistance, adoption assistance, paid parental leave, disability insurance, life insurance, and paid time-off for vacation or illness.
GE Aerospace (General Electric Company or the Company) and its affiliates each sponsor certain employee benefit plans or programs (i.e., is a "Sponsor"). Each Sponsor reserves the right to terminate, amend, suspend, replace or modify its benefit plans and programs at any time and for any reason, in its sole discretion. No individual has a vested right to any benefit under a Sponsor's welfare benefit plan or program. This document does not create a contract of employment with any individual.
\#LI-KS1
**\#securityclearance**
_This role requires access to U.S. export-controlled information. Therefore, employment will be contingent upon the ability to prove that you meet the status of a U.S. Person as one of the following: U.S. lawful permanent resident, U.S. Citizen, have been granted asylee or refugee status (i.e., a protected individual under the Immigration and Naturalization Act, 8 U.S.C. 1324b(a)(3))._
**Additional Information**
GE Aerospace offers a great work environment, professional development, challenging careers, and competitive compensation. GE Aerospace is an Equal Opportunity Employer (****************************************************************************************** . Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law.
GE Aerospace will only employ those who are legally authorized to work in the United States for this opening. Any offer of employment is conditioned upon the successful completion of a drug screen (as applicable).
**Relocation Assistance Provided:** Yes
GE Aerospace is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law.
Cybersecurity Data & AI Consultant
Data engineer job in Grand Rapids, MI
Consultant - Cyber Defense & Resilience - Security Operations Are you interested in working in a dynamic environment that offers opportunities for professional growth and new responsibilities? If so, Deloitte could be the place for you. Traditional security programs have often been unsuccessful in unifying the need to both secure and support technology innovation required by the business. Join Deloitte's Cyber Defense & Resilience (D&R) Security Operations team and become a member of the largest group of cybersecurity professionals worldwide.
Recruiting for this role ends on 12/31/2025
The Team
Cyber Defense & Resilience is an integrated team of security and data technologists working at the intersection cybersecurity, advanced cyber data engineering and the use of AI and ML for cyber defense and operations issues. We serve as a trusted advisor and managed service provider bringing a mix of capability and capacity across security data modernization, data ops, AI, and ML, and the use of these disciplines towards cyber specific solutioning. Through our unrivaled breadth and depth of services across every major industry and domain, we help our clients run smarter, faster, and more efficiently. With Deloitte's AI & Data, our clients have the support they need to continuously develop, innovate, automate, scale, and operate in service of organizational performance and growth.
Cyber Detect & Respond practitioners work with our clients to:
* Design and modernize large scale cyber data and analytics programs that promote organizational intelligence, provide embedded capacity, and implement as-a-service based subscription models at scale
* Harness the potential of bleeding edge cyber big data and AI technologies such as Databricks for Cyber, AWS Security Lake, Google Sec Ops and the latest from traditional security providers like Splunk, Crowdstrike, Palo Alto and others.
* Enable day-to-day operations, maintenance and ongoing enhancements of their data platforms and applications as well as managing and governing the underlying data leveraging standardized, automated and AI enabled Data Ops capabilities
* Mature their AI and Analytics journey with fluid capacity and flexible capability of AI and Analytics experts complemented with experience hardened assets and curated data sets to experiment with AI and scale their AI and Analytics ambition.
Qualifications:
Required:
* 2-4 years of relevant Analytics consulting or industry experience
* 2-4 Experience with AI development tools such as vector databases (Pinecone, Elastic, etc.) and AI development frameworks (Langchain, CrewAI, etc.)
* 2-4 years experience in statistical analysis, machine learning, and data mining techniques.
* 2-4 years of experience using statistical computer languages (Python, SQL, R, SAS, etc.) to prepare data for analysis, visualize data as part of exploratory analysis, generate features, and other similar data science driven data handling
* 2-4 years of experience using cyber security cloud platforms (Google SecOps, AWS, Azure, etc.)
* 1-4 years of experience with SOC threat hunting and incident response
* Demonstrated expertise with one full life cycle analytics engagement across strategy, design and implementation.
* Bachelor's Degree in Engineering, Mathematics, Empirical Statistics or 4 years equivalent professional experience
* Ability to travel up to 50%, on average, based on the work you do and the clients and industries/sectors you serve
* Limited immigration sponsorship may be available.
Preferred:
* Experience architecting, designing, developing and deploying enterprise data science solutions which include components across the Artificial Intelligence spectrum such as NLP, Chatbots, Virtual Assistants, Computer Vision, and Cognitive Services as well the use of big data tools for the management of massive datasets.
* Knowledge of the intersection of AI / ML / Advanced Data Engineering and cybersecurity specific use cases for Detection, cyber threat response acceleration.
* Experience parsing and normalizing cyber or IT specific telemetry datasets
* Expertise in Python machine and deep learning frameworks and libraries, e.g. PyTorch, Keras, Tensorflow, Scikit-learn, Numpy, SciPy
* Experience designing and implementing Apache Open Source (Kafka, Storm, Spark) frameworks to process end to end data management life cycle
* Ability to work independently and manage multiple task assignments.
* Strong oral and written communication skills, including presentation skills (MS Visio, MS PowerPoint).
Information for applicants with a need for accommodation: ************************************************************************************************************
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $80,400 to $148,000.
Recruiting tips
From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters.
Benefits
At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you.
Our people and culture
Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work.
Our purpose
Deloitte's purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Learn more.
Professional development
From entry-level employees to senior leaders, we believe there's always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career.
As used in this posting, "Deloitte" means Deloitte & Touche LLP, a subsidiary of Deloitte LLP. Please see ************************* for a detailed description of the legal structure of Deloitte LLP and its subsidiaries. Certain services may not be available to attest clients under the rules and regulations of public accounting.
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.
Qualified applicants with criminal histories, including arrest or conviction records, will be considered for employment in accordance with the requirements of applicable state and local laws, including the Los Angeles County Fair Chance Ordinance for Employers, City of Los Angeles's Fair Chance Initiative for Hiring Ordinance, San Francisco Fair Chance Ordinance, and the California Fair Chance Act. See notices of various fair chance hiring and ban-the-box laws where available. Fair Chance Hiring and Ban-the-Box Notices | Deloitte US Careers
Requisition code: 313848
Job ID 313848
Databricks Data Engineer - Manager - Consulting - Location Open
Data engineer job in Grand Rapids, MI
At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
**Technology - Data and Decision Science - Data Engineering - Manager**
We are looking for a dynamic and experienced Manager of Data Engineering to lead our team in designing and implementing complex cloud analytics solutions with a strong focus on Databricks. The ideal candidate will possess deep technical expertise in data architecture, cloud technologies, and analytics, along with exceptional leadership and client management skills.
**The opportunity:**
In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that business requirements are translated into effective technical solutions. Key responsibilities include:
+ Understanding and analyzing business requirements to translate them into technical requirements.
+ Designing, building, and operating scalable data architecture and modeling solutions.
+ Staying up to date with the latest trends and emerging technologies to maintain a competitive edge.
**Key Responsibilities:**
As a Data Engineering Manager, you will play a crucial role in managing and delivering complex technical initiatives. Your time will be spent across various responsibilities, including:
+ Leading workstream delivery and ensuring quality in all processes.
+ Engaging with clients on a daily basis, actively participating in working sessions, and identifying opportunities for additional services.
+ Implementing resource plans and budgets while managing engagement economics.
This role offers the opportunity to work in a dynamic environment where you will face challenges that require innovative solutions. You will learn and grow as you guide others and interpret internal and external issues to recommend quality solutions. Travel may be required regularly based on client needs.
**Skills and attributes for success:**
To thrive in this role, you should possess a blend of technical and interpersonal skills. The following attributes will make a significant impact:
+ Lead the design and development of scalable data engineering solutions using Databricks on cloud platforms (e.g., AWS, Azure, GCP).
+ Oversee the architecture of complex cloud analytics solutions, ensuring alignment with business objectives and best practices.
+ Manage and mentor a team of data engineers, fostering a culture of innovation, collaboration, and continuous improvement.
+ Collaborate with clients to understand their analytics needs and deliver tailored solutions that drive business value.
+ Ensure the quality, integrity, and security of data throughout the data lifecycle, implementing best practices in data governance.
+ Drive end-to-end data pipeline development, including data ingestion, transformation, and storage, leveraging Databricks and other cloud services.
+ Communicate effectively with stakeholders, including technical and non-technical audiences, to convey complex data concepts and project progress.
+ Manage client relationships and expectations, ensuring high levels of satisfaction and engagement.
+ Stay abreast of the latest trends and technologies in data engineering, cloud computing, and analytics.
+ Strong analytical and problem-solving abilities.
+ Excellent communication skills, with the ability to convey complex information clearly.
+ Proven experience in managing and delivering projects effectively.
+ Ability to build and manage relationships with clients and stakeholders.
**To qualify for the role, you must have:**
+ Bachelor's degree in computer science, Engineering, or a related field required; Master's degree preferred.
+ Typically, no less than 4 - 6 years relevant experience in data engineering, with a focus on cloud data solutions and analytics.
+ Proven expertise in Databricks and experience with Spark for big data processing.
+ Strong background in data architecture and design, with experience in building complex cloud analytics solutions.
+ Experience in leading and managing teams, with a focus on mentoring and developing talent.
+ Strong programming skills in languages such as Python, Scala, or SQL.
+ Excellent problem-solving skills and the ability to work independently and as part of a team.
+ Strong communication and interpersonal skills, with a focus on client management.
**Required Expertise for Managerial Role:**
+ **Strategic Leadership:** Ability to align data engineering initiatives with organizational goals and drive strategic vision.
+ **Project Management:** Experience in managing multiple projects and teams, ensuring timely delivery and adherence to project scope.
+ **Stakeholder Engagement:** Proficiency in engaging with various stakeholders, including executives, to understand their needs and present solutions effectively.
+ **Change Management:** Skills in guiding clients through change processes related to data transformation and technology adoption.
+ **Risk Management:** Ability to identify potential risks in data projects and develop mitigation strategies.
+ **Technical Leadership:** Experience in leading technical discussions and making architectural decisions that impact project outcomes.
+ **Documentation and Reporting:** Proficiency in creating comprehensive documentation and reports to communicate project progress and outcomes to clients.
**Large-Scale Implementation Programs:**
1. **Enterprise Data Lake Implementation:** Led the design and deployment of a cloud-based data lake solution for a Fortune 500 retail client, integrating data from multiple sources (e.g., ERPs, POS systems, e-commerce platforms) to enable advanced analytics and reporting capabilities.
2. **Real-Time Analytics Platform:** Managed the development of a real-time analytics platform using Databricks for a financial services organization, enabling real-time fraud detection and risk assessment through streaming data ingestion and processing.
3. **Data Warehouse Modernization:** Oversaw the modernization of a legacy data warehouse to a cloud-native architecture for a healthcare provider, implementing ETL processes with Databricks and improving data accessibility for analytics and reporting.
**Ideally, you'll also have:**
+ Experience with advanced data analytics tools and techniques.
+ Familiarity with machine learning concepts and applications.
+ Knowledge of industry trends and best practices in data engineering.
+ Familiarity with cloud platforms (AWS, Azure, GCP) and their data services.
+ Knowledge of data governance and compliance standards.
+ Experience with machine learning frameworks and tools.
**What we look for:**
We seek individuals who are not only technically proficient but also possess the qualities of top performers, including a strong sense of collaboration, adaptability, and a passion for continuous learning. If you are driven by results and have a desire to make a meaningful impact, we want to hear from you.
FY26NATAID
**What we offer you**
At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more .
+ We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $125,500 to $230,200. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $150,700 to $261,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
+ Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
+ Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
**Are you ready to shape your future with confidence? Apply today.**
EY accepts applications for this position on an on-going basis.
For those living in California, please click here for additional information.
EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
**EY | Building a better working world**
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
Data Architect
Data engineer job in Grand Rapids, MI
**Advance Local** is looking for a **Data Architect** to lead the design and implementation of enterprise-level data solutions within our modern cloud data platform. This role combines deep technical expertise in analytics engineering with leadership responsibilities to ensure the delivery of well-documented, tested, and high-quality data assets that enable AI, data products, advanced analytics, and self-service reporting. You'll guide strategic data initiatives, mentor a team of analytics engineers, and collaborate with data engineering and business stakeholders to deliver impactful, scalable solutions.
The base salary range is $150,000 - $165,000 per year.
**What you'll be doing:**
+ Architect and oversee scalable data models, pipelines, and frameworks in Snowflake using dbt, ensuring that they meet quality standards for AI agents, advanced analytics, and self-service reporting.
+ Lead the design and governance of analytics-ready data models, ensuring they are well-modeled, performant, and accessible to downstream consumers.
+ Drive rapid prototyping of new data products and features, providing technical direction and hand-on guidance when needed.
+ Establish and enforce data quality, testing, and documentation standards across all data assets, ensuring reliability and trustworthiness.
+ Develop advanced solutions for audience data modeling and identity resolution, supporting personalization and segmentation strategies.
+ Partner with Audience Strategy and Insights teams to translate requirements into technical solutions and automation.
+ Collaborate with the Lead Data Engineer on data integration patterns and ensure seamless handoffs between raw data ingestion and analytics-ready models.
+ Establish data architecture standards and development practices (version control, CI/CD, testing frameworks) that enable team scalability.
+ Enable data accessibility and integration solutions that support both technical and non-technical users across the organization.
+ Provide technical leadership to a Data Manager and their team of analytics engineers, fostering a culture of best practices, code review, and continuous improvement.
**Our ideal candidate will have the following:**
+ Bachelor's or master's degree in computer science, data engineering, information systems, or related field
+ Minimum ten years' experience in data engineering, data analytics engineering, architecture, or related roles, with proven experience leading data teams and managing complex data ecosystems
+ Expert level proficiency in dbt and Snowflake with demonstrated ability to build production-grade data models and pipelines
+ Strong knowledge of cloud platforms (AWS, Azure, GCP) and data warehousing best practices
+ Proficiency in big data technologies (Spark, Hadoop) and streaming frameworks
+ Familiarity with data governance, security, and compliance standards
+ Experience with audience segmentation, marketing analytics, or customer data platforms
+ Knowledge of machine learning pipelines, advanced analytics and AI applications
+ Strategic thinking and ability to align data initiatives with business objectives
+ Strong communication and stakeholder management skills
+ Proven ability to lead cross-functional teams and drive organizational change
+ Experience building data solutions that support self-service analytics and data demonstrations
**Additional Information**
Advance Local Media offers competitive pay and a comprehensive benefits package with affordable options for your healthcare including medical, dental and vision plans, mental health support options, flexible spending accounts, fertility assistance, a competitive 401(k) plan to help plan for your future, generous paid time off, paid parental and caregiver leave and an employee assistance program to support your work/life balance, optional legal assistance, life insurance options, as well as flexible holidays to honor cultural diversity.
Advance Local Media is one of the largest media groups in the United States, which operates the leading news and information companies in more than 20 cities, reaching 52+ million people monthly with our quality, real-time journalism and community engagement. Our company is built upon the values of Integrity, Customer-first, Inclusiveness, Collaboration and Forward-looking. For more information about Advance Local, please visit ******************** .
Advance Local Media includes MLive Media Group, Advance Ohio, Alabama Media Group, NJ Advance Media, Advance Media NY, MassLive Media, Oregonian Media Group, Staten Island Media Group, PA Media Group, ZeroSum, Headline Group, Adpearance, Advance Aviation, Advance Healthcare, Advance Education, Advance National Solutions, Advance Originals, Advance Recruitment, Advance Travel & Tourism, BookingsCloud, Cloud Theory, Fox Dealer, Hoot Interactive, Search Optics, Subtext.
_Advance Local Media is proud to be an equal opportunity employer, encouraging applications from people of all backgrounds. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, genetic information, national origin, age, disability, sexual orientation, marital status, veteran status, or any other category protected under federal, state or local law._
_If you need a reasonable accommodation because of a disability for any part of the employment process, please contact Human Resources and let us know the nature of your request and your contact information._
Advance Local Media does not provide sponsorship for work visas or employment authorization in the United States. Only candidates who are legally authorized to work in the U.S. will be considered for this position.