Post job

Data engineer jobs in Bloomfield, MI

- 899 jobs
All
Data Engineer
Data Scientist
Data Architect
Software Engineer
Senior Data Architect
Senior Software Engineer
ETL Architect
Software Systems Engineer
Requirements Engineer
Software Engineer Lead
Configuration Engineer
  • Controls Software Engineer

    Lincoln Electric 4.6company rating

    Data engineer job in Shelby, MI

    Lincoln Electric is the world leader in the engineering, design, and manufacturing of advanced arc welding solutions, automated joining, assembly and cutting systems, plasma and oxy-fuel cutting equipment, and has a leading global position in brazing and soldering alloys. Lincoln is recognized as the Welding Expert™ for its leading materials science, software development, automation engineering, and application expertise, which advance customers' fabrication capabilities to help them build a better world. Headquartered in Cleveland, Ohio, Lincoln Electric is a $4.2B publicly traded company (NASDAQ:LECO) with over 12,000 employees around the world, with operations in 71 manufacturing and automation system integration locations across 21 countries and maintains a worldwide network of distributors and sales offices serving customers in over 160 countries. Location: Shelby Employment Status: Hourly Full-Time Function: Engineering Req ID: 26527 Summary Fori Automation, LLC, a Lincoln Electric Company, is a global supplier of welding, assembly, material handling, and testing equipment for automotive and non-automotive customers worldwide. Fori Automation focuses on delivering cost-effective, highly engineered products and systems designed and manufactured globally with localized sales, project management, and service. We are seeking an experienced Controls Software Engineer for our Shelby Township, MI site with a background in industrial software development. The Controls Software Engineer will initially support active projects and then transition to completing projects directly. They will take the lead on developing software on new projects and debug software on new machines. This role requires travel to customer sites for equipment installation and customer interaction. What You Will Do Design PLC software and HMIs for industrial automation equipment Debug and troubleshoot PLC software and HMIs Collaborate with cross-functional teams to maintain project timelines and critical path milestones. Maintain task lists and reports of open items. Maintain project design documentation and prepare customer deliverables. Ensure the controls engineering process is tracked and followed. Assist customers and local tradespeople in troubleshooting equipment issues. Conduct end-user training on equipment operation. Education & Experience Requirements Electrical Engineering or Computer Engineering degree preferred; Mechatronics degrees will also be considered. Minimum of two years of experience as a Controls Engineer or Controls Software Engineer with experience in designing Rockwell Logix 5000 or Siemens S7-1500 family processors. Knowledge or education in electrical circuits, schematic reading, design, and troubleshooting. Experience with electrical CAD systems, such as AutoCAD Electrical and/or ePLAN Experience with PLC programming in ladder and structured text. Experience programming HMIs Travel required: approximately 30% domestic and international. Weekend work may be required based on project schedules. Preferred Experience in computer programming languages, such as VB, C/C++, or C#. Experience with Rockwell and Siemens HMI preferred. Lincoln Electric is an Equal Opportunity Employer. We are committed to promoting equal employment opportunity for applicants, without regard to their race, color, national origin, religion, sex (including pregnancy, childbirth, or related medical conditions, including, but not limited to, lactation), sexual orientation, gender identity, age, veteran status, disability, genetic information, and any other category protected by federal, state, or local law.
    $77k-99k yearly est. 2d ago
  • GCP Data Engineer

    Miracle Software Systems, Inc. 4.2company rating

    Data engineer job in Dearborn, MI

    Experience Required: 8+ years Work Status: Hybrid We're seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. You will: Work in collaborative environment including pairing and mobbing with other cross-functional engineers Work on a small agile team to deliver working, tested software Work effectively with fellow data engineers, product owners, data champions and other technical experts Demonstrate technical knowledge/leadership skills and advocate for technical excellence Develop exceptional Analytics data products using streaming, batch ingestion patterns in the Google Cloud Platform with solid Data Warehouse principles Be the Subject Matter Expert in Data Engineering and GCP tool technologies Skills Required: Big Query Skills Preferred: N/A Experience Required: In-depth understanding of Google's product technology (or other cloud platform) and underlying architectures 5+ years of analytics application development experience required 5+ years of SQL development experience 3+ years of Cloud experience (GCP preferred) with solution designed and implemented at production scale Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Terraform, Big Query, Google Cloud Storage, PubSub, Dataflow, Dataproc, Airflow, etc. 2 + years professional development experience in Java or Python, and Apache Beam Extracting, Loading, Transforming, cleaning, and validating data Designing pipelines and architectures for data processing 1+ year of designing and building CI/CD pipelines Experience Preferred: Experience building Machine Learning solutions using TensorFlow, BigQueryML, AutoML, Vertex AI Experience in building solution architecture, provision infrastructure, secure and reliable data-centric services and application in GCP Experience with DataPlex is preferred Experience with development eco-system such as Git, Jenkins and CICD Exceptional problem solving and communication skills Experience in working with DBT/Dataform Experience in working with Agile and Lean methodologies Team player and attention to detail Performance tuning experience Education Required: Bachelor's Degree Education Preferred: Master's Degree Additional Safety Training/Licensing/Personal Protection Requirements: Additional Information: ***POSITION IS HYBRID*** Primary Skills Required: Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Additional Skills Preferred: Strong drive for results and ability to multi-task and work independently Self-starter with proven innovation skills Ability to communicate and work with cross-functional teams and all levels of management Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Experience in creating and executing detailed test plans Additional Education Preferred GCP Professional Data Engineer Certified In-depth software engineering knowledge "
    $71k-94k yearly est. 2d ago
  • Data Architect

    Millennium Software and Staffing Inc. 4.2company rating

    Data engineer job in Detroit, MI

    Millennium Software is look for a Data Architect for one of its direct client based in Michigan. It is onsite role. Title: Data Architect Tax term: Only w2, no c2c Description: All below are must have Senior Data Architect with 12+ years of experience in Data Modeling. Develop conceptual, logical, and physical data models. Experience with GCP Cloud
    $91k-117k yearly est. 1d ago
  • GCP Data Architect (only W2 Position - No C2C Accepted) 11.18.2025

    Systems Technology Group, Inc. (STG 4.0company rating

    Data engineer job in Dearborn, MI

    - No C2C Accepted) 11.18.2025 Description: STG is a SEI CMMi Level 5 company with several Fortune 500 and State Government clients. STG has an opening for GCP Data Architect. Please note that this project assignment is with our own direct clients. We do not go through any vendors. STG only does business with direct end clients. This is expected to be a long-term position. STG will provide immigration and permanent residency sponsorship assistance to those candidates who need it. Job Description: Employees in this job function are responsible for designing, building and maintaining reliable, efficient and scalable data architecture and data models that serve as a foundation for all data solutions. They also closely collaborate with senior leadership and IT teams to ensure alignment of data strategy with overall business goals. Key Responsibilities: Align data strategy to business goals to support a mix of business strategy, improved decision-making, operations efficiency and risk management Ensure data assets are available, consumable and secure for end users across the enterprise - applications, platforms and infrastructure - within the confines of enterprise and security architecture Design and build reliable, efficient and scalable data architecture to be used by the organization for all data solutions Implement and maintain scalable architectural data patterns, solutions and tooling to support business strategy Design, build, and launch shared data services and APIs to support and expose data-driven solutions in line with enterprise architecture standards Research and optimize data architecture technologies to enhance and support enterprise technology and data strategy Skills Required: Power Builder, PostgreSQL, GCP, Big Query Senior Specialist Exp.: 10+ years in IT; 7+ years in concentration Must have experience presenting technical material to business users Must be able to envision larger strategies and anticipate where possible synergies can be realized Experience acting as the voice of the architecture/data model and defend its relevancy to ensure adherence to its principles and purpose Data Modeling and Design: Develop conceptual, logical, and physical data models for business intelligence, analytics, and reporting solutions. Transform requirements into scalable, flexible, and efficient data structures that can support advanced analytics. Requirement Analysis: Collaborate with business analysts, stakeholders, and subject matter experts to gather and interpret requirements for new data initiatives. Translate business questions into data models that can answer these questions. Data Integration: Work closely with data engineers to integrate data from multiple sources, ensuring consistency, accuracy, and reliability. Map data flows and document relationships between datasets. Database Architecture: Design and optimize database schemas using the medallion architecture which includes relational, star schema and denormalized data sets for BI and ML data consumers. Metadata Management: Team with the data governance team so detailed documentation on data definitions, data lineage, and data quality statistics are available to data consumers. Data Quality Assurance: Establish master data management data modeling so the history of how customer, provider and other party data is consolidated into a single version of the truth. Collaboration and Communication: Serve as a bridge between technical teams and business units, clearly communicating the value and limitations of various data sources and structures. Continuous Improvement: Stay abreast of emerging trends in data modeling, analytics platforms, and big data technologies. Recommend enhancements to existing data models and approaches. Performance Optimization: Monitor and optimize data models for query performance and scalability. Troubleshoot and resolve performance bottlenecks in collaboration with database administrators. Governance and Compliance: Ensure that data models and processes adhere to regulatory standards and organizational policies regarding privacy, access, and security. GCP Data Architect is based in Dearborn, MI. A great opportunity to experience the corporate environment leading personal career growth. Resume Submittal Instructions: Interested/qualified candidates should email their word formatted resumes to Vasavi Konda - vasavi.konda(.@)stgit.com and/or contact @(Two-Four-Eight) Seven- One-Two - Six-Seven-Two-Five (@*************. In the subject line of the email please include: First and Last Name: GCP Data Architect. For more information about STG, please visit us at ************** Sincerely, Vasavi Konda| Recruiting Specialist “Opportunities don't happen, you create them.” Systems Technology Group (STG) 3001 W. Big Beaver Road, Suite 500 Troy, Michigan 48084 Phone: @(Two-Four-Eight) Seven- One-Two - Six-Seven-Two-Five: @************(O) Email: vasavi.konda(.@)stgit.com
    $86k-125k yearly est. 1d ago
  • Sr. Data Engineer/Architect (Python, Pyspark, Airflow, SDET) - (Face to Face Interview)

    Centraprise

    Data engineer job in Auburn Hills, MI

    The Senior Data Engineer & Technical Lead (SDET Lead) will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects Mandatory Skills: Data Engineering, Python, PySpark, CI/CD, Airflow, Workflow Orchestration Key Responsibilities: Data Engineering: Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads. Workflow Orchestration: Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability. CI/CD Pipeline Development: Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions. Testing & Quality: Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines. Secure Development: Implement secure coding best practices and design patterns throughout the development lifecycle. Collaboration: Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions. Documentation: Create and maintain technical documentation, including process/data flow diagrams and system design artifacts. Mentorship: Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices. Troubleshooting: Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks. Cross-Team Knowledge Sharing: Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage. Includes all above skills, plus the following. Minimum of 7+ years overall IT experience Experienced in waterfall, iterative, and agile methodologies Technical Experience: Hands-on Data Engineering: Minimum 5+ yearsof practical experience building production-grade data pipelines using Python and PySpark. Airflow Expertise: Proven track record of designing, deploying, and managing Airflow DAGs in enterprise environments. CI/CD for Data Projects: Ability to build and maintain CI/CD pipelines for data engineering workflows, including automated testing and deployment**. Cloud & Containers: Experience with containerization (Docker and cloud platforms (GCP) for data engineering workloads. Appreciation for twelve-factor design principles Python Fluency: Ability to write object-oriented Python code manage dependencies, and follow industry best practices Version Control: Proficiency with **Git** for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows). Unix/Linux: Strong command-line skills** in Unix-like environments. SQL: Solid understanding of SQL for data ingestion and analysis. Collaborative Development: Comfortable with code reviews, pair programming and using remote collaboration tools effectively. Engineering Mindset: Writes code with an eye for maintainability and testability; excited to build production-grade software Education: Bachelor's or graduate degree in Computer Science, Data Analytics or related field, or equivalent work experience. Unique Skills: Graduate degree in a related field, such as Computer Science or Data Analytics Familiarity with Test-Driven Development (TDD) A high tolerance for OpenShift, Cloudera, Tableau, Confluence, Jira, and other enterprise tools.
    $89k-119k yearly est. 1d ago
  • Senior Data Architect

    Latentview Analytics

    Data engineer job in Detroit, MI

    LatentView Analytics is one of the world's largest and fastest-growing digital analytics firms. We help companies drive digital transformation by helping them combine digital and traditional data to gain a competitive advantage. LatentView provides a 360-degree view of the digital consumer, enabling companies to predict new revenue streams, anticipate product trends and popularity, improve customer retention rates, and optimize investment decisions. Role: Databricks Architect Location: Detroit City in Michigan Authorized to work in the United States No Need of Visa sponsorship Key Responsibilities Architecture Design: Lead the end-to-end design of scalable data solutions on the Databricks Lakehouse Platform, including Delta Lake, Delta Live Tables (DLT), and Unity Catalog. Data Engineering Leadership: Guide teams on best practices for batch/streaming ingestion, Spark optimization (PySpark/Scala), performance tuning, and CI/CD for data pipelines. Automotive Domain Expertise: Understand and architect solutions for automotive-specific challenges (e.g., large-scale telematics, manufacturing data, connected vehicle data). Cloud Integration: Manage and optimize Databricks on AWS, Azure, or GCP, integrating with other cloud services (S3, ADLS, IAM, etc.). Governance & Security: Implement fine-grained access controls, data lineage, and compliance with security standards. Stakeholder Collaboration: Work with business leaders, data scientists, analysts, and product owners to translate needs into technical roadmaps. Innovation: Act as a subject matter expert (SME), recommending emerging tech and driving innovation in data & AI. Core Skills & Qualifications 9-15 years experienced resources in Databricks, Python, Big Data, Apache Spark, SQL and Spark SQLd Strong hands on in Pyspark and Apache Spark. Experience in Building Data Governance Solutions like Unity Catalog …etc Build Very Strong Orchestration Layer in Databricks/ADF…. Workflows. Build CICD for Databricks in Azure Devops. Process near Real time Data thru Auto Loader, DLT Pipelines. Implement Security Layer in Delta Lake. Implement Massive Parallel Processing Layers in Spark SQL and PySpark. Implement Cost effective Infrastructure in Databricks. EEO Statement “At LatentView Analytics LLC, we value a diverse, inclusive workforce and we provide equal employment opportunities for all applicants and employees. All qualified applicants for employment will be considered without regard to an individual's race, color, sex, gender identity, gender expression, religion, age, national origin or ancestry, citizenship, physical or mental disability, medical condition, family care status, marital status, domestic partner status, sexual orientation, genetic information, military or veteran status, or any other basis protected by federal, state or local laws. If you are unable to submit your application because of incompatible assistive technology or a disability, please contact us at ********************. LatentView Analytics LLC will reasonably accommodate qualified individuals with disabilities to the extent required by applicable law.”
    $89k-119k yearly est. 1d ago
  • Configuration Engineer - NO C2C

    A-Line Staffing Solutions 3.5company rating

    Data engineer job in Troy, MI

    Title: Configuration Engineer Rate: $45/Hr is a contract on W2, and is NOT open to C2C. This Configuration Engineer will be responsible for the configuration build out of mortgage related business processes. This includes configuring workflows, the creation of automated decision points, tasks, managing system users. They will work with our Business Process Management team to support loan servicing business partners, vendors, and data providers as directed by the Technology and Product Development leadership. Strong consultative skills, root cause analysis, strong understanding of data modeling, providing solutions and alternative methods to meet internal client expectations, and have an in-depth understanding of process workflow and data integrations, such as API. Responsibilities Create and manage automated workflow solutions in a low-code environment including developing program modules. Supports data reporting partners and data integration partners in testing and providing insight to the data structure in the Business Management Model. Recommend and facilitate system enhancements to improve efficiencies throughout the servicing organization. Supports internal and external partners in resolving defects by triaging issues, identifying root cause failures, and provide solutions to facilitate a fix. Ability to learn and master low code platform from the perspective of both end-users and engineers. Develops and maintains workflow automations and 3rd party integrations via API from data providers and internal data owners. Coordinates with business partners and vendors to execute requirements. Supports training teams in understanding the workflow automation. Support internal customers to provide a positive technical experience. Interface with other departments as necessary to ensure the smooth operation and growth of the organization. Design, documents, managed testing and delivers solutions for assigned program modules. Other projects and assignments as needed. Qualifications And Experience Bachelor's degree in science or equivalent experience 2 - 5 of Years in SaaS application deployment and/or similar experience. Proficient in a programming query language. Be able to read & understand API documentation and versed in API authentication methods including OAuth, Basic Auth, Tokens, and SAML Proficient in working with REST APIs and in a major programming language. Understanding of Mortgage Servicing processes including default servicing Ability to work in a fast-paced fluid environment. Excellent communication skills both written and verbal. Ability to work independently and as a member of various teams and committees. Commitment to excellence and high standards. Not required but nice to have Experience in integrating attorney networks and common vendors in the industry Experience with JavaScript
    $45 hourly 4d ago
  • CAE Engineer

    Pentangle Tech Services | P5 Group

    Data engineer job in Novi, MI

    Job title : THERMAL-3D MANAGEMENT CAE SPECIALIST, Location: Novi, MI (Onsite) Duration: Contract Job Description: Responsible for designing and optimizing thermal management systems for vehicle components, this role focuses on ensuring the reliability and longevity of critical components such as batteries, power electronics, and electric motors. The Thermal Management Engineer leverages advanced simulation tools to analyze and improve thermal performance under various operating conditions. Key Responsibilities: • Develop and implement thermal management strategies for high-power components, including batteries, inverters, electric motors, and power electronics, using tools such as ANSYS, GT-SUITE, and STAR-CCM+. • Conduct thermal simulations to evaluate and enhance the performance of cooling and heating systems under different operating conditions, such as high-speed driving, rapid acceleration, and charging. • Collaborate with powertrain, electrical, and control teams to integrate thermal management solutions into overall vehicle design, ensuring compatibility and efficiency. • Analyze heat transfer and cooling requirements, assessing the effectiveness of components like radiators, heat exchangers, and HVAC systems to maintain optimal operating temperatures, in electric and hybrid vehicle applications. • Optimize battery thermal management to ensure consistent performance, prolong battery life, and enhance vehicle range under various environmental conditions. • Validate thermal models against real-world data by conducting physical tests under different load and environmental conditions, adjusting simulation parameters as needed to improve accuracy. • Document thermal analysis results, preparing detailed reports with recommendations for design improvements to enhance cooling efficiency and component reliability. • Research and implement advanced cooling technologies, such as phase change materials, liquid cooling systems, and thermal insulation, to improve overall vehicle thermal performance. • Proficiency in thermal simulation tools like ANSYS, GT-SUITE, and STAR-CCM+. • Use expertise to correlate virtual to physical. Sign-off virtual performance based on simulation models.
    $64k-85k yearly est. 5d ago
  • Java Software Engineer

    Mindlance 4.6company rating

    Data engineer job in Ann Arbor, MI

    Looking for candidates local to Ann Arbor, MI Required Skills: • 5+ Years of Java, J2EE and web/internet based programming experience (both client and server side) • 5+ Experience with OOA/OOD, distributed systems/software, real time processing, relational database systems, messaging systems • Experience with concurrency & multi-threading • Experience with scaling, Java Garbage Collection, and performance tuning preferred • Deep understanding of data structures, algorithms and design patterns (GoF) • Experience with agile, test-driven development • Experience with Unix/Linux • Experience with build, deploy and test automation tools like Ant, Gradle, Maven, Jenkins, TeamCity, Junit, TestNG, JaCoCo or similar tools • Demonstrated experience working with core business logic within applications • Experience in developing APIs and Frameworks • Excellent written and verbal communication skills Preferred Skills • Experience with application development frameworks like Spring, Hibernate, JSF or similar frameworks • Experience with compilers or DSLs preferred “Mindlance is an Equal Opportunity Employer and does not discriminate in employment on the basis of - Minority/Gender/Disability/Religion/LGBTQI/Age/Veterans.”
    $67k-88k yearly est. 4d ago
  • Senior Java Software Engineer

    Apexon

    Data engineer job in Detroit, MI

    Sr. Fullstack Java Developer - Detroit, MI - Onsite Duration: 1 Year Employment Type: Contract - Can go for Inperson Interview We are looking for an experienced Fullstack Java Developer(12- 15yrs of exp) to join our team for a long-term engagement. The ideal candidate will have strong hands-on experience across Java, Spring, front-end frameworks, databases, and cloud-ready tools, with the ability to lead a team and work directly with customers. Responsibilities (Brief) Develop and enhance applications using Java 17/8+, Spring Framework, JSON/XML, AngularJS / Angular 8-11 / React.js. Must have strong Hands on coding experience is needed Work with MongoDB, MySQL, SQL, NoSQL databases. Support upgrade/migration projects using Java, Spring, and Gradle. Must have at least 3 yrs of experience in deployment (CI/CD pipelines) Lead development activities and guide technical teams. Follow Agile methodologies and drive customer value. Participate in client discussions and deliver quality solutions. Preferred: Experience with front-end technologies and healthcare insurance domain. Communicate effectively with technical and business stakeholders. Required Technical Skills Java - Mandatory | 10+ years AngularJS / Angular 8-11 - Mandatory | 5+ years Spring Framework - Mandatory | 5+ years JSON / XML - Mandatory | 5+ years MongoDB / MySQL / SQL / NoSQL DBs - Mandatory | 5+ years Gradle - Mandatory | 5+ years Good to Have Spring Boot - 3+ years AngularJS / React.js / JSP - 3+ years IntelliJ - 3+ years
    $82k-107k yearly est. 1d ago
  • Robotics Software/Systems Engineer

    Akkodis

    Data engineer job in Warren, MI

    A Robotics Software/Systems Engineer job in Warren, MI is available courtesy of Akkodis. We are seeking a Senior Engineer, AI Systems Engineering - Integration to join a Manufacturing Technology Development team within the Research and Development organization. In this role, you will lead system-level integration of new technologies, validating novel AI and robotics algorithms in full-stack collaborative robot prototypes. You will develop frameworks for iterative assembly and testing, ensuring innovations can be evaluated in realistic workflows. You will serve as the convergence point where Robotics Intelligence breakthroughs and AI & Simulation models are combined into functional prototypes Pay: $40/hr to $60/hr Robotics Software/Sytems Engineer job responsibilities: Lead integration of AI, perception, and robotics software into full-stack prototype systems. Develop and maintain frameworks for iterative build, test, and validation cycles. Ensure innovations are evaluated under realistic, production-relevant workflows. Collaborate closely with Robotics Intelligence, AI & Simulation, Controls, and Hardware teams. Manage system-level prototype bring-up, debugging, and performance validation. Qualifications: Bachelor's degree in Robotics, Computer Engineering, Electrical Engineering, or related field. 5+ years of experience in robotics software or systems integration. Strong background in AI model deployment, ROS/ROS2, and hardware-software integration. Experience working with collaborative robots, sensors, and real-world task workflows. Excellent system-level debugging, communication, and cross-functional collaboration skills. If you are interested in this Software/System Engineer job in Warren, MI please click APPLY NOW. For other opportunities available at Akkodis go to **************** If you have questions about the position, please contact *****************************. Equal Opportunity Employer/Veterans/Disabled Benefit offerings include medical, dental, vision, term life insurance, short-term disability insurance, additional voluntary benefits, commuter benefits and 401K plan. Our program provides employees the flexibility to choose the type of coverage that meets their individual needs. Available paid leave may include Paid Sick Leave, where required by law; any other paid leave required by Federal, State or local law; and Holiday pay upon meeting eligibility criteria. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs which are direct hire to a client To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit ********************************************** The Company will consider qualified applicants with arrest and conviction records.
    $40 hourly 4d ago
  • Mid-Senior Software Engineer (Go/TypeScript/C++): $125-185K

    IC Resources 4.4company rating

    Data engineer job in Ann Arbor, MI

    We're assisting our European-Based Engineering client identify a Senior GoLang Software Engineer as they build out their US Headquarters in Ann Arbor, Michigan. This is a very exciting opportunity to be among the first members of an Emerging-Tech team here in the U.S. We're Only Considering Local, Michigan Candidates at This Time. Candidates must be able to obtain a Security Clearance (US Citizen). Highly Competitive Salary and Benefits Ability to Work Several Days from Home Cutting-Edge/Unique Tech: Greenfield Development You're Contributions Will Have True Impact What We're Looking for in a Senior GoLang Engineer: 3+ Years of Software Engineering Experience Highly Proficient with Go / GoLang Proficient in C++ and Linux Environments Experience with TypeScript CUDA Experience is Icing On The Cake! Experience in an AWS Environment Highly Preferred Experience in Machine Learning and New Model Architecture B.S. Degree in Computer Science or STEM
    $92k-120k yearly est. 2d ago
  • Data Scientist

    Ford Global

    Data engineer job in Dearborn, MI

    At Ford Motor Company, we believe freedom of movement drives human progress. We also believe in providing you with the freedom to define and realize your dreams. With our incredible plans for the future of mobility, we have a wide variety of opportunities for you to accelerate your career potential as you help us define tomorrow's transportation. Do you believe data tells the real story? We do! Redefining mobility requires quality data, metrics, and analytics, as well as insightful interpreters and analysts. That's where Global Data Insight & Analytics makes an impact. We advise leadership on business conditions, customer needs and the competitive landscape. With our support, key decision makers can act in meaningful, positive ways. Join us and use your data expertise and analytical skills to drive evidence-based, timely decision making. You'll have... Master's degree or foreign equivalent in Computer Science, Data Science, Computer Engineering or related field and 3 years of experience in the job offered or related occupation. 3 years of experience with each of the following skills is required: 1. Developing NLP pipelines in Python using at least 2 of the following: NLTK, spa Cy, Gensim, or HuggingFace. 2. Implementing text preprocessing workflows, creating feature extraction algorithms, and building and training models with scikit-learn, TensorFlow, or PyTorch. 3. Developing reusable modules for NLP and writing production-ready code. 4. Querying large datasets in SQL to extract textual information. 5. Designing database schemas optimized for NLP applications. 6. Writing complex queries to join structured and unstructured data sources. 7. Creating ETL processes for text data, optimizing query performance for large text corpora, and implementing database operations in analytics pipelines. 8. Applying supervised Machine Learning techniques to NLP problems, implementing unsupervised methods for text analysis, evaluating model performance with appropriate metrics, building ensemble models, conducting hyperparameter optimization, and applying transfer learning with pre-trained embeddings. 9. Deploying NLP models and pipelines on Google Cloud Platform (GCP) infrastructure. 10. Utilizing AI Platform for training and serving ML models. 11. Managing data storage with Cloud Storage, BigQuery, or Cloud SQL. 12. Implementing data processing pipelines with Dataflow or Dataproc. 2 years of experience with each of the following skills is required: 1. Managing code versioning for collaborative NLP model development, implementing code review processes, and resolving merge conflicts in multi-developer environments. 2. Using Git for CI/CD integration with model deployment and organizing repositories for maintainable ML codebases. 1 year of experience with each of the following skills is required: 1. Using Cloud Functions for serverless text processing and monitoring model performance. 2. Containerizing NLP applications for deployment, creating and managing deployment configurations, and setting up routes and services with OpenShift. 3. Implementing resource allocation and scaling strategies, configuring persistent storage for models and data, and managing deployments with rolling updates. 4. Fine-tuning pre-trained Large Language Models (BERT, GPT, or T5) for domain-specific tasks. 5. Implementing prompt engineering techniques and evaluating LLM outputs for accuracy. 6. Creating embeddings for semantic search, optimizing inference for production, and reducing hallucinations and improving factuality. 7. Building CI/CD pipelines in Tekton for NLP model deployment. 8. Creating reusable pipeline components for text processing, managing workflow triggers, and implementing testing and validation steps. 9. Configuring resource requirements, integrating model evaluation metrics, and setting up automated retraining pipelines. We are offering a salary of $107,848.00 - $182,338.56/yr. You may not check every box, or your experience may look a little different from what we've outlined, but if you think you can bring value to Ford Motor Company, we encourage you to apply! As an established global company, we offer the benefit of choice. You can choose what your Ford future will look like: will your story span the globe, or keep you close to home? Will your career be a deep dive into what you love, or a series of new teams and new skills? Will you be a leader, a changemaker, a technical expert, a culture builder…or all the above? No matter what you choose, we offer a work life that works for you, including: • Immediate medical, dental, and prescription drug coverage • Flexible family care, parental leave, new parent ramp-up programs, subsidized back-up child care and more • Vehicle discount program for employees and family members, and management leases • Tuition assistance • Established and active employee resource groups • Paid time off for individual and team community service • A generous schedule of paid holidays, including the week between Christmas and New Year's Day • Paid time off and the option to purchase additional vacation time. For a detailed look at our benefits, click here: ******************************* Candidates for positions with Ford Motor Company must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire. We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, age, sex, national origin, sexual orientation, gender identity, disability status or protected veteran status. In the United States, if you need a reasonable accommodation for the online application process due to a disability, please call **************. #LI-DNI #DNP What you'll be able to do: Data Scientist - positions offered by Ford Motor Company (Dearborn, Michigan). Note, this is a purely telecommuting/work-from-home position whereby the employee may reside anywhere within the U.S. Develop and deploy Natural Language Processing (NLP) models to extract insights from unstructured textual data. Collaborate with cross-functional teams to identify opportunities and develop strategies for applying NLP techniques to enhance quality analytics. Design and implement data pre-processing and feature engineering techniques for NLP tasks. Utilize supervised and unsupervised machine learning techniques to solve complex NLP problems. Evaluate and fine-tune models for performance optimization, accuracy, and efficiency. Contribute maintainable code to existing and new pipelines. Stay up-to-date with the latest advancements in NLP and contribute to the continuous improvement of methodologies and algorithms. Communicate findings, insights, and recommendations to stakeholders in a clear and concise manner.
    $107.8k-182.3k yearly Auto-Apply 60d+ ago
  • ETL Architect

    360 It Professionals 3.6company rating

    Data engineer job in Southfield, MI

    360 IT Professionals is a Software Development Company based in Fremont, California that offers complete technology services in Mobile development, Web development, Cloud computing and IT staffing. Merging Information Technology skills in all its services and operations, the company caters to its globally positioned clients by providing dynamic feasible IT solutions. 360 IT Professionals work along with its clients to deliver high-performance results, based exclusively on the one of a kind requirement. Our services are vast and we produce software and web products. We specialize in Mobile development, i.e. iPhone and Android apps. We use Objective C and Swift programming languages to create native applications for iPhone, whereas we use Android Code to develop native applications for Android devices. To create applications that work on cross-platforms, we use a number of frameworks such as Titanium, PhoneGap and JQuery mobile. Furthermore, we build web products and offer services such as web designing, layouts, responsive designing, graphic designing, web application development using frameworks based on model view controller architecture and content management system. Our services also extend to the domain of Cloud Computing, where we provide Salesforce CRM to effectively manage one's business and ease out all the operations by giving an easy platform. Apart from this, we also provide IT Staffing services that can help your organization to a great extent as you can hire highly skilled personnel's through us. We make sure that we deliver performance driven products that are optimally developed as per your organization's needs. Take a shot at us for your IT requirements and experience a radical change. Job Description Position: ETL Architect Location: Southfield, MI Duration: Contract to hire Need candidates on W2 only 15-17 yrs. Experience · This person will lead teams and work with management and executives · Must have excellent communication · This person is not hands on but must be able to speak to and understand how things work (Healthcare) · Must have 3-4 yrs. as an architect and be able to show their career progression · Cognos and Business Objects are nice to have The Informatica ETL Architect has the overall responsibility for assessing requirements and defining the strategy, technical architecture, implementation plan and delivery of data warehouse projects. The Informatica ETL Architect must have prior experience completing successful data warehousing implementations as well as broad background and experience with IT application development. This individual is responsible for establishing the long term strategy and technical architecture as well as the short term scope for a multi-phased data warehouse effort and should have strong professional consulting skills and the ability to communicate well at all levels of the organization. Top Skill Set:• Lead ETL architecture and design as well as data flow diagramming• Define and implement ETL development standards & procedures• Ensure ETL quality through code reviews and throughout inspection & knowledge sharing• At least 12 years experience with Informatica in a Developer/Tech Lead role• Knowledge of Health Care Insurance Payer Data Warehousing Preferred• Ability to develop a technical work plan and assign work and coordinate across multiple developers and projects. Required Skills/Experience• 12 to 16 Years of Informatica ETL development experience• At least 4 years experience as Informatica ETL Architect• At least 8-10 years experience with Informatica in a Developer/Tech Lead role• Mastery in data warehousing concepts. Candidate should be able to clearly communicate fundamental concepts during the interview and demonstrate previous experience in all aspects.• MUST HAVE strong SQL skills in Oracle Partitioned Environment• Experience in Business Intelligence reporting tools like Cognos and Business Objects preferred• Experience in Oracle database programming using Partitioning, Materialized View and OLAP• Experience in tuning Oracle queries/processes and performance management tools TOOLS & TECHNOLOGIES Informatica 8.x and above (9.1 Preferred) PowerCenter PowerExchange Data Quality Oracle 10g and above Unix Shell Scripting (AIX, Linux) Scheduling Tools (any one of Tivoli, Autosys and Ctrl-M) SQL Additional Information Regards, Vishal Rana Talent & Client Acquisition Specialist Phone: 510 254 3300 Ext 178 |
    $102k-128k yearly est. 11h ago
  • Data Scientist 1

    Denso Career Connection

    Data engineer job in Southfield, MI

    Under general supervision, applies knowledge of statistics, machine learning, programming, data modeling and advanced mathematics to recognize patterns, identify opportunities, pose business questions and make valuable discoveries leading to increased efficiency and productivity. Analysis focuses on product warranty related data sets that may be applied to various areas of the business (e.g, Manufacturing, Design, Marketing/Advertising, etc.). ESSENTIAL DUTIES AND RESPONSIBILITIES: Develops and maintains statistical models capable of extracting valuable business and/or technical insights from large data sets. Proactively participates in developing scalable Business Intelligence (BI) and predictive analytics solutions using a variety of techniques ranging from data aggregation to data mining. Assists with conducting needs assessment and requirements gathering to design and assist in deploying data analytics solutions. Performs data manipulation and analytics and translate insights into actionable recommendations to management and customers.. Prepares presentations and reports of statistical concepts and research results related to efficiency initiatives to be shared with a non-statistical audience and senior stakeholders. Facilitates training and education of associates related to new systems and procedures. Creates standards for process improvement projects using Data Analytics and Statistical Analysis Theory. Applies statistical expertise to advance current and future regional initiatives. Maintains current knowledge of industry trends in the field of Big Data Analytics by attending related seminars, conferences and training sessions. Performs other duties as assigned. QUALIFICATIONS: Bachelor's Degree in math, statistics, computer science, software engineering or a related field Experience with Node.js, SQL, Python, API's, MySQL/MongoDB, large language models, and business intelligence tools. 0+ years of related experience SKILLS AND ABILITIES: Experience with databases and large data sets and related reporting tools Analytical skills with an ability to independently evaluate and develop innovative solutions to complex situations Basic knowledge of automotive parts and related vehicle systems (e.g, charging/starting electrical, emission, HVAC systems, vehicle electronic components). Written and verbal communication skills and presentation skills. Ability to communicate with internal and external customers on issues of moderate to considerable importance, up to and including senior management. Demonstrated ability to foster and maintain relationships with an ability to work as part of a cross functional team Continuous improvement mindset Ability to apply process improvement planning and checking to own work output and to assist others with identifying gaps in work results Benefits Summary: Health, Dental, Vision, Prescription Drug plans Life and Accidental Death & Dismemberment Insurance Flexible Spending Account Employee Assistance Program 401K with 4% company match Bonus Program Wellness Program Onsite Fitness Center (vary by location) Tuition Reimbursement Career Development and Ongoing Training Paid holidays and vacation Cafeteria and food markets (vary by location) Volunteer opportunities Employee recognition (employee and milestone events) Annual Salary: $78,000 - $98,000
    $78k-98k yearly Auto-Apply 60d+ ago
  • Data Scientist

    Cs&S Staffing Solutions

    Data engineer job in Detroit, MI

    Please, review and apply for this position through the QCI system following the link below (Copy and Paste): http://tinyurl.com/nzn6msu *You can apply through Indeed using mobile devices with this link. Job Description The Data Scientist at will delve into the recesses of large data sets of structured, semi-structured, and unstructured data to discover hidden knowledge about our business and develop methods to leverage that knowledge within our line of business. The successful candidate will combine strengths in mathematics and applied statics, computer science, visualization capabilities, and a healthy sense of exploration and knowledge acquisition.You must have USA/Canadian Citizenship or your Green Card/EAD. Responsibilities Work closely with various teams across the company to identify and solve business challenges utilizing large structured, semi-structured, and unstructured data in a distributed processing environment. Develop predictive statistical, behavioral or other models via supervised and unsupervised machine learning, statistical analysis, and other predictive modeling techniques. Drive the collection of new data and the refinement of existing data sources. Analyze and interpret the results of product experiments. Collaborate with the engineering and product teams to develop and support our internal data platform to support ongoing analyses. Requirements M.S. or Ph.D. in a relevant technical field (e.g. applied mathematics, statistics, physics, computer science, operations research), or 3+ years experience in a relevant role. Extensive experience solving analytics problems using quantitative approaches. A proven passion for generating insights from data. Strong knowledge of statistical methods generally, and particularly in the areas of modeling and business analytics. Comfort manipulating and analyzing complex, high-volume, high-dimensionality data from varying sources. Ability to communicate complex quantitative analysis in a clear, precise, and actionable manner. Fluency with at least one scripting language such as Python, Java, or C/C++. Familiarity with relational databases and SQL. Additional Information All your information will be kept confidential according to EEO guidelines.
    $69k-96k yearly est. 11h ago
  • Lead Data Scientist

    Onemagnify

    Data engineer job in Detroit, MI

    OneMagnify is a global performance marketing organization working at the intersection of brand marketing, technology, and analytics. The Company's core offerings accelerate business, amplify real-time results, and help set their clients apart from their competitors. OneMagnify partners with clients to design, implement and manage marketing and brand strategies using analytical and predictive data models that provide valuable customer insights to drive higher levels of sales conversion. OneMagnify's commitment to employee growth and development extends far beyond typical approaches. We take great pride in fostering an environment where each of our 700+ colleagues can thrive and achieve their personal best. OneMagnify has been recognized as a Top Workplace, Best Workplace and Cool Workplace in the United States for 10 consecutive years and recently was recognized as a Top Workplace in India. You'll be joining our RXA Data Science team, a group dedicated to leveraging advanced analytics, predictive modeling, and machine learning to drive smarter marketing and business decisions. As Lead Data Scientist, you will play a critical role in delivering impactful, data-driven solutions. In this role, you will bridge strategy and execution-translating complex business problems into analytically sound solutions while ensuring technical excellence, timely delivery, and cross-functional collaboration. The Lead Data Scientist is responsible for leading the execution of end-to-end data science projects, from scoping and modeling to operationalization and insight delivery. You will partner with clients, internal teams, and technical stakeholders to develop and deploy scalable solutions that drive measurable business value. What you'll do: Lead the design, development, and deployment of statistical models, machine learning algorithms, and custom analytics solutions Collaborate consistently with team members to understand the purpose, focus, and objectives of each data analysis project, ensuring alignment and meaningful support Translate client goals into clear modeling strategies, project plans, and deliverables Guide the development of production-level model pipelines using tools such as Databricks and Azure ML Collaborate with engineering, marketing, and strategic partners to integrate models into real-world applications Monitor and improve model performance, ensuring high standards for reliability and business relevance Present complex analytical results to technical and non-technical audiences in a clear, actionable format Support innovation by identifying new tools, methods, and data sources-including the use of Snowflake for modern data architecture Promote best practices in model governance, data ethics, and responsible AI What you need: Minimum 5-7 years of experience in data science, analytics, or predictive modeling Experience leading all aspects of sophisticated data science initiatives with a solid foundation in technical strategy and execution Strong programming skills in Python, R, or SAS for modeling and data analysis Advanced SQL capabilities and experience working in cloud-based environments (e.g., Azure, AWS) Hands-on experience with Databricks, Azure Machine Learning, and Snowflake strongly preferred Experience applying the modeling rigor and documentation standards required in regulated industries such as financial services is a strong plus Expertise in regression, classification, clustering, A/B testing, and audience segmentation Proficiency with Tableau, Power BI, and Excel for data visualization and communication Strong communication skills and the ability to translate complex technical findings into business insight Bachelor's degree in Data Science, Statistics, Computer Science, or a related quantitative field (Master's preferred) Benefits We offer a comprehensive benefits package including medical, dental, 401(k), paid holidays, vacations, and more. About us Whether it's awareness, advocacy, engagement, or efficacy, we move brands forward with work that connects with audiences and delivers results. Through meaningful analytics, engaging communications and innovative technology solutions, we help clients tackle their most ambitious projects and overcome their biggest challenges. We are an equal opportunity employer We believe that Innovative ideas and solutions start with unique perspectives. That's why we're committed to providing every employee a workplace that's free of discrimination and intolerance. We're proud to be an equal opportunity employer and actively search for like-minded people to join our team. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform job functions, and to receive benefits and privileges of employment. Please contact us to request accommodation.
    $69k-96k yearly est. Auto-Apply 8d ago
  • Data Scientist III

    United Wholesale Mortgage Corp.(DBA UWM 4.6company rating

    Data engineer job in Pontiac, MI

    Team members in the Data Scientist role at UWM are responsible for modeling complex problems, discovering insights and identifying opportunities using statistics, algorithms, machine learning, and visualization techniques. Data Scientists work closely with executives, product owners, SME's, and other business teams to leverage data and help answer critical business decisions. Data Scientists at UWM need to be creative thinkers and propose innovative ways to look at problems by examining and discovering new patterns within our datasets and collaborating with our business stakeholders. They will need to validate their results using an experimental and iterative approach. Perhaps most importantly, they will need to be able to communicate their insights and results to the business in a clear, concise, and approachable way. They need to be storytellers of their work. These professionals will need a combination of business focus, data programming knowledge, and strong analytical and problem solving skills, to be able to quickly develop and test hypothesis, and provide conclusions in a clear, structured manner. This role includes the full data science lifecycle from analytic problem definition, through data wrangling, analysis, model development, reporting/visualization development, testing, deployment, and feedback. WHAT YOU WILL BE DOING * Work with stakeholders throughout the organization to identify opportunities for leveraging company data to increase efficiency or improve the bottom line. * Analyze UWM data sets to identify areas of optimization and improvement of business strategies. * Assess the effectiveness and accuracy of new data sources and data gathering techniques. * Develop custom data models, algorithms, simulations, and predictive modeling to support insights and opportunities for improvement. * Develop A/B testing framework and test model quality. * Coordinate with different business areas to implement models and monitor outcomes. * Develop processes and tools to monitor and analyze model performance and data accuracy WHAT WE NEED FROM YOU Must Have * Bachelor's degree in Finance, Statistics, Economics, Data Science, Computer Science, Engineering or Mathematics, or related field * 5+ years of experience in statistical analysis, and/or machine learning * 5+ years of experience with one or more of the following tools: machine learning (Python, MATLAB), data wrangling skills/tools (Hadoop, Teradata, SAS, or other), statistical analysis (Python, R, SAS) and/or visualization skills/tools (PowerBI, Tableau, Qlikview) * 3+ years of experience collaborating with teams (either internal or external) to develop analytics solutions * Strong problem solving skills * Strong communication skills (interpersonal, written, and presentation) Nice to Have * Master's degree in Finance, Statistics, Economics, Data Science, Computer Science, Mathematics or related field * 3+ years of experience with R, SQL, Tableau, MATLAB, Python * 3+ years of professional experience in machine learning, data mining, statistical analysis, modeling, optimization * Experience in Accounting, Finance, and Economics THE PLACE & THE PERKS Ready to join thousands of talented team members who are making the dream of home ownership possible for more Americans? It's all happening on UWM's campus, where our award-winning workplace packs plenty of perks and amenities that keep the atmosphere buzzing with energy and excitement. It's no wonder that out of our six pillars, People Are Our Greatest Asset is number one. It's at the very heart of how we treat each other, our clients and our community. Whether it's providing elite client service or continuously striving to improve, our pillars provide a pathway to a more successful personal and professional life. From the team member that holds a door open to the one that helps guide your career, you'll feel the encouragement and support on day one. No matter your race, creed, gender, age, sexual orientation and ethnicity, you'll be welcomed here. Accepted here. And empowered to Be You Here. More reasons you'll love working here include: * Paid Time Off (PTO) after just 30 days * Additional parental and maternity leave benefits after 12 months * Adoption reimbursement program * Paid volunteer hours * Paid training and career development * Medical, dental, vision and life insurance * 401k with employer match * Mortgage discount and area business discounts * Free membership to our large, state-of-the-art fitness center, including exercise classes such as yoga and Zumba, various sports leagues and a full-size basketball court * Wellness area, including an in-house primary-care physician's office, full-time massage therapist and hair salon * Gourmet cafeteria featuring homemade breakfast and lunch * Convenience store featuring healthy grab-and-go snacks * In-house Starbucks and Dunkin * Indoor/outdoor café with Wi-Fi DISCLAIMER All the above duties and responsibilities are essential job functions subject to reasonable accommodation and change. All job requirements listed indicate the minimum level of knowledge, skills and/or ability deemed necessary to perform the job proficiently. Team members may be required to perform other or different job-related duties as requested by their team lead, subject to reasonable accommodation. This document does not create an employment contract, implied or otherwise. Employment with UWM is "at-will." UWM is an Equal Opportunity Employer. By selecting "Apply for this job online" you provide consent to UWM to record phone call conversations between you and UWM to be used for quality control purposes.
    $73k-93k yearly est. Auto-Apply 14d ago
  • Data Engineer

    Vibe Credit Union 3.8company rating

    Data engineer job in Novi, MI

    Our Purpose At Vibe, we are driven by our mission to elevate community and create opportunity . We believe in fostering an environment of inclusivity where every team member has the chance to grow professionally. Guided by our core values - be i nclusive, educate, embrace change, and seek opportunities - we are dedicated to making a positive impact in the lives of our members and communities. As we continue to grow and expand our team, we are seeking passionate individuals who share our vision and are eager to join us in our journey. If you are someone who is passionate about making a difference and is committed to creating a brighter future for our communities, we invite you to explore this exciting opportunity at Vibe! Position Purpose The Data Engineer will be responsible for designing, developing, and managing data pipelines, ensuring data quality, and optimizing performance on the Snowflake, Power BI and SQL platforms. This role will work closely with the Business Intelligence team, data analysts and our IT departments to enable our data sources to be accessible and actionable. Essential Duties Design & Development: Develop and maintain robust, scalable data pipelines for extracting, transforming, and loading (ETL) data into Snowflake from various sources (e.g., databases, APIs, third-party systems). Implement and optimize Snowflake schemas (e.g., star and snowflake schemas) and data models for analytics and reporting. Write complex SQL queries to help enable business analysts in partner departments for data extraction and transformation. Develop complex Data Models within Power BI and produce self service data visualization dashboards. Data Integration: Collaborate with cross-functional teams to gather data requirements and design integrations between Snowflake and other data sources (e.g., on-premise databases, cloud platforms, data lakes). Work with our vendors to develop efficient ways to exchange data. Implement data ingestion frameworks using tools like Snowpipe for real-time or batch processing. Optimization & Performance: Optimize Snowflake queries, databases, and storage to improve performance and reduce costs (e.g., clustering, pruning, and partitioning). Monitor and troubleshoot data pipelines and optimize processes for reliability, efficiency, and scalability. Develop and execute a cost-conscious data lake ingestion strategy by evaluating, prioritizing, and onboarding high-value data sources to ensure scalable, efficient, and business-aligned data architecture. Data Governance & Quality: Implement and maintain data quality checks and validation mechanisms. Ensure compliance with data privacy and security standards, especially when working with sensitive or regulated data. Maintain documentation on data processes, pipelines, and data models for internal use. Collaboration: Work with data analysts, IT development teams and partner departments to ensure that data solutions align with business requirements. Assist in providing access to data and creating views in Snowflake for reporting and analytics purposes. Education/Experience Strong experience in data engineering or related roles with a focus on Snowflake. Hands-on experience with Snowflake's architecture, performance tuning, and advanced features (e.g., Snowflake Streams, Tasks, Snowpipe). Strong partnership and collaboration experience to develop solutions for several internal departments and external vendors. Proficiency in SQL, Snowflake SQL, DB2 and Python. Experience with ETL/ELT tools such as Talend, Apache NiFi, or custom solutions. Experience with cloud platforms (AWS, Azure, or Google Cloud) and data integration services. Skills/Abilities Expertise in Snowflake data warehousing concepts such as data loading, performance tuning, partitioning, and storage optimization. Proficient in programming languages like DB2, SnowSQL, SQL Server, Python or Java for data pipeline development. Knowledge of data visualization and reporting tools (e.g., Power BI) is a plus. Strong problem-solving and troubleshooting skills. Excellent communication skills for collaborating with stakeholders and team members. Ability to manage multiple projects and meet deadlines in a fast-paced environment. Physical Requirements These physical demands are representative of the physical requirements necessary for an employee to successfully perform the essential functions of the position. Reasonable accommodations can be made to enable people with disabilities to perform the described essential functions of the position. While performing the responsibilities of the job, the employee is required to hear, see, talk, stand, walk, stoop, kneel, lift, push, pull, and grasp. I have received a copy of this job description: _______________________________ Employee's Name (please print) ________________________________________ ______________________________________________ Manager's Signature
    $88k-109k yearly est. 3d ago
  • Data Scientist

    Publicis Groupe

    Data engineer job in Southfield, MI

    Mars United℠ Commerce is a global commerce marketing practice that aligns people, technology, and intelligence to make the business of our clients better today than it was yesterday. Our worldwide capabilities coalesce into four key disciplines - Strategy & Analytics, Content & Experiences, Digital Commerce, and Retail Consultancy - that individually deliver unmatched results for clients and collectively give them an unparalleled network of seamlessly integrated functions across the entire commerce marketing ecosystem. These disciplines are powered by our industry-leading technology platform, Marilyn, which helps marketers understand the total business impact of their commerce marketing activation, enabling them to make better decisions, create connected experiences, and drive stronger, measurable results. Learn more at **************************** Overview Part of the overall Analytics Group, the Data Science team is responsible for all data modeling, algorithm development, and creating machine learning and AI models. They develop techniques such as regression, classification, clustering, natural language processing (NLP), and more. Additionally, they focus on marketing analytics, using advanced data science techniques to analyze marketing performance, optimize campaigns, and provide actionable insights to enhance marketing effectiveness. ● Core Responsibilities: Data Modeling, Feature Engineering, Sentiment Analysis, Propensity Modeling, CM3, Model Training & Testing, Forecasting, Multi-touch/Data-driven Attribution, etc. ● Primary Tools: Databricks, Azure Synapse, Alteryx, Python, SQL Responsibilities As a Data Scientist, you will leverage your strong technical skills and experience to develop data science and AI solutions. This role requires a deep understanding of data science and machine learning techniques and the ability to collaborate with various teams to ensure data quality and actionable insights. Specifically, the Data Scientist will: ● Collaborate with stakeholders to understand business requirements and translate them into actionable data science projects. ● Work closely with cross-functional teams, including analysts, product managers, and domain experts, to understand business requirements, formulate problem statements, and deliver relevant data science solutions. ● Develop and optimize machine learning models by processing, analyzing and extracting data from varying internal and external data sources. ● Develop supervised, unsupervised, and semi-supervised machine learning models using state-of-the-art techniques to solve client problems. ● Own and manage complex ETL pipelines to clean, preprocess, and transform large datasets. ● Identify and engineer relevant features to enhance model performance and accuracy. ● Design and implement robust evaluation metrics and frameworks to assess and monitor the performance of machine learning models. ● Communicate findings and recommendations through comprehensive reports and engaging presentations. ● Support wider agency initiatives. ● Show up - be accountable, take responsibility, and get back up when you are down. ● Make stuff. ● Share so others can see what's happening. Qualifications A Bachelors'/Master's degree in Mathematics, Statistics, Data Analytics, Computer Science, or a directly related field. ● 1+ years of industry experience in a data science/data analysis/statistical analyst role. ● Comfortable in manipulating and analyzing complex, high-volume, high-dimensionality data from varying sources using Python/R libraries and SQL. ● Familiarity with relational, SQL, and NoSQL databases. ● Databricks experience is a big plus point. ● Knowledge of statistical analysis tools such as R is a plus. ● Knowledge of scripting in SQL and Python using OOP concepts. ● Experience with PowerBI or Tableau. ● Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and big data technologies. ● Experience in DevOps or MLOps is a plus. ● Experience with DS or ML frameworks and libraries (e.g., Spark, TensorFlow, PyTorch) is a plus. ● Strong communication skills to effectively convey complex findings to non-technical stakeholders. ● Collaborative mindset to work seamlessly with creative, strategic, and client-facing teams. ● Critical thinking to analyze data and derive meaningful insights. ● Experience in the marketing domain is preferred. ● Ensure the accuracy and reliability of data through rigorous QA processes. ● Validate model outputs to ensure they meet business requirements. ● Conduct unit tests and validation checks on data and models. ● Perform A/B testing to evaluate model performance and impact. ● Document all data analysis and modeling processes. ● Maintain comprehensive records of data sources, methodologies, and results. ● Ensure compliance with data governance and security policies. Additional information Our Publicis Groupe motto "Viva La Différence" means we're better together, and we believe that our differences make us stronger. It means we honor and celebrate all identities, across all facets of intersectionality, and it underpins all that we do as an organization. We are focused on fostering belonging and creating equitable & inclusive experiences for all talent. Publicis Groupe provides robust and inclusive benefit programs and policies to support the evolving and diverse needs of our talent and enable every person to grow and thrive. Our benefits package includes medical coverage, dental, vision, disability, 401K, as well as parental and family care leave, family forming assistance, tuition reimbursement, and flexible time off. If you require accommodation or assistance with the application or onboarding process specifically, please contact *****************************. Compensation Range: $54,910 to $72,300 annually. This is the pay range the Company believes it will pay for this position at the time of this posting. Consistent with applicable law, compensation will be determined based on the skills, qualifications, and experience of the applicant along with the requirements of the position, and the Company reserves the right to modify this pay range at any time. Temporary roles may be eligible to participate in our freelancer/temporary employee medical plan through a third-party benefits administration system once certain criteria have been met. Temporary roles may also qualify for participation in our 401(k) plan after eligibility criteria have been met. For regular roles, the Company will offer medical coverage, dental, vision, disability, 401k, and paid time off. The Company anticipates the application deadline for this job posting will be 1/15/2026. All your information will be kept confidential according to EEO guidelines.
    $54.9k-72.3k yearly 4d ago

Learn more about data engineer jobs

How much does a data engineer earn in Bloomfield, MI?

The average data engineer in Bloomfield, MI earns between $66,000 and $116,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Bloomfield, MI

$87,000

What are the biggest employers of Data Engineers in Bloomfield, MI?

The biggest employers of Data Engineers in Bloomfield, MI are:
  1. Ernst & Young
  2. Lear
  3. Integra Seating
  4. Lucid Motors
  5. Qualified Staffing
  6. Williams International
  7. Raymond James Financial
  8. Brightwing
  9. Guardian Alarm
  10. Guardian Alarm Company of Michigan Jobs
Job type you want
Full Time
Part Time
Internship
Temporary