Controls Software Engineer
Data engineer job in Shelby, MI
Lincoln Electric is the world leader in the engineering, design, and manufacturing of advanced arc welding solutions, automated joining, assembly and cutting systems, plasma and oxy-fuel cutting equipment, and has a leading global position in brazing and soldering alloys. Lincoln is recognized as the Welding Expert™ for its leading materials science, software development, automation engineering, and application expertise, which advance customers' fabrication capabilities to help them build a better world. Headquartered in Cleveland, Ohio, Lincoln Electric is a $4.2B publicly traded company (NASDAQ:LECO) with over 12,000 employees around the world, with operations in 71 manufacturing and automation system integration locations across 21 countries and maintains a worldwide network of distributors and sales offices serving customers in over 160 countries.
Location: Shelby
Employment Status: Hourly Full-Time
Function: Engineering
Req ID: 26527
Summary
Fori Automation, LLC, a Lincoln Electric Company, is a global supplier of welding, assembly, material handling, and testing equipment for automotive and non-automotive customers worldwide. Fori Automation focuses on delivering cost-effective, highly engineered products and systems designed and manufactured globally with localized sales, project management, and service.
We are seeking an experienced Controls Software Engineer for our Shelby Township, MI site with a background in industrial software development. The Controls Software Engineer will initially support active projects and then transition to completing projects directly. They will take the lead on developing software on new projects and debug software on new machines. This role requires travel to customer sites for equipment installation and customer interaction.
What You Will Do
Design PLC software and HMIs for industrial automation equipment
Debug and troubleshoot PLC software and HMIs
Collaborate with cross-functional teams to maintain project timelines and critical path milestones.
Maintain task lists and reports of open items.
Maintain project design documentation and prepare customer deliverables.
Ensure the controls engineering process is tracked and followed.
Assist customers and local tradespeople in troubleshooting equipment issues.
Conduct end-user training on equipment operation.
Education & Experience Requirements
Electrical Engineering or Computer Engineering degree preferred; Mechatronics degrees will also be considered.
Minimum of two years of experience as a Controls Engineer or Controls Software Engineer with experience in designing Rockwell Logix 5000 or Siemens S7-1500 family processors.
Knowledge or education in electrical circuits, schematic reading, design, and troubleshooting.
Experience with electrical CAD systems, such as AutoCAD Electrical and/or ePLAN
Experience with PLC programming in ladder and structured text.
Experience programming HMIs
Travel required: approximately 30% domestic and international.
Weekend work may be required based on project schedules.
Preferred
Experience in computer programming languages, such as VB, C/C++, or C#.
Experience with Rockwell and Siemens HMI preferred.
Lincoln Electric is an Equal Opportunity Employer. We are committed to promoting equal employment opportunity for applicants, without regard to their race, color, national origin, religion, sex (including pregnancy, childbirth, or related medical conditions, including, but not limited to, lactation), sexual orientation, gender identity, age, veteran status, disability, genetic information, and any other category protected by federal, state, or local law.
GCP Data Engineer
Data engineer job in Dearborn, MI
Experience Required: 8+ years
Work Status: Hybrid
We're seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. You will: Work in collaborative environment including pairing and mobbing with other cross-functional engineers Work on a small agile team to deliver working, tested software Work effectively with fellow data engineers, product owners, data champions and other technical experts Demonstrate technical knowledge/leadership skills and advocate for technical excellence Develop exceptional Analytics data products using streaming, batch ingestion patterns in the Google Cloud Platform with solid Data Warehouse principles Be the Subject Matter Expert in Data Engineering and GCP tool technologies
Skills Required:
Big Query
Skills Preferred:
N/A
Experience Required:
In-depth understanding of Google's product technology (or other cloud platform) and underlying architectures 5+ years of analytics application development experience required 5+ years of SQL development experience 3+ years of Cloud experience (GCP preferred) with solution designed and implemented at production scale Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Terraform, Big Query, Google Cloud Storage, PubSub, Dataflow, Dataproc, Airflow, etc. 2 + years professional development experience in Java or Python, and Apache Beam Extracting, Loading, Transforming, cleaning, and validating data Designing pipelines and architectures for data processing 1+ year of designing and building CI/CD pipelines
Experience Preferred:
Experience building Machine Learning solutions using TensorFlow, BigQueryML, AutoML, Vertex AI Experience in building solution architecture, provision infrastructure, secure and reliable data-centric services and application in GCP Experience with DataPlex is preferred Experience with development eco-system such as Git, Jenkins and CICD Exceptional problem solving and communication skills Experience in working with DBT/Dataform Experience in working with Agile and Lean methodologies Team player and attention to detail Performance tuning experience
Education Required:
Bachelor's Degree
Education Preferred:
Master's Degree
Additional Safety Training/Licensing/Personal Protection Requirements:
Additional Information:
***POSITION IS HYBRID*** Primary Skills Required: Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Additional Skills Preferred: Strong drive for results and ability to multi-task and work independently Self-starter with proven innovation skills Ability to communicate and work with cross-functional teams and all levels of management Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Experience in creating and executing detailed test plans Additional Education Preferred GCP Professional Data Engineer Certified In-depth software engineering knowledge "
Data Architect
Data engineer job in Detroit, MI
Millennium Software is look for a Data Architect for one of its direct client based in Michigan. It is onsite role.
Title: Data Architect
Tax term: Only w2, no c2c
Description:
All below are must have
Senior Data Architect with 12+ years of experience in Data Modeling.
Develop conceptual, logical, and physical data models.
Experience with GCP Cloud
GCP Data Architect (only W2 Position - No C2C Accepted) 11.18.2025
Data engineer job in Dearborn, MI
- No C2C Accepted) 11.18.2025
Description: STG is a SEI CMMi Level 5 company with several Fortune 500 and State Government clients. STG has an opening for GCP Data Architect.
Please note that this project assignment is with our own direct clients. We do not go through any vendors. STG only does business with direct end clients. This is expected to be a long-term position. STG will provide immigration and permanent residency sponsorship assistance to those candidates who need it.
Job Description:
Employees in this job function are responsible for designing, building and maintaining reliable, efficient and scalable data architecture and data models that serve as a foundation for all data solutions. They also closely collaborate with senior leadership and IT teams to ensure alignment of data strategy with overall business goals.
Key Responsibilities:
Align data strategy to business goals to support a mix of business strategy, improved decision-making, operations efficiency and risk management
Ensure data assets are available, consumable and secure for end users across the enterprise - applications, platforms and infrastructure - within the confines of enterprise and security architecture
Design and build reliable, efficient and scalable data architecture to be used by the organization for all data solutions
Implement and maintain scalable architectural data patterns, solutions and tooling to support business strategy
Design, build, and launch shared data services and APIs to support and expose data-driven solutions in line with enterprise architecture standards
Research and optimize data architecture technologies to enhance and support enterprise technology and data strategy
Skills Required:
Power Builder, PostgreSQL, GCP, Big Query
Senior Specialist Exp.: 10+ years in IT; 7+ years in concentration
Must have experience presenting technical material to business users
Must be able to envision larger strategies and anticipate where possible synergies can be realized
Experience acting as the voice of the architecture/data model and defend its relevancy to ensure adherence to its principles and purpose
Data Modeling and Design: Develop conceptual, logical, and physical data models for business intelligence, analytics, and reporting solutions. Transform requirements into scalable, flexible, and efficient data structures that can support advanced analytics.
Requirement Analysis: Collaborate with business analysts, stakeholders, and subject matter experts to gather and interpret requirements for new data initiatives. Translate business questions into data models that can answer these questions.
Data Integration: Work closely with data engineers to integrate data from multiple sources, ensuring consistency, accuracy, and reliability. Map data flows and document relationships between datasets.
Database Architecture: Design and optimize database schemas using the medallion architecture which includes relational, star schema and denormalized data sets for BI and ML data consumers.
Metadata Management: Team with the data governance team so detailed documentation on data definitions, data lineage, and data quality statistics are available to data consumers.
Data Quality Assurance: Establish master data management data modeling so the history of how customer, provider and other party data is consolidated into a single version of the truth.
Collaboration and Communication: Serve as a bridge between technical teams and business units, clearly communicating the value and limitations of various data sources and structures.
Continuous Improvement: Stay abreast of emerging trends in data modeling, analytics platforms, and big data technologies. Recommend enhancements to existing data models and approaches.
Performance Optimization: Monitor and optimize data models for query performance and scalability. Troubleshoot and resolve performance bottlenecks in collaboration with database administrators.
Governance and Compliance: Ensure that data models and processes adhere to regulatory standards and organizational policies regarding privacy, access, and security.
GCP Data Architect is based in Dearborn, MI. A great opportunity to experience the corporate environment leading personal career growth.
Resume Submittal Instructions: Interested/qualified candidates should email their word formatted resumes to Vasavi Konda - vasavi.konda(.@)stgit.com and/or contact @(Two-Four-Eight) Seven- One-Two - Six-Seven-Two-Five (@*************. In the subject line of the email please include: First and Last Name: GCP Data Architect.
For more information about STG, please visit us at **************
Sincerely,
Vasavi Konda| Recruiting Specialist
“Opportunities don't happen, you create them.”
Systems Technology Group (STG)
3001 W. Big Beaver Road, Suite 500
Troy, Michigan 48084
Phone: @(Two-Four-Eight) Seven- One-Two - Six-Seven-Two-Five: @************(O)
Email: vasavi.konda(.@)stgit.com
Java Software Engineer
Data engineer job in Ann Arbor, MI
Looking for candidates local to Ann Arbor, MI
Required Skills:
• 5+ Years of Java, J2EE and web/internet based programming experience (both client and server side)
• 5+ Experience with OOA/OOD, distributed systems/software, real time processing, relational database systems, messaging systems
• Experience with concurrency & multi-threading
• Experience with scaling, Java Garbage Collection, and performance tuning preferred
• Deep understanding of data structures, algorithms and design patterns (GoF)
• Experience with agile, test-driven development
• Experience with Unix/Linux
• Experience with build, deploy and test automation tools like Ant, Gradle, Maven, Jenkins, TeamCity, Junit, TestNG, JaCoCo or similar tools
• Demonstrated experience working with core business logic within applications
• Experience in developing APIs and Frameworks
• Excellent written and verbal communication skills
Preferred Skills
• Experience with application development frameworks like Spring, Hibernate, JSF or similar frameworks
• Experience with compilers or DSLs preferred
“Mindlance is an Equal Opportunity Employer and does not discriminate in employment on the basis of - Minority/Gender/Disability/Religion/LGBTQI/Age/Veterans.”
CAE Engineer
Data engineer job in Novi, MI
Job title : THERMAL-3D MANAGEMENT CAE SPECIALIST, Location: Novi, MI (Onsite) Duration: Contract Job Description: Responsible for designing and optimizing thermal management systems for vehicle components, this role focuses on ensuring the reliability and longevity of critical components such as batteries, power electronics, and electric motors. The Thermal Management Engineer leverages advanced simulation tools to analyze and improve thermal performance under various operating conditions. Key Responsibilities: • Develop and implement thermal management strategies for high-power components, including batteries, inverters, electric motors, and power electronics, using tools such as ANSYS, GT-SUITE, and STAR-CCM+. • Conduct thermal simulations to evaluate and enhance the performance of cooling and heating systems under different operating conditions, such as high-speed driving, rapid acceleration, and charging. • Collaborate with powertrain, electrical, and control teams to integrate thermal management solutions into overall vehicle design, ensuring compatibility and efficiency. • Analyze heat transfer and cooling requirements, assessing the effectiveness of components like radiators, heat exchangers, and HVAC systems to maintain optimal operating temperatures, in electric and hybrid vehicle applications. • Optimize battery thermal management to ensure consistent performance, prolong battery life, and enhance vehicle range under various environmental conditions. • Validate thermal models against real-world data by conducting physical tests under different load and environmental conditions, adjusting simulation parameters as needed to improve accuracy. • Document thermal analysis results, preparing detailed reports with recommendations for design improvements to enhance cooling efficiency and component reliability. • Research and implement advanced cooling technologies, such as phase change materials, liquid cooling systems, and thermal insulation, to improve overall vehicle thermal performance. • Proficiency in thermal simulation tools like ANSYS, GT-SUITE, and STAR-CCM+. • Use expertise to correlate virtual to physical. Sign-off virtual performance based on simulation models.
Senior Java Software Engineer
Data engineer job in Detroit, MI
Sr. Fullstack Java Developer - Detroit, MI - Onsite
Duration: 1 Year
Employment Type: Contract - Can go for Inperson Interview
We are looking for an experienced Fullstack Java Developer(12- 15yrs of exp) to join our team for a long-term engagement. The ideal candidate will have strong hands-on experience across Java, Spring, front-end frameworks, databases, and cloud-ready tools, with the ability to lead a team and work directly with customers.
Responsibilities (Brief)
Develop and enhance applications using Java 17/8+, Spring Framework, JSON/XML, AngularJS / Angular 8-11 / React.js.
Must have strong Hands on coding experience is needed
Work with MongoDB, MySQL, SQL, NoSQL databases.
Support upgrade/migration projects using Java, Spring, and Gradle.
Must have at least 3 yrs of experience in deployment (CI/CD pipelines)
Lead development activities and guide technical teams.
Follow Agile methodologies and drive customer value.
Participate in client discussions and deliver quality solutions.
Preferred: Experience with front-end technologies and healthcare insurance domain.
Communicate effectively with technical and business stakeholders.
Required Technical Skills
Java - Mandatory | 10+ years
AngularJS / Angular 8-11 - Mandatory | 5+ years
Spring Framework - Mandatory | 5+ years
JSON / XML - Mandatory | 5+ years
MongoDB / MySQL / SQL / NoSQL DBs - Mandatory | 5+ years
Gradle - Mandatory | 5+ years
Good to Have
Spring Boot - 3+ years
AngularJS / React.js / JSP - 3+ years
IntelliJ - 3+ years
Robotics Software/Systems Engineer
Data engineer job in Warren, MI
A Robotics Software/Systems Engineer job in Warren, MI is available courtesy of Akkodis. We are seeking a Senior Engineer, AI Systems Engineering - Integration to join a Manufacturing Technology Development team within the Research and Development organization. In this role, you will lead system-level integration of new technologies, validating novel AI and robotics algorithms in full-stack collaborative robot prototypes. You will develop frameworks for iterative assembly and testing, ensuring innovations can be evaluated in realistic workflows. You will serve as the convergence point where Robotics Intelligence breakthroughs and AI & Simulation models are combined into functional prototypes
Pay: $40/hr to $60/hr
Robotics Software/Sytems Engineer job responsibilities:
Lead integration of AI, perception, and robotics software into full-stack prototype systems.
Develop and maintain frameworks for iterative build, test, and validation cycles.
Ensure innovations are evaluated under realistic, production-relevant workflows.
Collaborate closely with Robotics Intelligence, AI & Simulation, Controls, and Hardware teams.
Manage system-level prototype bring-up, debugging, and performance validation.
Qualifications:
Bachelor's degree in Robotics, Computer Engineering, Electrical Engineering, or related field.
5+ years of experience in robotics software or systems integration.
Strong background in AI model deployment, ROS/ROS2, and hardware-software integration.
Experience working with collaborative robots, sensors, and real-world task workflows.
Excellent system-level debugging, communication, and cross-functional collaboration skills.
If you are interested in this Software/System Engineer job in Warren, MI please click APPLY NOW. For other opportunities available at Akkodis go to **************** If you have questions about the position, please contact *****************************.
Equal Opportunity Employer/Veterans/Disabled
Benefit offerings include medical, dental, vision, term life insurance, short-term disability insurance, additional voluntary benefits, commuter benefits and 401K plan. Our program provides employees the flexibility to choose the type of coverage that meets their individual needs. Available paid leave may include Paid Sick Leave, where required by law; any other paid leave required by Federal, State or local law; and Holiday pay upon meeting eligibility criteria.
Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs which are direct hire to a client
To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit **********************************************
The Company will consider qualified applicants with arrest and conviction records.
Data Scientist
Data engineer job in Dearborn, MI
At Ford Motor Company, we believe freedom of movement drives human progress. We also believe in providing you with the freedom to define and realize your dreams. With our incredible plans for the future of mobility, we have a wide variety of opportunities for you to accelerate your career potential as you help us define tomorrow's transportation.
Do you believe data tells the real story? We do! Redefining mobility requires quality data, metrics, and analytics, as well as insightful interpreters and analysts. That's where Global Data Insight & Analytics makes an impact. We advise leadership on business conditions, customer needs and the competitive landscape. With our support, key decision makers can act in meaningful, positive ways. Join us and use your data expertise and analytical skills to drive evidence-based, timely decision making.
You'll have...
Master's degree or foreign equivalent in Computer Science, Data Science, Computer Engineering or related field and 3 years of experience in the job offered or related occupation. 3 years of experience with each of the following skills is required: 1. Developing NLP pipelines in Python using at least 2 of the following: NLTK, spa Cy, Gensim, or HuggingFace. 2. Implementing text preprocessing workflows, creating feature extraction algorithms, and building and training models with scikit-learn, TensorFlow, or PyTorch. 3. Developing reusable modules for NLP and writing production-ready code. 4. Querying large datasets in SQL to extract textual information. 5. Designing database schemas optimized for NLP applications. 6. Writing complex queries to join structured and unstructured data sources. 7. Creating ETL processes for text data, optimizing query performance for large text corpora, and implementing database operations in analytics pipelines. 8. Applying supervised Machine Learning techniques to NLP problems, implementing unsupervised methods for text analysis, evaluating model performance with appropriate metrics, building ensemble models, conducting hyperparameter optimization, and applying transfer learning with pre-trained embeddings. 9. Deploying NLP models and pipelines on Google Cloud Platform (GCP) infrastructure. 10. Utilizing AI Platform for training and serving ML models. 11. Managing data storage with Cloud Storage, BigQuery, or Cloud SQL. 12. Implementing data processing pipelines with Dataflow or Dataproc. 2 years of experience with each of the following skills is required: 1. Managing code versioning for collaborative NLP model development, implementing code review processes, and resolving merge conflicts in multi-developer environments. 2. Using Git for CI/CD integration with model deployment and organizing repositories for maintainable ML codebases. 1 year of experience with each of the following skills is required: 1. Using Cloud Functions for serverless text processing and monitoring model performance. 2. Containerizing NLP applications for deployment, creating and managing deployment configurations, and setting up routes and services with OpenShift. 3. Implementing resource allocation and scaling strategies, configuring persistent storage for models and data, and managing deployments with rolling updates. 4. Fine-tuning pre-trained Large Language Models (BERT, GPT, or T5) for domain-specific tasks. 5. Implementing prompt engineering techniques and evaluating LLM outputs for accuracy. 6. Creating embeddings for semantic search, optimizing inference for production, and reducing hallucinations and improving factuality. 7. Building CI/CD pipelines in Tekton for NLP model deployment. 8. Creating reusable pipeline components for text processing, managing workflow triggers, and implementing testing and validation steps. 9. Configuring resource requirements, integrating model evaluation metrics, and setting up automated retraining pipelines.
We are offering a salary of $107,848.00 - $182,338.56/yr.
You may not check every box, or your experience may look a little different from what we've outlined, but if you think you can bring value to Ford Motor Company, we encourage you to apply!
As an established global company, we offer the benefit of choice. You can choose what your Ford future will look like: will your story span the globe, or keep you close to home? Will your career be a deep dive into what you love, or a series of new teams and new skills? Will you be a leader, a changemaker, a technical expert, a culture builder…or all the above? No matter what you choose, we offer a work life that works for you, including:
• Immediate medical, dental, and prescription drug coverage
• Flexible family care, parental leave, new parent ramp-up programs, subsidized back-up child care and more
• Vehicle discount program for employees and family members, and management leases
• Tuition assistance
• Established and active employee resource groups
• Paid time off for individual and team community service
• A generous schedule of paid holidays, including the week between Christmas and New Year's Day
• Paid time off and the option to purchase additional vacation time.
For a detailed look at our benefits, click here:
*******************************
Candidates for positions with Ford Motor Company must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire.
We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, age, sex, national origin, sexual orientation, gender identity, disability status or protected veteran status. In the United States, if you need a reasonable accommodation for the online application process due to a disability, please call **************.
#LI-DNI
#DNP
What you'll be able to do:
Data Scientist - positions offered by Ford Motor Company (Dearborn, Michigan). Note, this is a purely telecommuting/work-from-home position whereby the employee may reside anywhere within the U.S. Develop and deploy Natural Language Processing (NLP) models to extract insights from unstructured textual data. Collaborate with cross-functional teams to identify opportunities and develop strategies for applying NLP techniques to enhance quality analytics. Design and implement data pre-processing and feature engineering techniques for NLP tasks. Utilize supervised and unsupervised machine learning techniques to solve complex NLP problems. Evaluate and fine-tune models for performance optimization, accuracy, and efficiency. Contribute maintainable code to existing and new pipelines. Stay up-to-date with the latest advancements in NLP and contribute to the continuous improvement of methodologies and algorithms. Communicate findings, insights, and recommendations to stakeholders in a clear and concise manner.
Auto-ApplyPrincipal Data Scientist
Data engineer job in Detroit, MI
Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team.
You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes.
This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.)
This position requires occasional travel to the DC area for client meetings.
Essential Duties and Responsibilities:
- Make deep dives into the data, pulling out objective insights for business leaders.
- Initiate, craft, and lead advanced analyses of operational data.
- Provide a strong voice for the importance of data-driven decision making.
- Provide expertise to others in data wrangling and analysis.
- Convert complex data into visually appealing presentations.
- Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners.
- Understand the importance of automation and look to implement and initiate automated solutions where appropriate.
- Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects.
- Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects.
- Guide operational partners on product performance and solution improvement/maturity options.
- Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization.
- Learn new skills in advanced analytics/AI/ML tools, techniques, and languages.
- Mentor more junior data analysts/data scientists as needed.
- Apply strategic approach to lead projects from start to finish;
Job-Specific Minimum Requirements:
- Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
- Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning.
- Contribute to the development of mathematically rigorous process improvement procedures.
- Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments.
Minimum Requirements
- Bachelor's degree in related field required.
- 10-12 years of relevant professional experience required.
Job-Specific Minimum Requirements:
- 10+ years of relevant Software Development + AI / ML / DS experience.
- Professional Programming experience (e.g. Python, R, etc.).
- Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML.
- Experience with API programming.
- Experience with Linux.
- Experience with Statistics.
- Experience with Classical Machine Learning.
- Experience working as a contributor on a team.
Preferred Skills and Qualifications:
- Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.).
- Experience developing machine learning or signal processing algorithms:
- Ability to leverage mathematical principles to model new and novel behaviors.
- Ability to leverage statistics to identify true signals from noise or clutter.
- Experience working as an individual contributor in AI.
- Use of state-of-the-art technology to solve operational problems in AI and Machine Learning.
- Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles.
- Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions.
- Ability to build reference implementations of operational AI & Advanced Analytics processing solutions.
Background Investigations:
- IRS MBI - Eligibility
#techjobs #VeteransPage
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
156,740.00
Maximum Salary
$
234,960.00
Easy ApplyData Engineer
Data engineer job in Dearborn, MI
Details:
Stefanini Group is hiring!
Stefanini is looking for a Data Engineer, Dearborn, MI (Onsite)
For quick apply, please reach out Lokesh Sharma at ************/***************************
You will be responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately.
Responsibilities
Collaborate with business and technology stakeholders to understand current and future data requirements.
Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis.
Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable data workflow.
Design, implement and maintain existing and future data platforms like data warehouses, data lakes, data Lakehouse etc. for structured and unstructured data.
Design and develop analytical tools, algorithms, and programs to support data engineering activities like writing scripts and automating tasks.
Ensure optimum performance and identify improvement opportunities.
Details:
Experience Required
7+ years of experience in Data Engineering.
SQL, Big Data, Data Analysis, Data Warehousing, Python.
Experience Preferred
Experience with Airflow (Astronomer), Google Cloud Platform (GCP), BigQuery, Machine Learning Models.
Education Required
Bachelor's Degree
**Listed salary ranges may vary based on experience, qualifications, and local market. Also, some positions may include bonuses or other incentives***
Stefanini takes pride in hiring top talent and developing relationships with our future employees. Our talent acquisition teams will never make an offer of employment without having a phone conversation with you. Those face-to-face conversations will involve a description of the job for which you have applied. We also speak with you about the process, including interviews and job offers.
About Stefanini Group
The Stefanini Group is a global provider of offshore, onshore and near shore outsourcing, IT digital consulting, systems integration, application, and strategic staffing services to Fortune 1000 enterprises around the world. Our presence is in countries like the Americas, Europe, Africa, and Asia, and more than four hundred clients across a broad spectrum of markets, including financial services, manufacturing, telecommunications, chemical services, technology, public sector, and utilities. Stefanini is a CMM level 5, IT consulting company with a global presence. We are a CMM Level 5 company.
#LI-LS1
#LI-ONSITE
Easy ApplyETL Architect
Data engineer job in Southfield, MI
360 IT Professionals is a Software Development Company based in Fremont, California that offers complete technology services in Mobile development, Web development, Cloud computing and IT staffing. Merging Information Technology skills in all its services and operations, the company caters to its globally positioned clients by providing dynamic feasible IT solutions. 360 IT Professionals work along with its clients to deliver high-performance results, based exclusively on the one of a kind requirement.
Our services are vast and we produce software and web products. We specialize in Mobile development, i.e. iPhone and Android apps. We use Objective C and Swift programming languages to create native applications for iPhone, whereas we use Android Code to develop native applications for Android devices. To create applications that work on cross-platforms, we use a number of frameworks such as Titanium, PhoneGap and JQuery mobile.
Furthermore, we build web products and offer services such as web designing, layouts, responsive designing, graphic designing, web application development using frameworks based on model view controller architecture and content management system. Our services also extend to the domain of Cloud Computing, where we provide Salesforce CRM to effectively manage one's business and ease out all the operations by giving an easy platform. Apart from this, we also provide IT Staffing services that can help your organization to a great extent as you can hire highly skilled personnel's through us.
We make sure that we deliver performance driven products that are optimally developed as per your organization's needs. Take a shot at us for your IT requirements and experience a radical change.
Job Description
Position: ETL Architect
Location: Southfield, MI
Duration: Contract to hire
Need candidates on W2 only
15-17 yrs. Experience
· This person will lead teams and work with management and executives
· Must have excellent communication
· This person is not hands on but must be able to speak to and understand how things work (Healthcare)
· Must have 3-4 yrs. as an architect and be able to show their career progression
· Cognos and Business Objects are nice to have
The Informatica ETL Architect has the overall responsibility for assessing requirements and defining the strategy, technical architecture, implementation plan and delivery of data warehouse projects. The Informatica ETL Architect must have prior experience completing successful data warehousing implementations as well as broad background and experience with IT application development. This individual is responsible for establishing the long term strategy and technical architecture as well as the short term scope for a multi-phased data warehouse effort and should have strong professional consulting skills and the ability to communicate well at all levels of the organization. Top Skill Set:• Lead ETL architecture and design as well as data flow diagramming• Define and implement ETL development standards & procedures• Ensure ETL quality through code reviews and throughout inspection & knowledge sharing• At least 12 years experience with Informatica in a Developer/Tech Lead role• Knowledge of Health Care Insurance Payer Data Warehousing Preferred• Ability to develop a technical work plan and assign work and coordinate across multiple developers and projects. Required Skills/Experience• 12 to 16 Years of Informatica ETL development experience• At least 4 years experience as Informatica ETL Architect• At least 8-10 years experience with Informatica in a Developer/Tech Lead role• Mastery in data warehousing concepts. Candidate should be able to clearly communicate fundamental concepts during the interview and demonstrate previous experience in all aspects.• MUST HAVE strong SQL skills in Oracle Partitioned Environment• Experience in Business Intelligence reporting tools like Cognos and Business Objects preferred• Experience in Oracle database programming using Partitioning, Materialized View and OLAP• Experience in tuning Oracle queries/processes and performance management tools TOOLS & TECHNOLOGIES Informatica 8.x and above (9.1 Preferred) PowerCenter PowerExchange Data Quality Oracle 10g and above Unix Shell Scripting (AIX, Linux) Scheduling Tools (any one of Tivoli, Autosys and Ctrl-M) SQL
Additional Information
Regards,
Vishal Rana
Talent & Client Acquisition Specialist
Phone: 510 254 3300 Ext 178 |
Data Scientist 1
Data engineer job in Southfield, MI
Under general supervision, applies knowledge of statistics, machine learning, programming, data modeling and advanced mathematics to recognize patterns, identify opportunities, pose business questions and make valuable discoveries leading to increased efficiency and productivity. Analysis focuses on product warranty related data sets that may be applied to various areas of the business (e.g, Manufacturing, Design, Marketing/Advertising, etc.).
ESSENTIAL DUTIES AND RESPONSIBILITIES:
Develops and maintains statistical models capable of extracting valuable business and/or technical insights from large data sets.
Proactively participates in developing scalable Business Intelligence (BI) and predictive analytics solutions using a variety of techniques ranging from data aggregation to data mining.
Assists with conducting needs assessment and requirements gathering to design and assist in deploying data analytics solutions.
Performs data manipulation and analytics and translate insights into actionable recommendations to management and customers..
Prepares presentations and reports of statistical concepts and research results related to efficiency initiatives to be shared with a non-statistical audience and senior stakeholders.
Facilitates training and education of associates related to new systems and procedures.
Creates standards for process improvement projects using Data Analytics and Statistical Analysis Theory.
Applies statistical expertise to advance current and future regional initiatives.
Maintains current knowledge of industry trends in the field of Big Data Analytics by attending related seminars, conferences and training sessions.
Performs other duties as assigned.
QUALIFICATIONS:
Bachelor's Degree in math, statistics, computer science, software engineering or a related field
Experience with Node.js, SQL, Python, API's, MySQL/MongoDB, large language models, and business intelligence tools.
0+ years of related experience
SKILLS AND ABILITIES:
Experience with databases and large data sets and related reporting tools
Analytical skills with an ability to independently evaluate and develop innovative solutions to complex situations
Basic knowledge of automotive parts and related vehicle systems (e.g, charging/starting electrical, emission, HVAC systems, vehicle electronic components).
Written and verbal communication skills and presentation skills. Ability to communicate with internal and external customers on issues of moderate to considerable importance, up to and including senior management.
Demonstrated ability to foster and maintain relationships with an ability to work as part of a cross functional team
Continuous improvement mindset
Ability to apply process improvement planning and checking to own work output and to assist others with identifying gaps in work results
Benefits Summary:
Health, Dental, Vision, Prescription Drug plans
Life and Accidental Death & Dismemberment Insurance
Flexible Spending Account
Employee Assistance Program
401K with 4% company match
Bonus Program
Wellness Program
Onsite Fitness Center (vary by location)
Tuition Reimbursement
Career Development and Ongoing Training
Paid holidays and vacation
Cafeteria and food markets (vary by location)
Volunteer opportunities
Employee recognition (employee and milestone events)
Annual Salary: $78,000 - $98,000
Auto-ApplyData Scientist
Data engineer job in Detroit, MI
Please, review and apply for this position through the QCI system following the link below (Copy and Paste): http://tinyurl.com/nzn6msu *You can apply through Indeed using mobile devices with this link. Job Description The Data Scientist at will delve into the recesses of large data sets of structured, semi-structured, and unstructured data to discover hidden knowledge about our business and develop methods to leverage that knowledge within our line of business. The successful candidate will combine strengths in mathematics and applied statics, computer science, visualization capabilities, and a healthy sense of exploration and knowledge acquisition.You must have USA/Canadian Citizenship or your Green Card/EAD.
Responsibilities
Work closely with various teams across the company to identify and solve business challenges utilizing large structured, semi-structured, and unstructured data in a distributed processing environment.
Develop predictive statistical, behavioral or other models via supervised and unsupervised machine learning, statistical analysis, and other predictive modeling techniques.
Drive the collection of new data and the refinement of existing data sources.
Analyze and interpret the results of product experiments.
Collaborate with the engineering and product teams to develop and support our internal data platform to support ongoing analyses.
Requirements
M.S. or Ph.D. in a relevant technical field (e.g. applied mathematics, statistics, physics, computer science, operations research), or 3+ years experience in a relevant role.
Extensive experience solving analytics problems using quantitative approaches.
A proven passion for generating insights from data.
Strong knowledge of statistical methods generally, and particularly in the areas of modeling and business analytics.
Comfort manipulating and analyzing complex, high-volume, high-dimensionality data from varying sources.
Ability to communicate complex quantitative analysis in a clear, precise, and actionable manner.
Fluency with at least one scripting language such as Python, Java, or C/C++.
Familiarity with relational databases and SQL.
Additional Information
All your information will be kept confidential according to EEO guidelines.
Lead Data Scientist
Data engineer job in Detroit, MI
OneMagnify is a global performance marketing organization working at the intersection of brand marketing, technology, and analytics. The Company's core offerings accelerate business, amplify real-time results, and help set their clients apart from their competitors. OneMagnify partners with clients to design, implement and manage marketing and brand strategies using analytical and predictive data models that provide valuable customer insights to drive higher levels of sales conversion.
OneMagnify's commitment to employee growth and development extends far beyond typical approaches. We take great pride in fostering an environment where each of our 700+ colleagues can thrive and achieve their personal best. OneMagnify has been recognized as a Top Workplace, Best Workplace and Cool Workplace in the United States for 10 consecutive years and recently was recognized as a Top Workplace in India.
You'll be joining our RXA Data Science team, a group dedicated to leveraging advanced analytics, predictive modeling, and machine learning to drive smarter marketing and business decisions. As Lead Data Scientist, you will play a critical role in delivering impactful, data-driven solutions. In this role, you will bridge strategy and execution-translating complex business problems into analytically sound solutions while ensuring technical excellence, timely delivery, and cross-functional collaboration.
The Lead Data Scientist is responsible for leading the execution of end-to-end data science projects, from scoping and modeling to operationalization and insight delivery. You will partner with clients, internal teams, and technical stakeholders to develop and deploy scalable solutions that drive measurable business value.
What you'll do:
Lead the design, development, and deployment of statistical models, machine learning algorithms, and custom analytics solutions
Collaborate consistently with team members to understand the purpose, focus, and objectives of each data analysis project, ensuring alignment and meaningful support
Translate client goals into clear modeling strategies, project plans, and deliverables
Guide the development of production-level model pipelines using tools such as Databricks and Azure ML
Collaborate with engineering, marketing, and strategic partners to integrate models into real-world applications
Monitor and improve model performance, ensuring high standards for reliability and business relevance
Present complex analytical results to technical and non-technical audiences in a clear, actionable format
Support innovation by identifying new tools, methods, and data sources-including the use of Snowflake for modern data architecture
Promote best practices in model governance, data ethics, and responsible AI
What you need:
Minimum 5-7 years of experience in data science, analytics, or predictive modeling
Experience leading all aspects of sophisticated data science initiatives with a solid foundation in technical strategy and execution
Strong programming skills in Python, R, or SAS for modeling and data analysis
Advanced SQL capabilities and experience working in cloud-based environments (e.g., Azure, AWS)
Hands-on experience with Databricks, Azure Machine Learning, and Snowflake strongly preferred
Experience applying the modeling rigor and documentation standards required in regulated industries such as financial services is a strong plus
Expertise in regression, classification, clustering, A/B testing, and audience segmentation
Proficiency with Tableau, Power BI, and Excel for data visualization and communication
Strong communication skills and the ability to translate complex technical findings into business insight
Bachelor's degree in Data Science, Statistics, Computer Science, or a related quantitative field (Master's preferred)
Benefits
We offer a comprehensive benefits package including medical, dental, 401(k), paid holidays, vacations, and more.
About us
Whether it's awareness, advocacy, engagement, or efficacy, we move brands forward with work that connects with audiences and delivers results. Through meaningful analytics, engaging communications and innovative technology solutions, we help clients tackle their most ambitious projects and overcome their biggest challenges.
We are an equal opportunity employer
We believe that Innovative ideas and solutions start with unique perspectives. That's why we're committed to providing every employee a workplace that's free of discrimination and intolerance. We're proud to be an equal opportunity employer and actively search for like-minded people to join our team.
We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform job functions, and to receive benefits and privileges of employment. Please contact us to request accommodation.
Auto-ApplyData Engineer
Data engineer job in Novi, MI
Our Purpose
At Vibe, we are driven by our mission to
elevate community and create opportunity
. We believe in fostering an environment of inclusivity where every team member has the chance to grow professionally. Guided by our core values -
be
i
nclusive, educate, embrace change, and seek opportunities
- we are dedicated to making a positive impact in the lives of our members and communities. As we continue to grow and expand our team, we are seeking passionate individuals who share our vision and are eager to join us in our journey. If you are someone who is passionate about making a difference and is committed to creating a brighter future for our communities, we invite you to explore this exciting opportunity at Vibe!
Position Purpose
The Data Engineer will be responsible for designing, developing, and managing data pipelines, ensuring data quality, and optimizing performance on the Snowflake, Power BI and SQL platforms. This role will work closely with the Business Intelligence team, data analysts and our IT departments to enable our data sources to be accessible and actionable.
Essential Duties
Design & Development:
Develop and maintain robust, scalable data pipelines for extracting, transforming, and loading (ETL) data into Snowflake from various sources (e.g., databases, APIs, third-party systems).
Implement and optimize Snowflake schemas (e.g., star and snowflake schemas) and data models for analytics and reporting.
Write complex SQL queries to help enable business analysts in partner departments for data extraction and transformation.
Develop complex Data Models within Power BI and produce self service data visualization dashboards.
Data Integration:
Collaborate with cross-functional teams to gather data requirements and design integrations between Snowflake and other data sources (e.g., on-premise databases, cloud platforms, data lakes).
Work with our vendors to develop efficient ways to exchange data.
Implement data ingestion frameworks using tools like Snowpipe for real-time or batch processing.
Optimization & Performance:
Optimize Snowflake queries, databases, and storage to improve performance and reduce costs (e.g., clustering, pruning, and partitioning).
Monitor and troubleshoot data pipelines and optimize processes for reliability, efficiency, and scalability.
Develop and execute a cost-conscious data lake ingestion strategy by evaluating, prioritizing, and onboarding high-value data sources to ensure scalable, efficient, and business-aligned data architecture.
Data Governance & Quality:
Implement and maintain data quality checks and validation mechanisms.
Ensure compliance with data privacy and security standards, especially when working with sensitive or regulated data.
Maintain documentation on data processes, pipelines, and data models for internal use.
Collaboration:
Work with data analysts, IT development teams and partner departments to ensure that data solutions align with business requirements.
Assist in providing access to data and creating views in Snowflake for reporting and analytics purposes.
Education/Experience
Strong experience in data engineering or related roles with a focus on Snowflake.
Hands-on experience with Snowflake's architecture, performance tuning, and advanced features (e.g., Snowflake Streams, Tasks, Snowpipe).
Strong partnership and collaboration experience to develop solutions for several internal departments and external vendors.
Proficiency in SQL, Snowflake SQL, DB2 and Python.
Experience with ETL/ELT tools such as Talend, Apache NiFi, or custom solutions.
Experience with cloud platforms (AWS, Azure, or Google Cloud) and data integration services.
Skills/Abilities
Expertise in Snowflake data warehousing concepts such as data loading, performance tuning, partitioning, and storage optimization.
Proficient in programming languages like DB2, SnowSQL, SQL Server, Python or Java for data pipeline development.
Knowledge of data visualization and reporting tools (e.g., Power BI) is a plus.
Strong problem-solving and troubleshooting skills.
Excellent communication skills for collaborating with stakeholders and team members.
Ability to manage multiple projects and meet deadlines in a fast-paced environment.
Physical Requirements
These physical demands are representative of the physical requirements necessary for an employee to successfully perform the essential functions of the position. Reasonable accommodations can be made to enable people with disabilities to perform the described essential functions of the position. While performing the responsibilities of the job, the employee is required to hear, see, talk, stand, walk, stoop, kneel, lift, push, pull, and grasp.
I have received a copy of this job description: _______________________________
Employee's Name (please print)
________________________________________ ______________________________________________
Manager's Signature
Data Scientist III
Data engineer job in Pontiac, MI
Job Description
Ready to join thousands of talented team members who are making the dream of home ownership possible for more Americans? It's all happening on UWM's campus, where our award-winning workplace packs plenty of perks and amenities that keep the atmosphere buzzing with energy and excitement.
It's no wonder that out of our six pillars, People Are Our Greatest Asset is number one. It's at the very heart of how we treat each other, our clients and our community. Whether it's providing elite client service or continuously striving to improve, our pillars provide a pathway to a more successful personal and professional life.
From the team member that holds a door open to the one that helps guide your career, you'll feel the encouragement and support on day one. No matter your race, creed, gender, age, sexual orientation and ethnicity, you'll be welcomed here. Accepted here. And empowered to Be You Here.
More reasons you'll love working here include:
Paid Time Off (PTO) after just 30 days
Additional parental and maternity leave benefits after 12 months
Adoption reimbursement program
Paid volunteer hours
Paid training and career development
Medical, dental, vision and life insurance
401k with employer match
Mortgage discount and area business discounts
Free membership to our large, state-of-the-art fitness center, including exercise classes such as yoga and Zumba, various sports leagues and a full-size basketball court
Wellness area, including an in-house primary-care physician's office, full-time massage therapist and hair salon
Gourmet cafeteria featuring homemade breakfast and lunch
Convenience store featuring healthy grab-and-go snacks
In-house Starbucks and Dunkin
Indoor/outdoor café with Wi-Fi
Responsibilities
Work with stakeholders throughout the organization to identify opportunities for leveraging company data to increase efficiency or improve the bottom line.
Analyze UWM data sets to identify areas of optimization and improvement of business strategies.
Assess the effectiveness and accuracy of new data sources and data gathering techniques.
Develop custom data models, algorithms, simulations, and predictive modeling to support insights and opportunities for improvement.
Develop A/B testing framework and test model quality.
Coordinate with different business areas to implement models and monitor outcomes.
Develop processes and tools to monitor and analyze model performance and data accuracy
Qualifications
Must Have
Bachelor's degree in Finance, Statistics, Economics, Data Science, Computer Science, Engineering or Mathematics, or related field
5+ years of experience in statistical analysis, and/or machine learning
5+ years of experience with one or more of the following tools: machine learning (Python, MATLAB), data wrangling skills/tools (Hadoop, Teradata, SAS, or other), statistical analysis (Python, R, SAS) and/or visualization skills/tools (PowerBI, Tableau, Qlikview)
3+ years of experience collaborating with teams (either internal or external) to develop analytics solutions
Strong problem solving skills
Strong communication skills (interpersonal, written, and presentation)
Nice to Have
Master's degree in Finance, Statistics, Economics, Data Science, Computer Science, Mathematics or related field
3+ years of experience with R, SQL, Tableau, MATLAB, Python
3+ years of professional experience in machine learning, data mining, statistical analysis, modeling, optimization
Experience in Accounting, Finance, and Economics
Data Scientist
Data engineer job in Southfield, MI
Mars United℠ Commerce is a global commerce marketing practice that aligns people, technology, and intelligence to make the business of our clients better today than it was yesterday. Our worldwide capabilities coalesce into four key disciplines - Strategy & Analytics, Content & Experiences, Digital Commerce, and Retail Consultancy - that individually deliver unmatched results for clients and collectively give them an unparalleled network of seamlessly integrated functions across the entire commerce marketing ecosystem. These disciplines are powered by our industry-leading technology platform, Marilyn, which helps marketers understand the total business impact of their commerce marketing activation, enabling them to make better decisions, create connected experiences, and drive stronger, measurable results. Learn more at ****************************
Overview
Part of the overall Analytics Group, the Data Science team is responsible for all data modeling, algorithm development, and creating machine learning and AI models. They develop techniques such as regression, classification, clustering, natural language processing (NLP), and more. Additionally, they focus on marketing analytics, using advanced data science techniques to analyze marketing performance, optimize campaigns, and provide actionable insights to enhance marketing effectiveness.
● Core Responsibilities: Data Modeling, Feature Engineering, Sentiment Analysis, Propensity Modeling, CM3, Model Training & Testing, Forecasting, Multi-touch/Data-driven Attribution, etc.
● Primary Tools: Databricks, Azure Synapse, Alteryx, Python, SQL
Responsibilities
As a Data Scientist, you will leverage your strong technical skills and experience to develop data science and AI solutions. This role requires a deep understanding of data science and machine learning techniques and the ability to collaborate with various teams to ensure data quality and actionable insights. Specifically, the Data Scientist will:
● Collaborate with stakeholders to understand business requirements and translate them into actionable data science projects.
● Work closely with cross-functional teams, including analysts, product managers, and domain experts, to understand business requirements, formulate problem statements, and deliver relevant data science solutions.
● Develop and optimize machine learning models by processing, analyzing and extracting data from varying internal and external data sources.
● Develop supervised, unsupervised, and semi-supervised machine learning models using state-of-the-art techniques to solve client problems.
● Own and manage complex ETL pipelines to clean, preprocess, and transform large datasets.
● Identify and engineer relevant features to enhance model performance and accuracy.
● Design and implement robust evaluation metrics and frameworks to assess and monitor the performance of machine learning models.
● Communicate findings and recommendations through comprehensive reports and engaging presentations.
● Support wider agency initiatives.
● Show up - be accountable, take responsibility, and get back up when you are down.
● Make stuff.
● Share so others can see what's happening.
Qualifications
A Bachelors'/Master's degree in Mathematics, Statistics, Data Analytics, Computer Science, or a directly related field.
● 1+ years of industry experience in a data science/data analysis/statistical analyst role.
● Comfortable in manipulating and analyzing complex, high-volume, high-dimensionality data from varying sources using Python/R libraries and SQL.
● Familiarity with relational, SQL, and NoSQL databases.
● Databricks experience is a big plus point.
● Knowledge of statistical analysis tools such as R is a plus.
● Knowledge of scripting in SQL and Python using OOP concepts.
● Experience with PowerBI or Tableau.
● Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and big data technologies.
● Experience in DevOps or MLOps is a plus.
● Experience with DS or ML frameworks and libraries (e.g., Spark, TensorFlow, PyTorch) is a plus.
● Strong communication skills to effectively convey complex findings to non-technical stakeholders.
● Collaborative mindset to work seamlessly with creative, strategic, and client-facing teams.
● Critical thinking to analyze data and derive meaningful insights.
● Experience in the marketing domain is preferred.
● Ensure the accuracy and reliability of data through rigorous QA processes.
● Validate model outputs to ensure they meet business requirements.
● Conduct unit tests and validation checks on data and models.
● Perform A/B testing to evaluate model performance and impact.
● Document all data analysis and modeling processes.
● Maintain comprehensive records of data sources, methodologies, and results.
● Ensure compliance with data governance and security policies.
Additional information
Our Publicis Groupe motto "Viva La Différence" means we're better together, and we believe that our differences make us stronger. It means we honor and celebrate all identities, across all facets of intersectionality, and it underpins all that we do as an organization. We are focused on fostering belonging and creating equitable & inclusive experiences for all talent.
Publicis Groupe provides robust and inclusive benefit programs and policies to support the evolving and diverse needs of our talent and enable every person to grow and thrive. Our benefits package includes medical coverage, dental, vision, disability, 401K, as well as parental and family care leave, family forming assistance, tuition reimbursement, and flexible time off.
If you require accommodation or assistance with the application or onboarding process specifically, please contact *****************************.
Compensation Range: $54,910 to $72,300 annually. This is the pay range the Company believes it will pay for this position at the time of this posting. Consistent with applicable law, compensation will be determined based on the skills, qualifications, and experience of the applicant along with the requirements of the position, and the Company reserves the right to modify this pay range at any time. Temporary roles may be eligible to participate in our freelancer/temporary employee medical plan through a third-party benefits administration system once certain criteria have been met. Temporary roles may also qualify for participation in our 401(k) plan after eligibility criteria have been met. For regular roles, the Company will offer medical coverage, dental, vision, disability, 401k, and paid time off. The Company anticipates the application deadline for this job posting will be 1/15/2026.
All your information will be kept confidential according to EEO guidelines.
GCP Data Architect
Data engineer job in Dearborn, MI
Title: GCP Data Architect
Description: STG is a fast-growing Digital Transformation services company providing Fortune 500 companies with Digital Transformation, Mobility, Analytics and Cloud Integration services in both information technology and engineering product lines. STG has a 98% repeat business rate from existing clients and have achieved industry awards and recognition for our services.
Responsibilities:
Data Modeling and Design: Develop conceptual, logical, and physical data models for business intelligence, analytics, and reporting solutions. Transform requirements into scalable, flexible, and efficient data structures that can support advanced analytics.
Requirement Analysis: Collaborate with business analysts, stakeholders, and subject matter experts to gather and interpret requirements for new data initiatives. Translate business questions into data models that can answer these questions.
Data Integration: Work closely with data engineers to integrate data from multiple sources, ensuring consistency, accuracy, and reliability. Map data flows and document relationships between datasets.
Database Architecture: Design and optimize database schemas using the medallion architecture which includes relational, star schema and denormalized data sets for BI and ML data consumers.
Metadata Management: Team with the data governance team so detailed documentation on data definitions, data lineage, and data quality statistics are available to data consumers.
Data Quality Assurance: Establish master data management data modeling so the history of how customer, provider and other party data is consolidated into a single version of the truth.
Collaboration and Communication: Serve as a bridge between technical teams and business units, clearly communicating the value and limitations of various data sources and structures.
Governance and Compliance: Ensure that data models and processes adhere to regulatory standards and organizational policies regarding privacy, access, and security.
Experience Required:
Specialist Exp: 10+ yrs in IT; 7+ yrs as Data Architect
Power Builder
PostgreSQL
GCP
Big Query
GCP Data Architect position is based at our corporate office located in Dearborn, Michigan. A great opportunity to experience the corporate environment leading personal career growth.
Resume Submittal Instructions: Interested/qualified candidates should email their word formatted resumes to Ms. Shweta Huria at ********************** and/or contact at ************. In the subject line of the email please include: First and Last Name (GCP Data Architect).
For more information about STG, please visit us at **************
ETL Architect
Data engineer job in Southfield, MI
360 IT Professionals is a Software Development Company based in Fremont, California that offers complete technology services in Mobile development, Web development, Cloud computing and IT staffing. Merging Information Technology skills in all its services and operations, the company caters to its globally positioned clients by providing dynamic feasible IT solutions. 360 IT Professionals work along with its clients to deliver high-performance results, based exclusively on the one of a kind requirement.
Our services are vast and we produce software and web products. We specialize in Mobile development, i.e. iPhone and Android apps. We use Objective C and Swift programming languages to create native applications for iPhone, whereas we use Android Code to develop native applications for Android devices. To create applications that work on cross-platforms, we use a number of frameworks such as Titanium, PhoneGap and JQuery mobile.
Furthermore, we build web products and offer services such as web designing, layouts, responsive designing, graphic designing, web application development using frameworks based on model view controller architecture and content management system. Our services also extend to the domain of Cloud Computing, where we provide Salesforce CRM to effectively manage one's business and ease out all the operations by giving an easy platform. Apart from this, we also provide IT Staffing services that can help your organization to a great extent as you can hire highly skilled personnel's through us.
We make sure that we deliver performance driven products that are optimally developed as per your organization's needs. Take a shot at us for your IT requirements and experience a radical change.
Job Description
Position: ETL Architect
Location: Southfield, MI
Duration: Contract to hire
Need candidates on W2 only
15-17 yrs. Experience
·
This person will lead teams and work with management and executives
·
Must have excellent communication
·
This person is not hands on but must be able to speak to and understand how things work (Healthcare)
·
Must have 3-4 yrs. as an architect and be able to show their career progression
·
Cognos and Business Objects are nice to have
The Informatica ETL Architect has the overall responsibility for assessing requirements and defining the strategy, technical architecture, implementation plan and delivery of data warehouse projects. The Informatica ETL Architect must have prior experience completing successful data warehousing implementations as well as broad background and experience with IT application development. This individual is responsible for establishing the long term strategy and technical architecture as well as the short term scope for a multi-phased data warehouse effort and should have strong professional consulting skills and the ability to communicate well at all levels of the organization. Top Skill Set:• Lead ETL architecture and design as well as data flow diagramming• Define and implement ETL development standards & procedures• Ensure ETL quality through code reviews and throughout inspection & knowledge sharing• At least 12 years experience with Informatica in a Developer/Tech Lead role• Knowledge of Health Care Insurance Payer Data Warehousing Preferred• Ability to develop a technical work plan and assign work and coordinate across multiple developers and projects. Required Skills/Experience• 12 to 16 Years of Informatica ETL development experience• At least 4 years experience as Informatica ETL Architect• At least 8-10 years experience with Informatica in a Developer/Tech Lead role• Mastery in data warehousing concepts. Candidate should be able to clearly communicate fundamental concepts during the interview and demonstrate previous experience in all aspects.• MUST HAVE strong SQL skills in Oracle Partitioned Environment• Experience in Business Intelligence reporting tools like Cognos and Business Objects preferred• Experience in Oracle database programming using Partitioning, Materialized View and OLAP• Experience in tuning Oracle queries/processes and performance management tools TOOLS & TECHNOLOGIES Informatica 8.x and above (9.1 Preferred) PowerCenter PowerExchange Data Quality Oracle 10g and above Unix Shell Scripting (AIX, Linux) Scheduling Tools (any one of Tivoli, Autosys and Ctrl-M) SQL
Additional Information
Regards,
Vishal Rana
Talent & Client Acquisition Specialist
Phone: 510 254 3300 Ext 178 |