Controls Software Engineer
Data engineer job in Shelby, MI
Lincoln Electric is the world leader in the engineering, design, and manufacturing of advanced arc welding solutions, automated joining, assembly and cutting systems, plasma and oxy-fuel cutting equipment, and has a leading global position in brazing and soldering alloys. Lincoln is recognized as the Welding Expert™ for its leading materials science, software development, automation engineering, and application expertise, which advance customers' fabrication capabilities to help them build a better world. Headquartered in Cleveland, Ohio, Lincoln Electric is a $4.2B publicly traded company (NASDAQ:LECO) with over 12,000 employees around the world, with operations in 71 manufacturing and automation system integration locations across 21 countries and maintains a worldwide network of distributors and sales offices serving customers in over 160 countries.
Location: Shelby
Employment Status: Hourly Full-Time
Function: Engineering
Req ID: 26527
Summary
Fori Automation, LLC, a Lincoln Electric Company, is a global supplier of welding, assembly, material handling, and testing equipment for automotive and non-automotive customers worldwide. Fori Automation focuses on delivering cost-effective, highly engineered products and systems designed and manufactured globally with localized sales, project management, and service.
We are seeking an experienced Controls Software Engineer for our Shelby Township, MI site with a background in industrial software development. The Controls Software Engineer will initially support active projects and then transition to completing projects directly. They will take the lead on developing software on new projects and debug software on new machines. This role requires travel to customer sites for equipment installation and customer interaction.
What You Will Do
Design PLC software and HMIs for industrial automation equipment
Debug and troubleshoot PLC software and HMIs
Collaborate with cross-functional teams to maintain project timelines and critical path milestones.
Maintain task lists and reports of open items.
Maintain project design documentation and prepare customer deliverables.
Ensure the controls engineering process is tracked and followed.
Assist customers and local tradespeople in troubleshooting equipment issues.
Conduct end-user training on equipment operation.
Education & Experience Requirements
Electrical Engineering or Computer Engineering degree preferred; Mechatronics degrees will also be considered.
Minimum of two years of experience as a Controls Engineer or Controls Software Engineer with experience in designing Rockwell Logix 5000 or Siemens S7-1500 family processors.
Knowledge or education in electrical circuits, schematic reading, design, and troubleshooting.
Experience with electrical CAD systems, such as AutoCAD Electrical and/or ePLAN
Experience with PLC programming in ladder and structured text.
Experience programming HMIs
Travel required: approximately 30% domestic and international.
Weekend work may be required based on project schedules.
Preferred
Experience in computer programming languages, such as VB, C/C++, or C#.
Experience with Rockwell and Siemens HMI preferred.
Lincoln Electric is an Equal Opportunity Employer. We are committed to promoting equal employment opportunity for applicants, without regard to their race, color, national origin, religion, sex (including pregnancy, childbirth, or related medical conditions, including, but not limited to, lactation), sexual orientation, gender identity, age, veteran status, disability, genetic information, and any other category protected by federal, state, or local law.
GCP Data Engineer
Data engineer job in Dearborn, MI
Experience Required: 8+ years
Work Status: Hybrid
We're seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. You will: Work in collaborative environment including pairing and mobbing with other cross-functional engineers Work on a small agile team to deliver working, tested software Work effectively with fellow data engineers, product owners, data champions and other technical experts Demonstrate technical knowledge/leadership skills and advocate for technical excellence Develop exceptional Analytics data products using streaming, batch ingestion patterns in the Google Cloud Platform with solid Data Warehouse principles Be the Subject Matter Expert in Data Engineering and GCP tool technologies
Skills Required:
Big Query
Skills Preferred:
N/A
Experience Required:
In-depth understanding of Google's product technology (or other cloud platform) and underlying architectures 5+ years of analytics application development experience required 5+ years of SQL development experience 3+ years of Cloud experience (GCP preferred) with solution designed and implemented at production scale Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Terraform, Big Query, Google Cloud Storage, PubSub, Dataflow, Dataproc, Airflow, etc. 2 + years professional development experience in Java or Python, and Apache Beam Extracting, Loading, Transforming, cleaning, and validating data Designing pipelines and architectures for data processing 1+ year of designing and building CI/CD pipelines
Experience Preferred:
Experience building Machine Learning solutions using TensorFlow, BigQueryML, AutoML, Vertex AI Experience in building solution architecture, provision infrastructure, secure and reliable data-centric services and application in GCP Experience with DataPlex is preferred Experience with development eco-system such as Git, Jenkins and CICD Exceptional problem solving and communication skills Experience in working with DBT/Dataform Experience in working with Agile and Lean methodologies Team player and attention to detail Performance tuning experience
Education Required:
Bachelor's Degree
Education Preferred:
Master's Degree
Additional Safety Training/Licensing/Personal Protection Requirements:
Additional Information:
***POSITION IS HYBRID*** Primary Skills Required: Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Additional Skills Preferred: Strong drive for results and ability to multi-task and work independently Self-starter with proven innovation skills Ability to communicate and work with cross-functional teams and all levels of management Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Experience in creating and executing detailed test plans Additional Education Preferred GCP Professional Data Engineer Certified In-depth software engineering knowledge "
Data Architect
Data engineer job in Detroit, MI
Millennium Software is look for a Data Architect for one of its direct client based in Michigan. It is onsite role.
Title: Data Architect
Tax term: Only w2, no c2c
Description:
All below are must have
Senior Data Architect with 12+ years of experience in Data Modeling.
Develop conceptual, logical, and physical data models.
Experience with GCP Cloud
GCP Data Architect (only W2 Position - No C2C Accepted) 11.18.2025
Data engineer job in Dearborn, MI
- No C2C Accepted) 11.18.2025
Description: STG is a SEI CMMi Level 5 company with several Fortune 500 and State Government clients. STG has an opening for GCP Data Architect.
Please note that this project assignment is with our own direct clients. We do not go through any vendors. STG only does business with direct end clients. This is expected to be a long-term position. STG will provide immigration and permanent residency sponsorship assistance to those candidates who need it.
Job Description:
Employees in this job function are responsible for designing, building and maintaining reliable, efficient and scalable data architecture and data models that serve as a foundation for all data solutions. They also closely collaborate with senior leadership and IT teams to ensure alignment of data strategy with overall business goals.
Key Responsibilities:
Align data strategy to business goals to support a mix of business strategy, improved decision-making, operations efficiency and risk management
Ensure data assets are available, consumable and secure for end users across the enterprise - applications, platforms and infrastructure - within the confines of enterprise and security architecture
Design and build reliable, efficient and scalable data architecture to be used by the organization for all data solutions
Implement and maintain scalable architectural data patterns, solutions and tooling to support business strategy
Design, build, and launch shared data services and APIs to support and expose data-driven solutions in line with enterprise architecture standards
Research and optimize data architecture technologies to enhance and support enterprise technology and data strategy
Skills Required:
Power Builder, PostgreSQL, GCP, Big Query
Senior Specialist Exp.: 10+ years in IT; 7+ years in concentration
Must have experience presenting technical material to business users
Must be able to envision larger strategies and anticipate where possible synergies can be realized
Experience acting as the voice of the architecture/data model and defend its relevancy to ensure adherence to its principles and purpose
Data Modeling and Design: Develop conceptual, logical, and physical data models for business intelligence, analytics, and reporting solutions. Transform requirements into scalable, flexible, and efficient data structures that can support advanced analytics.
Requirement Analysis: Collaborate with business analysts, stakeholders, and subject matter experts to gather and interpret requirements for new data initiatives. Translate business questions into data models that can answer these questions.
Data Integration: Work closely with data engineers to integrate data from multiple sources, ensuring consistency, accuracy, and reliability. Map data flows and document relationships between datasets.
Database Architecture: Design and optimize database schemas using the medallion architecture which includes relational, star schema and denormalized data sets for BI and ML data consumers.
Metadata Management: Team with the data governance team so detailed documentation on data definitions, data lineage, and data quality statistics are available to data consumers.
Data Quality Assurance: Establish master data management data modeling so the history of how customer, provider and other party data is consolidated into a single version of the truth.
Collaboration and Communication: Serve as a bridge between technical teams and business units, clearly communicating the value and limitations of various data sources and structures.
Continuous Improvement: Stay abreast of emerging trends in data modeling, analytics platforms, and big data technologies. Recommend enhancements to existing data models and approaches.
Performance Optimization: Monitor and optimize data models for query performance and scalability. Troubleshoot and resolve performance bottlenecks in collaboration with database administrators.
Governance and Compliance: Ensure that data models and processes adhere to regulatory standards and organizational policies regarding privacy, access, and security.
GCP Data Architect is based in Dearborn, MI. A great opportunity to experience the corporate environment leading personal career growth.
Resume Submittal Instructions: Interested/qualified candidates should email their word formatted resumes to Vasavi Konda - vasavi.konda(.@)stgit.com and/or contact @(Two-Four-Eight) Seven- One-Two - Six-Seven-Two-Five (@*************. In the subject line of the email please include: First and Last Name: GCP Data Architect.
For more information about STG, please visit us at **************
Sincerely,
Vasavi Konda| Recruiting Specialist
“Opportunities don't happen, you create them.”
Systems Technology Group (STG)
3001 W. Big Beaver Road, Suite 500
Troy, Michigan 48084
Phone: @(Two-Four-Eight) Seven- One-Two - Six-Seven-Two-Five: @************(O)
Email: vasavi.konda(.@)stgit.com
Senior Data Architect
Data engineer job in Detroit, MI
LatentView Analytics is one of the world's largest and fastest-growing digital analytics firms.
We help companies drive digital transformation by helping them combine digital and traditional data to gain a competitive advantage. LatentView provides a 360-degree view of the digital consumer, enabling companies to predict new revenue streams, anticipate product trends and popularity, improve customer retention rates, and optimize investment decisions.
Role: Databricks Architect
Location: Detroit City in Michigan
Authorized to work in the United States
No Need of Visa sponsorship
Key Responsibilities
Architecture Design: Lead the end-to-end design of scalable data solutions on the Databricks Lakehouse Platform, including Delta Lake, Delta Live Tables (DLT), and Unity Catalog.
Data Engineering Leadership: Guide teams on best practices for batch/streaming ingestion, Spark optimization (PySpark/Scala), performance tuning, and CI/CD for data pipelines.
Automotive Domain Expertise: Understand and architect solutions for automotive-specific challenges (e.g., large-scale telematics, manufacturing data, connected vehicle data).
Cloud Integration: Manage and optimize Databricks on AWS, Azure, or GCP, integrating with other cloud services (S3, ADLS, IAM, etc.).
Governance & Security: Implement fine-grained access controls, data lineage, and compliance with security standards.
Stakeholder Collaboration: Work with business leaders, data scientists, analysts, and product owners to translate needs into technical roadmaps.
Innovation: Act as a subject matter expert (SME), recommending emerging tech and driving innovation in data & AI.
Core Skills & Qualifications
9-15 years experienced resources in Databricks, Python, Big Data, Apache Spark, SQL and Spark SQLd
Strong hands on in Pyspark and Apache Spark.
Experience in Building Data Governance Solutions like Unity Catalog …etc
Build Very Strong Orchestration Layer in Databricks/ADF…. Workflows.
Build CICD for Databricks in Azure Devops.
Process near Real time Data thru Auto Loader, DLT Pipelines.
Implement Security Layer in Delta Lake.
Implement Massive Parallel Processing Layers in Spark SQL and PySpark.
Implement Cost effective Infrastructure in Databricks.
EEO Statement
“At LatentView Analytics LLC, we value a diverse, inclusive workforce and we provide equal employment opportunities for all applicants and employees. All qualified applicants for employment will be considered without regard to an individual's race, color, sex, gender identity, gender expression, religion, age, national origin or ancestry, citizenship, physical or mental disability, medical condition, family care status, marital status, domestic partner status, sexual orientation, genetic information, military or veteran status, or any other basis protected by federal, state or local laws. If you are unable to submit your application because of incompatible assistive technology or a disability, please contact us at ********************. LatentView Analytics LLC will reasonably accommodate qualified individuals with disabilities to the extent required by applicable law.”
Sr. Data Engineer/Architect (Python, Pyspark, Airflow, SDET) - (Face to Face Interview)
Data engineer job in Auburn Hills, MI
The Senior Data Engineer & Technical Lead (SDET Lead) will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects
Mandatory Skills: Data Engineering, Python, PySpark, CI/CD, Airflow, Workflow Orchestration
Key Responsibilities:
Data Engineering: Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads.
Workflow Orchestration: Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability.
CI/CD Pipeline Development: Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions.
Testing & Quality: Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines.
Secure Development: Implement secure coding best practices and design patterns throughout the development lifecycle.
Collaboration: Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions.
Documentation: Create and maintain technical documentation, including process/data flow diagrams and system design artifacts.
Mentorship: Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices.
Troubleshooting: Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks.
Cross-Team Knowledge Sharing: Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage. Includes all above skills, plus the following.
Minimum of 7+ years overall IT experience
Experienced in waterfall, iterative, and agile methodologies
Technical Experience:
Hands-on Data Engineering: Minimum 5+ yearsof practical experience building production-grade data pipelines using Python and PySpark.
Airflow Expertise: Proven track record of designing, deploying, and managing Airflow DAGs in enterprise environments.
CI/CD for Data Projects: Ability to build and maintain CI/CD pipelines for data engineering workflows, including automated testing and deployment**.
Cloud & Containers: Experience with containerization (Docker and cloud platforms (GCP) for data engineering workloads. Appreciation for twelve-factor design principles
Python Fluency: Ability to write object-oriented Python code manage dependencies, and follow industry best practices
Version Control: Proficiency with **Git** for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows).
Unix/Linux: Strong command-line skills** in Unix-like environments.
SQL: Solid understanding of SQL for data ingestion and analysis.
Collaborative Development: Comfortable with code reviews, pair programming and using remote collaboration tools effectively.
Engineering Mindset: Writes code with an eye for maintainability and testability; excited to build production-grade software
Education: Bachelor's or graduate degree in Computer Science, Data Analytics or related field, or equivalent work experience.
Unique Skills:
Graduate degree in a related field, such as Computer Science or Data Analytics
Familiarity with Test-Driven Development (TDD)
A high tolerance for OpenShift, Cloudera, Tableau, Confluence, Jira, and other enterprise tools.
CAE Engineer
Data engineer job in Novi, MI
Job title : THERMAL-3D MANAGEMENT CAE SPECIALIST, Location: Novi, MI (Onsite) Duration: Contract Job Description: Responsible for designing and optimizing thermal management systems for vehicle components, this role focuses on ensuring the reliability and longevity of critical components such as batteries, power electronics, and electric motors. The Thermal Management Engineer leverages advanced simulation tools to analyze and improve thermal performance under various operating conditions. Key Responsibilities: • Develop and implement thermal management strategies for high-power components, including batteries, inverters, electric motors, and power electronics, using tools such as ANSYS, GT-SUITE, and STAR-CCM+. • Conduct thermal simulations to evaluate and enhance the performance of cooling and heating systems under different operating conditions, such as high-speed driving, rapid acceleration, and charging. • Collaborate with powertrain, electrical, and control teams to integrate thermal management solutions into overall vehicle design, ensuring compatibility and efficiency. • Analyze heat transfer and cooling requirements, assessing the effectiveness of components like radiators, heat exchangers, and HVAC systems to maintain optimal operating temperatures, in electric and hybrid vehicle applications. • Optimize battery thermal management to ensure consistent performance, prolong battery life, and enhance vehicle range under various environmental conditions. • Validate thermal models against real-world data by conducting physical tests under different load and environmental conditions, adjusting simulation parameters as needed to improve accuracy. • Document thermal analysis results, preparing detailed reports with recommendations for design improvements to enhance cooling efficiency and component reliability. • Research and implement advanced cooling technologies, such as phase change materials, liquid cooling systems, and thermal insulation, to improve overall vehicle thermal performance. • Proficiency in thermal simulation tools like ANSYS, GT-SUITE, and STAR-CCM+. • Use expertise to correlate virtual to physical. Sign-off virtual performance based on simulation models.
Java Software Engineer
Data engineer job in Ann Arbor, MI
Looking for candidates local to Ann Arbor, MI
Required Skills:
• 5+ Years of Java, J2EE and web/internet based programming experience (both client and server side)
• 5+ Experience with OOA/OOD, distributed systems/software, real time processing, relational database systems, messaging systems
• Experience with concurrency & multi-threading
• Experience with scaling, Java Garbage Collection, and performance tuning preferred
• Deep understanding of data structures, algorithms and design patterns (GoF)
• Experience with agile, test-driven development
• Experience with Unix/Linux
• Experience with build, deploy and test automation tools like Ant, Gradle, Maven, Jenkins, TeamCity, Junit, TestNG, JaCoCo or similar tools
• Demonstrated experience working with core business logic within applications
• Experience in developing APIs and Frameworks
• Excellent written and verbal communication skills
Preferred Skills
• Experience with application development frameworks like Spring, Hibernate, JSF or similar frameworks
• Experience with compilers or DSLs preferred
“Mindlance is an Equal Opportunity Employer and does not discriminate in employment on the basis of - Minority/Gender/Disability/Religion/LGBTQI/Age/Veterans.”
Configuration Engineer - NO C2C
Data engineer job in Troy, MI
Title: Configuration Engineer
Rate: $45/Hr
is a contract on W2, and is NOT open to C2C.
This Configuration Engineer will be responsible for the configuration build out of mortgage related business processes. This includes configuring workflows, the creation of automated decision points, tasks, managing system users. They will work with our Business Process Management team to support loan servicing business partners, vendors, and data providers as directed by the Technology and Product Development leadership. Strong consultative skills, root cause analysis, strong understanding of data modeling, providing solutions and alternative methods to meet internal client expectations, and have an in-depth understanding of process workflow and data integrations, such as API.
Responsibilities
Create and manage automated workflow solutions in a low-code environment including developing program modules.
Supports data reporting partners and data integration partners in testing and providing insight to the data structure in the Business Management Model.
Recommend and facilitate system enhancements to improve efficiencies throughout the servicing organization.
Supports internal and external partners in resolving defects by triaging issues, identifying root cause failures, and provide solutions to facilitate a fix.
Ability to learn and master low code platform from the perspective of both end-users and engineers.
Develops and maintains workflow automations and 3rd party integrations via API from data providers and internal data owners.
Coordinates with business partners and vendors to execute requirements.
Supports training teams in understanding the workflow automation.
Support internal customers to provide a positive technical experience.
Interface with other departments as necessary to ensure the smooth operation and growth of the organization.
Design, documents, managed testing and delivers solutions for assigned program modules.
Other projects and assignments as needed.
Qualifications And Experience
Bachelor's degree in science or equivalent experience
2 - 5 of Years in SaaS application deployment and/or similar experience.
Proficient in a programming query language.
Be able to read & understand API documentation and versed in API authentication methods including OAuth, Basic Auth, Tokens, and SAML
Proficient in working with REST APIs and in a major programming language.
Understanding of Mortgage Servicing processes including default servicing
Ability to work in a fast-paced fluid environment.
Excellent communication skills both written and verbal.
Ability to work independently and as a member of various teams and committees.
Commitment to excellence and high standards.
Not required but nice to have
Experience in integrating attorney networks and common vendors in the industry
Experience with JavaScript
Senior Java Software Engineer
Data engineer job in Detroit, MI
Sr. Fullstack Java Developer - Detroit, MI - Onsite
Duration: 1 Year
Employment Type: Contract - Can go for Inperson Interview
We are looking for an experienced Fullstack Java Developer(12- 15yrs of exp) to join our team for a long-term engagement. The ideal candidate will have strong hands-on experience across Java, Spring, front-end frameworks, databases, and cloud-ready tools, with the ability to lead a team and work directly with customers.
Responsibilities (Brief)
Develop and enhance applications using Java 17/8+, Spring Framework, JSON/XML, AngularJS / Angular 8-11 / React.js.
Must have strong Hands on coding experience is needed
Work with MongoDB, MySQL, SQL, NoSQL databases.
Support upgrade/migration projects using Java, Spring, and Gradle.
Must have at least 3 yrs of experience in deployment (CI/CD pipelines)
Lead development activities and guide technical teams.
Follow Agile methodologies and drive customer value.
Participate in client discussions and deliver quality solutions.
Preferred: Experience with front-end technologies and healthcare insurance domain.
Communicate effectively with technical and business stakeholders.
Required Technical Skills
Java - Mandatory | 10+ years
AngularJS / Angular 8-11 - Mandatory | 5+ years
Spring Framework - Mandatory | 5+ years
JSON / XML - Mandatory | 5+ years
MongoDB / MySQL / SQL / NoSQL DBs - Mandatory | 5+ years
Gradle - Mandatory | 5+ years
Good to Have
Spring Boot - 3+ years
AngularJS / React.js / JSP - 3+ years
IntelliJ - 3+ years
Robotics Software/Systems Engineer
Data engineer job in Warren, MI
A Robotics Software/Systems Engineer job in Warren, MI is available courtesy of Akkodis. We are seeking a Senior Engineer, AI Systems Engineering - Integration to join a Manufacturing Technology Development team within the Research and Development organization. In this role, you will lead system-level integration of new technologies, validating novel AI and robotics algorithms in full-stack collaborative robot prototypes. You will develop frameworks for iterative assembly and testing, ensuring innovations can be evaluated in realistic workflows. You will serve as the convergence point where Robotics Intelligence breakthroughs and AI & Simulation models are combined into functional prototypes
Pay: $40/hr to $60/hr
Robotics Software/Sytems Engineer job responsibilities:
Lead integration of AI, perception, and robotics software into full-stack prototype systems.
Develop and maintain frameworks for iterative build, test, and validation cycles.
Ensure innovations are evaluated under realistic, production-relevant workflows.
Collaborate closely with Robotics Intelligence, AI & Simulation, Controls, and Hardware teams.
Manage system-level prototype bring-up, debugging, and performance validation.
Qualifications:
Bachelor's degree in Robotics, Computer Engineering, Electrical Engineering, or related field.
5+ years of experience in robotics software or systems integration.
Strong background in AI model deployment, ROS/ROS2, and hardware-software integration.
Experience working with collaborative robots, sensors, and real-world task workflows.
Excellent system-level debugging, communication, and cross-functional collaboration skills.
If you are interested in this Software/System Engineer job in Warren, MI please click APPLY NOW. For other opportunities available at Akkodis go to **************** If you have questions about the position, please contact *****************************.
Equal Opportunity Employer/Veterans/Disabled
Benefit offerings include medical, dental, vision, term life insurance, short-term disability insurance, additional voluntary benefits, commuter benefits and 401K plan. Our program provides employees the flexibility to choose the type of coverage that meets their individual needs. Available paid leave may include Paid Sick Leave, where required by law; any other paid leave required by Federal, State or local law; and Holiday pay upon meeting eligibility criteria.
Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs which are direct hire to a client
To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit **********************************************
The Company will consider qualified applicants with arrest and conviction records.
Mid-Senior Software Engineer (Go/TypeScript/C++): $125-185K
Data engineer job in Ann Arbor, MI
We're assisting our European-Based Engineering client identify a Senior GoLang Software Engineer as they build out their US Headquarters in Ann Arbor, Michigan.
This is a very exciting opportunity to be among the first members of an Emerging-Tech team here in the U.S.
We're Only Considering Local, Michigan Candidates at This Time. Candidates must be able to obtain a Security Clearance (US Citizen).
Highly Competitive Salary and Benefits
Ability to Work Several Days from Home
Cutting-Edge/Unique Tech: Greenfield Development
You're Contributions Will Have True Impact
What We're Looking for in a Senior GoLang Engineer:
3+ Years of Software Engineering Experience
Highly Proficient with Go / GoLang
Proficient in C++ and Linux Environments
Experience with TypeScript
CUDA Experience is Icing On The Cake!
Experience in an AWS Environment Highly Preferred
Experience in Machine Learning and New Model Architecture
B.S. Degree in Computer Science or STEM
ETL Architect
Data engineer job in Southfield, MI
360 IT Professionals is a Software Development Company based in Fremont, California that offers complete technology services in Mobile development, Web development, Cloud computing and IT staffing. Merging Information Technology skills in all its services and operations, the company caters to its globally positioned clients by providing dynamic feasible IT solutions. 360 IT Professionals work along with its clients to deliver high-performance results, based exclusively on the one of a kind requirement.
Our services are vast and we produce software and web products. We specialize in Mobile development, i.e. iPhone and Android apps. We use Objective C and Swift programming languages to create native applications for iPhone, whereas we use Android Code to develop native applications for Android devices. To create applications that work on cross-platforms, we use a number of frameworks such as Titanium, PhoneGap and JQuery mobile.
Furthermore, we build web products and offer services such as web designing, layouts, responsive designing, graphic designing, web application development using frameworks based on model view controller architecture and content management system. Our services also extend to the domain of Cloud Computing, where we provide Salesforce CRM to effectively manage one's business and ease out all the operations by giving an easy platform. Apart from this, we also provide IT Staffing services that can help your organization to a great extent as you can hire highly skilled personnel's through us.
We make sure that we deliver performance driven products that are optimally developed as per your organization's needs. Take a shot at us for your IT requirements and experience a radical change.
Job Description
Position: ETL Architect
Location: Southfield, MI
Duration: Contract to hire
Need candidates on W2 only
15-17 yrs. Experience
·
This person will lead teams and work with management and executives
·
Must have excellent communication
·
This person is not hands on but must be able to speak to and understand how things work (Healthcare)
·
Must have 3-4 yrs. as an architect and be able to show their career progression
·
Cognos and Business Objects are nice to have
The Informatica ETL Architect has the overall responsibility for assessing requirements and defining the strategy, technical architecture, implementation plan and delivery of data warehouse projects. The Informatica ETL Architect must have prior experience completing successful data warehousing implementations as well as broad background and experience with IT application development. This individual is responsible for establishing the long term strategy and technical architecture as well as the short term scope for a multi-phased data warehouse effort and should have strong professional consulting skills and the ability to communicate well at all levels of the organization. Top Skill Set:• Lead ETL architecture and design as well as data flow diagramming• Define and implement ETL development standards & procedures• Ensure ETL quality through code reviews and throughout inspection & knowledge sharing• At least 12 years experience with Informatica in a Developer/Tech Lead role• Knowledge of Health Care Insurance Payer Data Warehousing Preferred• Ability to develop a technical work plan and assign work and coordinate across multiple developers and projects. Required Skills/Experience• 12 to 16 Years of Informatica ETL development experience• At least 4 years experience as Informatica ETL Architect• At least 8-10 years experience with Informatica in a Developer/Tech Lead role• Mastery in data warehousing concepts. Candidate should be able to clearly communicate fundamental concepts during the interview and demonstrate previous experience in all aspects.• MUST HAVE strong SQL skills in Oracle Partitioned Environment• Experience in Business Intelligence reporting tools like Cognos and Business Objects preferred• Experience in Oracle database programming using Partitioning, Materialized View and OLAP• Experience in tuning Oracle queries/processes and performance management tools TOOLS & TECHNOLOGIES Informatica 8.x and above (9.1 Preferred) PowerCenter PowerExchange Data Quality Oracle 10g and above Unix Shell Scripting (AIX, Linux) Scheduling Tools (any one of Tivoli, Autosys and Ctrl-M) SQL
Additional Information
Regards,
Vishal Rana
Talent & Client Acquisition Specialist
Phone: 510 254 3300 Ext 178 |
Data Scientist III
Data engineer job in Pontiac, MI
Team members in the Data Scientist role at UWM are responsible for modeling complex problems, discovering insights and identifying opportunities using statistics, algorithms, machine learning, and visualization techniques. Data Scientists work closely with executives, product owners, SME's, and other business teams to leverage data and help answer critical business decisions. Data Scientists at UWM need to be creative thinkers and propose innovative ways to look at problems by examining and discovering new patterns within our datasets and collaborating with our business stakeholders.
They will need to validate their results using an experimental and iterative approach. Perhaps most importantly, they will need to be able to communicate their insights and results to the business in a clear, concise, and approachable way. They need to be storytellers of their work.
These professionals will need a combination of business focus, data programming knowledge, and strong analytical and problem solving skills, to be able to quickly develop and test hypothesis, and provide conclusions in a clear, structured manner. This role includes the full data science lifecycle from analytic problem definition, through data wrangling, analysis, model development, reporting/visualization development, testing, deployment, and feedback.
WHAT YOU WILL BE DOING
* Work with stakeholders throughout the organization to identify opportunities for leveraging company data to increase efficiency or improve the bottom line.
* Analyze UWM data sets to identify areas of optimization and improvement of business strategies.
* Assess the effectiveness and accuracy of new data sources and data gathering techniques.
* Develop custom data models, algorithms, simulations, and predictive modeling to support insights and opportunities for improvement.
* Develop A/B testing framework and test model quality.
* Coordinate with different business areas to implement models and monitor outcomes.
* Develop processes and tools to monitor and analyze model performance and data accuracy
WHAT WE NEED FROM YOU
Must Have
* Bachelor's degree in Finance, Statistics, Economics, Data Science, Computer Science, Engineering or Mathematics, or related field
* 5+ years of experience in statistical analysis, and/or machine learning
* 5+ years of experience with one or more of the following tools: machine learning (Python, MATLAB), data wrangling skills/tools (Hadoop, Teradata, SAS, or other), statistical analysis (Python, R, SAS) and/or visualization skills/tools (PowerBI, Tableau, Qlikview)
* 3+ years of experience collaborating with teams (either internal or external) to develop analytics solutions
* Strong problem solving skills
* Strong communication skills (interpersonal, written, and presentation)
Nice to Have
* Master's degree in Finance, Statistics, Economics, Data Science, Computer Science, Mathematics or related field
* 3+ years of experience with R, SQL, Tableau, MATLAB, Python
* 3+ years of professional experience in machine learning, data mining, statistical analysis, modeling, optimization
* Experience in Accounting, Finance, and Economics
THE PLACE & THE PERKS
Ready to join thousands of talented team members who are making the dream of home ownership possible for more Americans? It's all happening on UWM's campus, where our award-winning workplace packs plenty of perks and amenities that keep the atmosphere buzzing with energy and excitement.
It's no wonder that out of our six pillars, People Are Our Greatest Asset is number one. It's at the very heart of how we treat each other, our clients and our community. Whether it's providing elite client service or continuously striving to improve, our pillars provide a pathway to a more successful personal and professional life.
From the team member that holds a door open to the one that helps guide your career, you'll feel the encouragement and support on day one. No matter your race, creed, gender, age, sexual orientation and ethnicity, you'll be welcomed here. Accepted here. And empowered to Be You Here.
More reasons you'll love working here include:
* Paid Time Off (PTO) after just 30 days
* Additional parental and maternity leave benefits after 12 months
* Adoption reimbursement program
* Paid volunteer hours
* Paid training and career development
* Medical, dental, vision and life insurance
* 401k with employer match
* Mortgage discount and area business discounts
* Free membership to our large, state-of-the-art fitness center, including exercise classes such as yoga and Zumba, various sports leagues and a full-size basketball court
* Wellness area, including an in-house primary-care physician's office, full-time massage therapist and hair salon
* Gourmet cafeteria featuring homemade breakfast and lunch
* Convenience store featuring healthy grab-and-go snacks
* In-house Starbucks and Dunkin
* Indoor/outdoor café with Wi-Fi
DISCLAIMER
All the above duties and responsibilities are essential job functions subject to reasonable accommodation and change. All job requirements listed indicate the minimum level of knowledge, skills and/or ability deemed necessary to perform the job proficiently. Team members may be required to perform other or different job-related duties as requested by their team lead, subject to reasonable accommodation. This document does not create an employment contract, implied or otherwise. Employment with UWM is "at-will." UWM is an Equal Opportunity Employer. By selecting "Apply for this job online" you provide consent to UWM to record phone call conversations between you and UWM to be used for quality control purposes.
Auto-ApplyData Scientist
Data engineer job in Detroit, MI
Please, review and apply for this position through the QCI system following the link below (Copy and Paste): http://tinyurl.com/nzn6msu *You can apply through Indeed using mobile devices with this link. Job Description The Data Scientist at will delve into the recesses of large data sets of structured, semi-structured, and unstructured data to discover hidden knowledge about our business and develop methods to leverage that knowledge within our line of business. The successful candidate will combine strengths in mathematics and applied statics, computer science, visualization capabilities, and a healthy sense of exploration and knowledge acquisition.You must have USA/Canadian Citizenship or your Green Card/EAD.
Responsibilities
Work closely with various teams across the company to identify and solve business challenges utilizing large structured, semi-structured, and unstructured data in a distributed processing environment.
Develop predictive statistical, behavioral or other models via supervised and unsupervised machine learning, statistical analysis, and other predictive modeling techniques.
Drive the collection of new data and the refinement of existing data sources.
Analyze and interpret the results of product experiments.
Collaborate with the engineering and product teams to develop and support our internal data platform to support ongoing analyses.
Requirements
M.S. or Ph.D. in a relevant technical field (e.g. applied mathematics, statistics, physics, computer science, operations research), or 3+ years experience in a relevant role.
Extensive experience solving analytics problems using quantitative approaches.
A proven passion for generating insights from data.
Strong knowledge of statistical methods generally, and particularly in the areas of modeling and business analytics.
Comfort manipulating and analyzing complex, high-volume, high-dimensionality data from varying sources.
Ability to communicate complex quantitative analysis in a clear, precise, and actionable manner.
Fluency with at least one scripting language such as Python, Java, or C/C++.
Familiarity with relational databases and SQL.
Additional Information
All your information will be kept confidential according to EEO guidelines.
Lead Data Scientist
Data engineer job in Detroit, MI
OneMagnify is a global performance marketing organization working at the intersection of brand marketing, technology, and analytics. The Company's core offerings accelerate business, amplify real-time results, and help set their clients apart from their competitors. OneMagnify partners with clients to design, implement and manage marketing and brand strategies using analytical and predictive data models that provide valuable customer insights to drive higher levels of sales conversion.
OneMagnify's commitment to employee growth and development extends far beyond typical approaches. We take great pride in fostering an environment where each of our 700+ colleagues can thrive and achieve their personal best. OneMagnify has been recognized as a Top Workplace, Best Workplace and Cool Workplace in the United States for 10 consecutive years and recently was recognized as a Top Workplace in India.
You'll be joining our RXA Data Science team, a group dedicated to leveraging advanced analytics, predictive modeling, and machine learning to drive smarter marketing and business decisions. As Lead Data Scientist, you will play a critical role in delivering impactful, data-driven solutions. In this role, you will bridge strategy and execution-translating complex business problems into analytically sound solutions while ensuring technical excellence, timely delivery, and cross-functional collaboration.
The Lead Data Scientist is responsible for leading the execution of end-to-end data science projects, from scoping and modeling to operationalization and insight delivery. You will partner with clients, internal teams, and technical stakeholders to develop and deploy scalable solutions that drive measurable business value.
What you'll do:
Lead the design, development, and deployment of statistical models, machine learning algorithms, and custom analytics solutions
Collaborate consistently with team members to understand the purpose, focus, and objectives of each data analysis project, ensuring alignment and meaningful support
Translate client goals into clear modeling strategies, project plans, and deliverables
Guide the development of production-level model pipelines using tools such as Databricks and Azure ML
Collaborate with engineering, marketing, and strategic partners to integrate models into real-world applications
Monitor and improve model performance, ensuring high standards for reliability and business relevance
Present complex analytical results to technical and non-technical audiences in a clear, actionable format
Support innovation by identifying new tools, methods, and data sources-including the use of Snowflake for modern data architecture
Promote best practices in model governance, data ethics, and responsible AI
What you need:
Minimum 5-7 years of experience in data science, analytics, or predictive modeling
Experience leading all aspects of sophisticated data science initiatives with a solid foundation in technical strategy and execution
Strong programming skills in Python, R, or SAS for modeling and data analysis
Advanced SQL capabilities and experience working in cloud-based environments (e.g., Azure, AWS)
Hands-on experience with Databricks, Azure Machine Learning, and Snowflake strongly preferred
Experience applying the modeling rigor and documentation standards required in regulated industries such as financial services is a strong plus
Expertise in regression, classification, clustering, A/B testing, and audience segmentation
Proficiency with Tableau, Power BI, and Excel for data visualization and communication
Strong communication skills and the ability to translate complex technical findings into business insight
Bachelor's degree in Data Science, Statistics, Computer Science, or a related quantitative field (Master's preferred)
Benefits
We offer a comprehensive benefits package including medical, dental, 401(k), paid holidays, vacations, and more.
About us
Whether it's awareness, advocacy, engagement, or efficacy, we move brands forward with work that connects with audiences and delivers results. Through meaningful analytics, engaging communications and innovative technology solutions, we help clients tackle their most ambitious projects and overcome their biggest challenges.
We are an equal opportunity employer
We believe that Innovative ideas and solutions start with unique perspectives. That's why we're committed to providing every employee a workplace that's free of discrimination and intolerance. We're proud to be an equal opportunity employer and actively search for like-minded people to join our team.
We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform job functions, and to receive benefits and privileges of employment. Please contact us to request accommodation.
Auto-ApplyData Scientist 1
Data engineer job in Southfield, MI
Under general supervision, applies knowledge of statistics, machine learning, programming, data modeling and advanced mathematics to recognize patterns, identify opportunities, pose business questions and make valuable discoveries leading to increased efficiency and productivity. Analysis focuses on product warranty related data sets that may be applied to various areas of the business (e.g, Manufacturing, Design, Marketing/Advertising, etc.).
ESSENTIAL DUTIES AND RESPONSIBILITIES:
Develops and maintains statistical models capable of extracting valuable business and/or technical insights from large data sets.
Proactively participates in developing scalable Business Intelligence (BI) and predictive analytics solutions using a variety of techniques ranging from data aggregation to data mining.
Assists with conducting needs assessment and requirements gathering to design and assist in deploying data analytics solutions.
Performs data manipulation and analytics and translate insights into actionable recommendations to management and customers..
Prepares presentations and reports of statistical concepts and research results related to efficiency initiatives to be shared with a non-statistical audience and senior stakeholders.
Facilitates training and education of associates related to new systems and procedures.
Creates standards for process improvement projects using Data Analytics and Statistical Analysis Theory.
Applies statistical expertise to advance current and future regional initiatives.
Maintains current knowledge of industry trends in the field of Big Data Analytics by attending related seminars, conferences and training sessions.
Performs other duties as assigned.
QUALIFICATIONS:
Bachelor's Degree in math, statistics, computer science, software engineering or a related field
Experience with Node.js, SQL, Python, API's, MySQL/MongoDB, large language models, and business intelligence tools.
0+ years of related experience
SKILLS AND ABILITIES:
Experience with databases and large data sets and related reporting tools
Analytical skills with an ability to independently evaluate and develop innovative solutions to complex situations
Basic knowledge of automotive parts and related vehicle systems (e.g, charging/starting electrical, emission, HVAC systems, vehicle electronic components).
Written and verbal communication skills and presentation skills. Ability to communicate with internal and external customers on issues of moderate to considerable importance, up to and including senior management.
Demonstrated ability to foster and maintain relationships with an ability to work as part of a cross functional team
Continuous improvement mindset
Ability to apply process improvement planning and checking to own work output and to assist others with identifying gaps in work results
Benefits Summary:
Health, Dental, Vision, Prescription Drug plans
Life and Accidental Death & Dismemberment Insurance
Flexible Spending Account
Employee Assistance Program
401K with 4% company match
Bonus Program
Wellness Program
Onsite Fitness Center (vary by location)
Tuition Reimbursement
Career Development and Ongoing Training
Paid holidays and vacation
Cafeteria and food markets (vary by location)
Volunteer opportunities
Employee recognition (employee and milestone events)
Annual Salary: $78,000 - $98,000
Auto-ApplyData Scientist
Data engineer job in Dearborn, MI
In Global Data Insight & Analytics (GDIA), we aspire to navigate Ford Motor Company through the disruptiveness of the information age, harnessing the power of data and artificial intelligence to realize the enterprise's known goals, reveal hidden opportunities, and achieve data superiority.
The GDIA complexity Analytics team develops data products, analytic software and provides insights to a broad range of skill teams and delivers value to Ford using critical thinking, artificial intelligence (AI), machine learning (ML), and optimization techniques.
As a Data Scientist, you will use your knowledge of data and advanced analytics to identify and articulate the role data and analytics products play in helping the business achieve their goals. You will develop analytics products using your expertise in visualization, AI/ML, Statistics and Optimization using GDI&A approved packages and architectures. You will collaborate with Data Engineers and Software Engineers to develop robust analytics products. You will use your knowledge of the product driven operating model, analytic and software delivery via Google Cloud Platform to optimize the delivery of value. You will interact with business partners in Complexity to align with their needs and processes to ensure relevancy of products.
You'll have...
Master's degree in a quantitative field, such as Data Science, Engineering, Operations Research, Industrial Engineering, Statistics, Mathematics OR Computer Science
2+ year experience hands-on experience with mathematical programming, machine learning, artificial intelligence, optimization/simulation techniques, or statistical analysis
Demonstrated technical skills in data analytics, AI/ML, operations research, and/or optimization
1+ year of experience delivering analytics solutions
1+ year experience with Agile team methodology
Even better, you may have...
PhD degree is preferred in quantitative field, such as Data Science, Engineering, Operations Research, Industrial Engineering, Statistics, Mathematics, Computer Science, or related field
Proven experience with developing data products/solutions to support analytic applications in Ford's data ecosystem
Experience with Product Driven Operating Model or Agile Product Development Process
Proven proficiency in developing and deploying analytic models, working in a team environment, supporting customers and/or end users
Comfortable working in an environment where problems are not always well-defined
Strong interpersonal and leadership skills, with ability to communicate complex topics to leaders and peers in a simple and clear manner
Well-organized, independent, and ready to work with minimal supervision
Inquisitive, proactive, and interested in learning new tools and techniques
Demonstrated hands on experience with deploying data products and/or analytic models in Ford's on-prem and/or Google Cloud Platform
Demonstrated experience to translate real-world business problems into analytical formulations and interpreting analytics results with non-analytics business partners
You may not check every box, or your experience may look a little different from what we've outlined, but if you think you can bring value to Ford Motor Company, we encourage you to apply!
As an established global company, we offer the benefit of choice. You can choose what your Ford future will look like: will your story span the globe, or keep you close to home? Will your career be a deep dive into what you love, or a series of new teams and new skills? Will you be a leader, a changemaker, a technical expert, a culture builder…or all of the above? No matter what you choose, we offer a work life that works for you, including:
• Immediate medical, dental, vision and prescription drug coverage
• Flexible family care days, paid parental leave, new parent ramp-up programs, subsidized back-up child care and more
• Family building benefits including adoption and surrogacy expense reimbursement, fertility treatments, and more
• Vehicle discount program for employees and family members and management leases
• Tuition assistance
• Established and active employee resource groups
• Paid time off for individual and team community service
• A generous schedule of paid holidays, including the week between Christmas and New Year's Day
• Paid time off and the option to purchase additional vacation time.
For a detailed look at our benefits, click here:
*******************************
This position is a range of salary grades 6-8.
Visa sponsorship is available for this position.
Candidates for positions with Ford Motor Company must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire.
We are an Equal Opportunity Employer committed to a culturally diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, color, age, sex, national origin, sexual orientation, gender identity, disability status or protected veteran status. In the United States, if you need a reasonable accommodation for the online application process due to a disability, please call **************.
#LI-Hybrid
What you'll do...
Accelerate the application of value-added analytics and machine learning into the portfolio of products for Complexity Analytics.
Drive analytic excellence into product teams by collaborating with Data Scientists, Data Engineers and Software Engineers in analytic and machine learning methods.
Work closely with the Product Manager and Product Owner to translate Business Value needs into analytic deliverables and, where appropriate, software products for delivery by product teams.
Work hands-on with the team and other partners to deliver solutions that meet our customer's requirements and needs.
Act as a consultant to the business vs. an order taker.
Balance "doing it right" with "speed to delivery" by identifying and mitigating risk, generating options, educating business and other decision makers, and taking on justified technical debt.
Auto-ApplyData Engineer
Data engineer job in Novi, MI
Our Purpose At Vibe, we are driven by our mission to elevate community and create opportunity. We believe in fostering an environment of inclusivity where every team member has the chance to grow professionally. Guided by our core values - be inclusive, educate, embrace change, and seek opportunities - we are dedicated to making a positive impact in the lives of our members and communities. As we continue to grow and expand our team, we are seeking passionate individuals who share our vision and are eager to join us in our journey. If you are someone who is passionate about making a difference and is committed to creating a brighter future for our communities, we invite you to explore this exciting opportunity at Vibe!
Position Purpose
The Data Engineer will be responsible for designing, developing, and managing data pipelines, ensuring data quality, and optimizing performance on the Snowflake, Power BI and SQL platforms. This role will work closely with the Business Intelligence team, data analysts and our IT departments to enable our data sources to be accessible and actionable.
Essential Duties
Design & Development:
* Develop and maintain robust, scalable data pipelines for extracting, transforming, and loading (ETL) data into Snowflake from various sources (e.g., databases, APIs, third-party systems).
* Implement and optimize Snowflake schemas (e.g., star and snowflake schemas) and data models for analytics and reporting.
* Write complex SQL queries to help enable business analysts in partner departments for data extraction and transformation.
* Develop complex Data Models within Power BI and produce self service data visualization dashboards.
Data Integration:
* Collaborate with cross-functional teams to gather data requirements and design integrations between Snowflake and other data sources (e.g., on-premise databases, cloud platforms, data lakes).
* Work with our vendors to develop efficient ways to exchange data.
* Implement data ingestion frameworks using tools like Snowpipe for real-time or batch processing.
Optimization & Performance:
* Optimize Snowflake queries, databases, and storage to improve performance and reduce costs (e.g., clustering, pruning, and partitioning).
* Monitor and troubleshoot data pipelines and optimize processes for reliability, efficiency, and scalability.
* Develop and execute a cost-conscious data lake ingestion strategy by evaluating, prioritizing, and onboarding high-value data sources to ensure scalable, efficient, and business-aligned data architecture.
Data Governance & Quality:
* Implement and maintain data quality checks and validation mechanisms.
* Ensure compliance with data privacy and security standards, especially when working with sensitive or regulated data.
* Maintain documentation on data processes, pipelines, and data models for internal use.
Collaboration:
* Work with data analysts, IT development teams and partner departments to ensure that data solutions align with business requirements.
* Assist in providing access to data and creating views in Snowflake for reporting and analytics purposes.
Education/Experience
* Strong experience in data engineering or related roles with a focus on Snowflake.
* Hands-on experience with Snowflake's architecture, performance tuning, and advanced features (e.g., Snowflake Streams, Tasks, Snowpipe).
* Strong partnership and collaboration experience to develop solutions for several internal departments and external vendors.
* Proficiency in SQL, Snowflake SQL, DB2 and Python.
* Experience with ETL/ELT tools such as Talend, Apache NiFi, or custom solutions.
* Experience with cloud platforms (AWS, Azure, or Google Cloud) and data integration services.
Skills/Abilities
* Expertise in Snowflake data warehousing concepts such as data loading, performance tuning, partitioning, and storage optimization.
* Proficient in programming languages like DB2, SnowSQL, SQL Server, Python or Java for data pipeline development.
* Knowledge of data visualization and reporting tools (e.g., Power BI) is a plus.
* Strong problem-solving and troubleshooting skills.
* Excellent communication skills for collaborating with stakeholders and team members.
* Ability to manage multiple projects and meet deadlines in a fast-paced environment.
Physical Requirements
These physical demands are representative of the physical requirements necessary for an employee to successfully perform the essential functions of the position. Reasonable accommodations can be made to enable people with disabilities to perform the described essential functions of the position. While performing the responsibilities of the job, the employee is required to hear, see, talk, stand, walk, stoop, kneel, lift, push, pull, and grasp.
I have received a copy of this job description: _______________________________
Employee's Name (please print)
________________________________________ ______________________________________________
Manager's Signature
GCP Data Architect
Data engineer job in Dearborn, MI
Title: GCP Data Architect
Description: STG is a fast-growing Digital Transformation services company providing Fortune 500 companies with Digital Transformation, Mobility, Analytics and Cloud Integration services in both information technology and engineering product lines. STG has a 98% repeat business rate from existing clients and have achieved industry awards and recognition for our services.
Responsibilities:
Data Modeling and Design: Develop conceptual, logical, and physical data models for business intelligence, analytics, and reporting solutions. Transform requirements into scalable, flexible, and efficient data structures that can support advanced analytics.
Requirement Analysis: Collaborate with business analysts, stakeholders, and subject matter experts to gather and interpret requirements for new data initiatives. Translate business questions into data models that can answer these questions.
Data Integration: Work closely with data engineers to integrate data from multiple sources, ensuring consistency, accuracy, and reliability. Map data flows and document relationships between datasets.
Database Architecture: Design and optimize database schemas using the medallion architecture which includes relational, star schema and denormalized data sets for BI and ML data consumers.
Metadata Management: Team with the data governance team so detailed documentation on data definitions, data lineage, and data quality statistics are available to data consumers.
Data Quality Assurance: Establish master data management data modeling so the history of how customer, provider and other party data is consolidated into a single version of the truth.
Collaboration and Communication: Serve as a bridge between technical teams and business units, clearly communicating the value and limitations of various data sources and structures.
Governance and Compliance: Ensure that data models and processes adhere to regulatory standards and organizational policies regarding privacy, access, and security.
Experience Required:
Specialist Exp: 10+ yrs in IT; 7+ yrs as Data Architect
Power Builder
PostgreSQL
GCP
Big Query
GCP Data Architect position is based at our corporate office located in Dearborn, Michigan. A great opportunity to experience the corporate environment leading personal career growth.
Resume Submittal Instructions: Interested/qualified candidates should email their word formatted resumes to Ms. Shweta Huria at ********************** and/or contact at ************. In the subject line of the email please include: First and Last Name (GCP Data Architect).
For more information about STG, please visit us at **************