Senior React Developer (Next.js)
Data engineer job in Lansing, MI
What You'll Do:
Develop and maintain scalable, responsive, and accessible web applications using React and modern frontend frameworks.
Implement reusable UI components, design systems, and custom hooks to accelerate development.
Collaborate with product managers, designers, and backend engineers to deliver high-quality features.
Optimize application performance through memoization, code splitting, and efficient state management.
Participate in code reviews, documentation, and continuous improvement of development processes.
Contribute to CI/CD pipelines and deployment processes for production applications.
What We're Looking For:
Technical Skills:
3+ years of experience with React, including modern hooks and patterns.
Strong expertise with Redux or similar state management solutions.
Proficient in JavaScript (ES6+) and CSS (modern frameworks like Tailwind, SCSS, or styled-components).
Experience integrating RESTful APIs and handling asynchronous operations.
Familiarity with Next.js or server-side rendering is a strong advantage.
Git version control and experience with development tooling (npm, yarn, linting, CI/CD).
Core Competencies:
Building accessible web applications (WCAG 2.1 AA).
Strong debugging, problem-solving, and performance optimization skills.
Comfortable working in Agile/Scrum environments.
Ability to implement and maintain complex forms and dynamic UI components.
Nice-to-Haves:
Experience with state persistence, design systems, and multi-step forms.
Knowledge of deployment pipelines, testing frameworks, and automated CI/CD processes.
Why You'll Love Working Here:
Impactful projects: Work on dashboards, dynamic forms, and reusable component libraries.
Collaborative culture: A team that values your ideas and growth.
Professional development: Opportunities to learn, mentor, and advance your career.
Flexible work environment: Remote or hybrid options available.
Senior Dotnet Developer
Data engineer job in Jackson, MI
We are seeking an experienced and motivated .NET Developer with a strong background in Electronic Data Interchange (EDI) development leveraging Microsoft Azure services. In this role, you will be responsible for the design, development, and maintenance of scalable and secure EDI integration solutions on the Azure cloud platform. Knowledge sharing and coaching of CE development resources as needed to transition future development and support to company employees. Development of appropriate standards, process and procedures for run book and monitoring purposes. The ideal candidate will have expertise in both .NET development and EDI transaction standards, and will thrive in a collaborative, agile environment.
Skills
Design, develop, and deploy EDI solutions and applications using the .NET framework and Azure services.
Work with stakeholders and business analysts to gather and translate business requirements into technical specifications for EDI integration.
Develop, configure, and maintain EDI mapping and translation processes using standard formats such as ANSI X12.
Utilize Azure services such as Azure Functions, Logic Apps, and App Service to build and manage EDI workflows.
Collaborate with trading partners to coordinate and test EDI file transfers and resolve any issues related to connectivity or data integrity.
Write and maintain high-quality, scalable code in C# and SQL, ensuring adherence to coding standards and best practices.
Implement and manage DevOps practices for CI/CD pipelines using Azure DevOps.
Provide technical support and troubleshooting for existing EDI and cloud-based applications.
Create and maintain technical documentation related to EDI processes, system architecture, and integrations.
Should be able to train other team members
Education
Bachelor's degree in computer science, Information Technology, or a related field.
Data Scientist
Data engineer job in Lansing, MI
** At Western Digital, our vision is to power global innovation and push the boundaries of technology to make what you thought was once impossible, possible. At our core, Western Digital is a company of problem solvers. People achieve extraordinary things given the right technology. For decades, we've been doing just that-our technology helped people put a man on the moon and capture the first-ever picture of a black hole.
We offer an expansive portfolio of technologies, HDDs, and platforms for business, creative professionals, and consumers alike under our Western Digital , WD , WD_BLACK, and SanDisk Professional brands.
We are a key partner to some of the largest and highest-growth organizations in the world. From enabling systems to make cities safer and more connected, to powering the data centers behind many of the world's biggest companies and hyperscale cloud providers, to meeting the massive and ever-growing data storage needs of the AI era, Western Digital is fueling a brighter, smarter future.
Today's exceptional challenges require your unique skills. Together, we can build the future of data storage.
**Job Description**
ESSENTIAL DUTIES AND RESPONSIBILITIES
+ **Business Partnership & Consulting**
+ Serve as the primary analytics partner to HR and business leaders, understanding their challenges and translating them into analytical solutions.
+ Provide insights and recommendations that inform decisions on talent strategy, workforce planning, retention, and employee experience.
+ Build strong relationships with HRBPs, COEs, and leadership teams to ensure alignment on priorities.
+ Experience advising, presenting to, and serving as a thought partner to senior executives.
+ **Analytics & Insights**
+ Develop dashboards, reports, and analyses on workforce metrics (e.g., attrition, DEI, engagement, recruiting, performance).
+ Translate complex data into clear, actionable insights with strong storytelling and visualization.
+ Deliver executive-ready materials that connect people data to business outcomes.
+ Partner cross-functionally with analytics and technical teams to ensure data accuracy, resolve quality issues, and maintain consistent, reliable insights.
+ **Advanced People Analytics**
+ Use statistical analysis, predictive modeling, and trend forecasting to identify workforce risks and opportunities.
+ Partner with HR Technology and Data teams to enhance data quality, governance, and reporting capabilities.
+ Lead initiatives to evolve people analytics from descriptive to predictive and prescriptive insights.
+ **Strategy & Enablement**
+ Guide stakeholders in building a data-driven culture within HR and across the business.
+ Drive adoption of self-service analytics platforms and democratize access to people insights.
**Qualifications**
REQUIRED
+ **Education & Experience**
+ Bachelor's or Master's in HR, Business, Data Analytics, Industrial/Organizational Psychology, Statistics, or a related field.
+ 6+ years of experience in People Analytics, HR Analytics, Workforce Planning, or related fields.
SKILLS
+ **Technical Skills**
+ Strong expertise in data visualization tools (e.g., Tableau, Power BI, Workday People Analytics, Visier).
+ Advanced Excel, SQL, or Python/R for data analysis preferred.
+ Understanding of HR systems (Workday, SuccessFactors, etc.) and data structures.
+ **Business & Consulting Skills**
+ Exceptional ability to translate data into business insights and recommendations.
+ Strong stakeholder management, influencing, and storytelling skills.
+ Experience in partnering with senior leaders to drive data-informed decisions
**Additional Information**
Western Digital is committed to providing equal opportunities to all applicants and employees and will not discriminate against any applicant or employee based on their race, color, ancestry, religion (including religious dress and grooming standards), sex (including pregnancy, childbirth or related medical conditions, breastfeeding or related medical conditions), gender (including a person's gender identity, gender expression, and gender-related appearance and behavior, whether or not stereotypically associated with the person's assigned sex at birth), age, national origin, sexual orientation, medical condition, marital status (including domestic partnership status), physical disability, mental disability, medical condition, genetic information, protected medical and family care leave, Civil Air Patrol status, military and veteran status, or other legally protected characteristics. We also prohibit harassment of any individual on any of the characteristics listed above. Our non-discrimination policy applies to all aspects of employment. We comply with the laws and regulations set forth in the "Know Your Rights: Workplace Discrimination is Illegal (************************************************************************************** " poster. Our pay transparency policy is available here (****************************************************** .
Western Digital thrives on the power and potential of diversity. As a global company, we believe the most effective way to embrace the diversity of our customers and communities is to mirror it from within. We believe the fusion of various perspectives results in the best outcomes for our employees, our company, our customers, and the world around us. We are committed to an inclusive environment where every individual can thrive through a sense of belonging, respect and contribution.
Western Digital is committed to offering opportunities to applicants with disabilities and ensuring all candidates can successfully navigate our careers website and our hiring process. Please contact us at jobs.accommodations@wdc.com to advise us of your accommodation request. In your email, please include a description of the specific accommodation you are requesting as well as the job title and requisition number of the position for which you are applying.
Based on our experience, we anticipate that the application deadline will be **12/2/2025** (3 months from posting), although we reserve the right to close the application process sooner if we hire an applicant for this position before the application deadline. If we are not able to hire someone from this role before the application deadline, we will update this posting with a new anticipated application deadline.
\#LI- VV1
**Compensation & Benefits Details**
+ An employee's pay position within the salary range may be based on several factors including but not limited to (1) relevant education; qualifications; certifications; and experience; (2) skills, ability, knowledge of the job; (3) performance, contribution and results; (4) geographic location; (5) shift; (6) internal and external equity; and (7) business and organizational needs.
+ The salary range is what we believe to be the range of possible compensation for this role at the time of this posting. We may ultimately pay more or less than the posted range and this range is only applicable for jobs to be performed in California, Colorado, New York or remote jobs that can be performed in California, Colorado and New York. This range may be modified in the future.
+ If your position is non-exempt, you are eligible for overtime pay pursuant to company policy and applicable laws. You may also be eligible for shift differential pay, depending on the shift to which you are assigned.
+ You will be eligible to be considered for bonuses under **either** Western Digital's Short Term Incentive Plan ("STI Plan") or the Sales Incentive Plan ("SIP") which provides incentive awards based on Company and individual performance, depending on your role and your performance. You may be eligible to participate in our annual Long-Term Incentive (LTI) program, which consists of restricted stock units (RSUs) or cash equivalents, pursuant to the terms of the LTI plan. Please note that not all roles are eligible to participate in the LTI program, and not all roles are eligible for equity under the LTI plan. RSU awards are also available to eligible new hires, subject to Western Digital's Standard Terms and Conditions for Restricted Stock Unit Awards.
+ We offer a comprehensive package of benefits including paid vacation time; paid sick leave; medical/dental/vision insurance; life, accident and disability insurance; tax-advantaged flexible spending and health savings accounts; employee assistance program; other voluntary benefit programs such as supplemental life and AD&D, legal plan, pet insurance, critical illness, accident and hospital indemnity; tuition reimbursement; transit; the Applause Program; employee stock purchase plan; and the Western Digital Savings 401(k) Plan.
+ **Note:** No amount of pay is considered to be wages or compensation until such amount is earned, vested, and determinable. The amount and availability of any bonus, commission, benefits, or any other form of compensation and benefits that are allocable to a particular employee remains in the Company's sole discretion unless and until paid and may be modified at the Company's sole discretion, consistent with the law.
**Notice To Candidates:** Please be aware that Western Digital and its subsidiaries will never request payment as a condition for applying for a position or receiving an offer of employment. Should you encounter any such requests, please report it immediately to Western Digital Ethics Helpline (******************************************************************** or email ****************** .
Data Scientist (Technical Leadership)
Data engineer job in Lansing, MI
We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.
**Required Skills:**
Data Scientist (Technical Leadership) Responsibilities:
1. Work with complex data sets to solve challenging problems using analytical and statistical approaches
2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies
3. Identify and measure success through goal setting, forecasting, and monitoring key metrics
4. Partner with cross-functional teams to inform and execute product strategy and investment decisions
5. Build long-term vision and strategy for programs and products
6. Collaborate with executives to define and develop data platforms and instrumentation
7. Effectively communicate insights and recommendations to stakeholders
8. Define success metrics, forecast changes, and set team goals
9. Support developing roadmaps and coordinate analytics efforts across teams
**Minimum Qualifications:**
Minimum Qualifications:
10. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab)
11. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development
12. Experience with predictive modeling, machine learning, and experimentation/causal inference methods
13. Experience communicating complex technical topics in a clear, precise, and actionable manner
**Preferred Qualifications:**
Preferred Qualifications:
14. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy
15. Masters or Ph.D. Degree in a quantitative field
16. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research)
17. 10+ years of experience doing complex quantitative analysis in product analytics
**Public Compensation:**
$206,000/year to $281,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
Principal Cloud Architect, AI Computational Data Scientist
Data engineer job in Lansing, MI
Oracle Cloud Infrastructure (OCI) is a pioneering force in cloud technology, merging the agility of startups with the robustness of an enterprise software leader. Within OCI, the Oracle Ai Infra / Gen AI Cloud Engineering team spearheads innovative solutions at the convergence of artificial intelligence and cloud infrastructure. As part of this team, you'll contribute to large-scale cloud solutions utilizing cutting-edge machine learning technologies, aimed at addressing complex global challenges.
Join us to create innovative solutions using top-notch machine learning technologies to solve global challenges. We're looking for an experienced Principal Applied Data/Computational Scientist to join our Cloud Engineering team for strategic customers. In this role, you'll collaborate with applied scientists and product managers to design, develop, and deploy tailored Gen-AI solutions with an emphasis on Large Language Models (LLMs), Agents, MCP and Retrieval Augmented Generation (RAG) with large OpenSearch clusters. You will be responsible for identifying, solutioning, and implementing AI solutions to the corresponding GPU IaaS or PaaS.
**Qualifications and experience**
+ Doctoral or master's degree in computer science or equivalent technical field with 10+ years of experience
+ Able to optimally communicate technical ideas verbally and in writing (technical proposals, design specs, architecture diagrams and presentations).
+ Demonstrated experience in designing and implementing scalable AI models and solutions for production, relevant professional experience as end-to-end solutions engineer or architect (data engineering, data science and ML engineering is a plus), with evidence of close collaborations with PM and Dev teams.
+ Experience with OpenSearch, Vector databases, PostgreSQL and Kafka Streaming.
+ Practical experience with setting up and finetuning large OpenSearch Clusters.
+ Experience in setting up data ingestion pipelines with OpenSearch.
+ Experience with search algorithms, indexing, optimizing latency and response times.
+ Practical experience with the latest technologies in LLM and generative AI, such as parameter-efficient fine-tuning, instruction fine-tuning, and advanced prompt engineering techniques like Tree-of-Thoughts.
+ Familiarity with Agents and Agent frameworks and Model Context Protocol (MCP)
+ Hands-on experience with emerging LLM frameworks and plugins, such as LangChain, LlamaIndex, VectorStores and Retrievers, LLM Cache, LLMOps (MLFlow), LMQL, Guidance, etc.
+ Strong publication record, including as a lead author or reviewer, in top-tier journals or conferences.
+ Ability and passion to mentor and develop junior machine learning engineers.
+ Proficient in Python and shell scripting tools.
**Preferred Qualifications** :
+ PhD/Masters in related field with 5+ years relevant experience
+ Experience with RAG based solutions architecture. Familiarity in OpenSearch and Vector stores as a knowledge store
+ Knowledge of LLM and experience delivering, Generative AI And Agent models are a significant plus.
+ Familiarity and experience with the latest advancements in computer vision and multimodal modeling is a plus.
+ Experience with semantic search, multi-modal search and conversational search.
+ Experience in working on a public cloud environment, and in-depth knowledge of IaaS/PaaS industry and competitive capabilities. Experience with popular model training and serving frameworks like KServe, KubeFlow, Triton etc.
+ Experience with LLM fine-tuning, especially the latest parameter efficient fine-tuning technologies and multi-task serving technologies.
+ Deep technical understanding of Machine Learning, Deep Learning architectures like Transformers, training methods, and optimizers.
+ Experience with deep learning frameworks (such as PyTorch, JAX, or TensorFlow) and deep learning architectures (especially Transformers).
+ Experience in diagnosing, fixing, and resolving issues in AI model training and serving.
**Responsibilities**
**Responsibilities**
As part of the **OCI Gen AI Cloud Engineering team** for strategic customers team, you will be responsible for developing innovative Gen AI and data services for our strategic customers. As a Principal Applied Data/Computational Scientist , you'll lead the development of advanced Gen AI solutions using the latest ML technologies combined with Oracle's cloud expertise. Your work will significantly impact sectors like financial services, telecom, healthcare, and code generation by creating distributed, scalable, high-performance solutions for strategic customers.
+ Work directly with key customers and accompany them on their Gen AI journey - understanding their requirements, help them envision and design and build the right solutions and work together with their ML engineering to remove blockers.
+ You will dive deep into model structure to optimize model performance and scalability.
+ You will build state of art solutions with brand new technologies in this fast-evolving area.
+ You will configure large scale OpenSearch clusters, setting up ingestion pipelines to get the data into the OpenSearch.
+ You will diagnose, troubleshoot, and resolve issues in AI model training and serving. You may also perform other duties as assigned.
+ Build re-usable solution patterns and reference solutions / showcases that can apply across multiple customers.
+ Be an enthusiastic, self-motivated, and a great collaborator.
+ Be our product evangelist - engage directly with customers and partners, participate and present in external events and conferences, etc.
Disclaimer:
**Certain US customer or client-facing roles may be required to comply with applicable requirements, such as immunization and occupational health mandates.**
**Range and benefit information provided in this posting are specific to the stated locations only**
US: Hiring Range in USD from: $113,100 to $185,100 per annum. May be eligible for equity. Eligible for commission with an estimated pay mix of 70/30.
Oracle maintains broad salary ranges for its roles in order to account for variations in knowledge, skills, experience, market conditions and locations, as well as reflect Oracle's differing products, industries and lines of business.
Candidates are typically placed into the range based on the preceding factors as well as internal peer equity.
Oracle US offers a comprehensive benefits package which includes the following:
1. Medical, dental, and vision insurance, including expert medical opinion
2. Short term disability and long term disability
3. Life insurance and AD&D
4. Supplemental life insurance (Employee/Spouse/Child)
5. Health care and dependent care Flexible Spending Accounts
6. Pre-tax commuter and parking benefits
7. 401(k) Savings and Investment Plan with company match
8. Paid time off: Flexible Vacation is provided to all eligible employees assigned to a salaried (non-overtime eligible) position. Accrued Vacation is provided to all other employees eligible for vacation benefits. For employees working at least 35 hours per week, the vacation accrual rate is 13 days annually for the first three years of employment and 18 days annually for subsequent years of employment. Vacation accrual is prorated for employees working between 20 and 34 hours per week. Employees working fewer than 20 hours per week are not eligible for vacation.
9. 11 paid holidays
10. Paid sick leave: 72 hours of paid sick leave upon date of hire. Refreshes each calendar year. Unused balance will carry over each year up to a maximum cap of 112 hours.
11. Paid parental leave
12. Adoption assistance
13. Employee Stock Purchase Plan
14. Financial planning and group legal
15. Voluntary benefits including auto, homeowner and pet insurance
The role will generally accept applications for at least three calendar days from the posting date or as long as the job remains posted.
Career Level - IC4
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_************* or by calling *************** in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Data Scientist, NLP
Data engineer job in Lansing, MI
Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care.
By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare.
We are looking for a motivated Data Scientist to help Datavant revolutionize the healthcare industry with AI. This is a critical role where the right candidate will have the ability to work on a wide range of problems in the healthcare industry with an unparalleled amount of data.
You'll join a team focused on deep medical document understanding, extracting meaning, intent, and structure from unstructured medical and administrative records. Our mission is to build intelligent systems that can reliably interpret complex, messy, and high-stakes healthcare documentation at scale.
This role is a unique blend of applied machine learning, NLP, and product thinking. You'll collaborate closely with cross-functional teams to:
+ Design and develop models to extract entities, detect intents, and understand document structure
+ Tackle challenges like long-context reasoning, layout-aware NLP, and ambiguous inputs
+ Evaluate model performance where ground truth is partial, uncertain, or evolving
+ Shape the roadmap and success metrics for replacing legacy document processing systems with smarter, scalable solutions
We operate in a high-trust, high-ownership environment where experimentation and shipping value quickly are key. If you're excited by building systems that make healthcare data more usable, accurate, and safe, please reach out.
**Qualifications**
+ 3+ years of experience with data science and machine learning in an industry setting, particularly in designing and building NLP models.
+ Proficiency with Python
+ Experience with the latest in language models (transformers, LLMs, etc.)
+ Proficiency with standard data analysis toolkits such as SQL, Numpy, Pandas, etc.
+ Proficiency with deep learning frameworks like PyTorch (preferred) or TensorFlow
+ Industry experience shepherding ML/AI projects from ideation to delivery
+ Demonstrated ability to influence company KPIs with AI
+ Demonstrated ability to navigate ambiguity
**Bonus Experience**
+ Experience with document layout analysis (using vision, NLP, or both).
+ Experience with Spark/PySpark
+ Experience with Databricks
+ Experience in the healthcare industry
**Responsibilities**
+ Play a key role in the success of our products by developing models for document understanding tasks.
+ Perform error analysis, data cleaning, and other related tasks to improve models.
+ Collaborate with your team by making recommendations for the development roadmap of a capability.
+ Work with other data scientists and engineers to optimize machine learning models and insert them into end-to-end pipelines.
+ Understand product use-cases and define key performance metrics for models according to business requirements.
+ Set up systems for long-term improvement of models and data quality (e.g. active learning, continuous learning systems, etc.).
**After 3 Months, You Will...**
+ Have a strong grasp of technologies upon which our platform is built.
+ Be fully integrated into ongoing model development efforts with your team.
**After 1 Year, You Will...**
+ Be independent in reading literature and doing research to develop models for new and existing products.
+ Have ownership over models internally, communicating with product managers, customer success managers, and engineers to make the model and the encompassing product succeed.
+ Be a subject matter expert on Datavant's models and a source from which other teams can seek information and recommendations.
\#LI-BC1
We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status.
At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services.
The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job.
The estimated total cash compensation range for this role is:
$136,000-$170,000 USD
To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion.
This job is not eligible for employment sponsorship.
Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay.
At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way.
Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis.
For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
Databricks Data Engineer - Senior - Consulting - Location Open
Data engineer job in Lansing, MI
At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
**Technology - Data and Decision Science - Data Engineering - Senior**
We are seeking a highly skilled Senior Consultant Data Engineer with expertise in cloud data engineering, specifically Databricks. The ideal candidate will have strong client management and communication skills, along with a proven track record of successful end-to-end implementations in data engineering projects.
**The opportunity**
In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that technical requirements align with business needs. Your responsibilities will include creating scalable data architecture and modeling solutions that support the entire data asset lifecycle.
**Your key responsibilities**
As a Senior Data Engineer, you will play a pivotal role in transforming data into actionable insights. Your time will be spent on various responsibilities, including:
+ Designing, building, and operating scalable on-premises or cloud data architecture.
+ Analyzing business requirements and translating them into technical specifications.
+ Optimizing data flows for target data platform designs.
+ Design, develop, and implement data engineering solutions using Databricks on cloud platforms (e.g., AWS, Azure, GCP).
+ Collaborate with clients to understand their data needs and provide tailored solutions that meet their business objectives.
+ Lead end-to-end data pipeline development, including data ingestion, transformation, and storage.
+ Ensure data quality, integrity, and security throughout the data lifecycle.
+ Provide technical guidance and mentorship to junior data engineers and team members.
+ Communicate effectively with stakeholders, including technical and non-technical audiences, to convey complex data concepts.
+ Manage client relationships and expectations, ensuring high levels of satisfaction and engagement.
+ Stay updated with the latest trends and technologies in data engineering and cloud computing.
This role offers the opportunity to work with cutting-edge technologies and stay ahead of industry trends, ensuring you gain a competitive advantage in the market. The position may require regular travel to meet with external clients.
**Skills and attributes for success**
To thrive in this role, you will need a blend of technical and interpersonal skills. Your ability to communicate effectively and build relationships will be crucial. Here are some key attributes we look for:
+ Strong analytical and decision-making skills.
+ Proficiency in cloud computing and data architecture design.
+ Experience in data integration and security.
+ Ability to manage complex problem-solving scenarios.
**To qualify for the role, you must have**
+ A Bachelor's degree in Computer Science, Engineering, or a related field required (4-year degree). Master's degree preferred
+ Typically, no less than 2 - 4 years relevant experience in data engineering, with a focus on cloud data solutions.
+ 5+ years of experience in data engineering, with a focus on cloud data solutions.
+ Expertise in Databricks and experience with Spark for big data processing.
+ Proven experience in at least two end-to-end data engineering implementations, including:
+ Implementation of a data lake solution using Databricks, integrating various data sources, and enabling analytics for business intelligence.
+ Development of a real-time data processing pipeline using Databricks and cloud services, delivering insights for operational decision-making.
+ Strong programming skills in languages such as Python, Scala, or SQL.
+ Experience with data modeling, ETL processes, and data warehousing concepts.
+ Excellent problem-solving skills and the ability to work independently and as part of a team.
+ Strong communication and interpersonal skills, with a focus on client management.
**Required Expertise for Senior Consulting Projects:**
+ **Strategic Thinking:** Ability to align data engineering solutions with business strategies and objectives.
+ **Project Management:** Experience in managing multiple projects simultaneously, ensuring timely delivery and adherence to project scope.
+ **Stakeholder Engagement:** Proficiency in engaging with various stakeholders, including executives, to understand their needs and present solutions effectively.
+ **Change Management:** Skills in guiding clients through change processes related to data transformation and technology adoption.
+ **Risk Management:** Ability to identify potential risks in data projects and develop mitigation strategies.
+ **Technical Leadership:** Experience in leading technical discussions and making architectural decisions that impact project outcomes.
+ **Documentation and Reporting:** Proficiency in creating comprehensive documentation and reports to communicate project progress and outcomes to clients.
**Ideally, you'll also have**
+ Experience with data quality management.
+ Familiarity with semantic layers in data architecture.
+ Familiarity with cloud platforms (AWS, Azure, GCP) and their data services.
+ Knowledge of data governance and compliance standards.
+ Experience with machine learning frameworks and tools.
**What we look for**
We seek individuals who are not only technically proficient but also possess the qualities of top performers. You should be adaptable, collaborative, and driven by a desire to achieve excellence in every project you undertake.
FY26NATAID
**What we offer you**
At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more .
+ We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $106,900 to $176,500. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $128,400 to $200,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
+ Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
+ Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
**Are you ready to shape your future with confidence? Apply today.**
EY accepts applications for this position on an on-going basis.
For those living in California, please click here for additional information.
EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
**EY | Building a better working world**
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
Data Scientist, Marketing
Data engineer job in Lansing, MI
**Employment Type:** FullTime Remote **Department** Finance & Operations, Data **Compensation:** $160.1K - $188.1K - Offers Equity _At Confluent, we are committed to providing competitive pay and benefits that are in line with industry standards. We analyze and carefully consider several factors when determining compensation, including work history, education, professional experience, and location. The actual pay may vary depending on your skills, qualifications, experience, and work location. In addition, Confluent offers a wide range of employee benefits. To learn more about our benefits click_ here (****************************** _._
**Overview**
We're not just building better tech. We're rewriting how data moves and what the world can do with it. With Confluent, data doesn't sit still. Our platform puts information in motion, streaming in near real-time so companies can react faster, build smarter, and deliver experiences as dynamic as the world around them.
It takes a certain kind of person to join this team. Those who ask hard questions, give honest feedback, and show up for each other. No egos, no solo acts. Just smart, curious humans pushing toward something bigger, together.
One Confluent. One Team. One Data Streaming Platform.
**About the Role:**
As a Marketing Data Scientist at Confluent, you will be a key player in analyzing lead funnel efficiency, marketing measurement and pipeline management. In this role, you will collaborate with Marketing, Growth, Sales & Finance stakeholders. Your focus will include data exploration, conducting in-depth analyses, and generating actionable insights to optimize and sustain healthy marketing funnel.
**What You Will Do:**
+ Identify high-impact opportunities through data exploration, and generate strategic insights by collaborating with cross-functional teams, contributing to improvement amongst top of the funnel conversion metrics.
+ Define marketing metrics to measure success, develop scalable reporting to monitor the health of our pipeline, and analyze drivers for company KPIs
+ Share actionable insights and recommendations with leadership at various levels, playing a pivotal role in determining marketing campaign and pipeline strategy
+ Design and execute A/B tests to measure the impact of product features and launches, providing insights that guide iterative improvements.
+ Provide deep understanding of customer journey phases, from awareness to conversion with expertise in analyzing metrics like ROI, CLTV etc. to identify opportunities for growth and optimization
+ Collaborate with other data scientists across our organization to share best practices, learn new analytical techniques, and champion an organizational culture where data is central to decision making
**What You Will Bring:**
+ 3+ years of experience in Data Science supporting a marketing function, preferably at a B2B SaaS company
+ Strong proficiency in data manipulation, especially using SQL (3+ years of experience) and a scripting language (Python/R)
+ Proficiency in data visualization and building dashboards with tools such as Tableau
+ Excellent communication skills, and ability to collaborate effectively with technical and non-technical stakeholders
+ Experience with marketing attribution models across a variety of channels (SEO, Paid Search, Social, etc.) and A/B testing frameworks
+ Exceptional problem solving skills and detail-oriented mindset
+ Bachelor or advanced degree in a quantitative discipline: computer science, engineering, statistics, economics, etc.
**What Gives You an Edge:**
+ Familiarity with lead funnel, marketing attribution models, and marketing measurement using causal inference and other advanced analytical methods.
+ Understanding of the B2B Marketing Stack, Salesforce objects
**Ready to build what's next? Let's get in motion.**
**Come As You Are**
Belonging isn't a perk here. It's the baseline. We work across time zones and backgrounds, knowing the best ideas come from different perspectives. And we make space for everyone to lead, grow, and challenge what's possible.
We're proud to be an equal opportunity workplace. Employment decisions are based on job-related criteria, without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, or any other classification protected by law.
Senior Data Engineer - Real-Time Data & Visualization Focus
Data engineer job in Lansing, MI
*****CANDIDATE MUST BE US Citizen (due to contractual/access requirements)***** available within the U.S.** In this role, you will be a key contributor to our data infrastructure, responsible for the design, development, and maintenance of efficient, reliable, and high-quality data flow and storage. Your expertise will drive real-time data ingestion and transformation processes, structuring data from diverse sources using Cloud platforms (Azure, GCP, AWS), Databases (BigQuery, Azure Data Explorer), and streaming technologies like Cribl, Kafka, and Azure Event Hubs. A primary responsibility will be creating impactful visualizations and dashboards for internal and enterprise reporting, leveraging tools like Power BI and Tableau to translate business needs into actionable insights. This role requires strong proficiency in scripting/query languages (KQL, SQL, Python, JavaScript) and strong skills in building and optimizing data pipelines, ensuring data quality and integrity throughout the data lifecycle.
**ESSENTIAL RESPONSIBILITIES**
+ Design, develop, and maintain robust data processes and solutions to ensure the efficient movement and transformation of data across multiple systems
+ Develop and maintain data models, databases, and data warehouses to support business intelligence and analytics needs
+ Collaborate with stakeholders across IT, product, analytics, and business teams to gather requirements and provide data solutions that meet organizational needs
+ Monitor work against production schedule,provide progress updates, and report any issues or technical difficulties to lead developers regularly
+ Implement and manage data governance practices, ensuring data quality, integrity, and compliance with relevant regulations.
+ Collaborate on the design and implementation of data security measures, including access controls, encryption, and data masking
+ Mentor other associate and intermediate data engineers as needed
+ Perform data analysis and provide insights to support decision-making across various departments
+ Stay current with industry trends and emerging technologies in data engineering, recommending new tools and best practices as needed
+ Other duties as assigned or requested.
**EXPERIENCE**
**Required**
+ 5 years of experience in design and analysis of algorithms, data structures, and design patterns in the building and deploying of scalable, highly available systems
+ 5 years of experience in a data engineering, ETL development, or data management role.
+ 5 years of experience in SQL and experience with database technologies (e.g., MySQL, PostgreSQL, MongoDB).
+ 5 years of experience in data warehousing concepts and experience with data warehouse solutions (e.g., Snowflake, Redshift, BigQuery)
**Preferred**
+ 7 years of experience with data pipeline and workflow management tools (e.g., Apache Airflow, AWS Glue, Azure Data Factory).
+ 7 years of experience defining system architectures and exploring technical feasibility trade-offs for optimizing short term execution while planning for long term technical capabilities
+ 7 years of experience working with a variety of technology systems, designing solutions or developing data solutions in healthcare
+ 7 years of experience with cloud platforms (AWS, Azure, GCP) and their respective data services
+ 7 years of experience in data governance, data quality, and data security best practices
+ 7 years of experience translating requirements, design mockups, prototypes or user stories into technical designs
+ 7 years of experience in producing data-related code that is fault-tolerant, efficient, and maintainable
+ 7 years of experience structuring and transforming data from multiple source systems.
+ 7 years of experience creating data visualizations and dashboards using Power BI or Tableau for internal and external audiences.
+ 7 years of experience using Cribl as a steaming and transformation solution.
**SKILLS**
+ Demonstrated ability to achieve stretch goals in a highly innovative and fast-paced environment
+ Adaptability: Strong ability to take on diverse tasks and projects, adapting to the evolving needs of the organization
+ Analytical Thinking: Strong analytical skills with a focus on detail and accuracy
+ Interest and ability to learn other data development technologies/languages as needed
+ Technical Proficiency: Comfortable with a range of data tools and technologies, with a willingness to learn new skills as needed
+ Strong track record in designing and implementing large-scale data sources
+ Strong sense of ownership, urgency, and drive
+ Demonstrated passion for user experience and improving usability
+ Team Collaboration: A team player who can work effectively in cross-functional environments
+ Experience and willingness to mentor junior data engineers and help develop their skills and leadership
**EDUCATION**
**Required**
+ Bachelor's degree in Computer Science, Information Systems, Data Science, Computer Engineering or related field
**Preferred**
+ Master's degree in Computer Science, Information Systems, Data Science, Computer Engineering or related field
**LICENSES or CERTIFICATIONS**
**Required**
+ None
**Preferred**
+ None
**Language (Other than English):**
None
**Travel Requirement:**
0% - 25%
**PHYSICAL, MENTAL DEMANDS and WORKING CONDITIONS**
**Position Type**
Office- or Remote-based
Teaches / trains others
Occasionally
Travel from the office to various work sites or from site-to-site
Rarely
Works primarily out-of-the office selling products/services (sales employees)
Never
Physical work site required
No
Lifting: up to 10 pounds
Constantly
Lifting: 10 to 25 pounds
Occasionally
Lifting: 25 to 50 pounds
Rarely
**_Disclaimer:_** _The job description has been designed to indicate the general nature and essential duties and responsibilities of work performed by employees within this job title. It may not contain a comprehensive inventory of all duties, responsibilities, and qualifications required of employees to do this job._
**_Compliance Requirement_** _: This job adheres to the ethical and legal standards and behavioral expectations as set forth in the code of business conduct and company policies._
_As a component of job responsibilities, employees may have access to covered information, cardholder data, or other confidential customer information that must be protected at all times. In connection with this, all employees must comply with both the Health Insurance Portability Accountability Act of 1996 (HIPAA) as described in the Notice of Privacy Practices and Privacy Policies and Procedures as well as all data security guidelines established within the Company's Handbook of Privacy Policies and Practices and Information Security Policy._
_Furthermore, it is every employee's responsibility to comply with the company's Code of Business Conduct. This includes but is not limited to adherence to applicable federal and state laws, rules, and regulations as well as company policies and training requirements._
**Pay Range Minimum:**
$78,900.00
**Pay Range Maximum:**
$147,500.00
_Base pay is determined by a variety of factors including a candidate's qualifications, experience, and expected contributions, as well as internal peer equity, market, and business considerations. The displayed salary range does not reflect any geographic differential Highmark may apply for certain locations based upon comparative markets._
Highmark Health and its affiliates prohibit discrimination against qualified individuals based on their status as protected veterans or individuals with disabilities and prohibit discrimination against all individuals based on any category protected by applicable federal, state, or local law.
We endeavor to make this site accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact the email below.
For accommodation requests, please contact HR Services Online at *****************************
California Consumer Privacy Act Employees, Contractors, and Applicants Notice
Req ID: J273322
Senior Data Engineer
Data engineer job in Lansing, MI
At CVS Health, we're building a world of health around every consumer and surrounding ourselves with dedicated colleagues who are passionate about transforming health care. As the nation's leading health solutions company, we reach millions of Americans through our local presence, digital channels and more than 300,000 purpose-driven colleagues - caring for people where, when and how they choose in a way that is uniquely more connected, more convenient and more compassionate. And we do it all with heart, each and every day.
**Position Summary**
The Senior Data Engineer will be responsible for delivering high quality modern data solutions through collaboration with our engineering, analysts, data scientists, and product teams in a fast-paced, agile environment leveraging cutting-edge technology to reimagine how Healthcare is provided. You will be instrumental in designing, integrating, and implementing solutions on-premise as well supporting migrations of existing workloads to the cloud. The Senior Data Engineer is expected to have extensive knowledge of modern programming languages, designing and developing data solutions.
The position is open in a data engineering team that is responsible for processing Claims, Revenue, Rx Medicare Specific files and data from 30+ payors into our Data Warehouse.
**Required Qualifications**
+ 6+ years' experience in data engineering, working with SQL and relational database management systems
+ 4+ years' experience in Cloud technologies (Azure, GCP or AWS)
+ 3+ years' experience with Microsoft SQL server
+ 3+ years' experience with Databricks
+ 3+ years' experience programming and modifying code in languages like SQL, Python, and PySpark to support and implement Cloud based and on-prem data warehousing services
+ 3+ years hands-on experience with dimensional data modeling, schema design, and data warehousing
+ 3+ years hands-on experience with troubleshooting data issues
+ 2+ years' hands-on experience with various performance improvement techniques
**Preferred Qualifications**
+ Familiarity with Azure
+ Willingness to identify and implement process improvements, and best practices as well as ability to take ownership to work within a fast-paced, collaborative, and team-based support environment
+ Excellent oral and written communication skills
+ Familiarity with healthcare data and healthcare insurance feeds
**Education**
+ Bachelor's degree in Computer Science / Engineering
**Anticipated Weekly Hours**
40
**Time Type**
Full time
**Pay Range**
The typical pay range for this role is:
$83,430.00 - $222,480.00
This pay range represents the base hourly rate or base annual full-time salary for all positions in the job grade within which this position falls. The actual base salary offer will depend on a variety of factors including experience, education, geography and other relevant factors. This position is eligible for a CVS Health bonus, commission or short-term incentive program in addition to the base pay range listed above.
Our people fuel our future. Our teams reflect the customers, patients, members and communities we serve and we are committed to fostering a workplace where every colleague feels valued and that they belong.
**Great benefits for great people**
We take pride in our comprehensive and competitive mix of pay and benefits - investing in the physical, emotional and financial wellness of our colleagues and their families to help them be the healthiest they can be. In addition to our competitive wages, our great benefits include:
+ **Affordable medical plan options,** a **401(k) plan** (including matching company contributions), and an **employee stock purchase plan** .
+ **No-cost programs for all colleagues** including wellness screenings, tobacco cessation and weight management programs, confidential counseling and financial coaching.
+ **Benefit solutions that address the different needs and preferences of our colleagues** including paid time off, flexible work schedules, family leave, dependent care resources, colleague assistance programs, tuition assistance, retiree medical access and many other benefits depending on eligibility.
For more information, visit *****************************************
We anticipate the application window for this opening will close on: 12/18/2025
Qualified applicants with arrest or conviction records will be considered for employment in accordance with all federal, state and local laws.
We are an equal opportunity and affirmative action employer. We do not discriminate in recruiting, hiring, promotion, or any other personnel action based on race, ethnicity, color, national origin, sex/gender, sexual orientation, gender identity or expression, religion, age, disability, protected veteran status, or any other characteristic protected by applicable federal, state, or local law.
Filenet or Datacap consultant - EAD or GC or CITIZENS
Data engineer job in Lansing, MI
USM Business Systems Inc. is a quickly developing worldwide System Integrator, Software and Product Development, IT Outsourcing and Technology assistance supplier headquartered in Chantilly, VA with off-shore delivery centers in India. We offer world-class ability in giving most astounding quality and administrations through industry best practices planned to convey remarkable worth to our customers.
Utilizing our industry knowledge, administration service offering expertise and innovation abilities, we distinguish new business and innovation slants and create answers for help customers around the globe, giving top of the line solid and practical IT benefits which are cost effective services.
Established in 1999, the organization has corner qualities in building and dealing with a Business Oriented IT environment with rich involvement in technology innovation, ERP and CRM counselling, Product Engineering, Business Intelligence, Data Management, SOA, BPM, Data Warehousing, SharePoint Consulting and IT Infrastructure. Our other offerings include modified solutions and administrations in ERP, CRM, Enterprise architecture, offshore advisory services ,e-commerce, Social , Mobile, Cloud, Analytics (SMAC) and DevOps.
USM, a US ensured Minority Business Enterprise (MBE) is perceived as one of the fastest developing IT Systems Integrator in the Washington, DC zone. Most as of late, USM was positioned #9 on the rundown of the Top administrations organizations in the DC Metro Area - Washington Business Journal (2011). We are a project-driven firm that reliably meets the IT needs of our State and Government customers through development and business keenness.
Job Description
Filenet or Datacap consultant - EAD or GC or CITIZENS
Duration: 12 months plus
Location: Lansing, MI
TOP SKILLS:
FileNet/DataCap
Application Development
Communication
Navya
*********************
************
Additional Information
If you are interested in the below position please forward your profile to navyar@usmsystems(dot)com or call me on ************
Easy ApplyData Architect
Data engineer job in Lansing, MI
Home (***************************** »Job Details **Data Architect** Information Technology (************************************************************ Technology) Language English Apply Now (***********************************************************************************************************
**Summary**
We're Concentrix. The intelligent transformation partner. Solution-focused. Tech-powered. Intelligence-fueled.
The global technology and services leader that powers the world's best brands, today and into the future. We're solution-focused, tech-powered, intelligence-fueled. With unique data and insights, deep industry expertise, and advanced technology solutions, we're the intelligent transformation partner that powers a world that works, helping companies become refreshingly simple to work, interact, and transact with. We shape new game-changing careers in over 70 countries, attracting the best talent.
The Concentrix Technical Products and Services team is the driving force behind Concentrix's transformation, data, and technology services. We integrate world-class digital engineering, creativity, and a deep understanding of human behavior to find and unlock value through tech-powered and intelligence-fueled experiences. We combine human-centered design, powerful data, and strong tech to accelerate transformation at scale. You will be surrounded by the best in the world providing market leading technology and insights to modernize and simplify the customer experience. Within our professional services team, you will deliver strategic consulting, design, advisory services, market research, and contact center analytics that deliver insights to improve outcomes and value for our clients. Hence achieving our vision.
Our game-changers around the world have devoted their careers to ensuring every relationship is exceptional. And we're proud to be recognized with awards such as "World's Best Workplaces," "Best Companies for Career Growth," and "Best Company Culture," year after year.
Join us and be part of this journey towards greater opportunities and brighter futures.
**Description**
**Data Architect**
**Job Location:** USA, East Coast (or) Remote - willing to work in EST timings
**Job Summary:**
We are seeking an experienced Data Architect to design and build the unified data model supporting our Operational Data Store (ODS) initiative. This is a working architect role requiring both strategic data modeling expertise and hands-on technical execution. The ideal candidate will bridge domain-level data ownership (Data Mesh principles) with enterprise-wide integration (Data Fabric patterns), creating a unified data layer that respects source system context while enabling cross-domain analytics and operations. This role requires someone who can define the architectural vision while personally building data models, writing transformation logic, and validating implementations against source systems. This is a remote, work at home opportunity in the US.
**Key Responsibilities:**
+ Design and build the unified data model for the Operational Data Store, balancing domain-specific context with enterprise integration requirements.
+ Define the architectural approach for federating domain data products into a cohesive enterprise data layer without creating monolithic dependencies.
+ Develop data transformation specifications, mapping rules, and working examples that preserve domain semantics while enabling cross-domain consistency.
+ Establish canonical data models that integrate across domain boundaries while respecting source system ownership and business context.
+ Perform source system analysis, data profiling, and gap assessments to understand domain data products and their integration requirements.
+ Write and validate SQL queries, transformation logic, and data quality rules to prove out architectural decisions before handoff.
+ Define system of record ownership across domains and maintain accurate data lineage documentation for federated data sources.
+ Design integration patterns that allow domains to evolve independently while maintaining enterprise data consistency.
+ Collaborate directly with Data Engineers during implementation, troubleshooting data quality issues and refining transformation logic.
+ Partner with domain stakeholders and enterprise architects to align domain data products with cross-domain analytics and operational needs.
+ Establish federated data governance standards that balance domain autonomy with enterprise consistency requirements.
+ Conduct architecture reviews focusing on data integrity, performance optimization, and scalability of the ODS.
+ Ensure compliance with data privacy regulations, security standards, and audit requirements in financial services.
+ Stay current with industry trends in Data Mesh, Data Fabric, and regulatory changes in the FinTech sector.
**Required Qualifications:**
+ Bachelor's or master's degree in computer science, Information Systems, Data Science, or related field.
+ 10+ years of experience in data management with at least 5+ years in data architecture roles.
+ Strong understanding of Data Mesh principles (domain ownership, data as product, federated governance) and Data Fabric concepts (unified access, integration layer, cross-domain visibility).
+ Proven expertise in logical and physical data modeling using tools such as ERwin, PowerDesigner, or similar.
+ Experience designing canonical data models that integrate multiple domain data sources while preserving business context.
+ Strong hands-on SQL skills with ability to write complex queries for data profiling, validation, and transformation.
+ Experience building Operational Data Stores, data warehouses, or enterprise data hubs with multiple source system integrations.
+ Demonstrated ability to perform data profiling, source system analysis, and data quality assessments independently.
+ Hands-on experience developing data transformation specifications and mapping documentation that engineers can implement.
+ Working knowledge of ETL/ELT patterns and data pipeline architecture.
+ Familiarity with event streaming platforms (Kafka, Kinesis) for real time data ingestion scenarios.
+ Solid understanding of data governance, metadata management, and data lineage practices.
**Preferred Qualifications:**
+ Demonstrates judgment and flexibility - positively deals with shifting priorities and rapid change of environments.
+ Experience in financial services, insurance, or large scale enterprise data platforms.
+ Experience implementing Data Mesh architectures or federated data governance models.
+ Proficiency with Python or similar scripting language for data analysis and validation.
+ Familiarity with cloud data services (AWS Glue, Azure Data Factory, Snowflake, Databricks).
+ Knowledge of data catalog and lineage tools (Collibra, Alation, Apache Atlas).
+ Exposure to domain-driven design principles applied to data architecture.
+ Understanding of regulatory requirements (SOX, GDPR, CCPA) as they apply to data management.
+ Experience mentoring Data Engineers and establishing team technical standards.
At Concentrix, we provide customer experience solutions that may involve handling sensitive data. As part of our hiring process, all candidates must undergo a background check in accordance with applicable law, which will include identity verification and employment eligibility.
The base salary for this position is $106,087-$197,020 , plus incentives that align with individual and company performance. Actual salaries will vary based on work location, qualifications, skills, education, experience, and competencies. Benefits available to eligible employees in this role include medical, dental, and vision insurance, comprehensive employee assistance program, 401(k) retirement plan, paid time off and holidays and paid learning days.
The deadline to apply for this position is: 12/15/2025
\#LI-Remote
\#Remote
Sr Data Engineer, PySpark
Data engineer job in Lansing, MI
**A Day in the Life:** The **Senior Data Engineer, PySpark** will be responsible for building and maintaining data pipelines and workflows that support ML, BI, analytics, and software products. This individual will work closely with data scientists, data engineers, analysts, software developers and SME's within the business to deliver new and exciting products and services. The main objectives are to develop data pipelines and fully automated workflows to drive operational efficiency and effectiveness by enabling data-driven decisions across the organization. This includes fostering collaboration, building partnerships, co-developing products, sharing knowledge, providing insights and valuable predictive information to business teams and leaders to highlight potential risks and opportunities that initiate the drive for change.
We expect the starting salary to be around $135k but will be commensurate with experience.
**What You'll Do:**
TECHNICAL LEADERSHIP
+ Development of high-quality code for the core data stack including data integration hub, data warehouse and data pipelines.
+ Build data flows for data acquisition, aggregation, and modeling, using both batch and streaming paradigms
+ Empower data scientists and data analysts to be as self-sufficient as possible by building core systems and developing reusable library code
+ Support and optimize data tools and associated cloud environments for consumption by downstream systems, data analysts and data scientists
+ Ensure code, configuration and other technology artifacts are delivered within agreed time schedules and any potential delays are escalated in advance
+ Collaborate across developers as part of a SCRUM or Kanban team ensuring collective team productivity
+ Participate in peer reviews and QA processes to drive higher quality
+ Ensure that 100% of code is well documented and maintained in source code repository.
INNOVATIVE INFRASTRUCTURE & PROBLEM SOLVING
+ Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows.
TEAMWORK & COMMUNICATION
+ Proactively educate others on basic data management concepts such as data governance, master data management, data warehousing, big data, reporting, data quality, and database performance.
+ Superior & demonstrated team building & development skills to harness powerful teams
+ Ability to communicate effectively with different levels of leadership within the organization
+ Provide timely updates so that progress against each individual incident can be updated as required
+ Write and review high quality technical documentation
CONTROL & AUDIT
+ Ensures their workstation and all processes and procedures, follow organization standards
CONTINUOUS IMPROVEMENT
+ Encourages and maintains a 'Best practice sharing culture', always striving to find ways to improve service and change mind set.
**What We're Looking For:**
+ 5+ years professional experience as a data engineer, software engineer, data analyst, data scientist or related role
+ Strongly prefer hands on experience with DataBricks or Palantir
+ Experience with relational and dimensional database modelling (Relational, Kimball, or Data Vault)
+ Proven experience with all aspects of the Data Pipeline (Data Sourcing, Transformations, Data Quality, Etc...)
+ Bachelors or Masters in Computer Science, Information Systems, or an engineering field or equivalent work experience
+ Prefer Travel, transportation, or hospitality experience
+ Prefer experience with designing application data models for mobile or web applications
+ Excellent written and verbal communication skills.
+ Flexibility in scheduling which may include nights, weekends, and holidays preferred
+ Prefer experience with event driven architectures and data streaming pub/sub technologies such as IBM MQ, Kafka, or Amazon Kinesis.
+ Strong capabilities in a scripting language such as Python, R, Scala, etc.
+ Strong capabilities in SQL and experience with stored procedures
+ Strong interpersonal and communication skills with Agile/Scrum experience.
+ Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions.
+ Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels.
**What You'll Get:**
+ Up to 40% off the base rate of any standard Hertz Rental
+ Paid Time Off
+ Medical, Dental & Vision plan options
+ Retirement programs, including 401(k) employer matching
+ Paid Parental Leave & Adoption Assistance
+ Employee Assistance Program for employees & family
+ Educational Reimbursement & Discounts
+ Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness
+ Perks & Discounts -Theme Park Tickets, Gym Discounts & more
The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world.
**US EEO STATEMENT**
At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company.
Individuals are encouraged to apply for positions because of the characteristics that make them unique.
EOE, including disability/veteran
Data Engineer
Data engineer job in Okemos, MI
We are looking for a Data Engineer to join our Data Engineering Team. The ideal candidate should have a minimum of 3 years of experience with excellent analytical reasoning and critical thinking skills. The candidate will be a part of a team that creates data pipelines that use change data capture (CDC) mechanisms to move data to a cloud provider and then transform data to make it available to Customers to consume. The Data Engineering Team also does general extraction, transformation, and load (ETL) work, along with traditional Enterprise Data Warehousing (EDW) work.
Responsibilities:
Participates in the analysis and development of technical specifications, programming, and testing of Data Engineering components.
Participates creating data pipelines and ETL workflows to ensure that design and enterprise programming standards and guidelines are followed. Assist with updating the enterprise standards when gaps are identified.
Follows technology best practices and standards and escalates any issues as deemed appropriate. Follows architecture and design best practices (as guided by the Lead Data Engineer, BI Architect, and Architectural team).
Responsible for assisting in configuration and scripting to implement fully automated data pipelines, stored procedures, and functions, and ETL workflows that allow for data to flow from on-premises Oracle databases to Snowflake where the data will be consumable by our end customers.
Follows standard change control and configuration management practices.
Participates in 24-hour on-call rotation in support of the platform.
Data Engineer
Data engineer job in Okemos, MI
We are looking for a Data Engineer to join our Data Engineering Team. The ideal candidate should have a minimum of 3 years of experience with excellent analytical reasoning and critical thinking skills. The candidate will be a part of a team that creates data pipelines that use change data capture (CDC) mechanisms to move data from on-premises to cloud-based destinations and then transform data to make it available to Customers to consume. The Data Engineering Team also does general extraction, transformation, and load (ETL) work, along with traditional Enterprise Data Warehousing (EDW) work.
Responsibilities:
Participates in the analysis and development of technical specifications, programming, and testing of Data Engineering components.
Participates in creating data pipelines and ETL workflows to ensure that design and enterprise programming standards and guidelines are followed. Assist with updating the enterprise standards when gaps are identified.
Follows technology best practices and standards and escalates any issues as deemed appropriate. Follows architecture and design best practices (as guided by the Lead Data Engineer, BI Architect, and Architectural team.
Responsible for assisting in configuration and scripting to implement fully automated data pipelines, stored procedures, and functions, and ETL workflows that allow data to flow from on-premises data sources to cloud-based data platforms (e.g. Snowflake) and application platforms (e.g. Salesforce), where data may be consumed by end customers.
Follows standard change control and configuration management practices.
Participates in 24-hour on-call rotation in support of the platform.
Required Skills/Qualifications:
Database Platforms: Snowflake, Oracle, and SQL Server
OS Platforms: RedHat Enterprise Linux and Windows Server
Languages and Tools: PL/SQL, Python, T-SQL, StreamSets, Snowflake Cloud Data Platform, and Informatica PowerCenter, Informatica IICS or IDMC.
Experience creating and maintaining ETL processes that use Salesforce as a destination.
Drive and desire to automate repeatable processes.
Excellent interpersonal skills and communication, as well as the willingness to collaborate with teams across the organization.
Desired Skills/Qualifications:
Experience creating and maintaining solutions within Snowflake that involve internal file stages, procedures and functions, tasks, and dynamic tables.
Experience creating and working with near-real-time data pipelines between relational sources and destinations.
Experience working with StreamSets Data Collector or similar data streaming/pipelining tools (Fivetran, Striim, Airbyte etc...).
Data architect
Data engineer job in Lansing, MI
Founded in 2009 and headquartered in Ann Arbor, MI, TEKWISSEN™ provides a unique portfolio of innovative capabilities that seamlessly combines clients insights, strategy, design, software engineering and systems integration. Our tightly integrated offerings are tailored to each client's requirements and span the services spectrum from Application Development/Maintenance, testing, Technology Consulting & staffing. The company is primarily focused on information technology, engineering, healthcare, financial technology and contingent workforce solutions. It operates in seven business segments including Commercial, Professional & Technical, EMEA Commercial, and EMEA Professional & Technical. The company provides professional and technical expertise in the fields of Telecom, Education, Banking, Retail, e-commerce, Automotive, Life Science, Insurance, legal, healthcare, among others. It also offers outsourcing, consulting, recruitment, career transition, and vendor management services.
We strongly believe:
" If something cannot be measured, it cannot be managed. "
TEKWISSEN™ measures all of these processes and applies corrective interventions to manage the quality process at its core.
We are an Equal Employment Opportunity Employer M/F/V/D
Recognitions:
2015 -America's Fastest Growing Company by Inc.com
2015- SPARK FastTrack Award from Ann Arbor SPARK
2015 -Honoree of Diversity Focused Company by Corp! Magazine
2014- America's Fastest Growing Company by Inc.com
2014- Michigan 50 Companies to Watch
2014 - DiSciTech Award in Technology by Corp! Magazine
2014- DiSciTech TECHNOLOGY Company of the year by Corp! Magazine
2014- SPARK FastTrack Award from Ann Arbor SPARK
Specialties:
Enterprise Solutions, Web Development, Data Warehousing, Systems Integration, IT Security, Storage Technologies, Development and Delivery, Business Intelligence, Telecommunications, Consulting and Planning, Network design, Implementation &Administration
Job Description
Hi ,
If you find the below description matching my profile , can contact me.
Job Title: Data Architect
Duration: 12+ Months
Location: Lansing, MI
Complete Description:
IT professional with 16 years of experience in data architecture, data warehousing and business intelligence solutions, which includes diversified experience in data analysis, data modeling, physical database design and data warehousing, reporting solutions. My passion is business information that is analyzing, organizing and structuring data so that it can provide insight and add value to the business. Demonstrated competency in collaborating and delivering effective solutions for different size businesses, information service provider, banks and financial institutions. Data architecture and data modeling for transactional and data warehouse systems. Data modeling: conceptual, logical and physical modeling using Erwin, ER Studio and Power Designer. Implementation of data models onto different databases technologies such as Oracle, SQL Server, DB2 and Teradata. Data analysis on structured and unstructured data sets. Metadata management and master data Management MDM.
Skills:
•
Data Architect Experience
Required 2
•
Database Design Experience
Highly desired 5
•
Data Modelling/Data Cleansing/ETL experience
Required 7
• Full Lifecycle Development Required 8
•
Business Analysis Experience
Highly desired 8
•
Business Process Engineering Experienc
e Highly desired 8
• Experience working directly with customers/in a heavily customer-facing role Required 5
•
Experience establishing best practices for Data Governance
Required 2
• Exp building customer-centric data glossaries & change management workflows using standard governance tools (like IBM InfoSphere Data Governance) Required 2
•
Experience developing Business Intelligence solutions
Required 5
Additional Information
Thanks & Regards,
B RajShekhar
Technical Recruiter.
TEL : ************ Ext: 286
Direct : ************
Fax : ************
Web : *****************
ITPA11 - Data Warehouse ETL Developer
Data engineer job in Lansing, MI
The Department of Technology, Management and Budget supports the business operations of state agencies through a variety of services, including building management and maintenance, information technology, centralized contracting and procurement, budget and financial management, space planning and leasing, construction management, motor vehicle fleet operations, and oversight of the state retirement systems. This position is with the Agency Services area of DTMB.
This position functions as an information technology analyst responsible for developing, maintaining and enhancing a variety of automated data warehouse ETL applications located on the State of Michigan's Teradata Data Warehouse platform. The position will represent the team and coordinate team activities with other data warehouse teams on complex, multi-team projects. In addition, the position will work with other data warehouse developers to review existing data warehouse policies and practices, modify work items as needed, or propose new ones. This position is located on Michigan Department of Health and Human Services (MDHHS) Data Warehouse Team. Knowledge of Teradata and Business Objects, along with relational database experience is required to help develop and maintain MDHHS data warehouse applications and queries.
SIGNING BONUS:
This position may be eligible for a sign-on bonus of up to $2,500. Up to $1,250 to be paid upon new hire and the remainder to be paid after satisfactory completion of the initial probationary period (12-month period). This does not apply to current state employees.
The state of Michigan offers competitive work experience that includes a tuition reduction program at several key higher education institutes if you would like to advance your education, good benefits, excellent vacation, and sick time policies, and an ability to successfully juggle your work and family life. We would like the opportunity to share with you more about the benefits of working for the state and joining the state of Michigan employee family if you are interested. Please consider sending in your application today.
ITPA P11 - Position Description
DTMB does not participate in STEM-OPT.
Position Location/Remote Office: The office location is Lansing, MI. The State of Michigan is not able to offer employment to out-of-state applicants that do not plan to relocate. The Department of Technology Management and Budget currently offers a hybrid work option which requires two days working on-site at the official work location and three days of remote work per week.
DTMB is proud to be a Michigan Veteran's Affairs Agency (MVAA) Gold Level Veteran-Friendly Employer.
Education
Information Technology Programmer/Analyst P11
Possession of a Bachelor's degree with 21 semester (32 term) credits in one or a combination of the following: computer science, data processing, computer information systems, data communications, networking, systems analysis, computer programming, information assurance, IT project management or mathematics.
Experience
No specific type or amount is required.Information Technology Programmer/Analyst P11
Possession of an associate's degree with 16 semester (24 term) credits in computer science, information assurance, data processing, computer information, data communications, networking, systems analysis, computer programming, IT project management, or mathematics and two years of experience as an application programmer, computer operator, or information technology technician; or two years (4,160 hours) of experience as an Information Technology Student Assistant may be substituted for the education requirement.
OR
Educational level typically acquired through completion of high school and four years of experience as an application programmer, computer operator, information technology technician, or four years (8,320 hours) of experience as an Information Technology Student Assistant may be substituted for the education requirement.
To be considered for this position you must:
* Apply for this position online via NEOGOV; click on "Apply" in the job posting for instructions on submitting your electronic application. Hard copy applications are not accepted.
* Relevant experience and/or education referred to in the supplemental questions must be documented in the resume, transcript and/or application to allow for accurate screening.
* Attach a resume identifying specific experience and dates of employment. Dates of employment should include month and year and hours per week.
* Attach a cover letter.
* If applicable, attach a copy of an official transcript(s). We accept scanned copies of official transcripts. We do not accept web-based, internet, or copies of unofficial transcripts. Official transcripts provide the name of the institution, confirmation that a degree was awarded and on what date, and the registrar's signature.
Failure to complete any of the above items may result in your application not being considered and screened out. See instructions for attaching files here: Instructions(Download PDF reader)
Your application for any position does not guarantee that you will be contacted by the Department/Agency for further consideration. Only those applicants interviewed will be notified of the results.
The Office of Career Services offers current State Employees and interested applicants access to the job application process and a variety of career planning resources.
In accordance with federal law, all new employees must provide proof of eligibility to work in the United States within three business days of beginning employment. If selected for employment, you must be able to submit proof of your legal right to work in the United States.
Certain positions may require certification in specific information technology programs.
View the job specification at: *****************************************************************************************************************
Data Modeler
Data engineer job in Lansing, MI
**Careers With Purpose** **Sanford Health is one of the largest and fastest-growing not-for-profit health systems in the United States. We're proud to offer many development and advancement opportunities to our nearly 50,000 members of the Sanford Family who are dedicated to the work of health and healing across our broad footprint.**
**Facility:** Remote MI (Central Time)
**Location:** Remote, MI
**Address:**
**Shift:** 8 Hours - Day Shifts
**Job Schedule:** Full time
**Weekly Hours:** 40.00
**Salary Range:** $34.50 - $57.00
**Pay Info:** Pay starts at $34.50 and increases according to years of applicable experience.
**Department Details**
***Working remotely is an option if you currently live in SD, ND, MN, IA, NE, WI, IL or MI.
The Data Engineering team is a collaborative, fun and dynamic group. We strive to innovate and improve the process to bring quality data to our end users.
**Job Summary**
The Data Modeler is responsible for ensuring the organization's strategic goals are optimized through the models and designs that detail the structure of data within the enterprise data infrastructure. Guides the organization's data-related efforts, champions the use of technology in the data encompassing all aspects of data management and movement, modeling at all levels to include conceptual, logical, and physical. Works with data architects and data engineers to model data, translating business rules into usable conceptual, logical and physical models and database designs. Designs detailed data representations applying normalization techniques, star schemas and dimensional modeling and generates Entity-Relationship Diagrams (ERDs) to ensure accuracy, integrity, quality and understanding to enable long-term scalability.
Assumes responsibility for the detailed data models, which includes reviewing and approval of changes to ensure a consistent and high-quality set of models. Assists with the establishment of enterprise-wide data Model principles and standards to ensure alignment with the overall Technology Solutions (TS) architecture principles set by the Architecture COE team.
Facilitates highly effective technical team discussions by acting as a leader and mediator between the data engineers and business users to meet the data Modeling needs of the organization. Participates in the development of corporate healthcare-based data models to facilitate efficiency and effective use of the data. Consults with the principal data architects on data architecture related concerns and directions while providing input based on data Model issues and solutions. Establishes an organizational discipline within various applications and business areas to ensure information is effectively captured for data pipeline management. Proposes appropriate model changes to the enterprise data architecture to keep pace with industry and organizational developments.
Communicates through mentorship on data modeling techniques and knowledge to the appropriate areas to effectively leverage and incorporate into business processes across the organization. Reviews and meets ongoing competency requirements of the role to maintain the skills, knowledge and abilities to perform, within scope, role specific functions. Build and optimize data models for Snowflake, Redshift or other similar DB platforms.
**Qualifications**
Bachelor's degree required in computer science, management information systems, or related field.
Minimum of two years of experience in data modeling, database design and development, systems or applications development, or related field. Must be proficient in data modeling tools such as Erwin, ER/Studio, PowerDesigner, or other similar tools. Proficiency with SQL, strong understanding of data pipeline and data warehousing principles and concepts is required.
Strong communication and documentation skills, with the ability to explain complex concepts to both technical and non-technical stakeholders. Strong problem-solving skills and attention to detail.
Experience in data modeling and design within the health insurance or health care industry is preferred. Experience with cloud data platforms (e.g., Azure, AWS, or Google Cloud) and healthcare analytics platforms (e.g., Epic Caboodle). Understanding of data lake and lakehouse architectures.
**Benefits**
Sanford Health offers an attractive benefits package for qualifying full-time and part-time employees. Depending on eligibility, a variety of benefits include health insurance, dental insurance, vision insurance, life insurance, a 401(k) retirement plan, work/life balance benefits, and a generous time off package to maintain a healthy home-work balance. For more information about Total Rewards, visit *********************************** .
Sanford is an EEO/AA Employer M/F/Disability/Vet. If you are an individual with a disability and would like to request an accommodation for help with your online application, please call ************** or send an email to ************************ .
Sanford Health has a Drug Free Workplace Policy. An accepted offer will require a drug screen and pre-employment background screening as a condition of employment.
**Req Number:** R-0243184
**Job Function:** Information Technology
**Featured:** No
Data architect
Data engineer job in Lansing, MI
Founded in 2009 and headquartered in Ann Arbor, MI, TEKWISSEN™ provides a unique portfolio of innovative capabilities that seamlessly combines clients insights, strategy, design, software engineering and systems integration. Our tightly integrated offerings are tailored to each client's requirements and span the services spectrum from Application Development/Maintenance, testing, Technology Consulting & staffing. The company is primarily focused on information technology, engineering, healthcare, financial technology and contingent workforce solutions. It operates in seven business segments including Commercial, Professional & Technical, EMEA Commercial, and EMEA Professional & Technical. The company provides professional and technical expertise in the fields of Telecom, Education, Banking, Retail, e-commerce, Automotive, Life Science, Insurance, legal, healthcare, among others. It also offers outsourcing, consulting, recruitment, career transition, and vendor management services.
We strongly believe:
" If something cannot be measured, it cannot be managed. "
TEKWISSEN™ measures all of these processes and applies corrective interventions to manage the quality process at its core.
We are an Equal Employment Opportunity Employer M/F/V/D
Recognitions:
2015 -America's Fastest Growing Company by Inc.com
2015- SPARK FastTrack Award from Ann Arbor SPARK
2015 -Honoree of Diversity Focused Company by Corp! Magazine
2014- America's Fastest Growing Company by Inc.com
2014- Michigan 50 Companies to Watch
2014 - DiSciTech Award in Technology by Corp! Magazine
2014- DiSciTech TECHNOLOGY Company of the year by Corp! Magazine
2014- SPARK FastTrack Award from Ann Arbor SPARK
Specialties:
Enterprise Solutions, Web Development, Data Warehousing, Systems Integration, IT Security, Storage Technologies, Development and Delivery, Business Intelligence, Telecommunications, Consulting and Planning, Network design, Implementation &Administration
Job Description
Hi ,
If you find the below description matching my profile , can contact me.
Job Title: Data Architect
Duration: 12+ Months
Location: Lansing, MI
Complete Description:
IT professional with 16 years of experience in data architecture, data warehousing and business intelligence solutions, which includes diversified experience in data analysis, data modeling, physical database design and data warehousing, reporting solutions. My passion is business information that is analyzing, organizing and structuring data so that it can provide insight and add value to the business. Demonstrated competency in collaborating and delivering effective solutions for different size businesses, information service provider, banks and financial institutions. Data architecture and data modeling for transactional and data warehouse systems. Data modeling: conceptual, logical and physical modeling using Erwin, ER Studio and Power Designer. Implementation of data models onto different databases technologies such as Oracle, SQL Server, DB2 and Teradata. Data analysis on structured and unstructured data sets. Metadata management and master data Management MDM.
Skills:
• Data Architect Experience Required 2
• Database Design Experience Highly desired 5
• Data Modelling/Data Cleansing/ETL experience Required 7
• Full Lifecycle Development Required 8
• Business Analysis Experience Highly desired 8
• Business Process Engineering Experience Highly desired 8
• Experience working directly with customers/in a heavily customer-facing role Required 5
• Experience establishing best practices for Data Governance Required 2
• Exp building customer-centric data glossaries & change management workflows using standard governance tools (like IBM InfoSphere Data Governance) Required 2
• Experience developing Business Intelligence solutions Required 5
Additional InformationThanks & Regards,B RajShekharTechnical Recruiter. TEL : ************ Ext: 286 Direct : ************ Fax : ************
Web : *****************
Data Engineer
Data engineer job in Lansing, MI
OpTech is a woman-owned company that values your ideas, encourages your growth, and always has your back. When you work at OpTech, not only do you get health and dental benefits, but you also have training opportunities, flexible/remote work options, growth opportunities, 401K and competitive pay. Apply today!
Data Engineer
Location: Hybrid: Minimum 2 days per week onsite in East Lansing, MI
Description:
We are looking for a Data Engineer to join our Data Engineering Team. The ideal candidate should have a minimum of 3 years of experience with excellent analytical reasoning and critical thinking skills. The candidate will be a part of a team that creates data pipelines that use change data capture (CDC) mechanisms to move data from on-premises to cloud-based destinations and then transform data to make it available to Customers to consume. The Data Engineering Team also does general extraction, transformation, and load (ETL) work, along with traditional Enterprise Data Warehousing (EDW) work.
Responsibilities:
* Participates in the analysis and development of technical specifications, programming, and testing of Data Engineering components.
* Participates in creating data pipelines and ETL workflows to ensure that design and enterprise programming standards and guidelines are followed. Assist with updating the enterprise standards when gaps are identified.
* Follows technology best practices and standards and escalates any issues as deemed appropriate. Follows architecture and design best practices (as guided by the Lead Data Engineer, BI Architect, and Architectural team.
* Responsible for assisting in configuration and scripting to implement fully automated data pipelines, stored procedures, and functions, and ETL workflows that allow data to flow from on-premises data sources to cloud-based data platforms (e.g. Snowflake) and application platforms (e.g. Salesforce), where data may be consumed by end customers.
* Follows standard change control and configuration management practices.
* Participates in 24-hour on-call rotation in support of the platform.
Required Skills/Qualifications:
Database Platforms: Snowflake, Oracle, and SQL Server
OS Platforms: RedHat Enterprise Linux and Windows Server
Languages and Tools: PL/SQL, Python, T-SQL, StreamSets, Snowflake Cloud Data Platform, and Informatica PowerCenter, Informatica IICS or IDMC.
* Experience creating and maintaining ETL processes that use Salesforce as a destination.
* Drive and desire to automate repeatable processes.
* Excellent interpersonal skills and communication, as well as the willingness to collaborate with teams across the organization.
Desired Skills/Qualifications:
* Experience creating and maintaining solutions within Snowflake that involve internal file stages, procedures and functions, tasks, and dynamic tables.
* Experience creating and working with near-real-time data pipelines between relational sources and destinations.
* Experience working with StreamSets Data Collector or similar data streaming/pipelining tools (Fivetran, Striim, Airbyte etc…).
We are an EOE, all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. *************************************************