Data Engineer
Data engineer job in Maple Grove, MN
Data Engineer - MDP Project
6+ Month Contract
Maple Grove, MN - Hybrid (3 days/week onsite)
Our client is seeking a data engineering contractor to join our team and help build and maintain the existing marketing data warehouse. This role will build DBT models in a medallion style data mart that combines 10+ data sources and transform them into a holistic model to allow marketing teams to derive actionable insights.
Responsibilities:
Collaborate with cross-functional teams to ingest, maintain, and deliver analytics data.
Develop and maintain data ingestion and transformation processes within Snowflake using SQL and DBT.
Develop datasets to support ad-hoc analytical needs and reporting.
Monitor data pipelines to ensure data is captured and processed accurately and on time.
Ensure that all processes for receiving, processing, and evaluating data are efficient, replicable and documented.
Perform validation and testing of data transformations and automated jobs to ensure quality data.
Fulfill ad-hoc requests and data processing needs.
Comply with all required internal governance and data privacy requirements
Required Qualifications:
5+ years professional work experience with relational data models and databases.
Strong Experience with the Snowflake, DBT, AWS.
Strong SQL experience focused on data transformation, cleansing, and preparation.
Experience documenting data processes.
Collaborative team-player
Strong ability to multi-task and balance competing priorities effectively.
Experience working with highly regulated industries.
Preferred Qualifications:
Bachelor's degree in IT, software engineering or related fields.
Healthcare data experience
Marketing data experience
ABOUT EIGHT ELEVEN GROUP:
At Eight Eleven, our business is people. Relationships are at the center of what we do. A successful partnership is only as strong as the relationship built. We're your trusted partner for IT hiring, recruiting and staffing needs. For over 16 years, Eight Eleven has established and maintained relationships that are designed to meet your IT staffing needs. Whether it's contract, contract-to-hire, or permanent placement work, we customize our search based upon your company's unique initiatives, culture and technologies. With our national team of recruiters placed at 21 major hubs around the nation, Eight Eleven finds the people best-suited for your business. When you work with us, we work with you. That's the Eight Eleven promise. Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
Data Engineer
Data engineer job in Minneapolis, MN
Job Title: Data Engineer
Job Type: Contract to Hire
USC of GC Holders only for contract to hire need having no sponsorship
Must have requirements:
GCP, SQL, Python, Airflow
System design mindset
Communication - ability to vocalize what they are doing, what/how they are achieving their work. Accents not an issue as long as they are comprehendible.
Healthcare not required, but a nice to have.
Location: Onsite - any 4 office location, focus is Minneapolis, Arlington, VA, Portland, OR, Raleigh, NC
100% onsite, then switch to 2-3x/week hybrid if they do well
Job Summary:
The Senior Cloud Data Engineer plays a key role in designing, building, and maintaining
data pipelines and infrastructure using Google Cloud Platform (GCP) BigQuery. The
incumbent will collaborate with data analysts, data scientists, and other engineers to
ensure timely access to high-quality data for data-driven decision-making across the
organization.
The Senior Cloud Data Engineer is a highly technical person that has mastered hands-on
coding in data processing solutions and scalable data pipelines to support analytics and
exploratory analysis. This role ensures new business requirements are decomposed and
implemented in the cohesive end-to-end designs that enable data integrity and quality, and
best support BI and analytic capability needs that power decision-making. This includes building data acquisition programs that handle the business's
growing data volume as part of the Data Lake in GCP BigQuery ecosystem and maintaining
a robust data catalog.
This is a Senior Data Engineering role within Data & Analytics' Data Core organization
working closely with leaders of the Data & Analytics. The incumbent will continually
improve the business's data and analytic solutions, processes, and data engineering
capabilities. The incumbent embraces industry best practices and trends and, through
acquired knowledge, drives process and system improvement opportunities.
Responsibilities:
• Design, develop, and implement data pipelines using GCP BigQuery, Dataflow, and
Airflow for data ingestion, transformation, and loading.
• Optimize data pipelines for performance, scalability, and cost-efficiency.
• Ensure data quality through data cleansing, validation, and monitoring processes.
• Develop and maintain data models and schemas in BigQuery to support various
data analysis needs.
• Automate data pipeline tasks using scripting languages like Python and tools like
Dataflow.
• Collaborate with data analysts and data scientists to understand data requirements
and translate them into technical data solutions.
• Leverage DevOps Terraform (IaC) to ensure seamless integration of data pipelines
with CI/CD workflows.
• Monitor and troubleshoot data pipelines and infrastructure to identify and resolve
issues.
• Stay up-to-date with the latest advancements in GCP BigQuery and other related
technologies.
• Document data pipelines and technical processes for future reference and
knowledge sharing.
Basic Requirements:
• Bachelor's degree or equivalent experience in Computer Science, Mathematics,
Information Technology or related field.
• 5+ years of solid experience as a data engineer.
• Strong understanding of data warehousing / datalake concepts and data modeling
principles.
• Proven experience with designing and implementing data pipelines using GCP
BigQuery, Dataflow and Aiflow.
• Strong SQL and scripting languages like Python (or similar) skills.
• Experience with data quality tools and techniques.
• Ability to work independently and as part of a team.
• Strong problem-solving and analytical skills.
• Passion for data and a desire to learn and adapt to new technologies.
• Experience with other GCP services like Cloud Storage, Dataflow, and Pub/Sub etc.
• Experience with cloud deployment and automation tools like Terraform.
• Experience with data visualization tools like Tableau or Power BI or Looker.
• Experience with healthcare data.
• Familiarity with machine learning, artificial intelligence and data science concepts.
• Experience with data governance and healthcare PHI data security best practices.
• Ability to work independently on tasks and projects to deliver data engineering
solutions.
• Ability to communicate effectively and convey complex technical concepts as well
as tasks / project updates.
The projected hourly range for this position is $78 to $89.
On-Demand Group (ODG) provides employee benefits which includes healthcare, dental, and vision insurance. ODG is an equal opportunity employer that does not discriminate on the basis of race, color, religion, gender, sexual orientation, age, national origin, disability, or any other characteristic protected by law.
Associate Data Scientist
Data engineer job in Minneapolis, MN
is remote.
Develop service specific knowledge through greater exposure to peers, internal experts, clients, regular self-study, and formal training opportunities
Gain exposure to a variety of program/project situations to develop business and organizational/planning skills
Retain knowledge gained and performance feedback provided to transfer into future work
Approach all problems and projects with a high level of professionalism, objectivity and an open mind to new ideas and solutions
Collaborate with internal teams to collect, analyze, and automate data processing
Leverage AI models, including LLMs, for developing intelligent solutions that enhance data-driven decision-making processes for both internal projects and external clients
Leverage machine learning methodologies, including non-linear, linear, and forecasting methods to help build solutions aimed at better understanding the business, making the business more efficient, and planning our future
Work under the guidance of a variety of Data Science team members, gain exposure to developing custom data models and algorithms to apply to data sets
Gain experience with predictive and inferential analytics, machine learning, and artificial intelligence techniques
Use existing processes and tools to monitor and analyze solution performance and accuracy and communicate findings to team members and end users
Contribute to automating business workflows by incorporating LLMs and other AI models to streamline processes and improve efficiency
Integrate AI-driven solutions within existing systems to provide advanced predictive capabilities and actionable insights
Learn to work individually as well as in collaboration with others
Desired Skills/Experience:
Bachelor's degree is required in the field of Statistics, Computer Science, Economics, Analytics, or Data Science preferred
1+ year of experience preferred
Experience with APIs, web scraping, SQL/no-SQL databases, and cloud-based data solutions preferred
Combination of relevant experience, education, and training may be accepted in lieu of degree
Benefits:
Medical, Dental, & Vision Insurance Plans
Employee-Owned Profit Sharing (ESOP)
401K offered
The approximate pay range for this position starting at $90,000 - $125,000. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
Senior Data Platform Engineer (28702)
Data engineer job in Minnetonka, MN
Title: Senior Data Platform Engineer - Oracle/Snowflake/Azure
Job Type: Contract-to-Hire (6 months) *All candidates must be interested & eligible for conversion without sponsorship.
Industry: Health Insurance
Pay range: $65 to $78/hour
Key Technologies: Oracle, Snowflake, Azure Cloud, MS SQL
---
About the Role
We are seeking a highly skilled Senior Data Platform Engineer to join a leading healthcare organization headquartered in Minnetonka, MN. This role focuses on designing, implementing, and maintaining both legacy and modern data platforms that support enterprise operations. You will collaborate with experienced engineers and architects to optimize databases, develop data pipelines, and drive cloud integration initiatives.
This position is ideal for a seasoned professional who thrives on solving complex data challenges, contributing to modernization efforts, and working in a fast-paced Agile environment.
Responsibilities
Design, build, and maintain robust data pipelines across cloud and on-premises environments.
Administer, monitor, and optimize databases including Oracle, Snowflake, Azure SQL, and MS SQL.
Manage database provisioning, configuration, patching, and backup/recovery processes.
Collaborate with developers, analysts, and DBAs to troubleshoot issues and optimize queries.
Support data migration and integration efforts as part of cloud transformation initiatives.
Ensure database security, access controls, and compliance with internal standards.
Contribute to documentation, runbooks, and knowledge sharing within the team.
Participate in Agile ceremonies and planning activities, fostering a culture of shared ownership and continuous improvement.
Join an on-call rotation to support 24/7 database operations and incident response.
Required Qualifications
7+ years of experience in database engineering or a related technical role.
Hands-on experience with at least one of the following: Oracle, Snowflake, or Azure SQL Database.
Solid knowledge of cloud platforms (Azure preferred) and cloud-native data services.
Strong understanding of system performance tuning and query optimization.
Ability to work collaboratively and communicate effectively with technical peers.
Preferred Qualifications
Experience building and maintaining data pipelines in cloud or hybrid environments.
Familiarity with Liquibase or other database change management tools.
Proficiency in scripting or automation (e.g., Ansible, Python, Terraform).
Experience with CI/CD pipelines or DevOps practices.
Knowledge of monitoring tools and observability platforms.
Background in Agile or SAFe environments.
Salary range for this position is $110,400-$154,600.
Annual salary range placement will depend on a variety of factors including, but not limited to, education, work experience, applicable certifications and/or licensure, the position's scope and responsibility, internal pay equity and external market salary data.
Benefits
Dahl Consulting is proud to offer a comprehensive benefits package to eligible employees that will allow you to choose the best coverage to meet your family's needs. For details, please review the DAHL Benefits Summary: ***********************************************
Data Engineer
Data engineer job in Eagan, MN
Insight Global is seeking a talented Azure Data Engineer to join one of our large utility clients on-site in Eagan, Minnesota. Please find more details below, we look forward to connecting with you!
**This client works closely with the US Government, so candidates need to eligible to receive a Secret Clearance or higher.
Title: Azure Data Engineer
Client: Utilities Administration Company
Location: Eagan, MN
Schedule: Hybrid onsite - 4 days per week (Monday - Thursday)
Skills Needed:
Ideally, 5+ years of prior Data Engineering experience
Expertise in Azure Cloud*** (experience with Azure Monitor is a plus)
Experience with the following: Azure Data Factory, Azure Synapse, PySpark, Python and SQL
Bachelor's Degree (or higher) in a related STEM discipline
Willingness to work in-office 4 days per week in Eagan, MN
Compensation: $60/hour to $75/hour. Exact compensation may vary based on several factors, including skills, experience, and education.
Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.
Senior Data Engineer
Data engineer job in Minneapolis, MN
About the Company
A global leader in the alternative investment and asset management space is expanding its Data & Analytics capabilities. The firm oversees multi-billion-dollar portfolios across credit, real estate, private equity, lending, and structured investments and maintains a high-performance, data-driven culture committed to integrity, innovation, and excellence.
Position Overview
We are hiring a Senior Data Engineer to join the Data & Analytics team and drive the evolution of our cloud-based data ecosystem. This individual will architect scalable data pipelines, optimize cloud infrastructure, and enable advanced analytics across multiple business units.
The role is based in Minneapolis and requires close collaboration with investment, operations, and technology teams.
Key Responsibilities
Data Engineering
Design, build, and optimize reliable ETL/ELT pipelines for large-scale data ingestion and transformation.
Modernize and enhance the current data ecosystem to support high-quality, consistent data delivery.
Manage cloud-based data infrastructure, including resource deployment, configuration, and performance tuning.
Monitor system health, troubleshoot pipeline issues, and streamline processes for speed and efficiency.
Implement robust data security, governance, and privacy controls for sensitive financial data.
Stay updated with emerging technologies and best practices in cloud data engineering.
Analytics & Business Enablement
Develop data models, tools, and frameworks that support self-service analytics across the organization.
Translate business needs into data-driven solutions such as dashboards, metrics, and analytical tools.
Mentor analysts and help strengthen analytical maturity across the company.
Support commercial and custom applications through configuration, administration, and maintenance.
Required Qualifications
Bachelor's degree in a STEM discipline (Computer Science, Engineering, Math, etc.).
10+ years overall experience in data engineering or analytics roles.
5+ years designing and maintaining ETL/ELT pipelines.
5+ years experience with data warehouses and analytics platforms.
5+ years strong SQL experience working with complex datasets.
5+ years experience with business intelligence tools (Looker, Power BI, Sigma, Tableau, Cognos, etc.).
5+ years scripting in Python or Scala.
5+ years cloud experience, preferably AWS.
5+ years working with data governance, data quality, and related tools.
Hands-on experience with Infrastructure as Code (Terraform, CloudFormation, ARM templates, etc.).
Familiarity with the alternative investments / private equity / hedge fund domain preferred.
Strong communication, stakeholder management, and cross-team collaboration skills.
Ability to thrive in a fast-paced environment with multiple competing priorities.
U.S. Citizen or Permanent Resident only (ITAR requirement).
Must work onsite 3-4 days/week in Minneapolis - no remote or hybrid exceptions
Note:
This role requires U.S. Citizenship or Permanent Residency (ITAR compliance). Candidates must already live in Minneapolis or be willing to relocate prior to start. Onsite attendance 3-4 days per week is mandatory - no remote exceptions.
Do not apply if you are OPT students with STEP OPT EAD; Due to ITAR Complaince, position requires USC or Green Card which is mandatory;
Data Modeler
Data engineer job in Minneapolis, MN
“Healthcare Insurance” Industry Experience
Data Analysis/Architecture
Source to Target Mapping
ERWIN
Data Modeling
ETL
Informatica-Powercenter
Oracle PL/SQL
Healthcare Cloud Data Transformation Lead
Data engineer job in Minneapolis, MN
About the job
We are seeking a highly skilled Cloud Data Transformation Architect/Leader to lead the design, implementation, and optimization of large-scale cloud-based data platforms and transformation initiatives. The architect will play a critical role in defining data strategy and roadmap, modernizing legacy environments, and enabling advanced analytics and AI/ML capabilities. This position requires deep expertise in cloud ecosystems, data integration, governance, and performance optimization, along with strong leadership in guiding cross-functional teams. The Transformation leader will drive the implementation roadmap and work with client leadership to demonstrate the ROI of the modernized platform and transformation roadmap.
What you will do at Sogeti
Advisory Consulting, Collaboration & Leadership
Provide leadership and advisory consulting to client data and analytics leadership
Partner with business stakeholders, data leaders, data engineers, and analysts to understand business needs and align with data capabilities
Provide CXO level status reporting and communication along with ROI and NPV measurement/reporting
Provide technical leadership, mentorship, and best practices for data engineering teams.
Serve as the subject matter expert on cloud data and analytics transformation.
Data Strategy, Roadmap & Architecture
Define and maintain the enterprise cloud data architecture, strategy and roadmap aligned with business objectives.
Define key metrics to measure progress and ROI from the roadmap
Design end-to-end cloud data ecosystems (data lakes, data warehouses, lakehouses, streaming pipelines).
Design metadata-driven architecture and open table formats (Delta Lake, Apache Hudi, and Apache Iceberg)
Experienced in implementing cost management measures on Cloud data platforms
Evaluate emerging technologies and recommend adoption strategies.
Data Transformation & Migration
Lead the modernization of on-premises data platforms to cloud-native architectures.
Architect scalable, secure, and high-performance ETL/ELT and real-time data pipelines.
Ensure seamless integration of structured, semi-structured, and unstructured data.
Governance & Security
Implement data governance, lineage, cataloging, and quality frameworks.
Ensure compliance with regulatory standards (GDPR, HIPAA, SOC 2, etc.).
Define data security models for data access, encryption, and masking.
BI Modernization
Lead the modernization of legacy BI platforms to Power BI
Architect and develop Semantic Layer needed for consumption layer - BI, AI etc
Optimization & Innovation
Drive performance tuning, cost optimization, and scalability of cloud data platforms.
Explore opportunities to leverage AI/ML, advanced analytics, and automation in data transformation.
Establish reusable frameworks and accelerators for faster delivery.
Data Operations:
Definition of SLA's/KPI's for Data Platform operations along with the client
Tracking and reporting of SLA's/KPI's to executive leadership
Identify and rollout solutions for improving SLA/KPI adherence
What you will bring
Experience:
18+ years in data engineering/architecture, with 5+ years in leading enterprise-scale cloud data transformations.
Experience in Healthcare Payer industry
Experience defining Enterprise level data strategy and roadmap and driving the implementation for at-least 3 enterprise clients
Experience with playing key advisory role for client data and analytics leadership (CDO and direct reports)
Hands-on expertise with at least one major cloud provider - Azure is must have AWS,GCP is good to have
Experienced in implementing Snowflake on Azure as well as Medallion lakehouse Architecture using Databricks and MS Fabric using open table formats
Experienced in various data modelling techniques and standards for cloud data warehouses
Experienced in designing and implementing high performing data pipelines; performance tuning expertise is required
Experienced with Data Governance implementation with focus on Metadata Management using Alation and Data Quality using Industry standard tools
Experienced with BI Modernization from legacy BI platforms to Power BI - Big plus
Technical Skills:
Data platforms: Snowflake - Must have, Databricks, MS Fabric -Big Plus Synapse, Redshift, Big Query - Good to have
Data integration: Azure Data Factory, DBT, Snowpipe, Informatica Power Center, IDMC CDI, Matillion, Kafka
Programming: SQL, Python, SnowSQL, SnowPark, PySpark, Scala, or Java.
Data Governance: Alation - Must have, Informatica, Collibra, Ataccama - Good to have
BI Platforms: PowerBI -Must have, Qlik, SSRS, SAS - Good to have
Infrastructure as Code: Terraform, ARM, CloudFormation.
Strong understanding of APIs, microservices, and event-driven architectures.
Life at Sogeti: Sogeti supports all aspects of your well-being throughout the changing stages of your life and career. For eligible employees, we offer:
Flexible work options
401(k) with 150% match up to 6%
Employee Share Ownership Plan
Medical, Prescription, Dental & Vision Insurance
Life Insurance
100% Company-Paid Mobile Phone Plan
3 Weeks PTO + 7 Paid Holidays
Paid Parental Leave
Adoption, Surrogacy & Cryopreservation Assistance
Subsidized Back-up Child/Elder Care & Tutoring
Career Planning & Coaching
$5,250 Tuition Reimbursement & 20,000+ Online Courses
Employee Resource Groups
Counseling & Support for Physical, Financial, Emotional & Spiritual Well-being
Disaster Relief Programs
About Sogeti
Part of the Capgemini Group, Sogeti makes business value through technology for organizations that need to implement innovation at speed and want a local partner with global scale. With a hands-on culture and close proximity to its clients, Sogeti implements solutions that will help organizations work faster, better, and smarter. By combining its agility and speed of implementation through a DevOps approach, Sogeti delivers innovative solutions in quality engineering, cloud and application development, all driven by AI, data and automation.
Become Your Best | *************
Disclaimer
Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law.
This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship.
Capgemini is committed to providing reasonable accommodation during our recruitment process. If you need assistance or accommodation, please reach out to your recruiting contact.
Please be aware that Capgemini may capture your image (video or screenshot) during the interview process and that image may be used for verification, including during the hiring and onboarding process.
Click the following link for more information on your rights as an Applicant **************************************************************************
Applicants for employment in the US must have valid work authorization that does not now and/or will not in the future require sponsorship of a visa for employment authorization in the US by Capgemini.
Data Scientist
Data engineer job in Mendota Heights, MN
We are seeking a Data Scientist to deliver predictive analytics and actionable insights that enhance financial forecasting and supply chain performance. This role will partner with business leaders and analysts to design models that inform strategic decisions. You will work primarily within Microsoft Fabric, leveraging Delta Lake/OneLake and Medallion Architecture (Bronze-Silver-Gold) to build scalable solutions and lay the groundwork for future AI-driven capabilities.
This is a full-time, direct hire role which will be onsite in Mendota Heights, MN. Local candidates only. Target salary is between $120,000-140,000.
Candidates must be eligible to work in the United States without sponsorship both now and in the future. No C2C or third parties.
Key Responsibilities
Develop and deploy machine learning models for cost modeling, sales forecasting, and long-term work order projections.
Analyze large, complex datasets to uncover trends, anomalies, and opportunities for operational improvement.
Collaborate with finance, supply chain, and business teams to translate challenges into data-driven solutions.
Work with engineering teams to create robust pipelines for data ingestion, transformation, and modeling using cloud-native tools.
Utilize Azure services (Data Lake, Synapse, ML Studio) to operationalize models and manage workflows.
Present insights through clear visualizations and executive-level presentations.
Contribute to governance standards, audit trails, and model documentation.
Qualifications
Education & Certifications
Bachelor's degree required; Master's in Computer Science, IT, or related field preferred.
Cloud certifications (Azure or similar) are a plus.
Experience & Skills
5+ years as a Data Scientist or similar role.
Hands-on experience with Microsoft Fabric, Azure Synapse, and related cloud technologies.
Proficiency in Python, R, SQL, and visualization tools (Power BI, Tableau).
Strong background in financial modeling, cost allocation, and supply chain analytics.
Familiarity with Oracle and Salesforce UI navigation is helpful.
Excellent business acumen and ability to communicate complex concepts to senior leadership.
Strong problem-solving skills and ability to design scalable solutions.
Preferred
Experience with Azure Machine Learning.
Knowledge of Jitterbit is a plus.
All qualified applicants will receive consideration for employment without regard to race, color, national origin, age, ancestry, religion, sex, sexual orientation, gender identity, gender expression, marital status, disability, medical condition, genetic information, pregnancy, or military or veteran status. We consider all qualified applicants, including those with criminal histories, in a manner consistent with state and local laws, including the California Fair Chance Act, City of Los Angeles' Fair Chance Initiative for Hiring Ordinance, and Los Angeles County Fair Chance Ordinance. For unincorporated Los Angeles county, to the extent our customers require a background check for certain positions, the Company faces a significant risk to its business operations and business reputation unless a review of criminal history is conducted for those specific job positions.
Data Engineer
Data engineer job in Bloomington, MN
Are you an experienced Data Engineer with a desire to excel? If so, then Talent Software Services may have the job for you! Our client is seeking an experienced Data Engineer to work at their company in Bloomington, MN.
Primary Responsibilities/Accountabilities:
Develop and maintain scalable ETL/ELT pipelines using Databricks and Airflow.
Build and optimize Python-based data workflows and SQL queries for large datasets.
Ensure data quality, reliability, and high performance across pipelines.
Collaborate with cross-functional teams to support analytics and reporting requirements.
Monitor, troubleshoot, and improve production data workflows.
Qualifications:
Strong hands-on experience with Databricks, Python, SQL, and Apache Airflow.
6-10+ years of experience in Data Engineering.
Experience with cloud platforms (Azure/AWS/GCP) and big data ecosystems.
Solid understanding of data warehousing, data modelling, and distributed data processing.
AirWatch MDM Engineer
Data engineer job in Saint Paul, MN
12-month assignment with possibility for extension
We are seeking an experienced AirWatch MDM Engineer to manage and support enterprise mobile device management (MDM) solutions. The role involves maintaining the existing VMware Workspace ONE / AirWatch platform, providing advanced technical support, and leading migration efforts to Microsoft Intune. This position requires strong troubleshooting skills, collaboration with security teams, and the ability to work independently in a fast-paced environment.
Key Responsibilities
Maintain and administer the AirWatch MDM platform, including:
Device enrollment and lifecycle management
Policy configuration and compliance monitoring
Application deployment for iOS, Android, and Windows devices
Provide Tier 2/3 support for mobile device issues across multiple platforms.
Manage vendor portals (Verizon, AT&T, T-Mobile) for cellular activations and support.
Collaborate with security and compliance teams to ensure alignment with organizational standards.
Monitor system performance, generate reports, and implement improvements for security and user experience.
Lead assessment, planning, and phased migration from AirWatch to Microsoft Intune:
Stakeholder engagement
Pilot testing
Documentation
Develop and maintain technical documentation, SOPs, and knowledge base articles.
Stay current with industry trends and best practices in endpoint management and mobile security.
Perform knowledge transfer and provide guidance to internal teams.
Minimum Qualifications
3+ years of hands-on experience with VMware Workspace ONE / AirWatch administration.
2+ years of experience with AirWatch MDM software, mobile OS platforms, and enterprise mobility architecture.
2+ years of experience managing cellular activations via vendor portals (Verizon, AT&T, T-Mobile).
1+ year of experience with Microsoft Intune, Azure AD, and Microsoft Endpoint Manager.
Desired Skills
Experience with Intune deployment or migration projects.
Microsoft certifications (e.g., MD-102, SC-300, AZ-104).
Knowledge of Zero Trust principles and conditional access policies.
Experience integrating MDM with identity and access management solutions.
Proficiency in PowerShell scripting or other automation tools.
#MDMEngineer #AirWatch #MicrosoftIntune #EndpointManagement #AzureAD #ZeroTrust #ITJobs #EnterpriseMobility
SharePoint Engineer
Data engineer job in Minneapolis, MN
Job Title: SharePoint Engineer II
Introduction
We are seeking a highly skilled SharePoint Engineer II to join our End User Technology - Productivity Tools team, responsible for enterprise collaboration and productivity platforms across Microsoft 365. This role plays a critical part in administering and engineering SharePoint Online environments, ensuring secure, scalable, and efficient collaboration solutions. The ideal candidate will combine deep expertise in SharePoint administration with strong automation and integration skills, driving modernization efforts and enhancing user productivity.
Roles and Responsibilities
Administer and manage SharePoint Online tenant and site collections, including site templates, hubs, term store, content types, app catalog/add-ins, sharing policies, retention labels, and search configurations. Maintain governance and lifecycle management at scale.
Collaborate with business owners to design modern site architectures (communication and team sites), navigation, and metadata strategies to improve content findability and user adoption.
Implement and enforce role-based access controls, group permissions, and external sharing policies. Conduct audits to identify and remediate permission drift, supporting least-privilege security models.
Lead and support classic-to-modern migration projects, performing content inventory, page and web part modernization, workflow replacements, and post-migration stabilization including training and documentation.
Develop, maintain, and optimize operational scripts and automation workflows using PowerShell (including PnP and Graph modules) and Azure Automation runbooks for provisioning, compliance enforcement, reporting, and policy management.
Utilize Microsoft Graph API and SharePoint REST API to integrate SharePoint with enterprise workflows, external data sources, and approval processes, creating lightweight custom extensions as needed.
Provide advanced (3rd-level) technical support and troubleshooting for complex SharePoint and Microsoft 365 collaboration issues, including root cause analysis and incident response.
Produce detailed documentation such as SOPs, runbooks, migration playbooks, and quick-start guides. Deliver targeted training sessions for site owners and administrators.
Collaborate with solution architects to evaluate and recommend new capabilities. Mentor junior team members to build overall team expertise and capacity.
Qualifications
Bachelor's degree in Computer Science, Information Technology, or a related field preferred.
5+ years of experience deploying, managing, and administering SharePoint Online within Microsoft 365, including tenant-level governance, site provisioning, app catalog management, search, and security/compliance.
3+ years of hands-on experience with PowerShell scripting (including PnP and Graph modules) and Azure Automation for operational task automation and reporting.
3+ years working with Microsoft Graph API and SharePoint REST API to extend and integrate SharePoint solutions.
Proven experience supporting classic-to-modern SharePoint migrations, including content inventory, page modernization, web part remediation, and workflow replacement.
Solid understanding of IT Service Management (ITSM) processes, with practical experience using tools such as ServiceNow for incident, change, and problem management.
Strong analytical, problem-solving, and communication skills with the ability to work effectively in a hybrid team environment.
Preferred:
Experience with Azure AD/Entra ID (dynamic groups, conditional access), Power Platform governance, and cross-service integrations (Teams, OneDrive, Viva).
Familiarity with cloud platforms such as Azure, AWS, or GCP, and CI/CD pipelines using GitHub Actions or Azure DevOps for infrastructure as code and SharePoint automation.
Exposure to enterprise monitoring and analytics tools like Nexthink to drive adoption and performance improvements.
Tools and Technologies
SharePoint Online (tenant and site-level administration)
PowerShell (including PnP PowerShell, Microsoft Graph PowerShell)
Microsoft Graph API
SharePoint REST API
Azure Automation
ServiceNow (or equivalent ITSM platforms)
Azure AD/Entra ID
Microsoft 365 collaboration tools (Teams, OneDrive, Viva)
Cloud platforms: Azure, AWS, GCP (preferred)
CI/CD tools: GitHub Actions, Azure DevOps (preferred)
Senior Software Engineer
Data engineer job in Minneapolis, MN
DOCSI is seeking a talented, driven software engineer to join our engineering team. We need a passionate and creative mind to help us continue building our cutting edge surgical waste elimination platform. The person who accepts this role will not only work closely with our Director of Engineering, but they will also benefit from full exposure to the inner workings and decision making challenges of an early stage startup. They will inevitably be called upon to contribute to significant decisions that impact the technical direction of the company. They should also be willing and able to grow into a technical or people management role as the engineering team grows.
This role will:
Work alongside the Director of Engineering and other DOCSI engineers to expand and maintain our software solution.
Design and build new user experiences that streamline the complex and confusing process of managing surgical waste.
Inform the creation of machine learning tools to amplify the quality of surgical waste reduction recommendations.
Create seamless data pipelines and integrations that enable our highly scalable, always available platform.
Influence and guide critical design discussions that determine the future direction of our product.
Gain access and connections to key members of the Twin Cities startup community.
Help shape the culture of a new and growing engineering team.
Minimum Qualifications:
4+ years of experience working as a software engineer or similar role
Experience in web development with one or more of the following languages/frameworks: PHP, React, Python, Java
Expertise working with relational database systems such as MySQL or PostgreSQL
Demonstrable experience leading technical projects from start to finish (with or without assistance from other team members)
An understanding of building systems to scale with large, often inconsistent data imports
Action driven self-starter who enjoys improving existing processes
A lifelong learning mindset with a desire to explore new ideas and connect them to their work
Ability to work in an often ambiguous, fast-paced environment
Bonus Qualifications:
Previous work with PHI or other sensitive data. Experience undergoing compliance audits is even better
Experience in designing seamless, mobile-friendly user experiences
A history or deep interest in working in startups or early-stage companies
A background/experience in healthcare and/or supply chain
(Extra plus) Experience specifically with Laravel, Apache Spark, Terraform, and/or AWS cloud services
Salary and Benefits:
Expected salary range is between $100,000 - $140,000
An equity package relative to the candidate's skills and experience
Unlimited vacation policy
A healthcare stipend is available, full healthcare benefits will be available in 2026
AI/ML Engineer
Data engineer job in Medina, MN
Top Technical Skills & Requirements
Programming Expertise: 3+ years of hands-on experience with C# and Python, including building scalable applications and integrating ML models.
API Development & Management: Experience designing, building, and managing RESTful API endpoints using C# (.NET) and Python (FastAPI), with a focus on performance, security, and maintainability.
Cloud ML Deployment: Proven experience in end-to-end deployment, monitoring, and maintenance of ML models on Azure (preferred) or AWS, including CI/CD pipelines and MLOps practices.
Deep Microsoft Azure Experience: Extensive hands-on experience with Azure services including Azure Machine Learning, Azure AI Foundry, Azure Functions, and Azure DevOps, enabling scalable and secure AI/ML solutions across enterprise environments.
Azure AI Foundry: Practical experience leveraging Azure AI Foundry for model development, orchestration, and deployment.
Applied Data Science: Strong background in data exploration, feature engineering, model selection, and validation across supervised and unsupervised learning tasks.
Key Responsibilities
AI Solution Development & Deployment
Architect and implement AI/ML solutions using Azure Machine Learning, Azure AI Foundry, Cognitive Services, and Azure OpenAI.
Build and deploy NLP and LLM-based models, utilizing frameworks such as LangChain, Semantic Kernel, or LlamaIndex.
Design and implement Retrieval-Augmented Generation (RAG) pipelines using Azure AI Search to enhance generative AI capabilities.
Develop and manage RESTful API endpoints using C# (.NET) and Python (FastAPI) to serve ML models and data services.
Implement CI/CD pipelines and MLOps workflows using Azure DevOps and GitHub for scalable and automated model deployment.
Leverage Terraform for infrastructure-as-code (IaC) to provision and manage Azure cloud resources in a repeatable and secure manner.
Model Lifecycle Management
Lead end-to-end ML model lifecycle including data exploration, feature engineering, model training, validation, and deployment.
Monitor and maintain deployed models in production environments, ensuring performance, reliability, and scalability.
Apply best practices in model versioning, automated retraining, and performance monitoring using Azure ML and related tools.
Quality Assurance & Governance
Conduct thorough testing and validation of AI models to ensure accuracy, reliability, and performance.
Optimize and fine-tune models, addressing issues related to data quality, bias, and fairness.
Stay current with industry trends and best practices in AI technology, incorporating them into solution development.
Education & Experience
Bachelor's Degree in Computer Science, Data Science or similar (relevant work experience is acceptable)
3+ years of experience in AI/ML development, with a focus on OpenAI Services, NLP and LLMs.
Experience in a consulting environment, engaging with clients and delivering tailored solutions.
Preferred Consulting Experience
Collaborate with sales and delivery teams to scope and design AI solutions tailored to client needs.
Contribute to proposal development, including technical architecture, effort estimation, and value articulation.
Deliver technical presentations and demos to stakeholders, showcasing solution capabilities and business impact.
Equal opportunity employer including disability/veterans.
IAM Engineer
Data engineer job in Thief River Falls, MN
Key Responsibilities
Design and implement IAM solutions, including SSO, MFA, and RBAC.
Manage and maintain IAM systems for high availability and security.
Develop and enforce IAM policies and best practices.
Integrate IAM systems with applications, infrastructure, and cloud services.
Conduct security assessments and audits of IAM processes.
Lead user provisioning, de-provisioning, and access certification processes.
Troubleshoot complex IAM issues and provide technical support.
Collaborate with IT, security, and business teams to define IAM requirements.
Mentor junior engineers and share best practices.
Stay updated on IAM trends and emerging technologies.
Required Qualifications
Experience: 6-8 years in IAM with strong architectural knowledge.
Expertise in Single Sign-On (OAuth) and IAM tools such as:
Ping Identity, Okta, CyberArk (PAM), Active Directory, Microsoft Entra, Delinea.
Strong understanding of IAM technologies and their functionality.
Excellent communication and presentation skills for technical and non-technical audiences.
Senior Software Engineer
Data engineer job in New Brighton, MN
We are seeking a skilled Power Platform Developer to design, develop, and implement solutions using Microsoft Power Platform tools including Power Apps, Power Automate, Power BI, and Dataverse. The ideal candidate will collaborate with business stakeholders to automate processes, build custom applications, and deliver data-driven insights that enhance operational efficiency.
Key Responsibilities:
Develop and maintain custom applications using Power Apps.
Automate workflows and integrate systems using Power Automate.
Create interactive dashboards and reports with Power BI.
Work with Dataverse and other data sources to manage and model data.
Collaborate with cross-functional teams to gather requirements and deliver scalable solutions.
Ensure solutions are secure, compliant, and aligned with best practices.
Qualifications:
Proven experience with Microsoft Power Platform.
Strong understanding of data modeling, connectors, and integration techniques.
Familiarity with Microsoft 365, SharePoint, and Azure services.
Excellent problem-solving and communication skills.
Senior Software Engineer
Data engineer job in Bloomington, MN
At TempWorks, the Senior Software Engineer is responsible for creating software that delights our customers and users in a way that is also easily maintainable.
The Senior Software Engineer is responsible for leading the design, development, and implementation of software solutions. You will collaborate closely with cross-functional teams to understand requirements, design scalable architectures, and deliver robust, efficient software products.
General Responsibilities:
Research, design, implement, and maintain software features through ongoing feature development, refactoring, and by addressing bugs.
Build highly performant, fault tolerant, high-quality, scalable software.
Actively seek to learn and improve the company, department, team, and themselves.
Develop intuitive software that meets the needs of the company and our customers.
Leverage technical knowledge, skills, and experience to improve department processes and software quality.
Write quality unit and integration tests.
Analyze and test programs and products before formal launch.
Contribute and adhere to best practices in software development.
Participate in agile development processes, including sprint planning, daily stand-ups, and retrospectives.
Communicate with and train stakeholders on completed work for documentation, customer training, troubleshooting, and quality.
Provide mentoring for other Software Engineers.
Perform code reviews and provide constructive feedback.
Stay up to date with emerging technologies and trends in software development and recommend new tools and techniques to improve efficiency and productivity.
Participate in architectural discussions and contribute to the continuous improvement of development processes and methodologies.
Participate in educational opportunities like online course materials, professional publications, conferences, meet-ups, etc.
Performs other related duties as assigned.
Additional Required Skills and Abilities:
Excellent verbal and written communication skills.
Excellent interpersonal and customer service skills.
Strong architectural and design skills, with the ability to architect complex systems and make informed technical decisions.
Analytical and creative problem solving.
High level of organization and attention to detail.
Ability to work independently.
Education and Experience:
Bachelor's degree in computer science, Engineering, or a related field (or equivalent experience).
5+ years of relevant experience developing enterprise scale, web-based software applications.
4+ years of C# experience.
2+ years of Microsoft SQL database experience required (4+ preferred).
4+ years' experience developing applications using RESTful APIs.
4+ years' experience developing REST API driven applications using C# .NET framework and/or ASP.NET.
Expertise in front-end technologies such as HTML, CSS, JavaScript, and modern JavaScript frameworks (e.g., React, Angular, Vue.js), React preferred.
Experience with version control systems (e.g., Git) to manage source code and facilitate collaboration within the development team.
Experience with testing and mocking frameworks (e.g., MSTest, NUnit, XUnit, Moq)
Experience with cloud computing platforms (e.g., AWS, Azure, GCP) and DevOps practices. Azure preferred.
Experience with CI/CD, preferably Azure YAML pipelines.
Experience with static and dynamic code analysis tools (e.g., SonarQube, Veracode, ReSharper).
Experience with one or more of the following required: Domain Driven Design, event-based architecture, distributed systems, microservices, clean architecture, 12-factor App.
Physical Requirements:
Prolonged periods sitting at desk and working on a computer.
Must be able to lift to 10 pounds at times.
Principal Software Engineer
Data engineer job in Eden Prairie, MN
Job Title: Principal Software Engineer
Work Style: Full-time onsite (some flexibility on Fridays)
Salary: $120,000 - $145,000 per year (no bonus or additional compensation currently)
Projected Total Compensation: $120,000 - $145,000 annually
Start: ASAP
Duration: Full-time / Direct Hire
Interview Process:
Round 1: 30-minute phone screen with hiring manager
Round 2: Onsite interview with engineering team
About the Role (Summary of project)
Gentis Solutions is seeking a Principal Software Engineer to design, develop, and customize Linux board support packages (BSPs), focusing primarily on bootloaders (U-Boot) and Linux kernel development for Yocto and Buildroot-based distributions.
This role is not an IT or application development position-it is deeply embedded, system-level engineering, supporting processor platforms, device drivers, bare-metal systems, RTOS environments, and board bring-up.
The Principal Software Engineer will provide technical leadership, mentor other engineers, and collaborate cross-functionally to deliver cutting-edge embedded solutions across multiple processor architectures.
What You'll Do (Job Description):
Technical Leadership & Architecture
Translate product requirements into scalable, implementable system architectures.
Provide day-to-day mentorship and technical leadership to design engineers.
Lead multi-discipline engineering projects and occasionally manage customer project deliverables.
Embedded Software Development
Develop software for 32-bit and 64-bit processor platforms.
Build and customize bootloaders (U-Boot) and Linux kernel components.
Develop software for bare metal, RTOS, Linux, Android, and QNX platforms.
Design and implement device drivers for USB, Video, Audio, Ethernet, CAN, NAND/NOR flash, DDR/SDRAM, HDMI, PCIe, SPI, I2C, etc.
Develop software for wireless technologies: Wi-Fi, Bluetooth, 802.11, GPS, cellular.
System Debug & Hardware Integration
Support hardware and electrical engineering teams with board bring-up, debugging, and validation.
Read and interpret complex electrical schematics and datasheets.
Utilize oscilloscopes, JTAG debuggers, spectrum analyzers, and related tools.
Documentation & Project Execution
Prepare verification test plans, development plans, software specifications, and requirements documents.
Complete projects within budget and timeline requirements.
Communicate technical details and project status across internal and external stakeholders.
Engage with external technical communities through writing or speaking engagements.
What We're Looking For (Must Haves):
Bachelor's degree in Computer Science, Computer Engineering, Software Engineering, or similar.
7-12+ years of embedded software development experience (flexible - right fit prioritized).
Strong experience with embedded processor platforms (ARM, PowerPC, MSP430, PIC32, x86 preferred).
Expertise with embedded Linux, device drivers, BSPs, bootloaders, Yocto, Buildroot.
Experience with bare-metal development, RTOS platforms, and low-level system programming.
Strong understanding of CPU internals (caches, MMU, interrupts, DMA, power states).
Experience working with cross-functional engineering teams on product design.
Ability to write detailed technical documentation and proposals.
Hands-on experience with Ethernet, USB, I2C, CAN, Flash, SPI, and other embedded peripherals.
Strong communication skills-able to present to leadership and engineering groups.
Experience with Agile/Scrum development environments.
Preferred (Nice-to-Have Skills):
Experience managing offshore engineering teams or partner organizations.
Experience working on wireless technologies like Bluetooth, Wi-Fi, GPS, cellular.
Familiarity with TCP/IP networking, routing protocols, and similar technologies.
Experience using oscilloscopes, JTAG tools, and system debuggers.
Experience contributing to technical blogs, conferences, or community events.
Senior Application Developer - OneStream
Data engineer job in Wayzata, MN
Senior Application Developer - OneStream _Wayzata-MN_Full-Time (FTE)_Direct Hire
Senior Application Developer - OneStream
Job Type: Full-Time (FTE)
Base Salary: $103,393 to $148,700+Best-in-class benefits
Qualifications:
*Minimum requirement of 4 years of relevant work experience. Typically reflects 5 years or more of relevant experience.
Preferred Qualifications:
*Proficient in .Net, C#
*Strong previous experience with finance applications
*Has the desire to learn Finance processes and gain solution expertise
*Previous experience with OneStream, Hyperion or other corporate consolidation and planning tools
*Knowledge of financial close and consolidation processes
*Knowledge of financial planning and analysis
*VB.Net and/or C# experience for business rules
Skills and Certifications:
*OneStream
Candidate Details:
*Seniority Level - Mid-Senior
*Minimum Education - Bachelor's Degree
Consultant, Quality Improvement & Data Management
Data engineer job in Hutchinson, MN
Hutchinson Health is seeking a skilled Quality Improvement & Data Management Consultant to lead moderate to complex projects aimed at enhancing performance and supporting regional and departmental strategic goals. In this role, you will provide expertise in quality improvement methods, data analysis, change management, and team facilitation within Health Partners, primarily focusing on Hutchinson Health and Olivia Hospital and Clinics. The ideal candidate will have a Bachelor's degree in a relevant field, at least 3 years of healthcare quality improvement experience, and proficiency in Lean, Six Sigma, and PDSA methodologies. In order to be successful in this role, qualified individuals will posses elevated leadership, multi-tasking, technology and self-starting skills. Join us in driving continuous improvement and delivering high-quality care to the Central MN community.
This position will be on-site primarily at Hutchinson Health and Olivia Hospital and Clinics, but will also include time at other Health Partners locations depending on need.
Job Summary:
Provides quality improvement and data expertise acting as a consultant in performance improvement methods, systems thinking, change management, team facilitation, and data collection and analysis. Manages all aspects of mid-sized projects in support of regional or departmental strategic goals. Provides expertise and facilitates development of standardized approaches to create performance improvement plans, define appropriate tools, methodologies and metrics, analyze and interpret data, manage change and facilitate improvement teams. Mentors and coaches individuals and teams in improvement methods, project management, change management, group dynamics and planning methods. Actively partners with leaders to select and implement solutions and develop appropriate monitors and control plans to ensure implementation and hardwiring of improvement/change. Creates and presents project status updates to senior leadership. Identifies and removes barriers to project success or escalates to leadership when appropriate.
Essential Duties and Responsibilities:
Acts as quality consultant, project manager and facilitator for mid-sized to complex projects that support the organization's mission, vision and strategis priorities.
Develops and supports a standardized performance improvement approach to influence the overall Central MN Performance Improvement culture.
Identifies and develops recommendations and material for educational and communication needs in the Quality Performance Improvement department and throughout the Central MN Region.
Establishes appropriate measurement and data monitoring approach to achieve desired results.
Supports local leaders in the identification of data sources/appropriate reports, including serving as a liaison to the HealthPartners system data teams when new report builds are required to evaluate a local improvement initiative.
Prepares charts, tables, and diagrams to assist others in conducting second level analysis and/or in problem-solving.
Partners with the Quality Director and other leaders to design reports and scorecards for local leaders/committees. Assists to ensure that any quality metrics required by accrediting/regulatory bodies (i.e. Joint Commission) are available to appropriate stakeholders.
Performs all other related duties as assigned.
Accountabilities for All Employees:
Adheres to the Hutchinson Health Employee Values.
Maintains confidentiality of the organization and patients.
Reports any health/medical errors.
Observes all Environment of Care policies and reports safety risks or hazards immediately.
Education, Training or Degree Required:
Bachelor degree required (BA/BS), preferably in business, nursing, operations management, industrial engineering, health care, statistics or related disciplines.
3 years of clinical or quality improvement experience in the healthcare industry, Master's level coursework may substitute for years of experience.
Previous project management/quality improvement/data management experience.
License/Registration/Certification: (will be primary source verified by Human Resources)
Green Belt certification, Lean or Six Sigma training and certification, or similar preferred
Experience and Skills: (indicate preferred or required)
Required:
Demonstrated experience in quality improvement methods (Lean, Six Sigma, and PDSA (Plan, Do, Study, Act) processes, A3 thinking), measurement definition and analysis, team facilitation and project management.
Proficiency with Microsoft Office applications including Excel, Word and Power Point and various project management tools to include flow charting.
Knowledge of Joint Commissions (TJC) and Center for Medicare & Medicaid Services (CMS) standards.
Exceptional organizational capabilities and prioritization skills.
Proficient in preparing, leading and facilitating meetings, bringing teams to decisions in facilitating improvement sessions and/or workgroups.
Proficient in tracking and reporting project or initiative progress.
Strong change management, interpersonal communication, and negotiation/conflict management skills.
Preferred:
System thinking/Change management coursework or experience
Experience working in a matrix organization
Experience with Epic
Previous experience in a licensed clinical position helpful
Date created: 10/07/2025 DR/KM
Date updated:
Auto-Apply