Post job

Data engineer jobs in Duluth, MN

- 1,621 jobs
All
Data Engineer
Requirements Engineer
Senior Software Engineer
Data Scientist
Data Consultant
Software Development Engineer
Software Engineer Lead
Data Architect
Principal Software Engineer
  • Data Engineer

    On-Demand Group 4.3company rating

    Data engineer job in Bloomington, MN

    Key Responsibilities Design, build, and maintain scalable data pipelines for ingesting, cleaning, and transforming provider data. Develop and optimize workflows in Databricks for large-scale data processing. Implement and manage data storage solutions using the Microsoft Azure suite, including Azure Data Lake, Blob Storage, and Azure SQL. Collaborate with API developers and data consumers to ensure seamless API data consumption. Work closely with data scientists, analysts, and product owners to ensure data quality, consistency, and availability. Contribute to the evolution of our data lake and warehouse architecture to support current and future analytics needs. Required Qualifications Hands-on experience with Databricks and Apache Spark. Proficient in SQL, Python, PySpark, Git Strong proficiency with Microsoft Azure cloud services, especially in data storage and compute. Proven experience with data lakes and/or data warehouses. Solid understanding of REST APIs and experience consuming them in data workflows. Experience with data ingestion, ETL/ELT pipelines, and data cleaning techniques. Preferred Qualifications Hands-On Experience with PowerBI Experience building or maintaining GraphQL APIs. Experience designing and developing REST APIs. Familiarity with AI/ML integration in data pipelines or analytics workflows. Knowledge of healthcare data standards and provider data models is a plus. The projected hourly range for this position is $65.00 to $85.00 On-Demand Group (ODG) provides employee benefits which includes healthcare, dental, and vision insurance. ODG is an equal opportunity employer that does not discriminate on the basis of race, color, religion, gender, sexual orientation, age, national origin, disability, or any other characteristic protected by law.
    $65-85 hourly 4d ago
  • Senior Data Platform Engineer (28702)

    Dahl Consulting 4.4company rating

    Data engineer job in Minnetonka, MN

    Title: Senior Data Platform Engineer - Oracle/Snowflake/Azure Job Type: Contract-to-Hire (6 months) *All candidates must be interested & eligible for conversion without sponsorship. Industry: Health Insurance Pay range: $65 to $78/hour Key Technologies: Oracle, Snowflake, Azure Cloud, MS SQL --- About the Role We are seeking a highly skilled Senior Data Platform Engineer to join a leading healthcare organization headquartered in Minnetonka, MN. This role focuses on designing, implementing, and maintaining both legacy and modern data platforms that support enterprise operations. You will collaborate with experienced engineers and architects to optimize databases, develop data pipelines, and drive cloud integration initiatives. This position is ideal for a seasoned professional who thrives on solving complex data challenges, contributing to modernization efforts, and working in a fast-paced Agile environment. Responsibilities Design, build, and maintain robust data pipelines across cloud and on-premises environments. Administer, monitor, and optimize databases including Oracle, Snowflake, Azure SQL, and MS SQL. Manage database provisioning, configuration, patching, and backup/recovery processes. Collaborate with developers, analysts, and DBAs to troubleshoot issues and optimize queries. Support data migration and integration efforts as part of cloud transformation initiatives. Ensure database security, access controls, and compliance with internal standards. Contribute to documentation, runbooks, and knowledge sharing within the team. Participate in Agile ceremonies and planning activities, fostering a culture of shared ownership and continuous improvement. Join an on-call rotation to support 24/7 database operations and incident response. Required Qualifications 7+ years of experience in database engineering or a related technical role. Hands-on experience with at least one of the following: Oracle, Snowflake, or Azure SQL Database. Solid knowledge of cloud platforms (Azure preferred) and cloud-native data services. Strong understanding of system performance tuning and query optimization. Ability to work collaboratively and communicate effectively with technical peers. Preferred Qualifications Experience building and maintaining data pipelines in cloud or hybrid environments. Familiarity with Liquibase or other database change management tools. Proficiency in scripting or automation (e.g., Ansible, Python, Terraform). Experience with CI/CD pipelines or DevOps practices. Knowledge of monitoring tools and observability platforms. Background in Agile or SAFe environments. Salary range for this position is $110,400-$154,600. Annual salary range placement will depend on a variety of factors including, but not limited to, education, work experience, applicable certifications and/or licensure, the position's scope and responsibility, internal pay equity and external market salary data. Benefits Dahl Consulting is proud to offer a comprehensive benefits package to eligible employees that will allow you to choose the best coverage to meet your family's needs. For details, please review the DAHL Benefits Summary: ***********************************************
    $110.4k-154.6k yearly 4d ago
  • Data Engineer

    Insight Global

    Data engineer job in Eagan, MN

    Insight Global is seeking a talented Azure Data Engineer to join one of our large utility clients on-site in Eagan, Minnesota. Please find more details below, we look forward to connecting with you! **This client works closely with the US Government, so candidates need to eligible to receive a Secret Clearance or higher. Title: Azure Data Engineer Client: Utilities Administration Company Location: Eagan, MN Schedule: Hybrid onsite - 4 days per week (Monday - Thursday) Skills Needed: Ideally, 5+ years of prior Data Engineering experience Expertise in Azure Cloud*** (experience with Azure Monitor is a plus) Experience with the following: Azure Data Factory, Azure Synapse, PySpark, Python and SQL Bachelor's Degree (or higher) in a related STEM discipline Willingness to work in-office 4 days per week in Eagan, MN Compensation: $60/hour to $75/hour. Exact compensation may vary based on several factors, including skills, experience, and education. Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.
    $60 hourly 1d ago
  • Data Engineer

    FAC Services, LLC

    Data engineer job in Madison, WI

    About FAC Services Want to build your career helping those who build the world? At FAC Services, we handle the business side so architecture, engineering, and construction firms can focus on shaping the future. Our trusted, high-quality solutions empower our partners, and our people, to achieve excellence with integrity, precision, and a personal touch. Job Purpose FAC Services is investing in a modern data platform to enable trustworthy, timely, and scalable data for analytics, operations, and product experiences. The Data Engineer will design, build, and maintain core data pipelines and models for Power BI reporting, application programming interfaces (APIs), and downstream integrations. This role partners closely with Infrastructure, Quality Assurance (QA), the Database Administrator, and application teams to deliver production grade, automated data workflows with strong reliability, governance, observability, and Infrastructure as Code (IaC) for resource orchestration. Primary Responsibilities: Data Architecture & Modeling Design and evolve canonical data models, marts, and lake/warehouse structures to support analytics, APIs, and applications. Establish standards for naming, partitioning, schema evolution, and Change Data Capture (CDC). Pipeline Development (ETL/ELT) Build resilient, testable pipelines across Microsoft Fabric Data Factory, notebooks (Apache Spark), and Lakehouse tables for batch and streaming workloads. Design Lakehouse tables (Delta/Parquet) in OneLake. Optimize Direct Lake models for Power BI. Implement reusable ingestion and transformation frameworks emphasizing modularity, idempotency, and performance. Integration & APIs Engineer reliable data services and APIs to feed web applications, Power BI, and partner integrations. Publish consumer-facing data contracts (Swagger) and implement change-notification (webhooks/eventing). Use semantic versioning for breaking changes and maintain a deprecation policy for endpoints and table schemas. Ensure secure connectivity and least-privilege access in coordination with the DBA. Infrastructure as Code (IaC) - Resource Orchestration Resource Orchestration & Security: Author and maintain IaC modules to deploy and configure core resources. Use Bicep/ARM (and, where appropriate, Terraform/Ansible) with CI/CD to promote changes across environments. DevOps, CI/CD & Testing Own CI/CD pipelines (Gitbased promotion) for data code, configurations, and infrastructure. Practice test-driven development with QA (unit, integration, regression) and embed data validations throughout pipelines; collaborate with the Data Quality Engineer to maximize coverage. Observability & Reliability Instrument pipelines and datasets for lineage, logging, metrics, and alerts; define Service Level Agreements (SLAs) for data freshness and quality. Perform performance tuning (e.g., Spark optimization, partition strategies) and cost management across cloud services. Data Quality & Governance Implement rules for deduplication, reconciliation, and anomaly detection across environments (Microsoft Fabric Lakehouse and Power BI). Contribute to standards for sensitivity labels, RoleBased Access Control (RBAC), auditability, and secure data movement aligned with Infrastructure and Security. Collaboration & Leadership Work cross functionally with Infrastructure, QA, and application teams; mentor peers in modern data engineering practices; contribute to documentation and knowledge sharing. Handoff to the Data Quality Engineer for release gating; coordinate with the Database Administrator on backups/restore posture, access roles, High Availability / Disaster Recovery (HA/DR), and source CDC readiness. Qualifications To perform this job successfully, an individual must be able to perform each primary duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Experience (Required) 3+ years designing and operating production ETL/ELT pipelines and data models. Apache Spark (Fabric notebooks, Synapse Spark pools, or Databricks). Advanced T-SQL and Python; experience with orchestration, scheduling, and dependency management. Azure Event Hubs (or Kafka) for streaming; Change Data Capture (CDC) Infrastructure as Code (Bicep/ARM/Terraform); CI/CD (Azure DevOps) API design for data services (REST/OpenAPI), including versioning, pagination, error handling, authentication, and authorization. Experience (Preferred) Lakehouse design patterns on Microsoft Fabric; optimization of Power BI with Direct Lake models. Kusto Query Language (KQL), Eventstream and Eventhouse familiarity. Experience with lineage/metadata platforms and cost governance.
    $76k-101k yearly est. 4d ago
  • Senior Data Platform Engineer

    The Nycor Group

    Data engineer job in Edina, MN

    Data Platform Engineer As a Data Platform Engineer, you will be responsible for the ingestion, transformation, and maintenance of enterprise data used to serve analytics needs for the business. Working closely with Business Analysts and Data Architects, you will use your technical skills to understand and execute business requirements. This role requires strong fundamentals in data engineering and a collaborative, business-process-oriented mindset. Essential Job Functions Data Preparation (70% time allocation) Use DBT to move data through a medallion architecture in Snowflake. Apply standardization and resolve conflicts in raw layer data (cleansing). Use cleansed data and dimensional modeling techniques (Kimball) to create facts and dimensions in the data warehouse. Create curated, highly consumable data products that fulfill business needs. Represent business processes digitally in data models, ensuring accurate reflection of underlying processes. Quality Assurance (15% time allocation) Validate data outputs against Business Analyst provided test cases. Ensure quality of data pipelines via analysis and unit tests (standardization, completeness, grain, redundancy, etc). Team Development (10% time allocation) Set development standards and lead code reviews. Mentor other team members to develop their skills and abilities. Research technologies to improve processes. Collaborate with a team of 9 reporting to the BI Manager, including engineers, a Data Scientist, a Data Architect, and Business Analysts. Data Ingestion (5% time allocation) Use Fivetran/HVR to create data connections from source systems to Snowflake. Knowledge, Skills, and Abilities Minimum of 5-7 years of in-depth work experience in data warehousing or data engineering. Manufacturing industry experience required. Expertise in DBT and Snowflake (must-have). Strong fundamentals: Kimball Dimensional Modeling, Normalization vs. Denormalization, Type 1 vs. Type 2 dimensions, Cardinality, Data granularity and Aggregation, Hierarchies etc. Experience in ELT and data analysis with SQL and at least one programming language (Python preferred). Conceptual knowledge of data and analytics, including dimensional modeling, ELT, reporting tools, data governance, structured and unstructured data. Experience and/or knowledge of CI/CD practices using GitHub or Azure repos. Familiarity with ERP systems (D365 experience is a plus). Ability to design and build systems that handle data, including cleaning messy data and building real-time pipelines. Collaborative, optimistic personality with integrity; able to pivot quickly and work closely with business teams. Education / Experience Bachelor's degree in Business Information Systems, Computer Science, or equivalent. Related work experience in a manufacturing setting is preferred. Minimum of 5-7 years in data engineering roles. Additional Notes Ideal candidates understand the fundamentals of data engineering and can articulate their experience designing and building data systems. They should be collaborative, positive, and business-process oriented, avoiding rigid enforcement approaches. Flexibility and adaptability are key. Please note: Unfortunately, No Visa Sponsorship or Transfers will be available for this position.
    $75k-99k yearly est. 4d ago
  • Data Engineer

    Talent Software Services 3.6company rating

    Data engineer job in Bloomington, MN

    Are you an experienced Data Engineer with a desire to excel? If so, then Talent Software Services may have the job for you! Our client is seeking an experienced Data Engineer to work at their company in Bloomington, MN. Primary Responsibilities/Accountabilities: Develop and maintain scalable ETL/ELT pipelines using Databricks and Airflow. Build and optimize Python-based data workflows and SQL queries for large datasets. Ensure data quality, reliability, and high performance across pipelines. Collaborate with cross-functional teams to support analytics and reporting requirements. Monitor, troubleshoot, and improve production data workflows. Qualifications: Strong hands-on experience with Databricks, Python, SQL, and Apache Airflow. 6-10+ years of experience in Data Engineering. Experience with cloud platforms (Azure/AWS/GCP) and big data ecosystems. Solid understanding of data warehousing, data modelling, and distributed data processing.
    $71k-96k yearly est. 2d ago
  • AWS Data Architect

    New York Technology Partners 4.7company rating

    Data engineer job in Neenah, WI

    We are seeking a highly skilled AWS Data Architect to design, build, and optimize cloud-based data platforms that enable scalable analytics and business intelligence. The ideal candidate will have deep expertise in AWS cloud services, data modeling, data lakes, ETL pipelines, and big data ecosystems. Key Responsibilities: Design and implement end-to-end data architectures on AWS (data lakes, data warehouses, and streaming solutions). Define data ingestion, transformation, and storage strategies using AWS native services (Glue, Lambda, EMR, S3, Redshift, Athena, etc.). Architect ETL/ELT pipelines and ensure efficient, secure, and reliable data flow. Collaborate with data engineers, analysts, and business stakeholders to translate business needs into scalable data solutions. Establish data governance, security, and compliance frameworks following AWS best practices (IAM, KMS, Lake Formation). Optimize data systems for performance, cost, and scalability. Lead data migration projects from on-prem or other clouds to AWS. Provide technical guidance and mentorship to data engineering teams. Required Skills & Qualifications 10+ years of experience in data architecture, data engineering, or cloud architecture. Strong hands-on experience with AWS services: Storage & Compute: S3, EC2, Lambda, ECS, EKS Data Processing: Glue, EMR, Kinesis, Step Functions Thanks!
    $90k-117k yearly est. 1d ago
  • Cloud Engineer

    Manifest Technology

    Data engineer job in Minneapolis, MN

    CLOUD AWS ENGINEER - III to Serve the Finance Industry MANIFEST Technology is seeking a Senior CLOUD AWS ENGINEER with at least six years of experience designing, deploying, and supporting cloud-based solutions in AWS. Candidates should bring strong expertise in Infrastructure as Code (Terraform), containerized environments using Docker, and CI/CD practices-particularly leveraging GitLab for automated deployments. The position requires proficiency in Python and data engineering concepts, including ETL pipelines, Apache Spark, data pipelines, serverless architectures, and REST APIs. Successful applicants will be skilled in creating and interpreting complex technical documentation and communicating technical concepts to non-technical audiences. Experience with observability and monitoring tools such as Grafana, CloudWatch, OpenSearch, Dynatrace, and OpenTelemetry (Prometheus, Jaeger, ADOT) is essential. In this role, the engineer will manage and administer AWS environments, guide architecture decisions for the observability platform, ensure application reliability and performance, support full lifecycle deployment processes, and collaborate closely with development, SRE, and operations teams. The ideal candidate is a technical leader capable of evaluating new technologies, ensuring compliance with security and audit standards, and contributing to the ongoing improvement of engineering practices and platform stability. Position type:W-2 US Citizen only; No C2C Duration: 1 year Location/Hours: Onsite 5 days a week in Minneapolis, MN Pay Range: $70-80/hr; W-2; No C2C RESPONSIBILITIES Primary focus with cloud-based technology solutions. Evaluate, recommend, and select new software/architecture for the observability platform. Manage and administer the AWS cloud environment, including provisioning, configuration, performance monitoring, policy governance and security Function as the subject matter expert for coordinating and managing the deployment process and support of the full lifecycle of the observability platform in AWS. Prepare detailed guidance from which application programs will be written. Analyze and revise existing system logic and documentation as necessary. May authorize risk level changes and recommend solutions to minimize and/or prevent system interruption. Provide technical assistance and operational guidelines for business operations and application development to ensure applications are running optimally in production, test, and development environments. Work with diverse technologies to design, build (code), test, debug, document, implement and maintain solutions and/or patterns for existing and new systems or hardware within the boundaries of existing standards, processes, or operational plans. Ensure that Treasury application services are highly available, reliable, and performant through monitoring and alerting. May lead highly technical/complex projects utilizing System, or local staff and resources. Follow and ensure adherence to technical standards for programming and design techniques. May train technical staff on use of software/hardware tools in accordance with required standards and procedures. Contributes to development and revision of department standards and procedures. Collaborate with the development staff and Site Reliability Engineers to understand the software products and any enhancements that are deployed, consults on issues related to the impact of development on the infrastructure, works with system engineers and developers to define server configuration settings, leads the migration of code through staging environments to production, and assist software quality assurance technicians during system acceptance testing Monitor compliance with internal audit requirements and Information Security Manual guidelines. KEY QUALIFICATIONS Education and Experience at the Senior Engineer Level: Bachelor's degree and a minimum of six (6) years of relevant work experience. Experience in AWS Infrastructure as Code (Terraform) Docker / containerized solutions Experience in configuring and deploying multiple deployments through GitLab Experience with coding languages - such as Python, Go, and NodeJS/Typescript Ability to communicate complex technical topics to non-technical audiences Ability to create, read, and comprehend complex technical documentation Experience with Observability tools (e.g. Grafana, CloudWatch, OpenSearch, Dynatrace, etc.) Experience with OpenTelemetry tools and framework (e.g. Prometheus, Jaeger, ADOT, etc.) Necessary Skills: Data engineering Python ETL Terraform CI/CD Docker Amazon Web Services (AWS) Apache Spark Data Pipelines Serverless computing REST APIs NEXT STEPS: Qualified candidates should APPLY NOW for immediate consideration! Please send your resume to ********************************* and then text/call David Slaymaker at ************.
    $70-80 hourly 1d ago
  • AirWatch MDM Engineer

    Trioptus

    Data engineer job in Saint Paul, MN

    12-month assignment with possibility for extension We are seeking an experienced AirWatch MDM Engineer to manage and support enterprise mobile device management (MDM) solutions. The role involves maintaining the existing VMware Workspace ONE / AirWatch platform, providing advanced technical support, and leading migration efforts to Microsoft Intune. This position requires strong troubleshooting skills, collaboration with security teams, and the ability to work independently in a fast-paced environment. Key Responsibilities Maintain and administer the AirWatch MDM platform, including: Device enrollment and lifecycle management Policy configuration and compliance monitoring Application deployment for iOS, Android, and Windows devices Provide Tier 2/3 support for mobile device issues across multiple platforms. Manage vendor portals (Verizon, AT&T, T-Mobile) for cellular activations and support. Collaborate with security and compliance teams to ensure alignment with organizational standards. Monitor system performance, generate reports, and implement improvements for security and user experience. Lead assessment, planning, and phased migration from AirWatch to Microsoft Intune: Stakeholder engagement Pilot testing Documentation Develop and maintain technical documentation, SOPs, and knowledge base articles. Stay current with industry trends and best practices in endpoint management and mobile security. Perform knowledge transfer and provide guidance to internal teams. Minimum Qualifications 3+ years of hands-on experience with VMware Workspace ONE / AirWatch administration. 2+ years of experience with AirWatch MDM software, mobile OS platforms, and enterprise mobility architecture. 2+ years of experience managing cellular activations via vendor portals (Verizon, AT&T, T-Mobile). 1+ year of experience with Microsoft Intune, Azure AD, and Microsoft Endpoint Manager. Desired Skills Experience with Intune deployment or migration projects. Microsoft certifications (e.g., MD-102, SC-300, AZ-104). Knowledge of Zero Trust principles and conditional access policies. Experience integrating MDM with identity and access management solutions. Proficiency in PowerShell scripting or other automation tools. #MDMEngineer #AirWatch #MicrosoftIntune #EndpointManagement #AzureAD #ZeroTrust #ITJobs #EnterpriseMobility
    $64k-85k yearly est. 1d ago
  • Senior Software Engineer

    Docsi

    Data engineer job in Minneapolis, MN

    DOCSI is seeking a talented, driven software engineer to join our engineering team. We need a passionate and creative mind to help us continue building our cutting edge surgical waste elimination platform. The person who accepts this role will not only work closely with our Director of Engineering, but they will also benefit from full exposure to the inner workings and decision making challenges of an early stage startup. They will inevitably be called upon to contribute to significant decisions that impact the technical direction of the company. They should also be willing and able to grow into a technical or people management role as the engineering team grows. This role will: Work alongside the Director of Engineering and other DOCSI engineers to expand and maintain our software solution. Design and build new user experiences that streamline the complex and confusing process of managing surgical waste. Inform the creation of machine learning tools to amplify the quality of surgical waste reduction recommendations. Create seamless data pipelines and integrations that enable our highly scalable, always available platform. Influence and guide critical design discussions that determine the future direction of our product. Gain access and connections to key members of the Twin Cities startup community. Help shape the culture of a new and growing engineering team. Minimum Qualifications: 4+ years of experience working as a software engineer or similar role Experience in web development with one or more of the following languages/frameworks: PHP, React, Python, Java Expertise working with relational database systems such as MySQL or PostgreSQL Demonstrable experience leading technical projects from start to finish (with or without assistance from other team members) An understanding of building systems to scale with large, often inconsistent data imports Action driven self-starter who enjoys improving existing processes A lifelong learning mindset with a desire to explore new ideas and connect them to their work Ability to work in an often ambiguous, fast-paced environment Bonus Qualifications: Previous work with PHI or other sensitive data. Experience undergoing compliance audits is even better Experience in designing seamless, mobile-friendly user experiences A history or deep interest in working in startups or early-stage companies A background/experience in healthcare and/or supply chain (Extra plus) Experience specifically with Laravel, Apache Spark, Terraform, and/or AWS cloud services Salary and Benefits: Expected salary range is between $100,000 - $140,000 An equity package relative to the candidate's skills and experience Unlimited vacation policy A healthcare stipend is available, full healthcare benefits will be available in 2026
    $100k-140k yearly 2d ago
  • IAM Engineer

    The Judge Group 4.7company rating

    Data engineer job in Thief River Falls, MN

    Key Responsibilities Design and implement IAM solutions, including SSO, MFA, and RBAC. Manage and maintain IAM systems for high availability and security. Develop and enforce IAM policies and best practices. Integrate IAM systems with applications, infrastructure, and cloud services. Conduct security assessments and audits of IAM processes. Lead user provisioning, de-provisioning, and access certification processes. Troubleshoot complex IAM issues and provide technical support. Collaborate with IT, security, and business teams to define IAM requirements. Mentor junior engineers and share best practices. Stay updated on IAM trends and emerging technologies. Required Qualifications Experience: 6-8 years in IAM with strong architectural knowledge. Expertise in Single Sign-On (OAuth) and IAM tools such as: Ping Identity, Okta, CyberArk (PAM), Active Directory, Microsoft Entra, Delinea. Strong understanding of IAM technologies and their functionality. Excellent communication and presentation skills for technical and non-technical audiences.
    $68k-88k yearly est. 2d ago
  • Senior Software Engineer

    Robert Half 4.5company rating

    Data engineer job in New Brighton, MN

    We are seeking a skilled Power Platform Developer to design, develop, and implement solutions using Microsoft Power Platform tools including Power Apps, Power Automate, Power BI, and Dataverse. The ideal candidate will collaborate with business stakeholders to automate processes, build custom applications, and deliver data-driven insights that enhance operational efficiency. Key Responsibilities: Develop and maintain custom applications using Power Apps. Automate workflows and integrate systems using Power Automate. Create interactive dashboards and reports with Power BI. Work with Dataverse and other data sources to manage and model data. Collaborate with cross-functional teams to gather requirements and deliver scalable solutions. Ensure solutions are secure, compliant, and aligned with best practices. Qualifications: Proven experience with Microsoft Power Platform. Strong understanding of data modeling, connectors, and integration techniques. Familiarity with Microsoft 365, SharePoint, and Azure services. Excellent problem-solving and communication skills.
    $89k-116k yearly est. 2d ago
  • Senior Software Engineer

    Tempworks Software, Inc. 3.6company rating

    Data engineer job in Bloomington, MN

    At TempWorks, the Senior Software Engineer is responsible for creating software that delights our customers and users in a way that is also easily maintainable. The Senior Software Engineer is responsible for leading the design, development, and implementation of software solutions. You will collaborate closely with cross-functional teams to understand requirements, design scalable architectures, and deliver robust, efficient software products. General Responsibilities: Research, design, implement, and maintain software features through ongoing feature development, refactoring, and by addressing bugs. Build highly performant, fault tolerant, high-quality, scalable software. Actively seek to learn and improve the company, department, team, and themselves. Develop intuitive software that meets the needs of the company and our customers. Leverage technical knowledge, skills, and experience to improve department processes and software quality. Write quality unit and integration tests. Analyze and test programs and products before formal launch. Contribute and adhere to best practices in software development. Participate in agile development processes, including sprint planning, daily stand-ups, and retrospectives. Communicate with and train stakeholders on completed work for documentation, customer training, troubleshooting, and quality. Provide mentoring for other Software Engineers. Perform code reviews and provide constructive feedback. Stay up to date with emerging technologies and trends in software development and recommend new tools and techniques to improve efficiency and productivity. Participate in architectural discussions and contribute to the continuous improvement of development processes and methodologies. Participate in educational opportunities like online course materials, professional publications, conferences, meet-ups, etc. Performs other related duties as assigned. Additional Required Skills and Abilities: Excellent verbal and written communication skills. Excellent interpersonal and customer service skills. Strong architectural and design skills, with the ability to architect complex systems and make informed technical decisions. Analytical and creative problem solving. High level of organization and attention to detail. Ability to work independently. Education and Experience: Bachelor's degree in computer science, Engineering, or a related field (or equivalent experience). 5+ years of relevant experience developing enterprise scale, web-based software applications. 4+ years of C# experience. 2+ years of Microsoft SQL database experience required (4+ preferred). 4+ years' experience developing applications using RESTful APIs. 4+ years' experience developing REST API driven applications using C# .NET framework and/or ASP.NET. Expertise in front-end technologies such as HTML, CSS, JavaScript, and modern JavaScript frameworks (e.g., React, Angular, Vue.js), React preferred. Experience with version control systems (e.g., Git) to manage source code and facilitate collaboration within the development team. Experience with testing and mocking frameworks (e.g., MSTest, NUnit, XUnit, Moq) Experience with cloud computing platforms (e.g., AWS, Azure, GCP) and DevOps practices. Azure preferred. Experience with CI/CD, preferably Azure YAML pipelines. Experience with static and dynamic code analysis tools (e.g., SonarQube, Veracode, ReSharper). Experience with one or more of the following required: Domain Driven Design, event-based architecture, distributed systems, microservices, clean architecture, 12-factor App. Physical Requirements: Prolonged periods sitting at desk and working on a computer. Must be able to lift to 10 pounds at times.
    $84k-107k yearly est. 2d ago
  • Principal Software Engineer

    Gentis Solutions 3.8company rating

    Data engineer job in Eden Prairie, MN

    Job Title: Principal Software Engineer Work Style: Full-time onsite (some flexibility on Fridays) Salary: $120,000 - $145,000 per year (no bonus or additional compensation currently) Projected Total Compensation: $120,000 - $145,000 annually Start: ASAP Duration: Full-time / Direct Hire Interview Process: Round 1: 30-minute phone screen with hiring manager Round 2: Onsite interview with engineering team About the Role (Summary of project) Gentis Solutions is seeking a Principal Software Engineer to design, develop, and customize Linux board support packages (BSPs), focusing primarily on bootloaders (U-Boot) and Linux kernel development for Yocto and Buildroot-based distributions. This role is not an IT or application development position-it is deeply embedded, system-level engineering, supporting processor platforms, device drivers, bare-metal systems, RTOS environments, and board bring-up. The Principal Software Engineer will provide technical leadership, mentor other engineers, and collaborate cross-functionally to deliver cutting-edge embedded solutions across multiple processor architectures. What You'll Do (Job Description): Technical Leadership & Architecture Translate product requirements into scalable, implementable system architectures. Provide day-to-day mentorship and technical leadership to design engineers. Lead multi-discipline engineering projects and occasionally manage customer project deliverables. Embedded Software Development Develop software for 32-bit and 64-bit processor platforms. Build and customize bootloaders (U-Boot) and Linux kernel components. Develop software for bare metal, RTOS, Linux, Android, and QNX platforms. Design and implement device drivers for USB, Video, Audio, Ethernet, CAN, NAND/NOR flash, DDR/SDRAM, HDMI, PCIe, SPI, I2C, etc. Develop software for wireless technologies: Wi-Fi, Bluetooth, 802.11, GPS, cellular. System Debug & Hardware Integration Support hardware and electrical engineering teams with board bring-up, debugging, and validation. Read and interpret complex electrical schematics and datasheets. Utilize oscilloscopes, JTAG debuggers, spectrum analyzers, and related tools. Documentation & Project Execution Prepare verification test plans, development plans, software specifications, and requirements documents. Complete projects within budget and timeline requirements. Communicate technical details and project status across internal and external stakeholders. Engage with external technical communities through writing or speaking engagements. What We're Looking For (Must Haves): Bachelor's degree in Computer Science, Computer Engineering, Software Engineering, or similar. 7-12+ years of embedded software development experience (flexible - right fit prioritized). Strong experience with embedded processor platforms (ARM, PowerPC, MSP430, PIC32, x86 preferred). Expertise with embedded Linux, device drivers, BSPs, bootloaders, Yocto, Buildroot. Experience with bare-metal development, RTOS platforms, and low-level system programming. Strong understanding of CPU internals (caches, MMU, interrupts, DMA, power states). Experience working with cross-functional engineering teams on product design. Ability to write detailed technical documentation and proposals. Hands-on experience with Ethernet, USB, I2C, CAN, Flash, SPI, and other embedded peripherals. Strong communication skills-able to present to leadership and engineering groups. Experience with Agile/Scrum development environments. Preferred (Nice-to-Have Skills): Experience managing offshore engineering teams or partner organizations. Experience working on wireless technologies like Bluetooth, Wi-Fi, GPS, cellular. Familiarity with TCP/IP networking, routing protocols, and similar technologies. Experience using oscilloscopes, JTAG tools, and system debuggers. Experience contributing to technical blogs, conferences, or community events.
    $120k-145k yearly 2d ago
  • Sr Boomi Developer

    Vista Applied Solutions Group Inc. 4.0company rating

    Data engineer job in Kenosha, WI

    Responsibilities: Design and Architect Solutions: Bringing deep knowledge to design stable, reliable, and scalable integration solutions using the Dell Boomi AtomSphere platform and its components (Integration, API Management, MDM, etc.) Hands-on Development: Designing, developing, and implementing complex integration processes, workflows, and APIs (REST/SOAP) to connect various applications (on-premises and cloud-based), ERP systems (like Microsoft Dynamics, Oracle EBS, SAP), and other data sources. Data Transformation: Proficiently handling various data formats such as XML, JSON, CSV and database formats, and using Boomi's capabilities and scripting languages (like Groovy or JavaScript) for complex data mapping and transformations. Dell Boomi Platform Knowledge: Proficiency in Dell Boomi is crucial. Familiarize yourself with Boomi components such as connectors, processes, maps, and APIs. Understand how to design, build, and deploy integrations using Boomi. API Development: Strong knowledge of RESTful and SOAP APIs. You'll create, consume, and manage APIs within Boomi. Working with team members and business users to understand project requirements and deliver successful design, implementation, and post implementation support. Working closely with team members to translate business requirements into feasible and efficient technical solutions. Develop and maintain documentation for integration and testing processes Be highly accurate in activity assessment, effort estimation and delivery commitment to ensure all project activities are delivered on time without comprising quality. Diagnose complex technical issues and provide recommendations on solutions with consideration of best practices and longer-term impacts of decisions. Lead/Perform third party testing, performance testing and UAT coordination. Selecting the appropriate development platform(s) to execute business requirements and ensure post implementation success. Serve as technical lead on projects to design, develop, test, document and deploy robust integration solutions. Working both independently and as part of a team; collaborating closely with other IT and non-IT team members. Assessing and troubleshooting production issues with a varying degree of priority and complexity. Optimizing existing and developing new integration solutions to support business requirements. Providing continuous support and management of the integration layer ensuring the integrity of our data and integrations and remove single points of failure. Good knowledge of best practices in error handling, logging, and monitoring. Documenting and cross-training team members for support continuity. Qualifications: 10-15 years of experience with enterprise integration platform Bachelor's degree in computer science Troubleshooting Skills: Be adept at diagnosing and resolving integration issues. Familiarity with Boomi's debugging tools is valuable. Security Awareness: Knowledge of authentication methods, encryption, and secure data transmission. Experience and proven track record of implementing integration projects. Extensible Stylesheet Language Transformations (XSLT) experience is a plus. Project Management experience is a plus Experience of ERP systems within a fast-moving wholesale, retail, and Ecommerce environment is highly desirable. Experience of Boomi implementation with Microsoft Dynamics ERP system is a plus. Strong communication and ability to work cross-functionally in a fast-paced environment.
    $82k-106k yearly est. 1d ago
  • Senior Application Developer - OneStream

    Bestinfo Systems LLC

    Data engineer job in Wayzata, MN

    Senior Application Developer - OneStream _Wayzata-MN_Full-Time (FTE)_Direct Hire Senior Application Developer - OneStream Job Type: Full-Time (FTE) Base Salary: $103,393 to $148,700+Best-in-class benefits Qualifications: *Minimum requirement of 4 years of relevant work experience. Typically reflects 5 years or more of relevant experience. Preferred Qualifications: *Proficient in .Net, C# *Strong previous experience with finance applications *Has the desire to learn Finance processes and gain solution expertise *Previous experience with OneStream, Hyperion or other corporate consolidation and planning tools *Knowledge of financial close and consolidation processes *Knowledge of financial planning and analysis *VB.Net and/or C# experience for business rules Skills and Certifications: *OneStream Candidate Details: *Seniority Level - Mid-Senior *Minimum Education - Bachelor's Degree
    $103.4k-148.7k yearly 2d ago
  • Senior Software Engineer - Payments

    Acculynx 3.4company rating

    Data engineer job in Beloit, WI

    **Please only apply if you live in one of the following states: Wisconsin, Illinois, Michigan, Texas, Colorado, Florida, Missouri, Pennsylvania, Maryland, Arkansas** AccuLynx is a rapidly growing SaaS provider of CRM and project management software for roofing contractors. With over 15 years of experience and impressive year-over-year revenue growth, we have quickly established ourselves as the leading software product in this multi-billion-dollar industry. AccuLynx is actively seeking an innovative and passionate Senior Software Engineer - Payments to lead the next phase of our payments platform development. You will design and expand systems that integrate with payment processors, gateways, and financial service APIs, as well as our subscription billing and sales tools. This will enable contractors to collect, disburse, and reconcile payments directly through AccuLynx, while also allowing the business to collect subscription and expansion revenue from customers. What You Will Do: Lead the technical direction of projects from conception to deployment Architect and design scalable and robust software systems Contribute to team output by writing clean, efficient, and maintainable code Review code, enforce standards, and mentor team members Collaborate with product managers and designers to define technical requirements Drive agile ceremonies Utilize an Agile process to experiment and refine software development practices at AccuLynx. Ensure on-time delivery of features with high quality and performance Identify and resolve technical issues and bottlenecks Lead technical direction for building integrations with modern payment providers (Worldpay, Stripe, Adyen, etc.). Architect systems for card payments, ACH, digital wallets, surcharges, and refunds. Ensure PCI compliance, tokenization, and end-to-end encryption in payment flows. Design scalable solutions for real-time payment processing and settlement reconciliation. Collaborate with Product and Legal on compliance (e.g., Reg E, Reg Z, NACHA, AML/KYC). Drive the implementation of dispute management workflows, chargeback processes, and fraud detection tools. Mentor developers on payment APIs, financial protocols, and secure coding practices. Partner with Data/Finance to ensure accurate payment reporting, settlement batching, and reconciliation. Stay up-to-date with emerging fintech trends and provider APIs. Your Qualifications: 10+ years of professional software development experience, including 3+ years in a lead role. Proven experience integrating with payment gateways, processors, or fintech APIs. Strong proficiency in C#, .NET Core, SQL Server, REST & gRPC APIs. Deep understanding of payment lifecycle (authorization, capture, settlement, refunds, chargebacks). Experience with tokenization, PCI compliance, encryption standards, OAuth flows. Strong architectural skills in high-volume, high-availability financial systems. Excellent communication and mentoring skills; ability to collaborate with technical and non-technical stakeholders. Bonus Points If You Have: Knowledge of banking APIs, ACH/NACHA protocols, and card network rules. Familiarity with disbursement systems, earned wage access, or embedded finance. Experience with real-time risk scoring or fraud detection models. Contributions to fintech/payment-related open-source projects. Why We Love AccuLynx: AccuLynx's success as the #1 business management software for roofing contractors over the past 11 years is thanks to our investing in our employees, maintaining company values, and focusing on a strong company culture. Our positive work environment has enabled us to retain employees who have been with us since the company's inception, providing the solid foundation for developing an industry-leading product that consistently exceeds our customers' expectations. Because of our commitment to our company values and culture, we were recently officially recognized as a Great Place to Work Certifiedâ„¢ organization, with 90% of our employees naming AccuLynx as a great place to work. We're proud to be regularly recognized for our achievements in software, products, and company culture. Our team's shared belief in AccuLynx's mission promotes a culture of collaboration, innovation, and fun. We have built a benefits program to match the strength of our team. This program includes: Attractive compensation packages Flexible paid time off - 3 weeks off in your first year! Competitive health coverage (medical, dental, vision) 401K matching and safe harbor contributions AccuLynx is an Equal Opportunity Employer committed to inclusion and employing a diverse workforce. All applicants will receive consideration without regard to race, color, religion, sex, national origin, age, sexual orientation, gender identity, gender expression, veteran status, disability, or other legally protected characteristics.
    $87k-113k yearly est. 4d ago
  • Marketing Data Scientist

    Polaris 4.5company rating

    Data engineer job in Plymouth, MN

    At Polaris Inc., we have fun doing what we love by driving change and innovation. We empower employees to take on challenging assignments and roles with an elevated level of responsibility in our agile working environment. Our people make us who we are, and we create incredible products and experiences that empower us to THINK OUTSIDE. JOB SUMMARY: The Data Scientist role is responsible for using their skillset to create recommendations for complex business decisions for Marketing at Polaris using a wide variety of data science techniques - from exploratory data analysis to predictive analytics. In addition to advanced analytics skills, the ideal candidate is proficient at integrating and preparing large and varied datasets, designing and implementing production algorithm scoring, and communicating results effectively to both technical and non-technical audiences. Your focus will be on Marketing Analytics across Polaris, optimizing decisions related to marketing personalization, channel performance and media investment. You are primarily a data scientist but are passionate about developing advanced end-to-end analytics and technology solutions that drive value for the business and customer. Our Data Scientists are instrumental in using data to help drive Polaris forward as we pioneer product breakthroughs and enriching experiences that help people work and play outside. ESSENTIAL DUTIES & RESPONSIBILITIES: Acquire a deep understanding of marketing and business problems facing Polaris and develop end-to-end data driven solutions Extract, cleanse, and combine large datasets from multiple sources and systems Perform exploratory and targeted analyses, with a wide variety of statistical methods including clustering, regression, decision tree/random forest, time series, neural network, and others Collaborate cross-functionally to arrive at actionable insights Synthesize results with business input to drive measurable change and effectively communicate technical analyses and results to business management Integrate and productionize model results into both cloud and edge compute hardware platforms Ensure quality of data & solutions throughout development process Utilize existing templates to diligently write up findings and results for storage in a central location Collect feedback from business users to continuously improve data science products Manage suite of customer, personalization, and marketing algorithms and apply to business problems Marketing test design and best practices for measurement Leverage knowledge of marketing operations to identify opportunities for new marketing algorithms and new variables to introduce into existing algorithms Developing new in platform marketing measurement tools with Meta, Tradedesk and Google Support development of innovative Marketing capabilities that leverage technology advancements by synthesizing complex datasets and providing predictive algorithms Support Marketing team with statistical expertise consulting and apply to test results or marketing performance SKILLS & KNOWLEDGE Bachelor's degree in Applied Statistics/Mathematics, Data or Computer Science, or work experience equivalent. Advanced degree in a quantitative field preferred 2+ years of professional work experience (post schooling & internships) in a predictive analytics-focused role Strong understanding of statistical methods and predictive/analytical modeling techniques and practices Strong experience with one or more data mining/predictive modeling tools: SQL, R, Python, Snowflake Proficient with one or more data visualization tools: Tableau, Power BI Experience with data preparation, rationalization, and processing in a cloud environment - Azure a plus Ability to effectively foster cross-functional relationships across various teams (e.g., data & analytics teams, sales, supply chain, etc.) and develop a good view of how the business operates so that analytical results can be framed in a business context. Excellent attention to detail and strong organizational skills to manage multiple work requests and projects. Ability to take on diverse ad-hoc data science requests involving multiple factors that have various scope/goals High energy and results oriented. Looking to make improvements and change Strong verbal and written communication skills WORKING CONDITIONS Office environment Limited travel may be required This position is not eligible for sponsorship The starting pay range for Minnesota is $85,000 to $112,000 per year. Individual salaries and positioning within the range are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills, and geography. While individual pay could fall anywhere in the range based on these factors, it is not common to start at the high end or top of the range. #LI-GR1 #LI-Hybrid To qualify for this position, former employees must be eligible for rehire, and current employees must be in good standing. We are an ambitious, resourceful, and driven workforce, which empowers us to THINK OUTSIDE. Apply today! At Polaris we put our employees first, by offering a holistic approach to their health and financial wellbeing. Polaris is proud to offer competitive compensation, including a market-leading profit-sharing plan that is fundamental to our pay-for-performance culture. At Polaris, employees are owners of the company through company contributions to our Employee Stock Ownership Plan and discounted employee stock purchases plan. Employees receive a generous matching contribution to 401(k), financial wellness education and consultation to plan for their financial future. In addition to competitive pay, Polaris provides a comprehensive suite of benefits, including health, dental, and vision insurance, wellness programs, paid time off, gym & personal training reimbursement, life insurance and disability offerings. Through the Polaris Foundation and our Polaris Gives paid volunteer time off, we support employees who actively volunteer their time, efforts, and passions to improve the health and wellbeing of the communities in which they live, play and work. Employees at Polaris drive our success and are rewarded for their commitment. About Polaris As the global leader in powersports, Polaris Inc. (NYSE: PII) pioneers product breakthroughs and enriching experiences and services that have invited people to discover the joy of being outdoors since our founding in 1954. Polaris' high-quality product line-up includes the Polaris RANGER , RZR and Polaris GENERALâ„¢ side-by-side off-road vehicles; Sportsman all-terrain off-road vehicles; military and commercial off-road vehicles; snowmobiles; Indian Motorcycle mid-size and heavyweight motorcycles; Slingshot moto-roadsters; Aixam quadricycles; Goupil electric vehicles; and pontoon and deck boats, including industry-leading Bennington pontoons. Polaris enhances the riding experience with a robust portfolio of parts, garments, and accessories. Proudly headquartered in Minnesota, Polaris serves more than 100 countries across the globe. *************** EEO Statement Polaris Inc. is an Equal Opportunity Employer and will make all employment-related decisions without regard to race, color, religion, creed, sex, sexual orientation, gender identity, national origin, age, disability, marital status, familial status, status with regard to public assistance, membership or activity in a local commission, protected veteran status, or any other status protected by applicable law. Applicants with a disability that are in need of an accommodation to complete the application process, or otherwise need assistance or an accommodation in the recruiting process, should contact Human Resources at ************ or ****************************** . To read more about employment discrimination protection under U.S. federal law, see: Know Your Rights: Workplace Discrimination is Illegal (eeoc.gov) .
    $85k-112k yearly Auto-Apply 60d+ ago
  • Alation Data Governance Consultant

    Tata Consulting Services 4.3company rating

    Data engineer job in Minneapolis, MN

    Must Have Technical/Functional Skills * Hands-on experience with Alation or similar tools (e.g., Collibra, OvalEdge etc). * Strong understanding of metadata management, data lineage, and data quality principles. * Familiarity with cloud platforms (AWS, GCP, Azure) and modern data stacks. * Proficiency in Agile methodologies and tools (e.g., Jira, Confluence). Roles & Responsibilities We are seeking a strategic and technically adept Product Owner to lead initiatives involving the Alation Data Catalog. This role will be responsible for defining and executing the product vision, roadmap, and delivery of data catalog capabilities that enhance data discovery, governance, and literacy across the organization. Job Responsibilities: * Define and maintain the product roadmap for Alation-based data catalog initiatives. * Align catalog capabilities with enterprise data strategy and business goals. * Collaborate with data stewards, analysts, engineers, and business users to gather requirements and feedback. * Act as the primary liaison between business and technical teams. * Create and manage a prioritized product backlog. * Write clear user stories and acceptance criteria for catalog features and enhancements. * Drive adoption of metadata standards, data lineage, and stewardship practices using Alation. * Ensure catalog content is accurate, up-to-date, and aligned with governance policies. * Promote data literacy and catalog usage through training, documentation, and internal advocacy. * Define KPIs to measure catalog adoption and data quality improvements. * Work closely with data engineers and architects to integrate Alation with data sources (e.g., Snowflake, AWS, data pipelining using AWS Glue etc.). * Support automation of metadata ingestion and policy enforcement. * 6+ years of experience in data management, data governance, or product ownership. * Excellent communication and stakeholder management skills. * Identify cost effective opportunities for improvement and recommend solutions to meet customers and user requirements, where appropriate. * Engage with the Automation SME Lead, look for opportunities for team development in a manner consistent with company policy and procedures * Develop internal team processes to ensure the integrity and quality of the service provision is maintained. Where possible, look for and propose opportunities for improvement * Escalate and manage situations arising from customer issues to Manager. Preferred Qualifications: * Experience with SQL and data modeling. * Knowledge of data privacy regulations (e.g., GDPR, CCPA). * Added advantages with background in financial services, Asset Management, or regulated industries is a plus. Salary Range- $120,000-$135,000 a year #LI-SP3 #LI-VX1
    $120k-135k yearly 28d ago
  • Consultant, Quality Improvement & Data Management

    Healthpartners 4.2company rating

    Data engineer job in Hutchinson, MN

    Hutchinson Health is seeking a skilled Quality Improvement & Data Management Consultant to lead moderate to complex projects aimed at enhancing performance and supporting regional and departmental strategic goals. In this role, you will provide expertise in quality improvement methods, data analysis, change management, and team facilitation within Health Partners, primarily focusing on Hutchinson Health and Olivia Hospital and Clinics. The ideal candidate will have a Bachelor's degree in a relevant field, at least 3 years of healthcare quality improvement experience, and proficiency in Lean, Six Sigma, and PDSA methodologies. In order to be successful in this role, qualified individuals will posses elevated leadership, multi-tasking, technology and self-starting skills. Join us in driving continuous improvement and delivering high-quality care to the Central MN community. This position will be on-site primarily at Hutchinson Health and Olivia Hospital and Clinics, but will also include time at other Health Partners locations depending on need. Job Summary: Provides quality improvement and data expertise acting as a consultant in performance improvement methods, systems thinking, change management, team facilitation, and data collection and analysis. Manages all aspects of mid-sized projects in support of regional or departmental strategic goals. Provides expertise and facilitates development of standardized approaches to create performance improvement plans, define appropriate tools, methodologies and metrics, analyze and interpret data, manage change and facilitate improvement teams. Mentors and coaches individuals and teams in improvement methods, project management, change management, group dynamics and planning methods. Actively partners with leaders to select and implement solutions and develop appropriate monitors and control plans to ensure implementation and hardwiring of improvement/change. Creates and presents project status updates to senior leadership. Identifies and removes barriers to project success or escalates to leadership when appropriate. Essential Duties and Responsibilities: Acts as quality consultant, project manager and facilitator for mid-sized to complex projects that support the organization's mission, vision and strategis priorities. Develops and supports a standardized performance improvement approach to influence the overall Central MN Performance Improvement culture. Identifies and develops recommendations and material for educational and communication needs in the Quality Performance Improvement department and throughout the Central MN Region. Establishes appropriate measurement and data monitoring approach to achieve desired results. Supports local leaders in the identification of data sources/appropriate reports, including serving as a liaison to the HealthPartners system data teams when new report builds are required to evaluate a local improvement initiative. Prepares charts, tables, and diagrams to assist others in conducting second level analysis and/or in problem-solving. Partners with the Quality Director and other leaders to design reports and scorecards for local leaders/committees. Assists to ensure that any quality metrics required by accrediting/regulatory bodies (i.e. Joint Commission) are available to appropriate stakeholders. Performs all other related duties as assigned. Accountabilities for All Employees: Adheres to the Hutchinson Health Employee Values. Maintains confidentiality of the organization and patients. Reports any health/medical errors. Observes all Environment of Care policies and reports safety risks or hazards immediately. Education, Training or Degree Required: Bachelor degree required (BA/BS), preferably in business, nursing, operations management, industrial engineering, health care, statistics or related disciplines. 3 years of clinical or quality improvement experience in the healthcare industry, Master's level coursework may substitute for years of experience. Previous project management/quality improvement/data management experience. License/Registration/Certification: (will be primary source verified by Human Resources) Green Belt certification, Lean or Six Sigma training and certification, or similar preferred Experience and Skills: (indicate preferred or required) Required: Demonstrated experience in quality improvement methods (Lean, Six Sigma, and PDSA (Plan, Do, Study, Act) processes, A3 thinking), measurement definition and analysis, team facilitation and project management. Proficiency with Microsoft Office applications including Excel, Word and Power Point and various project management tools to include flow charting. Knowledge of Joint Commissions (TJC) and Center for Medicare & Medicaid Services (CMS) standards. Exceptional organizational capabilities and prioritization skills. Proficient in preparing, leading and facilitating meetings, bringing teams to decisions in facilitating improvement sessions and/or workgroups. Proficient in tracking and reporting project or initiative progress. Strong change management, interpersonal communication, and negotiation/conflict management skills. Preferred: System thinking/Change management coursework or experience Experience working in a matrix organization Experience with Epic Previous experience in a licensed clinical position helpful Date created: 10/07/2025 DR/KM Date updated:
    $82k-102k yearly est. Auto-Apply 55d ago

Learn more about data engineer jobs

How much does a data engineer earn in Duluth, MN?

The average data engineer in Duluth, MN earns between $67,000 and $112,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Duluth, MN

$86,000
Job type you want
Full Time
Part Time
Internship
Temporary