Post job

Data engineer jobs in Duluth, MN

- 1,745 jobs
All
Data Engineer
Requirements Engineer
Senior Software Engineer
Data Scientist
Data Architect
Software Engineer Lead
Software Engineer
Data Modeler
Lead Data Technician
  • Data Engineer

    On-Demand Group 4.3company rating

    Data engineer job in Minneapolis, MN

    Job Title: Data Engineer Job Type: Contract to Hire USC of GC Holders only for contract to hire need having no sponsorship Must have requirements: GCP, SQL, Python, Airflow System design mindset Communication - ability to vocalize what they are doing, what/how they are achieving their work. Accents not an issue as long as they are comprehendible. Healthcare not required, but a nice to have. Location: Onsite - any 4 office location, focus is Minneapolis, Arlington, VA, Portland, OR, Raleigh, NC 100% onsite, then switch to 2-3x/week hybrid if they do well Job Summary: The Senior Cloud Data Engineer plays a key role in designing, building, and maintaining data pipelines and infrastructure using Google Cloud Platform (GCP) BigQuery. The incumbent will collaborate with data analysts, data scientists, and other engineers to ensure timely access to high-quality data for data-driven decision-making across the organization. The Senior Cloud Data Engineer is a highly technical person that has mastered hands-on coding in data processing solutions and scalable data pipelines to support analytics and exploratory analysis. This role ensures new business requirements are decomposed and implemented in the cohesive end-to-end designs that enable data integrity and quality, and best support BI and analytic capability needs that power decision-making. This includes building data acquisition programs that handle the business's growing data volume as part of the Data Lake in GCP BigQuery ecosystem and maintaining a robust data catalog. This is a Senior Data Engineering role within Data & Analytics' Data Core organization working closely with leaders of the Data & Analytics. The incumbent will continually improve the business's data and analytic solutions, processes, and data engineering capabilities. The incumbent embraces industry best practices and trends and, through acquired knowledge, drives process and system improvement opportunities. Responsibilities: • Design, develop, and implement data pipelines using GCP BigQuery, Dataflow, and Airflow for data ingestion, transformation, and loading. • Optimize data pipelines for performance, scalability, and cost-efficiency. • Ensure data quality through data cleansing, validation, and monitoring processes. • Develop and maintain data models and schemas in BigQuery to support various data analysis needs. • Automate data pipeline tasks using scripting languages like Python and tools like Dataflow. • Collaborate with data analysts and data scientists to understand data requirements and translate them into technical data solutions. • Leverage DevOps Terraform (IaC) to ensure seamless integration of data pipelines with CI/CD workflows. • Monitor and troubleshoot data pipelines and infrastructure to identify and resolve issues. • Stay up-to-date with the latest advancements in GCP BigQuery and other related technologies. • Document data pipelines and technical processes for future reference and knowledge sharing. Basic Requirements: • Bachelor's degree or equivalent experience in Computer Science, Mathematics, Information Technology or related field. • 5+ years of solid experience as a data engineer. • Strong understanding of data warehousing / datalake concepts and data modeling principles. • Proven experience with designing and implementing data pipelines using GCP BigQuery, Dataflow and Aiflow. • Strong SQL and scripting languages like Python (or similar) skills. • Experience with data quality tools and techniques. • Ability to work independently and as part of a team. • Strong problem-solving and analytical skills. • Passion for data and a desire to learn and adapt to new technologies. • Experience with other GCP services like Cloud Storage, Dataflow, and Pub/Sub etc. • Experience with cloud deployment and automation tools like Terraform. • Experience with data visualization tools like Tableau or Power BI or Looker. • Experience with healthcare data. • Familiarity with machine learning, artificial intelligence and data science concepts. • Experience with data governance and healthcare PHI data security best practices. • Ability to work independently on tasks and projects to deliver data engineering solutions. • Ability to communicate effectively and convey complex technical concepts as well as tasks / project updates. The projected hourly range for this position is $78 to $89. On-Demand Group (ODG) provides employee benefits which includes healthcare, dental, and vision insurance. ODG is an equal opportunity employer that does not discriminate on the basis of race, color, religion, gender, sexual orientation, age, national origin, disability, or any other characteristic protected by law.
    $78-89 hourly 4d ago
  • Senior Data Platform Engineer (28702)

    Dahl Consulting 4.4company rating

    Data engineer job in Minnetonka, MN

    Title: Senior Data Platform Engineer - Oracle/Snowflake/Azure Job Type: Contract-to-Hire (6 months) *All candidates must be interested & eligible for conversion without sponsorship. Industry: Health Insurance Pay range: $65 to $78/hour Key Technologies: Oracle, Snowflake, Azure Cloud, MS SQL --- About the Role We are seeking a highly skilled Senior Data Platform Engineer to join a leading healthcare organization headquartered in Minnetonka, MN. This role focuses on designing, implementing, and maintaining both legacy and modern data platforms that support enterprise operations. You will collaborate with experienced engineers and architects to optimize databases, develop data pipelines, and drive cloud integration initiatives. This position is ideal for a seasoned professional who thrives on solving complex data challenges, contributing to modernization efforts, and working in a fast-paced Agile environment. Responsibilities Design, build, and maintain robust data pipelines across cloud and on-premises environments. Administer, monitor, and optimize databases including Oracle, Snowflake, Azure SQL, and MS SQL. Manage database provisioning, configuration, patching, and backup/recovery processes. Collaborate with developers, analysts, and DBAs to troubleshoot issues and optimize queries. Support data migration and integration efforts as part of cloud transformation initiatives. Ensure database security, access controls, and compliance with internal standards. Contribute to documentation, runbooks, and knowledge sharing within the team. Participate in Agile ceremonies and planning activities, fostering a culture of shared ownership and continuous improvement. Join an on-call rotation to support 24/7 database operations and incident response. Required Qualifications 7+ years of experience in database engineering or a related technical role. Hands-on experience with at least one of the following: Oracle, Snowflake, or Azure SQL Database. Solid knowledge of cloud platforms (Azure preferred) and cloud-native data services. Strong understanding of system performance tuning and query optimization. Ability to work collaboratively and communicate effectively with technical peers. Preferred Qualifications Experience building and maintaining data pipelines in cloud or hybrid environments. Familiarity with Liquibase or other database change management tools. Proficiency in scripting or automation (e.g., Ansible, Python, Terraform). Experience with CI/CD pipelines or DevOps practices. Knowledge of monitoring tools and observability platforms. Background in Agile or SAFe environments. Salary range for this position is $110,400-$154,600. Annual salary range placement will depend on a variety of factors including, but not limited to, education, work experience, applicable certifications and/or licensure, the position's scope and responsibility, internal pay equity and external market salary data. Benefits Dahl Consulting is proud to offer a comprehensive benefits package to eligible employees that will allow you to choose the best coverage to meet your family's needs. For details, please review the DAHL Benefits Summary: ***********************************************
    $110.4k-154.6k yearly 12h ago
  • Associate Data Scientist

    Kellymitchell Group 4.5company rating

    Data engineer job in Minneapolis, MN

    is remote. Develop service specific knowledge through greater exposure to peers, internal experts, clients, regular self-study, and formal training opportunities Gain exposure to a variety of program/project situations to develop business and organizational/planning skills Retain knowledge gained and performance feedback provided to transfer into future work Approach all problems and projects with a high level of professionalism, objectivity and an open mind to new ideas and solutions Collaborate with internal teams to collect, analyze, and automate data processing Leverage AI models, including LLMs, for developing intelligent solutions that enhance data-driven decision-making processes for both internal projects and external clients Leverage machine learning methodologies, including non-linear, linear, and forecasting methods to help build solutions aimed at better understanding the business, making the business more efficient, and planning our future Work under the guidance of a variety of Data Science team members, gain exposure to developing custom data models and algorithms to apply to data sets Gain experience with predictive and inferential analytics, machine learning, and artificial intelligence techniques Use existing processes and tools to monitor and analyze solution performance and accuracy and communicate findings to team members and end users Contribute to automating business workflows by incorporating LLMs and other AI models to streamline processes and improve efficiency Integrate AI-driven solutions within existing systems to provide advanced predictive capabilities and actionable insights Learn to work individually as well as in collaboration with others Desired Skills/Experience: Bachelor's degree is required in the field of Statistics, Computer Science, Economics, Analytics, or Data Science preferred 1+ year of experience preferred Experience with APIs, web scraping, SQL/no-SQL databases, and cloud-based data solutions preferred Combination of relevant experience, education, and training may be accepted in lieu of degree Benefits: Medical, Dental, & Vision Insurance Plans Employee-Owned Profit Sharing (ESOP) 401K offered The approximate pay range for this position starting at $90,000 - $125,000. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
    $90k-125k yearly 2d ago
  • Data Engineer

    Brooksource 4.1company rating

    Data engineer job in Maple Grove, MN

    Data Engineer - MDP Project 6+ Month Contract Maple Grove, MN - Hybrid (3 days/week onsite) Our client is seeking a data engineering contractor to join our team and help build and maintain the existing marketing data warehouse. This role will build DBT models in a medallion style data mart that combines 10+ data sources and transform them into a holistic model to allow marketing teams to derive actionable insights. Responsibilities: Collaborate with cross-functional teams to ingest, maintain, and deliver analytics data. Develop and maintain data ingestion and transformation processes within Snowflake using SQL and DBT. Develop datasets to support ad-hoc analytical needs and reporting. Monitor data pipelines to ensure data is captured and processed accurately and on time. Ensure that all processes for receiving, processing, and evaluating data are efficient, replicable and documented. Perform validation and testing of data transformations and automated jobs to ensure quality data. Fulfill ad-hoc requests and data processing needs. Comply with all required internal governance and data privacy requirements Required Qualifications: 5+ years professional work experience with relational data models and databases. Strong Experience with the Snowflake, DBT, AWS. Strong SQL experience focused on data transformation, cleansing, and preparation. Experience documenting data processes. Collaborative team-player Strong ability to multi-task and balance competing priorities effectively. Experience working with highly regulated industries. Preferred Qualifications: Bachelor's degree in IT, software engineering or related fields. Healthcare data experience Marketing data experience ABOUT EIGHT ELEVEN GROUP: At Eight Eleven, our business is people. Relationships are at the center of what we do. A successful partnership is only as strong as the relationship built. We're your trusted partner for IT hiring, recruiting and staffing needs. For over 16 years, Eight Eleven has established and maintained relationships that are designed to meet your IT staffing needs. Whether it's contract, contract-to-hire, or permanent placement work, we customize our search based upon your company's unique initiatives, culture and technologies. With our national team of recruiters placed at 21 major hubs around the nation, Eight Eleven finds the people best-suited for your business. When you work with us, we work with you. That's the Eight Eleven promise. Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
    $80k-107k yearly est. 3d ago
  • Data Engineer

    Insight Global

    Data engineer job in Eagan, MN

    Insight Global is seeking a talented Azure Data Engineer to join one of our large utility clients on-site in Eagan, Minnesota. Please find more details below, we look forward to connecting with you! **This client works closely with the US Government, so candidates need to eligible to receive a Secret Clearance or higher. Title: Azure Data Engineer Client: Utilities Administration Company Location: Eagan, MN Schedule: Hybrid onsite - 4 days per week (Monday - Thursday) Skills Needed: Ideally, 5+ years of prior Data Engineering experience Expertise in Azure Cloud*** (experience with Azure Monitor is a plus) Experience with the following: Azure Data Factory, Azure Synapse, PySpark, Python and SQL Bachelor's Degree (or higher) in a related STEM discipline Willingness to work in-office 4 days per week in Eagan, MN Compensation: $60/hour to $75/hour. Exact compensation may vary based on several factors, including skills, experience, and education. Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.
    $60 hourly 2d ago
  • Senior Data Engineer

    Ken Systems, Inc.

    Data engineer job in Minneapolis, MN

    About the Company A global leader in the alternative investment and asset management space is expanding its Data & Analytics capabilities. The firm oversees multi-billion-dollar portfolios across credit, real estate, private equity, lending, and structured investments and maintains a high-performance, data-driven culture committed to integrity, innovation, and excellence. Position Overview We are hiring a Senior Data Engineer to join the Data & Analytics team and drive the evolution of our cloud-based data ecosystem. This individual will architect scalable data pipelines, optimize cloud infrastructure, and enable advanced analytics across multiple business units. The role is based in Minneapolis and requires close collaboration with investment, operations, and technology teams. Key Responsibilities Data Engineering Design, build, and optimize reliable ETL/ELT pipelines for large-scale data ingestion and transformation. Modernize and enhance the current data ecosystem to support high-quality, consistent data delivery. Manage cloud-based data infrastructure, including resource deployment, configuration, and performance tuning. Monitor system health, troubleshoot pipeline issues, and streamline processes for speed and efficiency. Implement robust data security, governance, and privacy controls for sensitive financial data. Stay updated with emerging technologies and best practices in cloud data engineering. Analytics & Business Enablement Develop data models, tools, and frameworks that support self-service analytics across the organization. Translate business needs into data-driven solutions such as dashboards, metrics, and analytical tools. Mentor analysts and help strengthen analytical maturity across the company. Support commercial and custom applications through configuration, administration, and maintenance. Required Qualifications Bachelor's degree in a STEM discipline (Computer Science, Engineering, Math, etc.). 10+ years overall experience in data engineering or analytics roles. 5+ years designing and maintaining ETL/ELT pipelines. 5+ years experience with data warehouses and analytics platforms. 5+ years strong SQL experience working with complex datasets. 5+ years experience with business intelligence tools (Looker, Power BI, Sigma, Tableau, Cognos, etc.). 5+ years scripting in Python or Scala. 5+ years cloud experience, preferably AWS. 5+ years working with data governance, data quality, and related tools. Hands-on experience with Infrastructure as Code (Terraform, CloudFormation, ARM templates, etc.). Familiarity with the alternative investments / private equity / hedge fund domain preferred. Strong communication, stakeholder management, and cross-team collaboration skills. Ability to thrive in a fast-paced environment with multiple competing priorities. U.S. Citizen or Permanent Resident only (ITAR requirement). Must work onsite 3-4 days/week in Minneapolis - no remote or hybrid exceptions Note: This role requires U.S. Citizenship or Permanent Residency (ITAR compliance). Candidates must already live in Minneapolis or be willing to relocate prior to start. Onsite attendance 3-4 days per week is mandatory - no remote exceptions. Do not apply if you are OPT students with STEP OPT EAD; Due to ITAR Complaince, position requires USC or Green Card which is mandatory;
    $75k-99k yearly est. 2d ago
  • Data Engineer

    FAC Services, LLC

    Data engineer job in Madison, WI

    About FAC Services Want to build your career helping those who build the world? At FAC Services, we handle the business side so architecture, engineering, and construction firms can focus on shaping the future. Our trusted, high-quality solutions empower our partners, and our people, to achieve excellence with integrity, precision, and a personal touch. Job Purpose FAC Services is investing in a modern data platform to enable trustworthy, timely, and scalable data for analytics, operations, and product experiences. The Data Engineer will design, build, and maintain core data pipelines and models for Power BI reporting, application programming interfaces (APIs), and downstream integrations. This role partners closely with Infrastructure, Quality Assurance (QA), the Database Administrator, and application teams to deliver production grade, automated data workflows with strong reliability, governance, observability, and Infrastructure as Code (IaC) for resource orchestration. Primary Responsibilities: Data Architecture & Modeling Design and evolve canonical data models, marts, and lake/warehouse structures to support analytics, APIs, and applications. Establish standards for naming, partitioning, schema evolution, and Change Data Capture (CDC). Pipeline Development (ETL/ELT) Build resilient, testable pipelines across Microsoft Fabric Data Factory, notebooks (Apache Spark), and Lakehouse tables for batch and streaming workloads. Design Lakehouse tables (Delta/Parquet) in OneLake. Optimize Direct Lake models for Power BI. Implement reusable ingestion and transformation frameworks emphasizing modularity, idempotency, and performance. Integration & APIs Engineer reliable data services and APIs to feed web applications, Power BI, and partner integrations. Publish consumer-facing data contracts (Swagger) and implement change-notification (webhooks/eventing). Use semantic versioning for breaking changes and maintain a deprecation policy for endpoints and table schemas. Ensure secure connectivity and least-privilege access in coordination with the DBA. Infrastructure as Code (IaC) - Resource Orchestration Resource Orchestration & Security: Author and maintain IaC modules to deploy and configure core resources. Use Bicep/ARM (and, where appropriate, Terraform/Ansible) with CI/CD to promote changes across environments. DevOps, CI/CD & Testing Own CI/CD pipelines (Gitbased promotion) for data code, configurations, and infrastructure. Practice test-driven development with QA (unit, integration, regression) and embed data validations throughout pipelines; collaborate with the Data Quality Engineer to maximize coverage. Observability & Reliability Instrument pipelines and datasets for lineage, logging, metrics, and alerts; define Service Level Agreements (SLAs) for data freshness and quality. Perform performance tuning (e.g., Spark optimization, partition strategies) and cost management across cloud services. Data Quality & Governance Implement rules for deduplication, reconciliation, and anomaly detection across environments (Microsoft Fabric Lakehouse and Power BI). Contribute to standards for sensitivity labels, RoleBased Access Control (RBAC), auditability, and secure data movement aligned with Infrastructure and Security. Collaboration & Leadership Work cross functionally with Infrastructure, QA, and application teams; mentor peers in modern data engineering practices; contribute to documentation and knowledge sharing. Handoff to the Data Quality Engineer for release gating; coordinate with the Database Administrator on backups/restore posture, access roles, High Availability / Disaster Recovery (HA/DR), and source CDC readiness. Qualifications To perform this job successfully, an individual must be able to perform each primary duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Experience (Required) 3+ years designing and operating production ETL/ELT pipelines and data models. Apache Spark (Fabric notebooks, Synapse Spark pools, or Databricks). Advanced T-SQL and Python; experience with orchestration, scheduling, and dependency management. Azure Event Hubs (or Kafka) for streaming; Change Data Capture (CDC) Infrastructure as Code (Bicep/ARM/Terraform); CI/CD (Azure DevOps) API design for data services (REST/OpenAPI), including versioning, pagination, error handling, authentication, and authorization. Experience (Preferred) Lakehouse design patterns on Microsoft Fabric; optimization of Power BI with Direct Lake models. Kusto Query Language (KQL), Eventstream and Eventhouse familiarity. Experience with lineage/metadata platforms and cost governance.
    $76k-101k yearly est. 12h ago
  • Senior Data Platform Engineer

    The Nycor Group

    Data engineer job in Edina, MN

    Data Platform Engineer As a Data Platform Engineer, you will be responsible for the ingestion, transformation, and maintenance of enterprise data used to serve analytics needs for the business. Working closely with Business Analysts and Data Architects, you will use your technical skills to understand and execute business requirements. This role requires strong fundamentals in data engineering and a collaborative, business-process-oriented mindset. Essential Job Functions Data Preparation (70% time allocation) Use DBT to move data through a medallion architecture in Snowflake. Apply standardization and resolve conflicts in raw layer data (cleansing). Use cleansed data and dimensional modeling techniques (Kimball) to create facts and dimensions in the data warehouse. Create curated, highly consumable data products that fulfill business needs. Represent business processes digitally in data models, ensuring accurate reflection of underlying processes. Quality Assurance (15% time allocation) Validate data outputs against Business Analyst provided test cases. Ensure quality of data pipelines via analysis and unit tests (standardization, completeness, grain, redundancy, etc). Team Development (10% time allocation) Set development standards and lead code reviews. Mentor other team members to develop their skills and abilities. Research technologies to improve processes. Collaborate with a team of 9 reporting to the BI Manager, including engineers, a Data Scientist, a Data Architect, and Business Analysts. Data Ingestion (5% time allocation) Use Fivetran/HVR to create data connections from source systems to Snowflake. Knowledge, Skills, and Abilities Minimum of 5-7 years of in-depth work experience in data warehousing or data engineering. Manufacturing industry experience required. Expertise in DBT and Snowflake (must-have). Strong fundamentals: Kimball Dimensional Modeling, Normalization vs. Denormalization, Type 1 vs. Type 2 dimensions, Cardinality, Data granularity and Aggregation, Hierarchies etc. Experience in ELT and data analysis with SQL and at least one programming language (Python preferred). Conceptual knowledge of data and analytics, including dimensional modeling, ELT, reporting tools, data governance, structured and unstructured data. Experience and/or knowledge of CI/CD practices using GitHub or Azure repos. Familiarity with ERP systems (D365 experience is a plus). Ability to design and build systems that handle data, including cleaning messy data and building real-time pipelines. Collaborative, optimistic personality with integrity; able to pivot quickly and work closely with business teams. Education / Experience Bachelor's degree in Business Information Systems, Computer Science, or equivalent. Related work experience in a manufacturing setting is preferred. Minimum of 5-7 years in data engineering roles. Additional Notes Ideal candidates understand the fundamentals of data engineering and can articulate their experience designing and building data systems. They should be collaborative, positive, and business-process oriented, avoiding rigid enforcement approaches. Flexibility and adaptability are key. Please note: Unfortunately, No Visa Sponsorship or Transfers will be available for this position.
    $75k-99k yearly est. 12h ago
  • Data Modeler

    Nextgenpros Inc.

    Data engineer job in Minneapolis, MN

    “Healthcare Insurance” Industry Experience Data Analysis/Architecture Source to Target Mapping ERWIN Data Modeling ETL Informatica-Powercenter Oracle PL/SQL
    $70k-94k yearly est. 1d ago
  • Healthcare Cloud Data Transformation Lead

    Sogeti 4.7company rating

    Data engineer job in Minneapolis, MN

    About the job We are seeking a highly skilled Cloud Data Transformation Architect/Leader to lead the design, implementation, and optimization of large-scale cloud-based data platforms and transformation initiatives. The architect will play a critical role in defining data strategy and roadmap, modernizing legacy environments, and enabling advanced analytics and AI/ML capabilities. This position requires deep expertise in cloud ecosystems, data integration, governance, and performance optimization, along with strong leadership in guiding cross-functional teams. The Transformation leader will drive the implementation roadmap and work with client leadership to demonstrate the ROI of the modernized platform and transformation roadmap. What you will do at Sogeti Advisory Consulting, Collaboration & Leadership Provide leadership and advisory consulting to client data and analytics leadership Partner with business stakeholders, data leaders, data engineers, and analysts to understand business needs and align with data capabilities Provide CXO level status reporting and communication along with ROI and NPV measurement/reporting Provide technical leadership, mentorship, and best practices for data engineering teams. Serve as the subject matter expert on cloud data and analytics transformation. Data Strategy, Roadmap & Architecture Define and maintain the enterprise cloud data architecture, strategy and roadmap aligned with business objectives. Define key metrics to measure progress and ROI from the roadmap Design end-to-end cloud data ecosystems (data lakes, data warehouses, lakehouses, streaming pipelines). Design metadata-driven architecture and open table formats (Delta Lake, Apache Hudi, and Apache Iceberg) Experienced in implementing cost management measures on Cloud data platforms Evaluate emerging technologies and recommend adoption strategies. Data Transformation & Migration Lead the modernization of on-premises data platforms to cloud-native architectures. Architect scalable, secure, and high-performance ETL/ELT and real-time data pipelines. Ensure seamless integration of structured, semi-structured, and unstructured data. Governance & Security Implement data governance, lineage, cataloging, and quality frameworks. Ensure compliance with regulatory standards (GDPR, HIPAA, SOC 2, etc.). Define data security models for data access, encryption, and masking. BI Modernization Lead the modernization of legacy BI platforms to Power BI Architect and develop Semantic Layer needed for consumption layer - BI, AI etc Optimization & Innovation Drive performance tuning, cost optimization, and scalability of cloud data platforms. Explore opportunities to leverage AI/ML, advanced analytics, and automation in data transformation. Establish reusable frameworks and accelerators for faster delivery. Data Operations: Definition of SLA's/KPI's for Data Platform operations along with the client Tracking and reporting of SLA's/KPI's to executive leadership Identify and rollout solutions for improving SLA/KPI adherence What you will bring Experience: 18+ years in data engineering/architecture, with 5+ years in leading enterprise-scale cloud data transformations. Experience in Healthcare Payer industry Experience defining Enterprise level data strategy and roadmap and driving the implementation for at-least 3 enterprise clients Experience with playing key advisory role for client data and analytics leadership (CDO and direct reports) Hands-on expertise with at least one major cloud provider - Azure is must have AWS,GCP is good to have Experienced in implementing Snowflake on Azure as well as Medallion lakehouse Architecture using Databricks and MS Fabric using open table formats Experienced in various data modelling techniques and standards for cloud data warehouses Experienced in designing and implementing high performing data pipelines; performance tuning expertise is required Experienced with Data Governance implementation with focus on Metadata Management using Alation and Data Quality using Industry standard tools Experienced with BI Modernization from legacy BI platforms to Power BI - Big plus Technical Skills: Data platforms: Snowflake - Must have, Databricks, MS Fabric -Big Plus Synapse, Redshift, Big Query - Good to have Data integration: Azure Data Factory, DBT, Snowpipe, Informatica Power Center, IDMC CDI, Matillion, Kafka Programming: SQL, Python, SnowSQL, SnowPark, PySpark, Scala, or Java. Data Governance: Alation - Must have, Informatica, Collibra, Ataccama - Good to have BI Platforms: PowerBI -Must have, Qlik, SSRS, SAS - Good to have Infrastructure as Code: Terraform, ARM, CloudFormation. Strong understanding of APIs, microservices, and event-driven architectures. Life at Sogeti: Sogeti supports all aspects of your well-being throughout the changing stages of your life and career. For eligible employees, we offer: Flexible work options 401(k) with 150% match up to 6% Employee Share Ownership Plan Medical, Prescription, Dental & Vision Insurance Life Insurance 100% Company-Paid Mobile Phone Plan 3 Weeks PTO + 7 Paid Holidays Paid Parental Leave Adoption, Surrogacy & Cryopreservation Assistance Subsidized Back-up Child/Elder Care & Tutoring Career Planning & Coaching $5,250 Tuition Reimbursement & 20,000+ Online Courses Employee Resource Groups Counseling & Support for Physical, Financial, Emotional & Spiritual Well-being Disaster Relief Programs About Sogeti Part of the Capgemini Group, Sogeti makes business value through technology for organizations that need to implement innovation at speed and want a local partner with global scale. With a hands-on culture and close proximity to its clients, Sogeti implements solutions that will help organizations work faster, better, and smarter. By combining its agility and speed of implementation through a DevOps approach, Sogeti delivers innovative solutions in quality engineering, cloud and application development, all driven by AI, data and automation. Become Your Best | ************* Disclaimer Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law. This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship. Capgemini is committed to providing reasonable accommodation during our recruitment process. If you need assistance or accommodation, please reach out to your recruiting contact. Please be aware that Capgemini may capture your image (video or screenshot) during the interview process and that image may be used for verification, including during the hiring and onboarding process. Click the following link for more information on your rights as an Applicant ************************************************************************** Applicants for employment in the US must have valid work authorization that does not now and/or will not in the future require sponsorship of a visa for employment authorization in the US by Capgemini.
    $103k-138k yearly est. 2d ago
  • Data Scientist

    Ledgent Technology 3.5company rating

    Data engineer job in Mendota Heights, MN

    We are seeking a Data Scientist to deliver predictive analytics and actionable insights that enhance financial forecasting and supply chain performance. This role will partner with business leaders and analysts to design models that inform strategic decisions. You will work primarily within Microsoft Fabric, leveraging Delta Lake/OneLake and Medallion Architecture (Bronze-Silver-Gold) to build scalable solutions and lay the groundwork for future AI-driven capabilities. This is a full-time, direct hire role which will be onsite in Mendota Heights, MN. Local candidates only. Target salary is between $120,000-140,000. Candidates must be eligible to work in the United States without sponsorship both now and in the future. No C2C or third parties. Key Responsibilities Develop and deploy machine learning models for cost modeling, sales forecasting, and long-term work order projections. Analyze large, complex datasets to uncover trends, anomalies, and opportunities for operational improvement. Collaborate with finance, supply chain, and business teams to translate challenges into data-driven solutions. Work with engineering teams to create robust pipelines for data ingestion, transformation, and modeling using cloud-native tools. Utilize Azure services (Data Lake, Synapse, ML Studio) to operationalize models and manage workflows. Present insights through clear visualizations and executive-level presentations. Contribute to governance standards, audit trails, and model documentation. Qualifications Education & Certifications Bachelor's degree required; Master's in Computer Science, IT, or related field preferred. Cloud certifications (Azure or similar) are a plus. Experience & Skills 5+ years as a Data Scientist or similar role. Hands-on experience with Microsoft Fabric, Azure Synapse, and related cloud technologies. Proficiency in Python, R, SQL, and visualization tools (Power BI, Tableau). Strong background in financial modeling, cost allocation, and supply chain analytics. Familiarity with Oracle and Salesforce UI navigation is helpful. Excellent business acumen and ability to communicate complex concepts to senior leadership. Strong problem-solving skills and ability to design scalable solutions. Preferred Experience with Azure Machine Learning. Knowledge of Jitterbit is a plus. All qualified applicants will receive consideration for employment without regard to race, color, national origin, age, ancestry, religion, sex, sexual orientation, gender identity, gender expression, marital status, disability, medical condition, genetic information, pregnancy, or military or veteran status. We consider all qualified applicants, including those with criminal histories, in a manner consistent with state and local laws, including the California Fair Chance Act, City of Los Angeles' Fair Chance Initiative for Hiring Ordinance, and Los Angeles County Fair Chance Ordinance. For unincorporated Los Angeles county, to the extent our customers require a background check for certain positions, the Company faces a significant risk to its business operations and business reputation unless a review of criminal history is conducted for those specific job positions.
    $120k-140k yearly 4d ago
  • Data Engineer

    Talent Software Services 3.6company rating

    Data engineer job in Bloomington, MN

    Are you an experienced Data Engineer with a desire to excel? If so, then Talent Software Services may have the job for you! Our client is seeking an experienced Data Engineer to work at their company in Bloomington, MN. Primary Responsibilities/Accountabilities: Develop and maintain scalable ETL/ELT pipelines using Databricks and Airflow. Build and optimize Python-based data workflows and SQL queries for large datasets. Ensure data quality, reliability, and high performance across pipelines. Collaborate with cross-functional teams to support analytics and reporting requirements. Monitor, troubleshoot, and improve production data workflows. Qualifications: Strong hands-on experience with Databricks, Python, SQL, and Apache Airflow. 6-10+ years of experience in Data Engineering. Experience with cloud platforms (Azure/AWS/GCP) and big data ecosystems. Solid understanding of data warehousing, data modelling, and distributed data processing.
    $71k-96k yearly est. 3d ago
  • AirWatch MDM Engineer

    Trioptus

    Data engineer job in Saint Paul, MN

    12-month assignment with possibility for extension We are seeking an experienced AirWatch MDM Engineer to manage and support enterprise mobile device management (MDM) solutions. The role involves maintaining the existing VMware Workspace ONE / AirWatch platform, providing advanced technical support, and leading migration efforts to Microsoft Intune. This position requires strong troubleshooting skills, collaboration with security teams, and the ability to work independently in a fast-paced environment. Key Responsibilities Maintain and administer the AirWatch MDM platform, including: Device enrollment and lifecycle management Policy configuration and compliance monitoring Application deployment for iOS, Android, and Windows devices Provide Tier 2/3 support for mobile device issues across multiple platforms. Manage vendor portals (Verizon, AT&T, T-Mobile) for cellular activations and support. Collaborate with security and compliance teams to ensure alignment with organizational standards. Monitor system performance, generate reports, and implement improvements for security and user experience. Lead assessment, planning, and phased migration from AirWatch to Microsoft Intune: Stakeholder engagement Pilot testing Documentation Develop and maintain technical documentation, SOPs, and knowledge base articles. Stay current with industry trends and best practices in endpoint management and mobile security. Perform knowledge transfer and provide guidance to internal teams. Minimum Qualifications 3+ years of hands-on experience with VMware Workspace ONE / AirWatch administration. 2+ years of experience with AirWatch MDM software, mobile OS platforms, and enterprise mobility architecture. 2+ years of experience managing cellular activations via vendor portals (Verizon, AT&T, T-Mobile). 1+ year of experience with Microsoft Intune, Azure AD, and Microsoft Endpoint Manager. Desired Skills Experience with Intune deployment or migration projects. Microsoft certifications (e.g., MD-102, SC-300, AZ-104). Knowledge of Zero Trust principles and conditional access policies. Experience integrating MDM with identity and access management solutions. Proficiency in PowerShell scripting or other automation tools. #MDMEngineer #AirWatch #MicrosoftIntune #EndpointManagement #AzureAD #ZeroTrust #ITJobs #EnterpriseMobility
    $64k-85k yearly est. 2d ago
  • SharePoint Engineer

    Hyqoo

    Data engineer job in Minneapolis, MN

    Job Title: SharePoint Engineer II Introduction We are seeking a highly skilled SharePoint Engineer II to join our End User Technology - Productivity Tools team, responsible for enterprise collaboration and productivity platforms across Microsoft 365. This role plays a critical part in administering and engineering SharePoint Online environments, ensuring secure, scalable, and efficient collaboration solutions. The ideal candidate will combine deep expertise in SharePoint administration with strong automation and integration skills, driving modernization efforts and enhancing user productivity. Roles and Responsibilities Administer and manage SharePoint Online tenant and site collections, including site templates, hubs, term store, content types, app catalog/add-ins, sharing policies, retention labels, and search configurations. Maintain governance and lifecycle management at scale. Collaborate with business owners to design modern site architectures (communication and team sites), navigation, and metadata strategies to improve content findability and user adoption. Implement and enforce role-based access controls, group permissions, and external sharing policies. Conduct audits to identify and remediate permission drift, supporting least-privilege security models. Lead and support classic-to-modern migration projects, performing content inventory, page and web part modernization, workflow replacements, and post-migration stabilization including training and documentation. Develop, maintain, and optimize operational scripts and automation workflows using PowerShell (including PnP and Graph modules) and Azure Automation runbooks for provisioning, compliance enforcement, reporting, and policy management. Utilize Microsoft Graph API and SharePoint REST API to integrate SharePoint with enterprise workflows, external data sources, and approval processes, creating lightweight custom extensions as needed. Provide advanced (3rd-level) technical support and troubleshooting for complex SharePoint and Microsoft 365 collaboration issues, including root cause analysis and incident response. Produce detailed documentation such as SOPs, runbooks, migration playbooks, and quick-start guides. Deliver targeted training sessions for site owners and administrators. Collaborate with solution architects to evaluate and recommend new capabilities. Mentor junior team members to build overall team expertise and capacity. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field preferred. 5+ years of experience deploying, managing, and administering SharePoint Online within Microsoft 365, including tenant-level governance, site provisioning, app catalog management, search, and security/compliance. 3+ years of hands-on experience with PowerShell scripting (including PnP and Graph modules) and Azure Automation for operational task automation and reporting. 3+ years working with Microsoft Graph API and SharePoint REST API to extend and integrate SharePoint solutions. Proven experience supporting classic-to-modern SharePoint migrations, including content inventory, page modernization, web part remediation, and workflow replacement. Solid understanding of IT Service Management (ITSM) processes, with practical experience using tools such as ServiceNow for incident, change, and problem management. Strong analytical, problem-solving, and communication skills with the ability to work effectively in a hybrid team environment. Preferred: Experience with Azure AD/Entra ID (dynamic groups, conditional access), Power Platform governance, and cross-service integrations (Teams, OneDrive, Viva). Familiarity with cloud platforms such as Azure, AWS, or GCP, and CI/CD pipelines using GitHub Actions or Azure DevOps for infrastructure as code and SharePoint automation. Exposure to enterprise monitoring and analytics tools like Nexthink to drive adoption and performance improvements. Tools and Technologies SharePoint Online (tenant and site-level administration) PowerShell (including PnP PowerShell, Microsoft Graph PowerShell) Microsoft Graph API SharePoint REST API Azure Automation ServiceNow (or equivalent ITSM platforms) Azure AD/Entra ID Microsoft 365 collaboration tools (Teams, OneDrive, Viva) Cloud platforms: Azure, AWS, GCP (preferred) CI/CD tools: GitHub Actions, Azure DevOps (preferred)
    $64k-85k yearly est. 12h ago
  • Senior Software Engineer

    Docsi

    Data engineer job in Minneapolis, MN

    DOCSI is seeking a talented, driven software engineer to join our engineering team. We need a passionate and creative mind to help us continue building our cutting edge surgical waste elimination platform. The person who accepts this role will not only work closely with our Director of Engineering, but they will also benefit from full exposure to the inner workings and decision making challenges of an early stage startup. They will inevitably be called upon to contribute to significant decisions that impact the technical direction of the company. They should also be willing and able to grow into a technical or people management role as the engineering team grows. This role will: Work alongside the Director of Engineering and other DOCSI engineers to expand and maintain our software solution. Design and build new user experiences that streamline the complex and confusing process of managing surgical waste. Inform the creation of machine learning tools to amplify the quality of surgical waste reduction recommendations. Create seamless data pipelines and integrations that enable our highly scalable, always available platform. Influence and guide critical design discussions that determine the future direction of our product. Gain access and connections to key members of the Twin Cities startup community. Help shape the culture of a new and growing engineering team. Minimum Qualifications: 4+ years of experience working as a software engineer or similar role Experience in web development with one or more of the following languages/frameworks: PHP, React, Python, Java Expertise working with relational database systems such as MySQL or PostgreSQL Demonstrable experience leading technical projects from start to finish (with or without assistance from other team members) An understanding of building systems to scale with large, often inconsistent data imports Action driven self-starter who enjoys improving existing processes A lifelong learning mindset with a desire to explore new ideas and connect them to their work Ability to work in an often ambiguous, fast-paced environment Bonus Qualifications: Previous work with PHI or other sensitive data. Experience undergoing compliance audits is even better Experience in designing seamless, mobile-friendly user experiences A history or deep interest in working in startups or early-stage companies A background/experience in healthcare and/or supply chain (Extra plus) Experience specifically with Laravel, Apache Spark, Terraform, and/or AWS cloud services Salary and Benefits: Expected salary range is between $100,000 - $140,000 An equity package relative to the candidate's skills and experience Unlimited vacation policy A healthcare stipend is available, full healthcare benefits will be available in 2026
    $100k-140k yearly 3d ago
  • IAM Engineer

    The Judge Group 4.7company rating

    Data engineer job in Thief River Falls, MN

    Key Responsibilities Design and implement IAM solutions, including SSO, MFA, and RBAC. Manage and maintain IAM systems for high availability and security. Develop and enforce IAM policies and best practices. Integrate IAM systems with applications, infrastructure, and cloud services. Conduct security assessments and audits of IAM processes. Lead user provisioning, de-provisioning, and access certification processes. Troubleshoot complex IAM issues and provide technical support. Collaborate with IT, security, and business teams to define IAM requirements. Mentor junior engineers and share best practices. Stay updated on IAM trends and emerging technologies. Required Qualifications Experience: 6-8 years in IAM with strong architectural knowledge. Expertise in Single Sign-On (OAuth) and IAM tools such as: Ping Identity, Okta, CyberArk (PAM), Active Directory, Microsoft Entra, Delinea. Strong understanding of IAM technologies and their functionality. Excellent communication and presentation skills for technical and non-technical audiences.
    $68k-88k yearly est. 3d ago
  • Senior Software Engineer

    Tempworks Software, Inc. 3.6company rating

    Data engineer job in Bloomington, MN

    At TempWorks, the Senior Software Engineer is responsible for creating software that delights our customers and users in a way that is also easily maintainable. The Senior Software Engineer is responsible for leading the design, development, and implementation of software solutions. You will collaborate closely with cross-functional teams to understand requirements, design scalable architectures, and deliver robust, efficient software products. General Responsibilities: Research, design, implement, and maintain software features through ongoing feature development, refactoring, and by addressing bugs. Build highly performant, fault tolerant, high-quality, scalable software. Actively seek to learn and improve the company, department, team, and themselves. Develop intuitive software that meets the needs of the company and our customers. Leverage technical knowledge, skills, and experience to improve department processes and software quality. Write quality unit and integration tests. Analyze and test programs and products before formal launch. Contribute and adhere to best practices in software development. Participate in agile development processes, including sprint planning, daily stand-ups, and retrospectives. Communicate with and train stakeholders on completed work for documentation, customer training, troubleshooting, and quality. Provide mentoring for other Software Engineers. Perform code reviews and provide constructive feedback. Stay up to date with emerging technologies and trends in software development and recommend new tools and techniques to improve efficiency and productivity. Participate in architectural discussions and contribute to the continuous improvement of development processes and methodologies. Participate in educational opportunities like online course materials, professional publications, conferences, meet-ups, etc. Performs other related duties as assigned. Additional Required Skills and Abilities: Excellent verbal and written communication skills. Excellent interpersonal and customer service skills. Strong architectural and design skills, with the ability to architect complex systems and make informed technical decisions. Analytical and creative problem solving. High level of organization and attention to detail. Ability to work independently. Education and Experience: Bachelor's degree in computer science, Engineering, or a related field (or equivalent experience). 5+ years of relevant experience developing enterprise scale, web-based software applications. 4+ years of C# experience. 2+ years of Microsoft SQL database experience required (4+ preferred). 4+ years' experience developing applications using RESTful APIs. 4+ years' experience developing REST API driven applications using C# .NET framework and/or ASP.NET. Expertise in front-end technologies such as HTML, CSS, JavaScript, and modern JavaScript frameworks (e.g., React, Angular, Vue.js), React preferred. Experience with version control systems (e.g., Git) to manage source code and facilitate collaboration within the development team. Experience with testing and mocking frameworks (e.g., MSTest, NUnit, XUnit, Moq) Experience with cloud computing platforms (e.g., AWS, Azure, GCP) and DevOps practices. Azure preferred. Experience with CI/CD, preferably Azure YAML pipelines. Experience with static and dynamic code analysis tools (e.g., SonarQube, Veracode, ReSharper). Experience with one or more of the following required: Domain Driven Design, event-based architecture, distributed systems, microservices, clean architecture, 12-factor App. Physical Requirements: Prolonged periods sitting at desk and working on a computer. Must be able to lift to 10 pounds at times.
    $84k-107k yearly est. 3d ago
  • AI/ML Engineer

    Iris Consulting LLC

    Data engineer job in Medina, MN

    Top Technical Skills & Requirements Programming Expertise: 3+ years of hands-on experience with C# and Python, including building scalable applications and integrating ML models. API Development & Management: Experience designing, building, and managing RESTful API endpoints using C# (.NET) and Python (FastAPI), with a focus on performance, security, and maintainability. Cloud ML Deployment: Proven experience in end-to-end deployment, monitoring, and maintenance of ML models on Azure (preferred) or AWS, including CI/CD pipelines and MLOps practices. Deep Microsoft Azure Experience: Extensive hands-on experience with Azure services including Azure Machine Learning, Azure AI Foundry, Azure Functions, and Azure DevOps, enabling scalable and secure AI/ML solutions across enterprise environments. Azure AI Foundry: Practical experience leveraging Azure AI Foundry for model development, orchestration, and deployment. Applied Data Science: Strong background in data exploration, feature engineering, model selection, and validation across supervised and unsupervised learning tasks. Key Responsibilities AI Solution Development & Deployment Architect and implement AI/ML solutions using Azure Machine Learning, Azure AI Foundry, Cognitive Services, and Azure OpenAI. Build and deploy NLP and LLM-based models, utilizing frameworks such as LangChain, Semantic Kernel, or LlamaIndex. Design and implement Retrieval-Augmented Generation (RAG) pipelines using Azure AI Search to enhance generative AI capabilities. Develop and manage RESTful API endpoints using C# (.NET) and Python (FastAPI) to serve ML models and data services. Implement CI/CD pipelines and MLOps workflows using Azure DevOps and GitHub for scalable and automated model deployment. Leverage Terraform for infrastructure-as-code (IaC) to provision and manage Azure cloud resources in a repeatable and secure manner. Model Lifecycle Management Lead end-to-end ML model lifecycle including data exploration, feature engineering, model training, validation, and deployment. Monitor and maintain deployed models in production environments, ensuring performance, reliability, and scalability. Apply best practices in model versioning, automated retraining, and performance monitoring using Azure ML and related tools. Quality Assurance & Governance Conduct thorough testing and validation of AI models to ensure accuracy, reliability, and performance. Optimize and fine-tune models, addressing issues related to data quality, bias, and fairness. Stay current with industry trends and best practices in AI technology, incorporating them into solution development. Education & Experience Bachelor's Degree in Computer Science, Data Science or similar (relevant work experience is acceptable) 3+ years of experience in AI/ML development, with a focus on OpenAI Services, NLP and LLMs. Experience in a consulting environment, engaging with clients and delivering tailored solutions. Preferred Consulting Experience Collaborate with sales and delivery teams to scope and design AI solutions tailored to client needs. Contribute to proposal development, including technical architecture, effort estimation, and value articulation. Deliver technical presentations and demos to stakeholders, showcasing solution capabilities and business impact. Equal opportunity employer including disability/veterans.
    $64k-85k yearly est. 4d ago
  • Sr Boomi Developer

    Vista Applied Solutions Group Inc. 4.0company rating

    Data engineer job in Kenosha, WI

    Responsibilities: Design and Architect Solutions: Bringing deep knowledge to design stable, reliable, and scalable integration solutions using the Dell Boomi AtomSphere platform and its components (Integration, API Management, MDM, etc.) Hands-on Development: Designing, developing, and implementing complex integration processes, workflows, and APIs (REST/SOAP) to connect various applications (on-premises and cloud-based), ERP systems (like Microsoft Dynamics, Oracle EBS, SAP), and other data sources. Data Transformation: Proficiently handling various data formats such as XML, JSON, CSV and database formats, and using Boomi's capabilities and scripting languages (like Groovy or JavaScript) for complex data mapping and transformations. Dell Boomi Platform Knowledge: Proficiency in Dell Boomi is crucial. Familiarize yourself with Boomi components such as connectors, processes, maps, and APIs. Understand how to design, build, and deploy integrations using Boomi. API Development: Strong knowledge of RESTful and SOAP APIs. You'll create, consume, and manage APIs within Boomi. Working with team members and business users to understand project requirements and deliver successful design, implementation, and post implementation support. Working closely with team members to translate business requirements into feasible and efficient technical solutions. Develop and maintain documentation for integration and testing processes Be highly accurate in activity assessment, effort estimation and delivery commitment to ensure all project activities are delivered on time without comprising quality. Diagnose complex technical issues and provide recommendations on solutions with consideration of best practices and longer-term impacts of decisions. Lead/Perform third party testing, performance testing and UAT coordination. Selecting the appropriate development platform(s) to execute business requirements and ensure post implementation success. Serve as technical lead on projects to design, develop, test, document and deploy robust integration solutions. Working both independently and as part of a team; collaborating closely with other IT and non-IT team members. Assessing and troubleshooting production issues with a varying degree of priority and complexity. Optimizing existing and developing new integration solutions to support business requirements. Providing continuous support and management of the integration layer ensuring the integrity of our data and integrations and remove single points of failure. Good knowledge of best practices in error handling, logging, and monitoring. Documenting and cross-training team members for support continuity. Qualifications: 10-15 years of experience with enterprise integration platform Bachelor's degree in computer science Troubleshooting Skills: Be adept at diagnosing and resolving integration issues. Familiarity with Boomi's debugging tools is valuable. Security Awareness: Knowledge of authentication methods, encryption, and secure data transmission. Experience and proven track record of implementing integration projects. Extensible Stylesheet Language Transformations (XSLT) experience is a plus. Project Management experience is a plus Experience of ERP systems within a fast-moving wholesale, retail, and Ecommerce environment is highly desirable. Experience of Boomi implementation with Microsoft Dynamics ERP system is a plus. Strong communication and ability to work cross-functionally in a fast-paced environment.
    $82k-106k yearly est. 2d ago
  • Senior Application Developer - OneStream

    Bestinfo Systems LLC

    Data engineer job in Wayzata, MN

    Senior Application Developer - OneStream _Wayzata-MN_Full-Time (FTE)_Direct Hire Senior Application Developer - OneStream Job Type: Full-Time (FTE) Base Salary: $103,393 to $148,700+Best-in-class benefits Qualifications: *Minimum requirement of 4 years of relevant work experience. Typically reflects 5 years or more of relevant experience. Preferred Qualifications: *Proficient in .Net, C# *Strong previous experience with finance applications *Has the desire to learn Finance processes and gain solution expertise *Previous experience with OneStream, Hyperion or other corporate consolidation and planning tools *Knowledge of financial close and consolidation processes *Knowledge of financial planning and analysis *VB.Net and/or C# experience for business rules Skills and Certifications: *OneStream Candidate Details: *Seniority Level - Mid-Senior *Minimum Education - Bachelor's Degree
    $103.4k-148.7k yearly 3d ago

Learn more about data engineer jobs

How much does a data engineer earn in Duluth, MN?

The average data engineer in Duluth, MN earns between $67,000 and $112,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Duluth, MN

$86,000
Job type you want
Full Time
Part Time
Internship
Temporary