Post Job

Data Engineer Jobs in Minnesota

- 1,024 Jobs
  • Data Engineer / Developer

    Simplicity Group Holdings

    Data Engineer Job In Minneapolis, MN

    Job Description Data Engineer/Developer Reports to: VP, Technology Department: Development and Technology Classification: Full-time; Exempt Summary / Job Objective: As a data engineer/developer, you will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure that are used for collecting, storing, processing, and analyzing large volumes of data. Essential Job Functions: Your role involves collaborating with business stakeholders, outside partners, other software developers, analysts, and other stakeholders to ensure data availability, reliability, and quality for decision-making and operational purposes. Primary Responsibilities: · Data Processing and Transformation: Implement data processing and transformation using state-of-the-art tools and processes. Build on top of existing capabilities and develop both custom features (e.g., Laravel/PHP, Python, or other frameworks) or enterprise software (e.g., Informatica, Mulesoft, or other tools). · Data Pipeline Development: Design, develop, and maintain robust, scalable data pipelines for ingesting, transforming, and loading structured and unstructured data from various sources into data storage systems. · Data Infrastructure: Manage and optimize data storage solutions, including databases, operational data stores, and data warehouses, to ensure high performance, reliability, and security. · Data Quality and Governance: Implement data quality checks, monitoring, and governance processes to ensure data accuracy, consistency, and compliance with regulatory requirements. · Performance Tuning: Identify and address performance bottlenecks in data pipelines and infrastructure through optimization, tuning, and capacity planning. · Collaboration: Work closely with data scientists, analysts, software engineers, and business stakeholders to understand data requirements, develop data solutions, and support analytical use cases. · Documentation: Document data pipelines, infrastructure configurations, and best practices for data engineering standards, ensuring knowledge sharing and team collaboration. Qualifications: · Hands-on experience with building data-focused processes in languages such as Laravel/PHP, Python, or other languages. · Knowledge and experience working in cloud platforms like AWS and/or Azure. · Experience working with complex data objects in various formats like JSON, XML and across different DB designs like MySQL, MS SQL, and Salesforce DB’s. · Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or related field. · Proven experience in data engineering, ETL (Extract, Transform, Load) processes, and data pipeline development. Additional Qualifications (Preferred): · Knowledge of or experience with Salesforce data models, API’s and high-level architecture. Experience with other CRM tools and their data is also a plus. · Experience with Business Intelligence tools such as Domo, Microsoft PowerBI, or other BI tools. · Agile development experience working with tools such as Atlassian Jira, Confluence, and Bitbucket. · Experience with financial services products such as insurance, wealth management, or related fields is a plus. · Certification or coursework in data engineering or related technologies. Core Competencies: · Excellent problem-solving skills, analytical thinking, and attention to detail. · Effective communication and teamwork abilities to collaborate across multidisciplinary teams. · Proven success working in both a team environment as well as independently. Compensation & Benefits: Compensation (based on experience) ● Base salary: $110,000 - $130,000 ● This is an exempt position ● Annual performance bonus target: 10% Benefits ● Employee benefits (medical, dental, vision, life insurance, other) ● 401k with employer match ● Paid Time Off ● Paid parking Location: Simplicity Office, 3600 American Blvd W, Suite 700, Minnesota, MN 55431 *This role is an in-office position Company DescriptionHeadquartered in Summit, New Jersey, Simplicity Financial Marketing Group Holdings (“Simplicity Group”) is a financial holding company in the independent financial services sector that specializes in the distribution of retirement and financial planning products. Simplicity Group partners with insurance and investment professionals to help provide consumers with guaranteed income and life insurance products, wealth accumulation strategies, disability and long-term care protection in support of a holistic financial strategy. Through its vast distribution network, Simplicity Group has assisted with the placement of more than $10 billion of insurance financial assets and has $10 billion of assets under management and advisement as of Q3 2024. Simplicity Group is a fast-growing business, focused on organic growth initiatives to help its distribution partners expand their businesses. It is also focused on growing by acquisition. Simplicity Group has over 1,000 employees and 70 operating subsidiaries. Simplicity Group is owned by two of the leading San Francisco-based financial and tech-enabled services private equity firms and by its operating Partners, who help drive Simplicity Group’s day-to-day business. For more information, please visit simplicitygroup.com.Company DescriptionHeadquartered in Summit, New Jersey, Simplicity Financial Marketing Group Holdings (“Simplicity\r Group”) is a financial holding company in the independent financial services sector that specializes in the\r distribution of retirement and financial planning products. Simplicity Group partners with insurance and\r investment professionals to help provide consumers with guaranteed income and life insurance products,\r wealth accumulation strategies, disability and long-term care protection in support of a holistic financial\r strategy. \r \r Through its vast distribution network, Simplicity Group has assisted with the placement of more than $10\r billion of insurance financial assets and has $10 billion of assets under management and advisement as\r of Q3 2024. Simplicity Group is a fast-growing business, focused on organic growth initiatives to help its\r distribution partners expand their businesses. It is also focused on growing by acquisition. Simplicity\r Group has over 1,000 employees and 70 operating subsidiaries. Simplicity Group is owned by two of the\r leading San Francisco-based financial and tech-enabled services private equity firms and by its operating\r Partners, who help drive Simplicity Group’s day-to-day business.\r \r For more information, please visit simplicitygroup.com.
    $110k-130k yearly 4d ago
  • Lead Azure Data Engineer

    ESB Technologies

    Data Engineer Job In Minnesota

    HI, Hope you are doing Great. Immediate need----Lead Azure Data Engineer Title:-Lead Azure Data Engineer Exp:-9+ Years Only W2 Job Description:- We are seeking a Lead Developer with advanced skills in Python, Apache Spark, Azure Synapse, and Azure Data Engineering services to develop andmanage ETL pipelines and data processing solutions that support AI/ML initiatives. The ideal candidate will have strong expertise in Azure Cloud,including SQL, Data Factory, SQL Pools, Spark Pools, and Data Warehousing. A background in CI/CD processes, cross-functional collaboration, and methodologies is essential. In addition, knowledge of Data Science, Java, Kubernetes, Azure Data Lake Storage (ADLS) Gen2, and API development is required. Key Responsibilities 1. ETL and Data Pipeline Development - Design, develop, and optimize scalable ETL processes using Python, Apache Spark, and Azure Synapse. - Build and manage Azure Data Factory pipelines to orchestrate complex data workflows. - Use SQL Pools and Spark Pools within Synapse to manage and process large datasets efficiently. - Implement Data Warehousing solutions using Azure Synapse Analytics to provide structured and queryable data layers. - Ensure the data platform supports real-time and batch AI/ML data requirements. 2. Azure Cloud Development & CI/CD Deployment - Build, configure, and manage CI/CD pipelines on Azure DevOps for ETL and data processing tasks. - Automate infrastructure provisioning, testing, and deployment using Infrastructure-as-Code (IaC) tools like ARM templates or Terraform. - Optimize Azure Data Lake Storage (ADLS Gen2) to store and manage raw and processed data efficiently, ensuring proper access control and data security. 3. Cross-Functional Collaboration - Collaborate with Data Scientists, Data Engineers, ML Engineers, and Business Analysts to translate business requirements into data solutions. - Work with the DevOps and Security teams to ensure smooth and secure deployment of applications and pipelines. - Act as the technical lead in designing, developing, and implementing data solutions, mentoring junior team members. 4. Data Engineering and API Development - Develop and integrate with external and internal APIs for data ingestion and data exchange. - Build, test, and deploy RESTful APIs for secure data access. - Use Kubernetes for containerizing and deploying data processing applications. - Manage data storage and transformation to support advanced Data Science and AI/ML models. 5. Agile Project Management - Participate in and lead Agile ceremonies, such as sprint planning, daily stand-ups, and retrospectives. - Collaborate with cross-functional teams in iterative development to ensure high-quality and timely feature delivery. - Adapt to changing project priorities and business needs in an Agile environment. Required Skills and Qualifications 1. Technical Skills: - Expertise in Python and Apache Spark for large-scale data processing. - Strong experience in Azure Synapse Analytics, including SQL Pools and Spark Pools. - Advanced proficiency in Azure Data Factory for ETL pipeline orchestration and management. - Knowledge of Data Warehousing principles, with hands-on experience building solutions on Azure. - Experience with SQL, including complex queries, optimization, and performance tuning. - Familiarity with CI/CD tools like Azure DevOps and managing infrastructure in Azure Cloud. - Experience in Java for API integration and microservices architecture. - Hands-on knowledge of Kubernetes for containerized data processing environments. - Proficiency in working with Azure Data Lake Storage (ADLS) Gen2 for data storage and management. - Experience working with APIs (REST, SOAP) and building API-based data integrations. 2. Agile and Cross-Functional Skills: - Experience working in an Agile environment, using Scrum or Kanban. - Ability to lead, mentor, and coach junior developers in the team. - Strong collaboration skills to work with data scientists, analysts, and cross-functional teams to deliver end-to-end data solutions. 3. Behavioral Skills: - Strong analytical and problem-solving skills with a passion for data-driven solutions. - Excellent communication and presentation skills, able to explain complex technical concepts to non-technical stakeholders. - Ability to work in a fast-paced, dynamic environment with changing priorities. - Self-motivated and results-oriented with attention to detail. Required/Preferred Skills : Preferred Qualifications - Azure certifications in data engineering or cloud architecture. - Experience deploying AI/ML models on cloud platforms. - Familiarity with Data Governance best practices, ensuring compliance with data privacy regulations.
    $75k-99k yearly est. 3d ago
  • Data Engineer

    Lucas James Talent Partners

    Data Engineer Job In Bloomington, MN

    A Data Engineer at TempWorks is responsible for delivering data-driven insights to innovate and drive business success. The Data Engineer plays a crucial role in designing, developing, and maintaining our data architecture to support our growing data needs. The Data Engineer collaborates closely with cross-functional teams including data scientists, analysts, and software engineers to ensure the reliability, scalability, and performance of our data systems. General Responsibilities: Designing and implementing scalable and efficient data pipelines to ingest, process, and transform data from various sources. Building and maintaining robust data warehousing solutions to support analytical and reporting requirements. Optimizing database performance and ensuring data quality and integrity. Developing and maintaining ETL processes to enable efficient data movement across systems. Implementing data security and privacy measures to ensure compliance with regulations and company policies. Evaluating and adopting new technologies and tools to enhance our data infrastructure and analytics capabilities. Performing other related duties as assigned. Required Skills and Abilities: Hands-on experience with SQL database design. A deep understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with Azure data storage solutions such as Azure SQL Database, Azure Cosmos DB, and Azure Data Lake Storage. Knowledge of data integration and analysis tools such as Azure Data Factory, Azure Databricks, Azure Synapse, and Power BI. Understanding of data modeling and schema design principles. Ability to work with large datasets and perform data analysis. Strong experience in common data warehouse modelling principles. Knowledge of Dev-Ops processes (including CI/CD). Experience in developing NO SQL solutions is desirable. Education and Experience: Bachelor's degree or higher in Computer Science, Engineering preferred. 5+ years of SQL experience (No-SQL experience is a plus). 5+ years of experience with schema design and dimensional data modeling. Experience designing, building, and maintaining data processing systems. Physical Requirements: Prolonged periods sitting and/or standing at desk and working on a computer. Must be able to lift to 10 pounds at times.
    $75k-99k yearly est. 14d ago
  • Data Scientist

    On-Demand Group 4.3company rating

    Data Engineer Job In Arden Hills, MN

    Data Scientist is responsible for creating and delivering advanced analytics and insights for the Finance teams. This position will partner with business partners within the organization, as well as with colleagues in the Data & Analytics technology team(s) to analyze data from diverse sources and inform driven decision making. Experience-Education (Required): • Bachelor's degree in business analytics, computer science, statistics, or related fields • Demonstrated experience in statistical programming such as Python, R, SAS, SPSS, etc. • Demonstrated knowledge of statistical and analytic techniques including regression, clustering, classification, etc. • Demonstrated ability to translate between business requirements and technical specifications and to communicate results and findings with non-technical business partners • Experience using traditional and modern platforms including cloud storage and compute technologies • Intermediate to Advanced SQL skills Competencies-Skills (Required): • Identify key business requirements that will drive the design, development, implementation, and delivery of models, quantitative analyses, and other analytical solutions • Collaborate with and influence multiple stakeholder groups to ensure alignment with standards, guidelines and best practices • Provide thought leadership and education to business partners regarding analytic best practices • Aid in enabling enterprise roll out of advanced analytic capabilities • Identify, report, and escalate risks to manager and stakeholders appropriately Competencies-Skills (Preferred): • Experience with cloud-based model training and production • Familiarity/experience with Agriculture, CPG, and/or manufacturing industries EOE M/F/Vets/Disabled. Pre-employment substance abuse testing. The projected hourly range for this position is $70 to $73. On-Demand Group (ODG) provides employee benefits which includes healthcare, dental, and vision insurance. ODG is an equal opportunity employer that does not discriminate on the basis of race, color, religion, gender, sexual orientation, age, national origin, disability, or any other characteristic protected by law.
    $70-73 hourly 12d ago
  • Data Engineer

    Medasource 4.2company rating

    Data Engineer Job In Minnesota

    We are searching for a curious and passionate team player who is a quick learner and has the ability to pivot as needed. The ideal candidate should be a clear and personable communicator who can work closely with our business and architects. They should have experience developing proof of concepts and MVP products within a large organization and thrive in a fast-paced, agile environment. Additionally, the candidate should be capable of bringing modern solutions to the team/project and identifying necessary changes. Key Experience Required: - Experience with no-sql databases - Experience with data migration projects (moving from on-prem to cloud) Must-Haves: - 3+ years of recent experience as a Data Engineer in an Azure environment (not heavy AWS) - Proficient in scripting with Python, as well as other languages such as PySpark, Scala, and SQL - Strong experience building pipelines with Azure Databricks and other Azure services, such as Azure Data Factory and Azure SQL Database - Experience with data integration, transformation, and orchestration frameworks such as Airflow, dbt, Fivetran, or similar - Clear, personable communicator to work closely with the business - Experience in a large organization and ability to work in a fast-paced, agile environment Job Description: - Design, develop, and implement end-to-end data solutions using Azure Databricks and other relevant technologies - Build and maintain scalable data pipelines to extract, transform, and load large volumes of data from various sources - Collaborate with cross-functional teams to gather and analyze requirements, then translate business needs into optimized data models - Ensure data quality and integrity by implementing data validation and cleansing processes - Optimize data processing and storage for improved performance and efficiency - Conduct data profiling, analysis, and modeling to identify opportunities for data-driven insights and recommendations - Monitor and troubleshoot data pipelines and processes to ensure continuous data flow and availability Please let me know if you have any questions or need further information to assist in the candidate search. I am here to help and would be happy to provide any additional details.
    $75k-98k yearly est. 7d ago
  • Salesforce Data Cloud Architect

    Akkodis

    Data Engineer Job In Minneapolis, MN

    We are looking for a highly motivated and skilled Salesforce Data Cloud Architect to design, develop, and optimize Data Cloud data model and use cases. The successful candidate will work closely with cross-functional teams, including Marketing, IT, and Data Analytics, to bring dynamic content and tailored experiences to our customers. Key Responsibilities: Lead the end-to-end implementation of Salesforce Data Cloud (CDP), including data acquisition, integration, quality assurance, and utilization. Configure and implement data-driven segmentation strategies, ensuring accurate audience targeting and content delivery. Design, document and implement data models, data pipelines, and transformations to support data ingestion, integration, and enrichment within Salesforce Data Cloud. Be curious and up-to-speed with the fast-paced releases coming to the platform. Collaborate with IT teams to ensure seamless data integration, troubleshoot technical issues, and optimize system performance for data initiatives. Integrate data from various sources, including CRM systems, databases, and third-party platforms, to support marketing and personalization efforts. Provide training and support to development teams on utilizing Salesforce Data Cloud features and capabilities. Qualifications: Proven experience with a strong focus on Salesforce Data Cloud and/or custom database solutions Salesforce Data Cloud Accredited Professional certification strongly preferred. Strong understanding of marketing automation, data segmentation, and personalized customer journeys, decisioning and Next Best Action. Experience working with Data Actions and Flow integrations, data integration, and API utilization. Expertise in data modeling, ETL processes, data integration tools, and SQL. Experience with customer data platforms (CDPs) and data management practices, including data governance and compliance. Familiarity with cloud technologies (e.g., AWS, Azure, GCP) and data modeling/scoring technologies. Bachelor's degree in Computer Science, Information Technology, Marketing, or equivalent work experience Equal Opportunity Employer/Veterans/Disabled Benefit offerings available for our associates include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, an EAP program, commuter benefits, and a 401K plan. Our benefit offerings provide employees the flexibility to choose the type of coverage that meets their individual needs. In addition, our associates may be eligible for paid leave including Paid Sick Leave or any other paid leave required by Federal, State, or local law, as well as Holiday pay where applicable. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs that are direct hires to a client. To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit ****************************************** The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable: · The California Fair Chance Act · Los Angeles City Fair Chance Ordinance · Los Angeles County Fair Chance Ordinance for Employers · San Francisco Fair Chance Ordinance
    $84k-112k yearly est. 17h ago
  • Cybersecurity Engineer (Solventum)

    Solventum

    Data Engineer Job In Maplewood, MN

    Thank you for your interest in working for our Company. Recruiting the right talent is crucial to our goals. On April 1, 2024, 3M Healthcare underwent a corporate spin-off leading to the creation of a new company named Solventum. We are still in the process of updating our Careers Page and applicant documents, which currently have 3M branding. Please bear with us. In the interim, our Privacy Policy here: *************************************************************************************** continues to apply to any personal information you submit, and the 3M-branded positions listed on our Careers Page are for Solventum positions. As it was with 3M, at Solventum all qualified applicants will receive consideration for employment without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Job Description: Cybersecurity Engineer (Solventum) 3M Health Care is now Solventum At Solventum, we enable better, smarter, safer healthcare to improve lives. As a new company with a long legacy of creating breakthrough solutions for our customers' toughest challenges, we pioneer game-changing innovations at the intersection of health, material and data science that change patients' lives for the better while enabling healthcare professionals to perform at their best. Because people, and their wellbeing, are at the heart of every scientific advancement we pursue. We partner closely with the brightest minds in healthcare to ensure that every solution we create melds the latest technology with compassion and empathy. Because at Solventum, we never stop solving for you. The Impact You'll Make in this Role As a Cybersecurity Engineer, you will have the opportunity to tap into your curiosity and collaborate with some of the most innovative and diverse people around the world. Here, you will make an impact by: Leading Cybersecurity Risk Management activities for Solventums hardgood medical devices Supporting Authority to Operate and Post-Market Surveillance requirements Working closely with process leads to develop new Cyber-related processes Identifying opportunities to improve Solventums Cybersecurity test capabilities Participate in new product introduction as well as sustaining product improvement projects to assess cyber impact Your Skills and Expertise To set you up for success in this role from day one, Solventum requires (at a minimum) the following qualifications: Bachelor's Degree or higher (completed and verified prior to start) from an accredited institution AND 1 year of Cybersecurity experience in a private, public, government or military environment OR High School Diploma/GED from AND 5 years of Cybersecurity experience in a private, public, government or military environment Additional qualifications that could help you succeed even further in this role include: Master's degree in Cybersecurity from an accredited institution Two (2) years of Medical Device Cybersecurity experience Experience with Quality Management Systems and ISO 13485 and/or IEC 62304 design documentation requirements Proven experience collaborating with laboratory, quality, regulatory, and manufacturing functions Work location: Remote Travel: May include up to 10% domestic Relocation Assistance: Is not authorized Must be legally authorized to work in country of employment without sponsorship for employment visa status (e.g., H1B status). Supporting Your Well-being Solventum offers many programs to help you live your best life - both physically and financially. To ensure competitive pay and benefits, Solventum regularly benchmarks with other companies that are comparable in size and scope. Applicable to US Applicants Only:The expected compensation range for this position is $95,825 - $117,120, which includes base pay plus variable incentive pay, if eligible. This range represents a good faith estimate for this position. The specific compensation offered to a candidate may vary based on factors including, but not limited to, the candidate's relevant knowledge, training, skills, work location, and/or experience. In addition, this position may be eligible for a range of benefits (e.g., Medical, Dental & Vision, Health Savings Accounts, Health Care & Dependent Care Flexible Spending Accounts, Disability Benefits, Life Insurance, Voluntary Benefits, Paid Absences and Retirement Benefits, etc.). Additional information is available at: *************************************************************************************** of this position include that corporate policies, procedures and security standards are complied with while performing assigned duties. Solventum is committed to maintaining the highest standards of integrity and professionalism in our recruitment process. Applicants must remain alert to fraudulent job postings and recruitment schemes that falsely claim to represent Solventum and seek to exploit job seekers. Please note that all email communications from Solventum regarding job opportunities with the company will be from an email with a domain *****************. Be wary of unsolicited emails or messages regarding Solventum job opportunities from emails with other email domains. Solventum is an equal opportunity employer. Solventum will not discriminate against any applicant for employment on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, or veteran status. Please note: your application may not be considered if you do not provide your education and work history, either by: 1) uploading a resume, or 2) entering the information into the application fields directly. Solventum Global Terms of Use and Privacy Statement Carefully read these Terms of Use before using this website. Your access to and use of this website and application for a job at Solventum are conditioned on your acceptance and compliance with these terms. Please access the linked document by clicking here, select the country where you are applying for employment, and review. Before submitting your application you will be asked to confirm your agreement with the terms.
    $95.8k-117.1k yearly 17d ago
  • Sr. Stibo Developer

    Cognizant 4.6company rating

    Data Engineer Job In Stillwater, MN

    open to any qualified applicant in the United States. We are Cognizant Artificial Intelligence Digital technologies, including analytics and AI, give companies a once-in-a-generation opportunity to perform orders of magnitude better than ever before. However, clients need new business models built from analyzing customers and business operations at every angle to really understand them. With the power to apply artificial intelligence and data science to business decisions via enterprise data management solutions, we help leading companies prototype, refine, validate, and scale the most desirable products and delivery models to enterprise scale within weeks. * You must be legally authorized to work in United States without the need of employer sponsorship, now or at any time in the future * Job Title: Sr. Stibo Developer (Remote) Roles and responsibilities: Worked as Stibo Sr. Developer responsible for carrying out all activities to ensure the optimal delivery of the STEP solution using standard methodologies. Configure, build and unit test STEP components, including data model, user roles, workflows, inbound and outbound, import/export and portals Build and unit test data migration components. Define technical specifications for workflows and business rules Prepare detailed design documents for data migration Guide and support junior team members in implementing STEP activities using standard methodologies. Support integration, system and user acceptance testing. Required Qualifications: At least one full-cycle implementation of Stibo Systems STEP MDM in a leadership role; additional MDM/PIM implementation experience a plus 3-5 years of relevant Stibo development work experience and 5-8 years of IT industry experience Effective interpersonal, communication, and team-facilitation skills Strong analytical and problem-solving skills Experienced technical skills in software development (Java and/or JavaScript), data and database design, SQL, XML, Web services Bachelor's degree or equivalent experience in computer science or related major. Salary and Other Compensation: Applications will be accepted until December 01, 2024. The annual salary for this position is between $133,000.00 - $156,000.00 depending on experience and other qualifications of the successful candidate. This position is also eligible for Cognizant's discretionary annual incentive program and stock awards, based on performance and subject to the terms of Cognizant's applicable plans. Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements: Medical/Dental/Vision/Life Insurance Paid holidays plus Paid Time Off 401(k) plan and contributions Long-term/Short-term Disability Paid Parental Leave Employee Stock Purchase Plan Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law. Cognizant is an Equal Opportunity Employer M/F/D/V. Cognizant is committed to ensuring that all current and prospective associates are afforded equal opportunities and treatment and a work environment free of harassment. Cognizant is recognized as a Military Friendly Employer and is a coalition member of the Veteran Jobs Mission. Our Cognizant Veterans Network Assists Veterans in building and growing a career at Cognizant that allows them to leverage the leadership, loyalty, integrity, and commitment to excellence instilled in them through participation in military service. #LI-KV1 #CB #Ind123
    $133k-156k yearly 17d ago
  • Data Analytics Engineer

    Datadrive

    Data Engineer Job In Minneapolis, MN

    Job Description DataDrive is looking for a Data Analytics Engineer who loves automation, building data pipelines in modern technologies, and creative solutioning the art of the possible with data and analytics in the cloud! The primary responsibility will be supporting our managed analytics service client through their data-driven journey and deliver measurable business value through data modeling, API integration, SQL scripting, and data pipeline development. The data analytics engineer will bridge the important gap between data applications and insightful business reports. DataDrive has a fun culture that celebrates individual differences. We believe in and are committed to creating a diverse, equitable, and inclusive workplace. We value those who embrace adventure, subscribe to life-long learning, and take ownership and pride in their work. In our professional-yet-relaxed environment, we want employees who enjoy what they do, seek balance outside of work, and constantly strive to help us grow by helping our clients grow in their analytics capabilities. This individual would Participate in building our data platform from the ground up by exploring new technologies & vendors within our cloud-first environment Embrace an incremental learning and improvement mindset to always exploring how to do things better Thrive in an ambiguous environment and be a well-organized, creative, and strong listener Be collaborative, with an optimistic style to problem-solving, relentlessly seek opportunities to go the extra mile for others Align with our closely-held company values In a growing small company rooted in analytics, change is the only constant. A candidate for this role should be adaptable and excited about the opportunity to ‘wear many hats’ as every member plays an influential role in making DataDrive an amazing place to work and grow together. We want you to embrace the adventure! We are unable to sponsor visas at this time. As part of day-to-day responsibilities, the Data Analytics Engineer position will Monitor, optimize, and integrate batch data source pipelines into Snowflake databases using ELT vendors (e.g. Fivetran, Airbyte) and Python integrations to databases and APIs Build and maintain dbt pipelines to transform source data into reporting data models for our visualization team Set up easy-to-use data sources in Tableau and build out Tableau dashboards following internal design standards Code orchestration pipelines leveraging Python to connect data ingestion, Snowflake, dbt, Tableau Design and implement Data Ops practices to make our lives easier using Github actions, dbt testing, and/or data observability/monitoring tools Identify where we can incrementally improve our platform and build the automation to accomplish it Stay informed on constantly evolving cloud warehousing and ETL ecosystems Commit to strong application SDLC concepts during development Other duties as assigned, including Tableau-related dashboard development Ideal Background for the Role 4+ years of professional experience in building cloud data pipelines and data warehousing within a cloud data warehouse (ideally Snowflake) High proficiency and experience working with SQL and Python Experience in designing data models to support reporting and business outcomes Experience or a strong desire to learn AWS, Snowflake, dbt, and Tableau Familiarity with Prefect/Airflow, Fivetran, Tableau, Terraform, Github Actions We recognize that there is no such thing as a perfect candidate. We embrace professional and personal growth - so however you identify and no matter what your experience level, background, or education is, please apply if this role would make you excited to come to work every day! Anticipated Schedule & Workplace Expectations DataDrive is a remote-first company that embraces a flexible, virtual workplace. We expect employees to be generally available for meetings, collaboration, client communication, and support Monday - Friday during normal business hours (9a - 4p CST). To maintain connection in a virtual world with our team and clients, employees are expected to maintain regular access to high-speed Internet connections turn on video/webcam for the majority of meetings dress appropriately for their day (e.g. business casual for external audiences) About DataDrive Founded in 2017, DataDrive is a fast-growing managed analytics service provider that not only provides modern cloud analytics data platforms to data-driven organizations, but also supports ongoing training, adoption, and growth of our clients’ data cultures. We seek to elevate people and organizations by building strong data cultures and platforms. DataDrive offers a unique team-oriented environment where one can develop their skills and work directly with some of the most talented analytics professionals in the business. We are a connected group who work hard, live well, care for others, & celebrate as a team. We help our clients and ourselves win by showing up with the following values: Embrace the Adventure Win as a Team Grow through Curiosity Own the Outcome DataDrive has been recognized as a ‘Best Place to Work’ by Minneapolis Business Journal. Powered by JazzHR DKWTE0GnNg
    $75k-99k yearly est. 19d ago
  • Senior Data Engineer

    Asset Marketing Services, LLC

    Data Engineer Job In Saint Paul, MN

    Job Description Asset Marketing Services, LLC (AMS) is a leading direct-to-consumer collector platform, focusing on high-end collectibles and bullion. We offer a curated selection that ranges from ancient artifacts to vintage and modern pieces. With our concierge service, we place sought-after pieces directly into the hands of collectors. Our close work with major world mints also allows us to deliver desirable coins to our customers quickly, while developing new and exclusive products. Our company vision is to bring the obsession with beauty, scarcity and value to life for our customers. Our customer's passion fuels our purpose as a company. We're on the lookout for talented, motivated individuals to take our team to the next level. Everything we do is personal, motivated by the desire to inspire our customers and each other. If that sounds like you, a career with AMS may be in your future. For more information about AMS, please visit our company website ***************** Position Overview AMS is a forward-thinking organization dedicated to leveraging data-driven insights to drive business success. We pride ourselves on our innovative approach to data management and analytics, empowering our users with robust data solutions. We are seeking an experienced Senior Data Engineer to join our team in designing, developing, and maintaining our data architecture. As a Senior Data Engineer, you will play a critical and hands on role in designing, creating, implementing, and managing our data architecture, movement, transformation, and infrastructure across various platforms. This position is pivotal in ensuring the reliability, scalability, integrity, and security of data while collaborating closely with data scientists, analysts, and other key stakeholders to understand business requirements and translate them into data solutions that ensure high data quality, performance, and availability. Your experience is crucial in defining how and where data is stored, consumed, integrated, and managed across different data management technologies as well as in building robust data pipelines and solutions that support our business objectives. If you are passionate about engineering and managing data solutions that drive business value and innovation, we invite you to join our team as a Senior Data Engineer. Essential Functions, Duties & Accountabilities Design, build, and maintain scalable and efficient data pipelines for collecting, processing, transforming, consuming, and storing large datasets. Design and develop data solutions using Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Synapse Analytics, and other Azure services. Implement and manage data storage solutions, including data lakes, data warehouses, and databases Build and maintain an enterprise data model that represents the data needs of the organization. Implement data management practices to ensure data quality, accuracy, consistency, availability, including data cleansing, normalization, and validation techniques. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and translate them into technical solutions. Document data architecture and maintain data catalogs and data lineage. Stay up to date with emerging trends and technologies in data architecture and data management. Monitor and troubleshoot data systems to ensure reliability, performance, and cost optimization. Continuously evaluate and improve existing data infrastructure and processes for efficiency and scalability. Lead and mentor data engineers providing help in prioritizing daily activities, technical guidance, and fostering a culture of continuous learning and improvement. Additional Functions, Duties & Accountabilities Serve as a member of the Technology team and support others as necessary. Maintain flexibility in work schedule to meet business needs. Actively and positively contribute to the overall success of the company through regular contributions to the general management of the company. Participate and provide leadership in inter-departmental meetings, projects and informal activities. Support the company and other departments wherever and whenever your skills and experience would be beneficial. Educational Requirements Bachelor's degree in Computer Science, Data Science, Information Technology, or related field; alternately, equivalent experience in a relevant role. Required Experience and Skills 5-7 years of experience in a Data Engineering role. Hands-on ETL pipeline design and development. Hands-on data modelling design and implementation experience. Strong knowledge of database management systems (e.g., SQL Server, Oracle, MySQL). Proficiency in programming languages such as SQL, Python, and Spark. Hands on experience with Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Synapse Analytics, and other Azure data services. Hands on Data Operations techniques to monitor data pipelines and systems to ensure reliability, performance and to identify and resolve issues promptly. Strong Experience with data governance, data quality, and data security best practices. Familiarity with market trends and AI concepts. Experience with DevOps practices, agile methodologies, and collaboration tools. Certifications in relevant technologies such as Microsoft Certified: Azure Data Engineer or other industry-standard certifications would be beneficial. Excellent communication, problem-solving, and analytical skills. Ability to work independently, prioritize tasks, and adapt to evolving requirements. Passion for technology innovation, continuous learning, and driving business success through technological excellence. Benefits AMS believes that investing in our employees is critical to having a successful company. We offer competitive compensation and benefit programs. Our Benefit programs include: Medical, Dental, Vision, HSA & FSA Plans, Accidental & Critical Illness, Life Insurance, Disability, AD&D, 401(k) Employer Match, Quarterly Profit Distribution, Paid Holidays and 16 Paid days of Personal Time Off. Closing AMS is an equal opportunity employer and is committed to hiring practices and a workplace environment that provides equal opportunities for all without regard to race, color, religion, sex, national origin, age, disability, veteran status, sexual orientation, gender identity, marital status, pregnancy, or any other characteristics protected by federal and state law. Asset Marketing Services, LLC participates in E-Verify and will provide the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S. See E-Verify's official poster at ********************************************** Contents/E-Verify_Participation_Poster_ES.pdf
    $75k-99k yearly est. 7d ago
  • Data Engineer Local Candidates only

    Ip Corporation 4.1company rating

    Data Engineer Job In Saint Paul, MN

    Job Posting - Data Engineer Local Candidates only About Us: IP Corporation is an industry leader in the manufacturing and distribution of composites. We are a company committed to leveraging data to drive strategic decisions and enhance our operations. We are looking for a talented Data Engineer to join our team and help us build robust data solutions that power our business insights. Job Description: As a Data Engineer at IP Corporation, you will be responsible for designing, building, and maintaining data pipelines that support our data-driven initiatives. You will work closely with technical and business stakeholders across the enterprise to ensure the efficient, reliable, and secure processing of data to enable all corners of the organization to leverage it. Required Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field or equivalent work experience 3-5 years of experience in data engineering Strong SQL skills and experience with relational databases Proficiency in ETL tools and frameworks Programming experience in languages supporting data operations Familiarity with cloud platforms for data operations Deep knowledge of data warehousing concepts Experience with data architecture best practices and data modeling for OLTP and data warehouse/lakehouse/BI applications Experience with managing data catalogs, business glossaries, and Data Lineage A demonstrated ability to deliver in a complex business environment. Preferred Qualifications: Experience with Microsoft Fabric Experience with SQL Server Experience with REST APIs Experience with data visualization tools (Power BI, Tableau, etc.) Knowledge of data modeling Familiarity with DevOps practices Understanding of data governance and security principles Primary Responsibilities: Play a pivotal role in evolving our data architecture to support data-intensive initiatives and technologies Develop and maintain ETL processes to integrate data from various sources into our data management architectures Optimize data pipelines for performance and reliability Ensure data accuracy and integrity through data cleansing and validation Monitor and troubleshoot data workflows Implement data governance and security practices Collaborate with cross-functional teams to understand data requirements and deliver solutions Document data processes and architecture Staying updated with industry trends and best practices in data engineering
    $69k-92k yearly est. 27d ago
  • DevOps Engineer

    Kigo

    Data Engineer Job In Saint Paul, MN

    Description: Kigo is paving a new path forward for customer loyalty and digital advertising. Kigo’s innovative platform connects advertisers with high-value customers through some of the world’s leading loyalty and rewards programs. As a subsidiary of Augeo, a leading global loyalty platform and services company, we leverage decades of industry expertise supporting Fortune 500 brands and their customers and employees across the globe. Kigo’s expanding loyalty network allows brands to reach millions of customers with proven engagement, spending power, and brand affinities—creating new revenue streams for loyalty programs while fostering deeper member engagement. Join Kigo and step into the future of loyalty and digital advertising! Here, your work will have an impact, driving meaningful results and shaping the loyalty experience of tomorrow. Job Description: As a DevOps Engineer at Kigo, you will play a pivotal role in managing and optimizing our deployment pipelines and infrastructure across multiple platforms. Primarily working within AWS, you will architect robust, reliable, and maintainable solutions for our applications. Your ability to identify opportunities for automation will be essential in streamlining processes and saving time. You will collaborate closely with developers and managed service providers to devise optimal solutions that meet the evolving needs of our innovative platform. This role also involves integrating and maintaining existing codebases, requiring strong problem-solving skills and adaptability. Key Responsibilities: Design and implement scalable, secure, and cost-effective infrastructure solutions for our applications and services Create and maintain deployment pipelines for our codebases Establish best practices for infrastructure monitoring, updating, patching, security, and compliance Stay up-to-date with the latest trends and technologies in cloud computing Troubleshoot and resolve infrastructure and deployment issues Document processes, configurations, and best practices Participate in on-call rotation to support critical infrastructure Collaborate with development and operations teams to ensure SLAs are met Collaborate with internal IT teams to ensure the smooth operation of networking, domain and DNS management, access management, and other technology systems used by Kigo What you need to be successful in this role: 5+ years of experience in DevOps or similar technical roles Strong experience with AWS services including EC2, ECS, EKS, S3, Cloudfront, Route53, Lambda, RDS, and VPC Experience with CI/CD platforms, specifically GitHub Actions, AWS Codepipelines, and Azure DevOps Proficiency with infrastructure as code tools (e.g., Terraform, AWS CloudFormation) Knowledge of container orchestration using Kubernetes, particularly OpenShift, ArgoCD, and Helm Experience working with managed service providers and third-party vendors Familiarity with DNS management and networking concepts Strong scripting skills (Python, Bash, or similar) Experience with monitoring and observability tools Knowledge of security best practices and compliance requirements Strong problem-solving and analytical skills Excellent communication and collaboration abilities Ability to manage multiple priorities in a fast-paced environment Benefits of joining our team: Competitive salary Comprehensive benefits package Opportunity to work at the nexus of technology, marketing, and loyalty Continuous learning and development opportunities Kigo is an equal opportunity employer committed to diversity and inclusion in the workplace. Requirements:
    $82k-107k yearly est. 29d ago
  • Software Engineer

    Lexisnexis Risk Solutions 4.6company rating

    Data Engineer Job In Minnesota

    Must be able to work in our St Cloud Minnesota office on a weekly basis. This is a full time role and we will be unable to transfer H1bs at this time. LexisNexis Risk Solutions provides customers with solutions and decision tools that combine public and industry specific content with advanced technology and analytics to assist them in evaluating and predicting risk and enhancing operational efficiency. We use the power of data and advanced analytics to help our customers make better, timelier decisions. By bringing clarity to information, we ultimately help make communities safer, insurance rates more accurate, commerce more transparent, business decisions easier and processes more efficient. You can learn more about LexisNexis Risk at the link below, **************************** About our Team This Technology team collaborates to reduce risks and create opportunities for customers in more than 100 countries. We're adaptable, curious and ambitious. That's why here, you'll have the freedom to drive change, the trust to find your own path, and the space to explore more. About the Role You will work in all phases of project delivery from initial design, code development, application testing and delivery to the production environments. We utilize ECL (Enterprise Control Language) & KEL (Knowledge Engineering Language), proprietary languages, you should be open to learning these. Responsibilities Designing, coding and documenting automated test cases within a defined framework to ensure quality of our product. Under the guidance of senior-level engineers, successfully implement development processes, coding best practices and code reviews. Operating in various development environments (Agile, Waterfall, etc.) while collaborating with key stakeholders. Writing and review portions of detailed specifications for the development of system components of simple complexity. Resolving basic technical issues, as necessary. Keeping abreast of new technology developments. Completing simple bug fixes. Write Unit tests and other dev tests. Learning the codebase and improving your coding skills. Maintain flexibility to react quickly to changes in priorities or circumstances to meet the needs of the business. Requirements Have Basic knowledge of software development methodologies (e.g., Agile, Waterfall). Proficiency with data manipulation languages and modeling principles. Possess the willingness to learn ECL (Enterprise Control Language) & KEL (Knowledge Engineering Language). Knowledge of data storage subsystems. Experience with development languages including but not limited to: Object Oriented Development, Java/J2EE, HTML, XML, SQL. Familiarity with development in a Windows environment Have a basic understanding of Microsoft Azure cloud platform. Knowledge of test-driven development. Knowledge of source-code-control and configuration management, GitLab/GitHub LexisNexis Risk Solutions is supportive of women in Technology and has been a founding signature for the Tech Talent Charter. We have the following initiatives in place to support women in technology: · Mentoring scheme for women in technology, · Women's network forum, regularly run events for schools about careers in technology to inspire the next generation of girls in technology. Learn more about the LexisNexis Risk team and how we work here
    $73k-97k yearly est. 12d ago
  • DevSecOps Engineer

    Seneca Resources 4.6company rating

    Data Engineer Job In Eagan, MN

    Hello, Job Title: (CI/CD) Cloud Systems Engineer III Contract Roles and Responsibilities Include but not limited to: Suppliers must provide an expert CI/CD Engineer resource in support Postal DevSecOps initiatives for cloud and on-prem infrastructure platforms and applications. Minimum of 8-12 years' experience, in technology or software development preferred, The CI/CD engineer is responsible for designing, building, testing, and maintaining the continuous integration and continuous delivery pipelines for software development. Write automations and workflows utilizing GitHub Actions to integrate different components of full CI/CD pipeline like Terraform, Trivy, Megalinter, and ServiceNow. Utilize tools like GitHub, GitHub Actions, ArgoCD, and other CI/CD technologies to develop and manage pipelines. Develop infrastructure design patterns and deploy infrastructure as a code with tools like Terraform and Ansible. Design and lead implementation of solutions in Cloud Service Providers (CSP) like AWS, GCP and Azure and On-Prem. Utilize container registries such as Artifactory, GHCR, and CSP registries and support base container images. Support Kubernetes Clusters, including AKS, GKE, EKS, ROSA & Rancher. Collaborate with development and operations teams to understand their needs and create a pipeline that meets their requirements their key areas. Advise and oversee the process Improvement initiatives and IT requirements related to all aspect of application's CI/CD pipelines. Automate security processes, such as vulnerability scanning, testing, and monitoring. Troubleshoot and resolve incidents and implement changes and act as escalation resource for Operations team. Mentor and transfer knowledge and less experienced team members. Document process and procedures. Prioritize and schedule projects/tasks based at the direction of Postal leadership. Typically performs all functional duties independently. Required Certifications, if any: GitHub Certification or Cloud Certification - AWS, GCP or Azure preferred Educational Requirements: A degree from an accredited College/University in the applicable field of services is required. If the individual's degree is not in the applicable field, then four additional years of related experience is required. Description: Infrastructure Design is a group within NIT that provides overall design for infrastructure delivery and interoperability for the 200+ technologies, 24,000 servers, and cloud services in use today support applications. Provides the expertise and design solutions based on CIO Architectures and Strategies architectural layouts for systems. Ensures the development teams, Endpoint teams, CISO teams, and others have secure, performant, and available cloud and on-premises platforms and services at their disposal. Highly specialized in one or more phases of software systems development, systems integration, or network engineering. Provides technical assistance and advice on complex activities. Formulates / defines specifications, develops / modifies / maintains complex systems and subsystems, using vendor engineering releases and utilities for overall operational systems. Applies analytical techniques when gathering information from users, defining work problems, designing technology solutions, and developing procedures to resolve the problems. Develops complete specifications to enable computer programmers to prepare required programs. Analyzes methods of approach. Reviews task proposal requirements, gathers information, analyzes data, prepares project synopses, compares alternatives, prepares specifications, resolves processing problems, coordinates work with programmers and engineers, and orients users to new systems. Works with considerable freedom to make decisions on the techniques and approaches to be used. Prepares recommendations for system improvement for management and user consideration. About Seneca Resources: Seneca Resources is a client driven provider of strategic Information Technology consulting services and Workforce Solutions to government and industry. Seneca Resources is a leading IT services provider with offices in Virginia, Alabama, Georgia, Florida and Texas that service clients throughout the United States. We are an Equal Opportunity Employer and value the benefits of diversity in our workplace.
    $76k-108k yearly est. 7d ago
  • Senior DevOps Engineer

    Idexcel 4.5company rating

    Data Engineer Job In Eagan, MN

    Infrastructure Design is a group within NIT that provides overall design for infrastructure delivery and interoperability for the 200+ technologies, 24,000 servers, and cloud services in use today to support applications. This team provides expertise and design solutions based on CIO Architectures and Strategies, creating architectural layouts for systems. They ensure development teams, Endpoint teams, CISO teams, and others have secure, performant, and available cloud and on-premise platforms and services. The role requires specialization in one or more phases of software systems development, systems integration, or network engineering. It involves technical assistance and advice on complex activities, formulating and defining specifications, and developing, modifying, and maintaining complex systems and subsystems using vendor engineering releases and utilities. The Engineer will apply analytical techniques to gather information, define work problems, design technology solutions, and develop procedures to resolve problems. Responsibilities include creating complete specifications for programmers, analyzing methods, reviewing task proposal requirements, preparing project synopses, and coordinating work across teams. The role allows significant freedom to make decisions on techniques and approaches. Roles and Responsibilities (Include but not limited to): Design, build, test, and maintain continuous integration and delivery (CI/CD) pipelines for software development. Write automations and workflows using GitHub Actions to integrate CI/CD pipeline components like Terraform, Trivy, Megalinter, and ServiceNow. Utilize tools like GitHub, GitHub Actions, ArgoCD, and other CI/CD technologies for pipeline management. Develop infrastructure design patterns and deploy infrastructure as code with tools like Terraform and Ansible. Design and implement solutions for Cloud Service Providers (CSP) such as AWS, GCP, Azure, and on-premises systems. Manage container registries like Artifactory, GHCR, and CSP registries, and support base container images. Support Kubernetes clusters, including AKS, GKE, EKS, ROSA, and Rancher. Collaborate with development and operations teams to meet pipeline requirements. Automate security processes, such as vulnerability scanning, testing, and monitoring. Troubleshoot incidents, implement changes, and provide escalation support for operations. Mentor team members and document processes and procedures. Prioritize and schedule tasks per Postal leadership direction. Required Skills and Experience: 8-12 years of experience in technology or software development. Expertise in CI/CD engineering, including infrastructure as code and pipeline automation. Experience with tools like Terraform, Ansible, GitHub, ArgoCD, and Kubernetes. Strong understanding of cloud services (AWS, GCP, Azure) and on-premises systems. Familiarity with container technologies, registries, and CI/CD security practices. Certifications any one: GitHub Certification (Acitve) Cloud Certification (AWS, GCP, or Azure) (Acitve) Educational Requirements: A degree from an accredited college/university in the applicable field is required. If the degree is not in the applicable field, an additional four years of related experience is required. Additional Provisions: Must be able to obtain a Position of Public Trust Clearance. Pass client-mandated clearance processes, including drug screening, criminal history check, and credit check. Employment continuation is contingent on receiving sensitive clearance if given an interim clearance. Must be a U.S. Citizen or have permanent residence status (Green Card). Candidate must have resided in the U.S. for the past five years. No more than six months of travel outside the U.S. within the last five years (military service excluded)
    $81k-104k yearly est. 17h ago
  • Application Engineer- Automation and Drives

    KEB America 3.4company rating

    Data Engineer Job In Shakopee, MN

    Job DescriptionDescription: Application Engineer-Automation and Drives The KEB team is growing again! This position is for engineers who want to work directly with customers to solve problems, while gaining technical sales experience. This role involves the application of automation software, PLCs, Variable Frequency Drives (VFDs), and AC motors to automated systems in collaboration with potential OEMs and system integrator customers. Additionally, you will gain technical sales experience by providing sales quotes and forecasts to our customers to further grow and expand our business. This position may require some travel within the USA and Canada (approx. up to 20% of the time). Why KEB America? KEB America is a German based mid-sized industrial automation company serving the North American Market (USA, Canada, and Mexico). KEB specializes in highly sophisticated technical industries such as elevators, medical, packaging, theatre, and plastics. With a recent building expansion, full manufacturing, engineering, R&D, and support capabilities are available at the Shakopee MN location. KEB America manufactures configurable industrial automation products like electromagnetic clutches and brakes, integral gearmotors, drives, HMIs, and more. With decades of experience working directly with OEMs, KEB has extensive knowledge and can solve the most challenging automated applications. Requirements: Requirements Engineering requirements of the job will include: · Write relevant technical documentation & manuals · Draft new product specifications and engineering work orders · Visit customer sites for product start-ups · Size, select, and recommend KEB products to customers · Offer technical support and troubleshooting over the phone Technical Sales requirements of the job will include: · Provide excellent customer support · Write relevant technical documentation & manuals · Create product quotations · Create sales presentations and other marketing material · Share application knowledge through writing and speaking · Provide product trainings in-house and at customer sites For more information on who we are and what we do, visit our website: ***********************************
    $70k-94k yearly est. 34d ago
  • DevOps Engineer

    Vail Resorts 4.0company rating

    Data Engineer Job In Minneota, MN

    Job Description Our mission is to create the Experience of a Lifetime for our employees, so they can, in turn, create the Experience of a Lifetime for our guests. We own and operate the most renowned destination resorts in the world as well as regional and local ski areas outside major cities, and connect them all through one unrivaled network. We are looking for ambitious leaders, innovators and creators to join our talented team. If you’re ready to pursue your fullest potential, we want to get to know you! Candidates for year-round positions are reviewed on a rolling basis. Applications will be accepted up to 90 days after the posting date, or until the position is filled (whichever is first). Job Summary: Do you enjoy working cross-functionally, automating everything, and driving continuous improvement? Are you looking to join a group of like-minded collaborators driving transformations and delivering value for the team, our partners, and the organization? If you answered yes, then a role as a DevOps Infrastructure Engineer on our Application Infrastructure team may be right for you. Our DevOps engineers are focused on providing application support for numerous web applications, systems, and integrations supporting various IT application development teams and business representatives for industry leading projects, enhancements and maintenance. Job Specifications: Outlet: Corporate The budgeted range starts at $103,596 - $126,646 + annual bonus. Actual pay will be adjusted based on experience Shift & Schedule Availability: Full Time, Year Round Other Specifics: Hybrid - Remote Job Responsibilities: Works with development teams and business representatives as part of the delivery team to support current business objectives Discusses, analyzes, reviews, and resolves usability issues in conjunction with development teams Recommends solutions by defining architecture, functional capabilities, system security and environmental fit Implements approved modifications to systems, integrations or software Adheres to and supports regulatory compliance and security standards (PCI, SOX, CIS, CCPA) Reviews, plans, implements and communicates code deployments across the enterprise Maintains and upgrades fleets of systems to meet compliance, performance and feature requirements Configures monitoring/alerting and responds to alerts as required by SLAs Collaborates on the research, planning and implementing new technologies to support the overall infrastructure group and applications teams SDLCs Works with other engineers and developers to deliver POCs of proposed solutions Maintains web application firewalls, content delivery networks, container infrastructure, application gateways, and load balancers Facilitates application pen testing, vulnerability remediation and implementation of compensating controls System administration of application servers (Microsoft and Red Hat) Performs software upgrades Performs vulnerability remediation and patching Provides occasional after-hours support and holiday coverage as part of an on-call rotation Creates and maintains documents, scripts and configurations on a collaborative platform Assists development teams in establishing CI/CD pipelines and other DevOps operations Automates processes via scripting and infrastructure as code Performs other related duties as assigned Job Skills: Excellent collaborative, analytical and problem-solving capabilities Systems administration in Windows and Linux (hardening, configuration, and patching) Thorough understanding of hosting web applications (IIS, .NET, Sitecore, Umbraco, Node.js) Experience with the following: Containers (Kubernetes, Docker) PKI management and certificate provisioning Network load balancers (F5, Azure, Traefik) Caching, CDNs, WAFs, Cloud Hosting (Redis, Azure, Akamai, F5) CI/CD and Configuration Management tools (Bamboo, GitHub, Octopus, Puppet, ProGet, Azure DevOps, Ansible, Terraform) Familiarity with the following desired: Regulatory compliance (PCI, SOX, CCPA, GDPR) Networking and firewall technology Application troubleshooting Security operations Agile and code deployment methodologies PowerShell, Bash sell, or Perl scripting Consuming APIs Job Experience: Bachelor’s degree in IS, Engineering, Computing, or related field required; or equivalent work experience. 3+ years for hard skills listed above supporting an enterprise technology environment Business Analyst, Development, and/or Project Management experience a plus The expected Total Compensation for this role is $103,596 - $126,646 + annual bonus. Individual compensation decisions are based on a variety of factors. Job Benefits Ski/Mountain Perks! Free passes for employees, employee discounted lift tickets for friends and family AND free ski lessons MORE employee discounts on lodging, food, gear, and mountain shuttles 401(k) Retirement Plan Employee Assistance Program Excellent training and professional development Full Time roles are eligible for the above, plus: Health Insurance; Medical Insurance, Dental Insurance, and Vision Insurance plans (for eligible seasonal employees after working 500 hours) Free ski passes for dependents Critical Illness and Accident plans Vail Resorts offers a ‘Hybrid’ work environment where employees living within 50 miles of the Broomfield office work on-site Tuesday, Wednesday, Thursday and have flexibility to work off-site on Mondays and Fridays. Employees living outside of a commutable distance can work remotely from British Columbia, Washington D.C., and the 16 U.S. states* in which we currently operate. This includes: California, Colorado, Indiana, Michigan, Minnesota, Missouri, New Hampshire, New York, Nevada, Ohio, Pennsylvania, Utah, Vermont, Washington State, Wisconsin, and Wyoming. Please note that the ability to work in person or off-site, and the particulars related to such work, are subject to change at any time; and, accordingly, the Company reserves the right to change its policies and/or require in-person/in-office work or off-site work at any time in its sole discretion. Vail Resorts is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability, protected veteran status or any other status protected by applicable law. Requisition ID 506072 Reference Date: 11/18/2024 Job Code Function: APP Job Type: [[JobType]]
    $103.6k-126.6k yearly 11d ago
  • Senior Software Engineer-OpenGL

    Raas Infotek 4.1company rating

    Data Engineer Job In Saint Paul, MN

    Title : Senior Software Engineer-OpenGL Below is the customer expectations: Main Focus on OpenGL Implementation: Primary requirement is on OpenGL who has good experience in optimizing the application. Algorithm and Data Structure Optimization: Experience in optimization of algorithms and data structures, with some experience around the feasibility of image renderings. Basics of C++: Good Knowledge on that require a strong grasp of C++ fundamentals. High-Level Qt Experience Job Description: Mandatory Skills Expertise in computational geometry algorithms Expertise in developing multithreaded real-time applications Excellent analytical and mathematical skills Strong experience in design and implementation of cutting-edge graphics techniques and detailed knowledge of graphics hardware such as OpenGL shader language, CUDA, Nvidia GPU programming, OpenGL 2D/3D texture mapping, CPU/GPU performance profiling and characterization, and other general stream programming techniques Hands on experience in Object-Oriented Design, C++, and Qt Programming experiences on Linux platform Strong verbal and written communications with ability to effectively communicate at multiple levels in the organization. Desired Skills Experience in gaming domain with usage of high end GPU systems Experience with creating and managing requirements and translating them into effective architectures and software designs. An understanding of requirements for, and experience in medical device development Experience in software development through the full product life cycle Thanks & Regards, Trayambkeshwer Dwivedi (Trayam), Sr. Technical Recruiter Raas infotek corporation 262 Chapman road, Suite 105A, Newark, DE-19702 Direct number: ********** | 132 Text Now: ************** Email: **************************************
    $90k-117k yearly est. 17h ago
  • Senior React Developer

    Spectraforce 4.5company rating

    Data Engineer Job In Saint Paul, MN

    Title: React Developer Duration: 12 months- temp to hire. We are looking for a strong, hands-on leader with experience working in a globally dispersed team. We are looking for collaborative and creative associates that thrive in a team environment, creating solutions for challenging customer driven business needs. Work on the inception phase of projects, including collecting requirements, suggesting methodologies and technologies, and release planning. Active researcher, keeping up with the latest trends in software development. Influence and strengthen the development culture of the team. Inspire, mentor, and encourage globally dispersed developers to apply industry's best practices. Design, implement, and support our mobile solutions and applications. You need to have: 7-10 years of developing software using object-oriented design and implementation. 7-10 years of mobile or web browser-based development experience. 7-10 years of JavaScript and/or JavaScript using React framework. Experience with developing mobile or web-based UI. Web Development - CSS, HTML, XML/XSLT, Object Oriented Methodologies - OOA, OOD, OOP, Design Patterns. Foundational understanding of AWS/Azure Cloud services and solutions. Ability to effectively communicate with others both written and orally, with good interpersonal skills. The ideal candidate would have one or more of the following: Mobile provisioning and development for iOS and Android. User Interface Design - User centered design, GUI & web page design, prototyping, usability testing. Experience working with cross functional teams as well as globally dispersed teams a plus. Good problem-solving skills Experience building highly reliable applications, requiring minimal support and maintenance. Experience with RESTful API or GraphQL Experience with typescript or any other typed programming languages Knowledge of JavaScript ecosystem tools, such as webpack, npm, etc. Bachelor of Science degree in Computer Science, Computer Engineering, Electrical Engineering, or equivalent experience. AWS Certified Cloud Practitioner/Microsoft Certified Azure Fundamentals is a plus. Experience with VM and build tools (Cocoapods, Gradle, Babel, XCode, Android Studio) Experience using build and deployment tools (GitLab, Jenkins or AWS Build/Deploy/Amplify) About Us: Established in 2004, SPECTRAFORCE is one of the largest and fastest-growing diversity-owned staffing firms in the US. The growth of our company is a direct result of our global client service delivery model that is powered by our state-of-the-art A.I. proprietary talent acquisition platform, robust ISO 9001:2015/ISO 27001 certified processes, and strong and passionate client engaged teams. We have built our business by providing talent and project-based solutions, including Contingent, Permanent, and Statement of Work (SOW) services to over 140 clients in the US, Canada, Puerto Rico, Costa Rica, and India. Key industries that we service include Technology, Financial Services, Life Sciences, Healthcare, Telecom, Retail, Utilities and Transportation. SPECTRAFORCE is built on a concept of “human connection,” defined by our branding attitude of NEWJOBPHORIA , which is the excitement of bringing joy and freedom to the work lifestyle so our people and clients can reach their highest potential. Learn more at: *************************** Benefits: SPECTRAFORCE offers ACA compliant health benefits as well as dental, vision, accident, critical illness, voluntary life, and hospital indemnity insurances to eligible employees. Additional benefits offered to eligible employees include commuter benefits, 401K plan with matching, and a referral bonus program. SPECTRAFORCE provides unpaid leave as well as paid sick leave when required by law. Equal Opportunity Employer: SPECTRAFORCE is an equal opportunity employer and does not discriminate against any employee or applicant for employment because of race, religion, color, sex, national origin, age, sexual orientation, gender identity, genetic information, disability or veteran status, or any other category protected by applicable federal, state, or local laws. Please contact Human Resources at ******************** if you require reasonable accommodation. California Applicant Notice: SPECTRAFORCE is committed to complying with the California Privacy Rights Act (“CPRA”) effective January 1, 2023; and all data privacy laws in the jurisdictions in which it recruits and hires employees. A Notice to California Job Applicants Regarding the Collection of Personal Information can be located on our website. Applicants with disabilities may access this notice in an alternative format by contacting *********************. LA County, CA Applicant Notice: If you are selected for this position with SPECTRAFORCE, your offer is contingent upon the satisfactory completion of several requirements, including but not limited to, a criminal background check. We consider qualified applicants with arrest or conviction records for employment in accordance with all local ordinances and state laws, including the Los Angeles County Fair Chance Ordinance for Employers (FCO) and the California Fair Chance Act (FCA). The background check assessment will consider whether a criminal history could reasonably have a direct, adverse impact on the job-related safety, security, trust, regulatory compliance, or suitability for this role. Such findings may result in withdrawal of a conditional job offer.
    $87k-113k yearly est. 7d ago
  • Developmental Engineer

    United States Air Force

    Data Engineer Job In Duluth, MN

    ADVANCING OUR OPERATIONS In order for us to complete our missions, our technology simply cannot fail. Covering a wide range of specialties ranging from aeronautical and computer systems to flight test and mechanical, Developmental Engineers provide advanced skill and knowledge of their particular specialties. Responsible for everything from the planning to implementation of their projects, these experts are essential to the success of operations all over the world. REQUIREMENTS You must meet several requirements before joining the Air Force. These concern your background, overall health and other standards set by the Air Force, Department of Defense and federal law.Minimum Education Bachelor's degree in engineering related to one of the following specialties: aerospace, aeronautical, astronautical, computer, electrical, electronics, communication or mechanical Qualifications Completion of the Defence Acquisition University Fundamentals of Systems Acquisition Management course or Acquisition Fundamentals course Completion of the Air Force Flight Test Engineer course or comparable Minimum of 24 months of experience in qualified position or a master's degree in a specified discipline and 12 months' experience or a Doctor of Philosophy degree in a specified discipline Completion of Officer Training School (OTS), Air Force Academy (AFA) or Air Force Reserve Officer Training Corps (AFROTC) Must be between the ages of 18 and have not reached your 42 nd birthday
    $65k-86k yearly est. 13d ago

Learn More About Data Engineer Jobs

Do you work as a Data Engineer?

What are the top employers for Data Engineer in MN?

Top 10 Data Engineer companies in MN

  1. UnitedHealth Group

  2. Mayo Clinic

  3. Robert Half

  4. Cognite

  5. Gsk

  6. The Travelers Companies

  7. SAS Holdings

  8. TechDigital

  9. Ip Services

  10. Deloitte

Job type you want
Full Time
Part Time
Internship
Temporary

Browse Data Engineer Jobs In Minnesota By City

All Data Engineer Jobs

Jobs In Minnesota