Data engineer jobs in West Des Moines, IA - 476 jobs
All
Data Engineer
Data Scientist
Data Architect
Data Warehouse Developer
Imaging Data Scientist
Pentangle Tech Services | P5 Group
Data engineer job in Johnston, IA
Johnston, IA - candidate living within 50-mile radius of location required onsite T/W/TH each week.
Project Scope and Brief Description:
Work at the intersection of plant cell biology and applied AI to build, productionize, and maintain computer vision pipelines that accelerate Doubled Haploid (DH) breeding in Biotechnology. The contractor will contribute to end‑to‑end imaging and analytics-from microscopy microspore detection to macroscopic structure assessment and plantlet characterization-supporting decisions that reduce cycle time and cost in DH programs. Solutions will be developed primarily in Python, integrated with our repositories and workflow tooling, and aligned with Biotech strategy initiatives.
Responsibilities:
Design & deliver deep learning-based CV models for microscopy and macroscopic assays (detection, segmentation, classification) with measurable accuracy, robustness, and throughput.
Build production‑ready pipelines in Python (data ingest, preprocessing, augmentation, inference, batch processing), integrated with GitLab repos and experiment tracking; ensure reproducibility and documentation.
Implement hyperspectral analysis workflows (band selection, normalization, feature extraction, model training).
Harmonize imaging acquisition with analysis by collaborating with biology teams to standardize microscopy/RGB/hyperspectral capture and file formats (e.g., FIJI/ImageJ for z‑stacks; autoscale practices).
Quantify model performance (precision/recall, F1, ROC/AUC, calibration) and write clear reports/posters for DH sessions; support fact‑checking in presentations.
Operationalize at scale: batch processing of tens of thousands of structures/images; optimize inference (e.g., torch.compile, mixed precision) and monitor resource usage.
Partner with DH stakeholders (biotech & breeding, Genome Technology Discovery, Data Science) to align deliverables with deployment milestones.
Maintain IP & data stewardship practices consistent with internal strategy; avoid disclosure of confidential protocols while enabling model re‑use.
Skills / Experience:
Must‑Have
4-6 years hands‑on in computer vision with Python (PyTorch/TensorFlow), including detection/segmentation/classification for scientific or industrial imaging.
Proven ability to productionize models: Git/GitLab, code reviews, CICD basics, experiment tracking (MLFlow or equivalent), reproducible data/experiments, and clear documentation.
Experience with microscopy image processing, multi‑page TIFFs, z‑stacks, autoscale/normalization, and image quality challenges.
Familiarity with hyperspectral or multispectral imaging pipelines (preprocessing, dimensionality reduction, modeling) applied to plant or biological materials.
Track record of measurable model performance reporting and communicating results via posters/presentations for technical audiences.
Nice‑to‑Have
Vision Transformers (ViT) and modern YOLO workflows for microscopy/macroscopic tasks; comfort with infer tooling.
Experience optimizing inference (e.g., torch.compile, mixed precision) and scaling batch workflows.
Domain familiarity with Biotech breeding workflows.
Collaboration with discovery and strategy teams; ability to work across biology, engineering, and data science groups.
Soft Skills
Strong stakeholder communication and the ability to translate biology & process constraints into CV requirements; comfortable triaging and prioritizing rapidly in active programs.
Ownership mindset around documentation, reproducibility, and IP‑aware sharing.
Curious and learning mindset
Technical leadership experience.
$64k-88k yearly est. 3d ago
Looking for a job?
Let Zippia find it for you.
Data Scientist, Product Analytics
Meta 4.8
Data engineer job in Des Moines, IA
As a Data Scientist at Meta, you will shape the future of people-facing and business-facing products we build across our entire family of applications (Facebook, Instagram, Messenger, WhatsApp, Oculus). By applying your technical skills, analytical mindset, and product intuition to one of the richest data sets in the world, you will help define the experiences we build for billions of people and hundreds of millions of businesses around the world. You will collaborate on a wide array of product and business problems with a wide-range of cross-functional partners across Product, Engineering, Research, DataEngineering, Marketing, Sales, Finance and others. You will use data and analysis to identify and solve product development's biggest challenges. You will influence product strategy and investment decisions with data, be focused on impact, and collaborate with other teams. By joining Meta, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.Product leadership: You will use data to shape product development, quantify new opportunities, identify upcoming challenges, and ensure the products we build bring value to people, businesses, and Meta. You will help your partner teams prioritize what to build, set goals, and understand their product's ecosystem.Analytics: You will guide teams using data and insights. You will focus on developing hypotheses and employ a varied toolkit of rigorous analytical approaches, different methodologies, frameworks, and technical approaches to test them.Communication and influence: You won't simply present data, but tell data-driven stories. You will convince and influence your partners using clear insights and recommendations. You will build credibility through structure and clarity, and be a trusted strategic partner.
**Required Skills:**
Data Scientist, Product Analytics Responsibilities:
1. Work with large and complex data sets to solve a wide array of challenging problems using different analytical and statistical approaches
2. Apply technical expertise with quantitative analysis, experimentation, data mining, and the presentation of data to develop strategies for our products that serve billions of people and hundreds of millions of businesses
3. Identify and measure success of product efforts through goal setting, forecasting, and monitoring of key product metrics to understand trends
4. Define, understand, and test opportunities and levers to improve the product, and drive roadmaps through your insights and recommendations
5. Partner with Product, Engineering, and cross-functional teams to inform, influence, support, and execute product strategy and investment decisions
**Minimum Qualifications:**
Minimum Qualifications:
6. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
7. Bachelor's degree in Mathematics, Statistics, a relevant technical field, or equivalent
8. 4+ years of work experience in analytics, data querying languages such as SQL, scripting languages such as Python, and/or statistical mathematical software such as R (minimum of 2 years with a Ph.D.)
9. 4+ years of experience solving analytical problems using quantitative approaches, understanding ecosystems, user behaviors & long-term product trends, and leading data-driven projects from definition to execution [including defining metrics, experiment, design, communicating actionable insights]
**Preferred Qualifications:**
Preferred Qualifications:
10. Master's or Ph.D. Degree in a quantitative field
**Public Compensation:**
$147,000/year to $208,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
$147k-208k yearly 60d+ ago
Data Scientist, Privacy
Datavant
Data engineer job in Des Moines, IA
Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care.
By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare.
As part of the Privacy Science team within Privacy Hub you will play a crucial role in ensuring that privacy of patients is safeguarded in the modern world of data sharing. As well as working on real data, you will be involved in exciting research to keep us as industry leaders in this area, and stimulating discussions on re-identification risk. You will be supported in developing/consolidating data analysis and coding skills to become proficient in the analysis of large health-related datasets.
**You Will:**
+ Critically analyze large health datasets using standard and bespoke software libraries
+ Discuss your findings and progress with internal and external stakeholders
+ Produce high quality reports which summarise your findings
+ Contribute to research activities as we explore novel and established sources of re-identification risk
**What You Will Bring to the Table:**
+ Excellent communication skills. Meticulous attention to detail in the production of comprehensive, well-presented reports
+ A good understanding of statistical probability distributions, bias, error and power as well as sampling and resampling methods
+ Seeks to understand real-world data in context rather than consider it in abstraction.
+ Familiarity or proficiency with programmable data analysis software R or Python, and the desire to develop expertise in its language
+ Application of scientific methods to practical problems through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions
+ Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines
+ Initiative and ability to independently explore and research novel topics and concepts as they arise, to expand Privacy Hub's knowledge base
+ An appreciation of the need for effective methods in data privacy and security, and an awareness of the relevant legislation
+ Familiarity with Amazon Web Services cloud-based storage and computing facilities
**Bonus Points If You Have:**
+ Experience creating documents using LATEX
+ Detailed knowledge of one or more types of health information, e.g., genomics, disease, health images
+ Experience working with or supporting public sector organizations, such as federal agencies (e.g., CMS, NIH, VA, CDC), state health departments, or public health research partners. Familiarity with government data environments, procurement processes, or privacy frameworks in regulated settings is highly valued.
\#LI-BC1
We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status.
At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services.
The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job.
The estimated total cash compensation range for this role is:
$104,000-$130,000 USD
To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion.
This job is not eligible for employment sponsorship.
Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay.
At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way.
Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis.
For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
$104k-130k yearly 15d ago
Imaging Data Scientist
Insight Global
Data engineer job in Johnston, IA
As an Imaging Data Scientist for one of our Agricultural Sciences customers, you will work at the intersection of plant cell biology and applied AI to build, productionize, and maintain computer vision pipelines that accelerate Doubled Haploid (DH) breeding in Biotechnology. You will contribute to end‑to‑end imaging and analytics-from microscopy microspore detection to macroscopic structure assessment and plantlet characterization-supporting decisions that reduce cycle time and cost in DH programs. Solutions will be developed primarily in Python, integrated with our repositories and workflow tooling, and aligned with Biotech strategy initiatives. Day to day responsibilities include:
Design & deliver deep learning-based CV models for microscopy and macroscopic assays (detection, segmentation, classification) with measurable accuracy, robustness, and throughput.
Build production‑ready pipelines in Python (data ingest, preprocessing, augmentation, inference, batch processing), integrated with GitLab repos and experiment tracking; ensure reproducibility and documentation.
Implement hyperspectral analysis workflows (band selection, normalization, feature extraction, model training).
Harmonize imaging acquisition with analysis by collaborating with biology teams to standardize microscopy/RGB/hyperspectral capture and file formats (e.g., FIJI/ImageJ for z‑stacks; auto scale practices).
Quantify model performance (precision/recall, F1, ROC/AUC, calibration) and write clear reports/posters for DH sessions; support fact‑checking in presentations.
Operationalize at scale: batch processing of tens of thousands of structures/images; optimize inference (e.g., torch. compile, mixed precision) and monitor resource usage.
Partner with DH stakeholders (biotech & breeding, Genome Technology Discovery, Data Science) to align deliverables with deployment milestones.
Maintain IP & data stewardship practices consistent with internal strategy; avoid disclosure of confidential protocols while enabling model re‑use.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to ********************.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: ****************************************************
Skills and Requirements
4-6 years hands‑on in computer vision with Python (PyTorch/TensorFlow), including detection/segmentation/classification for scientific or industrial imaging.
Proven ability to productionize models: Git/GitLab, code reviews, CICD basics, experiment tracking (MLFlow or equivalent), reproducible data/experiments, and clear documentation.
Experience with microscopy image processing, multi‑page TIFFs, z‑stacks, autoscale/normalization, and image quality challenges.
Familiarity with hyperspectral or multispectral imaging pipelines (preprocessing, dimensionality reduction, modeling) applied to plant or biological materials.
Track record of measurable model performance reporting and communicating results via posters/presentations for technical audiences.
Stakeholder communication skills, willing to take ownership of project work Vision Transformers (ViT) and modern YOLO workflows for microscopy/macroscopic tasks; comfort with infer tooling.
Experience optimizing inference (e.g., torch.compile, mixed precision) and scaling batch workflows.
Domain familiarity with Biotech breeding workflows.
Collaboration with discovery and strategy teams; ability to work across biology, engineering, and data science groups.
$64k-88k yearly est. 6d ago
Data Scientist - Property/Casualty
Farm Bureau Financial Services 4.5
Data engineer job in West Des Moines, IA
Farm Bureau is looking for a strong data driven professional to join our team and help us live out our mission of protecting livelihoods and futures. In this role, you will perform statistical analysis as part of the predictive modeling team and assist in the delivery of pricing models and competitive research. Candidate must be local to the Des Moines area, or willing to relocate as at least three days in the office is required per week. Candidates must also have predictive modeling or actuarial experience.
Who We Are: At Farm Bureau Financial Services, we make insurance simple so our client/members can feel confident knowing their family, home, cars and other property are protected. We value a culture where integrity, teamwork, passion, service, leadership and accountability are at the heart of every decision we make and every action we take. We're proud of our more than 80-year commitment to protecting the livelihoods and futures of our client/members and creating an atmosphere where our employees thrive.
What You'll Do:
* Develop, implement, and document predictive models
* Interpret results to audiences of various technical knowledge
* Test, validate, and correct / clean data as needed
* Collaborate with managing actuaries on the pricing teams to deliver results
What It Takes to Join Our Team:
* College degree plus 3 years predictive modeling experience required.
* Property Casualty experience preferred.
* Must have strong skills with SQL, R, Emblem or other predictive modeling software.
* Be team focused and be able to work in a collaborative work environment.
* Strong communication and presentation skills.
What We Offer You: When you're on our team, you get more than a great paycheck. You'll hear about career development and educational opportunities. We offer an enhanced 401K with a match, low-cost health, dental, and vision benefits, and life and disability insurance options. We also offer paid time off, including holidays and volunteer time, and teams who know how to have fun. Add to that an onsite wellness facility with fitness classes and programs, a daycare center, a cafeteria, and for many positions, even consideration for a hybrid work arrangement. Farm Bureau....where the grass really IS greener!
Work Authorization/Sponsorship
Applicants must be currently authorized to work in the United States on a full-time basis. We are not able to sponsor now or in the future, or take over sponsorship of, an employment visa or work authorization for this role. For example, we are not able to sponsor OPT status.
$65k-89k yearly est. 13d ago
Data Engineer
LCS Senior Living
Data engineer job in Des Moines, IA
This position is responsible for designing, developing, and maintaining robust data solutions to support business operations and decision-making processes. This role involves working with cutting-edge technologies, including Microsoft Azure Cloud, Snowflake, and Power BI, to build and manage data lakes, data warehouses, and reporting systems. The DatabaseEngineer ensures data accuracy, system reliability, and high-quality deliverables by performing thorough quality assurance testing and providing user support. Collaboration with IT and business teams is essential to identify opportunities for operational efficiency and align data solutions with organizational goals. Reports to the Director, Data Analytics
Experience is Everything.
At LCS, experience is everything. We provide you the opportunity to use your talents in a progressive, growing organization that makes a positive difference in the lives of the seniors we serve. If you are seeking an organization that gives back, you'll love working here. Our principles and hospitality promises define our company culture. LCS employees can be found participating in volunteer activities, getting involved in our committees or collaborating with team members in our innovative workspace. You'll find several opportunities to grow as a professional, serve the community, and enhance the lives of seniors.
What You'll Do:
* Design, develop, and maintain scalable data lakes, data warehouses, and data storage solutions.
* Create and optimize ETL/ELT pipelines for efficient data processing and integration.
* Implement data models to support reporting, analytics, and operational workflows.
* Build, manage, and enhance Power BI dashboards and reports for actionable insights.
* Collaborate with business stakeholders to gather reporting requirements and ensure data accuracy.
* Leverage Microsoft Azure Cloud services (e.g., Data Factory, Synapse Analytics, Data Lake Storage) for data integration and processing.
* Utilize Snowflake to manage and optimize data warehouse performance.
* Perform thorough QA testing to ensure data accuracy, system reliability, and high performance.
* Provide user support and troubleshoot issues with data systems, pipelines, and reporting tools.
* Work closely with cross-functional teams to align data solutions with organizational objectives.
* Identify and implement operational efficiencies in data workflows and system designs.
* Ensure compliance with data governance, privacy, and security standards.
* Follow best practices for version control, documentation, and CI/CD processes.
* Stay updated on emerging technologies and trends to enhance data strategies and tools.
* Recommend improvements to existing data processes and tools to drive innovation.
What We're Looking For:
* Required qualifications
* Bachelor's year degree in Computer Science, MIS, or equivalent work experience.
* 2-4 years of experience in developing, implementing, and supporting enterprise solutions.
* Experience with database concepts, advanced SQL, and reporting/Business Intelligence software.
* Expertise in data architecture, data modeling, and building scalable data solutions.
* Proficiency with Microsoft Azure Cloud services (e.g., Azure Data Factory, Synapse Analytics, Data Lake Storage).
* Strong experience with Snowflake for data warehousing and optimization.
* Advanced skills in Power BI for creating reports, dashboards, and data visualizations.
* Proficient in SQL and scripting languages like Python for data integration and analysis.
* Solid understanding of ETL/ELT pipelines and data integration workflows.
* Strong problem-solving skills and attention to detail for troubleshooting data systems.
* Excellent communication and collaboration abilities to work effectively across teams.
Why Join Us?
* Industry Leader.
* Inclusive & collaborative culture.
* Top Workplace USA.
* Top Workplace Iowa.
* Charity and community involvement.
* Outstanding advancement opportunities.
* Ongoing career development.
Benefits
Competitive pay, great benefits and vacation time. We are an equal opportunity employer with benefits including medical, dental, life insurance, disability, 401(K) with company match and paid parental leave.
Our Commitment
LCS creates living experiences that enhance the lives of seniors. You'll see this commitment in our people. They're talented, dedicated professionals who truly care about residents, with each conducting his or her work with integrity, honesty and transparency according to the principles of LCS. We strive to help every community succeed-strengthening available resources, establishing proven practices that lead to long-term growth and value for those living in, working for and affiliated with the community. Check us out on our website: *************************
Additional Information
Travel frequency: 0-10%
Estimated Salary: $113,000 - 141,000
The actual title & salary will carefully consider a wide range of factors, including your skills, qualifications, experience, and other relevant factors.
A POST-OFFER BACKGROUND CHECK, INCLUDING REFERENCES IS REQUIRED.
LCS IS AN EQUAL OPPORTUNITY EMPLOYER.
$113k-141k yearly Auto-Apply 5d ago
Sr Data Engineer (MFT - IBM Sterling)
The Hertz Corporation 4.3
Data engineer job in Des Moines, IA
**A Day in the Life:** The IBM Sterling Senior MFT Engineer will have expertise in Sterling B2B Integrator, Sterling File Gateway, Sterling Secure Proxy, Control Center. This position requires an expert level knowledge of these technologies. You'll provide third-level support for core hardware, software, data and infrastructure components within the B2B integration team. The software focus areas will include software infrastructure components, continuous improvement and automation related to middleware and enterprise data movement within a mission-critical, multi-site enterprise environment.
The ideal candidate will have a passion for technology; possess the ability to create change and be the facilitator for this transformation. They will have experience tailoring software design, developing, and implementing of solutions in support of data quality and automation. They will need to understand the requirements of a project, test the performance of the infrastructure, and verify that the requirements have been successfully met.
We expect the starting salary to be around $135k but will be commensurate with experience.
**What You'll Do:**
TECHNICAL SENIORSHIP
+ Communication with internal and external business users on Sterling Integrator mappings
+ Making changes to existing partner integrations to meet internal and external requirements
+ Design, develop and implement solutions based on standards and processes that establish consistency across the enterprise data, reduce risks, and promote efficiencies in support of the organizational goals and objectives.
+ Diagnose and troubleshoot complex issues, restore services and perform root cause analysis.
+ Facilitate the review, vetting of these designs with the architecture governance bodies, as required.
+ Be aware of all aspects of security related to the Sterling environment and integrations
INNOVATIVE INFRASTRUCTURE & PROBLEM SOLVING
+ Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows.
TEAMWORK & COMMUNICATION
+ Superior & demonstrated team building & development skills to harness powerful teams
+ Ability to communicate effectively with different levels of Seniorship within the organization
+ Provide timely updates so that progress against each individual incident can be updated as required
+ Write and review high quality technical documentation
CONTROL & AUDIT
+ Ensures their workstation and all processes and procedures, follow organization standards
CONTINUOUS IMPROVEMENT
+ Encourages and maintains a 'Best practice sharing culture', always striving to find ways to improve service and change mind set.
**What We're Looking For:**
+ Bachelor's degree in Engineering, Statistics, Computer Science or other quantitative fields, required
+ 5+ years of IT experience
+ 4+ years' experience working with Sterling B2B Integrator, Sterling File Gateway, PCM (preferred)
+ 3+ years' experience with scripting to enable automation of standard activities (example: Ansible, Python, Bash, Java)
+ Strong interpersonal and communication skills with Agile/Scrum experience.
+ Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions with the ability to develop custom setups.
+ Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels.
+ Prefer Travel, transportation, or hospitality experience
+ Prefer experience with designing application data models for mobile or web applications
+ Excellent written and verbal communication skills.
+ Flexibility in scheduling which may include nights, weekends, and holidays
**What You'll Get:**
+ Up to 40% off the base rate of any standard Hertz Rental
+ Paid Time Off
+ Medical, Dental & Vision plan options
+ Retirement programs, including 401(k) employer matching
+ Paid Parental Leave & Adoption Assistance
+ Employee Assistance Program for employees & family
+ Educational Reimbursement & Discounts
+ Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness
+ Perks & Discounts -Theme Park Tickets, Gym Discounts & more
The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world.
**US EEO STATEMENT**
At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company.
Individuals are encouraged to apply for positions because of the characteristics that make them unique.
EOE, including disability/veteran
$135k yearly 60d+ ago
Sr Data Warehouse Lakehouse Developer
Lumen 3.4
Data engineer job in Des Moines, IA
Lumen connects the world. We are igniting business growth by connecting people, data and applications - quickly, securely, and effortlessly. Together, we are building a culture and company from the people up - committed to teamwork, trust and transparency. People power progress.
We're looking for top-tier talent and offer the flexibility you need to thrive and deliver lasting impact. Join us as we digitally connect the world and shape the future.
**The Role**
We are seeking a Senior Data Warehouse/Lakehouse Developer to design, build, and optimize enterprise data solutions. This role combines advanced development expertise with strong analytical skills to translate business requirements into scalable, high-performance data systems. You will work closely with architects, product owners, and scrum teams, provide technical leadership, and ensure best practices in dataengineering and testing.
**Location**
The position is a Work-From-Home available from any US-based location. You must be a US Citizen or Permanent Resident/Green Card for consideration.
**The Main Responsibilities**
**Design & Development**
+ Develop and maintain ETL/ELT processes for Data Warehouse and Lakehouse environments.
+ Create and optimize complex SQL queries, stored procedures, and data transformations.
+ Build and enhance source-to-target mapping documents.
+ Assist with UAT build and data loading for User Acceptance Testing.
+ Estimate levels of effort (LOEs) for analysis, design, development, and testing tasks.
**Technical Leadership**
+ Provide technical leadership and mentorship to team members.
+ Collaborate with architects, system engineers, and product owners to understand and detail business/system requirements and logical/physical data models.
+ Participate in and consult on integrated application and regression testing.
+ Conduct training sessions for system operators, programmers, and end users.
**Analytical Expertise**
+ Analyze programming requests to ensure seamless integration with current applications.
+ Perform data analysis and mapping to ensure accuracy and consistency.
+ Generate test plans and test cases for quality assurance.
+ Research and evaluate problems, recommend solutions, and implement decisions.
**Continuous Improvement**
+ Monitor and optimize data pipelines for performance and reliability.
+ Stay current with emerging technologies and recommend improvements to architecture and processes.
+ Adapt to changing priorities and aggressive project timelines while managing multiple complex projects.
**What We Look For in a Candidate**
**Technical Skills**
+ Proficiency in SQL and at least one programming language (Python, Java, Scala).
+ Experience with ETL tools (Informatica, Kafka) and Lakehouse technologies (Azure Data Factory, PySpark).
+ Familiarity with databases (Databricks, Oracle, SQL Server).
+ Knowledge of modeling tools (Visio, ERwin, UML) and data analysis tools (TOAD, Oracle SQL Developer, DBeaver).
+ Strong understanding of data warehousing concepts and Lakehouse architecture.
**Analytical & Problem-Solving**
+ Ability to translate business requirements into technical solutions.
+ Strong troubleshooting and performance tuning skills.
+ Demonstrated organizational, oral, and written communication skills.
**Experience**
+ 6+ years of experience with a Bachelor's degree OR 4+ years with a Master's degree.
+ Proven ability to lead technical teams and manage projects.
+ Experience in applications development and systems analysis.
**Preferred Qualifications**
+ Project management experience.
+ Familiarity with CI/CD pipelines and version control (Git).
+ Exposure to big data frameworks (Spark, Hadoop) and cloud ecosystems (Azure, AWS, GCP).
**Compensation**
This information reflects the anticipated base salary range for this position based on current national data. Minimums and maximums may vary based on location. Individual pay is based on skills, experience and other relevant factors.
Location Based Pay Ranges
$82,969 - $110,625 in these states: AL AR AZ FL GA IA ID IN KS KY LA ME MO MS MT ND NE NM OH OK PA SC SD TN UT VT WI WV WY
$87,117 - $116,156 in these states: CO HI MI MN NC NH NV OR RI
$91,266 - $121,688 in these states: AK CA CT DC DE IL MA MD NJ NY TX VA WA
Lumen offers a comprehensive package featuring a broad range of Health, Life, Voluntary Lifestyle benefits and other perks that enhance your physical, mental, emotional and financial wellbeing. We're able to answer any additional questions you may have about our bonus structure (short-term incentives, long-term incentives and/or sales compensation) as you move through the selection process.
Learn more about Lumen's:
Benefits (****************************************************
Bonus Structure
\#LI-Remote
\#LI-PS
Requisition #: 340407
**Background Screening**
If you are selected for a position, there will be a background screen, which may include checks for criminal records and/or motor vehicle reports and/or drug screening, depending on the position requirements. For more information on these checks, please refer to the Post Offer section of our FAQ page (************************************* . Job-related concerns identified during the background screening may disqualify you from the new position or your current role. Background results will be evaluated on a case-by-case basis.
Pursuant to the San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records.
**Equal Employment Opportunities**
We are committed to providing equal employment opportunities to all persons regardless of race, color, ancestry, citizenship, national origin, religion, veteran status, disability, genetic characteristic or information, age, gender, sexual orientation, gender identity, gender expression, marital status, family status, pregnancy, or other legally protected status (collectively, "protected statuses"). We do not tolerate unlawful discrimination in any employment decisions, including recruiting, hiring, compensation, promotion, benefits, discipline, termination, job assignments or training.
**Disclaimer**
The job responsibilities described above indicate the general nature and level of work performed by employees within this classification. It is not intended to include a comprehensive inventory of all duties and responsibilities for this job. Job duties and responsibilities are subject to change based on evolving business needs and conditions.
In any materials you submit, you may redact or remove age-identifying information such as age, date of birth, or dates of school attendance or graduation. You will not be penalized for redacting or removing this information.
Please be advised that Lumen does not require any form of payment from job applicants during the recruitment process. All legitimate job openings will be posted on our official website or communicated through official company email addresses. If you encounter any job offers that request payment in exchange for employment at Lumen, they are not for employment with us, but may relate to another company with a similar name.
$91.3k-121.7k yearly 46d ago
Data Engineer II
Ncmic
Data engineer job in Clive, IA
Job Purpose:
Responsible for creating and modifying moderate to complex Extract, Transform, and Load (ETL) processes. Ensure reliability, stability, and performance of ETL environment. Design, develop, troubleshoot, and maintain ETL processes.
Essential Functions:
1.
Performs design, development, debugging, and maintenance of ETL processes. Identify, understand, and translate development requirements into technical solutions. Monitor ETL processes to ensure reliability and data availability.
2.
Performs unit testing and regression testing as needed to ensure ETL processes are functioning according to requirements. Relies on experience and judgment to plan and accomplish goals and in resolving problems and technical issues.
3.
Assists in development of standards and procedures for the management, design, and maintenance of the ETL environment.
4.
Maintains knowledge of ETL tools and processes; including changes in technology and potential impact to department and makes appropriate recommendations.
5.
Performs other duties as assigned.
Requirements:
Education: Bachelor's degree in computer science, management information systems or related field or equivalent experience.
Experience:
3+ years of experience in ETL development with Talend, SSIS, Informatica, Python, or similar ETL tool (Talend preferred).
Strong understanding of relational databases, with a preference for expertise in MS-SQL.
Proficiency in the fundamentals of data pipelining, ELT/ETL processes, and the overall data lifecycle, complemented by strong problem-solving and analytical skills to gather and interpret data, identify trends, and translate findings into efficient data workflows and solutions.
Knowledge of agile SDLC methodologies.
Strong verbal, written, and interpersonal skills required, including ability to convey technical issues to non-technical audience.
Mental Demands: Ability to gain understanding of tools, technologies, languages and techniques as required for assigned environments. Ability to research and solve development and application problems with little assistance. Ability to focus on tasks for extended periods of time. Must be flexible and have the ability to work with a variety of tasks and employees. Ability to plan, organize, be detail and deadline oriented and maintain a high accuracy rate.
Physical Demands: Continuous sitting for extended periods of time, some standing, walking, bending and reaching. Frequent use of fingers and hands to manipulate computer, telephone and other office equipment. Ability to be able to look and concentrate at a computer screen/monitor for extended periods of time.
$70k-94k yearly est. 60d+ ago
Data Scientist(Python, Power BI, Databricks, SQL, data modeling, R, Javascript, MondoDB)
Ccg Business Solutions 4.2
Data engineer job in Urbandale, IA
CCG Talent Management is not only a business solutions company but a company that believes success starts with the individual. CCG Business Solutions has been consulting and providing talent placement services since 2007. Our team understands the principles of connecting purpose to business. We are currently recruiting for a Data Scientist (Python, Power BI, Databricks, SQL, data modeling, R, Javascript, MondoDB).
Job Description
As a Data Scientist, you will join a client team leveraging petabyte-scale datasets for advanced analytics and model building to enable intelligent, automated equipment and improved decisions by farmers. The client team partners with product managers and dataengineers to design, scale, and deliver full-stack data science solutions. You will join a passionate team making a difference by applying innovative technology to solve some of the world's biggest problems.
Responsibilities
Communicate with impact your findings and methodologies to stakeholders with a variety of backgrounds.
Work with high-resolution machine and agronomic data in the development and testing of predictive models.
Develop and deliver production-ready machine learning approaches to yield insights and recommendations from precision agriculture data
Define, quantify, and analyze Key Performance Indicators that define successful customer outcomes.
Work closely with the DataEngineering teams to ensure data is stored efficiently and can support the required analytics.
Qualifications
Demonstrated competency in developing production-ready models in an Object-Oriented program language such as Python.
Demonstrated competency in using data-access technologies such as SQL, Spark, Databricks, BigQuery, MongoDB, etc.
Experience with Visualization tools such as Tableau, PowerBI, DataStudio, etc.
Experience with Data Modeling techniques such as Normalization, data quality, and coverage assessment, attribute analysis, performance management, etc.
Experience building machine learning models such as Regression, supervised learning, unsupervised learning, probabilistic inference, natural language modeling, etc.
Excellent communication skills. Able to effectively lead meetings, document work for reproduction, write persuasively, communicate proofs-of-concept, and effectively take notes.
Additional Qualifications
Additional experience with other languages such as R, JavaScript, Scala, etc.
Examples of professional work such as publications, patents, a portfolio of relevant project work, etc.
Familiarity with Distributed Datasets
Experienced with a variety of data structures such as time series, geo-tagged, text, structured, and unstructured.
Experience with simulations such as Monte Carlo simulation, Gibbs sampling, etc.
Experience with model validation, measuring model bias, measuring model drift, etc.
Experience collaborating with stakeholders from disciplines such as Product, Sales, Finance, etc.
Ability to communicate complex analytical insights in a manner that is understandable by non - technical audiences.
Additional Information
Salary: $55.00 - $58.00 + Bonus + Relocation
All your information will be kept confidential according to EEO guidelines.
$70k-101k yearly est. 1d ago
Azure Data Engineer
CapB Infotek
Data engineer job in Des Moines, IA
CapB is a global leader on IT Solutions and Managed Services. Our R&D is focused on providing cutting edge products and solutions across Digital Transformations from Cloud, AI/ML, IOT, Blockchain to MDM/PIM, Supply chain, ERP, CRM, HRMS and Integration solutions. For our growing needs we need consultants who can work with us on salaried or contract basis. We provide industry standard benefits, and an environment for LEARNING & Growth.
For one of our going on project we are looking for an AZURE DATAENGINEER. The position is based out of Des Moines, IA. Locals preferred but can be done remotely for the time being this year.
Responsibilities:
• Create functional design specifications, Azure reference architectures, and assist with other project deliverables as needed.
• Design and Develop Platform as a Service (PaaS) Solutions using different Azure Services
• Create a data factory, orchestrate data processing activities in a data-driven workflow, monitor and manage the data factory, move, transform and analyze data
• Design complex enterprise Data solutions that utilize Azure Data Factory Create migration plans to move legacy SSIS packages into Azure Data Factory
• Build conceptual and logical data models
• Design and implement big data real-time and batch processing solutions
• Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed / elastic environments, and downstream applications and/or self-service solutions.
• Develop and document mechanisms for deployment, monitoring and maintenance
Skills and Experience:
• Bachelor's degree or higher in Computer Science Engineering/ Information Technology, Information Systems
• 3+ years experience with Microsoft Cloud Data Platform: Azure Data Factory, Azure Databricks, Python, Scala, Spark SQL, SQL Data Warehouse
• 3+ years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL, data lake solutions
• Expertise with SQL, database design/structures, ETL/ELT design patterns, and DataMart structures (star, snowflake schemas, etc.)
• Functional knowledge of programming scripting and data science languages such as JavaScript, PowerShell, Python, Bash, SQL, .NET, Java, PHP, Ruby, PERL, C++, R, etc.
• Creation of descriptive, predictive and prescriptive analytics solutions using Azure Stream Analytics, Azure Analysis Services, Data Lake Analytics, HDInsight, HDP, Spark, Databricks, MapReduce, Pig, Hive, Tez, SSAS, Watson Analytics, SPSSA
• Experience in Azure Data Factory (ADF) creating multiple pipelines and activities using Azure for full and incremental data loads into Azure Data Lake Store and Azure SQL DW
• Experience for Azure Data Lake Storage and working with Parquet files and partitions
• Experience managing Microsoft Azure environments with VM's, VNETS, Subnets, NSG's, Resource Groups, etc.
• Experience in Creation & Configuration of Azure Resources & RBAC
• Experience with Git/Azure DevOps
• Azure certification would be desired
• Must have an ability to communicate clearly and be a team player
$70k-94k yearly est. 60d+ ago
Sr. Data Engineer (ETL Developer)
Fidelity & Guaranty Life 4.5
Data engineer job in Des Moines, IA
FGL Holdings-the F&G family of insurance companies-is committed to helping Americans prepare for and live comfortably in their retirement. Through its subsidiaries, F&G is a leading provider of annuity and life insurance products. For nearly 60 years, we have offered annuity and life insurance products to those who are seeking safety, protection and income solutions to meet their needs.
At F&G, we believe our culture is what makes our company great. In 2019, we received a Top Workplace award, which we credit to our employees' shared cultural values: Collaborative, Authentic, Dynamic and Empowered. We believe that by embracing these values, we will continue to build and strengthen the company, while being a great place to work. We recruit talented and committed individuals to join our team, and we provide opportunities for personal and professional growth.
The Sr. DataEngineer will support the design, development, implementation and maintenance of complex, mission-critical Informatica, SQL-based systems supporting F&G‘s insurance operations. As a senior team member, the senior dataengineer will assume lead roles in the design and development, oversee the work of all junior developers, and participate in the strategic planning around future development of the F&G data environment.
The Company has made a significant investment in information technology and relies heavily on data interfaces from multiple off-site source systems. Solid communication and problem-solving skills are required.
Organization
This position reports directly to the Director, DataEngineering & Administration and has significant interaction with members of the IT organization, Third Party Administrators, group managers, and departmental analysts throughout the organization. In addition, interaction with the PMO Project Managers for prioritization and reporting and Relationship Management with business will be critical and ongoing.
Duties and Responsibilities
Develop Informatica code to support existing and future EDS deployments
Performance tune existing and future Informatica code to ensure all SLAs are met
Develop and support team of on-shore and off-shore Informatica developers to deliver exceptional quality and meet all project deadlines
Perform relational database analysis, modeling, and design of complex systems
Create detailed technical design documents in accordance with business requirements
Develop complex programs/queries to support transactional processing and regulatory reporting utilizing SQL and Informatica
Develop and perform detailed unit, quality assurance and regression tests to validate the readiness of internal developed code for production
Create detailed deployment plans for use in the migration of code from staging to production environments and provide deployment guides to host provider for deployments
Work with Infrastructure and other IT teams to implement complete solution
Create clear and effective Status reports as required
Perform impact analysis for interface/system changes affecting the applications and data environment
Work closely with Data Management team members to translate business needs into technical solutions
Assist Data Management Manager in developing estimates for project and maintenance work
Monitor/ensure acceptable levels of system performance, integrity and security
Support standards for system architecture, code quality and collaborative team development
Attend routine departmental meetings to support communication around development best practices, participate in change control discussions, review code, and provide technical instruction to colleagues
Partner with external TPAs and consultants to collaborate on large scale development efforts and enforce F&G standards for integration and data exchange
Attend conferences, developer forums, and training opportunities to ensure current technology trends are understood and applied within the F&G environment
Experience and Education Requirements
Bachelor's degree (preferred emphasis in computer science or MIS) or equivalent experiences
Senior to expert level design/development, debugging ability with Informatica Power Center (including version 10.1)
Senior to expert level ability to optimize Informatica and SQL jobs through performance tuning
Minimum 5 years of experience in supporting ETL, production data operations (File processing, data distribution etc.,) including debugging, addressing production issues and performing Root Cause Analysis
Expert level experience in designing and building large applications utilizing SQL Server
Experience in windows batch scripting, scheduling jobs using job scheduling tools, e.g., JAMS and Data Marts and other Data Warehousing practices
Thorough understanding of the software development life cycle and experience in working with geographically distributed teams (offshore, offsite etc.,)
Ability to use SQL development tools such as SQL Navigator and Toad as well as maintain code in source code control systems
Knowledge of proper database normalization, indexing, transaction protection and locking is essential
Preferred Skills
Preferred to have experience in supporting DTCC and data transfers from / to external organizations and internal systems using EFT (Electronic File Transfer)
Working knowledge of Informatica Data Quality, Business Glossary, Metadata Manager
Experience with database design/modeling tools such as Erwin
Skills and Abilities
Strong technical documentation ability
Familiar with SSIS, Python and other ETL frameworks preferred
Previous experience with Tableau, operational reporting is a plus
Must have a teamwork focused attitude and be skilled at building relationships within IT organizations and across business functions
Strong technical documentation skills
Life/Annuity insurance industry experience strongly preferred
Excellent oral and written communication skills
Knowledge of data integrity protocols and security requirements and techniques
Strong time management and organizational skills to enable productivity in a fast-paced, dynamic development environment
Strong verbal communication skills and a demonstrated ability to work effectively in team-based development projects
Physical Demands and Work Environment
Must be able to work in a fast-pace team environment and handle multiple projects and assignments under tight deadlines
Must demonstrate willingness to work flexible hours as needed to accommodate business needs and deliverables
Must be able to sit at a computer for extended periods of time
#LI-JS1
#INDHP
$84k-110k yearly est. Auto-Apply 60d+ ago
Data Engineer
Rogers Freels & Associates Inc.
Data engineer job in Johnston, IA
Job Description
RFA Engineering (*************** supports industry-leading clients through the full software development lifecycle to build cutting-edge precision agriculture, machine guidance, vehicle automation and autonomy applications. We are seeking passionate, talented engineers to work on exciting projects using the latest tools and technologies including robotics, computer-vision, machine learning, IoT, cloud computing, and much more. Collaborate with a team of industry experts onsite at our client's world-class engineering center and contribute to developing innovative solutions that drive sustainable agriculture practices.
This is a full-time position with a full benefit package listed below that includes opportunities for professional growth, direct hire by our customers, and additional opportunities within our own organization.
Senior DataEngineer
As a DataEngineer, you will enable engineers, analysts, and data scientists to more effectively access, explore, and leverage enterprise data to generate insights, build models, and make data-driven decisions. This role focuses on designing scalable cloud-based data solutions, developing high-quality datasets, and creating data products that empower individual contributors to work independently with complex, high-volume data.
A key component of this role involves collaborating with embedded and digital engineering teams to define, capture, and analyze system performance, UI, and health metrics. You will help translate raw signals into actionable insights through analytics, dashboards, and monitoring tools.
Responsibilities
Design, build, and maintain cloud-based data pipelines and data products that support analytics, machine learning, and exploratory data analysis
Enable self-service data access by developing well-structured, documented, and discoverable datasets for individual contributors
Partner with embedded and digital engineering teams to define required performance, UI, and system health metrics and ensure appropriate data capture
Perform analytics and data aggregation to translate raw signals into meaningful insights and visualizations
Develop and maintain production-ready Python code and data workflows using SQL, PySpark, Databricks, and related technologies
Manage and optimize storage of diverse data types, including images, raster data, parquet files, time-series data, geo-tagged data, text, and other unstructured data
Build dashboards and interactive data applications using tools such as Tableau, Plotly Dash, or similar web-based visualization frameworks
Develop alerting and monitoring solutions (e.g., automated email notifications) to identify system performance issues or data pipeline failures
Collaborate in code reviews, documentation, and best-practice development to ensure maintainable and reproducible solutions
Manage multiple projects, priorities, and milestones while maintaining a strong sense of ownership and accountability
Requirements
Bachelor's degree or higher in Computer Science, Software Engineering, DataEngineering, or a related technical field, or equivalent professional experience
Demonstrated experience designing and implementing cloud-based data solutions in production environments
Strong proficiency in Python for dataengineering and analytics applications
Experience working with data access and processing technologies such as SQL, PySpark, Databricks, Postgres, and MongoDB
Experience building and maintaining datasets at scale across multiple data formats and structures
Familiarity with vehicle or embedded system data, including CAN signals
Excellent written and verbal communication skills, including the ability to lead meetings, clearly document work, communicate proof-of-concepts, and collaborate across teams
Desired Attributes
Experience developing dashboards and visual analytics solutions using tools such as Tableau, Plotly, or similar platforms
Familiarity with spatial data and visualization tools such as Folium or Plotly
Experience implementing system monitoring, alerting, and health-tracking solutions
Strong analytical and problem-solving skills with the ability to debug complex data and system issues
Ability to work effectively in a self-directed environment with minimal oversight
Proven ability to manage multiple schedules, deliverables, and stakeholders simultaneously
Knowledge of off-highway, agricultural, construction, or industrial equipment data systems is a plus
Visa sponsorship is NOT available for this position.
Salary Range: $80,000-$120,000/year: Commensurate with experience
About RFA Engineering
RFA Engineering has provided product development and engineering services to industry leading customers since 1943. Our primary focus is the development of off highway equipment including agricultural, construction, mining, recreational, industrial, and special machines. Our work includes concept development, product design, documentation, problem-solving, simulation, optimization, and testing of components, systems and complete machines. Our engineering staff is located at our Engineering Center in Minneapolis, branch office in Dubuque, IA, and at numerous customer sites throughout the U.S.
Competitive Benefits
Health and Dental Insurance
TelaDoc Healthiest You
Supplemental Vision Insurance
Company Paid Life Insurance
Company Paid Long-Term Disability
Short-term Disability
Retirement Savings Account (Traditional 401k & Roth 401k)
Flexible Spending Plan Dependent Care
HSA for Medical Expenses
Bonus Plan (Exempt Employees Only)
Paid Time Off (PTO)
Paid Holidays
Bereavement Leave
Employee Assistance Programs (EAP)
Education Assistance
Equal Opportunity and Veteran Friendly
$80k-120k yearly 12d ago
Data Engineer
Holmes Murphy 4.1
Data engineer job in West Des Moines, IA
We are looking to add a DataEngineer to join our Information Technology team in West Des Moines, IA. Offering a forward-thinking, innovative, and vibrant company culture, along with the opportunity to share your unique potential, there really is no place like Holmes!
Essential Responsibilities:
Design and Implement Complex Pipelines: Design, build, and optimize ETL/ELT workflows from multiple sources, enhancing performance and reliability. Azure Data Factory, Alteryx, Fabric, dbt, SQL, and other tools.
Advanced Data Transformations: Implement transformations, aggregations, and custom logic to meet business requirements.
Data Modeling and Storage Optimization: Contribute to data modeling efforts, recommending storage solutions and structures that support analytical requirements.
Pipeline Monitoring and Optimization: Proactively monitor pipelines, perform root cause analysis for performance issues, and implement improvements.
Mentorship and Guidance: Provide guidance to junior engineers on best practices in data extraction, transformation, and loading.
Qualifications:
Education: Bachelor's degree in computer science, data science, technology, information systems or engineering preferred.
Experience: 3-5 years of professional experience in data, analytics, or platform engineering or related work-related experience strongly preferred.
Skills: Experience building ETL/ELT dataflows using tools like dbt, Alteryx, Azure Data Factory, or Microsoft Fabric. Skilled in working with diverse data sources (structured and unstructured), including delimited, fixed width, databases, and blob storage. Familiarity with APIs or RPA for third-party data collection; Salesforce experience is a plus. Knowledge of source control, cloud data storage, and platforms such as Snowflake or Azure preferred.
Technical Competencies: Possesses strong business and technology knowledge to understand business needs, make informed decisions, and deliver effective technology solutions, including related processes and procedures. Demonstrates excellent problem-solving skills to efficiently identify issues, determine root causes, and propose and implement effective solutions or improvements.
Here's a little bit about us:
In addition to being great at what you do, we place a high emphasis on building a best-in-class culture. We do this through empowering employees to build trust through honest and caring actions, ensuring clear and constructive communication, establishing meaningful client relationships that support their unique potential, and contributing to the organization's success by effectively influencing and uplifting team members.
Benefits: In addition to core benefits like health, dental and vision, also enjoy benefits such as:
Paid Parental Leave and supportive New Parent Benefits - We know being a working parent is hard, and we want to support our employees in this journey!
Company paid continuing Education & Tuition Reimbursement - We support those who want to develop and grow.
401k Profit Sharing - Each year, Holmes Murphy makes a lump sum contribution to every full-time employee's 401k. This means, even if you're not in a position to set money aside for the future at any point in time, Holmes Murphy will do it on your behalf! We are forward-thinking and want to be sure your future is cared for.
Generous time off practices in addition to paid holidays - Yes, we actually encourage employees to use their time off, and they do. After all, you can't be at your best for our clients if you're not at your best for yourself first.
Supportive of community efforts with paid Volunteer time off and employee matching gifts to charities that are important to you - Through our Holmes Murphy Foundation, we offer several vehicles where you can make an impact and care for those around you.
DE&I programs - Holmes Murphy is committed to celebrating every employee's unique diversity, equity, and inclusion (DE&I) experience with us. Not only do we offer all employees a paid Diversity Day time off option, but we also have a Chief Diversity Officer on hand, as well as a DE&I project team, committee, and interest group. You will have the opportunity to take part in those if you wish!
Consistent merit increase and promotion opportunities - Annually, employees are reviewed for merit increases and promotion opportunities because we believe growth is important - not only with your financial wellbeing, but also your career wellbeing.
Discretionary bonus opportunity - Yes, there is an annual opportunity to make more money. Who doesn't love that?!
Holmes Murphy & Associates is an Equal Opportunity Employer.
#LI-GH1
$77k-102k yearly est. Auto-Apply 11d ago
Azure Data Engineer - 6013916
Accenture 4.7
Data engineer job in Des Moines, IA
Accenture Flex offers you the flexibility of local fixed-duration project-based work powered by Accenture, a leading global professional services company. Accenture is consistently recognized on FORTUNE's 100 Best Companies to Work For and Diversity Inc's Top 50 Companies For Diversity lists.
As an Accenture Flex employee, you will apply your skills and experience to help drive business transformation for leading organizations and communities. In addition to delivering innovative solutions for Accenture's clients, you will work with a highly skilled, diverse network of people across Accenture businesses who are using the latest emerging technologies to address today's biggest business challenges.
You will receive competitive rewards and access to benefits programs and world-class learning resources. Accenture Flex employees work in their local metro area onsite at the project, significantly reducing and/or eliminating the demands to travel.
Job Description:
Join our dynamic team and embark on a journey where you will be empowered to perform independently and become an SME. Required active participation/contribution in team discussions will be key as you contribute in providing solutions to work related problems. Let's work together to achieve greatness!
Responsibilities:
* Create new data pipelines leveraging existing data ingestion frameworks, tools
* Orchestrate data pipelines using the Azure Data Factory service.
* Develop/Enhance data transformations based on the requirements to parse, transform and load data into Enterprise Data Lake, Delta Lake, Enterprise DWH (Synapse Analytics)
* Perform Unit Testing, coordinate integration testing and UAT Create HLD/DD/runbooks for the data pipelines
* Configure compute, DQ Rules, Maintenance Performance tuning/optimization
Qualification
Basic Qualifications:
* Minimum of 3 years of work experience with one or more of the following: DatabricksDataEngineering, DLT, Azure Data Factory, SQL, PySpark, Synapse Dedicated SQL Pool, Azure DevOps, Python
Preferred Qualifications:
* Azure Function Apps
* Azure Logic Apps
* Precisely & COSMOS DB
* Advanced proficiency in PySpark.
* Advanced proficiency in Microsoft Azure Databricks, Azure DevOps, Databricks Delta Live Tables and Azure Data Factory.
* Bachelor's or Associate's degree
Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired as set forth below. We accept applications on an on-going basis and there is no fixed deadline to apply.
Information on benefits is here.
Role Location:
California - $47.69 - $57.69
Cleveland - $47.69 - $57.69
Colorado - $47.69 - $57.69
District of Columbia - $47.69 - $57.69
Illinois - $47.69 - $57.69
Minnesota - $47.69 - $57.69
Maryland - $47.69 - $57.69
Massachusetts - $47.69 - $57.69
New York/New Jersey - $47.69 - $57.69
Washington - $47.69 - $57.69
Locations
$64k-82k yearly est. 3d ago
Data Engineer-Healthcare Analytics (Full-Time)
The Iowa Clinic, P.C 4.6
Data engineer job in West Des Moines, IA
Looking for a career where you love what you do and who you do it with? You're in the right place. Healthcare here is different - we're locally owned and led by our physicians, and all decisions are always made right here in Central Iowa. By working at The Iowa Clinic, you'll get to make a difference while seeing a difference in our workplace. Because as one clinic dedicated to exceptional care, we're committed to exceeding expectations, showing compassion and collaborating to provide the kind of care most of us got into this business to deliver in the first place.
Think you've got what it takes to join our TIC team? Keep reading…
A day in the life…
Wondering what a day in the life of a DataEngineer - Healthcare Analytics at The Iowa Clinic might look like?
We are seeking a skilled and motivated DataEngineer to join our growing analytics team. In this role, you will design, build, and maintain scalable data pipelines and infrastructure that empower our organization to make data-driven decisions. You'll collaborate closely with data analysts, data scientists, and business stakeholders to ensure data is accessible, reliable, and optimized for performance.
Key Responsibilities
* Develop and maintain robust ETL/ELT pipelines using modern dataengineering tools and frameworks.
* Design and implement data models and architectures that support analytics and reporting needs.
* Ensure data quality, integrity, and security across all systems.
* Collaborate with cross-functional teams to understand data requirements and deliver solutions.
* Monitor and optimize data workflows for performance and scalability.
* Maintain documentation for data processes, systems, and architecture.
NOTE: Candidates must have valid U.S. work authorization and will not require employer sponsorship now or in the future. We do not provide sponsorship.
Qualifications
* Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
* 3+ years of experience in dataengineering or a similar role.
* Demonstrated knowledge and practical use of health information standards
* Strong understanding of analytical methods, tools, statistics and data management
* Proficiency in SQL and Python (or other scripting languages).
* Experience with cloud platforms (e.g., AWS, Azure, GCP) and data warehousing solutions (e.g., Snowflake, Redshift, BigQuery).
* Familiarity with tools like Airflow, dbt, Kafka, or Spark is a plus.
* Strong understanding of data modeling, data governance, and best practices in data architecture.
Preferred Skills
* Experience working in Agile environments.
* Knowledge of CI/CD pipelines and DevOps practices.
* Ability to communicate technical concepts to non-technical stakeholders.
* Passion for building scalable and efficient data systems.
Know someone else who might be a great fit for this role? Share it with them!
What's in it for you
* One of the best 401(k) programs in central Iowa, including employer match and profit sharing
* Employee incentives to share in the Clinic's success
* Generous PTO accruals
* and paid holidays
* Health, dental and vision insurance
* Quarterly volunteer opportunities through a variety of local nonprofits
* Training and development programs
* Opportunities to have fun with your colleagues, including TIC night at the Iowa Cubs, employee appreciation tailgate party, Adventureland day, State Fair tickets, annual holiday party, drive-in movie night… we could go on and on
* Monthly departmental celebrations, jeans days and clinic-wide competitions
* Employee rewards and recognition program
* Health and wellness program with up to $350/year in incentives
* Employee feedback surveys
* All employee meetings, team huddles and transparent communication
$80k-104k yearly est. Auto-Apply 60d+ ago
Senior Data Engineer
Berkley 4.3
Data engineer job in Urbandale, IA
Company Details
Company URL: ******************************************
Berkley Technology Services (BTS) is the dynamic technology solution for W. R. Berkley Corporation, a Fortune 500 Commercial Lines Insurance Company. With key locations in Urbandale, IA and Wilmington, DE, BTS provides innovative and customer-focused IT solutions to the majority of WRBC's 60+ operating units across the globe. BTS's wide reach ensures that ideas and opinions are considered at every level of the organization to guarantee we find the best solutions possible.
Driven by a commitment to collaboration, BTS acts as consultants to our customers and Operating Units by providing comprehensive solutions that not only address the challenge at hand, but proactively plan for the “What's Next” in our industry and beyond.
With a culture centered on innovation and entrepreneurial spirit, BTS stands as a community of technology leaders with eyes toward the future -- leaders who truly care about growing not only their team members, but themselves, and take pride in their employees who shine. BTS offers endless ways to get involved and have the chance to grow your career into a wide range of roles you'd never known existed. Come join us as we push forward into the future of industry leading technological solutions.
Berkley Technology Services: Right Team, Right Technology, Simple and Secure.
Responsibilities
This Sr. DataEngineer will be providing data development at an advanced complexity level such as ETL, cube creation, adhoc and management reporting, dashboard and data extract creation. This individual will work within a team environment that provides data resource development and support for several companies. They will be responsible for analyzing, designing and coding solutions for rapidly growing companies supporting the property & casualty insurance industry.
Requires an in depth understanding of insurance related reporting and insurance company operations
Demonstrates a robust understanding of all the business data processes/processing, system interfaces, and where/how that data is used, and the related data structures
Design and implementation of ETL process, data structure and the analytical/operational reporting environment
Can design significant new system functionality with a consideration of performance, stability, and supportability
Thorough understanding of industry best practices. Includes but not limited to modeling, workflow and presentation
Design and implement data extracts both inbound and outbound for internal and external sources
Requires strong organizational and communication skills
Demonstrates problem-solving skills that span the application, middleware, and infrastructure levels - thorough upstream/downstream impact analysis
Assists in defining standards and design patterns/paradigms for data processes and/or structures for development within a team
Provides guidance on BTS development standards and quality expectations to BTS or company resources entering the job family
Demonstrates understanding of data processes and/or structures
Will be required to communicate directly with employees at all levels up to the senior level, within company and client companies
Will provide mentorship to others
Will develop sphere of influence with other teams
Will be required to communicate and coordinate within the team
May be responsible for on-call rotation
Some travel required up to 20%
Qualifications
10 + years - reasonable single system / single technology knowledge
10 + years SQL experience. (queries, stored procedures, functions)
Must have demonstrated the capability of meeting the key accountabilities, or have the ability to learn/perform them
A self-motivated individual with a passion for success
Needs to be able to determine how changes impact customer and other systems
Excellent communication and organizational skills
Ability to work in a fast-paced team environment
Ability to quickly adapt and learn new technologies
Ability to work independently
Strong customer and business focus
Behavioral Core Competencies
Technically Astute
Managing Information
Customer Service Oriented
Business Knowledge
Influential
Conceptual Thinking
Bachelor's degree with emphasis in related field or equivalent experience.
The Company is an equal employment opportunity employer.
Sponsorship Details Sponsorship not Offered for this Role
$73k-98k yearly est. Auto-Apply 36d ago
Data Architect Consultant
Intermountain Health 3.9
Data engineer job in Des Moines, IA
We're looking for a technical, highly collaborative Data Architect - Consultant who can bridge strategy, engineering, and delivery. This role is responsible for defining the problem to solve, shaping solution approaches, and leading projects end‑to‑end with strong facilitation and execution skills. You'll play a key role in maturing our code review process, establishing data architecture best practices, and elevating the quality and consistency of our delivery. The ideal candidate combines hands-on technical fluency with exceptional communication, enabling them to partner closely with engineers, guide architectural decisions, and drive continuous improvement across teams.
**Essential Functions**
Lead the design, implementation, and management of complex data infrastructures across multiple clinical and enterprise projects.
Apply deep expertise in relational databases, cloud technologies, and big data tools to deliver scalable, efficient, and secure data solutions.
Manage multiple complex projects simultaneously, ensuring alignment with clinical program objectives and organizational priorities.
Provide leadership and direction on enterprise-level data integration strategies.
Mentor junior and senior team members, fostering technical growth through structured guidance and code reviews.
Collaborate with cross-functional teams and stakeholders to ensure high-quality data architectures and pipelines across cloud and on-premise systems.
**Skills**
+ Leadership & Project Management - Ability to lead teams and manage multiple complex initiatives concurrently.
+ Mentorship & Code Review - Skilled in guiding junior team members and enforcing best practices through structured reviews.
+ Collaboration & Stakeholder Management - Strong interpersonal skills to work effectively with clinical program leaders and technical teams.
+ Data Architecture & Design - Expertise in designing scalable and secure data solutions.
+ Cloud Infrastructure & Data Solutions - Proficiency in AWS, Azure, or similar platforms.
+ ETL/ELT Development & Data Integration - Building and optimizing data pipelines.
+ Database & Performance Optimization - Ensuring high availability and efficiency.
+ Data Modeling & Documentation - Creating clear, maintainable models and technical documentation.
+ Data Governance & Security - Implementing compliance and security best practices.
+ Coding/Programming - Strong programming skills for dataengineering and architecture.
Minimum Qualifications:
+ Expert proficiency with SQL and extensive experience with traditional RDBMS (e.g., Oracle, SQL Server, PostgreSQL).
+ Extensive experience with cloud platforms such as AWS, Azure, or Google Cloud Platform for data architecture and storage solutions.
+ Mastery of programming languages such as Python and PySpark for dataengineering tasks.
+ In-depth knowledge of ETL/ELT processes and tools, including both traditional (e.g., SSIS, Informatica) and cloud-native solutions (e.g., Azure Data Factory, Databricks).
+ Outstanding communication skills for collaborating with stakeholders and teams.
+ Expert understanding of Product Management, Project Management, or Program Management philosophies and methodologies, and capable of applying them to data architecture projects to ensure alignment with business goals and efficient execution.
+ Demonstrated ability to stay updated on industry trends and advancements.
+ Proven experience in providing mentorship and guidance to junior and senior architects.
Preferred Qualifications:
+ A Master's Degree in an analytics-related field such as information systems, data science / analytics, statistics, computer science, mathematics and 4 years of experience.
+ or
+ 8 years of professional experience in analytics role in an analytics-related field such as statistics, mathematics, information systems, computer science, data science / analytics.
+ or
+ Bachelors degree in an analytics-related field such as information systems, data science / analytics, statistics, computer science, mathematics. With 6 years of experience
+ Experience with Databricks, Apache Spark, and Delta Lake for real-time and batch data processing.
+ Proficiency in data streaming technologies such as Kafka, AWS Kinesis, or Azure Event Hubs.
+ Experience working with APIs to retrieve and integrate data from external systems.
+ Experience developing APIs to provide data as a product.
+ Familiarity with CI/CD pipelines for dataengineering workflows.
+ Knowledge of data governance frameworks and compliance standards (e.g., GDPR, HIPAA).
+ Experience in a healthcare environment
+ Familiarity with business intelligence tools such as Tableau, Power BI, or Looker for delivering insights from data architectures
Remain sitting or standing for long periods of time to perform work on a computer, telephone, or other equipment.
**Location:**
Lake Park Building
**Work City:**
West Valley City
**Work State:**
Utah
**Scheduled Weekly Hours:**
40
The hourly range for this position is listed below. Actual hourly rate dependent upon experience.
$60.06 - $94.57
We care about your well-being - mind, body, and spirit - which is why we provide our caregivers a generous benefits package that covers a wide range of programs to foster a sustainable culture of wellness that encompasses living healthy, happy, secure, connected, and engaged.
Learn more about our comprehensive benefits package here (***************************************************** .
Intermountain Health is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability or protected veteran status.
At Intermountain Health, we use the artificial intelligence ("AI") platform, HiredScore to improve your job application experience. HiredScore helps match your skills and experiences to the best jobs for you. While HiredScore assists in reviewing applications, all final decisions are made by Intermountain personnel to ensure fairness. We protect your privacy and follow strict data protection rules. Your information is safe and used only for recruitment. Thank you for considering a career with us and experiencing our AI-enhanced recruitment process.
All positions subject to close without notice.
$72k-95k yearly est. 5d ago
Data Scientist, Analytics (Technical Leadership)
Meta 4.8
Data engineer job in Des Moines, IA
We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.
**Required Skills:**
Data Scientist, Analytics (Technical Leadership) Responsibilities:
1. Work with complex data sets to solve challenging problems using analytical and statistical approaches
2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies
3. Identify and measure success through goal setting, forecasting, and monitoring key metrics
4. Partner with cross-functional teams to inform and execute product strategy and investment decisions
5. Build long-term vision and strategy for programs and products
6. Collaborate with executives to define and develop data platforms and instrumentation
7. Effectively communicate insights and recommendations to stakeholders
8. Define success metrics, forecast changes, and set team goals
9. Support developing roadmaps and coordinate analytics efforts across teams
**Minimum Qualifications:**
Minimum Qualifications:
10. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
11. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab)
12. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development
13. Experience working effectively with and leading complex analytics projects across multiple stakeholders and cross-functional teams, including Engineering, PM/TPM, Analytics and Finance
14. Experience with predictive modeling, machine learning, and experimentation/causal inference methods
15. Experience communicating complex technical topics in a clear, precise, and actionable manner
**Preferred Qualifications:**
Preferred Qualifications:
16. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy
17. Masters or Ph.D. Degree in a quantitative field
18. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research)
19. 10+ years of experience doing complex quantitative analysis in product analytics
**Public Compensation:**
$210,000/year to $281,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
$98k-128k yearly est. 60d+ ago
Senior Data Engineer
Berkley 4.3
Data engineer job in Urbandale, IA
Company Details
Company URL: ******************************************
Berkley Technology Services (BTS) is the dynamic technology solution for W. R. Berkley Corporation, a Fortune 500 Commercial Lines Insurance Company. With key locations in Urbandale, IA and Wilmington, DE, BTS provides innovative and customer-focused IT solutions to the majority of WRBC's 60+ operating units across the globe. BTS's wide reach ensures that ideas and opinions are considered at every level of the organization to guarantee we find the best solutions possible.
Driven by a commitment to collaboration, BTS acts as consultants to our customers and Operating Units by providing comprehensive solutions that not only address the challenge at hand, but proactively plan for the “
What's Next
” in our industry and beyond.
With a culture centered on innovation and entrepreneurial spirit, BTS stands as a community of technology leaders with eyes toward the future -- leaders who truly care about growing not only their team members, but themselves, and take pride in their employees who shine. BTS offers endless ways to get involved and have the chance to grow your career into a wide range of roles you'd never known existed. Come join us as we push forward into the future of industry leading technological solutions.
Berkley Technology Services: Right Team, Right Technology, Simple and Secure.
Responsibilities
Join Our Innovative Insurance Data Team!
We're looking for a Senior DataEngineer with a deep understanding of insurance reporting and company operations. If you have a knack for mastering business data processes, system interfaces, and data structures, this is the role for you!
In this position, you'll be tackling advanced tasks like ETL, cube creation, ad-hoc and management reporting, and crafting eye-catching dashboards and data extracts. You'll be part of a vibrant team that supports multiple companies, providing top-notch data resources and development.
Your mission includes analyzing, designing, and coding innovative solutions for fast-growing companies in the Property & Casualty insurance industry. Get ready to make a big impact and have some fun along the way!
Design & Implement: Create and manage ETL processes, data structures, and both analytical and operational reporting environments.
Innovate: Develop new system functionalities with a focus on performance, stability, and supportability.
Lead by Example: Apply industry best practices in modeling, workflow, and presentation.
Extract & Integrate: Design and implement data extracts for both internal and external sources.
Organize & Communicate: Utilize your strong organizational and communication skills to keep everything running smoothly.
Solve Problems: Tackle challenges across application, middleware, and infrastructure levels with thorough impact analysis.
Set Standards: Help define standards and design patterns for data processes and structures within the team.
Guide & Mentor: Provide guidance on development standards and quality expectations, and mentor new team members.
Collaborate: Communicate effectively with employees at all levels, both within the company and with clients.
Influence: Develop a sphere of influence with other teams and foster a collaborative environment.
Be On Call: Participate in on-call rotations to ensure smooth operations.
Travel: Enjoy occasional travel, up to 20%.
Qualifications
SQL Maestro: 7+ years of experience with SQL and ETL processes, including crafting queries, stored procedures, jobs, functions, synonyms, and aliases. Experience with SQL Environments and variables, version control, and execution plans. Extensive experience with ETL tools (SSIS or similar) and optimizing performance.
Proven Performer: Demonstrated ability to meet key accountabilities or the drive to learn and excel in them.
Passionate Go-Getter: A self-motivated individual with an unyielding passion for success.
Impact Analyst: Skilled at understanding how changes affect customers and other systems.
Communication Pro: Excellent communication and organizational skills to keep everything on track.
Team Player and Mentor: Thrives in a fast-paced team environment and loves to train and mentor junior staff on best practices and enhancing solution designs.
Tech Enthusiast: Quick to adapt and eager to learn new technologies.
Independent Worker: Capable of working independently with minimal supervision.
Customer Champion: Strong focus on customer and business needs.
Bachelors Degree in Computer Science, Information Technology, Information Systems, or a related discipline. Equivalent experience and/or alternative qualifications will be considered.
Behavioral Core Competencies
Technically Astute
Managing Information
Customer Service Oriented
Business Knowledge
Influential
Conceptual Thinking
The Company is an equal employment opportunity employer.
How much does a data engineer earn in West Des Moines, IA?
The average data engineer in West Des Moines, IA earns between $62,000 and $107,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.
Average data engineer salary in West Des Moines, IA
$81,000
What are the biggest employers of Data Engineers in West Des Moines, IA?
The biggest employers of Data Engineers in West Des Moines, IA are: