Johnston, IA - candidate living within 50-mile radius of location required onsite T/W/TH each week.
Project Scope and Brief Description:
Work at the intersection of plant cell biology and applied AI to build, productionize, and maintain computer vision pipelines that accelerate Doubled Haploid (DH) breeding in Biotechnology. The contractor will contribute to end‑to‑end imaging and analytics-from microscopy microspore detection to macroscopic structure assessment and plantlet characterization-supporting decisions that reduce cycle time and cost in DH programs. Solutions will be developed primarily in Python, integrated with our repositories and workflow tooling, and aligned with Biotech strategy initiatives.
Responsibilities:
Design & deliver deep learning-based CV models for microscopy and macroscopic assays (detection, segmentation, classification) with measurable accuracy, robustness, and throughput.
Build production‑ready pipelines in Python (data ingest, preprocessing, augmentation, inference, batch processing), integrated with GitLab repos and experiment tracking; ensure reproducibility and documentation.
Implement hyperspectral analysis workflows (band selection, normalization, feature extraction, model training).
Harmonize imaging acquisition with analysis by collaborating with biology teams to standardize microscopy/RGB/hyperspectral capture and file formats (e.g., FIJI/ImageJ for z‑stacks; autoscale practices).
Quantify model performance (precision/recall, F1, ROC/AUC, calibration) and write clear reports/posters for DH sessions; support fact‑checking in presentations.
Operationalize at scale: batch processing of tens of thousands of structures/images; optimize inference (e.g., torch.compile, mixed precision) and monitor resource usage.
Partner with DH stakeholders (biotech & breeding, Genome Technology Discovery, Data Science) to align deliverables with deployment milestones.
Maintain IP & data stewardship practices consistent with internal strategy; avoid disclosure of confidential protocols while enabling model re‑use.
Skills / Experience:
Must‑Have
4-6 years hands‑on in computer vision with Python (PyTorch/TensorFlow), including detection/segmentation/classification for scientific or industrial imaging.
Proven ability to productionize models: Git/GitLab, code reviews, CICD basics, experiment tracking (MLFlow or equivalent), reproducible data/experiments, and clear documentation.
Experience with microscopy image processing, multi‑page TIFFs, z‑stacks, autoscale/normalization, and image quality challenges.
Familiarity with hyperspectral or multispectral imaging pipelines (preprocessing, dimensionality reduction, modeling) applied to plant or biological materials.
Track record of measurable model performance reporting and communicating results via posters/presentations for technical audiences.
Nice‑to‑Have
Vision Transformers (ViT) and modern YOLO workflows for microscopy/macroscopic tasks; comfort with infer tooling.
Experience optimizing inference (e.g., torch.compile, mixed precision) and scaling batch workflows.
Domain familiarity with Biotech breeding workflows.
Collaboration with discovery and strategy teams; ability to work across biology, engineering, and data science groups.
Soft Skills
Strong stakeholder communication and the ability to translate biology & process constraints into CV requirements; comfortable triaging and prioritizing rapidly in active programs.
Ownership mindset around documentation, reproducibility, and IP‑aware sharing.
Curious and learning mindset
Technical leadership experience.
$64k-88k yearly est. 5d ago
Looking for a job?
Let Zippia find it for you.
Data Scientist, Product Analytics
Meta 4.8
Data engineer job in Des Moines, IA
As a Data Scientist at Meta, you will shape the future of people-facing and business-facing products we build across our entire family of applications (Facebook, Instagram, Messenger, WhatsApp, Oculus). By applying your technical skills, analytical mindset, and product intuition to one of the richest data sets in the world, you will help define the experiences we build for billions of people and hundreds of millions of businesses around the world. You will collaborate on a wide array of product and business problems with a wide-range of cross-functional partners across Product, Engineering, Research, DataEngineering, Marketing, Sales, Finance and others. You will use data and analysis to identify and solve product development's biggest challenges. You will influence product strategy and investment decisions with data, be focused on impact, and collaborate with other teams. By joining Meta, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.Product leadership: You will use data to shape product development, quantify new opportunities, identify upcoming challenges, and ensure the products we build bring value to people, businesses, and Meta. You will help your partner teams prioritize what to build, set goals, and understand their product's ecosystem.Analytics: You will guide teams using data and insights. You will focus on developing hypotheses and employ a varied toolkit of rigorous analytical approaches, different methodologies, frameworks, and technical approaches to test them.Communication and influence: You won't simply present data, but tell data-driven stories. You will convince and influence your partners using clear insights and recommendations. You will build credibility through structure and clarity, and be a trusted strategic partner.
**Required Skills:**
Data Scientist, Product Analytics Responsibilities:
1. Work with large and complex data sets to solve a wide array of challenging problems using different analytical and statistical approaches
2. Apply technical expertise with quantitative analysis, experimentation, data mining, and the presentation of data to develop strategies for our products that serve billions of people and hundreds of millions of businesses
3. Identify and measure success of product efforts through goal setting, forecasting, and monitoring of key product metrics to understand trends
4. Define, understand, and test opportunities and levers to improve the product, and drive roadmaps through your insights and recommendations
5. Partner with Product, Engineering, and cross-functional teams to inform, influence, support, and execute product strategy and investment decisions
**Minimum Qualifications:**
Minimum Qualifications:
6. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
7. Bachelor's degree in Mathematics, Statistics, a relevant technical field, or equivalent
8. 4+ years of work experience in analytics, data querying languages such as SQL, scripting languages such as Python, and/or statistical mathematical software such as R (minimum of 2 years with a Ph.D.)
9. 4+ years of experience solving analytical problems using quantitative approaches, understanding ecosystems, user behaviors & long-term product trends, and leading data-driven projects from definition to execution [including defining metrics, experiment, design, communicating actionable insights]
**Preferred Qualifications:**
Preferred Qualifications:
10. Master's or Ph.D. Degree in a quantitative field
**Public Compensation:**
$147,000/year to $208,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@meta.com.
$147k-208k yearly 60d+ ago
Data Scientist, Privacy
Datavant
Data engineer job in Des Moines, IA
Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care.
By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare.
As part of the Privacy Science team within Privacy Hub you will play a crucial role in ensuring that privacy of patients is safeguarded in the modern world of data sharing. As well as working on real data, you will be involved in exciting research to keep us as industry leaders in this area, and stimulating discussions on re-identification risk. You will be supported in developing/consolidating data analysis and coding skills to become proficient in the analysis of large health-related datasets.
**You Will:**
+ Critically analyze large health datasets using standard and bespoke software libraries
+ Discuss your findings and progress with internal and external stakeholders
+ Produce high quality reports which summarise your findings
+ Contribute to research activities as we explore novel and established sources of re-identification risk
**What You Will Bring to the Table:**
+ Excellent communication skills. Meticulous attention to detail in the production of comprehensive, well-presented reports
+ A good understanding of statistical probability distributions, bias, error and power as well as sampling and resampling methods
+ Seeks to understand real-world data in context rather than consider it in abstraction.
+ Familiarity or proficiency with programmable data analysis software R or Python, and the desire to develop expertise in its language
+ Application of scientific methods to practical problems through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions
+ Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines
+ Initiative and ability to independently explore and research novel topics and concepts as they arise, to expand Privacy Hub's knowledge base
+ An appreciation of the need for effective methods in data privacy and security, and an awareness of the relevant legislation
+ Familiarity with Amazon Web Services cloud-based storage and computing facilities
**Bonus Points If You Have:**
+ Experience creating documents using LATEX
+ Detailed knowledge of one or more types of health information, e.g., genomics, disease, health images
+ Experience working with or supporting public sector organizations, such as federal agencies (e.g., CMS, NIH, VA, CDC), state health departments, or public health research partners. Familiarity with government data environments, procurement processes, or privacy frameworks in regulated settings is highly valued.
\#LI-BC1
We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status.
At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services.
The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job.
The estimated total cash compensation range for this role is:
$104,000-$130,000 USD
To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion.
This job is not eligible for employment sponsorship.
Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay.
At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way.
Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis.
For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
$104k-130k yearly 22d ago
Data Scientist - Property/Casualty
Farm Bureau Financial Services 4.5
Data engineer job in West Des Moines, IA
Farm Bureau is looking for a strong data driven professional to join our team and help us live out our mission of protecting livelihoods and futures. In this role, you will perform statistical analysis as part of the predictive modeling team and assist in the delivery of pricing models and competitive research. Candidate must be local to the Des Moines area, or willing to relocate as at least three days in the office is required per week. Candidates must also have predictive modeling or actuarial experience.
Who We Are: At Farm Bureau Financial Services, we make insurance simple so our client/members can feel confident knowing their family, home, cars and other property are protected. We value a culture where integrity, teamwork, passion, service, leadership and accountability are at the heart of every decision we make and every action we take. We're proud of our more than 80-year commitment to protecting the livelihoods and futures of our client/members and creating an atmosphere where our employees thrive.
What You'll Do:
* Develop, implement, and document predictive models
* Interpret results to audiences of various technical knowledge
* Test, validate, and correct / clean data as needed
* Collaborate with managing actuaries on the pricing teams to deliver results
What It Takes to Join Our Team:
* College degree plus 3 years predictive modeling experience required.
* Property Casualty experience preferred.
* Must have strong skills with SQL, R, Emblem or other predictive modeling software.
* Be team focused and be able to work in a collaborative work environment.
* Strong communication and presentation skills.
What We Offer You: When you're on our team, you get more than a great paycheck. You'll hear about career development and educational opportunities. We offer an enhanced 401K with a match, low-cost health, dental, and vision benefits, and life and disability insurance options. We also offer paid time off, including holidays and volunteer time, and teams who know how to have fun. Add to that an onsite wellness facility with fitness classes and programs, a daycare center, a cafeteria, and for many positions, even consideration for a hybrid work arrangement. Farm Bureau....where the grass really IS greener!
Work Authorization/Sponsorship
Applicants must be currently authorized to work in the United States on a full-time basis. We are not able to sponsor now or in the future, or take over sponsorship of, an employment visa or work authorization for this role. For example, we are not able to sponsor OPT status.
$65k-89k yearly est. 20d ago
Data Engineer
LCS Senior Living
Data engineer job in Des Moines, IA
This position is responsible for designing, developing, and maintaining robust data solutions to support business operations and decision-making processes. This role involves working with cutting-edge technologies, including Microsoft Azure Cloud, Snowflake, and Power BI, to build and manage data lakes, data warehouses, and reporting systems. The DatabaseEngineer ensures data accuracy, system reliability, and high-quality deliverables by performing thorough quality assurance testing and providing user support. Collaboration with IT and business teams is essential to identify opportunities for operational efficiency and align data solutions with organizational goals. Reports to the Director, Data Analytics
Experience is Everything.
At LCS, experience is everything. We provide you the opportunity to use your talents in a progressive, growing organization that makes a positive difference in the lives of the seniors we serve. If you are seeking an organization that gives back, you'll love working here. Our principles and hospitality promises define our company culture. LCS employees can be found participating in volunteer activities, getting involved in our committees or collaborating with team members in our innovative workspace. You'll find several opportunities to grow as a professional, serve the community, and enhance the lives of seniors.
What You'll Do:
* Design, develop, and maintain scalable data lakes, data warehouses, and data storage solutions.
* Create and optimize ETL/ELT pipelines for efficient data processing and integration.
* Implement data models to support reporting, analytics, and operational workflows.
* Build, manage, and enhance Power BI dashboards and reports for actionable insights.
* Collaborate with business stakeholders to gather reporting requirements and ensure data accuracy.
* Leverage Microsoft Azure Cloud services (e.g., Data Factory, Synapse Analytics, Data Lake Storage) for data integration and processing.
* Utilize Snowflake to manage and optimize data warehouse performance.
* Perform thorough QA testing to ensure data accuracy, system reliability, and high performance.
* Provide user support and troubleshoot issues with data systems, pipelines, and reporting tools.
* Work closely with cross-functional teams to align data solutions with organizational objectives.
* Identify and implement operational efficiencies in data workflows and system designs.
* Ensure compliance with data governance, privacy, and security standards.
* Follow best practices for version control, documentation, and CI/CD processes.
* Stay updated on emerging technologies and trends to enhance data strategies and tools.
* Recommend improvements to existing data processes and tools to drive innovation.
What We're Looking For:
* Required qualifications
* Bachelor's year degree in Computer Science, MIS, or equivalent work experience.
* 2-4 years of experience in developing, implementing, and supporting enterprise solutions.
* Experience with database concepts, advanced SQL, and reporting/Business Intelligence software.
* Expertise in data architecture, data modeling, and building scalable data solutions.
* Proficiency with Microsoft Azure Cloud services (e.g., Azure Data Factory, Synapse Analytics, Data Lake Storage).
* Strong experience with Snowflake for data warehousing and optimization.
* Advanced skills in Power BI for creating reports, dashboards, and data visualizations.
* Proficient in SQL and scripting languages like Python for data integration and analysis.
* Solid understanding of ETL/ELT pipelines and data integration workflows.
* Strong problem-solving skills and attention to detail for troubleshooting data systems.
* Excellent communication and collaboration abilities to work effectively across teams.
Why Join Us?
* Industry Leader.
* Inclusive & collaborative culture.
* Top Workplace USA.
* Top Workplace Iowa.
* Charity and community involvement.
* Outstanding advancement opportunities.
* Ongoing career development.
Benefits
Competitive pay, great benefits and vacation time. We are an equal opportunity employer with benefits including medical, dental, life insurance, disability, 401(K) with company match and paid parental leave.
Our Commitment
LCS creates living experiences that enhance the lives of seniors. You'll see this commitment in our people. They're talented, dedicated professionals who truly care about residents, with each conducting his or her work with integrity, honesty and transparency according to the principles of LCS. We strive to help every community succeed-strengthening available resources, establishing proven practices that lead to long-term growth and value for those living in, working for and affiliated with the community. Check us out on our website: *************************
Additional Information
Travel frequency: 0-10%
Estimated Salary: $113,000 - 141,000
The actual title & salary will carefully consider a wide range of factors, including your skills, qualifications, experience, and other relevant factors.
A POST-OFFER BACKGROUND CHECK, INCLUDING REFERENCES IS REQUIRED.
LCS IS AN EQUAL OPPORTUNITY EMPLOYER.
$113k-141k yearly Auto-Apply 12d ago
Sr Data Engineer (MFT - IBM Sterling)
The Hertz Corporation 4.3
Data engineer job in Des Moines, IA
**A Day in the Life:** The IBM Sterling Senior MFT Engineer will have expertise in Sterling B2B Integrator, Sterling File Gateway, Sterling Secure Proxy, Control Center. This position requires an expert level knowledge of these technologies. You'll provide third-level support for core hardware, software, data and infrastructure components within the B2B integration team. The software focus areas will include software infrastructure components, continuous improvement and automation related to middleware and enterprise data movement within a mission-critical, multi-site enterprise environment.
The ideal candidate will have a passion for technology; possess the ability to create change and be the facilitator for this transformation. They will have experience tailoring software design, developing, and implementing of solutions in support of data quality and automation. They will need to understand the requirements of a project, test the performance of the infrastructure, and verify that the requirements have been successfully met.
We expect the starting salary to be around $135k but will be commensurate with experience.
**What You'll Do:**
TECHNICAL SENIORSHIP
+ Communication with internal and external business users on Sterling Integrator mappings
+ Making changes to existing partner integrations to meet internal and external requirements
+ Design, develop and implement solutions based on standards and processes that establish consistency across the enterprise data, reduce risks, and promote efficiencies in support of the organizational goals and objectives.
+ Diagnose and troubleshoot complex issues, restore services and perform root cause analysis.
+ Facilitate the review, vetting of these designs with the architecture governance bodies, as required.
+ Be aware of all aspects of security related to the Sterling environment and integrations
INNOVATIVE INFRASTRUCTURE & PROBLEM SOLVING
+ Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows.
TEAMWORK & COMMUNICATION
+ Superior & demonstrated team building & development skills to harness powerful teams
+ Ability to communicate effectively with different levels of Seniorship within the organization
+ Provide timely updates so that progress against each individual incident can be updated as required
+ Write and review high quality technical documentation
CONTROL & AUDIT
+ Ensures their workstation and all processes and procedures, follow organization standards
CONTINUOUS IMPROVEMENT
+ Encourages and maintains a 'Best practice sharing culture', always striving to find ways to improve service and change mind set.
**What We're Looking For:**
+ Bachelor's degree in Engineering, Statistics, Computer Science or other quantitative fields, required
+ 5+ years of IT experience
+ 4+ years' experience working with Sterling B2B Integrator, Sterling File Gateway, PCM (preferred)
+ 3+ years' experience with scripting to enable automation of standard activities (example: Ansible, Python, Bash, Java)
+ Strong interpersonal and communication skills with Agile/Scrum experience.
+ Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions with the ability to develop custom setups.
+ Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels.
+ Prefer Travel, transportation, or hospitality experience
+ Prefer experience with designing application data models for mobile or web applications
+ Excellent written and verbal communication skills.
+ Flexibility in scheduling which may include nights, weekends, and holidays
**What You'll Get:**
+ Up to 40% off the base rate of any standard Hertz Rental
+ Paid Time Off
+ Medical, Dental & Vision plan options
+ Retirement programs, including 401(k) employer matching
+ Paid Parental Leave & Adoption Assistance
+ Employee Assistance Program for employees & family
+ Educational Reimbursement & Discounts
+ Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness
+ Perks & Discounts -Theme Park Tickets, Gym Discounts & more
The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world.
**US EEO STATEMENT**
At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company.
Individuals are encouraged to apply for positions because of the characteristics that make them unique.
EOE, including disability/veteran
$135k yearly 60d+ ago
Principal Data Scientist
Maximus 4.3
Data engineer job in Des Moines, IA
Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team.
You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes.
This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.)
This position requires occasional travel to the DC area for client meetings.
U.S. citizenship is required for this position due to government contract requirements.
Essential Duties and Responsibilities:
- Make deep dives into the data, pulling out objective insights for business leaders.
- Initiate, craft, and lead advanced analyses of operational data.
- Provide a strong voice for the importance of data-driven decision making.
- Provide expertise to others in data wrangling and analysis.
- Convert complex data into visually appealing presentations.
- Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners.
- Understand the importance of automation and look to implement and initiate automated solutions where appropriate.
- Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects.
- Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects.
- Guide operational partners on product performance and solution improvement/maturity options.
- Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization.
- Learn new skills in advanced analytics/AI/ML tools, techniques, and languages.
- Mentor more junior data analysts/data scientists as needed.
- Apply strategic approach to lead projects from start to finish;
Job-Specific Minimum Requirements:
- Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
- Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning.
- Contribute to the development of mathematically rigorous process improvement procedures.
- Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments.
Minimum Requirements
- Bachelor's degree in related field required.
- 10-12 years of relevant professional experience required.
Job-Specific Minimum Requirements:
- 10+ years of relevant Software Development + AI / ML / DS experience.
- Professional Programming experience (e.g. Python, R, etc.).
- Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML.
- Experience with API programming.
- Experience with Linux.
- Experience with Statistics.
- Experience with Classical Machine Learning.
- Experience working as a contributor on a team.
Preferred Skills and Qualifications:
- Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.).
- Experience developing machine learning or signal processing algorithms:
- Ability to leverage mathematical principles to model new and novel behaviors.
- Ability to leverage statistics to identify true signals from noise or clutter.
- Experience working as an individual contributor in AI.
- Use of state-of-the-art technology to solve operational problems in AI and Machine Learning.
- Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles.
- Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions.
- Ability to build reference implementations of operational AI & Advanced Analytics processing solutions.
Background Investigations:
- IRS MBI - Eligibility
#techjobs #VeteransPage #LI-Remote
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
156,740.00
Maximum Salary
$
234,960.00
$65k-90k yearly est. Easy Apply 7d ago
Senior Data Engineer
Insight Global
Data engineer job in West Des Moines, IA
Insight Global is looking for a Technical Lead to guide and mentor a team of dataengineers and developers responsible for designing and implementing scalable data and integration solutions. This role requires strong leadership in Data Analysis, Snowflake, SQL and Kafka, with a focus on enabling high-quality data flow and seamless integration across enterprise systems. They will spend their day gathering requirements for Product Owner and Scrum Master and working with the correct engineers to find data, and build out scalable pipelines. They will participate in building and recommending new and improved architecture and bring ideas to the organization. They will be the face of the Enterprise Data pipelines and interact with the business and multiple teams. 50% of this role will still be hands on development and building. The annual salary for this role is between $125,000-135,000/yr.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to ********************.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: ****************************************************
Skills and Requirements
12 years of IT development experience
Lead or Architect experience
Hands on experience with Snowflake
Experience with Kafka SQL or Azure Event Hubs for streaming DBT or Data Opps skills
Python or Terraform
$125k-135k yearly 20d ago
Sr. Data Engineer (ETL Developer)
Fidelity & Guaranty Life 4.5
Data engineer job in Des Moines, IA
FGL Holdings-the F&G family of insurance companies-is committed to helping Americans prepare for and live comfortably in their retirement. Through its subsidiaries, F&G is a leading provider of annuity and life insurance products. For nearly 60 years, we have offered annuity and life insurance products to those who are seeking safety, protection and income solutions to meet their needs.
At F&G, we believe our culture is what makes our company great. In 2019, we received a Top Workplace award, which we credit to our employees' shared cultural values: Collaborative, Authentic, Dynamic and Empowered. We believe that by embracing these values, we will continue to build and strengthen the company, while being a great place to work. We recruit talented and committed individuals to join our team, and we provide opportunities for personal and professional growth.
The Sr. DataEngineer will support the design, development, implementation and maintenance of complex, mission-critical Informatica, SQL-based systems supporting F&G‘s insurance operations. As a senior team member, the senior dataengineer will assume lead roles in the design and development, oversee the work of all junior developers, and participate in the strategic planning around future development of the F&G data environment.
The Company has made a significant investment in information technology and relies heavily on data interfaces from multiple off-site source systems. Solid communication and problem-solving skills are required.
Organization
This position reports directly to the Director, DataEngineering & Administration and has significant interaction with members of the IT organization, Third Party Administrators, group managers, and departmental analysts throughout the organization. In addition, interaction with the PMO Project Managers for prioritization and reporting and Relationship Management with business will be critical and ongoing.
Duties and Responsibilities
Develop Informatica code to support existing and future EDS deployments
Performance tune existing and future Informatica code to ensure all SLAs are met
Develop and support team of on-shore and off-shore Informatica developers to deliver exceptional quality and meet all project deadlines
Perform relational database analysis, modeling, and design of complex systems
Create detailed technical design documents in accordance with business requirements
Develop complex programs/queries to support transactional processing and regulatory reporting utilizing SQL and Informatica
Develop and perform detailed unit, quality assurance and regression tests to validate the readiness of internal developed code for production
Create detailed deployment plans for use in the migration of code from staging to production environments and provide deployment guides to host provider for deployments
Work with Infrastructure and other IT teams to implement complete solution
Create clear and effective Status reports as required
Perform impact analysis for interface/system changes affecting the applications and data environment
Work closely with Data Management team members to translate business needs into technical solutions
Assist Data Management Manager in developing estimates for project and maintenance work
Monitor/ensure acceptable levels of system performance, integrity and security
Support standards for system architecture, code quality and collaborative team development
Attend routine departmental meetings to support communication around development best practices, participate in change control discussions, review code, and provide technical instruction to colleagues
Partner with external TPAs and consultants to collaborate on large scale development efforts and enforce F&G standards for integration and data exchange
Attend conferences, developer forums, and training opportunities to ensure current technology trends are understood and applied within the F&G environment
Experience and Education Requirements
Bachelor's degree (preferred emphasis in computer science or MIS) or equivalent experiences
Senior to expert level design/development, debugging ability with Informatica Power Center (including version 10.1)
Senior to expert level ability to optimize Informatica and SQL jobs through performance tuning
Minimum 5 years of experience in supporting ETL, production data operations (File processing, data distribution etc.,) including debugging, addressing production issues and performing Root Cause Analysis
Expert level experience in designing and building large applications utilizing SQL Server
Experience in windows batch scripting, scheduling jobs using job scheduling tools, e.g., JAMS and Data Marts and other Data Warehousing practices
Thorough understanding of the software development life cycle and experience in working with geographically distributed teams (offshore, offsite etc.,)
Ability to use SQL development tools such as SQL Navigator and Toad as well as maintain code in source code control systems
Knowledge of proper database normalization, indexing, transaction protection and locking is essential
Preferred Skills
Preferred to have experience in supporting DTCC and data transfers from / to external organizations and internal systems using EFT (Electronic File Transfer)
Working knowledge of Informatica Data Quality, Business Glossary, Metadata Manager
Experience with database design/modeling tools such as Erwin
Skills and Abilities
Strong technical documentation ability
Familiar with SSIS, Python and other ETL frameworks preferred
Previous experience with Tableau, operational reporting is a plus
Must have a teamwork focused attitude and be skilled at building relationships within IT organizations and across business functions
Strong technical documentation skills
Life/Annuity insurance industry experience strongly preferred
Excellent oral and written communication skills
Knowledge of data integrity protocols and security requirements and techniques
Strong time management and organizational skills to enable productivity in a fast-paced, dynamic development environment
Strong verbal communication skills and a demonstrated ability to work effectively in team-based development projects
Physical Demands and Work Environment
Must be able to work in a fast-pace team environment and handle multiple projects and assignments under tight deadlines
Must demonstrate willingness to work flexible hours as needed to accommodate business needs and deliverables
Must be able to sit at a computer for extended periods of time
#LI-JS1
#INDHP
$84k-110k yearly est. Auto-Apply 60d+ ago
Data Engineer-Healthcare Analytics (Full-Time)
The Iowa Clinic, P.C 4.6
Data engineer job in West Des Moines, IA
Looking for a career where you love what you do and who you do it with? You're in the right place. Healthcare here is different - we're locally owned and led by our physicians, and all decisions are always made right here in Central Iowa. By working at The Iowa Clinic, you'll get to make a difference while seeing a difference in our workplace. Because as one clinic dedicated to exceptional care, we're committed to exceeding expectations, showing compassion and collaborating to provide the kind of care most of us got into this business to deliver in the first place.
Think you've got what it takes to join our TIC team? Keep reading…
A day in the life…
Wondering what a day in the life of a DataEngineer - Healthcare Analytics at The Iowa Clinic might look like?
We are seeking a skilled and motivated DataEngineer to join our growing analytics team. In this role, you will design, build, and maintain scalable data pipelines and infrastructure that empower our organization to make data-driven decisions. You'll collaborate closely with data analysts, data scientists, and business stakeholders to ensure data is accessible, reliable, and optimized for performance.
Key Responsibilities
* Develop and maintain robust ETL/ELT pipelines using modern dataengineering tools and frameworks.
* Design and implement data models and architectures that support analytics and reporting needs.
* Ensure data quality, integrity, and security across all systems.
* Collaborate with cross-functional teams to understand data requirements and deliver solutions.
* Monitor and optimize data workflows for performance and scalability.
* Maintain documentation for data processes, systems, and architecture.
NOTE: Candidates must have valid U.S. work authorization and will not require employer sponsorship now or in the future. We do not provide sponsorship.
Qualifications
* Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
* 3+ years of experience in dataengineering or a similar role.
* Demonstrated knowledge and practical use of health information standards
* Strong understanding of analytical methods, tools, statistics and data management
* Proficiency in SQL and Python (or other scripting languages).
* Experience with cloud platforms (e.g., AWS, Azure, GCP) and data warehousing solutions (e.g., Snowflake, Redshift, BigQuery).
* Familiarity with tools like Airflow, dbt, Kafka, or Spark is a plus.
* Strong understanding of data modeling, data governance, and best practices in data architecture.
Preferred Skills
* Experience working in Agile environments.
* Knowledge of CI/CD pipelines and DevOps practices.
* Ability to communicate technical concepts to non-technical stakeholders.
* Passion for building scalable and efficient data systems.
Know someone else who might be a great fit for this role? Share it with them!
What's in it for you
* One of the best 401(k) programs in central Iowa, including employer match and profit sharing
* Employee incentives to share in the Clinic's success
* Generous PTO accruals
* and paid holidays
* Health, dental and vision insurance
* Quarterly volunteer opportunities through a variety of local nonprofits
* Training and development programs
* Opportunities to have fun with your colleagues, including TIC night at the Iowa Cubs, employee appreciation tailgate party, Adventureland day, State Fair tickets, annual holiday party, drive-in movie night… we could go on and on
* Monthly departmental celebrations, jeans days and clinic-wide competitions
* Employee rewards and recognition program
* Health and wellness program with up to $350/year in incentives
* Employee feedback surveys
* All employee meetings, team huddles and transparent communication
$80k-104k yearly est. Auto-Apply 60d+ ago
Data Engineer
Rogers Freels & Associates Inc.
Data engineer job in Johnston, IA
Job Description
RFA Engineering (*************** supports industry-leading clients through the full software development lifecycle to build cutting-edge precision agriculture, machine guidance, vehicle automation and autonomy applications. We are seeking passionate, talented engineers to work on exciting projects using the latest tools and technologies including robotics, computer-vision, machine learning, IoT, cloud computing, and much more. Collaborate with a team of industry experts onsite at our client's world-class engineering center and contribute to developing innovative solutions that drive sustainable agriculture practices.
This is a full-time position with a full benefit package listed below that includes opportunities for professional growth, direct hire by our customers, and additional opportunities within our own organization.
Senior DataEngineer
As a DataEngineer, you will enable engineers, analysts, and data scientists to more effectively access, explore, and leverage enterprise data to generate insights, build models, and make data-driven decisions. This role focuses on designing scalable cloud-based data solutions, developing high-quality datasets, and creating data products that empower individual contributors to work independently with complex, high-volume data.
A key component of this role involves collaborating with embedded and digital engineering teams to define, capture, and analyze system performance, UI, and health metrics. You will help translate raw signals into actionable insights through analytics, dashboards, and monitoring tools.
Responsibilities
Design, build, and maintain cloud-based data pipelines and data products that support analytics, machine learning, and exploratory data analysis
Enable self-service data access by developing well-structured, documented, and discoverable datasets for individual contributors
Partner with embedded and digital engineering teams to define required performance, UI, and system health metrics and ensure appropriate data capture
Perform analytics and data aggregation to translate raw signals into meaningful insights and visualizations
Develop and maintain production-ready Python code and data workflows using SQL, PySpark, Databricks, and related technologies
Manage and optimize storage of diverse data types, including images, raster data, parquet files, time-series data, geo-tagged data, text, and other unstructured data
Build dashboards and interactive data applications using tools such as Tableau, Plotly Dash, or similar web-based visualization frameworks
Develop alerting and monitoring solutions (e.g., automated email notifications) to identify system performance issues or data pipeline failures
Collaborate in code reviews, documentation, and best-practice development to ensure maintainable and reproducible solutions
Manage multiple projects, priorities, and milestones while maintaining a strong sense of ownership and accountability
Requirements
Bachelor's degree or higher in Computer Science, Software Engineering, DataEngineering, or a related technical field, or equivalent professional experience
Demonstrated experience designing and implementing cloud-based data solutions in production environments
Strong proficiency in Python for dataengineering and analytics applications
Experience working with data access and processing technologies such as SQL, PySpark, Databricks, Postgres, and MongoDB
Experience building and maintaining datasets at scale across multiple data formats and structures
Familiarity with vehicle or embedded system data, including CAN signals
Excellent written and verbal communication skills, including the ability to lead meetings, clearly document work, communicate proof-of-concepts, and collaborate across teams
Desired Attributes
Experience developing dashboards and visual analytics solutions using tools such as Tableau, Plotly, or similar platforms
Familiarity with spatial data and visualization tools such as Folium or Plotly
Experience implementing system monitoring, alerting, and health-tracking solutions
Strong analytical and problem-solving skills with the ability to debug complex data and system issues
Ability to work effectively in a self-directed environment with minimal oversight
Proven ability to manage multiple schedules, deliverables, and stakeholders simultaneously
Knowledge of off-highway, agricultural, construction, or industrial equipment data systems is a plus
Visa sponsorship is NOT available for this position.
Salary Range: $80,000-$120,000/year: Commensurate with experience
About RFA Engineering
RFA Engineering has provided product development and engineering services to industry leading customers since 1943. Our primary focus is the development of off highway equipment including agricultural, construction, mining, recreational, industrial, and special machines. Our work includes concept development, product design, documentation, problem-solving, simulation, optimization, and testing of components, systems and complete machines. Our engineering staff is located at our Engineering Center in Minneapolis, branch office in Dubuque, IA, and at numerous customer sites throughout the U.S.
Competitive Benefits
Health and Dental Insurance
TelaDoc Healthiest You
Supplemental Vision Insurance
Company Paid Life Insurance
Company Paid Long-Term Disability
Short-term Disability
Retirement Savings Account (Traditional 401k & Roth 401k)
Flexible Spending Plan Dependent Care
HSA for Medical Expenses
Bonus Plan (Exempt Employees Only)
Paid Time Off (PTO)
Paid Holidays
Bereavement Leave
Employee Assistance Programs (EAP)
Education Assistance
Equal Opportunity and Veteran Friendly
$80k-120k yearly 19d ago
Data Scientist, Supply Chain Digitalization
Emerson 4.5
Data engineer job in Marshalltown, IA
At Emerson, we pride ourselves on our industry-leading control valve portfolio, featuring top brands like Fisher , Sempell , and Crosby . These brands power the world through a wide range of applications across various industries, from refining and hydrogen to nuclear power. By joining our Supply Chain Digitalization team, you'll become part of an almost 150-year legacy that began with William Fisher in 1880 and help us take the next steps into the future by harnessing statistical analytics and emerging technologies to deliver products to customers faster and with greater certainty. We're looking for someone who is eager to take these next steps with us and make a significant impact on our operations.
If this sounds like the perfect opportunity for you, we encourage you to apply and become a vital part of our team. Together, we can continue to drive innovation and excellence in the industry. Relocation assistance available.
We look forward to hearing from you!
In This Role, Your Responsibilities Will Be:
Use statistics, programming, modeling, and related analytics to extract actionable insights from complex operations and supply chain data.
Develop ‘proof of concept' models for advanced supply chain and operations tools to optimize cost and performance.
Support implementation of ‘proof of concept' models into enterprise operations, through testing, validation, planning, and cross functional collaboration with stakeholders.
Support management decision-making with data-driven analytics.
Who You Are:
You break down objectives into appropriate initiatives and actions. You persist in accomplishing objectives despite obstacles and setbacks. You step up to conflicts, seeing them as opportunities. You delegate and distribute assignments and decisions appropriately. You create milestones to rally support behind the program.
For This Role, You Will Need:
Bachelor's degree and 3 + years in data science, statistics, or related field or PhD
Legal Authorization to work in the United States - sponsorship will not be provided for this role
Preferred Qualifications That Set You Apart:
Advanced Degree in related field preferred
Experience in supply chain, operations or related disciplines
Experience with Power BI, Python, Oracle, machine learning models, ML model training algorithms, agentic AI, and similar
Our Culture & Commitment to You:
At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives-because we know that great ideas come from great teams. Our commitment to ongoing career development and growing an inclusive culture ensures you have the support to thrive. Whether through mentorship, training, or leadership opportunities, we invest in your success so you can make a lasting impact. We believe diverse teams, working together are key to driving growth and delivering business results.
We recognize the importance of employee wellbeing. We prioritize providing flexible, competitive benefits plans to meet you and your family's physical, mental, financial, and social needs. We provide a variety of medical insurance plans, with dental and vision coverage, Employee Assistance Program, 401(k), tuition reimbursement, employee resource groups, recognition, and much more. Our culture offers flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave.
#LI-AN1
$63k-80k yearly est. Auto-Apply 12d ago
Azure Data Engineer - 6013916
Accenture 4.7
Data engineer job in Des Moines, IA
Accenture Flex offers you the flexibility of local fixed-duration project-based work powered by Accenture, a leading global professional services company. Accenture is consistently recognized on FORTUNE's 100 Best Companies to Work For and Diversity Inc's Top 50 Companies For Diversity lists.
As an Accenture Flex employee, you will apply your skills and experience to help drive business transformation for leading organizations and communities. In addition to delivering innovative solutions for Accenture's clients, you will work with a highly skilled, diverse network of people across Accenture businesses who are using the latest emerging technologies to address today's biggest business challenges.
You will receive competitive rewards and access to benefits programs and world-class learning resources. Accenture Flex employees work in their local metro area onsite at the project, significantly reducing and/or eliminating the demands to travel.
Job Description:
Join our dynamic team and embark on a journey where you will be empowered to perform independently and become an SME. Required active participation/contribution in team discussions will be key as you contribute in providing solutions to work related problems. Let's work together to achieve greatness!
Responsibilities:
+ Create new data pipelines leveraging existing data ingestion frameworks, tools
+ Orchestrate data pipelines using the Azure Data Factory service.
+ Develop/Enhance data transformations based on the requirements to parse, transform and load data into Enterprise Data Lake, Delta Lake, Enterprise DWH (Synapse Analytics)
+ Perform Unit Testing, coordinate integration testing and UAT Create HLD/DD/runbooks for the data pipelines
+ Configure compute, DQ Rules, Maintenance Performance tuning/optimization
Basic Qualifications:
+ Minimum of 3 years of work experience with one or more of the following: DatabricksDataEngineering, DLT, Azure Data Factory, SQL, PySpark, Synapse Dedicated SQL Pool, Azure DevOps, Python
Preferred Qualifications:
+ Azure Function Apps
+ Azure Logic Apps
+ Precisely & COSMOS DB
+ Advanced proficiency in PySpark.
+ Advanced proficiency in Microsoft Azure Databricks, Azure DevOps, Databricks Delta Live Tables and Azure Data Factory.
+ Bachelor's or Associate's degree
Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired as set forth below. We accept applications on an on-going basis and there is no fixed deadline to apply.
Information on benefits is here. (************************************************************
Role Location:
California - $47.69 - $57.69
Cleveland - $47.69 - $57.69
Colorado - $47.69 - $57.69
District of Columbia - $47.69 - $57.69
Illinois - $47.69 - $57.69
Minnesota - $47.69 - $57.69
Maryland - $47.69 - $57.69
Massachusetts - $47.69 - $57.69
New York/New Jersey - $47.69 - $57.69
Washington - $47.69 - $57.69
Requesting an Accommodation
Accenture is committed to providing equal employment opportunities for persons with disabilities or religious observances, including reasonable accommodation when needed. If you are hired by Accenture and require accommodation to perform the essential functions of your role, you will be asked to participate in our reasonable accommodation process. Accommodations made to facilitate the recruiting process are not a guarantee of future or continued accommodations once hired.
If you would like to be considered for employment opportunities with Accenture and have accommodation needs such as for a disability or religious observance, please call us toll free at **************** or send us an email or speak with your recruiter.
Equal Employment Opportunity Statement
We believe that no one should be discriminated against because of their differences. All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law. Our rich diversity makes us more innovative, more competitive, and more creative, which helps us better serve our clients and our communities.
For details, view a copy of the Accenture Equal Opportunity Statement (********************************************************************************************************************************************
Accenture is an EEO and Affirmative Action Employer of Veterans/Individuals with Disabilities.
Accenture is committed to providing veteran employment opportunities to our service men and women.
Other Employment Statements
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States.
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process. Further, at Accenture a criminal conviction history is not an absolute bar to employment.
The Company will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. Additionally, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the Company's legal duty to furnish information.
California requires additional notifications for applicants and employees. If you are a California resident, live in or plan to work from Los Angeles County upon being hired for this position, please click here for additional important information.
Please read Accenture's Recruiting and Hiring Statement for more information on how we process your data during the Recruiting and Hiring process.
$64k-82k yearly est. 9d ago
Azure Data Engineer
CapB Infotek
Data engineer job in Des Moines, IA
CapB is a global leader on IT Solutions and Managed Services. Our R&D is focused on providing cutting edge products and solutions across Digital Transformations from Cloud, AI/ML, IOT, Blockchain to MDM/PIM, Supply chain, ERP, CRM, HRMS and Integration solutions. For our growing needs we need consultants who can work with us on salaried or contract basis. We provide industry standard benefits, and an environment for LEARNING & Growth.
For one of our going on project we are looking for an AZURE DATAENGINEER. The position is based out of Des Moines, IA. Locals preferred but can be done remotely for the time being this year.
Responsibilities:
• Create functional design specifications, Azure reference architectures, and assist with other project deliverables as needed.
• Design and Develop Platform as a Service (PaaS) Solutions using different Azure Services
• Create a data factory, orchestrate data processing activities in a data-driven workflow, monitor and manage the data factory, move, transform and analyze data
• Design complex enterprise Data solutions that utilize Azure Data Factory Create migration plans to move legacy SSIS packages into Azure Data Factory
• Build conceptual and logical data models
• Design and implement big data real-time and batch processing solutions
• Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed / elastic environments, and downstream applications and/or self-service solutions.
• Develop and document mechanisms for deployment, monitoring and maintenance
Skills and Experience:
• Bachelor's degree or higher in Computer Science Engineering/ Information Technology, Information Systems
• 3+ years experience with Microsoft Cloud Data Platform: Azure Data Factory, Azure Databricks, Python, Scala, Spark SQL, SQL Data Warehouse
• 3+ years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL, data lake solutions
• Expertise with SQL, database design/structures, ETL/ELT design patterns, and DataMart structures (star, snowflake schemas, etc.)
• Functional knowledge of programming scripting and data science languages such as JavaScript, PowerShell, Python, Bash, SQL, .NET, Java, PHP, Ruby, PERL, C++, R, etc.
• Creation of descriptive, predictive and prescriptive analytics solutions using Azure Stream Analytics, Azure Analysis Services, Data Lake Analytics, HDInsight, HDP, Spark, Databricks, MapReduce, Pig, Hive, Tez, SSAS, Watson Analytics, SPSSA
• Experience in Azure Data Factory (ADF) creating multiple pipelines and activities using Azure for full and incremental data loads into Azure Data Lake Store and Azure SQL DW
• Experience for Azure Data Lake Storage and working with Parquet files and partitions
• Experience managing Microsoft Azure environments with VM's, VNETS, Subnets, NSG's, Resource Groups, etc.
• Experience in Creation & Configuration of Azure Resources & RBAC
• Experience with Git/Azure DevOps
• Azure certification would be desired
• Must have an ability to communicate clearly and be a team player
$70k-94k yearly est. 60d+ ago
Data Engineer
Holmes Murphy 4.1
Data engineer job in West Des Moines, IA
We are looking to add a DataEngineer to join our Information Technology team in West Des Moines, IA. Offering a forward-thinking, innovative, and vibrant company culture, along with the opportunity to share your unique potential, there really is no place like Holmes!
Essential Responsibilities:
Design and Implement Complex Pipelines: Design, build, and optimize ETL/ELT workflows from multiple sources, enhancing performance and reliability. Azure Data Factory, Alteryx, Fabric, dbt, SQL, and other tools.
Advanced Data Transformations: Implement transformations, aggregations, and custom logic to meet business requirements.
Data Modeling and Storage Optimization: Contribute to data modeling efforts, recommending storage solutions and structures that support analytical requirements.
Pipeline Monitoring and Optimization: Proactively monitor pipelines, perform root cause analysis for performance issues, and implement improvements.
Mentorship and Guidance: Provide guidance to junior engineers on best practices in data extraction, transformation, and loading.
Qualifications:
Education: Bachelor's degree in computer science, data science, technology, information systems or engineering preferred.
Experience: 3-5 years of professional experience in data, analytics, or platform engineering or related work-related experience strongly preferred.
Skills: Experience building ETL/ELT dataflows using tools like dbt, Alteryx, Azure Data Factory, or Microsoft Fabric. Skilled in working with diverse data sources (structured and unstructured), including delimited, fixed width, databases, and blob storage. Familiarity with APIs or RPA for third-party data collection; Salesforce experience is a plus. Knowledge of source control, cloud data storage, and platforms such as Snowflake or Azure preferred.
Technical Competencies: Possesses strong business and technology knowledge to understand business needs, make informed decisions, and deliver effective technology solutions, including related processes and procedures. Demonstrates excellent problem-solving skills to efficiently identify issues, determine root causes, and propose and implement effective solutions or improvements.
Here's a little bit about us:
In addition to being great at what you do, we place a high emphasis on building a best-in-class culture. We do this through empowering employees to build trust through honest and caring actions, ensuring clear and constructive communication, establishing meaningful client relationships that support their unique potential, and contributing to the organization's success by effectively influencing and uplifting team members.
Benefits: In addition to core benefits like health, dental and vision, also enjoy benefits such as:
Paid Parental Leave and supportive New Parent Benefits - We know being a working parent is hard, and we want to support our employees in this journey!
Company paid continuing Education & Tuition Reimbursement - We support those who want to develop and grow.
401k Profit Sharing - Each year, Holmes Murphy makes a lump sum contribution to every full-time employee's 401k. This means, even if you're not in a position to set money aside for the future at any point in time, Holmes Murphy will do it on your behalf! We are forward-thinking and want to be sure your future is cared for.
Generous time off practices in addition to paid holidays - Yes, we actually encourage employees to use their time off, and they do. After all, you can't be at your best for our clients if you're not at your best for yourself first.
Supportive of community efforts with paid Volunteer time off and employee matching gifts to charities that are important to you - Through our Holmes Murphy Foundation, we offer several vehicles where you can make an impact and care for those around you.
DE&I programs - Holmes Murphy is committed to celebrating every employee's unique diversity, equity, and inclusion (DE&I) experience with us. Not only do we offer all employees a paid Diversity Day time off option, but we also have a Chief Diversity Officer on hand, as well as a DE&I project team, committee, and interest group. You will have the opportunity to take part in those if you wish!
Consistent merit increase and promotion opportunities - Annually, employees are reviewed for merit increases and promotion opportunities because we believe growth is important - not only with your financial wellbeing, but also your career wellbeing.
Discretionary bonus opportunity - Yes, there is an annual opportunity to make more money. Who doesn't love that?!
Holmes Murphy & Associates is an Equal Opportunity Employer.
#LI-GH1
$77k-102k yearly est. Auto-Apply 19d ago
Data Scientist(Python, Power BI, Databricks, SQL, data modeling, R, Javascript, MondoDB)
Ccg Business Solutions 4.2
Data engineer job in Urbandale, IA
CCG Talent Management is not only a business solutions company but a company that believes success starts with the individual. CCG Business Solutions has been consulting and providing talent placement services since 2007. Our team understands the principles of connecting purpose to business. We are currently recruiting for a Data Scientist (Python, Power BI, Databricks, SQL, data modeling, R, Javascript, MondoDB).
Job Description
As a Data Scientist, you will join a client team leveraging petabyte-scale datasets for advanced analytics and model building to enable intelligent, automated equipment and improved decisions by farmers. The client team partners with product managers and dataengineers to design, scale, and deliver full-stack data science solutions. You will join a passionate team making a difference by applying innovative technology to solve some of the world's biggest problems.
Responsibilities
Communicate with impact your findings and methodologies to stakeholders with a variety of backgrounds.
Work with high-resolution machine and agronomic data in the development and testing of predictive models.
Develop and deliver production-ready machine learning approaches to yield insights and recommendations from precision agriculture data
Define, quantify, and analyze Key Performance Indicators that define successful customer outcomes.
Work closely with the DataEngineering teams to ensure data is stored efficiently and can support the required analytics.
Qualifications
Demonstrated competency in developing production-ready models in an Object-Oriented program language such as Python.
Demonstrated competency in using data-access technologies such as SQL, Spark, Databricks, BigQuery, MongoDB, etc.
Experience with Visualization tools such as Tableau, PowerBI, DataStudio, etc.
Experience with Data Modeling techniques such as Normalization, data quality, and coverage assessment, attribute analysis, performance management, etc.
Experience building machine learning models such as Regression, supervised learning, unsupervised learning, probabilistic inference, natural language modeling, etc.
Excellent communication skills. Able to effectively lead meetings, document work for reproduction, write persuasively, communicate proofs-of-concept, and effectively take notes.
Additional Qualifications
Additional experience with other languages such as R, JavaScript, Scala, etc.
Examples of professional work such as publications, patents, a portfolio of relevant project work, etc.
Familiarity with Distributed Datasets
Experienced with a variety of data structures such as time series, geo-tagged, text, structured, and unstructured.
Experience with simulations such as Monte Carlo simulation, Gibbs sampling, etc.
Experience with model validation, measuring model bias, measuring model drift, etc.
Experience collaborating with stakeholders from disciplines such as Product, Sales, Finance, etc.
Ability to communicate complex analytical insights in a manner that is understandable by non - technical audiences.
Additional Information
Salary: $55.00 - $58.00 + Bonus + Relocation
All your information will be kept confidential according to EEO guidelines.
$70k-101k yearly est. 1d ago
Senior Data Engineer
Berkley 4.3
Data engineer job in Urbandale, IA
Company Details
Company URL: ******************************************
Berkley Technology Services (BTS) is the dynamic technology solution for W. R. Berkley Corporation, a Fortune 500 Commercial Lines Insurance Company. With key locations in Urbandale, IA and Wilmington, DE, BTS provides innovative and customer-focused IT solutions to the majority of WRBC's 60+ operating units across the globe. BTS's wide reach ensures that ideas and opinions are considered at every level of the organization to guarantee we find the best solutions possible.
Driven by a commitment to collaboration, BTS acts as consultants to our customers and Operating Units by providing comprehensive solutions that not only address the challenge at hand, but proactively plan for the “
What's Next
” in our industry and beyond.
With a culture centered on innovation and entrepreneurial spirit, BTS stands as a community of technology leaders with eyes toward the future -- leaders who truly care about growing not only their team members, but themselves, and take pride in their employees who shine. BTS offers endless ways to get involved and have the chance to grow your career into a wide range of roles you'd never known existed. Come join us as we push forward into the future of industry leading technological solutions.
Berkley Technology Services: Right Team, Right Technology, Simple and Secure.
Responsibilities
Join Our Innovative Insurance Data Team!
We're looking for a Senior DataEngineer with a deep understanding of insurance reporting and company operations. If you have a knack for mastering business data processes, system interfaces, and data structures, this is the role for you!
In this position, you'll be tackling advanced tasks like ETL, cube creation, ad-hoc and management reporting, and crafting eye-catching dashboards and data extracts. You'll be part of a vibrant team that supports multiple companies, providing top-notch data resources and development.
Your mission includes analyzing, designing, and coding innovative solutions for fast-growing companies in the Property & Casualty insurance industry. Get ready to make a big impact and have some fun along the way!
Design & Implement: Create and manage ETL processes, data structures, and both analytical and operational reporting environments.
Innovate: Develop new system functionalities with a focus on performance, stability, and supportability.
Lead by Example: Apply industry best practices in modeling, workflow, and presentation.
Extract & Integrate: Design and implement data extracts for both internal and external sources.
Organize & Communicate: Utilize your strong organizational and communication skills to keep everything running smoothly.
Solve Problems: Tackle challenges across application, middleware, and infrastructure levels with thorough impact analysis.
Set Standards: Help define standards and design patterns for data processes and structures within the team.
Guide & Mentor: Provide guidance on development standards and quality expectations, and mentor new team members.
Collaborate: Communicate effectively with employees at all levels, both within the company and with clients.
Influence: Develop a sphere of influence with other teams and foster a collaborative environment.
Be On Call: Participate in on-call rotations to ensure smooth operations.
Travel: Enjoy occasional travel, up to 20%.
Qualifications
SQL Maestro: 7+ years of experience with SQL and ETL processes, including crafting queries, stored procedures, jobs, functions, synonyms, and aliases. Experience with SQL Environments and variables, version control, and execution plans. Extensive experience with ETL tools (SSIS or similar) and optimizing performance.
Proven Performer: Demonstrated ability to meet key accountabilities or the drive to learn and excel in them.
Passionate Go-Getter: A self-motivated individual with an unyielding passion for success.
Impact Analyst: Skilled at understanding how changes affect customers and other systems.
Communication Pro: Excellent communication and organizational skills to keep everything on track.
Team Player and Mentor: Thrives in a fast-paced team environment and loves to train and mentor junior staff on best practices and enhancing solution designs.
Tech Enthusiast: Quick to adapt and eager to learn new technologies.
Independent Worker: Capable of working independently with minimal supervision.
Customer Champion: Strong focus on customer and business needs.
Bachelors Degree in Computer Science, Information Technology, Information Systems, or a related discipline. Equivalent experience and/or alternative qualifications will be considered.
Behavioral Core Competencies
Technically Astute
Managing Information
Customer Service Oriented
Business Knowledge
Influential
Conceptual Thinking
The Company is an equal employment opportunity employer.
$73k-98k yearly est. Auto-Apply 60d+ ago
Data Analytics Engineer
Kuder Inc. 4.0
Data engineer job in Adel, IA
The Analytics Engineer designs and manage the end‑to‑end analytics stack: design and operate our data models and pipelines, and build dashboards that power decision‑making across the company. Work closely with software engineers on infrastructure and with business leaders on metrics and insights.
The Analytics Engineer plays a key role in advancing the company's data strategy by developing and optimizing enterprise-level reporting solutions, and driving data-informed decision-making across the organization. This role partners directly with internal client teams to deeply understand business needs and translate them into actionable insights, scalable dashboards, and improved data processes.
This role also contributes to defining best practices, strengthening data governance, and shaping the evolution of Kuder's data-driven culture. This role requires strong analytical rigor, stakeholder leadership, and high-level technical proficiency.
Location
Adel or remote
**Note this position is not eligible for an employment Visa sponsorship.
Essential Job Functions
Lead the design, development, and optimization of advanced reports, dashboards, and visualizations using Power BI or other BI tools.
Design, build, and maintain reliable data models and transformations in the warehouse (e.g., dbt or SQL-based modeling).
Implement and monitor ELT/ETL pipelines from product databases and third-party tools, with support from the software engineering team for infrastructure where needed.
Partner with business stakeholders to translate questions into data solutions and provide exploratory analysis when required.
Present insights, recommendations, and narratives to leadership teams and internal stakeholders in clear, compelling ways.
Develop, maintain, and champion data definitions, business rules, and data quality standards across the organization.
Identify and drive both process and data improvements, proposing scalable solutions and collaborating cross-functionally to implement them.
Perform other duties as assigned.
Requirements
Bachelor's degree required (Computer Science, Engineering, Statistics, Data Science, Economics or related discipline).
6+ years of experience in data analysis, dataengineering or similar roles
3+ years of hands-on experience with SQL and relational databases.
Experience designing and delivering advanced reports, dashboards, and analytical products.
Experience with advanced SQL and data modeling (dimensional models, star schemas, warehouse concepts) are the most critical skills.
Experience with a cloud data warehouse (Snowflake, BigQuery, Redshift, etc.), plus some exposure to ETL/ELT tools and dbt or similar transformation frameworks.
Exceptional communication skills, both written and verbal, with demonstrated success presenting to cross-functional stakeholders and leadership.
Skills and Abilities
Strong initiative, analytical thinking, problem-solving, and organizational skills required.
Some experience with data pipelines (e.g., Airflow, Fivetran, custom Python/ELT) and comfort collaborating with software engineers on deployment and infrastructure.
Ability to independently manage and prioritize multiple projects while meeting deadlines and delivering high-quality results.
Ability to act as a generalist: comfortable switching between low-level schema design, pipeline debugging, and high-level stakeholder conversations.
Comfortable with ambiguity and able to lead projects with minimal direction; proactive in identifying opportunities for improvement.
Experience working in small teams where flexibility, adaptability, and a broad skill set are essential.
Ability to quickly adapt to new tools, systems, and processes.
Strong commitment to continuous learning and applying new technologies and techniques to improve data practices.
Professional: Ability to follow Kuder's culture and values:
Attitude is Everything - We believe we have more potential to accomplish goals, develop resiliency, and make improvements when we choose to lead with a positive attitude.
Create Partnerships - We create genuine, flexible, and long-term partnerships that cultivate collaboration and support for achieving common goals.
Deliver Success - We drive results and reach our goals with passion, urgency, and a commitment to excellence. We are accountable and encouraging as we collectively celebrate our victories and turn setbacks into progress.
Foster Innovation - We promote innovation and welcome ideas. We are curious, we listen, and we take action to elevate and improve how we deliver reliable solutions.
Thrive Together - We invest in an authentic environment where our team is motivated, supported, and successful. We respect all voices and experiences as we work together for meaningful growth.
This job description is not intended to be all-inclusive. Employee may perform other related duties as negotiated to meet the ongoing needs of the organization.
$75k-103k yearly est. 5d ago
Data Scientist, Analytics (Technical Leadership)
Meta 4.8
Data engineer job in Des Moines, IA
We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.
**Required Skills:**
Data Scientist, Analytics (Technical Leadership) Responsibilities:
1. Work with complex data sets to solve challenging problems using analytical and statistical approaches
2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies
3. Identify and measure success through goal setting, forecasting, and monitoring key metrics
4. Partner with cross-functional teams to inform and execute product strategy and investment decisions
5. Build long-term vision and strategy for programs and products
6. Collaborate with executives to define and develop data platforms and instrumentation
7. Effectively communicate insights and recommendations to stakeholders
8. Define success metrics, forecast changes, and set team goals
9. Support developing roadmaps and coordinate analytics efforts across teams
**Minimum Qualifications:**
Minimum Qualifications:
10. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
11. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab)
12. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development
13. Experience working effectively with and leading complex analytics projects across multiple stakeholders and cross-functional teams, including Engineering, PM/TPM, Analytics and Finance
14. Experience with predictive modeling, machine learning, and experimentation/causal inference methods
15. Experience communicating complex technical topics in a clear, precise, and actionable manner
**Preferred Qualifications:**
Preferred Qualifications:
16. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy
17. Masters or Ph.D. Degree in a quantitative field
18. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research)
19. 10+ years of experience doing complex quantitative analysis in product analytics
**Public Compensation:**
$210,000/year to $281,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@meta.com.
$98k-128k yearly est. 60d+ ago
Data Scientist, Supply Chain Digitalization
Emerson 4.5
Data engineer job in Marshalltown, IA
At Emerson, we pride ourselves on our industry-leading control valve portfolio, featuring top brands like Fisher , Sempell , and Crosby . These brands power the world through a wide range of applications across various industries, from refining and hydrogen to nuclear power. By joining our Supply Chain Digitalization team, you'll become part of an almost 150-year legacy that began with William Fisher in 1880 and help us take the next steps into the future by harnessing statistical analytics and emerging technologies to deliver products to customers faster and with greater certainty. We're looking for someone who is eager to take these next steps with us and make a significant impact on our operations.
If this sounds like the perfect opportunity for you, we encourage you to apply and become a vital part of our team. Together, we can continue to drive innovation and excellence in the industry. Relocation assistance available.
We look forward to hearing from you!
In This Role, Your Responsibilities Will Be:
Use statistics, programming, modeling, and related analytics to extract actionable insights from complex operations and supply chain data.
Develop ‘proof of concept' models for advanced supply chain and operations tools to optimize cost and performance.
Support implementation of ‘proof of concept' models into enterprise operations, through testing, validation, planning, and cross functional collaboration with stakeholders.
Support management decision-making with data-driven analytics.
Who You Are:
You break down objectives into appropriate initiatives and actions. You persist in accomplishing objectives despite obstacles and setbacks. You step up to conflicts, seeing them as opportunities. You delegate and distribute assignments and decisions appropriately. You create milestones to rally support behind the program.
For This Role, You Will Need:
Bachelor's degree and 3 + years in data science, statistics, or related field or PhD
Legal Authorization to work in the United States - sponsorship will not be provided for this role
Preferred Qualifications That Set You Apart:
Advanced Degree in related field preferred
Experience in supply chain, operations or related disciplines
Experience with Power BI, Python, Oracle, machine learning models, ML model training algorithms, agentic AI, and similar
Our Culture & Commitment to You:
At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives-because we know that great ideas come from great teams. Our commitment to ongoing career development and growing an inclusive culture ensures you have the support to thrive. Whether through mentorship, training, or leadership opportunities, we invest in your success so you can make a lasting impact. We believe diverse teams, working together are key to driving growth and delivering business results.
We recognize the importance of employee wellbeing. We prioritize providing flexible, competitive benefits plans to meet you and your family's physical, mental, financial, and social needs. We provide a variety of medical insurance plans, with dental and vision coverage, Employee Assistance Program, 401(k), tuition reimbursement, employee resource groups, recognition, and much more. Our culture offers flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave.
#LI-AN1
The average data engineer in Ames, IA earns between $62,000 and $107,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.
Average data engineer salary in Ames, IA
$81,000
What are the biggest employers of Data Engineers in Ames, IA?
The biggest employers of Data Engineers in Ames, IA are: