Senior data scientist jobs in Independence, MO - 57 jobs
All
Senior Data Scientist
Data Engineer
Data Scientist
Engineering Mathematician
Senior Data Architect
Data Scientist (Mid-Senior Level)
Jack Henry & Associates Inc. 4.6
Senior data scientist job in Lenexa, KS
At Jack Henry, we deliver technology solutions that are digitally transforming and empowering community banks and credit unions to provide enhanced and streamlined user experiences to their customers and members. Our best-in-class products are just the start as we lay the groundwork for the future of digital banking and payments. We hope you'll join us. We can't do it without you.
The Data Hub team is central to this mission. We are responsible for building and maintaining Data Hub, our strategic data marketplace powered by Google BigQuery. This marketplace enables us to provide our customers with a rich, curated data set serving as the foundation for our next-generation analytics and AI/ML capabilities.
We are seeking a talented datascientist to join our team! This individual will assist with evaluating and broadening our data infrastructure, collaborate across product groups and departments internally, and help us turn customer data into transformative insight. The growth potential is limitless, particularly as our machine learning capacity grows. The ideal candidate is knowledgeable, creative, and collaborative, and can communicate how data can make a big impact.
This is a remote position, but candidates must live within approximately a 70-mile radius of our office locations in: Allen, TX; Birmingham, AL; Cedar Falls, IA; Charlotte, NC; Lenexa, KS; Louisville, KY; and Springfield/Monett, MO. All positions, regardless of location, may require an onsite interview or in-person onboarding requirement to verify your identity.
Salary range for this position is $90,000-170,000, depending on candidate experience and geographic location.
This position is ineligible for immigration sponsorship and support. Please do not apply if at any time you will need immigration support now or in the future (i.e., H-1B, STEM OPT Training Plans, etc.).
What you'll be responsible for:
* Performs and may lead data onboarding, integration, and curation activities to transform raw enterprise data into a valuable, organized asset.
* Designs, develops, and implements robust data models within Looker and BigQuery, specifically for our JH Insights application.
* Applies rigorous data validation techniques and quality checks to ensure the accuracy, consistency, and reliability of data.
* Transforms raw data into meaningful features suitable for both analytical dashboards and the development of future AI/ML models.
* Prepares and cleans complex datasets for machine learning initiatives, including data normalization, outlier detection, and feature selection.
* Performs exploratory data analysis to uncover trends, patterns, and anomalies, translating complex findings into clear, actionable insights.
* Collaborates cross-functionally with data engineering, product management, and business analysts to define requirements and deliver solutions.
* Creates and maintains comprehensive technical documentation for data definitions, transformations, and models.
* May lead data-driven projects, mentor less experienced team members, and help shape technical direction.
* Other duties as assigned.
What you'll need to have:
* Minimum of 4 years of progressive experience in a data science or data analyst role.
* Minimum of 2 years of experience with Python and SQL in a data-related capacity.
* Experience with any or all of Snowflake, BigQuery, Microsoft Fabric, or Redshift for data warehousing.
* Experience with Dataflow, Airbyte, Airflow, or dbt for data movement.
* Experience collaborating with various stakeholders such as product management, data engineering, and business analysts.
* Experience designing and implementing complex data models in cloud data warehouses.
* Understanding of machine learning concepts, algorithms, and their practical application.
* The ability to lead projects, work independently on complex issues, and translate business problems into data-driven solutions.
* Ability to travel up to 10% for business meetings and conferences.
What would be nice for you to have:
* Master's degree or Ph.D. in a quantitative field.
* Experience with big data processing frameworks and technologies.
* Experience with Looker or similar tools for data exploration, dashboard creation, and LookML development.
* Experience with data visualization tools like Tableau or Power BI.
* Excellent problem-solving skills and a strong sense of ownership over your deliverables.
* Experience in Finance, Banking, or Credit Unions
* Experience delivering formal presentations.
If you got this far, we hope you're feeling excited about this opportunity. Even if you don't feel you meet every single requirement on this posting, we still encourage you to apply. We're eager to meet motivated people who align with Jack Henry's mission and can contribute to our company in a variety of ways.
Why Jack Henry?
At Jack Henry, we pride ourselves through our motto of, 'Do the right thing, do whatever it takes, and have fun.' We recognize the value of our associates and believe much of our company's strength and success depends on their well-being.
We demonstrate our commitment by offering outstanding benefit programs to ensure the physical, mental & financial well-being of our people is always met.
Culture of Commitment
Ask our associates why they love Jack Henry, and many will tell you it is because our culture is exceptional. We do great things together. Rising to meet challenges and seeking opportunities is part of who we are as an organization. Our culture has helped us stay strong through challenging times and we credit our dedicated associates for our success. Visit our Corporate Responsibility site to learn more about our culture and commitment to our people, customers, community, environment, and shareholders.
Equal Employment Opportunity
At Jack Henry, we know we are better together. We value, respect, and protect the uniqueness each of us brings. Innovation flourishes by including all voices and makes our business - and our society - stronger. Jack Henry is an equal opportunity employer and we are committed to providing equal opportunity in all of our employment practices, including selection, hiring, performance management, promotion, transfer, compensation, benefits, education, training, social, and recreational activities to all persons regardless of race, religious creed, color, national origin, ancestry, physical disability, mental disability, genetic information, pregnancy, marital status, sex, gender, gender identity, gender expression, age, sexual orientation, and military and veteran status, or any other protected status protected by local, state or federal law.
No one will be subject to, and Jack Henry prohibits, any form of discipline, reprisal, intimidation, or retaliation for good faith reports or complaints of discrimination of any kind, pursuing any discrimination claim, or cooperating in related investigations.
Requests for full corporate job descriptions may be requested through the interview process at any time.
$90k-170k yearly 6d ago
Looking for a job?
Let Zippia find it for you.
Business Data Scientist
Southlaw p c 3.6
Senior data scientist job in Overland Park, KS
Benefits
401k with Matching Up To 3%, Over a 3 Year Vesting Period
Medical
Dental - 100% Base Rate Paid by the Firm
Vision - 100% Base Rate Paid by the Firm
Life Insurance
Long- and Short-Term Disability
PTO
Job Purpose
The Business DataScientist is a key part that helps to improve the efficiency and effectiveness of the Firm. The position is responsible for the ongoing analysis of Firm workflow, task and financial data, obtained from a variety of different sources. This involves development and preparation of information, reconciliations and reports for departments and management.
Job Duties & Responsibilities
Subject matter expert for the Case Management data system
Build and maintain Key Performance Indicators (KPI's), dashboards, reports, and data related products in a supportable and extensive way using organizationally accepted tools and methods
Identify patterns and trends in data sets to support process improvement efforts to maintain or meet Firm goals or opportunities
Analyze results of data reports for anomalies, accuracy, and applicability to work with department leaders, BPM and QA teams for discussion
Provides reporting and support as needed for data intensive projects throughout implementation
Responsible for monitoring progress and maintaining reporting to ensure post go-live adherence to expectations
Review & analyze data to answer financial & technical questions
Qualifications
Minimum of 3 years of experience in data or business analysis
Minimum of 3 years of experience with Microsoft SQL, Excel, and Power BI
Ability to express complex analytical concepts effectively, both verbally and written
Proven analytics skills, including mining, evaluation, analysis, and visualization.
Demonstrated ability to analyze data in a variety of formats and to provide recommendations and support in establishing expectations
Demonstrated effectiveness in working with multi-functional teams in a technical production workflow environment
Superior attention to detail to synthesize data from multiple sources
Ability to multitask without sacrificing the quality and accuracy of the analytical data captured.
Eligible to work in the United States
Working Conditions
This job operates in a professional office environment. This role routinely uses standard office equipment such as computers, phones, photocopiers, filing cabinets, and fax machines.
Physical Requirements
The ability to work in an office environment, in front of a computer, while typing documents, responding to emails, managing calendars and answering phones.
Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties, or responsibilities that are required of the employee for this job. Duties, responsibilities, and activities may change at times with or without notice.
SouthLaw, P.C. is an equal opportunity employer and does not discriminate against otherwise qualified applicants on the basis of race, color, creed, religion, ancestry, age, sex, marital status, national origin, disability or handicap, veteran status, or any other characteristic protected by law.
$72k-96k yearly est. Auto-Apply 22d ago
Principal Data Scientist
Maximus 4.3
Senior data scientist job in Kansas City, KS
Description & Requirements Maximus has an exciting opportunity for a Principal DataScientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team.
You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes.
This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.)
This is a remote position.
Essential Duties and Responsibilities:
- Make deep dives into the data, pulling out objective insights for business leaders.
- Initiate, craft, and lead advanced analyses of operational data.
- Provide a strong voice for the importance of data-driven decision making.
- Provide expertise to others in data wrangling and analysis.
- Convert complex data into visually appealing presentations.
- Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners.
- Understand the importance of automation and look to implement and initiate automated solutions where appropriate.
- Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects.
- Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects.
- Guide operational partners on product performance and solution improvement/maturity options.
- Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization.
- Learn new skills in advanced analytics/AI/ML tools, techniques, and languages.
- Mentor more junior data analysts/datascientists as needed.
- Apply strategic approach to lead projects from start to finish;
Job-Specific Essential Duties and Responsibilities:
- Develop, collaborate, and advance the applied and responsible use of AI, ML, simulation, and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
- Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning.
- Maintain current knowledge and evaluation of the AI technology landscape and emerging developments and their applicability for use in production/operational environments
Minimum Requirements
- Bachelor's degree in related field required.
- 10-12 years of relevant professional experience required.
Job-Specific Minimum Requirements (required skills that align with contract LCAT, verifiable, and measurable):
- 10+ years of relevant Software Development + AI / ML / DS experience
- Professional Programming experience (e.g. Python, R, etc.)
- Experience with AI / Machine Learning
- Experience working as a contributor on a team
- Experience leading AI/DS/or Analytics teams
- Experience mentoring Junior Staff
- Experience with Modeling and Simulation
- Experience with program management
Preferred Skills and Qualifications:
- Masters in quantitative discipline (Math, Operations Research, Computer Science, etc.)
- Experience developing machine learning or signal processing algorithms:
- Ability to leverage mathematical principles to model new and novel behaviors
- Ability to leverage statistics to identify true signals from noise or clutter
- Experience working as an individual contributor in AI or modeling and simulation
- Use of state-of-the-art technology to solve operational problems in AI, Machine Learning, or Modeling and Simulation spheres
- Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles
- Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions
- Use and development of program automation, CI/CD, DevSecOps, and Agile
- Experience managing technical teams delivering technical solutions for clients.
- Experience working with optimization problems like scheduling
- Experience with Data Analytics and Visualizations
- Cloud certifications (AWS, Azure, or GCP)
- 10+ yrs of related experience in AI, advanced analytics, computer science, or software development
#techjobs #Veteranspage
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
156,640.00
Maximum Salary
$
234,960.00
$64k-89k yearly est. Easy Apply 2d ago
Data Scientist - AI & BI
Sunlighten 3.9
Senior data scientist job in Leawood, KS
Job Description
**Please Note: This position is open only to candidates authorized to work in the U.S. without the need for current or future visa sponsorship. Additionally, this position is based in the Kansas City area, and we are only considering candidates who reside locally.**
At Sunlighten, we're not just about infrared saunas, we're on a mission to improve lives through innovative health and wellness solutions. As a global leader in infrared sauna therapy, we are rapidly expanding and need a talented DataScientist, AI & BI to drive measurable impact across Sales, Marketing, CX, and Operations through applied ML/LLMs, experimentation, and analytics. This is an AI-first role: you will own evaluation, monitoring, and continuous improvement for AI agents and RAG experiences, and partner with our AI Applications Engineer to productionize safe, reliable workflows that are accurate, secure, and ROI-positive. You will also lead core BI data science work (forecasting, scoring, experimentation) and step into BI analytics/dashboarding as needed to ensure business priorities ship end to end.
Celebrating 25 years of innovation, Sunlighten has grown from its Kansas City roots to establish a global footprint, including expansion into the UK. With the global wellness market projected to reach $7 trillion in 2026, we are proud to be part of this dynamic and holistic shift. As leaders in light science and longevity, we create innovative solutions that help customers lead vibrant, active lifestyles.
Duties/Responsibilities:
LLM / Agent Quality (Applied AI)
Define evaluation strategy for LLM/RAG/agents: grounded, helpfulness, safety, regression tests, and release gates.
Build and maintain “golden sets” and rubric-based scoring for agent behavior across key use cases.
Establish monitoring for agent outcomes: quality, latency, cost, drift, user feedback, and business KPIs.
Partner with the AI Applications Engineer on prompt strategy, retrieval patterns, tool-use behavior, and safe fallbacks.
Run red team/adversarial testing and coordinate mitigations for unsafe or ungrounded behavior.
Ensure privacy/security by design: PII minimization, RBAC/least privilege, secrets via Key Vault/1Password, auditable deletions (≤7 days where applicable).
Define human in the loop workflows when needed (sampling, review queues, labeling guidelines, escalation paths).
LLMOps / Governance (Production Readiness)
Own an LLM release process: prompt/model/versioning, offline evals, staging, canary, and rollback.
Maintain documentation for production AI: evaluation reports, model/prompt “cards,” known failure modes, and mitigation playbooks.
Implement automated regression checks (pre/post deploy) to prevent quality/safety backslides.
Define incident response expectations for agent issues: triage, root cause analysis, corrective actions, and follow-up measurement.
BI + Applied ML (Core Data Science)
Partner with stakeholders (Sales/Marketing/CX/Ops) to convert questions into testable plans, success metrics, and decision ready recommendations.
Own predictive modeling for BI priorities: lead/opportunity scoring, demand planning, end-to-end forecasting, and product/website models as needed.
Design and run experiments (A/B, holdouts, quasi experimental when needed): power, guardrails, instrumentation, readouts.
Define business + model metrics; build golden labels/holdouts; quantify ROI and operationalize decision thresholds.
Feature engineering across Salesforce, NetSuite, Five9, Marketing Cloud, Shopify, GA4, and product telemetry; collaborate with Data Engineering to productionize in Microsoft Fabric.
Translate modeling outputs into operational workflows (e.g., Salesforce scoring, routing, prioritization, dashboards, and alerts).
This is an AI-first role, but you're expected to pitch in on BI/analytics work when priorities demand it (metric definitions, semantic model alignment, dashboards, and executive readouts) to ensure outcomes land not just models.
Improve data clarity: metric definitions, data quality checks, lineage notes, and stakeholder enablement.
Other duties as discussed and assigned.
Requirements
2-6 years of enterprise level experience in applied data science or analytics with stakeholder-facing delivery.
Bachelors or Masters degree in Data Science, Computer Science, Statistics, Operations Research (or equivalent practical experience); portfolio, GitHub or examples of shipped work preferred.
Strong proficiency in Python (pandas/sklearn) and SQL, with solid statistical and experimental foundations (forecasting, power analysis, common tests).
Experience shipping models or analytics into production business workflows (e.g., CRM scoring, operational forecasting, dashboards).
Familiarity with LLM concepts (prompting, retrieval, evals) and a quality-first mindset.
Working knowledge of MLOps/LLMOps and Git-based workflows, including versioning, automated eval/regression testing, monitoring/alerting, documentation, and rollback strategies.
Nice to Have (Preferred Experience)
Experience with modern data and BI platforms such as Microsoft Fabric (Lakehouse, Warehouse, Notebooks, Pipelines) and Power BI semantic models, including basic DAX familiarity.
Domain experience with customer, finance, or marketing systems (e.g., Salesforce, NetSuite, Five9), and familiarity with digital platforms such as Marketing Cloud, Shopify, and Google Analytics (GA4).
Hands-on exposure to LLM and agent tooling, including frameworks or ecosystems like OpenAI Agents SDK, Microsoft AI Foundry/Copilot, LangChain/LangGraph, or LlamaIndex, along with an understanding of evaluation, observability, and cost controls.
Experience working with production data infrastructure and telemetry, including databases such as ClickHouse, Postgres, or SQL Server, observability tools like Grafana or Datadog, and evaluation practices such as golden datasets, rubric scoring, pairwise testing, or human-in-the-loop review processes.
Benefits
Opportunity to work in a collaborative and innovative environment.
Career growth opportunities in a market leading and rapidly growing wellness technology company.
Competitive Paid Time Off Policy + Paid Holidays + Floating Holidays.
Fully Equipped Fitness Center On-Site.
Lunch Program featuring a James-Beard Award Winning Chef.
Health (HSA & FSA Options), Dental, and Vision Insurance.
401(k) with company contributions.
Profit Sharing.
Life and Short-Term Disability Insurance.
Professional Development and Tuition Reimbursement.
Associate Discounts on Saunas, Spa Products and Day Spa Services.
Sunlighten provides equal employment opportunity. Discrimination of any type will not be tolerated. Sunlighten is an Equal Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability, protected veteran status or any other characteristic protected by state, federal, or local law.
$62k-86k yearly est. 4d ago
Mathematician
Department of The Air Force
Senior data scientist job in Whiteman Air Force Base, MO
Click on "Learn more about this agency" button below for IMPORTANT additional information. This is a Direct Hire Solicitation. This public notice is to gather applications that may or may not result in a referral or selection. Click on "Learn more about this agency" button below for IMPORTANT additional information.
This is a Direct Hire Solicitation. This public notice is to gather applications that may or may not result in a referral or selection.
Overview
Help
Accepting applications
Open & closing dates
12/22/2025 to 12/21/2026
Salary $63,795 to - $164,301 per year Pay scale & grade GS 11 - 15
Locations
FEW vacancies in the following locations:
Eielson AFB, AK
Elmendorf AFB, AK
Fort Richardson, AK
Maxwell AFB, AL
Show morefewer locations (79)
Little Rock AFB, AR
Davis Monthan AFB, AZ
Luke AFB, AZ
Beale AFB, CA
Edwards AFB, CA
El Segundo, CA
Los Angeles, CA
March AFB, CA
Travis AFB, CA
Vandenberg AFB, CA
Air Force Academy, CO
Buckley AFB, CO
Cheyenne Mountain AFB, CO
Colorado Springs, CO
Peterson AFB, CO
Schriever AFB, CO
Joint Base Anacostia-Bolling, DC
Dover AFB, DE
Cape Canaveral, FL
Cape Canaveral AFS, FL
Eglin AFB, FL
Homestead AFB, FL
Hurlburt Field, FL
MacDill AFB, FL
Patrick AFB, FL
Tyndall AFB, FL
Dobbins AFB, GA
Moody AFB, GA
Robins AFB, GA
Hickam AFB, HI
Mountain Home AFB, ID
Scott AFB, IL
Grissom AFB, IN
Barksdale AFB, LA
Hanscom AFB, MA
Westover Air Reserve Base, MA
Andrews AFB, MD
Linthicum Heights, MD
Selfridge ANG Base, MI
Whiteman AFB, MO
Columbus AFB, MS
Keesler AFB, MS
Malmstrom AFB, MT
Grand Forks, ND
Minot AFB, ND
Offutt AFB, NE
New Boston, NH
McGuire AFB, NJ
Cannon AFB, NM
Holloman AFB, NM
Kirtland AFB, NM
Nellis AFB, NV
Niagara Falls, NY
Wright-Patterson AFB, OH
Youngstown, OH
Youngstown, OH
Altus AFB, OK
Tinker AFB, OK
Vance AFB, OK
Charleston, SC
Shaw AFB, SC
Arnold AFB, TN
Dyess AFB, TX
Fort Sam Houston, TX
Goodfellow AFB, TX
Kelly AFB, TX
Lackland AFB, TX
Laughlin AFB, TX
Randolph AFB, TX
Hill AFB, UT
Alexandria, VA
Arlington, VA
Dahlgren, VA
Fort Eustis, VA
Langley AFB, VA
Pentagon, Arlington, VA
Fairchild AFB, WA
McChord AFB, WA
Warren AFB, WY
Remote job No Telework eligible No Travel Required Occasional travel - You may be expected to travel for this position. Relocation expenses reimbursed No Appointment type Multiple Work schedule Full-time Service Competitive
Promotion potential
15
Job family (Series)
* 1520 Mathematics
Supervisory status No Security clearance Secret Drug test No Position sensitivity and risk Noncritical-Sensitive (NCS)/Moderate Risk
Trust determination process
* Suitability/Fitness
Financial disclosure No Bargaining unit status No
Announcement number AFPC-STEMDHA-12*********** Control number 852876400
This job is open to
Help
The public
U.S. Citizens, Nationals or those who owe allegiance to the U.S.
Clarification from the agency
This public notice is to gather applications that may or may not result in a referral or selection.
Duties
Help
* Duties and responsibilities vary and may increase according to grade level
* Plan and carry out the collection and analysis of information, data and standards used by the assigned organization to perform mathematics tasks.
* Carry out special projects designed to facilitate the full use of the interns training and development.
* Keep abreast of emerging technologies and professional developments to maintain current in the field and for application to work assignments.
* Perform scientific work in the field of mathematics in support of conventional projects, or portions of larger projects and programs under the guidance of a senior mathematician or scientist.
Requirements
Help
Conditions of employment
* Please read this Public Notice in its entirety prior to submitting your application for consideration.
* U.S. Citizenship is required
* Males must be registered for Selective Service, see ***********
* Total salary varies depending on location of position
* If authorized, PCS will be paid IAW JTR and AF Regulations. If receiving an authorized PCS, you may be subject to completing/signing a CONUS agreement. More information on PCS requirements, may be found at: *****************************************
* Recruitment incentives may be authorized
* Position may be subject to random drug testing
* Employee may be required to work other than normal duty hours, to include evenings, weekends and/or holidays
* Shift work and emergency overtime may be required
* Employee must maintain current certifications
* A security clearance may be required
* Disclosure of Political Appointments is required
Qualifications
In order to qualify, you must meet the specialized experience requirements described in the Office of Personnel Management (OPM) Qualification Standards for General Schedule Positions, Group Coverage Qualification Standard for Professional and Scientific Positions.
BASIC REQUIREMENT OR INDIVIDUAL OCCUPATIONAL REQUIREMENT:
Degree: mathematics; or the equivalent of a major that included at least 24 semester hours in mathematics.
OR
Combination of education and experience - courses equivalent to a major in mathematics (including at least 24 semester hours in mathematics), as shown in A above, plus appropriate experience or additional education.
SPECIALIZED EXPERIENCE: In addition to meeting the basic requirement above, to qualify for this position you must also meet the qualification requirements listed below.
GS-11: One year of specialized experience (equivalent to GS-09) experience that equipped the applicant with the particular knowledge, skills, and abilities to perform successfully the duties of the position, and that is typically in or related to the work of the position to be filled. Your experience includes performing full range of activities required for the scientific area under study and utilizing an intense application of specialized methods and techniques of the required field of mathematics; gathering, analyzing and processing of data and information to support work efforts.
GS-12: One year of specialized experience (equivalent to GS-11) experience that equipped the applicant with the particular knowledge, skills, and abilities to perform successfully the duties of the position, and that is typically in or related to the work of the position to be filled. Your experience includes directing, coordinating and reviewing work of the other members of the team; planning, organizing and conducting the operational and performance tests and technical evaluation of test results.
GS-13: One year of specialized experience (equivalent to GS-12) experience that equipped the applicant with the particular knowledge, skills, and abilities to perform successfully the duties of the position, and that is typically in or related to the work of the position to be filled. Your experience includes advising and providing technical assistance to scientific, engineering, mathematical and managerial personnel on the mathematical aspects of range safety work; reviewing scientific and mathematical studies and papers and convert results into practical recommendations and procedures.
GS-14: One year of specialized experience (equivalent to GS-13) experience that equipped the applicant with the particular knowledge, skills, and abilities to perform successfully the duties of the position, and that is typically in or related to the work of the position to be filled. Your experience includes formulating, conducting, organizing and directing highly creative and independent research programs; defining and selecting basic problems for investigation based on needed improvements to the state of the art in mathematical modeling; providing contractual technical direction to programs of considerable scope and complexity, with nominal, broad oversight.
GS-15: One year of specialized experience (equivalent to GS-14) experience that equipped the applicant with the particular knowledge, skills, and abilities to perform successfully the duties of the position, and that is typically in or related to the work of the position to be filled. Your experience includes providing operational and scientific expertise to explore basic research initiatives for a new generation; overseeing policy development, plans, and operations in critical research initiatives and advanced technologies for future precision capabilities on a continuing or project basis; identifying and recommending critical research initiatives and establishes goals toward future precision capabilities.
KNOWLEDGE, SKILLS AND ABILITIES (KSAs):
* Knowledge of mathematics policies, theories, principles and concepts and familiarity with other scientific and engineering disciplines to carry out conventional scientific processes.
* Knowledge of mathematics and related disciplines to design, develop and adapt scientific methods and techniques.
* Knowledge of analytical techniques to develop, stimulate, and analyze data.
* Ability to communicate effectively both orally and in writing.
* Knowledge of computer operations and standard software applications.
PART-TIME OR UNPAID EXPERIENCE: Credit will be given for appropriate unpaid and or part-time work. You must clearly identify the duties and responsibilities in each position held and the total number of hours per week.
VOLUNTEER WORK EXPERIENCE: Refers to paid and unpaid experience, including volunteer work done through National Service Programs (i.e., Peace Corps, AmeriCorps) and other organizations (e.g., professional; philanthropic; religious; spiritual; community; student and social). Volunteer work helps build critical competencies, knowledge and skills that can provide valuable training and experience that translates directly to paid employment. You will receive credit for all qualifying experience, including volunteer experience.
Education
IF USING EDUCATION TO QUALIFY: If position has a positive degree requirement or education forms the basis for qualifications, you MUST submit transcriptswith the application. Official transcripts are not required at the time of application; however, if position has a positive degree requirement, qualifying based on education alone or in combination with experience; transcripts must be verified prior to appointment. An accrediting institution recognized by the U.S. Department of Education must accredit education. Click here to check accreditation.
FOREIGN EDUCATION: Education completed in foreign colleges or universities may be used to meet the requirements. You must show proof the education credentials have been deemed to be at least equivalent to that gained in conventional U.S. education program. It is your responsibility to provide such evidence when applying.
Additional information
For Direct Hire (DHA) Positions:
This is a Direct Hire Public Notice, under this recruitment procedure applications will be accepted for each location/ installation identified in this Public Notice and selections are made for vacancies as they occur. There may or may not be actual/projected vacancies at the time you submit your application.
Interagency Career Transition Assistance Program (ICTAP): For information on
$63.8k-164.3k yearly 35d ago
Data Scientist - Retail Pricing
Capitol Federal Savings Bank 4.4
Senior data scientist job in Overland Park, KS
We are looking for a DataScientist! This position will play a key role in shaping data-driven strategies that directly influence the bank's profitability, customer value, and market competitiveness. This role sits at the intersection of analytics, finance, and product strategy - transforming data into pricing intelligence that supports smarter, faster business decisions.
Will design and implement advanced pricing and profitability models for retail banking products, leveraging internal performance metrics, market benchmarks, and third-party data sources. Through predictive modeling, elasticity analysis, and scenario testing, will help the organization optimize deposit and loan pricing, forecast financial outcomes, and identify growth opportunities.
Collaborating across product, finance, and executive teams, will translate complex analytical findings into clear business recommendations that drive strategic action. Will also contribute to enhancing our analytics infrastructure - improving data pipelines, model governance, and reporting capabilities to strengthen enterprise-wide decision-making.
Core Expertise: Pricing strategy · Profitability modeling · Financial forecasting · Machine learning · SQL · Python · R · Data visualization · Strategic analytics · Cross-functional collaboration
CapFed is an equal opportunity employer.
$66k-82k yearly est. Auto-Apply 50d ago
Lead Data Engineer
Launch Potato
Senior data scientist job in Kansas City, KS
WHO ARE WE?
Launch Potato is a profitable digital media company that reaches over 30M+ monthly visitors through brands such as FinanceBuzz, All About Cookies, and OnlyInYourState.
As The Discovery and Conversion Company, our mission is to connect consumers with the world's leading brands through data-driven content and technology.
Headquartered in South Florida with a remote-first team spanning over 15 countries, we've built a high-growth, high-performance culture where speed, ownership, and measurable impact drive success.
WHY JOIN US?
At Launch Potato, you'll accelerate your career by owning outcomes, moving fast, and driving impact with a global team of high-performers.
BASE SALARY: $150,000 to $190,000 per year
MUST HAVE:
5+ years of experience in data engineering within fast-paced, cloud-native environments
Deep expertise in Python, SQL, Docker, and AWS (S3, Glue, Kinesis, Athena/Presto)
Experience building and managing scalable ETL pipelines and data lake infrastructure
Familiarity with distributed systems, Spark, and data quality best practices
Strong cross-functional collaboration skills to support BI, analytics, and engineering teams
EXPERIENCE: 5+ years of data engineering experience in an AWS-based environment where data powers decision-making across product, marketing, and operations.
YOUR ROLE
Lead scalable data engineering efforts that empower cross-functional teams with reliable, timely, and actionable data, ensuring Launch Potato's analytics and business intelligence infrastructure fuels strategic growth.
OUTCOMES
Build and optimize scalable, efficient ETL and data lake processes that proactively catch issues before they impact the business
Own the ingestion, modeling, and transformation of structured and unstructured data to support reporting and analysis across all business units
Partner closely with BI and Analytics to deliver clean, query-ready datasets that improve user acquisition, engagement, and revenue growth
Maintain and enhance database monitoring, anomaly detection, and quality assurance workflows
Serve as the internal point of contact for reporting infrastructure-delivering ad hoc data analyses and driving consistent data integrity
Drive adoption and advancement of Looker dashboards by ensuring robust and scalable backend data support
Contribute to the future of Launch Potato's data team by mentoring peers and shaping a high-performance, quality-first engineering culture
COMPETENCIES
Data Engineering Mastery: Demonstrates technical excellence in building data pipelines, troubleshooting distributed systems, and scaling infrastructure using AWS and open-source tools
Cross-Functional Collaboration: Communicates clearly across technical and non-technical teams, translating business needs into robust data solutions
Proactive Ownership: Operates with a strong sense of accountability; identifies and solves data issues independently and efficiently
Quality-Driven Execution: Holds a high bar for data accuracy, auditability, and documentation throughout all systems and workflows
Strategic Thinking: Anticipates how data infrastructure impacts wider company OKRs and proactively suggests improvements and innovations
Growth Mindset: Seeks out opportunities to elevate team capabilities, mentor others, and stay ahead of evolving best practices in data engineering
TOTAL COMPENSATION
Base salary is set according to market rates for the nearest major metro and varies based on Launch Potato's Levels Framework. Your compensation package includes a base salary, profit-sharing bonus, and competitive benefits. Launch Potato is a performance-driven company, which means once you are hired, future increases will be based on company and personal performance, not annual cost of living adjustments.
Want to accelerate your career? Apply now!
Since day one, we've been committed to having a diverse, inclusive team and culture. We are proud to be an Equal Employment Opportunity company. We value diversity, equity, and inclusion.
We do not discriminate based on race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.
$150k-190k yearly Auto-Apply 2d ago
Data Engineer
PDS Inc., LLC 3.8
Senior data scientist job in Overland Park, KS
The Data Engineer is a key contributor in advancing the firm's data strategy and analytics ecosystem, transforming raw data into actionable insights that drive business decisions. This role requires a technically strong, curious professional committed to continuous learning and innovation. The ideal candidate combines analytical acumen with data engineering skills to ensure reliable, efficient, and scalable data pipelines and reporting solutions.
ESSENTIAL DUTIES AND RESPONSIBILITIES
Data Engineering & Integration
Design, build, and maintain data pipelines and integrations using Azure Data Factory, SSIS, or equivalent ETL/ELT tools.
Automate data imports, transformations, and loads from multiple sources (on-premise, SaaS, APIs, and cloud).
Optimize and monitor data workflows for reliability, performance, and cost efficiency.
Implement and maintain data quality, validation, and error-handling frameworks.
Data Analysis & Reporting
Develop and maintain reporting databases, views, and semantic models for business intelligence solutions.
Design and publish dashboards and visualizations in Power BI and SSRS, ensuring alignment with business KPIs.
Perform ad-hoc data exploration and statistical analysis to support business initiatives.
Collaboration & Governance
Partner with stakeholders across marketing, underwriting, operations, and IT to define analytical and data integration requirements.
Maintain data integrity, enforce governance standards, and promote best practices in data stewardship.
Support data security and compliance initiatives in coordination with IT and business teams.
Continuous Improvement
Stay current with emerging data technologies and analytics practices.
Recommend tools, processes, or automation improvements to enhance data accessibility and insight delivery.
QUALIFICATIONS
Required:
Strong SQL development skills and experience with Microsoft SQL Server and Azure SQL Database.
Hands-on experience with data import, transformation, and integration using Azure Data Factory, SSIS, or similar tools.
Proficiency in building BI solutions using Power BI and/or SSRS.
Strong data modeling and relational database design skills.
Proficiency in Microsoft Excel (advanced formulas, pivot tables, external data connections).
Ability to translate business goals into data requirements and technical solutions.
Excellent communication and collaboration skills.
Bachelor's degree in Computer Science, Information Systems, or a related field (or equivalent experience).
Preferred:
Experience with cloud-based data platforms (Azure Data Lake, Synapse Analytics, Databricks).
Familiarity with version control tools (Git, Azure DevOps) and Agile development practices.
Exposure to Python or PowerShell for data transformation or automation.
Experience integrating data from insurance or financial systems.
Compensation: $120-129K
This position is 3 days onsite/hybrid located in Overland Park, KS
We look forward to reviewing your application. We encourage everyone to apply - even if every box isn't checked for what you are looking for or what is required.
PDSINC, LLC is an Equal Opportunity Employer.
$120k-129k yearly 58d ago
Google Cloud Data & AI Engineer
Slalom 4.6
Senior data scientist job in Kansas City, MO
Who You'll Work With As a modern technology company, our Slalom Technologists are disrupting the market and bringing to life the art of the possible for our clients. We have passion for building strategies, solutions, and creative products to help our clients solve their most complex and interesting business problems. We surround our technologists with interesting challenges, innovative minds, and emerging technologies
You will collaborate with cross-functional teams, including Google Cloud architects, datascientists, and business units, to design and implement Google Cloud data and AI solutions. As a Consultant, Senior Consultant or Principal at Slalom, you will be a part of a team of curious learners who lean into the latest technologies to innovate and build impactful solutions for our clients.
What You'll Do
* Design, build, and operationalize large-scale enterprise data and AI solutions using Google Cloud services such as BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub and more.
* Implement cloud-based data solutions for data ingestion, transformation, and storage; and AI solutions for model development, deployment, and monitoring, ensuring both areas meet performance, scalability, and compliance needs.
* Develop and maintain comprehensive architecture plans for data and AI solutions, ensuring they are optimized for both data processing and AI model training within the Google Cloud ecosystem.
* Provide technical leadership and guidance on Google Cloud best practices for data engineering (e.g., ETL pipelines, data pipelines) and AI engineering (e.g., model deployment, MLOps).
* Conduct assessments of current data architectures and AI workflows, and develop strategies for modernizing, migrating, or enhancing data systems and AI models within Google Cloud.
* Stay current with emerging Google Cloud data and AI technologies, such as BigQuery ML, AutoML, and Vertex AI, and lead efforts to integrate new innovations into solutions for clients.
* Mentor and develop team members to enhance their skills in Google Cloud data and AI technologies, while providing leadership and training on both data pipeline optimization and AI/ML best practices.
What You'll Bring
* Proven experience as a Cloud Data and AI Engineer or similar role, with hands-on experience in Google Cloud tools and services (e.g., BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub, etc.).
* Strong knowledge of data engineering concepts, such as ETL processes, data warehousing, data modeling, and data governance.
* Proficiency in AI engineering, including experience with machine learning models, model training, and MLOps pipelines using tools like Vertex AI, BigQuery ML, and AutoML.
* Strong problem-solving and decision-making skills, particularly with large-scale data systems and AI model deployment.
* Strong communication and collaboration skills to work with cross-functional teams, including datascientists, business stakeholders, and IT teams, bridging data engineering and AI efforts.
* Experience with agile methodologies and project management tools in the context of Google Cloud data and AI projects.
* Ability to work in a fast-paced environment, managing multiple Google Cloud data and AI engineering projects simultaneously.
* Knowledge of security and compliance best practices as they relate to data and AI solutions on Google Cloud.
* Google Cloud certifications (e.g., Professional Data Engineer, Professional Database Engineer, Professional Machine Learning Engineer) or willingness to obtain certification within a defined timeframe.
About Us
Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all.
Compensation and Benefits
Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance.
Slalom is committed to fair and equitable compensation practices. For this position the target base salaries are listed below. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The target salary pay range is subject to change and may be modified at any time.
East Bay, San Francisco, Silicon Valley:
* Consultant $114,000-$171,000
* Senior Consultant: $131,000-$196,500
* Principal: $145,000-$217,500
San Diego, Los Angeles, Orange County, Seattle, Houston, New Jersey, New York City, Westchester, Boston, Washington DC:
* Consultant $105,000-$157,500
* Senior Consultant: $120,000-$180,000
* Principal: $133,000-$199,500
All other locations:
* Consultant: $96,000-$144,000
* Senior Consultant: $110,000-$165,000
* Principal: $122,000-$183,000
We are committed to pay transparency and compliance with applicable laws. If you have questions or concerns about the pay range or other compensation information in this posting, please contact us at: ********************.
EEO and Accommodations
Slalom is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process.
We are accepting applications until the role is filled.
$145k-217.5k yearly Easy Apply 7d ago
Data Engineer III
Spring Venture Group 3.9
Senior data scientist job in Kansas City, MO
Who We Are: Spring Venture Group is a leading digital direct-to-consumer sales and marketing company with product offerings focused on the senior market. We specialize in distributing Medicare Supplement, Medicare Advantage, and related products via our family of brands and dedicated team of licensed insurance agents. Powered by our unique technologies that combine sophisticated marketing, comparison shopping, sales execution, and customer engagement - we help thousands of seniors across the country navigate the complex world of Medicare every day.
Job Description
This person has the opportunity to work primarily remote in the Kansas City or surrounding areas, making occasional visits to the office, but must CURRENTLY be in the Kansas City area.
We are unable to sponsor for this role, this includes international students.
OVERVIEW
The Data Management team is responsible for all things data at Spring Venture Group. Most importantly, our team is responsible for constructing high quality datasets that enable our business stakeholders and world-class Analytics department to make data informed decisions. Data engineers, combining Software Engineering and Database Engineering, serve as a primary resource for expertise with writing scripts and SQL queries, monitoring our database stability, and assisting with data governance ensuring availability for business-critical systems. The DE III works with a team of engineers of varying levels to design, develop, test, and maintain software applications and programs. The DE III will be expected to work independently when needed to solve the most complex problems encountered. They will be expected to be a leader and a mentor.
ESSENTIAL DUTIES
The essential duties for this role include, but are not limited to:
* Serve as a primary advisor to Data Engineering Manager to identify and bring attention to opportunities for technical improvements, reduction of technical debt, or automation of repeated tasks.
* Build advanced data pipelines utilizing the medallion architecture to create high quality single source of truth data sources in Snowflake
* Architect replacements of current Data Management systems with respect to all aspects of data governance
* Design advanced services with multiple data pipelines to securely and appropriately store company assets in our enterprise data stores.
* Technically advise any member of the data engineering department, providing direction when multiple paths forward present themselves.
* Actively participate as a leader in regular team meetings, listening and ensuring that one is assisting others at every chance for growth and development.
* Write advanced ETL/ELT scripts where appropriate to integrate data of various formats into enterprise data stores.
* Take ownership (both individually and as part of a team) of services and applications
* Write complex SQL queries, scripts, and stored procedures to reliably and consistently modify data throughout our organization according to business requirements
* Collaborate directly and independently with stakeholders to build familiarity, fully understand their needs, and create custom, modular, and reliable solutions to resolve their requests
* Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code
* Work with Project Managers, Solution Architects, and Software Development teams to build solutions for Company Initiatives on time, on budget, and on value.
* Independently architect solutions to problems of high complexity, and advise junior and mid-level engineers on problems of medium complexity.
* Create data pipelines using appropriate and applicable technologies from Amazon Web Services (AWS) to serve the specific needs of the business.
* Ensure 99.95% uptime of our company's services monitoring data anomalies, batch failures, and our support chat for one week per team cycle from 8am-9pm.
* Follow and embrace procedures of both the Data Management team and SVG Software Development Life Cycle (SDLC), including obtaining and retaining IT Security Admin III clearance.
* Support after hours and weekend releases from our internal Software Development teams.
* Actively participate in code review and weekly technicals with another more senior engineer or manager.
* Assist departments with time-critical SQL execution and debug database performance problems.
ROLE COMPETENCIES
The competencies for this role include, but are not limited to:
* Emotional Intelligence
* Drive for Results
* Continuous Improvement
* Communication
* Strategic Thinking
* Teamwork and Collaboration
Qualifications
POSITION REQUIREMENTS
The requirements to fulfill this position are as follows:
* Bachelor's degree in Computer Science, or a related technical field.
* 4-7 years of practical production work in Data Engineering.
* Expertise of the Python programming language.
* Expertise of Snowflake
* Expertise of SQL, databases, & query optimization.
* Must have experience in a large cloud provider such as AWS, Azure, GCP.
* Advanced at reading code independently and understanding its intent.
* Advanced at writing readable, modifiable code that solves business problems.
* Ability to construct reliable and robust data pipelines to support both scheduled and event based workflows.
* Working directly with stakeholders to create solutions.
* Mentoring junior and mid-level engineers on best practices in programming, query optimization, and business tact.
Additional Information
Benefits:
The Company offers the following benefits for this position, subject to applicable eligibility requirements:
* Competitive Compensation
* Medical, Dental and vision benefits after a short waiting period
* 401(k) matching program
* Life Insurance, and Short-term and Long-term Disability Insurance
* Optional enrollment includes HSA/FSA, AD&D, Spousal/Dependent Life Insurance, Travel Assist and Legal Plan
* Generous paid time off (PTO) program starting off at 15 days your first year
* 15 paid Holidays (includes holiday break between Christmas and New Years)
* 15 days of Paid Parental Leave
* Annual Volunteer Time Off (VTO) and a donation matching program
* Employee Assistance Program (EAP) - health and well-being on and off the job
* Rewards and Recognition
* Diverse, inclusive and welcoming culture
* Training program and ongoing support throughout your Venture Spring Venture Group career
Security Responsibilities:
* Operating in alignment with policies and standards
* Reporting Security Incidents Completing assigned training
* Protecting assigned organizational assets
Spring Venture Group is an Equal Opportunity Employer
$75k-98k yearly est. 2d ago
Sr Data Architect
Tata Consulting Services 4.3
Senior data scientist job in Kansas City, MO
Must Have Technical/Functional Skills * 8+ years in data engineering/architecture; Good to have at least 1+ years in AI/ML solutions. * Expertise in cloud data platforms, Spark, streaming, and data modeling. * Experience with MLOps, model deployment, and monitoring is good to have.
* Strong communication and leadership skills; ability to run workshops.
* Proficiency in Python, Pyspark and SQL.
Roles & Responsibilities
* Facilitate design workshops and translate business needs into data/AI architecture.
* Define data flow, integration patterns, and data models across multiple sources.
* Design and implement data pipelines, ML pipelines, and MLOps practices.
* Establish governance, security, and observability for data and models.
* Lead and mentor engineers; ensure delivery excellence and coding standards.
* Stay hands-on with Spark//Python/PySpark, orchestration tools, and model deployment.
Salary Range: $110,000 to $125,000 per year
$110k-125k yearly 23d ago
Senior. Data Engineer
Care It Services 4.3
Senior data scientist job in Overland Park, KS
The SeniorData Engineer will be responsible for building and maintaining the data infrastructure that powers the organization's data-driven decision-making. Designs, develops, and maintains data pipelines, data warehouses, and other data-related infrastructure. This role expects to work closely with datascientists, analysts, and other stakeholders to understand their data needs and translate them into robust and scalable solutions.
Key Responsibilities:
Build, maintain, and optimize data pipelines, including ELT processes, data models, reports, and dashboards to drive business insights.
Develop and implement data solutions for enterprise data warehouses and business intelligence (BI) initiatives.
Continuously monitor and optimize data pipelines for performance, reliability, and cost-effectiveness. This includes identifying bottlenecks, tuning queries, and scaling infrastructure as needed.
Automate data ingestion, processing, and validation tasks to ensure data quality and consistency.
Implement data governance policies and procedures to ensure data quality, consistency, and compliance with relevant regulations.
Contribute to the development of the organization's overall data strategy.
Conduct code reviews and contribute to the establishment of coding standards and best practices.
Required Qualifications:
Bachelor's degree in a relevant field or equivalent professional experience.
4-6 years of hands-on experience in data engineering.
Strong expertise in SQL and NoSQL databases, including PostgreSQL, DynamoDB, and MongoDB.
Experience working with cloud platforms such as GCP, Azure, or AWS and their associated data services.
Practical knowledge of data warehouses like BigQuery, Snowflake, and Redshift.
Programming skills in Python or JavaScript.
Proficiency with BI tools such as Sisense, Power BI, or Tableau.
Preferred Qualifications:
Direct experience with Google Cloud Platform (GCP).
Knowledge of CI/CD pipelines, including tools like Docker and Terraform.
Background in the healthcare industry.
Familiarity with modern data integration tools such as DBT, Matillion, and Airbyte. Compensation: $125,000.00 per year
Who We Are CARE ITS is a certified Woman-owned and operated minority company (certified as WMBE). At CARE ITS, we are the World Class IT Professionals, helping clients achieve their goals. Care ITS was established in 2010. Since then we have successfully executed several projects with our expert team of professionals with more than 20 years of experience each. We are globally operated with our Head Quarters in Plainsboro, NJ, with focused specialization in Salesforce, Guidewire and AWS. We provide expert solutions to our customers in various business domains.
$125k yearly Auto-Apply 60d+ ago
Data Engineer
Lockton 4.5
Senior data scientist job in Overland Park, KS
Lockton Affinity, located in Overland Park, KS is searching for a Data Engingeer. At Lockton Affinity, information technology plays a vital role in delivering exceptional business results. The Data Engineer is a key contributor in advancing the firm's data strategy and analytics ecosystem, transforming raw data into actionable insights that drive business decisions. This role requires a technically strong, curious professional committed to continuous learning and innovation. The ideal candidate combines analytical acumen with data engineering skills to ensure reliable, efficient, and scalable data pipelines and reporting solutions. Lockton values Associates who are proactive, collaborative, and motivated to make a measurable impact.
ESSENTIAL DUTIES AND RESPONSIBILITIES
Data Engineering & Integration
* Design, build, and maintain data pipelines and integrations using Azure Data Factory, SSIS, or equivalent ETL/ELT tools.
* Automate data imports, transformations, and loads from multiple sources (on-premise, SaaS, APIs, and cloud).
* Optimize and monitor data workflows for reliability, performance, and cost efficiency.
* Implement and maintain data quality, validation, and error-handling frameworks.
Data Analysis & Reporting
* Develop and maintain reporting databases, views, and semantic models for business intelligence solutions.
* Design and publish dashboards and visualizations in Power BI and SSRS, ensuring alignment with business KPIs.
* Perform ad-hoc data exploration and statistical analysis to support business initiatives.
Collaboration & Governance
* Partner with stakeholders across marketing, underwriting, operations, and IT to define analytical and data integration requirements.
* Maintain data integrity, enforce governance standards, and promote best practices in data stewardship.
* Support data security and compliance initiatives in coordination with IT and business teams.
Continuous Improvement
* Stay current with emerging data technologies and analytics practices.
* Recommend tools, processes, or automation improvements to enhance data accessibility and insight delivery.
$77k-101k yearly est. 13d ago
Staff Data Engineer
Artera
Senior data scientist job in Kansas City, MO
Our Mission: Make healthcare #1 in customer service. What We Deliver: Artera, a SaaS leader in digital health, transforms patient experience with AI-powered virtual agents (voice and text) for every step of the patient journey. Trusted by 1,000+ provider organizations - including specialty groups, FQHCs, large IDNs and federal agencies - engaging 100 million patients annually. Artera's virtual agents support front desk staff to improve patient access including self-scheduling, intake, forms, billing and more. Whether augmenting a team or unleashing a fully autonomous digital workforce, Artera offers multiple virtual agent options to meet healthcare organizations where they are in their AI journey. Artera helps support 2B communications in 109 languages across voice, text and web. A decade of healthcare expertise, powered by AI.
Our Impact: Trusted by 1,000+ provider organizations - including specialty groups, FQHCs, large IDNs and federal agencies - engaging 100 million patients annually. Hear from our CEO, Guillaume de Zwirek, about why we are standing at the edge of the biggest technological shift in healthcare's history!
Our award-winning culture: Our award-winning culture: Since founding in 2015, Artera has consistently been recognized for its innovative technology, business growth, and named a top place to work. Examples of these accolades include: Inc. 5000 Fastest Growing Private Companies (2020, 2021, 2022, 2023, 2024); Deloitte Technology Fast 500 (2021, 2022, 2023, 2024, 2025); Built In Best Companies to Work For (2021, 2022, 2023, 2024, 2025, 2026). Artera has also been recognized by Forbes as one of “America's Best Startup Employers,” Newsweek as one of the “World's Best Digital Health Companies,” and named one of the top “44 Startups to Bet your Career on in 2024” by Business Insider.
SUMMARY We are seeking a highly skilled and motivated Staff Data Engineer to join our team at Artera. This role is critical to maintaining and improving our data infrastructure, ensuring that our data pipelines are robust, efficient, and capable of delivering high-quality data to both internal and external stakeholders. As a key player in our data team, you will have the opportunity to make strategic decisions about the tools we use, how we organize our data, and the best methods for orchestrating and optimizing our data processes.
Your contributions will be essential to ensuring the uninterrupted flow of data across our platform, supporting the analytics needs of our clients and internal teams. If you are passionate about data, problem-solving, and continuous improvement, while also taking the lead on investigating and implementing solutions to enhance our data infrastructure.Responsibilities
Continuous Enhancement: Maintain and elevate Artera's data infrastructure, ensuring peak performance and dependability.
Strategic Leadership: Drive the decision-making process for the selection and implementation of data tools and technologies
Streamlining: Design and refine data pipelines to ensure smooth and efficient data flow.
Troubleshooting: Manage the daily operations of the Artera platform, swiftly identifying and resolving data-related challenges.
Cross-Functional Synergy: Partner with cross-functional teams to develop new data requirements and refine existing processes
Guidance: Provide mentorship to junior engineers, supporting their growth and assisting with complex projects
Collaborative Innovation: Contribute to ongoing platform improvements, ensuring a culture of continuous innovation.
Knowledge Expansion: Stay informed on industry trends and best practices in data infrastructure and cloud technologies.
Dependability: Guarantee consistent data delivery to customers and stakeholders, adhering to or surpassing service level agreements.
Oversight: Monitor and sustain the data infrastructure, covering areas like recalls, message delivery, and reporting functions.
Proactiveness: Improves stability and performance of architecture for team implementations.
Requirements
Bachelor's Degree in STEM preferred (additional experience is also accepted in lieu of a degree)
Proven experience with Kubernetes and Cloud infrastructure (AWS preferred)
Strong proficiency in Python and SQL for data processing and automation.
Expertise in orchestration tools such as Dagster and Docker.
Understanding of performance optimization and cost-effectiveness in Snowflake.
Ability to work effectively in a collaborative, cross-functional environment.
Strong problem-solving skills with a proactive and solution-oriented mindset.
Experience with event sourced and microservice architecture
Experienced working with asynchronous requests in large scale applications
Commitment to testing best practices
Experience in Large-scale data architecture
Demonstrated ability to build and maintain complex data pipelines and data flows.
Bonus Experience
Knowledge of DBT & Meltano
Security RequirementsThis engineering role contributes to a secure, federally compliant platform. Candidates must be eligible for a government background check and operate within strict code management, access, and documentation standards. Security-conscious development and participation in compliance practices are core to the role.
OUR APPROACH TO WORK LOCATIONArtera has hybrid office locations in Santa Barbara, CA, and Philadelphia (Wayne), PA, where team members typically come in three days a week. Specific frequency can vary depending on your team's needs, manager expectations and/or role responsibilities.
In addition to our U.S. office locations, we are intentionally building geographically concentrated teams in several key metropolitan areas, which we call our “Hiring Hubs.” We are currently hiring remote candidates located within the following hiring hubs:- Boston Metro Area, MA- Chicago Metro Area, IL- Denver Metro Area, CO- Kansas City Metro Area (KS/MO)- Los Angeles Metro Area, CA- San Francisco / Bay Area, CA- Seattle Metro Area, WA
This hub-based model helps us cultivate strong local connections and team cohesion, even in a distributed environment.
To be eligible for employment at Artera, candidates must reside in one of our hybrid office cities or one of the designated hiring hubs. Specific roles may call out location preferences when relevant.
As our hubs grow, we may establish local offices to further enhance in-person connection and collaboration. While there are no current plans in place, should an office open in your area, we anticipate implementing a hybrid model. Any future attendance expectations would be developed thoughtfully, considering factors like typical commute times and access to public transit, to ensure they are fair and practical for the local team.
WORKING AT ARTERA Company benefits - Full health benefits (medical, dental, and vision), flexible spending accounts, company paid life insurance, company paid short-term & long-term disability, company equity, voluntary benefits, 401(k) and more! Career development - Manager development cohorts, employee development funds Generous time off - Company holidays, Winter & Summer break, and flexible time off Employee Resource Groups (ERGs) - We believe that everyone should belong at their workplace. Our ERGs are available for identifying employees or allies to join.
EQUAL EMPLOYMENT OPPORTUNITY (EEO) STATEMENTArtera is an Equal Opportunity Employer and is committed to fair and equitable hiring practices. All hiring decisions at Artera are based on strategic business needs, job requirements and individual qualifications. All candidates are considered without regard to race, color, religion, gender, sexuality, national origin, age, disability, genetics or any other protected status.
Artera is committed to providing employees with a work environment free of discrimination and harassment; Artera will not tolerate discrimination or harassment of any kind.
Artera provides reasonable accommodations for applicants and employees in compliance with state and federal laws. If you need an accommodation, please reach out to ************.
DATA PRIVACYArtera values your privacy. By submitting your application, you consent to the processing of your personal information provided in conjunction with your application. For more information please refer to our Privacy Policy.
SECURITY REQUIREMENTSAll employees are responsible for protecting the confidentiality, integrity, and availability of the organization's systems and data, including safeguarding Artera's sensitive information such as, Personal identifiable Information (PII) and Protected Health Information (PHI). Those with specific security or privacy responsibilities must ensure compliance with organizational policies, regulatory requirements, and applicable standards and frameworks by implementing safeguards, monitoring for threats, reporting incidents, and addressing data handling risks or breaches.
$72k-96k yearly est. Auto-Apply 23d ago
Google Senior Data Engineer
Accenture 4.7
Senior data scientist job in Overland Park, KS
We Are Accenture is a premier Google Cloud partner helping organizations modernize data ecosystems, build real-time analytics capabilities, and responsibly scale AI. As part of Accenture Cloud First and the Accenture Google Business Group (AGBG), we deliver solutions leveraging Google Cloud's Data & AI platform-including BigQuery, Looker, Vertex AI, Gemini Foundation Models, and Gemini Enterprise.
You Are
A hands-on Engineer with foundational experience in Data Engineering, Analytics, or Machine Learning-now building deep expertise in Google Cloud Platform (GCP). You are eager to apply technical skills, learn advanced Data & AI patterns, and support delivery teams in designing and implementing modern data and AI solutions.
You're comfortable working directly with clients, supporting senior architects, and contributing to end-to-end project execution.
The Work (What You Will Do)
As a GCP SeniorData Engineer, you will help deliver data modernization, analytics, and AI solutions on GCP. You will support architecture design, build data pipelines and models, perform analysis, and contribute to technical implementations under guidance from senior team members.
1. Hands-On Technical Delivery
+ Build data pipelines, ETL/ELT processes, and integrations using GCP services such as: BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage
+ Assist with data modeling, performance tuning, and query optimization in BigQuery.
+ Implement data ingestion patterns for batch and streaming data sources.
+ Support development of dashboards and analytics products using Looker or Looker Studio.
2. Support Agentic AI & ML Solution Development
+ Assist in developing ML models and AI solutions using:Vertex AI, Gemini Foundation Models, Gemini Enterprise, Model APIs & Embeddings
+ Implement ML pipelines and help establish MLOps processes (monitoring, retraining, deployment).
+ Support prompt engineering, embeddings, and retrieval-augmented generation (RAG) experimentation.
+ Contribute to model testing, validation, and documentation.
3. Requirements Gathering & Client Collaboration
+ Participate in client workshops to understand data needs, use cases, and technical requirements.
+ Help translate functional requirements into technical tasks and implementation plans.
+ Communicate progress, blockers, and insights to project leads and client stakeholders.
4. Data Governance, Quality & Security Support
+ Implement metadata management, data quality checks, and lineage tracking using GCP tools (Dataplex, IAM).
+ Follow best practices for security, identity management, and compliance.
+ Support operational processes for data validation, testing, and monitoring.
5. Continuous Learning & Team Support
+ Learn and apply GCP Data & AI best practices across architectural patterns, engineering standards, and AI frameworks.
+ Collaborate closely with seniordata engineers, ML engineers, and architects.
+ Contribute to internal accelerators, documentation, and reusable components.
+ Stay current with GCP releases, Gemini model updates, and modern engineering practices.
Travel may be required for this role. The amount of travel will vary from 0 to 100% depending on business need and client requirements.
Here's what you need
+ Minimum of 5 years of hands-on experience in Data Engineering, Data Analytics, ML Engineering, or related fields.
+ Minimum of 4 years of practical experience with Google Cloud Platform.
+ Minimum of 5 years of experience with SQL, data modeling, and building data pipelines.
+ Minimum of 3 years of experience with Python or AI or GenAI tools (Vertex AI preferred).
+ Bachelor's degree or equivalent (minimum 12 years) work experience. (If Associate's Degree, must have minimum 6 years work experience)
Bonus point if you have
+ Experience with GCP services such as BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, and Looker.
+ Exposure to AI/ML development or experimentation with Vertex AI, Gemini models, embeddings, or RAG patterns.
+ Hands-on experience with CI/CD, Git, or cloud-native engineering practices.
+ Google Cloud certifications (Associate Cloud Engineer or Professional Data Engineer).
+ Experience working in agile delivery environments.
Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired as set forth below.We anticipate this job posting will be posted on 01/24/2026 and open for at least 3 days.
Accenture offers a market competitive suite of benefits including medical, dental, vision, life, and long-term disability coverage, a 401(k) plan, bonus opportunities, paid holidays, and paid time off. See more information on our benefits here:
U.S. Employee Benefits | Accenture (*******************************************************
Role Location Annual Salary Range
California $94,400 to $266,300
Cleveland $87,400 to $213,000
Colorado $94,400 to $230,000
District of Columbia $100,500 to $245,000
Illinois $87,400 to $230,000
Maryland $94,400 to $230,000
Massachusetts $94,400 to $245,000
Minnesota $94,400 to $230,000
New York $87,400 to $266,300
New Jersey $100,500 to $266,300
Washington $100,500 to $245,000
Requesting an Accommodation
Accenture is committed to providing equal employment opportunities for persons with disabilities or religious observances, including reasonable accommodation when needed. If you are hired by Accenture and require accommodation to perform the essential functions of your role, you will be asked to participate in our reasonable accommodation process. Accommodations made to facilitate the recruiting process are not a guarantee of future or continued accommodations once hired.
If you would like to be considered for employment opportunities with Accenture and have accommodation needs such as for a disability or religious observance, please call us toll free at **************** or send us an email or speak with your recruiter.
Equal Employment Opportunity Statement
We believe that no one should be discriminated against because of their differences. All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law. Our rich diversity makes us more innovative, more competitive, and more creative, which helps us better serve our clients and our communities.
For details, view a copy of the Accenture Equal Opportunity Statement (********************************************************************************************************************************************
Accenture is an EEO and Affirmative Action Employer of Veterans/Individuals with Disabilities.
Accenture is committed to providing veteran employment opportunities to our service men and women.
Other Employment Statements
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States.
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process. Further, at Accenture a criminal conviction history is not an absolute bar to employment.
The Company will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. Additionally, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the Company's legal duty to furnish information.
California requires additional notifications for applicants and employees. If you are a California resident, live in or plan to work from Los Angeles County upon being hired for this position, please click here for additional important information.
Please read Accenture's Recruiting and Hiring Statement for more information on how we process your data during the Recruiting and Hiring process.
$63k-83k yearly est. 4d ago
Sr. Data Engineer
Oakwood Systems Group Inc. 3.5
Senior data scientist job in Kansas City, MO
Oakwood Systems Group is a leading provider of transformative cloud services, application development, data & analytics, and managed services. Recognized as a Microsoft Solutions Partner, we pride ourselves on hiring top talent and fostering a collaborative and innovative environment. Our employees thrive in a culture that values learning, cutting-edge technology, and meaningful projects.
Position Overview:
As a Data Engineer, you will design, build, and optimize reusable, scalable data pipelines with robust error handling and reporting. Your work will enable advanced analytics and data-driven decision-making for our clients across various industries. This role offers the opportunity to work with state-of-the-art cloud platforms like Microsoft Fabric, as well as industry-leading tools and technologies.
Key Responsibilities:
* Develop, maintain, and optimize reusable, scalable data pipelines that include robust error handling and logging mechanisms.
* Implement automated notifications and error reporting using Azure Logic Apps and other tools.
* Collaborate with cross-functional teams to ensure data pipelines align with business objectives and meet performance requirements.
* Ensure data quality by implementing and maintaining frameworks.
* Design and implement data models and ETL/ELT processes that enable high-performance analytics solutions.
* Support the integration and transformation of diverse data sources into unified and consumable formats.
* Stay up-to-date with the latest advancements in Azure data services, including Microsoft Fabric, to deliver innovative solutions.
Required Skills:
* 3-5 years of proven experience as a Data Engineer or in a similar role.
* Proficiency in designing and implementing data pipelines with tools such as Azure Data Factory and Azure Synapse, preferably within Microsoft Fabric.
* Strong programming skills in Python (preferable), Spark, Java, or C#
* Advanced SQL expertise (T-SQL, PL/SQL) for database design and query optimization.
* Hands-on experience with cloud data platforms, particularly Azure (Databricks, Synapse, Data Factory, Logic Apps).
* Familiarity with data quality frameworks.
Preferred Experience:
* Proficiency with Microsoft Fabric for analytics and data processing.
* Advanced knowledge of modern data architectures (Medallion, DataVault, Lambda/Kappa)
* Knowledge of business intelligence tools like Power BI or Tableau.
* Azure certifications (e.g., Fabric Analytics, Fabric Data Engineer, Power BI Data Analyst).
* Solid Experience with DevOps practices using Azure DevOps or Git.
Why Join Oakwood?
* Competitive compensation with bonus potential.
* Comprehensive benefits, including health, dental, and vision coverage.
* Paid time off for vacation, holiday and illness.
* Certification support, training programs, and professional growth opportunities.
* Flexible work environment with remote and hybrid options for employees living within 50 miles of St. Louis or Kansas City.
* Sponsorship and relocation are not available now or in the future. *
$69k-95k yearly est. Auto-Apply 22d ago
Senior Data Engineer
Berkley 4.3
Senior data scientist job in Overland Park, KS
Company Details
Intrepid Direct Insurance (IDI) is a rapidly growing direct to consumer property and casualty insurance company. A member of the W. R. Berkley Corporation, a fortune 500 company, rated A+ (Superior) by A.M. Best, Intrepid Direct's vision is to make life better for business. The insurance industry has not evolved with innovation like other major industries. We're here to change that. We are making life better for our customers, shareholders, and our team members by leveraging data and technology as insurance experts for our targeted customers. You will be part of a highly collaborative team of talented and focused professionals. Join a group that enjoys working together, trusts each other, and takes pride in our hard-earned success.
***************************
The Company is an equal employment opportunity employer.
Responsibilities
Intrepid Direct Insurance is looking for an experienced SeniorData Engineer to mentor, orchestrate, implement, and monitor the flowing through our organization. This opportunity will have a direct influence on how data is made available to our business units, as well as our customers. You'll primarily be working with our operations and engineering teams to create and enhance data pipelines, conform and enrich data, and deliver information to business users. Learn the ins and outs of what we do so that you can focus on improving availability and quality of the data we use to service our customers.
Key functions include but are not limited to:
Assist with long-term strategic planning for modern data warehousing needs.
Contribute to data modeling exercises and the buildout of our data warehouse.
Monitor, support, and analyze existing pipelines and recommend performance and process improvements to address gaps in existing process.
Automate manual processes owned by data team.
Troubleshoot and remediate ingestion and reporting related issues.
Design and build new pipelines to ingest data from additional disparate sources.
Responsible for the accuracy and availability of data in our data warehouse.
Collaborate with a multi-disciplinary team to develop data-driven solutions that align with our business and technical needs.
Create and deploy reports as needed.
Assist with cataloging and classifying existing data sets.
Participate in peer reviews with emphasis on continuous improvement.
Respond to regulatory requests for information.
Assumes other tasks and duties as assigned by management.
Mentor team members and advise on best practices.
Qualifications
Bachelor's degree in Mathematics, Statistics, Computer Science, or equivalent experience.
6+ years of relevant data engineering experience.
Analytical thinker with experience working in a fast-paced, startup environment.
Technical expertise with Microsoft SQL Server.
Familiarity with ETL tools and concepts.
Hands-on experience with database design and data modeling, preferable experience with Data Vault methodology.
Experience supporting and troubleshooting SSIS packages.
Experience consuming event-based data through APIs or queues.
Experience in Agile software development.
Experience with insurance data highly desired.
Detail oriented, solid organizational, and problem-solving.
Strong written, visual, and verbal communication skills.
Team oriented with a strong willingness to serve others in an agile startup environment.
Flexible in assuming new responsibilities as they arise.
Experience with Power Bi desired.
Additional Company Details We do not accept unsolicited resumes from third party recruiting agencies or firms.
The actual salary for this position will be determined by a number of factors, including the scope, complexity and location of the role; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. Sponsorship Details Sponsorship not Offered for this Role
$72k-98k yearly est. Auto-Apply 60d+ ago
Sr. Data Engineer
Quest Analytics
Senior data scientist job in Overland Park, KS
At Quest Analytics, our mission is to make healthcare more accessible for all Americans. As part of our team, you'll work in an innovative, collaborative, challenging, and flexible environment that supports your personal growth every day. We are looking for a talented and motivated SeniorData Engineer with experience in building scalable infrastructure, implementing automation, and enabling cross-functional teams with reliable and accessible data.
The SeniorData Engineer will help modernize and scale our data environment. This person will play a key role in transforming these workflows into automated, cloud-based pipelines using Azure Data Factory, Databricks, and modern data platforms. If you are looking for a high-impact opportunity to shape how data flows across the business, APPLY TODAY! What you'll do:
Identify, design, and implement internal process improvements (e.g., automating manual processes, optimizing data delivery, and re-designing infrastructure for scalability).
Transform manual SQL/SSMS/stored procedure workflows into automated pipelines using Azure Data Factory.
Write clean, reusable, and efficient code in Python (and optionally C# or Scala).
Leverage distributed data tools such as Spark and Databricks for large-scale processing.
Review project objectives to determine and implement the most suitable technologies.
Apply best practice standards for development, build, and deployment automation.
Manage day-to-day operations of the data infrastructure and support engineers and analysts with data investigations.
Monitor and report on data pipeline tasks, collaborating with teams to resolve issues quickly.
Partner with internal teams to analyze current processes and identify efficiency opportunities.
Participate in training and mentoring programs as assigned or required.
Uphold Quest Analytics values and contribute to a positive company culture.
Respond professionally and promptly to client and internal requests.
Perform other duties as assigned.
What it requires:
Bachelor's Degree in Computer Science or equivalent education/experience.
3-5 years of experience with ETL, data operations, and troubleshooting, preferably in Healthcare data.
Strong SQL development skills (SSMS, stored procedures, and optimization).
Proficiency in Python, C#, or Scala (experience with pandas and NumPy is a plus).
Solid understanding of the Azure ecosystem, especially Azure Data Factory and Azure Data Lake Storage (ADLS).
Hands-on experience with Azure Data Factory and ADLS.
Familiarity with Spark, Databricks, and data modeling techniques.
Experience working with both relational databases (e.g., SQL Server) and NoSQL databases (e.g., MongoDB).
Self-motivated, strong problem-solver, and thrives in fast-paced environments.
Excellent troubleshooting, listening, and analytical skills.
Customer-focused mindset with a collaborative, team-oriented approach.
We are not currently engaging with outside agencies on this role.Visa sponsorship is not available at this time.
What you'll appreciate:•Workplace flexibility - you choose between remote, hybrid or in-office•Company paid employee medical, dental and vision•Competitive salary and success sharing bonus•Flexible vacation with no cap, plus sick time and holidays•An entrepreneurial culture that won't limit you to a job description•Being listened to, valued, appreciated -- and having your contributions rewarded•Enjoying your work each day with a great group of people Apply TODAY!careers.questanalytics.com
About Quest AnalyticsFor more than 20 years, we've been improving provider network management one groundbreaking innovation at a time. 90% of America's health plans use our tools, including the eight largest in the nation. Achieve your personal quest to build a great career here. Visa sponsorship is not available at this time.
Preferred work locations are within one of the following states: Alabama, Arizona, Arkansas, Colorado, Connecticut, Delaware, Florida, Georgia, Idaho, Illinois (outside of Chicago proper), Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Mexico, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, West Virginia, Wisconsin, or Wyoming.
Quest Analytics provides equal employment opportunities to all people without regard to race, color, religion, sex, national origin, ancestry, marital status, veteran status, age, disability, sexual orientation or gender identity or expression or any other legally protected category. We are committed to creating and maintaining a workforce environment that is free from any form of discriminations or harassment.
Applicants must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire.
Persons with disabilities who anticipate needing accommodations for any part of the application process may contact, in confidence *********************
NOTE: Staffing agencies, headhunters, recruiters, and/or placement agencies, please do not contact our hiring managers directly. We are not currently working with additional outside agencies at this time. Any job posting displayed on websites other than questanalytics.com or jobs.lever.co/questanalytics/ may be out of date, inaccurate and unavailable
$69k-92k yearly est. Auto-Apply 60d+ ago
Senior Data Engineer
Velocity Staff
Senior data scientist job in Overland Park, KS
Velocity Staff, Inc. is working with our client located in the Overland Park, KS area to identify a Senior Level Data Engineer to join their Data Services Team. The right candidate will utilize their expertise in data warehousing, data pipeline creation/support and analytical reporting and be responsible for gathering and analyzing data from several internal and external sources, designing a cloud-focused data platform for analytics and business intelligence, reliably providing data to our analysts. This role requires significant understanding of data mining and analytical techniques. An ideal candidate will have strong technical capabilities, business acumen, and the ability to work effectively with cross-functional teams.
Responsibilities
Work with Data architects to understand current data models, to build pipelines for data ingestion and transformation.
Design, build, and maintain a framework for pipeline observation and monitoring, focusing on reliability and performance of jobs.
Surface data integration errors to the proper teams, ensuring timely processing of new data.
Provide technical consultation for other team members on best practices for automation, monitoring, and deployments.
Provide technical consultation for the team with “infrastructure as code” best practices: building deployment processes utilizing technologies such as Terraform or AWS Cloud Formation.
Qualifications
Bachelor's degree in computer science, data science or related technical field, or equivalent practical experience
Proven experience with relational and NoSQL databases (e.g. Postgres, Redshift, MongoDB, Elasticsearch)
Experience building and maintaining AWS based data pipelines: Technologies currently utilized include AWS Lambda, Docker / ECS, MSK
Mid/Senior level development utilizing Python: (Pandas/Numpy, Boto3, SimpleSalesforce)
Experience with version control (git) and peer code reviews
Enthusiasm for working directly with customer teams (Business units and internal IT)
Preferred but not required qualifications include:
Experience with data processing and analytics using AWS Glue or Apache Spark
Hands-on experience building data-lake style infrastructures using streaming data set technologies (particularly with Apache Kafka)
Experience data processing using Parquet and Avro
Experience developing, maintaining, and deploying Python packages
Experience with Kafka and the Kafka Connect ecosystem.
Familiarity with data visualization techniques using tools such as Grafana, PowerBI, AWS Quick Sight, and Excel.
Not ready to apply? Connect with us to learn about future opportunities.
$69k-92k yearly est. Auto-Apply 60d+ ago
Sr. Data Engineer
Wellsky
Senior data scientist job in Overland Park, KS
The Sr Data Engineer is responsible for designing, building, and maintaining scalable data pipelines, infrastructure, and data visualization to support our data-driven decision-making processes. The scope of this job includes collaborating with datascientists, analysts, solutions partners, and other stakeholders to increase data quality, accessibility, and security.
Key Responsibilities:
Create, develop, maintain, and leverage data pipelines, data models, reporting, and dashboards to solve business problems or assigned tasks in coordination with multiple teams.
Develop and maintain data solutions to enable new or existing solutions and BI development for the data warehouse and data visualization solutions.
Enhance and document workflows for complex data-related challenges by identifying patterns, trends, and anomalies in the data, and find ways to optimize data processing and storage to promote data quality through automation.
Develop understanding of different WellSky solutions and become a subject matter expertise in the healthcare domain.
Identify opportunities in code reviews and contribute to the development of best practices and coding standards.
Adhere to WellSky core values, ensure PHI data security, and adapt to changing BI technologies.
Leverage AI tools and platforms as an integral part of daily responsibilities to enhance decision-making, streamline workflows, and drive data-informed outcomes.
Perform other job duties as assigned.
Required Qualifications:
Bachelor's degree or relevant work experience
4-6 years of relevant work experience
Strong SQL skills with hands-on experience developing and supporting data pipelines and data warehouse solutions
Experience with SQL Server data integration and ETL development (including SSIS or equivalent)
Experience with cloud-based data platforms and services, preferably within GCP environments
Preferred Qualifications:
Experience with Snowflake data warehouse development and management
Experience with DBT for ELT/ETL processes, data modeling, testing, and documentation
GCP expertise, especially with Cloud SQL, BigQuery, and DataStream
Experience with data replication tools such as HVR (or similar CDC/replication technologies)
Experience supporting BI/reporting environments such as Tableau through well-modeled, performant datasets
Healthcare industry experience
Job Expectations:
Willing to work additional or irregular hours as needed
Must work in accordance with applicable security policies and procedures to safeguard company and client information
Must be able to sit and view a computer screen for extended periods of time
#LI-TC1
#LI-Onsite
WellSky is where independent thinking and collaboration come together to create an authentic culture. We thrive on innovation, inclusiveness, and cohesive perspectives. At WellSky you can make a difference.
WellSky provides equal employment opportunities to all people without regard to race, color, national origin, ancestry, citizenship, age, religion, gender, sex, sexual orientation, gender identity, gender expression, marital status, pregnancy, physical or mental disability, protected medical condition, genetic information, military service, veteran status, or any other status or characteristic protected by law. WellSky is proud to be a drug-free workplace.
Applicants for U.S.-based positions with WellSky must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire. Certain client-facing positions may be required to comply with applicable requirements, such as immunizations and occupational health mandates.
Here are some of the exciting benefits full-time teammates are eligible to receive at WellSky:
Excellent medical, dental, and vision benefits
Mental health benefits through TelaDoc
Prescription drug coverage
Generous paid time off, plus 13 paid holidays
Paid parental leave
100% vested 401(K) retirement plans
Educational assistance up to $2500 per year
How much does a senior data scientist earn in Independence, MO?
The average senior data scientist in Independence, MO earns between $66,000 and $121,000 annually. This compares to the national average senior data scientist range of $90,000 to $170,000.
Average senior data scientist salary in Independence, MO