Principal Data Scientist
Data scientist job in Kansas City, KS
Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team.
You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes.
This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.)
This position requires occasional travel to the DC area for client meetings.
Essential Duties and Responsibilities:
- Make deep dives into the data, pulling out objective insights for business leaders.
- Initiate, craft, and lead advanced analyses of operational data.
- Provide a strong voice for the importance of data-driven decision making.
- Provide expertise to others in data wrangling and analysis.
- Convert complex data into visually appealing presentations.
- Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners.
- Understand the importance of automation and look to implement and initiate automated solutions where appropriate.
- Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects.
- Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects.
- Guide operational partners on product performance and solution improvement/maturity options.
- Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization.
- Learn new skills in advanced analytics/AI/ML tools, techniques, and languages.
- Mentor more junior data analysts/data scientists as needed.
- Apply strategic approach to lead projects from start to finish;
Job-Specific Minimum Requirements:
- Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
- Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning.
- Contribute to the development of mathematically rigorous process improvement procedures.
- Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments.
Minimum Requirements
- Bachelor's degree in related field required.
- 10-12 years of relevant professional experience required.
Job-Specific Minimum Requirements:
- 10+ years of relevant Software Development + AI / ML / DS experience.
- Professional Programming experience (e.g. Python, R, etc.).
- Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML.
- Experience with API programming.
- Experience with Linux.
- Experience with Statistics.
- Experience with Classical Machine Learning.
- Experience working as a contributor on a team.
Preferred Skills and Qualifications:
- Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.).
- Experience developing machine learning or signal processing algorithms:
- Ability to leverage mathematical principles to model new and novel behaviors.
- Ability to leverage statistics to identify true signals from noise or clutter.
- Experience working as an individual contributor in AI.
- Use of state-of-the-art technology to solve operational problems in AI and Machine Learning.
- Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles.
- Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions.
- Ability to build reference implementations of operational AI & Advanced Analytics processing solutions.
Background Investigations:
- IRS MBI - Eligibility
#techjobs #VeteransPage
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
156,740.00
Maximum Salary
$
234,960.00
Easy ApplyActuary (Kansas City Metro)
Data scientist job in Overland Park, KS
Employment Type: Full-time, In-Office Salary: $120,000 - $160,000/year + equity
Steadily is hiring an Actuary in the Kansas City Metro area who is the best at what they do. You'll be surrounded by team members who are also the best at what they do, which will make you even better! This is a full-time, in-office position based in Overland Park, KS.
As an Actuary, you will:
Be responsible for the growth and profitability of our DP products nationwide.
Develop industry-leading approaches to rating, exposure management, and capital modeling.
Develop pricing of new products and enhance pricing of existing products
Expand our product into new geographies.
Analyze the drivers of profitability including loss ratios, actuarial indications, frequency/severity trends, retention and other data for all products, books, and channels.
Manage rate filings across multiple states to ensure we are priced to achieve the required return across all products and segments.
Develop and implement robust data-driven action plans to continually improve performance.
Drive innovation by finding new and different ways to price better than the traditional solutions.
Your Background
Experienced: You've been a high achiever in insurance for four years or more. You have experience managing property products (preferably personal lines - HO or DP). You understand the current US property market.
Education: You are nearing actuarial accreditation with the Casualty Actuarial Society (CAS), having passed a minimum of 5 actuarial exams.
Builder: You have a builder's mindset and can take projects and products from inception to launch and beyond. You have a bias towards action.
Digital: Your tech savvy is exceptional. You have strong product sense and are a master of learning and implementing new software and processes. Your technical and analytical skills are top notch.
Hungry: You want to make the leap into an earlier-stage tech company to rapidly accelerate your growth. You want to roll up your sleeves and hustle - you are not looking for a traditional 9-5 job.
Compensation and Benefits
Compensation $120k -$160k salary + equity in the company
3 weeks PTO plus six federal holidays
Health Insurance including Medical, Dental, Vision, Life, Disability, HSA, FSA
401K
Free snacks & regular team lunches
Locations
Overland Park, KS
Relocation assistance available for out of state candidates
Steadily is building a workplace environment of team members who are passionate and excited to be together in person. Our office is located in Overland Park, KS and key to our fast-paced growth trajectory.
Why Join Steadily
Good company. Our founders have three successful startups under their belt and have recruited a stellar team to match.
Top compensation. We pay at the top of the Kansas City market (see comp).
Growth opportunity: We're an early-stage, fast-growing company where you'll wear a lot of hats and shape product decisions.
Strong backing. We're growing fast, we manage over $20 billion in risk, and we're exceptionally well-funded.
Culture: Steadily boasts a very unique culture that our teammates love. We call it like we see it and we're nothing if not candid. Plus, we love to have a good time. Check out our culture deck to learn what we're all about.
Awards: We've been recognized both locally and nationally as a top place to work. We were named a Top 2025 Startup in Newsweek, winner of Austin Business Journal's Best Places to Work in 2025, recognized in Investopedia's Best Landlord Insurance Companies, ranked No. 6 on Inc's list of Fastest Growing Regional Companies, 44th on Forbes' 2025 Best Startup Employers list, and 63rd on the prestigious Inc 5000 Fastest Growing Companies list.
Auto-ApplyP&C Actuary
Data scientist job in Kansas City, MO
This role helps support Lockton's goal of providing clear, actionable insights for our clients. These insights are used to facilitate informed decision-making and deliver improved outcomes. The role is responsible for creating tailored solutions for individual clients as well as enhancing the Analytics group's brand, workflows, and capabilities.
Job Responsibilities:
* Collaborate with internal and external stakeholders to understand needs, identify opportunities, and develop strategic solutions to clients' Property and Casualty challenges
* Work with various tools (Excel, VBA, Power BI, R, Python, SQL, etc.) to create sophisticated and scalable analytic tools and processes
* Leverage Lockton's data to answer ad hoc questions for internal and external audiences
* Create materials and trainings to enhance internal understanding of Analytics' capabilities and contribute to white papers / research articles to advance Lockton Analytics' brand in the market
* Remain informed on industry developments and effectively communicate the implications both internally and externally
* Work with analysts and other stakeholders to identify new or improved analytic solutions that solve business needs
* Communicate and explain complex analytic concepts to clients and internal associates
* Work with developers to embed actuarial methodologies in new proprietary Lockton applications
* Compile, analyze, interpret, and present data trends to guide decision-making
* Prepare a variety of analytic outputs based on client or prospect data, including loss projection and stratification, collateral analysis, simulation models, predictive models, claim dashboards, and other ad hoc reports
Data Scientist - Retail Pricing
Data scientist job in Overland Park, KS
We are looking for a Data Scientist! This position will play a key role in shaping data-driven strategies that directly influence the bank's profitability, customer value, and market competitiveness. This role sits at the intersection of analytics, finance, and product strategy - transforming data into pricing intelligence that supports smarter, faster business decisions.
Will design and implement advanced pricing and profitability models for retail banking products, leveraging internal performance metrics, market benchmarks, and third-party data sources. Through predictive modeling, elasticity analysis, and scenario testing, will help the organization optimize deposit and loan pricing, forecast financial outcomes, and identify growth opportunities.
Collaborating across product, finance, and executive teams, will translate complex analytical findings into clear business recommendations that drive strategic action. Will also contribute to enhancing our analytics infrastructure - improving data pipelines, model governance, and reporting capabilities to strengthen enterprise-wide decision-making.
Core Expertise: Pricing strategy · Profitability modeling · Financial forecasting · Machine learning · SQL · Python · R · Data visualization · Strategic analytics · Cross-functional collaboration
CapFed is an equal opportunity employer.
Auto-ApplyData Engineer
Data scientist job in Overland Park, KS
Description The Data Engineer role within the Cloud Payments Reporting team is responsible for developing, deploying, and maintaining data solutions. Primary responsibilities will include querying databases, maintaining data transformation pipelines through ETL tools and systems into data warehouses, provide and support data visualizations, dashboards, ad hoc and customer deliverable reports. Responsibilities
Provide primary support for existing databases, automated jobs, and reports, including monitoring notifications, performing root cause analysis, communicating findings, and resolving issues.
Subject Matter Expert for payments reports, databases, and processes.
Ensure data and report integrity and accuracy through thorough testing and validation.
Build analytical tools to utilize the data, providing actionable insight into key business performance metrics including operational efficiency.
Implement, support, and optimize ETL pipelines, data aggregation processes, and reports using various tools and technologies.
Collaborate with operational leaders and teams to understand reporting and data usage across the business and provide efficient solutions.
Participate in recurring meetings with working groups and management teams to discuss operational improvements.
Work with stakeholders including data, design, product, and executive teams to support their data infrastructure needs while assisting with data-related technical issues.
Handle tasks on your own, adjust to new deadlines, and adapt to changing priorities.
Design, develop and implement special projects, based on business needs.
Perform other job-related duties and responsibilities as assigned.
Qualifications
Five or more years of experience with Oracle, MySQL, Power BI / QuickSight, and Excel.
Thorough knowledge of SQL, relational databases and data modeling principles.
Proficiency in programming languages such as PL/SQL, Bash, PowerShell and Python.
Exceptional problem-solving, analytical, and critical thinking skills.
Excellent interpersonal and communication skills, with the ability to consult with stakeholders to facilitate requirements gathering, troubleshooting, and solution validation.
Detail-oriented with the ability to understand the bigger picture.
Ability to communicate complex quantitative analysis clearly.
Strong organizational skills, including multi-tasking and teamwork.
Self-motivated, task oriented and an aptitude for complex problem solving.
Experience with AWS, Jenkins and SnapLogic is a plus.
Data streaming, API calls (SOAP and REST), database replication and real-time processing is a plus.
Experience with Atlassian JIRA and Confluence is a plus.
Auto-ApplyData Analytics Intern
Data scientist job in Overland Park, KS
Internship Description
Propio Language Services is a provider of the highest quality interpretation, translation, and localization services. Our people take pride in every resource we offer, and our users always have access to the best technology, support, and experience. We are driven by our passion for innovation, growth, and connecting people. If you believe in the transformative power of technology-driven solutions and meaningful communication, Propio could be the ideal place for you.
Propio's Summer Internship Program is an eight-week experience that offers students the opportunity to engage in real-world client work while receiving mentorship from industry-leading professionals. As an intern, you'll make meaningful contributions from day one because we believe you are the future of our business.
Program Benefits:
Onsite experience that promotes hands-on learning, team engagement, and a deeper understanding of our company culture
Competitive hourly pay
One-on-one mentorship with experienced professionals
Ongoing learning and development
Networking opportunities and social events with peers and professionals
Potential for full-time employment upon graduation
Position Overview
We are seeking a motivated and ambitious Data Analytics Intern to join our team. This internship provides hands-on experience in various data analysis functions including report and dashboard creation, data analyzing, and results reporting. In this role, you will gain exposure to the full suite of business analytics and data reporting and contribute to meaningful projects in a fast-paced environment.
Responsibilities:
Assist in collecting, cleaning, and organizing raw data from multiple sources for analysis
Perform exploratory data analysis to identify trends, patterns, and outliers
Support the creation of dashboards, reports, and data visualizations to communicate insights
Collaborate with internal teams to understand data needs and deliver solutions
Help maintain data integrity and accuracy across systems and tools
Use tools such as Excel, SQL, Python, Tableau, and Power BI to support analytics projects
Contribute to the development of predictive models and statistical analyses under supervision
Document processes, methodologies, and findings to support future reference and repeatability
Stay current on data analytics trends, tools, and best practices to support continuous improvement
Requirements
Qualifications:
Currently pursuing a degree in Business Analytics, Data Science, Statistics, Management Information Systems or a related field
Minimum of a 3.0 GPA strongly preferred
Active involvement in campus, community, or other volunteer activities and/or organizations preferred
Strong written and verbal communication skills
High level of confidentiality and professionalism
Excellent attention to detail and organizational skills
Basic knowledge of Microsoft Office Suite (Word, Excel, Outlook, PowerPoint)
Interest in learning and contributing to a variety of [Department] functions
Prior internship or office experience a plus, but not required
Candidates must be legally authorized to work in the United States on a full-time basis without requiring future sponsorship for employment visa status
What You'll Gain
Exposure to real-world data analysis practices and systems
Experience working with cross-functional teams
Mentorship and support from industry leading data analysis professionals
Opportunities to make meaningful contributions to organizational projects
A stronger understanding of career paths within data and business analytics
Finance, Strategy & Analytics - Data Science Intern
Data scientist job in Kansas City, MO
Data Science Intern DEPARTMENT: Finance, Strategy & Analytics REPORTS TO: Senior Data Scientist STATUS: Intern, Hourly The Data Science Intern will play a key role in supporting the Kansas City Chiefs' Finance, Strategy & Analytics team. You will help enhance our data infrastructure by assisting with potential migration efforts to a modern data platform, supporting updates to connected Plotly Dash applications, and validating data pipelines as needed. You will also contribute to real-time insights, data visualization, and advanced analytics that help transform data into meaningful, actionable decisions across the organization.
JOB RESPONSIBILITIES
Support the team in evaluating and potentially transitioning data workflows from PostgreSQL to a modern data platform.
Assist in restructuring, optimizing, and validating datasets and queries across evolving data environments.
Help update and maintain Plotly Dash applications, ensuring stable data connections during platform transitions.
Maintain and enhance reports, dashboards, and analytical tools used by executives and department leaders.
Develop or improve internal applications that support real-time, data-driven decision making.
Assist with analytical projects such as retention modeling, customer segmentation, dynamic pricing, and financial forecasting.
Document data processes, sources, and architectural changes as infrastructure evolves.
Work with internal teams and vendors to ensure data accuracy, timeliness, and reliability.
Support ad-hoc analysis requests and identify opportunities for new projects or process improvements.
JOB REQUIREMENTS
Required Qualifications
Proficiency in Python and familiarity with commonly used analytics and data engineering libraries.
Strong SQL skills and understanding of relational database concepts.
Familiarity with PySpark or an interest in learning distributed computing frameworks.
Interest in modern data platforms, cloud technologies, or large-scale data processing environments.
Ability to troubleshoot API integrations and application data connections.
Ability to combine data from multiple sources and validate data accuracy.
Strong analytical and problem-solving skills.
Ability to collaborate effectively with teammates and stakeholders across the organization.
Ability to work flexible hours, including evenings, weekends, and holidays as needed.
Familiarity with Microsoft Office tools (Excel, PowerPoint, Dynamics CRM).
Preferred Qualifications
0 - 1 year of experience in related analytics, engineering, or technical role.
Experience with Plotly Dash (enterprise or open source) or other application frameworks.
Experience with PySpark, Spark SQL, or cloud-based data processing tools such as Databricks, Snowflake, or similar platforms.
Experience with HTML/CSS and JavaScript.
Pursuing or holding a degree in Mathematics, Statistics, Computer Science, Engineering, or a related field.
Experience with Tableau or Power BI.
Experience with Git or other version control systems.
Building Science Intern - Summer 2026
Data scientist job in Kansas City, MO
We at BranchPattern are seeking a motivated and detail-oriented Building Science Summer Intern to join our team for Summer 2026. This is a hands-on opportunity to gain experience in the fast-growing field of building science, with a focus on green building consulting, energy analytics, and building commissioning. You will work closely with our multidisciplinary team of engineers, architects, and consultants to assist in evaluating building performance, optimizing energy efficiency, and implementing sustainable design practices.
Key Responsibilities:
Assist in building performance analysis using simulation software to assess energy efficiency, daylighting, thermal comfort, and other sustainability metrics.
Support the commissioning process by participating in system verification, functional performance testing, and documentation reviews.
Conduct research on sustainable building materials, energy codes, and emerging green technologies to support project recommendations.
Assist in the development of reports and presentations for clients, summarizing building performance findings and sustainability strategies.
Participate in site visits to assess building systems, including HVAC, lighting, and envelope performance, and support in preparing commissioning plans and checklists.
Collaborate with team members on LEED, WELL, and other green building certification projects, helping with documentation and compliance tracking.
Analyze and interpret building performance data to assist in identifying energy-saving opportunities and recommending solutions.
Minimum Qualifications:
Currently pursuing a Bachelor's or Master's degree in Architectural Engineering, Mechanical Engineering, or Architecture.
Strong interest in building science, consulting, and sustainability.
Strong analytical and problem-solving skills, with attention to detail.
Excellent communication and teamwork skills, with the ability to work collaboratively in a consulting environment.
Proficiency in Microsoft Office Suite (Excel, Word, PowerPoint);
Desired Qualifications:
Basic knowledge of building systems, including HVAC, lighting, and envelope design.
Familiarity with building performance simulation tools (e.g., EnergyPlus, IESVE, or similar) is a plus.
Experience with AutoCAD or Revit is a plus.
BranchPattern is a great place to work!
BranchPattern is an award-winning firm built by engineers and building scientists who genuinely care about improving people's lives. We are advocates for building a better environment and community. Our employees are progressive, passionate, and innovative change agents that strive to deliver high-quality projects. If you are ready to make a difference and improve the lives of our communities, apply today.
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, or national origin.
Data Engineer
Data scientist job in Overland Park, KS
The Data Engineer is a key contributor in advancing the firm's data strategy and analytics ecosystem, transforming raw data into actionable insights that drive business decisions. This role requires a technically strong, curious professional committed to continuous learning and innovation. The ideal candidate combines analytical acumen with data engineering skills to ensure reliable, efficient, and scalable data pipelines and reporting solutions.
ESSENTIAL DUTIES AND RESPONSIBILITIES
Data Engineering & Integration
Design, build, and maintain data pipelines and integrations using Azure Data Factory, SSIS, or equivalent ETL/ELT tools.
Automate data imports, transformations, and loads from multiple sources (on-premise, SaaS, APIs, and cloud).
Optimize and monitor data workflows for reliability, performance, and cost efficiency.
Implement and maintain data quality, validation, and error-handling frameworks.
Data Analysis & Reporting
Develop and maintain reporting databases, views, and semantic models for business intelligence solutions.
Design and publish dashboards and visualizations in Power BI and SSRS, ensuring alignment with business KPIs.
Perform ad-hoc data exploration and statistical analysis to support business initiatives.
Collaboration & Governance
Partner with stakeholders across marketing, underwriting, operations, and IT to define analytical and data integration requirements.
Maintain data integrity, enforce governance standards, and promote best practices in data stewardship.
Support data security and compliance initiatives in coordination with IT and business teams.
Continuous Improvement
Stay current with emerging data technologies and analytics practices.
Recommend tools, processes, or automation improvements to enhance data accessibility and insight delivery.
QUALIFICATIONS
Required:
Strong SQL development skills and experience with Microsoft SQL Server and Azure SQL Database.
Hands-on experience with data import, transformation, and integration using Azure Data Factory, SSIS, or similar tools.
Proficiency in building BI solutions using Power BI and/or SSRS.
Strong data modeling and relational database design skills.
Proficiency in Microsoft Excel (advanced formulas, pivot tables, external data connections).
Ability to translate business goals into data requirements and technical solutions.
Excellent communication and collaboration skills.
Bachelor's degree in Computer Science, Information Systems, or a related field (or equivalent experience).
Preferred:
Experience with cloud-based data platforms (Azure Data Lake, Synapse Analytics, Databricks).
Familiarity with version control tools (Git, Azure DevOps) and Agile development practices.
Exposure to Python or PowerShell for data transformation or automation.
Experience integrating data from insurance or financial systems.
Compensation: $120-129K
This position is 3 days onsite/hybrid located in Overland Park, KS
We look forward to reviewing your application. We encourage everyone to apply - even if every box isn't checked for what you are looking for or what is required.
PDSINC, LLC is an Equal Opportunity Employer.
Data Engineer III
Data scientist job in Kansas City, MO
Who We Are: Spring Venture Group is a leading digital direct-to-consumer sales and marketing company with product offerings focused on the senior market. We specialize in distributing Medicare Supplement, Medicare Advantage, and related products via our family of brands and dedicated team of licensed insurance agents. Powered by our unique technologies that combine sophisticated marketing, comparison shopping, sales execution, and customer engagement - we help thousands of seniors across the country navigate the complex world of Medicare every day.
Job Description
This person has the opportunity to work primarily remote in the Kansas City or surrounding areas, making occasional visits to the office, but must CURRENTLY be in the Kansas City area.
We are unable to sponsor for this role, this includes international students.
OVERVIEW
The Data Management team is responsible for all things data at Spring Venture Group. Most importantly, our team is responsible for constructing high quality datasets that enable our business stakeholders and world-class Analytics department to make data informed decisions. Data engineers, combining Software Engineering and Database Engineering, serve as a primary resource for expertise with writing scripts and SQL queries, monitoring our database stability, and assisting with data governance ensuring availability for business-critical systems. The DE III works with a team of engineers of varying levels to design, develop, test, and maintain software applications and programs. The DE III will be expected to work independently when needed to solve the most complex problems encountered. They will be expected to be a leader and a mentor.
ESSENTIAL DUTIES
The essential duties for this role include, but are not limited to:
Serve as a primary advisor to Data Engineering Manager to identify and bring attention to opportunities for technical improvements, reduction of technical debt, or automation of repeated tasks.
Build advanced data pipelines utilizing the medallion architecture to create high quality single source of truth data sources in Snowflake
Architect replacements of current Data Management systems with respect to all aspects of data governance
Design advanced services with multiple data pipelines to securely and appropriately store company assets in our enterprise data stores.
Technically advise any member of the data engineering department, providing direction when multiple paths forward present themselves.
Actively participate as a leader in regular team meetings, listening and ensuring that one is assisting others at every chance for growth and development.
Write advanced ETL/ELT scripts where appropriate to integrate data of various formats into enterprise data stores.
Take ownership (both individually and as part of a team) of services and applications
Write complex SQL queries, scripts, and stored procedures to reliably and consistently modify data throughout our organization according to business requirements
Collaborate directly and independently with stakeholders to build familiarity, fully understand their needs, and create custom, modular, and reliable solutions to resolve their requests
Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code
Work with Project Managers, Solution Architects, and Software Development teams to build solutions for Company Initiatives on time, on budget, and on value.
Independently architect solutions to problems of high complexity, and advise junior and mid-level engineers on problems of medium complexity.
Create data pipelines using appropriate and applicable technologies from Amazon Web Services (AWS) to serve the specific needs of the business.
Ensure 99.95% uptime of our company's services monitoring data anomalies, batch failures, and our support chat for one week per team cycle from 8am-9pm.
Follow and embrace procedures of both the Data Management team and SVG Software Development Life Cycle (SDLC), including obtaining and retaining IT Security Admin III clearance.
Support after hours and weekend releases from our internal Software Development teams.
Actively participate in code review and weekly technicals with another more senior engineer or manager.
Assist departments with time-critical SQL execution and debug database performance problems.
ROLE COMPETENCIES
The competencies for this role include, but are not limited to:
Emotional Intelligence
Drive for Results
Continuous Improvement
Communication
Strategic Thinking
Teamwork and Collaboration
Qualifications
POSITION REQUIREMENTS
The requirements to fulfill this position are as follows:
Bachelor's degree in Computer Science, or a related technical field.
4-7 years of practical production work in Data Engineering.
Expertise of the Python programming language.
Expertise of Snowflake
Expertise of SQL, databases, & query optimization.
Must have experience in a large cloud provider such as AWS, Azure, GCP.
Advanced at reading code independently and understanding its intent.
Advanced at writing readable, modifiable code that solves business problems.
Ability to construct reliable and robust data pipelines to support both scheduled and event based workflows.
Working directly with stakeholders to create solutions.
Mentoring junior and mid-level engineers on best practices in programming, query optimization, and business tact.
Additional Information
Benefits:
The Company offers the following benefits for this position, subject to applicable eligibility requirements:
Competitive Compensation
Medical, Dental and vision benefits after a short waiting period
401(k) matching program
Life Insurance, and Short-term and Long-term Disability Insurance
Optional enrollment includes HSA/FSA, AD&D, Spousal/Dependent Life Insurance, Travel Assist and Legal Plan
Generous paid time off (PTO) program starting off at 15 days your first year
15 paid Holidays (includes holiday break between Christmas and New Years)
10 days of Paid Parental Leave and 5 days of Paid Birth Recovery Leave
Annual Volunteer Time Off (VTO) and a donation matching program
Employee Assistance Program (EAP) - health and well-being on and off the job
Rewards and Recognition
Diverse, inclusive and welcoming culture
Training program and ongoing support throughout your Venture Spring Venture Group career
Security Responsibilities:
Operating in alignment with policies and standards
Reporting Security Incidents Completing assigned training
Protecting assigned organizational assets
Spring Venture Group is an Equal Opportunity Employer
Senior. Data Engineer
Data scientist job in Overland Park, KS
The Senior Data Engineer will be responsible for building and maintaining the data infrastructure that powers the organization's data-driven decision-making. Designs, develops, and maintains data pipelines, data warehouses, and other data-related infrastructure. This role expects to work closely with data scientists, analysts, and other stakeholders to understand their data needs and translate them into robust and scalable solutions.
Key Responsibilities:
Build, maintain, and optimize data pipelines, including ELT processes, data models, reports, and dashboards to drive business insights.
Develop and implement data solutions for enterprise data warehouses and business intelligence (BI) initiatives.
Continuously monitor and optimize data pipelines for performance, reliability, and cost-effectiveness. This includes identifying bottlenecks, tuning queries, and scaling infrastructure as needed.
Automate data ingestion, processing, and validation tasks to ensure data quality and consistency.
Implement data governance policies and procedures to ensure data quality, consistency, and compliance with relevant regulations.
Contribute to the development of the organization's overall data strategy.
Conduct code reviews and contribute to the establishment of coding standards and best practices.
Required Qualifications:
Bachelor's degree in a relevant field or equivalent professional experience.
4-6 years of hands-on experience in data engineering.
Strong expertise in SQL and NoSQL databases, including PostgreSQL, DynamoDB, and MongoDB.
Experience working with cloud platforms such as GCP, Azure, or AWS and their associated data services.
Practical knowledge of data warehouses like BigQuery, Snowflake, and Redshift.
Programming skills in Python or JavaScript.
Proficiency with BI tools such as Sisense, Power BI, or Tableau.
Preferred Qualifications:
Direct experience with Google Cloud Platform (GCP).
Knowledge of CI/CD pipelines, including tools like Docker and Terraform.
Background in the healthcare industry.
Familiarity with modern data integration tools such as DBT, Matillion, and Airbyte. Compensation: $125,000.00 per year
Who We Are CARE ITS is a certified Woman-owned and operated minority company (certified as WMBE). At CARE ITS, we are the World Class IT Professionals, helping clients achieve their goals. Care ITS was established in 2010. Since then we have successfully executed several projects with our expert team of professionals with more than 20 years of experience each. We are globally operated with our Head Quarters in Plainsboro, NJ, with focused specialization in Salesforce, Guidewire and AWS. We provide expert solutions to our customers in various business domains.
Auto-ApplyData Engineer
Data scientist job in Overland Park, KS
At Quest Analytics, our mission is to make healthcare more accessible for all Americans. As part of our team, you'll work in an innovative, collaborative, challenging, and flexible environment that supports your personal growth every day. We are looking for a talented and motivated Data Engineer with experience in building scalable infrastructure, implementing automation, and enabling cross-functional teams with reliable and accessible data.
The Data Engineer will run daily operations of the data infrastructure, automate and optimize our data operations and data pipeline architecture while ensuring active monitoring and troubleshooting. This hire will also support other engineers and analysts on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. APPLY TODAY!What you'll do:
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Review project objectives and determine the best technology for implementation. Implement best practice standards for development, build and deployment automation.
Run daily operations of the data infrastructure and support other engineers and analysts on data investigations and operations.
Monitor and report on all data pipeline tasks while working with appropriate teams to take corrective action quickly, in case of any issues.
Work with internal teams to understand current process and areas for efficiency gains
Write well-abstracted, reusable, and efficient code.
Participate in the training and/or mentoring programs as assigned or required.
Adheres to the Quest Analytics Values and supports a positive company's culture.
Responds to the needs and requests of clients and Quest Analytics management and staff in a professional and expedient manner.
What it requires:
Bachelor's degree in computer science or related field.
3 years of work experience with ETL, data operations and troubleshooting, preferably in healthcare data.
Proficiency with Azure ecosystems, specifically in Azure Data Factory and ADLS.
Strong proficiency in Python for scripting, automation, and data processing.
Advanced SQL skills for query optimization and data manipulation.
Experience with distributed data pipeline tools like Apache Spark, Databricks, etc.
Working knowledge of database modeling (schema design, and data governance best practices.)
Working knowledge of libraries like Pandas, numpy, etc.
Self-motivated and able to work in a fast paced, deadline-oriented environment
Excellent troubleshooting, listening, and problem-solving skills.
Proven ability to solve complex issues.
Customer focused.
What you'll appreciate:•Workplace flexibility - you choose between remote, hybrid or in-office•Company paid employee medical, dental and vision•Competitive salary and success sharing bonus•Flexible vacation with no cap, plus sick time and holidays•An entrepreneurial culture that won't limit you to a job description•Being listened to, valued, appreciated -- and having your contributions rewarded•Enjoying your work each day with a great group of people Apply TODAY!careers.questanalytics.com
About Quest AnalyticsFor more than 20 years, we've been improving provider network management one groundbreaking innovation at a time. 90% of America's health plans use our tools, including the eight largest in the nation. Achieve your personal quest to build a great career here. Visa sponsorship is not available at this time.
Preferred work locations are within one of the following states: Alabama, Arizona, Arkansas, Colorado, Connecticut, Delaware, Florida, Georgia, Idaho, Illinois (outside of Chicago proper), Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Mexico, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, West Virginia, Wisconsin, or Wyoming.
Quest Analytics provides equal employment opportunities to all people without regard to race, color, religion, sex, national origin, ancestry, marital status, veteran status, age, disability, sexual orientation or gender identity or expression or any other legally protected category. We are committed to creating and maintaining a workforce environment that is free from any form of discriminations or harassment.
Applicants must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire.
Persons with disabilities who anticipate needing accommodations for any part of the application process may contact, in confidence *********************
NOTE: Staffing agencies, headhunters, recruiters, and/or placement agencies, please do not contact our hiring managers directly. We are not currently working with additional outside agencies at this time. Any job posting displayed on websites other than questanalytics.com or jobs.lever.co/questanalytics/ may be out of date, inaccurate and unavailable
We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
Senior Data Engineer
Data scientist job in Overland Park, KS
Velocity Staff, Inc. is working with our client located in the Overland Park, KS area to identify a Senior Level Data Engineer to join their Data Services Team. The right candidate will utilize their expertise in data warehousing, data pipeline creation/support and analytical reporting and be responsible for gathering and analyzing data from several internal and external sources, designing a cloud-focused data platform for analytics and business intelligence, reliably providing data to our analysts. This role requires significant understanding of data mining and analytical techniques. An ideal candidate will have strong technical capabilities, business acumen, and the ability to work effectively with cross-functional teams.
Responsibilities
Work with Data architects to understand current data models, to build pipelines for data ingestion and transformation.
Design, build, and maintain a framework for pipeline observation and monitoring, focusing on reliability and performance of jobs.
Surface data integration errors to the proper teams, ensuring timely processing of new data.
Provide technical consultation for other team members on best practices for automation, monitoring, and deployments.
Provide technical consultation for the team with “infrastructure as code” best practices: building deployment processes utilizing technologies such as Terraform or AWS Cloud Formation.
Qualifications
Bachelor's degree in computer science, data science or related technical field, or equivalent practical experience
Proven experience with relational and NoSQL databases (e.g. Postgres, Redshift, MongoDB, Elasticsearch)
Experience building and maintaining AWS based data pipelines: Technologies currently utilized include AWS Lambda, Docker / ECS, MSK
Mid/Senior level development utilizing Python: (Pandas/Numpy, Boto3, SimpleSalesforce)
Experience with version control (git) and peer code reviews
Enthusiasm for working directly with customer teams (Business units and internal IT)
Preferred but not required qualifications include:
Experience with data processing and analytics using AWS Glue or Apache Spark
Hands-on experience building data-lake style infrastructures using streaming data set technologies (particularly with Apache Kafka)
Experience data processing using Parquet and Avro
Experience developing, maintaining, and deploying Python packages
Experience with Kafka and the Kafka Connect ecosystem.
Familiarity with data visualization techniques using tools such as Grafana, PowerBI, AWS Quick Sight, and Excel.
Not ready to apply? Connect with us to learn about future opportunities.
Auto-ApplyData Engineer II
Data scientist job in Leawood, KS
Full-time Description
27Global is a rapidly growing company in the dynamic industry of software, cloud, and data engineering. We pride ourselves on the quality of the services we deliver, the clients we serve, and the strength of our culture. Our commitment to our employees is evidenced by our five Best Places to Work awards.
We're looking for a Data Engineer to join our team! You'll be responsible for contributing to the design and development of enterprise data solutions that support analytics, business intelligence, and scalable applications. You'll work closely with data and software architects, consultants and other engineers to deliver data models, integration strategies, and governance practices that empower client's data-driven decisions.
Joining 27Global as a Data Engineer is an exciting high-growth opportunity offering a competitive base salary, performance bonuses, and variable compensation.
Your Role:
Participate in the design and implementation of scalable, secure, and high-performance data architectures.
Develop and maintain conceptual, logical, and physical data models.
Work closely with architects to define standards for data integration, quality, and governance.
Collaborate with engineers, analysts, and business stakeholders to align data solutions with organizational needs.
Support cloud-based data strategies including data warehousing, pipelines, and real-time processing.
Design and optimize data pipelines that support AI, machine learning, and advanced analytics workloads.
Implement data preprocessing, feature engineering, and real-time inference capabilities for predictive modeling.
Integrate AI/ML models into production environments using tools such as AWS SageMaker, Azure Machine Learning, or Databricks.
Assess, learn, and apply emerging data technologies and frameworks to enhance solutions and stay current with industry trends.
Requirements
What You Bring:
BA/BS/Master's degree in Computer Science, Information Systems, Data Science, or related field.
2 - 4 years of experience in data architecture, data engineering, or related roles delivering scalable architecture solutions from design to production.
2 - 4 years of experience writing .Net code or other OOP languages in an Agile environment.
Demonstrated leadership skills with the ability to collaborate with and lead on-shore and off-shore team members.
Proficient technical skills in: Spark, Scala, C#, PySpark, Data Lake, Delta Lake, Relational and NoSQL Databases, AWS Glue and Azure Synapse
Experience with SQL, ETL/ELT, and data modeling.
Experience with cloud platforms (AWS, Azure, GCP) and implementing modern data platforms with data lake.
Knowledge of data governance, security, and compliance frameworks.
Ability to context switch and work on a variety of projects over specified periods of time.
Ability to work at the 27Global office in Leawood, KS with hybrid work flexibility after 90 days, and occasionally onsite at client offices.
Flexibility to occasionally travel to client sites may be required, typically 1 week per quarter or less.
Legal authorization to work in the United States and the ability to prove eligibility at the time of hire.
Ways to Stand Out:
Certifications: AWS Solution Architect, Azure Data Engineer, Databricks Data Engineer
Hands-on experience with Databricks for building and optimizing scalable data pipelines, Delta Lake, and Spark-based analytics.
Hands-on experience with big data tools (Spark, Kafka).
Modern data warehouses (Snowflake, Redshift, BigQuery).
Familiarity with machine learning pipelines and real-time analytics.
Strong communication skills and ability to influence stakeholders.
Prior experience implementing enterprise data governance frameworks.
Experience in a client-facing role, working directly with clients from multiple levels of the organization; often presenting and documenting client environment suggestions and improvements.
Why 27G?:
Four-time award winner of Best Place to Work by the Kansas City Business Journal.
A casual and fun small business work environment.
Competitive compensation, benefits, time off, profit sharing, and quarterly bonus potential.
Dedicated time for learning, development, research, and certifications.
Principal Data Engineer
Data scientist job in Lenexa, KS
About the Role: weavix is seeking a hands-on, business-minded Senior or Principal Data Engineer to architect and own our data infrastructure from the ground up. This is a unique opportunity to shape the future of data at a high-growth startup where IoT, scale, and performance are core to our mission.
You'll be the technical lead for everything data - building pipelines, architecting systems, and working cross-functionally to extract insights that power customer growth, analyze user behavior, and improve system reliability and performance. This is a highly autonomous role, perfect for someone with startup experience who enjoys solving complex problems independently.
What You'll Do:
Architect, build, and maintain scalable data systems and pipelines to ingest and process large-scale data from IoT devices and user activity
Own the design and implementation of our cloud-based data platform (Microsoft Azure strongly preferred; GCP or AWS also acceptable)
Enable data-driven decision-making across product, engineering, and business teams
Create a data architecture that supports both operational and analytical use cases (growth analytics, performance monitoring, system scaling)
Ensure data quality, observability, governance, and security across all systems
Serve as the subject matter expert on data systems, operating as a senior IC without a team initially
What You Bring:
6+ years of experience in data engineering, ideally within a startup or high-growth environment
Proven ability to independently design, implement, and manage scalable data architectures
Deep experience working with large datasets, ideally from IoT sources or other high-volume systems
Proficiency with modern data tools and languages (e.g., Typescript, NodeJS, SQL, etc.)
Strong cloud experience, ideally with Microsoft Azure (but AWS or GCP also acceptable)
A business-focused mindset with the ability to connect technical work to strategic outcomes
Experience with New Relic, Metabase, Postgres, Grafana, Azure Storage, MongoDB, or other storage, database, graphing, or alerting platforms.
Excellent communication and collaboration skills across technical and non-technical teams
Bonus Points For:
Experience with event-driven or real-time data systems (Kafka, Kinesis, etc.)
Familiarity with BI tools and self-service analytics platforms
Background in system performance monitoring and observability tools
Why weavix
Being a part of the weavix team is being a part of something bigger. We value the innovators and the risk-takers-the ones who love a challenge. Through our shared values and dedication to our mission to Connect every Disconnected Worker, we're reshaping the future of work to focus on this world's greatest assets: people.
It's truly amazing what happy, engaged team members can achieve. Our ever-evolving list of benefits means you'll be able to achieve work/life balance, perform impactful work, grow in your role, look after yourself/your family, and invest in your future.
Perks and Benefits
Competitive Compensation
Employee Equity Stock Program
Competitive Benefits Package including: Medical, Dental, Vision, Life, and Disability Insurance
401(k) Retirement Plan + Company Match
Flexible Spending & Health Savings Accounts
Paid Holidays
Flexible Time Off
Employee Assistance Program (EAP)
Other exciting company benefits
About Us
weavix , the Internet of Workers platform, revolutionizes frontline communication and productivity at scale. Since its founding, weavix has shaped the future of work by introducing innovative methods to better connect and enable the frontline workforce. weavix transforms enterprise by providing data-driven insights into facilities and teams to maximize productivity and achieve breakthrough results. weavix is the single source of truth for both workers and executives.
Our mission is simple: to connect every disconnected worker through disruptive technology.
How do you want to make your impact?
For more information about us, visit weavix.com.
Equal Employment Opportunity (EEO) Statement
weavix is an Equal Opportunity Employer. At weavix, diversity fuels innovation. We are dedicated to fostering an inclusive environment where every team member is empowered to contribute to our mission of connecting the disconnected workforce.
We do not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), national origin, age, disability, veteran status, genetic information, or any other legally protected characteristic. All qualified applicants will receive consideration for employment.
Americans with Disabilities Act (ADA) Statement
weavix is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need assistance or an accommodation during the application process due to a disability, you may contact us at *************.
E-Verify Notice
Notice: weavix participates in the E-Verify program to confirm employment eligibility as required by law.
Auto-ApplyPrincipal Data Engineer
Data scientist job in Lenexa, KS
Job Description
About the Role: weavix is seeking a hands-on, business-minded Senior or Principal Data Engineer to architect and own our data infrastructure from the ground up. This is a unique opportunity to shape the future of data at a high-growth startup where IoT, scale, and performance are core to our mission.
You'll be the technical lead for everything data - building pipelines, architecting systems, and working cross-functionally to extract insights that power customer growth, analyze user behavior, and improve system reliability and performance. This is a highly autonomous role, perfect for someone with startup experience who enjoys solving complex problems independently.
What You'll Do:
Architect, build, and maintain scalable data systems and pipelines to ingest and process large-scale data from IoT devices and user activity
Own the design and implementation of our cloud-based data platform (Microsoft Azure strongly preferred; GCP or AWS also acceptable)
Enable data-driven decision-making across product, engineering, and business teams
Create a data architecture that supports both operational and analytical use cases (growth analytics, performance monitoring, system scaling)
Ensure data quality, observability, governance, and security across all systems
Serve as the subject matter expert on data systems, operating as a senior IC without a team initially
What You Bring:
6+ years of experience in data engineering, ideally within a startup or high-growth environment
Proven ability to independently design, implement, and manage scalable data architectures
Deep experience working with large datasets, ideally from IoT sources or other high-volume systems
Proficiency with modern data tools and languages (e.g., Typescript, NodeJS, SQL, etc.)
Strong cloud experience, ideally with Microsoft Azure (but AWS or GCP also acceptable)
A business-focused mindset with the ability to connect technical work to strategic outcomes
Experience with New Relic, Metabase, Postgres, Grafana, Azure Storage, MongoDB, or other storage, database, graphing, or alerting platforms.
Excellent communication and collaboration skills across technical and non-technical teams
Bonus Points For:
Experience with event-driven or real-time data systems (Kafka, Kinesis, etc.)
Familiarity with BI tools and self-service analytics platforms
Background in system performance monitoring and observability tools
Why weavix
Being a part of the weavix team is being a part of something bigger. We value the innovators and the risk-takers-the ones who love a challenge. Through our shared values and dedication to our mission to Connect every Disconnected Worker, we're reshaping the future of work to focus on this world's greatest assets: people.
It's truly amazing what happy, engaged team members can achieve. Our ever-evolving list of benefits means you'll be able to achieve work/life balance, perform impactful work, grow in your role, look after yourself/your family, and invest in your future.
Perks and Benefits
Competitive Compensation
Employee Equity Stock Program
Competitive Benefits Package including: Medical, Dental, Vision, Life, and Disability Insurance
401(k) Retirement Plan + Company Match
Flexible Spending & Health Savings Accounts
Paid Holidays
Flexible Time Off
Employee Assistance Program (EAP)
Other exciting company benefits
About Us
weavix , the Internet of Workers platform, revolutionizes frontline communication and productivity at scale. Since its founding, weavix has shaped the future of work by introducing innovative methods to better connect and enable the frontline workforce. weavix transforms enterprise by providing data-driven insights into facilities and teams to maximize productivity and achieve breakthrough results. weavix is the single source of truth for both workers and executives.
Our mission is simple: to connect every disconnected worker through disruptive technology.
How do you want to make your impact?
For more information about us, visit weavix.com.
Equal Employment Opportunity (EEO) Statement
weavix is an Equal Opportunity Employer. At weavix, diversity fuels innovation. We are dedicated to fostering an inclusive environment where every team member is empowered to contribute to our mission of connecting the disconnected workforce.
We do not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), national origin, age, disability, veteran status, genetic information, or any other legally protected characteristic. All qualified applicants will receive consideration for employment.
Americans with Disabilities Act (ADA) Statement
weavix is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need assistance or an accommodation during the application process due to a disability, you may contact us at *************.
E-Verify Notice
Notice: weavix participates in the E-Verify program to confirm employment eligibility as required by law.
Senior Data Engineer
Data scientist job in Overland Park, KS
Company Details
Intrepid Direct Insurance (IDI) is a rapidly growing direct to consumer property and casualty insurance company. A member of the W. R. Berkley Corporation, a fortune 500 company, rated A+ (Superior) by A.M. Best, Intrepid Direct's vision is to make life better for business. The insurance industry has not evolved with innovation like other major industries. We're here to change that. We are making life better for our customers, shareholders, and our team members by leveraging data and technology as insurance experts for our targeted customers. You will be part of a highly collaborative team of talented and focused professionals. Join a group that enjoys working together, trusts each other, and takes pride in our hard-earned success.
***************************
The Company is an equal employment opportunity employer.
Responsibilities
Intrepid Direct Insurance is looking for an experienced Senior Data Engineer to mentor, orchestrate, implement, and monitor the flowing through our organization. This opportunity will have a direct influence on how data is made available to our business units, as well as our customers. You'll primarily be working with our operations and engineering teams to create and enhance data pipelines, conform and enrich data, and deliver information to business users. Learn the ins and outs of what we do so that you can focus on improving availability and quality of the data we use to service our customers.
Key functions include but are not limited to:
Assist with long-term strategic planning for modern data warehousing needs.
Contribute to data modeling exercises and the buildout of our data warehouse.
Monitor, support, and analyze existing pipelines and recommend performance and process improvements to address gaps in existing process.
Automate manual processes owned by data team.
Troubleshoot and remediate ingestion and reporting related issues.
Design and build new pipelines to ingest data from additional disparate sources.
Responsible for the accuracy and availability of data in our data warehouse.
Collaborate with a multi-disciplinary team to develop data-driven solutions that align with our business and technical needs.
Create and deploy reports as needed.
Assist with cataloging and classifying existing data sets.
Participate in peer reviews with emphasis on continuous improvement.
Respond to regulatory requests for information.
Assumes other tasks and duties as assigned by management.
Mentor team members and advise on best practices.
Qualifications
Bachelor's degree in Mathematics, Statistics, Computer Science, or equivalent experience.
6+ years of relevant data engineering experience.
Analytical thinker with experience working in a fast-paced, startup environment.
Technical expertise with Microsoft SQL Server.
Familiarity with ETL tools and concepts.
Hands-on experience with database design and data modeling, preferable experience with Data Vault methodology.
Experience supporting and troubleshooting SSIS packages.
Experience consuming event-based data through APIs or queues.
Experience in Agile software development.
Experience with insurance data highly desired.
Detail oriented, solid organizational, and problem-solving.
Strong written, visual, and verbal communication skills.
Team oriented with a strong willingness to serve others in an agile startup environment.
Flexible in assuming new responsibilities as they arise.
Experience with Power Bi desired.
Additional Company Details We do not accept unsolicited resumes from third party recruiting agencies or firms.
The actual salary for this position will be determined by a number of factors, including the scope, complexity and location of the role; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. Sponsorship Details Sponsorship not Offered for this Role
Auto-ApplyData Engineer, Mid-Level (Leawood, KS & Arlington, VA)
Data scientist job in Leawood, KS
Become Part of a Meaningful Mission
Torch.AI is a defense-focused AI-software company on a mission to become the leading provider of critical data infrastructure for U.S. Defense and National Security. We deliver advanced AI and data software capabilities directly to customer mission owners to meet flexible, user-defined specifications and enable a decision advantage for the warfighter. We're passionate about solving complex problems that improve national security, support our warfighters, and protect our nation. Join us in our mission to help organizations Unlock Human Potential.
The U.S. defense and national security industry offers an unparalleled opportunity to contribute to the safety and well-being of the nation while engaging with cutting-edge technologies. As a vital sector that shapes global stability, it offers a dynamic environment to tackle complex challenges across multidisciplinary domains. With substantial investment in innovation, the industry is at the forefront of developing AI, autonomous systems, and advanced national security solutions, each founded on the premise that information is the new battlefield. If this type of work is of interest, we'd love to hear from you.
The Environment: Unlock Your Potential
As a Data Engineer at Torch.AI, you will be at the forefront of building software that scales across Torch.AI's platform capabilities. Your software will be deployed across an array of operational and research & development efforts for mission-critical customer programs and projects.
Each of our customers requires unique technical solutions to enable an asymmetric advantage on the battlefield. Torch.AI's patented software helps remove common obstacles such as manual-intensive data processing, parsing, and analysis, thereby reducing cognitive burden for the warfighter. Our end-to-end data processing, orchestration, and fusion platform supports a wide variety of military use cases, domains, operations, and echelons. Customers enjoy enterprise-grade capabilities that meet specialized needs.
Torch.AI encourages company-wide collaboration to share context, skills, and expertise across a variety of tools, technologies, and development practices. You'll work autonomously while driving coordinated, collaborative decisions across cross-functional teams comprised of defense and national security experts, veterans, business leaders, and experienced software engineers. Your code will advance back-end data orchestration and graph-compute capabilities to deliver elegant data and intelligence products. You will have the opportunity to harden and scale existing platform capabilities, tools, and technologies, while also working to innovate and introduce new iterative capabilities and features which benefit our company and customers.
Successful candidates thrive in a fast-paced, entrepreneurial, and mission-driven environment. We hire brilliant patriots. You'll be encouraged to think creatively, challenge conventional thinking, and identify alternative approaches for delivering value to customers across complex problem sets. Your day-to-day will vary, adapting to the requirements of our customers and the technical needs of respective use cases. One day, you may be supporting the development of a new proof of capability concept for a new customer program; another you may be focused on optimizing system performance to help scale a production deployment; the next you may be working directly with customers to understand their requirements with deep intellectual curiosity.
Our flat operating model puts every employee at the forefront of our customers' missions.
We value customer intimacy, unique perspectives, and dedication to delivering lasting impact and results. You'll have the opportunity to work on the frontlines of major customer programs and influence lasting success for Torch.AI and your teammates.
You'll have the opportunity to gain experience across a wide range of projects and tasks, from designing and demonstrating early capabilities and prototypes to deploying large-scale mission systems.
You'll contribute directly to Torch.AI's continued position as a market leader for data infrastructure AI and compete against multi-billion-dollar incumbents and high-tech AI companies.
Responsibilities
Design, build, and maintain scalable data pipelines using tools like Apache NiFi, Airflow, or equivalent orchestration systems.
Work with structured and semi-structured data using SQL and NoSQL systems (e.g., PostgreSQL, MongoDB, Elasticsearch, Neo4j).
Develop services and integrations using Java (primary) and optionally Python for ETL workflows and data transformation.
Integrate data from internal and external REST APIs; handle data format translation (e.g., JSON, Parquet, Avro).
Optimize data flows for reliability and performance, and support large-scale batch and streaming data jobs.
Implement and document ETL mappings, schemas, and transformation logic aligned with mission use cases.
Collaborate with software, DevOps, and AI teams to support downstream data science and ML workflows.
Use Git-based workflows and participate in CI/CD processes for data infrastructure deployments.
Contribute to application specifications, data quality checks, and internal documentation.
What We Value
B.S. degree in Computer Science, Technology, Engineering, or a relevant field.
4-6 years of experience in data engineering, backend software engineering, or data integration roles.
Strong experience with Java development in data pipeline or ETL contexts; Python is a plus.
Proficiency with SQL and NoSQL databases, including query optimization and large dataset processing.
Familiarity with data integration tools such as Apache NiFi, Airflow, or comparable platforms.
Knowledge of RESTful API interactions, JSON parsing, and schema transformations.
Exposure to cloud environments (especially AWS: S3, EC2, Lambda) and distributed systems.
Comfortable with Git-based version control and Agile team practices.
Industry experience, preferably within the defense industry and/or intelligence community or related sectors, a plus.
Capability to work collaboratively in interdisciplinary teams.
Awareness of ethical considerations and responsible AI practices.
Excellent problem-solving skills, attention to detail, and ability to thrive in a fast-paced, collaborative environment.
Experience with data messaging and streaming technologies (e.g., Kafka) (nice to have, not required).
Understanding of IAM/security concepts in data environments (e.g., role-based access, encryption) (nice to have, not required).
Exposure to data modeling, time-series data analysis, or graph databases (e.g., Neo4j) (nice to have, not required).
Familiarity with Spark or other distributed processing frameworks (nice to have, not required).
Security Clearance
We are hiring for multiple positions for each role. Some roles require a
Secret
,
Top Secret
, or
Top Secret/SCI
Security Clearance on Day 1. If you do not currently hold a clearance but are interested in this role and believe you are eligible for a clearance, we still encourage you to submit an application.
Work Location
We are hiring for roles at our headquarters in Leawood, KS and remotely in the Arlington, VA region. Candidates in the Arlington, VA region may work remotely while not on customer site. Candidates in the Leawood, KS region may require some limited travel to customer sites (
Incentives
Equity: All employees are eligible to participate in the company equity incentive program within their first 12 months of employment. We are proud that 100% of our employees are equity-owning partners at Torch.AI.
Competitive salary and annual performance bonus opportunities.
Unlimited PTO.
11 paid holidays each year.
Incredible professional development and learning opportunities in a fast-paced high-tech environment and exciting industry.
Weekly in-office catering in our Leawood HQ.
Benefits
Torch.AI values employee well-being and, in turn, offers exceptional benefits options which greatly exceed regional and national averages for similar companies.
401k Plan
Torch.AI offers a 401k plan through John Hancock. While the company does not offer employee matching, we offer 3% profit sharing for all employees who elect to participate in the 401k plan. Profit sharing is calculated based on company performance at the end of each calendar year and distributed to 401k accounts at the start of each calendar year.
Medical
Three medical options: PPO, HSA, and TRICARE.
Torch.AI's HSA contribution is 250%-350% higher than average employer contributions in Kansas City and Arlington regions.
Only ~18% of employers offer TRICARE Supplement plans.
Spending Accounts
Above-market employer funding and flexibility.
HSA: Triple-tax advantage
FSA: $50-$3,300 annual contribution, $660 rollover
Dependent Care FSA: $100-$5,000, pre-tax savings on child/dependent care.
Dental
High Plan annual maximum is ~2.6x higher than the national average.
High Renaissance Plan: $5,000 annual max, 50% ortho up to $1,000.
Low Renaissance Plan: $1,000 annual max, no ortho.
Vision
Frame allowance is 25-35% higher than typical employer-sponsored plans.
Vision through Renaissance with VSP Choice network: $0 exams, lenses covered in full, and $180 frame allowance
Life Insurance
Employer-paid 1x base salary and additional voluntary options for employees and spouses, compared to most employers who only cover $50k basic life on average.
Disability & Illness
Torch.AI ranks in the top 10% of regional employers for disability benefits
Short-Term Disability (employer paid): 60% income, up to $2,000/week
Long-Term Disability (employer paid): 60% income, up to $5,000/month
Voluntary Benefits
Robust Voluntary plans offer direct pash payout flexibility and wellness incentives.
Accidental Insurance
Critical Illness
Hospital Indemnity
Commuter Benefits: up to $300/month tax-free for transit/parking
Torch.AI is an Equal Opportunity /Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, national origin, protected veteran status or status as an individual with a disability.
These positions are being reviewed and filled on a rolling basis, and multiple openings may be available for each role.
JOB CODE: 1000108
Principal Data Scientist
Data scientist job in Kansas City, MO
Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team.
You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes.
This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.)
This position requires occasional travel to the DC area for client meetings.
Essential Duties and Responsibilities:
- Make deep dives into the data, pulling out objective insights for business leaders.
- Initiate, craft, and lead advanced analyses of operational data.
- Provide a strong voice for the importance of data-driven decision making.
- Provide expertise to others in data wrangling and analysis.
- Convert complex data into visually appealing presentations.
- Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners.
- Understand the importance of automation and look to implement and initiate automated solutions where appropriate.
- Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects.
- Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects.
- Guide operational partners on product performance and solution improvement/maturity options.
- Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization.
- Learn new skills in advanced analytics/AI/ML tools, techniques, and languages.
- Mentor more junior data analysts/data scientists as needed.
- Apply strategic approach to lead projects from start to finish;
Job-Specific Minimum Requirements:
- Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
- Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning.
- Contribute to the development of mathematically rigorous process improvement procedures.
- Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments.
Minimum Requirements
- Bachelor's degree in related field required.
- 10-12 years of relevant professional experience required.
Job-Specific Minimum Requirements:
- 10+ years of relevant Software Development + AI / ML / DS experience.
- Professional Programming experience (e.g. Python, R, etc.).
- Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML.
- Experience with API programming.
- Experience with Linux.
- Experience with Statistics.
- Experience with Classical Machine Learning.
- Experience working as a contributor on a team.
Preferred Skills and Qualifications:
- Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.).
- Experience developing machine learning or signal processing algorithms:
- Ability to leverage mathematical principles to model new and novel behaviors.
- Ability to leverage statistics to identify true signals from noise or clutter.
- Experience working as an individual contributor in AI.
- Use of state-of-the-art technology to solve operational problems in AI and Machine Learning.
- Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles.
- Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions.
- Ability to build reference implementations of operational AI & Advanced Analytics processing solutions.
Background Investigations:
- IRS MBI - Eligibility
#techjobs #VeteransPage
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
156,740.00
Maximum Salary
$
234,960.00
Easy ApplyCorporate Treasury Data & Risk Analytics
Data scientist job in Overland Park, KS
We are seeking a driven and analytically minded professional to join our Corporate Treasury team. This individual will play a key role supporting asset/liability management, liquidity management, budgeting & forecasting, data analytics, and performance analysis/reporting.
In this role, you will work closely with senior and executive leadership to deliver strategic financial insights, optimize business performance, support and influence decision-making, uncover data-driven stories, and challenge existing processes with fresh, innovative thinking.
Essential Duties & Responsibilities
Responsibilities will be tailored to the experience and skillset of the selected candidate and may include:
* Developing and enhancing financial models and simulations
* Supporting forecasting, liquidity, and ALM analytics
* Conducting "what-if" scenario analysis and presenting actionable insights
* Building dashboards, reporting tools, and performance summaries
* Driving or contributing to process improvement initiatives
* Collaborating cross-functionally with senior leaders across the organization
Experience & Knowledge
* Financial modeling and earnings simulation experience using risk/performance management tools
* Designing and developing mathematical or statistical models to support strategic decision-making and risk management
* Experience running scenario analysis and synthesizing insights for executive audiences
* Familiarity with financial asset/liability instruments, market instruments, and their interactions
* Experience with Funds Transfer Pricing (FTP) and capital allocation is a plus
* Demonstrated success driving effective process improvements
Education
* Bachelor's degree in Accounting, Finance, or a related field required
CapFed is an equal opportunity employer.
Auto-Apply