Post job

Senior data scientist jobs in Independence, MO

- 54 jobs
All
Senior Data Scientist
Data Engineer
Data Scientist
  • Senior Data Scientist, Specialist Senior - SFL Scientific

    Deloitte 4.7company rating

    Senior data scientist job in Kansas City, MO

    + Senior Data Scientist, Specialist Senior - SFL Scientific Our Deloitte Strategy & Transactions team helps guide clients through their most critical moments and transformational initiatives. From strategy to execution, this team delivers integrated, end-to-end support and advisory services covering valuation modeling, cost optimization, restructuring, business design and transformation, infrastructure and real estate, mergers and acquisitions (M&A), and sustainability. Work alongside clients every step of the way, helping them navigate new challenges, avoid financial pitfalls, and provide practical solutions at every stage of their journey-before, during, and after any major transformational projects or transactions. SFL Scientific, a Deloitte Business, is a U.S. based, data science consulting firm specializing in building industry-specific, artificial intelligence (AI) technologies. We are hiring a Senior Data Scientist to collaborate directly with clients to design and develop novel projects and solutions. Join a rapidly growing team of professionals working to build a world-class data science practice focused on solving complex and R&D problems. Recruiting for this role ends on 12/31/2025. Work You'll Do As a Senior Data Scientist at SFL Scientific, a Deloitte Business, you will define data strategy, drive technical development, and help us create the next generation of tools, products, and AI services. You will work closely with clients to understand their data sets, strategy, and operational requirements, in order to drive exploratory analysis and design long-term solutions. Working with a team of interdisciplinary data scientists, engineers, architects, and consultants, our work includes novel areas such as cancer detection, drug discovery, optimizing population health and clinical trials, autonomous systems and edge AI, agentic solutions, and consumer product innovation. Join us to expand your technical career through the lens of consulting and work on novel projects and use cases to expand your data science & AI skills. + Guide clients with high autonomy in AI strategy and development, including understanding organizational needs, performing exploratory data analysis, building and validating models, and deploying models into production + Lead client initiatives to deliver AI/ML solutions, including providing thought leadership, long-term maintenance, and AI strategy objectives + Research and implement novel machine learning approaches, including advancing state of the art training, solution design, network design, and hardware optimization + Validate AI models and algorithm via code reviews, unit, and integration tests + Support prioritization of project performance and model development and ensure AI solutions are delivered to maximize business impact and new initiatives + Collaborate with data engineers, data scientists, project managers, and business teams to make sure delivery and presentations align with business objectives The Team Our Strategy offering architects bold strategies to achieve business and mission goals, enabling growth, competitive advantage, technology modernization, and continuous digital and AI transformation. SFL Scientific, a Deloitte Business, is a data science professional services practice focused on strategy, technology, and solving business challenges with Artificial Intelligence (AI). The team has a proven track record serving large, market-leading organizations in the private and public sectors, successfully delivering high-quality, novel and complex projects, and offering deep domain and scientific capabilities. Made up of experienced AI strategists, data scientists, and AI engineers, they serve as trusted advisors to executives, helping them understand and evaluate new and essential areas for AI investment and identify unique opportunities to transform their businesses. Qualifications: + Master's or Ph.D. in a relevant STEM field (Data Science, Computer Science, Engineering, Physics, Mathematics, etc.) + 3+ years of experience in AI/ML algorithm development using core data science languages and frameworks (Python, PyTorch, etc.) and data analysis (NLP, time-series analysis, computer vision) + 3+ years of experience and a proven track record applying traditional ML and deep learning techniques (CNNs, RNNs, GANs) across real-world projects, including model tuning and performance validation in production environments + 3+ years of experience deploying and optimizing ML models using tools like Kubernetes, Docker, TensorRT/Trion, RAPIDs, Kubeflow, and MLflow + 3+ years of experience in leveraging cloud environments (AWS, Azure, or GCP) to deploy AI/ML workloads + Live within commuting distance to one of Deloitte's consulting offices + Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve + Limited immigration sponsorship may be available Preferred: + 2+ years of experience working in a client-facing, consulting environment + 2+ years of experience leading project/client engagement teams in the execution of complex AI data science solutions + 1+ year of experience with LLM/GenAI use cases and developing RAG solutions, tools, and services (i.e., LangChain, LangGraph, MCP, etc.) + 1+ year of experience with AWS Sagemaker or AWS ML Studio The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $107,600 to $198,400. You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance. Information for applicants with a need for accommodation: ************************************************************************************************************ #MonitorDeloitte #DeloitteJobs #StrategyConsulting #DeloitteStrategy #Strategy26 #SFL26 All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.
    $107.6k-198.4k yearly 60d+ ago
  • Principal Data Scientist

    Maximus 4.3company rating

    Senior data scientist job in Kansas City, MO

    Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team. You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes. This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.) This position requires occasional travel to the DC area for client meetings. Essential Duties and Responsibilities: - Make deep dives into the data, pulling out objective insights for business leaders. - Initiate, craft, and lead advanced analyses of operational data. - Provide a strong voice for the importance of data-driven decision making. - Provide expertise to others in data wrangling and analysis. - Convert complex data into visually appealing presentations. - Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners. - Understand the importance of automation and look to implement and initiate automated solutions where appropriate. - Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects. - Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects. - Guide operational partners on product performance and solution improvement/maturity options. - Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization. - Learn new skills in advanced analytics/AI/ML tools, techniques, and languages. - Mentor more junior data analysts/data scientists as needed. - Apply strategic approach to lead projects from start to finish; Job-Specific Minimum Requirements: - Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation. - Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital. - Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning. - Contribute to the development of mathematically rigorous process improvement procedures. - Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments. Minimum Requirements - Bachelor's degree in related field required. - 10-12 years of relevant professional experience required. Job-Specific Minimum Requirements: - 10+ years of relevant Software Development + AI / ML / DS experience. - Professional Programming experience (e.g. Python, R, etc.). - Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML. - Experience with API programming. - Experience with Linux. - Experience with Statistics. - Experience with Classical Machine Learning. - Experience working as a contributor on a team. Preferred Skills and Qualifications: - Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.). - Experience developing machine learning or signal processing algorithms: - Ability to leverage mathematical principles to model new and novel behaviors. - Ability to leverage statistics to identify true signals from noise or clutter. - Experience working as an individual contributor in AI. - Use of state-of-the-art technology to solve operational problems in AI and Machine Learning. - Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles. - Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions. - Ability to build reference implementations of operational AI & Advanced Analytics processing solutions. Background Investigations: - IRS MBI - Eligibility #techjobs #VeteransPage EEO Statement Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics. Pay Transparency Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances. Accommodations Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************. Minimum Salary $ 156,740.00 Maximum Salary $ 234,960.00
    $66k-92k yearly est. Easy Apply 7d ago
  • Data Scientist

    Sunlighten 3.9company rating

    Senior data scientist job in Leawood, KS

    Job Description **Please Note: This position is open only to candidates authorized to work in the U.S. without the need for current or future visa sponsorship. Additionally, this position is based in the Kansas City area, and we are only considering candidates who reside locally.** At Sunlighten, we're not just about infrared saunas, we're on a mission to improve lives through innovative health and wellness solutions. As a global leader in infrared sauna therapy with a 25-year legacy of innovation, we've grown from our Kansas City roots to establish a global footprint. With the wellness market projected to reach $7 trillion by 2026, we're proud to lead the way in light science and longevity. We're rapidly expanding our BI, AI, and Automation team and are on the hunt for a curious, proactive, and methodical Data Scientist- Forecasting, Experimentation & Scoring to take us to the next level. Based in the Kansas City metro area, we offer the best of both worlds: the growth of a global company with the close-knit culture of a local business. Sunlighten's Data Scientist will drive measurable impact across Sales, Marketing, CX, and Operations through analytics and ML. You'll lead forecasting, lead/opportunity scoring, what‑if analysis, and rigorous experimentation so leaders can make confident decisions. Duties/Responsibilities: Partner with stakeholders (e.g., CSO, Marketing) to shape analytical questions into testable plans; frame success metrics and hypotheses; deliver what‑if analysis, statistical testing, and clear recommendations. Design & run experiments (e.g., changing how we serve chat leads): define treatment/control, power, guardrails; implement tracking (Salesforce/Service Cloud/Marketing Cloud/GA4), and build Power BI readouts. Own ML for BI priorities: improve Lead Score & Opp Score; demand planning; end‑to‑end forecasting; support website/product‑specific models. Define business and model metrics; build golden labels/holdouts; quantify ROI. Feature engineering from Salesforce/NetSuite/Five9/Marketing Cloud/Shopify/telemetry; partner with DE to productionize in Fabric. Implement offline/online evaluation; drift monitoring; experiment tracking; reproducible notebooks; monthly goal snapshotting. Wear multiple hats: when needed, take an analysis/ML project end‑to‑end and collaborate with devs to surface results in products or lightweight UIs. Other duties as discussed and assigned. Requirements This position is open only to candidates authorized to work in the U.S. without the need for current or future visa sponsorship. Additionally, this position is based in the Kansas City area, and we are only considering candidates who reside locally. 2-5 years of enterprise experience in applied data science/analytics 2-5 years of experience developing and shipping predictive models, and delivering stakeholder-facing analyses. Expertise in Python (pandas, scikit‑learn/lightgbm); SQL; forecasting (Prophet/ARIMA/XGB‑based); causal/AB testing (t‑tests, proportion tests, chi‑square), power analysis. Expertise with Power BI for experiment/forecast dashboards; Microsoft Fabric for data/feature pipelines; CI/CD with Git. Excellent communication skills: concise narratives, assumptions, and trade‑offs; clear documentation. Bachelor's or Masters Degree in Data Science/Computer Science/Stats/OR or equivalent experience portfolio/GitHub preferred Benefits Opportunity to work in a collaborative and innovative environment. Career growth opportunities in a market leading and rapidly growing wellness technology company. Competitive Paid Time Off Policy + Paid Holidays + Floating Holidays. Fully Equipped Fitness Center On-Site. Lunch Program featuring a James-Beard Award Winning Chef. Health (HSA & FSA Options), Dental, and Vision Insurance. 401(k) with company contributions. Profit Sharing. Life and Short-Term Disability Insurance. Professional Development and Tuition Reimbursement. Associate Discounts on Saunas, Spa Products and Day Spa Services. Sunlighten provides equal employment opportunity. Discrimination of any type will not be tolerated. Sunlighten is an Equal Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability, protected veteran status or any other characteristic protected by state, federal, or local law.
    $62k-86k yearly est. 19d ago
  • Data Engineer

    Tyler Technologies 4.3company rating

    Senior data scientist job in Overland Park, KS

    Description The Data Engineer role within the Cloud Payments Reporting team is responsible for developing, deploying, and maintaining data solutions. Primary responsibilities will include querying databases, maintaining data transformation pipelines through ETL tools and systems into data warehouses, provide and support data visualizations, dashboards, ad hoc and customer deliverable reports. Responsibilities Provide primary support for existing databases, automated jobs, and reports, including monitoring notifications, performing root cause analysis, communicating findings, and resolving issues. Subject Matter Expert for payments reports, databases, and processes. Ensure data and report integrity and accuracy through thorough testing and validation. Build analytical tools to utilize the data, providing actionable insight into key business performance metrics including operational efficiency. Implement, support, and optimize ETL pipelines, data aggregation processes, and reports using various tools and technologies. Collaborate with operational leaders and teams to understand reporting and data usage across the business and provide efficient solutions. Participate in recurring meetings with working groups and management teams to discuss operational improvements. Work with stakeholders including data, design, product, and executive teams to support their data infrastructure needs while assisting with data-related technical issues. Handle tasks on your own, adjust to new deadlines, and adapt to changing priorities. Design, develop and implement special projects, based on business needs. Perform other job-related duties and responsibilities as assigned. Qualifications Five or more years of experience with Oracle, MySQL, Power BI / QuickSight, and Excel. Thorough knowledge of SQL, relational databases and data modeling principles. Proficiency in programming languages such as PL/SQL, Bash, PowerShell and Python. Exceptional problem-solving, analytical, and critical thinking skills. Excellent interpersonal and communication skills, with the ability to consult with stakeholders to facilitate requirements gathering, troubleshooting, and solution validation. Detail-oriented with the ability to understand the bigger picture. Ability to communicate complex quantitative analysis clearly. Strong organizational skills, including multi-tasking and teamwork. Self-motivated, task oriented and an aptitude for complex problem solving. Experience with AWS, Jenkins and SnapLogic is a plus. Data streaming, API calls (SOAP and REST), database replication and real-time processing is a plus. Experience with Atlassian JIRA and Confluence is a plus.
    $69k-84k yearly est. Auto-Apply 60d+ ago
  • Data Scientist - Retail Pricing

    Capitol Federal Savings Bank 4.4company rating

    Senior data scientist job in Overland Park, KS

    We are looking for a Data Scientist! This position will play a key role in shaping data-driven strategies that directly influence the bank's profitability, customer value, and market competitiveness. This role sits at the intersection of analytics, finance, and product strategy - transforming data into pricing intelligence that supports smarter, faster business decisions. Will design and implement advanced pricing and profitability models for retail banking products, leveraging internal performance metrics, market benchmarks, and third-party data sources. Through predictive modeling, elasticity analysis, and scenario testing, will help the organization optimize deposit and loan pricing, forecast financial outcomes, and identify growth opportunities. Collaborating across product, finance, and executive teams, will translate complex analytical findings into clear business recommendations that drive strategic action. Will also contribute to enhancing our analytics infrastructure - improving data pipelines, model governance, and reporting capabilities to strengthen enterprise-wide decision-making. Core Expertise: Pricing strategy · Profitability modeling · Financial forecasting · Machine learning · SQL · Python · R · Data visualization · Strategic analytics · Cross-functional collaboration CapFed is an equal opportunity employer.
    $66k-82k yearly est. Auto-Apply 6d ago
  • Data Engineer

    PDS Inc., LLC 3.8company rating

    Senior data scientist job in Overland Park, KS

    The Data Engineer is a key contributor in advancing the firm's data strategy and analytics ecosystem, transforming raw data into actionable insights that drive business decisions. This role requires a technically strong, curious professional committed to continuous learning and innovation. The ideal candidate combines analytical acumen with data engineering skills to ensure reliable, efficient, and scalable data pipelines and reporting solutions. ESSENTIAL DUTIES AND RESPONSIBILITIES Data Engineering & Integration Design, build, and maintain data pipelines and integrations using Azure Data Factory, SSIS, or equivalent ETL/ELT tools. Automate data imports, transformations, and loads from multiple sources (on-premise, SaaS, APIs, and cloud). Optimize and monitor data workflows for reliability, performance, and cost efficiency. Implement and maintain data quality, validation, and error-handling frameworks. Data Analysis & Reporting Develop and maintain reporting databases, views, and semantic models for business intelligence solutions. Design and publish dashboards and visualizations in Power BI and SSRS, ensuring alignment with business KPIs. Perform ad-hoc data exploration and statistical analysis to support business initiatives. Collaboration & Governance Partner with stakeholders across marketing, underwriting, operations, and IT to define analytical and data integration requirements. Maintain data integrity, enforce governance standards, and promote best practices in data stewardship. Support data security and compliance initiatives in coordination with IT and business teams. Continuous Improvement Stay current with emerging data technologies and analytics practices. Recommend tools, processes, or automation improvements to enhance data accessibility and insight delivery. QUALIFICATIONS Required: Strong SQL development skills and experience with Microsoft SQL Server and Azure SQL Database. Hands-on experience with data import, transformation, and integration using Azure Data Factory, SSIS, or similar tools. Proficiency in building BI solutions using Power BI and/or SSRS. Strong data modeling and relational database design skills. Proficiency in Microsoft Excel (advanced formulas, pivot tables, external data connections). Ability to translate business goals into data requirements and technical solutions. Excellent communication and collaboration skills. Bachelor's degree in Computer Science, Information Systems, or a related field (or equivalent experience). Preferred: Experience with cloud-based data platforms (Azure Data Lake, Synapse Analytics, Databricks). Familiarity with version control tools (Git, Azure DevOps) and Agile development practices. Exposure to Python or PowerShell for data transformation or automation. Experience integrating data from insurance or financial systems. Compensation: $120-129K This position is 3 days onsite/hybrid located in Overland Park, KS We look forward to reviewing your application. We encourage everyone to apply - even if every box isn't checked for what you are looking for or what is required. PDSINC, LLC is an Equal Opportunity Employer.
    $120k-129k yearly 14d ago
  • Data Engineer III

    Spring Venture Group 3.9company rating

    Senior data scientist job in Kansas City, MO

    Who We Are: Spring Venture Group is a leading digital direct-to-consumer sales and marketing company with product offerings focused on the senior market. We specialize in distributing Medicare Supplement, Medicare Advantage, and related products via our family of brands and dedicated team of licensed insurance agents. Powered by our unique technologies that combine sophisticated marketing, comparison shopping, sales execution, and customer engagement - we help thousands of seniors across the country navigate the complex world of Medicare every day. Job Description This person has the opportunity to work primarily remote in the Kansas City or surrounding areas, making occasional visits to the office, but must CURRENTLY be in the Kansas City area. We are unable to sponsor for this role, this includes international students. OVERVIEW The Data Management team is responsible for all things data at Spring Venture Group. Most importantly, our team is responsible for constructing high quality datasets that enable our business stakeholders and world-class Analytics department to make data informed decisions. Data engineers, combining Software Engineering and Database Engineering, serve as a primary resource for expertise with writing scripts and SQL queries, monitoring our database stability, and assisting with data governance ensuring availability for business-critical systems. The DE III works with a team of engineers of varying levels to design, develop, test, and maintain software applications and programs. The DE III will be expected to work independently when needed to solve the most complex problems encountered. They will be expected to be a leader and a mentor. ESSENTIAL DUTIES The essential duties for this role include, but are not limited to: * Serve as a primary advisor to Data Engineering Manager to identify and bring attention to opportunities for technical improvements, reduction of technical debt, or automation of repeated tasks. * Build advanced data pipelines utilizing the medallion architecture to create high quality single source of truth data sources in Snowflake * Architect replacements of current Data Management systems with respect to all aspects of data governance * Design advanced services with multiple data pipelines to securely and appropriately store company assets in our enterprise data stores. * Technically advise any member of the data engineering department, providing direction when multiple paths forward present themselves. * Actively participate as a leader in regular team meetings, listening and ensuring that one is assisting others at every chance for growth and development. * Write advanced ETL/ELT scripts where appropriate to integrate data of various formats into enterprise data stores. * Take ownership (both individually and as part of a team) of services and applications * Write complex SQL queries, scripts, and stored procedures to reliably and consistently modify data throughout our organization according to business requirements * Collaborate directly and independently with stakeholders to build familiarity, fully understand their needs, and create custom, modular, and reliable solutions to resolve their requests * Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code * Work with Project Managers, Solution Architects, and Software Development teams to build solutions for Company Initiatives on time, on budget, and on value. * Independently architect solutions to problems of high complexity, and advise junior and mid-level engineers on problems of medium complexity. * Create data pipelines using appropriate and applicable technologies from Amazon Web Services (AWS) to serve the specific needs of the business. * Ensure 99.95% uptime of our company's services monitoring data anomalies, batch failures, and our support chat for one week per team cycle from 8am-9pm. * Follow and embrace procedures of both the Data Management team and SVG Software Development Life Cycle (SDLC), including obtaining and retaining IT Security Admin III clearance. * Support after hours and weekend releases from our internal Software Development teams. * Actively participate in code review and weekly technicals with another more senior engineer or manager. * Assist departments with time-critical SQL execution and debug database performance problems. ROLE COMPETENCIES The competencies for this role include, but are not limited to: * Emotional Intelligence * Drive for Results * Continuous Improvement * Communication * Strategic Thinking * Teamwork and Collaboration Qualifications POSITION REQUIREMENTS The requirements to fulfill this position are as follows: * Bachelor's degree in Computer Science, or a related technical field. * 4-7 years of practical production work in Data Engineering. * Expertise of the Python programming language. * Expertise of Snowflake * Expertise of SQL, databases, & query optimization. * Must have experience in a large cloud provider such as AWS, Azure, GCP. * Advanced at reading code independently and understanding its intent. * Advanced at writing readable, modifiable code that solves business problems. * Ability to construct reliable and robust data pipelines to support both scheduled and event based workflows. * Working directly with stakeholders to create solutions. * Mentoring junior and mid-level engineers on best practices in programming, query optimization, and business tact. Additional Information Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: * Competitive Compensation * Medical, Dental and vision benefits after a short waiting period * 401(k) matching program * Life Insurance, and Short-term and Long-term Disability Insurance * Optional enrollment includes HSA/FSA, AD&D, Spousal/Dependent Life Insurance, Travel Assist and Legal Plan * Generous paid time off (PTO) program starting off at 15 days your first year * 15 paid Holidays (includes holiday break between Christmas and New Years) * 10 days of Paid Parental Leave and 5 days of Paid Birth Recovery Leave * Annual Volunteer Time Off (VTO) and a donation matching program * Employee Assistance Program (EAP) - health and well-being on and off the job * Rewards and Recognition * Diverse, inclusive and welcoming culture * Training program and ongoing support throughout your Venture Spring Venture Group career Security Responsibilities: * Operating in alignment with policies and standards * Reporting Security Incidents Completing assigned training * Protecting assigned organizational assets Spring Venture Group is an Equal Opportunity Employer
    $75k-98k yearly est. 36d ago
  • Senior. Data Engineer

    Care It Services 4.3company rating

    Senior data scientist job in Overland Park, KS

    The Senior Data Engineer will be responsible for building and maintaining the data infrastructure that powers the organization's data-driven decision-making. Designs, develops, and maintains data pipelines, data warehouses, and other data-related infrastructure. This role expects to work closely with data scientists, analysts, and other stakeholders to understand their data needs and translate them into robust and scalable solutions. Key Responsibilities: Build, maintain, and optimize data pipelines, including ELT processes, data models, reports, and dashboards to drive business insights. Develop and implement data solutions for enterprise data warehouses and business intelligence (BI) initiatives. Continuously monitor and optimize data pipelines for performance, reliability, and cost-effectiveness. This includes identifying bottlenecks, tuning queries, and scaling infrastructure as needed. Automate data ingestion, processing, and validation tasks to ensure data quality and consistency. Implement data governance policies and procedures to ensure data quality, consistency, and compliance with relevant regulations. Contribute to the development of the organization's overall data strategy. Conduct code reviews and contribute to the establishment of coding standards and best practices. Required Qualifications: Bachelor's degree in a relevant field or equivalent professional experience. 4-6 years of hands-on experience in data engineering. Strong expertise in SQL and NoSQL databases, including PostgreSQL, DynamoDB, and MongoDB. Experience working with cloud platforms such as GCP, Azure, or AWS and their associated data services. Practical knowledge of data warehouses like BigQuery, Snowflake, and Redshift. Programming skills in Python or JavaScript. Proficiency with BI tools such as Sisense, Power BI, or Tableau. Preferred Qualifications: Direct experience with Google Cloud Platform (GCP). Knowledge of CI/CD pipelines, including tools like Docker and Terraform. Background in the healthcare industry. Familiarity with modern data integration tools such as DBT, Matillion, and Airbyte. Compensation: $125,000.00 per year Who We Are CARE ITS is a certified Woman-owned and operated minority company (certified as WMBE). At CARE ITS, we are the World Class IT Professionals, helping clients achieve their goals. Care ITS was established in 2010. Since then we have successfully executed several projects with our expert team of professionals with more than 20 years of experience each. We are globally operated with our Head Quarters in Plainsboro, NJ, with focused specialization in Salesforce, Guidewire and AWS. We provide expert solutions to our customers in various business domains.
    $125k yearly Auto-Apply 60d+ ago
  • Azure Data Engineer (Python/SQL) - 6013914

    Accenture 4.7company rating

    Senior data scientist job in Overland Park, KS

    Accenture Flex offers you the flexibility of local fixed-duration project-based work powered by Accenture, a leading global professional services company. Accenture is consistently recognized on FORTUNE's 100 Best Companies to Work For and Diversity Inc's Top 50 Companies For Diversity lists. As an Accenture Flex employee, you will apply your skills and experience to help drive business transformation for leading organizations and communities. In addition to delivering innovative solutions for Accenture's clients, you will work with a highly skilled, diverse network of people across Accenture businesses who are using the latest emerging technologies to address today's biggest business challenges. You will receive competitive rewards and access to benefits programs and world-class learning resources. Accenture Flex employees work in their local metro area onsite at the project, significantly reducing and/or eliminating the demands to travel. Job Description: Join our dynamic team and embark on a journey where you will be empowered to perform independently and become an SME. Required active participation/contribution in team discussions will be key as you contribute in providing solutions to work related problems. Let's work together to achieve greatness! Responsibilities: + Create new data pipelines leveraging existing data ingestion frameworks, tools + Orchestrate data pipelines using the Azure Data Factory service. + Develop/Enhance data transformations based on the requirements to parse, transform and load data into Enterprise Data Lake, Delta Lake, Enterprise DWH (Synapse Analytics) + Perform Unit Testing, coordinate integration testing and UAT Create HLD/DD/runbooks for the data pipelines + Configure compute, DQ Rules, Maintenance Performance tuning/optimization Basic Qualifications: + Minimum of 3 years of work experience with one or more of the following: Databricks Data Engineering, DLT, Azure Data Factory, SQL, PySpark, Synapse Dedicated SQL Pool, Azure DevOps, Python Preferred Qualifications: + Azure Function Apps + Azure Logic Apps + Precisely & COSMOS DB + Advanced proficiency in PySpark. + Advanced proficiency in Microsoft Azure Databricks, Azure DevOps, Databricks Delta Live Tables and Azure Data Factory. + Bachelor's or Associate's degree Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired as set forth below. We accept applications on an on-going basis and there is no fixed deadline to apply. Information on benefits is here. (************************************************************ Role Location: California - $47.69 - $57.69 Cleveland - $47.69 - $57.69 Colorado - $47.69 - $57.69 District of Columbia - $47.69 - $57.69 Illinois - $47.69 - $57.69 Minnesota - $47.69 - $57.69 Maryland - $47.69 - $57.69 Massachusetts - $47.69 - $57.69 New York/New Jersey - $47.69 - $57.69 Washington - $47.69 - $57.69 Requesting an Accommodation Accenture is committed to providing equal employment opportunities for persons with disabilities or religious observances, including reasonable accommodation when needed. If you are hired by Accenture and require accommodation to perform the essential functions of your role, you will be asked to participate in our reasonable accommodation process. Accommodations made to facilitate the recruiting process are not a guarantee of future or continued accommodations once hired. If you would like to be considered for employment opportunities with Accenture and have accommodation needs such as for a disability or religious observance, please call us toll free at **************** or send us an email or speak with your recruiter. Equal Employment Opportunity Statement We believe that no one should be discriminated against because of their differences. All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law. Our rich diversity makes us more innovative, more competitive, and more creative, which helps us better serve our clients and our communities. For details, view a copy of the Accenture Equal Opportunity Statement (******************************************************************************************************************************************** Accenture is an EEO and Affirmative Action Employer of Veterans/Individuals with Disabilities. Accenture is committed to providing veteran employment opportunities to our service men and women. Other Employment Statements Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States. Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration. Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process. Further, at Accenture a criminal conviction history is not an absolute bar to employment. The Company will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. Additionally, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the Company's legal duty to furnish information. California requires additional notifications for applicants and employees. If you are a California resident, live in or plan to work from Los Angeles County upon being hired for this position, please click here for additional important information. Please read Accenture's Recruiting and Hiring Statement for more information on how we process your data during the Recruiting and Hiring process.
    $63k-83k yearly est. 2d ago
  • Data Engineer II

    27Global

    Senior data scientist job in Leawood, KS

    Full-time Description 27Global is a rapidly growing company in the dynamic industry of software, cloud, and data engineering. We pride ourselves on the quality of the services we deliver, the clients we serve, and the strength of our culture. Our commitment to our employees is evidenced by our five Best Places to Work awards. We're looking for a Data Engineer to join our team! You'll be responsible for contributing to the design and development of enterprise data solutions that support analytics, business intelligence, and scalable applications. You'll work closely with data and software architects, consultants and other engineers to deliver data models, integration strategies, and governance practices that empower client's data-driven decisions. Joining 27Global as a Data Engineer is an exciting high-growth opportunity offering a competitive base salary, performance bonuses, and variable compensation. Your Role: Participate in the design and implementation of scalable, secure, and high-performance data architectures. Develop and maintain conceptual, logical, and physical data models. Work closely with architects to define standards for data integration, quality, and governance. Collaborate with engineers, analysts, and business stakeholders to align data solutions with organizational needs. Support cloud-based data strategies including data warehousing, pipelines, and real-time processing. Design and optimize data pipelines that support AI, machine learning, and advanced analytics workloads. Implement data preprocessing, feature engineering, and real-time inference capabilities for predictive modeling. Integrate AI/ML models into production environments using tools such as AWS SageMaker, Azure Machine Learning, or Databricks. Assess, learn, and apply emerging data technologies and frameworks to enhance solutions and stay current with industry trends. Requirements What You Bring: BA/BS/Master's degree in Computer Science, Information Systems, Data Science, or related field. 2 - 4 years of experience in data architecture, data engineering, or related roles delivering scalable architecture solutions from design to production. 2 - 4 years of experience writing .Net code or other OOP languages in an Agile environment. Demonstrated leadership skills with the ability to collaborate with and lead on-shore and off-shore team members. Proficient technical skills in: Spark, Scala, C#, PySpark, Data Lake, Delta Lake, Relational and NoSQL Databases, AWS Glue and Azure Synapse Experience with SQL, ETL/ELT, and data modeling. Experience with cloud platforms (AWS, Azure, GCP) and implementing modern data platforms with data lake. Knowledge of data governance, security, and compliance frameworks. Ability to context switch and work on a variety of projects over specified periods of time. Ability to work at the 27Global office in Leawood, KS with hybrid work flexibility after 90 days, and occasionally onsite at client offices. Flexibility to occasionally travel to client sites may be required, typically 1 week per quarter or less. Legal authorization to work in the United States and the ability to prove eligibility at the time of hire. Ways to Stand Out: Certifications: AWS Solution Architect, Azure Data Engineer, Databricks Data Engineer Hands-on experience with Databricks for building and optimizing scalable data pipelines, Delta Lake, and Spark-based analytics. Hands-on experience with big data tools (Spark, Kafka). Modern data warehouses (Snowflake, Redshift, BigQuery). Familiarity with machine learning pipelines and real-time analytics. Strong communication skills and ability to influence stakeholders. Prior experience implementing enterprise data governance frameworks. Experience in a client-facing role, working directly with clients from multiple levels of the organization; often presenting and documenting client environment suggestions and improvements. Why 27G?: Four-time award winner of Best Place to Work by the Kansas City Business Journal. A casual and fun small business work environment. Competitive compensation, benefits, time off, profit sharing, and quarterly bonus potential. Dedicated time for learning, development, research, and certifications.
    $69k-92k yearly est. 60d+ ago
  • Principal Data Engineer

    Weavix

    Senior data scientist job in Lenexa, KS

    About the Role: weavix is seeking a hands-on, business-minded Senior or Principal Data Engineer to architect and own our data infrastructure from the ground up. This is a unique opportunity to shape the future of data at a high-growth startup where IoT, scale, and performance are core to our mission. You'll be the technical lead for everything data - building pipelines, architecting systems, and working cross-functionally to extract insights that power customer growth, analyze user behavior, and improve system reliability and performance. This is a highly autonomous role, perfect for someone with startup experience who enjoys solving complex problems independently. What You'll Do: Architect, build, and maintain scalable data systems and pipelines to ingest and process large-scale data from IoT devices and user activity Own the design and implementation of our cloud-based data platform (Microsoft Azure strongly preferred; GCP or AWS also acceptable) Enable data-driven decision-making across product, engineering, and business teams Create a data architecture that supports both operational and analytical use cases (growth analytics, performance monitoring, system scaling) Ensure data quality, observability, governance, and security across all systems Serve as the subject matter expert on data systems, operating as a senior IC without a team initially What You Bring: 6+ years of experience in data engineering, ideally within a startup or high-growth environment Proven ability to independently design, implement, and manage scalable data architectures Deep experience working with large datasets, ideally from IoT sources or other high-volume systems Proficiency with modern data tools and languages (e.g., Typescript, NodeJS, SQL, etc.) Strong cloud experience, ideally with Microsoft Azure (but AWS or GCP also acceptable) A business-focused mindset with the ability to connect technical work to strategic outcomes Experience with New Relic, Metabase, Postgres, Grafana, Azure Storage, MongoDB, or other storage, database, graphing, or alerting platforms. Excellent communication and collaboration skills across technical and non-technical teams Bonus Points For: Experience with event-driven or real-time data systems (Kafka, Kinesis, etc.) Familiarity with BI tools and self-service analytics platforms Background in system performance monitoring and observability tools Why weavix Being a part of the weavix team is being a part of something bigger. We value the innovators and the risk-takers-the ones who love a challenge. Through our shared values and dedication to our mission to Connect every Disconnected Worker, we're reshaping the future of work to focus on this world's greatest assets: people. It's truly amazing what happy, engaged team members can achieve. Our ever-evolving list of benefits means you'll be able to achieve work/life balance, perform impactful work, grow in your role, look after yourself/your family, and invest in your future. Perks and Benefits Competitive Compensation Employee Equity Stock Program Competitive Benefits Package including: Medical, Dental, Vision, Life, and Disability Insurance 401(k) Retirement Plan + Company Match Flexible Spending & Health Savings Accounts Paid Holidays Flexible Time Off Employee Assistance Program (EAP) Other exciting company benefits About Us weavix , the Internet of Workers platform, revolutionizes frontline communication and productivity at scale. Since its founding, weavix has shaped the future of work by introducing innovative methods to better connect and enable the frontline workforce. weavix transforms enterprise by providing data-driven insights into facilities and teams to maximize productivity and achieve breakthrough results. weavix is the single source of truth for both workers and executives. Our mission is simple: to connect every disconnected worker through disruptive technology. How do you want to make your impact? For more information about us, visit weavix.com. Equal Employment Opportunity (EEO) Statement weavix is an Equal Opportunity Employer. At weavix, diversity fuels innovation. We are dedicated to fostering an inclusive environment where every team member is empowered to contribute to our mission of connecting the disconnected workforce. We do not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), national origin, age, disability, veteran status, genetic information, or any other legally protected characteristic. All qualified applicants will receive consideration for employment. Americans with Disabilities Act (ADA) Statement weavix is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need assistance or an accommodation during the application process due to a disability, you may contact us at *************. E-Verify Notice Notice: weavix participates in the E-Verify program to confirm employment eligibility as required by law.
    $69k-92k yearly est. Auto-Apply 21d ago
  • Principal Data Engineer

    Weavix Inc.

    Senior data scientist job in Lenexa, KS

    Job Description About the Role: weavix is seeking a hands-on, business-minded Senior or Principal Data Engineer to architect and own our data infrastructure from the ground up. This is a unique opportunity to shape the future of data at a high-growth startup where IoT, scale, and performance are core to our mission. You'll be the technical lead for everything data - building pipelines, architecting systems, and working cross-functionally to extract insights that power customer growth, analyze user behavior, and improve system reliability and performance. This is a highly autonomous role, perfect for someone with startup experience who enjoys solving complex problems independently. What You'll Do: Architect, build, and maintain scalable data systems and pipelines to ingest and process large-scale data from IoT devices and user activity Own the design and implementation of our cloud-based data platform (Microsoft Azure strongly preferred; GCP or AWS also acceptable) Enable data-driven decision-making across product, engineering, and business teams Create a data architecture that supports both operational and analytical use cases (growth analytics, performance monitoring, system scaling) Ensure data quality, observability, governance, and security across all systems Serve as the subject matter expert on data systems, operating as a senior IC without a team initially What You Bring: 6+ years of experience in data engineering, ideally within a startup or high-growth environment Proven ability to independently design, implement, and manage scalable data architectures Deep experience working with large datasets, ideally from IoT sources or other high-volume systems Proficiency with modern data tools and languages (e.g., Typescript, NodeJS, SQL, etc.) Strong cloud experience, ideally with Microsoft Azure (but AWS or GCP also acceptable) A business-focused mindset with the ability to connect technical work to strategic outcomes Experience with New Relic, Metabase, Postgres, Grafana, Azure Storage, MongoDB, or other storage, database, graphing, or alerting platforms. Excellent communication and collaboration skills across technical and non-technical teams Bonus Points For: Experience with event-driven or real-time data systems (Kafka, Kinesis, etc.) Familiarity with BI tools and self-service analytics platforms Background in system performance monitoring and observability tools Why weavix Being a part of the weavix team is being a part of something bigger. We value the innovators and the risk-takers-the ones who love a challenge. Through our shared values and dedication to our mission to Connect every Disconnected Worker, we're reshaping the future of work to focus on this world's greatest assets: people. It's truly amazing what happy, engaged team members can achieve. Our ever-evolving list of benefits means you'll be able to achieve work/life balance, perform impactful work, grow in your role, look after yourself/your family, and invest in your future. Perks and Benefits Competitive Compensation Employee Equity Stock Program Competitive Benefits Package including: Medical, Dental, Vision, Life, and Disability Insurance 401(k) Retirement Plan + Company Match Flexible Spending & Health Savings Accounts Paid Holidays Flexible Time Off Employee Assistance Program (EAP) Other exciting company benefits About Us weavix , the Internet of Workers platform, revolutionizes frontline communication and productivity at scale. Since its founding, weavix has shaped the future of work by introducing innovative methods to better connect and enable the frontline workforce. weavix transforms enterprise by providing data-driven insights into facilities and teams to maximize productivity and achieve breakthrough results. weavix is the single source of truth for both workers and executives. Our mission is simple: to connect every disconnected worker through disruptive technology. How do you want to make your impact? For more information about us, visit weavix.com. Equal Employment Opportunity (EEO) Statement weavix is an Equal Opportunity Employer. At weavix, diversity fuels innovation. We are dedicated to fostering an inclusive environment where every team member is empowered to contribute to our mission of connecting the disconnected workforce. We do not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), national origin, age, disability, veteran status, genetic information, or any other legally protected characteristic. All qualified applicants will receive consideration for employment. Americans with Disabilities Act (ADA) Statement weavix is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need assistance or an accommodation during the application process due to a disability, you may contact us at *************. E-Verify Notice Notice: weavix participates in the E-Verify program to confirm employment eligibility as required by law.
    $69k-92k yearly est. 21d ago
  • Senior Data Engineer

    Berkley 4.3company rating

    Senior data scientist job in Overland Park, KS

    Company Details Intrepid Direct Insurance (IDI) is a rapidly growing direct to consumer property and casualty insurance company. A member of the W. R. Berkley Corporation, a fortune 500 company, rated A+ (Superior) by A.M. Best, Intrepid Direct's vision is to make life better for business. The insurance industry has not evolved with innovation like other major industries. We're here to change that. We are making life better for our customers, shareholders, and our team members by leveraging data and technology as insurance experts for our targeted customers. You will be part of a highly collaborative team of talented and focused professionals. Join a group that enjoys working together, trusts each other, and takes pride in our hard-earned success. *************************** The Company is an equal employment opportunity employer. Responsibilities Intrepid Direct Insurance is looking for an experienced Senior Data Engineer to mentor, orchestrate, implement, and monitor the flowing through our organization. This opportunity will have a direct influence on how data is made available to our business units, as well as our customers. You'll primarily be working with our operations and engineering teams to create and enhance data pipelines, conform and enrich data, and deliver information to business users. Learn the ins and outs of what we do so that you can focus on improving availability and quality of the data we use to service our customers. Key functions include but are not limited to: Assist with long-term strategic planning for modern data warehousing needs. Contribute to data modeling exercises and the buildout of our data warehouse. Monitor, support, and analyze existing pipelines and recommend performance and process improvements to address gaps in existing process. Automate manual processes owned by data team. Troubleshoot and remediate ingestion and reporting related issues. Design and build new pipelines to ingest data from additional disparate sources. Responsible for the accuracy and availability of data in our data warehouse. Collaborate with a multi-disciplinary team to develop data-driven solutions that align with our business and technical needs. Create and deploy reports as needed. Assist with cataloging and classifying existing data sets. Participate in peer reviews with emphasis on continuous improvement. Respond to regulatory requests for information. Assumes other tasks and duties as assigned by management. Mentor team members and advise on best practices. Qualifications Bachelor's degree in Mathematics, Statistics, Computer Science, or equivalent experience. 6+ years of relevant data engineering experience. Analytical thinker with experience working in a fast-paced, startup environment. Technical expertise with Microsoft SQL Server. Familiarity with ETL tools and concepts. Hands-on experience with database design and data modeling, preferable experience with Data Vault methodology. Experience supporting and troubleshooting SSIS packages. Experience consuming event-based data through APIs or queues. Experience in Agile software development. Experience with insurance data highly desired. Detail oriented, solid organizational, and problem-solving. Strong written, visual, and verbal communication skills. Team oriented with a strong willingness to serve others in an agile startup environment. Flexible in assuming new responsibilities as they arise. Experience with Power Bi desired. Additional Company Details We do not accept unsolicited resumes from third party recruiting agencies or firms. The actual salary for this position will be determined by a number of factors, including the scope, complexity and location of the role; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. Sponsorship Details Sponsorship not Offered for this Role
    $72k-98k yearly est. Auto-Apply 60d+ ago
  • Data Engineer

    Quest Analytics

    Senior data scientist job in Overland Park, KS

    At Quest Analytics, our mission is to make healthcare more accessible for all Americans. As part of our team, you'll work in an innovative, collaborative, challenging, and flexible environment that supports your personal growth every day. We are looking for a talented and motivated Data Engineer with experience in building scalable infrastructure, implementing automation, and enabling cross-functional teams with reliable and accessible data. The Data Engineer will run daily operations of the data infrastructure, automate and optimize our data operations and data pipeline architecture while ensuring active monitoring and troubleshooting. This hire will also support other engineers and analysts on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. APPLY TODAY! What you'll do: * Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. * Review project objectives and determine the best technology for implementation. Implement best practice standards for development, build and deployment automation. * Run daily operations of the data infrastructure and support other engineers and analysts on data investigations and operations. * Monitor and report on all data pipeline tasks while working with appropriate teams to take corrective action quickly, in case of any issues. * Work with internal teams to understand current process and areas for efficiency gains * Write well-abstracted, reusable, and efficient code. * Participate in the training and/or mentoring programs as assigned or required. * Adheres to the Quest Analytics Values and supports a positive company's culture. * Responds to the needs and requests of clients and Quest Analytics management and staff in a professional and expedient manner. What it requires: * Bachelor's degree in computer science or related field. * 3 years of work experience with ETL, data operations and troubleshooting, preferably in healthcare data. * Proficiency with Azure ecosystems, specifically in Azure Data Factory and ADLS. * Strong proficiency in Python for scripting, automation, and data processing. * Advanced SQL skills for query optimization and data manipulation. * Experience with distributed data pipeline tools like Apache Spark, Databricks, etc. * Working knowledge of database modeling (schema design, and data governance best practices.) * Working knowledge of libraries like Pandas, numpy, etc. * Self-motivated and able to work in a fast paced, deadline-oriented environment * Excellent troubleshooting, listening, and problem-solving skills. * Proven ability to solve complex issues. * Customer focused. What you'll appreciate: * Workplace flexibility - you choose between remote, hybrid or in-office * Company paid employee medical, dental and vision * Competitive salary and success sharing bonus * Flexible vacation with no cap, plus sick time and holidays * An entrepreneurial culture that won't limit you to a job description * Being listened to, valued, appreciated -- and having your contributions rewarded * Enjoying your work each day with a great group of people Apply TODAY! careers.questanalytics.com About Quest Analytics For more than 20 years, we've been improving provider network management one groundbreaking innovation at a time. 90% of America's health plans use our tools, including the eight largest in the nation. Achieve your personal quest to build a great career here. Visa sponsorship is not available at this time. Preferred work locations are within one of the following states: Alabama, Arizona, Arkansas, Colorado, Connecticut, Delaware, Florida, Georgia, Idaho, Illinois (outside of Chicago proper), Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Mexico, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, West Virginia, Wisconsin, or Wyoming. Quest Analytics provides equal employment opportunities to all people without regard to race, color, religion, sex, national origin, ancestry, marital status, veteran status, age, disability, sexual orientation or gender identity or expression or any other legally protected category. We are committed to creating and maintaining a workforce environment that is free from any form of discriminations or harassment. Applicants must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire. Persons with disabilities who anticipate needing accommodations for any part of the application process may contact, in confidence [email protected] NOTE: Staffing agencies, headhunters, recruiters, and/or placement agencies, please do not contact our hiring managers directly. We are not currently working with additional outside agencies at this time. Any job posting displayed on websites other than questanalytics.com or jobs.lever.co/questanalytics/ may be out of date, inaccurate and unavailable We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
    $69k-92k yearly est. 47d ago
  • Senior Data Engineer

    Velocity Staff

    Senior data scientist job in Overland Park, KS

    Velocity Staff, Inc. is working with our client located in the Overland Park, KS area to identify a Senior Level Data Engineer to join their Data Services Team. The right candidate will utilize their expertise in data warehousing, data pipeline creation/support and analytical reporting and be responsible for gathering and analyzing data from several internal and external sources, designing a cloud-focused data platform for analytics and business intelligence, reliably providing data to our analysts. This role requires significant understanding of data mining and analytical techniques. An ideal candidate will have strong technical capabilities, business acumen, and the ability to work effectively with cross-functional teams. Responsibilities Work with Data architects to understand current data models, to build pipelines for data ingestion and transformation. Design, build, and maintain a framework for pipeline observation and monitoring, focusing on reliability and performance of jobs. Surface data integration errors to the proper teams, ensuring timely processing of new data. Provide technical consultation for other team members on best practices for automation, monitoring, and deployments. Provide technical consultation for the team with “infrastructure as code” best practices: building deployment processes utilizing technologies such as Terraform or AWS Cloud Formation. Qualifications Bachelor's degree in computer science, data science or related technical field, or equivalent practical experience Proven experience with relational and NoSQL databases (e.g. Postgres, Redshift, MongoDB, Elasticsearch) Experience building and maintaining AWS based data pipelines: Technologies currently utilized include AWS Lambda, Docker / ECS, MSK Mid/Senior level development utilizing Python: (Pandas/Numpy, Boto3, SimpleSalesforce) Experience with version control (git) and peer code reviews Enthusiasm for working directly with customer teams (Business units and internal IT) Preferred but not required qualifications include: Experience with data processing and analytics using AWS Glue or Apache Spark Hands-on experience building data-lake style infrastructures using streaming data set technologies (particularly with Apache Kafka) Experience data processing using Parquet and Avro Experience developing, maintaining, and deploying Python packages Experience with Kafka and the Kafka Connect ecosystem. Familiarity with data visualization techniques using tools such as Grafana, PowerBI, AWS Quick Sight, and Excel. Not ready to apply? Connect with us to learn about future opportunities.
    $69k-92k yearly est. Auto-Apply 16d ago
  • Data Engineer, Mid-Level (Leawood, KS & Arlington, VA)

    Torch.Ai

    Senior data scientist job in Leawood, KS

    Become Part of a Meaningful Mission Torch.AI is a defense-focused AI-software company on a mission to become the leading provider of critical data infrastructure for U.S. Defense and National Security. We deliver advanced AI and data software capabilities directly to customer mission owners to meet flexible, user-defined specifications and enable a decision advantage for the warfighter. We're passionate about solving complex problems that improve national security, support our warfighters, and protect our nation. Join us in our mission to help organizations Unlock Human Potential. The U.S. defense and national security industry offers an unparalleled opportunity to contribute to the safety and well-being of the nation while engaging with cutting-edge technologies. As a vital sector that shapes global stability, it offers a dynamic environment to tackle complex challenges across multidisciplinary domains. With substantial investment in innovation, the industry is at the forefront of developing AI, autonomous systems, and advanced national security solutions, each founded on the premise that information is the new battlefield. If this type of work is of interest, we'd love to hear from you. The Environment: Unlock Your Potential As a Data Engineer at Torch.AI, you will be at the forefront of building software that scales across Torch.AI's platform capabilities. Your software will be deployed across an array of operational and research & development efforts for mission-critical customer programs and projects. Each of our customers requires unique technical solutions to enable an asymmetric advantage on the battlefield. Torch.AI's patented software helps remove common obstacles such as manual-intensive data processing, parsing, and analysis, thereby reducing cognitive burden for the warfighter. Our end-to-end data processing, orchestration, and fusion platform supports a wide variety of military use cases, domains, operations, and echelons. Customers enjoy enterprise-grade capabilities that meet specialized needs. Torch.AI encourages company-wide collaboration to share context, skills, and expertise across a variety of tools, technologies, and development practices. You'll work autonomously while driving coordinated, collaborative decisions across cross-functional teams comprised of defense and national security experts, veterans, business leaders, and experienced software engineers. Your code will advance back-end data orchestration and graph-compute capabilities to deliver elegant data and intelligence products. You will have the opportunity to harden and scale existing platform capabilities, tools, and technologies, while also working to innovate and introduce new iterative capabilities and features which benefit our company and customers. Successful candidates thrive in a fast-paced, entrepreneurial, and mission-driven environment. We hire brilliant patriots. You'll be encouraged to think creatively, challenge conventional thinking, and identify alternative approaches for delivering value to customers across complex problem sets. Your day-to-day will vary, adapting to the requirements of our customers and the technical needs of respective use cases. One day, you may be supporting the development of a new proof of capability concept for a new customer program; another you may be focused on optimizing system performance to help scale a production deployment; the next you may be working directly with customers to understand their requirements with deep intellectual curiosity. Our flat operating model puts every employee at the forefront of our customers' missions. We value customer intimacy, unique perspectives, and dedication to delivering lasting impact and results. You'll have the opportunity to work on the frontlines of major customer programs and influence lasting success for Torch.AI and your teammates. You'll have the opportunity to gain experience across a wide range of projects and tasks, from designing and demonstrating early capabilities and prototypes to deploying large-scale mission systems. You'll contribute directly to Torch.AI's continued position as a market leader for data infrastructure AI and compete against multi-billion-dollar incumbents and high-tech AI companies. Responsibilities Design, build, and maintain scalable data pipelines using tools like Apache NiFi, Airflow, or equivalent orchestration systems. Work with structured and semi-structured data using SQL and NoSQL systems (e.g., PostgreSQL, MongoDB, Elasticsearch, Neo4j). Develop services and integrations using Java (primary) and optionally Python for ETL workflows and data transformation. Integrate data from internal and external REST APIs; handle data format translation (e.g., JSON, Parquet, Avro). Optimize data flows for reliability and performance, and support large-scale batch and streaming data jobs. Implement and document ETL mappings, schemas, and transformation logic aligned with mission use cases. Collaborate with software, DevOps, and AI teams to support downstream data science and ML workflows. Use Git-based workflows and participate in CI/CD processes for data infrastructure deployments. Contribute to application specifications, data quality checks, and internal documentation. What We Value B.S. degree in Computer Science, Technology, Engineering, or a relevant field. 4-6 years of experience in data engineering, backend software engineering, or data integration roles. Strong experience with Java development in data pipeline or ETL contexts; Python is a plus. Proficiency with SQL and NoSQL databases, including query optimization and large dataset processing. Familiarity with data integration tools such as Apache NiFi, Airflow, or comparable platforms. Knowledge of RESTful API interactions, JSON parsing, and schema transformations. Exposure to cloud environments (especially AWS: S3, EC2, Lambda) and distributed systems. Comfortable with Git-based version control and Agile team practices. Industry experience, preferably within the defense industry and/or intelligence community or related sectors, a plus. Capability to work collaboratively in interdisciplinary teams. Awareness of ethical considerations and responsible AI practices. Excellent problem-solving skills, attention to detail, and ability to thrive in a fast-paced, collaborative environment. Experience with data messaging and streaming technologies (e.g., Kafka) (nice to have, not required). Understanding of IAM/security concepts in data environments (e.g., role-based access, encryption) (nice to have, not required). Exposure to data modeling, time-series data analysis, or graph databases (e.g., Neo4j) (nice to have, not required). Familiarity with Spark or other distributed processing frameworks (nice to have, not required). Security Clearance We are hiring for multiple positions for each role. Some roles require a Secret , Top Secret , or Top Secret/SCI Security Clearance on Day 1. If you do not currently hold a clearance but are interested in this role and believe you are eligible for a clearance, we still encourage you to submit an application. Work Location We are hiring for roles at our headquarters in Leawood, KS and remotely in the Arlington, VA region. Candidates in the Arlington, VA region may work remotely while not on customer site. Candidates in the Leawood, KS region may require some limited travel to customer sites ( Incentives Equity: All employees are eligible to participate in the company equity incentive program within their first 12 months of employment. We are proud that 100% of our employees are equity-owning partners at Torch.AI. Competitive salary and annual performance bonus opportunities. Unlimited PTO. 11 paid holidays each year. Incredible professional development and learning opportunities in a fast-paced high-tech environment and exciting industry. Weekly in-office catering in our Leawood HQ. Benefits Torch.AI values employee well-being and, in turn, offers exceptional benefits options which greatly exceed regional and national averages for similar companies. 401k Plan Torch.AI offers a 401k plan through John Hancock. While the company does not offer employee matching, we offer 3% profit sharing for all employees who elect to participate in the 401k plan. Profit sharing is calculated based on company performance at the end of each calendar year and distributed to 401k accounts at the start of each calendar year. Medical Three medical options: PPO, HSA, and TRICARE. Torch.AI's HSA contribution is 250%-350% higher than average employer contributions in Kansas City and Arlington regions. Only ~18% of employers offer TRICARE Supplement plans. Spending Accounts Above-market employer funding and flexibility. HSA: Triple-tax advantage FSA: $50-$3,300 annual contribution, $660 rollover Dependent Care FSA: $100-$5,000, pre-tax savings on child/dependent care. Dental High Plan annual maximum is ~2.6x higher than the national average. High Renaissance Plan: $5,000 annual max, 50% ortho up to $1,000. Low Renaissance Plan: $1,000 annual max, no ortho. Vision Frame allowance is 25-35% higher than typical employer-sponsored plans. Vision through Renaissance with VSP Choice network: $0 exams, lenses covered in full, and $180 frame allowance Life Insurance Employer-paid 1x base salary and additional voluntary options for employees and spouses, compared to most employers who only cover $50k basic life on average. Disability & Illness Torch.AI ranks in the top 10% of regional employers for disability benefits Short-Term Disability (employer paid): 60% income, up to $2,000/week Long-Term Disability (employer paid): 60% income, up to $5,000/month Voluntary Benefits Robust Voluntary plans offer direct pash payout flexibility and wellness incentives. Accidental Insurance Critical Illness Hospital Indemnity Commuter Benefits: up to $300/month tax-free for transit/parking Torch.AI is an Equal Opportunity /Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, national origin, protected veteran status or status as an individual with a disability. These positions are being reviewed and filled on a rolling basis, and multiple openings may be available for each role. JOB CODE: 1000108
    $50k yearly 60d+ ago
  • Principal Data Scientist

    Maximus 4.3company rating

    Senior data scientist job in Kansas City, KS

    Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team. You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes. This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.) This position requires occasional travel to the DC area for client meetings. Essential Duties and Responsibilities: - Make deep dives into the data, pulling out objective insights for business leaders. - Initiate, craft, and lead advanced analyses of operational data. - Provide a strong voice for the importance of data-driven decision making. - Provide expertise to others in data wrangling and analysis. - Convert complex data into visually appealing presentations. - Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners. - Understand the importance of automation and look to implement and initiate automated solutions where appropriate. - Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects. - Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects. - Guide operational partners on product performance and solution improvement/maturity options. - Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization. - Learn new skills in advanced analytics/AI/ML tools, techniques, and languages. - Mentor more junior data analysts/data scientists as needed. - Apply strategic approach to lead projects from start to finish; Job-Specific Minimum Requirements: - Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation. - Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital. - Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning. - Contribute to the development of mathematically rigorous process improvement procedures. - Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments. Minimum Requirements - Bachelor's degree in related field required. - 10-12 years of relevant professional experience required. Job-Specific Minimum Requirements: - 10+ years of relevant Software Development + AI / ML / DS experience. - Professional Programming experience (e.g. Python, R, etc.). - Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML. - Experience with API programming. - Experience with Linux. - Experience with Statistics. - Experience with Classical Machine Learning. - Experience working as a contributor on a team. Preferred Skills and Qualifications: - Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.). - Experience developing machine learning or signal processing algorithms: - Ability to leverage mathematical principles to model new and novel behaviors. - Ability to leverage statistics to identify true signals from noise or clutter. - Experience working as an individual contributor in AI. - Use of state-of-the-art technology to solve operational problems in AI and Machine Learning. - Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles. - Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions. - Ability to build reference implementations of operational AI & Advanced Analytics processing solutions. Background Investigations: - IRS MBI - Eligibility #techjobs #VeteransPage EEO Statement Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics. Pay Transparency Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances. Accommodations Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************. Minimum Salary $ 156,740.00 Maximum Salary $ 234,960.00
    $64k-89k yearly est. Easy Apply 7d ago
  • Data Engineer III

    Spring Venture Group 3.9company rating

    Senior data scientist job in Kansas City, MO

    Who We Are: Spring Venture Group is a leading digital direct-to-consumer sales and marketing company with product offerings focused on the senior market. We specialize in distributing Medicare Supplement, Medicare Advantage, and related products via our family of brands and dedicated team of licensed insurance agents. Powered by our unique technologies that combine sophisticated marketing, comparison shopping, sales execution, and customer engagement - we help thousands of seniors across the country navigate the complex world of Medicare every day. Job Description This person has the opportunity to work primarily remote in the Kansas City or surrounding areas, making occasional visits to the office, but must CURRENTLY be in the Kansas City area. We are unable to sponsor for this role, this includes international students. OVERVIEW The Data Management team is responsible for all things data at Spring Venture Group. Most importantly, our team is responsible for constructing high quality datasets that enable our business stakeholders and world-class Analytics department to make data informed decisions. Data engineers, combining Software Engineering and Database Engineering, serve as a primary resource for expertise with writing scripts and SQL queries, monitoring our database stability, and assisting with data governance ensuring availability for business-critical systems. The DE III works with a team of engineers of varying levels to design, develop, test, and maintain software applications and programs. The DE III will be expected to work independently when needed to solve the most complex problems encountered. They will be expected to be a leader and a mentor. ESSENTIAL DUTIES The essential duties for this role include, but are not limited to: Serve as a primary advisor to Data Engineering Manager to identify and bring attention to opportunities for technical improvements, reduction of technical debt, or automation of repeated tasks. Build advanced data pipelines utilizing the medallion architecture to create high quality single source of truth data sources in Snowflake Architect replacements of current Data Management systems with respect to all aspects of data governance Design advanced services with multiple data pipelines to securely and appropriately store company assets in our enterprise data stores. Technically advise any member of the data engineering department, providing direction when multiple paths forward present themselves. Actively participate as a leader in regular team meetings, listening and ensuring that one is assisting others at every chance for growth and development. Write advanced ETL/ELT scripts where appropriate to integrate data of various formats into enterprise data stores. Take ownership (both individually and as part of a team) of services and applications Write complex SQL queries, scripts, and stored procedures to reliably and consistently modify data throughout our organization according to business requirements Collaborate directly and independently with stakeholders to build familiarity, fully understand their needs, and create custom, modular, and reliable solutions to resolve their requests Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Work with Project Managers, Solution Architects, and Software Development teams to build solutions for Company Initiatives on time, on budget, and on value. Independently architect solutions to problems of high complexity, and advise junior and mid-level engineers on problems of medium complexity. Create data pipelines using appropriate and applicable technologies from Amazon Web Services (AWS) to serve the specific needs of the business. Ensure 99.95% uptime of our company's services monitoring data anomalies, batch failures, and our support chat for one week per team cycle from 8am-9pm. Follow and embrace procedures of both the Data Management team and SVG Software Development Life Cycle (SDLC), including obtaining and retaining IT Security Admin III clearance. Support after hours and weekend releases from our internal Software Development teams. Actively participate in code review and weekly technicals with another more senior engineer or manager. Assist departments with time-critical SQL execution and debug database performance problems. ROLE COMPETENCIES The competencies for this role include, but are not limited to: Emotional Intelligence Drive for Results Continuous Improvement Communication Strategic Thinking Teamwork and Collaboration Qualifications POSITION REQUIREMENTS The requirements to fulfill this position are as follows: Bachelor's degree in Computer Science, or a related technical field. 4-7 years of practical production work in Data Engineering. Expertise of the Python programming language. Expertise of Snowflake Expertise of SQL, databases, & query optimization. Must have experience in a large cloud provider such as AWS, Azure, GCP. Advanced at reading code independently and understanding its intent. Advanced at writing readable, modifiable code that solves business problems. Ability to construct reliable and robust data pipelines to support both scheduled and event based workflows. Working directly with stakeholders to create solutions. Mentoring junior and mid-level engineers on best practices in programming, query optimization, and business tact. Additional Information Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: Competitive Compensation Medical, Dental and vision benefits after a short waiting period 401(k) matching program Life Insurance, and Short-term and Long-term Disability Insurance Optional enrollment includes HSA/FSA, AD&D, Spousal/Dependent Life Insurance, Travel Assist and Legal Plan Generous paid time off (PTO) program starting off at 15 days your first year 15 paid Holidays (includes holiday break between Christmas and New Years) 10 days of Paid Parental Leave and 5 days of Paid Birth Recovery Leave Annual Volunteer Time Off (VTO) and a donation matching program Employee Assistance Program (EAP) - health and well-being on and off the job Rewards and Recognition Diverse, inclusive and welcoming culture Training program and ongoing support throughout your Venture Spring Venture Group career Security Responsibilities: Operating in alignment with policies and standards Reporting Security Incidents Completing assigned training Protecting assigned organizational assets Spring Venture Group is an Equal Opportunity Employer
    $75k-98k yearly est. 3h ago
  • Data Engineer II

    27Global

    Senior data scientist job in Leawood, KS

    Job DescriptionDescription: 27Global is a rapidly growing company in the dynamic industry of software, cloud, and data engineering. We pride ourselves on the quality of the services we deliver, the clients we serve, and the strength of our culture. Our commitment to our employees is evidenced by our five Best Places to Work awards. We're looking for a Data Engineer to join our team! You'll be responsible for contributing to the design and development of enterprise data solutions that support analytics, business intelligence, and scalable applications. You'll work closely with data and software architects, consultants and other engineers to deliver data models, integration strategies, and governance practices that empower client's data-driven decisions. Joining 27Global as a Data Engineer is an exciting high-growth opportunity offering a competitive base salary, performance bonuses, and variable compensation. Your Role: Participate in the design and implementation of scalable, secure, and high-performance data architectures. Develop and maintain conceptual, logical, and physical data models. Work closely with architects to define standards for data integration, quality, and governance. Collaborate with engineers, analysts, and business stakeholders to align data solutions with organizational needs. Support cloud-based data strategies including data warehousing, pipelines, and real-time processing. Design and optimize data pipelines that support AI, machine learning, and advanced analytics workloads. Implement data preprocessing, feature engineering, and real-time inference capabilities for predictive modeling. Integrate AI/ML models into production environments using tools such as AWS SageMaker, Azure Machine Learning, or Databricks. Assess, learn, and apply emerging data technologies and frameworks to enhance solutions and stay current with industry trends. Requirements: What You Bring: BA/BS/Master's degree in Computer Science, Information Systems, Data Science, or related field. 2 - 4 years of experience in data architecture, data engineering, or related roles delivering scalable architecture solutions from design to production. 2 - 4 years of experience writing .Net code or other OOP languages in an Agile environment. Demonstrated leadership skills with the ability to collaborate with and lead on-shore and off-shore team members. Proficient technical skills in: Spark, Scala, C#, PySpark, Data Lake, Delta Lake, Relational and NoSQL Databases, AWS Glue and Azure Synapse Experience with SQL, ETL/ELT, and data modeling. Experience with cloud platforms (AWS, Azure, GCP) and implementing modern data platforms with data lake. Knowledge of data governance, security, and compliance frameworks. Ability to context switch and work on a variety of projects over specified periods of time. Ability to work at the 27Global office in Leawood, KS with hybrid work flexibility after 90 days, and occasionally onsite at client offices. Flexibility to occasionally travel to client sites may be required, typically 1 week per quarter or less. Legal authorization to work in the United States and the ability to prove eligibility at the time of hire. Ways to Stand Out: Certifications: AWS Solution Architect, Azure Data Engineer, Databricks Data Engineer Hands-on experience with Databricks for building and optimizing scalable data pipelines, Delta Lake, and Spark-based analytics. Hands-on experience with big data tools (Spark, Kafka). Modern data warehouses (Snowflake, Redshift, BigQuery). Familiarity with machine learning pipelines and real-time analytics. Strong communication skills and ability to influence stakeholders. Prior experience implementing enterprise data governance frameworks. Experience in a client-facing role, working directly with clients from multiple levels of the organization; often presenting and documenting client environment suggestions and improvements. Why 27G?: Four-time award winner of Best Place to Work by the Kansas City Business Journal. A casual and fun small business work environment. Competitive compensation, benefits, time off, profit sharing, and quarterly bonus potential. Dedicated time for learning, development, research, and certifications.
    $69k-92k yearly est. 29d ago
  • Sr. Data Engineer

    Quest Analytics

    Senior data scientist job in Overland Park, KS

    At Quest Analytics, our mission is to make healthcare more accessible for all Americans. As part of our team, you'll work in an innovative, collaborative, challenging, and flexible environment that supports your personal growth every day. We are looking for a talented and motivated Senior Data Engineer with experience in building scalable infrastructure, implementing automation, and enabling cross-functional teams with reliable and accessible data. The Senior Data Engineer will help modernize and scale our data environment. This person will play a key role in transforming these workflows into automated, cloud-based pipelines using Azure Data Factory, Databricks, and modern data platforms. If you are looking for a high-impact opportunity to shape how data flows across the business, APPLY TODAY! What you'll do: Identify, design, and implement internal process improvements (e.g., automating manual processes, optimizing data delivery, and re-designing infrastructure for scalability). Transform manual SQL/SSMS/stored procedure workflows into automated pipelines using Azure Data Factory. Write clean, reusable, and efficient code in Python (and optionally C# or Scala). Leverage distributed data tools such as Spark and Databricks for large-scale processing. Review project objectives to determine and implement the most suitable technologies. Apply best practice standards for development, build, and deployment automation. Manage day-to-day operations of the data infrastructure and support engineers and analysts with data investigations. Monitor and report on data pipeline tasks, collaborating with teams to resolve issues quickly. Partner with internal teams to analyze current processes and identify efficiency opportunities. Participate in training and mentoring programs as assigned or required. Uphold Quest Analytics values and contribute to a positive company culture. Respond professionally and promptly to client and internal requests. Perform other duties as assigned. What it requires: Bachelor's Degree in Computer Science or equivalent education/experience. 3-5 years of experience with ETL, data operations, and troubleshooting, preferably in Healthcare data. Strong SQL development skills (SSMS, stored procedures, and optimization). Proficiency in Python, C#, or Scala (experience with pandas and NumPy is a plus). Solid understanding of the Azure ecosystem, especially Azure Data Factory and Azure Data Lake Storage (ADLS). Hands-on experience with Azure Data Factory and ADLS. Familiarity with Spark, Databricks, and data modeling techniques. Experience working with both relational databases (e.g., SQL Server) and NoSQL databases (e.g., MongoDB). Self-motivated, strong problem-solver, and thrives in fast-paced environments. Excellent troubleshooting, listening, and analytical skills. Customer-focused mindset with a collaborative, team-oriented approach. We are not currently engaging with outside agencies on this role.Visa sponsorship is not available at this time. What you'll appreciate:•Workplace flexibility - you choose between remote, hybrid or in-office•Company paid employee medical, dental and vision•Competitive salary and success sharing bonus•Flexible vacation with no cap, plus sick time and holidays•An entrepreneurial culture that won't limit you to a job description•Being listened to, valued, appreciated -- and having your contributions rewarded•Enjoying your work each day with a great group of people Apply TODAY!careers.questanalytics.com About Quest AnalyticsFor more than 20 years, we've been improving provider network management one groundbreaking innovation at a time. 90% of America's health plans use our tools, including the eight largest in the nation. Achieve your personal quest to build a great career here. Visa sponsorship is not available at this time. Preferred work locations are within one of the following states: Alabama, Arizona, Arkansas, Colorado, Connecticut, Delaware, Florida, Georgia, Idaho, Illinois (outside of Chicago proper), Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Mexico, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, West Virginia, Wisconsin, or Wyoming. Quest Analytics provides equal employment opportunities to all people without regard to race, color, religion, sex, national origin, ancestry, marital status, veteran status, age, disability, sexual orientation or gender identity or expression or any other legally protected category. We are committed to creating and maintaining a workforce environment that is free from any form of discriminations or harassment. Applicants must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire. Persons with disabilities who anticipate needing accommodations for any part of the application process may contact, in confidence ********************* NOTE: Staffing agencies, headhunters, recruiters, and/or placement agencies, please do not contact our hiring managers directly. We are not currently working with additional outside agencies at this time. Any job posting displayed on websites other than questanalytics.com or jobs.lever.co/questanalytics/ may be out of date, inaccurate and unavailable
    $69k-92k yearly est. Auto-Apply 60d+ ago

Learn more about senior data scientist jobs

How much does a senior data scientist earn in Independence, MO?

The average senior data scientist in Independence, MO earns between $66,000 and $121,000 annually. This compares to the national average senior data scientist range of $90,000 to $170,000.

Average senior data scientist salary in Independence, MO

$89,000
Job type you want
Full Time
Part Time
Internship
Temporary