Post job

Data engineer jobs in Lawrence, KS - 620 jobs

All
Data Engineer
Data Scientist
Data Consultant
Data Architect
  • Data Scientist, Analytics (Technical Leadership)

    Meta 4.8company rating

    Data engineer job in Topeka, KS

    We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond. **Required Skills:** Data Scientist, Analytics (Technical Leadership) Responsibilities: 1. Work with complex data sets to solve challenging problems using analytical and statistical approaches 2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies 3. Identify and measure success through goal setting, forecasting, and monitoring key metrics 4. Partner with cross-functional teams to inform and execute product strategy and investment decisions 5. Build long-term vision and strategy for programs and products 6. Collaborate with executives to define and develop data platforms and instrumentation 7. Effectively communicate insights and recommendations to stakeholders 8. Define success metrics, forecast changes, and set team goals 9. Support developing roadmaps and coordinate analytics efforts across teams **Minimum Qualifications:** Minimum Qualifications: 10. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience 11. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab) 12. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development 13. Experience working effectively with and leading complex analytics projects across multiple stakeholders and cross-functional teams, including Engineering, PM/TPM, Analytics and Finance 14. Experience with predictive modeling, machine learning, and experimentation/causal inference methods 15. Experience communicating complex technical topics in a clear, precise, and actionable manner **Preferred Qualifications:** Preferred Qualifications: 16. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy 17. Masters or Ph.D. Degree in a quantitative field 18. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research) 19. 10+ years of experience doing complex quantitative analysis in product analytics **Public Compensation:** $210,000/year to $281,000/year + bonus + equity + benefits **Industry:** Internet **Equal Opportunity:** Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment. Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@meta.com.
    $210k-281k yearly 60d+ ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Data Scientist, Privacy

    Datavant

    Data engineer job in Topeka, KS

    Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care. By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare. As part of the Privacy Science team within Privacy Hub you will play a crucial role in ensuring that privacy of patients is safeguarded in the modern world of data sharing. As well as working on real data, you will be involved in exciting research to keep us as industry leaders in this area, and stimulating discussions on re-identification risk. You will be supported in developing/consolidating data analysis and coding skills to become proficient in the analysis of large health-related datasets. **You Will:** + Critically analyze large health datasets using standard and bespoke software libraries + Discuss your findings and progress with internal and external stakeholders + Produce high quality reports which summarise your findings + Contribute to research activities as we explore novel and established sources of re-identification risk **What You Will Bring to the Table:** + Excellent communication skills. Meticulous attention to detail in the production of comprehensive, well-presented reports + A good understanding of statistical probability distributions, bias, error and power as well as sampling and resampling methods + Seeks to understand real-world data in context rather than consider it in abstraction. + Familiarity or proficiency with programmable data analysis software R or Python, and the desire to develop expertise in its language + Application of scientific methods to practical problems through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions + Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines + Initiative and ability to independently explore and research novel topics and concepts as they arise, to expand Privacy Hub's knowledge base + An appreciation of the need for effective methods in data privacy and security, and an awareness of the relevant legislation + Familiarity with Amazon Web Services cloud-based storage and computing facilities **Bonus Points If You Have:** + Experience creating documents using LATEX + Detailed knowledge of one or more types of health information, e.g., genomics, disease, health images + Experience working with or supporting public sector organizations, such as federal agencies (e.g., CMS, NIH, VA, CDC), state health departments, or public health research partners. Familiarity with government data environments, procurement processes, or privacy frameworks in regulated settings is highly valued. \#LI-BC1 We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services. The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job. The estimated total cash compensation range for this role is: $104,000-$130,000 USD To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion. This job is not eligible for employment sponsorship. Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay. At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way. Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis. For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
    $104k-130k yearly 22d ago
  • AWS Data Migration Consultant

    Slalom 4.6company rating

    Data engineer job in Kansas City, MO

    Candidates can live within commutable distance to any Slalom office in the US. We have a hybrid and flexible environment. Who You'll Work With As a modern technology company, we've never met a technical challenge we didn't like. We enable our clients to learn from their data, create incredible digital experiences, and make the most of new technologies. We blend design, engineering, and analytics expertise to build the future. We surround our technologists with interesting challenges, innovative minds, and emerging technologies. We are seeking an experienced Cloud Data Migration Architect with deep expertise in SQL Server, Oracle, DB2, or a combination of these platforms, to lead the design, migration, and optimization of scalable database solutions in the AWS cloud. This role will focus on modernizing on-premises database systems by architecting high-performance, secure, and reliable AWS-hosted solutions. As a key technical leader, you will work closely with data engineers, cloud architects, and business stakeholders to define data strategies, lead complex database migrations, build out ETL pipelines, and optimize performance across legacy and cloud-native environments. What You'll Do * Design and optimize database solutions on AWS, including Amazon RDS, EC2-hosted instances, and advanced configurations like SQL Server Always On or Oracle RAC (Real Application Clusters). * Lead and execute cloud database migrations using AWS Database Migration Service (DMS), Schema Conversion Tool (SCT), and custom automation tools. * Architect high-performance database schemas, indexing strategies, partitioning models, and query optimization techniques. * Optimize complex SQL queries, stored procedures, functions, and views to ensure performance and scalability in the cloud. * Implement high-availability and disaster recovery (HA/DR) strategies including Always-On, Failover Clusters, Log Shipping, and Replication, tailored to each RDBMS. * Ensure security best practices are followed including IAM-based access control, encryption, and compliance with industry standards. * Collaborate with DevOps teams to implement Infrastructure-as-Code (IaC) using tools like Terraform, CloudFormation, or AWS CDK. * Monitor performance using tools such as AWS CloudWatch, Performance Insights, Query Store, Dynamic Management Views (DMVs), or Oracle-native tools. * Work with software engineers and data teams to integrate cloud databases into enterprise applications and analytics platforms. What You'll Bring * 5+ years of experience in database architecture, design, and administration with at least one of the following: SQL Server, Oracle, or DB2. * Expertise in one or more of the following RDBMS platforms: Microsoft SQL Server, Oracle, DB2. * Hands-on experience with AWS database services (RDS, EC2-hosted databases). * Strong understanding of HA/DR solutions and cloud database design patterns. * Experience with ETL development and data integration, using tools such as SSIS, AWS Glue, or custom solutions. * Familiarity with AWS networking components (VPCs, security groups) and hybrid cloud connectivity. * Strong troubleshooting and analytical skills to resolve complex database and performance issues. * Ability to work independently and lead database modernization initiatives in collaboration with engineering and client stakeholders. Nice to Have * AWS certifications such as AWS Certified Database - Specialty or AWS Certified Solutions Architect - Professional. * Experience with NoSQL databases or hybrid data architectures. * Knowledge of analytics and big data tools (e.g., Snowflake, Redshift, Athena, Power BI, Tableau). * Familiarity with containerization (Docker, Kubernetes) and serverless technologies (AWS Lambda, Fargate). * Experience with DB2 on-premise or cloud-hosted environments. About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this position, the target base salary pay range in the following locations: Boston, Houston, Los Angeles, Orange County, Seattle, San Diego, Washington DC, New York, New Jersey, for Consultant level is $105,000-147,000 and for Senior Consultant level it is $120,000-$169,000 and for Principal level it is $133,000-$187,000. In all other markets, the target base salary pay range for Consultant level is $96,000-$135,000 and for Senior Consultant level it is $110,000-$155,000 and for Principal level it is $122,000-$172,000. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time. EEO and Accommodations Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process. We will accept applications until 1/31/2026 or until the positions are filled.
    $133k-187k yearly 11d ago
  • Jr Data Scientist - AI

    Spring Venture Group 3.9company rating

    Data engineer job in Kansas City, MO

    Who We Are: Spring Venture Group is a leading digital direct-to-consumer sales and marketing company with product offerings focused on the senior market. We specialize in distributing Medicare Supplement, Medicare Advantage, and related products via our family of brands and dedicated team of licensed insurance agents. Powered by our unique technologies that combine sophisticated marketing, comparison shopping, sales execution, and customer engagement - we help thousands of seniors across the country navigate the complex world of Medicare every day. Job Description Overview At SVG, we are a leader in driving business solutions using analytics and data. The Data Science team operates at the very heart of this culture, constantly investigating how machine learning can optimize business, automate processes, improve KPIs, and forecast trends. The Data Science team is ideal for those who like to be where business and science intersect. They collaborate with stakeholders across SVG to identify business problems, and then apply data science skills to optimize objectives and directly impact key performance indicators. About the role As a Junior Data Scientist on the DS team you will receive exposure to all facets of data science at SVG and be given self-development and skills training opportunities. You will work as part of a team to solve business-related problems using data-driven techniques. You will help transform our data into tangible business value by performing statistical analysis, contributing to production services, and communicating the outcomes across business verticals. * This is a hybrid role 2-3 days a week in our dowtown office, so you must currently be in Kansas City. We are unable to sponsor for this role, so can not consider candidates with their EAD/OPT. Also, no third parties. Responsibilities Learn about and improve existing AI and machine learning systems, or build new ones altogether. Train and Deploy great models, then create tangible value with them. Develop lead engagement strategies that utilize model outputs. Train a close rate model and apply it to customer outreach. Derive actionable insights from transcripts using LLMs. Test a hypothesis and utilize the data to drive the way forward. Run thorough A/B tests that show and validate measurable improvements. Conduct analyses of business topics and present actionable insights to team leadership. Build Tableau dashboards to investigate business questions and analyze ML systems. Begin writing production-level machine learning and engineering code. Create automated solutions that solve operational pain-points and reduce manual work. Develop understanding of and utilize our tech stack - including MySQL, Python, AWS (esp. SageMaker), Git, FastAPI, and LangChain. Share ownership of production workflows, in collaboration with senior team members. Demonstrate consistent initiative and follow-through. Work through every step of the Machine Learning Lifecycle; take a problem definition all the way through to a deployed and tracked production model. Build trust throughout the organization. Perform other duties and responsibilities as assigned. Qualifications Bachelor's degree in analytical or technical field, or 2+ years of relevant work experience. Experience analyzing data and using it to drive decisions. Demonstrated problem-solving skills. Passionate about mathematical and statistical applications in business. Strong sense of accountability and integrity with exceptional written and verbal communication skills. Adaptable, acting with appropriate urgency when needed. General understanding of SQL and relational databases. Fluency in at least one programming language, preferably Python. Familiarity with basic machine learning concepts. Interest in AI and LLMs for engineering and analysis tasks. A strong work ethic combined with self-awareness and a commitment to continuous improvement. Demonstrated ability to receive and incorporate feedback effectively. Preferred Graduate coursework in Data Science, Statistics, or similar. Experience using version control systems and performing code reviews. Familiarity with owning 24/7 operational systems. Experience in a highly-productive work environment where ambiguity is the norm. Additional Information Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: Competitive Compensation Medical, Dental and vision benefits after a short waiting period 401(k) matching program Life Insurance, and Short-term and Long-term Disability Insurance Optional enrollment includes HSA/FSA, AD&D, Spousal/Dependent Life Insurance, Travel Assist and Legal Plan Generous paid time off (PTO) program starting off at 15 days your first year 15 paid Holidays (includes holiday break between Christmas and New Years) 10 days of Paid Parental Leave and 5 days of Paid Birth Recovery Leave Annual Volunteer Time Off (VTO) and a donation matching program Employee Assistance Program (EAP) - health and well-being on and off the job Rewards and Recognition Diverse, inclusive and welcoming culture Training program and ongoing support throughout your Venture Spring Venture Group career Security Responsibilities: Operating in alignment with policies and standards Reporting Security Incidents Completing assigned training Protecting assigned organizational assets Spring Venture Group is an Equal Opportunity Employer
    $68k-87k yearly est. 1d ago
  • Principal Data Scientist

    Maximus 4.3company rating

    Data engineer job in Kansas City, MO

    Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team. You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes. This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.) This position requires occasional travel to the DC area for client meetings. U.S. citizenship is required for this position due to government contract requirements. Essential Duties and Responsibilities: - Make deep dives into the data, pulling out objective insights for business leaders. - Initiate, craft, and lead advanced analyses of operational data. - Provide a strong voice for the importance of data-driven decision making. - Provide expertise to others in data wrangling and analysis. - Convert complex data into visually appealing presentations. - Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners. - Understand the importance of automation and look to implement and initiate automated solutions where appropriate. - Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects. - Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects. - Guide operational partners on product performance and solution improvement/maturity options. - Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization. - Learn new skills in advanced analytics/AI/ML tools, techniques, and languages. - Mentor more junior data analysts/data scientists as needed. - Apply strategic approach to lead projects from start to finish; Job-Specific Minimum Requirements: - Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation. - Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital. - Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning. - Contribute to the development of mathematically rigorous process improvement procedures. - Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments. Minimum Requirements - Bachelor's degree in related field required. - 10-12 years of relevant professional experience required. Job-Specific Minimum Requirements: - 10+ years of relevant Software Development + AI / ML / DS experience. - Professional Programming experience (e.g. Python, R, etc.). - Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML. - Experience with API programming. - Experience with Linux. - Experience with Statistics. - Experience with Classical Machine Learning. - Experience working as a contributor on a team. Preferred Skills and Qualifications: - Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.). - Experience developing machine learning or signal processing algorithms: - Ability to leverage mathematical principles to model new and novel behaviors. - Ability to leverage statistics to identify true signals from noise or clutter. - Experience working as an individual contributor in AI. - Use of state-of-the-art technology to solve operational problems in AI and Machine Learning. - Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles. - Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions. - Ability to build reference implementations of operational AI & Advanced Analytics processing solutions. Background Investigations: - IRS MBI - Eligibility #techjobs #VeteransPage #LI-Remote EEO Statement Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics. Pay Transparency Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances. Accommodations Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************. Minimum Salary $ 156,740.00 Maximum Salary $ 234,960.00
    $66k-92k yearly est. Easy Apply 7d ago
  • Sr Data Engineer, Palantir

    The Hertz Corporation 4.3company rating

    Data engineer job in Topeka, KS

    **A Day in the Life:** We are seeking a talented **Sr Data Engineer, Palantir (experience required)** to join our Strategic Data & Analytics team working on Hertz's strategic applications and initiatives. This role will work in multi-disciplinary teams rapidly building high-value products that directly impact our financial performance and customer experience. You'll build cloud-native, large-scale, employee facing software using modern technologies including React, Python, Java, AWS, and Palantir Foundry. The ideal candidate will have strong development skills across the full stack, a growth mindset, and a passion for building software at a sustainable pace in a highly productive engineering culture. Experience with Palantir Foundry is highly preferred but not required - we're looking for engineers who are eager to learn and committed to engineering excellence. We expect the starting salary to be around $135k but will be commensurate with experience. **What You'll Do:** Day-to-Day Responsibilities + Work in balanced teams consisting of Product Managers, Product Designers, and engineers + Test first - We strive for Test-Driven Development (TDD) for all production code + CI (Continuous Integration) everything - Automation is core to our development process + Architect user-facing interfaces and design functions that help users visualize and interact with their data + Contribute to both frontend and backend codebases to enhance and develop projects + Build software at a sustainable pace to ensure longevity, reliability, and higher quality output Frontend Development + Design and develop responsive, intuitive user interfaces using React and modern JavaScript/TypeScript + Build reusable component libraries and implement best practices for frontend architecture + Generate UX/UI designs (no dedicated UX/UI designers on team) with considerations for usability and efficiency + Optimize applications for maximum speed, scalability, and accessibility + Develop large-scale, web and mobile software utilizing appropriate technologies for use by our employees Backend Development + Develop and maintain RESTful APIs and backend services using Python or Java + Design and implement data models and database schemas + Deploy to cloud environments (primarily AWS) + Integrate with third-party services and APIs + Write clean, maintainable, and well-documented code Palantir Foundry Development (Highly Preferred) + Build custom applications and integrations within the Palantir Foundry platform + Develop Ontology-based applications leveraging object types, link types, and actions + Create data pipelines and transformations using Python transforms + Implement custom widgets and user experiences using the Foundry SDK + Design and build functions that assist users to visualize and interact with their data Product Development & Delivery + Research problems and break them into deliverable parts + Work with a Lean mindset and deliver value quickly + Participate in all stages of the product development and deployment lifecycle + Conduct code reviews and provide constructive feedback to team members + Work with product managers and stakeholders to define requirements and deliverables + Contribute to architectural decisions and technical documentation **What We're Looking For:** + Experience with Palantir Foundry platform, required + 5+ years in web front-end or mobile development + Bachelor's or Master's degree in Computer Science or other related field, preferred + Strong proficiency in React, JavaScript/TypeScript, HTML, and CSS for web front-end development + Strong knowledge of one or more Object Oriented Programming or Functional Programming languages such as JavaScript, Typescript, Java, Python, or Kotlin + Experience with RESTful API design and development + Experience deploying to cloud environments (AWS preferred) + Understanding of version control systems, particularly GitHub + Experience with relational and/or NoSQL databases + Familiarity with modern frontend build tools and package managers (e.g., Webpack, npm, yarn) + Experience with React, including React Native for mobile app development, preferred + Experience in Android or iOS development, preferred + Experience with data visualization libraries (e.g., D3.js, Plotly, Chart.js), preferred + Familiarity with CI/CD pipelines and DevOps practices, preferred + Experience with Spring framework, preferred + Working knowledge of Lean, User Centered Design, and Agile methodologies + Strong communication skills and ability to collaborate effectively across teams + Growth mindset - Aptitude and willingness to learn new technologies + Empathy - Kindness and empathy when building software for end users + Pride - Takes pride in engineering excellence and quality craftsmanship + Customer obsession - Obsessed with the end user experience of products + Strong problem-solving skills and attention to detail + Ability to work independently and as part of a balanced, multi-disciplinary team + Self-motivated with a passion for continuous learning and improvement **What You'll Get:** + Up to 40% off the base rate of any standard Hertz Rental + Paid Time Off + Medical, Dental & Vision plan options + Retirement programs, including 401(k) employer matching + Paid Parental Leave & Adoption Assistance + Employee Assistance Program for employees & family + Educational Reimbursement & Discounts + Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness + Perks & Discounts -Theme Park Tickets, Gym Discounts & more The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world. **US EEO STATEMENT** At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company. Individuals are encouraged to apply for positions because of the characteristics that make them unique. EOE, including disability/veteran
    $135k yearly 54d ago
  • Data Engineer

    PDS Inc., LLC 3.8company rating

    Data engineer job in Overland Park, KS

    The Data Engineer is a key contributor in advancing the firm's data strategy and analytics ecosystem, transforming raw data into actionable insights that drive business decisions. This role requires a technically strong, curious professional committed to continuous learning and innovation. The ideal candidate combines analytical acumen with data engineering skills to ensure reliable, efficient, and scalable data pipelines and reporting solutions. ESSENTIAL DUTIES AND RESPONSIBILITIES Data Engineering & Integration Design, build, and maintain data pipelines and integrations using Azure Data Factory, SSIS, or equivalent ETL/ELT tools. Automate data imports, transformations, and loads from multiple sources (on-premise, SaaS, APIs, and cloud). Optimize and monitor data workflows for reliability, performance, and cost efficiency. Implement and maintain data quality, validation, and error-handling frameworks. Data Analysis & Reporting Develop and maintain reporting databases, views, and semantic models for business intelligence solutions. Design and publish dashboards and visualizations in Power BI and SSRS, ensuring alignment with business KPIs. Perform ad-hoc data exploration and statistical analysis to support business initiatives. Collaboration & Governance Partner with stakeholders across marketing, underwriting, operations, and IT to define analytical and data integration requirements. Maintain data integrity, enforce governance standards, and promote best practices in data stewardship. Support data security and compliance initiatives in coordination with IT and business teams. Continuous Improvement Stay current with emerging data technologies and analytics practices. Recommend tools, processes, or automation improvements to enhance data accessibility and insight delivery. QUALIFICATIONS Required: Strong SQL development skills and experience with Microsoft SQL Server and Azure SQL Database. Hands-on experience with data import, transformation, and integration using Azure Data Factory, SSIS, or similar tools. Proficiency in building BI solutions using Power BI and/or SSRS. Strong data modeling and relational database design skills. Proficiency in Microsoft Excel (advanced formulas, pivot tables, external data connections). Ability to translate business goals into data requirements and technical solutions. Excellent communication and collaboration skills. Bachelor's degree in Computer Science, Information Systems, or a related field (or equivalent experience). Preferred: Experience with cloud-based data platforms (Azure Data Lake, Synapse Analytics, Databricks). Familiarity with version control tools (Git, Azure DevOps) and Agile development practices. Exposure to Python or PowerShell for data transformation or automation. Experience integrating data from insurance or financial systems. Compensation: $120-129K This position is 3 days onsite/hybrid located in Overland Park, KS We look forward to reviewing your application. We encourage everyone to apply - even if every box isn't checked for what you are looking for or what is required. PDSINC, LLC is an Equal Opportunity Employer.
    $120k-129k yearly 56d ago
  • Business Data Scientist

    Southlaw p c 3.6company rating

    Data engineer job in Overland Park, KS

    Benefits 401k with Matching Up To 3%, Over a 3 Year Vesting Period Medical Dental - 100% Base Rate Paid by the Firm Vision - 100% Base Rate Paid by the Firm Life Insurance Long- and Short-Term Disability PTO Job Purpose The Business Data Scientist is a key part that helps to improve the efficiency and effectiveness of the Firm. The position is responsible for the ongoing analysis of Firm workflow, task and financial data, obtained from a variety of different sources. This involves development and preparation of information, reconciliations and reports for departments and management. Job Duties & Responsibilities Subject matter expert for the Case Management data system Build and maintain Key Performance Indicators (KPI's), dashboards, reports, and data related products in a supportable and extensive way using organizationally accepted tools and methods Identify patterns and trends in data sets to support process improvement efforts to maintain or meet Firm goals or opportunities Analyze results of data reports for anomalies, accuracy, and applicability to work with department leaders, BPM and QA teams for discussion Provides reporting and support as needed for data intensive projects throughout implementation Responsible for monitoring progress and maintaining reporting to ensure post go-live adherence to expectations Review & analyze data to answer financial & technical questions Qualifications Minimum of 3 years of experience in data or business analysis Minimum of 3 years of experience with Microsoft SQL, Excel, and Power BI Ability to express complex analytical concepts effectively, both verbally and written Proven analytics skills, including mining, evaluation, analysis, and visualization. Demonstrated ability to analyze data in a variety of formats and to provide recommendations and support in establishing expectations Demonstrated effectiveness in working with multi-functional teams in a technical production workflow environment Superior attention to detail to synthesize data from multiple sources Ability to multitask without sacrificing the quality and accuracy of the analytical data captured. Eligible to work in the United States Working Conditions This job operates in a professional office environment. This role routinely uses standard office equipment such as computers, phones, photocopiers, filing cabinets, and fax machines. Physical Requirements The ability to work in an office environment, in front of a computer, while typing documents, responding to emails, managing calendars and answering phones. Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties, or responsibilities that are required of the employee for this job. Duties, responsibilities, and activities may change at times with or without notice. SouthLaw, P.C. is an equal opportunity employer and does not discriminate against otherwise qualified applicants on the basis of race, color, creed, religion, ancestry, age, sex, marital status, national origin, disability or handicap, veteran status, or any other characteristic protected by law.
    $72k-96k yearly est. Auto-Apply 20d ago
  • Senior. Data Engineer

    Care It Services 4.3company rating

    Data engineer job in Overland Park, KS

    The Senior Data Engineer will be responsible for building and maintaining the data infrastructure that powers the organization's data-driven decision-making. Designs, develops, and maintains data pipelines, data warehouses, and other data-related infrastructure. This role expects to work closely with data scientists, analysts, and other stakeholders to understand their data needs and translate them into robust and scalable solutions. Key Responsibilities: Build, maintain, and optimize data pipelines, including ELT processes, data models, reports, and dashboards to drive business insights. Develop and implement data solutions for enterprise data warehouses and business intelligence (BI) initiatives. Continuously monitor and optimize data pipelines for performance, reliability, and cost-effectiveness. This includes identifying bottlenecks, tuning queries, and scaling infrastructure as needed. Automate data ingestion, processing, and validation tasks to ensure data quality and consistency. Implement data governance policies and procedures to ensure data quality, consistency, and compliance with relevant regulations. Contribute to the development of the organization's overall data strategy. Conduct code reviews and contribute to the establishment of coding standards and best practices. Required Qualifications: Bachelor's degree in a relevant field or equivalent professional experience. 4-6 years of hands-on experience in data engineering. Strong expertise in SQL and NoSQL databases, including PostgreSQL, DynamoDB, and MongoDB. Experience working with cloud platforms such as GCP, Azure, or AWS and their associated data services. Practical knowledge of data warehouses like BigQuery, Snowflake, and Redshift. Programming skills in Python or JavaScript. Proficiency with BI tools such as Sisense, Power BI, or Tableau. Preferred Qualifications: Direct experience with Google Cloud Platform (GCP). Knowledge of CI/CD pipelines, including tools like Docker and Terraform. Background in the healthcare industry. Familiarity with modern data integration tools such as DBT, Matillion, and Airbyte. Compensation: $125,000.00 per year Who We Are CARE ITS is a certified Woman-owned and operated minority company (certified as WMBE). At CARE ITS, we are the World Class IT Professionals, helping clients achieve their goals. Care ITS was established in 2010. Since then we have successfully executed several projects with our expert team of professionals with more than 20 years of experience each. We are globally operated with our Head Quarters in Plainsboro, NJ, with focused specialization in Salesforce, Guidewire and AWS. We provide expert solutions to our customers in various business domains.
    $125k yearly Auto-Apply 60d+ ago
  • Sr. Data Engineer

    Quest Analytics

    Data engineer job in Overland Park, KS

    At Quest Analytics, our mission is to make healthcare more accessible for all Americans. As part of our team, you'll work in an innovative, collaborative, challenging, and flexible environment that supports your personal growth every day. We are looking for a talented and motivated Senior Data Engineer with experience in building scalable infrastructure, implementing automation, and enabling cross-functional teams with reliable and accessible data. The Senior Data Engineer will help modernize and scale our data environment. This person will play a key role in transforming these workflows into automated, cloud-based pipelines using Azure Data Factory, Databricks, and modern data platforms. If you are looking for a high-impact opportunity to shape how data flows across the business, APPLY TODAY! What you'll do: Identify, design, and implement internal process improvements (e.g., automating manual processes, optimizing data delivery, and re-designing infrastructure for scalability). Transform manual SQL/SSMS/stored procedure workflows into automated pipelines using Azure Data Factory. Write clean, reusable, and efficient code in Python (and optionally C# or Scala). Leverage distributed data tools such as Spark and Databricks for large-scale processing. Review project objectives to determine and implement the most suitable technologies. Apply best practice standards for development, build, and deployment automation. Manage day-to-day operations of the data infrastructure and support engineers and analysts with data investigations. Monitor and report on data pipeline tasks, collaborating with teams to resolve issues quickly. Partner with internal teams to analyze current processes and identify efficiency opportunities. Participate in training and mentoring programs as assigned or required. Uphold Quest Analytics values and contribute to a positive company culture. Respond professionally and promptly to client and internal requests. Perform other duties as assigned. What it requires: Bachelor's Degree in Computer Science or equivalent education/experience. 3-5 years of experience with ETL, data operations, and troubleshooting, preferably in Healthcare data. Strong SQL development skills (SSMS, stored procedures, and optimization). Proficiency in Python, C#, or Scala (experience with pandas and NumPy is a plus). Solid understanding of the Azure ecosystem, especially Azure Data Factory and Azure Data Lake Storage (ADLS). Hands-on experience with Azure Data Factory and ADLS. Familiarity with Spark, Databricks, and data modeling techniques. Experience working with both relational databases (e.g., SQL Server) and NoSQL databases (e.g., MongoDB). Self-motivated, strong problem-solver, and thrives in fast-paced environments. Excellent troubleshooting, listening, and analytical skills. Customer-focused mindset with a collaborative, team-oriented approach. We are not currently engaging with outside agencies on this role.Visa sponsorship is not available at this time. What you'll appreciate:•Workplace flexibility - you choose between remote, hybrid or in-office•Company paid employee medical, dental and vision•Competitive salary and success sharing bonus•Flexible vacation with no cap, plus sick time and holidays•An entrepreneurial culture that won't limit you to a job description•Being listened to, valued, appreciated -- and having your contributions rewarded•Enjoying your work each day with a great group of people Apply TODAY!careers.questanalytics.com About Quest AnalyticsFor more than 20 years, we've been improving provider network management one groundbreaking innovation at a time. 90% of America's health plans use our tools, including the eight largest in the nation. Achieve your personal quest to build a great career here. Visa sponsorship is not available at this time. Preferred work locations are within one of the following states: Alabama, Arizona, Arkansas, Colorado, Connecticut, Delaware, Florida, Georgia, Idaho, Illinois (outside of Chicago proper), Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Mexico, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, West Virginia, Wisconsin, or Wyoming. Quest Analytics provides equal employment opportunities to all people without regard to race, color, religion, sex, national origin, ancestry, marital status, veteran status, age, disability, sexual orientation or gender identity or expression or any other legally protected category. We are committed to creating and maintaining a workforce environment that is free from any form of discriminations or harassment. Applicants must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire. Persons with disabilities who anticipate needing accommodations for any part of the application process may contact, in confidence ********************* NOTE: Staffing agencies, headhunters, recruiters, and/or placement agencies, please do not contact our hiring managers directly. We are not currently working with additional outside agencies at this time. Any job posting displayed on websites other than questanalytics.com or jobs.lever.co/questanalytics/ may be out of date, inaccurate and unavailable
    $69k-92k yearly est. Auto-Apply 60d+ ago
  • Senior Data Engineer

    Velocity Staff

    Data engineer job in Overland Park, KS

    Velocity Staff, Inc. is working with our client located in the Overland Park, KS area to identify a Senior Level Data Engineer to join their Data Services Team. The right candidate will utilize their expertise in data warehousing, data pipeline creation/support and analytical reporting and be responsible for gathering and analyzing data from several internal and external sources, designing a cloud-focused data platform for analytics and business intelligence, reliably providing data to our analysts. This role requires significant understanding of data mining and analytical techniques. An ideal candidate will have strong technical capabilities, business acumen, and the ability to work effectively with cross-functional teams. Responsibilities Work with Data architects to understand current data models, to build pipelines for data ingestion and transformation. Design, build, and maintain a framework for pipeline observation and monitoring, focusing on reliability and performance of jobs. Surface data integration errors to the proper teams, ensuring timely processing of new data. Provide technical consultation for other team members on best practices for automation, monitoring, and deployments. Provide technical consultation for the team with “infrastructure as code” best practices: building deployment processes utilizing technologies such as Terraform or AWS Cloud Formation. Qualifications Bachelor's degree in computer science, data science or related technical field, or equivalent practical experience Proven experience with relational and NoSQL databases (e.g. Postgres, Redshift, MongoDB, Elasticsearch) Experience building and maintaining AWS based data pipelines: Technologies currently utilized include AWS Lambda, Docker / ECS, MSK Mid/Senior level development utilizing Python: (Pandas/Numpy, Boto3, SimpleSalesforce) Experience with version control (git) and peer code reviews Enthusiasm for working directly with customer teams (Business units and internal IT) Preferred but not required qualifications include: Experience with data processing and analytics using AWS Glue or Apache Spark Hands-on experience building data-lake style infrastructures using streaming data set technologies (particularly with Apache Kafka) Experience data processing using Parquet and Avro Experience developing, maintaining, and deploying Python packages Experience with Kafka and the Kafka Connect ecosystem. Familiarity with data visualization techniques using tools such as Grafana, PowerBI, AWS Quick Sight, and Excel. Not ready to apply? Connect with us to learn about future opportunities.
    $69k-92k yearly est. Auto-Apply 59d ago
  • Sr. Data Engineer

    Wellsky

    Data engineer job in Overland Park, KS

    The Sr Data Engineer is responsible for designing, building, and maintaining scalable data pipelines, infrastructure, and data visualization to support our data-driven decision-making processes. The scope of this job includes collaborating with data scientists, analysts, solutions partners, and other stakeholders to increase data quality, accessibility, and security. Key Responsibilities: Create, develop, maintain, and leverage data pipelines, data models, reporting, and dashboards to solve business problems or assigned tasks in coordination with multiple teams. Develop and maintain data solutions to enable new or existing solutions and BI development for the data warehouse and data visualization solutions. Enhance and document workflows for complex data-related challenges by identifying patterns, trends, and anomalies in the data, and find ways to optimize data processing and storage to promote data quality through automation. Develop understanding of different WellSky solutions and become a subject matter expertise in the healthcare domain. Identify opportunities in code reviews and contribute to the development of best practices and coding standards. Adhere to WellSky core values, ensure PHI data security, and adapt to changing BI technologies. Leverage AI tools and platforms as an integral part of daily responsibilities to enhance decision-making, streamline workflows, and drive data-informed outcomes. Perform other job duties as assigned. Required Qualifications: Bachelor's degree or relevant work experience 4-6 years of relevant work experience Strong SQL skills with hands-on experience developing and supporting data pipelines and data warehouse solutions Experience with SQL Server data integration and ETL development (including SSIS or equivalent) Experience with cloud-based data platforms and services, preferably within GCP environments Preferred Qualifications: Experience with Snowflake data warehouse development and management Experience with DBT for ELT/ETL processes, data modeling, testing, and documentation GCP expertise, especially with Cloud SQL, BigQuery, and DataStream Experience with data replication tools such as HVR (or similar CDC/replication technologies) Experience supporting BI/reporting environments such as Tableau through well-modeled, performant datasets Healthcare industry experience Job Expectations: Willing to work additional or irregular hours as needed Must work in accordance with applicable security policies and procedures to safeguard company and client information Must be able to sit and view a computer screen for extended periods of time #LI-TC1 #LI-Onsite WellSky is where independent thinking and collaboration come together to create an authentic culture. We thrive on innovation, inclusiveness, and cohesive perspectives. At WellSky you can make a difference. WellSky provides equal employment opportunities to all people without regard to race, color, national origin, ancestry, citizenship, age, religion, gender, sex, sexual orientation, gender identity, gender expression, marital status, pregnancy, physical or mental disability, protected medical condition, genetic information, military service, veteran status, or any other status or characteristic protected by law. WellSky is proud to be a drug-free workplace. Applicants for U.S.-based positions with WellSky must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire. Certain client-facing positions may be required to comply with applicable requirements, such as immunizations and occupational health mandates. Here are some of the exciting benefits full-time teammates are eligible to receive at WellSky: Excellent medical, dental, and vision benefits Mental health benefits through TelaDoc Prescription drug coverage Generous paid time off, plus 13 paid holidays Paid parental leave 100% vested 401(K) retirement plans Educational assistance up to $2500 per year
    $69k-92k yearly est. Auto-Apply 19d ago
  • Data Engineer

    Lockton 4.5company rating

    Data engineer job in Overland Park, KS

    Lockton Affinity, located in Overland Park, KS is searching for a Data Engingeer. At Lockton Affinity, information technology plays a vital role in delivering exceptional business results. The Data Engineer is a key contributor in advancing the firm's data strategy and analytics ecosystem, transforming raw data into actionable insights that drive business decisions. This role requires a technically strong, curious professional committed to continuous learning and innovation. The ideal candidate combines analytical acumen with data engineering skills to ensure reliable, efficient, and scalable data pipelines and reporting solutions. Lockton values Associates who are proactive, collaborative, and motivated to make a measurable impact. ESSENTIAL DUTIES AND RESPONSIBILITIES Data Engineering & Integration * Design, build, and maintain data pipelines and integrations using Azure Data Factory, SSIS, or equivalent ETL/ELT tools. * Automate data imports, transformations, and loads from multiple sources (on-premise, SaaS, APIs, and cloud). * Optimize and monitor data workflows for reliability, performance, and cost efficiency. * Implement and maintain data quality, validation, and error-handling frameworks. Data Analysis & Reporting * Develop and maintain reporting databases, views, and semantic models for business intelligence solutions. * Design and publish dashboards and visualizations in Power BI and SSRS, ensuring alignment with business KPIs. * Perform ad-hoc data exploration and statistical analysis to support business initiatives. Collaboration & Governance * Partner with stakeholders across marketing, underwriting, operations, and IT to define analytical and data integration requirements. * Maintain data integrity, enforce governance standards, and promote best practices in data stewardship. * Support data security and compliance initiatives in coordination with IT and business teams. Continuous Improvement * Stay current with emerging data technologies and analytics practices. * Recommend tools, processes, or automation improvements to enhance data accessibility and insight delivery.
    $77k-101k yearly est. 11d ago
  • Staff Data Engineer

    Artera

    Data engineer job in Kansas City, MO

    Our Mission: Make healthcare #1 in customer service. What We Deliver: Artera, a SaaS leader in digital health, transforms patient experience with AI-powered virtual agents (voice and text) for every step of the patient journey. Trusted by 1,000+ provider organizations - including specialty groups, FQHCs, large IDNs and federal agencies - engaging 100 million patients annually. Artera's virtual agents support front desk staff to improve patient access including self-scheduling, intake, forms, billing and more. Whether augmenting a team or unleashing a fully autonomous digital workforce, Artera offers multiple virtual agent options to meet healthcare organizations where they are in their AI journey. Artera helps support 2B communications in 109 languages across voice, text and web. A decade of healthcare expertise, powered by AI. Our Impact: Trusted by 1,000+ provider organizations - including specialty groups, FQHCs, large IDNs and federal agencies - engaging 100 million patients annually. Hear from our CEO, Guillaume de Zwirek, about why we are standing at the edge of the biggest technological shift in healthcare's history! Our award-winning culture: Our award-winning culture: Since founding in 2015, Artera has consistently been recognized for its innovative technology, business growth, and named a top place to work. Examples of these accolades include: Inc. 5000 Fastest Growing Private Companies (2020, 2021, 2022, 2023, 2024); Deloitte Technology Fast 500 (2021, 2022, 2023, 2024, 2025); Built In Best Companies to Work For (2021, 2022, 2023, 2024, 2025, 2026). Artera has also been recognized by Forbes as one of “America's Best Startup Employers,” Newsweek as one of the “World's Best Digital Health Companies,” and named one of the top “44 Startups to Bet your Career on in 2024” by Business Insider. SUMMARY We are seeking a highly skilled and motivated Staff Data Engineer to join our team at Artera. This role is critical to maintaining and improving our data infrastructure, ensuring that our data pipelines are robust, efficient, and capable of delivering high-quality data to both internal and external stakeholders. As a key player in our data team, you will have the opportunity to make strategic decisions about the tools we use, how we organize our data, and the best methods for orchestrating and optimizing our data processes. Your contributions will be essential to ensuring the uninterrupted flow of data across our platform, supporting the analytics needs of our clients and internal teams. If you are passionate about data, problem-solving, and continuous improvement, while also taking the lead on investigating and implementing solutions to enhance our data infrastructure. RESPONSIBILITES Continuous Enhancement: Maintain and elevate Artera's data infrastructure, ensuring peak performance and dependability. Strategic Leadership: Drive the decision-making process for the selection and implementation of data tools and technologies Streamlining: Design and refine data pipelines to ensure smooth and efficient data flow. Troubleshooting: Manage the daily operations of the Artera platform, swiftly identifying and resolving data-related challenges. Cross-Functional Synergy: Partner with cross-functional teams to develop new data requirements and refine existing processes. Guidance: Provide mentorship to junior engineers, supporting their growth and assisting with complex projects. Collaborative Innovation: Contribute to ongoing platform improvements, ensuring a culture of continuous innovation. Knowledge Expansion: Stay informed on industry trends and best practices in data infrastructure and cloud technologies. Dependability: Guarantee consistent data delivery to customers and stakeholders, adhering to or surpassing service level agreements. Oversight: Monitor and sustain the data infrastructure, covering areas like recalls, message delivery, and reporting functions. Proactiveness: Improves stability and performance of architecture for team implementations. Requirements Bachelor's Degree in STEM preferred *additional experience is also accepted in lieu of a degree Proven experience with Kubernetes and Cloud infrastructure (AWS preferred) Strong proficiency in Python and SQL for data processing and automation. Expertise in orchestration tools such as Airflow and Docker. Understanding of performance optimization and cost-effectiveness in Snowflake. Ability to work effectively in a collaborative, cross-functional environment. Strong problem-solving skills with a proactive and solution-oriented mindset. Experience with event sourced and microservice architecture Experienced working with asynchronous requests in large scale applications Commitment to testing best practices Experience in Large-scale data architecture Demonstrated ability to build and maintain complex data pipelines and data flows. Bonus Experience Knowledge of DBT & Meltano Security RequirementsThis engineering role contributes to a secure, federally compliant platform. Candidates must be eligible for a government background check and operate within strict code management, access, and documentation standards. Security-conscious development and participation in compliance practices are core to the role. OUR APPROACH TO WORK LOCATIONArtera has hybrid office locations in Santa Barbara, CA, and Philadelphia (Wayne), PA, where team members typically come in three days a week. Specific frequency can vary depending on your team's needs, manager expectations and/or role responsibilities. In addition to our U.S. office locations, we are intentionally building geographically concentrated teams in several key metropolitan areas, which we call our “Hiring Hubs.” We are currently hiring remote candidates located within the following hiring hubs:- Boston Metro Area, MA- Chicago Metro Area, IL- Denver Metro Area, CO- Kansas City Metro Area (KS/MO)- Los Angeles Metro Area, CA- San Francisco / Bay Area, CA- Seattle Metro Area, WA This hub-based model helps us cultivate strong local connections and team cohesion, even in a distributed environment. To be eligible for employment at Artera, candidates must reside in one of our hybrid office cities or one of the designated hiring hubs. Specific roles may call out location preferences when relevant. As our hubs grow, we may establish local offices to further enhance in-person connection and collaboration. While there are no current plans in place, should an office open in your area, we anticipate implementing a hybrid model. Any future attendance expectations would be developed thoughtfully, considering factors like typical commute times and access to public transit, to ensure they are fair and practical for the local team. WORKING AT ARTERA Company benefits - Full health benefits (medical, dental, and vision), flexible spending accounts, company paid life insurance, company paid short-term & long-term disability, company equity, voluntary benefits, 401(k) and more! Career development - Manager development cohorts, employee development funds Generous time off - Company holidays, Winter & Summer break, and flexible time off Employee Resource Groups (ERGs) - We believe that everyone should belong at their workplace. Our ERGs are available for identifying employees or allies to join. EQUAL EMPLOYMENT OPPORTUNITY (EEO) STATEMENTArtera is an Equal Opportunity Employer and is committed to fair and equitable hiring practices. All hiring decisions at Artera are based on strategic business needs, job requirements and individual qualifications. All candidates are considered without regard to race, color, religion, gender, sexuality, national origin, age, disability, genetics or any other protected status. Artera is committed to providing employees with a work environment free of discrimination and harassment; Artera will not tolerate discrimination or harassment of any kind. Artera provides reasonable accommodations for applicants and employees in compliance with state and federal laws. If you need an accommodation, please reach out to ************. DATA PRIVACYArtera values your privacy. By submitting your application, you consent to the processing of your personal information provided in conjunction with your application. For more information please refer to our Privacy Policy. SECURITY REQUIREMENTSAll employees are responsible for protecting the confidentiality, integrity, and availability of the organization's systems and data, including safeguarding Artera's sensitive information such as, Personal identifiable Information (PII) and Protected Health Information (PHI). Those with specific security or privacy responsibilities must ensure compliance with organizational policies, regulatory requirements, and applicable standards and frameworks by implementing safeguards, monitoring for threats, reporting incidents, and addressing data handling risks or breaches.
    $72k-96k yearly est. Auto-Apply 21d ago
  • IT Data & Analytics Engineer - Office of Judicial Administration

    Kansas Judicial Branch

    Data engineer job in Topeka, KS

    K0076172 IT Data & Analytics Engineer, Pay Grade - 49 $ 82,299.89 annually OverviewThe Data Systems Engineer is a key technical contributor within the Kansas Judicial Branch's newly established Data, Analytics & AI team. This role supports the development, modernization, and daily operation of the Branch's data environment, including: Modern data platform engineering (Azure, Databricks, ADLS, ADF) Data modeling and ingestion pipelines SQL and database administration (light-to-moderate) Dashboard and analytics enablement Operational support for data processes across the Branch Applied AI and automation initiatives This is a hybrid, cross-disciplinary position-ideal for someone who enjoys learning new technologies, solving complex data problems, and working across infrastructure, application development, security, and business units. This role supports both technical delivery and data governance foundation efforts and will work closely with internal stakeholders and agency leadership to deliver high-impact data solutions that improve judicial operations statewide. This position is expected to work on site at the Kansas Judicial Center. Key Responsibilities Data Engineering & Platform Work (Core) Build, test, and maintain data ingestion pipelines using Azure Data Factory, Databricks, SQL, and Python. Support medallion architecture workflows (Bronze → Silver → Gold) for structured and unstructured data. Assist with configuration and administration of Azure Data Lake Storage, Key Vault, ADF, Databricks, and related services. Maintain metadata tables, pipeline configuration, data quality rules, and automated monitoring scripts. Database & SQL Responsibilities Support and troubleshoot SQL Server environments related to analytics workloads, including stored procedures, ETL queries, schema updates, and performance tuning when needed. Assist with replication-based ingestion processes from legacy case management systems. Monitor extraction processes, troubleshoot data integrity or latency issues, and coordinate fixes. Analytics & Reporting Support Prepare curated datasets for dashboard developers and analysts. Build semantic models, business logic, and KPIs in SQL, Python, or Power BI. Assist with dashboard UAT, quality review, and deployment processes. AI & Automation Enablement Assist the IT Solution Architect in developing, training and supporting AI agents, chatbots, or retrieval workflows on Databricks (training provided). Support vectorization tasks, document processing, or metadata preparation for AI projects. Maintain internal model registry assets and contribute to responsible AI practices. Governance & Process Support Support data quality, metadata, glossary, and classification work in collaboration with business units. Contribute to Service Agreements (SAs), process documentation, SOPs, and governance deliverables. Participate in backlog reviews, sprint planning, and DAOSC working sessions. Cross-Team Collaboration Work with Architecture, Infrastructure, Security, Application, PMO, and eCourt teams to ensure reliable data movement and access. Provide technical assistance to stakeholders, SMEs, and leadership when data issues arise. Help refine intake, triage, documentation, and incident response for data services. Other Duties as Assigned This position will evolve as the Branch matures its data program and adopts new technologies. Required EducationMinimum: High school diploma, GED, or technical program in information technology, computer science, data analytics, or relevant field. Education may be substituted with relevant professional experience. Required ExperienceCandidates should have experience in at least two of the categories below and willingness to learn the others: Data & Platform Engineering Experience with SQL, data modeling, ETL/ELT solutions, or pipeline development. Familiarity with cloud concepts (Azure preferred). Programming / Automation Hands-on experience with one or more: Python, PowerShell, SQL, or similar scripting languages. Analytics Experience preparing datasets or supporting dashboards in tools like Power BI, Tableau, or similar. Experience building KPIs, metrics, or calculations. Database Work Experience with relational databases (SQL Server, Oracle, Postgres, etc.). Knowledge of indexing, querying, and performance tuning fundamentals. Systems / Infrastructure Understanding of identity, networking, storage, or DevOps concepts helpful. Soft Skills Ability to communicate with technical and non-technical staff. Ability to learn new tools and adapt quickly. Ability to work both independently and collaboratively. Preferred Qualifications(Not required - candidates are encouraged to apply even without these.) Experience with Azure Data Factory, Databricks, or cloud analytics platforms. Experience with version control (Git), CI/CD (Azure DevOps), or infrastructure-as-code. Experience with data governance, metadata, data quality, or cataloging. Experience with AI/ML tooling (MLflow, vector search, RAG frameworks). Experience in government, courts, or public-sector data environments. Certifications in Azure data/AI (DP-900, DP-203, AI-900, etc.) or equivalent practical experience. Applications Accepted Until FilledThe Americans with Disabilities Act ensures your right to reasonable accommodations during the employment process. A request for accommodation will not affect your opportunities for employment with the Judicial Branch. If you wish to request an ADA accommodation, please contact [email protected] or by TDD through the Kansas Relay Center at ************ or 711. THE KANSAS JUDICIAL BRANCH IS AN EEO / AA EMPLOYER
    $82.3k yearly Auto-Apply 54d ago
  • Data Scientist - Retail Pricing

    Capitol Federal Savings Bank 4.4company rating

    Data engineer job in Topeka, KS

    We are looking for a Data Scientist! This position will play a key role in shaping data-driven strategies that directly influence the bank's profitability, customer value, and market competitiveness. This role sits at the intersection of analytics, finance, and product strategy - transforming data into pricing intelligence that supports smarter, faster business decisions. Will design and implement advanced pricing and profitability models for retail banking products, leveraging internal performance metrics, market benchmarks, and third-party data sources. Through predictive modeling, elasticity analysis, and scenario testing, will help the organization optimize deposit and loan pricing, forecast financial outcomes, and identify growth opportunities. Collaborating across product, finance, and executive teams, will translate complex analytical findings into clear business recommendations that drive strategic action. Will also contribute to enhancing our analytics infrastructure - improving data pipelines, model governance, and reporting capabilities to strengthen enterprise-wide decision-making. Core Expertise: Pricing strategy · Profitability modeling · Financial forecasting · Machine learning · SQL · Python · R · Data visualization · Strategic analytics · Cross-functional collaboration CapFed is an equal opportunity employer.
    $66k-82k yearly est. Auto-Apply 48d ago
  • Jr Data Scientist - AI

    Spring Venture Group 3.9company rating

    Data engineer job in Kansas City, MO

    Who We Are: Spring Venture Group is a leading digital direct-to-consumer sales and marketing company with product offerings focused on the senior market. We specialize in distributing Medicare Supplement, Medicare Advantage, and related products via our family of brands and dedicated team of licensed insurance agents. Powered by our unique technologies that combine sophisticated marketing, comparison shopping, sales execution, and customer engagement - we help thousands of seniors across the country navigate the complex world of Medicare every day. Job Description Overview At SVG, we are a leader in driving business solutions using analytics and data. The Data Science team operates at the very heart of this culture, constantly investigating how machine learning can optimize business, automate processes, improve KPIs, and forecast trends. The Data Science team is ideal for those who like to be where business and science intersect. They collaborate with stakeholders across SVG to identify business problems, and then apply data science skills to optimize objectives and directly impact key performance indicators. About the role As a Junior Data Scientist on the DS team you will receive exposure to all facets of data science at SVG and be given self-development and skills training opportunities. You will work as part of a team to solve business-related problems using data-driven techniques. You will help transform our data into tangible business value by performing statistical analysis, contributing to production services, and communicating the outcomes across business verticals. * This is a hybrid role 2-3 days a week in our dowtown office, so you must currently be in Kansas City. We are unable to sponsor for this role, so can not consider candidates with their EAD/OPT. Also, no third parties. Responsibilities Learn about and improve existing AI and machine learning systems, or build new ones altogether. Train and Deploy great models, then create tangible value with them. Develop lead engagement strategies that utilize model outputs. Train a close rate model and apply it to customer outreach. Derive actionable insights from transcripts using LLMs. Test a hypothesis and utilize the data to drive the way forward. Run thorough A/B tests that show and validate measurable improvements. Conduct analyses of business topics and present actionable insights to team leadership. Build Tableau dashboards to investigate business questions and analyze ML systems. Begin writing production-level machine learning and engineering code. Create automated solutions that solve operational pain-points and reduce manual work. Develop understanding of and utilize our tech stack - including MySQL, Python, AWS (esp. SageMaker), Git, FastAPI, and LangChain. Share ownership of production workflows, in collaboration with senior team members. Demonstrate consistent initiative and follow-through. Work through every step of the Machine Learning Lifecycle; take a problem definition all the way through to a deployed and tracked production model. Build trust throughout the organization. Perform other duties and responsibilities as assigned. Qualifications Bachelor's degree in analytical or technical field, or 2+ years of relevant work experience. Experience analyzing data and using it to drive decisions. Demonstrated problem-solving skills. Passionate about mathematical and statistical applications in business. Strong sense of accountability and integrity with exceptional written and verbal communication skills. Adaptable, acting with appropriate urgency when needed. General understanding of SQL and relational databases. Fluency in at least one programming language, preferably Python. Familiarity with basic machine learning concepts. Interest in AI and LLMs for engineering and analysis tasks. A strong work ethic combined with self-awareness and a commitment to continuous improvement. Demonstrated ability to receive and incorporate feedback effectively. Preferred Graduate coursework in Data Science, Statistics, or similar. Experience using version control systems and performing code reviews. Familiarity with owning 24/7 operational systems. Experience in a highly-productive work environment where ambiguity is the norm. Additional Information Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: Competitive Compensation Medical, Dental and vision benefits after a short waiting period 401(k) matching program Life Insurance, and Short-term and Long-term Disability Insurance Optional enrollment includes HSA/FSA, AD&D, Spousal/Dependent Life Insurance, Travel Assist and Legal Plan Generous paid time off (PTO) program starting off at 15 days your first year 15 paid Holidays (includes holiday break between Christmas and New Years) 10 days of Paid Parental Leave and 5 days of Paid Birth Recovery Leave Annual Volunteer Time Off (VTO) and a donation matching program Employee Assistance Program (EAP) - health and well-being on and off the job Rewards and Recognition Diverse, inclusive and welcoming culture Training program and ongoing support throughout your Venture Spring Venture Group career Security Responsibilities: Operating in alignment with policies and standards Reporting Security Incidents Completing assigned training Protecting assigned organizational assets Spring Venture Group is an Equal Opportunity Employer
    $68k-87k yearly est. 14d ago
  • Principal Data Scientist

    Maximus 4.3company rating

    Data engineer job in Kansas City, KS

    Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team. You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes. This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.) This position requires occasional travel to the DC area for client meetings. U.S. citizenship is required for this position due to government contract requirements. Essential Duties and Responsibilities: - Make deep dives into the data, pulling out objective insights for business leaders. - Initiate, craft, and lead advanced analyses of operational data. - Provide a strong voice for the importance of data-driven decision making. - Provide expertise to others in data wrangling and analysis. - Convert complex data into visually appealing presentations. - Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners. - Understand the importance of automation and look to implement and initiate automated solutions where appropriate. - Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects. - Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects. - Guide operational partners on product performance and solution improvement/maturity options. - Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization. - Learn new skills in advanced analytics/AI/ML tools, techniques, and languages. - Mentor more junior data analysts/data scientists as needed. - Apply strategic approach to lead projects from start to finish; Job-Specific Minimum Requirements: - Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation. - Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital. - Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning. - Contribute to the development of mathematically rigorous process improvement procedures. - Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments. Minimum Requirements - Bachelor's degree in related field required. - 10-12 years of relevant professional experience required. Job-Specific Minimum Requirements: - 10+ years of relevant Software Development + AI / ML / DS experience. - Professional Programming experience (e.g. Python, R, etc.). - Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML. - Experience with API programming. - Experience with Linux. - Experience with Statistics. - Experience with Classical Machine Learning. - Experience working as a contributor on a team. Preferred Skills and Qualifications: - Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.). - Experience developing machine learning or signal processing algorithms: - Ability to leverage mathematical principles to model new and novel behaviors. - Ability to leverage statistics to identify true signals from noise or clutter. - Experience working as an individual contributor in AI. - Use of state-of-the-art technology to solve operational problems in AI and Machine Learning. - Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles. - Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions. - Ability to build reference implementations of operational AI & Advanced Analytics processing solutions. Background Investigations: - IRS MBI - Eligibility #techjobs #VeteransPage #LI-Remote EEO Statement Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics. Pay Transparency Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances. Accommodations Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************. Minimum Salary $ 156,740.00 Maximum Salary $ 234,960.00
    $64k-89k yearly est. Easy Apply 7d ago
  • Google Cloud Data & AI Engineer

    Slalom 4.6company rating

    Data engineer job in Kansas City, MO

    Who You'll Work With As a modern technology company, our Slalom Technologists are disrupting the market and bringing to life the art of the possible for our clients. We have passion for building strategies, solutions, and creative products to help our clients solve their most complex and interesting business problems. We surround our technologists with interesting challenges, innovative minds, and emerging technologies You will collaborate with cross-functional teams, including Google Cloud architects, data scientists, and business units, to design and implement Google Cloud data and AI solutions. As a Consultant, Senior Consultant or Principal at Slalom, you will be a part of a team of curious learners who lean into the latest technologies to innovate and build impactful solutions for our clients. What You'll Do * Design, build, and operationalize large-scale enterprise data and AI solutions using Google Cloud services such as BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub and more. * Implement cloud-based data solutions for data ingestion, transformation, and storage; and AI solutions for model development, deployment, and monitoring, ensuring both areas meet performance, scalability, and compliance needs. * Develop and maintain comprehensive architecture plans for data and AI solutions, ensuring they are optimized for both data processing and AI model training within the Google Cloud ecosystem. * Provide technical leadership and guidance on Google Cloud best practices for data engineering (e.g., ETL pipelines, data pipelines) and AI engineering (e.g., model deployment, MLOps). * Conduct assessments of current data architectures and AI workflows, and develop strategies for modernizing, migrating, or enhancing data systems and AI models within Google Cloud. * Stay current with emerging Google Cloud data and AI technologies, such as BigQuery ML, AutoML, and Vertex AI, and lead efforts to integrate new innovations into solutions for clients. * Mentor and develop team members to enhance their skills in Google Cloud data and AI technologies, while providing leadership and training on both data pipeline optimization and AI/ML best practices. What You'll Bring * Proven experience as a Cloud Data and AI Engineer or similar role, with hands-on experience in Google Cloud tools and services (e.g., BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub, etc.). * Strong knowledge of data engineering concepts, such as ETL processes, data warehousing, data modeling, and data governance. * Proficiency in AI engineering, including experience with machine learning models, model training, and MLOps pipelines using tools like Vertex AI, BigQuery ML, and AutoML. * Strong problem-solving and decision-making skills, particularly with large-scale data systems and AI model deployment. * Strong communication and collaboration skills to work with cross-functional teams, including data scientists, business stakeholders, and IT teams, bridging data engineering and AI efforts. * Experience with agile methodologies and project management tools in the context of Google Cloud data and AI projects. * Ability to work in a fast-paced environment, managing multiple Google Cloud data and AI engineering projects simultaneously. * Knowledge of security and compliance best practices as they relate to data and AI solutions on Google Cloud. * Google Cloud certifications (e.g., Professional Data Engineer, Professional Database Engineer, Professional Machine Learning Engineer) or willingness to obtain certification within a defined timeframe. About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this position the target base salaries are listed below. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The target salary pay range is subject to change and may be modified at any time. East Bay, San Francisco, Silicon Valley: * Consultant $114,000-$171,000 * Senior Consultant: $131,000-$196,500 * Principal: $145,000-$217,500 San Diego, Los Angeles, Orange County, Seattle, Houston, New Jersey, New York City, Westchester, Boston, Washington DC: * Consultant $105,000-$157,500 * Senior Consultant: $120,000-$180,000 * Principal: $133,000-$199,500 All other locations: * Consultant: $96,000-$144,000 * Senior Consultant: $110,000-$165,000 * Principal: $122,000-$183,000 We are committed to pay transparency and compliance with applicable laws. If you have questions or concerns about the pay range or other compensation information in this posting, please contact us at: ********************. EEO and Accommodations Slalom is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process. We are accepting applications until the role is filled.
    $145k-217.5k yearly Easy Apply 5d ago
  • Data Scientist - Retail Pricing

    Capitol Federal Savings Bank 4.4company rating

    Data engineer job in Overland Park, KS

    We are looking for a Data Scientist! This position will play a key role in shaping data-driven strategies that directly influence the bank's profitability, customer value, and market competitiveness. This role sits at the intersection of analytics, finance, and product strategy - transforming data into pricing intelligence that supports smarter, faster business decisions. Will design and implement advanced pricing and profitability models for retail banking products, leveraging internal performance metrics, market benchmarks, and third-party data sources. Through predictive modeling, elasticity analysis, and scenario testing, will help the organization optimize deposit and loan pricing, forecast financial outcomes, and identify growth opportunities. Collaborating across product, finance, and executive teams, will translate complex analytical findings into clear business recommendations that drive strategic action. Will also contribute to enhancing our analytics infrastructure - improving data pipelines, model governance, and reporting capabilities to strengthen enterprise-wide decision-making. Core Expertise: Pricing strategy · Profitability modeling · Financial forecasting · Machine learning · SQL · Python · R · Data visualization · Strategic analytics · Cross-functional collaboration CapFed is an equal opportunity employer.
    $66k-82k yearly est. Auto-Apply 48d ago

Learn more about data engineer jobs

How much does a data engineer earn in Lawrence, KS?

The average data engineer in Lawrence, KS earns between $60,000 and $105,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Lawrence, KS

$79,000
Job type you want
Full Time
Part Time
Internship
Temporary