Post job

Data engineer jobs in Overland Park, KS

- 263 jobs
All
Data Engineer
Data Architect
Data Scientist
Analytical Data Miner
Data Warehouse Developer
Senior Systems Developer
  • Data Engineer

    Tyler Technologies 4.3company rating

    Data engineer job in Overland Park, KS

    Description The Data Engineer role within the Cloud Payments Reporting team is responsible for developing, deploying, and maintaining data solutions. Primary responsibilities will include querying databases, maintaining data transformation pipelines through ETL tools and systems into data warehouses, provide and support data visualizations, dashboards, ad hoc and customer deliverable reports. Responsibilities Provide primary support for existing databases, automated jobs, and reports, including monitoring notifications, performing root cause analysis, communicating findings, and resolving issues. Subject Matter Expert for payments reports, databases, and processes. Ensure data and report integrity and accuracy through thorough testing and validation. Build analytical tools to utilize the data, providing actionable insight into key business performance metrics including operational efficiency. Implement, support, and optimize ETL pipelines, data aggregation processes, and reports using various tools and technologies. Collaborate with operational leaders and teams to understand reporting and data usage across the business and provide efficient solutions. Participate in recurring meetings with working groups and management teams to discuss operational improvements. Work with stakeholders including data, design, product, and executive teams to support their data infrastructure needs while assisting with data-related technical issues. Handle tasks on your own, adjust to new deadlines, and adapt to changing priorities. Design, develop and implement special projects, based on business needs. Perform other job-related duties and responsibilities as assigned. Qualifications Five or more years of experience with Oracle, MySQL, Power BI / QuickSight, and Excel. Thorough knowledge of SQL, relational databases and data modeling principles. Proficiency in programming languages such as PL/SQL, Bash, PowerShell and Python. Exceptional problem-solving, analytical, and critical thinking skills. Excellent interpersonal and communication skills, with the ability to consult with stakeholders to facilitate requirements gathering, troubleshooting, and solution validation. Detail-oriented with the ability to understand the bigger picture. Ability to communicate complex quantitative analysis clearly. Strong organizational skills, including multi-tasking and teamwork. Self-motivated, task oriented and an aptitude for complex problem solving. Experience with AWS, Jenkins and SnapLogic is a plus. Data streaming, API calls (SOAP and REST), database replication and real-time processing is a plus. Experience with Atlassian JIRA and Confluence is a plus.
    $69k-84k yearly est. Auto-Apply 60d+ ago
  • Data Scientist / Data Architect / Data Governance Lead

    Bluebird Network 3.8company rating

    Data engineer job in Kansas City, MO

    KEY RESPONSIBILITIES: Data Governance, Strategy and Architecture * Define and drive the organization's overall vision, data strategy, roadmap, and architecture vision. This includes the data AI architecture vision, strategy, and roadmap. This includes the design of scalable data lakes, data warehouses, and data fabric architectures. * Establish and enforce data governance policies and standards to ensure data quality, consistency, and compliance with all relevant regulations (e.g., GDPR, CCPA). Lead the implementation of a comprehensive data governance framework, including data quality management, data lineage tracking, and master data management (MDM). Collaborate with data owners and stewards across business units to establish clear roles, responsibilities, and accountability for data assets. * Establish clear rules and policies governing the responsible usage of data within AI and ML models, including documentation of data lineage for model training. Design data infrastructure specifically optimized for AI workloads, including data pipelines for machine learning models, and architect solutions for large language models (LLMs). Develop bias mitigations strategies to ensure diverse and representative datasets to prevent AI biases, and architect monitoring systems for model drift. * Evaluate, recommend, and select appropriate data management technologies, including cloud platforms (e.g., AWS, Azure, GCP), storage solutions, and governance tools. * Architect complex data integration patterns to connect disparate data sources across the organization, ensuring seamless data flow and a unified data view. Data Security and Privacy * Design and implement a robust data security architecture to protect sensitive data from unauthorized access, breaches, and corruption. * Develop security protocols, such as encryption, access controls (IAM), and masking techniques to safeguard data in transit and at rest. * Conduct regular security audits and vulnerability testing to identify gaps in security architecture and develop remediation plans. * Ensure the data architecture and its supporting systems are compliant with internal policies and external data protection regulations. Data Modeling and Management * Design and maintain conceptual, logical, and physical data models for transactional and analytical systems. * Oversee the development of database schemas, metadata management, and data cataloging efforts to improve data discoverability and understanding. * Define and standardize data architecture components, including storage solutions (data lakes, warehouses, etc.), data pipelines, and integration patterns. * Evaluate and recommend new data technologies, tools, and platforms that align with the organization's strategic needs. Data Classification * Design and implement a robust data security architecture, including controls for access management, encryption, and data masking to protect sensitive information. * Create and manage an organization-wide data classification scheme based on data sensitivity and importance (e.g., public, internal, confidential, restricted). * Implement technical controls and processes to automatically classify and tag data assets, ensuring proper handling and security. * Collaborate with business and legal teams to define and apply data classification rules consistently. Team Collaboration and Leadership * Provide technical guidance and mentorship to data engineers, analysts, developers, and other IT teams on best practices for data management and security. * Work closely with business stakeholders to understand their data requirements and translate them into effective architectural solutions. * Foster a data-centric culture across the organization, promoting awareness and understanding of data governance principles. ABOUT THE COMPANY: Bluebird Fiber is a premier fiber telecommunications provider of internet, data transport, and other services to carriers, businesses, schools, hospitals, and other enterprises in the Midwest. To learn more, please visit bluebirdfiber.com. Join an amazing team of telecommunication professionals! Bluebird is a dynamic growing company in need of a Data Architect to be a part of a collaborative team. This is a full-time, benefit eligible position in our Kansas City Office. All of us at Bluebird work hard to meet objectives for the organization and live the mission and values of this growing company to meet a common goal. Check out this video that highlights our amazing company culture. JOB SUMMARY: We are seeking a highly skilled and strategic Data Architect to lead our data governance, security, and management initiatives. This senior role will be responsible for designing and implementing the organization's enterprise data architecture, ensuring that our data is secure, reliable, and accessible for business-critical functions. The ideal candidate is a proactive leader who can define data strategy, enforce best practices, and collaborate with cross-functional teams to align our data ecosystem with business goals. REQUIRED QUALIFICATIONS: * Bachelor's or master's degree in Computer Science, Information Technology, or a related technical field. * 10+ years of hands-on experience in data architecture, data modeling, and data governance, with a proven track record of designing and implementing complex data ecosystems. Experience working in regulated industries is a plus. * Proven experience (8+ years) designing and implementing enterprise-level data architectures. * Extensive experience with data modeling, data warehousing, and modern data platforms (e.g., cloud environments like AWS, Azure, or GCP). * Deep expertise in data modeling, data warehousing, database technologies (SQL, NoSQL), big data technologies (e.g., Spark), and modern cloud platforms (e.g., AWS, Azure, GCP). * Deep expertise in data governance and security principles, including regulatory compliance frameworks. * Strong knowledge of how to structure data for machine learning and AI workloads, including experience with MLOps platforms. * Hands-on experience with data classification and data cataloging tools (e.g., Collibra, Alation). * Excellent communication, interpersonal, and leadership skills, with the ability to influence and build consensus across the organization. PREFERRED QUALIFICATIONS: * Professional certifications in data architecture, data governance, or cloud platforms. * Experience with big data technologies (e.g., Hadoop, Spark). * Familiarity with data integration and ETL/ELT frameworks.
    $67k-94k yearly est. 54d ago
  • Senior. Data Engineer

    Care It Services 4.3company rating

    Data engineer job in Overland Park, KS

    The Senior Data Engineer will be responsible for building and maintaining the data infrastructure that powers the organization's data-driven decision-making. Designs, develops, and maintains data pipelines, data warehouses, and other data-related infrastructure. This role expects to work closely with data scientists, analysts, and other stakeholders to understand their data needs and translate them into robust and scalable solutions. Key Responsibilities: Build, maintain, and optimize data pipelines, including ELT processes, data models, reports, and dashboards to drive business insights. Develop and implement data solutions for enterprise data warehouses and business intelligence (BI) initiatives. Continuously monitor and optimize data pipelines for performance, reliability, and cost-effectiveness. This includes identifying bottlenecks, tuning queries, and scaling infrastructure as needed. Automate data ingestion, processing, and validation tasks to ensure data quality and consistency. Implement data governance policies and procedures to ensure data quality, consistency, and compliance with relevant regulations. Contribute to the development of the organization's overall data strategy. Conduct code reviews and contribute to the establishment of coding standards and best practices. Required Qualifications: Bachelor's degree in a relevant field or equivalent professional experience. 4-6 years of hands-on experience in data engineering. Strong expertise in SQL and NoSQL databases, including PostgreSQL, DynamoDB, and MongoDB. Experience working with cloud platforms such as GCP, Azure, or AWS and their associated data services. Practical knowledge of data warehouses like BigQuery, Snowflake, and Redshift. Programming skills in Python or JavaScript. Proficiency with BI tools such as Sisense, Power BI, or Tableau. Preferred Qualifications: Direct experience with Google Cloud Platform (GCP). Knowledge of CI/CD pipelines, including tools like Docker and Terraform. Background in the healthcare industry. Familiarity with modern data integration tools such as DBT, Matillion, and Airbyte. Compensation: $125,000.00 per year Who We Are CARE ITS is a certified Woman-owned and operated minority company (certified as WMBE). At CARE ITS, we are the World Class IT Professionals, helping clients achieve their goals. Care ITS was established in 2010. Since then we have successfully executed several projects with our expert team of professionals with more than 20 years of experience each. We are globally operated with our Head Quarters in Plainsboro, NJ, with focused specialization in Salesforce, Guidewire and AWS. We provide expert solutions to our customers in various business domains.
    $125k yearly Auto-Apply 60d+ ago
  • Data Engineer

    PDS Inc., LLC 3.8company rating

    Data engineer job in Overland Park, KS

    The Data Engineer is a key contributor in advancing the firm's data strategy and analytics ecosystem, transforming raw data into actionable insights that drive business decisions. This role requires a technically strong, curious professional committed to continuous learning and innovation. The ideal candidate combines analytical acumen with data engineering skills to ensure reliable, efficient, and scalable data pipelines and reporting solutions. ESSENTIAL DUTIES AND RESPONSIBILITIES Data Engineering & Integration Design, build, and maintain data pipelines and integrations using Azure Data Factory, SSIS, or equivalent ETL/ELT tools. Automate data imports, transformations, and loads from multiple sources (on-premise, SaaS, APIs, and cloud). Optimize and monitor data workflows for reliability, performance, and cost efficiency. Implement and maintain data quality, validation, and error-handling frameworks. Data Analysis & Reporting Develop and maintain reporting databases, views, and semantic models for business intelligence solutions. Design and publish dashboards and visualizations in Power BI and SSRS, ensuring alignment with business KPIs. Perform ad-hoc data exploration and statistical analysis to support business initiatives. Collaboration & Governance Partner with stakeholders across marketing, underwriting, operations, and IT to define analytical and data integration requirements. Maintain data integrity, enforce governance standards, and promote best practices in data stewardship. Support data security and compliance initiatives in coordination with IT and business teams. Continuous Improvement Stay current with emerging data technologies and analytics practices. Recommend tools, processes, or automation improvements to enhance data accessibility and insight delivery. QUALIFICATIONS Required: Strong SQL development skills and experience with Microsoft SQL Server and Azure SQL Database. Hands-on experience with data import, transformation, and integration using Azure Data Factory, SSIS, or similar tools. Proficiency in building BI solutions using Power BI and/or SSRS. Strong data modeling and relational database design skills. Proficiency in Microsoft Excel (advanced formulas, pivot tables, external data connections). Ability to translate business goals into data requirements and technical solutions. Excellent communication and collaboration skills. Bachelor's degree in Computer Science, Information Systems, or a related field (or equivalent experience). Preferred: Experience with cloud-based data platforms (Azure Data Lake, Synapse Analytics, Databricks). Familiarity with version control tools (Git, Azure DevOps) and Agile development practices. Exposure to Python or PowerShell for data transformation or automation. Experience integrating data from insurance or financial systems. Compensation: $120-129K This position is 3 days onsite/hybrid located in Overland Park, KS We look forward to reviewing your application. We encourage everyone to apply - even if every box isn't checked for what you are looking for or what is required. PDSINC, LLC is an Equal Opportunity Employer.
    $120k-129k yearly 18d ago
  • Data Architect

    Teksystems 4.4company rating

    Data engineer job in Kansas City, MO

    A company is seeking a highly skilled and strategic Data Architect to lead their data governance, security, and management initiatives. This senior role will be responsible for designing and implementing the organization's enterprise data architecture, ensuring that their data is secure, reliable, and accessible for business-critical functions. The ideal candidate is a proactive leader who can define data strategy, enforce best practices, and collaborate with cross-functional teams to align their data ecosystem with business goals. Skills data architecture, data modeling, data warehouse, aws, azure, cloud, lambda, artificial intelligence Top Skills Details data architecture,data modeling,data warehouse,aws,azure,cloud Additional Skills & Qualifications Bachelor's or master's degree in Computer Science, Information Technology, or a related technical field. 10+ years of hands-on experience in data architecture, data modeling, and data governance, with a proven track record of designing and implementing complex data ecosystems. Experience working in regulated industries is a plus. Proven experience (8+ years) designing and implementing enterprise-level data architectures. Extensive experience with data modeling, data warehousing, and modern data platforms (e.g., cloud environments like AWS, Azure, or GCP). Deep expertise in data modeling, data warehousing, database technologies (SQL, NoSQL), big data technologies (e.g., Spark), and modern cloud platforms (e.g., AWS, Azure, GCP). Deep expertise in data governance and security principles, including regulatory compliance frameworks. Strong knowledge of how to structure data for machine learning and AI workloads, including experience with MLOps platforms. Hands-on experience with data classification and data cataloging tools (e.g., Collibra, Alation). Excellent communication, interpersonal, and leadership skills, with the ability to influence and build consensus across the organization. Professional certifications in data architecture, data governance, or cloud platforms preferred. Experience with big data technologies (e.g., Hadoop, Spark) preferred. Familiarity with data integration and ETL/ELT frameworks preferred. Excellent communication, interpersonal, and leadership skills, with the ability to influence and build consensus across the organization. Highly motivated, self-starter with a strong sense of duty Continual learner, willing to attend workshops, seminars, etc. to maintain skills Mature critical thinking, analytical, and problem-solving skills with the ability to troubleshoot and devise a course of corrective action Highly organized and efficient with the ability to multitask, prioritizes tasks appropriately Experience Level Expert Level Job Type & Location This is a Contract to Hire position based out of Kansas City, MO. Pay and Benefits The pay range for this position is $70.00 - $85.00/hr. Eligibility requirements apply to some benefits and may depend on your job classification and length of employment. Benefits are subject to change and may be subject to specific elections, plan, or program terms. If eligible, the benefits available for this temporary role may include the following: - Medical, dental & vision - Critical Illness, Accident, and Hospital - 401(k) Retirement Plan - Pre-tax and Roth post-tax contributions available - Life Insurance (Voluntary Life & AD&D for the employee and dependents) - Short and long-term disability - Health Spending Account (HSA) - Transportation benefits - Employee Assistance Program - Time Off/Leave (PTO, Vacation or Sick Leave) Workplace Type This is a fully onsite position in Kansas City,MO. Application Deadline This position is anticipated to close on Dec 22, 2025. h4>About TEKsystems: We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company. The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law. About TEKsystems and TEKsystems Global Services We're a leading provider of business and technology services. We accelerate business transformation for our customers. Our expertise in strategy, design, execution and operations unlocks business value through a range of solutions. We're a team of 80,000 strong, working with over 6,000 customers, including 80% of the Fortune 500 across North America, Europe and Asia, who partner with us for our scale, full-stack capabilities and speed. We're strategic thinkers, hands-on collaborators, helping customers capitalize on change and master the momentum of technology. We're building tomorrow by delivering business outcomes and making positive impacts in our global communities. TEKsystems and TEKsystems Global Services are Allegis Group companies. Learn more at TEKsystems.com. The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
    $70-85 hourly 5d ago
  • Principal Data Engineer

    Weavix Inc.

    Data engineer job in Lenexa, KS

    Job Description About the Role: weavix is seeking a hands-on, business-minded Senior or Principal Data Engineer to architect and own our data infrastructure from the ground up. This is a unique opportunity to shape the future of data at a high-growth startup where IoT, scale, and performance are core to our mission. You'll be the technical lead for everything data - building pipelines, architecting systems, and working cross-functionally to extract insights that power customer growth, analyze user behavior, and improve system reliability and performance. This is a highly autonomous role, perfect for someone with startup experience who enjoys solving complex problems independently. What You'll Do: Architect, build, and maintain scalable data systems and pipelines to ingest and process large-scale data from IoT devices and user activity Own the design and implementation of our cloud-based data platform (Microsoft Azure strongly preferred; GCP or AWS also acceptable) Enable data-driven decision-making across product, engineering, and business teams Create a data architecture that supports both operational and analytical use cases (growth analytics, performance monitoring, system scaling) Ensure data quality, observability, governance, and security across all systems Serve as the subject matter expert on data systems, operating as a senior IC without a team initially What You Bring: 6+ years of experience in data engineering, ideally within a startup or high-growth environment Proven ability to independently design, implement, and manage scalable data architectures Deep experience working with large datasets, ideally from IoT sources or other high-volume systems Proficiency with modern data tools and languages (e.g., Typescript, NodeJS, SQL, etc.) Strong cloud experience, ideally with Microsoft Azure (but AWS or GCP also acceptable) A business-focused mindset with the ability to connect technical work to strategic outcomes Experience with New Relic, Metabase, Postgres, Grafana, Azure Storage, MongoDB, or other storage, database, graphing, or alerting platforms. Excellent communication and collaboration skills across technical and non-technical teams Bonus Points For: Experience with event-driven or real-time data systems (Kafka, Kinesis, etc.) Familiarity with BI tools and self-service analytics platforms Background in system performance monitoring and observability tools Why weavix Being a part of the weavix team is being a part of something bigger. We value the innovators and the risk-takers-the ones who love a challenge. Through our shared values and dedication to our mission to Connect every Disconnected Worker, we're reshaping the future of work to focus on this world's greatest assets: people. It's truly amazing what happy, engaged team members can achieve. Our ever-evolving list of benefits means you'll be able to achieve work/life balance, perform impactful work, grow in your role, look after yourself/your family, and invest in your future. Perks and Benefits Competitive Compensation Employee Equity Stock Program Competitive Benefits Package including: Medical, Dental, Vision, Life, and Disability Insurance 401(k) Retirement Plan + Company Match Flexible Spending & Health Savings Accounts Paid Holidays Flexible Time Off Employee Assistance Program (EAP) Other exciting company benefits About Us weavix , the Internet of Workers platform, revolutionizes frontline communication and productivity at scale. Since its founding, weavix has shaped the future of work by introducing innovative methods to better connect and enable the frontline workforce. weavix transforms enterprise by providing data-driven insights into facilities and teams to maximize productivity and achieve breakthrough results. weavix is the single source of truth for both workers and executives. Our mission is simple: to connect every disconnected worker through disruptive technology. How do you want to make your impact? For more information about us, visit weavix.com. Equal Employment Opportunity (EEO) Statement weavix is an Equal Opportunity Employer. At weavix, diversity fuels innovation. We are dedicated to fostering an inclusive environment where every team member is empowered to contribute to our mission of connecting the disconnected workforce. We do not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), national origin, age, disability, veteran status, genetic information, or any other legally protected characteristic. All qualified applicants will receive consideration for employment. Americans with Disabilities Act (ADA) Statement weavix is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need assistance or an accommodation during the application process due to a disability, you may contact us at *************. E-Verify Notice Notice: weavix participates in the E-Verify program to confirm employment eligibility as required by law.
    $69k-92k yearly est. 25d ago
  • Data Engineer II

    27Global

    Data engineer job in Leawood, KS

    Full-time Description 27Global is a rapidly growing company in the dynamic industry of software, cloud, and data engineering. We pride ourselves on the quality of the services we deliver, the clients we serve, and the strength of our culture. Our commitment to our employees is evidenced by our five Best Places to Work awards. We're looking for a Data Engineer to join our team! You'll be responsible for contributing to the design and development of enterprise data solutions that support analytics, business intelligence, and scalable applications. You'll work closely with data and software architects, consultants and other engineers to deliver data models, integration strategies, and governance practices that empower client's data-driven decisions. Joining 27Global as a Data Engineer is an exciting high-growth opportunity offering a competitive base salary, performance bonuses, and variable compensation. Your Role: Participate in the design and implementation of scalable, secure, and high-performance data architectures. Develop and maintain conceptual, logical, and physical data models. Work closely with architects to define standards for data integration, quality, and governance. Collaborate with engineers, analysts, and business stakeholders to align data solutions with organizational needs. Support cloud-based data strategies including data warehousing, pipelines, and real-time processing. Design and optimize data pipelines that support AI, machine learning, and advanced analytics workloads. Implement data preprocessing, feature engineering, and real-time inference capabilities for predictive modeling. Integrate AI/ML models into production environments using tools such as AWS SageMaker, Azure Machine Learning, or Databricks. Assess, learn, and apply emerging data technologies and frameworks to enhance solutions and stay current with industry trends. Requirements What You Bring: BA/BS/Master's degree in Computer Science, Information Systems, Data Science, or related field. 2 - 4 years of experience in data architecture, data engineering, or related roles delivering scalable architecture solutions from design to production. 2 - 4 years of experience writing .Net code or other OOP languages in an Agile environment. Demonstrated leadership skills with the ability to collaborate with and lead on-shore and off-shore team members. Proficient technical skills in: Spark, Scala, C#, PySpark, Data Lake, Delta Lake, Relational and NoSQL Databases, AWS Glue and Azure Synapse Experience with SQL, ETL/ELT, and data modeling. Experience with cloud platforms (AWS, Azure, GCP) and implementing modern data platforms with data lake. Knowledge of data governance, security, and compliance frameworks. Ability to context switch and work on a variety of projects over specified periods of time. Ability to work at the 27Global office in Leawood, KS with hybrid work flexibility after 90 days, and occasionally onsite at client offices. Flexibility to occasionally travel to client sites may be required, typically 1 week per quarter or less. Legal authorization to work in the United States and the ability to prove eligibility at the time of hire. Ways to Stand Out: Certifications: AWS Solution Architect, Azure Data Engineer, Databricks Data Engineer Hands-on experience with Databricks for building and optimizing scalable data pipelines, Delta Lake, and Spark-based analytics. Hands-on experience with big data tools (Spark, Kafka). Modern data warehouses (Snowflake, Redshift, BigQuery). Familiarity with machine learning pipelines and real-time analytics. Strong communication skills and ability to influence stakeholders. Prior experience implementing enterprise data governance frameworks. Experience in a client-facing role, working directly with clients from multiple levels of the organization; often presenting and documenting client environment suggestions and improvements. Why 27G?: Four-time award winner of Best Place to Work by the Kansas City Business Journal. A casual and fun small business work environment. Competitive compensation, benefits, time off, profit sharing, and quarterly bonus potential. Dedicated time for learning, development, research, and certifications.
    $69k-92k yearly est. 60d+ ago
  • Data Engineer III

    Spring Venture Group 3.9company rating

    Data engineer job in Kansas City, MO

    Who We Are: Spring Venture Group is a leading digital direct-to-consumer sales and marketing company with product offerings focused on the senior market. We specialize in distributing Medicare Supplement, Medicare Advantage, and related products via our family of brands and dedicated team of licensed insurance agents. Powered by our unique technologies that combine sophisticated marketing, comparison shopping, sales execution, and customer engagement - we help thousands of seniors across the country navigate the complex world of Medicare every day. Job Description This person has the opportunity to work primarily remote in the Kansas City or surrounding areas, making occasional visits to the office, but must CURRENTLY be in the Kansas City area. We are unable to sponsor for this role, this includes international students. OVERVIEW The Data Management team is responsible for all things data at Spring Venture Group. Most importantly, our team is responsible for constructing high quality datasets that enable our business stakeholders and world-class Analytics department to make data informed decisions. Data engineers, combining Software Engineering and Database Engineering, serve as a primary resource for expertise with writing scripts and SQL queries, monitoring our database stability, and assisting with data governance ensuring availability for business-critical systems. The DE III works with a team of engineers of varying levels to design, develop, test, and maintain software applications and programs. The DE III will be expected to work independently when needed to solve the most complex problems encountered. They will be expected to be a leader and a mentor. ESSENTIAL DUTIES The essential duties for this role include, but are not limited to: Serve as a primary advisor to Data Engineering Manager to identify and bring attention to opportunities for technical improvements, reduction of technical debt, or automation of repeated tasks. Build advanced data pipelines utilizing the medallion architecture to create high quality single source of truth data sources in Snowflake Architect replacements of current Data Management systems with respect to all aspects of data governance Design advanced services with multiple data pipelines to securely and appropriately store company assets in our enterprise data stores. Technically advise any member of the data engineering department, providing direction when multiple paths forward present themselves. Actively participate as a leader in regular team meetings, listening and ensuring that one is assisting others at every chance for growth and development. Write advanced ETL/ELT scripts where appropriate to integrate data of various formats into enterprise data stores. Take ownership (both individually and as part of a team) of services and applications Write complex SQL queries, scripts, and stored procedures to reliably and consistently modify data throughout our organization according to business requirements Collaborate directly and independently with stakeholders to build familiarity, fully understand their needs, and create custom, modular, and reliable solutions to resolve their requests Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Work with Project Managers, Solution Architects, and Software Development teams to build solutions for Company Initiatives on time, on budget, and on value. Independently architect solutions to problems of high complexity, and advise junior and mid-level engineers on problems of medium complexity. Create data pipelines using appropriate and applicable technologies from Amazon Web Services (AWS) to serve the specific needs of the business. Ensure 99.95% uptime of our company's services monitoring data anomalies, batch failures, and our support chat for one week per team cycle from 8am-9pm. Follow and embrace procedures of both the Data Management team and SVG Software Development Life Cycle (SDLC), including obtaining and retaining IT Security Admin III clearance. Support after hours and weekend releases from our internal Software Development teams. Actively participate in code review and weekly technicals with another more senior engineer or manager. Assist departments with time-critical SQL execution and debug database performance problems. ROLE COMPETENCIES The competencies for this role include, but are not limited to: Emotional Intelligence Drive for Results Continuous Improvement Communication Strategic Thinking Teamwork and Collaboration Qualifications POSITION REQUIREMENTS The requirements to fulfill this position are as follows: Bachelor's degree in Computer Science, or a related technical field. 4-7 years of practical production work in Data Engineering. Expertise of the Python programming language. Expertise of Snowflake Expertise of SQL, databases, & query optimization. Must have experience in a large cloud provider such as AWS, Azure, GCP. Advanced at reading code independently and understanding its intent. Advanced at writing readable, modifiable code that solves business problems. Ability to construct reliable and robust data pipelines to support both scheduled and event based workflows. Working directly with stakeholders to create solutions. Mentoring junior and mid-level engineers on best practices in programming, query optimization, and business tact. Additional Information Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: Competitive Compensation Medical, Dental and vision benefits after a short waiting period 401(k) matching program Life Insurance, and Short-term and Long-term Disability Insurance Optional enrollment includes HSA/FSA, AD&D, Spousal/Dependent Life Insurance, Travel Assist and Legal Plan Generous paid time off (PTO) program starting off at 15 days your first year 15 paid Holidays (includes holiday break between Christmas and New Years) 10 days of Paid Parental Leave and 5 days of Paid Birth Recovery Leave Annual Volunteer Time Off (VTO) and a donation matching program Employee Assistance Program (EAP) - health and well-being on and off the job Rewards and Recognition Diverse, inclusive and welcoming culture Training program and ongoing support throughout your Venture Spring Venture Group career Security Responsibilities: Operating in alignment with policies and standards Reporting Security Incidents Completing assigned training Protecting assigned organizational assets Spring Venture Group is an Equal Opportunity Employer
    $75k-98k yearly est. 9h ago
  • Data Developer

    Veracity Consulting

    Data engineer job in Overland Park, KS

    Veracity Consulting, Inc. is an Information Technology Solutions Provider. We offer our clients value added expertise in the development and use of Information Technology to expand and improve their organization's business processes. Currently, we are searching for a Data Developer to join our team in Overland Park, KS. Our team is instinctively curious. It's just how we're wired. That means empowering our people to see the big picture-to cut through the noise so we're not just treating the symptoms but finding the cure. Founded in 2006, Veracity is a team of problem\-solvers and truth\-tellers who deliver customized IT solutions for our customers. We bridge the gap between business and technology while always staying transparent and authentic-simply doing the right thing. RESPONSIBILITIES: Write queries against various database management systems Manage tasks and time effectively QUALIFICATIONS Familiarity with at least one database management system such as SQL Server, MySQL, or MariaDB Data visualization work with Power BI, Tableau, or similar Strong SQL development (T\-SQL, PL\/SQL a plus) Python or other scripting language (C#, Javascript, etc.) Understand data flow and the ETL process Regular and predictable attendance To be considered an applicant for a position, you must: (1) complete the application in full; (2) apply for a specific, available position; and (3) meet all stated minimum qualifications. Applications that are incomplete or are submitted for "any" position will not be considered. Applicants are good for 90 days. If you are not selected within 90 days of submission, and remain interested in a position, you must submit a new application. Veracity Consulting provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or status as a protected veteran and any other characteristics protected by law. In addition to federal law requirements, Veracity Consulting complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. No 3 rd parties, please. "}}],"is Mobile":false,"iframe":"true","job Type":"Full time","apply Name":"Apply","zsoid":"670335284","FontFamily":"PuviRegular","job OtherDetails":[{"field Label":"Industry","uitype":2,"value":"IT Services"},{"field Label":"City","uitype":1,"value":"Overland Park"},{"field Label":"State\/Province","uitype":1,"value":"Kansas"},{"field Label":"Zip\/Postal Code","uitype":1,"value":"66204"}],"header Name":"Data Developer","widget Id":"**********00925015","is JobBoard":"false","user Id":"**********07889003","attach Arr":[],"custom Template":"4","is CandidateLoginEnabled":false,"job Id":"**********02399001","FontSize":"13","location":"Overland Park","embedsource":"CareerSite","logo Id":"rc0lf017b693490c648be8a4aa5911517b8bb"}
    $72k-95k yearly est. 60d+ ago
  • Data Engineer

    Quest Analytics

    Data engineer job in Overland Park, KS

    At Quest Analytics, our mission is to make healthcare more accessible for all Americans. As part of our team, you'll work in an innovative, collaborative, challenging, and flexible environment that supports your personal growth every day. We are looking for a talented and motivated Data Engineer with experience in building scalable infrastructure, implementing automation, and enabling cross-functional teams with reliable and accessible data. The Data Engineer will run daily operations of the data infrastructure, automate and optimize our data operations and data pipeline architecture while ensuring active monitoring and troubleshooting. This hire will also support other engineers and analysts on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. APPLY TODAY! What you'll do: * Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. * Review project objectives and determine the best technology for implementation. Implement best practice standards for development, build and deployment automation. * Run daily operations of the data infrastructure and support other engineers and analysts on data investigations and operations. * Monitor and report on all data pipeline tasks while working with appropriate teams to take corrective action quickly, in case of any issues. * Work with internal teams to understand current process and areas for efficiency gains * Write well-abstracted, reusable, and efficient code. * Participate in the training and/or mentoring programs as assigned or required. * Adheres to the Quest Analytics Values and supports a positive company's culture. * Responds to the needs and requests of clients and Quest Analytics management and staff in a professional and expedient manner. What it requires: * Bachelor's degree in computer science or related field. * 3 years of work experience with ETL, data operations and troubleshooting, preferably in healthcare data. * Proficiency with Azure ecosystems, specifically in Azure Data Factory and ADLS. * Strong proficiency in Python for scripting, automation, and data processing. * Advanced SQL skills for query optimization and data manipulation. * Experience with distributed data pipeline tools like Apache Spark, Databricks, etc. * Working knowledge of database modeling (schema design, and data governance best practices.) * Working knowledge of libraries like Pandas, numpy, etc. * Self-motivated and able to work in a fast paced, deadline-oriented environment * Excellent troubleshooting, listening, and problem-solving skills. * Proven ability to solve complex issues. * Customer focused. What you'll appreciate: * Workplace flexibility - you choose between remote, hybrid or in-office * Company paid employee medical, dental and vision * Competitive salary and success sharing bonus * Flexible vacation with no cap, plus sick time and holidays * An entrepreneurial culture that won't limit you to a job description * Being listened to, valued, appreciated -- and having your contributions rewarded * Enjoying your work each day with a great group of people Apply TODAY! careers.questanalytics.com About Quest Analytics For more than 20 years, we've been improving provider network management one groundbreaking innovation at a time. 90% of America's health plans use our tools, including the eight largest in the nation. Achieve your personal quest to build a great career here. Visa sponsorship is not available at this time. Preferred work locations are within one of the following states: Alabama, Arizona, Arkansas, Colorado, Connecticut, Delaware, Florida, Georgia, Idaho, Illinois (outside of Chicago proper), Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Mexico, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, West Virginia, Wisconsin, or Wyoming. Quest Analytics provides equal employment opportunities to all people without regard to race, color, religion, sex, national origin, ancestry, marital status, veteran status, age, disability, sexual orientation or gender identity or expression or any other legally protected category. We are committed to creating and maintaining a workforce environment that is free from any form of discriminations or harassment. Applicants must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire. Persons with disabilities who anticipate needing accommodations for any part of the application process may contact, in confidence [email protected] NOTE: Staffing agencies, headhunters, recruiters, and/or placement agencies, please do not contact our hiring managers directly. We are not currently working with additional outside agencies at this time. Any job posting displayed on websites other than questanalytics.com or jobs.lever.co/questanalytics/ may be out of date, inaccurate and unavailable We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
    $69k-92k yearly est. 51d ago
  • Senior Data Engineer

    Velocity Staff

    Data engineer job in Overland Park, KS

    Velocity Staff, Inc. is working with our client located in the Overland Park, KS area to identify a Senior Level Data Engineer to join their Data Services Team. The right candidate will utilize their expertise in data warehousing, data pipeline creation/support and analytical reporting and be responsible for gathering and analyzing data from several internal and external sources, designing a cloud-focused data platform for analytics and business intelligence, reliably providing data to our analysts. This role requires significant understanding of data mining and analytical techniques. An ideal candidate will have strong technical capabilities, business acumen, and the ability to work effectively with cross-functional teams. Responsibilities Work with Data architects to understand current data models, to build pipelines for data ingestion and transformation. Design, build, and maintain a framework for pipeline observation and monitoring, focusing on reliability and performance of jobs. Surface data integration errors to the proper teams, ensuring timely processing of new data. Provide technical consultation for other team members on best practices for automation, monitoring, and deployments. Provide technical consultation for the team with “infrastructure as code” best practices: building deployment processes utilizing technologies such as Terraform or AWS Cloud Formation. Qualifications Bachelor's degree in computer science, data science or related technical field, or equivalent practical experience Proven experience with relational and NoSQL databases (e.g. Postgres, Redshift, MongoDB, Elasticsearch) Experience building and maintaining AWS based data pipelines: Technologies currently utilized include AWS Lambda, Docker / ECS, MSK Mid/Senior level development utilizing Python: (Pandas/Numpy, Boto3, SimpleSalesforce) Experience with version control (git) and peer code reviews Enthusiasm for working directly with customer teams (Business units and internal IT) Preferred but not required qualifications include: Experience with data processing and analytics using AWS Glue or Apache Spark Hands-on experience building data-lake style infrastructures using streaming data set technologies (particularly with Apache Kafka) Experience data processing using Parquet and Avro Experience developing, maintaining, and deploying Python packages Experience with Kafka and the Kafka Connect ecosystem. Familiarity with data visualization techniques using tools such as Grafana, PowerBI, AWS Quick Sight, and Excel. Not ready to apply? Connect with us to learn about future opportunities.
    $69k-92k yearly est. Auto-Apply 20d ago
  • Corporate Treasury Data & Risk Analytics

    Capitol Federal Savings Bank 4.4company rating

    Data engineer job in Overland Park, KS

    We are seeking a driven and analytically minded professional to join our Corporate Treasury team. This individual will play a key role supporting asset/liability management, liquidity management, budgeting & forecasting, data analytics, and performance analysis/reporting. In this role, you will work closely with senior and executive leadership to deliver strategic financial insights, optimize business performance, support and influence decision-making, uncover data-driven stories, and challenge existing processes with fresh, innovative thinking. Essential Duties & Responsibilities Responsibilities will be tailored to the experience and skillset of the selected candidate and may include: * Developing and enhancing financial models and simulations * Supporting forecasting, liquidity, and ALM analytics * Conducting "what-if" scenario analysis and presenting actionable insights * Building dashboards, reporting tools, and performance summaries * Driving or contributing to process improvement initiatives * Collaborating cross-functionally with senior leaders across the organization Experience & Knowledge * Financial modeling and earnings simulation experience using risk/performance management tools * Designing and developing mathematical or statistical models to support strategic decision-making and risk management * Experience running scenario analysis and synthesizing insights for executive audiences * Familiarity with financial asset/liability instruments, market instruments, and their interactions * Experience with Funds Transfer Pricing (FTP) and capital allocation is a plus * Demonstrated success driving effective process improvements Education * Bachelor's degree in Accounting, Finance, or a related field required CapFed is an equal opportunity employer.
    $62k-76k yearly est. Auto-Apply 29d ago
  • Salesforce Data 360 Architect

    Slalom 4.6company rating

    Data engineer job in Kansas City, MO

    Who You'll Work With In our Salesforce business, we help our clients bring the most impactful customer experiences to life and we do that in a way that makes our clients the hero of their transformation story. We are passionate about and dedicated to building a diverse and inclusive team, recognizing that diverse team members who are celebrated for bringing their authentic selves to their work build solutions that reach more diverse populations in innovative and impactful ways. Our team is comprised of customer strategy experts, Salesforce-certified experts across all Salesforce capabilities, industry experts, organizational and cultural change consultants, and project delivery leaders. As the 3rd largest Salesforce partner globally and in North America, we are committed to growing and developing our Salesforce talent, offering continued growth opportunities, and exposing our people to meaningful work that aligns to their personal and professional goals. We're looking for individuals who have experience implementing Salesforce Data Cloud or similar platforms and are passionate about customer data. The ideal candidate has a desire for continuous professional growth and can deliver complex, end-to-end Data Cloud implementations from strategy and design, through to data ingestion, segment creation, and activation; all while working alongside both our clients and other delivery disciplines. Our Global Salesforce team is looking to add a passionate Principal or Senior Principal to take on the role of Data Cloud Architect within our Salesforce practice. What You'll Do: Responsible for business requirements gathering, architecture design, data ingestion and modeling, identity resolution setup, calculated insight configuration, segment creation and activation, end-user training, and support procedures Lead technical conversations with both business and technical client teams; translate those outcomes into well-architected solutions that best utilize Salesforce Data Cloud and the wider Salesforce ecosystem Ability to direct technical teams, both internal and client-side Provide subject matter expertise as warranted via customer needs and business demands Build lasting relationships with key client stakeholders and sponsors Collaborate with digital specialists across disciplines to innovate and build premier solutions Participate in compiling industry research, thought leadership and proposal materials for business development activities Experience with scoping client work Experience with hyperscale data platforms (ex: Snowflake), robust database modeling and data governance is a plus. What You'll Bring: Have been part of at least one Salesforce Data Cloud implementation Familiarity with Salesforce's technical architecture: APIs, Standard and Custom Objects, APEX. Proficient with ANSI SQL and supported functions in Salesforce Data Cloud Strong proficiency toward presenting complex business and technical concepts using visualization aids Ability to conceptualize and craft sophisticated wireframes, workflows, and diagrams Strong understanding of data management concepts, including data quality, data distribution, data modeling and data governance Detailed understanding of the fundamentals of digital marketing and complementary Salesforce products that organizations may use to run their business. Experience defining strategy, developing requirements, and implementing practical business solutions. Experience in delivering projects using Agile-based methodologies Salesforce Data Cloud certification preferred Additional Salesforce certifications like Administrator are a plus Strong interpersonal skills Bachelor's degree in a related field preferred, but not required Open to travel (up to 50%) About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this role, we are hiring at the following levels and salary ranges: East Bay, San Francisco, Silicon Valley: Principal: $145,000-$225,000 San Diego, Los Angeles, Orange County, Seattle, Boston, Houston, New Jersey, New York City, Washington DC, Westchester: Principal: $133,000-$206,000 All other locations: Principal: $122,000-$189,000 In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time. We are committed to pay transparency and compliance with applicable laws. If you have questions or concerns about the pay range or other compensation information in this posting, please contact us at: ********************. We will accept applications until December 31, 2025 or until the position is filled. EEO and Accommodations Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team or contact ****************************** if you require accommodations during the interview process.
    $145k-225k yearly Easy Apply 53d ago
  • Junior Data Engineer (Leawood, KS & Arlington, VA)

    Torch.Ai

    Data engineer job in Leawood, KS

    Become Part of a Meaningful Mission Torch.AI is a defense-focused AI-software company on a mission to become the leading provider of critical data infrastructure for U.S. Defense and National Security. We deliver advanced AI and data software capabilities directly to customer mission owners to meet flexible, user-defined specifications and enable a decision advantage for the warfighter. We're passionate about solving complex problems that improve national security, support our warfighters, and protect our nation. Join us in our mission to help organizations Unlock Human Potential. The U.S. defense and national security industry offers an unparalleled opportunity to contribute to the safety and well-being of the nation while engaging with cutting-edge technologies. As a vital sector that shapes global stability, it offers a dynamic environment to tackle complex challenges across multidisciplinary domains. With substantial investment in innovation, the industry is at the forefront of developing AI, autonomous systems, and advanced national security solutions, each founded on the premise that information is the new battlefield. If this type of work is of interest, we'd love to hear from you. The Environment: Unlock Your Potential As a Junior Data Engineer at Torch.AI, you will be at the forefront of building software that scales across Torch.AI's platform capabilities. Your software will be deployed across an array of operational and research & development efforts for mission-critical customer programs and projects. Each of our customers requires unique technical solutions to enable an asymmetric advantage on the battlefield. Torch.AI's patented software helps remove common obstacles such as manual-intensive data processing, parsing, and analysis, thereby reducing cognitive burden for the warfighter. Our end-to-end data processing, orchestration, and fusion platform supports a wide variety of military use cases, domains, operations, and echelons. Customers enjoy enterprise-grade capabilities that meet specialized needs. Torch.AI encourages company-wide collaboration to share context, skills, and expertise across a variety of tools, technologies, and development practices. You'll work autonomously while driving coordinated, collaborative decisions across cross-functional teams comprised of defense and national security experts, veterans, business leaders, and experienced software engineers. Your code will advance back-end data orchestration and graph-compute capabilities to deliver elegant data and intelligence products. You will have the opportunity to harden and scale existing platform capabilities, tools, and technologies, while also working to innovate and introduce new iterative capabilities and features which benefit our company and customers. Successful candidates thrive in a fast-paced, entrepreneurial, and mission-driven environment. We hire brilliant patriots. You'll be encouraged to think creatively, challenge conventional thinking, and identify alternative approaches for delivering value to customers across complex problem sets. Your day-to-day will vary, adapting to the requirements of our customers and the technical needs of respective use cases. One day, you may be supporting the development of a new proof of capability concept for a new customer program; another you may be focused on optimizing system performance to help scale a production deployment; the next you may be working directly with customers to understand their requirements with deep intellectual curiosity. Our flat operating model puts every employee at the forefront of our customers' missions. We value customer intimacy, unique perspectives, and dedication to delivering lasting impact and results. You'll have the opportunity to work on the frontlines of major customer programs and influence lasting success for Torch.AI and your teammates. You'll have the opportunity to gain experience across a wide range of projects and tasks, from designing and demonstrating early capabilities and prototypes to deploying large-scale mission systems. You'll contribute directly to Torch.AI's continued position as a market leader for data infrastructure AI and compete against multi-billion-dollar incumbents and high-tech AI companies. Responsibilities Assist in developing and maintaining data pipelines to ingest, transform, and store data from diverse sources. Learn and contribute to version control, testing, and CI/CD best practices. Document application behavior and pipeline logic using tools like Confluence and GitHub. Collaborate with senior engineers to support ETL jobs, monitor data flows, and resolve data quality issues. Participate in API integration and the consumption of RESTful services. Translate data between formats such as JSON, CSV, Parquet, and Avro. Work with SQL and NoSQL databases to query, structure, and analyze data. Write scripts and services (primarily in Java or Python) for data parsing, enrichment, and movement across systems. What We Value B.S. degree in Computer Science or field or related field. 1+ years of experience in software engineering, data engineering, or related field (internships and academic projects count). Proficiency with Java (or another object-oriented language); familiarity with Python is a plus. Basic understanding of databases (e.g., PostgreSQL, MongoDB, Elasticsearch) and data querying using SQL. Exposure to data formats like JSON and experience with basic data transformation tasks. Familiarity with version control systems (Git) and a willingness to learn CI/CD workflows. Interest in pursuing a career in the defense industry and/or intelligence community. Entrepreneurial mindset. Awareness of ethical considerations and responsible AI practices. Capability to work collaboratively in interdisciplinary teams. Excellent problem-solving skills, attention to detail, and ability to thrive in a fast-paced, collaborative environment. Exposure to data orchestration or integration tools (e.g., Apache NiFi, Airflow) (not required, nice to have). Familiarity with data messaging or streaming platforms (e.g., Kafka) (not required, nice to have). Understanding of ETL design principles and data validation techniques (not required, nice to have). Interest in security-focused data applications, including identity and access management (IAM) systems (not required, nice to have). Security Clearance We are hiring for multiple positions for each role. Some roles require a Secret , Top Secret , or Top Secret/SCI Security Clearance on Day 1. If you do not currently hold a clearance but are interested in this role and believe you are eligible for a clearance, we still encourage you to submit an application. Work Location We are hiring for roles at our headquarters in Leawood, KS and remotely in the Arlington, VA region. Candidates in the Arlington, VA region may work remotely while not on customer site. Candidates in the Leawood, KS region may require some limited travel to customer sites ( Incentives Equity: All employees are eligible to participate in the company equity incentive program within their first 12 months of employment. We are proud that 100% of our employees are equity-owning partners at Torch.AI. Competitive salary and annual performance bonus opportunities. Unlimited PTO. 11 paid holidays each year. Incredible professional development and learning opportunities in a fast-paced high-tech environment and exciting industry. Weekly in-office catering in our Leawood HQ. Benefits Torch.AI values employee well-being and, in turn, offers exceptional benefits options which greatly exceed regional and national averages for similar companies. 401k Plan Torch.AI offers a 401k plan through John Hancock. While the company does not offer employee matching, we offer 3% profit sharing for all employees who elect to participate in the 401k plan. Profit sharing is calculated based on company performance at the end of each calendar year and distributed to 401k accounts at the start of each calendar year. Medical Three medical options: PPO, HSA, and TRICARE. Torch.AI's HSA contribution is 250%-350% higher than average employer contributions in Kansas City and Arlington regions. Only ~18% of employers offer TRICARE Supplement plans. Spending Accounts Above-market employer funding and flexibility. HSA: Triple-tax advantage FSA: $50-$3,300 annual contribution, $660 rollover Dependent Care FSA: $100-$5,000, pre-tax savings on child/dependent care. Dental High Plan annual maximum is ~2.6x higher than the national average. High Renaissance Plan: $5,000 annual max, 50% ortho up to $1,000. Low Renaissance Plan: $1,000 annual max, no ortho. Vision Frame allowance is 25-35% higher than typical employer-sponsored plans. Vision through Renaissance with VSP Choice network: $0 exams, lenses covered in full, and $180 frame allowance Life Insurance Employer-paid 1x base salary and additional voluntary options for employees and spouses, compared to most employers who only cover $50k basic life on average. Disability & Illness Torch.AI ranks in the top 10% of regional employers for disability benefits Short-Term Disability (employer paid): 60% income, up to $2,000/week Long-Term Disability (employer paid): 60% income, up to $5,000/month Voluntary Benefits Robust Voluntary plans offer direct pash payout flexibility and wellness incentives. Accidental Insurance Critical Illness Hospital Indemnity Commuter Benefits: up to $300/month tax-free for transit/parking Torch.AI is an Equal Opportunity /Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, national origin, protected veteran status or status as an individual with a disability. These positions are being reviewed and filled on a rolling basis, and multiple openings may be available for each role. JOB CODE: 1000107
    $50k yearly 60d+ ago
  • Data Analytics Engineer

    Emprise Bank 4.5company rating

    Data engineer job in Kansas City, MO

    At Emprise Bank, everything we do is focused on empowering people to thrive. We proudly work to provide an extraordinary customer experience to help our customers achieve their goals. We are currently seeking a Data Analytics Engineer to join our team in Wichita, KS or Kansas City, MO. As a Data Analytics Engineer, you'll be responsible for administrative, technical, and professional work within the technology department. This role will work on-site in Wichita, KS with hybrid scheduling. For candidates in the Kansas City metro area, the role will be remote. A successful candidate will have: * Confident and articulate communications skills * Initiative and a strong work ethic * A strategic mindset * A demonstrated ability to make sense of complex and sometimes contradictory information to effectively solve problems * Strong attention to detail * The ability to work both independently and collaboratively * An understanding of and commitment to our values Essential functions of the role: * Demonstrate a strong understanding of privacy and security principles, particularly as they pertain to data pipelines and dataset * Develop, test, and maintain data pipelines supporting business intelligence functions * Collaborate with data analysts to create and optimize datasets for data analysis and reporting * Maintain documentation of data pipelines, workflows, and data dictionaries for internal reference and ensure it is aligned with data governance protocols * Develop code for the business logic of operational pipelines, ensuring that data processing aligns with business requirements * Serve as a liaison for data analysts and data engineers, facilitating communication and collaboration between the two positions to ensure that data needs are met efficiently * Implement robust data quality checks and transformation processes to ensure data accuracy and consistency Other duties as assigned within the scope of the role Requirements * Bachelor's degree in quantitative field * Experience with data transformation tooling * Experience with BI tools (Tableau, PowerBI, Qlikview, etc) * Proficiency with Python and SQL language is required * Strong communication skills and the ability work with business teams to define metrics, elicit requirements and explain technical issues to non-technical associates * Proficiency in Pyspark language is preferred * Experience in Azure Cloud is preferred * Experience with SQL server database is preferred * Familiarity with medallion data architecture is preferred * Experience with Data Factory and Databricks is preferred Benefits In addition to a competitive salary and benefits, Emprise offers professional growth, a rewarding and challenging environment, opportunities to be involved in our communities, and a culture of integrity, passion, and success. We also offer shift differential pay for bilingual candidates! At Emprise Bank, empowering people to thrive means having an all-inclusive culture that honors our commitment to all dimensions of diversity in our workforce and embraces inclusion of all people. People of color, women, LGBTQIA+, veterans, and persons with disabilities are encouraged to apply. To learn more, please visit our website at ******************** Emprise Bank is an EEO/AA/ADA/Veteran Employer/Member FDIC/Drug Free Workplace. Emprise Bank participates in E-Verify and will provide your Form-I 9 to the federal government to confirm authorization to work in the United States.
    $88k-107k yearly est. 3d ago
  • Sr. Systems Analys

    Right Talent Right Now

    Data engineer job in Overland Park, KS

    Sr. Systems Analyst 1 Year contract Kansas City MO. Primary Job Responsibilities • Identify & evaluate technology/infrastructure project (server hardware & software upgrades, application migrations, application retirements, platform changes, user workstation hardware & software deployments) impacts through a variety of techniques including technology research, interviews, document and system analysis, workshops, surveys, user shadowing, business process descriptions, use cases, scenarios, business analysis, and/or task and workflow analysis. • Critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details, abstract up from low-level information to a general understanding, and distinguish user requests from the underlying true needs. • Utilize prior experience with enterprise-wide requirements definition and management systems and methodologies required. • Successfully support multiple project initiatives simultaneously. Along with business case evaluation and creation for upcoming projects. • Strong analytical skills required, including a thorough understanding of how to interpret project scope and translate them into application and operational requirements . • Excellent verbal and written communication skills and the ability to interact professionally with a diverse group, executives, managers, and subject matter experts within both business and technical teams. • Develop requirements specifications according to standard templates, using natural language. • Collaborate with developers and subject matter experts to establish the technical vision and analyze tradeoffs between usability and performance needs . • Be the liaison between the business units, technology teams and support teams. • Strong knowledge of systems, interfaces and environments. • Ability to assess and understand technology changes. - Planning and Monitoring (10%) - Effectively apply methodologies and enforce project process and standards. Responsible for meeting project deadlines for planned analysis activities. Effectively communicate relevant project information to project team and superiors. - Elicitation (20%) - Elicit and define requirements (Functional Specifications) using standard analysis techniques. Conduct interviews with users to identify possible business impacts of upgrade, analyze existing procedures and application/interface/environment documentation, and evaluate possible interface and environment impacts of change. - Analysis (50%) - Act as business and technical representative for requirements while leading discovery sessions with technical (development and testing) teams. Document requirements for use by technical teams for creating design documentation and test cases. Consult with business on project scope and requirements as needed. - Communication and Management (20%) - Act as a liaison between the business and technical IT teams. Translate information from IT to the business in easily understood terms. Partner with IT to accomplish goals and provide ongoing communication to various business stakeholders on project status and issues. Qualifications • 6+ years experience with technology/infrastructure projects • 4+ years experience documenting or consuming technical project requirements • High School degree required • This position requires incumbents to regularly sit at a desk and operation standard office equipment, such as a computer and phone • Employee is regularly required to move equipment, up to 10 pounds, to facilitate in-person meetings from computer and/or phone • Specific vision abilities required by this job include close vision and the ability to adjust focus • Must be able to talk and hear • Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions Preferred Experience • Bachelors degree in Computer Information Systems or Computer Sciences • Excellent verbal and written communication skills • Excellent listener • Solid analytical skills required, including a thorough understanding of how to interpret needs and translate them into technical requirements • Extensive experience with corporate technology and infrastructure • Effectively and efficiently plan and lead meetings and complete necessary follow-up • Ability to understand and manage to approved project scope • Critical thinking skills - using logic and reason to identify the strengths and weaknesses of alternative solutions, conclusions or approaches to problems • Independent Thinker - requires guiding oneself with little direct supervision and depending on oneself to get things done • Experience working in a matrix organization for successful project delivery • Experience working projects with integrated systems and the related complexity of requirements traceability between systems • ITIL and SDLC knowledge. Additional Information All your information will be kept confidential according to EEO guidelines.
    $79k-103k yearly est. 60d+ ago
  • Data Engineer II

    27Global

    Data engineer job in Leawood, KS

    Job DescriptionDescription: 27Global is a rapidly growing company in the dynamic industry of software, cloud, and data engineering. We pride ourselves on the quality of the services we deliver, the clients we serve, and the strength of our culture. Our commitment to our employees is evidenced by our five Best Places to Work awards. We're looking for a Data Engineer to join our team! You'll be responsible for contributing to the design and development of enterprise data solutions that support analytics, business intelligence, and scalable applications. You'll work closely with data and software architects, consultants and other engineers to deliver data models, integration strategies, and governance practices that empower client's data-driven decisions. Joining 27Global as a Data Engineer is an exciting high-growth opportunity offering a competitive base salary, performance bonuses, and variable compensation. Your Role: Participate in the design and implementation of scalable, secure, and high-performance data architectures. Develop and maintain conceptual, logical, and physical data models. Work closely with architects to define standards for data integration, quality, and governance. Collaborate with engineers, analysts, and business stakeholders to align data solutions with organizational needs. Support cloud-based data strategies including data warehousing, pipelines, and real-time processing. Design and optimize data pipelines that support AI, machine learning, and advanced analytics workloads. Implement data preprocessing, feature engineering, and real-time inference capabilities for predictive modeling. Integrate AI/ML models into production environments using tools such as AWS SageMaker, Azure Machine Learning, or Databricks. Assess, learn, and apply emerging data technologies and frameworks to enhance solutions and stay current with industry trends. Requirements: What You Bring: BA/BS/Master's degree in Computer Science, Information Systems, Data Science, or related field. 2 - 4 years of experience in data architecture, data engineering, or related roles delivering scalable architecture solutions from design to production. 2 - 4 years of experience writing .Net code or other OOP languages in an Agile environment. Demonstrated leadership skills with the ability to collaborate with and lead on-shore and off-shore team members. Proficient technical skills in: Spark, Scala, C#, PySpark, Data Lake, Delta Lake, Relational and NoSQL Databases, AWS Glue and Azure Synapse Experience with SQL, ETL/ELT, and data modeling. Experience with cloud platforms (AWS, Azure, GCP) and implementing modern data platforms with data lake. Knowledge of data governance, security, and compliance frameworks. Ability to context switch and work on a variety of projects over specified periods of time. Ability to work at the 27Global office in Leawood, KS with hybrid work flexibility after 90 days, and occasionally onsite at client offices. Flexibility to occasionally travel to client sites may be required, typically 1 week per quarter or less. Legal authorization to work in the United States and the ability to prove eligibility at the time of hire. Ways to Stand Out: Certifications: AWS Solution Architect, Azure Data Engineer, Databricks Data Engineer Hands-on experience with Databricks for building and optimizing scalable data pipelines, Delta Lake, and Spark-based analytics. Hands-on experience with big data tools (Spark, Kafka). Modern data warehouses (Snowflake, Redshift, BigQuery). Familiarity with machine learning pipelines and real-time analytics. Strong communication skills and ability to influence stakeholders. Prior experience implementing enterprise data governance frameworks. Experience in a client-facing role, working directly with clients from multiple levels of the organization; often presenting and documenting client environment suggestions and improvements. Why 27G?: Four-time award winner of Best Place to Work by the Kansas City Business Journal. A casual and fun small business work environment. Competitive compensation, benefits, time off, profit sharing, and quarterly bonus potential. Dedicated time for learning, development, research, and certifications.
    $69k-92k yearly est. 3d ago
  • Data Engineer

    Quest Analytics

    Data engineer job in Overland Park, KS

    At Quest Analytics, our mission is to make healthcare more accessible for all Americans. As part of our team, you'll work in an innovative, collaborative, challenging, and flexible environment that supports your personal growth every day. We are looking for a talented and motivated Data Engineer with experience in building scalable infrastructure, implementing automation, and enabling cross-functional teams with reliable and accessible data. The Data Engineer will run daily operations of the data infrastructure, automate and optimize our data operations and data pipeline architecture while ensuring active monitoring and troubleshooting. This hire will also support other engineers and analysts on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. APPLY TODAY!What you'll do: Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Review project objectives and determine the best technology for implementation. Implement best practice standards for development, build and deployment automation. Run daily operations of the data infrastructure and support other engineers and analysts on data investigations and operations. Monitor and report on all data pipeline tasks while working with appropriate teams to take corrective action quickly, in case of any issues. Work with internal teams to understand current process and areas for efficiency gains Write well-abstracted, reusable, and efficient code. Participate in the training and/or mentoring programs as assigned or required. Adheres to the Quest Analytics Values and supports a positive company's culture. Responds to the needs and requests of clients and Quest Analytics management and staff in a professional and expedient manner. What it requires: Bachelor's degree in computer science or related field. 3 years of work experience with ETL, data operations and troubleshooting, preferably in healthcare data. Proficiency with Azure ecosystems, specifically in Azure Data Factory and ADLS. Strong proficiency in Python for scripting, automation, and data processing. Advanced SQL skills for query optimization and data manipulation. Experience with distributed data pipeline tools like Apache Spark, Databricks, etc. Working knowledge of database modeling (schema design, and data governance best practices.) Working knowledge of libraries like Pandas, numpy, etc. Self-motivated and able to work in a fast paced, deadline-oriented environment Excellent troubleshooting, listening, and problem-solving skills. Proven ability to solve complex issues. Customer focused. What you'll appreciate:•Workplace flexibility - you choose between remote, hybrid or in-office•Company paid employee medical, dental and vision•Competitive salary and success sharing bonus•Flexible vacation with no cap, plus sick time and holidays•An entrepreneurial culture that won't limit you to a job description•Being listened to, valued, appreciated -- and having your contributions rewarded•Enjoying your work each day with a great group of people Apply TODAY!careers.questanalytics.com About Quest AnalyticsFor more than 20 years, we've been improving provider network management one groundbreaking innovation at a time. 90% of America's health plans use our tools, including the eight largest in the nation. Achieve your personal quest to build a great career here. Visa sponsorship is not available at this time. Preferred work locations are within one of the following states: Alabama, Arizona, Arkansas, Colorado, Connecticut, Delaware, Florida, Georgia, Idaho, Illinois (outside of Chicago proper), Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Mexico, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, West Virginia, Wisconsin, or Wyoming. Quest Analytics provides equal employment opportunities to all people without regard to race, color, religion, sex, national origin, ancestry, marital status, veteran status, age, disability, sexual orientation or gender identity or expression or any other legally protected category. We are committed to creating and maintaining a workforce environment that is free from any form of discriminations or harassment. Applicants must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire. Persons with disabilities who anticipate needing accommodations for any part of the application process may contact, in confidence ********************* NOTE: Staffing agencies, headhunters, recruiters, and/or placement agencies, please do not contact our hiring managers directly. We are not currently working with additional outside agencies at this time. Any job posting displayed on websites other than questanalytics.com or jobs.lever.co/questanalytics/ may be out of date, inaccurate and unavailable We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
    $69k-92k yearly est. 21d ago
  • Data Scientist - Retail Pricing

    Capitol Federal Savings Bank 4.4company rating

    Data engineer job in Overland Park, KS

    We are looking for a Data Scientist! This position will play a key role in shaping data-driven strategies that directly influence the bank's profitability, customer value, and market competitiveness. This role sits at the intersection of analytics, finance, and product strategy - transforming data into pricing intelligence that supports smarter, faster business decisions. Will design and implement advanced pricing and profitability models for retail banking products, leveraging internal performance metrics, market benchmarks, and third-party data sources. Through predictive modeling, elasticity analysis, and scenario testing, will help the organization optimize deposit and loan pricing, forecast financial outcomes, and identify growth opportunities. Collaborating across product, finance, and executive teams, will translate complex analytical findings into clear business recommendations that drive strategic action. Will also contribute to enhancing our analytics infrastructure - improving data pipelines, model governance, and reporting capabilities to strengthen enterprise-wide decision-making. Core Expertise: Pricing strategy · Profitability modeling · Financial forecasting · Machine learning · SQL · Python · R · Data visualization · Strategic analytics · Cross-functional collaboration CapFed is an equal opportunity employer.
    $66k-82k yearly est. Auto-Apply 10d ago
  • Adobe Real-Time Customer Data Platform (RT-CDP) Architect

    Slalom 4.6company rating

    Data engineer job in Kansas City, MO

    Who You'll Work With The Adobe team drives strategic direction and solution enablement in support of Marketing Teams. We accelerate innovation and learning, advance sales and delivery excellence with high-caliber Marketing Technology solutions including Adobe technology expertise. Our focus is 4 go-to-market solution areas: Experience and Content Management with a focus on Content Supply Chain, Digital Asset Management; Personalized Insights and Engagement with a focus on Analytics, Customer Data Platforms, and Journey Orchestration; Digital Commerce with a focus on Experience Led Commerce and Product Information Management; Marketing Operations and Workflow with a focus on resource management, reporting and approvals of the content and data required to run Personalization and Campaigns at Scale. We are seeking a talented Adobe RT-CDP Architect to join our team as a senior consultant or principal. This is a client-facing role that involves close collaboration with both technical and non-technical stakeholders. What You'll Do Implement, configure, and enable Adobe Customer Data Platform (CDP) Provide the technical design and data architecture for configuring RT-CDP to meet clients' business goals Responsible for understanding business problems and capturing client requirements by leading effective conversations with business and technical client teams Interpret how to best apply the out of the box product to provide a solution; including finding alternative approaches that best leverage the platform Provide analytics domain expertise, consultation, and troubleshooting Learn new platforms, new capabilities, and new clouds to stay on top of the ever-growing product ecosystem (CJA, AJO, Marketo) What You'll Bring Expertise in configuration, implementation, and integration of RT-CDP product without significant help from others Knowledge of, and experience with RT-CDP B2C, B2B and/or B2P Knowledge of how RT-CDP works with other Adobe Experience Platform products Experience implementing and driving success with RT-CDP for enterprise clients in an architecture role Proficient with manipulating, structuring, and merging data from different data sources and understanding of typical data sources within an enterprise environment Knowledge of how Graph Stitching, profile merge rules, profile collapsing, householding concepts work in RT-CDP Ability to translate business rules into technical requirements and implementation of those requirements Proficient with data transformation, API-based integrations and JavaScript tagging Experience working with SQL, R, and/or Python preferred Enterprise experience designing multi-solution architecture Strong communication skills and a passion for learning new technologies and platform capabilities Build strong relationships with clients and understand them from a business and strategic perspective Occasional travel as needed by client About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this role, we are hiring at the following levels and salary ranges: East Bay, San Francisco, Silicon Valley: Senior Consultant: $131,000-$203,000 Principal: $145,000-$225,000 San Diego, Los Angeles, Orange County, Seattle, Boston, Houston, New Jersey, New York City, Washington DC, Westchester: Senior Consultant: $120,000-$186,000 Principal: $133,000-$206,000 All other locations: Senior Consultant: $110,000-$171,000 Principal: $122,000-$189,000 In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time. We will accept applicants until December 12, or until the position is filled. We are committed to pay transparency and compliance with applicable laws. If you have questions or concerns about the pay range or other compensation information in this posting, please contact us at: ********************. EEO and Accommodations Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team or contact ****************************** if you require accommodations during the interview process. #LI-KM
    $145k-225k yearly Easy Apply 25d ago

Learn more about data engineer jobs

How much does a data engineer earn in Overland Park, KS?

The average data engineer in Overland Park, KS earns between $60,000 and $105,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Overland Park, KS

$79,000

What are the biggest employers of Data Engineers in Overland Park, KS?

The biggest employers of Data Engineers in Overland Park, KS are:
  1. Quest Analytics
  2. 27Global
  3. Tyler Technologies
  4. W. R. Berkley
  5. Conexess Group
  6. PDS
  7. CARE
  8. Advanced Technologies Group
  9. SelectQuote Insurance Services
  10. Kforce
Job type you want
Full Time
Part Time
Internship
Temporary