Post job

Data engineer jobs in Overland Park, KS

- 275 jobs
All
Data Engineer
Data Architect
Senior Software Engineer
Software Engineer
Requirements Engineer
Analytical Data Miner
Data Scientist
Data Warehouse Developer
Senior Systems Developer
  • NMC_000344 - SAN Engineer (US Citizen Only)

    New Millenium Consulting 3.7company rating

    Data engineer job in Kansas City, MO

    One of our clients in Kansas City, is urgently looking for a SAN Engineer. Scope: The customer, needs support services to reconfigure a Dell Unity SAN, including installation of new drives, RAID configuration, encryption, and connection to a VMware cluster as traditional datastores. Customer wants to treat them like traditional VmWare DataStores, not vSan. The array needs to have the new drives installed, raid configured and entire array needs to be encrypted. They have 6 SSD and 11 10,00 rpm traditional hard drives. Note from the client: There is currently 4 - 600GB drives in the unit. Have 6 - 960 GB drives ( p/n - SDFSU76EXB02T ) and 11 - 1.8TB drives ( p/n 1XJ233-031) that they need added to the storage unit and configured. They believe they wanted to attach the array to our 3 node ESXi cluster for allocation as a Vsan datastore or datastores. Must-to-Have skills: Experience in Installation and Configuring SAN Experience with Dell Unity SAN Expertise with DataStores VMWare Configuration experience is a Plus
    $73k-102k yearly est. 5d ago
  • Software Engineer

    Inceed 4.1company rating

    Data engineer job in Kansas City, MO

    Software Engineer Compensation: $90,000 - $100,000 annually, depending on experience Inceed has partnered with a great company to help find a skilled Software Engineer to join their team! Join a dynamic team in a collaborative environment, where innovation meets creativity. This opportunity arises as the team is expanding due to the retirement of a valued developer. Work in an environment and contribute to exciting new projects and legacy product updates. Key Responsibilities & Duties: Collaborate with QA and UX teams to meet business needs Provide technical support and troubleshoot current systems Develop new product features using coding standards Participate in code reviews and knowledge sharing Create and maintain documentation on system architecture Required Qualifications & Experience: 3+ years of experience in back-end services and API development Proficiency in C#, .NET Core, and Node.js Experience with relational databases like SQL Server or MySQL Nice to Have Skills & Experience: Experience with cloud development (Azure, AWS) Containerization experience with Docker and Kubernetes Other Information: Work in a hybrid environment, with Mondays and Fridays from home Collaborative team with a flexible and innovative work culture Potential to work on diverse development projects If you are interested in learning more about the Software Engineer opportunity, please submit your resume for consideration. Our client is unable to provide sponsorship at this time. We are Inceed, a staffing direct placement firm who believes in the possibility of something better. Our mission is simple: We're here to help every person, whether client, candidate, or employee, find and secure what's better for them. Inceed is an equal opportunity employer. Inceed prohibits discrimination and harassment of any type and affords equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity, or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law. #IND
    $90k-100k yearly 1d ago
  • Senior Dotnet Developer

    Vaiticka Solution

    Data engineer job in Kansas City, MO

    Client is looking for an innovative and modernization-minded engineer to re-envision our entire client experience. This person will join a collaborative and design-driven team, with talent and tenure to thrive on. They will assist in leading the development, implementation, and management of technology-based business solutions. This person will design software applications to meet both functional and technical requirements for the client experience team at a high level. The ideal engineer will possess the ability to prioritize well, communicate clearly, have a consistent track record of delivery and excellent software engineering skills. They will be able to adapt to new tech and methodologies, as needed. Duties: Participate in all phases of the SDLC, including requirements analysis, design, development, testing, deployment, and maintenance. Develop dynamic and responsive user interfaces using Angular, TypeScript, HTML, CSS, and related front-end frameworks and libraries. Design and develop robust and scalable back-end services and APIs using C#, ASP.NET MVC, .NET Core, and Web API. Integrate front-end applications with back-end APIs. Work with SQL Server databases to design schemas, write queries, and manage data. Write clean, well-documented, and testable code. Perform code reviews, refactor code, and ensure adherence to coding standards and best practices. Implement unit and integration tests to ensure software quality. Provide technical guidance and mentorship to junior developers, sharing knowledge and promoting best practices. Percentage of time spent on duties will be as follows: Software development including database design, solution architecture, and project planning - 80% Production and incident support - 20% Required Skills: 9+ years extensive experience in .NET development, particularly with MVC, Angular, and C#. Proficiency in Angular and related front-end technologies (TypeScript, HTML, CSS, JavaScript). Strong understanding: of object-oriented programming (OOP) principles and design patterns. Extensive experience with database systems and SQL. Familiarity with Git version control system. Knowledge of Azure cloud platforms.
    $77k-101k yearly est. 2d ago
  • Data Engineer

    Tyler Technologies 4.3company rating

    Data engineer job in Overland Park, KS

    Description The Data Engineer role within the Cloud Payments Reporting team is responsible for developing, deploying, and maintaining data solutions. Primary responsibilities will include querying databases, maintaining data transformation pipelines through ETL tools and systems into data warehouses, provide and support data visualizations, dashboards, ad hoc and customer deliverable reports. Responsibilities Provide primary support for existing databases, automated jobs, and reports, including monitoring notifications, performing root cause analysis, communicating findings, and resolving issues. Subject Matter Expert for payments reports, databases, and processes. Ensure data and report integrity and accuracy through thorough testing and validation. Build analytical tools to utilize the data, providing actionable insight into key business performance metrics including operational efficiency. Implement, support, and optimize ETL pipelines, data aggregation processes, and reports using various tools and technologies. Collaborate with operational leaders and teams to understand reporting and data usage across the business and provide efficient solutions. Participate in recurring meetings with working groups and management teams to discuss operational improvements. Work with stakeholders including data, design, product, and executive teams to support their data infrastructure needs while assisting with data-related technical issues. Handle tasks on your own, adjust to new deadlines, and adapt to changing priorities. Design, develop and implement special projects, based on business needs. Perform other job-related duties and responsibilities as assigned. Qualifications Five or more years of experience with Oracle, MySQL, Power BI / QuickSight, and Excel. Thorough knowledge of SQL, relational databases and data modeling principles. Proficiency in programming languages such as PL/SQL, Bash, PowerShell and Python. Exceptional problem-solving, analytical, and critical thinking skills. Excellent interpersonal and communication skills, with the ability to consult with stakeholders to facilitate requirements gathering, troubleshooting, and solution validation. Detail-oriented with the ability to understand the bigger picture. Ability to communicate complex quantitative analysis clearly. Strong organizational skills, including multi-tasking and teamwork. Self-motivated, task oriented and an aptitude for complex problem solving. Experience with AWS, Jenkins and SnapLogic is a plus. Data streaming, API calls (SOAP and REST), database replication and real-time processing is a plus. Experience with Atlassian JIRA and Confluence is a plus.
    $69k-84k yearly est. Auto-Apply 60d+ ago
  • Senior. Data Engineer

    Care It Services 4.3company rating

    Data engineer job in Overland Park, KS

    The Senior Data Engineer will be responsible for building and maintaining the data infrastructure that powers the organization's data-driven decision-making. Designs, develops, and maintains data pipelines, data warehouses, and other data-related infrastructure. This role expects to work closely with data scientists, analysts, and other stakeholders to understand their data needs and translate them into robust and scalable solutions. Key Responsibilities: Build, maintain, and optimize data pipelines, including ELT processes, data models, reports, and dashboards to drive business insights. Develop and implement data solutions for enterprise data warehouses and business intelligence (BI) initiatives. Continuously monitor and optimize data pipelines for performance, reliability, and cost-effectiveness. This includes identifying bottlenecks, tuning queries, and scaling infrastructure as needed. Automate data ingestion, processing, and validation tasks to ensure data quality and consistency. Implement data governance policies and procedures to ensure data quality, consistency, and compliance with relevant regulations. Contribute to the development of the organization's overall data strategy. Conduct code reviews and contribute to the establishment of coding standards and best practices. Required Qualifications: Bachelor's degree in a relevant field or equivalent professional experience. 4-6 years of hands-on experience in data engineering. Strong expertise in SQL and NoSQL databases, including PostgreSQL, DynamoDB, and MongoDB. Experience working with cloud platforms such as GCP, Azure, or AWS and their associated data services. Practical knowledge of data warehouses like BigQuery, Snowflake, and Redshift. Programming skills in Python or JavaScript. Proficiency with BI tools such as Sisense, Power BI, or Tableau. Preferred Qualifications: Direct experience with Google Cloud Platform (GCP). Knowledge of CI/CD pipelines, including tools like Docker and Terraform. Background in the healthcare industry. Familiarity with modern data integration tools such as DBT, Matillion, and Airbyte. Compensation: $125,000.00 per year Who We Are CARE ITS is a certified Woman-owned and operated minority company (certified as WMBE). At CARE ITS, we are the World Class IT Professionals, helping clients achieve their goals. Care ITS was established in 2010. Since then we have successfully executed several projects with our expert team of professionals with more than 20 years of experience each. We are globally operated with our Head Quarters in Plainsboro, NJ, with focused specialization in Salesforce, Guidewire and AWS. We provide expert solutions to our customers in various business domains.
    $125k yearly Auto-Apply 60d+ ago
  • Data Engineer

    PDS Inc., LLC 3.8company rating

    Data engineer job in Overland Park, KS

    The Data Engineer is a key contributor in advancing the firm's data strategy and analytics ecosystem, transforming raw data into actionable insights that drive business decisions. This role requires a technically strong, curious professional committed to continuous learning and innovation. The ideal candidate combines analytical acumen with data engineering skills to ensure reliable, efficient, and scalable data pipelines and reporting solutions. ESSENTIAL DUTIES AND RESPONSIBILITIES Data Engineering & Integration Design, build, and maintain data pipelines and integrations using Azure Data Factory, SSIS, or equivalent ETL/ELT tools. Automate data imports, transformations, and loads from multiple sources (on-premise, SaaS, APIs, and cloud). Optimize and monitor data workflows for reliability, performance, and cost efficiency. Implement and maintain data quality, validation, and error-handling frameworks. Data Analysis & Reporting Develop and maintain reporting databases, views, and semantic models for business intelligence solutions. Design and publish dashboards and visualizations in Power BI and SSRS, ensuring alignment with business KPIs. Perform ad-hoc data exploration and statistical analysis to support business initiatives. Collaboration & Governance Partner with stakeholders across marketing, underwriting, operations, and IT to define analytical and data integration requirements. Maintain data integrity, enforce governance standards, and promote best practices in data stewardship. Support data security and compliance initiatives in coordination with IT and business teams. Continuous Improvement Stay current with emerging data technologies and analytics practices. Recommend tools, processes, or automation improvements to enhance data accessibility and insight delivery. QUALIFICATIONS Required: Strong SQL development skills and experience with Microsoft SQL Server and Azure SQL Database. Hands-on experience with data import, transformation, and integration using Azure Data Factory, SSIS, or similar tools. Proficiency in building BI solutions using Power BI and/or SSRS. Strong data modeling and relational database design skills. Proficiency in Microsoft Excel (advanced formulas, pivot tables, external data connections). Ability to translate business goals into data requirements and technical solutions. Excellent communication and collaboration skills. Bachelor's degree in Computer Science, Information Systems, or a related field (or equivalent experience). Preferred: Experience with cloud-based data platforms (Azure Data Lake, Synapse Analytics, Databricks). Familiarity with version control tools (Git, Azure DevOps) and Agile development practices. Exposure to Python or PowerShell for data transformation or automation. Experience integrating data from insurance or financial systems. Compensation: $120-129K This position is 3 days onsite/hybrid located in Overland Park, KS We look forward to reviewing your application. We encourage everyone to apply - even if every box isn't checked for what you are looking for or what is required. PDSINC, LLC is an Equal Opportunity Employer.
    $120k-129k yearly 17d ago
  • Data Architect

    Teksystems 4.4company rating

    Data engineer job in Kansas City, MO

    A company is seeking a highly skilled and strategic Data Architect to lead their data governance, security, and management initiatives. This senior role will be responsible for designing and implementing the organization's enterprise data architecture, ensuring that their data is secure, reliable, and accessible for business-critical functions. The ideal candidate is a proactive leader who can define data strategy, enforce best practices, and collaborate with cross-functional teams to align their data ecosystem with business goals. Skills data architecture, data modeling, data warehouse, aws, azure, cloud, lambda, artificial intelligence Top Skills Details data architecture,data modeling,data warehouse,aws,azure,cloud Additional Skills & Qualifications Bachelor's or master's degree in Computer Science, Information Technology, or a related technical field. 10+ years of hands-on experience in data architecture, data modeling, and data governance, with a proven track record of designing and implementing complex data ecosystems. Experience working in regulated industries is a plus. Proven experience (8+ years) designing and implementing enterprise-level data architectures. Extensive experience with data modeling, data warehousing, and modern data platforms (e.g., cloud environments like AWS, Azure, or GCP). Deep expertise in data modeling, data warehousing, database technologies (SQL, NoSQL), big data technologies (e.g., Spark), and modern cloud platforms (e.g., AWS, Azure, GCP). Deep expertise in data governance and security principles, including regulatory compliance frameworks. Strong knowledge of how to structure data for machine learning and AI workloads, including experience with MLOps platforms. Hands-on experience with data classification and data cataloging tools (e.g., Collibra, Alation). Excellent communication, interpersonal, and leadership skills, with the ability to influence and build consensus across the organization. Professional certifications in data architecture, data governance, or cloud platforms preferred. Experience with big data technologies (e.g., Hadoop, Spark) preferred. Familiarity with data integration and ETL/ELT frameworks preferred. Excellent communication, interpersonal, and leadership skills, with the ability to influence and build consensus across the organization. Highly motivated, self-starter with a strong sense of duty Continual learner, willing to attend workshops, seminars, etc. to maintain skills Mature critical thinking, analytical, and problem-solving skills with the ability to troubleshoot and devise a course of corrective action Highly organized and efficient with the ability to multitask, prioritizes tasks appropriately Experience Level Expert Level Job Type & Location This is a Contract to Hire position based out of Kansas City, MO. Pay and Benefits The pay range for this position is $70.00 - $85.00/hr. Eligibility requirements apply to some benefits and may depend on your job classification and length of employment. Benefits are subject to change and may be subject to specific elections, plan, or program terms. If eligible, the benefits available for this temporary role may include the following: - Medical, dental & vision - Critical Illness, Accident, and Hospital - 401(k) Retirement Plan - Pre-tax and Roth post-tax contributions available - Life Insurance (Voluntary Life & AD&D for the employee and dependents) - Short and long-term disability - Health Spending Account (HSA) - Transportation benefits - Employee Assistance Program - Time Off/Leave (PTO, Vacation or Sick Leave) Workplace Type This is a fully onsite position in Kansas City,MO. Application Deadline This position is anticipated to close on Dec 22, 2025. h4>About TEKsystems: We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company. The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law. About TEKsystems and TEKsystems Global Services We're a leading provider of business and technology services. We accelerate business transformation for our customers. Our expertise in strategy, design, execution and operations unlocks business value through a range of solutions. We're a team of 80,000 strong, working with over 6,000 customers, including 80% of the Fortune 500 across North America, Europe and Asia, who partner with us for our scale, full-stack capabilities and speed. We're strategic thinkers, hands-on collaborators, helping customers capitalize on change and master the momentum of technology. We're building tomorrow by delivering business outcomes and making positive impacts in our global communities. TEKsystems and TEKsystems Global Services are Allegis Group companies. Learn more at TEKsystems.com. The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
    $70-85 hourly 4d ago
  • Principal Data Engineer

    Weavix

    Data engineer job in Lenexa, KS

    About the Role: weavix is seeking a hands-on, business-minded Senior or Principal Data Engineer to architect and own our data infrastructure from the ground up. This is a unique opportunity to shape the future of data at a high-growth startup where IoT, scale, and performance are core to our mission. You'll be the technical lead for everything data - building pipelines, architecting systems, and working cross-functionally to extract insights that power customer growth, analyze user behavior, and improve system reliability and performance. This is a highly autonomous role, perfect for someone with startup experience who enjoys solving complex problems independently. What You'll Do: Architect, build, and maintain scalable data systems and pipelines to ingest and process large-scale data from IoT devices and user activity Own the design and implementation of our cloud-based data platform (Microsoft Azure strongly preferred; GCP or AWS also acceptable) Enable data-driven decision-making across product, engineering, and business teams Create a data architecture that supports both operational and analytical use cases (growth analytics, performance monitoring, system scaling) Ensure data quality, observability, governance, and security across all systems Serve as the subject matter expert on data systems, operating as a senior IC without a team initially What You Bring: 6+ years of experience in data engineering, ideally within a startup or high-growth environment Proven ability to independently design, implement, and manage scalable data architectures Deep experience working with large datasets, ideally from IoT sources or other high-volume systems Proficiency with modern data tools and languages (e.g., Typescript, NodeJS, SQL, etc.) Strong cloud experience, ideally with Microsoft Azure (but AWS or GCP also acceptable) A business-focused mindset with the ability to connect technical work to strategic outcomes Experience with New Relic, Metabase, Postgres, Grafana, Azure Storage, MongoDB, or other storage, database, graphing, or alerting platforms. Excellent communication and collaboration skills across technical and non-technical teams Bonus Points For: Experience with event-driven or real-time data systems (Kafka, Kinesis, etc.) Familiarity with BI tools and self-service analytics platforms Background in system performance monitoring and observability tools Why weavix Being a part of the weavix team is being a part of something bigger. We value the innovators and the risk-takers-the ones who love a challenge. Through our shared values and dedication to our mission to Connect every Disconnected Worker, we're reshaping the future of work to focus on this world's greatest assets: people. It's truly amazing what happy, engaged team members can achieve. Our ever-evolving list of benefits means you'll be able to achieve work/life balance, perform impactful work, grow in your role, look after yourself/your family, and invest in your future. Perks and Benefits Competitive Compensation Employee Equity Stock Program Competitive Benefits Package including: Medical, Dental, Vision, Life, and Disability Insurance 401(k) Retirement Plan + Company Match Flexible Spending & Health Savings Accounts Paid Holidays Flexible Time Off Employee Assistance Program (EAP) Other exciting company benefits About Us weavix , the Internet of Workers platform, revolutionizes frontline communication and productivity at scale. Since its founding, weavix has shaped the future of work by introducing innovative methods to better connect and enable the frontline workforce. weavix transforms enterprise by providing data-driven insights into facilities and teams to maximize productivity and achieve breakthrough results. weavix is the single source of truth for both workers and executives. Our mission is simple: to connect every disconnected worker through disruptive technology. How do you want to make your impact? For more information about us, visit weavix.com. Equal Employment Opportunity (EEO) Statement weavix is an Equal Opportunity Employer. At weavix, diversity fuels innovation. We are dedicated to fostering an inclusive environment where every team member is empowered to contribute to our mission of connecting the disconnected workforce. We do not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), national origin, age, disability, veteran status, genetic information, or any other legally protected characteristic. All qualified applicants will receive consideration for employment. Americans with Disabilities Act (ADA) Statement weavix is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need assistance or an accommodation during the application process due to a disability, you may contact us at *************. E-Verify Notice Notice: weavix participates in the E-Verify program to confirm employment eligibility as required by law.
    $69k-92k yearly est. Auto-Apply 24d ago
  • Data Engineer II

    27Global

    Data engineer job in Leawood, KS

    Job DescriptionDescription: 27Global is a rapidly growing company in the dynamic industry of software, cloud, and data engineering. We pride ourselves on the quality of the services we deliver, the clients we serve, and the strength of our culture. Our commitment to our employees is evidenced by our five Best Places to Work awards. We're looking for a Data Engineer to join our team! You'll be responsible for contributing to the design and development of enterprise data solutions that support analytics, business intelligence, and scalable applications. You'll work closely with data and software architects, consultants and other engineers to deliver data models, integration strategies, and governance practices that empower client's data-driven decisions. Joining 27Global as a Data Engineer is an exciting high-growth opportunity offering a competitive base salary, performance bonuses, and variable compensation. Your Role: Participate in the design and implementation of scalable, secure, and high-performance data architectures. Develop and maintain conceptual, logical, and physical data models. Work closely with architects to define standards for data integration, quality, and governance. Collaborate with engineers, analysts, and business stakeholders to align data solutions with organizational needs. Support cloud-based data strategies including data warehousing, pipelines, and real-time processing. Design and optimize data pipelines that support AI, machine learning, and advanced analytics workloads. Implement data preprocessing, feature engineering, and real-time inference capabilities for predictive modeling. Integrate AI/ML models into production environments using tools such as AWS SageMaker, Azure Machine Learning, or Databricks. Assess, learn, and apply emerging data technologies and frameworks to enhance solutions and stay current with industry trends. Requirements: What You Bring: BA/BS/Master's degree in Computer Science, Information Systems, Data Science, or related field. 2 - 4 years of experience in data architecture, data engineering, or related roles delivering scalable architecture solutions from design to production. 2 - 4 years of experience writing .Net code or other OOP languages in an Agile environment. Demonstrated leadership skills with the ability to collaborate with and lead on-shore and off-shore team members. Proficient technical skills in: Spark, Scala, C#, PySpark, Data Lake, Delta Lake, Relational and NoSQL Databases, AWS Glue and Azure Synapse Experience with SQL, ETL/ELT, and data modeling. Experience with cloud platforms (AWS, Azure, GCP) and implementing modern data platforms with data lake. Knowledge of data governance, security, and compliance frameworks. Ability to context switch and work on a variety of projects over specified periods of time. Ability to work at the 27Global office in Leawood, KS with hybrid work flexibility after 90 days, and occasionally onsite at client offices. Flexibility to occasionally travel to client sites may be required, typically 1 week per quarter or less. Legal authorization to work in the United States and the ability to prove eligibility at the time of hire. Ways to Stand Out: Certifications: AWS Solution Architect, Azure Data Engineer, Databricks Data Engineer Hands-on experience with Databricks for building and optimizing scalable data pipelines, Delta Lake, and Spark-based analytics. Hands-on experience with big data tools (Spark, Kafka). Modern data warehouses (Snowflake, Redshift, BigQuery). Familiarity with machine learning pipelines and real-time analytics. Strong communication skills and ability to influence stakeholders. Prior experience implementing enterprise data governance frameworks. Experience in a client-facing role, working directly with clients from multiple levels of the organization; often presenting and documenting client environment suggestions and improvements. Why 27G?: Four-time award winner of Best Place to Work by the Kansas City Business Journal. A casual and fun small business work environment. Competitive compensation, benefits, time off, profit sharing, and quarterly bonus potential. Dedicated time for learning, development, research, and certifications.
    $69k-92k yearly est. 2d ago
  • Principal Data Engineer

    Weavix Inc.

    Data engineer job in Lenexa, KS

    Job Description About the Role: weavix is seeking a hands-on, business-minded Senior or Principal Data Engineer to architect and own our data infrastructure from the ground up. This is a unique opportunity to shape the future of data at a high-growth startup where IoT, scale, and performance are core to our mission. You'll be the technical lead for everything data - building pipelines, architecting systems, and working cross-functionally to extract insights that power customer growth, analyze user behavior, and improve system reliability and performance. This is a highly autonomous role, perfect for someone with startup experience who enjoys solving complex problems independently. What You'll Do: Architect, build, and maintain scalable data systems and pipelines to ingest and process large-scale data from IoT devices and user activity Own the design and implementation of our cloud-based data platform (Microsoft Azure strongly preferred; GCP or AWS also acceptable) Enable data-driven decision-making across product, engineering, and business teams Create a data architecture that supports both operational and analytical use cases (growth analytics, performance monitoring, system scaling) Ensure data quality, observability, governance, and security across all systems Serve as the subject matter expert on data systems, operating as a senior IC without a team initially What You Bring: 6+ years of experience in data engineering, ideally within a startup or high-growth environment Proven ability to independently design, implement, and manage scalable data architectures Deep experience working with large datasets, ideally from IoT sources or other high-volume systems Proficiency with modern data tools and languages (e.g., Typescript, NodeJS, SQL, etc.) Strong cloud experience, ideally with Microsoft Azure (but AWS or GCP also acceptable) A business-focused mindset with the ability to connect technical work to strategic outcomes Experience with New Relic, Metabase, Postgres, Grafana, Azure Storage, MongoDB, or other storage, database, graphing, or alerting platforms. Excellent communication and collaboration skills across technical and non-technical teams Bonus Points For: Experience with event-driven or real-time data systems (Kafka, Kinesis, etc.) Familiarity with BI tools and self-service analytics platforms Background in system performance monitoring and observability tools Why weavix Being a part of the weavix team is being a part of something bigger. We value the innovators and the risk-takers-the ones who love a challenge. Through our shared values and dedication to our mission to Connect every Disconnected Worker, we're reshaping the future of work to focus on this world's greatest assets: people. It's truly amazing what happy, engaged team members can achieve. Our ever-evolving list of benefits means you'll be able to achieve work/life balance, perform impactful work, grow in your role, look after yourself/your family, and invest in your future. Perks and Benefits Competitive Compensation Employee Equity Stock Program Competitive Benefits Package including: Medical, Dental, Vision, Life, and Disability Insurance 401(k) Retirement Plan + Company Match Flexible Spending & Health Savings Accounts Paid Holidays Flexible Time Off Employee Assistance Program (EAP) Other exciting company benefits About Us weavix , the Internet of Workers platform, revolutionizes frontline communication and productivity at scale. Since its founding, weavix has shaped the future of work by introducing innovative methods to better connect and enable the frontline workforce. weavix transforms enterprise by providing data-driven insights into facilities and teams to maximize productivity and achieve breakthrough results. weavix is the single source of truth for both workers and executives. Our mission is simple: to connect every disconnected worker through disruptive technology. How do you want to make your impact? For more information about us, visit weavix.com. Equal Employment Opportunity (EEO) Statement weavix is an Equal Opportunity Employer. At weavix, diversity fuels innovation. We are dedicated to fostering an inclusive environment where every team member is empowered to contribute to our mission of connecting the disconnected workforce. We do not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), national origin, age, disability, veteran status, genetic information, or any other legally protected characteristic. All qualified applicants will receive consideration for employment. Americans with Disabilities Act (ADA) Statement weavix is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need assistance or an accommodation during the application process due to a disability, you may contact us at *************. E-Verify Notice Notice: weavix participates in the E-Verify program to confirm employment eligibility as required by law.
    $69k-92k yearly est. 24d ago
  • Data Engineer III

    Spring Venture Group 3.9company rating

    Data engineer job in Kansas City, MO

    Who We Are: Spring Venture Group is a leading digital direct-to-consumer sales and marketing company with product offerings focused on the senior market. We specialize in distributing Medicare Supplement, Medicare Advantage, and related products via our family of brands and dedicated team of licensed insurance agents. Powered by our unique technologies that combine sophisticated marketing, comparison shopping, sales execution, and customer engagement - we help thousands of seniors across the country navigate the complex world of Medicare every day. Job Description This person has the opportunity to work primarily remote in the Kansas City or surrounding areas, making occasional visits to the office, but must CURRENTLY be in the Kansas City area. We are unable to sponsor for this role, this includes international students. OVERVIEW The Data Management team is responsible for all things data at Spring Venture Group. Most importantly, our team is responsible for constructing high quality datasets that enable our business stakeholders and world-class Analytics department to make data informed decisions. Data engineers, combining Software Engineering and Database Engineering, serve as a primary resource for expertise with writing scripts and SQL queries, monitoring our database stability, and assisting with data governance ensuring availability for business-critical systems. The DE III works with a team of engineers of varying levels to design, develop, test, and maintain software applications and programs. The DE III will be expected to work independently when needed to solve the most complex problems encountered. They will be expected to be a leader and a mentor. ESSENTIAL DUTIES The essential duties for this role include, but are not limited to: Serve as a primary advisor to Data Engineering Manager to identify and bring attention to opportunities for technical improvements, reduction of technical debt, or automation of repeated tasks. Build advanced data pipelines utilizing the medallion architecture to create high quality single source of truth data sources in Snowflake Architect replacements of current Data Management systems with respect to all aspects of data governance Design advanced services with multiple data pipelines to securely and appropriately store company assets in our enterprise data stores. Technically advise any member of the data engineering department, providing direction when multiple paths forward present themselves. Actively participate as a leader in regular team meetings, listening and ensuring that one is assisting others at every chance for growth and development. Write advanced ETL/ELT scripts where appropriate to integrate data of various formats into enterprise data stores. Take ownership (both individually and as part of a team) of services and applications Write complex SQL queries, scripts, and stored procedures to reliably and consistently modify data throughout our organization according to business requirements Collaborate directly and independently with stakeholders to build familiarity, fully understand their needs, and create custom, modular, and reliable solutions to resolve their requests Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Work with Project Managers, Solution Architects, and Software Development teams to build solutions for Company Initiatives on time, on budget, and on value. Independently architect solutions to problems of high complexity, and advise junior and mid-level engineers on problems of medium complexity. Create data pipelines using appropriate and applicable technologies from Amazon Web Services (AWS) to serve the specific needs of the business. Ensure 99.95% uptime of our company's services monitoring data anomalies, batch failures, and our support chat for one week per team cycle from 8am-9pm. Follow and embrace procedures of both the Data Management team and SVG Software Development Life Cycle (SDLC), including obtaining and retaining IT Security Admin III clearance. Support after hours and weekend releases from our internal Software Development teams. Actively participate in code review and weekly technicals with another more senior engineer or manager. Assist departments with time-critical SQL execution and debug database performance problems. ROLE COMPETENCIES The competencies for this role include, but are not limited to: Emotional Intelligence Drive for Results Continuous Improvement Communication Strategic Thinking Teamwork and Collaboration Qualifications POSITION REQUIREMENTS The requirements to fulfill this position are as follows: Bachelor's degree in Computer Science, or a related technical field. 4-7 years of practical production work in Data Engineering. Expertise of the Python programming language. Expertise of Snowflake Expertise of SQL, databases, & query optimization. Must have experience in a large cloud provider such as AWS, Azure, GCP. Advanced at reading code independently and understanding its intent. Advanced at writing readable, modifiable code that solves business problems. Ability to construct reliable and robust data pipelines to support both scheduled and event based workflows. Working directly with stakeholders to create solutions. Mentoring junior and mid-level engineers on best practices in programming, query optimization, and business tact. Additional Information Benefits: The Company offers the following benefits for this position, subject to applicable eligibility requirements: Competitive Compensation Medical, Dental and vision benefits after a short waiting period 401(k) matching program Life Insurance, and Short-term and Long-term Disability Insurance Optional enrollment includes HSA/FSA, AD&D, Spousal/Dependent Life Insurance, Travel Assist and Legal Plan Generous paid time off (PTO) program starting off at 15 days your first year 15 paid Holidays (includes holiday break between Christmas and New Years) 10 days of Paid Parental Leave and 5 days of Paid Birth Recovery Leave Annual Volunteer Time Off (VTO) and a donation matching program Employee Assistance Program (EAP) - health and well-being on and off the job Rewards and Recognition Diverse, inclusive and welcoming culture Training program and ongoing support throughout your Venture Spring Venture Group career Security Responsibilities: Operating in alignment with policies and standards Reporting Security Incidents Completing assigned training Protecting assigned organizational assets Spring Venture Group is an Equal Opportunity Employer
    $75k-98k yearly est. 9h ago
  • Data Developer

    Veracity Consulting

    Data engineer job in Overland Park, KS

    Veracity Consulting, Inc. is an Information Technology Solutions Provider. We offer our clients value added expertise in the development and use of Information Technology to expand and improve their organization's business processes. Currently, we are searching for a Data Developer to join our team in Overland Park, KS. Our team is instinctively curious. It's just how we're wired. That means empowering our people to see the big picture-to cut through the noise so we're not just treating the symptoms but finding the cure. Founded in 2006, Veracity is a team of problem\-solvers and truth\-tellers who deliver customized IT solutions for our customers. We bridge the gap between business and technology while always staying transparent and authentic-simply doing the right thing. RESPONSIBILITIES: Write queries against various database management systems Manage tasks and time effectively QUALIFICATIONS Familiarity with at least one database management system such as SQL Server, MySQL, or MariaDB Data visualization work with Power BI, Tableau, or similar Strong SQL development (T\-SQL, PL\/SQL a plus) Python or other scripting language (C#, Javascript, etc.) Understand data flow and the ETL process Regular and predictable attendance To be considered an applicant for a position, you must: (1) complete the application in full; (2) apply for a specific, available position; and (3) meet all stated minimum qualifications. Applications that are incomplete or are submitted for "any" position will not be considered. Applicants are good for 90 days. If you are not selected within 90 days of submission, and remain interested in a position, you must submit a new application. Veracity Consulting provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or status as a protected veteran and any other characteristics protected by law. In addition to federal law requirements, Veracity Consulting complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. No 3 rd parties, please. "}}],"is Mobile":false,"iframe":"true","job Type":"Full time","apply Name":"Apply","zsoid":"670335284","FontFamily":"PuviRegular","job OtherDetails":[{"field Label":"Industry","uitype":2,"value":"IT Services"},{"field Label":"City","uitype":1,"value":"Overland Park"},{"field Label":"State\/Province","uitype":1,"value":"Kansas"},{"field Label":"Zip\/Postal Code","uitype":1,"value":"66204"}],"header Name":"Data Developer","widget Id":"**********00925015","is JobBoard":"false","user Id":"**********07889003","attach Arr":[],"custom Template":"4","is CandidateLoginEnabled":false,"job Id":"**********02399001","FontSize":"13","location":"Overland Park","embedsource":"CareerSite","logo Id":"rc0lf017b693490c648be8a4aa5911517b8bb"}
    $72k-95k yearly est. 60d+ ago
  • Data Engineer

    Quest Analytics

    Data engineer job in Overland Park, KS

    At Quest Analytics, our mission is to make healthcare more accessible for all Americans. As part of our team, you'll work in an innovative, collaborative, challenging, and flexible environment that supports your personal growth every day. We are looking for a talented and motivated Data Engineer with experience in building scalable infrastructure, implementing automation, and enabling cross-functional teams with reliable and accessible data. The Data Engineer will run daily operations of the data infrastructure, automate and optimize our data operations and data pipeline architecture while ensuring active monitoring and troubleshooting. This hire will also support other engineers and analysts on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. APPLY TODAY! What you'll do: * Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. * Review project objectives and determine the best technology for implementation. Implement best practice standards for development, build and deployment automation. * Run daily operations of the data infrastructure and support other engineers and analysts on data investigations and operations. * Monitor and report on all data pipeline tasks while working with appropriate teams to take corrective action quickly, in case of any issues. * Work with internal teams to understand current process and areas for efficiency gains * Write well-abstracted, reusable, and efficient code. * Participate in the training and/or mentoring programs as assigned or required. * Adheres to the Quest Analytics Values and supports a positive company's culture. * Responds to the needs and requests of clients and Quest Analytics management and staff in a professional and expedient manner. What it requires: * Bachelor's degree in computer science or related field. * 3 years of work experience with ETL, data operations and troubleshooting, preferably in healthcare data. * Proficiency with Azure ecosystems, specifically in Azure Data Factory and ADLS. * Strong proficiency in Python for scripting, automation, and data processing. * Advanced SQL skills for query optimization and data manipulation. * Experience with distributed data pipeline tools like Apache Spark, Databricks, etc. * Working knowledge of database modeling (schema design, and data governance best practices.) * Working knowledge of libraries like Pandas, numpy, etc. * Self-motivated and able to work in a fast paced, deadline-oriented environment * Excellent troubleshooting, listening, and problem-solving skills. * Proven ability to solve complex issues. * Customer focused. What you'll appreciate: * Workplace flexibility - you choose between remote, hybrid or in-office * Company paid employee medical, dental and vision * Competitive salary and success sharing bonus * Flexible vacation with no cap, plus sick time and holidays * An entrepreneurial culture that won't limit you to a job description * Being listened to, valued, appreciated -- and having your contributions rewarded * Enjoying your work each day with a great group of people Apply TODAY! careers.questanalytics.com About Quest Analytics For more than 20 years, we've been improving provider network management one groundbreaking innovation at a time. 90% of America's health plans use our tools, including the eight largest in the nation. Achieve your personal quest to build a great career here. Visa sponsorship is not available at this time. Preferred work locations are within one of the following states: Alabama, Arizona, Arkansas, Colorado, Connecticut, Delaware, Florida, Georgia, Idaho, Illinois (outside of Chicago proper), Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Mexico, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, West Virginia, Wisconsin, or Wyoming. Quest Analytics provides equal employment opportunities to all people without regard to race, color, religion, sex, national origin, ancestry, marital status, veteran status, age, disability, sexual orientation or gender identity or expression or any other legally protected category. We are committed to creating and maintaining a workforce environment that is free from any form of discriminations or harassment. Applicants must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire. Persons with disabilities who anticipate needing accommodations for any part of the application process may contact, in confidence [email protected] NOTE: Staffing agencies, headhunters, recruiters, and/or placement agencies, please do not contact our hiring managers directly. We are not currently working with additional outside agencies at this time. Any job posting displayed on websites other than questanalytics.com or jobs.lever.co/questanalytics/ may be out of date, inaccurate and unavailable We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
    $69k-92k yearly est. 50d ago
  • Senior Data Engineer

    Velocity Staff

    Data engineer job in Overland Park, KS

    Velocity Staff, Inc. is working with our client located in the Overland Park, KS area to identify a Senior Level Data Engineer to join their Data Services Team. The right candidate will utilize their expertise in data warehousing, data pipeline creation/support and analytical reporting and be responsible for gathering and analyzing data from several internal and external sources, designing a cloud-focused data platform for analytics and business intelligence, reliably providing data to our analysts. This role requires significant understanding of data mining and analytical techniques. An ideal candidate will have strong technical capabilities, business acumen, and the ability to work effectively with cross-functional teams. Responsibilities Work with Data architects to understand current data models, to build pipelines for data ingestion and transformation. Design, build, and maintain a framework for pipeline observation and monitoring, focusing on reliability and performance of jobs. Surface data integration errors to the proper teams, ensuring timely processing of new data. Provide technical consultation for other team members on best practices for automation, monitoring, and deployments. Provide technical consultation for the team with “infrastructure as code” best practices: building deployment processes utilizing technologies such as Terraform or AWS Cloud Formation. Qualifications Bachelor's degree in computer science, data science or related technical field, or equivalent practical experience Proven experience with relational and NoSQL databases (e.g. Postgres, Redshift, MongoDB, Elasticsearch) Experience building and maintaining AWS based data pipelines: Technologies currently utilized include AWS Lambda, Docker / ECS, MSK Mid/Senior level development utilizing Python: (Pandas/Numpy, Boto3, SimpleSalesforce) Experience with version control (git) and peer code reviews Enthusiasm for working directly with customer teams (Business units and internal IT) Preferred but not required qualifications include: Experience with data processing and analytics using AWS Glue or Apache Spark Hands-on experience building data-lake style infrastructures using streaming data set technologies (particularly with Apache Kafka) Experience data processing using Parquet and Avro Experience developing, maintaining, and deploying Python packages Experience with Kafka and the Kafka Connect ecosystem. Familiarity with data visualization techniques using tools such as Grafana, PowerBI, AWS Quick Sight, and Excel. Not ready to apply? Connect with us to learn about future opportunities.
    $69k-92k yearly est. Auto-Apply 19d ago
  • Corporate Treasury Data & Risk Analytics

    Capitol Federal Savings Bank 4.4company rating

    Data engineer job in Overland Park, KS

    We are seeking a driven and analytically minded professional to join our Corporate Treasury team. This individual will play a key role supporting asset/liability management, liquidity management, budgeting & forecasting, data analytics, and performance analysis/reporting. In this role, you will work closely with senior and executive leadership to deliver strategic financial insights, optimize business performance, support and influence decision-making, uncover data-driven stories, and challenge existing processes with fresh, innovative thinking. Essential Duties & Responsibilities Responsibilities will be tailored to the experience and skillset of the selected candidate and may include: * Developing and enhancing financial models and simulations * Supporting forecasting, liquidity, and ALM analytics * Conducting "what-if" scenario analysis and presenting actionable insights * Building dashboards, reporting tools, and performance summaries * Driving or contributing to process improvement initiatives * Collaborating cross-functionally with senior leaders across the organization Experience & Knowledge * Financial modeling and earnings simulation experience using risk/performance management tools * Designing and developing mathematical or statistical models to support strategic decision-making and risk management * Experience running scenario analysis and synthesizing insights for executive audiences * Familiarity with financial asset/liability instruments, market instruments, and their interactions * Experience with Funds Transfer Pricing (FTP) and capital allocation is a plus * Demonstrated success driving effective process improvements Education * Bachelor's degree in Accounting, Finance, or a related field required CapFed is an equal opportunity employer.
    $62k-76k yearly est. Auto-Apply 28d ago
  • Adobe Real-Time Customer Data Platform (RT-CDP) Architect

    Slalom 4.6company rating

    Data engineer job in Kansas City, MO

    Who You'll Work With The Adobe team drives strategic direction and solution enablement in support of Marketing Teams. We accelerate innovation and learning, advance sales and delivery excellence with high-caliber Marketing Technology solutions including Adobe technology expertise. Our focus is 4 go-to-market solution areas: Experience and Content Management with a focus on Content Supply Chain, Digital Asset Management; Personalized Insights and Engagement with a focus on Analytics, Customer Data Platforms, and Journey Orchestration; Digital Commerce with a focus on Experience Led Commerce and Product Information Management; Marketing Operations and Workflow with a focus on resource management, reporting and approvals of the content and data required to run Personalization and Campaigns at Scale. We are seeking a talented Adobe RT-CDP Architect to join our team as a senior consultant or principal. This is a client-facing role that involves close collaboration with both technical and non-technical stakeholders. What You'll Do Implement, configure, and enable Adobe Customer Data Platform (CDP) Provide the technical design and data architecture for configuring RT-CDP to meet clients' business goals Responsible for understanding business problems and capturing client requirements by leading effective conversations with business and technical client teams Interpret how to best apply the out of the box product to provide a solution; including finding alternative approaches that best leverage the platform Provide analytics domain expertise, consultation, and troubleshooting Learn new platforms, new capabilities, and new clouds to stay on top of the ever-growing product ecosystem (CJA, AJO, Marketo) What You'll Bring Expertise in configuration, implementation, and integration of RT-CDP product without significant help from others Knowledge of, and experience with RT-CDP B2C, B2B and/or B2P Knowledge of how RT-CDP works with other Adobe Experience Platform products Experience implementing and driving success with RT-CDP for enterprise clients in an architecture role Proficient with manipulating, structuring, and merging data from different data sources and understanding of typical data sources within an enterprise environment Knowledge of how Graph Stitching, profile merge rules, profile collapsing, householding concepts work in RT-CDP Ability to translate business rules into technical requirements and implementation of those requirements Proficient with data transformation, API-based integrations and JavaScript tagging Experience working with SQL, R, and/or Python preferred Enterprise experience designing multi-solution architecture Strong communication skills and a passion for learning new technologies and platform capabilities Build strong relationships with clients and understand them from a business and strategic perspective Occasional travel as needed by client About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this role, we are hiring at the following levels and salary ranges: East Bay, San Francisco, Silicon Valley: Senior Consultant: $131,000-$203,000 Principal: $145,000-$225,000 San Diego, Los Angeles, Orange County, Seattle, Boston, Houston, New Jersey, New York City, Washington DC, Westchester: Senior Consultant: $120,000-$186,000 Principal: $133,000-$206,000 All other locations: Senior Consultant: $110,000-$171,000 Principal: $122,000-$189,000 In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time. We will accept applicants until December 12, or until the position is filled. We are committed to pay transparency and compliance with applicable laws. If you have questions or concerns about the pay range or other compensation information in this posting, please contact us at: ********************. EEO and Accommodations Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team or contact ****************************** if you require accommodations during the interview process. #LI-KM
    $145k-225k yearly Easy Apply 24d ago
  • Data Analytics Engineer

    Emprise Bank 4.5company rating

    Data engineer job in Kansas City, MO

    At Emprise Bank, everything we do is focused on empowering people to thrive. We proudly work to provide an extraordinary customer experience to help our customers achieve their goals. We are currently seeking a Data Analytics Engineer to join our team in Wichita, KS or Kansas City, MO. As a Data Analytics Engineer, you'll be responsible for administrative, technical, and professional work within the technology department. This role will work on-site in Wichita, KS with hybrid scheduling. For candidates in the Kansas City metro area, the role will be remote. A successful candidate will have: * Confident and articulate communications skills * Initiative and a strong work ethic * A strategic mindset * A demonstrated ability to make sense of complex and sometimes contradictory information to effectively solve problems * Strong attention to detail * The ability to work both independently and collaboratively * An understanding of and commitment to our values Essential functions of the role: * Demonstrate a strong understanding of privacy and security principles, particularly as they pertain to data pipelines and dataset * Develop, test, and maintain data pipelines supporting business intelligence functions * Collaborate with data analysts to create and optimize datasets for data analysis and reporting * Maintain documentation of data pipelines, workflows, and data dictionaries for internal reference and ensure it is aligned with data governance protocols * Develop code for the business logic of operational pipelines, ensuring that data processing aligns with business requirements * Serve as a liaison for data analysts and data engineers, facilitating communication and collaboration between the two positions to ensure that data needs are met efficiently * Implement robust data quality checks and transformation processes to ensure data accuracy and consistency Other duties as assigned within the scope of the role Requirements * Bachelor's degree in quantitative field * Experience with data transformation tooling * Experience with BI tools (Tableau, PowerBI, Qlikview, etc) * Proficiency with Python and SQL language is required * Strong communication skills and the ability work with business teams to define metrics, elicit requirements and explain technical issues to non-technical associates * Proficiency in Pyspark language is preferred * Experience in Azure Cloud is preferred * Experience with SQL server database is preferred * Familiarity with medallion data architecture is preferred * Experience with Data Factory and Databricks is preferred Benefits In addition to a competitive salary and benefits, Emprise offers professional growth, a rewarding and challenging environment, opportunities to be involved in our communities, and a culture of integrity, passion, and success. We also offer shift differential pay for bilingual candidates! At Emprise Bank, empowering people to thrive means having an all-inclusive culture that honors our commitment to all dimensions of diversity in our workforce and embraces inclusion of all people. People of color, women, LGBTQIA+, veterans, and persons with disabilities are encouraged to apply. To learn more, please visit our website at ******************** Emprise Bank is an EEO/AA/ADA/Veteran Employer/Member FDIC/Drug Free Workplace. Emprise Bank participates in E-Verify and will provide your Form-I 9 to the federal government to confirm authorization to work in the United States.
    $88k-107k yearly est. 2d ago
  • Sr. Systems Analys

    Right Talent Right Now

    Data engineer job in Overland Park, KS

    Sr. Systems Analyst 1 Year contract Kansas City MO. Primary Job Responsibilities • Identify & evaluate technology/infrastructure project (server hardware & software upgrades, application migrations, application retirements, platform changes, user workstation hardware & software deployments) impacts through a variety of techniques including technology research, interviews, document and system analysis, workshops, surveys, user shadowing, business process descriptions, use cases, scenarios, business analysis, and/or task and workflow analysis. • Critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details, abstract up from low-level information to a general understanding, and distinguish user requests from the underlying true needs. • Utilize prior experience with enterprise-wide requirements definition and management systems and methodologies required. • Successfully support multiple project initiatives simultaneously. Along with business case evaluation and creation for upcoming projects. • Strong analytical skills required, including a thorough understanding of how to interpret project scope and translate them into application and operational requirements . • Excellent verbal and written communication skills and the ability to interact professionally with a diverse group, executives, managers, and subject matter experts within both business and technical teams. • Develop requirements specifications according to standard templates, using natural language. • Collaborate with developers and subject matter experts to establish the technical vision and analyze tradeoffs between usability and performance needs . • Be the liaison between the business units, technology teams and support teams. • Strong knowledge of systems, interfaces and environments. • Ability to assess and understand technology changes. - Planning and Monitoring (10%) - Effectively apply methodologies and enforce project process and standards. Responsible for meeting project deadlines for planned analysis activities. Effectively communicate relevant project information to project team and superiors. - Elicitation (20%) - Elicit and define requirements (Functional Specifications) using standard analysis techniques. Conduct interviews with users to identify possible business impacts of upgrade, analyze existing procedures and application/interface/environment documentation, and evaluate possible interface and environment impacts of change. - Analysis (50%) - Act as business and technical representative for requirements while leading discovery sessions with technical (development and testing) teams. Document requirements for use by technical teams for creating design documentation and test cases. Consult with business on project scope and requirements as needed. - Communication and Management (20%) - Act as a liaison between the business and technical IT teams. Translate information from IT to the business in easily understood terms. Partner with IT to accomplish goals and provide ongoing communication to various business stakeholders on project status and issues. Qualifications • 6+ years experience with technology/infrastructure projects • 4+ years experience documenting or consuming technical project requirements • High School degree required • This position requires incumbents to regularly sit at a desk and operation standard office equipment, such as a computer and phone • Employee is regularly required to move equipment, up to 10 pounds, to facilitate in-person meetings from computer and/or phone • Specific vision abilities required by this job include close vision and the ability to adjust focus • Must be able to talk and hear • Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions Preferred Experience • Bachelors degree in Computer Information Systems or Computer Sciences • Excellent verbal and written communication skills • Excellent listener • Solid analytical skills required, including a thorough understanding of how to interpret needs and translate them into technical requirements • Extensive experience with corporate technology and infrastructure • Effectively and efficiently plan and lead meetings and complete necessary follow-up • Ability to understand and manage to approved project scope • Critical thinking skills - using logic and reason to identify the strengths and weaknesses of alternative solutions, conclusions or approaches to problems • Independent Thinker - requires guiding oneself with little direct supervision and depending on oneself to get things done • Experience working in a matrix organization for successful project delivery • Experience working projects with integrated systems and the related complexity of requirements traceability between systems • ITIL and SDLC knowledge. Additional Information All your information will be kept confidential according to EEO guidelines.
    $79k-103k yearly est. 60d+ ago
  • Data Engineer II

    27Global

    Data engineer job in Leawood, KS

    Full-time Description 27Global is a rapidly growing company in the dynamic industry of software, cloud, and data engineering. We pride ourselves on the quality of the services we deliver, the clients we serve, and the strength of our culture. Our commitment to our employees is evidenced by our five Best Places to Work awards. We're looking for a Data Engineer to join our team! You'll be responsible for contributing to the design and development of enterprise data solutions that support analytics, business intelligence, and scalable applications. You'll work closely with data and software architects, consultants and other engineers to deliver data models, integration strategies, and governance practices that empower client's data-driven decisions. Joining 27Global as a Data Engineer is an exciting high-growth opportunity offering a competitive base salary, performance bonuses, and variable compensation. Your Role: Participate in the design and implementation of scalable, secure, and high-performance data architectures. Develop and maintain conceptual, logical, and physical data models. Work closely with architects to define standards for data integration, quality, and governance. Collaborate with engineers, analysts, and business stakeholders to align data solutions with organizational needs. Support cloud-based data strategies including data warehousing, pipelines, and real-time processing. Design and optimize data pipelines that support AI, machine learning, and advanced analytics workloads. Implement data preprocessing, feature engineering, and real-time inference capabilities for predictive modeling. Integrate AI/ML models into production environments using tools such as AWS SageMaker, Azure Machine Learning, or Databricks. Assess, learn, and apply emerging data technologies and frameworks to enhance solutions and stay current with industry trends. Requirements What You Bring: BA/BS/Master's degree in Computer Science, Information Systems, Data Science, or related field. 2 - 4 years of experience in data architecture, data engineering, or related roles delivering scalable architecture solutions from design to production. 2 - 4 years of experience writing .Net code or other OOP languages in an Agile environment. Demonstrated leadership skills with the ability to collaborate with and lead on-shore and off-shore team members. Proficient technical skills in: Spark, Scala, C#, PySpark, Data Lake, Delta Lake, Relational and NoSQL Databases, AWS Glue and Azure Synapse Experience with SQL, ETL/ELT, and data modeling. Experience with cloud platforms (AWS, Azure, GCP) and implementing modern data platforms with data lake. Knowledge of data governance, security, and compliance frameworks. Ability to context switch and work on a variety of projects over specified periods of time. Ability to work at the 27Global office in Leawood, KS with hybrid work flexibility after 90 days, and occasionally onsite at client offices. Flexibility to occasionally travel to client sites may be required, typically 1 week per quarter or less. Legal authorization to work in the United States and the ability to prove eligibility at the time of hire. Ways to Stand Out: Certifications: AWS Solution Architect, Azure Data Engineer, Databricks Data Engineer Hands-on experience with Databricks for building and optimizing scalable data pipelines, Delta Lake, and Spark-based analytics. Hands-on experience with big data tools (Spark, Kafka). Modern data warehouses (Snowflake, Redshift, BigQuery). Familiarity with machine learning pipelines and real-time analytics. Strong communication skills and ability to influence stakeholders. Prior experience implementing enterprise data governance frameworks. Experience in a client-facing role, working directly with clients from multiple levels of the organization; often presenting and documenting client environment suggestions and improvements. Why 27G?: Four-time award winner of Best Place to Work by the Kansas City Business Journal. A casual and fun small business work environment. Competitive compensation, benefits, time off, profit sharing, and quarterly bonus potential. Dedicated time for learning, development, research, and certifications.
    $69k-92k yearly est. 60d+ ago
  • Data Scientist - Retail Pricing

    Capitol Federal Savings Bank 4.4company rating

    Data engineer job in Overland Park, KS

    We are looking for a Data Scientist! This position will play a key role in shaping data-driven strategies that directly influence the bank's profitability, customer value, and market competitiveness. This role sits at the intersection of analytics, finance, and product strategy - transforming data into pricing intelligence that supports smarter, faster business decisions. Will design and implement advanced pricing and profitability models for retail banking products, leveraging internal performance metrics, market benchmarks, and third-party data sources. Through predictive modeling, elasticity analysis, and scenario testing, will help the organization optimize deposit and loan pricing, forecast financial outcomes, and identify growth opportunities. Collaborating across product, finance, and executive teams, will translate complex analytical findings into clear business recommendations that drive strategic action. Will also contribute to enhancing our analytics infrastructure - improving data pipelines, model governance, and reporting capabilities to strengthen enterprise-wide decision-making. Core Expertise: Pricing strategy · Profitability modeling · Financial forecasting · Machine learning · SQL · Python · R · Data visualization · Strategic analytics · Cross-functional collaboration CapFed is an equal opportunity employer.
    $66k-82k yearly est. Auto-Apply 9d ago

Learn more about data engineer jobs

How much does a data engineer earn in Overland Park, KS?

The average data engineer in Overland Park, KS earns between $60,000 and $105,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Overland Park, KS

$79,000

What are the biggest employers of Data Engineers in Overland Park, KS?

The biggest employers of Data Engineers in Overland Park, KS are:
  1. Quest Analytics
  2. 27Global
  3. Tyler Technologies
  4. W. R. Berkley
  5. Conexess Group
  6. PDS
  7. CARE
  8. Advanced Technologies Group
  9. SelectQuote Insurance Services
  10. Kforce
Job type you want
Full Time
Part Time
Internship
Temporary