Data Engineer
Data engineer job in Overland Park, KS
Description The Data Engineer role within the Cloud Payments Reporting team is responsible for developing, deploying, and maintaining data solutions. Primary responsibilities will include querying databases, maintaining data transformation pipelines through ETL tools and systems into data warehouses, provide and support data visualizations, dashboards, ad hoc and customer deliverable reports. Responsibilities
Provide primary support for existing databases, automated jobs, and reports, including monitoring notifications, performing root cause analysis, communicating findings, and resolving issues.
Subject Matter Expert for payments reports, databases, and processes.
Ensure data and report integrity and accuracy through thorough testing and validation.
Build analytical tools to utilize the data, providing actionable insight into key business performance metrics including operational efficiency.
Implement, support, and optimize ETL pipelines, data aggregation processes, and reports using various tools and technologies.
Collaborate with operational leaders and teams to understand reporting and data usage across the business and provide efficient solutions.
Participate in recurring meetings with working groups and management teams to discuss operational improvements.
Work with stakeholders including data, design, product, and executive teams to support their data infrastructure needs while assisting with data-related technical issues.
Handle tasks on your own, adjust to new deadlines, and adapt to changing priorities.
Design, develop and implement special projects, based on business needs.
Perform other job-related duties and responsibilities as assigned.
Qualifications
Five or more years of experience with Oracle, MySQL, Power BI / QuickSight, and Excel.
Thorough knowledge of SQL, relational databases and data modeling principles.
Proficiency in programming languages such as PL/SQL, Bash, PowerShell and Python.
Exceptional problem-solving, analytical, and critical thinking skills.
Excellent interpersonal and communication skills, with the ability to consult with stakeholders to facilitate requirements gathering, troubleshooting, and solution validation.
Detail-oriented with the ability to understand the bigger picture.
Ability to communicate complex quantitative analysis clearly.
Strong organizational skills, including multi-tasking and teamwork.
Self-motivated, task oriented and an aptitude for complex problem solving.
Experience with AWS, Jenkins and SnapLogic is a plus.
Data streaming, API calls (SOAP and REST), database replication and real-time processing is a plus.
Experience with Atlassian JIRA and Confluence is a plus.
Auto-ApplyData Scientist / Data Architect / Data Governance Lead
Data engineer job in Kansas City, MO
KEY RESPONSIBILITIES: Data Governance, Strategy and Architecture * Define and drive the organization's overall vision, data strategy, roadmap, and architecture vision. This includes the data AI architecture vision, strategy, and roadmap. This includes the design of scalable data lakes, data warehouses, and data fabric architectures.
* Establish and enforce data governance policies and standards to ensure data quality, consistency, and compliance with all relevant regulations (e.g., GDPR, CCPA). Lead the implementation of a comprehensive data governance framework, including data quality management, data lineage tracking, and master data management (MDM). Collaborate with data owners and stewards across business units to establish clear roles, responsibilities, and accountability for data assets.
* Establish clear rules and policies governing the responsible usage of data within AI and ML models, including documentation of data lineage for model training. Design data infrastructure specifically optimized for AI workloads, including data pipelines for machine learning models, and architect solutions for large language models (LLMs). Develop bias mitigations strategies to ensure diverse and representative datasets to prevent AI biases, and architect monitoring systems for model drift.
* Evaluate, recommend, and select appropriate data management technologies, including cloud platforms (e.g., AWS, Azure, GCP), storage solutions, and governance tools.
* Architect complex data integration patterns to connect disparate data sources across the organization, ensuring seamless data flow and a unified data view.
Data Security and Privacy
* Design and implement a robust data security architecture to protect sensitive data from unauthorized access, breaches, and corruption.
* Develop security protocols, such as encryption, access controls (IAM), and masking techniques to safeguard data in transit and at rest.
* Conduct regular security audits and vulnerability testing to identify gaps in security architecture and develop remediation plans.
* Ensure the data architecture and its supporting systems are compliant with internal policies and external data protection regulations.
Data Modeling and Management
* Design and maintain conceptual, logical, and physical data models for transactional and analytical systems.
* Oversee the development of database schemas, metadata management, and data cataloging efforts to improve data discoverability and understanding.
* Define and standardize data architecture components, including storage solutions (data lakes, warehouses, etc.), data pipelines, and integration patterns.
* Evaluate and recommend new data technologies, tools, and platforms that align with the organization's strategic needs.
Data Classification
* Design and implement a robust data security architecture, including controls for access management, encryption, and data masking to protect sensitive information.
* Create and manage an organization-wide data classification scheme based on data sensitivity and importance (e.g., public, internal, confidential, restricted).
* Implement technical controls and processes to automatically classify and tag data assets, ensuring proper handling and security.
* Collaborate with business and legal teams to define and apply data classification rules consistently.
Team Collaboration and Leadership
* Provide technical guidance and mentorship to data engineers, analysts, developers, and other IT teams on best practices for data management and security.
* Work closely with business stakeholders to understand their data requirements and translate them into effective architectural solutions.
* Foster a data-centric culture across the organization, promoting awareness and understanding of data governance principles.
ABOUT THE COMPANY:
Bluebird Fiber is a premier fiber telecommunications provider of internet, data transport, and other services to carriers, businesses, schools, hospitals, and other enterprises in the Midwest. To learn more, please visit bluebirdfiber.com.
Join an amazing team of telecommunication professionals! Bluebird is a dynamic growing company in need of a Data Architect to be a part of a collaborative team. This is a full-time, benefit eligible position in our Kansas City Office. All of us at Bluebird work hard to meet objectives for the organization and live the mission and values of this growing company to meet a common goal. Check out this video that highlights our amazing company culture.
JOB SUMMARY:
We are seeking a highly skilled and strategic Data Architect to lead our data governance, security, and management initiatives. This senior role will be responsible for designing and implementing the organization's enterprise data architecture, ensuring that our data is secure, reliable, and accessible for business-critical functions. The ideal candidate is a proactive leader who can define data strategy, enforce best practices, and collaborate with cross-functional teams to align our data ecosystem with business goals.
REQUIRED QUALIFICATIONS:
* Bachelor's or master's degree in Computer Science, Information Technology, or a related technical field.
* 10+ years of hands-on experience in data architecture, data modeling, and data governance, with a proven track record of designing and implementing complex data ecosystems. Experience working in regulated industries is a plus.
* Proven experience (8+ years) designing and implementing enterprise-level data architectures.
* Extensive experience with data modeling, data warehousing, and modern data platforms (e.g., cloud environments like AWS, Azure, or GCP).
* Deep expertise in data modeling, data warehousing, database technologies (SQL, NoSQL), big data technologies (e.g., Spark), and modern cloud platforms (e.g., AWS, Azure, GCP).
* Deep expertise in data governance and security principles, including regulatory compliance frameworks.
* Strong knowledge of how to structure data for machine learning and AI workloads, including experience with MLOps platforms.
* Hands-on experience with data classification and data cataloging tools (e.g., Collibra, Alation).
* Excellent communication, interpersonal, and leadership skills, with the ability to influence and build consensus across the organization.
PREFERRED QUALIFICATIONS:
* Professional certifications in data architecture, data governance, or cloud platforms.
* Experience with big data technologies (e.g., Hadoop, Spark).
* Familiarity with data integration and ETL/ELT frameworks.
Senior Data Engineer
Data engineer job in Kansas City, MO
About Us
American Century Investments is a leading global asset manager with over 65 years of experience helping a broad base of clients achieve their financial goals. Our expertise spans global equities and fixed income, multi-asset strategies, ETFs, and private investments.
Privately controlled and independent, we focus solely on investment management. But there's an unexpected side to us, too. We direct over 40% of our profits every year-more than $2 billion since 2000-to the Stowers Institute for Medical Research. Our ongoing financial support drives the Institute's breakthrough work and mission of defeating life-threatening diseases like cancer and Alzheimer's. So, the better we do for our clients, the more we can do for everyone.
All 1,400 of us across the globe are inspired every day by the unique difference our hard work can make in so many lives. It shows in the curiosity we bring to every initiative, the deep relationships we build with our clients, and the way we treat each other in the hallway. If you're excited to learn more about us, we can't wait to learn more about you.
Role Summary
We are seeking a Senior Data Engineer with expertise in AWS, Snowflake, Python, and SQL to join our Enterprise Data Technology team. In this role, you will lead the design, development, and maintenance of advanced data applications, contributing to the evolution of our AWS-based Data Lake. You'll provide technical leadership across complex programming and integration initiatives, collaborating closely with Database Administrators, Data Architects, and Data Scientists to ensure optimized, scalable solutions.
This hybrid position will be based out of our Kansas City, MO office.
This position is not eligible for visa sponsorship. Applicants must be authorized to work in the U.S. without visa sponsorship, now or in the future.
How You Will Make an Impact
Design, build, and maintain enterprise data ecosystems, including ingestion, storage, organization, and interface layers.
Analyze business and technical requirements for data systems and applications; ensure alignment with IT policies and development standards.
Identify and implement process improvements such as automation, optimized data delivery, and scalable infrastructure redesign.
Research emerging data technologies and integration frameworks to continuously enhance engineering practices.
Lead in an Agile environment and contribute to project scoping, estimation, and scheduling.
Mentor junior developers through coaching and hands-on guidance.
What You Bring to the Team (Required)
Bachelor's degree in Computer Science, MIS, or related field, or equivalent work experience.
Strong experience in business, technical, and application design.
5+ years of experience in:
Object-oriented or functional programming (Java, Python).
AWS services (CloudFormation, S3, Lambda, IAM, EMR, KMS).
Building data pipelines (ETL, ELT, EL/TL, DaaS, Data Lake, ODS).
Working with RDBMS technologies (MySQL, PostgreSQL, MSSQL).
SDLC and Agile methodologies.
Messaging/integration tools (Kafka, SQS).
Proven ability to lead and mentor cross-functional teams across business units, vendors, and geographies.
Experience deploying and supporting production cloud services, including serverless architecture, orchestration, infrastructure-as-code (IaC), and security in AWS.
Effective communicator with the ability to collaborate with diverse stakeholders including Data Architects, Database Administrators, and Data Scientists.
Demonstrates the American Century Investments Winning Behaviors: Client Focused, Courageous and Accountable, Collaborative, Curious and Adaptable, Competitively Driven.
Additional Assets (Preferred)
Experience with architectural patterns such as event-driven design, APIs, microservices, and stateless systems.
Familiarity with advanced analytics tools and languages (R, Python, Java, SAS, etc).
Experience with ETL tools (Informatica, T-SQL, SSIS, Matillion).
1+ years of experience with data visualization tools (Tableau, QuickSight, Power BI).
Experience working with financial product datasets and in SSAE 16 or similar controlled environments.
DevOps experience with Git, build automation, artifact/package management, and deployment pipelines.
Strong background in data modeling, schema design, and dimensional modeling.
Test automation design and implementation experience.
AWS or TOGAF certification preferred.
Experience with additional AWS services (EC2, Kinesis, ECS, ELB) is a plus.
The above statements are not intended to be a complete list of all responsibilities, duties, and skills required.
What We Offer
Competitive compensation package with bonus plan
Generous PTO and competitive benefits
401k with 5% company match plus annual performance-based discretionary contribution
Tuition reimbursement, formal mentorship program, live and online learning
Learn more about our benefits and perks.
Employees are required to be in the office on a scheduled frequency. Adherence to this schedule is essential to fulfilling the expectations of the role.
American Century Investments is committed to complying with the Americans with Disabilities Act and all other applicable Equal Employment Opportunity laws and regulations. As such, American Century strives to provide a reasonable accommodation to any qualified individual under the ADA to perform essential job functions.
We encourage people of all backgrounds to join us on our mission. If you require reasonable accommodation for any aspect of the recruitment process, please send a request to HR-Talent_*******************************. All requests for accommodation will be addressed as confidentially as practicable.
American Century Investments believes all individuals are entitled to equal employment opportunity and advancement opportunities without regard to race, religious creed, color, sex, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, gender, gender identity, gender expression, age for individuals forty years of age and older, military and veteran status, sexual orientation, and any other basis protected by applicable federal, state and local laws. ACI does not discriminate or adopt any policy that discriminates against an individual or any group of individuals on any of these bases.
#LI-Hybrid
American Century Proprietary Holdings, Inc. All rights reserved.
Auto-ApplyData Engineer
Data engineer job in Overland Park, KS
The Data Engineer is a key contributor in advancing the firm's data strategy and analytics ecosystem, transforming raw data into actionable insights that drive business decisions. This role requires a technically strong, curious professional committed to continuous learning and innovation. The ideal candidate combines analytical acumen with data engineering skills to ensure reliable, efficient, and scalable data pipelines and reporting solutions.
ESSENTIAL DUTIES AND RESPONSIBILITIES
Data Engineering & Integration
Design, build, and maintain data pipelines and integrations using Azure Data Factory, SSIS, or equivalent ETL/ELT tools.
Automate data imports, transformations, and loads from multiple sources (on-premise, SaaS, APIs, and cloud).
Optimize and monitor data workflows for reliability, performance, and cost efficiency.
Implement and maintain data quality, validation, and error-handling frameworks.
Data Analysis & Reporting
Develop and maintain reporting databases, views, and semantic models for business intelligence solutions.
Design and publish dashboards and visualizations in Power BI and SSRS, ensuring alignment with business KPIs.
Perform ad-hoc data exploration and statistical analysis to support business initiatives.
Collaboration & Governance
Partner with stakeholders across marketing, underwriting, operations, and IT to define analytical and data integration requirements.
Maintain data integrity, enforce governance standards, and promote best practices in data stewardship.
Support data security and compliance initiatives in coordination with IT and business teams.
Continuous Improvement
Stay current with emerging data technologies and analytics practices.
Recommend tools, processes, or automation improvements to enhance data accessibility and insight delivery.
QUALIFICATIONS
Required:
Strong SQL development skills and experience with Microsoft SQL Server and Azure SQL Database.
Hands-on experience with data import, transformation, and integration using Azure Data Factory, SSIS, or similar tools.
Proficiency in building BI solutions using Power BI and/or SSRS.
Strong data modeling and relational database design skills.
Proficiency in Microsoft Excel (advanced formulas, pivot tables, external data connections).
Ability to translate business goals into data requirements and technical solutions.
Excellent communication and collaboration skills.
Bachelor's degree in Computer Science, Information Systems, or a related field (or equivalent experience).
Preferred:
Experience with cloud-based data platforms (Azure Data Lake, Synapse Analytics, Databricks).
Familiarity with version control tools (Git, Azure DevOps) and Agile development practices.
Exposure to Python or PowerShell for data transformation or automation.
Experience integrating data from insurance or financial systems.
Compensation: $120-129K
This position is 3 days onsite/hybrid located in Overland Park, KS
We look forward to reviewing your application. We encourage everyone to apply - even if every box isn't checked for what you are looking for or what is required.
PDSINC, LLC is an Equal Opportunity Employer.
Senior. Data Engineer
Data engineer job in Overland Park, KS
The Senior Data Engineer will be responsible for building and maintaining the data infrastructure that powers the organization's data-driven decision-making. Designs, develops, and maintains data pipelines, data warehouses, and other data-related infrastructure. This role expects to work closely with data scientists, analysts, and other stakeholders to understand their data needs and translate them into robust and scalable solutions.
Key Responsibilities:
Build, maintain, and optimize data pipelines, including ELT processes, data models, reports, and dashboards to drive business insights.
Develop and implement data solutions for enterprise data warehouses and business intelligence (BI) initiatives.
Continuously monitor and optimize data pipelines for performance, reliability, and cost-effectiveness. This includes identifying bottlenecks, tuning queries, and scaling infrastructure as needed.
Automate data ingestion, processing, and validation tasks to ensure data quality and consistency.
Implement data governance policies and procedures to ensure data quality, consistency, and compliance with relevant regulations.
Contribute to the development of the organization's overall data strategy.
Conduct code reviews and contribute to the establishment of coding standards and best practices.
Required Qualifications:
Bachelor's degree in a relevant field or equivalent professional experience.
4-6 years of hands-on experience in data engineering.
Strong expertise in SQL and NoSQL databases, including PostgreSQL, DynamoDB, and MongoDB.
Experience working with cloud platforms such as GCP, Azure, or AWS and their associated data services.
Practical knowledge of data warehouses like BigQuery, Snowflake, and Redshift.
Programming skills in Python or JavaScript.
Proficiency with BI tools such as Sisense, Power BI, or Tableau.
Preferred Qualifications:
Direct experience with Google Cloud Platform (GCP).
Knowledge of CI/CD pipelines, including tools like Docker and Terraform.
Background in the healthcare industry.
Familiarity with modern data integration tools such as DBT, Matillion, and Airbyte. Compensation: $125,000.00 per year
Who We Are CARE ITS is a certified Woman-owned and operated minority company (certified as WMBE). At CARE ITS, we are the World Class IT Professionals, helping clients achieve their goals. Care ITS was established in 2010. Since then we have successfully executed several projects with our expert team of professionals with more than 20 years of experience each. We are globally operated with our Head Quarters in Plainsboro, NJ, with focused specialization in Salesforce, Guidewire and AWS. We provide expert solutions to our customers in various business domains.
Auto-ApplySoftware Engineer, Data Pipeline Engineering
Data engineer job in Kansas City, MO
Building trusted markets - powered by our people. At Cboe Global Markets, we inspire our people to solve complex challenges together because what we do matters. We provide the financial infrastructure that powers the global economy. As a leading provider of market infrastructure and tradable products, Cboe delivers cutting-edge trading, clearing and investment solutions to market participants around the world.
We're building inclusive ways to support professional and personal development while strengthening the trust we've earned as a global market leader. Our teams are empowered to share ideas, actively pursue them and bring on a challenge. As champions of internal mobility and access to opportunity, we encourage our people to "go for it" and equip our managers with the training to coach their teams to the next level. Our Associate Resource Groups champion diversity, equity and inclusion, giving employees a safe space to network, share ideas and create opportunities.
Sound like the place for you? Join us!
Please note: The person selected for this role will need to be willing to go on call 2 to 3 times per year for one week at a time for 24/7 support.
Data Pipeline Engineering Team, What We Do
We design, develop and deploy data driven batch and streaming systems supporting 11 major North American exchange markets trading over $1 trillion notional securities daily. We store every order, cancel, and execution with care. We architect systems managing 100s of billions of events per day. We optimize for scalability, resiliency and performance.
We are a data processing focused software engineering team. We live and breathe automation. We are analytical thinkers. If it's broken, we fix it. If it needs refactoring, we refactor it. If it's hard to test, we make it testable. We are pragmatic. We ship code weekly. We're looking for like-minded individuals to join us.
Required Skills:
* Bachelor's degree or equivalent in Computer Science or related field preferred.
* 2+ years of experience as Software Engineer.
* Strong knowledge of software engineering data structures and algorithms.
* Experience with Python or strong desire to learn Python.
* Experience with Java or strong desire to learn Java.
* Experience with Postgres or similar relational databases.
* Experience with Snowflake, AWS and cloud-based services desirable.
* Experience with Kafka streaming platform or strong desire to learn about Kafka.
* Experience with Linux systems, shell scripting, command line proficiency and network concepts.
* Self-directed and self-motivated.
Benefits and Perks
We value the total wellbeing of our people - including health, financial, personal and social wellness. We believe standard benefits like health insurance and fair pay are a given at any organization. Still, you should know we offer:
* Fair and competitive salary and incentive compensation packages with an upside for overachievement
* Generous paid time off, including vacation, personal days, sick days and annual community service days
* Flexible, hybrid work environment
* Health, dental and vision benefits, including access to telemedicine and mental health services
* 2:1 401(k) match, up to 8% match immediately upon hire
* Discounted Employee Stock Purchase Plan
* Tax Savings Accounts for health, dependent and transportation
* Employee referral bonus program
* Volunteer opportunities to help you give back to your communities
Some of our employees' favorite benefits and perks include:
* Complimentary lunch, snacks and coffee in any Cboe office
* Paid Tuition assistance and education opportunities
* Generous charitable giving company match
* Paid parental leave and fertility benefits
* On-site gyms and discounts to other fitness centers
More About Cboe
We're reimagining the future of the workplace by focusing on what matters most, our people. Our journey is an inclusive one. We're investing deeply in leadership programs and career development initiatives that ensure everyone has an equal chance to succeed. We celebrate the diversity in our communities, inside and out, and welcome new perspectives with equity, inclusion and belonging.
We work with purpose, solving problems with ingenuity, collaboration, and a lot of passion. We're an engaged and excited team connecting markets across borders and embracing growth in all its forms to achieve incredible outcomes.
Learn more about life at Cboe on our website and LinkedIn.
Equal Employment Opportunity
We're proud to be an equal opportunity employer - and celebrate our employees' differences, including race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, and Veteran status.
This position is not eligible for visa sponsorship. Candidates must be legally authorized to work in the United States without the need for employer sponsorship now or in the future.
Our pay ranges are determined by a number of factors, including, but not limited to, role, experience, level, and location. The national new hire base pay range for this job in the United States is $97,750-$120,750. This range represents the minimum and maximum base pay the company expects to offer for new hires working in the position full time. If you live in one of the following areas or if you work in a Cboe office in the following areas, the range may be higher according to the geographic differentials listed below:
US Geographic Differentials:
* 110%: Austin TX, Chicago IL, Denver CO, San Diego CA
* 115%: Los Angeles CA, Seattle WA
* 120%: Boston MA, Washington DC
* 125%: New York City NY
* 130%: San Francisco CA
Within the range, individual pay is determined by a number of factors, including, but not limited to, work location, job-related skills, experience, and relevant education or training. In addition to base pay, our total rewards program includes an annual variable pay program and benefits including healthcare (medical, dental and vision), 401 (k) with a generous company match, life and disability insurance, paid time off, market-leading tuition assistance, and much more! Your recruiter will provide more details about the total compensation package, including variable pay and benefits, during the hiring process. For further information on our total rewards program, visit TOTAL REWARDS @CBOE.
Any communication from Cboe regarding this position will only come from a Cboe recruiter who has *********** email or via LinkedIn Recruiter. Cboe does not use any other third party communication tools for recruiting purposes.
Auto-ApplyData Engineer III
Data engineer job in Kansas City, MO
Who We Are: Spring Venture Group is a leading digital direct-to-consumer sales and marketing company with product offerings focused on the senior market. We specialize in distributing Medicare Supplement, Medicare Advantage, and related products via our family of brands and dedicated team of licensed insurance agents. Powered by our unique technologies that combine sophisticated marketing, comparison shopping, sales execution, and customer engagement - we help thousands of seniors across the country navigate the complex world of Medicare every day.
Job Description
This person has the opportunity to work primarily remote in the Kansas City or surrounding areas, making occasional visits to the office, but must CURRENTLY be in the Kansas City area.
We are unable to sponsor for this role, this includes international students.
OVERVIEW
The Data Management team is responsible for all things data at Spring Venture Group. Most importantly, our team is responsible for constructing high quality datasets that enable our business stakeholders and world-class Analytics department to make data informed decisions. Data engineers, combining Software Engineering and Database Engineering, serve as a primary resource for expertise with writing scripts and SQL queries, monitoring our database stability, and assisting with data governance ensuring availability for business-critical systems. The DE III works with a team of engineers of varying levels to design, develop, test, and maintain software applications and programs. The DE III will be expected to work independently when needed to solve the most complex problems encountered. They will be expected to be a leader and a mentor.
ESSENTIAL DUTIES
The essential duties for this role include, but are not limited to:
Serve as a primary advisor to Data Engineering Manager to identify and bring attention to opportunities for technical improvements, reduction of technical debt, or automation of repeated tasks.
Build advanced data pipelines utilizing the medallion architecture to create high quality single source of truth data sources in Snowflake
Architect replacements of current Data Management systems with respect to all aspects of data governance
Design advanced services with multiple data pipelines to securely and appropriately store company assets in our enterprise data stores.
Technically advise any member of the data engineering department, providing direction when multiple paths forward present themselves.
Actively participate as a leader in regular team meetings, listening and ensuring that one is assisting others at every chance for growth and development.
Write advanced ETL/ELT scripts where appropriate to integrate data of various formats into enterprise data stores.
Take ownership (both individually and as part of a team) of services and applications
Write complex SQL queries, scripts, and stored procedures to reliably and consistently modify data throughout our organization according to business requirements
Collaborate directly and independently with stakeholders to build familiarity, fully understand their needs, and create custom, modular, and reliable solutions to resolve their requests
Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code
Work with Project Managers, Solution Architects, and Software Development teams to build solutions for Company Initiatives on time, on budget, and on value.
Independently architect solutions to problems of high complexity, and advise junior and mid-level engineers on problems of medium complexity.
Create data pipelines using appropriate and applicable technologies from Amazon Web Services (AWS) to serve the specific needs of the business.
Ensure 99.95% uptime of our company's services monitoring data anomalies, batch failures, and our support chat for one week per team cycle from 8am-9pm.
Follow and embrace procedures of both the Data Management team and SVG Software Development Life Cycle (SDLC), including obtaining and retaining IT Security Admin III clearance.
Support after hours and weekend releases from our internal Software Development teams.
Actively participate in code review and weekly technicals with another more senior engineer or manager.
Assist departments with time-critical SQL execution and debug database performance problems.
ROLE COMPETENCIES
The competencies for this role include, but are not limited to:
Emotional Intelligence
Drive for Results
Continuous Improvement
Communication
Strategic Thinking
Teamwork and Collaboration
Qualifications
POSITION REQUIREMENTS
The requirements to fulfill this position are as follows:
Bachelor's degree in Computer Science, or a related technical field.
4-7 years of practical production work in Data Engineering.
Expertise of the Python programming language.
Expertise of Snowflake
Expertise of SQL, databases, & query optimization.
Must have experience in a large cloud provider such as AWS, Azure, GCP.
Advanced at reading code independently and understanding its intent.
Advanced at writing readable, modifiable code that solves business problems.
Ability to construct reliable and robust data pipelines to support both scheduled and event based workflows.
Working directly with stakeholders to create solutions.
Mentoring junior and mid-level engineers on best practices in programming, query optimization, and business tact.
Additional Information
Benefits:
The Company offers the following benefits for this position, subject to applicable eligibility requirements:
Competitive Compensation
Medical, Dental and vision benefits after a short waiting period
401(k) matching program
Life Insurance, and Short-term and Long-term Disability Insurance
Optional enrollment includes HSA/FSA, AD&D, Spousal/Dependent Life Insurance, Travel Assist and Legal Plan
Generous paid time off (PTO) program starting off at 15 days your first year
15 paid Holidays (includes holiday break between Christmas and New Years)
10 days of Paid Parental Leave and 5 days of Paid Birth Recovery Leave
Annual Volunteer Time Off (VTO) and a donation matching program
Employee Assistance Program (EAP) - health and well-being on and off the job
Rewards and Recognition
Diverse, inclusive and welcoming culture
Training program and ongoing support throughout your Venture Spring Venture Group career
Security Responsibilities:
Operating in alignment with policies and standards
Reporting Security Incidents Completing assigned training
Protecting assigned organizational assets
Spring Venture Group is an Equal Opportunity Employer
Senior Data Engineer
Data engineer job in Overland Park, KS
Velocity Staff, Inc. is working with our client located in the Overland Park, KS area to identify a Senior Level Data Engineer to join their Data Services Team. The right candidate will utilize their expertise in data warehousing, data pipeline creation/support and analytical reporting and be responsible for gathering and analyzing data from several internal and external sources, designing a cloud-focused data platform for analytics and business intelligence, reliably providing data to our analysts. This role requires significant understanding of data mining and analytical techniques. An ideal candidate will have strong technical capabilities, business acumen, and the ability to work effectively with cross-functional teams.
Responsibilities
Work with Data architects to understand current data models, to build pipelines for data ingestion and transformation.
Design, build, and maintain a framework for pipeline observation and monitoring, focusing on reliability and performance of jobs.
Surface data integration errors to the proper teams, ensuring timely processing of new data.
Provide technical consultation for other team members on best practices for automation, monitoring, and deployments.
Provide technical consultation for the team with “infrastructure as code” best practices: building deployment processes utilizing technologies such as Terraform or AWS Cloud Formation.
Qualifications
Bachelor's degree in computer science, data science or related technical field, or equivalent practical experience
Proven experience with relational and NoSQL databases (e.g. Postgres, Redshift, MongoDB, Elasticsearch)
Experience building and maintaining AWS based data pipelines: Technologies currently utilized include AWS Lambda, Docker / ECS, MSK
Mid/Senior level development utilizing Python: (Pandas/Numpy, Boto3, SimpleSalesforce)
Experience with version control (git) and peer code reviews
Enthusiasm for working directly with customer teams (Business units and internal IT)
Preferred but not required qualifications include:
Experience with data processing and analytics using AWS Glue or Apache Spark
Hands-on experience building data-lake style infrastructures using streaming data set technologies (particularly with Apache Kafka)
Experience data processing using Parquet and Avro
Experience developing, maintaining, and deploying Python packages
Experience with Kafka and the Kafka Connect ecosystem.
Familiarity with data visualization techniques using tools such as Grafana, PowerBI, AWS Quick Sight, and Excel.
Not ready to apply? Connect with us to learn about future opportunities.
Auto-ApplySenior Data Engineer-SQL(Onsite) Overland Park, KS
Data engineer job in Overland Park, KS
Netsmart is seeking a Sr. Data Engineer with expertise in Big Data and advanced ETL development. This role will focus on building and optimizing complex data pipelines using SSIS and SQL, while also leveraging cloud-native technologies such as AWS Glue and Spark. The Senior Data Engineer will ensure accurate, timely, and automated data collection to power analytics and automation tools, while driving scalability and process improvements. Strong experience with Python or Scala is preferred, along with the ability to evaluate and implement emerging data solutions that enhance data integration, governance, and delivery.
As a key member of the team, you will be responsible for maintaining smooth operations of large-scale data automation environments, ensuring data accuracy, and enabling downstream analytics and automation tools. This position requires both hands-on technical ability and the vision to recommend and implement emerging technologies that enhance data integration, governance, and delivery.
This position is not available for visa sponsorship.
This position is not eligible for relocation assistance.
Responsibilities
* Identify and leverage new tools, processes and methodologies to interpret data and analyze results using statistical techniques on an ad hoc and recurring basis
* Identify methodologies and build tools to acquire data from primary or secondary data sources and maintain databases/data systems
* Identify, analyze, and interpret trends or patterns in complex data sets; create engaging and insightful data visualizations that optimize statistical efficiency and quality
* Proactively troubleshoot data quality issues by identifying the root cause of data discrepancies and determining and implementing recommendations for resolution
* Prioritize business and information needs
* Identify, define and quantify process improvement opportunities
This position is located in our Overland Park, KS location.
Qualifications
Required
* Bachelor's degree in Mathematics, Computer Science, Statistics, Economics, Information Management or related field
* At least 4 years of data analysis work experience including data management, transformation and visualization
* Experience with reporting packages (Microsoft SSRS, SAP Crystal Reports, etc.) and industry leading BI tools such as Tableau, Qlik, or Power BI
* Experience with SQL and RDBMS databases
* Programming languages and coding technologies such as Java, .NET, Python, XML, JavaScript
* Experience analyzing large datasets from ETL frameworks
* Experience using statistical packages for analyzing datasets (Excel, SPSS, SAS etc.)
* Experience with relational and NoSQL database systems
Preferred
* Work experience in the healthcare industry, including clinical, financial and operational data
* Experience working with AI/ML
* Experience with AWS is a nice to have
* Experience with languages such as Python, Java, Ruby, C++, Perl, SQL, Hive, Scala, Spark, Kafka,
This position is not available for visa sponsorship.
This position is not eligible for relocation assistance.
Netsmart is proud to be an equal opportunity workplace and is an affirmative action employer, providing equal employment and advancement opportunities to all individuals. We celebrate diversity and are committed to creating an inclusive environment for all associates. All employment decisions at Netsmart, including but not limited to recruiting, hiring, promotion and transfer, are based on performance, qualifications, abilities, education and experience. Netsmart does not discriminate in employment opportunities or practices based on race, color, religion, sex (including pregnancy), sexual orientation, gender identity or expression, national origin, age, physical or mental disability, past or present military service, or any other status protected by the laws or regulations in the locations where we operate.
Netsmart desires to provide a healthy and safe workplace and, as a government contractor, Netsmart is committed to maintaining a drug-free workplace in accordance with applicable federal law. Pursuant to Netsmart policy, all post-offer candidates are required to successfully complete a pre-employment background check, including a drug screen, which is provided at Netsmart's sole expense. In the event a candidate tests positive for a controlled substance, Netsmart will rescind the offer of employment unless the individual can provide proof of valid prescription to Netsmart's third party screening provider.
If you are located in a state which grants you the right to receive information on salary range, pay scale, description of benefits or other compensation for this position, please use this form to request details which you may be legally entitled.
All applicants for employment must be legally authorized to work in the United States. Netsmart does not provide work visa sponsorship for this position.
Netsmart's Job Applicant Privacy Notice may be found here.
Auto-ApplyData Engineer
Data engineer job in Overland Park, KS
At Quest Analytics, our mission is to make healthcare more accessible for all Americans. As part of our team, you'll work in an innovative, collaborative, challenging, and flexible environment that supports your personal growth every day. We are looking for a talented and motivated Data Engineer with experience in building scalable infrastructure, implementing automation, and enabling cross-functional teams with reliable and accessible data.
The Data Engineer will run daily operations of the data infrastructure, automate and optimize our data operations and data pipeline architecture while ensuring active monitoring and troubleshooting. This hire will also support other engineers and analysts on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. APPLY TODAY!
What you'll do:
* Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
* Review project objectives and determine the best technology for implementation. Implement best practice standards for development, build and deployment automation.
* Run daily operations of the data infrastructure and support other engineers and analysts on data investigations and operations.
* Monitor and report on all data pipeline tasks while working with appropriate teams to take corrective action quickly, in case of any issues.
* Work with internal teams to understand current process and areas for efficiency gains
* Write well-abstracted, reusable, and efficient code.
* Participate in the training and/or mentoring programs as assigned or required.
* Adheres to the Quest Analytics Values and supports a positive company's culture.
* Responds to the needs and requests of clients and Quest Analytics management and staff in a professional and expedient manner.
What it requires:
* Bachelor's degree in computer science or related field.
* 3 years of work experience with ETL, data operations and troubleshooting, preferably in healthcare data.
* Proficiency with Azure ecosystems, specifically in Azure Data Factory and ADLS.
* Strong proficiency in Python for scripting, automation, and data processing.
* Advanced SQL skills for query optimization and data manipulation.
* Experience with distributed data pipeline tools like Apache Spark, Databricks, etc.
* Working knowledge of database modeling (schema design, and data governance best practices.)
* Working knowledge of libraries like Pandas, numpy, etc.
* Self-motivated and able to work in a fast paced, deadline-oriented environment
* Excellent troubleshooting, listening, and problem-solving skills.
* Proven ability to solve complex issues.
* Customer focused.
What you'll appreciate:
* Workplace flexibility - you choose between remote, hybrid or in-office
* Company paid employee medical, dental and vision
* Competitive salary and success sharing bonus
* Flexible vacation with no cap, plus sick time and holidays
* An entrepreneurial culture that won't limit you to a job description
* Being listened to, valued, appreciated -- and having your contributions rewarded
* Enjoying your work each day with a great group of people
Apply TODAY!
careers.questanalytics.com
About Quest Analytics
For more than 20 years, we've been improving provider network management one groundbreaking innovation at a time. 90% of America's health plans use our tools, including the eight largest in the nation. Achieve your personal quest to build a great career here.
Visa sponsorship is not available at this time.
Preferred work locations are within one of the following states: Alabama, Arizona, Arkansas, Colorado, Connecticut, Delaware, Florida, Georgia, Idaho, Illinois (outside of Chicago proper), Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Mexico, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, West Virginia, Wisconsin, or Wyoming.
Quest Analytics provides equal employment opportunities to all people without regard to race, color, religion, sex, national origin, ancestry, marital status, veteran status, age, disability, sexual orientation or gender identity or expression or any other legally protected category. We are committed to creating and maintaining a workforce environment that is free from any form of discriminations or harassment.
Applicants must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire.
Persons with disabilities who anticipate needing accommodations for any part of the application process may contact, in confidence [email protected]
NOTE: Staffing agencies, headhunters, recruiters, and/or placement agencies, please do not contact our hiring managers directly. We are not currently working with additional outside agencies at this time.
Any job posting displayed on websites other than questanalytics.com or jobs.lever.co/questanalytics/ may be out of date, inaccurate and unavailable
We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
Data Engineer II
Data engineer job in Leawood, KS
Full-time Description
27Global is a rapidly growing company in the dynamic industry of software, cloud, and data engineering. We pride ourselves on the quality of the services we deliver, the clients we serve, and the strength of our culture. Our commitment to our employees is evidenced by our five Best Places to Work awards.
We're looking for a Data Engineer to join our team! You'll be responsible for contributing to the design and development of enterprise data solutions that support analytics, business intelligence, and scalable applications. You'll work closely with data and software architects, consultants and other engineers to deliver data models, integration strategies, and governance practices that empower client's data-driven decisions.
Joining 27Global as a Data Engineer is an exciting high-growth opportunity offering a competitive base salary, performance bonuses, and variable compensation.
Your Role:
Participate in the design and implementation of scalable, secure, and high-performance data architectures.
Develop and maintain conceptual, logical, and physical data models.
Work closely with architects to define standards for data integration, quality, and governance.
Collaborate with engineers, analysts, and business stakeholders to align data solutions with organizational needs.
Support cloud-based data strategies including data warehousing, pipelines, and real-time processing.
Design and optimize data pipelines that support AI, machine learning, and advanced analytics workloads.
Implement data preprocessing, feature engineering, and real-time inference capabilities for predictive modeling.
Integrate AI/ML models into production environments using tools such as AWS SageMaker, Azure Machine Learning, or Databricks.
Assess, learn, and apply emerging data technologies and frameworks to enhance solutions and stay current with industry trends.
Requirements
What You Bring:
BA/BS/Master's degree in Computer Science, Information Systems, Data Science, or related field.
2 - 4 years of experience in data architecture, data engineering, or related roles delivering scalable architecture solutions from design to production.
2 - 4 years of experience writing .Net code or other OOP languages in an Agile environment.
Demonstrated leadership skills with the ability to collaborate with and lead on-shore and off-shore team members.
Proficient technical skills in: Spark, Scala, C#, PySpark, Data Lake, Delta Lake, Relational and NoSQL Databases, AWS Glue and Azure Synapse
Experience with SQL, ETL/ELT, and data modeling.
Experience with cloud platforms (AWS, Azure, GCP) and implementing modern data platforms with data lake.
Knowledge of data governance, security, and compliance frameworks.
Ability to context switch and work on a variety of projects over specified periods of time.
Ability to work at the 27Global office in Leawood, KS with hybrid work flexibility after 90 days, and occasionally onsite at client offices.
Flexibility to occasionally travel to client sites may be required, typically 1 week per quarter or less.
Legal authorization to work in the United States and the ability to prove eligibility at the time of hire.
Ways to Stand Out:
Certifications: AWS Solution Architect, Azure Data Engineer, Databricks Data Engineer
Hands-on experience with Databricks for building and optimizing scalable data pipelines, Delta Lake, and Spark-based analytics.
Hands-on experience with big data tools (Spark, Kafka).
Modern data warehouses (Snowflake, Redshift, BigQuery).
Familiarity with machine learning pipelines and real-time analytics.
Strong communication skills and ability to influence stakeholders.
Prior experience implementing enterprise data governance frameworks.
Experience in a client-facing role, working directly with clients from multiple levels of the organization; often presenting and documenting client environment suggestions and improvements.
Why 27G?:
Four-time award winner of Best Place to Work by the Kansas City Business Journal.
A casual and fun small business work environment.
Competitive compensation, benefits, time off, profit sharing, and quarterly bonus potential.
Dedicated time for learning, development, research, and certifications.
Principal Data Engineer
Data engineer job in Lenexa, KS
Job Description
About the Role: weavix is seeking a hands-on, business-minded Senior or Principal Data Engineer to architect and own our data infrastructure from the ground up. This is a unique opportunity to shape the future of data at a high-growth startup where IoT, scale, and performance are core to our mission.
You'll be the technical lead for everything data - building pipelines, architecting systems, and working cross-functionally to extract insights that power customer growth, analyze user behavior, and improve system reliability and performance. This is a highly autonomous role, perfect for someone with startup experience who enjoys solving complex problems independently.
What You'll Do:
Architect, build, and maintain scalable data systems and pipelines to ingest and process large-scale data from IoT devices and user activity
Own the design and implementation of our cloud-based data platform (Microsoft Azure strongly preferred; GCP or AWS also acceptable)
Enable data-driven decision-making across product, engineering, and business teams
Create a data architecture that supports both operational and analytical use cases (growth analytics, performance monitoring, system scaling)
Ensure data quality, observability, governance, and security across all systems
Serve as the subject matter expert on data systems, operating as a senior IC without a team initially
What You Bring:
6+ years of experience in data engineering, ideally within a startup or high-growth environment
Proven ability to independently design, implement, and manage scalable data architectures
Deep experience working with large datasets, ideally from IoT sources or other high-volume systems
Proficiency with modern data tools and languages (e.g., Typescript, NodeJS, SQL, etc.)
Strong cloud experience, ideally with Microsoft Azure (but AWS or GCP also acceptable)
A business-focused mindset with the ability to connect technical work to strategic outcomes
Experience with New Relic, Metabase, Postgres, Grafana, Azure Storage, MongoDB, or other storage, database, graphing, or alerting platforms.
Excellent communication and collaboration skills across technical and non-technical teams
Bonus Points For:
Experience with event-driven or real-time data systems (Kafka, Kinesis, etc.)
Familiarity with BI tools and self-service analytics platforms
Background in system performance monitoring and observability tools
Why weavix
Being a part of the weavix team is being a part of something bigger. We value the innovators and the risk-takers-the ones who love a challenge. Through our shared values and dedication to our mission to Connect every Disconnected Worker, we're reshaping the future of work to focus on this world's greatest assets: people.
It's truly amazing what happy, engaged team members can achieve. Our ever-evolving list of benefits means you'll be able to achieve work/life balance, perform impactful work, grow in your role, look after yourself/your family, and invest in your future.
Perks and Benefits
Competitive Compensation
Employee Equity Stock Program
Competitive Benefits Package including: Medical, Dental, Vision, Life, and Disability Insurance
401(k) Retirement Plan + Company Match
Flexible Spending & Health Savings Accounts
Paid Holidays
Flexible Time Off
Employee Assistance Program (EAP)
Other exciting company benefits
About Us
weavix , the Internet of Workers platform, revolutionizes frontline communication and productivity at scale. Since its founding, weavix has shaped the future of work by introducing innovative methods to better connect and enable the frontline workforce. weavix transforms enterprise by providing data-driven insights into facilities and teams to maximize productivity and achieve breakthrough results. weavix is the single source of truth for both workers and executives.
Our mission is simple: to connect every disconnected worker through disruptive technology.
How do you want to make your impact?
For more information about us, visit weavix.com.
Equal Employment Opportunity (EEO) Statement
weavix is an Equal Opportunity Employer. At weavix, diversity fuels innovation. We are dedicated to fostering an inclusive environment where every team member is empowered to contribute to our mission of connecting the disconnected workforce.
We do not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), national origin, age, disability, veteran status, genetic information, or any other legally protected characteristic. All qualified applicants will receive consideration for employment.
Americans with Disabilities Act (ADA) Statement
weavix is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need assistance or an accommodation during the application process due to a disability, you may contact us at *************.
E-Verify Notice
Notice: weavix participates in the E-Verify program to confirm employment eligibility as required by law.
Principal Data Engineer
Data engineer job in Lenexa, KS
About the Role: weavix is seeking a hands-on, business-minded Senior or Principal Data Engineer to architect and own our data infrastructure from the ground up. This is a unique opportunity to shape the future of data at a high-growth startup where IoT, scale, and performance are core to our mission.
You'll be the technical lead for everything data - building pipelines, architecting systems, and working cross-functionally to extract insights that power customer growth, analyze user behavior, and improve system reliability and performance. This is a highly autonomous role, perfect for someone with startup experience who enjoys solving complex problems independently.
What You'll Do:
Architect, build, and maintain scalable data systems and pipelines to ingest and process large-scale data from IoT devices and user activity
Own the design and implementation of our cloud-based data platform (Microsoft Azure strongly preferred; GCP or AWS also acceptable)
Enable data-driven decision-making across product, engineering, and business teams
Create a data architecture that supports both operational and analytical use cases (growth analytics, performance monitoring, system scaling)
Ensure data quality, observability, governance, and security across all systems
Serve as the subject matter expert on data systems, operating as a senior IC without a team initially
What You Bring:
6+ years of experience in data engineering, ideally within a startup or high-growth environment
Proven ability to independently design, implement, and manage scalable data architectures
Deep experience working with large datasets, ideally from IoT sources or other high-volume systems
Proficiency with modern data tools and languages (e.g., Typescript, NodeJS, SQL, etc.)
Strong cloud experience, ideally with Microsoft Azure (but AWS or GCP also acceptable)
A business-focused mindset with the ability to connect technical work to strategic outcomes
Experience with New Relic, Metabase, Postgres, Grafana, Azure Storage, MongoDB, or other storage, database, graphing, or alerting platforms.
Excellent communication and collaboration skills across technical and non-technical teams
Bonus Points For:
Experience with event-driven or real-time data systems (Kafka, Kinesis, etc.)
Familiarity with BI tools and self-service analytics platforms
Background in system performance monitoring and observability tools
Why weavix
Being a part of the weavix team is being a part of something bigger. We value the innovators and the risk-takers-the ones who love a challenge. Through our shared values and dedication to our mission to Connect every Disconnected Worker, we're reshaping the future of work to focus on this world's greatest assets: people.
It's truly amazing what happy, engaged team members can achieve. Our ever-evolving list of benefits means you'll be able to achieve work/life balance, perform impactful work, grow in your role, look after yourself/your family, and invest in your future.
Perks and Benefits
Competitive Compensation
Employee Equity Stock Program
Competitive Benefits Package including: Medical, Dental, Vision, Life, and Disability Insurance
401(k) Retirement Plan + Company Match
Flexible Spending & Health Savings Accounts
Paid Holidays
Flexible Time Off
Employee Assistance Program (EAP)
Other exciting company benefits
About Us
weavix , the Internet of Workers platform, revolutionizes frontline communication and productivity at scale. Since its founding, weavix has shaped the future of work by introducing innovative methods to better connect and enable the frontline workforce. weavix transforms enterprise by providing data-driven insights into facilities and teams to maximize productivity and achieve breakthrough results. weavix is the single source of truth for both workers and executives.
Our mission is simple: to connect every disconnected worker through disruptive technology.
How do you want to make your impact?
For more information about us, visit weavix.com.
Equal Employment Opportunity (EEO) Statement
weavix is an Equal Opportunity Employer. At weavix, diversity fuels innovation. We are dedicated to fostering an inclusive environment where every team member is empowered to contribute to our mission of connecting the disconnected workforce.
We do not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), national origin, age, disability, veteran status, genetic information, or any other legally protected characteristic. All qualified applicants will receive consideration for employment.
Americans with Disabilities Act (ADA) Statement
weavix is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need assistance or an accommodation during the application process due to a disability, you may contact us at *************.
E-Verify Notice
Notice: weavix participates in the E-Verify program to confirm employment eligibility as required by law.
Auto-ApplySlalom Flex (Project Based)- Java Data Engineer
Data engineer job in Kansas City, MO
About the Role: We are seeking a highly skilled and motivated Data Engineer to join our team as an individual contributor. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure that support our data-driven initiatives. You will work closely with cross-functional teams to ensure data availability, quality, and performance across the organization.
About Us
Slalom is a purpose-led, global business and technology consulting company. From strategy to implementation, our approach is fiercely human. In six countries and 43 markets, we deeply understand our customers-and their customers-to deliver practical, end-to-end solutions that drive meaningful impact. Backed by close partnerships with over 400 leading technology providers, our 10,000+ strong team helps people and organizations dream bigger, move faster, and build better tomorrows for all. We're honored to be consistently recognized as a great place to work, including being one of Fortune's 100 Best Companies to Work For seven years running. Learn more at slalom.com.
Key Responsibilities:
* Design, develop, and maintain robust data pipelines using Java and Python.
* Build and optimize data workflows on AWS using services such as EMR, Glue, Lambda, and NoSQL databases.
* Leverage open-source frameworks to enhance data processing capabilities and performance.
* Collaborate with data scientists, analysts, and other engineers to deliver high-quality data solutions.
* Participate in Agile development practices, including sprint planning, stand-ups, and retrospectives.
* Ensure data integrity, security, and compliance with internal and external standards.
Required Qualifications:
* 5+ years of hands-on experience in software development using Java and Python (Spring Boo).
* 1+ years of experience working with AWS services including EMR, Glue, Lambda, and NoSQL databases.
* 3+ years of experience working with open-source data processing frameworks (e.g., Apache Spark, Kafka, Airflow).
* 2+ years of experience in Agile software development environments.
* Strong problem-solving skills and the ability to work independently in a fast-paced environment.
* Excellent communication and collaboration skills.
Preferred Qualifications:
* Experience with CI/CD pipelines and infrastructure-as-code tools (e.g., Terraform, CloudFormation).
* Familiarity with data governance and data quality best practices.
* Exposure to data lake and data warehouse architectures.
Compensation and Benefits
Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses.
EEO and Accommodations
Slalom is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration
for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements.
Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the
selection process. Please advise the talent acquisition team if you require accommodations during the interview process.
Senior Data Engineer
Data engineer job in Overland Park, KS
Company Details
Intrepid Direct Insurance (IDI) is a rapidly growing direct to consumer property and casualty insurance company. A member of the W. R. Berkley Corporation, a fortune 500 company, rated A+ (Superior) by A.M. Best, Intrepid Direct's vision is to make life better for business. The insurance industry has not evolved with innovation like other major industries. We're here to change that. We are making life better for our customers, shareholders, and our team members by leveraging data and technology as insurance experts for our targeted customers. You will be part of a highly collaborative team of talented and focused professionals. Join a group that enjoys working together, trusts each other, and takes pride in our hard-earned success.
***************************
The Company is an equal employment opportunity employer.
Responsibilities
Intrepid Direct Insurance is looking for an experienced Senior Data Engineer to mentor, orchestrate, implement, and monitor the flowing through our organization. This opportunity will have a direct influence on how data is made available to our business units, as well as our customers. You'll primarily be working with our operations and engineering teams to create and enhance data pipelines, conform and enrich data, and deliver information to business users. Learn the ins and outs of what we do so that you can focus on improving availability and quality of the data we use to service our customers.
Key functions include but are not limited to:
Assist with long-term strategic planning for modern data warehousing needs.
Contribute to data modeling exercises and the buildout of our data warehouse.
Monitor, support, and analyze existing pipelines and recommend performance and process improvements to address gaps in existing process.
Automate manual processes owned by data team.
Troubleshoot and remediate ingestion and reporting related issues.
Design and build new pipelines to ingest data from additional disparate sources.
Responsible for the accuracy and availability of data in our data warehouse.
Collaborate with a multi-disciplinary team to develop data-driven solutions that align with our business and technical needs.
Create and deploy reports as needed.
Assist with cataloging and classifying existing data sets.
Participate in peer reviews with emphasis on continuous improvement.
Respond to regulatory requests for information.
Assumes other tasks and duties as assigned by management.
Mentor team members and advise on best practices.
Qualifications
Bachelor's degree in Mathematics, Statistics, Computer Science, or equivalent experience.
6+ years of relevant data engineering experience.
Analytical thinker with experience working in a fast-paced, startup environment.
Technical expertise with Microsoft SQL Server.
Familiarity with ETL tools and concepts.
Hands-on experience with database design and data modeling, preferable experience with Data Vault methodology.
Experience supporting and troubleshooting SSIS packages.
Experience consuming event-based data through APIs or queues.
Experience in Agile software development.
Experience with insurance data highly desired.
Detail oriented, solid organizational, and problem-solving.
Strong written, visual, and verbal communication skills.
Team oriented with a strong willingness to serve others in an agile startup environment.
Flexible in assuming new responsibilities as they arise.
Experience with Power Bi desired.
Additional Company Details We do not accept unsolicited resumes from third party recruiting agencies or firms.
The actual salary for this position will be determined by a number of factors, including the scope, complexity and location of the role; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. Sponsorship Details Sponsorship not Offered for this Role
Auto-ApplyCorporate Treasury Data & Risk Analytics
Data engineer job in Overland Park, KS
We are seeking a driven and analytically minded professional to join our Corporate Treasury team. This individual will play a key role supporting asset/liability management, liquidity management, budgeting & forecasting, data analytics, and performance analysis/reporting.
In this role, you will work closely with senior and executive leadership to deliver strategic financial insights, optimize business performance, support and influence decision-making, uncover data-driven stories, and challenge existing processes with fresh, innovative thinking.
Essential Duties & Responsibilities
Responsibilities will be tailored to the experience and skillset of the selected candidate and may include:
* Developing and enhancing financial models and simulations
* Supporting forecasting, liquidity, and ALM analytics
* Conducting "what-if" scenario analysis and presenting actionable insights
* Building dashboards, reporting tools, and performance summaries
* Driving or contributing to process improvement initiatives
* Collaborating cross-functionally with senior leaders across the organization
Experience & Knowledge
* Financial modeling and earnings simulation experience using risk/performance management tools
* Designing and developing mathematical or statistical models to support strategic decision-making and risk management
* Experience running scenario analysis and synthesizing insights for executive audiences
* Familiarity with financial asset/liability instruments, market instruments, and their interactions
* Experience with Funds Transfer Pricing (FTP) and capital allocation is a plus
* Demonstrated success driving effective process improvements
Education
* Bachelor's degree in Accounting, Finance, or a related field required
CapFed is an equal opportunity employer.
Auto-ApplyData Engineer II
Data engineer job in Kansas City, MO
Who We Are: Spring Venture Group is a leading digital direct-to-consumer sales and marketing company with product offerings focused on the senior market. We specialize in distributing Medicare Supplement, Medicare Advantage, and related products via our family of brands and dedicated team of licensed insurance agents. Powered by our unique technologies that combine sophisticated marketing, comparison shopping, sales execution, and customer engagement - we help thousands of seniors across the country navigate the complex world of Medicare every day.
Job Description
OVERVIEW
The Data Management team is responsible for all things data at Spring Venture Group. Most importantly, our team is responsible for constructing high quality datasets that enable our business stakeholders and world-class Analytics department to make data informed decisions. Data engineers, combining Software Engineering and Database Engineering, serve as a primary resource for expertise with writing scripts and SQL queries, monitoring our database stability, and assisting with data governance ensuring availability for business-critical systems. The DE II works with a team of engineers of varying levels to design, develop, test, and maintain data applications and programs. The DE II will be expected to work independently when needed to solve moderately complex problems.
We are unable to sponsor for this role at anytime, so this includes not being able to consider OPT or EAD candidates or C2C candidates. You must also CURRENTLY be in the Kansas City area.
REPORTS TO
The Data Engineer II reports to the Manager of Data Management in the Technology Department.
ESSENTIAL DUTIES
The essential duties for this role include, but are not limited to:
Implement changes to existing Data Management systems with respect to data integrity, documentation, and reporting
Write advanced
Extract, Transform, and Load (ETL) scripts
to integrate data of various formats into enterprise data stores.
Write complex
SQL queries, scripts, and stored procedures
to reliably and consistently modify data throughout our organization according to business requirements
Collaborate directly and independently with stakeholders to build familiarity, fully understand their needs, and create custom, modular, and reliable solutions to resolve their problems
Work with Project Managers, Solution Architects, and Software Development teams to produce architected solutions for Company Initiatives on time, on budget, and on value.
Document and demonstrate solutions by developing
documentation, flowcharts, layouts, diagrams, charts, code comments and clear code.
Collaborate with more senior engineers on problems of high complexity, and architect solutions to problems of medium complexity, and advise newer engineers on problems of standard complexity.
Create data pipelines
using appropriate and applicable technologies from
Amazon Web Services (AWS)
to serve the specific needs of the business.
Ensure 99.95% uptime of our company's services monitoring data anomalies, batch failures, and our support chat for one week per team cycle from 8am-9pm.
Follow and embrace procedures of both the Data Management team and SVG Software Development Life Cycle (SDLC), including obtaining and retaining IT Security Admin III clearance.
Support after hours and weekend releases from our internal Software Development teams.
Actively participate in code review and weekly technicals with another more senior engineer or manager.
Assist departments with time-critical
SQL execution and debug database performance problems
.
Qualifications
ROLE COMPETENCIES
The competencies for this role include, but are not limited to:
Emotional Intelligence
Drive for Results
Continuous Improvement
Communication
Strategic Thinking
Teamwork and Collaboration
POSITION REQUIREMENTS
Bachelor's degree in Computer Science, or a related technical field.
2-4 years of practical production work in Data Engineering.
Advanced at reading code independently and understanding its intent.
Advanced at writing readable, modifiable code that solves business problems.
Ability to construct reliable and robust data pipelines to support both scheduled and event based workflows.
Working directly with stakeholders to create solutions.
Strong knowledge of the Python programming language.
Strong understanding of SQL, databases, & query optimization.
Additional Information
Benefits:
The Company offers the following benefits for this position, subject to applicable eligibility requirements:
Competitive Compensation
Medical, Dental and vision benefits after a short waiting period
401(k) matching program
Life Insurance, and Short-term and Long-term Disability Insurance
Optional enrollment includes HSA/FSA, AD&D, Spousal/Dependent Life Insurance, Travel Assist and Legal Plan
Generous paid time off (PTO) program starting off at 15 days your first year
15 paid Holidays (includes holiday break between Christmas and New Years)
10 days of Paid Parental Leave and 5 days of Paid Birth Recovery Leave
Annual Volunteer Time Off (VTO) and a donation matching program
Employee Assistance Program (EAP) - health and well-being on and off the job
Rewards and Recognition
Diverse, inclusive and welcoming culture
Training program and ongoing support throughout your Venture Spring Venture Group career
Security Responsibilities:
Operating in alignment with policies and standards
Reporting Security Incidents Completing assigned training
Protecting assigned organizational assets
Spring Venture Group is an Equal Opportunity Employer
Sr. Data Engineer
Data engineer job in Overland Park, KS
At Quest Analytics, our mission is to make healthcare more accessible for all Americans. As part of our team, you'll work in an innovative, collaborative, challenging, and flexible environment that supports your personal growth every day. We are looking for a talented and motivated Senior Data Engineer with experience in building scalable infrastructure, implementing automation, and enabling cross-functional teams with reliable and accessible data.
The Senior Data Engineer will help modernize and scale our data environment. This person will play a key role in transforming these workflows into automated, cloud-based pipelines using Azure Data Factory, Databricks, and modern data platforms. If you are looking for a high-impact opportunity to shape how data flows across the business, APPLY TODAY! What you'll do:
Identify, design, and implement internal process improvements (e.g., automating manual processes, optimizing data delivery, and re-designing infrastructure for scalability).
Transform manual SQL/SSMS/stored procedure workflows into automated pipelines using Azure Data Factory.
Write clean, reusable, and efficient code in Python (and optionally C# or Scala).
Leverage distributed data tools such as Spark and Databricks for large-scale processing.
Review project objectives to determine and implement the most suitable technologies.
Apply best practice standards for development, build, and deployment automation.
Manage day-to-day operations of the data infrastructure and support engineers and analysts with data investigations.
Monitor and report on data pipeline tasks, collaborating with teams to resolve issues quickly.
Partner with internal teams to analyze current processes and identify efficiency opportunities.
Participate in training and mentoring programs as assigned or required.
Uphold Quest Analytics values and contribute to a positive company culture.
Respond professionally and promptly to client and internal requests.
Perform other duties as assigned.
What it requires:
Bachelor's Degree in Computer Science or equivalent education/experience.
3-5 years of experience with ETL, data operations, and troubleshooting, preferably in Healthcare data.
Strong SQL development skills (SSMS, stored procedures, and optimization).
Proficiency in Python, C#, or Scala (experience with pandas and NumPy is a plus).
Solid understanding of the Azure ecosystem, especially Azure Data Factory and Azure Data Lake Storage (ADLS).
Hands-on experience with Azure Data Factory and ADLS.
Familiarity with Spark, Databricks, and data modeling techniques.
Experience working with both relational databases (e.g., SQL Server) and NoSQL databases (e.g., MongoDB).
Self-motivated, strong problem-solver, and thrives in fast-paced environments.
Excellent troubleshooting, listening, and analytical skills.
Customer-focused mindset with a collaborative, team-oriented approach.
We are not currently engaging with outside agencies on this role.Visa sponsorship is not available at this time.
What you'll appreciate:•Workplace flexibility - you choose between remote, hybrid or in-office•Company paid employee medical, dental and vision•Competitive salary and success sharing bonus•Flexible vacation with no cap, plus sick time and holidays•An entrepreneurial culture that won't limit you to a job description•Being listened to, valued, appreciated -- and having your contributions rewarded•Enjoying your work each day with a great group of people Apply TODAY!careers.questanalytics.com
About Quest AnalyticsFor more than 20 years, we've been improving provider network management one groundbreaking innovation at a time. 90% of America's health plans use our tools, including the eight largest in the nation. Achieve your personal quest to build a great career here. Visa sponsorship is not available at this time.
Preferred work locations are within one of the following states: Alabama, Arizona, Arkansas, Colorado, Connecticut, Delaware, Florida, Georgia, Idaho, Illinois (outside of Chicago proper), Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Mexico, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, West Virginia, Wisconsin, or Wyoming.
Quest Analytics provides equal employment opportunities to all people without regard to race, color, religion, sex, national origin, ancestry, marital status, veteran status, age, disability, sexual orientation or gender identity or expression or any other legally protected category. We are committed to creating and maintaining a workforce environment that is free from any form of discriminations or harassment.
Applicants must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire.
Persons with disabilities who anticipate needing accommodations for any part of the application process may contact, in confidence *********************
NOTE: Staffing agencies, headhunters, recruiters, and/or placement agencies, please do not contact our hiring managers directly. We are not currently working with additional outside agencies at this time. Any job posting displayed on websites other than questanalytics.com or jobs.lever.co/questanalytics/ may be out of date, inaccurate and unavailable
Auto-ApplyData Engineer II
Data engineer job in Leawood, KS
Job DescriptionDescription:
27Global is a rapidly growing company in the dynamic industry of software, cloud, and data engineering. We pride ourselves on the quality of the services we deliver, the clients we serve, and the strength of our culture. Our commitment to our employees is evidenced by our five Best Places to Work awards.
We're looking for a Data Engineer to join our team! You'll be responsible for contributing to the design and development of enterprise data solutions that support analytics, business intelligence, and scalable applications. You'll work closely with data and software architects, consultants and other engineers to deliver data models, integration strategies, and governance practices that empower client's data-driven decisions.
Joining 27Global as a Data Engineer is an exciting high-growth opportunity offering a competitive base salary, performance bonuses, and variable compensation.
Your Role:
Participate in the design and implementation of scalable, secure, and high-performance data architectures.
Develop and maintain conceptual, logical, and physical data models.
Work closely with architects to define standards for data integration, quality, and governance.
Collaborate with engineers, analysts, and business stakeholders to align data solutions with organizational needs.
Support cloud-based data strategies including data warehousing, pipelines, and real-time processing.
Design and optimize data pipelines that support AI, machine learning, and advanced analytics workloads.
Implement data preprocessing, feature engineering, and real-time inference capabilities for predictive modeling.
Integrate AI/ML models into production environments using tools such as AWS SageMaker, Azure Machine Learning, or Databricks.
Assess, learn, and apply emerging data technologies and frameworks to enhance solutions and stay current with industry trends.
Requirements:
What You Bring:
BA/BS/Master's degree in Computer Science, Information Systems, Data Science, or related field.
2 - 4 years of experience in data architecture, data engineering, or related roles delivering scalable architecture solutions from design to production.
2 - 4 years of experience writing .Net code or other OOP languages in an Agile environment.
Demonstrated leadership skills with the ability to collaborate with and lead on-shore and off-shore team members.
Proficient technical skills in: Spark, Scala, C#, PySpark, Data Lake, Delta Lake, Relational and NoSQL Databases, AWS Glue and Azure Synapse
Experience with SQL, ETL/ELT, and data modeling.
Experience with cloud platforms (AWS, Azure, GCP) and implementing modern data platforms with data lake.
Knowledge of data governance, security, and compliance frameworks.
Ability to context switch and work on a variety of projects over specified periods of time.
Ability to work at the 27Global office in Leawood, KS with hybrid work flexibility after 90 days, and occasionally onsite at client offices.
Flexibility to occasionally travel to client sites may be required, typically 1 week per quarter or less.
Legal authorization to work in the United States and the ability to prove eligibility at the time of hire.
Ways to Stand Out:
Certifications: AWS Solution Architect, Azure Data Engineer, Databricks Data Engineer
Hands-on experience with Databricks for building and optimizing scalable data pipelines, Delta Lake, and Spark-based analytics.
Hands-on experience with big data tools (Spark, Kafka).
Modern data warehouses (Snowflake, Redshift, BigQuery).
Familiarity with machine learning pipelines and real-time analytics.
Strong communication skills and ability to influence stakeholders.
Prior experience implementing enterprise data governance frameworks.
Experience in a client-facing role, working directly with clients from multiple levels of the organization; often presenting and documenting client environment suggestions and improvements.
Why 27G?:
Four-time award winner of Best Place to Work by the Kansas City Business Journal.
A casual and fun small business work environment.
Competitive compensation, benefits, time off, profit sharing, and quarterly bonus potential.
Dedicated time for learning, development, research, and certifications.
Adobe Real-Time Customer Data Platform (RT-CDP) Architect
Data engineer job in Kansas City, MO
Who You'll Work With The Adobe team drives strategic direction and solution enablement in support of Marketing Teams. We accelerate innovation and learning, advance sales and delivery excellence with high-caliber Marketing Technology solutions including Adobe technology expertise. Our focus is 4 go-to-market solution areas: Experience and Content Management with a focus on Content Supply Chain, Digital Asset Management; Personalized Insights and Engagement with a focus on Analytics, Customer Data Platforms, and Journey Orchestration; Digital Commerce with a focus on Experience Led Commerce and Product Information Management; Marketing Operations and Workflow with a focus on resource management, reporting and approvals of the content and data required to run Personalization and Campaigns at Scale.
We are seeking a talented Adobe RT-CDP Architect to join our team as a senior consultant or principal. This is a client-facing role that involves close collaboration with both technical and non-technical stakeholders.
What You'll Do
* Implement, configure, and enable Adobe Customer Data Platform (CDP)
* Provide the technical design and data architecture for configuring RT-CDP to meet clients' business goals
* Responsible for understanding business problems and capturing client requirements by leading effective conversations with business and technical client teams
* Interpret how to best apply the out of the box product to provide a solution; including finding alternative approaches that best leverage the platform
* Provide analytics domain expertise, consultation, and troubleshooting
* Learn new platforms, new capabilities, and new clouds to stay on top of the ever-growing product ecosystem (CJA, AJO, Marketo)
What You'll Bring
* Expertise in configuration, implementation, and integration of RT-CDP product without significant help from others
* Knowledge of, and experience with RT-CDP B2C, B2B and/or B2P
* Knowledge of how RT-CDP works with other Adobe Experience Platform products
* Experience implementing and driving success with RT-CDP for enterprise clients in an architecture role
* Proficient with manipulating, structuring, and merging data from different data sources and understanding of typical data sources within an enterprise environment
* Knowledge of how Graph Stitching, profile merge rules, profile collapsing, householding concepts work in RT-CDP
* Ability to translate business rules into technical requirements and implementation of those requirements
* Proficient with data transformation, API-based integrations and JavaScript tagging
* Experience working with SQL, R, and/or Python preferred
* Enterprise experience designing multi-solution architecture
* Strong communication skills and a passion for learning new technologies and platform capabilities
* Build strong relationships with clients and understand them from a business and strategic perspective
* Occasional travel as needed by client
About Us
Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all.
Compensation and Benefits
Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance.
Slalom is committed to fair and equitable compensation practices. For this role, we are hiring at the following levels and targeted base pay salary ranges: The targeted base salary range for a Senior Consultant for this position is $110,000 to $203,000 and the targeted base salary for a Principal for this position is $122,000 to $225,000. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time.
We will accept applicants until December 12th, 2025, or until the position is filled.
We are committed to pay transparency and compliance with applicable laws. If you have questions or concerns about the pay range or other compensation information in this posting, please contact us at: ********************.
EEO and Accommodations
Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team or contact ****************************** if you require accommodations during the interview process.
#LI-KM
Easy Apply