Data Engineer
Data engineer job in Chicago, IL
Scaylor helps mid-market and enterprise companies make sense of their data. Most teams past $20M ARR are stuck with fragmented systems, old databases, and hundreds of spreadsheets that never quite line up. We build the pipelines that clean, normalize, and connect that data so it can actually be used.
Our platform handles everything from parsing financial models and reports to mapping tables across legacy databases and modern APIs. The goal is simple: give companies a single source of truth they can trust.
We're a small team of four - one backend engineer, one frontend engineer, and 2 founders. We're looking for our fifth teammate to help us scale the engine behind it all.
โธป
The Role
You'll work across data engineering and full-stack development, helping us build reliable data infrastructure that powers real workflows. You'll touch everything from ingestion and transformation pipelines to the APIs and dashboards that surface insights to clients.
You'll work directly with the founding team and help make technical decisions that define the next version of Scaylor's core platform.
โธป
What You'll Work On
โข Build data pipelines that extract, clean, and standardize information from Excel files, PDFs, APIs, and legacy databases
โข Design schemas and transformation logic for structured and semi-structured data
โข Develop and maintain backend APIs (Python/FastAPI or Node/Express) for data access and analytics
โข Help connect backend services to our frontend dashboards (React, Node.js, or similar)
โข Set up and maintain AWS infrastructure (Lambda, S3, ECS, CloudFormation)
โข Collaborate with clients to understand their data problems and design workflows that fix them
โธป
You'd Be Great Here If You
โข Have 3-6 years of experience in data engineering, backend, or full-stack roles
โข Write clean, maintainable code in Python + JS
โข Understand ETL, data normalization, and schema mapping
โข Have experience with SQL and working with legacy databases or systems
โข Are comfortable managing cloud services and debugging data pipelines
โข Enjoy solving messy data problems and care about building things that last
โธป
Nice to Have
โข Familiarity with GCP or SQL databases
โข Understanding of enterprise data flows (ERP, CRM, or financial systems)
โข Experience building and deploying containers (Docker, GitHub Actions, CI/CD)
โข Interest in lightweight ML or LLM-assisted data transformation
โธป
Why Join Scaylor
โข Be one of the first five team members shaping the product and the company
โข Work directly with the founder and help define Scaylor's technical direction
โข Build infrastructure that solves real problems for real companies
โข Earn meaningful equity and have a say in how the company grows
โธป
Compensation
โข $130k - $150k with a raise based on set revenue triggers
โข .4% equity
โข Relocation to Chicago, IL required
Data Engineer
Data engineer job in Chicago, IL
Data Engineer - Build the Data Engine Behind AI Execution - Starting Salary $150,000
You'll be part architect, part systems designer, part execution partner - someone who thrives at the intersection of engineering precision, scalability, and impact.
As the builder behind the AI data platform, you'll turn raw, fragmented data into powerful, reliable systems that feed intelligent products. You'll shape how data flows, how it scales, and how it powers decision-making across AI, analytics, and product teams.
Your work won't be behind the scenes - it will be the foundation of everything we build.
You'll be joining a company built for builders. Our model combines AI consulting, venture building, and company creation into one execution flywheel. Here, you won't just build data pipelines - you'll build the platforms that power real products and real companies.
You know that feeling when a data system scales cleanly under real-world pressure, when latency drops below target, when complexity turns into clarity - and everything just flows? That's exactly what you'll build here.
Ready to engineer the platform that powers AI execution? Let's talk.
No up-to-date resume required.
Data Architect
Data engineer job in Oak Brook, IL
GeoWealth is a Chicago-based fintech firm that offers an award-winning digital advisory platform, including Turnkey Asset Management Platform (โTAMPโ) capabilities. We deliver a comprehensive and fully integrated wealth management technology platform to professionals in the financial services industry.
OPPORTUNITY:
We're looking for a Data Architect to join our Engineering Team. In this role, you will oversee the overall data architecture, helping us deliver our best-in-class solutions to our customers. This role will be key in organizing, designing, and leading our team through well-designed data architecture. If you love architecting complex systems, delivering customer focused software, designing best-in-class systems and leading data architecture design this role is for you.
RESPONSIBILITIES:
Own data architecture and oversee data implementation
Set coding/implementation standards
Lead our data warehouse design
Deliver performant, maintainable, and quality software in collaboration with our teams.
Improve our database design to reduce replication and increase performance
Partner with other architects and engineers to produce better designed systems
SKILLS, KNOWLEDGE, AND EXPERIENCE:
5+ years of experience as Data Architect or equivalent role
Bachelor's degree in computer science or equivalent degree
Hands-on experience with Oracle
Designed and implemented data warehouse
Experience with the following is preferred but not required: designing and building monolithic and distributed systems, Postgres, Logi Symphony, PowerBI, Java and JIRA/Confluence
COMPANY CULTURE & PERKS - HIGHLIGHTS:
Investing in Your Growth ๐ฑ
Casual work environment with fun, hard-working, and open-minded coworkers
Competitive salary with opportunity for performance-based annual bonus
Opportunities to up-skill, explore new responsibilities, and network across departments
Defined and undefined career pathways allowing you to grow your own way
Work/Life Balance ๐๏ธ
Flexible PTO and work schedule to ensure our team balances work and life
Hybrid work schedule
Maternity and paternity leave
Taking Care of Your Future โฅ๏ธ
Medical, Dental, and Vision, Disability insurance
Free access to Spring Health, a comprehensive mental health solution
401(k) with company match and a broad selection of investments
Voluntary insurance: short-term disability, long-term disability, and life insurance
FSA and transit benefits for employees that contribute pre-tax dollars
Other Fun Stuff โญ
Free on-site gym and parking
Weekly catered lunches in the office, plus monthly happy hours
Stocked kitchen with snacks and drinks
GeoWealth was recognized as โBest Place to Workโ by Purpose Job's 2025, 2024 and 2022
GeoWealth was recognized as โBest Place to Workโ by Built In in 2024, 2023 and 2022
SALARY RANGE:
Starting at $170,000-$220,000 + Benefits + Opportunity for Performance Bonus
This is an estimated range based on the circumstances at the time of posting, however, may change based on a combination of factors, including but not limited to skills, experience, education, market factors, geographical location, budget, and demand.
Senior Data Engineer
Data engineer job in Chicago, IL
requires visa independent candidates.
Note: (OPT, CPT, H1B holders will not work at this time)
Design, develop, and maintain scalable ETL pipelines using AWSGlue
Collaborate with data engineers and analysts to understand data requirements
Build and manage data extraction, transformation, and loading processes
Optimize and troubleshoot existing Glue jobs and workflows
Ensure data quality, integrity, and security throughout the ETL process
Integrate AWS Glue with other AWS services like S3, Lambda, Redshift, and Step Functions
Maintain documentation of data workflows and processes
Stay updated with the latest AWS tools and best practices
Required Skills
Strong hands-on experience with AWS Glue, PySpark, and Python
Proficiency in SQL and working with structured/unstructured data (JSON, CSV, Parquet)
Experience with data warehousing concepts and tools
Familiarity with CI/CD pipelines, Terraform, and scripting (PowerShell, Bash)
Solid understanding of data modeling, data integration, and data management
Exposure to AWS Batch, Step Functions, and Data Catalogs
Big Data Consultant
Data engineer job in Chicago, IL
Job Title: Bigdata Engineer
Employment Type: W2 Contract
Detailed Job Description:
We are seeking a skilled and experienced Big Data Platform Engineer who is having 7+ yrs of experience with a strong background in both development and administration of big data ecosystems. The ideal candidate will be responsible for designing, building, maintaining, and optimizing scalable data platforms that support advanced analytics, machine learning, and real-time data processing.
Key Responsibilities:
Platform Engineering & Administration:
โข Install, configure, and manage big data tools such as Hadoop, Spark, Kafka, Hive, HBase, and others.
โข Monitor cluster performance, troubleshoot issues, and ensure high availability and reliability.
โข Implement security policies, access controls, and data governance practices.
โข Manage upgrades, patches, and capacity planning for big data infrastructure.
Development & Data Engineering:
โข Design and develop scalable data pipelines using tools like Apache Spark, Flink, NiFi, or Airflow.
โข Build ETL/ELT workflows to ingest, transform, and load data from various sources.
โข Optimize data storage and retrieval for performance and cost-efficiency.
โข Collaborate with data scientists and analysts to support model deployment and data exploration.
Data Engineer
Data engineer job in Chicago, IL
The Data Engineer will design, build, and optimize the data pipelines and models that support the firm's evolving research, analytics, and systematic portfolio construction environment. This role is central to enabling data-driven investment processes, including quantitative research, AI/ML capabilities, and front-office automation.
Candidates must have deep expertise with Snowflake, strong SQL skills, and experience integrating diverse datasets used across investment organizations. The role is highly collaborative and requires comfort working in an iterative, fast-moving environment where data needs evolve rapidly based on stakeholder input.
Responsibilities
Design, build, and enhance ETL/ELT pipelines in Snowflake, ensuring high performance, reliability, and scalability.
Integrate internal and external datasets, including pricing, research content, economic releases, market data, and security reference data.
Support real-time or near-real-time data flows where needed (e.g., pricing, indicative quotes, market-sensitive inputs).
Collaborate closely with Product Leads, Quant Developers, and UI/UX teams to ensure data structures meet the requirements of research workflows, analytical models, and user-facing applications.
Partner with front-office stakeholders to rapidly iterate on evolving analytical and data needs.
Implement data validation, monitoring, and quality frameworks to ensure accuracy and reliability across critical datasets.
Translate prototype pipelines into production-ready workflows with appropriate documentation, standards, and controls.
Contribute to data modeling standards, metadata frameworks, and data governance practices across the platform.
Requirements
10+ years of data engineering experience within investment management, financial technology, or similar data-intensive environments.
Expert-level SQL, including complex queries, schema design, and performance optimization.
Deep hands-on experience with Snowflake, including advanced features such as tasks, streams, performance tuning, and secure data sharing.
Strong Python capabilities for ETL/ELT development, data processing, and workflow automation.
Experience integrating APIs and working with structured, semi-structured, and unstructured datasets.
Familiarity with NLP or AI/ML-oriented datasets (e.g., textual research content, PDFs) is a plus.
Experience with Domino or willingness to work within a Domino-based model environment.
Working knowledge of investment data structures (holdings, benchmarks, pricing, exposures) is highly preferred.
Ability to thrive in a rapid prototyping environment with evolving requirements and close partnership with front-office teams.
Data Engineer
Data engineer job in Chicago, IL
Job Title: Data Engineer - Workflow Automation
Employment Type: Contract to Hire or Full-Time
Department: Project Scion / Information Management Solutions
Key Responsibilities:
Design, build, and manage workflows using Automic or experience with similar tools like Autosys, Apache Airflow, or Cybermation.
workflow orchestration across multi-cloud ecosystems (AWS, Azure, Snowflake, Databricks, Redshift).
Monitor and troubleshoot workflow execution, ensuring high availability, reliability, and performance.
Administer and maintain workflow platforms.
Collaborate with architecture and infrastructure teams to align workflows with cloud strategies.
Support migrations, upgrades, and workflow optimization efforts
Required Skills:
Has 5+ years of experience in IT managing production grade system
Hands-on experience with Automic or similar enterprise workflow automation tools.
Strong analytical and problem-solving skills.
Good communication and documenting skills.
Familiarity with cloud platforms and technologies (e.g., AWS, Azure, Snowflake, Databricks).
Scripting proficiency (e.g., Shell, Python).
Ability to manage workflows across hybrid environments and optimize performance.
Experience managing production operations & support activities
Preferred Skills:
Experience with CI/CD pipeline integration.
Knowledge of cloud-native orchestration tools
Exposure to monitoring and alerting systems.
Snowflake Data Engineer
Data engineer job in Chicago, IL
Join a dynamic team focused on building innovative data solutions that drive strategic insights for the business. This is an opportunity to leverage your expertise in Snowflake, ETL processes, and data integration.
Key Responsibilities
Develop Snowflake-based data models to support enterprise-level reporting.
Design and implement batch ETL pipelines for efficient data ingestion from legacy systems.
Collaborate with stakeholders to gather and understand data requirements.
Required Qualifications
Hands-on experience with Snowflake for data modeling and schema design.
Proven track record in developing ETL pipelines and understanding transformation logic.
Solid SQL skills to perform complex data transformations and optimization.
If you are passionate about building cutting-edge data solutions and want to make a significant impact, we would love to see your application!
#11290
Data Engineer
Data engineer job in Itasca, IL
Primary Location: Itasca, IL Hybrid in Chicago's Northwest Suburbs
2 Days In-Office, 3 Days WFH
TYPE: Direct Hire / Permanent Role
MUST BE Citizen and Green Card
The Data Engineer will be responsible for designing, developing, and maintaining scalable data pipelines and integrations that support data analytics and business intelligence across the organization. This role is essential to ensuring high-quality data delivery, optimizing performance, and enabling effective decision-making through reliable data solutions.
What You Bring to the Role (Ideal Experience)
Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience).
5+ years of experience as a Data Engineer.
3+ years of experience with the following:
Building and supporting data lakehouse architectures using Delta Lake and change data feeds.
Working with PySpark and Python, with strong Object-Oriented Programming (OOP) experience to extend existing frameworks.
Designing data warehouse table architecture such as star schema or Kimball method.
Writing and maintaining versioned Python wheel packages to manage dependencies and distribute code.
Creating and managing CI/CD pipelines, especially using Azure DevOps for Microsoft Fabric-related assets.
Experience establishing scalable and maintainable data integrations and pipelines in Databricks environments.
Nice to Have's
Hands-on experience implementing data solutions using Microsoft Fabric.
Experience with machine learning/ML and data science tools.
Knowledge of data governance and security best practices.
Experience in a larger IT environment with 3,000+ users and multiple domains.
Current industry certifications from Microsoft cloud/data platforms or equivalent certifications. One or more of the following is preferred:
Microsoft Certified: Fabric Data Engineer Associate
Microsoft Certified: Azure Data Scientist Associate
Microsoft Certified: Azure Data Fundamentals
Google Professional Data Engineer
Certified Data Management Professional (CDMP)
IBM Certified Data Architect - Big Data
What You'll Do (Skills Used in this Position)
Design and develop scalable data pipelines to collect, process, and store large volumes of structured and unstructured data.
Extend and enhance existing OOP-based frameworks developed in Python and PySpark.
Partner with data scientists and analysts to define requirements and design robust data analytics solutions.
Ensure data quality and integrity through data cleansing, validation, and automated testing procedures.
Develop and maintain technical documentation, including requirements, design specifications, and test plans.
Implement and manage data integrations from multiple internal and external sources.
Optimize data workflows to improve performance, reliability, and reduce cloud consumption.
Monitor, troubleshoot, and resolve data pipeline issues to ensure consistent data delivery.
Establish and manage CI/CD pipelines and release processes, particularly using Azure DevOps for Microsoft Fabric.
Provide technical leadership and coordination for global development and support teams.
Participate in creating a safe and healthy workplace by adhering to organizational safety protocols.
Support additional projects and initiatives as assigned by management.
Senior Data Architect
Data engineer job in Oak Brook, IL
We are seeking a highly skilled and strategic Senior Data Solution Architect to join our IT Enterprise Data Warehouse team. This role is responsible for designing and implementing scalable, secure, and high-performing data solutions that bridge business needs with technical execution. Design solutions for provisioning data to our cloud data platform using ingestion, transformation, and semantic layer techniques. Additionally, this position provides technical thought leadership and guidance to ensure that data platforms and pipelines effectively support ODS, analytics, reporting, and AI initiatives across the organization.
Key Responsibilities:
Architecture & Design:
Design end-to-end data architecture solutions including operational data stores, data warehouses, and real-time data pipelines.
Define standards and best practices for data modeling, integration, and governance.
Evaluate and recommend tools, platforms, and frameworks for data management and analytics.
Collaboration & Leadership:
Partner with business stakeholders, data engineers, data analysts, and other IT teams to translate business requirements into technical solutions.
Lead architecture reviews and provide technical guidance to development teams.
Advocate for data quality, security, and compliance across all data initiatives.
Implementation & Optimization
Oversee the implementation of data solutions, ensuring scalability, performance, and reliability.
Optimize data workflows and storage strategies for cost and performance efficiency.
Monitor and troubleshoot data systems, ensuring high availability and integrity.
Required Qualifications:
Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or related field.
7+ years of experience in data architecture, data engineering, or related roles.
Strong expertise in cloud platforms (e.g., Azure, AWS, GCP) and modern data stack tools (e.g., Snowflake, Databricks).
Proficiency in SQL, Python, and data modeling techniques (e.g. Data Vault 2.0)
Experience with ETL/ELT tools, APIs, and real-time streaming technologies (e.g., dbt, Coalesce, SSIS, Datastage, Kafka, Spark).
Familiarity with data governance, security, and compliance frameworks
Preferred Qualifications:
Certifications in cloud architecture or data engineering (e.g., SnowPro Advanced: Architect).
Strong communication and stakeholder management skills.
Why Join Us?
Work on cutting-edge data platforms and technologies.
Collaborate with cross-functional teams to drive data-driven decision-making.
Be part of a culture that values innovation, continuous learning, and impact.
** This is a full-time, W2 position with Hub Group - We are NOT able to provide sponsorship at this time **
Salary: $135,000 - $175,000/year base salary
+ bonus eligibility
This is an estimated range based on the circumstances at the time of posting, however, may change based on a combination of factors, including but not limited to skills, experience, education, market factors, geographical location, budget, and demand
Benefits
We offer a comprehensive benefits plan including:
Medical
Dental
Vision
Flexible Spending Account (FSA)
Employee Assistance Program (EAP)
Life & AD&D Insurance
Disability
Paid Time Off
Paid Holidays
BEWARE OF FRAUD!
Hub Group has become aware of online recruiting related scams in which individuals who are not affiliated with or authorized by Hub Group are using Hub Group's name in fraudulent emails, job postings, or social media messages. In light of these scams, please bear the following in mind
Hub Group will never solicit money or credit card information in connection with a Hub Group job application.
Hub Group does not communicate with candidates via online chatrooms such as Signal or Discord using email accounts such as Gmail or Hotmail.
Hub Group job postings are posted on our career site: ********************************
About Us
Hub Group is the premier, customer-centric supply chain company offering comprehensive transportation and logistics management solutions. Keeping our customers' needs in focus, Hub Group designs, continually optimizes and applies industry-leading technology to our customers' supply chains for better service, greater efficiency and total visibility. As an award-winning, publicly traded company (NASDAQ: HUBG) with $4 billion in revenue, our 6,000 employees and drivers across the globe are always in pursuit of "The Way Ahead" - a commitment to service, integrity and innovation. We believe the way you do something is just as important as what you do. For more information, visit ****************
Sr. Data Engineer - PERM - MUST BE LOCAL
Data engineer job in Naperville, IL
Resource 1 is in need of a Sr. Data Engineer for a full-time/ permanent position with our client in Naperville, IL. Candidate must be
local to Illinois
because of future hybrid onsite in Naperville expected. Our client is an employee-owned company with excellent benefits, growth opportunities and profit-sharing bonus.
This position is focused on building modern data pipelines, integrations and back-end data solutions. Selected individual will work within cross-functional Agile teams, collaborating with product owners, business analysts and other engineers to design and deliver data solutions that power business insights and AI products.
Responsibilities:
Design and develop scalable data pipelines for ingestion, transformation and integration using AWS services.
Pull data from PostgreSQL and SQL Server to migrate to AWS.
Create and modify jobs in AWS and modify logic in SQL Server.
Create SQL queries, stored procedures and functions in PostgreSQL and RedShift.
Provide input on data modeling and schema design as needed.
Manage infrastructure through infrastructure-as-code templates (Serverless Framework), supporting new data products and services in AWS.
Support inbound/ outbound data flows, including APIs, S3 replication and secured data.
Assist with data visualization/ reporting as needed.
Follow an Agile development methodology, with regular workshops and standup meetings, working in two-week sprints.
Qualifications:
5+ years of data engineering experience.
Experience with AWS and its associated array of offerings (Glue, Redshift, Athena, S3, Lambda, Spectrum).
Strong experience with SQL, Python and PySpark.
A background in supply chain, logistics or distribution would be a plus.
Experience with Power BI is a plus.
Data Architect - Pharma
Data engineer job in Chicago, IL
MathCo
Role - Data/AI Engineering Manager
Onsite - Chicago - 4 days in office (Mandatory)
Industry - Pharma (Mandatory)
As platform architect/owner, you will:
Lead the end-to-end architecture, lifecycle, and governance of the AI/Analytics platform, defining standards, reusable components, and integration patterns.
Partner with AI/Data architects to enable scalable model deployment and enhance agentic orchestration.
Translate business needs into platform features, manage onboarding, documentation, and cross-functional collaboration for platform adoption.
Oversee infrastructure-as-code, CI/CD, observability, and containerized environments to ensure reliability and scalability.
Evaluate complex technical proposals and develop actionable platform roadmaps and architecture recommendations
Stay current on key AI platform developments and assess their impact on architecture and client strategy
Coach others, recognize their strengths, and encourage them to take ownership of their personal development
Skills Required
Experience in designing, architecting, or managing distributed data and AI platforms in cloud environments (AWS, Azure, or GCP)
Proven ability to carry out complex Proof of Concept (POC), pilot projects, and limited production rollouts for AI use-cases, focusing on developing new or improved techniques and procedures.
Strong skills in pipeline/workflow optimization and data processing frameworks to evaluate architectural choices
Years of Experience
Minimum of 8 years in relevant experience, preferably with a consulting
background and experience with Pharma clients
Distinguished Data Engineer - Card Data
Data engineer job in Chicago, IL
Distinguished Data Engineers are individual contributors who strive to be diverse in thought so we visualize the problem space. At Capital One, we believe diversity of thought strengthens our ability to influence, collaborate and provide the most innovative solutions across organizational boundaries. Distinguished Engineers will significantly impact our trajectory and devise clear roadmaps to deliver next generation technology solutions.
About the Team: Capital One is seeking a Distinguished Data Engineer, to work in our Credit Card Technology Data Engineering Team and build the future of financial services. We are a fast-paced, mission-driven group responsible for managing and leveraging petabytes of sensitive, real-time and batch data that powers everything from fraud detection models and personalized reward systems to regulatory compliance reporting. As a leader in Data Engineering, you won't just move data; you'll architect high-availability that directly influence millions of customer experiences and secure billions in transactions daily. You'll own critical data domains end-to-end, working cross-functionally with ML Scientists, Product Managers, and Business Analysts teams etc to solve complex, high-stakes problems with cutting-edge cloud technologies (like Snowflake, Kafka, and AWS). If you thrive on technical challenges, demand data integrity, and want your work to have a clear, measurable impact on the bank's core profitability and security, this is your team.
This leader must have the ability to attract and recruit the industry's best talent, and simultaneously have the technical chops to ensure that we build compelling, customer-oriented solutions in an iterative methodology. Success in the role requires an innovative mind, a proven track record of delivering next generation software and data products, rigorous analytical skills, and a passion for delivering customer value through automation, machine learning and predictive analytics.
Our Distinguished Engineers Are:
Deep technical experts and thought leaders that help accelerate adoption of the very best engineering practices, while maintaining knowledge on industry innovations, trends and practices
Visionaries, collaborating on Capital One's toughest issues, to deliver on business needs that directly impact the lives of our customers and associates
Role models and mentors, helping to coach and strengthen the technical expertise and know-how of our engineering and product community
Evangelists, both internally and externally, helping to elevate the Distinguished Engineering community and establish themselves as a go-to resource on given technologies and technology-enabled capabilities
Responsibilities:
Build awareness, increase knowledge and drive adoption of modern technologies, sharing consumer and engineering benefits to gain buy-in
Strike the right balance between lending expertise and providing an inclusive environment where others' ideas can be heard and championed; leverage expertise to grow skills in the broader Capital One team
Promote a culture of engineering excellence, using opportunities to reuse and innersource solutions where possible
Effectively communicate with and influence key stakeholders across the enterprise, at all levels of the organization
Operate as a trusted advisor for a specific technology, platform or capability domain, helping to shape use cases and implementation in an unified manner
Lead the way in creating next-generation talent for Tech, mentoring internal talent and actively recruiting external talent to bolster Capital One's Tech talent
Basic Qualifications:
Bachelor's Degree
At least 7 years of experience in data engineering
At least 3 years of experience in data architecture
At least 2 years of experience building applications in AWS
Preferred Qualifications:
Masters' Degree
9+ years of experience in data engineering
3+ years of data modeling experience
2+ years of experience with ontology standards for defining a domain
2+ years of experience using Python, SQL or Scala
1+ year of experience deploying machine learning models
3+ years of experience implementing big data processing solutions on AWS
Capital One will consider sponsoring a new qualified applicant for employment authorization for this position
The minimum and maximum full-time annual salaries for this role are listed below, by location. Please note that this salary information is solely for candidates hired to perform work within one of these locations, and refers to the amount Capital One is willing to pay at the time of this posting. Salaries for part-time roles will be prorated based upon the agreed upon number of hours to be regularly worked.
Chicago, IL: $239,900 - $273,800 for Distinguished Data Engineer
McLean, VA: $263,900 - $301,200 for Distinguished Data Engineer
New York, NY: $287,800 - $328,500 for Distinguished Data Engineer
Richmond, VA: $239,900 - $273,800 for Distinguished Data Engineer
San Francisco, CA: $287,800 - $328,500 for Distinguished Data Engineer
Candidates hired to work in other locations will be subject to the pay range associated with that location, and the actual annualized salary amount offered to any candidate at the time of hire will be reflected solely in the candidate's offer letter.
This role is also eligible to earn performance based incentive compensation, which may include cash bonus(es) and/or long term incentives (LTI). Incentives could be discretionary or non discretionary depending on the plan.
Capital One offers a comprehensive, competitive, and inclusive set of health, financial and other benefits that support your total well-being. Learn more at the Capital One Careers website . Eligibility varies based on full or part-time status, exempt or non-exempt status, and management level.
This role is expected to accept applications for a minimum of 5 business days.No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections ; New York City's Fair Chance Act; Philadelphia's Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries.
If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1- or via email at . All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations.
For technical support or questions about Capital One's recruiting process, please send an email to
Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site.
Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC).
Senior Back End Developer - Distributed Systems (C# or Golang)
Data engineer job in Chicago, IL
Our client, a fast-growing organization developing secure, scalable technologies for next-generation AI applications, is seeking a Backend Engineer to join their core platform team.
In this role, you'll help build and refine the foundational services that power authentication, observability, data flows, and high-availability systems across a distributed ecosystem. This is an opportunity to work on complex backend challenges while shaping the infrastructure that supports mission-critical applications.
What You'll Do
Develop, enhance, and support backend services that form the foundation of the platform.
Build and maintain core authentication and authorization capabilities.
Apply principles of Domain-Driven Design to guide how services and components evolve over time.
Architect, extend, and support event-sourced systems to ensure durable, consistent operations at scale.
Participate in API design and integration efforts across internal and external stakeholders.
Implement and support messaging frameworks (e.g., NATS) to enable reliable service-to-service communication.
Maintain and improve observability tooling-including metrics, tracing, and logging-to ensure healthy system performance.
Work closely with infrastructure, DevOps, and engineering teams to ensure robust, secure, and maintainable operations.
What You Bring
3-6+ years of experience as a backend engineer.
Strong knowledge of distributed systems and microservices.
Proficiency in at least one modern backend programming language (C#, Go, Rust, etc.).
Practical experience with IAM concepts and authentication/authorization frameworks.
Exposure to event-sourcing patterns, DDD, and common messaging systems (e.g., NATS, Kafka, SNS, RabbitMQ).
Familiarity with Redis or similar in-memory caching technologies.
Experience working with observability tools such as Prometheus, Jaeger, ELK, or Application Insights.
Understanding of cloud-native environments and deployment workflows (AWS, Azure, or GCP).
Why This Role Is Compelling
You'll contribute directly to a foundational platform used across an entire organization-impacting performance, reliability, and security at every layer. If you enjoy solving distributed-system challenges and working on complex, high-scale backend services, this is a strong match.
#BackendEngineering #DistributedSystems #PlatformEngineering #CloudNative #SoftwareJobs
Lead DevOps Engineer
Data engineer job in Chicago, IL
Qorali is seeking a Lead DevOps Engineer to drive the evolution of our cloud and automation strategy. In this role, you'll own the design and delivery of enterprise-scale cloud infrastructure, lead mission-critical DevOps initiatives, and mentor engineers across the organization.
We're looking for a hands-on technical leader with deep expertise in AWS, Kubernetes, CI/CD pipelines, Terraform, and Kafka - someone who thrives on solving complex challenges and setting best practices for scalable, secure, and resilient systems.
Key Responsibilities
Architect and implement highly available, automated cloud solutions on AWS.
Build and optimize CI/CD pipelines to accelerate software delivery.
Design, deploy, and manage containerized workloads with Kubernetes.
Lead Kafka platform operations to support real-time, high-throughput applications.
Champion infrastructure-as-code with Terraform, driving automation and repeatability.
Provide technical leadership, mentoring, and serve as escalation point for critical issues.
Collaborate with development, security, and operations teams to deliver end-to-end DevOps solutions.
Qualifications
7+ years of experience in DevOps, cloud engineering, or infrastructure automation.
Proven expertise in AWS, Kubernetes, Terraform, CI/CD (Jenkins/GitHub Actions), Python and Kafka.
Experience with configuration management (Ansible, Puppet, or Chef).
Strong understanding of cloud security, compliance frameworks (CIS, NIST), and high-availability design.
Demonstrated leadership experience, guiding technical teams and influencing DevOps best practices.
Compensation & Benefits
$150-180k base salary + 15% bonus
22+ days PTO
Health, vision, dental & life insurance
6% 401k matching
Location: Hybrid, Chicago or Dallas
Director of Automation and SDET
Data engineer job in Chicago, IL
is bonus eligible***
Prestigious Financial Institution is currently seeking a Director of Automation and SDET with AI/ML experience. Candidate will be responsible for defining, driving and scaling enterprise-wide test automation, quality engineering practices. This role will architect and implement advanced automation solutions across applications, data and platforms, enable adoption of best practices, and establish the governance, metrics and tools. The role combines technical expertise with strong leadership and stakeholder collaboration skills to deliver next generation automation infusing AI capabilities.
Responsibilities:
Define and execute the enterprise automation strategy aligned with business and technology modernization goals.
Drive adoption of automation in all phases of testing including automated regression and smoke tests to improve quality and accelerate testing.
Implement automated quality gates, pre/post deployment checks and shift-left testing.
Architect scalable, reusable automation frameworks covering UI, API, microservices, data pipelines, Kafka/event driven systems, batch jobs, reports and databases.
Define standards for BDD, contract testing, CI/CD integration, synthetic data generation and environment-agnostic test automation.
Establish tagging and traceability across automation framework, Jira, Confluence, Test management tools, CI/CD pipelines and Splunk.
Introduce and scale synthetic test data management, environment/service virtualization for complex integration testing.
Envision and implement AI/ML and Generative AI infused solutions for test case generation, test data generation, automation script generation and quality insights.
Build quality engineering and automation center of excellence to drive training, reusable asset libraries and knowledge management artifacts.
Partner with development, product, DevOps and Platform Engineering leaders to embed automation into all stages of SDLC.
Define, monitor and report KPIs/OKRs for automation outcomes to executives and product owners.
Drive compliance with industry standards, regulatory requirements, and audit readiness across automation and QE practices.
Manages a team of people managers, individual contributors, and consultants/contractors
Qualifications:
Minimum ten (15) years' of IT experience with ten (10)+ years in test automation.
Proven track record of leading enterprise-scale automation initiatives in complex, distributed environments (microservices, cloud, batch applications, data, MQ, Kafka event driven systems).
Hands-on experience with service virtualization, synthetic test data management.
Strong hands-on expertise in testing and test automation tools and frameworks including Jira, BDD, Selenium, Cucumber, REST-assured, JMeter, Playwright.
Strong programming experience in Java and Python.
Deep understanding of DevOps, CI/CD pipelines (Jenkins, Harness, GitHub), cloud platforms (AWS, Azure) and containerized environments (Kubernetes, Docker).
Experience with Kafka/event-driven testing, large data set validations
Experience with Agile development processes for enterprise software solutions
Strong background in metrics-driven quality reporting and risk-based decision making.
Strong organizational leadership skills
Ability to manage multiple, competing priorities and make decisions quickly
Knowledgeable about industry trends, best practices, and change management
Strong communication skills with the ability to communicate and interact with a variety of internal/external customers, coworkers, and Executive Management
Strong work ethic, hands-on, detail oriented with a customer service mentality
Team player, self-driven, motivated, and able to work under pressure
Results-oriented and demonstrated record of developing initiatives that impact productivity
Technical Skills:
Proficiency with modern quality engineering tools including Jira, Jenkins, automation frameworks, test management tools.
Software QA methodologies (requirements analysis, test planning, functional testing, usability testing, performance testing, etc.)
Familiarity with AI/ML/GenAI Solutions in QE.
Utilizing best practices in software engineering, software test automation, test management tools, and defect tracking software
Past/Current experience of 3+ years working on a large-scale cloud native project Experience with cloud technologies and migrations using public cloud vendor preferably using cloud foundational services like AWS's VPCs, Security groups, EC2
Education and/or Experience:
BS degree in Computer Science or Information Systems Management or a similar technical field
10+ years of experience in Quality Assurance space preferably on complex systems and large programs.
DevOps Cloud Engineer
Data engineer job in Chicago, IL
Duties: You will be responsible for: (1) Designing, deploying, securing, and managing enterprise cloud and hybrid infrastructure across compute, storage, database, networking, and security domains using services within Amazon Web Services (including EC2, Lambda, S3, RDS, VPC, IAM, and related technologies); (2) Implementing and maintaining Infrastructure as Code (IaC) using tools such as GitHub, Pulumi, or AWS CloudFormation to automate provisioning, configuration, and lifecycle management; (3) Continuously evaluating and optimizing AWS environments to ensure performance, availability, scalability, cost efficiency, and operational stability; (4) Designing, building, and maintaining CI/CD pipelines using GitHub Actions, AWS CodePipeline, or Jenkins, including integration of automated testing, security scanning, and compliance checks (e.g., Orca Security or similar tools); (5) Leveraging automation and AI-based tools to strengthen the efficiency and intelligence of CI/CD and DevOps processes; (6) Implementing security best practices across identity and access management, network architecture, encryption, monitoring, logging, and incident response in coordination with the Information Security team; (7) Supporting vulnerability management, incident response, remediation, and follow-up to ensure secure and compliant cloud operations; (8) Setting up and maintaining monitoring, logging, alerting, and SIEM integrations using platforms such as AWS CloudWatch, LogicMonitor, Splunk, or Orca Security; (9) Troubleshooting infrastructure, networking, and deployment issues across hybrid environments and participating in weekly on-call rotation for production support; (10) Managing Windows and Linux patching, BC/DR capabilities, and policy governance using AWS Systems Manager, Cloud Custodian, and related tooling; (11) Collaborating with developers, system administrators, engineers, and business stakeholders to design and deliver reliable and secure cloud solutions; (12) Evaluating, recommending, and implementing new tools, frameworks, and automation opportunities to enhance performance, availability, security, and operational maturity; (13) Documenting system standards, architecture diagrams, operating procedures, and best practices to ensure alignment, maintainability, and operational excellence; (14) Contributing to a culture of collaboration, agility, innovation, continuous improvement, and cross-team partnership.
Required:
****Critical Note: This is NOT a traditional DevOps Cloud Engineer and traditional DevOps Cloud Engineers should not invest time in applying. The requirements for consideration are shared specifically below this critical note, but to provide important and essential insight for you to be considered, the following is being shared:
ALL applicants must have hands-on experience at some point in their professional work experience foundational or traditional IT infrastructure skills---not cloud based (e.g. actual non-cloud based system administration, network engineering/administration, firewalls/security) with background/experience in building/administering/engineering/supporting/operating either on-premises or hybrid IT infrastructures, who grew into more of the DevOps space, would be highly preferred versus a pure cloud-only person.
Required:
A completed and verifiable Bachelor's degree in Computer Science, Information Systems, or a related STEM field is required.
Must have 3 or more years of professional Dev/Ops and Cloud Engineering experience with Prior experience as a Systems Engineer, Systems Adminstration, or Network Engineer with pater exeperience in DevOps practices, cloud automation, and modern infrastructure. Both components of this requirement are an absolute must have.
Must have strong, hands-on expertise with AWS compute, storage, networking, database, serverless, and security services, including EC2, Lambda, S3, RDS, CloudFormation, VPC, IAM, and container services such as ECS/EKS.
Must have experience building and managing Infrastructure as Code using Pulumi, Terraform, AWS CloudFormation, and scripting languages such as Python, Bash, or Node.js.
Must have hands-on experience administering and developing CI/CD pipelines using GitHub Actions, AWS CodeCommit/CodePipeline, or equivalent automation platforms.
Must have working knowledge of networking technologies including routing, switching, VPNs, firewalls, and network security principles, along with experience managing hybrid connectivity.
Must have familiarity with IAM, SIEM, SASE, and the integration of security within CI/CD pipelines.
Must have experience with monitoring and observability tools such as AWS CloudWatch, LogicMonitor, Splunk, Orca Security, or similar enterprise platforms.
Must demonstrate strong communication skills, the ability to work closely with peers and stakeholders, and the ability to operate effectively in a fast-paced, dynamic environment.
Pluses: AWS certifications such as AWS Certified Solutions Architect - Associate or AWS Certified DevOps Engineer - Associate. Experience in financial services or other regulated industries. Experience supporting governance, compliance, or cloud security programs.
Azure Cloud & DevOps Engineer
Data engineer job in Chicago, IL
๐ฑ Azure Cloud & DevOps Engineer
๐ Chicago, IL | ๐ข Hybrid | ๐ผ Full-Time
At Sprocket Sports, We are currently seeking an Azure Cloud & DevOps Engineer to join our Team. The ideal candidate has a passion for youth sports and managing a best-in-class software platform that will be used by thousands of youth sports club administrators, coaches, parents and players.
About Sprocket
Sprocket Sports is a fast-growing technology company based in Chicago and a national leader in the youth sports space. Our software and services help clubs streamline operations, reduce costs, and grow faster, so they can focus on what really matters: kids playing sports. We're also proud to be a certified Great Place to Work 2024, with a culture that balances high standards, accountability, and fun.
What You'll Do
As an experienced DevOps / cloud engineer you will help us scale and maintain a high-performing, reliable, and cost-effective cloud infrastructure. As an Azure Cloud Engineer, you will be the backbone of our cloud infrastructure, ensuring our platform is always available, fast, and secure for our users. You will manage our resources in Microsoft Azure, focusing heavily on performance optimization, cost control, and proactive system health monitoring. This role is perfect for someone passionate about cloud technology, DevOps principles, and continuous improvement. In this role you will interact with our software engineers, product managers and occasionally with operational stakeholders. We are seeking individuals who like to think creatively and have a passion for continually improving the platform.
Responsibilities:
Core Azure Cloud Management
Resource & Cost Optimization:
Manage, provision, and maintain our complete suite of Azure resources (e.g., App Services, Azure Functions, AKS, VMs).
Proactively manage and reduce cloud costs by identifying and implementing efficiencies in resource utilization and recommending right-sizing strategies.
Security and Compliance:
Ensure security best practices are implemented across all Azure services, including network segmentation, access control (IAM), and patching.
Performance & Reliability Engineering (SRE Focus)
System Health and Monitoring:
Ongoing monitoring of application and system performance using Azure and DataDog to detect and diagnose issues before they impact users.
Review system logs, metrics, and tracing data to identify areas of concern, bottlenecks, and opportunities for performance tuning.
Performance Testing
Lead efforts to conduct load testing and performance testing on the system.
Database Performance Tuning:
Review and optimize SQL performance by analyzing query plans, identifying slow-running queries, and recommending improvements (indexing, schema changes, stored procedures).
Manage and monitor our Azure SQL Database resources for optimal health and throughput.
Incident Response: Participate in on-call rotation to provide 24/7 support for critical infrastructure incidents and drive root cause analysis (RCA).
DevOps Automation
Infrastructure as Code (IaC):
Implement Infrastructure-as-Code (ARM, Bicep, or Terraform) to maintain consistent, auditable deployments.
Continuous Integration / Continuous Delivery (CI/CD):
Work closely with the development team to automate and streamline deployment pipelines (CI/CD) using Azure DevOps, ensuring fast and reliable releases.
Configuration Management: Implement and manage configuration for applications and infrastructure.
What We're Looking For:
Bachelor's degree in a Computer Science or related field.
3+ years of professional experience in Cloud Engineering, DevOps, or a similar role, with a strong focus on Microsoft Azure.
Deep hands-on experience with core Azure services and strong networking fundamentals.
Solid experience with monitoring and observability platforms, specifically DataDog.
Scripting proficiency in PowerShell.
Demonstrated ability to analyze and optimize relational database performance (SQL/T-SQL).
Strong problem-solving skills.
Strong communication and interpersonal skills; ability to analytically defend design decisions and take feedback without ego.
Strong attention to detail and accountability.
Why Join Us?
โ
Certified Great Place to Work 2024
๐ค Mission-driven team with a big vision
๐ Fast-growing startup with room to grow
๐ผ Competitive salary + equity
๐ 401(k) with company match
๐ฉบ Comprehensive medical and dental
๐ A culture built on Higher Standards, Greater Accountability, and More Fun
Azure Devops Engineer
Data engineer job in Chicago, IL
Set up CI/CD pipelines to support automated deployment of resources to Cloud environments, all at medium to high level of complexity
ยท This is a hands-on role that develops and supports build and release automation pipelines. You will be part of the team that will deploy a highly available full software stack in public/ private clouds
ยท Remediate gaps and support the automation requirements of continuous integration and continuous deployment
ยท Identify and develop metrics and dashboards to monitor adoption and maturity of DevOps
ยท Experience in Docker/Containerization and Kubernetes
ยท Ability to contribute to architecture discussions around technology controls and their implementation in a DevOps/Cloud environment
ยท Work collaboratively with architecture, security and other engineers to estimate, design, code, deploy and support working software / technology components
ยท Foster the adoption of DevSecOps culture and capabilities across Agile product delivery teams
ยท Embed โshift-leftโ security practices using tools like Checkmarx, SonarQube, PrismaCloud.
ยท Work in an Agile/Scrum environment; planning, estimating, and completing tasks on
ยท Liaison with Agile Delivery Process teams to support necessary configurations/setup in Azure DevOps (ADO) for Agile ceremonies
Champion Modern SDLC by leading the consistent application of the redesigned SDLC framework, aligning with Agile, DevSecOps, and platform standards
ยท Work with development and support teams to design improved deployment, provisioning and integration workflows, ensure environments stability and identify areas and plans for improvement
ยท Contribute to new technology, vendor package and tool road mapping, evaluation and introduction
Ensure compliance with Performance, Security, Availability, Recoverability standards and policies and provide Monitoring recommendations for tasks of low to medium level of complexity
ยท 5+ years of demonstrable software engineering and DevOps experience
ยท 5+ years working in SCRUM/Agile software development environment
ยท Experience deploying and administering Continuous integration tools such as Azure DevOps, is a must
ยท Experience with Infrastructure cloud tools such as Terraform, Docker, and Aspire etc.
ยท Experience with automated testing solutions for unit testing, integration and system testing
ยท Bachelor's Degree or equivalent experience. Computer Science or related field preferred.
ยท Strong cloud engineering experience primarily with Azure and AWS.
ยท Experience in working with Terraform, Ansible, and/or Chef for infrastructure automation and configuration
ยท Experience with Docker and Kubernetes on platforms such as AWS ECS and AWS EKS
ยท Experience with programming languages such as Python, Poweshell, and C++ is a plus
ยท Experience with APM, monitoring and logging tools such as Datadog, Solarwinds, Cloud watch and Splunk
ยท Experience with SQL databases such as MySQL and , NoSQL databases like AWS Dynamo DB and MongoDB, graph DB such as Neo4J, AWS Neptune.
Experience with project management and workflow tools and concepts such as Jira, Agile, Scrum/Kanban, etc.
ยท Proficiency in cross-platform scripting language and build tools (Python,ANT,Artifactory, MS Build,NuGet)
ยท Proficiency in OOP software development using C# or similar languages
ยท Ability to define scalable and secure CI/CD pipelines
ยท Understanding of deployment strategies using Docker and Podman for containerization
ยท Experience with pair programming using GitHub Copilot
Strong communication/presentation skills and ability to explain standards, processes, and cloud architecture with team and management.
Senior Python Developer
Data engineer job in Chicago, IL
Design & build production-grade services and APIs (FastAPI / Django / Flask) using clean, well-tested Python.
Architect scalable systems (microservices, event-driven patterns, async I/O, caching) with high availability and failover.
Data & storage: model schemas; write efficient SQL; integrate with Postgres/MySQL, caching (Redis)
Performance & reliability: profile (cProfile, py-spy), tune hot paths, apply back-pressure, circuit breakers, retries, and idempotency.
Security & compliance: enforce authN/Z, secrets management, secure coding, dependency hygiene (SCA), and data protection.
DevOps & quality: code reviews, automated testing (pytest), static typing (mypy/pyright), linting, and CI/CD best practices.
Cloud & platform: containerize ; deploy to Kubernetes or serverless (Azure Functions); manage IaC (Terraform).
Observability: instrument with OpenTelemetry; create actionable dashboards/alerts
Minimum qualifications
10+ years of professional software engineering with Python in production.
Deep expertise with at least one Python web framework (FastAPI, Django, or Flask) and modern async programming.
Strong CS fundamentals: algorithms, data structures, and concurrency
Proven experience designing distributed systems and event-driven architectures.
Solid SQL/ORM experience (SQLAlchemy/Django ORM) and schema design.
Mastery of testing (unit/integration/contract), CI/CD (GitHub /Azure DevOps), and release strategies.
Hands-on with cloud (Azure), containers, Kubernetes, and infrastructure automation.
Excellent communication; ability to lead cross-functional initiatives.
Nice to have :
Security background: OAuth2/OIDC, Key Vault/Secrets Manager, threat modeling, SDLC governance.
FinTech domain experience
Front-end familiarity (React) for API-consumer alignment.