Post job

Data engineer jobs in Atlanta, GA

- 1,606 jobs
All
Data Engineer
Senior Software Engineer
Data Architect
Requirements Engineer
Senior Data Architect
Software Engineer
Lead Data Technician
Data Warehouse Developer
  • Data Architect with low latency

    New York Technology Partners 4.7company rating

    Data engineer job in Atlanta, GA

    Role : Data Architect with low latency Duration : Long Term Contact We're seeking a seasoned Lead Software Engineer to architect, build, and scale real time data processing platforms that power event driven applications and analytics. You'll lead the design of streaming microservices, govern data quality and lineage, and mentor engineers while partnering with product, platform, and security stakeholders to deliver resilient, low latency systems. Responsibilities: • Own design & delivery of high throughput, low latency streaming solutions using technologies like Confluent Kafka, Apache Flink, Hazelcast, Kafka Streams, Kafka Connect, and Schema Registry. • Design and implement microservices and event driven systems with robust ETL/ELT pipelines for real time ingestion, enrichment, and delivery. • Establish distributed caching and in memory data grid patterns (e.g., Redis, Hazelcast) to optimize read/write performance and session/state management. • Define and operationalize event gateways / event grids for event routing, fan out, and reliable delivery. • Lead data governance initiatives-standards for metadata, lineage, classifications, retention, access controls, and compliance (PII/PCI/SOX/GDPR as applicable). • Drive CI/CD best practices (pipelines, automated testing, progressive delivery) to enable safe, frequent releases; champion DevSecOps and “shift left” testing. • Set SLOs/SLAs, track observability (tracing, metrics, logs), and optimize performance at scale (throughput, backpressure, state, checkpointing). • Work with Security, Platform, and Cloud teams on networking, IAM, secrets, certificates, and cost optimization. • Mentor engineers, conduct design reviews, and enforce coding standards and reliability patterns. • Guide platform and delivery roadmap Required Qualifications: • 10+ years in software engineering; 5+ years designing large-scale real time or event driven platforms. • Expert with Confluent Kafka (brokers, partitions, consumer groups, Schema Registry, Kafka Connect), Flink (DataStream/Table API, stateful ops, checkpointing), Hazelcast, and/or Kafka Streams. • Strong in ETL/ELT design, streaming joins/windows, exactly once semantics, and idempotent processing. • Experience with microservices (Java/Python), REST/gRPC, protobuf/Avro, and contract-first development. • Hands-on with distributed caching and in memory data grids; performance tuning and eviction strategies. • Cloud experience in any one or more cloud platforms Azure/AWS/GCP; containers, Docker, Kubernetes. • Experience in production-grade CI/CD (Jenkins, Bamboo, Harness or similar), Infrastructure as Code (Terraform/Helm). • Robust observability (Prometheus/Grafana/OpenTelemetry, Splunk/ELK or similar), and resilience patterns (circuit breakers, retries, DLQs). • Practical data governance: metadata catalogs, lineage, encryption, RBAC. • Excellent communication; ability to lead design, influence stakeholders, and guide cross-functional delivery. • Core competencies to include Architectural Thinking, Systems Design, Operational Excellence, Security & Compliance, Team Leadership, Stakeholder Management. Nice to Have: • Experience with CDC, Kafka Connect custom connectors, Flink SQL, Beam. • Streaming ML or feature stores integration (online/offline consistency). • Multi region / disaster recovery for streaming platforms. • Experience with Zero downtime migrations, blue/green, and canary deployments."
    $93k-121k yearly est. 16h ago
  • Data Analytics & AI Delivery Lead

    Futran Solutions 3.9company rating

    Data engineer job in Alpharetta, GA

    Hello Hope you are doing well Job Title: Data Analytics & AI Delivery Lead 10+ Years of Experience Essential Duties and Responsibilities: We are seeking a highly motivated and experienced Delivery Leader to oversee the execution of strategic Analytics and AI projects. This individual will be responsible for ensuring on-time, high-quality delivery of initiatives across data and analytics programs, managing project organization, and serving as a critical bridge between business stakeholders and technical teams. This role requires strong delivery expertise across agile and waterfall methodologies, deep experience in stakeholder engagement, and familiarity with the Azure ecosystem, including tools like Synapse, Microsoft Fabric, Power BI, and Azure DevOps. The ideal candidate brings a track record of leading complex, cross-functional data initiatives with confidence, clarity, and consistency. Key Responsibilities: Lead end-to-end delivery of Analytics and AI projects, from initiation through implementation and transition to operations. Partner closely with business stakeholders to align project goals with strategic objectives and ensure value realization. Drive project planning activities including scoping, scheduling, resource planning, risk management, and reporting. Facilitate agile ceremonies or stage-gate processes based on project needs (Scrum or Waterfall). Serve as a liaison between technical teams and business functions to translate requirements and remove roadblocks. Ensure strong governance and compliance with delivery standards, documentation, and stakeholder communication. Track key metrics such as scope, budget, timeline, and benefit realization to ensure successful project outcomes. Contribute to the continuous improvement of delivery practices within the analytics and AI domain. Your qualifications Required: Bachelor's degree in Information Systems, Computer Science, Engineering, or a related field (Master's preferred). 8+ years of experience in project/program management with a focus on data, analytics, or AI initiatives. Demonstrated ability to manage complex, multi-disciplinary programs using Agile, Waterfall, and hybrid methodologies. Experience delivering projects involving Azure Data Services, including Synapse Analytics, Power BI, Microsoft Fabric, and Azure DevOps. Strong stakeholder management skills, with a proven ability to influence, communicate, and collaborate at all levels of the organization. Knowledge of data governance, data modeling, and AI/ML deployment frameworks is a plus. PMP, Scrum Master, or SAFe certification is advantageous. Preferred: Strong analytical mindset with the ability to understand and interpret technical details. Exceptional organizational skills and attention to detail. Comfortable managing ambiguity and driving clarity in evolving project environments. Proactive problem-solver with a focus on outcomes and team empowerment. Familiarity with tools like Jira, Azure Boards, or similar project tracking platforms. Thanks & Regards, Mittapalli Lalith Kumar Senior Technical Recruiter ***************************
    $93k-127k yearly est. 16h ago
  • ETL Databricks Data Engineer

    Capgemini 4.5company rating

    Data engineer job in Atlanta, GA

    We are seeking a ETL Databricks Data Engineer to join our team and help build robust, scalable data solutions. This role involves designing and maintaining data pipelines, optimizing ETL processes, and collaborating with cross-functional teams to ensure data integrity and accessibility. Key Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes using Databricks. Create and optimize Python scripts for data transformation, automation, and integration tasks. Develop and fine-tune SQL queries for data extraction, transformation, and loading. Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data solutions. Ensure data integrity, security, and compliance with organizational standards. Participate in code reviews and contribute to best practices in data engineering. Required Skills & Qualifications 5 years of professional experience in data engineering or related roles. Strong proficiency in Databricks (including Spark-based data processing). Advanced programming skills in Python. Expertise in SQL for querying and data modeling. Familiarity with Azure Cloud and Azure Data Factory (ADF). Understanding of ETL frameworks, data governance, and performance tuning. Knowledge of CI/CD practices and version control tools (e.g., Git). Exposure to BI tools such as Power BI or Tableau for data visualization. Life at Capgemini Capgemini supports all aspects of your well-being throughout the changing stages of your life and career. For eligible employees, we offer: • Flexible work • Healthcare including dental, vision, mental health, and well-being programs • Financial well-being programs such as 401(k) and Employee Share Ownership Plan • Paid time off and paid holidays • Paid parental leave • Family building benefits like adoption assistance, surrogacy, and cryopreservation • Social well-being benefits like subsidized back-up child/elder care and tutoring • Mentoring, coaching and learning programs • Employee Resource Groups • Disaster Relief Disclaimer Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law. This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship. Capgemini is committed to providing reasonable accommodations during our recruitment process. If you need assistance or accommodation, please reach out to your recruiting contact. Click the following link for more information on your rights as an Applicant **************************************************************************
    $77k-99k yearly est. 2d ago
  • Data Architect

    Agile Resources, Inc.

    Data engineer job in Atlanta, GA

    Note: Initial 100% onsite required for the first six months. Employment Type: Permanent / Direct Hire / Full-time Salary Up to $180,000 (depending on experience) + bonus The Role: We're seeking a highly skilled and hands-on Data Architect to lead the design, implementation, and ongoing evolution of our enterprise-grade data systems. This role is crucial for building scalable, secure, and intelligent data infrastructure that supports core analytics, operational excellence, and future AI initiatives. Success requires a seasoned technologist who can seamlessly integrate cloud-native services with traditional data warehousing to create a modern, unified data platform. What You'll Do: Architecture & Strategy: Lead the design and implementation of modern data platforms, including Data Lakes, Data Warehouses, and Lakehouse architectures, to enable a single source of truth for the enterprise. Data Modeling & Integration: Architect unified data models that support both modular monoliths and microservices-based platforms. Design and optimize high-volume, low-latency streaming/batch ETL/ELT pipelines. Technical Leadership: Drive the technical execution across the entire data lifecycle. Build and optimize core data processing scripts using Spark and Python. Governance & Quality: Define and enforce standards for data governance, metadata management, and data observability across distributed systems. Implement automated data lineage tracking, schema evolution, and data quality monitoring. Cloud Infrastructure: Configure and manage cloud-native data services, including core data storage and event ingestion infrastructure. Required Experience: Experience: 10+ years of proven experience in enterprise data architecture and engineering. Core Platform Expertise: Strong, hands-on experience with the Azure Data Ecosystem including Azure Data Lake Storage (ADLS), Azure Synapse Analytics (or equivalent cloud DW), and Azure Purview (or equivalent data catalog). Processing: Deep expertise in Databricks (or Apache Spark) for ETL/ELT pipeline implementation, using Delta Lake and SQL Server (or equivalent RDBMS). Coding & Scripting: Strong proficiency in Python, Spark, and advanced SQL. Data Governance: Hands-on experience implementing data lineage tracking and data quality monitoring (e.g., using Great Expectations or dbt). Preferred Skills: Semantic Technologies: Hands-on experience developing ontology frameworks using OWL, RDF, and SPARQL to enable semantic interoperability. Advanced AI Data: Experience integrating structured/unstructured data into Knowledge Graphs and Vector Databases. Streaming/Telemetry: Experience developing and maintaining semantic telemetry pipelines using services like Azure Event Hubs or Kafka. Emerging Concepts: Exposure to linked data ecosystems, data mesh, or data fabric concepts.
    $180k yearly 1d ago
  • SAP Data Engineer

    IDR, Inc. 4.3company rating

    Data engineer job in Atlanta, GA

    IDR is seeking an SAP Data Engineer to join one of our top clients for an opportunity in Atlanta, GA. This role involves designing, building, and optimizing data pipelines and architecture to support advanced analytics and business intelligence in a dynamic enterprise environment. Position Overview for the SAP Data Engineer: Develop and optimize ETL/ELT pipelines from SAP sources and other enterprise data sources Design, build, and maintain data architecture supporting analytics and BI initiatives Deep knowledge of SAP technologies including BW/4HANA, S/4HANA, ECC, BusinessObjects, and SAC Experience with Celonis data engineering and enterprise DataOps practices Collaborate with cross-functional teams to ensure data governance, security, and performance Requirements for the SAP Data Engineer: 5 - 8+ years of hands-on experience in Data Engineering with SAP BW/4HANA, SAP ECC/S4, BusinessObjects, SAC, and Celonis Strong knowledge of data integration techniques such as ODP, SLT, ABAP, SQL, and CDS views Experience developing and customizing BI extraction processes using ABAP Familiarity with SAP Datasphere, hybrid architecture, and SAP BPC Understanding of process excellence frameworks, Celonis EMS, DataOps, CI/CD practices, Snowflake, and Tableau What's in it for you? Competitive compensation package Full Benefits; Medical, Vision, Dental, and more! Opportunity to get in with an industry leading organization Why IDR? 25+ Years of Proven Industry Experience in 4 major markets Employee Stock Ownership Program Dedicated Engagement Manager who is committed to you and your success Medical, Dental, Vision, and Life Insurance ClearlyRated's Best of Staffing Client and Talent Award winner 12 years in a row
    $74k-99k yearly est. 1d ago
  • ML Engineer with Timeseries data experience

    Techstar Group 3.7company rating

    Data engineer job in Atlanta, GA

    Role: ML Engineer with Timeseries data experience Hybrid in Atlanta, GA (locals preferred) $58/hr on C2C, Any Visa Model Development: Design, build, train, and optimize ML/DL models for time-series forecasting, prediction, anomaly detection, and causal inference. Data Pipelines: Create robust data pipelines for collection, preprocessing, feature engineering, and labeling of large-scale time-series data. Scalable Systems: Architect and implement scalable AI/ML infrastructure and MLOps pipelines (CI/CD, monitoring) for production deployment. Collaboration: Work with data engineers, software developers, and domain experts to integrate AI solutions. Performance: Monitor, troubleshoot, and optimize model performance, ensuring robustness and real-world applicability. Languages & Frameworks: Good understanding of AWS Framework, Python (Pandas, NumPy), PyTorch, TensorFlow, Scikit-learn, PySpark. ML/DL Expertise: Strong grasp of time-series models (ARIMA, Prophet, Deep Learning), anomaly detection, and predictive analytics Data Handling: Experience with large datasets, feature engineering, and scalable data processing.
    $58 hourly 4d ago
  • Data Engineer

    A2C 4.7company rating

    Data engineer job in Alpharetta, GA

    5 days onsite in Alpharetta, GA Skills required: Python Data Pipeline Data Analysis Data Modeling Must have solid Cloud experience AI/ML Strong problem-solving skills Strong Communication skill A problem solver with ability to analyze and research complex issues and problems; and proposing actionable solutions and/or strategies. Solid understanding and hands on experience with major cloud platforms. Experience in designing and implementing data pipelines. Must have experience with one of the following: GCP, AWS OR Azure - MUST have the drive to learn GCP.
    $77k-106k yearly est. 2d ago
  • Lead Data Engineer - Palantir Foundry

    Smurfit Westrock

    Data engineer job in Atlanta, GA

    Our technology organization is transforming how we work at WestRock. We align with our businesses to deliver innovative solutions that: Address specific business challenges, integrate processes, and create great experiences Connect our work to shared goals that propel WestRock forward in the Digital Age Imagine how technology can advance the way we work by using disruptive technology We are looking for forward thinking technologists that can accelerate our focus areas such as building stronger foundational technology capabilities, reducing complexity, employing digital transformation concepts, and leveraging disruptive technology. As a Lead Data Engineer, you will play a pivotal role in building and scaling modern data infrastructure that powers decision-making across production, supply chain, and operations. Helps to define and analyze business requirements for Enterprise scale reports. Analyzes and evaluates business use cases for data engineering problems and helps design and develop processing solutions with ETL Cloud based technologies. How you will impact WestRock: Architect and implement scalable data pipelines using Palantir Foundry (pipelines, workshops, ontology) to unify and transform operational data. Design and develop robust data workflows using Python, Apache Airflow, and Apache Spark to support real-time and batch processing needs. Build and deploy solutions on cloud platforms (AWS or Azure), ensuring high availability, security, and performance. Collaborate with data scientists, analysts, and operations teams to deliver actionable insights and operational tooling. Define and enforce data engineering best practices, including CI/CD automation, version control (Git), and testing strategies. Mentor junior developers, conduct code reviews, and help shape the technical roadmap for the data platform. What you need to succeed: Education: Bachelor's degree in computer science or similar At least 6 years of strong Data Engineering experience Hands-on experience with Palantir Foundry, including pipelines, ontology modeling, and workshop development. Strong programming skills in Python or Java, with experience building and maintaining production-grade data pipelines. Proficiency in Apache Airflow and Apache Spark for workflow orchestration and large-scale data processing. Proven experience deploying data solutions on AWS or Azure, with strong understanding of cloud-native services. Familiarity with Git for version control and CI/CD pipelines for automated testing and deployment. Demonstrated ability to mentor junior engineers, lead projects, and work independently in a fast-paced environment. Good communication skills, with the ability to collaborate effectively across technical and non-technical teams. Good analytical and troubleshooting abilities. What we offer: Corporate culture based on integrity, respect, accountability and excellence Comprehensive training with numerous learning and development opportunities An attractive salary reflecting skills, competencies and potential A career with a global packaging company where Sustainability, Safety and Inclusion are business drivers and foundational elements of the daily work.
    $75k-100k yearly est. 16h ago
  • Lead Azure Databrick Engineer

    Syren

    Data engineer job in Atlanta, GA

    ****************Individual Contractors (W2/1099) are encouraged to apply. Visa sponsorship is not available for this role at this time************ An Azure Data Engineer is responsible for designing, implementing, and maintaining the data infrastructure within an organization. They collaborate with both business and IT teams to understand stakeholders' needs and unlock the full potential of data. They create conceptual and logical data models, analyze structural requirements, and ensure efficient database solutions. Must Have Skills: Experience of Migrating from other platform to Databricks Proficiency in Databricks and Azure Cloud, Databricks Asset Bundles, Hoslistic vision on the Data Strategy. Proficiency in Data Streaming and Data Modeling Experience in architecting at least two large-scale big data projects Strong understanding of data scaling and its complexities Data Archiving and Purging mechanisms. Job Requirements • Degree in computer science or equivalent preferred • Demonstrable experience in architecture, design, implementation, and/or support of highly distributed applications with Azure cloud and Databricks. • 10+ Years of Hands-on experience with data modelling, database design, data mining, and segmentation techniques. • Working knowledge and experience with "Cloud Architectures" (e.g., SaaS, PaaS, IaaS) and the ability to address the unique security considerations of secure Cloud computing • Should have architected solutions for Cloud environments such as Microsoft Azure and/or GCP • Experience with debugging and performance tuning in distributed environments • Strong analytical skills with the ability to collect, organize, analyse, and broadcast significant amounts of information with attention to detail and accuracy • Experience dealing with structured, unstructured data. • Must have Python, PySpark experience. • Experience in ML or/and graph analysis is a plus
    $75k-100k yearly est. 3d ago
  • Technical Data Architect

    Oldcastle Infrastructure 4.3company rating

    Data engineer job in Atlanta, GA

    Exempt Oldcastle Infrastructure™, a CRH company, is the leading provider of utility infrastructure solutions for the water, energy, and communications markets throughout North America. We're more than just a manufacturer of precast concrete, polymer concrete, or plastic products. We're a trusted and strategic partner to engineers, contractors, distributors, specifiers, and more. With our network of more than 80 manufacturing facilities and more than 4,000 employees, we're leading the industry with innovation and a safety-first mindset. Job Summary Oldcastle Infrastructure (OI), as part of CRH's Infrastructure Products Group (IPG), is a global manufacturing leader of utility infrastructure products. Our goal is to be the most efficient producer of engineered systems and our customers' strategic partner of choice. A crucial part of OI's journey is the investment in new digital tools including a new ERP. With a modern, common platform, OI will unlock the benefits of its scale, deliver a better customer experience, and build a foundation for continuous process improvement. The Technical Data Architect is a senior role accountable of defining, governing, and delivering the data architecture strategy required to migrate enterprise data from legacy systems into SAP S/4HANA and Salesforce CPQ. This role ensures that data models, migration approaches, and governance structures support end-to-end business processes and regulatory compliance, while delivering high-quality, reconciled, and auditable data into the template. The architect will partner with the business data management team, program management office, functional process owners, and system integrators to ensure a seamless transition with minimal disruption to operations. Job Location This role will work hybrid out of our office in the Sandy Springs, GA area. Job Responsibilities Data Architecture Modeling Design target SAP S/4HANA data models and mapping rules from legacy systems. Validate functional data alignment for Finance (FI/CO), Sales & Distribution (SD), Materials Management (MM) and Production Planning (PP). Leverage CRH IPG Data Dictionary, Data Management and ETL migration tools to support the cleansing and data migration processes. Provide Technical capabilities to support the data quality and data reconciliations for Master Data Subjects. ERP Data Migration Collaborate with the business Master Data team on the legacy data migration by supporting the technical requirements for Customers, Vendors, BOMs, Products and other master data subjects. Define extraction, transformation, load, and reconciliation processes with automation where possible. Master Data Management Partner with the Business Master Data team to align on the governance model, ownership, and ongoing stewardship processes for core data subjects. Define and support the data migration testing strategy, including mock loads, trial conversions, and dress rehearsals. Partner with business master data team and users for the validation and sign-off at each migration stage. Design cutover sequencing for data loads, ensuring minimal downtime. Coordinate with functional leads and the PMO on the entry/exit criteria and contingency planning for go-live events related to data quality readiness. Job Requirements 5-8+ years of experience working in Data Architecture in the manufacturing industry Proven track record in delivering large-scale data migrations (CPQ, OTC, Finance, Supply Chain, Manufacturing P2P). Hands-on experience with ETL/migration tools (SAP Data Services, Informatica, etc). Strong knowledge of data governance, master data management, and audit/compliance processes. Process improvement knowledge gained while working in an organization undergoing a significant operational culture shift Creation and improvement of processes that demonstrate ease of doing business internally and externally Development and implementation of process adherence and data quality adoption metrics Comfortable operating in environment of ambiguity and fast change Strong interpersonal and organizational influencing skills Ability to communicate in a simple, articulate, thoughtful manner to varying audience levels Innovative spirit to work cross-functionally in developing improvement ideas A pleasant, likeable manner while accomplishing challenging results Bachelor's degree in computer science or technical related discipline SAP Technical Certifications in Master Data/Data Services/MDG (preferred) PMP Certification (preferred) What CRH Offers You Highly competitive base pay Comprehensive medical, dental and disability benefits programs Group retirement savings program Health and wellness programs An inclusive culture that values opportunity for growth, development, and internal promotion About CRH CRH has a long and proud heritage. We are a collection of hundreds of family businesses, regional companies and large enterprises that together form the CRH family. CRH operates in a decentralized, diversified structure that allows you to work in a small company environment while having the career opportunities of a large international organization. If you're up for a rewarding challenge, we invite you to take the first step and apply today! Once you click apply now, you will be brought to our official employment application. Please complete your online profile and it will be sent to the hiring manager. Our system allows you to view and track your status 24 hours a day. Thank you for your interest! Oldcastle Infrastructure, a CRH Company, is an Affirmative Action and Equal Opportunity Employer. EOE/Vet/Disability CRH is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, status as a protected veteran or any other characteristic protected under applicable federal, state, or local law.
    $84k-113k yearly est. 1d ago
  • Data Engineer

    Synechron 4.4company rating

    Data engineer job in Alpharetta, GA

    We are At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver industry-leading digital solutions. Synechron's progressive technologies and optimization strategies span end-to-end Artificial Intelligence, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering, servicing an array of noteworthy financial services and technology firms. Through research and development initiatives in our FinLabs we develop solutions for modernization, from Artificial Intelligence and Blockchain to Data Science models, Digital Underwriting, mobile-first applications and more. Over the last 20+ years, our company has been honored with multiple employer awards, recognizing our commitment to our talented teams. With top clients to boast about, Synechron has a global workforce of 14,500+, and has 58 offices in 21 countries within key global markets. Our Challenge Join our data-driven enterprise and lead the design of scalable and high-performance big data solutions. You will craft architectures that handle vast volumes of data, optimize pipeline performance, and incorporate advanced governance and AI-powered processing to unlock actionable insights. Additional Information The base salary for this position varies based on geography and other factors. In accordance with law, the base salary for this role if filled within Alpharetta, GA is $120K - 125K/year & benefits (see below). The Role Responsibilities: Design, build, and maintain scalable big data architectures supporting enterprise analytics and operational needs. Develop, implement, and optimize data pipelines using Apache Airflow, Databricks, and other relevant technologies to ensure reliable data flow and process automation. Manage and enhance data workflows for batch and real-time processing, ensuring efficiency and scalability. Collaborate with data scientists, analysts, and business stakeholders to translate requirements into robust data solutions. Implement data governance, security, and compliance best practices to protect sensitive information. Explore integrating AI/ML techniques into data pipelines, leveraging Databricks and other AI tools for predictive analytics and automation. Develop monitoring dashboards and alert systems to ensure pipeline health and performance. Stay current with emerging big data and cloud technologies, recommending best practices to improve system performance and scalability. Requirements: 5+ years of proven experience in Big Data architecture design, including distributed storage and processing frameworks such as Hadoop, Spark, and Databricks. Strong expertise in performance tuning for large-scale data systems. Hands-on experience with Apache Airflow for workflow orchestration. Proficiency in SQL for managing and querying large databases. Extensive experience with Python for scripting, automation, and data processing workflows. Experience working with cloud platforms (Azure, AWS, or GCP) preferable. Preferred, but not required: Deep understanding of data governance and security frameworks to safeguard sensitive data. Experience with integrating AI/ML models into data pipelines using Databricks MLflow or similar tools. Knowledge of containerization (Docker, Kubernetes) is a plus We offer: A highly competitive compensation and benefits package. A multinational organization with 58 offices in 21 countries and the possibility to work abroad. 10 days of paid annual leave (plus sick leave and national holidays). Maternity & paternity leave plans. A comprehensive insurance plan including medical, dental, vision, life insurance, and long-/short-term disability (plans vary by region). Retirement savings plans. A higher education certification policy. Commuter benefits (varies by region). Extensive training opportunities, focused on skills, substantive knowledge, and personal development. On-demand Udemy for Business for all Synechron employees with free access to more than 5000 curated courses. Coaching opportunities with experienced colleagues from our Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups. Cutting edge projects at the world's leading tier-one banks, financial institutions and insurance firms. A flat and approachable organization. A truly diverse, fun-loving, and global work culture
    $120k-125k yearly 3d ago
  • Senior Data Architect

    Visionaire Partners 4.1company rating

    Data engineer job in Atlanta, GA

    Long-term opportunity with a rapidly growing company! RESPONSIBILITIES: Own end-to-end data architecture for enterprise SaaS platforms, including both OLTP and analytical serving layers Design and operate solutions across Azure SQL DB/MI, Azure Databricks with Delta Lake, ADLS Gen2, Synapse Analytics / Microsoft Fabric Partner with analytics teams on Power BI semantic models, including performance optimization and row-level security (RLS) Define and implement Information Lifecycle Management (ILM): hot/warm/cold tiers, 2-year OLTP retention, archive/nearline, and a BI mirror that enables rich analytics without impacting production workloads. Engineer ERP/SAP financial interfaces for idempotency, reconciliation, and traceability; design rollback/de-dup strategies and financial journal integrity controls. Govern schema evolution/DbVersions to prevent cross-customer regressions while achieving performance gains Establish data SLOs (freshness, latency, correctness) mapped to customer SLAs; instrument monitoring/alerting and drive continuous improvement. This is a direct-hire opportunity in Atlanta. Work onsite the first 5-6 months, then transition to a hybrid schedule of 3 days in the office with 2 days remote (flex days). REQUIRED SKILLS: 10+ years of experience in data or database engineering 5-8+ years owning data or database architecture for customer-facing SaaS or analytics platforms at enterprise scale Proven experience operating at multi-terabyte scale (5+ TB) with measurable improvements in performance, reliability, and cost Strong expertise with Azure data technologies Advanced SQL skills, including query optimization, indexing, partitioning, CDC, caching, and schema versioning Experience designing audit-ready, SLA-driven data platforms Strong background in ERP/SAP data integrations, particularly financial data Bachelor's degree PREFERRED SKILLS: Power BI performance modeling (RLS, composite models, incremental refresh, DAX optimization). Modular monolith/microservices experience Semantic tech (ontology/knowledge graphs), vector stores, and agentic AI orchestration experience Must be authorized to work in the US. Sponsorships are not available.
    $100k-137k yearly est. 4d ago
  • Senior Data Architect

    Mtech Systems 3.5company rating

    Data engineer job in Dunwoody, GA

    At MTech Systems, our company mission is to increase yield in protein production to help feed the growing world population without compromising animal welfare or damaging the planet. We aim to create software that delivers real-time data to the entire supply chain that allows producers to get better insight into what is happening on their farms and what they can do to responsibly improve production. MTech Systems is a prominent provider of tools for managing performance in Live Animal Protein Production. For over 30 years, MTech Systems has provided cutting-edge enterprise data solutions for all aspects of the live poultry operations cycle. We provide our customers with solutions in Business Intelligence, Live Production Accounting, Production Planning, and Remote Data Management-all through an integrated system. Our applications can currently be found running businesses on six continents in over 50 countries. MTech has built an international reputation for equipping our customers with the power to utilize comprehensive data to maximize profitability. With over 250 employees globally, MTech Systems currently has main offices in Mexico, United States, and Brazil, with additional resources in key markets around the world. MTech Systems USA's headquarters is based in Atlanta, Georgia and has approximately 90 team members in a casual, collaborative environment. Our work culture here is based on a commitment to helping our clients feed the world, resulting in a flexible and rewarding atmosphere. We are committed to maintaining a work culture that enhances collaboration, provides robust development tools, offers training programs, and allows for direct access to senior and executive management. Job Summary MTech builds customer-facing SaaS & analytics products used by global enterprise customers. You will own the database/data platform architecture that powers these products-driving performance, reliability, auditability, and cost efficiency at multi-tenant, multi-terabyte scale. Success is measured in hard outcomes: fewer P1s/support tickets, faster queries, bullet-proof ERP/SAP integrations, SLO compliance tied to SLAs, and audit ready evidence. Responsibilities and Duties Architecture & Design Own the end-to-end data architecture for enterprise SaaS (OLTP + analytical serving), including Azure SQL/MI, Databricks/Delta Lake, ADLS, Synapse/Fabric, and collaboration on Power BI semantic models (RLS, performance). Define and implement Information Lifecycle Management (ILM): hot/warm/cold tiers, 2-year OLTP retention, archive/nearline, and a BI mirror that enables rich analytics without impacting production workloads. Engineer ERP/SAP financial interfaces for idempotency, reconciliation, and traceability; design rollback/de-dup strategies and financial journal integrity controls. Govern schema evolution/DbVersions to prevent cross-customer regressions while achieving performance gains. Establish data SLOs (freshness, latency, correctness) mapped to customer SLAs; instrument monitoring/alerting and drive continuous improvement. Operations & Observability Build observability for pipelines and interfaces (logs/metrics/traces, lineage, data quality gates) and correlate application telemetry (e.g., Stackify/Retrace) with DB performance for rapid rootcause analysis. Create incident playbooks (reprocess, reconcile, rollback) and drive MTTR down across data incidents. Collaboration & Leadership Lead the DBA/DB engineering function (standards, reviews, capacity planning, HA/DR, on-call, performance/availability SLOs) and mentor data engineers. Partner with Product/Projects/BI to shape domain models that meet demanding customer reporting (e.g., Tyson Matrix) and planning needs without compromising OLTP. Required Qualifications 15+ years in data/database engineering; 5-8+ years owning data/DB architecture for customerfacing SaaS/analytics at enterprise scale. Proven results at multi-terabyte scale (≥5 TB) with measurable improvements (P1 reduction, MTTR, query latency, cost/performance). Expertise in Azure SQL/MI, Databricks/Delta Lake, ADLS, Synapse/Fabric; deep SQL, partitioning/indexing, query plans, CDC, caching, schema versioning. Audit & SLA readiness: implemented controls/evidence to satisfy SOC 1 Type 2 (or equivalent) and run environments to SLOs linked to SLAs. ERP/SAP data interface craftsmanship: idempotent, reconciled, observable financial integrations. ILM/Archival + BI mirror design for queryable archives/analytics without OLTP impact. Preferred Skills Power BI performance modeling (RLS, composite models, incremental refresh, DAX optimization). Modular monolith/microservices experience (plus, not required). Semantic tech (ontology/knowledge graphs), vector stores, and agentic AI orchestration experience (advantage, not required). EEO Statement Integrated into our shared values is MTech's commitment to diversity and equal employment opportunity. All qualified applicants will receive consideration for employment without regard to sex, age, race, color, creed, religion, national origin, disability, sexual orientation, gender identity, veteran status, military service, genetic information, or any other characteristic or conduct protected by law. MTech aims to maintain a global inclusive workplace where every person is regarded fairly, appreciated for their uniqueness, advanced according to their accomplishments, and encouraged to fulfill their highest potential. We believe in understanding and respecting differences among all people. Every individual at MTech has an ongoing responsibility to respect and support a globally diverse environment.
    $92k-123k yearly est. 1d ago
  • People Analytics Engineer

    The Clorox Company 4.6company rating

    Data engineer job in Alpharetta, GA

    Clorox is the place that's committed to growth - for our people and our brands. Guided by our purpose and values, and with people at the center of everything we do, we believe every one of us can make a positive impact on consumers, communities and teammates. Join our team. #CloroxIsThePlace Your role at Clorox: We are seeking a People Analytics Engineer with strong expertise in Business Intelligence and Data Engineering to join our People Analytics & Insights team. This role will design, develop, and maintain scalable data solutions that empower Clorox leaders and business partners to make data-driven decisions. You will play a key role in transforming complex workforce data into actionable insights through innovative products. In this role, you will: Design and develop advanced dashboards and reports to visualize people metrics, workforce trends, and predictive analytics. Build and maintain robust data models, ensuring accuracy, consistency, and usability across all analytics solutions. Integrate and manage data pipelines from multiple sources (HRIS, ATS, engagement platforms, etc.) to deliver timely and reliable insights. Collaborate with cross-functional stakeholders to translate complex business questions into clear, data-driven solutions. Provide actionable insights on workforce planning, talent acquisition, retention, and diversity to inform strategic initiatives. Implement and uphold data governance standards, including security roles and compliance protocols, to maintain data integrity and confidentiality. What we look for: 4+ years in BI development, preferably in HR or People function. Bachelor's degree is preferred or equivalent job experience Expertise in SQL, comfortable with at least one programming language (e.g. Python, R, etc) Experience using a data warehouse (Snowflake or Databricks preferred) Proven experience with Dashboard (Power BI experience is required) Familiarity with HR systems (Workday, SAP SuccessFactor, etc.) is a plus. Knowledge of data governance and security best practices. Strong analytical thinking and problem-solving ability. Excellent communication skills to explain technical concepts to non-technical stakeholders. Workplace type: 3 days in the office, 2 days working from home Our values-based culture connects to our purpose and empowers people to be their best, professionally and personally. We serve a diverse consumer base which is why we believe teams that reflect our consumers bring fresh perspectives, drive innovation, and help us stay attuned to the world around us. That's why we foster an inclusive culture where every person can feel respected, valued, and fully able to participate, and ultimately able to thrive. Learn more. [U.S.]Additional Information: At Clorox, we champion people to be well and thrive, starting with our own people. To help make this possible, we offer comprehensive, competitive benefits that prioritize all aspects of wellbeing and provide flexibility for our teammates' unique needs. This includes robust health plans, a market-leading 401(k) program with a company match, flexible time off benefits (including half-day summer Fridays depending on location), inclusive fertility/adoption benefits, and more. We are committed to fair and equitable pay and are transparent with current and future teammates about our full salary ranges. We use broad salary ranges that reflect the competitive market for similar jobs, provide sufficient opportunity for growth as you gain experience and expand responsibilities, while also allowing for differentiation based on performance. Based on the breadth of our ranges, most new hires will start at Clorox in the first half of the applicable range. Your starting pay will depend on job-related factors, including relevant skills, knowledge, experience and location. The applicable salary range for every role in the U.S. is based on your work location and is aligned to one of three zones according to the cost of labor in your area. -Zone A: $72,400 - $132,500 -Zone B: $66,400 - $121,500 -Zone C: $60,300 - $110,400 All ranges are subject to change in the future. Your recruiter can share more about the specific salary range for your location during the hiring process. This job is also eligible for participation in Clorox's incentive plans, subject to the terms of the applicable plan documents and policies. To all recruitment agencies: Clorox (and its brand families) does not accept agency resumes. Please do not forward resumes to Clorox employees, including any members of our leadership team. Clorox is not responsible for any fees related to unsolicited resumes.
    $72.4k-132.5k yearly 3d ago
  • Software Engineer

    Insight Global

    Data engineer job in Atlanta, GA

    DevOps Software Engineer Type: Hybrid in Atlanta, GA - Onsite 4x a month Duration: 6-month contract to hire About the Role The DevOps Software Engineer works within the client's Software Development group as part of an Agile Scrum or Kanban team. In this role, you will design, code, test, automate, and support high‑performing software and delivery pipelines. You serve as a technical expert on the systems you build and maintain, collaborating with teammates, business partners, and stakeholders to deliver scalable solutions aligned with the client's technical vision. Software Engineering Responsibilities Build, maintain, and optimize CI/CD pipelines. Manage branching and release strategies (feature, release, hotfix, etc.). Implement automation across build, deployment, and operational processes. Design, develop, test, and document applications based on business requirements. Build standalone and multi‑tiered applications using modern development practices. Deliver web and desktop applications for assigned projects. Required Qualifications 3+ years of experience in software development (coding, debugging, testing, troubleshooting). Hands‑on experience building and maintaining CI/CD pipelines. Experience managing branching and release strategies. Strong focus on automation across development and operational workflows. Azure DevOps, GitHub, Git, Visual Studio or similar tooling Web Services, JSON, XML, CSS, HTML C#, JavaScript, SQL Contract/Contract-to-Hire Roles: Compensation: $40/hr to $50/hr. Exact compensation may vary based on several factors, including skills, experience, and education. Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law. Conversion Salary: $95,000 - $115,000
    $95k-115k yearly 3d ago
  • Java Software Engineer

    Holistic Partners, Inc.

    Data engineer job in Atlanta, GA

    Must Have Technical/Functional Skills Java, J2EE, Angular 17, Spring Boot, Microservices, Java Script, CSS, JSON, REST/SOAP Webservices, Hibernate, Maven, PotgreSQL, XML Roles & Responsibilities Development & Enhancement as per change request and fixing the defects independent. Coordinating with business analyst and cross functionals teams Hands on experience for Java, J2EE, Angular 17, Spring Boot, Microservices, Java Script, CSS, JSON, REST/SOAP Webservices, Hibernate, Maven, PotgreSQL Should have proven analytical abilities, experience in generating process documentation and reports Excellent communication skills with an ability to translate data into actionable insights Generic Managerial Skills, If any Key Words to search in Resume Java, J2EE, Angular 17, Spring Boot, Microservices, Java Script, CSS, JSON, Webservices
    $64k-85k yearly est. 16h ago
  • UI/UX Engineer

    Charter Global 4.0company rating

    Data engineer job in Atlanta, GA

    📅 Contract: 6+ Months About the Role: We're seeking a UI/UX Engineer to design and develop intuitive, user-centric web interfaces that deliver exceptional digital experiences. This role blends creative design, front-end development, and data-driven insights to optimize usability and performance. Key Responsibilities: Design and implement responsive, accessible web interfaces using modern frameworks (React, Angular, Vue). Collaborate with business and technical teams to translate requirements into functional UI/UX solutions. Utilize analytics tools (Google Analytics, Tag Manager, Hotjar) to track user behavior and improve engagement. Conduct A/B testing and generate insights for UX enhancements and conversion optimization. Ensure compliance with accessibility standards (WCAG) and SEO best practices. Requirements: 5+ years in UI/UX design and front-end development. Proficiency in HTML5, CSS3, JavaScript, and design tools (Figma, Adobe XD, Sketch). Experience with CMS platforms, version control (Git), and Agile methodologies. Strong understanding of analytics, accessibility, and user-centered design principles. Bachelor's degree in Design, HCI, or related field. Regards, Ashish Lal | Talent Acquisition Manager Charter Global Inc | ***************************** Email: ************************* LinkedIn: ASHISH K LAL | LinkedIn One Glenlake Parkway | Suite 525 | Atlanta, GA 30328
    $67k-85k yearly est. 1d ago
  • Senior Angular Developer

    Firstpro 360 4.5company rating

    Data engineer job in Norcross, GA

    About the Role We're seeking a Senior Angular Developer to lead the migration of our applications from Angular 14 to Angular 19+ and architect state management solutions using NgRx. Key Responsibilities Lead Angular 14 to 19+ migration initiatives Design and implement NgRx state management (Store, Effects, Entity) Build scalable, performant enterprise Angular applications Mentor developers on Angular best practices and reactive programming Conduct code reviews and establish coding standards Required Skills 5+ years Angular experience (versions 2+) Hands-on experience migrating Angular applications across major versions Expert knowledge of NgRx state management patterns Strong proficiency in TypeScript and RxJS Experience with modern Angular features (standalone components, signals) Unit and E2E testing experience (Jest, Cypress, or similar) Nice to Have Angular 19 feature experience Angular Material or component library expertise Server-side rendering (Angular Universal) Micro-frontend architecture knowledge
    $91k-118k yearly est. 4d ago
  • Senior Lead Software Engineer

    Apps Accelerator

    Data engineer job in Alpharetta, GA

    Senior Lead Software Engineer- Founding Software Engineer Company: Apps Accelerator Type: Full-time / Founding Role About Us At Apps Accelerator, we're building an AI-first venture studio a place where great ideas, intelligent engineering, and entrepreneurial energy collide. Based in Alpharetta, GA, we're designing and launching the next generation of AI-powered web and mobile companies that move fast, scale smart, and redefine what's possible. We're not another dev shop. We're builders of intelligent ventures taking ideas from whiteboard to revenue with speed, precision, and creativity. The Role We're looking for a Founding Software Engineer someone who doesn't just want to write code but wants to build companies. You'll work side-by-side with the founder to design, develop, and launch AI-first products from concept to MVP. You'll experiment, iterate, and scale, using the full Microsoft and open AI ecosystem from React and React Native to LLM integrations, automation, and intelligent systems. This role is perfect for someone who's half engineer, half entrepreneur - someone who thrives on building 0 → 1, learning fast, and shipping faster. What You'll Do Lead the development of AI-first MVPs using React, React Native, Node.js, and modern backend frameworks. Integrate AI-assisted tools (e.g., GitHub Copilot, OpenAI API, LangChain) to accelerate development. Architect scalable, cloud-based infrastructure on Azure or AWS. Partner with design and strategy to bring ideas to life - fast. Collaborate on product vision, roadmap, and technical direction for multiple ventures. Establish the foundation for engineering culture, standards, and best practices. Grow with the studio - mentoring new engineers as the venture portfolio expands. What We're Looking For 5+ years of experience in full-stack development (React, React Native, Node.js, TypeScript). Strong grasp of AI tools and frameworks - LLMs, APIs, and automation workflows. Startup or entrepreneurial experience - you've shipped products or launched something from scratch. A builder's mindset - you care about speed, quality, and iteration. Curiosity and grit. You love solving problems no one's solved before. Bonus Skills Experience with LangChain, Semantic Kernel, or LlamaIndex. Familiarity with MLOps or deploying AI models in production. Passion for UX/UI design and creating intuitive interfaces. Interest in becoming a co-founder or equity partner in future ventures. Why Join Us Ground-floor opportunity in an AI venture studio that's redefining how companies are built. Hybrid work model with in-person collaboration in Alpharetta, GA. Competitive base + equity participation in future ventures. Freedom to experiment, innovate, and build products that matter. Work directly with the founder; no red tape, just real impact.
    $83k-109k yearly est. 3d ago
  • Senior Developer

    Talent Software Services 3.6company rating

    Data engineer job in Alpharetta, GA

    Are you an experienced Senior Developer with a desire to excel? If so, then Talent Software Services may have the job for you! Our client is seeking an experienced Senior Developer to work in Columbia, SC. Primary Responsibilities/Accountabilities: Responsible for analysis, design, programming, and implementation of the most complex application tasks and projects in the area. Devise feasible, logical procedures to resolve business problems through the use of computer resources. Formulate scope and objectives through research to develop or modify complex systems. Provide technical direction and support in the development and support of business systems software and procedures. Design, code, test, and debug the most complex application programs. Provide expertise regarding the integration of applications across the business. Conceive, design, and implement structures and programs to business systems software. Act as an internal consultant, advocate, mentor, and change agent providing expertise and technical guidance on complex projects. Work closely with customers, business analysts, and team members to determine business requirements that drive the analysis and design of quality technical solutions. Ensure solutions are aligned with business and IT strategies and comply with the organization's architectural standards. Provide design recommendations based on long-term IT organization strategy. Make recommendations towards the development of new code or reuse of existing code. Perform analysis, design, programming, and implementation on systems and procedures to solve complex business or scientific problems. Develop enterprise-level applications and custom integration solutions. Evaluate complex interrelationships in immediate programming area to determine how changes in one program will affect another related area. Develop programming and development standards. Devise new sources of data and develop new approaches and techniques. Involved in the full systems life cycle and is responsible for designing, coding, testing, implementing, maintaining and supporting application software that is delivered on time and within budget. Provide guidance to lower level programmers/analysts. Lead, plan, organize, and/or coordinate complex projects or phases of large projects. Determine and resolve problems with other systems analysts, programmers, and systems users. Test designed programs, verify logic, perform any necessary "debugging," and write the documentation. Lead design, development, and deployment of cloud-native applications Review and approve code merges Monitor and manage CI/CD pipelines: Jenkins GitHub Actions Azure DevOps Build and integrate solutions using AWS cloud services Ensure alignment with: Architectural standards Business goals Collaborate with: Cloud architects Security teams DevOps teams Implement and enforce best practices for: Performance Security Reliability Troubleshoot deployment, performance, and integration issues Mentor junior developers and provide technical leadership Qualifications: 8 years of experience in: Application development Systems testing Or other job-related IT experience Full Stack Development: Java / Java EE ecosystems JavaScript frameworks Agile & Medicare experience Cloud Platforms: Hands-on experience with AWS or Azure Lift-and-shift or re-platforming project experience Cloud Architecture & Reliability: Design and develop resilient, high-availability, fault-tolerant systems Experience following Cloud Well-Architected Framework AWS Services (Hands-on): EC2, Lambda RDS, Aurora S3, VPC, IAM EKS / ECS (Fargate) DynamoDB API Gateway CloudFormation CloudWatch, CloudTrail Secrets Manager Logging, Monitoring & Tracing: Splunk CloudWatch AWS X-Ray Infrastructure as Code (IaC): Terraform ARM Templates CloudFormation Security & Compliance: NIST 800-53 FedRAMP CI/CD & DevSecOps Tools: Jenkins GitHub Actions AWS CodePipeline Azure DevOps Selenium SonarQube Snyk Accessibility Standards: Section 508 WCAG remediation familiarity Preferred: Serverless architecture expertise Cloud security best practices for cloud-native development Cloud certifications (AWS, Azure) Containerization & orchestration: Docker ECS / Fargate Kubernetes Secure coding practices Identity & Access Management (IAM) Compliance experience in regulated environments: HIPAA FedRAMP
    $86k-112k yearly est. 3d ago

Learn more about data engineer jobs

How much does a data engineer earn in Atlanta, GA?

The average data engineer in Atlanta, GA earns between $65,000 and $114,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Atlanta, GA

$86,000

What are the biggest employers of Data Engineers in Atlanta, GA?

Job type you want
Full Time
Part Time
Internship
Temporary