GIS Data Analyst
Data analyst job in Atlanta, GA
We are seeking a highly skilled GIS Data Analyst to support the creation, maintenance, and quality assurance of critical geospatial infrastructure data for enterprise-level Engineering initiatives. This role plays a key part in ensuring the accuracy and reliability of GIS data used across the organization - including compliance-related programs and operational analytics.
The ideal candidate is a hands-on GIS professional with strong analytical skills, advanced geospatial editing experience, and the ability to interpret field data into accurate digital representations. PTC (Positive Train Control) and rail experience are preferred but not required.
Key Responsibilities
Create, modify, and quality-check geospatial infrastructure data for engineering and business operations
Utilize GIS tools to ensure accurate topology and track geometry representation
Convert field-collected spatial data into a validated digital rail network aligned with organizational standards
Review, approve, and promote data change sets submitted by GIS/CAD technicians
Conduct regular inventory analysis including track mileage, asset counts, and spatial measurements
Collaborate with engineering sub-groups and business partners to support enterprise GIS initiatives
Contribute to the preparation, assembly, and deployment of geospatial data to support compliance programs and corporate systems
Support continuous improvement by recommending cost-saving initiatives leveraging GIS technologies
Assist senior GIS staff in additional GIS responsibilities as needed
Required Skills & Qualifications
Advanced proficiency with GIS software, ideally ESRI tools (ArcGIS Pro, Desktop, geodatabase editing, topology management)
Strong analytical, problem-solving, and data quality assurance capabilities
Ability to interpret engineering drawings, field data, and spatial reference materials
Familiarity working with infrastructure or utility network datasets
Excellent communication and collaboration skills
Bachelor's degree required - GIS, Computer Science, Software Engineering, IT, Geography, or related
Preferred Qualifications (Nice to Have)
Exposure to railroad infrastructure or linear transportation networks
Experience supporting Positive Train Control (PTC) data models or compliance initiatives
Working knowledge of CAD-to-GIS workflows
Experience with Enterprise GIS deployments in large-scale organizations
Soft Skills
Detail-oriented data stewardship mindset
Ability to make informed decisions and manage competing priorities
Strong teamwork and communication in a technical environment
Data Analyst or Bussiness Analyst
Data analyst job in Alpharetta, GA
The ideal candidate will use their passion for data and analytics to provide insights to the business covering a range of topics. They will be responsible for conducting both recurring and ad hoc analysis for business users.
Responsibilities
Understand the day-to-day issues that our business faces, which can be better understood with data
Compile and analyze data related to business' issues
Develop clear visualizations to convey complicated data in a straightforward fashion
Qualifications
Bachelor's or Master's degree in Statistics or Applied Mathematics or equivalent experience
1 - 2 years' Data Analysis experience
Proficient in SQL
Data Architect with low latency
Data analyst job in Atlanta, GA
Role : Data Architect with low latency
Duration : Long Term Contact
We're seeking a seasoned Lead Software Engineer to architect, build, and scale real time data processing platforms that power event driven applications and analytics. You'll lead the design of streaming microservices, govern data quality and lineage, and mentor engineers while partnering with product, platform, and security stakeholders to deliver resilient, low latency systems.
Responsibilities:
• Own design & delivery of high throughput, low latency streaming solutions using technologies like Confluent Kafka, Apache Flink, Hazelcast, Kafka Streams, Kafka Connect, and Schema Registry.
• Design and implement microservices and event driven systems with robust ETL/ELT pipelines for real time ingestion, enrichment, and delivery.
• Establish distributed caching and in memory data grid patterns (e.g., Redis, Hazelcast) to optimize read/write performance and session/state management.
• Define and operationalize event gateways / event grids for event routing, fan out, and reliable delivery.
• Lead data governance initiatives-standards for metadata, lineage, classifications, retention, access controls, and compliance (PII/PCI/SOX/GDPR as applicable).
• Drive CI/CD best practices (pipelines, automated testing, progressive delivery) to enable safe, frequent releases; champion DevSecOps and “shift left” testing.
• Set SLOs/SLAs, track observability (tracing, metrics, logs), and optimize performance at scale (throughput, backpressure, state, checkpointing).
• Work with Security, Platform, and Cloud teams on networking, IAM, secrets, certificates, and cost optimization.
• Mentor engineers, conduct design reviews, and enforce coding standards and reliability patterns.
• Guide platform and delivery roadmap
Required Qualifications:
• 10+ years in software engineering; 5+ years designing large-scale real time or event driven platforms.
• Expert with Confluent Kafka (brokers, partitions, consumer groups, Schema Registry, Kafka Connect), Flink (DataStream/Table API, stateful ops, checkpointing), Hazelcast, and/or Kafka Streams.
• Strong in ETL/ELT design, streaming joins/windows, exactly once semantics, and idempotent processing.
• Experience with microservices (Java/Python), REST/gRPC, protobuf/Avro, and contract-first development.
• Hands-on with distributed caching and in memory data grids; performance tuning and eviction strategies.
• Cloud experience in any one or more cloud platforms Azure/AWS/GCP; containers, Docker, Kubernetes.
• Experience in production-grade CI/CD (Jenkins, Bamboo, Harness or similar), Infrastructure as Code (Terraform/Helm).
• Robust observability (Prometheus/Grafana/OpenTelemetry, Splunk/ELK or similar), and resilience patterns (circuit breakers, retries, DLQs).
• Practical data governance: metadata catalogs, lineage, encryption, RBAC.
• Excellent communication; ability to lead design, influence stakeholders, and guide cross-functional delivery.
• Core competencies to include Architectural Thinking, Systems Design, Operational Excellence, Security & Compliance, Team Leadership, Stakeholder Management.
Nice to Have:
• Experience with CDC, Kafka Connect custom connectors, Flink SQL, Beam.
• Streaming ML or feature stores integration (online/offline consistency).
• Multi region / disaster recovery for streaming platforms.
• Experience with Zero downtime migrations, blue/green, and canary deployments."
Information Technology Business Analyst
Data analyst job in Atlanta, GA
We are building a team to drive a merger & acquisition! Please apply today if you are looking to make an impact + join a like-minded group of individuals.
Sr Business Analyst
Duration: Long-term/extending contract
Pay: Targeting between 50-55/hour
Location: 3-4 days onsite in ATL
The Senior Business Analyst will play a critical role in bridging our business needs with the entire technical solutions team. This position is responsible for analyzing complex business processes, gathering requirements, and delivering actionable insights that drive strategic decisions and operational improvements.
Core Responsibilities:
o Business Analysis & Strategy: Evaluate business strategies and identify opportunities for improvement in efficiency, productivity, and profitability.
o Requirements Gathering: Collect, document, and manage business and functional requirements throughout the project lifecycle.
o Solution Design: Translate business needs into technical specifications and collaborate with IT teams to design and implement solutions.
o Stakeholder Management: Act as a liaison between business units and technical teams, ensuring alignment and clear communication.
o Process Improvement: Conduct gap analysis, benchmarking, and recommend process optimization strategies.
o Testing & Implementation: Support testing phases, validate solutions, and assist in successful rollouts.
o Reporting & Documentation: Prepare detailed reports, dashboards, and presentations for senior leadership.
Must Haves:
5+ years of experience in an IT Business Analyst role
Ability to write complex BRD, TRDs, and Agile artifacts
Experience with SaaS/software migrations preferred
Executive presence and ability to speak to non-technical audiences
Plusses:
Experience with M&A (mergers and acquisitions)
Experience with Smartsheet
Data Architect
Data analyst job in Atlanta, GA
Note: Initial 100% onsite required for the first six months.
Employment Type: Permanent / Direct Hire / Full-time
Salary Up to $180,000 (depending on experience) + bonus
The Role:
We're seeking a highly skilled and hands-on Data Architect to lead the design, implementation, and ongoing evolution of our enterprise-grade data systems. This role is crucial for building scalable, secure, and intelligent data infrastructure that supports core analytics, operational excellence, and future AI initiatives. Success requires a seasoned technologist who can seamlessly integrate cloud-native services with traditional data warehousing to create a modern, unified data platform.
What You'll Do:
Architecture & Strategy: Lead the design and implementation of modern data platforms, including Data Lakes, Data Warehouses, and Lakehouse architectures, to enable a single source of truth for the enterprise.
Data Modeling & Integration: Architect unified data models that support both modular monoliths and microservices-based platforms. Design and optimize high-volume, low-latency streaming/batch ETL/ELT pipelines.
Technical Leadership: Drive the technical execution across the entire data lifecycle. Build and optimize core data processing scripts using Spark and Python.
Governance & Quality: Define and enforce standards for data governance, metadata management, and data observability across distributed systems. Implement automated data lineage tracking, schema evolution, and data quality monitoring.
Cloud Infrastructure: Configure and manage cloud-native data services, including core data storage and event ingestion infrastructure.
Required Experience:
Experience: 10+ years of proven experience in enterprise data architecture and engineering.
Core Platform Expertise: Strong, hands-on experience with the Azure Data Ecosystem including Azure Data Lake Storage (ADLS), Azure Synapse Analytics (or equivalent cloud DW), and Azure Purview (or equivalent data catalog).
Processing: Deep expertise in Databricks (or Apache Spark) for ETL/ELT pipeline implementation, using Delta Lake and SQL Server (or equivalent RDBMS).
Coding & Scripting: Strong proficiency in Python, Spark, and advanced SQL.
Data Governance: Hands-on experience implementing data lineage tracking and data quality monitoring (e.g., using Great Expectations or dbt).
Preferred Skills:
Semantic Technologies: Hands-on experience developing ontology frameworks using OWL, RDF, and SPARQL to enable semantic interoperability.
Advanced AI Data: Experience integrating structured/unstructured data into Knowledge Graphs and Vector Databases.
Streaming/Telemetry: Experience developing and maintaining semantic telemetry pipelines using services like Azure Event Hubs or Kafka.
Emerging Concepts: Exposure to linked data ecosystems, data mesh, or data fabric concepts.
Technical Research Analyst (467230)
Data analyst job in Atlanta, GA
IDR is seeking a Technical Research Analyst to join one of our top clients in Atlanta, GA. This role offers a unique opportunity to delve into the convergence of data and power within the electrical industry. If you are eager to be part of a dynamic organization and thrive in a collaborative, team-oriented environment, we encourage you to apply today!
Position Overview & Responsibilities for the Technical Research Analyst:
Collaborate with leading corporations, educational institutions, and research centers on R&D projects.
Conduct in-depth analysis of emerging technological trends in the convergence of data and power.
Develop comprehensive documentation for research findings, technology evaluations, and market studies.
Present research outcomes and technology roadmaps to executive teams, ensuring alignment with strategic goals.
Required Skills for Technical Research Analyst:
3-5+ years of experience in Power, Electrical, or Industrial Engineering, with a focus on the electrical industry.
Bachelor's degree in Business, IT, or a related field; advanced degrees preferred.
Strong analytical skills to interpret complex data and trends.
Experience with AI and ML in power or utility systems.
Excellent interpersonal skills for effective collaboration and presentation.
What's in it for you?
Competitive compensation package Full Benefits; Medical, Vision, Dental, and more!
Opportunity to get in with an industry leading organization
Close-knit and team-oriented culture
ServiceNow Agentic AI / Virtual Agent (VA) Business Analyst
Data analyst job in Atlanta, GA
The ServiceNow Agentic AI / Virtual Agent (VA) Business Analyst will serve as the primary liaison between procurement stakeholders, finance teams, and technical developers to design, implement, and optimize Agentic AI / Virtual Agent (VA) processes on the ServiceNow platform. This role focuses on gathering requirements, mapping workflows, and ensuring seamless integration of Agentic AI / Virtual Agent (VA) within the enterprise service management ecosystem.
This role will participate in the design of Virtual Agent experiences and lead integrations between ServiceNow and enterprise telephony/CTI platforms. You will drive discovery workshops, translate business goals into measurable outcomes, and partner with architects and developers to deliver Agentic AI capabilities (Now Assist, VA Skills, conversational flows) that reduce effort and improve CX/EX.
Key Responsibilities
• Requirements Gathering & Analysis:
o Collaborate with procurement, finance, and IT teams to capture functional and technical requirements for Agentic AI / Virtual Agent (VA) workflows.
o Translate business needs into detailed user stories and acceptance criteria.
• Process Design & Optimization:
o Document and optimize end-to-end Source-to-Pay processes, including sourcing, requisitioning, purchase orders, invoicing, and supplier management.
o Ensure alignment with organizational policies and compliance standards.
• ServiceNow Platform Expertise:
o Work with architects and developers to configure and customize ServiceNow Procurement and Financial modules.
o Validate workflows, integrations, and reporting dashboards against business requirements.
• Stakeholder Management:
o Facilitate workshops, demos, and UAT sessions with business and technical teams.
o Act as a trusted advisor for procurement transformation initiatives.
• Data & Reporting:
o Define reporting requirements for spend analysis, supplier performance, and compliance.
o Support CMDB and CSDM alignment for procurement-related services.
• Governance & Compliance:
o Ensure adherence to financial regulations, procurement policies, and audit requirements.
o Support risk management and continuous improvement initiatives.
Required Skills & Qualifications
• Strong understanding of Agentic AI / Virtual Agent (VA) processes and procurement best practices.
• Experience with ServiceNow related modules.
• Familiarity with ITIL v4 and enterprise service management principles.
• Excellent business analysis skills: requirements gathering, process mapping, documentation.
• Strong communication and stakeholder engagement skills.
• Certifications:
o ServiceNow Certified System Administrator (CSA)
o Certified Implementation Specialist certifications
o Agentic AI / Virtual Agent (VA) related certifications
o ITIL v4 Foundation
• Experience:
o 5+ years in Agentic AI / Virtual Agent (VA) roles.
o 3+ years working with ServiceNow or similar platforms in a BA capacity.
o Experience in large-scale enterprise environments.
Best Regards,
Bismillah Arzoo (AB)
Business Analyst
Data analyst job in Alpharetta, GA
Care Logistics is seeking a detail-oriented, creative problem solver, and collaborative Business Analyst to join our Integrated Solutions team. The Business Analyst plays an integral role in the enterprise-wide operational transformation and technology implementation initiatives across our healthcare client base. This role is responsible for gaining an understanding of the client's current operations, assisting in designing the ideal future state, defining and managing the integration needed between Care Logistics software and client systems. The Business Analyst serves as a liaison between operations and technology teams to ensure implemented solutions deliver desired outcomes for both clients and the company. With an understanding of the desired future state, the capabilities of Care Logistics technology, and familiarity with hospital EMR systems, the Business Analyst is responsible for designing how to best leverage each technology investment and how data is exchanged between the various systems to achieve efficient, patient centered workflows. Internally at Care Logistics, the Business Analyst serves as a subject matter expert for product management, providing business process and integration knowledge to inform future products. The ideal candidate will have experience with HL7 messaging, and healthcare IT systems, and will thrive in a fast-paced, client-focused environment.
ESSENTIAL RESPONSIBILITIES:
Collaborate with cross-functional teams to gather business requirements, identify opportunities and translate them into technical specifications and actionable insights.
Ability to grasp clinical processes and translate them into system requirements for technology teams.
Analyze data from EMR, clinical and scheduling systems to identify additional functionality opportunities, and integration needs.
Critically evaluate gathered information from multiple sources, reconcile conflicts, relate high level information to details, and distinguish user requests from underlying business problems/needs.
Serve as a technical advisor to the internal organization's industrial engineers, Healthcare Operations Executives, as well as other roles.
Participate in and co-lead virtual and in-person integration interviews, discussions, and project meetings with both internal and external teams.
Help to design future state processes that align business requirements with the capability of the client's system and technology.
Develop and execute interface test plans to validate integration processing and system response.
Participate in go-live support and post-implementation optimization.
Develop and maintain internal and external documentation.
Provide input into developing and modifying client and Care Logistics systems to meet client needs.
Collaborate with Client Services to ensure realization of client goals and estimated ROI.
Other duties as assigned.
QUALIFICATIONS - EDUCATION, WORK EXPERIENCE, CERTIFICATIONS:
REQUIRED
Bachelor's degree in Engineering, Information Technology or equivalent combination of education and experience
3-5 years of business analyst or related experience
Outstanding analytical skills with the ability to critically evaluate the information gathered from multiple sources, reconcile conflicts, relate high-level information to details, and distinguish user requests from the underlying true business problem/needs.
Ability to identify inefficiencies, propose solutions, and evaluate outcomes effectively.
Above average observational skills to recognize opportunities, collect data and validate information
Proficiency with Microsoft products such as Outlook, Word, Excel and PowerPoint.
Comfort in leading discussions, facilitating interviews, and presenting findings to diverse audiences.
PREFERRED
Working knowledge of HL7 preferred.
Experience with or knowledge of hospital processes is strongly desired.
Experience with Electronic Health Records, EHR, platforms including ADT, clinical and ordering processes preferred.
KNOWLEDGE, SKILLS, AND ABILITIES:
Adaptability:
Effectively copes with rapidly changing information and fast-paced environment.
Ability to maintain confidentiality and use discretion.
Time Management:
Strong organizational and quality management skills with ability to handle multiple competing tasks and priorities.
Customer Service:
An ability to actively seek to understand the needs of others and provide a positive experience that addresses their needs, questions, and concerns.
Ability to form a team bond and enhance team performance.
Strong interpersonal skills with ability to effectively and tactfully communicate, in both written and verbal form, with a diverse group of stakeholders (prospects, clients, hospital executives, nurses, client development staff, product management staff, and development staff).
TRAVEL REQUIREMENTS & WORKING CONDITIONS:
30-70% travel is required.
Technical Data Architect
Data analyst job in Atlanta, GA
Exempt
Oldcastle Infrastructure™, a CRH company, is the leading provider of utility infrastructure solutions for the water, energy, and communications markets throughout North America. We're more than just a manufacturer of precast concrete, polymer concrete, or plastic products. We're a trusted and strategic partner to engineers, contractors, distributors, specifiers, and more. With our network of more than 80 manufacturing facilities and more than 4,000 employees, we're leading the industry with innovation and a safety-first mindset.
Job Summary
Oldcastle Infrastructure (OI), as part of CRH's Infrastructure Products Group (IPG), is a global manufacturing leader of utility infrastructure products. Our goal is to be the most efficient producer of engineered systems and our customers' strategic partner of choice. A crucial part of OI's journey is the investment in new digital tools including a new ERP. With a modern, common platform, OI will unlock the benefits of its scale, deliver a better customer experience, and build a foundation for continuous process improvement.
The Technical Data Architect is a senior role accountable of defining, governing, and delivering the data architecture strategy required to migrate enterprise data from legacy systems into SAP S/4HANA and Salesforce CPQ. This role ensures that data models, migration approaches, and governance structures support end-to-end business processes and regulatory compliance, while delivering high-quality, reconciled, and auditable data into the template. The architect will partner with the business data management team, program management office, functional process owners, and system integrators to ensure a seamless transition with minimal disruption to operations.
Job Location
This role will work hybrid out of our office in the Sandy Springs, GA area.
Job Responsibilities
Data Architecture Modeling
Design target SAP S/4HANA data models and mapping rules from legacy systems. Validate functional data alignment for Finance (FI/CO), Sales & Distribution (SD), Materials Management (MM) and Production Planning (PP).
Leverage CRH IPG Data Dictionary, Data Management and ETL migration tools to support the cleansing and data migration processes.
Provide Technical capabilities to support the data quality and data reconciliations for Master Data Subjects.
ERP Data Migration
Collaborate with the business Master Data team on the legacy data migration by supporting the technical requirements for Customers, Vendors, BOMs, Products and other master data subjects.
Define extraction, transformation, load, and reconciliation processes with automation where possible.
Master Data Management
Partner with the Business Master Data team to align on the governance model, ownership, and ongoing stewardship processes for core data subjects.
Define and support the data migration testing strategy, including mock loads, trial conversions, and dress rehearsals. Partner with business master data team and users for the validation and sign-off at each migration stage.
Design cutover sequencing for data loads, ensuring minimal downtime. Coordinate with functional leads and the PMO on the entry/exit criteria and contingency planning for go-live events related to data quality readiness.
Job Requirements
5-8+ years of experience working in Data Architecture in the manufacturing industry
Proven track record in delivering large-scale data migrations (CPQ, OTC, Finance, Supply Chain, Manufacturing P2P).
Hands-on experience with ETL/migration tools (SAP Data Services, Informatica, etc).
Strong knowledge of data governance, master data management, and audit/compliance processes.
Process improvement knowledge gained while working in an organization undergoing a significant operational culture shift
Creation and improvement of processes that demonstrate ease of doing business internally and externally
Development and implementation of process adherence and data quality adoption metrics
Comfortable operating in environment of ambiguity and fast change
Strong interpersonal and organizational influencing skills
Ability to communicate in a simple, articulate, thoughtful manner to varying audience levels
Innovative spirit to work cross-functionally in developing improvement ideas
A pleasant, likeable manner while accomplishing challenging results
Bachelor's degree in computer science or technical related discipline
SAP Technical Certifications in Master Data/Data Services/MDG (preferred)
PMP Certification (preferred)
What CRH Offers You
Highly competitive base pay
Comprehensive medical, dental and disability benefits programs
Group retirement savings program
Health and wellness programs
An inclusive culture that values opportunity for growth, development, and internal promotion
About CRH
CRH has a long and proud heritage. We are a collection of hundreds of family businesses, regional companies and large enterprises that together form the CRH family. CRH operates in a decentralized, diversified structure that allows you to work in a small company environment while having the career opportunities of a large international organization.
If you're up for a rewarding challenge, we invite you to take the first step and apply today! Once you click apply now, you will be brought to our official employment application. Please complete your online profile and it will be sent to the hiring manager. Our system allows you to view and track your status 24 hours a day. Thank you for your interest!
Oldcastle Infrastructure, a CRH Company, is an Affirmative Action and Equal Opportunity Employer.
EOE/Vet/Disability
CRH is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, status as a protected veteran or any other characteristic protected under applicable federal, state, or local law.
Workday Financial Configuration Analyst
Data analyst job in Atlanta, GA
TRC's client, an Industry Leader in Supply Chain Software, has a direct hire opportunity for Senior level Workday Financials Analyst!
They are looking for a Senior Enterprise Analyst who can bridge the gap between business needs and Workday Financials. This person will gather requirements, configure workflows, support UAT, troubleshoot issues, and serve as the first point of contact for Finance & Accounting users.
What You'll Do
• Meet with business partners to understand requirements, document processes, and translate needs into user stories and test scripts.
• Perform hands-on configuration in Workday Financials, Revenue Management, and Procurement.
• Create custom reports, calculated fields, and perform data loads using EIBs.
• Troubleshoot Workday Financials issues and guide users through resolutions.
• Support testing cycles (UAT), including test scripting, execution, and user support.
• Create and maintain process flows, diagrams, and documentation.
• Assist with new feature rollouts, enhancements, and process improvements.
• Provide user training and act as the first-line support contact for system questions.
What We're Looking For
• 5+ years of business analysis experience in Finance/Accounting environments (Q2C, P2P, OTC, RTR).
• 5+ years supporting Workday Financials, including configuration and reporting.
• Experience creating Workday custom reports, calculated fields, and performing EIB data loads.
• Strong documentation skills (Visio, process mapping, requirements).
• Experience supporting UAT and writing test scripts.
• Familiarity with Jira, Confluence, SharePoint, and IT ticketing systems (ServiceNow, Jira, etc.).
• Strong communication skills and the ability to work with both technical and business teams.
• Ability to work independently, manage your own workload, and adapt to changing priorities.
Strategic Business Analyst
Data analyst job in Lawrenceville, GA
Our client, a local government administration organization, is actively looking for a Strategic Business Analyst to join their Business Strategic Service team in Lawrenceville, GA!
This role is fully onsite so local candidates are required.
***This is an ongoing contract starting at a 1 year duration***
This person will assist in initiative discovery, research, analysis, creation of technology solicitations such as Request for Proposals (RFPs), facilitation and management of business and functional requirements pre-project creation.
Responsibilities
Lead business requirements gathering sessions with a diverse set of stakeholders to generate detailed solicitation requirements and content using pre-created templates.
Facilitate business requirements gathering and author the artifacts from that effort such as Business Requirements Documents (BRDs).
Create technology Request for Proposal content in prescribed templates that include extensive Word documents and accompanying Excel Workbooks.
Perform assigned primary and secondary research in support of business cases, technology research, and other Business Relationship Management deliverables.
Create reports, analysis, and visualizations to support technology selections.
Create technology inventories, business architectures, process Swim Lanes (as is and future state), and technology comparisons.
Assist BRMs in the creation of project hand-off documentation using pre-created templates.
Author business cases in support of new technology that includes financial analysis.
Facilitate customer journey mapping sessions with customer departments.
Describe and validate solutions that meet business needs, goals, and objects.
Required Skills & Experience
Bachelor's degree in computer science, business administration, or related field.
ITIL and/or Business Analysis certifications in requirements gathering is a plus.
Prior work experience at or for technology or government solutions providers or customers.
8+ years' experience facilitating and producing business requirements.
5+ years' experience authoring technology solicitation content such as RFPs.
Experience facilitating customer-facing requirements gathering and/or journey mapping.
Experience authoring business cases.
Ability to create compelling presentations and reports.
Proficient in MS Suite (Word, Excel, Power-Point.).
Experience in Visio, MS Project, and SharePoint.
Senior Data Architect
Data analyst job in Atlanta, GA
Long-term opportunity with a rapidly growing company!
RESPONSIBILITIES:
Own end-to-end data architecture for enterprise SaaS platforms, including both OLTP and analytical serving layers
Design and operate solutions across Azure SQL DB/MI, Azure Databricks with Delta Lake, ADLS Gen2, Synapse Analytics / Microsoft Fabric
Partner with analytics teams on Power BI semantic models, including performance optimization and row-level security (RLS)
Define and implement Information Lifecycle Management (ILM): hot/warm/cold tiers, 2-year OLTP retention, archive/nearline, and a BI mirror that enables rich analytics without impacting production workloads.
Engineer ERP/SAP financial interfaces for idempotency, reconciliation, and traceability; design rollback/de-dup strategies and financial journal integrity controls.
Govern schema evolution/DbVersions to prevent cross-customer regressions while achieving performance gains
Establish data SLOs (freshness, latency, correctness) mapped to customer SLAs; instrument monitoring/alerting and drive continuous improvement.
This is a direct-hire opportunity in Atlanta. Work onsite the first 5-6 months, then transition to a hybrid schedule of 3 days in the office with 2 days remote (flex days).
REQUIRED SKILLS:
10+ years of experience in data or database engineering
5-8+ years owning data or database architecture for customer-facing SaaS or analytics platforms at enterprise scale
Proven experience operating at multi-terabyte scale (5+ TB) with measurable improvements in performance, reliability, and cost
Strong expertise with Azure data technologies
Advanced SQL skills, including query optimization, indexing, partitioning, CDC, caching, and schema versioning
Experience designing audit-ready, SLA-driven data platforms
Strong background in ERP/SAP data integrations, particularly financial data
Bachelor's degree
PREFERRED SKILLS:
Power BI performance modeling (RLS, composite models, incremental refresh, DAX optimization).
Modular monolith/microservices experience
Semantic tech (ontology/knowledge graphs), vector stores, and agentic AI orchestration experience
Must be authorized to work in the US. Sponsorships are not available.
Senior Data Architect
Data analyst job in Dunwoody, GA
At MTech Systems, our company mission is to increase yield in protein production to help feed the growing world population without compromising animal welfare or damaging the planet. We aim to create software that delivers real-time data to the entire supply chain that allows producers to get better insight into what is happening on their farms and what they can do to responsibly improve production.
MTech Systems is a prominent provider of tools for managing performance in Live Animal Protein Production. For over 30 years, MTech Systems has provided cutting-edge enterprise data solutions for all aspects of the live poultry operations cycle. We provide our customers with solutions in Business Intelligence, Live Production Accounting, Production Planning, and Remote Data Management-all through an integrated system. Our applications can currently be found running businesses on six continents in over 50 countries. MTech has built an international reputation for equipping our customers with the power to utilize comprehensive data to maximize profitability.
With over 250 employees globally, MTech Systems currently has main offices in Mexico,
United States, and Brazil, with additional resources in key markets around the world. MTech Systems USA's headquarters is based in Atlanta, Georgia and has approximately 90 team members in a casual, collaborative environment. Our work culture here is based on a commitment to helping our clients feed the world, resulting in a flexible and rewarding atmosphere. We are committed to maintaining a work culture that enhances collaboration, provides robust development tools, offers training programs, and allows for direct access to senior and executive management.
Job Summary
MTech builds customer-facing SaaS & analytics products used by global enterprise customers. You will own the database/data platform architecture that powers these products-driving performance, reliability, auditability, and cost efficiency at multi-tenant, multi-terabyte scale. Success is measured in hard outcomes: fewer P1s/support tickets, faster queries, bullet-proof ERP/SAP integrations, SLO compliance tied to SLAs, and audit ready evidence.
Responsibilities and Duties
Architecture & Design
Own the end-to-end data architecture for enterprise SaaS (OLTP + analytical serving), including Azure SQL/MI, Databricks/Delta Lake, ADLS, Synapse/Fabric, and collaboration on Power BI semantic models (RLS, performance).
Define and implement Information Lifecycle Management (ILM): hot/warm/cold tiers, 2-year OLTP retention, archive/nearline, and a BI mirror that enables rich analytics without impacting production workloads.
Engineer ERP/SAP financial interfaces for idempotency, reconciliation, and traceability; design rollback/de-dup strategies and financial journal integrity controls.
Govern schema evolution/DbVersions to prevent cross-customer regressions while achieving performance gains.
Establish data SLOs (freshness, latency, correctness) mapped to customer SLAs; instrument monitoring/alerting and drive continuous improvement.
Operations & Observability
Build observability for pipelines and interfaces (logs/metrics/traces, lineage, data quality gates) and correlate application telemetry (e.g., Stackify/Retrace) with DB performance for rapid rootcause analysis.
Create incident playbooks (reprocess, reconcile, rollback) and drive MTTR down across data incidents.
Collaboration & Leadership
Lead the DBA/DB engineering function (standards, reviews, capacity planning, HA/DR, on-call, performance/availability SLOs) and mentor data engineers.
Partner with Product/Projects/BI to shape domain models that meet demanding customer reporting (e.g., Tyson Matrix) and planning needs without compromising OLTP.
Required Qualifications
15+ years in data/database engineering; 5-8+ years owning data/DB architecture for customerfacing SaaS/analytics at enterprise scale.
Proven results at multi-terabyte scale (≥5 TB) with measurable improvements (P1 reduction, MTTR, query latency, cost/performance).
Expertise in Azure SQL/MI, Databricks/Delta Lake, ADLS, Synapse/Fabric; deep SQL, partitioning/indexing, query plans, CDC, caching, schema versioning.
Audit & SLA readiness: implemented controls/evidence to satisfy SOC 1 Type 2 (or equivalent) and run environments to SLOs linked to SLAs.
ERP/SAP data interface craftsmanship: idempotent, reconciled, observable financial integrations.
ILM/Archival + BI mirror design for queryable archives/analytics without OLTP impact.
Preferred Skills
Power BI performance modeling (RLS, composite models, incremental refresh, DAX optimization).
Modular monolith/microservices experience (plus, not required).
Semantic tech (ontology/knowledge graphs), vector stores, and agentic AI orchestration experience (advantage, not required).
EEO Statement
Integrated into our shared values is MTech's commitment to diversity and equal employment opportunity. All qualified applicants will receive consideration for employment without regard to sex, age, race, color, creed, religion, national origin, disability, sexual orientation, gender identity, veteran status, military service, genetic information, or any other characteristic or conduct protected by law. MTech aims to maintain a global inclusive workplace where every person is regarded fairly, appreciated for their uniqueness, advanced according to their accomplishments, and encouraged to fulfill their highest potential. We believe in understanding and respecting differences among all people. Every individual at
MTech has an ongoing responsibility to respect and support a globally diverse environment.
ML Engineer with Timeseries data experience
Data analyst job in Atlanta, GA
Role: ML Engineer with Timeseries data experience
Hybrid in Atlanta, GA (locals preferred)
$58/hr on C2C, Any Visa
Model Development: Design, build, train, and optimize ML/DL models for time-series forecasting, prediction, anomaly detection, and causal inference.
Data Pipelines: Create robust data pipelines for collection, preprocessing, feature engineering, and labeling of large-scale time-series data.
Scalable Systems: Architect and implement scalable AI/ML infrastructure and MLOps pipelines (CI/CD, monitoring) for production deployment.
Collaboration: Work with data engineers, software developers, and domain experts to integrate AI solutions.
Performance: Monitor, troubleshoot, and optimize model performance, ensuring robustness and real-world applicability.
Languages & Frameworks: Good understanding of AWS Framework, Python (Pandas, NumPy), PyTorch, TensorFlow, Scikit-learn, PySpark.
ML/DL Expertise: Strong grasp of time-series models (ARIMA, Prophet, Deep Learning), anomaly detection, and predictive analytics
Data Handling: Experience with large datasets, feature engineering, and scalable data processing.
W2 Opportunity // GCP Data Engineer // Atlanta, GA
Data analyst job in Atlanta, GA
Job Description: GCP Data Engineer
Rate: $50/hr. on W2 (No C2C)
We are seeking a highly skilled GCP Data Engineer to design, build, and optimize cloud-native data pipelines and analytics solutions on Google Cloud Platform. The ideal candidate has strong experience with Python, BigQuery, Cloud Data Fusion, and core GCP services such as Cloud Composer, Cloud Storage, Cloud Functions, and Pub/Sub. This role requires a strong foundation in data warehousing concepts and scalable data engineering practices.
Responsibilities
Design, develop, and maintain robust ETL/ELT pipelines on Google Cloud Platform.
Build and optimize data workflows using Cloud Data Fusion, BigQuery, and Cloud Composer.
Write efficient and maintainable Python code to support data ingestion, transformation, and automation.
Develop optimized BigQuery SQL for analytics, reporting, and large-scale data modeling.
Utilize GCP services such as Cloud Storage, Pub/Sub, and Cloud Functions to build event-driven and scalable data solutions.
Ensure data quality, governance, and reliability across all pipelines.
Collaborate with cross-functional teams to deliver clean, trusted, production-ready datasets.
Monitor, troubleshoot, and resolve performance issues in cloud data pipelines and workflows.
Must-Have Skills
Strong experience with GCP BigQuery (data modeling, SQL development, performance tuning).
Proficiency in Python for data engineering and pipeline automation.
Hands-on experience with Cloud Data Fusion for ETL/ELT development.
Working experience with key GCP services:
Cloud Composer
Cloud Storage
Cloud Functions
Pub/Sub
Strong understanding of data warehousing concepts, star/snowflake schemas, and best practices.
Solid understanding of cloud data architecture and distributed processing.
Good-to-Have Skills
Experience with Vertex AI for ML pipeline integration or model deployment.
Familiarity with Dataproc (Spark/Hadoop) for large-scale processing.
Knowledge of CI/CD workflows, Git, and DevOps best practices.
Experience with Cloud Logging/Monitoring tools.
Lead Azure Databrick Engineer
Data analyst job in Atlanta, GA
****************Individual Contractors (W2/1099) are encouraged to apply. Visa sponsorship is not available for this role at this time************
An Azure Data Engineer is responsible for designing, implementing, and maintaining the data infrastructure within an organization. They collaborate with both business and IT teams to understand stakeholders' needs and unlock the full potential of data. They create conceptual and logical data models, analyze structural requirements, and ensure efficient database solutions.
Must Have Skills:
Experience of Migrating from other platform to Databricks
Proficiency in Databricks and Azure Cloud, Databricks Asset Bundles, Hoslistic vision on the Data Strategy.
Proficiency in Data Streaming and Data Modeling
Experience in architecting at least two large-scale big data projects
Strong understanding of data scaling and its complexities
Data Archiving and Purging mechanisms.
Job Requirements
• Degree in computer science or equivalent preferred
• Demonstrable experience in architecture, design, implementation, and/or support of highly distributed applications with Azure cloud and Databricks.
• 10+ Years of Hands-on experience with data modelling, database design, data mining, and segmentation techniques.
• Working knowledge and experience with "Cloud Architectures" (e.g., SaaS, PaaS, IaaS) and the ability to address the unique security considerations of secure Cloud computing
• Should have architected solutions for Cloud environments such as Microsoft Azure and/or GCP
• Experience with debugging and performance tuning in distributed environments
• Strong analytical skills with the ability to collect, organize, analyse, and broadcast significant amounts of information with attention to detail and accuracy
• Experience dealing with structured, unstructured data.
• Must have Python, PySpark experience.
• Experience in ML or/and graph analysis is a plus
Data Engineer - OrcaWorks AI
Data analyst job in Atlanta, GA
Experience Level: Entry-level (Master's preferred)
About OrcaWorks AI
At OrcaWorks AI, we're building next-generation AI systems that empower businesses to make data-driven decisions with intelligence and speed. We're seeking passionate Data Engineers who love solving real-world data challenges and want to be part of a growing team building cutting-edge AI infrastructure.
Key Responsibilities
Design, develop, and maintain data pipelines using tools like Airbyte and Prefect to feed AI and machine learning models.
Integrate data from multiple structured and unstructured sources into unified and queryable layers using ElasticSearch or Vespa.
Implement data validation, transformation, and storage solutions using modern ETL frameworks.
Collaborate with AI, LLM, and data science teams to ensure reliable and optimized data flow for model training.
Support database management, SQLModel, and data governance practices across services.
Required Skills & Qualifications
Master's degree (or Bachelor's with equivalent experience) in Computer Science, Information Systems, or Data Engineering.
Proficiency in Python and SQL; experience with PySpark or equivalent ETL frameworks.
Hands-on experience with Airbyte, Prefect, and DBT.
Familiarity with search and indexing systems like Vespa or ElasticSearch.
Knowledge of cloud data platforms (AWS, GCP, or Azure) and API integration.
Strong understanding of data security and applied AI workflows.
Lead Data Engineer - Palantir Foundry
Data analyst job in Atlanta, GA
Our technology organization is transforming how we work at WestRock. We align with our businesses to deliver innovative solutions that:
Address specific business challenges, integrate processes, and create great experiences
Connect our work to shared goals that propel WestRock forward in the Digital Age
Imagine how technology can advance the way we work by using disruptive technology
We are looking for forward thinking technologists that can accelerate our focus areas such as building stronger foundational technology capabilities, reducing complexity, employing digital transformation concepts, and leveraging disruptive technology.
As a Lead Data Engineer, you will play a pivotal role in building and scaling modern data infrastructure that powers decision-making across production, supply chain, and operations. Helps to define and analyze business requirements for Enterprise scale reports. Analyzes and evaluates business use cases for data engineering problems and helps design and develop processing solutions with ETL Cloud based technologies.
How you will impact WestRock:
Architect and implement scalable data pipelines using Palantir Foundry (pipelines, workshops, ontology) to unify and transform operational data.
Design and develop robust data workflows using Python, Apache Airflow, and Apache Spark to support real-time and batch processing needs.
Build and deploy solutions on cloud platforms (AWS or Azure), ensuring high availability, security, and performance.
Collaborate with data scientists, analysts, and operations teams to deliver actionable insights and operational tooling.
Define and enforce data engineering best practices, including CI/CD automation, version control (Git), and testing strategies.
Mentor junior developers, conduct code reviews, and help shape the technical roadmap for the data platform.
What you need to succeed:
Education: Bachelor's degree in computer science or similar
At least 6 years of strong Data Engineering experience
Hands-on experience with Palantir Foundry, including pipelines, ontology modeling, and workshop development.
Strong programming skills in Python or Java, with experience building and maintaining production-grade data pipelines.
Proficiency in Apache Airflow and Apache Spark for workflow orchestration and large-scale data processing.
Proven experience deploying data solutions on AWS or Azure, with strong understanding of cloud-native services.
Familiarity with Git for version control and CI/CD pipelines for automated testing and deployment.
Demonstrated ability to mentor junior engineers, lead projects, and work independently in a fast-paced environment.
Good communication skills, with the ability to collaborate effectively across technical and non-technical teams.
Good analytical and troubleshooting abilities.
What we offer:
Corporate culture based on integrity, respect, accountability and excellence
Comprehensive training with numerous learning and development opportunities
An attractive salary reflecting skills, competencies and potential
A career with a global packaging company where Sustainability, Safety and Inclusion are business drivers and foundational elements of the daily work.
Data Engineer
Data analyst job in Alpharetta, GA
5 days onsite in Alpharetta, GA
Skills required:
Python
Data Pipeline
Data Analysis
Data Modeling
Must have solid Cloud experience
AI/ML
Strong problem-solving skills
Strong Communication skill
A problem solver with ability to analyze and research complex issues and problems; and proposing actionable solutions and/or strategies.
Solid understanding and hands on experience with major cloud platforms.
Experience in designing and implementing data pipelines.
Must have experience with one of the following: GCP, AWS OR Azure - MUST have the drive to learn GCP.
Data Engineer (Mid-Level)
Data analyst job in Alpharetta, GA
We are looking for a skilled Data Engineer to design, build, and maintain scalable data pipelines and data platforms. The ideal candidate will have strong hands-on experience in SQL, Python, and big data technologies, and will work closely with analytics, data science, and business teams to enable data-driven decision-making.
Key Responsibilities
Design, develop, and optimize end-to-end data pipelines for large-scale data processing
Build and maintain ETL/ELT workflows using Python and SQL
Work with big data frameworks such as Apache Spark and Hive
Orchestrate and monitor data workflows using Apache Airflow
Ensure data quality, reliability, and performance across pipelines
Collaborate with cross-functional teams to understand data requirements
Troubleshoot and optimize data processing jobs and queries
Follow best practices for data engineering, security, and scalability
Mandatory Skills
Strong experience in data pipeline development
Excellent hands-on expertise in SQL
Proficiency in Python for data processing and automation
Experience with Spark, Hive, and Airflow
Solid understanding of data warehousing and ETL concepts
Preferred Skills
Experience with cloud data platforms such as AWS, GCP, or Azure
Exposure to NoSQL databases (MongoDB, Cassandra, DynamoDB, etc.)
Familiarity with distributed systems and large-scale data processing
Knowledge of CI/CD practices for data pipelines
Nice to Have
Experience working in Agile/Scrum environments
Understanding of data governance and data security practices
What We Offer
Opportunity to work on large-scale, impactful data systems
Collaborative and growth-focused work culture