Post job

Data engineer jobs in Sierra Vista, AZ

- 1,088 jobs
All
Data Engineer
Software Engineer
Requirements Engineer
Data Architect
Software Development Engineer
Senior Software Engineer
Devops Engineer
  • Data Engineer/Architect

    GTN Technical Staffing 3.8company rating

    Data engineer job in Phoenix, AZ

    Principal Data Engineer / Data Architect HIGHLIGHTS Contract to Hire Hourly/Salary: BOE Residency Status: US Citizen or Green Card Holder ONLY The Principal Data Engineer / Data Architect will provide hands-on technical services to support the enhancement, stabilization, and ongoing development of a public-sector financial system operating in a full-stack application environment. This role serves as the primary technical resource responsible for data architecture, data engineering, and data integration activities associated with the system. In addition to supporting application enhancements, the contractor will assist the organization in establishing foundational data management practices, including standards, tools, and processes appropriate for a public-sector environment. The role requires a high degree of independent technical judgment and the ability to work effectively within an iterative delivery framework. Scope of Services: A. Data Engineering and Architecture Support Design, implement, and maintain relational data models supporting transactional and reporting use cases. Develop, optimize, and maintain SQL-based data objects, including tables, views, stored procedures, and indexes. Support data access and integration within a full-stack application environment. Design and implement data interfaces and services to support system integrations. Perform data performance tuning, diagnostics, and optimization activities. Support data validation, reconciliation, and auditability requirements. B. Application Enhancement and Delivery Support Support enhancement and feature development efforts for an existing production system. Participate in backlog refinement, technical design, and implementation activities related to data components. Provide production support and issue resolution related to data quality, performance, or integrity. Support testing, deployment, and post-release stabilization activities. C. Data Management Practice Enablement Assist in defining and documenting data management standards and best practices. Develop templates, patterns, and reference implementations to support consistent data engineering practices. Support the establishment of data governance and stewardship concepts appropriate to organizational maturity. Provide documentation and knowledge transfer to support long-term sustainability. Required Qualifications: Technical Experience Minimum of eight (8) years of hands-on experience in data engineering and/or data architecture roles. Advanced proficiency with relational database platforms, including data modeling, query optimization, and performance tuning. Experience supporting full-stack application environments, including integration with application code. Experience designing and supporting data integrations and application programming interfaces (APIs). Demonstrated experience supporting production financial or transactional systems. Professional Experience: Experience supporting post-implementation or post-go-live enhancement efforts. Ability to operate independently as a senior technical resource with minimal supervision. Experience working in public-sector or regulated environments. Strong analytical, troubleshooting, and documentation skills. Preferred Qualifications: Experience supporting public-sector financial, revenue, or accounting systems. Experience with cloud or hybrid data architectures. Familiarity with data governance, data quality, or master data concepts. Experience working in agile or iterative delivery environments. "We are GTN - The Go To Network"
    $91k-129k yearly est. 2d ago
  • Senior Data Engineer

    Addison Group 4.6company rating

    Data engineer job in Phoenix, AZ

    Job Title: Sr. Data Engineer Job Type: Full Time Compensation: $130,000 - $150,000 D.O.E. is eligible for medical, dental, vision, and life insurance coverage, & PTO Senior Data Engineer ROLE OVERVIEW The Senior Data Engineer is responsible for designing, building, and maintaining scalable data platforms that support analytics, reporting, and advanced data-driven initiatives. This is a hands-on engineering role focused on developing reliable, high-performing data solutions while contributing to architectural standards, data quality, and governance practices. The ideal candidate has strong experience with modern data architectures, data modeling, and pipeline development, and is comfortable collaborating across technical and business teams to deliver trusted, production-ready datasets. KEY RESPONSIBILITIES Design and maintain data models across analytical and operational use cases to support reporting and advanced analytics. Build and manage data pipelines that ingest, transform, and deliver structured and unstructured data at scale. Contribute to data governance practices, including data quality controls, metadata management, lineage, and stewardship. Develop and maintain cloud-based data platforms, including data lakes, analytical stores, and curated datasets. Implement and optimize batch and near-real-time data ingestion and transformation processes. Support data migration and modernization efforts while ensuring accuracy, performance, and reliability. Partner with analytics, engineering, and business teams to understand data needs and deliver high-quality solutions. Enable reporting and visualization use cases by providing clean, well-structured datasets for downstream tools. Apply security, privacy, and compliance best practices throughout the data lifecycle. Establish standards for performance tuning, scalability, reliability, and maintainability of data solutions. Implement automation, testing, and deployment practices to improve data pipeline quality and consistency. QUALIFICATIONS Bachelor's degree in Computer Science, Engineering, or a related technical field, or equivalent professional experience. 5+ years of experience in data engineering or related roles. Strong hands-on experience with: Data modeling, schema design, and pipeline development Cloud-based data platforms and services Data ingestion, transformation, and optimization techniques Familiarity with modern data architecture patterns, including lakehouse-style designs and governance frameworks. Experience supporting analytics, reporting, and data science use cases. Proficiency in one or more programming languages commonly used in data engineering (e.g., Python, SQL, or similar). Solid understanding of data structures, performance optimization, and scalable system design. Experience integrating data from APIs and distributed systems. Exposure to CI/CD practices and automated testing for data workflows. Familiarity with streaming or event-driven data processing concepts preferred. Experience working in Agile or iterative delivery environments. Strong communication skills with the ability to document solutions and collaborate across teams.
    $130k-150k yearly 3d ago
  • Data Engineer

    CBTS 4.9company rating

    Data engineer job in Phoenix, AZ

    CBTS is a leading IT Solutions Provider with an exceptional track record of delivering results to its clients. With over 27 years of experience, time tested business acumen and a unique vendor neutral approach that ensures unbiased solution design, CBTS is one of the world's leading IT Solution Providers. We "Design Build and Operate" complex, best-of-breed data and data center solutions for highly recognizable customers. Role Overview This is a long-term contract role that is Hybrid (3 days in the office) Our client is seeking an experienced Senior Data Engineer with strong hands-on expertise in Quantexa, Scala, Google Cloud Platform (GCP), and ETL development. This role will focus on building and configuring batch and real-time entity resolution solutions that support large-scale data processing and analytics initiatives. The ideal candidate has practical experience developing, configuring, and deploying Quantexa modules, working closely with data engineering and architecture teams to deliver high-quality, scalable solutions. Key Responsibilities Design, develop, and configure Quantexa modules for batch and real-time entity resolution Build and maintain scalable data pipelines using Scala and ETL frameworks Develop and optimize data processing solutions on Google Cloud Platform Collaborate with data architects, engineers, and business stakeholders to translate requirements into technical solutions Ensure performance, data quality, and reliability across large-scale datasets Support deployment, testing, and troubleshooting of Quantexa-based solutions Follow best practices for data engineering, security, and compliance in enterprise environments Required Qualifications 6 to 9 years of overall experience in data engineering or related roles Strong hands-on experience with Quantexa, including batch and real-time entity resolution modules Proficiency in Scala for data processing and application development Experience building ETL pipelines and large-scale data integrations Hands-on experience with Google Cloud Platform (GCP) Strong understanding of data modeling, data quality, and performance optimization Preferred Qualifications Quantexa Certification Experience working in large enterprise or financial services environments Familiarity with real-time streaming and event-driven architectures Strong communication skills and ability to work cross-functionally
    $71k-107k yearly est. 1d ago
  • Data Engineer

    Cornerstone Technology Talent Services 3.2company rating

    Data engineer job in Phoenix, AZ

    Data Engineer / Data Architect (Contract-to-Hire) Phoenix, AZ (On-site) CornerStone Technology Talent Services is partnering with a public-sector organization on an upcoming opportunity for an experienced Data Engineer / Data Architect. This role is a backfill for a senior team member planning to retire at the end of 2025 and supports a mission-critical financial system. The contractor will work within an established team structure and report directly to departmental leadership. This is a hands-on, senior-level technical role focused on data architecture, data engineering, and data integration, with an emphasis on building, optimizing, and maturing foundational data practices. Summary: The Data Engineer / Data Architect will serve as a senior technical resource supporting a large-scale public-sector financial system. This individual will be responsible for designing, enhancing, and maintaining data architectures and pipelines that support system performance, scalability, and long-term data governance. The role includes hands-on development, system optimization, and collaboration with internal stakeholders to support ongoing enhancements and future-state data strategy. Key Responsibilities: Design, implement, and maintain scalable data architectures supporting enterprise financial systems Develop and optimize data pipelines, ETL/ELT processes, and system integrations Support performance tuning, data quality, and data reliability initiatives Establish and enhance data management standards, documentation, and best practices Collaborate with application teams, business stakeholders, and technical leadership on system enhancements Troubleshoot complex data-related issues and deliver sustainable technical solutions Contribute to architectural planning, modernization efforts, and technical roadmaps Required Qualifications: Senior-level experience in data engineering and/or data architecture roles Proven experience designing and supporting enterprise-scale data architectures Hands-on experience building and maintaining data pipelines and integrations Experience supporting financial systems or highly regulated environments Strong problem-solving skills with the ability to operate independently in a hands-on role Strong communication skills and ability to collaborate effectively within established teams Engagement Details: Contract Duration: 6 months with potential for permanent hire Location: Downtown Phoenix, AZ Work Model: 100% on-site, in-office Start Timing: Targeting an ASAP start at the beginning of 2026 Interview Process: Structured interview process consistent with prior engagements This opportunity offers the chance to step into a stable, well-established environment, work closely with experienced leadership, and make a meaningful impact on a critical public-sector system.
    $92k-129k yearly est. 2d ago
  • Pyspark Data Engineer | Only USC and Green Card

    Ampstek

    Data engineer job in Phoenix, AZ

    Pyspark Data Engineer Duration: 06+ Months **Only US Citizen and Green Card Required** Job Details: Must Have Skills • PySpark, Python developer. • Hands on knowledge for py Spark, Hadoop, Python • Github Backend API integration knowledge (JASON, REST) Nice to have skills • Closely working with client • Good communication Detailed Job Description • Looking for a Subcon requirement for PySpark, Python, Data engineer. • Client communication skillset for Amex Account. Minimum years of experience: 6 years Certifications Needed : No (Good to have GCP certification) Top 3 responsibilities you would expect the Subcon to shoulder and execute • Individual contributor • Strong development experience and leading dev module • Work with client directly Thank You Aakash Dubey ************************
    $80k-111k yearly est. 2d ago
  • Data Governance Engineer

    Centraprise

    Data engineer job in Phoenix, AZ

    Job Title : Data Governance Engineer Phoenix, AZ - Complete Onsite Full-Time Permanent Experience Required - 6+ Years Must Have Technical/Functional Skills Understanding of Data Management and Data Governance concepts (metadata, lineage, data quality, etc.) and prior experience. 2 - 5 years of Data Quality Management experience. Intermediate competency in SQL & Python or related programming language. Strong familiarity with data architecture and/or data modeling concepts 2 - 5 years of experience with Agile or SAFe project methodologies Roles & Responsibilities Assist in identifying data-related risks and associated controls for key business processes. Risks relate to Record Retention, Data Quality, Data Movement, Data Stewardship, Data Protection, Data Sharing, among others. Identify data quality issues, perform root-cause-analysis of data quality issues and drive remediation of audit and regulatory feedback. Develop deep understanding of key enterprise data-related policies and serve as the policy expert for the business unit, providing education to teams regarding policy implications for business. Responsible for holistic platform data quality monitoring, including but not limited to critical data elements. Collaborate with and influence product managers to ensure all new use cases are managed according to policies. Influence and contribute to strategic improvements to data assessment processes and analytical tools. Responsible for monitoring data quality issues, communicating issues, and driving resolution. Support current regulatory reporting needs via existing platforms, working with upstream data providers, downstream business partners, as well as technology teams. Subject matter expertise on multiple platforms. Responsible to partner with the Data Steward Manager in developing and managing the data compliance roadmap. Generic Managerial Skills, If any Drives Innovation & Change: Provides systematic and rational analysis to identify the root cause of problems. Is prepared to challenge the status quo and drive innovation. Makes informed judgments, recommends tailored solutions. Leverages Team - Collaboration: Coordinates efforts within and across teams to deliver goals, accountable to bring in ideas, information, suggestions, and expertise from others outside & inside the immediate team. Communication: Influences and holds others accountable and has ability to convince others. Identifies the specific data governance requirements and is able to communicate clearly and in a compelling way.
    $80k-111k yearly est. 2d ago
  • Data Engineer

    Mastek

    Data engineer job in Phoenix, AZ

    Hi, We do have an job opportunity for Data Engineer Analyst role. Data Analyst / Data Engineer Expectations: Our project is data analysis heavy, and we are looking for someone who can grasp business functionality and translate that into working technical solutions. Job location: Phoenix, Arizona. Type - Hybrid model (3 days a week in office) Job Description: Data Analyst / Data Engineer (6+ Years relevant Experience with required skill set) Summary: We are seeking a Data Analyst Engineer with a minimum of 6 years in data engineering, data analysis, and data design. The ideal candidate will have strong hands-on expertise in Python and relational databases such as Postgres, SQL Server, or MySQL. Should have good understanding of data modeling theory and normalization forms. Required Skills: 6+ years of experience in data engineering, data analysis, and data design Your approach as a data analysis in your previous / current role, and what methods or techniques did you use to extract insights from large datasets Good proficiency in Python Do you have any formal training or education in data modeling? If so, please provide details about the course, program, or certification you completed, including when you received it. Strong experience with relational databases: Postgres, SQL Server, or MySQL. What are the essential factors that contribute to a project's success, and how do you plan to leverage your skills and expertise to ensure our project meets its objectives? Expertise in writing complex SQL queries and optimizing database performance Solid understanding of data modeling theory and normalization forms. Good communicator with the ability to articulate business problems for technical solutions. Key Responsibilities: Analyze complex datasets to derive actionable insights and support business decisions. Model data solutions for high performance and reliability. Work extensively with Python for data processing and automation. Develop and optimize SQL queries for Postgres, SQL Server, or MySQL databases. Ensure data integrity, security, and compliance across all data solutions. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Communicate effectively with stakeholders and articulate business problems to drive technical solutions. Secondary Skills: Experience deploying applications in Kubernetes. API development using FastAPI or Django. Familiarity with containerization (Docker) and CI/CD tools. Regards, Suhas Gharge
    $80k-111k yearly est. 2d ago
  • Data Engineer

    Stelvio Inc.

    Data engineer job in Phoenix, AZ

    Hybrid - 2-3 days on site Phoenix, AZ We're looking for a Data Engineer to help build the cloud-native data pipelines that power critical insights across our organization. You'll work with modern technologies, solve real-world data challenges, and support analytics and reporting systems that drive smarter decision-making in the transportation space. What You'll Do Build and maintain data pipelines using Databricks, Azure Data Factory, and Microsoft Fabric Implement incremental and real-time ingestion using medallion architecture Develop and optimize complex SQL and Python transformations Support legacy platforms (SSIS, SQL Server) while contributing to modernization efforts Troubleshoot data quality and integration issues Participate in proof-of-concepts and recommend technical solutions What You Bring 5+ years designing and building data solutions Strong SQL and Python skills Experience with ETL pipelines and Data Lake architecture Ability to collaborate and adapt in a fast-moving environment Preferred: Azure services, cloud ETL tools, Power BI/Tableau, event-driven systems, NoSQL databases Bonus: Experience with Data Science or Machine Learning Benefits Medical, dental, and vision from day one · PTO & holidays · 401(k) with match · Lifestyle account · Tuition reimbursement · Voluntary benefits · Employee Assistance Program · Well-being & culture programs · Professional development support
    $80k-111k yearly est. 2d ago
  • Data Engineer

    Echelix

    Data engineer job in Phoenix, AZ

    Echelix is a leading AI consulting company helping businesses design, build, and scale intelligent systems. We partner with organizations to make artificial intelligence practical, powerful, and easy to adopt. Our team blends deep technical skill with real-world business sense to deliver AI that drives measurable results. The Role We're looking for a Senior Data Engineer to architect, optimize, and manage database systems that power AI-driven solutions and enterprise applications. You'll lead the design of scalable, secure, and high-performance data infrastructure across cloud platforms, ensuring our clients' data foundations are built for the future. This role is ideal for database professionals who have evolved beyond traditional DBA work into cloud-native architectures, API-driven data access layers, and modern DevOps practices. You'll work with cutting-edge technologies like GraphQL, Hasura, and managed cloud databases while mentoring engineers on data architecture best practices. What You'll Do Design, tune, and manage PostgreSQL, SQL Server, and cloud-managed databases (AWS RDS/Aurora, Azure SQL Database/Cosmos DB) Architect and implement GraphQL APIs using Hasura or equivalent technologies for real-time data access Lead cloud database migrations and deployments across AWS and Azure environments Automate database CI/CD pipelines using tools like GitHub Actions, Azure DevOps, or AWS Code Pipeline Develop and maintain data access layers and APIs that integrate with AI and application workloads Monitor, secure, and optimize database performance using cloud-native tools (AWS CloudWatch, Azure Monitor, Datadog) Implement database security best practices including encryption, access controls, and compliance requirements Mentor engineers on database design, data modeling, and architecture best practices Requirements 5+ years of experience designing and managing production database systems Deep expertise in PostgreSQL and SQL Server, including performance tuning and query optimization Hands-on experience with cloud database services (AWS RDS, Aurora, Azure SQL Database, Azure Cosmos DB) Experience with GraphQL and API development, preferably with Hasura or similar platforms Strong background in database CI/CD automation and Infrastructure as Code (Terraform, CloudFormation, Bicep) Proficiency in scripting languages (Python, Bash) for automation and tooling Solid understanding of data modeling, schema design, and database normalization Strong communication and mentoring skills US citizen and must reside in the United States Nice to Have Experience with NoSQL databases (MongoDB, DynamoDB, Redis) Knowledge of data streaming platforms (Kafka, AWS Kinesis, Azure Event Hubs) Experience with data warehousing solutions (Snowflake, Redshift, Azure Synapse) Background in AI/ML data pipelines and feature stores Relevant certifications (AWS Database Specialty, Azure Database Administrator, PostgreSQL Professional) Why Join Echelix You'll join a fast-moving team that's shaping how AI connects people and data. We value curiosity, precision, and practical innovation. You'll work on real projects with real impact, not just proofs of concept.
    $80k-111k yearly est. 4d ago
  • ORACLE CLOUD DATA ENGINEER

    Wise Skulls

    Data engineer job in Phoenix, AZ

    Hiring: Oracle Cloud Data Engineer / Technology Lead We're looking for a hands-on Oracle Cloud Data Engineer (Technology Lead) to drive OCI-based data engineering and Power BI analytics initiatives. This role combines technical leadership with active development in a high-impact data program. Location: Phoenix, AZ (Hybrid) Duration: 6+ Months (Contract) Work Authorization: USC & Green Card holders ONLY (Strict Requirement) Job Summary This role focuses on building scalable data pipelines on Oracle Cloud Infrastructure (OCI) while leading Power BI dashboard and reporting development. You'll apply Medallion Architecture, enforce data governance, and collaborate closely with business stakeholders. Utility industry experience is a strong plus. Must-Have (Non-Negotiable) Skills 8-10 years of experience in Data Engineering & Business Intelligence 3+ years of hands-on OCI experience Strong expertise in OCI Data Services, including: OCI Data Integration, OCI Data Flow, OCI Streaming Autonomous Data Warehouse, Oracle Exadata, OCI Object Storage Hands-on experience with Medallion Architecture (Bronze, Silver, Gold layers) Power BI expertise: dashboards, reports, DAX, Power Query, data modeling, RLS Strong coding skills in SQL, PL/SQL, Python Experience with Terraform, Ansible, and CI/CD pipelines Bachelor's or Master's degree in a related field Power BI Certification - Required Hands-on development is mandatory Key Responsibilities Design and implement secure, scalable OCI data pipelines Lead Power BI dashboard and reporting development Build inbound/outbound integration patterns (APIs, files, streaming) Implement Audit, Balance, and Control (ABC) frameworks Ensure data quality, governance, lineage, and monitoring Mentor engineers and BI developers Drive agile delivery and stakeholder collaboration 📩 Interested? Apply now or DM us to explore this opportunity! You can share resumes at ********************* OR Call us on *****************
    $80k-111k yearly est. 5d ago
  • Data Engineer (GIS)

    Impact Technology Recruiting 4.5company rating

    Data engineer job in Scottsdale, AZ

    About the Role We're partnering with a large, operations-focused organization to hire a Data Scientist (GIS) to support analytics initiatives within their operations function. This role applies geospatial data and advanced analytics to help improve operational efficiency, service reliability, and planning decisions. The work is highly analytical and engineering-focused, with models built directly in Snowflake and used as inputs into downstream optimization and planning systems. What You'll Work On Geospatial Modeling & Time Estimation Develop data-driven models to estimate operational timing across different service and facility interactions Leverage GPS data and geofencing techniques to understand behavior across locations Incorporate contextual variables such as: Geography and location characteristics Customer and service attributes Site complexity and external conditions (e.g., weather, time-based patterns) Produce reliable, explainable time estimates that support planning and decision-making Facility & Location Analytics Model turnaround and processing time across different types of locations Analyze performance variability based on operational and environmental factors Apply polygon- and radius-based geofencing to capture location-specific behavior Quantify how conditions impact operational flow and timing outcomes Technical Environment Primary development and modeling in Snowflake Build and engineer transformations and analytical processes directly in Snowflake Modeling approaches may include: Percentile-based time estimates Aggregations such as averages and medians by service and location attributes Data sources include: Latitude/longitude data High-frequency GPS signals Location and facility reference data What We're Looking For Strong hands-on experience with Snowflake Advanced SQL skills Python for analytics and data engineering Solid understanding of core GIS concepts, including: Spatial joins Polygons Geofencing Experience with traditional GIS tools (e.g., ArcGIS) is a plus, but this is not a cartography or visualization-focused role Background in geospatial data engineering and modeling is key Interview Process Two One hour video interviews
    $90k-128k yearly est. 3d ago
  • Data Governance Engineer

    Tata Consultancy Services 4.3company rating

    Data engineer job in Phoenix, AZ

    Role: Data Governance Engineer Experience Required - 6+ Years Must Have Technical/Functional Skills • Understanding of Data Management and Data Governance concepts (metadata, lineage, data quality, etc.) and prior experience. • 2 - 5 years of Data Quality Management experience. • Intermediate competency in SQL & Python or related programming language. • Strong familiarity with data architecture and/or data modeling concepts • 2 - 5 years of experience with Agile or SAFe project methodologies Roles & Responsibilities • Assist in identifying data-related risks and associated controls for key business processes. Risks relate to Record Retention, Data Quality, Data Movement, Data Stewardship, Data Protection, Data Sharing, among others. • Identify data quality issues, perform root-cause-analysis of data quality issues and drive remediation of audit and regulatory feedback. • Develop deep understanding of key enterprise data-related policies and serve as the policy expert for the business unit, providing education to teams regarding policy implications for business. • Responsible for holistic platform data quality monitoring, including but not limited to critical data elements. • Collaborate with and influence product managers to ensure all new use cases are managed according to policies. • Influence and contribute to strategic improvements to data assessment processes and analytical tools. • Responsible for monitoring data quality issues, communicating issues, and driving resolution. • Support current regulatory reporting needs via existing platforms, working with upstream data providers, downstream business partners, as well as technology teams. • Subject matter expertise on multiple platforms. • Responsible to partner with the Data Steward Manager in developing and managing the data compliance roadmap. Generic Managerial Skills, If any • Drives Innovation & Change: Provides systematic and rational analysis to identify the root cause of problems. Is prepared to challenge the status quo and drive innovation. Makes informed judgments, recommends tailored solutions. • Leverages Team - Collaboration: Coordinates efforts within and across teams to deliver goals, accountable to bring in ideas, information, suggestions, and expertise from others outside & inside the immediate team. • Communication: Influences and holds others accountable and has ability to convince others. Identifies the specific data governance requirements and is able to communicate clearly and in a compelling way. Interested candidates please do share me your updated resume to ******************* Salary Range - $100,000 to $120,000 per year TCS Employee Benefits Summary: Discretionary Annual Incentive. Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans. Family Support: Maternal & Parental Leaves. Insurance Options: Auto & Home Insurance, Identity Theft Protection. Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement. Time Off: Vacation, Time Off, Sick Leave & Holidays. Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
    $100k-120k yearly 4d ago
  • Data Architect-Only W2-No Engineers please

    Tech One It 3.9company rating

    Data engineer job in Phoenix, AZ

    Role: Principal Data Engineer/Data Architect (Public Sector Financial Systems) Type: Contract to hire Role Summary (high level): This individual will serve as a senior technical resource supporting a public-sector financial system, with a strong focus on data architecture, data engineering, and data integration. The role is hands-on and will support ongoing system enhancements, performance optimization, and the establishment of foundational data management practices. 2. Scope of Services A. Data Engineering and Architecture Support Design, implement, and maintain relational data models supporting transactional and reporting use cases. Develop, optimize, and maintain SQL-based data objects, including tables, views, stored procedures, and indexes. Support data access and integration within a full-stack application environment. Design and implement data interfaces and services to support system integrations. Perform data performance tuning, diagnostics, and optimization activities. Support data validation, reconciliation, and auditability requirements. B. Application Enhancement and Delivery Support Support enhancement and feature development efforts for an existing production system. Participate in backlog refinement, technical design, and implementation activities related to data components. Provide production support and issue resolution related to data quality, performance, or integrity. Support testing, deployment, and post-release stabilization activities. C. Data Management Practice Enablement Assist in defining and documenting data management standards and best practices. Develop templates, patterns, and reference implementations to support consistent data engineering practices. Support the establishment of data governance and stewardship concepts appropriate to organizational maturity. Provide documentation and knowledge transfer to support long-term sustainability. 3. Required Qualifications Technical Experience Minimum of eight (8) years of hands-on experience in data engineering and/or data architecture roles. Advanced proficiency with relational database platforms, including data modeling, query optimization, and performance tuning. Experience supporting full-stack application environments, including integration with application code. Experience designing and supporting data integrations and application programming interfaces (APIs). Demonstrated experience supporting production financial or transactional systems. Professional Experience Experience supporting post-implementation or post-go-live enhancement efforts. Ability to operate independently as a senior technical resource with minimal supervision. Experience working in public-sector or regulated environments. Strong analytical, troubleshooting, and documentation skills. 4. Preferred Qualifications Experience supporting public-sector financial, revenue, or accounting systems. Experience with cloud or hybrid data architectures. Familiarity with data governance, data quality, or master data concepts. Experience working in agile or iterative delivery environments.
    $104k-143k yearly est. 1d ago
  • Data Architect

    Akkodis

    Data engineer job in Phoenix, AZ

    Akkodis is seeking a Data Architect local to Phoenix, AZ that can come onsite 3 days a week. If you are interested, please apply! JOB TITLE: Data Architect EMPLOYMENT TYPE: 24+ month Contract | 3 days/week on site Pay: 80 - 96/hr ETL design and development for enterprise data solutions. Design and build databases, data warehouses, and strategies for data acquisition, archiving, and recovery. Review new data sources for compliance with standards. Provide technical leadership, set standards, and mentor junior team members. Collaborate with business stakeholders to translate requirements into scalable solutions. Guide teams on Azure data tools (Data Factory, Synapse, Data Lake, Databricks). Establish best practices for database design, data integration, and data governance. Ensure solutions are secure, high-performing, and easy to support. Essential Skills & Experience Bachelor's degree in computer science, Information Systems, or equivalent experience. 10+ years with Microsoft SQL technologies. 3+ years with cloud-based solutions (Azure preferred). Strong knowledge of ETL, data modeling, and data warehousing. Experience with source control, change/release management, and documentation. Excellent communication and leadership skills. Preferred Retail or grocery industry experience. Familiarity with Power BI and MDM principles. Work Schedule Hybrid: 3 days onsite in Phoenix, AZ; 2 days remote. “Benefit offerings include medical, dental, vision, term life insurance, short-term disability insurance, additional voluntary benefits, commuter benefits and 401K plan. Our program provides employees the flexibility to choose the type of coverage that meets their individual needs. Available paid leave may include Paid Sick Leave, where required by law; any other paid leave required by Federal, State or local law; and Holiday pay upon meeting eligibility criteria. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs which are direct hire to a client”
    $92k-128k yearly est. 4d ago
  • Data Architect

    Sibitalent Corp

    Data engineer job in Phoenix, AZ

    Hi, Hope you are doing well, IMMEDIATE INTERVIEW = Principal Data Engineer / Data Architect IN Phoenix, ARIZONA( NEED LOCAL CANDIDATE)- OPEN FOR W2 OR SELF CORP CANDIDATE ONLY Please find the Job details below and kindly revert if you're interested in learning more about this job. Job Title: Principal Data Engineer / Data Architect Location: Phoenix, ARIZONA( NEED LOCAL CANDIDATE) Principal Data Engineer / Data Architect (Public Sector Financial Systems) 1. Role Overview The Principal Data Engineer / Data Architect will provide hands-on technical services to support the enhancement, stabilization, and ongoing development of a public-sector financial system operating in a full-stack application environment. This role serves as the primary technical resource responsible for data architecture, data engineering, and data integration activities associated with the system. In addition to supporting application enhancements, the contractor will assist the organization in establishing foundational data management practices, including standards, tools, and processes appropriate for a public-sector environment. The role requires a high degree of independent technical judgment and the ability to work effectively within an iterative delivery framework. 2. Scope of Services A. Data Engineering and Architecture Support Design, implement, and maintain relational data models supporting transactional and reporting use cases. Develop, optimize, and maintain SQL-based data objects, including tables, views, stored procedures, and indexes. Support data access and integration within a full-stack application environment. Design and implement data interfaces and services to support system integrations. Perform data performance tuning, diagnostics, and optimization activities. Support data validation, reconciliation, and auditability requirements. B. Application Enhancement and Delivery Support Support enhancement and feature development efforts for an existing production system. Participate in backlog refinement, technical design, and implementation activities related to data components. Provide production support and issue resolution related to data quality, performance, or integrity. Support testing, deployment, and post-release stabilization activities. C. Data Management Practice Enablement Assist in defining and documenting data management standards and best practices. Develop templates, patterns, and reference implementations to support consistent data engineering practices. Support the establishment of data governance and stewardship concepts appropriate to organizational maturity. Provide documentation and knowledge transfer to support long-term sustainability. 3. Required Qualifications Technical Experience Minimum of eight (8) years of hands-on experience in data engineering and/or data architecture roles. Advanced proficiency with relational database platforms, including data modeling, query optimization, and performance tuning. Experience supporting full-stack application environments, including integration with application code. Experience designing and supporting data integrations and application programming interfaces (APIs). Demonstrated experience supporting production financial or transactional systems. Professional Experience Experience supporting post-implementation or post-go-live enhancement efforts. Ability to operate independently as a senior technical resource with minimal supervision. Experience working in public-sector or regulated environments. Strong analytical, troubleshooting, and documentation skills. 4. Preferred Qualifications Experience supporting public-sector financial, revenue, or accounting systems. Experience with cloud or hybrid data architectures. Familiarity with data governance, data quality, or master data concepts. Experience working in agile or iterative delivery environments.
    $92k-128k yearly est. 3d ago
  • Data Architect

    Saxon Global 3.6company rating

    Data engineer job in Phoenix, AZ

    The Senior Data Engineer & Test in Phoenix 85029 will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects Key Responsibilities 1. Data Engineering: Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads. 2. Workflow Orchestration: Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability. 3. CI/CD Pipeline Development: Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions. 4. Testing & Quality: Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines. 5. Secure Development: Implement secure coding best practices and design patterns throughout the development lifecycle. 6. Collaboration: Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions. 7. Documentation: Create and maintain technical documentation, including process/data flow diagrams and system design artifacts. 8. Mentorship: Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices. 9. Troubleshooting: Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks. Cross-Team Knowledge Sharing: Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage. Includes all above skills, plus the following; · Minimum of 10+ years overall IT experience · Experienced in waterfall, iterative, and agile methodologies Technical Requirment: 1. Hands-on Data Engineering : Minimum 5+ yearsof practical experience building production-grade data pipelines using Python and PySpark. 2. Airflow Expertise: Proven track record of designing, deploying, and managing Airflow DAGs in enterprise environments. 3. CI/CD for Data Projects : Ability to build and maintain CI/CD pipelinesfor data engineering workflows, including automated testing and deployment**. 4. Cloud & Containers: Experience with containerization (Docker and cloud platforms (GCP) for data engineering workloads. Appreciation for twelve-factor design principles 5. Python Fluency : Ability to write object-oriented Python code manage dependencies, and follow industry best practices 6. Version Control: Proficiency with **Git** for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows). 7. Unix/Linux: Strong command-line skills** in Unix-like environments. 8. SQL : Solid understanding of SQL for data ingestion and analysis. 9. Collaborative Development : Comfortable with code reviews, pair programming and usingremote collaboration tools effectively. 10. Engineering Mindset: Writes code with an eye for maintainability and testability; excited to build production-grade software 11. Education: Bachelor's or graduate degree in Computer Science, Data Analytics or related field, or equivalent work experience.
    $84k-115k yearly est. 2d ago
  • Azure Data Architect

    Sogeti 4.7company rating

    Data engineer job in Phoenix, AZ

    We are seeking an experienced Azure Data Architect to design, build, and govern scalable, secure, and high-performance data platforms on Microsoft Azure. The architect will define enterprise data architecture, lead cloud data modernization initiatives, and enable analytics, reporting, and AI/ML workloads. Key Responsibilities: Architecture & Design Design end-to-end Azure data architectures including ingestion, storage, processing, analytics, and consumption layers Define Lakehouse, Data Warehouse, and Medallion (Bronze/Silver/Gold) architectures Architect solutions using Azure Fabric, Azure Synapse, Azure Data Factory, ADLS Gen2, Azure SQL, Snowflake (if applicable) Data Engineering & Integration Design scalable ingestion pipelines for batch and streaming data Implement data transformations using Spark, SQL, PySpark, Dataflows Optimize data models for Power BI, reporting, and advanced analytics Governance, Security & Compliance Define data governance, lineage, and metadata strategy Implement security, RBAC, encryption, and PII controls Ensure compliance with enterprise and regulatory standards (SOX, GDPR, CCAR, FRY14, etc. where applicable) Analytics & BI Enablement Design semantic models and enterprise datasets for Power BI Optimize performance for large-scale analytical workloads Enable self-service analytics while maintaining governance Cloud & DevOps Implement CI/CD pipelines for data solutions Define monitoring, logging, and cost optimization (FinOps) Collaborate with DevOps and platform teams Leadership & Collaboration Act as a technical leader and advisor to engineering and business teams Review designs, mentor data engineers, and enforce best practices Partner with stakeholders to translate business needs into data solutions Required Skills & Experience: Technical Skills: Strong expertise in Microsoft Azure Data Services Azure Fabric (Lakehouse, Warehouse, Notebooks) Azure Data Factory / Synapse Pipelines ADLS Gen2 Azure SQL / SQL Server Strong SQL and data modeling skill Experience with Power BI (semantic models, performance tuning) Hands-on with Spark / PySpark Understanding of Snowflake or Databricks Architecture & Concepts: Data Warehousing & Lakehouse architecture Dimensional modeling (Star/Snowflake schemas) Data quality, lineage, and observability Streaming vs batch architecture Experience: 10+ years in data engineering / analytics 4-6+ years designing cloud data platforms Experience in enterprise-scale implementations Certifications: Microsoft certifications (Azure Data Engineer / Architect) Experience with AI/ML enablement on Azure Domain exposure: Banking, Regulatory Reporting, Real Estate, Healthcare Experience with Purview / Microsoft Fabric governance Soft Skills: Strong communication with business and technical stakeholders Ownership mindset and problem-solving ability Ability to influence architecture decisions Mentorship and leadership capabilities Life at Sogeti - We support all aspects of your well-being throughout the changing stages of your life and career. For eligible employees, we offer: Flexible work Healthcare including dental, vision, mental health, and well-being programs Financial well-being programs such as 401(k) (matched 150% up to 6%) and Employee Share Ownership Plan 100% Company-paid mobile phone plan 3 weeks Personal Time Off (PTO) and 7 Paid Holidays Paid parental leave Family building benefits like adoption assistance, surrogacy, and cryopreservation Social well-being benefits like subsidized back-up child/elder care and tutoring Mentoring, coaching and learning programs Continuing Education: $5,250 Annual Tuition Reimbursement plus access to over 20,000 online courses and certifications through Capgemini University, as well as Coursera and Degreed. Programs for Counseling, Support, Health and Fitness perks, Auto discounts and much, much more! Employee Resource Groups Disaster Relief Disclaimer - Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law. Please be aware that Capgemini may capture your image (video or screenshot) during the interview process and that image may be used for verification, including during the hiring and onboarding process. This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever it is necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodation does not pose an undue hardship. Capgemini is committed to providing reasonable accommodation during our recruitment process. If you need assistance or accommodation, please reach out to your recruiting contact. Click the following link for more information on your rights as an Applicant ************************************************************************** Applicants for employment in the US must have valid work authorization that does not now and/or will not in the future require sponsorship of a visa for employment authorization in the US by Capgemini.
    $90k-131k yearly est. 3d ago
  • Java Software Engineer

    Tekskills Inc. 4.2company rating

    Data engineer job in Phoenix, AZ

    Job Title : Java Developer Duration : 12 Months Must Have Skills: Good Knowledge on Java Strong communication skill Should be able to work independently Detailed Job Description: JavaJ2EE full stack developer with financial or Banking domain experience. Should be very fluent in communication and should be able to work on his own without hand holding. Should be completely hands on. Responsibilities: Good Knowledge on Java Strong communication skill Should be able to work independently
    $76k-107k yearly est. 5d ago
  • Backend Software Engineer

    Waferwire Cloud Technologies

    Data engineer job in Phoenix, AZ

    Job Title: Backend Software Engineer - FinTech Duration: Long-term About WCT WCT is a global talent solutions partner committed to delivering high-impact technology and engineering talent to some of the world's most innovative companies. As a WCT employee, you'll be part of a dynamic, growth-oriented culture that values collaboration, continuous learning, and excellence in execution. Job Description: The Digital Banking and Payments team is responsible for our Card Payments Processing and Digital Bank products. We are currently building platforms responsible for ACH, Wire, Bill Pay, Zelle, Debit, Checking, and Savings applications that allow teams to create products at scale and allow digital channels to deliver compelling experiences in a dramatically faster fashion. We're building a next-generation orchestration and automation platform to support thousands of business processes across the enterprise. This isn't about coding one-off automations - it's about creating the platform, APIs, and orchestration primitives that other developer teams will use to define, run, and scale their business processes. As the Lead Backend Engineer, you'll be hands-on in Kotlin every day, designing platform services, integrating workflow orchestration frameworks, and enabling GenAl-powered intent parsing and decision nodes. You need to understand complexity, tradeoffs and scaling. Responsibilities Design and build backend platform services in Kotlin for ingestion, orchestration, RBAC, monitoring, and developer tooling. Implement and optimize workflow orchestration frameworks (e.g., Temporal - preferred, Cadence, Camunda). Provide scalable Alts and abstractions that empower other teams to build workflows on the platform. Integrate GenAI/NLP pipelines for intent parsing, process matching, and intelligent decisioning. Champion developer experience (X) through tooling, CI/CD improvements, and observability. Mentor backend engineers, lead design reviews, and guide technical decisions. Collaborate closely with frontend, product, and process analysts to ensure platform adoption and impact. Qualifications: Bachelor's degree in computer science. Engineering, Data Science, or related field (or equivalent experience). 10+ years of backend engineering experience, with strong Kotlin/JVM expertise. Proven track record building platforms, frameworks, or orchestration services (not just applications). Hands-on experience with workflow orchestration systems (Temporal, Cadence, Camunda, or similar). Excellent knowledge in graph algorithms. Deep knowledge of distributed systems. API design, and event driven architectures. Practical experience integrating GenAI/NLP into backend systems. Experience with RBAC/security models in multi-tenant or enterprise environments. Strong bias for action, ability to thrive in lean teams inside large organizations, and passion for delivering value quality. Natural mentor with excellent communication skills, collaborates across functions and knows when to push back. Compensation / Salary Range: The typical pay range for this role is: USD $80,000/Yearly - $100,000/Yearly. Factors that may affect pay within or outside of this range may include but not limited to geography/market, skills, education, experience, and other qualifications of the successful candidate. Benefits: Medical, dental, Vision, Life, PTO, Holidays, 401(k) benefits and ancillaries may be available for eligible WCT employees and may vary depending on the nature of your employment. WCT will accept applications and processes offers for these roles until the role is filled. Equal Employment Opportunity Declaration: WCT is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable laws, regulations and ordinances.
    $80k-100k yearly 3d ago
  • Java Software Engineer

    Coreai Consulting

    Data engineer job in Phoenix, AZ

    We are seeking a skilled Back-End Senor Java Developer to join our development team. In this role, you will be responsible for designing, building, and maintaining the server-side logic, databases, and APIs of scalable web applications. The ideal candidate will have a strong background in Java development, excellent problem-solving abilities, and a passion for delivering high-performance back-end solutions. Qualifications 6+ years of hands-on experience in Java development. Back-End API Development, Full-Stack Development, and Software Development skills Strong knowledge of Spring or Spring Boot framework for building back-end services. Strong knowledge of microservices architecture and development. Experience with RESTful API design and development. Proficiency in working with databases (e.g., MySQL, PostgreSQL, Cassandra, Couchbase & Mongo). Experience in monitoring tools such as Micrometer, Prometheus, Elastic, Kibana, Grafana & Splunk. Experience with cloud platforms (AWS or GCP,). Familiarity with version control tools like Git. Knowledge of security best practices in back-end development. Experience with Agile methodologies and working in a collaborative, fast-paced environment. Understanding of containerization tools such as Docker or Kubernetes Good communication and team work skills are a must
    $71k-98k yearly est. 4d ago

Learn more about data engineer jobs

How much does a data engineer earn in Sierra Vista, AZ?

The average data engineer in Sierra Vista, AZ earns between $68,000 and $126,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Sierra Vista, AZ

$93,000
Job type you want
Full Time
Part Time
Internship
Temporary