Post job

Requirements engineer jobs in Mesa, AZ

- 548 jobs
All
Requirements Engineer
Data Engineer
Devops Engineer
Software Engineer
  • Java Backend Engineer

    Hays 4.8company rating

    Requirements engineer job in Phoenix, AZ

    Java Backend Developer (Vert.X & Spark - Good to Have) We're looking for a strong Java engineer with experience in backend development and web technologies. Vert.X and Apache Spark experience is a plus. Key Skills: Java, Webtechnologies Vert.X & Spark (nice to have) Team player, Agile mindset Hybrid work (3 days onsite)
    $90k-119k yearly est. 3d ago
  • ServiceNow IRM Engineer

    Henderson Scott

    Requirements engineer job in Phoenix, AZ

    ServiceNow IRM (Integrated Risk Management) Engineer Circa $160,000 Experience Required - 5+ Years Must Have Technical/Functional Skills ServiceNow IRM Module Implementation Exp is must Must have worked on ServiceNow IRM CMDB and CSDM capabilities Good to have Javascript, Augular Preferred Qualifications/Certifications: ServiceNow Certified System Administrator (CSA). ServiceNow Certified Implementation Specialist - Risk and Compliance (GRC CIS). ServiceNow Certified Application Developer (CAD). Roles & Responsibilities We are seeking a highly skilled and experienced ServiceNow IRM Developer to join our team. In this role, associate will be instrumental in designing, developing, implementing, and maintaining robust Integrated Risk Management (IRM) solutions on the ServiceNow platform. Associate will work closely with business stakeholders, risk managers, compliance officers, and audit teams to translate business requirements into technical solutions that enhance our organization's risk posture and ensure regulatory adherence. Benefits: Discretionary Annual Incentive. Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans. Family Support: Maternal & Parental Leaves. Insurance Options: Auto & Home Insurance, Identity Theft Protection. Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement. Time Off: Vacation, Time Off, Sick Leave & Holidays. Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing. **please note - we cannot offer Sponsorship If you feel this is a good fit and would like to find out more I look forward to recieving your application.
    $160k yearly 3d ago
  • Quantexa Engineer

    Saxon Global 3.6company rating

    Requirements engineer job in Phoenix, AZ

    Hello, Greetings of the day! We have an exciting job opportunity for the position of Quantexa Engineer with one of our esteemed clients. Based on your experience, I believe this could be a great fit for you. If you're interested, please share your latest resume with me at ************************ Also, if you know someone who might be a good fit and is currently exploring opportunities, referrals are highly appreciated! Job Title: Quantexa Engineer Location : Phoenix AZ, New York NY, Sunrise, FL - Day 1 Onsite Duration: Contract role Skills : Scala; GCP; ETL - Big Data / Data Warehousing, Quantexa Certified Job Summary: Having a good hands-on knowledge of development and configurations of Quantexa (***************** modules for batch and realtime entity resolutions. Quantexa certified professionals
    $83k-106k yearly est. 4d ago
  • Azure Cloud Engineer

    Agreeya Solutions 4.3company rating

    Requirements engineer job in Phoenix, AZ

    AgreeYa is a global Systems Integrator and is seeking an experienced Azure Cloud Engineer to join our growing team. Join our dynamic team as a Level 2 Azure Cloud Engineer, where you will be the go-to technical resource driving the reliability, performance, and security of enterprise-scale networks and cloud environments. You'll troubleshoot complex issues, lead system upgrades, perform proactive health checks, and collaborate closely with infrastructure, security, and operations teams to keep mission-critical systems running smoothly Job Responsibilities Design, build, and maintain Azure-based cloud infrastructure, including VMs, VNets, storage, load balancers, and other core cloud services. Implement Infrastructure-as-Code (IaC) using Terraform, Bicep, or ARM templates. Migrate on-premises workloads and applications to Azure with minimal downtime Build and maintain CI/CD pipelines using Azure DevOps, GitHub Actions, or similar tools. Automate operational tasks using PowerShell, Bash, or Python. Implement and support Azure DevOps pipelines, artifacts, and release management processes. Required Skills & Experience Experience working as a Cloud Engineer or similar role, with a strong focus on Microsoft Azure services. Experience with automation and scripting using PowerShell, Bash, or Python. Solid experience with Azure DevOps pipelines, CI/CD workflows, repositories, artifacts, and release management. Preferred Skills & Experience: AZ-104: Azure Administrator Associate AZ-305: Azure Solutions Architect Expert AZ-400: DevOps Engineer Expert Education Required: Bachelor's degree in Computer Science, Information Technology, Engineering, or equivalent practical experience.
    $80k-103k yearly est. 3d ago
  • Quantexa Engineer

    Coelate Technologies

    Requirements engineer job in Phoenix, AZ

    Skills : Scala; GCP; ETL - Big Data / Data Warehousing, Quantexa Certified Responsibilities Having a good hands-on knowledge of development and configurations of Quantexa modules for batch and realtime entity resolutions. Quantexa certified professionals. Required Skills Skills related to Quantexa development and configurations.
    $76k-106k yearly est. 4d ago
  • Azure Cloud Engineer

    Recurring Decimal

    Requirements engineer job in Phoenix, AZ

    This is not a DevOps engineer role, but a Cloud Engineer role Skills & Qualifications: Required- Advanced Azure/ certifications Ten plus years of experience in designing, building, and implementing distributed cloud architecture within cloud environments Azure. - APIM, APIOps, CosmosDB, EventHub, AKS etc. 7-8/10 scale Terraform experience is MUST and not limited to existing templates, but terraform from scratch. experience building complex cloud Infrastructure, ability to write modules, best practice, state files management etc. is MUST. understanding of Linux fundamentals. Windows welcome. security scan tools understanding (e.g. Azure Defender, Container Scan, API security, Perimeter scan etc.) is much desired. with full DevOps cycle spanning across - Cloud Infrastructure, Cloud Security, Observability, CICD, Secure CICD, ProdOps etc. experience in designing and implementing large scale platforms with high resiliency, availability, and reliability using public cloud infrastructure. in implementing security architecture and best practices in cloud infrastructure. verbal and written skills, ability to clearly communicate.
    $76k-106k yearly est. 2d ago
  • AEM Engineer

    Cozen Technology Solutions Inc.

    Requirements engineer job in Phoenix, AZ

    We are seeking a skilled Adobe Engineer with deep, hands-on experience in Adobe Experience Platform (AEP) to join our growing team in Phoenix, AZ. The ideal candidate will have strong technical expertise in Adobe Journey Optimizer (AJO) and Adobe Real-Time Customer Data Platform (RTCDP), along with a solid understanding of data integration, customer journeys, and personalized experience orchestration. Qualifications: Professional experience in Adobe Platform Adobe Certified Expert in Adobe Experience Platform or related Adobe products. Familiarity with data privacy and compliance regulations (GDPR, CCPA). Experience in Agile development environments. Preferred Skills: Hands-on experience with the core components of Adobe Experience Platform, including AJO and RTCDP. Strong understanding of customer data platforms (CDPs), real-time segmentation, and journey orchestration Experience integrating first-party, second-party, and third-party data into AEP. Proficiency in working with JSON, REST APIs, and data ingestion pipelines. Knowledge of Adobe Tags, Adobe Analytics, and Adobe Target is a plus. Ability to translate business requirements into scalable technical solutions. Excellent communication and collaboration skills.
    $76k-106k yearly est. 4d ago
  • GCP engineer with Bigquery, Pyspark

    Tata Consultancy Services 4.3company rating

    Requirements engineer job in Phoenix, AZ

    Job Title : GCP engineer with Bigquery, Pyspark Experience Required - 7+ Years Must Have Technical/Functional Skills GCP Engineer with Bigquery, Pyspark and Python experience Roles & Responsibilities · 6+ years of professional experience with at least 4+ years of GCP Data Engineer experience · Experience working on GCP application Migration for large enterprise · Hands on Experience with Google Cloud Platform (GCP) · Extensive experience with ETL/ELT tools and data transformation frameworks · Working knowledge of data storage solutions like Big Query or Cloud SQL · Solid skills in data orchestration tools like AirFlow or Cloud Workflows. · Familiarity with Agile development methods. · Hands on experience with Spark, Python ,PySpark APIs. Knowledge of various Shell Scripting tools Salary Range - $90,000 to $120,000 per year Interested candidates please do share me your updated resume to ******************* TCS Employee Benefits Summary: Discretionary Annual Incentive. Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans. Family Support: Maternal & Parental Leaves. Insurance Options: Auto & Home Insurance, Identity Theft Protection. Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement. Time Off: Vacation, Time Off, Sick Leave & Holidays. Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
    $90k-120k yearly 4d ago
  • Senior Data Engineer

    Addison Group 4.6company rating

    Requirements engineer job in Phoenix, AZ

    Job Title: Sr. Data Engineer Job Type: Full Time Compensation: $130,000 - $150,000 D.O.E. is eligible for medical, dental, vision, and life insurance coverage, & PTO Senior Data Engineer ROLE OVERVIEW The Senior Data Engineer is responsible for designing, building, and maintaining scalable data platforms that support analytics, reporting, and advanced data-driven initiatives. This is a hands-on engineering role focused on developing reliable, high-performing data solutions while contributing to architectural standards, data quality, and governance practices. The ideal candidate has strong experience with modern data architectures, data modeling, and pipeline development, and is comfortable collaborating across technical and business teams to deliver trusted, production-ready datasets. KEY RESPONSIBILITIES Design and maintain data models across analytical and operational use cases to support reporting and advanced analytics. Build and manage data pipelines that ingest, transform, and deliver structured and unstructured data at scale. Contribute to data governance practices, including data quality controls, metadata management, lineage, and stewardship. Develop and maintain cloud-based data platforms, including data lakes, analytical stores, and curated datasets. Implement and optimize batch and near-real-time data ingestion and transformation processes. Support data migration and modernization efforts while ensuring accuracy, performance, and reliability. Partner with analytics, engineering, and business teams to understand data needs and deliver high-quality solutions. Enable reporting and visualization use cases by providing clean, well-structured datasets for downstream tools. Apply security, privacy, and compliance best practices throughout the data lifecycle. Establish standards for performance tuning, scalability, reliability, and maintainability of data solutions. Implement automation, testing, and deployment practices to improve data pipeline quality and consistency. QUALIFICATIONS Bachelor's degree in Computer Science, Engineering, or a related technical field, or equivalent professional experience. 5+ years of experience in data engineering or related roles. Strong hands-on experience with: Data modeling, schema design, and pipeline development Cloud-based data platforms and services Data ingestion, transformation, and optimization techniques Familiarity with modern data architecture patterns, including lakehouse-style designs and governance frameworks. Experience supporting analytics, reporting, and data science use cases. Proficiency in one or more programming languages commonly used in data engineering (e.g., Python, SQL, or similar). Solid understanding of data structures, performance optimization, and scalable system design. Experience integrating data from APIs and distributed systems. Exposure to CI/CD practices and automated testing for data workflows. Familiarity with streaming or event-driven data processing concepts preferred. Experience working in Agile or iterative delivery environments. Strong communication skills with the ability to document solutions and collaborate across teams.
    $130k-150k yearly 4d ago
  • Data Engineer

    Interactive Resources-IR 4.2company rating

    Requirements engineer job in Tempe, AZ

    About the Role We are seeking a highly skilled Databricks Data Engineer with strong expertise in modern data engineering, Azure cloud technologies, and Lakehouse architectures. This role is ideal for someone who thrives in dynamic environments, enjoys solving complex data challenges, and can lead end-to-end delivery of scalable data solutions. What We're Looking For 8+ years designing and delivering scalable data pipelines in modern data platforms Deep experience in data engineering, data warehousing, and enterprise-grade solution delivery Ability to lead cross-functional initiatives in matrixed teams Advanced skills in SQL, Python, and ETL/ELT development, including performance tuning Hands-on experience with Azure, Snowflake, and Databricks, including system integrations Key Responsibilities Design, build, and optimize large-scale data pipelines on the Databricks Lakehouse platform Modernize and enhance cloud-based data ecosystems on Azure, contributing to architecture, modeling, security, and CI/CD Use Apache Airflow and similar tools for workflow automation and orchestration Work with financial or regulated datasets while ensuring strong compliance and governance Drive best practices in data quality, lineage, cataloging, and metadata management Primary Technical Skills Develop and optimize ETL/ELT pipelines using Python, PySpark, Spark SQL, and Databricks Notebooks Design efficient Delta Lake models for reliability and performance Implement and manage Unity Catalog for governance, RBAC, lineage, and secure data sharing Build reusable frameworks using Databricks Workflows, Repos, and Delta Live Tables Create scalable ingestion pipelines for APIs, databases, files, streaming sources, and MDM systems Automate ingestion and workflows using Python and REST APIs Support downstream analytics for BI, data science, and application workloads Write optimized SQL/T-SQL queries, stored procedures, and curated datasets Automate DevOps workflows, testing pipelines, and workspace configurations Additional Skills Azure: Data Factory, Data Lake, Key Vault, Logic Apps, Functions CI/CD: Azure DevOps Orchestration: Apache Airflow (plus) Streaming: Delta Live Tables MDM: Profisee (nice-to-have) Databases: SQL Server, Cosmos DB Soft Skills Strong analytical and problem-solving mindset Excellent communication and cross-team collaboration Detail-oriented with a high sense of ownership and accountability
    $92k-122k yearly est. 2d ago
  • Sr Bigdata engineer

    E-Solutions 4.5company rating

    Requirements engineer job in Scottsdale, AZ

    Sr Bigdata developer Scottsdale AZ Must have : 10-12 years of experience Strong Experience in Scala, Spark, hive SQL, Hadoop and Kafka Proficiency in Hive and SQL optimization. Understanding of distributed systems and big data architecture. Knowledge of streaming frameworks (Spark Streaming, Kafka Streams). Good to have - Aerospike experience
    $103k-144k yearly est. 5d ago
  • Data Engineer (GIS)

    Impact Technology Recruiting 4.5company rating

    Requirements engineer job in Scottsdale, AZ

    About the Role We're partnering with a large, operations-focused organization to hire a Data Scientist (GIS) to support analytics initiatives within their operations function. This role applies geospatial data and advanced analytics to help improve operational efficiency, service reliability, and planning decisions. The work is highly analytical and engineering-focused, with models built directly in Snowflake and used as inputs into downstream optimization and planning systems. What You'll Work On Geospatial Modeling & Time Estimation Develop data-driven models to estimate operational timing across different service and facility interactions Leverage GPS data and geofencing techniques to understand behavior across locations Incorporate contextual variables such as: Geography and location characteristics Customer and service attributes Site complexity and external conditions (e.g., weather, time-based patterns) Produce reliable, explainable time estimates that support planning and decision-making Facility & Location Analytics Model turnaround and processing time across different types of locations Analyze performance variability based on operational and environmental factors Apply polygon- and radius-based geofencing to capture location-specific behavior Quantify how conditions impact operational flow and timing outcomes Technical Environment Primary development and modeling in Snowflake Build and engineer transformations and analytical processes directly in Snowflake Modeling approaches may include: Percentile-based time estimates Aggregations such as averages and medians by service and location attributes Data sources include: Latitude/longitude data High-frequency GPS signals Location and facility reference data What We're Looking For Strong hands-on experience with Snowflake Advanced SQL skills Python for analytics and data engineering Solid understanding of core GIS concepts, including: Spatial joins Polygons Geofencing Experience with traditional GIS tools (e.g., ArcGIS) is a plus, but this is not a cartography or visualization-focused role Background in geospatial data engineering and modeling is key Interview Process Two One hour video interviews
    $90k-128k yearly est. 4d ago
  • DevOps Engineer

    Tech One It 3.9company rating

    Requirements engineer job in Scottsdale, AZ

    Overall Purpose This position designs, develops, tests and maintains infrastructure as code, CICD patterns, Configuration Management and containerized product applications, providing technical leadership and hands-on support for internal systems. Essential Functions Design, develop, document, test and debug new and existing Configuration management patterns and infrastructure as code. Design, create and maintain comprehensive policies and technical documentation of best practices for all implemented system configurations ensuring efficient planning and execution. Perform requirements analysis and design a model for Infrastructure and application flow. Conduct design meetings and analyzes user needs to determine technical requirements. Write technical specifications (based on conceptual design and business requirements). Identify and evaluate new technologies for implementation. Recommend and implement changes to existing hardware and operating system infrastructure including patches, users, file systems and kernel parameters. Seek out and implement new technologies to continually simplify the environment while improving security and performance. Analyze results, failures and bugs to determine the causes of errors and tune the automation pipeline to fix the problems to have desired outcome. Diagnose and resolve hardware related server problems (failed disks, network cards, CPU, memory, etc.) and act as escalation point to troubleshoot hardware and operating system problems and suggest possible performance tuning. Consult with end user to prototype, refine, test, and debug programs to meet needs. Proactively monitors health of environment and act on fixing any issues and improves the performance of environments. Coaching and mentoring staff on team policies, procedures, use cases and best patterns. Support and maintain products and add new features. Participate in and follow change management processes for change implementation. Support the company's commitment to risk management and protecting the integrity and confidentiality of systems and data For Kubernetes Focus Only: Design/Implement container orchestration platform in a hybrid cloud environment. Ensure that container orchestration platform is regularly maintained and released to production without downtime. For Cloud Focus Only: Lead infrastructure-as-code projects, designing APIs and building tools to be used by engineering teams for reliable and repeatable cloud deployments Implement abstractions to simplify the complexities of cloud providers (AWS), open-source technologies (Kubernetes), and internal EWS infrastructure Obsess about the usability of the systems you build, allowing engineers to have an intuitive and predictable experience working with infrastructure at scale Troubleshooting complex infrastructure problems, often spanning multiple layers of the stack and requires working with multiple teams Experience designing cloud infrastructure for robustness, security, and observability Expertise in infrastructure-as-code tools such as Terraform, Ansible, and continuous deployment pipelines Expertise in AWS foundations including compute, networking, storage, observability and security. Experience in automating AWS services using Terraform and Ansible. Experience in highly scalable distributed datacenter or cloud computer systems (AWS, Azure, VM) Strong knowledge of AWS services (EC2, IAM, ELB, Route53, S3, Lambda, Cloud Formation, DynamoDB) Experience architecting Kubernetes based systems Container orchestration - Kubernetes, TKGi, EKS, ECS Proficient with using and debugging networks, DNS, HTTP, TLS, load-balancing, build systems, Linux, and Docker Experience in building CI/CD pipelines Experience building and scaling Workflow pipelines Experience in data center operations, monitoring, alerting and notifications Minimum Qualifications Education and/or experience typically obtained through completion of a Bachelor's Degree in Computer Science or equivalent certifications. Minimum of 7 or more years of related experience. Demonstrated prior DevOps, software engineering or related experience. Ability to work on multiple projects and general understanding of software environments and network topologies Able to facilitate technical design sessions Minimum of 3 years of experience in modern application design patterns Solid understanding of an iterative software development process Ability to use Linux administration command line programs and create/edit scripts Knowledge of one or more of the tools - Chef, Ansible, puppet. Knowledge of one or more of the tools - IAC, Containerization and orchestration (Terraform, Docker & Kubernetes) Experienced with security and encryption protocols. Knowledge of one of the cloud infrastructure providers - AWS, GCP and Azure Must be able to work different schedules as part of an on-call rotation. Background and drug screen. Preferred Certification in Terraform, AWS, and Kubernetes AWS, Azure (and/ or other cloud-based) certification(s) strongly preferred Interviews: 3 virtual interviews then 1 final onsite. Start Date: Jan/early Feb.
    $92k-119k yearly est. 3d ago
  • Data Engineer

    Stelvio Inc.

    Requirements engineer job in Phoenix, AZ

    Hybrid - 2-3 days on site Phoenix, AZ We're looking for a Data Engineer to help build the cloud-native data pipelines that power critical insights across our organization. You'll work with modern technologies, solve real-world data challenges, and support analytics and reporting systems that drive smarter decision-making in the transportation space. What You'll Do Build and maintain data pipelines using Databricks, Azure Data Factory, and Microsoft Fabric Implement incremental and real-time ingestion using medallion architecture Develop and optimize complex SQL and Python transformations Support legacy platforms (SSIS, SQL Server) while contributing to modernization efforts Troubleshoot data quality and integration issues Participate in proof-of-concepts and recommend technical solutions What You Bring 5+ years designing and building data solutions Strong SQL and Python skills Experience with ETL pipelines and Data Lake architecture Ability to collaborate and adapt in a fast-moving environment Preferred: Azure services, cloud ETL tools, Power BI/Tableau, event-driven systems, NoSQL databases Bonus: Experience with Data Science or Machine Learning Benefits Medical, dental, and vision from day one · PTO & holidays · 401(k) with match · Lifestyle account · Tuition reimbursement · Voluntary benefits · Employee Assistance Program · Well-being & culture programs · Professional development support
    $80k-111k yearly est. 3d ago
  • Senior Data Engineer (PySpark / Python) (Only USC or GC on W2)

    Ampstek

    Requirements engineer job in Phoenix, AZ

    Job Title: Senior Data Engineer (PySpark / Python) Employment Type: Contract Must Have Skills py Spark, Python development , data engineer Hands on knowledge for py Spark, Hadoop, Python Github Backend API integration knowledge (JASON, REST) Certifications Needed : No (Good to have GCP certification) Top 3 responsibilities you would expect the Subcon to shoulder and execute Individual contributor Strong development experience and leading dev module Work with client directly
    $80k-111k yearly est. 4d ago
  • Data Governance Engineer

    Centraprise

    Requirements engineer job in Phoenix, AZ

    Job Title : Data Governance Engineer Phoenix, AZ - Complete Onsite Full-Time Permanent Experience Required - 6+ Years Must Have Technical/Functional Skills Understanding of Data Management and Data Governance concepts (metadata, lineage, data quality, etc.) and prior experience. 2 - 5 years of Data Quality Management experience. Intermediate competency in SQL & Python or related programming language. Strong familiarity with data architecture and/or data modeling concepts 2 - 5 years of experience with Agile or SAFe project methodologies Roles & Responsibilities Assist in identifying data-related risks and associated controls for key business processes. Risks relate to Record Retention, Data Quality, Data Movement, Data Stewardship, Data Protection, Data Sharing, among others. Identify data quality issues, perform root-cause-analysis of data quality issues and drive remediation of audit and regulatory feedback. Develop deep understanding of key enterprise data-related policies and serve as the policy expert for the business unit, providing education to teams regarding policy implications for business. Responsible for holistic platform data quality monitoring, including but not limited to critical data elements. Collaborate with and influence product managers to ensure all new use cases are managed according to policies. Influence and contribute to strategic improvements to data assessment processes and analytical tools. Responsible for monitoring data quality issues, communicating issues, and driving resolution. Support current regulatory reporting needs via existing platforms, working with upstream data providers, downstream business partners, as well as technology teams. Subject matter expertise on multiple platforms. Responsible to partner with the Data Steward Manager in developing and managing the data compliance roadmap. Generic Managerial Skills, If any Drives Innovation & Change: Provides systematic and rational analysis to identify the root cause of problems. Is prepared to challenge the status quo and drive innovation. Makes informed judgments, recommends tailored solutions. Leverages Team - Collaboration: Coordinates efforts within and across teams to deliver goals, accountable to bring in ideas, information, suggestions, and expertise from others outside & inside the immediate team. Communication: Influences and holds others accountable and has ability to convince others. Identifies the specific data governance requirements and is able to communicate clearly and in a compelling way.
    $80k-111k yearly est. 3d ago
  • AI Data Engineer

    Echelix

    Requirements engineer job in Phoenix, AZ

    Echelix is a leading AI consulting company helping businesses design, build, and scale intelligent systems. We partner with organizations to make artificial intelligence practical, powerful, and easy to adopt. Our team blends deep technical skill with real-world business sense to deliver AI that drives measurable results. The Role We're looking for a Senior Data Engineer to architect, optimize, and manage database systems that power AI-driven solutions and enterprise applications. You'll lead the design of scalable, secure, and high-performance data infrastructure across cloud platforms, ensuring our clients' data foundations are built for the future. This role is ideal for database professionals who have evolved beyond traditional DBA work into cloud-native architectures, API-driven data access layers, and modern DevOps practices. You'll work with cutting-edge technologies like GraphQL, Hasura, and managed cloud databases while mentoring engineers on data architecture best practices. What You'll Do Design, tune, and manage PostgreSQL, SQL Server, and cloud-managed databases (AWS RDS/Aurora, Azure SQL Database/Cosmos DB) Architect and implement GraphQL APIs using Hasura or equivalent technologies for real-time data access Lead cloud database migrations and deployments across AWS and Azure environments Automate database CI/CD pipelines using tools like GitHub Actions, Azure DevOps, or AWS Code Pipeline Develop and maintain data access layers and APIs that integrate with AI and application workloads Monitor, secure, and optimize database performance using cloud-native tools (AWS CloudWatch, Azure Monitor, Datadog) Implement database security best practices including encryption, access controls, and compliance requirements Mentor engineers on database design, data modeling, and architecture best practices Requirements 5+ years of experience designing and managing production database systems Deep expertise in PostgreSQL and SQL Server, including performance tuning and query optimization Hands-on experience with cloud database services (AWS RDS, Aurora, Azure SQL Database, Azure Cosmos DB) Experience with GraphQL and API development, preferably with Hasura or similar platforms Strong background in database CI/CD automation and Infrastructure as Code (Terraform, CloudFormation, Bicep) Proficiency in scripting languages (Python, Bash) for automation and tooling Solid understanding of data modeling, schema design, and database normalization Strong communication and mentoring skills US citizen and must reside in the United States Nice to Have Experience with NoSQL databases (MongoDB, DynamoDB, Redis) Knowledge of data streaming platforms (Kafka, AWS Kinesis, Azure Event Hubs) Experience with data warehousing solutions (Snowflake, Redshift, Azure Synapse) Background in AI/ML data pipelines and feature stores Relevant certifications (AWS Database Specialty, Azure Database Administrator, PostgreSQL Professional) Why Join Echelix You'll join a fast-moving team that's shaping how AI connects people and data. We value curiosity, precision, and practical innovation. You'll work on real projects with real impact, not just proofs of concept.
    $80k-111k yearly est. 22h ago
  • Data Engineer

    Mastek

    Requirements engineer job in Phoenix, AZ

    Hi, We do have an job opportunity for Data Engineer Analyst role. Data Analyst / Data Engineer Expectations: Our project is data analysis heavy, and we are looking for someone who can grasp business functionality and translate that into working technical solutions. Job location: Phoenix, Arizona. Type - Hybrid model (3 days a week in office) Job Description: Data Analyst / Data Engineer (6+ Years relevant Experience with required skill set) Summary: We are seeking a Data Analyst Engineer with a minimum of 6 years in data engineering, data analysis, and data design. The ideal candidate will have strong hands-on expertise in Python and relational databases such as Postgres, SQL Server, or MySQL. Should have good understanding of data modeling theory and normalization forms. Required Skills: 6+ years of experience in data engineering, data analysis, and data design Your approach as a data analysis in your previous / current role, and what methods or techniques did you use to extract insights from large datasets Good proficiency in Python Do you have any formal training or education in data modeling? If so, please provide details about the course, program, or certification you completed, including when you received it. Strong experience with relational databases: Postgres, SQL Server, or MySQL. What are the essential factors that contribute to a project's success, and how do you plan to leverage your skills and expertise to ensure our project meets its objectives? Expertise in writing complex SQL queries and optimizing database performance Solid understanding of data modeling theory and normalization forms. Good communicator with the ability to articulate business problems for technical solutions. Key Responsibilities: Analyze complex datasets to derive actionable insights and support business decisions. Model data solutions for high performance and reliability. Work extensively with Python for data processing and automation. Develop and optimize SQL queries for Postgres, SQL Server, or MySQL databases. Ensure data integrity, security, and compliance across all data solutions. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Communicate effectively with stakeholders and articulate business problems to drive technical solutions. Secondary Skills: Experience deploying applications in Kubernetes. API development using FastAPI or Django. Familiarity with containerization (Docker) and CI/CD tools. Regards, Suhas Gharge
    $80k-111k yearly est. 3d ago
  • ORACLE CLOUD DATA ENGINEER

    Wise Skulls

    Requirements engineer job in Phoenix, AZ

    Hiring: Oracle Cloud Data Engineer / Technology Lead We're looking for a hands-on Oracle Cloud Data Engineer (Technology Lead) to drive OCI-based data engineering and Power BI analytics initiatives. This role combines technical leadership with active development in a high-impact data program. Location: Phoenix, AZ (Hybrid) Duration: 6+ Months (Contract) Work Authorization: USC & Green Card holders ONLY (Strict Requirement) Job Summary This role focuses on building scalable data pipelines on Oracle Cloud Infrastructure (OCI) while leading Power BI dashboard and reporting development. You'll apply Medallion Architecture, enforce data governance, and collaborate closely with business stakeholders. Utility industry experience is a strong plus. Must-Have (Non-Negotiable) Skills 8-10 years of experience in Data Engineering & Business Intelligence 3+ years of hands-on OCI experience Strong expertise in OCI Data Services, including: OCI Data Integration, OCI Data Flow, OCI Streaming Autonomous Data Warehouse, Oracle Exadata, OCI Object Storage Hands-on experience with Medallion Architecture (Bronze, Silver, Gold layers) Power BI expertise: dashboards, reports, DAX, Power Query, data modeling, RLS Strong coding skills in SQL, PL/SQL, Python Experience with Terraform, Ansible, and CI/CD pipelines Bachelor's or Master's degree in a related field Power BI Certification - Required Hands-on development is mandatory Key Responsibilities Design and implement secure, scalable OCI data pipelines Lead Power BI dashboard and reporting development Build inbound/outbound integration patterns (APIs, files, streaming) Implement Audit, Balance, and Control (ABC) frameworks Ensure data quality, governance, lineage, and monitoring Mentor engineers and BI developers Drive agile delivery and stakeholder collaboration 📩 Interested? Apply now or DM us to explore this opportunity! You can share resumes at ********************* OR Call us on *****************
    $80k-111k yearly est. 1d ago
  • DevOps Engineer

    The Judge Group 4.7company rating

    Requirements engineer job in Chandler, AZ

    Build Tools: Proficiency in build automation tools such as Make, Maven, Gradle, or Ant. Continuous Integration/Continuous Deployment (CI/CD): Experience with CI/CD tools like Jenkins or GitLab CI. Version Control Systems: Strong knowledge of version control systems, particularly Git, including branching strategies and workflows. Scripting Languages: Proficiency in scripting languages such as Bash, Python, or Ruby for automating build processes. Containerization: Familiarity with containerization technologies such as Docker and orchestration tools like Kubernetes. Static and Dynamic Analysis Tools: Understanding of tools for code quality and security analysis (e.g., SonarQube, Val grind). Programming Languages: Knowledge of programming languages relevant to the projects (e.g., C/C++, Python). Preferred Qualifications Experience in managing large data sets. Parallel Computing: Familiarity with parallel programming models like MPI (Message Passing Interface), OpenMP, and CUDA for GPU-based computing. Performance Optimization: Skills in profiling and optimizing code for better performance on HPC systems (e.g., using tools like Gprof, Valgrind, or Intel VTune). Storage Architecture Knowledge: Understanding file systems such as Lustre, GPFS, or HDFS and strategies for efficient data storage and retrieval in HPC environments. Distributed Computing Tools: Familiarity with frameworks such as Hadoop, Spark, or Dask for handling distributed datasets. Education and Experience · A bachelor's degree in Computer Science, Software Engineering, or a related field. · Experience: Proven experience in software build management, DevOps, or continuous integration roles (typically 3+ years).
    $87k-115k yearly est. 22h ago

Learn more about requirements engineer jobs

How much does a requirements engineer earn in Mesa, AZ?

The average requirements engineer in Mesa, AZ earns between $65,000 and $123,000 annually. This compares to the national average requirements engineer range of $62,000 to $120,000.

Average requirements engineer salary in Mesa, AZ

$90,000

What are the biggest employers of Requirements Engineers in Mesa, AZ?

The biggest employers of Requirements Engineers in Mesa, AZ are:
  1. Exyte Group
  2. Stantec
  3. Boeing
  4. Bank of America
  5. Deloitte
  6. Highgate Hotels
  7. Jeppesen
  8. City of Mesa
  9. Career Mentors
  10. Champions Funding LLC
Job type you want
Full Time
Part Time
Internship
Temporary