Post job

Data analyst jobs in Gilbert, AZ

- 682 jobs
All
Data Analyst
Data Engineer
Business Analyst
Data Architect
Data Scientist
Analyst
Solutions Analyst
Change Management Analyst
Data Analyst Internship
  • Data Analyst

    Intraedge 3.9company rating

    Data analyst job in Phoenix, AZ

    6+ years of experience in data sourcing, financial data analytics, or regulatory reporting. Prior experience in large banking or financial institutions preferred. Regulatory Reporting Expertise: Strong understanding of US regulatory reports including Basel, Y-9C, 2052a, Call Reports, etc and Hands-on experience supporting data sourcing for these reports by working with finance and reporting teams. Data Sourcing & Profiling: Proven experience identifying required data elements across enterprise data warehouses and ability to profile large datasets, assess data quality, and identify missing or inconsistent attributes. SQL and BigQuery Proficiency: Advanced SQL skills to query and validate data across multiple platforms. Strong experience with Google BigQuery or similar cloud-based analytics tools. Source-to-Target(S2T) Mapping Documentation: Experience creating detailed S2T mapping documents, including data hops, transformation logic, and attribute lineage. Ability to define defaulting logic for data gaps and align mappings to regulatory reporting needs. Stakeholder Collaboration: Experience working with finance product owners, technology, data stewards, and governance teams to finalize source tables and mappings. Skilled at coordinating approvals and documenting decisions around data usage. Data Controls and Governance Awareness: Understanding of data control points, validation rules, and audit readiness. Experience contributing to the design of automated controls for completeness and accuracy in reporting data. Documentation and Communication: Ability to clearly document data sourcing logic, assumptions, and business rules. Strong written and verbal communication skills for walkthroughs and stakeholder updates.
    $63k-83k yearly est. 3d ago
  • Data Analyst

    Mi-Case

    Data analyst job in Phoenix, AZ

    Full Time Hybrid- 3 Days a week on site About Us: Join us at Mi-Case, where we're at the forefront of developing innovative public safety products and solutions. We take pride in delivering fully-integrated software and exceptional client support, making a real impact in communities. As part of our team, you'll collaborate with passionate, talented colleagues and industry experts who are deeply committed to solving the unique challenges faced by our clients in the public safety sector. Together, we're replacing outdated systems with cutting-edge, mobile-ready solutions that empower our clients to enhance public safety and achieve their goals. Job Description: We are seeking a skilled Data Analyst to analyze legacy data sources and Mi-Case data requirements, and to develop detailed data mapping specifications. In this role, strong communication skills are essential for effectively conveying data requirements, identifying source data locations, and defining conversion rules. Proficiency in SQL and data analysis is required, as well as the ability to query data and provide thorough analysis. As a detail-oriented professional, you will be responsible for documenting element level data conversion specifications and communicating progress and requirements clearly to key stakeholders, including the Project Manager, client system Subject Matter Experts, application developers, data engineers, and quality analysts. You will focus on gathering, analyzing, and mapping data for migration purposes, ensuring data quality and integrity throughout the process. Your contributions will be vital to the successful transition of data from legacy systems to new platforms. Role: Collaborate with Mi-Case Application Developers, Legacy System Subject Matter Experts (SMEs), and stakeholders to gather, document, and analyze data migration requirements. Analyze and map data structures from legacy systems to target platforms, making certain that data mapping specifications and transformation rules are precise. Document the steps for data transformation, cleansing, and validation to uphold high standards of data quality and integrity. Track migration requirements, mapping rules, and progress through tools like DevOps, keeping stakeholders fully informed. Serve as the main point of contact for clients regarding data migration queries, delivering clear and consistent communication to manage expectations and keep the process aligned. Log migration errors and any records that weren't transferred, and investigate these issues to resolve them for future migration runs. Offer ongoing support to Project Managers, SMEs, developers, data engineers, and quality analysts by regularly communicating migration progress, updates, and troubleshooting any issues that arise. Minimum Years of Experience: 7 Years Education and skills: Experience: 4+ years of experience in data analysis or a related field, with a focus on data migration projects. Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience with data analysis, mapping, and transformation processes. Proficient in SQL and database querying for data extraction and validation. Excellent communication skills and ability to work collaboratively with cross-functional teams. Detail-oriented with a strong commitment to data quality. Strong analytical and problem-solving skills. Experience with data migration tools and methodologies (preferred). Department of Corrections Technologies- ( Offender Management System ) Nice to Have: Knowledge of data governance and compliance best practices. Familiarity with ETL tools.
    $53k-78k yearly est. 2d ago
  • Business Analyst

    Impact Technology Recruiting 4.5company rating

    Data analyst job in Scottsdale, AZ

    Title: Business Analyst Duration: 12-14 Months + Must Have Skills: Order to Cash Business Analysis POSITION SUMMARY: The Business Analyst is responsible for supporting the Order to Cash reporting workstream as we approach System Integration Testing (SIT) through User Acceptance Testing (UAT), Training, and Deployment phases. This role will primarily focus on testing activities including development of test scenarios, test cases, and test steps while serving as a liaison between the business and IT department. The Business Analyst ensures that testing requirements and results are clearly documented, communicated, and validated to support the successful implementation of the Order to Cash system. PRINCIPAL RESPONSIBILITIES: Testing Support Develop comprehensive test scenarios, test cases, and detailed test steps for the Order to Cash reporting workstream Execute test cases during SIT and support business users during SIT & UAT execution. Document and track defects, working closely with development teams to ensure timely resolution Validate fixes and conduct regression testing as needed Prepare test summary reports and communicate testing progress to stakeholders Implementation Support Support training material development and delivery for end-users Assist with cutover planning and execution activities Provide post-implementation support to address user queries and issues Business Analysis Coordinate with business stakeholders to validate that test scenarios cover all critical business processes Create and maintain detailed documentation of testing requirements and results Facilitate communication between technical teams and business users during testing cycles Support quality assurance efforts using data analysis/profiling during pre- and post-implementation reviews Collaborate with technical teams to ensure reporting solutions meet business requirements within Oracle ERP
    $65k-94k yearly est. 2d ago
  • Data Scientist

    The Intersect Group 4.2company rating

    Data analyst job in Phoenix, AZ

    We are seeking a Data Scientist to support advanced analytics and machine learning initiatives across the organization. This role involves working with large, complex datasets to uncover insights, validate data integrity, and build predictive models. A key focus will be developing and refining machine learning models that leverage sales and operational data to optimize pricing strategies at the store level. Day-to-Day Responsibilities Compare and validate numbers across multiple data systems Investigate discrepancies and understand how metrics are derived Perform data science and data analysis tasks Build and maintain AI/ML models using Python Interpret model results, fine-tune algorithms, and iterate based on findings Validate and reconcile data from different sources to ensure accuracy Work with sales and production data to produce item-level pricing recommendations Support ongoing development of a new data warehouse and create queries as needed Review Power BI dashboards (Power BI expertise not required) Contribute to both ML-focused work and general data science responsibilities Improve and refine an existing ML pricing model already in production Qualifications Strong proficiency with MS SQL Server Experience creating and deploying machine learning models in Python Ability to interpret, evaluate, and fine-tune model outputs Experience validating and reconciling data across systems Strong foundation in machine learning, data modeling, and backend data operations Familiarity with querying and working with evolving data environments
    $76k-109k yearly est. 2d ago
  • Business Analyst (Clinical Operations & EMR Systems)

    Prosum 4.4company rating

    Data analyst job in Scottsdale, AZ

    Business Analyst - Clinical Operations & EMR Systems Term: Contract to Hire The Business Analyst will play a key role in supporting clinical operations by evaluating, optimizing, and documenting workflows related to Electronic Medical Record (EMR) systems. This role collaborates with clinical staff, IT teams, and operational leaders to gather and translate business requirements into functional specifications that support efficient, compliant, and patient-focused care delivery. The ideal candidate has hands-on experience in clinical site operations, EMR workflows, and healthcare/medical environments. Key Responsibilities Requirements Gathering & Documentation Conduct detailed interviews, workflow observations, and process mapping sessions with clinicians, front-office staff, and operational stakeholders. Elicit, analyze, and document business, functional, and technical requirements for EMR enhancements, new features, integrations, and workflow changes. Translate complex clinical workflow needs into clear user stories, use cases, requirements documents, and acceptance criteria. Maintain traceability of requirements through the project lifecycle. Clinical Operations & EMR Support Analyze current clinical workflows within the EMR to identify inefficiencies, gaps, compliance risks, and opportunities for optimization. Support EMR configuration, testing, validation, and implementation activities, ensuring alignment with clinical best practices and regulatory requirements. Work with end-users to troubleshoot workflow issues and propose process or system improvements. Collaborate with clinical leadership to ensure EMR workflows support patient safety, quality reporting, and operational objectives. Project Coordination & Stakeholder Engagement Serve as the liaison between clinical departments, IT teams, EMR vendors, and administrative leadership. Support project planning, prioritization, and resource coordination for EMR upgrades, workflow redesigns, and clinical system projects. Communicate project updates, risks, and dependencies to stakeholders in a clear and timely manner. Facilitate cross-functional meetings, gather feedback, and ensure alignment across all project phases. Data Analysis & Reporting Analyze clinical and operational data to identify trends, workflow gaps, and process improvement opportunities. Assist in developing dashboards, metrics, or reports to support clinical operations and decision-making. Validate EMR data quality and support compliance with healthcare regulations such as HIPAA and quality reporting requirements. Qualifications Required Bachelor's degree in Business, Healthcare Administration, Information Systems, or related field. 5+ years of experience as a Business Analyst in a healthcare or medical environment. Hands-on experience with EMR/EHR systems Experience working directly in or with clinical site operations (front office, back office, clinical workflows, scheduling, intake, documentation, orders, billing, etc.). Strong skills in requirements elicitation, documentation, process mapping, and workflow analysis. Knowledge of healthcare regulations, clinical terminology, and patient care workflows. Excellent communication, facilitation, and stakeholder-management abilities. Preferred Experience with EMR implementations, upgrades, or optimization initiatives. Prior involvement in quality improvement, population health, or clinical analytics. Familiarity with Agile methodologies and tools such as Jira, Azure DevOps, or similar. “This position does not offer sponsorship. Candidates must be legally authorized to work in the United States without sponsorship now or in the future."
    $69k-98k yearly est. 3d ago
  • IAM Business Systems Analyst

    Robert Half 4.5company rating

    Data analyst job in Phoenix, AZ

    The Business Systems Analyst will manage day-to-day operations of the Customer Identity Access Management (CIAM) platform, ensuring compliance with defined policies and procedures related to user and application management. In this role, you will play a pivotal part in bridging business objectives with technology solutions focused on identity and access management, and drive CIAM capabilities that directly support the bank's digital strategy and customer trust objectives. Working collaboratively with stakeholders, development teams, and vendors, you will help define, implement, and optimize CIAM initiatives that enhance user experiences and operational efficiency. Collaborate with business partners and stakeholders to gather, analyze, and document business requirements aligned with strategic goals and regulatory standards. Act as the Subject Matter Expert for CIAM systems, providing guidance on solution design, configuration, and process improvements. Translate business needs into detailed system requirements, user stories, acceptance criteria, and technical documentation to support effective development and testing. Support the entire solution delivery lifecycle, including requirements gathering, process mapping, documentation, testing, deployment, and post-launch support. Manage and maintain accurate documentation for CIAM applications, ensuring accessibility for business and technical teams. Oversee application configuration changes, ensuring adherence to SDLC and Change Management protocols. Coordinate with internal IT teams, external vendors, and implementation partners to resolve issues and deliver integrated CIAM solutions. Monitor application performance, facilitate incident and problem management, and conduct Root Cause Analysis to drive issue resolution. What you'll need: 8+ years of related experience. Bachelor's degree in a related field required. Previous leadership experience preferred. Advanced knowledge of general Financial Services or Banking is preferred. Advanced knowledge of applicable regulatory and legal compliance obligations, rules and regulations, industry standards and practices. Advanced to expert in process and data analysis within one or two domains or functional areas with deep critical thinking skills. Experience working as an SME across one or two domains or functional areas. Advanced speaking and writing communication skills. Demonstrated expertise in the implementation of CIAM or IAM solutions, including integration of external platforms and overseeing end-to-end deployment. Proven ability to translate complex business needs into comprehensive system requirements and articulate application configurations that meet both operational and regulatory standards. Experience in administering user lifecycle processes, including onboarding, offboarding, and access reviews. Familiar with application onboarding and policy enforcement for identity and access. Skilled in maintaining compliance with organizational policies and regulatory security standards. Capable of using CIAM platform tools and portals to manage configurations, monitor logs, and handle support issues. Strong documentation and process adherence abilities. Excellent communication skills, with experience collaborating effectively across all levels of internal teams (business units, IT operations, etc.) and with external vendors, sales representatives, technology partners, and implementation consultants. Deep understanding and hands-on application of Software Development Life Cycle (SDLC) methodologies and change management protocols. Demonstrated experience with problem and incident management processes, including leading Root Cause Analysis and resolving high-impact issues. Strong knowledge of data integrity, security, and privacy best practices, with a focus on maintaining high standards across banking platforms. Advanced proficiency in collaboration and workstream management tools, such as Azure DevOps (ADO) and Confluence, for project coordination, tracking, and documentation. Excellent skills in problem recognition, attention to detail, prioritization, and proactively driving process improvements. Hands-on experience with Agile methodologies, including adapting to evolving requirements and supporting iterative development cycles.
    $59k-92k yearly est. 4d ago
  • Data Architect

    Akkodis

    Data analyst job in Phoenix, AZ

    Akkodis is seeking a Data Architect local to Phoenix, AZ that can come onsite 3 days a week. If you are interested, please apply! JOB TITLE: Data Architect EMPLOYMENT TYPE: 24+ month Contract | 3 days/week on site Pay: 80 - 96/hr ETL design and development for enterprise data solutions. Design and build databases, data warehouses, and strategies for data acquisition, archiving, and recovery. Review new data sources for compliance with standards. Provide technical leadership, set standards, and mentor junior team members. Collaborate with business stakeholders to translate requirements into scalable solutions. Guide teams on Azure data tools (Data Factory, Synapse, Data Lake, Databricks). Establish best practices for database design, data integration, and data governance. Ensure solutions are secure, high-performing, and easy to support. Essential Skills & Experience Bachelor's degree in computer science, Information Systems, or equivalent experience. 10+ years with Microsoft SQL technologies. 3+ years with cloud-based solutions (Azure preferred). Strong knowledge of ETL, data modeling, and data warehousing. Experience with source control, change/release management, and documentation. Excellent communication and leadership skills. Preferred Retail or grocery industry experience. Familiarity with Power BI and MDM principles. Work Schedule Hybrid: 3 days onsite in Phoenix, AZ; 2 days remote. “Benefit offerings include medical, dental, vision, term life insurance, short-term disability insurance, additional voluntary benefits, commuter benefits and 401K plan. Our program provides employees the flexibility to choose the type of coverage that meets their individual needs. Available paid leave may include Paid Sick Leave, where required by law; any other paid leave required by Federal, State or local law; and Holiday pay upon meeting eligibility criteria. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs which are direct hire to a client”
    $92k-128k yearly est. 5d ago
  • Data Architect

    Saxon Global 3.6company rating

    Data analyst job in Phoenix, AZ

    The Senior Data Engineer & Test in Phoenix 85029 will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects Key Responsibilities 1. Data Engineering: Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads. 2. Workflow Orchestration: Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability. 3. CI/CD Pipeline Development: Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions. 4. Testing & Quality: Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines. 5. Secure Development: Implement secure coding best practices and design patterns throughout the development lifecycle. 6. Collaboration: Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions. 7. Documentation: Create and maintain technical documentation, including process/data flow diagrams and system design artifacts. 8. Mentorship: Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices. 9. Troubleshooting: Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks. Cross-Team Knowledge Sharing: Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage. Includes all above skills, plus the following; · Minimum of 10+ years overall IT experience · Experienced in waterfall, iterative, and agile methodologies Technical Requirment: 1. Hands-on Data Engineering : Minimum 5+ yearsof practical experience building production-grade data pipelines using Python and PySpark. 2. Airflow Expertise: Proven track record of designing, deploying, and managing Airflow DAGs in enterprise environments. 3. CI/CD for Data Projects : Ability to build and maintain CI/CD pipelinesfor data engineering workflows, including automated testing and deployment**. 4. Cloud & Containers: Experience with containerization (Docker and cloud platforms (GCP) for data engineering workloads. Appreciation for twelve-factor design principles 5. Python Fluency : Ability to write object-oriented Python code manage dependencies, and follow industry best practices 6. Version Control: Proficiency with **Git** for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows). 7. Unix/Linux: Strong command-line skills** in Unix-like environments. 8. SQL : Solid understanding of SQL for data ingestion and analysis. 9. Collaborative Development : Comfortable with code reviews, pair programming and usingremote collaboration tools effectively. 10. Engineering Mindset: Writes code with an eye for maintainability and testability; excited to build production-grade software 11. Education: Bachelor's or graduate degree in Computer Science, Data Analytics or related field, or equivalent work experience.
    $84k-115k yearly est. 3d ago
  • Senior Data Engineer

    Addison Group 4.6company rating

    Data analyst job in Phoenix, AZ

    Job Title: Sr. Data Engineer Job Type: Full Time Compensation: $130,000 - $150,000 D.O.E. is eligible for medical, dental, vision, and life insurance coverage, & PTO Senior Data Engineer ROLE OVERVIEW The Senior Data Engineer is responsible for designing, building, and maintaining scalable data platforms that support analytics, reporting, and advanced data-driven initiatives. This is a hands-on engineering role focused on developing reliable, high-performing data solutions while contributing to architectural standards, data quality, and governance practices. The ideal candidate has strong experience with modern data architectures, data modeling, and pipeline development, and is comfortable collaborating across technical and business teams to deliver trusted, production-ready datasets. KEY RESPONSIBILITIES Design and maintain data models across analytical and operational use cases to support reporting and advanced analytics. Build and manage data pipelines that ingest, transform, and deliver structured and unstructured data at scale. Contribute to data governance practices, including data quality controls, metadata management, lineage, and stewardship. Develop and maintain cloud-based data platforms, including data lakes, analytical stores, and curated datasets. Implement and optimize batch and near-real-time data ingestion and transformation processes. Support data migration and modernization efforts while ensuring accuracy, performance, and reliability. Partner with analytics, engineering, and business teams to understand data needs and deliver high-quality solutions. Enable reporting and visualization use cases by providing clean, well-structured datasets for downstream tools. Apply security, privacy, and compliance best practices throughout the data lifecycle. Establish standards for performance tuning, scalability, reliability, and maintainability of data solutions. Implement automation, testing, and deployment practices to improve data pipeline quality and consistency. QUALIFICATIONS Bachelor's degree in Computer Science, Engineering, or a related technical field, or equivalent professional experience. 5+ years of experience in data engineering or related roles. Strong hands-on experience with: Data modeling, schema design, and pipeline development Cloud-based data platforms and services Data ingestion, transformation, and optimization techniques Familiarity with modern data architecture patterns, including lakehouse-style designs and governance frameworks. Experience supporting analytics, reporting, and data science use cases. Proficiency in one or more programming languages commonly used in data engineering (e.g., Python, SQL, or similar). Solid understanding of data structures, performance optimization, and scalable system design. Experience integrating data from APIs and distributed systems. Exposure to CI/CD practices and automated testing for data workflows. Familiarity with streaming or event-driven data processing concepts preferred. Experience working in Agile or iterative delivery environments. Strong communication skills with the ability to document solutions and collaborate across teams.
    $130k-150k yearly 4d ago
  • Data Engineer

    Interactive Resources-IR 4.2company rating

    Data analyst job in Tempe, AZ

    About the Role We are seeking a highly skilled Databricks Data Engineer with strong expertise in modern data engineering, Azure cloud technologies, and Lakehouse architectures. This role is ideal for someone who thrives in dynamic environments, enjoys solving complex data challenges, and can lead end-to-end delivery of scalable data solutions. What We're Looking For 8+ years designing and delivering scalable data pipelines in modern data platforms Deep experience in data engineering, data warehousing, and enterprise-grade solution delivery Ability to lead cross-functional initiatives in matrixed teams Advanced skills in SQL, Python, and ETL/ELT development, including performance tuning Hands-on experience with Azure, Snowflake, and Databricks, including system integrations Key Responsibilities Design, build, and optimize large-scale data pipelines on the Databricks Lakehouse platform Modernize and enhance cloud-based data ecosystems on Azure, contributing to architecture, modeling, security, and CI/CD Use Apache Airflow and similar tools for workflow automation and orchestration Work with financial or regulated datasets while ensuring strong compliance and governance Drive best practices in data quality, lineage, cataloging, and metadata management Primary Technical Skills Develop and optimize ETL/ELT pipelines using Python, PySpark, Spark SQL, and Databricks Notebooks Design efficient Delta Lake models for reliability and performance Implement and manage Unity Catalog for governance, RBAC, lineage, and secure data sharing Build reusable frameworks using Databricks Workflows, Repos, and Delta Live Tables Create scalable ingestion pipelines for APIs, databases, files, streaming sources, and MDM systems Automate ingestion and workflows using Python and REST APIs Support downstream analytics for BI, data science, and application workloads Write optimized SQL/T-SQL queries, stored procedures, and curated datasets Automate DevOps workflows, testing pipelines, and workspace configurations Additional Skills Azure: Data Factory, Data Lake, Key Vault, Logic Apps, Functions CI/CD: Azure DevOps Orchestration: Apache Airflow (plus) Streaming: Delta Live Tables MDM: Profisee (nice-to-have) Databases: SQL Server, Cosmos DB Soft Skills Strong analytical and problem-solving mindset Excellent communication and cross-team collaboration Detail-oriented with a high sense of ownership and accountability
    $92k-122k yearly est. 2d ago
  • Life Actuarial Solutions Analyst Senior

    USAA 4.7company rating

    Data analyst job in Phoenix, AZ

    Why USAA? At USAA, our mission is to empower our members to achieve financial security through highly competitive products, exceptional service and trusted advice. We seek to be the #1 choice for the military community and their families. Embrace a fulfilling career at USAA, where our core values - honesty, integrity, loyalty and service - define how we treat each other and our members. Be part of what truly makes us special and impactful. The Opportunity As a dedicated Life Actuarial Solutions Analyst Senior to join the Life Company's Modeling Operations Team. The Life Modeling Operations Team is a diverse team that supports the complex life actuarial modeling ecosystem, which consumes data from multiple sources across USAA to support actuarial functions. Your role also supports Life/Annuity/Health actuarial work through one or more of the following activities: data extraction, data transformation, validation and analysis, and system functionality oversight and integration. Responsible for providing technical and analytical solutions for one or more of the following functions: pricing and product development, experience studies, actuarial assumption reviews, reserve calculations, financial reporting, asset liability management or competitive analysis. This role is remote eligible in the continental U.S. with occasional business travel. However, individuals residing within a 60-mile radius of a USAA office will be expected to work on-site four days per week. What you'll do: Independently extracts, integrates and transforms data from a multitude of sources, and may identify new sources. Reconciles and validates data accuracy, and reasonability of actuarial or financial information. Prepares reports, reserve estimates, journal entries, financial statements, industry surveys and/or special studies, analyzes data, and recommends solutions. Develops comprehensive and innovative solutions that impact productivity to improve actuarial tools and processes. Resolves unique and complex issues and navigates obstacles to deliver work product. Develops cost benefit analysis. Provides insight to management on issues and serves as a resource to team members on escalated issues of an unusual nature. Leads projects related to actuarial solutions including automation, IT projects, or product development initiatives. Oversees requirement development process through testing and implementation. Demonstrates in depth understanding to identify and resolve issues or potential defects. Maintains processes, procedures and tools, and ensures all regulatory requirements and internal controls are adhered to. Works with business partners to understand key regulatory implications that impact processes, and may develop processes to comply with new or changing regulations. May respond to audit requests and oversees coordination of responses to internal and external audit, such as Department of Insurance examination, as well as, other audit reports. Anticipates and analyzes trends or deviations from forecast, plan or other projections. Presents recommendations and communicates solutions to business partners and management in a clear, concise, logical and organized manner. Ensures risks associated with business activities are effectively identified, measured, monitored, and controlled in accordance with risk and compliance policies and procedures. What you have: Bachelor's degree; OR 4 years of related experience (in addition to the minimum years of experience required) may be substituted in lieu of degree. 6 or more years of technical experience as an analyst or other relevant technical work experience. What sets you apart: Bachelor's degree in mathematics, computer science, statistics, economics, finance, actuarial science, or other similar quantitative field Experience with SQL or similar programming languages Experience working in IT for a life insurance company Experience supporting projects for actuarial or modeling functions Excellent verbal and written communication skills, with the ability to tailor the content for varying audiences. Strong aptitude for problem solving and technology Quick learner, self-starter, and ability to work well autonomously and with others. US military experience through military service or a military spouse/domestic partner Compensation range: The salary range for this position is: $93,770 - $168,790. USAA does not provide visa sponsorship for this role. Please do not apply for this role if at any time (now or in the future) you will need immigration support (i.e., H-1B, TN, STEM OPT Training Plans, etc.). Compensation: USAA has an effective process for assessing market data and establishing ranges to ensure we remain competitive. You are paid within the salary range based on your experience and market data of the position. The actual salary for this role may vary by location. Employees may be eligible for pay incentives based on overall corporate and individual performance and at the discretion of the USAA Board of Directors. The above description reflects the details considered necessary to describe the principal functions of the job and should not be construed as a detailed description of all the work requirements that may be performed in the job. Benefits: At USAA our employees enjoy best-in-class benefits to support their physical, financial, and emotional wellness. These benefits include comprehensive medical, dental and vision plans, 401(k), pension, life insurance, parental benefits, adoption assistance, paid time off program with paid holidays plus 16 paid volunteer hours, and various wellness programs. Additionally, our career path planning and continuing education assists employees with their professional goals. For more details on our outstanding benefits, visit our benefits page on USAAjobs.com. Applications for this position are accepted on an ongoing basis, this posting will remain open until the position is filled. Thus, interested candidates are encouraged to apply the same day they view this posting. USAA is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
    $93.8k-168.8k yearly Auto-Apply 1d ago
  • Sr Bigdata engineer

    E-Solutions 4.5company rating

    Data analyst job in Scottsdale, AZ

    Sr Bigdata developer Scottsdale AZ Must have : 10-12 years of experience Strong Experience in Scala, Spark, hive SQL, Hadoop and Kafka Proficiency in Hive and SQL optimization. Understanding of distributed systems and big data architecture. Knowledge of streaming frameworks (Spark Streaming, Kafka Streams). Good to have - Aerospike experience
    $103k-144k yearly est. 5d ago
  • Data Governance Engineer

    Tata Consultancy Services 4.3company rating

    Data analyst job in Phoenix, AZ

    Role: Data Governance Engineer Experience Required - 6+ Years Must Have Technical/Functional Skills • Understanding of Data Management and Data Governance concepts (metadata, lineage, data quality, etc.) and prior experience. • 2 - 5 years of Data Quality Management experience. • Intermediate competency in SQL & Python or related programming language. • Strong familiarity with data architecture and/or data modeling concepts • 2 - 5 years of experience with Agile or SAFe project methodologies Roles & Responsibilities • Assist in identifying data-related risks and associated controls for key business processes. Risks relate to Record Retention, Data Quality, Data Movement, Data Stewardship, Data Protection, Data Sharing, among others. • Identify data quality issues, perform root-cause-analysis of data quality issues and drive remediation of audit and regulatory feedback. • Develop deep understanding of key enterprise data-related policies and serve as the policy expert for the business unit, providing education to teams regarding policy implications for business. • Responsible for holistic platform data quality monitoring, including but not limited to critical data elements. • Collaborate with and influence product managers to ensure all new use cases are managed according to policies. • Influence and contribute to strategic improvements to data assessment processes and analytical tools. • Responsible for monitoring data quality issues, communicating issues, and driving resolution. • Support current regulatory reporting needs via existing platforms, working with upstream data providers, downstream business partners, as well as technology teams. • Subject matter expertise on multiple platforms. • Responsible to partner with the Data Steward Manager in developing and managing the data compliance roadmap. Generic Managerial Skills, If any • Drives Innovation & Change: Provides systematic and rational analysis to identify the root cause of problems. Is prepared to challenge the status quo and drive innovation. Makes informed judgments, recommends tailored solutions. • Leverages Team - Collaboration: Coordinates efforts within and across teams to deliver goals, accountable to bring in ideas, information, suggestions, and expertise from others outside & inside the immediate team. • Communication: Influences and holds others accountable and has ability to convince others. Identifies the specific data governance requirements and is able to communicate clearly and in a compelling way. Interested candidates please do share me your updated resume to ******************* Salary Range - $100,000 to $120,000 per year TCS Employee Benefits Summary: Discretionary Annual Incentive. Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans. Family Support: Maternal & Parental Leaves. Insurance Options: Auto & Home Insurance, Identity Theft Protection. Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement. Time Off: Vacation, Time Off, Sick Leave & Holidays. Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
    $100k-120k yearly 5d ago
  • AI Data Engineer

    Echelix

    Data analyst job in Phoenix, AZ

    Echelix is a leading AI consulting company helping businesses design, build, and scale intelligent systems. We partner with organizations to make artificial intelligence practical, powerful, and easy to adopt. Our team blends deep technical skill with real-world business sense to deliver AI that drives measurable results. The Role We're looking for a Senior Data Engineer to architect, optimize, and manage database systems that power AI-driven solutions and enterprise applications. You'll lead the design of scalable, secure, and high-performance data infrastructure across cloud platforms, ensuring our clients' data foundations are built for the future. This role is ideal for database professionals who have evolved beyond traditional DBA work into cloud-native architectures, API-driven data access layers, and modern DevOps practices. You'll work with cutting-edge technologies like GraphQL, Hasura, and managed cloud databases while mentoring engineers on data architecture best practices. What You'll Do Design, tune, and manage PostgreSQL, SQL Server, and cloud-managed databases (AWS RDS/Aurora, Azure SQL Database/Cosmos DB) Architect and implement GraphQL APIs using Hasura or equivalent technologies for real-time data access Lead cloud database migrations and deployments across AWS and Azure environments Automate database CI/CD pipelines using tools like GitHub Actions, Azure DevOps, or AWS Code Pipeline Develop and maintain data access layers and APIs that integrate with AI and application workloads Monitor, secure, and optimize database performance using cloud-native tools (AWS CloudWatch, Azure Monitor, Datadog) Implement database security best practices including encryption, access controls, and compliance requirements Mentor engineers on database design, data modeling, and architecture best practices Requirements 5+ years of experience designing and managing production database systems Deep expertise in PostgreSQL and SQL Server, including performance tuning and query optimization Hands-on experience with cloud database services (AWS RDS, Aurora, Azure SQL Database, Azure Cosmos DB) Experience with GraphQL and API development, preferably with Hasura or similar platforms Strong background in database CI/CD automation and Infrastructure as Code (Terraform, CloudFormation, Bicep) Proficiency in scripting languages (Python, Bash) for automation and tooling Solid understanding of data modeling, schema design, and database normalization Strong communication and mentoring skills US citizen and must reside in the United States Nice to Have Experience with NoSQL databases (MongoDB, DynamoDB, Redis) Knowledge of data streaming platforms (Kafka, AWS Kinesis, Azure Event Hubs) Experience with data warehousing solutions (Snowflake, Redshift, Azure Synapse) Background in AI/ML data pipelines and feature stores Relevant certifications (AWS Database Specialty, Azure Database Administrator, PostgreSQL Professional) Why Join Echelix You'll join a fast-moving team that's shaping how AI connects people and data. We value curiosity, precision, and practical innovation. You'll work on real projects with real impact, not just proofs of concept.
    $80k-111k yearly est. 5d ago
  • Pyspark Data Engineer | Only USC and Green Card

    Ampstek

    Data analyst job in Phoenix, AZ

    Pyspark Data Engineer Duration: 06+ Months **Only US Citizen and Green Card Required** Job Details: Must Have Skills • PySpark, Python developer. • Hands on knowledge for py Spark, Hadoop, Python • Github Backend API integration knowledge (JASON, REST) Nice to have skills • Closely working with client • Good communication Detailed Job Description • Looking for a Subcon requirement for PySpark, Python, Data engineer. • Client communication skillset for Amex Account. Minimum years of experience: 6 years Certifications Needed : No (Good to have GCP certification) Top 3 responsibilities you would expect the Subcon to shoulder and execute • Individual contributor • Strong development experience and leading dev module • Work with client directly Thank You Aakash Dubey ************************
    $80k-111k yearly est. 3d ago
  • Data Governance Engineer

    Centraprise

    Data analyst job in Phoenix, AZ

    Job Title : Data Governance Engineer Phoenix, AZ - Complete Onsite Full-Time Permanent Experience Required - 6+ Years Must Have Technical/Functional Skills Understanding of Data Management and Data Governance concepts (metadata, lineage, data quality, etc.) and prior experience. 2 - 5 years of Data Quality Management experience. Intermediate competency in SQL & Python or related programming language. Strong familiarity with data architecture and/or data modeling concepts 2 - 5 years of experience with Agile or SAFe project methodologies Roles & Responsibilities Assist in identifying data-related risks and associated controls for key business processes. Risks relate to Record Retention, Data Quality, Data Movement, Data Stewardship, Data Protection, Data Sharing, among others. Identify data quality issues, perform root-cause-analysis of data quality issues and drive remediation of audit and regulatory feedback. Develop deep understanding of key enterprise data-related policies and serve as the policy expert for the business unit, providing education to teams regarding policy implications for business. Responsible for holistic platform data quality monitoring, including but not limited to critical data elements. Collaborate with and influence product managers to ensure all new use cases are managed according to policies. Influence and contribute to strategic improvements to data assessment processes and analytical tools. Responsible for monitoring data quality issues, communicating issues, and driving resolution. Support current regulatory reporting needs via existing platforms, working with upstream data providers, downstream business partners, as well as technology teams. Subject matter expertise on multiple platforms. Responsible to partner with the Data Steward Manager in developing and managing the data compliance roadmap. Generic Managerial Skills, If any Drives Innovation & Change: Provides systematic and rational analysis to identify the root cause of problems. Is prepared to challenge the status quo and drive innovation. Makes informed judgments, recommends tailored solutions. Leverages Team - Collaboration: Coordinates efforts within and across teams to deliver goals, accountable to bring in ideas, information, suggestions, and expertise from others outside & inside the immediate team. Communication: Influences and holds others accountable and has ability to convince others. Identifies the specific data governance requirements and is able to communicate clearly and in a compelling way.
    $80k-111k yearly est. 3d ago
  • Data Engineer

    Mastek

    Data analyst job in Phoenix, AZ

    Hi, We do have an job opportunity for Data Engineer Analyst role. Data Analyst / Data Engineer Expectations: Our project is data analysis heavy, and we are looking for someone who can grasp business functionality and translate that into working technical solutions. Job location: Phoenix, Arizona. Type - Hybrid model (3 days a week in office) Job Description: Data Analyst / Data Engineer (6+ Years relevant Experience with required skill set) Summary: We are seeking a Data Analyst Engineer with a minimum of 6 years in data engineering, data analysis, and data design. The ideal candidate will have strong hands-on expertise in Python and relational databases such as Postgres, SQL Server, or MySQL. Should have good understanding of data modeling theory and normalization forms. Required Skills: 6+ years of experience in data engineering, data analysis, and data design Your approach as a data analysis in your previous / current role, and what methods or techniques did you use to extract insights from large datasets Good proficiency in Python Do you have any formal training or education in data modeling? If so, please provide details about the course, program, or certification you completed, including when you received it. Strong experience with relational databases: Postgres, SQL Server, or MySQL. What are the essential factors that contribute to a project's success, and how do you plan to leverage your skills and expertise to ensure our project meets its objectives? Expertise in writing complex SQL queries and optimizing database performance Solid understanding of data modeling theory and normalization forms. Good communicator with the ability to articulate business problems for technical solutions. Key Responsibilities: Analyze complex datasets to derive actionable insights and support business decisions. Model data solutions for high performance and reliability. Work extensively with Python for data processing and automation. Develop and optimize SQL queries for Postgres, SQL Server, or MySQL databases. Ensure data integrity, security, and compliance across all data solutions. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Communicate effectively with stakeholders and articulate business problems to drive technical solutions. Secondary Skills: Experience deploying applications in Kubernetes. API development using FastAPI or Django. Familiarity with containerization (Docker) and CI/CD tools. Regards, Suhas Gharge
    $80k-111k yearly est. 3d ago
  • ORACLE CLOUD DATA ENGINEER

    Wise Skulls

    Data analyst job in Phoenix, AZ

    Hiring: Oracle Cloud Data Engineer / Technology Lead We're looking for a hands-on Oracle Cloud Data Engineer (Technology Lead) to drive OCI-based data engineering and Power BI analytics initiatives. This role combines technical leadership with active development in a high-impact data program. Location: Phoenix, AZ (Hybrid) Duration: 6+ Months (Contract) Work Authorization: USC & Green Card holders ONLY (Strict Requirement) Job Summary This role focuses on building scalable data pipelines on Oracle Cloud Infrastructure (OCI) while leading Power BI dashboard and reporting development. You'll apply Medallion Architecture, enforce data governance, and collaborate closely with business stakeholders. Utility industry experience is a strong plus. Must-Have (Non-Negotiable) Skills 8-10 years of experience in Data Engineering & Business Intelligence 3+ years of hands-on OCI experience Strong expertise in OCI Data Services, including: OCI Data Integration, OCI Data Flow, OCI Streaming Autonomous Data Warehouse, Oracle Exadata, OCI Object Storage Hands-on experience with Medallion Architecture (Bronze, Silver, Gold layers) Power BI expertise: dashboards, reports, DAX, Power Query, data modeling, RLS Strong coding skills in SQL, PL/SQL, Python Experience with Terraform, Ansible, and CI/CD pipelines Bachelor's or Master's degree in a related field Power BI Certification - Required Hands-on development is mandatory Key Responsibilities Design and implement secure, scalable OCI data pipelines Lead Power BI dashboard and reporting development Build inbound/outbound integration patterns (APIs, files, streaming) Implement Audit, Balance, and Control (ABC) frameworks Ensure data quality, governance, lineage, and monitoring Mentor engineers and BI developers Drive agile delivery and stakeholder collaboration 📩 Interested? Apply now or DM us to explore this opportunity! You can share resumes at ********************* OR Call us on *****************
    $80k-111k yearly est. 1d ago
  • Data Engineer

    Stelvio Inc.

    Data analyst job in Phoenix, AZ

    Hybrid - 2-3 days on site Phoenix, AZ We're looking for a Data Engineer to help build the cloud-native data pipelines that power critical insights across our organization. You'll work with modern technologies, solve real-world data challenges, and support analytics and reporting systems that drive smarter decision-making in the transportation space. What You'll Do Build and maintain data pipelines using Databricks, Azure Data Factory, and Microsoft Fabric Implement incremental and real-time ingestion using medallion architecture Develop and optimize complex SQL and Python transformations Support legacy platforms (SSIS, SQL Server) while contributing to modernization efforts Troubleshoot data quality and integration issues Participate in proof-of-concepts and recommend technical solutions What You Bring 5+ years designing and building data solutions Strong SQL and Python skills Experience with ETL pipelines and Data Lake architecture Ability to collaborate and adapt in a fast-moving environment Preferred: Azure services, cloud ETL tools, Power BI/Tableau, event-driven systems, NoSQL databases Bonus: Experience with Data Science or Machine Learning Benefits Medical, dental, and vision from day one · PTO & holidays · 401(k) with match · Lifestyle account · Tuition reimbursement · Voluntary benefits · Employee Assistance Program · Well-being & culture programs · Professional development support
    $80k-111k yearly est. 3d ago
  • Cash Analyst

    Insight Global

    Data analyst job in Phoenix, AZ

    Cash Posting Analyst Shift: Full-Time M-F 7:00am to 3:30pm AZ TIME Pay Rate: $19-25 Interview Process: 1 hour-long Virtual interview with Revenue Cycle leadership Duration: 6 month contract to hire REQUIRED Minimum 3 years healthcare revenue cycle including but not limited to Front End( pre-registration, insurance verification) Mid-Cycle (charge capture, coding) or Back End (Collections, Reporting and Audit) - Ideally 12 months professional experience with Accounts Recievables (AR)/ Back End: claims, denials, payments or adjustments 1 year experience in Accounting Either professional experience in Accounting or relevant degree in Accounting Ability to describe Provider-Level Adjustments (PLB) and reconciliation processes - What is PLB and why is it used in Healthcare Strong background in cash posting operations - Differences between manual and 835 electronic Moderate Excel experience PREFERRED Multi-national billion dollar Health systems with 100+ locations Google Sheets experience Day to day: Insight Global is looking for a Cash Posting Analyst to join one of the nation's largest nonprofit hospital systems and play a key role in driving revenue cycle accuracy and efficiency. In this role, you'll post and reconcile payments, adjustments, and denials, manage billing and refunds, and perform variance analysis to resolve discrepancies. You'll ensure accounts are balanced, collaborate with billing and finance teams, and support audits and month-end close. Strong technical and analytical skills are essential, along with expertise in payer rules and remittance formats. This position offers minimal direct interaction (20-30%) and the opportunity to work independently while contributing to a mission-driven organization.
    $19-25 hourly 5d ago

Learn more about data analyst jobs

How much does a data analyst earn in Gilbert, AZ?

The average data analyst in Gilbert, AZ earns between $45,000 and $92,000 annually. This compares to the national average data analyst range of $53,000 to $103,000.

Average data analyst salary in Gilbert, AZ

$64,000

What are the biggest employers of Data Analysts in Gilbert, AZ?

The biggest employers of Data Analysts in Gilbert, AZ are:
  1. CenExel
  2. Tailstorm Health
  3. Workoo Technologies
Job type you want
Full Time
Part Time
Internship
Temporary