Post job

Data engineer jobs in Casa Grande, AZ - 1,507 jobs

All
Data Engineer
Data Consultant
Data Scientist
Hadoop Developer
Analytical Data Miner
Data Modeler
Software Systems Engineer
Devops Engineer
  • Senior Data Modeler

    Harnham

    Data engineer job in Phoenix, AZ

    Hybrid 3-4 days onsite Salary: $130,000 - $150,000 base A large, operationally complex organization is undergoing a major modernization of its data platform and is building a new, cloud-native analytics foundation from the ground up. This is a greenfield opportunity for a senior-level data modeler to establish best practices, influence architecture, and help shape how data is organized and used across the business. This role sits at the center of a multi-year transformation focused on modern analytics, scalable data products, and strong collaboration between data and business teams. What You'll Be Working On Designing and implementing enterprise data models across conceptual, logical, and physical layers Establishing Medallion architecture patterns and reusable modeling assets Building dimensional and semantic models that support analytics and reporting Partnering closely with domain experts and functional leaders to translate business needs into data structures Collaborating with data engineers to align models with ELT pipelines and analytics frameworks Helping define modeling standards and upskilling senior engineers in modern data modeling practices Contributing hands-on to data engineering work where needed (SQL, transformations, optimization) Proactively identifying analytics opportunities and recommending data structures to support them This role is roughly 40% data modeling, 30% hands-on engineering, and 30% cross-functional collaboration. Must-Have Experience Strong, hands-on experience with data modeling (dimensional, canonical, semantic) Deep understanding of Medallion architecture Advanced SQL and experience working with a modern cloud data warehouse Experience with dbt for transformations and modeling Hands-on experience in cloud-native data environments (AWS preferred) Ability to work directly with business stakeholders and explain technical concepts clearly Experience collaborating closely with data engineers on execution Nice to Have Python experience Familiarity with Informatica or reverse-engineering legacy data models Exposure to streaming or near-real-time data pipelines Experience with visualization tools (tool choice is flexible) Who Will Thrive in This Role A senior individual contributor who enjoys building from scratch Someone who can act as a modeling expert and mentor in an organization formalizing this practice Comfortable working in ambiguity and taking initiative Strong communicator who enjoys partnering with both technical and non-technical teams Equally comfortable discussing business concepts and physical data models Why This Role Is Unique Greenfield data modeling initiative with real influence Opportunity to define standards that will be used across the organization Work on large-scale, real-world operational and analytical data High visibility within a growing data organization Flexible work setup for individual contributors If you're excited about shaping a modern data foundation and want to be the person who defines how data is modeled, understood, and used, this is a rare opportunity to make a lasting impact.
    $130k-150k yearly 5d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Systems Software Engineer

    Sunbelt Controls 3.3company rating

    Data engineer job in Phoenix, AZ

    Now Hiring: Systems Software Engineer II 📍 Phoenix , Arizona | 💰 $108,000 - $135,000 per year 🏢 About the Role We're looking for an experienced Systems Software Engineer II to join Sunbelt Controls, a leading provider of Building Automation System (BAS) solutions across the Western U.S. In this role, you'll develop and program databases, create custom graphics, and integrate control systems for smart buildings. You'll also support project startups, commissioning, and troubleshooting - working closely with project managers and engineers to deliver high-quality, energy-efficient building automation solutions. If you have a passion for technology, problem-solving, and helping create intelligent building systems, this opportunity is for you. ⚙️ What You'll Do Design and program BAS control system databases and graphics for assigned projects. Lead the startup, commissioning, and troubleshooting of control systems. Work with networked systems and diagnose LAN/WAN connectivity issues. Perform pre-functional and functional system testing, including LEED and Title 24 requirements. Manage project documentation, including as-builts and commissioning records. Coordinate with project teams, subcontractors, and clients for smooth execution. Mentor and support junior Systems Software Engineers. 🧠 What We're Looking For 2-5 years of experience in Building Automation Systems or a related field. Associate's degree in a technical field (Bachelor's in Mechanical or Electrical Engineering preferred). Proficiency in MS Office, Windows, and basic TCP/IP networking. Strong organizational skills and the ability to manage multiple priorities. Excellent communication and customer-service skills. Valid Arizona driver's license. 💎 Why You'll Love Working With Us At Sunbelt Controls, we don't just build smart buildings - we build smart careers. As a 100% employee-owned company (ESOP), we offer a supportive, growth-oriented environment where innovation and teamwork thrive. What we offer: Competitive salary: $108K - $135K, based on experience Employee-owned company culture with a family-oriented feel Comprehensive health, dental, and vision coverage Paid time off, holidays, and 401(k)/retirement plan Professional growth, mentorship, and ongoing learning opportunities Veteran-friendly employer & Equal Opportunity workplace 🌍 About Sunbelt Controls Sunbelt Controls is a premier BAS solutions provider serving clients across multiple industries, including data centers, healthcare, education, biotech, and commercial real estate. We specialize in smart building technology, system retrofits, analytics, and energy efficiency - helping clients reduce operational costs and achieve sustainable performance. 👉 Apply today to join a team that's shaping the future of intelligent buildings. #Sunbelt #BuildingAutomation #SystemsEngineer #HVACControls #BASCareers
    $108k-135k yearly 2d ago
  • IT Cloud DevOps Engineer

    Avesis

    Data engineer job in Phoenix, AZ

    Join us for an exciting career with the leading provider of supplemental benefits! Our Promise Through skill-building, leadership development and philanthropic opportunities, we provide opportunities to build communities and grow your career, surrounded by diverse colleagues with high ethical standards. The IT Cloud DevOps Engineer is a position responsible for leading the design, deployment and optimization of cloud-based IT infrastructure and Azure CI/CD pipelines in support of our business applications. This role collaborates with IT and leadership to ensure that IT Infrastructure aligns with strategic goals and regulatory requirements. The IT Cloud DevOps Engineer serves as a strategic partner to technologist and technical leaders within the IT Business Analysis, and Application Support teams. This key role is responsible for the design, deployment, and management of cloud-based infrastructure via Azure CI/CD pipelines across both Microsoft AzureandAmazon Web Services (AWS) and supports applications teams to ensure seamless delivery of cloud-based services. The ideal candidate will have expertise with Azure DevOps, Terraform, multi-cloud networking fundamentals, cloud deployments, and cloud-based applications in a fast-paced environment. Competencies: Functional: Design, build, and maintain cloud infrastructure using Azure, Terraform, and Azure DevOps pipelines. Support multi-cloud environments (primarily Azure with some AWS workloads). Implement Infrastructure as Code (IaC) best practices for scalable and secure deployments. Monitor and maintain cloud networking, including VNETs, VPNs and API workloads. Collaborate with application and operations teams to support deployments and troubleshoot issues. Implement and manage CI/CD workflows, including version control, build automation, and deployment pipelines. Ensure systems adhere to security, compliance, and performance standards. Participate and/or provide operational support for critical services. Core: Technical Proficiency Solid understanding of cloud computing concepts, specifically Azure. Expertise at IT application deployment, administration and support. Analytical Skills Able to identify opportunities for improvement and implement tasks to drive positive change. Expert at analyzing complex business requirements and translating them into technical solutions. Exceptional problem-solving skills with attention to detail. Communication Skills Effective verbal and written communication skills. Ability to collaborate with diverse stakeholders, including technical and non-technical personnel. Has the ability to lead developers to use best practices with source code management and CI/CD pipelines. Regulatory Knowledge Understanding of healthcare regulations such as HIPAA and CMS. Ability to ensure compliance with application configurations. Understanding of underlying cybersecurity standards like NIST. Organizational Skills A multi-tasker that can manage multiple tasks and projects simultaneously. Strong time management and prioritization skills. Behavioral: Collegiality: building strong relationships on company-wide, approachable, and helpful, ability to mentor and support team growth. Initiative: readiness to lead or take action to achieve goals. Communicative: ability to relay issues, concepts, and ideas to others easily orally and in writing. Member-focused: going above and beyond to make our members feel seen, valued, and appreciated. Detail-oriented and thorough: managing and completing details of assignments with little oversight. Flexible and responsive: managing new demands, changes, and situations. Critical Thinking: effectively troubleshoot complex issues, problem solve and multi-task. Integrity & responsibility: acting with a clear sense of ownership for actions, decisions and to keep information confidential when required. Collaborative: ability to represent your own interests while being fair to those representing other or competing ideas in search of a workable solution for all parties. Minimum Qualifications: Bachelor's degree in computer science, Information Technology, or related field (or equivalent experience). High School Degree required. 3-5 years of experience in cloud engineering or DevOps roles. Strong expertise with Microsoft Azure (IaaS, PaaS, AKS, App Services). Strong expertise with Azure DevOps (Repos, Pipelines, Boards). Strong expertise with Terraform (IaC automation). Basic to intermediate knowledge of AWS (EC2, S3, IAM, VPCs). Strong understanding of networking principles (DNS, routing, firewalls, VPNs, etc.). Experience with scripting (PowerShell, Bash, or Python). Familiarity with monitoring tools (Azure Monitor, Log Analytics, Application Insights). Excellent communication and problem-solving skills. Understanding of healthcare regulations such as HIPAA and CMS and cybersecurity standards like NIST. Preferred Qualifications and Certifications: Microsoft Certified: Azure Administrator Associate (AZ-104) HashiCorp Certified: Terraform Associate AWS Certified Cloud Practitioner (optional) Experience with application deployment and administration in a healthcare setting. At Avsis, we strive to design equitable, and competitive compensation programs. Base pay within the range is ultimately determined by a candidate's skills, expertise, or experience. In the United States, we have three geographic pay zones. For this role, our current pay ranges for new hires in each zone are: Zone A: $81,650.00-$136,090.00 Zone B: $89,060.00-$148,440.00 Zone C: $95,840.00-$159,730.00 FLSA Status: Salary/Exempt This role may also be eligible for benefits, bonuses, and commission. Please visit Avesis Pay Zones for more information on which locations are included in each of our geographic pay zones. However, please confirm the zone for your specific location with your recruiter. We Offer Meaningful and challenging work opportunities to accelerate innovation in a secure and compliant way. Competitive compensation package. Excellent medical, dental, supplemental health, life and vision coverage for you and your dependents with no wait period. Life and disability insurance. A great 401(k) with company match. Tuition assistance, paid parental leave and backup family care. Dynamic, modern work environments that promote collaboration and creativity to develop and empower talent. Flexible time off, dress code, and work location policies to balance your work and life in the ways that suit you best. Employee Resource Groups that advocate for inclusion and diversity in all that we do. Social responsibility in all aspects of our work. We volunteer within our local communities, create educational alliances with colleges, drive a variety of initiatives in sustainability. How To Stay Safe Avsis is aware of fraudulent activity by individuals falsely representing themselves as Avsis recruiters. In some instances, these individuals may even contact applicants with a job offer letter, ask applicants to make purchases (i.e., a laptop or gift cards) from a designated vendor, have applicants fill out W-2 forms, or ask that applicants ship or send packages of goods to the company. Avsis would never make such requests to applicants at any time throughout our job application process. We also would never ask applicants for personal information, such as passport numbers, bank account numbers, or social security numbers, during our process. Our recruitment process takes place by phone and via trusted business communication platform (i.e., Zoom, Webex, Microsoft Teams, etc.). Any emails from Avsis recruiters will come from a verified email address ending in @ Avsiscom. We urge all applicants to exercise caution. If something feels off about your interactions, we encourage you to suspend or cease communications. If you are unsure of the legitimacy of a communication you have received, please reach out to . To learn more about protecting yourself from fraudulent activity, please refer to this article link (articles/how-avoid-scam). If you believe you were a victim of fraudulent activity, please contact your local authorities or file a complaint (Link: #/) with the Federal Trade Commission. Avsis is not responsible for any claims, losses, damages, or expenses resulting from unaffiliated individuals of the company or their fraudulent activity. Equal Employment Opportunity At Avsis, We See You. We celebrate differences and are building a culture of inclusivity and diversity. We are proud to be an Equal Employment Opportunity employer that considers all qualified applicants and does not discriminate against any person based on ancestry, age, citizenship, color, creed, disability, familial status, gender, gender expression, gender identity, marital status, military or veteran status, national origin, race, religion, sexual orientation, or any other characteristic. At Avsis, we believe that, to operate at the peak of excellence, our workforce needs to represent a rich mixture of diverse people, all focused on providing a world-class experience for our clients. We focus on recruiting, training and retaining those individuals that share similar goals. Come Dare to be Different at Avsis, where We See You!
    $95.8k-159.7k yearly 2d ago
  • Data Scientist, Analytics (Technical Leadership)

    Meta 4.8company rating

    Data engineer job in Phoenix, AZ

    We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond. **Required Skills:** Data Scientist, Analytics (Technical Leadership) Responsibilities: 1. Work with complex data sets to solve challenging problems using analytical and statistical approaches 2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies 3. Identify and measure success through goal setting, forecasting, and monitoring key metrics 4. Partner with cross-functional teams to inform and execute product strategy and investment decisions 5. Build long-term vision and strategy for programs and products 6. Collaborate with executives to define and develop data platforms and instrumentation 7. Effectively communicate insights and recommendations to stakeholders 8. Define success metrics, forecast changes, and set team goals 9. Support developing roadmaps and coordinate analytics efforts across teams **Minimum Qualifications:** Minimum Qualifications: 10. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience 11. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab) 12. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development 13. Experience working effectively with and leading complex analytics projects across multiple stakeholders and cross-functional teams, including Engineering, PM/TPM, Analytics and Finance 14. Experience with predictive modeling, machine learning, and experimentation/causal inference methods 15. Experience communicating complex technical topics in a clear, precise, and actionable manner **Preferred Qualifications:** Preferred Qualifications: 16. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy 17. Masters or Ph.D. Degree in a quantitative field 18. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research) 19. 10+ years of experience doing complex quantitative analysis in product analytics **Public Compensation:** $210,000/year to $281,000/year + bonus + equity + benefits **Industry:** Internet **Equal Opportunity:** Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment. Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
    $210k-281k yearly 60d+ ago
  • AWS Data Migration Consultant

    Slalom 4.6company rating

    Data engineer job in Phoenix, AZ

    Candidates can live within commutable distance to any Slalom office in the US. We have a hybrid and flexible environment. Who You'll Work With As a modern technology company, we've never met a technical challenge we didn't like. We enable our clients to learn from their data, create incredible digital experiences, and make the most of new technologies. We blend design, engineering, and analytics expertise to build the future. We surround our technologists with interesting challenges, innovative minds, and emerging technologies. We are seeking an experienced Cloud Data Migration Architect with deep expertise in SQL Server, Oracle, DB2, or a combination of these platforms, to lead the design, migration, and optimization of scalable database solutions in the AWS cloud. This role will focus on modernizing on-premises database systems by architecting high-performance, secure, and reliable AWS-hosted solutions. As a key technical leader, you will work closely with data engineers, cloud architects, and business stakeholders to define data strategies, lead complex database migrations, build out ETL pipelines, and optimize performance across legacy and cloud-native environments. What You'll Do * Design and optimize database solutions on AWS, including Amazon RDS, EC2-hosted instances, and advanced configurations like SQL Server Always On or Oracle RAC (Real Application Clusters). * Lead and execute cloud database migrations using AWS Database Migration Service (DMS), Schema Conversion Tool (SCT), and custom automation tools. * Architect high-performance database schemas, indexing strategies, partitioning models, and query optimization techniques. * Optimize complex SQL queries, stored procedures, functions, and views to ensure performance and scalability in the cloud. * Implement high-availability and disaster recovery (HA/DR) strategies including Always-On, Failover Clusters, Log Shipping, and Replication, tailored to each RDBMS. * Ensure security best practices are followed including IAM-based access control, encryption, and compliance with industry standards. * Collaborate with DevOps teams to implement Infrastructure-as-Code (IaC) using tools like Terraform, CloudFormation, or AWS CDK. * Monitor performance using tools such as AWS CloudWatch, Performance Insights, Query Store, Dynamic Management Views (DMVs), or Oracle-native tools. * Work with software engineers and data teams to integrate cloud databases into enterprise applications and analytics platforms. What You'll Bring * 5+ years of experience in database architecture, design, and administration with at least one of the following: SQL Server, Oracle, or DB2. * Expertise in one or more of the following RDBMS platforms: Microsoft SQL Server, Oracle, DB2. * Hands-on experience with AWS database services (RDS, EC2-hosted databases). * Strong understanding of HA/DR solutions and cloud database design patterns. * Experience with ETL development and data integration, using tools such as SSIS, AWS Glue, or custom solutions. * Familiarity with AWS networking components (VPCs, security groups) and hybrid cloud connectivity. * Strong troubleshooting and analytical skills to resolve complex database and performance issues. * Ability to work independently and lead database modernization initiatives in collaboration with engineering and client stakeholders. Nice to Have * AWS certifications such as AWS Certified Database - Specialty or AWS Certified Solutions Architect - Professional. * Experience with NoSQL databases or hybrid data architectures. * Knowledge of analytics and big data tools (e.g., Snowflake, Redshift, Athena, Power BI, Tableau). * Familiarity with containerization (Docker, Kubernetes) and serverless technologies (AWS Lambda, Fargate). * Experience with DB2 on-premise or cloud-hosted environments. About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this position, the target base salary pay range in the following locations: Boston, Houston, Los Angeles, Orange County, Seattle, San Diego, Washington DC, New York, New Jersey, for Consultant level is $105,000-147,000 and for Senior Consultant level it is $120,000-$169,000 and for Principal level it is $133,000-$187,000. In all other markets, the target base salary pay range for Consultant level is $96,000-$135,000 and for Senior Consultant level it is $110,000-$155,000 and for Principal level it is $122,000-$172,000. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time. EEO and Accommodations Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process. We will accept applications until 1/31/2026 or until the positions are filled.
    $133k-187k yearly 5d ago
  • Data Scientist, Privacy

    Datavant

    Data engineer job in Phoenix, AZ

    Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care. By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare. As part of the Privacy Science team within Privacy Hub you will play a crucial role in ensuring that privacy of patients is safeguarded in the modern world of data sharing. As well as working on real data, you will be involved in exciting research to keep us as industry leaders in this area, and stimulating discussions on re-identification risk. You will be supported in developing/consolidating data analysis and coding skills to become proficient in the analysis of large health-related datasets. **You Will:** + Critically analyze large health datasets using standard and bespoke software libraries + Discuss your findings and progress with internal and external stakeholders + Produce high quality reports which summarise your findings + Contribute to research activities as we explore novel and established sources of re-identification risk **What You Will Bring to the Table:** + Excellent communication skills. Meticulous attention to detail in the production of comprehensive, well-presented reports + A good understanding of statistical probability distributions, bias, error and power as well as sampling and resampling methods + Seeks to understand real-world data in context rather than consider it in abstraction. + Familiarity or proficiency with programmable data analysis software R or Python, and the desire to develop expertise in its language + Application of scientific methods to practical problems through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions + Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines + Initiative and ability to independently explore and research novel topics and concepts as they arise, to expand Privacy Hub's knowledge base + An appreciation of the need for effective methods in data privacy and security, and an awareness of the relevant legislation + Familiarity with Amazon Web Services cloud-based storage and computing facilities **Bonus Points If You Have:** + Experience creating documents using LATEX + Detailed knowledge of one or more types of health information, e.g., genomics, disease, health images + Experience working with or supporting public sector organizations, such as federal agencies (e.g., CMS, NIH, VA, CDC), state health departments, or public health research partners. Familiarity with government data environments, procurement processes, or privacy frameworks in regulated settings is highly valued. \#LI-BC1 We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services. The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job. The estimated total cash compensation range for this role is: $104,000-$130,000 USD To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion. This job is not eligible for employment sponsorship. Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay. At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way. Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis. For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
    $104k-130k yearly 16d ago
  • JAVA / Hadoop Developer ( Strong Java Background)

    Trg 4.6company rating

    Data engineer job in Phoenix, AZ

    Job Role : JAVA / Hadoop Developer ( Strong Java Background) Job Type : Full Time/ Permanent Salary : Market + Benefits + Relocation Assistance Required Skill: Core Java , Distributed System , Design pattern , Messaging ( Sounds knowledge in Big Data / Hadoop technologies ) Job description: Should have sounds knowledge in Big Data/ Hadoop Should have strong hands-on experience in minimum 5 pluas years in Java/ J2EE Java experience, in open source, data-intensive, distributed environments. Should have worked on open source products and contribution towards it would be an added advantage. Implemented and in-depth knowledge of various java, J2EE and EAI patterns. Interetsted candidate can reach me on Below contact details : Thanks & Regards Utkarsh Agarwal ************ ext 995 Additional Information If you are comfortable with the position and location then please revert me back at the earliest with your updated resume and following details or I would really appreciate if you can call me back on my number. Full Name: Email: Skype id: Contact Nos.: Current Location: Open to relocate: Start Availability: Work Permit: Flexible time for INTERVIEW: Current Company: Current Rate: Expected Rate: Total IT Experience [Years]: Total US Experience [Years]: Key Skill Set: Best time to call: 2 Slots for phone interview: In case you are not interested, I will be very grateful if you can pass this position to your colleagues or friends who might be interested. All your information will be kept confidential according to EEO guidelines.
    $81k-104k yearly est. 2d ago
  • Data Engineer/Developer (Snowflake)

    Cayuse Shared Services

    Data engineer job in Phoenix, AZ

    Cayuse Commercial Services (CCS) delivers fresh solutions to business challenges in the technology and business services environment. Services available are application development, business process outsourcing, data services, and professional services. Cayuse helps clients to achieve impactful outcomes such as improved efficiency, reduced cost, increased profitability and accelerated time to market. LOCATION: Phoenix, AZ (Hybrid) EMPLOYEE TYPE: Contractor (1099/C2C) Responsibilities We are seeking a skilled Snowflake Developer to join our team. The ideal candidate will have extensive experience in Snowflake development, strong SQL expertise, and a solid understanding of data warehousing principles. This role involves working with ETL processes, cloud platforms (AWS or Azure), and analytics tools to build scalable and efficient data solutions. If you have a problem-solving mindset and can perform under pressure, we'd love to hear from you! Key Responsibilities: Design, develop, and optimize Snowflake data solutions. Implement and maintain data warehousing best practices. Write and optimize complex SQL queries for data extraction and transformation. Work with ETL processes to integrate data from multiple sources. Collaborate with teams to deploy data solutions on AWS or Azure. Utilize analytics tools to derive insights from data. Troubleshoot and resolve performance issues related to data processing. Ensure data security, quality, and compliance with industry standards. Qualifications Required Skills & Experience: 4+ years of experience working with Snowflake. Strong understanding of data warehousing principles. Advanced SQL expertise, including query optimization. Hands-on experience with ETL processes and data integration. Proficiency in cloud platforms like AWS or Azure. Familiarity with analytics tools for reporting and insights. Strong problem-solving skills and ability to work in fast-paced environments. Ability to work under pressure and meet deadlines. Preferred Qualifications: Experience with Python, Spark, or other data processing frameworks. Knowledge of data governance and security best practices. Previous experience in big data processing and performance tuning Cayuse is an Equal Opportunity Employer. All employment decisions are based on merit, qualifications, skills, and abilities. All qualified applicants will receive consideration for employment in accordance with any applicable federal, state, or local law. Pay Range USD $75.00 - USD $85.00 /Hr.
    $75-85 hourly Auto-Apply 60d+ ago
  • Data Engineer/Developer (Snowflake)

    Cayuse Holdings

    Data engineer job in Phoenix, AZ

    Cayuse Commercial Services (CCS) delivers fresh solutions to business challenges in the technology and business services environment. Services available are application development, business process outsourcing, data services, and professional services. Cayuse helps clients to achieve impactful outcomes such as improved efficiency, reduced cost, increased profitability and accelerated time to market. **LOCATION:** Phoenix, AZ (Hybrid) **EMPLOYEE TYPE:** Contractor (1099/C2C) **Responsibilities** We are seeking a skilled **Snowflake Developer** to join our team. The ideal candidate will have extensive experience in **Snowflake development** , strong **SQL expertise** , and a solid understanding of **data warehousing principles** . This role involves working with **ETL processes** , **cloud platforms (AWS or Azure)** , and **analytics tools** to build scalable and efficient data solutions. If you have a problem-solving mindset and can perform under pressure, we'd love to hear from you! **Key Responsibilities:** + Design, develop, and optimize **Snowflake** data solutions. + Implement and maintain **data warehousing** best practices. + Write and optimize **complex SQL queries** for data extraction and transformation. + Work with **ETL processes** to integrate data from multiple sources. + Collaborate with teams to deploy data solutions on **AWS or Azure** . + Utilize **analytics tools** to derive insights from data. + Troubleshoot and resolve performance issues related to data processing. + Ensure data security, quality, and compliance with industry standards. **Qualifications** **Required Skills & Experience:** + **4+ years of experience** working with **Snowflake** . + Strong understanding of **data warehousing principles** . + Advanced **SQL** expertise, including query optimization. + Hands-on experience with **ETL processes** and data integration. + Proficiency in cloud platforms like **AWS or Azure** . + Familiarity with **analytics tools** for reporting and insights. + Strong **problem-solving skills** and ability to work in fast-paced environments. + Ability to **work under pressure** and meet deadlines. **Preferred Qualifications:** + Experience with **Python, Spark, or other data processing frameworks** . + Knowledge of **data governance and security best practices** . + Previous experience in **big data processing and performance tuning** _Cayuse is an Equal Opportunity Employer. All employment decisions are based on merit, qualifications, skills, and abilities. All qualified applicants will receive consideration for employment in accordance with any applicable federal, state, or local law._ **Pay Range** USD $75.00 - USD $85.00 /Hr. Submit a Referral (**************************************************************************************************************************************************** **Can't find the right opportunity?** Join our Talent Community (********************************************************** or Language Services Talent Community (******************************************************** and be among the first to discover exciting new possibilities! **Location** _US-AZ-Phoenix_ **ID** _2024-1797_ **Category** _Information Technology_ **Position Type** _Independent Contractor_ **Remote** _No_ **Clearance Required** _None_
    $75-85 hourly 60d+ ago
  • Data Scientist

    Mater Dei Catholic High School 3.8company rating

    Data engineer job in Phoenix, AZ

    Data Scientist Post Available Only to Americans American, Off-grid From here, our viewpoint is that the creative staff is committed to using data to reach findings that inspire fresh ideas. Data is about far more than simply numbers for us; it's the secret to revealing latent opportunities, overcoming tough obstacles, and future industry charting. We want a data scientist driven by data who can explore datasets, identify fresh trends, and apply their knowledge to actually change things. If you enjoy the chance to spin stories out of unprocessed data, join our team! What then is your strategy? Like a detective, examine challenging data looking for trends, connections, and patterns that could support corporate strategy. Exercises your critical thinking to investigate closely and develop reasonable answers. Create and polish machine learning models capable of forecasting consumer trends and actions, therefore contributing significantly to corporate decision-making. Work creatively with others: Work collaboratively with teams in marketing, engineering, and products to grasp corporate goals and create data-driven answers to actual problems. Set up automated data pipelines. Building effective and adaptable data pipelines that automatically transfer data can help to improve the availability and value of data. Share Knowledge: Present complicated data results clearly and attractingly to technical teams and non-technical stakeholders. Your aim should be for every piece of data to be simply understandable. By always learning fresh approaches, tools, and algorithms, keep ahead of the curve. You will make sure we remain competitive always by keeping us at the forefront of data practice innovation. Finding and resolving discrepancies helps one to guarantee that the data utilized for analysis is accurate, high-quality, and uncompromised. Our intended result: Potential applicants for this post could be Americans only. Data science, machine learning, or a similar discipline calls for minimum two to three years of pertinent professional experience. Statistical techniques, data analysis, and model development all flow naturally to you. Technical Mastery: You speak Python, R, and SQL rather well among other programming languages. Good usage of machine learning libraries including PyTorch, TensorFlow, or scikit-learn is appreciated. Using Tableau, Power BI, and Matplotlib, you are a master at visually appealing and intelligible presenting challenging data. According to the analytical perspective, one can see the possible insights by closely reviewing unprocessed data. Constantly asking "why" questions regarding the figures helps you to strengthen your critical thinking abilities. Your creative approach helps you to turn facts into strategic advantage and generate original answers to problems. You like collaborating with others whether you are developing fresh ideas or presenting your results to non-technical aware stakeholders. Your painstaking attention to detail helps the data you handle to be of the best quality and accuracy. Our Motives for Your Contentment in Comprising Our Team Working remotely in the US allows you to enjoy your freedom in juggling job and personal life. Join a team that celebrates uniqueness and creativity to start fresh ideas and assume leadership responsibilities. Development in Your Profession: We provide you various chances to pick up fresh skills, broaden your knowledge in a profession always changing, and progress in your employment. We provide a competitive pay, a whole benefits package, and wellness incentives to guarantee your health and happiness. You will directly influence the course of our company since you own its future. Your efforts will help to shape it. Approaches for Application: Are you ready to use data's power and change things? Your answer would really be much valued. Add a quick cover letter outlining your interest in data science and noting how your background fits the position. Note: Only US applicants are eligible for this post.
    $85k-116k yearly est. 60d+ ago
  • Business Intelligence & Data Engineering - Phoenix

    Meadows of Wickenburg 4.0company rating

    Data engineer job in Phoenix, AZ

    This is where you change your story… At Meadows we understand that new directions to career advancing and improvement can be scary, but we are excited to offer you a possible new rewarding chapter with us! Come join us in transforming lives! Who are we? Meadows Behavioral Health is a leader in the behavioral health industry. We offer a range of specialized programs including residential, outpatient and virtual treatment. We provide care for drug and alcohol addiction, trauma, sexual addiction, behavioral health conditions, and co-occurring disorders. We offer state-of-the-art care including neurofeedback and other services. Our evidence-based approach is rooted in decades of clinical experience, with more than 45 years in the field. Our approach is different and success stories from our patients are the proof. Who are you? Are you compassionate, innovative and have a passion to make an impact? Are you looking to get your foot in the door with a company that will believe in your abilities and train you to advance? 75% of our current top-level executive staff are organic internal promotions from within. We might be a perfect fit for you! *Please note - this is a hybrid position based out of our North Phoenix corporate offices. Please only apply if you live within a commutable distance to our location* Position Summary: The Data Engineer will play a crucial role in building, maintaining, and enhancing business intelligence solutions, focusing on data modeling, report building, and data analysis within cloud-based platforms, mainly Microsoft Azure. This role involves designing and optimizing data pipelines using Azure Data Factory, managing data lakes and data warehouses using Microsoft Azure Synapse, and creating meaningful reports and dashboards within Tableau. The Data Engineer will be responsible for designing, coding, and supporting functional intra-departmental data procedures and process flows, with additional focus on cross-departmental data integration procedures through ETL process builds. The Data Engineer will also be responsible for performing ad hoc SQL queries to extract data for analysis within the requesting department, ensuring the integrity and accessibility of critical business data to support decision-making. This role is not an analyst role and is majority focused on the technical "to-do" of making data available for the business in a form or fashion that the data can be analyzed. The Data Engineer needs to be comfortable building all aspects of the data pipeline from source through to storage, then presentation, rather than focusing on analysis of the impact of what the data means to the business. Essential Job Functions: Design, develop, and maintain ETL/ELT pipelines using Microsoft Azure Data Factory to move, transform, and load data from various sources into Azure Data Lakes and Azure Synapse. Primary focus on enhancement of data integration, normalization, and standardization for business intelligence and reporting. Build, manage, and monitor data pipelines and ETL processes along with documentation and usage statistics/KPIs to improve processing performance, database capacity issues, data replication, and other distributed data issues. Responsible for designing and building interactive dashboards, alerts, and reports on both a recurring and ad-hoc basis with our BI tools and platforms, primarily Tableau. Develops reports as needed against multiple types of data sources. Design and implement data models for use in Tableau for reporting and dashboarding. Ensure data integrity, quality, and consistency in Tableau reports and dashboards. Work closely with business stakeholders to understand their data needs and provide via ad hoc data extraction. Write and execute queries against the data warehouse or other sources in support of these ad hoc data requests and quick reporting needs. The sources include both SQL and No-SQL databases. Implement security and access controls for databases and data pipelines to ensure compliance with security standards and protect sensitive data. Maintain proper documentation of data access, database schemas, and transformation logic for transparency and governance. Collaborate with the Director of Infrastructure on cloud data architecture and integration strategies, ensuring that Azure Data Lakes and Azure Synapse are optimized for performance and scalability. Creates and fosters a collaborative, friendly, and supportive environment for information technology users to interface their requirements and projects and troubleshoot their application and data-related problems. Qualifications Education, Skills, and Experience Requirements: Bachelor's degree in computer science, data science, information technology, or related field, or equivalent education and experience. Relevant certification in cloud platforms, data analytics, or business intelligence is a plus (i.e., Microsoft Certified: Azure Data Engineer Associate, Tableau Desktop Specialist). At least 2 years of experience in data modeling and report building, specifically using Tableau for dashboard and report creation. Direct experience with Microsoft Azure Data Factory, Azure Data Lakes, and Azure Synapse for building, maintaining, and optimizing data pipelines and data integration solutions. Proficiency in SQL for querying relational and non-relational data sources, including experience with ad hoc queries for quick data extraction and reporting. Experience in developing, maintaining, and enhancing ETL/ELT processes for data transformation and loading in cloud-based environments. Strong understanding of data warehousing concepts, data modeling techniques, and best practices for cloud data architecture. Proficiency in scripting languages like Python or R is preferred, particularly for data manipulation and analysis. Experience with other AI/ML-based analytics tools or other reporting platforms is a plus, but not required. Examples include Crystal Reports, PowerBI, Salesforce CRM Reporting, IBM Cognos, etc. We are a Drug Free Company. All positions are designated as “Safety Sensitive” positions and in light of our company mission, the Company does not employ medical marijuana cardholders. Following an offer of employment, and prior to reporting to work, all applicants will be required to submit to and pass a substance abuse screen. The Meadows is an equal opportunity employer committed to diversity and inclusion in the workplace. Qualified candidates will receive consideration without regard to race, color, religion, sex, sexual orientation, age, national origin, disability, protected veteran status or any other factor protected by applicable federal, state or local laws. The Meadows provides reasonable accommodations to individuals with disabilities and if you need reasonable accommodation during any time of the employment process please reach out
    $98k-124k yearly est. 6d ago
  • Sr Data Engineer (MFT - IBM Sterling)

    The Hertz Corporation 4.3company rating

    Data engineer job in Phoenix, AZ

    **A Day in the Life:** The IBM Sterling Senior MFT Engineer will have expertise in Sterling B2B Integrator, Sterling File Gateway, Sterling Secure Proxy, Control Center. This position requires an expert level knowledge of these technologies. You'll provide third-level support for core hardware, software, data and infrastructure components within the B2B integration team. The software focus areas will include software infrastructure components, continuous improvement and automation related to middleware and enterprise data movement within a mission-critical, multi-site enterprise environment. The ideal candidate will have a passion for technology; possess the ability to create change and be the facilitator for this transformation. They will have experience tailoring software design, developing, and implementing of solutions in support of data quality and automation. They will need to understand the requirements of a project, test the performance of the infrastructure, and verify that the requirements have been successfully met. We expect the starting salary to be around $135k but will be commensurate with experience. **What You'll Do:** TECHNICAL SENIORSHIP + Communication with internal and external business users on Sterling Integrator mappings + Making changes to existing partner integrations to meet internal and external requirements + Design, develop and implement solutions based on standards and processes that establish consistency across the enterprise data, reduce risks, and promote efficiencies in support of the organizational goals and objectives. + Diagnose and troubleshoot complex issues, restore services and perform root cause analysis. + Facilitate the review, vetting of these designs with the architecture governance bodies, as required. + Be aware of all aspects of security related to the Sterling environment and integrations INNOVATIVE INFRASTRUCTURE & PROBLEM SOLVING + Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows. TEAMWORK & COMMUNICATION + Superior & demonstrated team building & development skills to harness powerful teams + Ability to communicate effectively with different levels of Seniorship within the organization + Provide timely updates so that progress against each individual incident can be updated as required + Write and review high quality technical documentation CONTROL & AUDIT + Ensures their workstation and all processes and procedures, follow organization standards CONTINUOUS IMPROVEMENT + Encourages and maintains a 'Best practice sharing culture', always striving to find ways to improve service and change mind set. **What We're Looking For:** + Bachelor's degree in Engineering, Statistics, Computer Science or other quantitative fields, required + 5+ years of IT experience + 4+ years' experience working with Sterling B2B Integrator, Sterling File Gateway, PCM (preferred) + 3+ years' experience with scripting to enable automation of standard activities (example: Ansible, Python, Bash, Java) + Strong interpersonal and communication skills with Agile/Scrum experience. + Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions with the ability to develop custom setups. + Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels. + Prefer Travel, transportation, or hospitality experience + Prefer experience with designing application data models for mobile or web applications + Excellent written and verbal communication skills. + Flexibility in scheduling which may include nights, weekends, and holidays **What You'll Get:** + Up to 40% off the base rate of any standard Hertz Rental + Paid Time Off + Medical, Dental & Vision plan options + Retirement programs, including 401(k) employer matching + Paid Parental Leave & Adoption Assistance + Employee Assistance Program for employees & family + Educational Reimbursement & Discounts + Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness + Perks & Discounts -Theme Park Tickets, Gym Discounts & more The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world. **US EEO STATEMENT** At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company. Individuals are encouraged to apply for positions because of the characteristics that make them unique. EOE, including disability/veteran
    $135k yearly 60d+ ago
  • Big Data Consultant

    Career Guidant

    Data engineer job in Phoenix, AZ

    Career Guidant, an internationally acclimed, trusted multi-faced orgiansation into Information Technology Custom Learning Services for Enterprises, Lateral Staffing Solutions, Information Technology Development & Consulting, Infrastructure & Facility Management Services and Technical Content development as core competencies. Our experienced professionals bring a wealth of industry knowledge to each client and operate in a manner that produces superior quality and outstanding results. Career Guidant proven and tested methodologies ensures client satisfaction being the primary objective. Committed to our core values of Client Satisfaction, Professionalism, Teamwork, Respect, and Integrity. Career Guidant with its large network of delivery centres,support offices and Partners across India, Asia Pacific, Middle East, Far East, Europe, USA has committed to render the best service to the client closely to ensure their operation continues to run smoothly. Our Mission "To build Customer satisfaction, and strive to provide complete Information Technology solution you need to stay ahead of your competition" If you have any queries about our services. Job Description • At least 5 years of Design and development experience in Big data, Java or Datawarehousing related technologies • Atleast 3 years of hands on design and development experience on Big data related technologies - PIG, Hive, Mapreduce, HDFS, HBase, Hive, YARN, SPARK, Oozie, Java and shell scripting • Should be a strong communicator and be able to work independently with minimum involvement from client SMEs • Should be able to work in team in diverse/ multiple stakeholder environment Qualifications • Bachelor's degree or foreign equivalent required. Will also consider one year of relevant work experience in lieu of every year of education. Additional Information Note : NO OPT, H1 for this position Client : Infosys
    $73k-100k yearly est. 2d ago
  • Data Engineer (Contractor)

    Glint Tech Solutions 4.5company rating

    Data engineer job in Phoenix, AZ

    We are seeking an experienced Senior Data Engineer with a strong background in building, deploying, and supporting data ingestion and batch applications on Google Cloud. The ideal candidate will have extensive experience with BigQuery, Cloud Storage, Dataproc, and Cloud Composer/Airflow, along with proficiency in SQL, Python, PySpark, and Hive. You will work collaboratively with cross-functional teams to design and implement robust data solutions that drive business intelligence and analytics. Key Responsibilities: Design, build, and maintain scalable data pipelines for data ingestion and processing on Google Cloud Platform (GCP). Leverage BigQuery, Cloud Storage, Dataproc, and Cloud Composer/Airflow to create and optimize data workflows and batch processing applications. Develop and optimize complex SQL queries for data transformation and analysis. Write and maintain production-level Python scripts and PySpark jobs for data processing and analysis. Implement data governance and ensure data quality, accuracy, and security throughout the data lifecycle. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver actionable insights. Monitor and troubleshoot data pipelines, ensuring timely data delivery and performance optimization. Stay up-to-date with emerging technologies and best practices in big data engineering and analytics. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. 5+ years of experience in data engineering or a related role, with a focus on big data technologies. Strong proficiency in Google Cloud services, particularly BigQuery, Cloud Storage, Dataproc, and Cloud Composer/Airflow. Expertise in SQL and experience with data manipulation in Hive and other big data frameworks. Solid programming skills in Python, with hands-on experience in PySpark. Familiarity with Hadoop and Spark ecosystems. Strong analytical and problem-solving skills with attention to detail. Excellent communication and collaboration skills, with the ability to work effectively in a team environment.
    $91k-128k yearly est. 60d+ ago
  • Big Data Engineer

    Practice Xpert Inc. 3.7company rating

    Data engineer job in Phoenix, AZ

    TekWissen provides a unique portfolio of innovative capabilities that seamlessly combines clients insights, strategy, design, software engineering and systems integration. Our tightly integrated offerings are tailored to each clients requirements and span the services spectrum from Application Development/Maintenance testing, IT Consulting & staffing for IT Infrastructure Management through strategic consulting and industry-oriented business process. Job Description 1. Very strong server-side Java experience, especially in an open source, data-intensive, distributed environments. 2. Should have worked on open source products and contribution towards it would be an added advantage. 3. Implemented and in-depth knowledge of various java, J2EE and EAI patterns. 4. Implemented complex projects dealing with the considerable data size (GB/ PB) and with high complexity 5. Well aware of various Architectural concepts (Multi-tenancy, SOA, SCA etc) and NFR's (performance, scalability, monitoring etc) 6. Good understanding of algorithms, data structure, and performance optimization techniques. 7. Knowledge of database principles, SQL, and experience working with large databases beyond just data access. 8. Exposure to complete SDLC and PDLC. 9. Capable of working as an individual contributor and within team too. 10. Self-starter & resourceful personality with ability to manage pressure situations 11. Should have experience/ knowledge on working with batch processing/ Real time systems using various Open source technologies like Solr, hadoop, NoSQL DB's, Storm, kafka etc. Role & Responsibilities • Implementation of various solutions arising out of the large data processing (GB's/ PB's) over various NoSQL, Hadoop and MPP based products • Active participation in the various Architecture and design calls with bigdata customers. • Working with Sr. Architects and providing implementation details to offshore. • Conducting sessions/ writing whitepapers/ Case Studies pertaining to BigData • Responsible for Timely and quality deliveries. • Fulfill organization responsibilities - Sharing knowledge and experience within the other groups in the org., conducting various technical sessions and trainings Additional Information Thanks & Regards, Aravind Jakku aravind.j-tekwissen.com ************
    $90k-127k yearly est. 2d ago
  • Cloud Analytics in Big Data/Hadoop

    Sonsoft 3.7company rating

    Data engineer job in Phoenix, AZ

    Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services. Job Description At least 4+ years of overall design and development experience At least 4+ years of working on data management domain with big data, Oracle, SQL or other database solutions At least 2+ years of experience with Big Data & Analytics solutions - Hadoop, MapReduce, Pig, Hive, Spark, Storm, Amazon Kinesis, AWS EMR, AWS RedShift, DynamoDB, Azure HDInsight, Azure Cortana Analytics, Azure Data Lake and other technologies Development experience in Java, Python, Scala Experience in cloud technologies preferred - AWS or Azure. Experience in DevOps technologies preferred Understanding of market and technology trends. Analytical and problem solving skills Qualifications Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education. At least 4 years of experience with Information Technology. Additional Information ** U.S. citizens and those authorized to work in the U.S. are encouraged to apply . We are unable to sponsor at this time. Note:- This is a Full-Time Permanent job opportunity for you. Only US Citizen, Green Card Holder, GC-EAD, H4-EAD & L2-EAD can apply. No OPT-EAD, TN Visa & H1B Consultants please. Please mention your Visa Status in your email or resume.
    $78k-107k yearly est. 60d+ ago
  • Data Scientist

    Isolved HCM

    Data engineer job in Phoenix, AZ

    Summary/objective We are seeking a highly skilled Data Scientist to focus on building and deploying predictive models that identify customer churn risk and upsell opportunities. This role will play a key part in driving revenue growth and retention strategies by leveraging advanced machine learning, statistical modeling, and large-scale data capabilities within Databricks. Why Join Us? Be at the forefront of using Databricks AI/ML capabilities to solve real-world business challenges. Directly influence customer retention and revenue growth through applied data science. Work in a collaborative environment where experimentation and innovation are encouraged. Core Job Duties: Model Development * Design, develop, and deploy predictive models for customer churn and upsell propensity using Databricks ML capabilities. * Evaluate and compare algorithms (e.g., logistic regression, gradient boosting, random forest, deep learning) to optimize predictive performance. * Incorporate feature engineering pipelines that leverage customer behavior, transaction history, and product usage data. Data Engineering & Pipeline Ownership * Build and maintain scalable data pipelines in Databricks (using PySpark, Delta Lake, and MLflow) to enable reliable model training and scoring. * Collaborate with data engineers to ensure proper data ingestion, transformation, and governance. Experimentation & Validation * Conduct A/B tests and back testing to validate model effectiveness. * Apply techniques for model monitoring, drift detection, and retraining in production. Business Impact & Storytelling * Translate complex analytical outputs into clear recommendations for business stakeholders. * Partner with Product and Customer Success teams to design strategies that reduce churn, increase upsell and improve customer retention KPIs. Minimum Qualifications: * Master's or PhD in Data Science, Statistics, Computer Science, or related field (or equivalent industry experience). * 3+ years of experience building predictive models in a production environment. * Strong proficiency in Python (pandas, scikit-learn, PySpark) and SQL. * Demonstrated expertise using Databricks for: * Data manipulation and distributed processing with PySpark. * Building and managing models with MLflow. * Leveraging Delta Lake for efficient data storage and retrieval. * Implementing scalable ML pipelines within Databricks' ML Runtime. * Experience with feature engineering for behavioral and transactional datasets. * Strong understanding of customer lifecycle analytics, including churn modeling and upsell/recommendation systems. * Ability to communicate results and influence decision-making across technical and non-technical teams. Preferred Qualifications: * Experience with cloud platforms (Azure Databricks, AWS, or GCP). * Familiarity with Unity Catalog for data governance and security. * Knowledge of deep learning frameworks (TensorFlow, PyTorch) within Databricks. * Exposure to MLOps best practices (CI/CD for ML, model versioning, monitoring). * Background in SaaS, subscription-based businesses, or customer analytics. Physical Demands Prolonged periods of sitting at a desk and working on a computer. Must be able to lift up to 15 pounds. Travel Required: Limited Work Authorization: Employees must be legally authorized to work in the United States. FLSA Classification: Exempt Location: Any Effective Date: 9/16/2025 About isolved isolved is a provider of human capital management (HCM) solutions that help organizations recruit, retain and elevate their workforce. More than 195,000 employers and 8 million employees rely on isolved's software and services to streamline human resource (HR) operations and deliver employee experiences that matter. isolved People Cloud is a unified yet modular HCM platform with built-in artificial intelligence (AI) and analytics that connects HR, payroll, benefits, and workforce and talent management into a single solution that drives better business outcomes. Through the Sidekick Advantage, isolved also provides expert guidance, embedded services and an engaged community that empowers People Heroes to grow their companies and careers. Learn more at ******************* isolved is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. isolved is a progressive and open-minded meritocracy. If you are smart and good at what you do, come as you are. Visit ************************** for more information regarding our incredible culture and focus on our employee experience. Visit ************************* for a comprehensive list of our employee total rewards offerings.
    $74k-107k yearly est. 2d ago
  • DataScientist

    V15P1Talonnn

    Data engineer job in Arizona City, AZ

    Java Full Stack Developer (Job Code: J2EE) 3 to 10 years of experience developing web based applications in Java/J2EE technologies Knowledge of RDBMS and NoSQL data stores and polyglot persistence (Oracle, MongoDB etc.) Knowledge of event sourcing and distributed message systems (Kafka, RabbitMQ) AngularJS, React, Backbone or other client-side MVC experience Experience with JavaScript build tools and dependency management (npm, bower, grunt, gulp) Experience creating responsive designs (Bootstrap, mobile, etc.) Experience with unit and automation testing (Jasmine, Protractor, JUnit) Expert knowledge of build tools and dependency management (gradle, maven) Knowledge of Domain Driven Design concepts and microservices Participate in software design and development using modern Java and web technology stack. Should be proficient in Spring boot and Angular Sound understanding of Microservices architecture Good understanding of event driven architecture Experience building Web Services (REST/SOAP) Experience in writing©Junit Good to have experience in TDD Expert in developing highly responsive web application using Angular4 or above Good Knowledge of HTML/HTML5/CSS, JavaScript/AJAX, and XML Good understanding of SQL and relational databases and NO SQL databases Familiarity with design patterns and should be able to design small to medium complexity modules independently Experience with Agile or similar development methodologies Experience with a versioning system (e.g., CVS/SVN/Git) Experience with agile development methodologies including TDD, Scrum and Kanban Strong verbal communications, cross-group collaboration skills, analytical, structured and strategic thinking. Great interpersonal skills, cultural awareness, belief in teamwork Collaborating with product owners, stakeholders and potentially globally distributed teams Work cross-functional in an Agile environment Excellent problem-solving, organizational and analytical skills Qualification : BE / B.Tech / MCA / ME / M.Tech **************************** TESt Shift 9 hrs Test Budget 5 hrs
    $73k-105k yearly est. Auto-Apply 60d+ ago
  • Data Engineer Training

    Agap Technologies

    Data engineer job in Mesa, AZ

    We at Agap Technologies Inc. help our clients build successful businesses by enabling them to synergize state-of-the-art technology with exceptional talent. We offer a full suite of IT solutions and services, from custom software development to staffing. Our multidisciplinary team of experts in areas like data analysis, automation, personnel development and management, and project management helps us offer a unique set of tech-driven solutions and services that allow our clients to achieve their business objectives in the most efficient way possible. Our technical competence is further built upon by our high standards of professionalism, diligence, and ethics, which has enabled us to deliver top-notch services to our clients and exceed expectations every time. Job Description Our Training Features: · You will receive top quality instruction that is famous for Online IT training. · Trainees will receive immediate response to any training related queries, either technical or otherwise. We advise our trainees not to wait till the next class to seek answers to any technical issue. · Training sessions are conducted by real-time instructor with real-time examples. · Every training session is recorded and posted to the batch after each weekend class. · We are offering online training on Data Engineer. . Provide OPT Stem Ext.: Guidance and support for applying for the 24-month OPT STEM extension Help with OPT Employment letter: Help with drafting and obtaining OPT employment letters that meet USCIS requirements. · We provide training in technology of your choice. · Good online training virtual class room environment. · Highly qualified and experienced trainers. · Professional environment. · Special interview training · Training for skill enhancement. · Study material and Lab material provided. · E-Verified company. If you are interested or if you know anyone looking for a change, please feel free to call or email me for details or questions. I look forward to seeing resumes from you or your known and highly recommended candidates. Thanks Additional Information All your information will be kept confidential according to EEO guidelines.
    $80k-111k yearly est. 2d ago
  • Big Data Engineer

    Eateam

    Data engineer job in Phoenix, AZ

    6 to 12 months AWS experience should have EMR Lambda, S3 and EC2 Should be strong hands-on experience in Big Data Technologies. Must be strong in Hive , Spark , No SQL , AWS Strong Experience in : Java or Scala Must be strong in AWS technologies ( AWS Certified ) Additional Information Regards Ria
    $80k-111k yearly est. 2d ago

Learn more about data engineer jobs

How much does a data engineer earn in Casa Grande, AZ?

The average data engineer in Casa Grande, AZ earns between $69,000 and $128,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Casa Grande, AZ

$94,000

What are the biggest employers of Data Engineers in Casa Grande, AZ?

The biggest employers of Data Engineers in Casa Grande, AZ are:
  1. Lucid Motors
Job type you want
Full Time
Part Time
Internship
Temporary