Engineer 1
Requirements engineer job in Chandler, AZ
Compensation Type: Hourly Highgate Hotels:
Highgate is a leading real estate investment and hospitality management company with over $15 billion of assets under management and a global portfolio of more than 400 hotels spanning North America, Europe, the Caribbean, and Latin America.
With a 30-year track record as an innovator in the hospitality industry, this forward-thinking company provides expert guidance through all stages of the property cycle, from planning and development through recapitalization or disposition. Highgate continues to demonstrate success in developing a diverse portfolio of bespoke lifestyle hotel brands, legacy brands, and independent hotels and resorts, featuring contemporary programming and digital acumen. The company utilizes industry-leading revenue management tools that efficiently identify and predict evolving market dynamics to drive outperformance and maximize asset value.
With an executive team of seasoned hospitality leaders and corporate offices worldwide, Highgate is a trusted partner for top ownership groups and major hotel brands. ****************
Location: Hampton Inn Chandler7333 W. Detroit StreetChandler, AZ 85226 Overview:
The Engineer, Level 1, is responsible for ensuring that the property is maintained in the best possible condition at all times with the least amount of inconvenience to customers and employees.
Responsibilities:
Make repairs to hotel air conditioning system: change filters, clean coils, replace motors.
Perform preventive maintenance on all equipment (i.e. boilers, chillers, HVAC-Heating Ventilation and Air Conditioning, electrical, etc.).
Take required readings on equipment.
Test cooling tower and record readings.
Replace and program televisions as needed.
Replace light switches, receptacles, light bulbs and fixtures.
Perform furniture repair.
Replace and repair pumps.
Perform plumbing repairs (i.e. clogged drains, copper pipe, change washers, change handles, drain fittings, etc).
Understand and be able to read blueprints and wiring diagrams.
Trace and repair all types of water lines.
Troubleshoot and repair kitchen equipment.
Maintain repair and preventive maintenance records.
Perform and maintain work to local, state and Federal codes.
Test, clean and repair swimming pools and spas.
Paint designated areas.
Repair and finish sheet rock.
Repair all types of wall coverings.
Repair and program hotel electronic lock system.
Qualifications:
High School diploma or equivalent and/or experience in a hotel or a related field required.
At least one year of progressive experience in a hotel or related field preferred.
Trade school and/or College course work in related field helpful.
Flexible and long hours sometimes required.
Heavy work - Exerting in excess of 100 pounds of force occasionally, and/or in excess of 50 pounds of force frequently and/or in excess of 20 pounds Must be able to effectively communicate both verbally and written, with all level of employees and guests in an attentive, friendly courteouse and service oriented manner.
Must be effective at listening to, understanding, and clarifying concerns raised by employees and guests.
Must be able to multitask and prioritize departmental functions to meet deadlines.
Approach all encounters with guests and employees in an attentive, friendly, courteous and service-oriented manner.
Attend all hotel required meetings and trainings.
Maintain regular attendance in compliance with Highgate Hotel Standards, as required by scheduling, which will vary according to the needs of the hotel.
Maintain high standards of personal appearance and grooming, which includes wearing the proper uniform and nametag.
Comply with Highgate Hotel Standards and regulations to encourage safe and efficient hotel operations.
Maximize efforts towards productivity, identify problem areas and assist in implementing solutions.
Must be effective in handling problems, including anticipating, preventing, identifying and solving problems as necessary.
Must be able to understand and apply complex information, data, etc. from various sources to meet appropriate objectives.
Must be able to cross-train in other hotel related areas.
Must be able to maintain confidentiality of information.
Must be able to show initiative, including anticipating guest or operational needs.
Perform other duties as requested by management.
Java Backend Engineer
Requirements engineer job in Phoenix, AZ
Java Backend Developer (Vert.X & Spark - Good to Have)
We're looking for a strong Java engineer with experience in backend development and web technologies. Vert.X and Apache Spark experience is a plus.
Key Skills:
Java, Webtechnologies
Vert.X & Spark (nice to have)
Team player, Agile mindset
Hybrid work (3 days onsite)
Frontend Engineer (React & Next.JS) - Salt Lake City, UT
Requirements engineer job in Salt Lake City, UT
Frontend Engineer
We're looking for a Front-End Engineer with exceptional React and Next.js expertise to help us expand our banking platform. You'll transform wireframes and designs into elegant, high-performance interfaces and collaborate closely with backend engineers to deliver seamless user experiences.
What You'll Do
Implement responsive, accessible, and pixel-perfect UI using React, Next.js, Typescript, HTML5, and CSS.
Collaborate with backend teams to integrate APIs and ensure smooth data flow.
Optimize performance for complex, interactive features and rich forms.
Write unit tests and E2E tests to maintain quality and prevent regressions.
Contribute to CI/CD pipelines and advocate for best practices in front-end development.
Participate in architectural discussions.
What We're Looking For
Professional experience building modern web applications with React and Next.js.
Strong foundation in Typescript, HTML5, and CSS.
Experience with state management (Redux, Context API) and component libraries.
Familiarity with AWS is a plus.
Bonus: Experience with GraphQL, Tailwind CSS, or microservices architecture.
Proven ability to work in agile teams and communicate effectively.
Although we have a global team, we would prefer to find someone local to Utah and available to spend some time at our Base Camp in downtown Salt Lake City
What Sets You Apart
You've led teams or projects and know how to balance technical excellence with collaboration.
You're passionate about building scalable, maintainable front-end architectures.
You embrace testing and automation as part of your development DNA.
You stay ahead of trends in React and modern front-end ecosystems.
You thrive in environments where innovation and speed matter-and you make others better by sharing knowledge.
Azure Cloud Engineer
Requirements engineer job in Phoenix, AZ
AgreeYa is a global Systems Integrator and is seeking an experienced Azure Cloud Engineer to join our growing team. Join our dynamic team as a Level 2 Azure Cloud Engineer, where you will be the go-to technical resource driving the reliability, performance, and security of enterprise-scale networks and cloud environments. You'll troubleshoot complex issues, lead system upgrades, perform proactive health checks, and collaborate closely with infrastructure, security, and operations teams to keep mission-critical systems running smoothly
Job Responsibilities
Design, build, and maintain Azure-based cloud infrastructure, including VMs, VNets, storage, load balancers, and other core cloud services.
Implement Infrastructure-as-Code (IaC) using Terraform, Bicep, or ARM templates.
Migrate on-premises workloads and applications to Azure with minimal downtime
Build and maintain CI/CD pipelines using Azure DevOps, GitHub Actions, or similar tools.
Automate operational tasks using PowerShell, Bash, or Python.
Implement and support Azure DevOps pipelines, artifacts, and release management processes.
Required Skills & Experience
Experience working as a Cloud Engineer or similar role, with a strong focus on Microsoft Azure services.
Experience with automation and scripting using PowerShell, Bash, or Python.
Solid experience with Azure DevOps pipelines, CI/CD workflows, repositories, artifacts, and release management.
Preferred Skills & Experience:
AZ-104: Azure Administrator Associate
AZ-305: Azure Solutions Architect Expert
AZ-400: DevOps Engineer Expert
Education Required:
Bachelor's degree in Computer Science, Information Technology, Engineering, or equivalent practical experience.
PLM Engineer III
Requirements engineer job in Reno, NV
PLM Engineer III @ Reno, NV
In this role, the PLM Engineer III will serve as a technical expert in the areas of infrastructure related to manufacturing information systems. The engineer will identify methods and architect solutions to provide a high level of service to end-users of the application(s). This role will help build out, maintain, and troubleshoot a rapidly expanding application system and its infrastructure. The Engineer will work closely with other engineers, analysts, and leadership to obtain regular feedback on design and execution. Additionally, the Engineer will provide periodic operational support and maintenance of the manufacturing systems.
Essential Duties:
• Responsible for the maintenance, optimization, troubleshooting, upgrades, installation and documentation of infrastructure related to Product Lifecycle Management (PLM) systems which
includes PLM database management, project management systems, CAD and PDM systems,
• Administration of new and existing technologies, including the design, documentation, install, configuration, maintenance, integration, and testing related systems
• Leads collaboratively with business customers on hardware, middleware, and application requirements.
• Leads with vendors, engineers, analysts, and leadership to ensure the integrity of databases, application software and data
• Performs regular system monitoring, verifying the integrity, health and availability of all systems and key processes, reviewing application and system logs, and ensuring completion of scheduled jobs
• Leads problem-solving efforts regarding system and service matters and if necessary, facilitates support with outside personnel, organizations, and/or vendors
• Monitor tech-industry trends, and brings new ideas and innovative solutions
• Develop new deliverables as needed based on operational requirements and following best practices for documentation and delivery
• Analyzes data to drive decision making and recommendations.
• Configure and deploy devices and peripherals as necessary
• Resolves issues, problems and questions on-site assigned from a ticketing system
Qualifications: Requirements - Required and/or Preferred
Education:
• Basic: Bachelor's degree in Information Systems, or similar discipline, and/or equivalent related work experience
• Preferred: PLM Certification
Essential Qualifications:
• Minimum of 4+ years of hands-on experience in a factory or enterprise setting
• Minimum of 2+ years of hands-on experience with PLM
• Technical expertise with PLM systems such as CAD, PDM, ERP, PMO
• Excellent communication (oral and written) skills, including the ability to explain and present technical information
• Effectively train and advise users on information technology issues
• Prepare written documentation in a clear and concise style
• Demonstrated ability to work in a fast-paced, flexible environment and take the initiative to learn new tools and concepts quickly
• Creative thinker with attention to detail, strong analytical, multitasking, and interpersonal skills
• Approachable and adaptable
• Ability to work extended-hours or outside of normal business hours including weekends as needed
• Skills in MS Office (Word, Power Point, Excel, Outlook)
• Must have working-level knowledge of the English language, including reading, writing, and speaking English
Preferred Qualifications:
• Prior experience in a factory environment desirable
• Technical experience with database management & language (SQL, Oracle)
• Able to work independently and in a team environment, as well as with cross-functional groups
• Resilient, self-motived and able to work well under pressure
GCP engineer with Bigquery, Pyspark
Requirements engineer job in Phoenix, AZ
Job Title : GCP engineer with Bigquery, Pyspark
Experience Required - 7+ Years
Must Have Technical/Functional Skills
GCP Engineer with Bigquery, Pyspark and Python experience
Roles & Responsibilities
· 6+ years of professional experience with at least 4+ years of GCP Data Engineer experience
· Experience working on GCP application Migration for large enterprise
· Hands on Experience with Google Cloud Platform (GCP)
· Extensive experience with ETL/ELT tools and data transformation frameworks
· Working knowledge of data storage solutions like Big Query or Cloud SQL
· Solid skills in data orchestration tools like AirFlow or Cloud Workflows.
· Familiarity with Agile development methods.
· Hands on experience with Spark, Python ,PySpark APIs.
Knowledge of various Shell Scripting tools
Salary Range - $90,000 to $120,000 per year
Interested candidates please do share me your updated resume to *******************
TCS Employee Benefits Summary:
Discretionary Annual Incentive.
Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans.
Family Support: Maternal & Parental Leaves.
Insurance Options: Auto & Home Insurance, Identity Theft Protection.
Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement.
Time Off: Vacation, Time Off, Sick Leave & Holidays.
Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
Azure Cloud Engineer
Requirements engineer job in Phoenix, AZ
This is not a DevOps engineer role, but a Cloud Engineer role
Skills & Qualifications:
Required- Advanced Azure/ certifications
Ten plus years of experience in designing,
building, and implementing distributed cloud architecture within cloud environments Azure. - APIM, APIOps, CosmosDB, EventHub, AKS etc.
7-8/10 scale Terraform experience is MUST and not limited to existing templates, but terraform from scratch.
experience building complex cloud Infrastructure, ability to write modules, best practice, state files management etc. is MUST.
understanding of Linux fundamentals. Windows welcome.
security scan tools understanding (e.g. Azure Defender, Container Scan, API security, Perimeter scan etc.) is much desired.
with full DevOps cycle spanning across - Cloud Infrastructure, Cloud Security, Observability, CICD, Secure CICD, ProdOps etc.
experience in designing and implementing large scale platforms with high resiliency, availability, and reliability using public cloud infrastructure.
in implementing security architecture and best practices in cloud infrastructure.
verbal and written skills, ability to clearly communicate.
iSeries Engineer
Requirements engineer job in Las Vegas, NV
Taurean Consulting Group is a 100% Woman-Owned IT Staffing and Project Solutions company built on deep relationships. With over 25 years of experience in Technology Staffing, we match candidates to the culture of an organization as well as required skill sets.
Our client is seeking a iSeries Power System Engineer to join their team in Las Vegas, NV. This is an On-Site, Direct Hire role.
Successful candidates excel at and enjoy:
Provide support and subject matter expertise for installations, configurations, maintenance and support of iSeries / Power Systems with regard to physical hardware, OS and virtualized environments to ensure proper performance and up-time.
Resolve escalated complex service requests and incidents assigned to the team through the Help Desk Ticketing System as related to iSeries / Power Systems activities.
Interact with vendors on support agreements (scope development), ticket management and issue escalations.
Participate in projects to improve systems, processes and service delivery quality within the department.
Your previous experience includes:
5 years' experience iSeries / Power Systems Administration including OS V7R1+ Management.
5 years' significant experience managing iSeries / Power Systems infrastructure including HMC, VIOS, Virtualization & External or Internal Storage (LPAR) and Patch Management (PTFs).
Minimum 5years' experience High Availability applications including iTera; configuration & administration.
5 years' of experience with iSeries / Power Systems access, utility, security & auditing applications including IBM Client Access, Director, Navigator, BRMS, iSecurity, Help Systems, abs Message or related applications.
Salary: $80k - $100K
Does this sound like the job for you? If so, please apply today! Let's do this!
Not sure this is a fit? We can help! Contact us at ************ to speak with one of our consultants about your career path!
Java Google Query engineer
Requirements engineer job in Phoenix, AZ
Role : JAVA GCP Engineer
Type : Fulltime
Please Apply : ************************************************************** Id=535855&company=Atos&st=CE3C6B0E5BD8B3334CD9AFDC18C3A53009E33FC3
Job description :
Required Skills & Experience 8+ years of hands-on experience in Java backend development.
Strong proficiency in Core Java, J2EE, and microservices architecture.
Hands-on experience with Spring, Spring Boot, and RESTful web services.
Solid understanding of Object-Oriented Design Patterns and software engineering best practices.
Hands-on experience with GCP BigQuery - including querying, data modeling, and performance tuning
Good understanding of microservices architecture.
DevOps Engineer
Requirements engineer job in Lehi, UT
Dev-Ops Engineer 3
Job Details
Dev-Ops Engineer 3 (Contract)
Duration: 1/12/2026 to 5/08/2026
Team: Campaign Managed Cloud and Fleet Operati
Key Responsibilities:
Provide deep technical troubleshooting for escalated issues that involve Adobe Campaign's most technically complex or large-scale customers;
Troubleshoot, monitor and report tools to analyze the root cause of serious and impactful technical issues and build stable and sustainable solutions and improvements;
Work closely with customer care, internal escalation teams, product management, and engineering to seek solutions for customers and drive ownership of tasks toward completion;
Drive and improve the whole lifecycle of operational readiness from inception and design, through deployment, operation and refinement;
Develop tools, operational enhancements, and automated solutions that enable self-service configuration changes, speed deployments and improve monitoring in support of business-critical customer facing SaaS applications and environments;
Ensure proper monitoring and metrics are being built into the applications before going to production.
Required Skills & Qualifications:
Bachelor's Degree in Computer Science, Computer Engineering, Software Engineering, or related field.
Full Stack troubleshooting experience including networking, operating system (Debian, CentOS), Apache, HA Proxy, Nginx, RDBMS
Experience leveraging monitoring tools such as Splunk, New Relic, Nagios for troubleshooting
Experience with AWS and/or Azure stack - particularly in the areas of networking (VPCs, security groups), VMs (EC2), databases (RDS), load balancing (ELB, ALB)
Excellent information management practices, such as detailed documentation, usage of wikis, and other collaboration tools
Ability to scope project work, estimate effort and then break down work into sub-tasks.
Experience developing applications in one or more of the following: Python, Java or Go.
Strong comprehension of continuous integration and continuous deployment methodologies.
Excellent written and verbal communication skills, demonstrating the ability to effectively convey technical information to both technical and non-technical audiences.
Compensation:
$60.15 per hour.
#36552720
Data Engineer
Requirements engineer job in Tempe, AZ
About the Role
We are seeking a highly skilled Databricks Data Engineer with strong expertise in modern data engineering, Azure cloud technologies, and Lakehouse architectures. This role is ideal for someone who thrives in dynamic environments, enjoys solving complex data challenges, and can lead end-to-end delivery of scalable data solutions.
What We're Looking For
8+ years designing and delivering scalable data pipelines in modern data platforms
Deep experience in data engineering, data warehousing, and enterprise-grade solution delivery
Ability to lead cross-functional initiatives in matrixed teams
Advanced skills in SQL, Python, and ETL/ELT development, including performance tuning
Hands-on experience with Azure, Snowflake, and Databricks, including system integrations
Key Responsibilities
Design, build, and optimize large-scale data pipelines on the Databricks Lakehouse platform
Modernize and enhance cloud-based data ecosystems on Azure, contributing to architecture, modeling, security, and CI/CD
Use Apache Airflow and similar tools for workflow automation and orchestration
Work with financial or regulated datasets while ensuring strong compliance and governance
Drive best practices in data quality, lineage, cataloging, and metadata management
Primary Technical Skills
Develop and optimize ETL/ELT pipelines using Python, PySpark, Spark SQL, and Databricks Notebooks
Design efficient Delta Lake models for reliability and performance
Implement and manage Unity Catalog for governance, RBAC, lineage, and secure data sharing
Build reusable frameworks using Databricks Workflows, Repos, and Delta Live Tables
Create scalable ingestion pipelines for APIs, databases, files, streaming sources, and MDM systems
Automate ingestion and workflows using Python and REST APIs
Support downstream analytics for BI, data science, and application workloads
Write optimized SQL/T-SQL queries, stored procedures, and curated datasets
Automate DevOps workflows, testing pipelines, and workspace configurations
Additional Skills
Azure: Data Factory, Data Lake, Key Vault, Logic Apps, Functions
CI/CD: Azure DevOps
Orchestration: Apache Airflow (plus)
Streaming: Delta Live Tables
MDM: Profisee (nice-to-have)
Databases: SQL Server, Cosmos DB
Soft Skills
Strong analytical and problem-solving mindset
Excellent communication and cross-team collaboration
Detail-oriented with a high sense of ownership and accountability
Senior Data Engineer
Requirements engineer job in Las Vegas, NV
We are looking for a Senior Developer/Analyst to support high-impact initiatives in the financial services domain. The ideal candidate will leverage strong data and business analysis skills to deliver actionable insights and build robust data solutions in a fast-paced Agile environment.
Responsibilities:
• Design and maintain process flows, data models and analytic solutions using SAS, SQL, and Python/PySpark.
• Deliver high-quality dashboards and visualizations using Tableau or similar tools.
• Collaborate with cross-functional stakeholders to understand business needs and translate them into analytical solutions.
• Follow Agile and data development best practices for sprint-based delivery.
• Prepare documentation and ensure knowledge transfer for ongoing support.
Required Qualifications:
• Bachelor's degree in a quantitative or technology-related discipline.
• 5+ years of experience in data analysis and process modeling within the financial services sector.
• Hands-on experience with SAS, SQL, and working knowledge of Python/PySpark.
• Strong dashboarding and data storytelling skills using Tableau, Power BI, or similar tools.
• Excellent communication skills and ability to work independently as a contractor.
Preferred:
• Experience in credit card processing and familiarity with risk/fraud data is a strong plus.
Experience with Snowflake or similar modern data platforms is a plus.
ETL Data Engineer
Requirements engineer job in Salt Lake City, UT
Role: ETL Data Engineer
Employment Type: Full-time
Experience: 8+ Years
We are seeking an ETL Data Engineer with strong experience in building and supporting large-scale data pipelines. The role involves designing, developing, and optimizing ETL processes using tools like DataStage, SQL, Python, and Spark. You will work closely with architects, engineers, and business teams to create efficient data solutions. The job includes troubleshooting issues, improving performance, and handling data migration and transformation tasks. You will also support Test, QA, and Production environments while ensuring smooth deployments. Strong skills in databases, scripting, and version control are essential for this position.
Responsibilities
Collaborate with architects, engineers, analysts, and business teams to develop and deliver enterprise-level data platforms that support data-driven solutions.
Apply strong analytical, organizational, and problem-solving skills to design and implement technical solutions based on business requirements.
Develop, test, and optimize software components for data platforms, improving performance and efficiency.
Troubleshoot technical issues, identify root causes, and recommend effective solutions.
Work closely with data operations teams to deploy updates into production environments.
Provide support across Test, QA, and Production environments and perform additional tasks as needed.
Required Qualifications
Bachelor's degree in Computer Science, Computer Engineering, or a related discipline.
Strong experience in Data Warehousing, Operational Data Stores, ETL tools, and data management technologies.
8+ years of hands-on expertise in ETL (IBM DataStage), SQL, UNIX/Linux scripting, and Big Data distributed systems.
4+ years of experience with Teradata (Vantage), SQL Server, Greenplum, Hive, and delimited text data sources.
3+ years of experience with Python programming, orchestration tools, and ETL pipeline development using Python/Pandas.
Deep understanding of data migration, data analysis, data transformation, large-volume ETL processing, database modeling, and SQL performance tuning.
Experience creating DDL scripts, stored procedures, and database functions.
Practical experience with Git for version control and release processes.
Familiarity with Spark framework, including RDDs using Python or Scala.
DevOps Engineer
Requirements engineer job in Scottsdale, AZ
Overall Purpose
This position designs, develops, tests and maintains infrastructure as code, CICD patterns, Configuration Management and containerized product applications, providing technical leadership and hands-on support for internal systems.
Essential Functions
Design, develop, document, test and debug new and existing Configuration management patterns and infrastructure as code.
Design, create and maintain comprehensive policies and technical documentation of best practices for all implemented system configurations ensuring efficient planning and execution.
Perform requirements analysis and design a model for Infrastructure and application flow.
Conduct design meetings and analyzes user needs to determine technical requirements.
Write technical specifications (based on conceptual design and business requirements).
Identify and evaluate new technologies for implementation. Recommend and implement changes to existing hardware and operating system infrastructure including patches, users, file systems and kernel parameters. Seek out and implement new technologies to continually simplify the environment while improving security and performance.
Analyze results, failures and bugs to determine the causes of errors and tune the automation pipeline to fix the problems to have desired outcome.
Diagnose and resolve hardware related server problems (failed disks, network cards, CPU, memory, etc.) and act as escalation point to troubleshoot hardware and operating system problems and suggest possible performance tuning.
Consult with end user to prototype, refine, test, and debug programs to meet needs.
Proactively monitors health of environment and act on fixing any issues and improves the performance of environments.
Coaching and mentoring staff on team policies, procedures, use cases and best patterns.
Support and maintain products and add new features.
Participate in and follow change management processes for change implementation.
Support the company's commitment to risk management and protecting the integrity and confidentiality of systems and data
For Kubernetes Focus Only:
Design/Implement container orchestration platform in a hybrid cloud environment.
Ensure that container orchestration platform is regularly maintained and released to production without downtime.
For Cloud Focus Only:
Lead infrastructure-as-code projects, designing APIs and building tools to be used by engineering teams for reliable and repeatable cloud deployments
Implement abstractions to simplify the complexities of cloud providers (AWS), open-source technologies (Kubernetes), and internal EWS infrastructure
Obsess about the usability of the systems you build, allowing engineers to have an intuitive and predictable experience working with infrastructure at scale
Troubleshooting complex infrastructure problems, often spanning multiple layers of the stack and requires working with multiple teams
Experience designing cloud infrastructure for robustness, security, and observability
Expertise in infrastructure-as-code tools such as Terraform, Ansible, and continuous deployment pipelines
Expertise in AWS foundations including compute, networking, storage, observability and security. Experience in automating AWS services using Terraform and Ansible. Experience in highly scalable distributed datacenter or cloud computer systems (AWS, Azure, VM)
Strong knowledge of AWS services (EC2, IAM, ELB, Route53, S3, Lambda, Cloud Formation, DynamoDB)
Experience architecting Kubernetes based systems
Container orchestration - Kubernetes, TKGi, EKS, ECS
Proficient with using and debugging networks, DNS, HTTP, TLS, load-balancing, build systems, Linux, and Docker
Experience in building CI/CD pipelines
Experience building and scaling Workflow pipelines
Experience in data center operations, monitoring, alerting and notifications
Minimum Qualifications
Education and/or experience typically obtained through completion of a Bachelor's Degree in Computer Science or equivalent certifications.
Minimum of 7 or more years of related experience.
Demonstrated prior DevOps, software engineering or related experience.
Ability to work on multiple projects and general understanding of software environments and network topologies
Able to facilitate technical design sessions
Minimum of 3 years of experience in modern application design patterns
Solid understanding of an iterative software development process
Ability to use Linux administration command line programs and create/edit scripts
Knowledge of one or more of the tools - Chef, Ansible, puppet.
Knowledge of one or more of the tools - IAC, Containerization and orchestration (Terraform, Docker & Kubernetes)
Experienced with security and encryption protocols.
Knowledge of one of the cloud infrastructure providers - AWS, GCP and Azure
Must be able to work different schedules as part of an on-call rotation.
Background and drug screen.
Preferred
Certification in Terraform, AWS, and Kubernetes
AWS, Azure (and/ or other cloud-based) certification(s) strongly preferred
Interviews: 3 virtual interviews then 1 final onsite.
Start Date: Jan/early Feb.
Data Engineer
Requirements engineer job in Scottsdale, AZ
📍
Scottsdale, AZ (Hybrid 3 days a week in office)
About the Opportunity
A leading renewable energy organization is seeking a Data Engineer to join its high-growth Performance Engineering team. This is an exceptional role for someone who wants to work at the intersection of data engineering, analytics, and clean energy, supporting a portfolio of utility-scale solar, energy-storage, and solar-plus-storage assets across the U.S.
If you thrive in an environment focused on teamwork, continuous improvement, and driving real operational impact, this role offers both challenge and meaningful purpose.
What You'll Do
As an Associate Data Engineer, you'll help optimize the performance of a large fleet of renewable energy assets by designing and maintaining modern data architectures. Your work will turn vast amounts of operational data into actionable insights for engineering and asset management teams.
Key responsibilities include:
Build and maintain scalable data pipelines using Snowflake or Databricks
Integrate large, diverse datasets from performance systems, CMMS platforms, and drone inspection imagery
Analyze asset performance data to detect underperformance, quantify energy losses, and support predictive maintenance modeling
Manage the full data lifecycle from ingestion (S3) to processing, analysis, and visualization
Evaluate and improve systems, processes, and workflows across engineering teams
Develop metadata documentation and support strong data governance practices
What We're Looking For
Bachelor's degree in Data Science, Computer Science, Engineering, Statistics, or a related quantitative field
3-4 years of experience in a data-focused role
Strong hands-on expertise with Snowflake or Databricks, plus cloud experience with AWS (S3, EC2, Glue, SageMaker)
Experience with Apache Spark for distributed computing (highly preferred)
Expert-level SQL and strong Python skills (Pandas, NumPy)
Experience in statistical modeling, ML, and mathematical modeling
Experience working with aerial or geospatial imagery (OpenCV, Scikit-image, GeoPandas, PyTorch, TensorFlow)
Ability to collaborate effectively, take ownership, and drive process improvements
Strong communication skills and the ability to align technical work with business goals
Why You'll Love Working Here
This organization invests heavily in the well-being, growth, and success of its team members. You can expect:
Flexible, hybrid work environment
Generous PTO
401(k) with 6% company match
Tuition reimbursement
Paid parental & caregiver leave
Inspiring, mission-driven culture
Strong opportunities for professional growth and development
Data Engineer
Requirements engineer job in Phoenix, AZ
Hybrid - 2-3 days on site
Phoenix, AZ
We're looking for a Data Engineer to help build the cloud-native data pipelines that power critical insights across our organization. You'll work with modern technologies, solve real-world data challenges, and support analytics and reporting systems that drive smarter decision-making in the transportation space.
What You'll Do
Build and maintain data pipelines using Databricks, Azure Data Factory, and Microsoft Fabric
Implement incremental and real-time ingestion using medallion architecture
Develop and optimize complex SQL and Python transformations
Support legacy platforms (SSIS, SQL Server) while contributing to modernization efforts
Troubleshoot data quality and integration issues
Participate in proof-of-concepts and recommend technical solutions
What You Bring
5+ years designing and building data solutions
Strong SQL and Python skills
Experience with ETL pipelines and Data Lake architecture
Ability to collaborate and adapt in a fast-moving environment
Preferred: Azure services, cloud ETL tools, Power BI/Tableau, event-driven systems, NoSQL databases
Bonus: Experience with Data Science or Machine Learning
Benefits
Medical, dental, and vision from day one · PTO & holidays · 401(k) with match · Lifestyle account · Tuition reimbursement · Voluntary benefits · Employee Assistance Program · Well-being & culture programs · Professional development support
Senior Data Engineer
Requirements engineer job in Phoenix, AZ
As the Senior Data Engineer, you will help build, maintain, and optimize the data infrastructure that powers our decision-making and product development. You'll work with modern tools like Snowflake, Metabase, Mage, Airbyte, and MySQL to enable data visualization, data mining, and efficient access to high-quality insights across our GradGuard ecosystem. This is a key opportunity for someone with around five years of experience who's passionate about turning data into impact.
This position is based in Phoenix, AZ.
Challenges You'll Focus On:
Design, build, and maintain scalable data pipelines and architectures using Mage (or similar orchestrators) and Airbyte for ELT processes.
Ensure efficient and reliable data ingestion, transformation, and loading into Snowflake.
Perform data mining and exploratory data analysis to uncover trends, patterns, and business opportunities.
Ensure the quality, consistency, and reliability of the underlying data.
Promote best practices and quality standards for the data engineering team.
Partner with Data Science, Business Intelligence, and Product teams to define data needs and ensure data infrastructure supports strategic initiatives.
Optimize SQL queries and data models for performance and scalability.
Contribute to improving data standards, documentation, and governance across all data systems.
Help ensure compliance with data security, privacy, and regulatory requirements in the insurance domain.
The person we're looking for has a proven, successful background with:
5+ years of experience as a Data Engineer, Data Analyst, or similar role.
Experience leading and mentoring a data engineering team.
Proficient with SQL for data transformation, querying, and performance optimization.
Proficiency with Python or other languages like Java, JavaScript, and/or Scala.
Proficiency in connecting with APIs for data loading.
Hands-on experience with:
Snowflake (Data Warehousing). Beyond basic SQL, you must understand Snowflake's unique architecture and features which includes: Data Warehouse Design, Performance Optimization, and Data Loading. Knowledge of advance Snowflake feature is nice-to-have.
Mage (Data Pipeline Orchestration) or experience with other orchestration.
Airbyte (ELT Processes) or experience with ELT tools.
Metabase (Data Visualization & Dashboards) or familiarity with other visualization BI tools.
Comfort with data modeling, ETL/ELT best practices, and cloud-based data architectures (preferably AWS).
Excellent problem-solving skills, attention to detail, and ability to work cross-functionally.
Prior experience working in an insurance, fintech, or highly regulated industry is a plus.
Beyond a fulfilling and challenging role, you'll get:
A competitive salary.
Opportunity to enroll in comprehensive health, dental, and vision insurance. We pay 100% of employee premiums and 75% of your family's premiums.
A lifestyle spending account where you can receive up to $400 in reimbursements for wellness activities.
401(K) retirement plan with company matching up to 5% of compensation deferred. Employee and employer contributions are 100% vested.
Student loan and education assistance, after one year of employment at GradGuard. We're learners and embrace education.
Unlimited PTO after completing the 30-day introductory period. Plus, 12 paid holidays and paid parental leave.
About GradGuard
As the leader in college tuition and renters insurance, GradGuard serves more than 1.7 million students across 1,900+ institutions.
Our national technology platform embeds innovative insurance protections into the enrollment processes of over 650 institutional partners, empowering schools to increase college completion rates and reduce the financial impact of preventable losses.
GradGuard supports College Life Protected, a social purpose entity that promotes research, professional development, and best practices that strengthen campus communities, families, society and the economic competitiveness of our nation.
GradGuard was recognized as one of the Top 100 Financial Technology Companies of 2024 by The Financial Technology Report, a RISE Internship Award winner, and a Phoenix Business Journal Best Places to Work finalist, GradGuard remains committed to innovation, excellence, and supporting students and families.
Hear from our students, families, and partners: **********************************
Those that succeed at our company:
Make it happen by turning challenges into opportunities.
Do the right thing even when it's difficult.
Demand excellence from yourself and others.
Learn for life and stay curious.
Enjoy the journey, not just the results.
The above just so happen to be our core values. These values are at the heart of our mission to educate and protect students from the risks of college life, empowering us to create meaningful experiences and make a positive impact.
GradGuard is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Data Engineer
Requirements engineer job in Phoenix, AZ
Hi,
We do have an job opportunity for Data Engineer Analyst role.
Data Analyst / Data Engineer
Expectations: Our project is data analysis heavy, and we are looking for someone who can grasp business functionality and translate that into working technical solutions.
Job location: Phoenix, Arizona.
Type - Hybrid model (3 days a week in office)
Job Description: Data Analyst / Data Engineer (6+ Years relevant Experience with required skill set)
Summary:
We are seeking a Data Analyst Engineer with a minimum of 6 years in data engineering, data analysis, and data design. The ideal candidate will have strong hands-on expertise in Python and relational databases such as Postgres, SQL Server, or MySQL. Should have good understanding of data modeling theory and normalization forms.
Required Skills:
6+ years of experience in data engineering, data analysis, and data design
Your approach as a data analysis in your previous / current role, and what methods or techniques did you use to extract insights from large datasets
Good proficiency in Python
Do you have any formal training or education in data modeling? If so, please provide details about the course, program, or certification you completed, including when you received it.
Strong experience with relational databases: Postgres, SQL Server, or MySQL.
What are the essential factors that contribute to a project's success, and how do you plan to leverage your skills and expertise to ensure our project meets its objectives?
Expertise in writing complex SQL queries and optimizing database performance
Solid understanding of data modeling theory and normalization forms.
Good communicator with the ability to articulate business problems for technical solutions.
Key Responsibilities:
Analyze complex datasets to derive actionable insights and support business decisions.
Model data solutions for high performance and reliability.
Work extensively with Python for data processing and automation.
Develop and optimize SQL queries for Postgres, SQL Server, or MySQL databases.
Ensure data integrity, security, and compliance across all data solutions.
Collaborate with cross-functional teams to understand data requirements and deliver solutions.
Communicate effectively with stakeholders and articulate business problems to drive technical solutions.
Secondary Skills:
Experience deploying applications in Kubernetes.
API development using FastAPI or Django.
Familiarity with containerization (Docker) and CI/CD tools.
Regards,
Suhas Gharge
DevOps Engineer
Requirements engineer job in Chandler, AZ
Build Tools: Proficiency in build automation tools such as Make, Maven, Gradle, or Ant.
Continuous Integration/Continuous Deployment (CI/CD): Experience with CI/CD tools like Jenkins or GitLab CI.
Version Control Systems: Strong knowledge of version control systems, particularly Git, including branching strategies and workflows.
Scripting Languages: Proficiency in scripting languages such as Bash, Python, or Ruby for automating build processes.
Containerization: Familiarity with containerization technologies such as Docker and orchestration tools like Kubernetes.
Static and Dynamic Analysis Tools: Understanding of tools for code quality and security analysis (e.g., SonarQube, Val grind).
Programming Languages: Knowledge of programming languages relevant to the projects (e.g., C/C++, Python).
Preferred Qualifications
Experience in managing large data sets.
Parallel Computing: Familiarity with parallel programming models like MPI (Message Passing Interface), OpenMP, and CUDA for GPU-based computing.
Performance Optimization: Skills in profiling and optimizing code for better performance on HPC systems (e.g., using tools like Gprof, Valgrind, or Intel VTune).
Storage Architecture Knowledge: Understanding file systems such as Lustre, GPFS, or HDFS and strategies for efficient data storage and retrieval in HPC environments.
Distributed Computing Tools: Familiarity with frameworks such as Hadoop, Spark, or Dask for handling distributed datasets.
Education and Experience
· A bachelor's degree in Computer Science, Software Engineering, or a related field.
· Experience: Proven experience in software build management, DevOps, or continuous integration roles (typically 3+ years).
Java Software Engineer
Requirements engineer job in Phoenix, AZ
Job Title : Java Developer
Duration : 12 Months
Must Have Skills:
Good Knowledge on Java
Strong communication skill
Should be able to work independently
Detailed Job Description:
JavaJ2EE full stack developer with financial or Banking domain experience.
Should be very fluent in communication and should be able to work on his own without hand holding.
Should be completely hands on.
Responsibilities:
Good Knowledge on Java
Strong communication skill
Should be able to work independently