Post job

Requirements engineer jobs in Wylie, TX

- 1,288 jobs
All
Requirements Engineer
Data Engineer
Devops Engineer
  • SailPoint Engineer

    Pyramid Consulting, Inc. 4.1company rating

    Requirements engineer job in Roanoke, TX

    Immediate need for a talented SailPoint Engineer. This is a 12+ Month Contract opportunity with long-term potential and is located in Westlake, TX(Hybrid). Please review the job description below and contact me ASAP if you are interested. Job ID:25-95045 Pay Range: $65 - $70/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location). Key Requirements and Technology Experience: Must have skills: -Sailpoint Identity IQ platform - Lifecycle Manager, Certifications, Roles, Joiner/Mover/Leaver events Active Directory, JDBC, SCIM 2.0, Azure Active Directory Programming Java, BeanShell/JavaScript, Angular, SQL. XML, JSON, REST, SQL, Web and Application Servers like Tomcat. API's (REST, SCIM) leveraging Java based development Experience in building, installing, testing IIQ using Services Standards Build/Deployment (SSB/SSD) Sailpoint IdenityIQ - LCM, Certifications, Roles Java/BeanShell SSB/SSD Webservice - REST, SCIM SQL/PL Active Directory You Have B.S.in Computer Science preferred, Engineering / Mathematics or comparable 8+ years of experience in building and developing using SailPoint IdentityIQ product Expertise onboarding applications, Lifecycle events, Certifications, Roles in SailPoint IdentityIQ product Hands on experience with automation & pipeline implementation (Testing, Continuous Integration / Continuous Delivery pipeline). You have hands-on experience in designing and developing using the following technologies: Expertise working with Sailpoint Identity IQ platform - Lifecycle Manager, Certifications, Roles, Joiner/Mover/Leaver events Expertise onboarding applications with connectors like Active Directory, JDBC, SCIM 2.0, Azure Active Directory Programming Expertise with the following programming languages: Java, BeanShell/JavaScript, Angular, SQL. Expertise developing using XML, JSON, REST, SQL, Web and Application Servers like Tomcat. Expertise developing API's (REST, SCIM) leveraging Java based development Experience in building, installing, testing IIQ using Services Standards Build/Deployment (SSB/SSD) Experience installing, patching, and upgrading IIQ Experience in analyzing & troubleshooting issues in various components of IIQ Experience with Application Lifecycle Management tools such as GIT, Maven, Jenkins, Artifactory, Veracode, Sonar You have experience working in an agile environment (Scrum and Kanban) You possess strong engineering skills and experience developing maintainable, scalable multi-tiered applications You have maniacal focus on automation and always looking towards reducing waste and improving efficiencies You should enjoy communicating and learning the business behind the application. Our client is a leading Banking Industry and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration. Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
    $65-70 hourly 1d ago
  • AL/ML Engineer- Full Time

    Maveric Systems Limited

    Requirements engineer job in Irving, TX

    Job Title - AI/ML Engineer Job Type- Full Time Key Responsibilities Model Development: Design, build, and optimize machine learning models for predictive analytics, classification, recommendation systems, and NLP. Data Processing: Collect, clean, and preprocess large datasets from various sources for training and evaluation. Deployment: Implement and deploy ML models into production environments using frameworks like TensorFlow, PyTorch, or Scikit-learn. Performance Monitoring: Continuously monitor and improve model accuracy, efficiency, and scalability. Collaboration: Work closely with data engineers, software developers, and product teams to integrate AI solutions into applications. Research & Innovation: Stay updated with the latest advancements in AI/ML and apply cutting-edge techniques to business challenges. Documentation: Maintain clear documentation of models, processes, and workflows. Required Skills & Qualifications Bachelor's or Master's degree in Computer Science, Data Science, or related field. Strong proficiency in Python, R, or Java. Hands-on experience with ML frameworks (TensorFlow, PyTorch, Scikit-learn). Knowledge of data structures, algorithms, and software engineering principles. Experience with cloud platforms (AWS, Azure, GCP) and MLOps tools. Familiarity with big data technologies (Spark, Hadoop) is a plus. Excellent problem-solving and analytical skills. Preferred Qualifications Experience in Natural Language Processing (NLP), Computer Vision, or Deep Learning. Understanding of model interpretability and ethical AI practices. Prior experience deploying models in production environments.
    $70k-96k yearly est. 1d ago
  • AI/ML Engineer

    Apexon

    Requirements engineer job in Dallas, TX

    About the Role Apexon is seeking an experienced AI/ML Engineer with strong expertise in LLM development, MLOps, and building scalable GenAI solutions. You will design, build, and operationalize AI/ML systems that support enterprise clients across healthcare, BFSI, retail, and digital transformation engagements. The ideal candidate has hands-on experience building end-to-end machine learning pipelines, optimizing large language model workflows, and deploying secure ML systems in production environments. Responsibilities LLM & AI Solution Development Build, fine-tune, evaluate, and optimize Large Language Models (LLMs) for client-specific use cases such as document intelligence, chatbot automation, code generation, and workflow orchestration. Develop RAG (Retrieval-Augmented Generation) pipelines using enterprise knowledge bases. Implement prompt engineering, guardrails, hallucination reduction strategies, and safety frameworks. Work with transformer-based architectures (GPT, LLaMA, Mistral, Falcon, etc.) and develop optimized model variants for low-latency and cost-efficient inference. Machine Learning Engineering Develop scalable ML systems including feature pipelines, training jobs, and batch/real-time inference services. Build and automate training, validation, and monitoring workflows for predictive and GenAI models. Perform offline evaluation, A/B testing, performance benchmarking, and business KPI tracking. MLOps & Platform Engineering Build and maintain end-to-end MLOps pipelines using: AWS SageMaker, Databricks, MLflow, Kubernetes, Docker, Terraform, Airflow Manage CICD pipelines for model deployment, versioning, reproducibility, and governance. Implement enterprise-grade model monitoring (data drift, performance, cost, safety). Maintain infrastructure for vector stores, embeddings pipelines, feature stores, and inference endpoints. Data Engineering & Infrastructure Build data pipelines for structured and unstructured data using: Snowflake, S3, Kafka, Delta Lake, Spark (PySpark) Work on data ingestion, transformation, quality checks, cataloging, and secure storage. Ensure all systems adhere to Apexon and client-specific security, IAM, and compliance standards. Cross-Functional Collaboration Partner with product managers, data engineers, cloud architects, and QA teams. Translate business requirements into scalable AI/ML solutions. Ensure model explainability, governance documentation, and compliance adherence. Basic Qualifications Bachelor's or Master's degree in Computer Science, Engineering, AI/ML, Data Science, or related field. 4+ years of experience in AI/ML engineering, including 1+ years working with LLMs/GenAI. Strong experience with Python, Transformers, PyTorch/TensorFlow, and NLP frameworks. Hands-on expertise with MLOps platforms: SageMaker, MLflow, Databricks, Kubernetes, Docker. Strong SQL and data engineering experience (Snowflake, S3, Spark, Kafka). Preferred Qualifications Experience implementing Generative AI solutions for enterprise clients. Expertise in distributed training, quantization, optimization, and GPU acceleration. Experience with: Vector Databases (Pinecone, Weaviate, FAISS) RAG frameworks (LangChain, LlamaIndex) Monitoring tools (Prometheus, Grafana, CloudWatch) Understanding of model governance, fairness evaluation, and client compliance frameworks.
    $70k-96k yearly est. 4d ago
  • Google Cloud ML Engineer- Vertex AI

    HR Pundits Inc.

    Requirements engineer job in Dallas, TX

    Day 1 onsite in Dallas TX Contract Required Qualifications: 6+ years in software development (4+ in NLP/NLU for chat). Advanced Python skills; experience with ML/NLP libraries (Hugging Face, TensorFlow, PYTorch). Proven success building conversational agents with Vertex AI and Dialog flow CX/ES. Proficiency in GCP services and cloud-native architectures. Solid MLOps understanding (Docker, Kubernetes, CI/CD, Git). Experience with Agent Assist functionality is a plus.
    $70k-96k yearly est. 1d ago
  • CyberArk Engineer

    Talent 360 Solutions

    Requirements engineer job in Frisco, TX

    You will be responsible for delivery and buildout of a Privileged Access ecosystem and apply comprehensive knowledge of privileged access security controls to the completion of complex assignments. You will identify and recommend changes in procedures, processes, and scope of delivery. This position reports to the Director of Privileged Access Engineering. What you will do: Troubleshoot complex heterogeneous environments related to privileged access technologies through server log and network traffic analysis, leaning on experience with troubleshooting and analysis techniques and tools. Understand taxonomy of privileges on named or shared privileged accounts. Incorporate cybersecurity best practices for technology governance over privileged account lifecycles. Development of PAM (CyberArk) connection components and plugins as needed utilizing various scripting tools (PowerShell, python) and rest API's. Develop regular reporting and be accountable for deliverables. Perform disaster resiliency tests, discovery audits, and can present findings to management in order to ensure security and integrity of the systems. What you will need to have: 8+ years' experience in IT. 5+ years' experience in Cyber Security. 3+ years' experience in implementation, integration, and operations of privileged access technologies (CyberArk and all its components). 3+ years' experience in systems and network administration (Windows, Unix/Linux, Network devices) and good knowledge of PKI, Authentication tools and protocols (like SAML, Radius, PING), MFA. 2+ years' experience with privileged access controls in Unix and Windows environments. 2+ years' experience with broader IAM ecosystem of directories, identity management, and access management controls. 1+ years' experience in a senior technical role (have a deep understanding of the product) with IAM/PAM products such as CyberArk and its components. Bachelor's degree in computer science, or a relevant field, or an equivalent combination of education, work, and/or military experience. What would be great to have: 2+ years' experience in onboarding and managing privileged credentials across Windows, Linux/Unix, databases, networking devices and other platforms. 2+ years' experience in development/scripting (shell, PowerShell, python and utilizing rest API methods and other current tools including AI to assist in automation activities like provisioning of vault components, accounts and implementing access controls. 1+ years' experience in coming up with technical solutions and being able to present to management related to PAM. 1+ years' experience in ability to interface with Corporate Audit and External Audit functions for regulatory compliance. Cybersecurity certifications such as CISA, CISSP and CyberArk certifications - CDE, Sentry, Defender.
    $69k-96k yearly est. 1d ago
  • SRE Engineer with Azure AI

    Tekgence Inc.

    Requirements engineer job in Plano, TX

    Overall experience: 8 to 10+ years' experience performing Production Support for Mission Critical, high performance applications (Customer Care, Retail and eCommerce customer/agent facing application experience preferred). Experience using Docker, Kubernetes and Microsoft Azure Cloud, Unix, Networking and troubleshooting knowledge. Experience with Application & Infrastructure Performance Monitoring tools like Dynatrace. Experience with Application Log Analytics tools like Elastic. Experience with visualization tools like Kibana and Grafana. EFK stack experience preferred. Creation of Dashboards on Dynatrace, ELK and Grafana. Debugging java log, debugging microservices log. Experience in Relational & NoSQL databases like Oracle & Cassandra. Experience with Site Reliability Engineering preferred. Generative AI and Workflow Automation skills • Demonstrated experience leveraging AI driven tools for automating end to end operational workflows. Demonstrated experience using Text Generative & Code Generative AI Models • Automation, Gen AI & Agentic Workflow Technical Skills to include GPT 4o LLM. Advanced LangGraph and LangChain Google DialogFlow (Api.ai) Google Vertex Databricks, Spark, Snowflake.
    $70k-96k yearly est. 2d ago
  • Endpoint Engineer

    Divergeit

    Requirements engineer job in Plano, TX

    About Us: At DivergeIT, we're recognized for our innovative approach to IT consulting and managed services. We help organizations navigate digital transformation by delivering tailored technology strategies that drive business growth. We're seeking a highly skilled and motivated Service Engineer to join our dynamic team and further our commitment to excellence. Why Join DivergeIT? At DivergeIT, we offer a collaborative and innovative work environment where your expertise will be valued, and your contributions will make a tangible impact. We are committed to supporting your professional growth through continuous learning opportunities and certifications. Position Overview As an Endpoint Engineer, you will be responsible for the provisioning, configuration, and lifecycle management of Windows-based endpoints, including both physical workstations and Azure Virtual Desktop (AVD) environments. This position requires extensive experience with Microsoft Intune, SCCM, Patch My PC, 1E, and Autopilot, along with proficiency in creating, deploying, and maintaining Windows images. You will collaborate closely with the client's IT security team and work alongside our dedicated service desk to ensure optimal endpoint performance and security compliance. Key Responsibilities Design, create, and maintain standardized Windows images for physical workstations and Azure Virtual Desktop (AVD) using tools such as SCCM, Microsoft Deployment Toolkit (MDT), and Intune. Manage and automate device provisioning and deployment through Windows Autopilot. Administer patching, software deployment, and update compliance using Patch My PC, SCCM, and Intune. Utilize 1E tools (e.g., Nomad, Tachyon) to support remote management, compliance, and endpoint performance monitoring. Collaborate with the client's IT security team to implement and maintain endpoint security baselines and compliance standards. Provide escalation support to our dedicated service desk and help drive resolution of complex endpoint issues. Maintain up-to-date documentation for image creation, deployment processes, and system configurations. Monitor and optimize AVD performance, scaling, and configuration consistency. Stay informed about changes in Microsoft endpoint management tools and provide recommendations for improvements or modernization efforts. Required Qualifications 3+ years of experience in an endpoint management or systems engineering role, preferably in an MSP or enterprise IT environment. Expertise in creating and managing Windows images for both physical endpoints and AVD environments. Strong understanding of Windows 10/11 OS deployment, device provisioning, group policy, and compliance management. Experience working collaboratively with IT security and help desk teams. Excellent troubleshooting, documentation, and communication skills. Proficiency with: Microsoft Intune (Endpoint Manager) System Center Configuration Manager (SCCM) Windows Autopilot Patch My PC 1E Nomad, Tachyon Preferred Qualifications Microsoft certifications (e.g., MD-102, MS-101, AZ-140). Experience with hybrid environments and Azure AD Join/Hybrid Join. Familiarity with AVD scaling plans, FSLogix, and host pool image management. Scripting knowledge (PowerShell) for automation of endpoint and imaging tasks. Exposure to Zero Trust security models and conditional access policies.
    $70k-96k yearly est. 3d ago
  • HP NonStop Engineer (W2)

    PTR Global

    Requirements engineer job in Plano, TX

    Open to: Jersey City, Tampa, FL, Columbus, OH, Plano, TX W2 Hiring Onsite from day one We are seeking an experienced HP NonStop Engineer to support, enhance, and automate operations on mission-critical HPE NonStop systems. The ideal candidate will have strong hands-on experience with NonStop environments and a passion for automating manual operational tasks using modern scripting and configuration management tools. Automate Operations and System manual tasks Skills: Python, Ansible, Java, HPE Nonstop TACL, Prognosis
    $70k-96k yearly est. 3d ago
  • Kubernetes Engineer

    Tata Consultancy Services 4.3company rating

    Requirements engineer job in Plano, TX

    Hands on experience of Kubernetes engineering and development. Minimum 5-7+ years of experience in working with hybrid Infra architectures Experience in analyzing the architecture of On Prem Infrastructure for Applications (Network, Storage, Processing, Backup/DR etc). Strong understanding of Infrastructure capacity planning, monitoring, upgrades, IaaC automations using Terraform, Ansible, CICD using Jenkins/Github Actions. Experience working with engineering teams to define best practices and processes as appropriate to support the entire infrastructure lifecycle - Plan, Build, Deploy, and Operate such as automate lifecycle activities - self-service, orchestration and provisioning, configuration management. Experience working with engineering teams to define best practices and processes as appropriate to support the entire infrastructure lifecycle - Plan, Build, Deploy, and Operate such as automate lifecycle activities - self-service, orchestration and provisioning, configuration management. Experience defining infrastructure direction. Drive continuous improvement including design, and standardization of process and methodologies. Experience assessing feasibility, complexity and scope of new capabilities and solutions Base Salary Range: $100,000 - $110,000 per annum TCS Employee Benefits Summary: Discretionary Annual Incentive. Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans. Family Support: Maternal & Parental Leaves. Insurance Options: Auto & Home Insurance, Identity Theft Protection. Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement. Time Off: Vacation, Time Off, Sick Leave & Holidays. Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
    $100k-110k yearly 3d ago
  • Backend Engineer (Distributed Systems and Kubernetes)

    Arcus Search 3.9company rating

    Requirements engineer job in Dallas, TX

    Software Engineer - Batch Compute (Kubernetes / HPC) Dallas (Hybrid) | 💼 Full-time A leading, well-funded quantitative research and technology firm is looking for a Software Engineer to join a team building and running a large-scale, high-performance batch compute platform. You'll be working on modern Kubernetes-based infrastructure that powers complex research and ML workloads at serious scale, including contributions to a well-known open-source scheduling project used for multi-cluster batch computing. What you'll be doing • Building and developing backend services, primarily in Go (Python, C++, C# backgrounds are fine) • Working on large-scale batch scheduling and distributed systems on Kubernetes • Operating and improving HPC-style workloads, CI/CD pipelines, and Linux-based platforms • Optimising data flows across systems using tools like PostgreSQL • Debugging and improving performance across infrastructure, networking, and software layers What they're looking for • Strong software engineering background with an interest in Kubernetes and batch workloads • Experience with Kubernetes internals (controllers, operators, schedulers) • Exposure to HPC, job schedulers, or DAG-based workflows • Familiarity with cloud platforms (ideally AWS), observability tooling, and event-driven systems Why it's worth a look • Market-leading compensation plus bonus • Hybrid setup from a brand-new Dallas office • Strong work/life balance and excellent benefits • Generous relocation support if needed • The chance to work at genuine scale on technically hard problems If you're interested (or know someone who might be), drop me a message and I'm happy to share more details anonymously.
    $75k-106k yearly est. 4d ago
  • Autodesk Vault Engineer-- CDC5695559

    Compunnel Inc. 4.4company rating

    Requirements engineer job in Plano, TX

    Auto Desk Upgrade Lead planning, execution and validation of Autodesk Vault upgrade Collaborate with engineering CAD, and IT Team to support Vault upgrade roadmap. Ensure robust data migration, backup and recovery strategies. Conduct required validation after upgrade Document upgrade procedures and train end user as needed. Coordinate with Autodesk support for unresolved upgrade related issues. SCCM Package deployment: Validation SCCM packages work as expected. Investigate/resolve any installation failures after package installation. Monitor deployment success rates and troubleshoot issues. Track and resolve user- reported bugs or regressions introduced during the upgrade. Support rollback or contingency plans if critical issues arises. Manage Vault user roles, permissions, and access controls. Support CAD teams with Vault- related workflows. Manually install/updates SCCM for some applications.
    $77k-99k yearly est. 4d ago
  • Senior Data Engineer

    Longbridge 3.6company rating

    Requirements engineer job in Dallas, TX

    About Us Longbridge Securities, founded in March 2019 and headquartered in Singapore, is a next-generation online brokerage platform. Established by a team of seasoned finance professionals and technical experts from leading global firms, we are committed to advancing financial technology innovation. Our mission is to empower every investor by offering enhanced financial opportunities. What You'll Do As part of our global expansion, we're seeking a Data Engineer to design and build batch/real-time data warehouses and maintain data platforms that power trading and research for the US market. You'll work on data pipelines, APIs, storage systems, and quality monitoring to ensure reliable, scalable, and efficient data services. Responsibilities: Design and build batch/real-time data warehouses to support the US market growth Develop efficient ETL pipelines to optimize data processing performance and ensure data quality/stability Build a unified data middleware layer to reduce business data development costs and improve service reusability Collaborate with business teams to identify core metrics and data requirements, delivering actionable data solutions Discover data insights through collaboration with the business owner Maintain and develop enterprise data platforms for the US market Qualifications 7+ years of data engineering experience with a proven track record in data platform/data warehouse projects Proficient in Hadoop ecosystem (Hive, Kafka, Spark, Flink), Trino, SQL, and at least one programming language (Python/Java/Scala) Solid understanding of data warehouse modeling (dimensional modeling, star/snowflake schemas) and ETL performance optimization Familiarity with AWS/cloud platforms and experience with Docker, Kubernetes Experience with open-source data platform development, familiar with at least one relational database (MySQL/PostgreSQL) Strong cross-department collaboration skills to translate business requirements into technical solutions Bachelor's degree or higher in Computer Science, Data Science, Statistics, or related fields Comfortable working in a fast-moving fintech/tech startup environment Qualifications 7+ years of data engineering experience with a proven track record in data platform/data warehouse projects Proficient in Hadoop ecosystem (Hive, Kafka, Spark, Flink), Trino, SQL, and at least one programming language (Python/Java/Scala) Solid understanding of data warehouse modeling (dimensional modeling, star/snowflake schemas) and ETL performance optimization Familiarity with AWS/cloud platforms and experience with Docker, Kubernetes Experience with open-source data platform development, familiar with at least one relational database (MySQL/PostgreSQL) Strong cross-department collaboration skills to translate business requirements into technical solutions Bachelor's degree or higher in Computer Science, Data Science, Statistics, or related fields Comfortable working in a fast-moving fintech/tech startup environment Proficiency in Mandarin and English at the business communication level for international team collaboration Bonus Point: Experience with DolphinScheduler and SeaTunnel is a plus
    $83k-116k yearly est. 6d ago
  • GCP Data Engineer

    Dexian [Corp Deal

    Requirements engineer job in Dallas, TX

    MUST BE USC or Green Card; No vendors GCP Data Engineer/Lead Onsite Required Qualifications: 9+ years' experience and hands on with Data Warehousing. 9+ years of hands on ETL (e.g., Informatica/DataStage) experience 3+ years of hands-on Big query 3+ years of hands on GCP 9+ years of Teradata hands on experience 9+ years working in a cross-functional environment. 3+ years of hands-on experience with Google Cloud Platform services like Big Query, Dataflow, Pub/Sub, and Cloud Storage 3+ years of hands-on experience building modern data pipelines with GCP platform 3+ years of experience with Query optimization, data structures, transformation, metadata, dependency, and workload management 3+ years of experience with SQL, NoSQL 3+ years of experience in data engineering with a focus on microservices-based data solutions 3+ years of containerization (Docker, Kubernetes) and CI/CD for data pipeline 3+ years of experience with Python (or a comparable scripting language) 3+ years of experience with Big data and cloud architecture 3+ years of experience with deployment/scaling of apps on containerized environment (Kubernetes,) Excellent oral and written communications skills; ability to interact effectively with all levels within the organization. Working knowledge of AGILE/SDLC methodology Excellent analytical and problem-solving skills. Ability to interact and work effectively with technical & non-technical levels within the organization. Ability to drive clarity of purpose and goals during release and planning activities. Excellent organizational skills including ability to prioritize tasks efficiently with high level of attention to detail. Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.
    $76k-103k yearly est. 2d ago
  • Senior Data Engineer (USC AND GC ONLY)

    Wise Skulls

    Requirements engineer job in Richardson, TX

    Now Hiring: Senior Data Engineer (GCP / Big Data / ETL) Duration: 6 Months (Possible Extension) We're seeking an experienced Senior Data Engineer with deep expertise in Data Warehousing, ETL, Big Data, and modern GCP-based data pipelines. This role is ideal for someone who thrives in cross-functional environments and can architect, optimize, and scale enterprise-level data solutions on the cloud. Must-Have Skills (Non-Negotiable) 9+ years in Data Engineering & Data Warehousing 9+ years hands-on ETL experience (Informatica, DataStage, etc.) 9+ years working with Teradata 3+ years hands-on GCP and BigQuery Experience with Dataflow, Pub/Sub, Cloud Storage, and modern GCP data pipelines Strong background in query optimization, data structures, metadata & workload management Experience delivering microservices-based data solutions Proficiency in Big Data & cloud architecture 3+ years with SQL & NoSQL 3+ years with Python or similar scripting languages 3+ years with Docker, Kubernetes, CI/CD for data pipelines Expertise in deploying & scaling apps in containerized environments (K8s) Strong communication, analytical thinking, and ability to collaborate across technical & non-technical teams Familiarity with AGILE/SDLC methodologies Key Responsibilities Build, enhance, and optimize modern data pipelines on GCP Implement scalable ETL frameworks, data structures, and workflow dependency management Architect and tune BigQuery datasets, queries, and storage layers Collaborate with cross-functional teams to define data requirements and support business objectives Lead efforts in containerized deployments, CI/CD integrations, and performance optimization Drive clarity in project goals, timelines, and deliverables during Agile planning sessions 📩 Interested? Apply now or DM us to explore this opportunity! You can share resumes at ********************* OR Call us on *****************
    $76k-103k yearly est. 3d ago
  • Azure Data Engineer Sr

    Resolve Tech Solutions 4.4company rating

    Requirements engineer job in Irving, TX

    Minimum 7 years of relevant work experience in data engineering, with at least 2 years in a data modeling. Strong technical foundation in Python, SQL, and experience with cloud platforms (Azure,). Deep understanding of data engineering fundamentals, including database architecture and design, Extract, transform and load (ETL) processes, data lakes, data warehousing, and both batch and streaming technologies. Experience with data orchestration tools (e.g., Airflow), data processing frameworks (e.g., Spark, Databricks), and data visualization tools (e.g., Tableau, Power BI). Proven ability to lead a team of engineers, fostering a collaborative and high-performing environment.
    $76k-100k yearly est. 1d ago
  • Data Engineer

    Anblicks 4.5company rating

    Requirements engineer job in Dallas, TX

    Must be local to TX Data Engineer - SQL, Python and Pyspark Expert (Onsite - Dallas, TX) Data Engineer with strong proficiency in SQL, Python, and PySpark to support high-performance data pipelines and analytics initiatives. This role will focus on scalable data processing, transformation, and integration efforts that enable business insights, regulatory compliance, and operational efficiency. Key Responsibilities Design, develop, and optimize ETL/ELT pipelines using SQL, Python, and PySpark for large-scale data environments Implement scalable data processing workflows in distributed data platforms (e.g., Hadoop, Databricks, or Spark environments) Partner with business stakeholders to understand and model mortgage lifecycle data (origination, underwriting, servicing, foreclosure, etc.) Create and maintain data marts, views, and reusable data components to support downstream reporting and analytics Ensure data quality, consistency, security, and lineage across all stages of data processing Assist in data migration and modernization efforts to cloud-based data warehouses (e.g., Snowflake, Azure Synapse, GCP BigQuery) Document data flows, logic, and transformation rules Troubleshoot performance and quality issues in batch and real-time pipelines Support compliance-related reporting (e.g., HMDA, CFPB) Required Qualifications 6+ years of experience in data engineering or data development Advanced expertise in SQL (joins, CTEs, optimization, partitioning, etc.) Strong hands-on skills in Python for scripting, data wrangling, and automation Proficient in PySpark for building distributed data pipelines and processing large volumes of structured/unstructured data Experience working with mortgage banking data sets and domain knowledge is highly preferred Strong understanding of data modeling (dimensional, normalized, star schema) Experience with cloud-based platforms (e.g., Azure Databricks, AWS EMR, GCP Dataproc) Familiarity with ETL tools, orchestration frameworks (e.g., Airflow, ADF, dbt)
    $75k-102k yearly est. 2d ago
  • DevOps Engineer

    Hexaware Technologies 4.2company rating

    Requirements engineer job in Plano, TX

    Primary Skills: Jenkins, Groovy, Python, Cloudformation, Terraform, AWS This position seeks a motivated and enthusiastic DevOps Engineer to join the team and assist in managing and maintaining the CICD framework and its tools. The engineer will be responsible for implementing and maintaining scalable infrastructure, automating CI/CD pipelines, ensuring system reliability, and fostering collaboration between development and operations teams. This role requires a good understanding of DevOps practices, cloud platforms, containerization, and configuration management. Responsibilities: • Design, build, and maintain robust CI/CD pipelines to automate software delivery from code commitment to deployment. • Design, develop, and maintain infrastructure using Terraform and automation tools to provision, manage, and scale cloud resources efficiently and consistently. • Configure and maintain CICD tools in cloud environments, ensuring optimal performance, scalability, and cost efficiency. • Work with containerization technologies like Docker and orchestration tools such as Kubernetes for efficient application deployment and scaling. • Monitor system performance, troubleshoot production issues, and implement proactive solutions to ensure high availability and reliability. • Collaborate closely with development, QA, and operations teams to streamline workflows, identify areas for improvement, and promote a culture of shared responsibility. • Implement security best practices throughout the software development lifecycle, integrating security tools and processes into CI/CD pipelines. • Develop and maintain comprehensive documentation for infrastructure, processes, and deployment procedures. • Stay up to date with emerging DevOps tools, technologies, and industry best practices, continuously evaluating and recommending improvements. Skills: • Over 5-6 years of experience as a DevOps Engineer or in a similar role. • Strong expertise in Terraform for infrastructure automation is mandatory. •Adequate experience in designing, implementing, and maintaining CI/CD pipelines using tools like Jenkins, Harness, GitHub Actions, Github cloud, SonarQube, Veracode, Qualys. • Awareness on CICD tools in the market and its capabilities. • Good experience in scripting languages (e.g., Python, Groovy,Java). • Good understanding on Cloud technologies, containerization technologies (Docker) and container orchestration tools (Kubernetes). • Experience in onboarding the applications into any CICD framework. Experience with monitoring and logging tools.
    $74k-93k yearly est. 1d ago
  • AZURE DATA ENGINEER (Databrick certified and DATA FACTORY.)

    Cloudingest

    Requirements engineer job in Irving, TX

    AZURE DATA ENGINEER with DATA FACTORY. Databrick certified 3 days a week onsite, can be based out of Irving TX or Houston TX. Rate is 45 W2.
    $76k-103k yearly est. 2d ago
  • Data Engineer

    Ledelsea

    Requirements engineer job in Irving, TX

    W2 Contract to Hire Role with Monthly Travel to the Dallas Texas area We are looking for a highly skilled and independent Data Engineer to support our analytics and data science teams, as well as external client data needs. This role involves writing and optimizing complex SQL queries, generating client-specific data extracts, and building scalable ETL pipelines using Azure Data Factory. The ideal candidate will have a strong foundation in data engineering, with a collaborative mindset and the ability to work across teams and systems. Duties/Responsibilities:Develop and optimize complex SQL queries to support internal analytics and external client data requests. Generate custom data lists and extracts based on client specifications and business rules. Design, build, and maintain efficient ETL pipelines using Azure Data Factory. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver high-quality solutions. Work with Salesforce data; familiarity with SOQL is preferred but not required. Support Power BI reporting through basic data modeling and integration. Assist in implementing MLOps practices for model deployment and monitoring. Use Python for data manipulation, automation, and integration tasks. Ensure data quality, consistency, and security across all workflows and systems. Required Skills/Abilities/Attributes: 5+ years of experience in data engineering or a related field. Strong proficiency in SQL, including query optimization and performance tuning. Experience with Azure Data Factory, with git repository and pipeline deployment. Ability to translate client requirements into accurate and timely data outputs. Working knowledge of Python for data-related tasks. Strong problem-solving skills and ability to work independently. Excellent communication and documentation skills. Preferred Skills/ExperiencePrevious knowledge of building pipelines for ML models. Extensive experience creating/managing stored procedures and functions in MS SQL Server 2+ years of experience in cloud architecture (Azure, AWS, etc) Experience with ‘code management' systems (Azure Devops) 2+ years of reporting design and management (PowerBI Preferred) Ability to influence others through the articulation of ideas, concepts, benefits, etc. Education and Experience: Bachelor's degree in a computer science field or applicable business experience. Minimum 3 years of experience in a Data Engineering role Healthcare experience preferred. Physical Requirements:Prolonged periods sitting at a desk and working on a computer. Ability to lift 20 lbs.
    $76k-103k yearly est. 2d ago
  • GCP Data Engineer

    Methodhub

    Requirements engineer job in Fort Worth, TX

    Job Title: GCP Data Engineer Employment Type: W2/CTH Client: Direct We are seeking a highly skilled Data Engineer with strong expertise in Python, SQL, and Google Cloud Platform (GCP) services. The ideal candidate will have 6-8 years of hands-on experience in building and maintaining scalable data pipelines, working with APIs, and leveraging GCP tools such as BigQuery, Cloud Composer, and Dataflow. Core Responsibilities: • Design, build, and maintain scalable data pipelines to support analytics and business operations. • Develop and optimize ETL processes for structured and unstructured data. • Work with BigQuery, Cloud Composer, and other GCP services to manage data workflows. • Collaborate with data analysts and business teams to ensure data availability and quality. • Integrate data from multiple sources using APIs and custom scripts. • Monitor and troubleshoot pipeline performance and reliability. Technical Skills: o Strong proficiency in Python and SQL. o Experience with data pipeline development and ETL frameworks. • GCP Expertise: o Hands-on experience with BigQuery, Cloud Composer, and Dataflow. • Additional Requirements: o Familiarity with workflow orchestration tools and cloud-based data architecture. o Strong problem-solving and analytical skills. o Excellent communication and collaboration abilities.
    $76k-104k yearly est. 4d ago

Learn more about requirements engineer jobs

How much does a requirements engineer earn in Wylie, TX?

The average requirements engineer in Wylie, TX earns between $60,000 and $110,000 annually. This compares to the national average requirements engineer range of $62,000 to $120,000.

Average requirements engineer salary in Wylie, TX

$82,000

What are the biggest employers of Requirements Engineers in Wylie, TX?

The biggest employers of Requirements Engineers in Wylie, TX are:
  1. Tata Group
  2. Google via Artech Information Systems
  3. Ascentt
  4. PepsiCo
  5. Career Mentors, LLC
  6. Zone It Solutions
  7. TeleWorld Solutions
  8. Lorven Technologies
  9. NTT Europe Ltd
  10. Fisher Investments
Job type you want
Full Time
Part Time
Internship
Temporary