Post job

Data engineer jobs in Carrollton, TX

- 2,263 jobs
All
Data Engineer
Data Architect
Data Scientist
Data Modeler
Senior Data Architect
Requirements Engineer
Data Consultant
Lead Data Architect
  • Engineer IV

    Marriott International, Inc. 4.6company rating

    Data engineer job in Dallas, TX

    Additional Information Job Number25190111 Job CategoryEngineering & Facilities LocationThe Ritz-Carlton Dallas, 2121 McKinney Avenue, Dallas, Texas, United States, 75201VIEW ON MAP ScheduleFull Time Located Remotely?N Type Non-Management Respond and attend to guest repair requests. Communicate with guests/customers to resolve maintenance issues. Perform preventive maintenance on tools and equipment, including cleaning and lubrication. Visually inspect tools, equipment, or machines. Carry equipment (e.g., tools, radio). Identify, locate, and operate all shut-off valves for equipment and utility shut-offs for buildings. Maintain maintenance inventory and requisition parts and supplies as needed. Assure each day's activities and problems that occur are communicated to the other shifts using approved communication programs and standards. Display thorough knowledge of building systems, emergency response, and building documentation including reading standard blue prints and electrical schematics concerning plumbing and HVAC. Display advanced engineering operations skills and general mechanical ability. Display professional journeyman level expertise in at least three of the following areas with basic skills in the remaining: air conditioning and refrigeration, electrical, plumbing, carpentry and finish skills, mechanical, general building management, pneumatic/electronic systems and controls, and/or energy conservation. Display solid knowledge and skill in the safe use of hand and power tools and other materials required to perform repair and maintenance tasks. Safely perform highly complex repairs of the physical property, electrical, plumbing and mechanical equipment, air conditioners, refrigeration and pool heaters - ensuring all methods, materials and practices meet company standards and Local and National codes - with little or no supervision. Perform routine inspections of the entire property, noting safety hazards, lack of illumination, down equipment (such as ice makers, fans, extractors, pumps), and take immediate corrective action. Inspect and repair all mechanical equipment including, but not limited to: appliances, HVAC, electrical and plumbing components, diagnose and repair of boilers, pumps and related components. Use the Lockout/Tagout system before performing any maintenance work. Display thorough knowledge of maintenance contracts and vendors. Display advanced knowledge of engineering computer programs related to preventative maintenance, energy management, and other systems, including devices that interact with such programs. Perform advanced troubleshooting of hotel Mechanical, Electrical, and Plumbing (MEP) systems. Display the ability to train and mentor other engineers (e.g., Engineers I, II, and III) as necessary and supervise work in progress and act in a supervisory role in the absence of supervisors and/or management. Display ability to perform Engineer on Duty responsibilities, including readings and rounds. Follow all company and safety and security policies and procedures; report any maintenance problems, safety hazards, accidents, or injuries; complete safety training and certifications; and properly store flammable materials. Ensure uniform and personal appearances are clean and professional, maintain confidentiality of proprietary information, and protect company assets. Welcome and acknowledge all guests according to company standards, anticipate and address guests' service needs, assist individuals with disabilities, and thank guests with genuine appreciation. Adhere to quality expectations and standards. Develop and maintain positive working relationships with others, support team to reach common goals, and listen and respond appropriately to the concerns of other employees. Speak with others using clear and professional language. Move, lift, carry, push, pull, and place objects weighing less than or equal to 50 pounds without assistance and heavier lifting or movement tasks with assistance. Move up and down stairs, service ramps, and/or ladders. Reach overhead and below the knees, including bending, twisting, pulling, and stooping. Enter and locate work-related information using computers and/or point of sale systems. Perform other reasonable job duties as requested. PREFERRED QUALIFICATIONS Education: High school diploma or G.E.D. equivalent. Certificate in two-year technical diploma program for HVAC/refrigeration. Related Work Experience: Extensive experience and training in general maintenance (advanced repairs), electrical or refrigeration, exterior and interior surface preparation and painting. At least 2 years of hotel engineering/maintenance experience. Supervisory Experience: No supervisory experience. REQUIRED QUALIFICATIONS License or Certification: Valid Driver's License License or certification in refrigeration or electrical (earned, or currently working towards receiving) Universal Chlorofluorocarbon (CFC) certification Must meet applicable state and federal certification and/or licensing requirements. At Marriott International, we are dedicated to being an equal opportunity employer, welcoming all and providing access to opportunity. We actively foster an environment where the unique backgrounds of our associates are valued and celebrated. Our greatest strength lies in the rich blend of culture, talent, and experiences of our associates. We are committed to non-discrimination on any protected basis, including disability, veteran status, or other basis protected by applicable law. At more than 100 award-winning properties worldwide, The Ritz-Carlton Ladies and Gentlemen create experiences so exceptional that long after a guest stays with us, the experience stays with them. Attracting the world's top hospitality professionals who curate lifelong memories, we believe that everyone succeeds when they are empowered to be creative, thoughtful and compassionate. Every day, we set the standard for rare and special luxury service the world over and pride ourselves on delivering excellence in the care and comfort of our guests. Your role will be to ensure that the “Gold Standards” of The Ritz-Carlton are delivered graciously and thoughtfully every day. The Gold Standards are the foundation of The Ritz-Carlton and are what guides us each day to be better than the next. It is this foundation and our belief that our culture drives success by which The Ritz Carlton has earned the reputation as a global brand leader in luxury hospitality. As part of our team, you will learn and exemplify the Gold Standards, such as our Employee Promise, Credo and our Service Values. And our promise to you is that we offer the chance to be proud of the work you do and who you work with. In joining The Ritz-Carlton, you join a portfolio of brands with Marriott International. Be where you can do your best work, begin your purpose, belong to an amazing global team, and become the best version of you.
    $32k-55k yearly est. 5d ago
  • Senior Data Governance Consultant (Informatica)

    Paradigm Technology 4.2company rating

    Data engineer job in Plano, TX

    Senior Data Governance Consultant (Informatica) About Paradigm - Intelligence Amplified Paradigm is a strategic consulting firm that turns vision into tangible results. For over 30 years, we've helped Fortune 500 and high-growth organizations accelerate business outcomes across data, cloud, and AI. From strategy through execution, we empower clients to make smarter decisions, move faster, and maximize return on their technology investments. What sets us apart isn't just what we do, it's how we do it. Driven by a clear mission and values rooted in integrity, excellence, and collaboration, we deliver work that creates lasting impact. At Paradigm, your ideas are heard, your growth is prioritized, your contributions make a difference. Summary: We are seeking a Senior Data Governance Consultant to lead and enhance data governance capabilities across a financial services organization The Senior Data Governance Consultant will collaborate closely with business, risk, compliance, technology, and data management teams to define data standards, strengthen data controls, and drive a culture of data accountability and stewardship The ideal candidate will have deep experience in developing and implementing data governance frameworks, data policies, and control mechanisms that ensure compliance, consistency, and trust in enterprise data assets Hands-on experience with Informatica, including Master Data Management (MDM) or Informatica Data Management Cloud (IDMC), is preferred This position is Remote, with occasional travel to Plano, TX Responsibilities: Data Governance Frameworks: Design, implement, and enhance data governance frameworks aligned with regulatory expectations (e.g., BCBS 239, GDPR, CCPA, DORA) and internal control standards Policy & Standards Development: Develop, maintain, and operationalize data policies, standards, and procedures that govern data quality, metadata management, data lineage, and data ownership Control Design & Implementation: Define and embed data control frameworks across data lifecycle processes to ensure data integrity, accuracy, completeness, and timeliness Risk & Compliance Alignment: Work with risk and compliance teams to identify data-related risks and ensure appropriate mitigation and monitoring controls are in place Stakeholder Engagement: Partner with data owners, stewards, and business leaders to promote governance practices and drive adoption of governance tools and processes Data Quality Management: Define and monitor data quality metrics and KPIs, establishing escalation and remediation procedures for data quality issues Metadata & Lineage: Support metadata and data lineage initiatives to increase transparency and enable traceability across systems and processes Reporting & Governance Committees: Prepare materials and reporting for data governance forums, risk committees, and senior management updates Change Management & Training: Develop communication and training materials to embed governance culture and ensure consistent understanding across the organization Required Qualifications: 7+ years of experience in data governance, data management, or data risk roles within financial services (banking, insurance, or asset management preferred) Strong knowledge of data policy development, data standards, and control frameworks Proven experience aligning data governance initiatives with regulatory and compliance requirements Familiarity with Informatica data governance and metadata tools Excellent communication skills with the ability to influence senior stakeholders and translate technical concepts into business language Deep understanding of data management principles (DAMA-DMBOK, DCAM, or equivalent frameworks) Bachelor's or Master's Degree in Information Management, Data Science, Computer Science, Business, or related field Preferred Qualifications: Hands-on experience with Informatica, including Master Data Management (MDM) or Informatica Data Management Cloud (IDMC), is preferred Experience with data risk management or data control testing Knowledge of financial regulatory frameworks (e.g., Basel, MiFID II, Solvency II, BCBS 239) Certifications, such as Informatica, CDMP, or DCAM Background in consulting or large-scale data transformation programs Key Competencies: Strategic and analytical thinking Strong governance and control mindset Excellent stakeholder and relationship management Ability to drive organizational change and embed governance culture Attention to detail with a pragmatic approach Why Join Paradigm At Paradigm, integrity drives innovation. You'll collaborate with curious, dedicated teammates, solving complex problems and unlocking immense data value for leading organizations. If you seek a place where your voice is heard, growth is supported, and your work creates lasting business value, you belong at Paradigm. Learn more at ******************** Policy Disclosure: Paradigm maintains a strict drug-free workplace policy. All offers of employment are contingent upon successfully passing a standard 5-panel drug screen. Please note that a positive test result for any prohibited substance, including marijuana, will result in disqualification from employment, regardless of state laws permitting its use. This policy applies consistently across all positions and locations.
    $76k-107k yearly est. 5d ago
  • Data Scientist 2

    Cullerton Group

    Data engineer job in Dallas, TX

    Cullerton Group has a new opportunity for a Data Scientist 2. The work will be done onsite full-time, with flexibility for candidates located in Illinois (Mossville) or Texas (Dallas) depending on business needs. This is a long-term 12-month position that can lead to permanent employment with our client. Compensation is up to $58.72/hr + full benefits (vision, dental, health insurance, 401k, and holiday pay). Job Summary Cullerton Group is seeking a motivated and analytical Data Scientist to support strategic sourcing and cost management initiatives through advanced data analytics and reporting. This role focuses on developing insights from complex datasets to guide decision-making, improve cost visibility, and support category strategy execution. The ideal candidate will collaborate with cross-functional teams, apply statistical and analytical methods, and contribute independently to analytics-driven projects that deliver measurable business value. Key ResponsibilitiesDevelop and maintain scorecards, dashboards, and reports by consolidating data from multiple enterprise sources Perform data collection, validation, and analysis to support strategic sourcing and cost savings initiatives Apply statistical analysis and modeling techniques to identify trends, risks, and optimization opportunities Support monthly and recurring reporting processes, including cost tracking and performance metrics Collaborate with category teams, strategy leaders, and peers to translate analytics into actionable insights Required QualificationsBachelor's degree in a quantitative field such as Data Science, Statistics, Engineering, Computer Science, Economics, Mathematics, or similar (or Master's degree in lieu of experience) 3-5 years of professional experience performing quantitative analysis (internships accepted) Proficiency with analytics and data visualization tools, including Power BI Strong problem-solving skills with the ability to communicate insights clearly to technical and non-technical audiences Preferred QualificationsExperience with advanced statistical methods (regression, hypothesis testing, ANOVA, statistical process control) Practical exposure to machine learning techniques such as clustering, logistic regression, random forests, or similar models Experience with cloud platforms (AWS, Azure, or Google Cloud) Familiarity with procurement, sourcing, cost management, or manufacturing-related analytics Strong initiative, collaboration skills, and commitment to continuous learning in analytics Why This Role? This position offers the opportunity to work on high-impact analytics projects that directly support sourcing strategy, cost optimization, and operational decision-making. You will collaborate with diverse teams, gain exposure to large-scale enterprise data, and contribute to meaningful initiatives that drive measurable business outcomes. Cullerton Group provides a professional consulting environment with growth potential, strong client partnerships, and long-term career opportunities.
    $58.7 hourly 2d ago
  • Data Engineer

    Robert Half 4.5company rating

    Data engineer job in Dallas, TX

    We are seeking a highly experienced Senior Data Engineer with deep expertise in modern data engineering frameworks and cloud-native architectures, primarily on AWS. This role focuses on designing, building, and optimizing scalable data pipelines and distributed systems. You will collaborate cross-functionally to deliver secure, high-quality data solutions that drive business decisions. Key Responsibilities Design & Build: Develop and maintain scalable, highly available AWS-based data pipelines, specializing in EKS/ECS containerized workloads and services like Glue, EMR, and Lake Formation. Orchestration: Implement automated data ingestion, transformation, and workflow orchestration using Airflow, NiFi, and AWS Step Functions. Real-time: Architect and implement real-time streaming solutions with Kafka, MSK, and Flink. Data Lake & Storage: Architect secure S3 data storage and govern data lakes using Lake Formation and Glue Data Catalog. Optimization: Optimize distributed processing solutions (Databricks, Spark, Hadoop) and troubleshoot performance across cloud-native systems. Governance: Ensure robust data quality, security, and governance via IAM, Lake Formation controls, and automated validations. Mentorship: Mentor junior team members and foster technical excellence. Requirements Experience: 7+ years in data engineering; strong hands-on experience designing cloud data pipelines. AWS Expertise: Deep proficiency in EKS, ECS, S3, Lake Formation, Glue, EMR, IAM, and MSK. Core Tools: Strong experience with Kafka, Airflow, NiFi, Databricks, Spark, Hadoop, and Flink. Coding: Proficiency in Python, Scala, or Java for building data pipelines and automation. Databases: Strong SQL skills and experience with relational/NoSQL databases (e.g., Redshift, DynamoDB). Cloud-Native Skills: Strong knowledge of Kubernetes, containerization, and CI/CD pipelines. Education: Bachelor's degree in Computer Science or related field.
    $86k-121k yearly est. 2d ago
  • Data Scientist (F2F Interview)

    GBIT (Global Bridge Infotech Inc.

    Data engineer job in Dallas, TX

    W2 Contract Dallas, TX (Onsite) We are seeking an experienced Data Scientist to join our team in Dallas, Texas. The ideal candidate will have a strong foundation in machine learning, data modeling, and statistical analysis, with the ability to transform complex datasets into clear, actionable insights that drive business impact. Key Responsibilities Develop, implement, and optimize machine learning models to support business objectives. Perform exploratory data analysis, feature engineering, and predictive modeling. Translate analytical findings into meaningful recommendations for technical and non-technical stakeholders. Collaborate with cross-functional teams to identify data-driven opportunities and improve decision-making. Build scalable data pipelines and maintain robust analytical workflows. Communicate insights through reports, dashboards, and data visualizations. Qualifications Bachelor's or Master's degree in Data Science, Statistics, Computer Science, or a related field. Proven experience working with machine learning algorithms and statistical modeling techniques. Proficiency in Python or R, along with hands-on experience using libraries such as Pandas, NumPy, Scikit-learn, or TensorFlow. Strong SQL skills and familiarity with relational or NoSQL databases. Experience with data visualization tools (e.g., Tableau, Power BI, matplotlib). Excellent problem-solving, communication, and collaboration skills.
    $69k-97k yearly est. 4d ago
  • Data Modeler

    People Consultancy Services (PCS

    Data engineer job in Plano, TX

    Plano TX- Nearby candidates only W2 Candidates Must Have: 5+ years of experience with data modeling, warehousing, analysis & data profiling experience and ability to identify trends and anomalies in the data Experience on AWS technologies like S3, AWS Glue, EMR, and IAM roles/permissions Experience with one or more query language (e.g., SQL, PL/SQL, DDL, SparkSQL, Scala) Experience working with relational database such as Teradata and handling both structured and unstructured datasets Data Modeling tools (Any of - Erwin, Power Designer, ER Studio) Preferred / Ideal to have - Proficiency in Python Experience with NoSQL, non-relational databases / data stores (e.g., object storage, document or key-value stores, graph databases, column-family databases) Experience with Snowflake and Databricks
    $79k-108k yearly est. 2d ago
  • Senior Data Engineer

    Longbridge 3.6company rating

    Data engineer job in Dallas, TX

    About Us Longbridge Securities, founded in March 2019 and headquartered in Singapore, is a next-generation online brokerage platform. Established by a team of seasoned finance professionals and technical experts from leading global firms, we are committed to advancing financial technology innovation. Our mission is to empower every investor by offering enhanced financial opportunities. What You'll Do As part of our global expansion, we're seeking a Data Engineer to design and build batch/real-time data warehouses and maintain data platforms that power trading and research for the US market. You'll work on data pipelines, APIs, storage systems, and quality monitoring to ensure reliable, scalable, and efficient data services. Responsibilities: Design and build batch/real-time data warehouses to support the US market growth Develop efficient ETL pipelines to optimize data processing performance and ensure data quality/stability Build a unified data middleware layer to reduce business data development costs and improve service reusability Collaborate with business teams to identify core metrics and data requirements, delivering actionable data solutions Discover data insights through collaboration with the business owner Maintain and develop enterprise data platforms for the US market Qualifications 7+ years of data engineering experience with a proven track record in data platform/data warehouse projects Proficient in Hadoop ecosystem (Hive, Kafka, Spark, Flink), Trino, SQL, and at least one programming language (Python/Java/Scala) Solid understanding of data warehouse modeling (dimensional modeling, star/snowflake schemas) and ETL performance optimization Familiarity with AWS/cloud platforms and experience with Docker, Kubernetes Experience with open-source data platform development, familiar with at least one relational database (MySQL/PostgreSQL) Strong cross-department collaboration skills to translate business requirements into technical solutions Bachelor's degree or higher in Computer Science, Data Science, Statistics, or related fields Comfortable working in a fast-moving fintech/tech startup environment Qualifications 7+ years of data engineering experience with a proven track record in data platform/data warehouse projects Proficient in Hadoop ecosystem (Hive, Kafka, Spark, Flink), Trino, SQL, and at least one programming language (Python/Java/Scala) Solid understanding of data warehouse modeling (dimensional modeling, star/snowflake schemas) and ETL performance optimization Familiarity with AWS/cloud platforms and experience with Docker, Kubernetes Experience with open-source data platform development, familiar with at least one relational database (MySQL/PostgreSQL) Strong cross-department collaboration skills to translate business requirements into technical solutions Bachelor's degree or higher in Computer Science, Data Science, Statistics, or related fields Comfortable working in a fast-moving fintech/tech startup environment Proficiency in Mandarin and English at the business communication level for international team collaboration Bonus Point: Experience with DolphinScheduler and SeaTunnel is a plus
    $83k-116k yearly est. 2d ago
  • Data Engineer

    IDR, Inc. 4.3company rating

    Data engineer job in Coppell, TX

    IDR is seeking a Data Engineer to join one of our top clients for an opportunity in Coppell, TX. This role involves designing, building, and maintaining enterprise-grade data architectures, with a focus on cloud-based data engineering, analytics, and machine learning applications. The company operates within the technology and data services industry, providing innovative solutions to large-scale clients. Position Overview for the Data Engineer: Develop and maintain scalable data pipelines utilizing Databricks and Azure environments Design data models and optimize ETL/ELT processes for large datasets Collaborate with cross-functional teams to implement data solutions supporting analytics, BI, and ML projects Ensure data quality, availability, and performance across enterprise systems Automate workflows and implement CI/CD pipelines to improve data deployment processes Requirements for the Data Engineer: 8-10 years of experience on modern data platforms with a strong background in cloud-based data engineering Strong expertise in Databricks (PySpark/Scala, Delta Lake, Unity Catalog) Hands-on experience with Azure (AWS/GCP also acceptable IF Super strong in Databricks) Advanced SQL skills and strong experience with data modeling, ETL/ELT development and data orchestration Experience with CI/CD (Azure DevOps, GitHub Actions, Terraform, etc.) What's in it for you? Competitive compensation package Full Benefits; Medical, Vision, Dental, and more! Opportunity to get in with an industry leading organization. Why IDR? 25+ Years of Proven Industry Experience in 4 major markets Employee Stock Ownership Program Dedicated Engagement Manager who is committed to you and your success. Medical, Dental, Vision, and Life Insurance ClearlyRated's Best of Staffing Client and Talent Award winner 12 years in a row.
    $75k-103k yearly est. 4d ago
  • Data Engineer

    Ledelsea

    Data engineer job in Irving, TX

    W2 Contract to Hire Role with Monthly Travel to the Dallas Texas area We are looking for a highly skilled and independent Data Engineer to support our analytics and data science teams, as well as external client data needs. This role involves writing and optimizing complex SQL queries, generating client-specific data extracts, and building scalable ETL pipelines using Azure Data Factory. The ideal candidate will have a strong foundation in data engineering, with a collaborative mindset and the ability to work across teams and systems. Duties/Responsibilities:Develop and optimize complex SQL queries to support internal analytics and external client data requests. Generate custom data lists and extracts based on client specifications and business rules. Design, build, and maintain efficient ETL pipelines using Azure Data Factory. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver high-quality solutions. Work with Salesforce data; familiarity with SOQL is preferred but not required. Support Power BI reporting through basic data modeling and integration. Assist in implementing MLOps practices for model deployment and monitoring. Use Python for data manipulation, automation, and integration tasks. Ensure data quality, consistency, and security across all workflows and systems. Required Skills/Abilities/Attributes: 5+ years of experience in data engineering or a related field. Strong proficiency in SQL, including query optimization and performance tuning. Experience with Azure Data Factory, with git repository and pipeline deployment. Ability to translate client requirements into accurate and timely data outputs. Working knowledge of Python for data-related tasks. Strong problem-solving skills and ability to work independently. Excellent communication and documentation skills. Preferred Skills/ExperiencePrevious knowledge of building pipelines for ML models. Extensive experience creating/managing stored procedures and functions in MS SQL Server 2+ years of experience in cloud architecture (Azure, AWS, etc) Experience with ‘code management' systems (Azure Devops) 2+ years of reporting design and management (PowerBI Preferred) Ability to influence others through the articulation of ideas, concepts, benefits, etc. Education and Experience: Bachelor's degree in a computer science field or applicable business experience. Minimum 3 years of experience in a Data Engineering role Healthcare experience preferred. Physical Requirements:Prolonged periods sitting at a desk and working on a computer. Ability to lift 20 lbs.
    $76k-103k yearly est. 4d ago
  • Data Engineer

    Aaratech

    Data engineer job in Dallas, TX

    Data Engineer (3-4 Years Experience) Remote / On-site - based on client needs ) Employment Type: Full-time ( Contract or Contract-to-Hire ) Experience Level: Mid-level (3-4 years) Company: Aaratech Inc 🛑 Eligibility: Open to U.S. Citizens and Green Card holders only. We do not offer visa sponsorship. 🔍 About Aaratech Inc Aaratech Inc is a specialized IT consulting and staffing company that places elite engineering talent into high-impact roles at leading U.S. organizations. We focus on modern technologies across cloud, data, and software disciplines. Our client engagements offer long-term stability, competitive compensation, and the opportunity to work on cutting-edge data projects. 🎯 Position Overview We are seeking a Data Engineer with 3-4 years of experience to join a client-facing role focused on building and maintaining scalable data pipelines, robust data models, and modern data warehousing solutions. You'll work with a variety of tools and frameworks, including Apache Spark, Snowflake, and Python, to deliver clean, reliable, and timely data for advanced analytics and reporting. 🛠️ Key Responsibilities Design and develop scalable Data Pipelines to support batch and real-time processing Implement efficient Extract, Transform, Load (ETL) processes using tools like Apache Spark and dbt Develop and optimize queries using SQL for data analysis and warehousing Build and maintain Data Warehousing solutions using platforms like Snowflake or BigQuery Collaborate with business and technical teams to gather requirements and create accurate Data Models Write reusable and maintainable code in Python (Programming Language) for data ingestion, processing, and automation Ensure end-to-end Data Processing integrity, scalability, and performance Follow best practices for data governance, security, and compliance ✅ Required Skills & Experience 3-4 years of experience in Data Engineering or a similar role Strong proficiency in SQL and Python (Programming Language) Experience with Extract, Transform, Load (ETL) frameworks and building data pipelines Solid understanding of Data Warehousing concepts and architecture Hands-on experience with Snowflake, Apache Spark, or similar big data technologies Proven experience in Data Modeling and data schema design Exposure to Data Processing frameworks and performance optimization techniques Familiarity with cloud platforms like AWS, GCP, or Azure ⭐ Nice to Have Experience with streaming data pipelines (e.g., Kafka, Kinesis) Exposure to CI/CD practices in data development Prior work in consulting or multi-client environments Understanding of data quality frameworks and monitoring strategies
    $76k-103k yearly est. 2d ago
  • Lead Data Engineer

    Sotalent

    Data engineer job in Plano, TX

    Job Title: Senior Lead Data Engineer Type: Full Time Our client is seeking a hands-on Senior Data Engineer to guide development teams in building scalable, cloud-native solutions. This leadership role combines technical expertise with mentoring, driving innovation across Agile teams to deliver high-impact applications using modern full-stack technologies. About the Role You'll lead developers focused on machine learning, microservices, and full-stack systems while collaborating with product managers to create robust, performant solutions. Stay ahead of tech trends, experiment with emerging tools, and contribute to engineering communities through mentoring and knowledge sharing. Key Responsibilities Design, develop, test, deploy, and support full-stack solutions across Agile teams. Lead engineering teams specializing in ML, distributed microservices, and cloud systems. Build with Java, Scala, Python, RDBMS/NoSQL databases, and cloud data warehouses like Redshift/Snowflake. Partner with product managers to deliver cloud-based applications powering exceptional user experiences. Perform unit testing and code reviews for rigorous design, clean code, and peak performance. Experiment with new technologies and mentor peers in internal/external tech communities. What You Bring Required Bachelor's degree in Computer Science, Engineering, or related field. 6+ years in application development. 2+ years with big data technologies. 1+ year with cloud platforms (AWS, Azure, Google Cloud). Preferred Master's degree. 9+ years in app dev (Python, SQL, Scala, Java). 4+ years public cloud, real-time streaming, NoSQL (MongoDB/Cassandra), data warehousing. 5+ years distributed tools (Hadoop, Spark, Kafka, etc.). 4+ years UNIX/Linux/shell scripting; 2+ years Agile practices.
    $76k-103k yearly est. 2d ago
  • Data Architect

    Ascendion

    Data engineer job in Plano, TX

    Ascendion is a full-service digital engineering solutions company. We make and manage software platforms and products that power growth and deliver captivating experiences to consumers and employees. Our engineering, cloud, data, experience design, and talent solution capabilities accelerate transformation and impact for enterprise clients. Headquartered in New Jersey, our workforce of 6,000+ Ascenders delivers solutions from around the globe. Ascendion is built differently to engineer the next. Ascendion | Engineering to elevate life We have a culture built on opportunity, inclusion, and a spirit of partnership. Come, change the world with us: Build the coolest tech for the world's leading brands Solve complex problems - and learn new skills Experience the power of transforming digital engineering for Fortune 500 clients Master your craft with leading training programs and hands-on experience Experience a community of change makers! Join a culture of high-performing innovators with endless ideas and a passion for tech. Our culture is the fabric of our company, and it is what makes us unique and diverse. The way we share ideas, learning, experiences, successes, and joy allows everyone to be their best at Ascendion. Title - Data Architect Duration - Contract to hire Required Qualifications: Knowledge on Finance Domain-regulatory reporting (GL) - (supporting data modeling within the financial team which is regularly doing AR/GL, etc) Need strong DW skills Experience/strong in Data modeling skills Developing and implementing data framework, reference models, data modeling (operational and analytical). Hands on expertise in leading definition, design, and deployment planning of data management strategies Hands on expertise in data modeling for operational, warehouse Understand business systems analysis, and process and data artifacts within an enterprise architecture framework Experience in implementing information security and data access control across all forms of data stores (RDBMS, Unstructured, File Stores etc.). Very Good understanding of underlying hardware architecture (process and storage) used by data stores (RDBMS and non-RDBMS). Strong in Designing & Development of Stored Procedure Knowledge of Control-M Plus Location - Plano TX Salary Range: The salary for this position is between $140k - $160k annually. Factors which may affect pay within this range may include geography/market, skills, education, experience, and other qualifications of the successful candidate. Benefits:The Company offers the following benefits for this position, subject to applicable eligibility requirements: [medical insurance] [dental insurance] [vision insurance] [401(k) retirement plan] [long-term disability insurance] [short-term disability insurance] [5 personal days accrued each calendar year. The Paid time off benefits meet the paid sick and safe time laws that pertains to the City/ State] [10-15 days of paid vacation time] [6 paid holidays and 1 floating holiday per calendar year] [Ascendion Learning Management System] This position is eligible for commissions in accordance with the terms of the Company's plan. Commissions for this position are estimated to be based on individual performance. Additionally, this role is also eligible for a bonus based on the achievement of mutually agreed KRAs. Want to change the world? Let us know. Tell us about your experiences, education, and ambitions. Bring your knowledge, unique viewpoint, and creativity to the table. Let's talk!
    $140k-160k yearly 2d ago
  • Data Engineer

    Anblicks 4.5company rating

    Data engineer job in Dallas, TX

    Must be local to TX Data Engineer - SQL, Python and Pyspark Expert (Onsite - Dallas, TX) Data Engineer with strong proficiency in SQL, Python, and PySpark to support high-performance data pipelines and analytics initiatives. This role will focus on scalable data processing, transformation, and integration efforts that enable business insights, regulatory compliance, and operational efficiency. Key Responsibilities Design, develop, and optimize ETL/ELT pipelines using SQL, Python, and PySpark for large-scale data environments Implement scalable data processing workflows in distributed data platforms (e.g., Hadoop, Databricks, or Spark environments) Partner with business stakeholders to understand and model mortgage lifecycle data (origination, underwriting, servicing, foreclosure, etc.) Create and maintain data marts, views, and reusable data components to support downstream reporting and analytics Ensure data quality, consistency, security, and lineage across all stages of data processing Assist in data migration and modernization efforts to cloud-based data warehouses (e.g., Snowflake, Azure Synapse, GCP BigQuery) Document data flows, logic, and transformation rules Troubleshoot performance and quality issues in batch and real-time pipelines Support compliance-related reporting (e.g., HMDA, CFPB) Required Qualifications 6+ years of experience in data engineering or data development Advanced expertise in SQL (joins, CTEs, optimization, partitioning, etc.) Strong hands-on skills in Python for scripting, data wrangling, and automation Proficient in PySpark for building distributed data pipelines and processing large volumes of structured/unstructured data Experience working with mortgage banking data sets and domain knowledge is highly preferred Strong understanding of data modeling (dimensional, normalized, star schema) Experience with cloud-based platforms (e.g., Azure Databricks, AWS EMR, GCP Dataproc) Familiarity with ETL tools, orchestration frameworks (e.g., Airflow, ADF, dbt)
    $75k-102k yearly est. 3d ago
  • GCP Data Engineer

    Methodhub

    Data engineer job in Fort Worth, TX

    Job Title: GCP Data Engineer Employment Type: W2/CTH Client: Direct We are seeking a highly skilled Data Engineer with strong expertise in Python, SQL, and Google Cloud Platform (GCP) services. The ideal candidate will have 6-8 years of hands-on experience in building and maintaining scalable data pipelines, working with APIs, and leveraging GCP tools such as BigQuery, Cloud Composer, and Dataflow. Core Responsibilities: • Design, build, and maintain scalable data pipelines to support analytics and business operations. • Develop and optimize ETL processes for structured and unstructured data. • Work with BigQuery, Cloud Composer, and other GCP services to manage data workflows. • Collaborate with data analysts and business teams to ensure data availability and quality. • Integrate data from multiple sources using APIs and custom scripts. • Monitor and troubleshoot pipeline performance and reliability. Technical Skills: o Strong proficiency in Python and SQL. o Experience with data pipeline development and ETL frameworks. • GCP Expertise: o Hands-on experience with BigQuery, Cloud Composer, and Dataflow. • Additional Requirements: o Familiarity with workflow orchestration tools and cloud-based data architecture. o Strong problem-solving and analytical skills. o Excellent communication and collaboration abilities.
    $76k-104k yearly est. 5d ago
  • Data Engineer

    Beaconfire Inc.

    Data engineer job in Dallas, TX

    Junior Data Engineer DESCRIPTION: BeaconFire is based in Central NJ, specializing in Software Development, Web Development, and Business Intelligence; looking for candidates who are good communicators and self-motivated. You will play a key role in building, maintaining, and operating integrations, reporting pipelines, and data transformation systems. Qualifications: Passion for data and a deep desire to learn. Master's Degree in Computer Science/Information Technology, Data Analytics/Data Science, or related discipline. Intermediate Python. Experience in data processing is a plus. (Numpy, Pandas, etc) Experience with relational databases (SQL Server, Oracle, MySQL, etc.) Strong written and verbal communication skills. Ability to work both independently and as part of a team. Responsibilities: Collaborate with the analytics team to find reliable data solutions to meet the business needs. Design and implement scalable ETL or ELT processes to support the business demand for data. Perform data extraction, manipulation, and production from database tables. Build utilities, user-defined functions, and frameworks to better enable data flow patterns. Build and incorporate automated unit tests, participate in integration testing efforts. Work with teams to resolve operational & performance issues. Work with architecture/engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and adhered to. Compensation: $65,000.00 to $80,000.00 /year BeaconFire is an e-verified company. Work visa sponsorship is available.
    $65k-80k yearly 4d ago
  • Data Architect

    KPI Partners 4.8company rating

    Data engineer job in Plano, TX

    KPI Partners is a 5 times Gartner-recognized data, analytics, and AI consulting company. We are leaders in data engineering on Azure, AWS, Google, Snowflake, and Databricks. Founded in 2006, KPI has over 400 consultants and has successfully delivered over 1,000 projects to our clients. We are looking for skilled data engineers who want to work with the best team in data engineering. Title: Senior Data Architect Location: Plano, TX (Hybrid) Job Type: Contract - 6 Months Key Skills: SQL, PySpark, Databricks, and Azure Cloud Key Note: Looking for a Data Architect who is Hands-on with SQL, PySpark, Databricks, and Azure Cloud. About the Role: We are seeking a highly skilled and experienced Senior Data Architect to join our dynamic team at KPI, working on challenging and multi-year data transformation projects. This is an excellent opportunity for a talented data engineer to play a key role in building innovative data solutions using Azure Native Services and related technologies. If you are passionate about working with large-scale data systems and enjoy solving complex engineering problems, this role is for you. Key Responsibilities: Data Engineering: Design, development, and implementation of data pipelines and solutions using PySpark, SQL, and related technologies. Collaboration: Work closely with cross-functional teams to understand business requirements and translate them into robust data solutions. Data Warehousing: Design and implement data warehousing solutions, ensuring scalability, performance, and reliability. Continuous Learning: Stay up to date with modern technologies and trends in data engineering and apply them to improve our data platform. Mentorship: Provide guidance and mentorship to junior data engineers, ensuring best practices in coding, design, and development. Must-Have Skills & Qualifications: Minimum 12+ years of overall experience in IT Industry. 4+ years of experience in data engineering, with a strong background in building large-scale data solutions. 4+ years of hands-on experience developing and implementing data pipelines using Azure stack experience (Azure, ADF, Databricks, Functions) Proven expertise in SQL for querying, manipulating, and analyzing large datasets. Strong knowledge of ETL processes and data warehousing fundamentals. Self-motivated and independent, with a “let's get this done” mindset and the ability to thrive in a fast-paced and dynamic environment. Good-to-Have Skills: Databricks Certification is a plus. Data Modeling, Azure Architect Certification.
    $88k-123k yearly est. 5d ago
  • GCP Data Engineer

    Infosys 4.4company rating

    Data engineer job in Richardson, TX

    Infosys is seeking a Google Cloud (GCP) data engineer with experience in Github and python. In this role, you will enable digital transformation for our clients in a global delivery model, research on technologies independently, recommend appropriate solutions and contribute to technology-specific best practices and standards. You will be responsible to interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued. Required Qualifications: Candidate must be located within commuting distance of Richardson, TX or be willing to relocate to the area. This position may require travel in the US Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education. Candidates authorized to work for any employer in the United States without employer-based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time At least 4 years of Information Technology experience. Experience working with technologies like - GCP with data engineering - data flow / air flow, pub sub/ kafta, data proc/Hadoop, Big Query. ETL development experience with strong SQL background such as Python/R, Scala, Java, Hive, Spark, Kafka Strong knowledge on Python Program development to build reusable frameworks, enhance existing frameworks. Application build experience with core GCP Services like Dataproc, GKE, Composer, Deep understanding GCP IAM & Github. Must have done IAM set up Knowledge on CICD pipeline using Terraform in Git. Preferred Qualifications: Good knowledge on Google Big Query, using advance SQL programing techniques to build Big Query Data sets in Ingestion and Transformation layer. Experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data Knowledge on Airflow Dag creation, execution, and monitoring. Good understanding of Agile software development frameworks Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams. Experience and desire to work in a global delivery environment.
    $72k-91k yearly est. 5d ago
  • Azure Data Architect

    Ltimindtree

    Data engineer job in Dallas, TX

    About Us: LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit ******************** Job Title: Data Architect Work Location Dallas, Texas Job Description: The ideal candidate will have a good understanding of big data technologies data engineering and cloud computing DWH Projects with a focus on Azure Databricks Work closely with business stakeholders and other IT teams to understand requirements and define the scope for engagement with reasonable timeline Ensure proper documentation of architecture processes while ensuring compliance with security and governance standards Ensure best practices are followed by team in terms of code quality data security and scalability Stay updated with the latest developments in Databricks and associated technologies to drive innovation 12 years of experience along with 5 years of data Analytics project experience Experience with Azure Databricks notebook development and Delta Lake Good understanding of Azure services like Azure Data Lake Azure Synapse and Azure Data Factory Fabric Experience with ETLELT processes data warehousing and building data lakes SQL skills and familiarity with NoSQL databases Experience with CICD pipelines and version control systems like Git Soft Skills Excellent communication skills with the ability to explain complex technical concepts to nontechnical stakeholders Strong problem-solving skills and a proactive approach to identifying and resolving issues Leadership skills with the ability to manage and mentor a team of data engineers Power BI for dashboarding and reporting Microsoft Fabric for analytics and integration tasks Spark Streaming for processing real time data streams Over 12 years of IT experience including 4 years specializing in developing data ingestion and transformation pipelines using Databricks Synapse notebooks and Azure Data Factory Good understanding on different domain industries with respect to data Analytics project DWH projects Should be good in Excel and Power Point Good understanding and experience with Delta tables Delta Lake and Azure Data Lake Storage Gen2 Experience in building and optimizing query layers using Databricks SQL Familiarity with modern CICD practices especially in the context of Databricks and cloud native solutions Benefits/perks listed below may vary depending on the nature of your employment with LTIMindtree (“LTIM”): Benefits and Perks: Comprehensive Medical Plan Covering Medical, Dental, Vision Short Term and Long-Term Disability Coverage 401(k) Plan with Company match Life Insurance Vacation Time, Sick Leave, Paid Holidays Paid Paternity and Maternity Leave The range displayed on each job posting reflects the minimum and maximum salary target for the position across all US locations. Within the range, individual pay is determined by work location and job level and additional factors including job-related skills, experience, and relevant education or training. Depending on the position offered, other forms of compensation may be provided as part of overall compensation like an annual performance-based bonus, sales incentive pay and other forms of bonus or variable compensation. Disclaimer: The compensation and benefits information provided herein is accurate as of the date of this posting. LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.
    $87k-119k yearly est. 5d ago
  • Senior Data Architect

    Akkodis

    Data engineer job in Dallas, TX

    Akkodis is seeking a Senior Data Architect for a Contract with a client located in Dallas, TX (Hybrid). Pay Range: $80/hr - $90/hr, The rate may be negotiable based on experience, education, geographic location, and other factors Must have Oracle Exadata, ETL, Informatica, Golden Gate, and MDM. Job Description: Primary responsibilities of the Senior Data Architect include designing and managing Data Architectural solutions for multiple environments, including but not limited to Data Warehouse, ODS, Data Replication/ETL Data Management initiatives. The candidate will be in an expert role and will work closely with Business, DBA, ETL, and Data Management teams, providing solutions for complex data-related initiatives. This individual will also be responsible for developing and managing Data Governance and Master Data Management solutions. This candidate must have good technical and communication skills coupled with the ability to mentor effectively. Responsibilities Establishing policies, procedures, and guidelines regarding all aspects of Data Governance Ensure data decisions are consistent and best practices are adhered to Ensure Data Standardization definitions, Data Dictionary, and Data Lineage are kept up to date and accessible Work with ETL, Replication, and DBA teams to determine best practices as it relates to data transformations, data movement, and derivations Work with support teams to ensure consistent and proactive support methodologies are in place for all aspects of data movements and data transformations Work with and mentor Data Architects and Data Analysts to ensure best practices are adhered to for database design and data management Assist in overall Architectural solutions, including, but not limited to, Data Warehouse, ODS, Data Replication/ETL Data Management initiatives Work with the business teams and Enterprise Architecture team to ensure the best architectural solutions from a Data perspective Create a strategic roadmap for MDM implementation Responsible for implementing a Master Data Management tool Establishing policies, procedures, and guidelines regarding all aspects of Master Data Management Ensure Architectural rules and design of the MDM process are documented, and best practices are adhered to Qualifications 5+ years of Data Architecture experience, including OLTP, Data Warehouse, and Big Data 5+ years of Solution Architecture experience 5+ years of MDM experience 5+ years of Data Governance experience, working knowledge of best practices Extensive working knowledge of all aspects of Data Movement and Processing, including Middleware, ETL, API, OLAP, and best practices for data tracking Good Communication skills Self-Motivated Capable of presenting to all levels of audiences Works well in a team environment If you are interested in this role, then please click APPLY NOW . For other opportunities available at Akkodis , or any questions, please contact Anirudh Srivastava at ************ or ***********************************. Equal Opportunity Employer/Veterans/Disabled Benefit offerings include medical, dental, vision, term life insurance, short-term disability insurance, additional voluntary benefits, commuter benefits, and a 401K plan. Our program provides employees the flexibility to choose the type of coverage that meets their individual needs. Available paid leave may include Paid Sick Leave, where required by law; any other paid leave required by Federal, State, or local law; and Holiday pay upon meeting eligibility criteria. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs that are direct hires to a client. To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit ******************************************
    $80 hourly 1d ago
  • Lead GCP Data Engineer/Architect

    Programmers.Io 3.8company rating

    Data engineer job in Richardson, TX

    We are seeking a highly experienced Lead GCP Data Engineer to design, build, and optimize scalable data engineering solutions on Google Cloud Platform. The ideal candidate will take ownership of building robust data pipelines, ensuring best practices, and leading engineering teams to deliver high-quality data solutions for analytics, reporting, and business operations. Key Responsibilities Lead the design, development, and deployment of data pipelines and data integration workflows on GCP. Build and optimize data ingestion, transformation, and storage using tools such as Dataflow, Dataproc, Pub/Sub, Composer, BigQuery, Cloud Storage, and Cloud Functions. Collaborate with data architects, analysts, and business teams to translate requirements into technical solutions. Develop and maintain ETL/ELT frameworks, ensuring scalability, performance, and reliability. Implement and enforce best practices around data quality, data validation, metadata management, and documentation. Conduct performance tuning for BigQuery, Dataflow, Spark jobs, and data pipelines. Drive cost optimization strategies for GCP data workloads. Ensure compliance with data security, governance, and access control policies. Provide technical leadership, mentoring, and code reviews for the data engineering team. Contribute to architecture discussions and technology strategy for cloud data platforms.
    $84k-118k yearly est. 1d ago

Learn more about data engineer jobs

How much does a data engineer earn in Carrollton, TX?

The average data engineer in Carrollton, TX earns between $66,000 and $118,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Carrollton, TX

$88,000

What are the biggest employers of Data Engineers in Carrollton, TX?

The biggest employers of Data Engineers in Carrollton, TX are:
  1. U.S. Bank
  2. Acosta
  3. Capgemini
  4. Huntington National Bank
  5. Lennar
  6. Fusion Health
  7. Soho Dragon
  8. Select Medical
  9. Michaels Stores
  10. The PNC Financial Services Group
Job type you want
Full Time
Part Time
Internship
Temporary