Mobile Engineering
Data engineer job in Austin, TX
JLL empowers you to shape a brighter way.
Our people at JLL and JLL Technologies are shaping the future of real estate for a better world by combining world class services, advisory and technology for our clients. We are committed to hiring the best, most talented people and empowering them to thrive, grow meaningful careers and to find a place where they belong. Whether you've got deep experience in commercial real estate, skilled trades or technology, or you're looking to apply your relevant experience to a new industry, join our team as we help shape a brighter way forward.
Mobile Engineering - JLL
What this job involves: This position focuses on the hands-on performance of ongoing preventive maintenance and repair work orders across multiple facility locations. You will maintain, operate, and repair building systems including HVAC, electrical, plumbing, and other critical infrastructure components. This mobile role requires you to travel between assigned buildings, conduct facility inspections, respond to emergencies, and ensure all systems operate efficiently to support client occupancy and satisfaction across JLL's building portfolio.
What your day-to-day will look like:
• Perform ongoing preventive maintenance and repair work orders on facility mechanical, electrical and other installed systems, equipment, and components.
• Maintain, operate, and repair all HVAC systems and associated equipment, electrical distribution equipment, plumbing systems, building interior/exterior repair, and related grounds.
• Conduct assigned facility inspections and due diligence efforts, reporting conditions that impact client occupancy and operations.
• Respond effectively to all emergencies and after-hours building activities as required.
• Prepare and submit summary reports to management listing conditions found during assigned work and recommend corrective actions.
• Study and maintain familiarity with building automation systems, fire/life safety systems, and other building-related equipment.
• Maintain compliance with all safety procedures, recognize hazards, and propose elimination methods while adhering to State, County, or City Ordinances, Codes, and Laws.
Required Qualifications:
• Valid state driver's license and Universal CFC Certification.
• Minimum four years of technical experience in all aspects of building engineering with strong background in packaged and split HVAC units, plumbing, and electrical systems.
• Physical ability to lift up to 80 lbs and climb ladders up to 30 ft.
• Ability to read schematics and technical drawings.
• Availability for on-call duties and overtime as required.
• Must pass background, drug/alcohol, and MVR screening process.
Preferred Qualifications:
• Experience with building automation systems and fire/life safety systems.
• Knowledge of CMMS systems such as Corrigo for work order management.
• Strong troubleshooting and problem-solving abilities across multiple building systems.
• Experience working in commercial building environments.
• Commitment to ongoing safety training and professional development.
Location: Mobile position covering Austin, TX and surrounding area.
Work Shift: Standard business hours with on-call availability
#HVACjobs
This position does not provide visa sponsorship. Candidates must be authorized to work in the United States without employer sponsorship.
Location:
On-site -Austin, TX
If this job description resonates with you, we encourage you to apply, even if you don't meet all the requirements. We're interested in getting to know you and what you bring to the table!
Personalized benefits that support personal well-being and growth:
JLL recognizes the impact that the workplace can have on your wellness, so we offer a supportive culture and comprehensive benefits package that prioritizes mental, physical and emotional health. Some of these benefits may include:
401(k) plan with matching company contributions
Comprehensive Medical, Dental & Vision Care
Paid parental leave at 100% of salary
Paid Time Off and Company Holidays
Early access to earned wages through Daily Pay
JLL Privacy Notice
Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLL's recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely.
For more information about how JLL processes your personal data, please view our Candidate Privacy Statement.
For additional details please see our career site pages for each country.
For candidates in the United States, please see a full copy of our Equal Employment Opportunity policy here.
Jones Lang LaSalle (“JLL”) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process - including the online application and/or overall selection process - you may email us at ******************. This email is only to request an accommodation. Please direct any other general recruiting inquiries to our Contact Us page > I want to work for JLL.
Accepting applications on an ongoing basis until candidate identified.
Senior Data Engineer
Data engineer job in Austin, TX
We are looking for a seasoned Azure Data Engineer to design, build, and optimize secure, scalable, and high-performance data solutions within the Microsoft Azure ecosystem. This will be a multi-year contract worked FULLY ONSITE in Austin, TX.
The ideal candidate brings deep technical expertise in data architecture, ETL/ELT engineering, data integration, and governance, along with hands-on experience in MDM, API Management, Lakehouse architectures, and data mesh or data hub frameworks. This position combines strategic architectural planning with practical, hands-on implementation, empowering cross-functional teams to leverage data as a key organizational asset.
Key Responsibilities
1. Data Architecture & Strategy
Design and deploy end-to-end Azure data platforms using Azure Data Lake, Azure Synapse Analytics, Azure Databricks, and Azure SQL Database.
Build and implement Lakehouse and medallion (Bronze/Silver/Gold) architectures for scalable and modular data processing.
Define and support data mesh and data hub patterns to promote domain-driven design and federated governance.
Establish standards for conceptual, logical, and physical data modeling across data warehouse and data lake environments.
2. Data Integration & Pipeline Development
Develop and maintain ETL/ELT pipelines using Azure Data Factory, Synapse Pipelines, and Databricks for both batch and streaming workloads.
Integrate diverse data sources (on-prem, cloud, SaaS, APIs) into a unified Azure data environment.
Optimize pipelines for cost-effectiveness, performance, and scalability.
3. Master Data Management (MDM) & Data Governance
Implement MDM solutions using Azure-native or third-party platforms (e.g., Profisee, Informatica, Semarchy).
Define and manage data governance, metadata, and data quality frameworks.
Partner with business teams to align data standards and maintain data integrity across domains.
4. API Management & Integration
Build and manage APIs for data access, transformation, and system integration using Azure API Management and Logic Apps.
Design secure, reliable data services for internal and external consumers.
Automate workflows and system integrations using Azure Functions, Logic Apps, and Power Automate.
5. Database & Platform Administration
Perform core DBA tasks, including performance tuning, query optimization, indexing, and backup/recovery for Azure SQL and Synapse.
Monitor and optimize cost, performance, and scalability across Azure data services.
Implement CI/CD and Infrastructure-as-Code (IaC) solutions using Azure DevOps, Terraform, or Bicep.
6. Collaboration & Leadership
Work closely with data scientists, analysts, business stakeholders, and application teams to deliver high-value data solutions.
Mentor junior engineers and define best practices for coding, data modeling, and solution design.
Contribute to enterprise-wide data strategy and roadmap development.
Required Qualifications
Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or related fields.
5+ years of hands-on experience in Azure-based data engineering and architecture.
Strong proficiency with the following:
Azure Data Factory, Azure Synapse, Azure Databricks, Azure Data Lake Storage Gen2
SQL, Python, PySpark, PowerShell
Azure API Management and Logic Apps
Solid understanding of data modeling approaches (3NF, dimensional modeling, Data Vault, star/snowflake schemas).
Proven experience with Lakehouse/medallion architectures and data mesh/data hub designs.
Familiarity with MDM concepts, data governance frameworks, and metadata management.
Experience with automation, data-focused CI/CD, and IaC.
Thorough understanding of Azure security, RBAC, Key Vault, and core networking principles.
What We Offer
Competitive compensation and benefits package
Luna Data Solutions, Inc. (LDS) provides equal employment opportunities to all employees. All applicants will be considered for employment. LDS prohibits discrimination and harassment of any type regarding age, race, color, religion, sexual orientation, gender identity, sex, national origin, genetics, protected veteran status, and/or disability status.
Senior Data Retention & Protection Consultant: Disaster Recovery
Data engineer job in Dallas, TX
Technology Recovery Services provides subject matter expertise and direction on complex IT disaster recovery projects/initiatives and supports IT disaster recovery technical planning, coordination and service maturity working across IT, business resilience, risk management, regulatory and compliance.
Summary of Essential Functions:
Govern disaster recovery plans and procedures for critical business applications and infrastructure.
Create, update, and publish disaster recovery related policies, procedures, and guidelines.
Ensure annual updates and validations of DR policies and procedures to maintain readiness and resilience.
Maintain upto-date knowledge of disaster recovery and business continuity best practices.
Perform regular disaster recovery testing, including simulation exercises, incident response simulations, tabletop exercises, and actual failover drills to validate procedures and identify improvements.
Train staff and educate employees on disaster recovery processes, their roles during incidents, and adherence to disaster recovery policies.
Coordinates Technology Response to Natural Disasters and Aircraft Accidents
Qualifications:
Strong knowledge of Air vault and ransomware recovery technologies
Proven ability to build, cultivate, and promote strong relationships with internal customers at all levels of the organization, as well as with Technology counterparts, business partners, and external groups
Proficiency in handling operational issues effectively and understanding escalation, communication, and crisis management
Demonstrated call control and situation management skills under fast paced, highly dynamic situations
Knowledge of basic IT and Airline Ecosystems
Understand SLA's, engagement process and urgency needed to engage teams during critical situations
Ability to understand and explain interconnected application functionality in a complex environment and share knowledge with peers
Skilled in a Customer centric attitude and the ability to focus on providing best-in-class service for customers and stakeholders
Ability to execute with a high level of operational urgency with an ability to maintain calm, and work closely with a team and stakeholders during a critical situation while using project management skills
Ability to present to C Level executives with outstanding communication skills
Ability to lead a large group up to 200 people including support, development, leaders and executives on a single call
Ability to effectively triage - be able to detect and determine symptom vs cause and capture key data from various sources, systems and people
Knowledge of business strategies and priorities
Excellent communication and stakeholder engagement skills.
Required:
3 plus years of similar
or related experience in such fields as Disaster Recovery, Business Continuity and Enterprise Operational Resilience.
Working knowledge of Disaster Recovery professional practices, including Business Impact Analysis, disaster recovery plan (DRP), redundancy and failover mechanisms DR related regulatory requirement, and Business Continuity Plan exercises and audits.
Ability to motivate, influence, and train others.
Strong analytical skills and problem-solving skills using data analysis tools including Alteryx and Tableau.
Ability to communicate technical and operational issues clearly to both technical and nontechnical audiences.
Data Engineer
Data engineer job in Houston, TX
We are looking for a talented and motivated Python Data Engineers. We need help expanding our data assets in support of our analytical capabilities in a full-time role. This role will have the opportunity to interface directly with our traders, analysts, researchers and data scientists to drive out requirements and deliver a wide range of data related needs.
What you will do:
- Translate business requirements into technical deliveries. Drive out requirements for data ingestion and access
- Maintain the cleanliness of our Python codebase, while adhering to existing designs and coding conventions as much as possible
- Contribute to our developer tools and Python ETL toolkit, including standardization and consolidation of core functionality
- Efficiently coordinate with the rest of our team in different locations
Qualifications
- 6+ years of enterprise-level coding experience with Python
- Computer Science, MIS or related degree
- Familiarity with Pandas and NumPy packages
- Experience with Data Engineering and building data pipelines
- Experience scraping websites with Requests, Beautiful Soup, Selenium, etc.
- Strong understating of object-oriented design, design patterns, SOA architectures
- Proficient understanding of peer-reviewing, code versioning, and bug/issue tracking tools.
- Strong communication skills
- Familiarity with containerization solutions like Docker and Kubernetes is a plus
Applied Data Scientist/ Data Science Engineer
Data engineer job in Austin, TX
Role: Applied Data Scientist/ Data Science Engineer
Yrs. of experience: 8+ Yrs.
Job type : Fulltime
Job Responsibilities:
You will be part of a team that innovates and collaborates with internal stakeholders to deliver world-class solutions with a customer first mentality. This group is passionate about the data science field and is motivated to find opportunity in, and develop solutions for, evolving challenges.
You will:
Solve business and customer issues utilizing AI/ML - Mandatory
Build prototypes and scalable AI/ML solutions that will be integrated into software products
Collaborate with software engineers, business stakeholders and product owners in an Agile environment
Have complete ownership of model outcomes and drive continuous improvement
Essential Requirements:
Strong coding skills in Python and SQL - Mandatory
Machine Learning knowledge (Deep learning, Information Retrieval (RAG), GenAI , Classification, Forecasting, Regression, etc. on large datasets) with experience in ML model deployment
Ability to work with internal stakeholders to transfer business questions into quantitative problem statements
Ability to effectively communicate data science progress to non-technical internal stakeholders
Ability to lead a team of data scientists is a plus
Experience with Big Data technologies and/or software development is a plus
Senior Data Governance Consultant (Informatica)
Data engineer job in Plano, TX
Senior Data Governance Consultant (Informatica)
About Paradigm - Intelligence Amplified
Paradigm is a strategic consulting firm that turns vision into tangible results. For over 30 years, we've helped Fortune 500 and high-growth organizations accelerate business outcomes across data, cloud, and AI. From strategy through execution, we empower clients to make smarter decisions, move faster, and maximize return on their technology investments. What sets us apart isn't just what we do, it's how we do it. Driven by a clear mission and values rooted in integrity, excellence, and collaboration, we deliver work that creates lasting impact. At Paradigm, your ideas are heard, your growth is prioritized, your contributions make a difference.
Summary:
We are seeking a Senior Data Governance Consultant to lead and enhance data governance capabilities across a financial services organization
The Senior Data Governance Consultant will collaborate closely with business, risk, compliance, technology, and data management teams to define data standards, strengthen data controls, and drive a culture of data accountability and stewardship
The ideal candidate will have deep experience in developing and implementing data governance frameworks, data policies, and control mechanisms that ensure compliance, consistency, and trust in enterprise data assets
Hands-on experience with Informatica, including Master Data Management (MDM) or Informatica Data Management Cloud (IDMC), is preferred
This position is Remote, with occasional travel to Plano, TX
Responsibilities:
Data Governance Frameworks:
Design, implement, and enhance data governance frameworks aligned with regulatory expectations (e.g., BCBS 239, GDPR, CCPA, DORA) and internal control standards
Policy & Standards Development:
Develop, maintain, and operationalize data policies, standards, and procedures that govern data quality, metadata management, data lineage, and data ownership
Control Design & Implementation:
Define and embed data control frameworks across data lifecycle processes to ensure data integrity, accuracy, completeness, and timeliness
Risk & Compliance Alignment:
Work with risk and compliance teams to identify data-related risks and ensure appropriate mitigation and monitoring controls are in place
Stakeholder Engagement:
Partner with data owners, stewards, and business leaders to promote governance practices and drive adoption of governance tools and processes
Data Quality Management:
Define and monitor data quality metrics and KPIs, establishing escalation and remediation procedures for data quality issues
Metadata & Lineage:
Support metadata and data lineage initiatives to increase transparency and enable traceability across systems and processes
Reporting & Governance Committees:
Prepare materials and reporting for data governance forums, risk committees, and senior management updates
Change Management & Training:
Develop communication and training materials to embed governance culture and ensure consistent understanding across the organization
Required Qualifications:
7+ years of experience in data governance, data management, or data risk roles within financial services (banking, insurance, or asset management preferred)
Strong knowledge of data policy development, data standards, and control frameworks
Proven experience aligning data governance initiatives with regulatory and compliance requirements
Familiarity with Informatica data governance and metadata tools
Excellent communication skills with the ability to influence senior stakeholders and translate technical concepts into business language
Deep understanding of data management principles (DAMA-DMBOK, DCAM, or equivalent frameworks)
Bachelor's or Master's Degree in Information Management, Data Science, Computer Science, Business, or related field
Preferred Qualifications:
Hands-on experience with Informatica, including Master Data Management (MDM) or Informatica Data Management Cloud (IDMC), is preferred
Experience with data risk management or data control testing
Knowledge of financial regulatory frameworks (e.g., Basel, MiFID II, Solvency II, BCBS 239)
Certifications, such as Informatica, CDMP, or DCAM
Background in consulting or large-scale data transformation programs
Key Competencies:
Strategic and analytical thinking
Strong governance and control mindset
Excellent stakeholder and relationship management
Ability to drive organizational change and embed governance culture
Attention to detail with a pragmatic approach
Why Join Paradigm
At Paradigm, integrity drives innovation. You'll collaborate with curious, dedicated teammates, solving complex problems and unlocking immense data value for leading organizations. If you seek a place where your voice is heard, growth is supported, and your work creates lasting business value, you belong at Paradigm.
Learn more at ********************
Policy Disclosure:
Paradigm maintains a strict drug-free workplace policy. All offers of employment are contingent upon successfully passing a standard 5-panel drug screen. Please note that a positive test result for any prohibited substance, including marijuana, will result in disqualification from employment, regardless of state laws permitting its use. This policy applies consistently across all positions and locations.
Data Engineer
Data engineer job in Austin, TX
About the Role
We are seeking a highly skilled Databricks Data Engineer with strong expertise in modern data engineering, Azure cloud technologies, and Lakehouse architectures. This role is ideal for someone who thrives in dynamic environments, enjoys solving complex data challenges, and can lead end-to-end delivery of scalable data solutions.
What We're Looking For
8+ years designing and delivering scalable data pipelines in modern data platforms
Deep experience in data engineering, data warehousing, and enterprise-grade solution delivery
Ability to lead cross-functional initiatives in matrixed teams
Advanced skills in SQL, Python, and ETL/ELT development, including performance tuning
Hands-on experience with Azure, Snowflake, and Databricks, including system integrations
Key Responsibilities
Design, build, and optimize large-scale data pipelines on the Databricks Lakehouse platform
Modernize and enhance cloud-based data ecosystems on Azure, contributing to architecture, modeling, security, and CI/CD
Use Apache Airflow and similar tools for workflow automation and orchestration
Work with financial or regulated datasets while ensuring strong compliance and governance
Drive best practices in data quality, lineage, cataloging, and metadata management
Primary Technical Skills
Develop and optimize ETL/ELT pipelines using Python, PySpark, Spark SQL, and Databricks Notebooks
Design efficient Delta Lake models for reliability and performance
Implement and manage Unity Catalog for governance, RBAC, lineage, and secure data sharing
Build reusable frameworks using Databricks Workflows, Repos, and Delta Live Tables
Create scalable ingestion pipelines for APIs, databases, files, streaming sources, and MDM systems
Automate ingestion and workflows using Python and REST APIs
Support downstream analytics for BI, data science, and application workloads
Write optimized SQL/T-SQL queries, stored procedures, and curated datasets
Automate DevOps workflows, testing pipelines, and workspace configurations
Additional Skills
Azure: Data Factory, Data Lake, Key Vault, Logic Apps, Functions
CI/CD: Azure DevOps
Orchestration: Apache Airflow (plus)
Streaming: Delta Live Tables
MDM: Profisee (nice-to-have)
Databases: SQL Server, Cosmos DB
Soft Skills
Strong analytical and problem-solving mindset
Excellent communication and cross-team collaboration
Detail-oriented with a high sense of ownership and accountability
Data Scientist (F2F Interview)
Data engineer job in Dallas, TX
W2 Contract
Dallas, TX (Onsite)
We are seeking an experienced Data Scientist to join our team in Dallas, Texas. The ideal candidate will have a strong foundation in machine learning, data modeling, and statistical analysis, with the ability to transform complex datasets into clear, actionable insights that drive business impact.
Key Responsibilities
Develop, implement, and optimize machine learning models to support business objectives.
Perform exploratory data analysis, feature engineering, and predictive modeling.
Translate analytical findings into meaningful recommendations for technical and non-technical stakeholders.
Collaborate with cross-functional teams to identify data-driven opportunities and improve decision-making.
Build scalable data pipelines and maintain robust analytical workflows.
Communicate insights through reports, dashboards, and data visualizations.
Qualifications
Bachelor's or Master's degree in Data Science, Statistics, Computer Science, or a related field.
Proven experience working with machine learning algorithms and statistical modeling techniques.
Proficiency in Python or R, along with hands-on experience using libraries such as Pandas, NumPy, Scikit-learn, or TensorFlow.
Strong SQL skills and familiarity with relational or NoSQL databases.
Experience with data visualization tools (e.g., Tableau, Power BI, matplotlib).
Excellent problem-solving, communication, and collaboration skills.
Data Engineer
Data engineer job in Houston, TX
Job Title: Senior Software Engineer / Quant Developer (JG4 Level)
Duration: Long-term contract with possibility of extension
The Senior Data Engineer will design and build robust data foundations and end-to-end data solutions to enable the business to maximize value from data. This role plays a critical part in fostering a data-driven culture across both IT and business stakeholder communities. The Senior Data Engineer will act as a subject matter expert (SME), lead solution design and delivery, mentor junior engineers, and translate Data Strategy and Vision into scalable, high-quality IT solutions.
Key Responsibilities
Design, build, and maintain enterprise-grade data foundations and end-to-end data solutions.
Serve as a subject matter expert in data engineering, data modeling, and solution architecture.
Translate business data strategy and vision into scalable technical solutions.
Mentor and guide junior data engineers and contribute to continuous capability building.
Drive the rollout and adoption of Data Foundation initiatives across the business.
Coordinate change management, incident management, and problem management processes.
Present insights, reports, and technical findings to key stakeholders.
Drive implementation efficiency across pilots and future projects to reduce cost, accelerate delivery, and maximize business value.
Actively contribute to community initiatives such as Centers of Excellence (CoE) and Communities of Practice (CoP).
Collaborate effectively with both technical teams and business leaders.
Key Characteristics
Highly curious technology expert with a continuous learning mindset.
Strong data-domain expertise with deep technical focus.
Excellent communicator who can engage both technical and non-technical stakeholders.
Trusted advisor to leadership and cross-functional teams.
Strong driver of execution, quality, and delivery excellence.
Mandatory Skills
Cloud Platforms: AWS, Azure, SAP -
Expert Level
ELT:
Expert Level
Data Modeling:
Expert Level
Data Integration & Ingestion
Data Manipulation & Processing
DevOps & Version Control: GitHub, GitHub Actions, Azure DevOps
Data & Analytics Tools: Data Factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest
Optional / Nice-to-Have Skills
Experience leading projects or running a Scrum team.
Experience with BPC and Planning.
Exposure to external technical ecosystems.
Documentation using MkDocs.
Azure Data Engineer Sr
Data engineer job in Irving, TX
Minimum 7 years of relevant work experience in data engineering, with at least 2 years in a data modeling.
Strong technical foundation in Python, SQL, and experience with cloud platforms (Azure,).
Deep understanding of data engineering fundamentals, including database architecture and design, Extract, transform and load (ETL) processes, data lakes, data warehousing, and both batch and streaming technologies.
Experience with data orchestration tools (e.g., Airflow), data processing frameworks (e.g., Spark, Databricks), and data visualization tools (e.g., Tableau, Power BI).
Proven ability to lead a team of engineers, fostering a collaborative and high-performing environment.
Azure Data Engineer
Data engineer job in Irving, TX
Our client is seeking an Azure Data Engineer to join their team! This position is located in Irving, Texas. THIS ROLE REQUIRES AN ONSITE INTERVIEW IN IRVING, please only apply if you are local and available to interview onsite.
Duties:
Lead the design, architecture, and implementation of key data initiatives and platform capabilities
Optimize existing data workflows and systems to improve performance, cost-efficiency, identifying and guiding teams to implement solutions
Lead and mentor a team of 2-5 data engineers, providing guidance on technical best practices, career development, and initiative execution
Contribute to the development of data engineering standards, processes, and documentation, promoting consistency and maintainability across teams while enabling business stakeholders
Desired Skills/Experience:
Bachelor's degree or equivalent in Computer Science, Mathematics, Software Engineering, Management Information Systems, etc.
5+ years of relevant work experience in data engineering
Strong technical skills in SQL, PySpark/Python, Azure, and Databricks
Deep understanding of data engineering fundamentals, including database architecture and design, ETL, etc.
Benefits:
Medical, Dental, & Vision Insurance Plans
Employee-Owned Profit Sharing (ESOP)
401K offered
The approximate pay range for this position starting at $140-145,000+. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
Data Analytics Engineer
Data engineer job in Houston, TX
Title: Data Analytics Engineer
Type: 6 Month Contract (Full-time is possible after contract period)
Schedule: Hybrid (3-4 days onsite)
Sector: Oil & Gas
Overview: You will be instrumental in developing and maintaining data models while delivering insightful analyses of maintenance operations, including uptime/downtime, work order metrics, and asset health.
Key Responsibilities:
Aggregate and transform raw data from systems such as CMMS, ERP, and SCADA into refined datasets and actionable reports/visualizations using tools like SQL, Python, Power BI, and/or Spotfire.
Own the creation and maintenance of dashboards for preventative and predictive maintenance.
Collaborate cross-functionally to identify data requirements, key performance indicators (KPIs), and reporting gaps.
Ensure high data quality through rigorous testing, validation, and documentation.
Qualifications and Skills:
Bachelor's degree required.
Proficiency in Python and SQL is essential.
Knowledge of API rules and protocols.
Experience organizing development workflows using GitHub.
Familiarity with Machine Learning is a plus.
Preference for candidates with experience in water midstream/infrastructure or Oil & Gas sectors.
Expertise in dashboard creation using tools like Tableau, Spotfire, Excel, or Power BI.
Ability to clearly communicate technical concepts to non-technical stakeholders.
Strong organizational skills and a customer-service mindset.
Capability to work independently or collaboratively with minimal supervision.
Exceptional analytical and problem-solving skills, with a strategic approach to prioritization.
Ability to analyze data, situations, and processes to make informed decisions or resolve issues, with regular communication to management.
Excellent written and verbal communication skills.
On-premise Data Engineer (Python, SQL, Databases)
Data engineer job in Houston, TX
I want to do 3 rounds of interviews:
Teams virtual tech interview with senior developers
Karat screening
In person interview with managers and directors
5+ years of experience in data engineering, sql and nosql databases like oracle, sql server, postgres, db2, elastic, mongo db and advanced python skills.
Advanced application development experience with implementing business logic utilizing SQL procedures and NOSQL utilities
Experience with design and development of scalable and performant processes
Expert in Python development and fast API microservices what IDEs and tools did they use to code and test?
Development experience with real time user interactive applications and communicate between UI and database what protocols/data formats did they use to interact between db and UI
Python Data Engineer
Data engineer job in Houston, TX
Job Title: Python Data Engineer
Experience & Skills
5+ years in Data Engineering with strong SQL and NoSQL database skills:
Databases: Oracle, SQL Server, Postgres, DB2, Elasticsearch, MongoDB
Advanced Python development and FastAPI microservices experience
Application development experience implementing business logic via SQL stored procedures and NoSQL utilities
Experience designing scalable and performant processes:
Must provide metrics: transactions/day, largest DB table size, concurrent users, API response times
Real-time interactive applications with UI-to-database communication:
Must explain protocols and data formats used (e.g., JSON, REST, WebSockets)
Experience using LLM models, coding agents, and testing agents:
Provide specific examples of problem-solving
Ability to handle support and development simultaneously:
Detail daily split between support and development, ticketing system usage, or direct user interaction
Bachelor's degree in Computer Science or relevant major
Strong analytic skills, AI tool usage, multitasking, self-management, and direct collaboration with business users
Not a Good Fit
Experience limited to ETL / backend processes / data transfer between databases
Experience only on cloud platforms (Azure, AWS, GCP) without SQL/NoSQL + Python expertise
Dexian stands at the forefront of Talent + Technology solutions with a presence spanning more than 70 locations worldwide and a team exceeding 10,000 professionals. As one of the largest technology and professional staffing companies and one of the largest minority-owned staffing companies in the United States, Dexian combines over 30 years of industry expertise with cutting-edge technologies to deliver comprehensive global services and support.
Dexian connects the right talent and the right technology with the right organizations to deliver trajectory-changing results that help everyone achieve their ambitions and goals. To learn more, please visit ********************
Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.
Lead Data Engineer(Databricks, DLT (Delta Live Tables)
Data engineer job in Houston, TX
a. Relevant experience to be more than 8-9 years, Strong and proficient in Databricks, DLT (Delta Live Tables) framework and Pyspark, need excellent communication.
Thanks
Rakesh Pathak | Senior Technical Recruiter
Phone: ************
*************************| ***************
**********************************************************
GCP Data Engineer
Data engineer job in Fort Worth, TX
Job Title: GCP Data Engineer
Employment Type: W2/CTH
Client: Direct
We are seeking a highly skilled Data Engineer with strong expertise in Python, SQL, and Google Cloud Platform (GCP) services. The ideal candidate will have 6-8 years of hands-on experience in building and maintaining scalable data pipelines, working with APIs, and leveraging GCP tools such as BigQuery, Cloud Composer, and Dataflow.
Core Responsibilities:
• Design, build, and maintain scalable data pipelines to support analytics and business operations.
• Develop and optimize ETL processes for structured and unstructured data.
• Work with BigQuery, Cloud Composer, and other GCP services to manage data workflows.
• Collaborate with data analysts and business teams to ensure data availability and quality.
• Integrate data from multiple sources using APIs and custom scripts.
• Monitor and troubleshoot pipeline performance and reliability.
Technical Skills:
o Strong proficiency in Python and SQL.
o Experience with data pipeline development and ETL frameworks.
• GCP Expertise:
o Hands-on experience with BigQuery, Cloud Composer, and Dataflow.
• Additional Requirements:
o Familiarity with workflow orchestration tools and cloud-based data architecture.
o Strong problem-solving and analytical skills.
o Excellent communication and collaboration abilities.
Data Engineer
Data engineer job in Temple, TX
SeAH Superalloy Technologies is building a world-class manufacturing facility in Temple, Texas, producing aerospace-grade nickel-based superalloys for investment casting and additive manufacturing. As part of SeAH Group's $150M U.S. greenfield investment, we're shaping the future of advanced manufacturing and establishing strong partnerships with industry leaders, suppliers, and communities.
Position Summary
We are seeking a highly skilled and proactive Data Engineer to lead and support the development and optimization of our analytics infrastructure. This role will focus on building scalable, secure, and maintainable data pipelines across enterprise systems like ERP, MES, SCADA, and WMS. The ideal candidate has a strong technical foundation in data engineering, exceptional problem-solving skills, and experience in both on-prem and cloud environments. This role will also involve the development of dashboards, visualization tools, and predictive analytics for use across operations, engineering, and executive leadership.
Key Responsibilities
Data Engineering & Pipeline Development:
Design, build, and maintain robust, fault-tolerant data pipelines and ingestion workflows.
Lead integration of key enterprise systems (ERP, MES, CMMS, SCADA, WMS).
Optimize pipelines for performance, scalability, and long-term maintainability.
Clean, transform, and augment raw industrial data to ensure accuracy and analytical value.
System Integration & API Management:
Develop and maintain RESTful API connectivity for cross-platform communication.
Work with structured and semi-structured data formats (SQL, CSV, PLC logs, etc.).
Translate complex business requirements into scalable data architecture.
Visualization & Reporting:
Create and maintain dashboards and reports using Power BI or similar tools.
Automate report generation for predictive analytics, anomaly detection, and performance insights.
Collaborate with stakeholders to customize visual outputs and provide decision-ready insights.
Data Collection, Governance & Security:
Implement ETL processes and ensure proper data governance protocols.
Conduct quality checks, monitor ingestion workflows, and enforce secure data handling practices.
Perform backups and manage version control for code and reports.
Collaboration & Agile Operations:
Participate in agile team meetings, code reviews, and sprint planning.
Support internal teams with technical troubleshooting and training.
Gather requirements directly from stakeholders to refine data strategies.
Qualifications
Bachelor's degree in Computer Science, Engineering, Data Science, or a related field.
5+ years of professional experience in data engineering, analytics, or a related technical role.
Strong experience with REST APIs, microservices, and data pipeline orchestration.
Proficient in SQL and scripting languages (Python, Bash, PowerShell).
Experience with data warehousing, ETL design, and industrial datasets.
Familiarity with on-prem and cloud environments.
Excellent analytical, communication, and problem-solving skills.
Preferred/Bonus Skills
Experience integrating data from PLCs or industrial protocols.
Familiarity with Power BI, MES, or CMMS tools.
Experience applying cybersecurity standards to data infrastructure.
Knowledge of manufacturing environments, especially in metals or high-spec industries.
Data Engineer
Data engineer job in Austin, TX
We are seeking a Data Engineer to join a dynamic Agile team and support the build and enhancement of a large-scale data integration hub. This role requires hands-on experience in data acquisition, ETL automation, SQL development, and performance analytics.
What You'll Do
✔ Lead technical work within Agile development teams
✔ Automate ETL processes using Informatica Power Center / IICS
✔ Develop complex Oracle/Snowflake SQL scripts & views
✔ Integrate data from multiple sources (Oracle, SQL Server, Excel, Access, PDF)
✔ Support CI/CD and deployment processes
✔ Produce technical documentation, diagrams & mockups
✔ Collaborate with architects, engineers & business stakeholders
✔ Participate in Sprint ceremonies & requirements sessions
✔ Ensure data quality, validation & accuracy
Must Have Experience
✅ 8+ years:
Informatica Power Center / IICS
ETL workflow development
SQL development (Oracle/Snowflake)
Data warehousing & analytics
Technical documentation (Visio/Erwin, MS Office, MS Project)
Data Engineer
Data engineer job in Dallas, TX
Junior Data Engineer
DESCRIPTION: BeaconFire is based in Central NJ, specializing in Software Development, Web Development, and Business Intelligence; looking for candidates who are good communicators and self-motivated. You will play a key role in building, maintaining, and operating integrations, reporting pipelines, and data transformation systems.
Qualifications:
Passion for data and a deep desire to learn.
Master's Degree in Computer Science/Information Technology, Data Analytics/Data
Science, or related discipline.
Intermediate Python. Experience in data processing is a plus. (Numpy, Pandas, etc)
Experience with relational databases (SQL Server, Oracle, MySQL, etc.)
Strong written and verbal communication skills.
Ability to work both independently and as part of a team.
Responsibilities:
Collaborate with the analytics team to find reliable data solutions to meet the business needs.
Design and implement scalable ETL or ELT processes to support the business demand for data.
Perform data extraction, manipulation, and production from database tables.
Build utilities, user-defined functions, and frameworks to better enable data flow patterns.
Build and incorporate automated unit tests, participate in integration testing efforts.
Work with teams to resolve operational & performance issues.
Work with architecture/engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and adhered to.
Compensation: $65,000.00 to $80,000.00 /year
BeaconFire is an e-verified company. Work visa sponsorship is available.
Python Data Engineer- THADC5693417
Data engineer job in Houston, TX
Must Haves:
Strong proficiency in Python; 5+ years' experience.
Expertise in Fast API and microservices architecture and coding
Linking python based apps with sql and nosql db's
Deployments on docker, Kubernetes and monitoring tools
Experience with Automated testing and test-driven development
Git source control, git actions, ci/cd , VS code and copilot
Expertise in both on prem sql dbs (oracle, sql server, Postgres, db2) and no sql databases
Working knowledge of data warehousing and ETL Able to explain the business functionality of the projects/applications they have worked on
Ability to multi task and simultaneously work on multiple projects.
NO CLOUD - they are on prem
Day to Day:
Insight Global is looking for a Python Data Engineer for one of our largest oil and gas clients in Downtown Houston, TX. This person will be responsible for building python-based relationships between back-end SQL and NoSQL databases, architecting and coding Fast API and Microservices, and performing testing on back-office applications. The ideal candidate will have experience developing applications utilizing python and microservices and implementing complex business functionality utilizing python.