Post job

Data engineer jobs in Santa Clarita, CA

- 1,669 jobs
All
Data Engineer
Data Scientist
Software Engineer
Requirements Engineer
Senior Data Architect
Senior Software Engineer
Game Engineer
ETL Architect
Devops Engineer
  • Data Scientist

    Stand 8 Technology Consulting

    Data engineer job in Long Beach, CA

    STAND 8 provides end to end IT solutions to enterprise partners across the United States and with offices in Los Angeles, New York, New Jersey, Atlanta, and more including internationally in Mexico and India We are seeking a highly analytical and technically skilled Data Scientist to transform complex, multi-source data into unified, actionable insights used for executive reporting and decision-making. This role requires expertise in business intelligence design, data modeling, metadata management, data integrity validation, and the development of dashboards, reports, and analytics used across operational and strategic environments. The ideal candidate thrives in a fast-paced environment, demonstrates strong investigative skills, and can collaborate effectively with technical teams, business stakeholders, and leadership. Essential Duties & Responsibilities As a Data Scientist, participate across the full solution lifecycle: business case, planning, design, development, testing, migration, and production support. Analyze large and complex datasets with accuracy and attention to detail. Collaborate with users to develop effective metadata and data relationships. Identify reporting and dashboard requirements across business units. Determine strategic placement of business logic within ETL or metadata models. Build enterprise data warehouse metadata/semantic models. Design and develop unified dashboards, reports, and data extractions from multiple data sources. Develop and execute testing methodologies for reports and metadata models. Document BI architecture, data lineage, and project report requirements. Provide technical specifications and data definitions to support the enterprise data dictionary. Apply analytical skills and Data Science techniques to understand business processes, financial calculations, data flows, and application interactions. Identify and implement improvements, workarounds, or alternative solutions related to ETL processes, ensuring integrity and timeliness. Create UI components or portal elements (e.g., SharePoint) for dynamic or interactive stakeholder reporting. As a Data Scientist, download and process SQL database information to build Power BI or Tableau reports (including cybersecurity awareness campaigns). Utilize SQL, Python, R, or similar languages for data analysis and modeling. Support process optimization through advanced modeling, leveraging experience as a Data Scientist where needed. Required Knowledge & Attributes Highly self-motivated with strong organizational skills and ability to manage multiple verbal and written assignments. Experience collaborating across organizational boundaries for data sourcing and usage. Analytical understanding of business processes, forecasting, capacity planning, and data governance. Proficient with BI tools (Power BI, Tableau, PBIRS, SSRS, SSAS). Strong Microsoft Office skills (Word, Excel, Visio, PowerPoint). High attention to detail and accuracy. Ability to work independently, demonstrate ownership, and ensure high-quality outcomes. Strong communication, interpersonal, and stakeholder engagement skills. Deep understanding that data integrity and consistency are essential for adoption and trust. Ability to shift priorities and adapt within fast-paced environments. Required Education & Experience Bachelor's degree in Computer Science, Mathematics, or Statistics (or equivalent experience). 3+ years of BI development experience. 3+ years with Power BI and supporting Microsoft stack tools (SharePoint 2019, PBIRS/SSRS, Excel 2019/2021). 3+ years of experience with SDLC/project lifecycle processes 3+ years of experience with data warehousing methodologies (ETL, Data Modeling). 3+ years of VBA experience in Excel and Access. Strong ability to write SQL queries and work with SQL Server 2017-2022. Experience with BI tools including PBIRS, SSRS, SSAS, Tableau. Strong analytical skills in business processes, financial modeling, forecasting, and data flow understanding. Critical thinking and problem-solving capabilities. Experience producing high-quality technical documentation and presentations. Excellent communication and presentation skills, with the ability to explain insights to leadership and business teams. Benefits Medical coverage and Health Savings Account (HSA) through Anthem Dental/Vision/Various Ancillary coverages through Unum 401(k) retirement savings plan Paid-time-off options Company-paid Employee Assistance Program (EAP) Discount programs through ADP WorkforceNow Additional Details The base range for this contract position is $73 - $83 / per hour, depending on experience. Our pay ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target for new hires of this position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Qualified applicants with arrest or conviction records will be considered About Us STAND 8 provides end-to-end IT solutions to enterprise partners across the United States and globally with offices in Los Angeles, Atlanta, New York, Mexico, Japan, India, and more. STAND 8 focuses on the "bleeding edge" of technology and leverages automation, process, marketing, and over fifteen years of success and growth to provide a world-class experience for our customers, partners, and employees. Our mission is to impact the world positively by creating success through PEOPLE, PROCESS, and TECHNOLOGY. Check out more at ************** and reach out today to explore opportunities to grow together! By applying to this position, your data will be processed in accordance with the STAND 8 Privacy Policy.
    $73-83 hourly 4d ago
  • Senior Data Engineer - Snowflake / ETL (Onsite)

    CGS Business Solutions 4.7company rating

    Data engineer job in Beverly Hills, CA

    CGS Business Solutions is committed to helping you, as an esteemed IT Professional, find the next right step in your career. We match professionals like you to rewarding consulting or full-time opportunities in your area of expertise. We are currently seeking Technical Professionals who are searching for challenging and rewarding jobs for the following opportunity: Summary CGS is hiring for a Senior Data Engineer to serve as a core member of the Platform team. This is a high-impact role responsible for advancing our foundational data infrastructure. Your primary mission will be to build key components of our Policy Journal - the central source of truth for all policy, commission, and client accounting data. You'll work closely with the Lead Data Engineer and business stakeholders to translate complex requirements into scalable data models and reliable pipelines that power analytics and operational decision-making for agents, managers, and leadership. This role blends greenfield engineering, strategic modernization, and a strong focus on delivering trusted, high-quality data products. Overview • Build the Policy Journal - Design and implement the master data architecture unifying policy, commission, and accounting data from sources like IVANS and Applied EPIC to create the platform's “gold record.” • Ensure Data Reliability - Define and implement data quality checks, monitoring, and alerting to guarantee accuracy, consistency, and timeliness across pipelines - while contributing to best practices in governance. • Build the Analytics Foundation - Enhance and scale our analytics stack (Snowflake, dbt, Airflow), transforming raw data into clean, performant dimensional models for BI and operational insights. • Modernize Legacy ETL - Refactor our existing Java + SQL (PostgreSQL) ETL system - diagnose duplication and performance issues, rewrite critical components in Python, and migrate orchestration to Airflow. • Implement Data Quality Frameworks - Develop automated testing and validation frameworks aligned with our QA strategy to ensure accuracy, completeness, and integrity across pipelines. • Collaborate on Architecture & Design - Partner with product and business stakeholders to deeply understand requirements and design scalable, maintainable data solutions. Ideal Experience • 5+ years of experience building and operating production-grade data pipelines. • Expert-level proficiency in Python and SQL. • Hands-on experience with the modern data stack - Snowflake/Redshift, Airflow, dbt, etc. • Strong understanding of AWS data services (S3, Glue, Lambda, RDS). • Experience working with insurance or insurtech data (policies, commissions, claims, etc.). • Proven ability to design robust data models (e.g., dimensional modeling) for analytics. • Pragmatic problem-solver capable of analyzing and refactoring complex legacy systems (ability to read Java/Hibernate is a strong plus - but no new Java coding required). • Excellent communicator comfortable working with both technical and non-technical stakeholders. Huge Plus! • Direct experience with Agency Management Systems (Applied EPIC, Nowcerts, EZLynx, etc.) • Familiarity with carrier data formats (Accord XML, IVANS AL3) • Experience with BI tools (Tableau, Looker, Power BI) About CGS Business Solutions: CGS specializes in IT business solutions, staffing and consulting services. With a strong focus in IT Applications, Network Infrastructure, Information Security, and Engineering. CGS is an INC 5000 company and is honored to be selected as one of the Best IT Recruitment Firms in California. After five consecutive Fastest Growing Company titles, CGS continues to break into new markets across the USA. Companies are counting on CGS to attract and help retain these resource pools in order to gain a competitive advantage the rapidly changing business environments.
    $99k-136k yearly est. 3d ago
  • Principal Data Scientist

    Hiretalent-Staffing & Recruiting Firm

    Data engineer job in Alhambra, CA

    The Principal Data Scientist works to establish a comprehensive Data Science Program to advance data-driven decision-making, streamline operations, and fully leverage modern platforms including Databricks, or similar, to meet increasing demand for predictive analytics and AI solutions. The Principal Data Scientist will guide program development, provide training and mentorship to junior members of the team, accelerate adoption of advanced analytics, and build internal capacity through structured mentorship. The Principal Data Scientist will possess exceptional communication abilities, both verbal and written, with a strong customer service mindset and the ability to translate complex concepts into clear, actionable insights; strong analytical and business acumen, including foundational experience with regression, association analysis, outlier detection, and core data analysis principles; working knowledge of database design and organization, with the ability to partner effectively with Data Management and Data Engineering teams; outstanding time management and organizational skills, with demonstrated success managing multiple priorities and deliverables in parallel; a highly collaborative work style, coupled with the ability to operate independently, maintain focus, and drive projects forward with minimal oversight; a meticulous approach to quality, ensuring accuracy, reliability, and consistency in all deliverables; and proven mentorship capabilities, including the ability to guide, coach, and upskill junior data scientists and analysts. 5+ years of professional experience leading data science initiatives, including developing machine learning models, statistical analyses, and end-to-end data science workflows in production environments. 3+ years of experience working with Databricks and similar cloud-based analytics platforms, including notebook development, feature engineering, ML model training, and workflow orchestration. 3+ years of experience applying advanced analytics and predictive modeling (e.g., regression, classification, clustering, forecasting, natural language processing). 2+ years of experience implementing MLOps practices, such as model versioning, CI/CD for ML, MLflow, automated pipelines, and model performance monitoring. 2+ years of experience collaborating with data engineering teams to design data pipelines, optimize data transformations, and implement Lakehouse or data warehouse architectures (e.g., Databricks, Snowflake, SQL-based platforms). 2+ years of experience mentoring or supervising junior data scientists or analysts, including code reviews, training, and structured skill development. 2+ years of experience with Python and SQL programming, using data sources such as SQL Server, Oracle, PostgreSQL, or similar relational databases. 1+ year of experience operationalizing analytics within enterprise governance frameworks, partnering with Data Management, Security, and IT to ensure compliance, reproducibility, and best practices. Education: This classification requires possession of a Master's degree or higher in Data Science, Statistics, Computer Science, or a closely related field. Additional qualifying professional experience may be substituted for the required education on a year-for-year basis. At least one of the following industry-recognized certifications in data science or cloud analytics, such as: • Microsoft Azure Data Scientist Associate (DP-100) • Databricks Certified Data Scientist or Machine Learning Professional • AWS Machine Learning Specialty • Google Professional Data Engineer • or equivalent advanced analytics certifications. The certification is required and may not be substituted with additional experience.
    $97k-141k yearly est. 1d ago
  • Senior Data Engineer

    Robert Half 4.5company rating

    Data engineer job in Los Angeles, CA

    Robert Half is partnering with a well known brand seeking an experienced Data Engineer with Databricks experience. Working alongside data scientists and software developers, you'll work will directly impact dynamic pricing strategies by ensuring the availability, accuracy, and scalability of data systems. This position is full time with full benefits and 3 days onsite in the Woodland Hills, CA area. Responsibilities: Design, build, and maintain scalable data pipelines for dynamic pricing models. Collaborate with data scientists to prepare data for model training, validation, and deployment. Develop and optimize ETL processes to ensure data quality and reliability. Monitor and troubleshoot data workflows for continuous integration and performance. Partner with software engineers to embed data solutions into product architecture. Ensure compliance with data governance, privacy, and security standards. Translate stakeholder requirements into technical specifications. Document processes and contribute to data engineering best practices. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 4+ years of experience in data engineering, data warehousing, and big data technologies. Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Must have experience in Databricks. Experience working within Azure or AWS or GCP environment. Familiarity with big data tools like Spark, Hadoop, or Databricks. Experience in real-time data pipeline tools. Experienced with Python
    $116k-165k yearly est. 2d ago
  • Senior Data Engineer

    Akube

    Data engineer job in Glendale, CA

    City: Glendale, CA Onsite/ Hybrid/ Remote: Hybrid (3 days a week onsite, Friday - Remote) Duration: 12 months Rate Range: Up to$85/hr on W2 depending on experience (no C2C or 1099 or sub-contract) Work Authorization: GC, USC, All valid EADs except OPT, CPT, H1B Must Have: • 5+ years Data Engineering • Airflow • Spark DataFrame API • Databricks • SQL • API integration • AWS • Python or Java or Scala Responsibilities: • Maintain, update, and expand Core Data platform pipelines. • Build tools for data discovery, lineage, governance, and privacy. • Partner with engineering and cross-functional teams to deliver scalable solutions. • Use Airflow, Spark, Databricks, Delta Lake, Kubernetes, and AWS to build and optimize workflows. • Support platform standards, best practices, and documentation. • Ensure data quality, reliability, and SLA adherence across datasets. • Participate in Agile ceremonies and continuous process improvement. • Work with internal customers to understand needs and prioritize enhancements. • Maintain detailed documentation that supports governance and quality. Qualifications: • 5+ years in data engineering with large-scale pipelines. • Strong SQL and one major programming language (Python, Java, or Scala). • Production experience with Spark and Databricks. • Experience ingesting and interacting with API data sources. • Hands-on Airflow orchestration experience. • Experience developing APIs with GraphQL. • Strong AWS knowledge and infrastructure-as-code familiarity. • Understanding of OLTP vs OLAP, data modeling, and data warehousing. • Strong problem-solving and algorithmic skills. • Clear written and verbal communication. • Agile/Scrum experience. • Bachelor's degree in a STEM field or equivalent industry experience.
    $85 hourly 4d ago
  • Lead Data Engineer - (Automotive exp)

    Intelliswift-An LTTS Company

    Data engineer job in Torrance, CA

    Role: Sr Technical Lead Duration: 12+ Month Contract Daily Tasks Performed: Lead the design, development, and deployment of a scalable, secure, and high-performance CDP SaaS product. Architect solutions that integrate with various data sources, APIs, and third-party platforms. Design, develop, and optimize complex SQL queries for data extraction, transformation, and analysis Build and maintain workflow pipelines using Digdag, integrating with data platforms such as Treasure Data, AWS, or other cloud services Automate ETL processes and schedule tasks using Digdag's YAML-based workflow definitions Implement data quality checks, logging, and alerting mechanisms within workflow Leverage AWS services (e.g., S3, Lambda, Athena) where applicable to enhance data processing and storage capabilities Ensure best practices in software engineering, including code reviews, testing, CI/CD, and documentation. Oversee data privacy, security, and compliance initiatives (e.g., GDPR, CCPA). Ensure adherence to security, compliance, and data governance requirements. Oversee development of real-time and batch data processing systems. Collaborate with cross-functional teams including data analysts, product managers, and software engineers to translate business requirements into technical solutions Collaborate with the stakeholders to define technical requirements to align technical solutions with business goals and deliver product features. Mentor and guide developers, fostering a culture of technical excellence and continuous improvement. Troubleshoot complex technical issues and provide hands-on support as needed. Monitor, troubleshoot, and improve data workflows for performance, reliability, and cost-efficiency as needed Optimize system performance, scalability, and cost efficiency. What this person will be working on: As the Senior Technical Lead for our Customer Data Platform (CDP), the candidate will define the technical strategy, architecture, and execution of the platform. They will lead the design and delivery of scalable, secure, and high-performing solutions that enable unified customer data management, advanced analytics, and personalized experiences. This role demands deep technical expertise, strong leadership, and a solid understanding of data platforms and modern cloud technologies. It is a pivotal position that supports the CDP vision by mentoring team members and delivering solutions that empower our customers to unify, analyze, and activate their data. Position Success Criteria (Desired) - 'WANTS' Bachelor's or Master's degree in Computer Science, Engineering, or related field. 8+ years of software development experience, with at least 3+ years in a technical leadership role. Proven experience building and scaling SaaS products, preferably in customer data, marketing technology, or analytics domains Extensive hands-on experience with Presto, Hive, and Python Strong proficiency in writing complex SQL queries for data extraction, transformation, and analysis Familiarity with AWS data services such as S3, Athena, Glue, and Lambda Deep understanding of data modeling, ETL pipelines, workflow orchestration, and both real-time and batch data processing Experience ensuring data privacy, security, and compliance in SaaS environments Knowledge of Customer Data Platforms (CDPs), CDP concepts, and integration with CRM, marketing, and analytics tools Excellent communication, leadership, and project management skills Experience working with Agile methodologies and DevOps practices Ability to thrive in a fast-paced, agile environment Collaborative mindset with a proactive approach to problem-solving Stay current with industry trends and emerging technologies relevant to SaaS and customer data platforms.
    $100k-141k yearly est. 2d ago
  • Lead Data Scientist

    TPI Global Solutions 4.6company rating

    Data engineer job in Alhambra, CA

    Role: Principal Data Scientist Duration: 12+ Months contract The Principal Data Scientist works to establish a comprehensive Data Science Program to advance data-driven decision-making, streamline operations, and fully leverage modern platforms including Databricks, or similar, to meet increasing demand for predictive analytics and AI solutions. The Principal Data Scientist will guide program development, provide training and mentorship to junior members of the team, accelerate adoption of advanced analytics, and build internal capacity through structured mentorship. The Principal Data Scientist will possess exceptional communication abilities, both verbal and written, with a strong customer service mindset and the ability to translate complex concepts into clear, actionable insights; strong analytical and business acumen, including foundational experience with regression, association analysis, outlier detection, and core data analysis principles; working knowledge of database design and organization, with the ability to partner effectively with Data Management and Data Engineering teams; outstanding time management and organizational skills, with demonstrated success managing multiple priorities and deliverables in parallel; a highly collaborative work style, coupled with the ability to operate independently, maintain focus, and drive projects forward with minimal oversight; a meticulous approach to quality, ensuring accuracy, reliability, and consistency in all deliverables; and proven mentorship capabilities, including the ability to guide, coach, and upskill junior data scientists and analysts. Required Experience • Five (5)+ years of professional experience leading data science initiatives, including developing machine learning models, statistical analyses, and end-to-end data science workflows in production environments. • Three (3)+ years of experience working with Databricks and similar cloud-based analytics platforms, including notebook development, feature engineering, ML model training, and workflow orchestration. • Three (3)+ years of experience applying advanced analytics and predictive modeling (e.g., regression, classification, clustering, forecasting, natural language processing). • Two (2)+ years of experience implementing MLOps practices, such as model versioning, CI/CD for ML, MLflow, automated pipelines, and model performance monitoring. • Two (2)+ years of experience collaborating with data engineering teams to design data pipelines, optimize data transformations, and implement Lakehouse or data warehouse architectures (e.g., Databricks, Snowflake, SQL-based platforms). • Two (2)+ years of experience mentoring or supervising junior data scientists or analysts, including code reviews, training, and structured skill development. • Two (2)+ years of experience with Python and SQL programming, using data sources such as SQL Server, Oracle, PostgreSQL, or similar relational databases. • One (1)+ year of experience operationalizing analytics within enterprise governance frameworks, partnering with Data Management, Security, and IT to ensure compliance, reproducibility, and best practices. Education This classification requires possession of a Master's degree or higher in Data Science, Statistics, Computer Science, or a closely related field. Additional qualifying professional experience may be substituted for the required education on a year-for-year basis. At least one of the following industry-recognized certifications in data science or cloud analytics, such as: • Microsoft Azure Data Scientist Associate (DP-100) • Databricks Certified Data Scientist or Machine Learning Professional • AWS Machine Learning Specialty • Google Professional Data Engineer • or equivalent advanced analytics certifications. The certification is required and may not be substituted with additional experience. Additional Information • California Resident Candidates Only. This position is HYBRID (2 days onsite, 2 days telework). Interviews will be conducted via Microsoft Teams. The work schedule follows a 4/40 (10-hour days, Monday-Thursday), with the specific shift determined by the program manager. Shifts may range between 7:15 a.m. and 6:00 p.m.
    $90k-125k yearly est. 2d ago
  • Data Engineer (AWS Redshift, BI, Python, ETL)

    Prosum 4.4company rating

    Data engineer job in Manhattan Beach, CA

    We are seeking a skilled Data Engineer with strong experience in business intelligence (BI) and data warehouse development to join our team. In this role, you will design, build, and optimize data pipelines and warehouse architectures that support analytics, reporting, and data-driven decision-making. You will work closely with analysts, data scientists, and business stakeholders to ensure reliable, scalable, and high-quality data solutions. Responsibilities: Develop and maintain ETL/ELT pipelines for ingesting, transforming, and delivering data. Design and enhance data warehouse models (star/snowflake schemas) and BI datasets. Optimize data workflows for performance, scalability, and reliability. Collaborate with BI teams to support dashboards, reporting, and analytics needs. Ensure data quality, governance, and documentation across all solutions. Qualifications: Proven experience with data engineering tools (SQL, Python, ETL frameworks). Strong understanding of BI concepts, reporting tools, and dimensional modeling. Hands-on experience with cloud data platforms (e.g., AWS, Azure, GCP) is a plus. Excellent problem-solving skills and ability to work in a cross-functional environment.
    $99k-139k yearly est. 3d ago
  • Senior Data Architect-SoCal only(No C2C)

    JSG (Johnson Service Group, Inc.

    Data engineer job in Calabasas, CA

    JSG is seeking a Senior Data Solutions Architect for a client in Woodland Hills, CA. This position is remote, and our client is looking for local candidates based in Southern California.The Senior Data Solutions Engineer will design, scale, and optimize the company's enterprise data platform. This role will build and maintain cloud-native data pipelines, lakehouse/warehouse architectures, and multi-system integrations that support Finance, CRM, Operations, Marketing, and guest experience analytics. The engineer will focus on building secure, scalable, and cost-efficient systems while applying modern DevOps, ETL, and cloud engineering practices requiring a strong technologist with hands-on expertise across data pipelines, orchestration, governance, and cloud infrastructure. Key Responsibilities Design, build, and maintain ELT/ETL pipelines across Snowflake, Databricks, Microsoft Fabric Gen 2, Azure Synapse Analytics, and legacy SQL/Oracle platforms. Implement medallion/lakehouse architecture, CDC pipelines, and streaming ingestion frameworks. Leverage Python (90%) and SQL (10%) for data processing, orchestration, and automation. Manage AWS and Azure multi-account environments, enforcing MFA, IAM policies, and governance. Build serverless architectures (AWS Lambda, Azure Functions, EventBridge, SQS, Step Functions) for event-driven data flows. Integrate infrastructure with CI/CD pipelines (GitHub Actions, Azure DevOps, MWAA/Airflow, dbt) for automated testing and deployments. Deploy infrastructure as code using Terraform and Azure DevOps for reproducible, version-controlled environments. Implement observability and monitoring frameworks (Datadog, Prometheus, Grafana, Kibana, Azure Monitor) to ensure system reliability, performance, and cost efficiency. Collaborate with stakeholders to deliver secure, scalable, and cost-efficient data solutions. Background in Finance a or Consumer facing industries is preferred Salary: $160K-$175K JSG offers medical, dental, vision, life insurance options, short-term disability, 401(k), weekly pay, and more. Johnson Service Group (JSG) is an Equal Opportunity Employer. JSG provides equal employment opportunities to all applicants and employees without regard to race, color, religion, sex, age, sexual orientation, gender identity, national origin, disability, marital status, protected veteran status, or any other characteristic protected by law.
    $160k-175k yearly 3d ago
  • Senior Data Architect

    Ispace, Inc.

    Data engineer job in Torrance, CA

    Title: Senior Data Architect Duration: 12 Months Pay rate $90 Per hr on W2 Daily Tasks Performed: Translates high-level business requirements into data models and appropriate metadata, test data, and data quality standards. Manages senior business stakeholders to secure strong engagement and ensures that the delivery of the project aligns with longer-term strategic roadmaps. Leads and participates in the peer review and quality assurance of project architectural artifacts Defines and manages standards, guidelines, and processes to ensure data quality. Works with IT teams, business analysts, and data analytics teams to understand data consumers' needs and develop solutions. Evaluates and recommends emerging technologies for data management, storage, and analytics Establish and maintain governance frameworks for team and vendor partners to ensure the effectiveness of architecture services. What this person will be working on: Understand data confidentiality, security, compliance needs and apply data protection rules including data sharing, filtering, and fencing at storage, compute, and consumption layers. Design data protection solutions at database, table, column level and APIs based on enterprise standard data security, privacy, architecture principles and reference architectures. Design the structure and layout of data systems, including databases, warehouses, and lakes Select and implement database management systems that meet the organization's needs by defining data schemas, optimizing data storage, and establishing data access controls and security measures Deliver exceptional business value by enhancing data pipeline performance, ensuring timely orchestration, and upholding data governance. Position Success Criteria (Desired) - 'WANTS' A bachelor's degree in computer science, data science, engineering, or related field. At least 10 years of relevant experience in design and implementation of data models (Erwin) for enterprise data warehouse initiatives Experience leading projects involving cloud data lakes, data warehousing, data modeling, and data analysis Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS), real-time data distribution (Kinesis, Kafka, Dataflow), and modern data warehouse tools (Redshift) Experience with various database platforms, including DB2, MS SQL Server, PostgreSQL, MongoDB, etc. Understanding of entity-relationship modeling, metadata systems, and data security, quality tools and techniques Ability to design traditional - relational, analytics, datalake and lakehouse architecture based on business needs Experience with business intelligence tools and technologies such as Informatica, Power BI, and Tableau Exceptional communication and presentation skills Strong analytical and problem-solving skills Ability to collaborate and excel in complex, cross-functional teams involving data scientists, business analysts, and stakeholders Ability to guide solution design and architecture to meet business needs. If you're interested in above role please send me your updated resume to *******************************
    $90 hourly 1d ago
  • Software Engineer (Java/Typescrip/Kotlin)

    Optomi 4.5company rating

    Data engineer job in Burbank, CA

    Optomi in partnership with one of our top clients is seeking a highly skilled Software Engineer with strong experience in building application and shared services, REST APIs, and cloud-native solutions. In this role, you will contribute to the development of the Studio's media platforms and B2B applications that support content fulfillment across the Studio's global supply chain. The ideal candidate will bring strong AWS expertise, proficiency in modern programming languages, and the ability to work cross-functionally within a collaborative engineering environment. What the Right Candidate Will Enjoy! Contributing to high-visibility media platforms and content supply chain applications Building scalable, reusable B2B REST APIs used across multiple business units Hands-on development with TypeScript, Java, Kotlin, or JavaScript Working extensively with AWS serverless tools-including Lambda and API Gateway Solving complex engineering challenges involving identity and access management Participating in a structured, multi-stage interview process that values both technical and collaborative skills Collaborating with engineers, product owners, security teams, and infrastructure partners Delivering features in an Agile environment with opportunities for continuous learning Expanding skillsets across cloud services, API design, and distributed systems Experience of the Right Candidate: 3+ years of industry experience in software engineering STEM Degree Strong focus on application development and shared services Extensive experience with AWS tools and technologies, especially serverless computing and API Gateway Strong proficiency in TypeScript, Java, Kotlin, or JavaScript (TypeScript/Java preferred) Solid understanding of REST API design principles and software engineering best practices Strong communication and problem-solving skills Ability to collaborate effectively within cross-functional teams Experience with databases and identity & access management concepts a plus Comfortable participating in coding assessments and system design interviews Responsibilities of the Right Candidate: Collaborate on the design, development, and deployment of scalable, high-quality software solutions Build, enhance, and maintain API-driven shared services for Studio media platforms Leverage AWS tools and serverless technologies to architect reliable, cloud-native applications Partner closely with product owners, security teams, and other engineering groups to deliver on requirements Participate in Agile ceremonies-estimating work, prioritizing tasks, and delivering iteratively Apply and uphold best practices in coding standards, architecture, and system reliability Contribute to identity and access management services and reusable B2B REST APIs Conduct testing and ensure high-quality deployments across the platform Actively stay current with emerging technologies, industry trends, and engineering best practices Support continuous improvement efforts for development processes, tooling, and automation
    $105k-146k yearly est. 3d ago
  • Azure Cloud Engineer (Jr/Mid) - (Locals only)

    Maxonic Inc.

    Data engineer job in Los Angeles, CA

    Job Title: Cloud Team Charter Job Type: Contract to Hire Work Schedule: Hybrid (3 days onsite, 2 days remote) Rate: $60. Based on experience Responsibilities: Cloud Team Charter/ Scope- 2 resources (1 Sr and 1 Mid/Jr) Operate and maintain Cloud Foundation Services, such as: Azure Policies Backup Engineering and Enforcement Logging Standard and Enforcement AntiVirus and Malware Enforcement Azure service/resources life cycle management, including retirement of resources Tagging enforcement Infrastructure Security Ownership of Defender reporting as it relates to Infrastructure. Collaboration with Cyber Security and App team to generate necessary reports for Infrastructure security review. Actively monitoring and remediating infrastructure vulnerability with App Team. Coordinate with the App team to address infrastructure vulnerabilities. Drive continuous improvement in Cloud Security by tracking/maintaining infrastructure vulnerabilities through Azure Security Center. Cloud Support: PaaS DB support Support for Cloud Networking (L2) and work with the Network team as needed Developer support in the Cloud. Support for the CMDB team to track the Cloud assets. L4 Cloud support for the enterprise. About Maxonic: Since 2002 Maxonic has been at the forefront of connecting candidate strengths to client challenges. Our award winning, dedicated team of recruiting professionals are specialized by technology, are great listeners, and will seek to find a position that meets the long-term career needs of our candidates. We take pride in the over 10,000 candidates that we have placed, and the repeat business that we earn from our satisfied clients. Interested in Applying? Please apply with your most current resume. Feel free to contact Jhankar Chanda (******************* / ************ ) for more details.
    $60 hourly 5d ago
  • Thermal Engineer

    Bcforward 4.7company rating

    Data engineer job in Los Angeles, CA

    BCforward is seeking a highly motivated and experienced Thermal Engineer to join our dynamic team in Los Angeles, CA Thermal Engineer Work Arrangement: (100% Onsite) Duration: 3+ months (Possible Extension). Basic qualifications Bachelor's degree in mechanical engineering or related discipline, or equivalent experience. 3+ year experience with Thermal Desktop (SINDA) and/or Ansys thermal analysis tools (ICEPAK. Fluent, mechanical, etc.) 2+ year experience with test planning, test set-up (thermocouple and heater installation), operating DAQ and power supplies, results correlation and system verification for production Experience in documentation and writing test reports. Preferred qualifications Experience with avionics thermal design and analysis. CAD skills (NX or Solidworks). Experience with interpreting and correlating test data to thermal models. Specifics tasks that this individual will support are as follows: Provide design inputs to the mechanical and electrical teams. Complete thermal modeling analysis using Ansys analysis tools. Develop comprehensive thermal analysis documentation Develop thermal testing and qualification plan. Conduct thermal testing and model validation. Optical Communications Terminal v2 (OCT v2) (0.5 FTE - 12 months) Perform trade studies to select architecture of the electronic enclosures. Preliminary thermal analysis of printed circuit board assemblies. Collaborate with design team to develop preliminary designs. Define and execute development testing.
    $93k-126k yearly est. 2d ago
  • Sr. Software Engineer (NO H1B OR C2C) - Major Entertainment Company

    Techlink Resources, Inc. 4.5company rating

    Data engineer job in Los Angeles, CA

    Senior Software Engineer - Ad Platform Machine Learning We're looking for a Senior Software Engineer to join our Ad Platform Decisioning & Machine Learning Platform team. Our mission is to power the Company's advertising ecosystem with advanced machine learning, AI-driven decisioning, and high-performance backend systems. We build end-to-end solutions that span machine learning, large-scale data processing, experimentation platforms, and microservices-all to improve ad relevance, performance, and efficiency. If you're passionate about ML technologies, backend engineering, and solving complex problems in a fast-moving environment, this is an exciting opportunity to make a direct impact on next-generation ad decisioning systems. What You'll Do Build next-generation experimentation platforms for ad decisioning and large-scale A/B testing Develop simulation platforms that apply state-of-the-art ML and optimization techniques to improve ad performance Design and implement scalable approaches for large-scale data analysis Work closely with researchers to productize cutting-edge ML innovations Architect distributed systems with a focus on performance, scalability, and flexibility Champion engineering best practices including CI/CD, design patterns, automated testing, and strong code quality Contribute to all phases of the software lifecycle-design, experimentation, implementation, and testing Partner with product managers, program managers, SDETs, and researchers in a collaborative and innovative environment Basic Qualifications 4+ years of professional programming and software design experience (Java, Python, Scala, etc.) Experience building highly available, scalable microservices Strong understanding of system architecture and application design Knowledge of big data technologies and large-scale data processing Passion for understanding the ad business and driving innovation Enthusiastic about technology and comfortable working across disciplines Preferred Qualifications Domain knowledge in digital advertising Familiarity with AI/ML technologies and common ML tech stacks Experience with big data and workflow tools such as Airflow or Databricks Education Bachelor's degree plus 5+ years of relevant industry experience Role Scope You'll support ongoing initiatives across the ad platform, including building new experimentation and simulation systems used for online A/B testing. Media industry experience is not required. Technical Environment Java & Spring Boot for backend microservices AWS as the primary cloud environment Python & Scala for data pipelines running on Spark and Airflow Candidates should be strong in either backend microservices or data pipeline development and open to learning the other API development experience is required Interview Process Round 1: Technical & coding evaluation (1 hour) Round 2: Technical + behavioral interview (1 hour) Candidates are assessed on technical strength and eagerness to learn.
    $113k-148k yearly est. 1d ago
  • Senior Software Engineer

    PTR Global

    Data engineer job in Culver City, CA

    We're looking for highly capable Senior Software Engineers to join a fast-paced, start-up-like environment within an exciting well-known brand. You'll be instrumental in building modern web applications using React and Node.js, with a strong focus on delivering scalable, high-performance APIs and robust data management via relational databases. This role is perfect for someone who thrives in ownership-driven environments, collaborates well with cross-functional teams, and can deliver high-quality code under tight timelines. Key Responsibilities: Design, build, and maintain modern single-page web applications using React (JavaScript/TypeScript) Develop backend APIs and services using Node.js with TypeScript, JavaScript, and Java Collaborate with team members, product managers, and designers to deliver new features and improvements Integrate with and manage relational databases to ensure data accuracy, performance, and scalability Write clean, maintainable code and contribute to peer reviews and documentation Participate in agile processes including sprint planning, standups, and retrospectives Troubleshoot, debug, and optimize application performance Communicate effectively across technical and non-technical teams Technical Skills: Required Proficient in TypeScript and JavaScript Strong hands-on experience building single-page applications, ideally React Strong understanding of relational databases such as PostgreSQL or MySQL Familiarity with Git and collaborative version control workflows Experience with Agile development tools (e.g., Jira, Confluence) Nice to Have Familiarity with cloud services, CI/CD, or containerization Working knowledge of Java for backend development Experience building and maintaining Node.js backend APIs Soft Skills Required: Self-motivated and able to work independently as well as collaboratively Strong problem-solving and debugging skills Excellent verbal and written communication Willingness to learn quickly and adapt to new technologies Detail-oriented with a passion for clean, readable, and efficient code Comfortable working in fast-paced, evolving environments Resourceful Education Required: B.S. Computer Science
    $109k-150k yearly est. 5d ago
  • Software Engineer - Python

    S3 Connections LLC 4.4company rating

    Data engineer job in Pasadena, CA

    Role: Python Developer Skills: Python , Pyspark , Data Bricks Key Responsibilities: Design, develop, and maintain backend services and high-performance data pipelines using Python and PySpark. Architect and implement scalable APIs and microservices to support business-critical applications and integrations. Design and optimize PostgreSQL data models and queries for performance and reliability. Collaborate with cross-functional teams including data engineers, architects, and DevOps to ensure system robustness and scalability. Lead technical design discussions, mentor junior engineers, and enforce best practices in backend development. Participate in performance tuning, debugging, and production support. Integrate with external systems such as OpenText (experience helpful but not required). Required Skills and Experience 7+ years of professional software development experience, with a focus on backend systems. Expert-level proficiency in Python and PySpark for backend and data processing workloads. Experience with Databricks Strong understanding of backend architecture, distributed systems, and API design. Experience with PostgreSQL or other relational databases. Familiarity with cloud-based environments (AWS, Azure, or GCP) and CI/CD pipelines. Strong problem-solving skills and ability to work independently on complex projects.
    $103k-141k yearly est. 1d ago
  • Plumbing Engineer

    K2D Consulting Engineers

    Data engineer job in Marina del Rey, CA

    We are currently seeking a Plumbing Engineer to join our team in Marina Del Rey, California. SUMMARY: This position is responsible for managing and performing tests on various materials and equipment and maintaining knowledge on all product specifications and ensure adherence to all required standards by performing the following duties. DUTIES AND RESPONSIBILITIES: Build long term customer relationships with existing and potential customers. Effectively manage Plumbing and design projects by satisfying clients' needs, meeting budget expectations and project schedules. Provide support during construction phases. Performs other related duties as assigned by management. SUPERVISORY RESPONSIBILITIES: Carries out supervisory responsibilities in accordance with the organization's policies and applicable laws. QUALIFICATIONS: Bachelor's Degree (BA) from four-year college or universityin Mechanical Engineering or completed Course Work in Plumbing, or one to two years of related experience and/or training, or equivalent combination of education and experience. Certificates, licenses and registrations required: LEED Certification is a plus. Computer skills required:Experienced at using a computer, preferably knowledgeable with MS Word, MS Excel, AutoCAD, and REVIT is a plus. Other skills required: 5 years of experience minimum, individuals should have recent experience working for a consulting engineering or engineering/architectural firm designing plumbing systems. Experience in the following preferred: Residential Commercial Multi-Family Restaurants Strong interpersonal skills and experience in maintaining strong client relationships are required. Ability to communicate effectively with people through oral presentations and written communications. Ability to motivate multiple-discipline project teams in meeting client's needs in a timely manner and meeting budget objectives.
    $87k-124k yearly est. 60d+ ago
  • Etl/Informatica Architect

    Scadea Solutions

    Data engineer job in Los Angeles, CA

    Reach me at venkat.ram AT scadea DOT net Hello !! We have an immediate Full-Time opportunity for ETL/ Informatica in Los Angeles, CA. Please find the details below.If this interests you, please share you updated resume at the earliest or call me at ************ EXt.2235 . Visa: USC/GC Desired skills and responsibilities: 8-10 years of real DW BI experience. Good understanding on ETL architecture. Good understanding of Dimensional modeling Should have experience with ETL design work Should be worked with Banking domain Understand the business and customer needs to introduce technical solutions. Analyze business requirements/processes and system integration points to design & develop end-to-end BI solution. Strong experience in developing Informatica ETL. Good knowledge in performance tuning and enforcing repository best practices. Translate user requirements into technical solutions. Design and develop meta data model to meet user analytic requirements. Design of data warehouse tables, including source to Client mappings. Sound understanding and experience with data modelling, including multi-dimensional modelling, ETL and metadata development. Qualifications BACHELOR'S DEGREE Additional Information Required Skills: ETL, DW BI, Informatica, Banking domain
    $101k-137k yearly est. 60d+ ago
  • Sr Gameplay Engineer

    The Walt Disney Company 4.6company rating

    Data engineer job in Burbank, CA

    This is not a remote role. You must be in the local area or willing to relocate. Join Our Team The Office of Technology Enablement is a strategic organization designed to establish Disney as a progressive, innovative, and responsible leader in important and fast-moving areas of technology such as artificial intelligence and mixed reality. The organization coordinates efforts in AI/ML and mixed reality across each of the Walt Disney Company's business segments to enable value and drive business outcomes. Overview As a Senior Software Engineer (Unreal), you will help build foundational prototypes and systems that enable Disney's AI & XR Studio to accelerate creative innovation across The Walt Disney Company. Your focus will be on developing interactive, real-time experiences for XR with a downstream focus on scalable tools and experiences that teams can extend and adapt into broader production efforts. This role operates at the intersection of engineering and storytelling, providing foundational tools and experiences that empower partner teams to move faster and bolder with emerging technologies. You will own the technical direction and optimize stakeholder satisfaction for a collection of internally developed products in addition to architecting and delivering new software solutions, all while leveraging your expertise in Unreal and its accompanying toolset. You work closely with a cross-functional team of partners, including software, test, and automation engineers, as well as creative teams, to develop and improve product performance. Responsibilities Prototype and iterate on immersive XR experiences with a focus on emotional impact and user engagement. Collaborate with creative teams to translate narrative and design goals into interactive features. Provide technical leadership across multiple prototypes and tools, focusing on building scalable foundations that downstream teams can expand upon, while maintaining strong momentum and project health. Provide guidance and support to team members to ensure alignment with creative vision and business goals. Conduct rigorous testing to identify and fix any bugs or issues that may affect product performance. Act as a technical owner for net new tools or experiences, collaborating with stakeholders to define requirements, success metrics, and clear pathways for downstream teams to build upon studio- developed work. Build clear, modular documentation that translates business and creative inputs into technical deliverables, allowing for smooth handoffs and scalability. Collaborate closely with internal engineering teams and enterprise partners to deliver robust solutions designed to be adapted, scaled, or integrated into downstream experiences. Mentor and develop junior team members. Help author interactive development pipelines and real-time performance optimization. Design, develop, and maintain high-quality game code, features, and systems in collaboration with the development team. Implement and optimize interactive mechanics, graphics rendering, AI algorithms, and networking systems to ensure smooth performance across various platforms. Contribute to technical design discussions, identifying opportunities for innovation and improvement in game development processes. Collaborate with cross-functional teams, including design, art, and production, to integrate technical requirements and deliver high-quality games on time. Conduct code reviews, provide feedback, and mentor junior engineers to promote best practices and ensure code quality and efficiency. Stay current on industry trends, emerging technologies, and best practices in XR interactive development to continuously improve technical skills and knowledge. Must Have the following experience: Must have 5+ years of experience in developing high impact software solutions working with Unreal Demonstrated experience building interactive applications in Unreal for XR, Console or PC. Experience with Blueprints and C++ in Unreal for gameplay and interaction systems. Portfolio or demo reel showcasing interactive XR or real-time experiences you created. Strong understanding of real-time rendering, input systems, and performance optimization in Unreal. Proficiency in programming languages commonly used in development, such as C++, C#, or Java. Strong problem-solving skills, attention to detail, and the ability to work effectively in a fast-paced, collaborative environment. Excellent communication and interpersonal skills, with the ability to work well with cross-functional teams and contribute to technical discussions and decisions. Strong track record of interfacing with business and creative stakeholders to define requirements, propose solutions and support products through the prototype life. Passion for mentoring others and developing junior engineers. Bachelor's degree in Computer Science, Software Engineering, or a related field. #DISNEYTECH The hiring range for this position in Los Angeles, CA is $138,900.00-$186,200.00. The base pay actually offered will take into account internal equity and also may vary depending on the candidate's geographic region, job-related knowledge, skills, and experience among other factors. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered. Job Posting Segment: TWDSTECH Job Posting Primary Business: Technology Enablement Primary Job Posting Category: Software Engineer Employment Type: Full time Primary City, State, Region, Postal Code: Burbank, CA, USA Alternate City, State, Region, Postal Code: Date Posted: 2025-05-21
    $138.9k-186.2k yearly Auto-Apply 60d+ ago
  • Gameplay Engineer

    Motocol

    Data engineer job in Glendale, CA

    An ideal candidate has Mobile Unity Game development, C# and a passion for games, mobile or any type. Samples or examples of work - can be from a prototype, in college, etc. Must be able to work collaboratively in a team environment and be willing to learn every day. Ideal candidate is also driven to make games. This position is part technical and part designing for the user. The Client is seeking a talented Gameplay Engineer (Designer Programmer) to join its team of passionate gamers. Ability to drive gameplay aspects of the game development process. Work effectively as part of a multi-disciplinary team of engineers, artists, designers and producers on mobile game projects. Design and build fun features in a high-quality way for our mobile games, participate in weekly sprints, keeping the team updated with your progress. Work with systems and tools used in professional software development, such as source control, issue tracking and others. Qualifications Required Qualifications: 2+ years' experience developing mobile Unity games. Production-level ability in one game programming language, such as C#. Working knowledge of development tools and environments such as Unity. Passion for developing player feedback / game feel. Game design skills - especially mechanics design, prototyping concepts and systems design. Game programming skills including 2D and 3D math, physics, FX and UI systems. Strong collaboration skills. A passion for games. Preferred Qualifications: Working prototypes of concepts that showcase rapid iteration process. Experience building and publishing Unity iOS and Android apps or games. Experience C++, Objective C or JAVA, or other coding Additional Information All your information will be kept confidential according to EEO guidelines. Hourly Bill rate :$25.00 No backround check. Duration 10 months.
    $25 hourly 17h ago

Learn more about data engineer jobs

How much does a data engineer earn in Santa Clarita, CA?

The average data engineer in Santa Clarita, CA earns between $87,000 and $168,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Santa Clarita, CA

$121,000
Job type you want
Full Time
Part Time
Internship
Temporary