Post job

Data engineer jobs in Lawndale, CA

- 2,068 jobs
All
Data Engineer
Senior Software Engineer
Data Scientist
Senior Data Architect
Senior Systems Developer
Software Engineer
Devops Engineer
Requirements Engineer
Software Development Engineer
Game Engineer
  • Data Scientist

    Stand 8 Technology Consulting

    Data engineer job in Long Beach, CA

    STAND 8 provides end to end IT solutions to enterprise partners across the United States and with offices in Los Angeles, New York, New Jersey, Atlanta, and more including internationally in Mexico and India We are seeking a highly analytical and technically skilled Data Scientist to transform complex, multi-source data into unified, actionable insights used for executive reporting and decision-making. This role requires expertise in business intelligence design, data modeling, metadata management, data integrity validation, and the development of dashboards, reports, and analytics used across operational and strategic environments. The ideal candidate thrives in a fast-paced environment, demonstrates strong investigative skills, and can collaborate effectively with technical teams, business stakeholders, and leadership. Essential Duties & Responsibilities As a Data Scientist, participate across the full solution lifecycle: business case, planning, design, development, testing, migration, and production support. Analyze large and complex datasets with accuracy and attention to detail. Collaborate with users to develop effective metadata and data relationships. Identify reporting and dashboard requirements across business units. Determine strategic placement of business logic within ETL or metadata models. Build enterprise data warehouse metadata/semantic models. Design and develop unified dashboards, reports, and data extractions from multiple data sources. Develop and execute testing methodologies for reports and metadata models. Document BI architecture, data lineage, and project report requirements. Provide technical specifications and data definitions to support the enterprise data dictionary. Apply analytical skills and Data Science techniques to understand business processes, financial calculations, data flows, and application interactions. Identify and implement improvements, workarounds, or alternative solutions related to ETL processes, ensuring integrity and timeliness. Create UI components or portal elements (e.g., SharePoint) for dynamic or interactive stakeholder reporting. As a Data Scientist, download and process SQL database information to build Power BI or Tableau reports (including cybersecurity awareness campaigns). Utilize SQL, Python, R, or similar languages for data analysis and modeling. Support process optimization through advanced modeling, leveraging experience as a Data Scientist where needed. Required Knowledge & Attributes Highly self-motivated with strong organizational skills and ability to manage multiple verbal and written assignments. Experience collaborating across organizational boundaries for data sourcing and usage. Analytical understanding of business processes, forecasting, capacity planning, and data governance. Proficient with BI tools (Power BI, Tableau, PBIRS, SSRS, SSAS). Strong Microsoft Office skills (Word, Excel, Visio, PowerPoint). High attention to detail and accuracy. Ability to work independently, demonstrate ownership, and ensure high-quality outcomes. Strong communication, interpersonal, and stakeholder engagement skills. Deep understanding that data integrity and consistency are essential for adoption and trust. Ability to shift priorities and adapt within fast-paced environments. Required Education & Experience Bachelor's degree in Computer Science, Mathematics, or Statistics (or equivalent experience). 3+ years of BI development experience. 3+ years with Power BI and supporting Microsoft stack tools (SharePoint 2019, PBIRS/SSRS, Excel 2019/2021). 3+ years of experience with SDLC/project lifecycle processes 3+ years of experience with data warehousing methodologies (ETL, Data Modeling). 3+ years of VBA experience in Excel and Access. Strong ability to write SQL queries and work with SQL Server 2017-2022. Experience with BI tools including PBIRS, SSRS, SSAS, Tableau. Strong analytical skills in business processes, financial modeling, forecasting, and data flow understanding. Critical thinking and problem-solving capabilities. Experience producing high-quality technical documentation and presentations. Excellent communication and presentation skills, with the ability to explain insights to leadership and business teams. Benefits Medical coverage and Health Savings Account (HSA) through Anthem Dental/Vision/Various Ancillary coverages through Unum 401(k) retirement savings plan Paid-time-off options Company-paid Employee Assistance Program (EAP) Discount programs through ADP WorkforceNow Additional Details The base range for this contract position is $73 - $83 / per hour, depending on experience. Our pay ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target for new hires of this position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Qualified applicants with arrest or conviction records will be considered About Us STAND 8 provides end-to-end IT solutions to enterprise partners across the United States and globally with offices in Los Angeles, Atlanta, New York, Mexico, Japan, India, and more. STAND 8 focuses on the "bleeding edge" of technology and leverages automation, process, marketing, and over fifteen years of success and growth to provide a world-class experience for our customers, partners, and employees. Our mission is to impact the world positively by creating success through PEOPLE, PROCESS, and TECHNOLOGY. Check out more at ************** and reach out today to explore opportunities to grow together! By applying to this position, your data will be processed in accordance with the STAND 8 Privacy Policy.
    $73-83 hourly 5d ago
  • Principal Data Scientist

    Hiretalent-Staffing & Recruiting Firm

    Data engineer job in Alhambra, CA

    The Principal Data Scientist works to establish a comprehensive Data Science Program to advance data-driven decision-making, streamline operations, and fully leverage modern platforms including Databricks, or similar, to meet increasing demand for predictive analytics and AI solutions. The Principal Data Scientist will guide program development, provide training and mentorship to junior members of the team, accelerate adoption of advanced analytics, and build internal capacity through structured mentorship. The Principal Data Scientist will possess exceptional communication abilities, both verbal and written, with a strong customer service mindset and the ability to translate complex concepts into clear, actionable insights; strong analytical and business acumen, including foundational experience with regression, association analysis, outlier detection, and core data analysis principles; working knowledge of database design and organization, with the ability to partner effectively with Data Management and Data Engineering teams; outstanding time management and organizational skills, with demonstrated success managing multiple priorities and deliverables in parallel; a highly collaborative work style, coupled with the ability to operate independently, maintain focus, and drive projects forward with minimal oversight; a meticulous approach to quality, ensuring accuracy, reliability, and consistency in all deliverables; and proven mentorship capabilities, including the ability to guide, coach, and upskill junior data scientists and analysts. 5+ years of professional experience leading data science initiatives, including developing machine learning models, statistical analyses, and end-to-end data science workflows in production environments. 3+ years of experience working with Databricks and similar cloud-based analytics platforms, including notebook development, feature engineering, ML model training, and workflow orchestration. 3+ years of experience applying advanced analytics and predictive modeling (e.g., regression, classification, clustering, forecasting, natural language processing). 2+ years of experience implementing MLOps practices, such as model versioning, CI/CD for ML, MLflow, automated pipelines, and model performance monitoring. 2+ years of experience collaborating with data engineering teams to design data pipelines, optimize data transformations, and implement Lakehouse or data warehouse architectures (e.g., Databricks, Snowflake, SQL-based platforms). 2+ years of experience mentoring or supervising junior data scientists or analysts, including code reviews, training, and structured skill development. 2+ years of experience with Python and SQL programming, using data sources such as SQL Server, Oracle, PostgreSQL, or similar relational databases. 1+ year of experience operationalizing analytics within enterprise governance frameworks, partnering with Data Management, Security, and IT to ensure compliance, reproducibility, and best practices. Education: This classification requires possession of a Master's degree or higher in Data Science, Statistics, Computer Science, or a closely related field. Additional qualifying professional experience may be substituted for the required education on a year-for-year basis. At least one of the following industry-recognized certifications in data science or cloud analytics, such as: • Microsoft Azure Data Scientist Associate (DP-100) • Databricks Certified Data Scientist or Machine Learning Professional • AWS Machine Learning Specialty • Google Professional Data Engineer • or equivalent advanced analytics certifications. The certification is required and may not be substituted with additional experience.
    $97k-141k yearly est. 2d ago
  • Data Engineer

    Robert Half Recruiting 4.5company rating

    Data engineer job in Culver City, CA

    Robert Half is partnering with a well known high tech company seeking an experienced Data Engineer with strong Python and SQL skills. The primary duties involve managing the complete data lifecycle and utilizing extensive datasets across marketing, software, and web platforms. This position is full time with full benefits and 3 days onsite in the Culver CIty area. Responsibilities: 4+ years of professional experience ideally in a combination of data engineering and business intelligence. Working heavily with SQL and programming in Python. Ownership mindset to oversee the entire data lifecycle, including collection, extraction, and cleansing processes. Building reports and data visualization to help advance business. Leverage industry-standard tools for data integration such as Talend. Work extensively within Cloud based ecosystems such as AWS and GCP ecosystems. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering, data warehousing, and big data technologies. Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server) and NoSQL Technologies. Experience working within GCP environments and AWS. Experience in real-time data pipeline tools. Hands-on expertise with Google Cloud services including BigQuery. Deep knowledge of SQL including Dimension tables and experienced in Python programming.
    $116k-165k yearly est. 3d ago
  • Senior Data Engineer

    Akube

    Data engineer job in Calabasas, CA

    City: Calabasas, CA/ Las Vegas, NV Onsite/ Hybrid/ Remote: Onsite 4 days a week Duration: 6 months Contract to Hire Rate Range: $85/hr W2 Work Authorization: GC, USC Only Must Have: Databricks Python Azure API development ETL pipelines DevOps and CI/CD Responsibilities: Design and build scalable batch and real-time data pipelines. Develop data ingestion, processing, and analytical workflows. Build data products and intelligent APIs. Ensure data quality, reliability, and performance. Collaborate with cross-functional teams to translate business needs into data solutions. Support cloud-based data architecture for BI and AI/ML use cases. Participate in code reviews and CI/CD practices. Qualifications: Bachelor's degree in a related technical field required. 8+ years of data engineering experience. Strong experience with Databricks, Spark, and cloud platforms. Proficiency in Python and SQL. Hands-on experience with Azure data services. Experience with REST APIs and DevOps practices. Agile development experience.
    $85 hourly 3d ago
  • Data Engineer (AWS Redshift, BI, Python, ETL)

    Prosum 4.4company rating

    Data engineer job in Manhattan Beach, CA

    We are seeking a skilled Data Engineer with strong experience in business intelligence (BI) and data warehouse development to join our team. In this role, you will design, build, and optimize data pipelines and warehouse architectures that support analytics, reporting, and data-driven decision-making. You will work closely with analysts, data scientists, and business stakeholders to ensure reliable, scalable, and high-quality data solutions. Responsibilities: Develop and maintain ETL/ELT pipelines for ingesting, transforming, and delivering data. Design and enhance data warehouse models (star/snowflake schemas) and BI datasets. Optimize data workflows for performance, scalability, and reliability. Collaborate with BI teams to support dashboards, reporting, and analytics needs. Ensure data quality, governance, and documentation across all solutions. Qualifications: Proven experience with data engineering tools (SQL, Python, ETL frameworks). Strong understanding of BI concepts, reporting tools, and dimensional modeling. Hands-on experience with cloud data platforms (e.g., AWS, Azure, GCP) is a plus. Excellent problem-solving skills and ability to work in a cross-functional environment.
    $99k-139k yearly est. 4d ago
  • Data Analytics Engineer

    Archwest Capital

    Data engineer job in Irvine, CA

    We are seeking a Data Analytics Engineer to join our team who serves as a hybrid Database Administrator, Data Engineer, and Data Analyst, responsible for managing core data infrastructure, developing and maintaining ETL pipelines, and delivering high-quality analytics and visual insights to executive stakeholders. This role bridges technical execution with business intelligence, ensuring that data across Salesforce, financial, and operational systems is accurate, accessible, and strategically presented. Essential Functions Database Administration: Oversee and maintain database servers, ensuring performance, reliability, and security. Manage user access, backups, and data recovery processes while optimizing queries and database operations. Data Engineering (ELT): Design, build, and maintain robust ELT pipelines (SQL/DBT or equivalent) to extract, transform, and load data across Salesforce, financial, and operational sources. Ensure data lineage, integrity, and governance throughout all workflows. Data Modeling & Governance: Design scalable data models and maintain a governed semantic layer and KPI catalog aligned with business objectives. Define data quality checks, SLAs, and lineage standards to reconcile analytics with finance source-of-truth systems. Analytics & Reporting: Develop and manage executive-facing Tableau dashboards and visualizations covering key lending and operational metrics - including pipeline conversion, production, credit quality, delinquency/charge-offs, DSCR, and LTV distributions. Presentation & Insights: Translate complex datasets into clear, compelling stories and presentations for leadership and cross-functional teams. Communicate findings through visual reports and executive summaries to drive strategic decisions. Collaboration & Integration: Partner with Finance, Capital Markets, and Operations to refine KPIs and perform ad-hoc analyses. Collaborate with Engineering to align analytical and operational data, manage integrations, and support system scalability. Enablement & Training: Conduct training sessions, create documentation, and host data office hours to promote data literacy and empower business users across the organization. Competencies & Skills Advanced SQL proficiency with strong data modeling, query optimization, and database administration experience (PostgreSQL, MySQL, or equivalent). Hands-on experience managing and maintaining database servers and optimizing performance. Proficiency with ETL/ELT frameworks (DBT, Airflow, or similar) and cloud data stacks (AWS/Azure/GCP). Strong Tableau skills - parameters, LODs, row-level security, executive-level dashboard design, and storytelling through data. Experience with Salesforce data structures and ingestion methods. Proven ability to communicate and present technical data insights to executive and non-technical stakeholders. Solid understanding of lending/financial analytics (pipeline conversion, delinquency, DSCR, LTV). Working knowledge of Python for analytics tasks, cohort analysis, and variance reporting. Familiarity with version control (Git), CI/CD for analytics, and data governance frameworks. Excellent organizational, documentation, and communication skills with a strong sense of ownership and follow-through. Education & Experience Bachelor's degree in Computer Science, Engineering, Information Technology, Data Analytics, or a related field. 3+ years of experience in data analytics, data engineering, or database administration roles. Experience supporting executive-level reporting and maintaining database infrastructure in a fast-paced environment.
    $99k-139k yearly est. 5d ago
  • Data Engineer

    RSM Solutions, Inc. 4.4company rating

    Data engineer job in Irvine, CA

    Thank you for stopping by to take a look at the Data Integration Engineer role I posted here on LinkedIN, I appreciate it. If you have read my s in the past, you will recognize how I write job descriptions. If you are new, allow me to introduce myself. My name is Tom Welke. I am Partner & VP at RSM Solutions, Inc and I have been recruiting technical talent for more than 23 years and been in the tech space since the 1990s. Due to this, I actually write JD's myself...no AI, no 'bots', just a real live human. I realized a while back that looking for work is about as fun as a root canal with no anesthesia...especially now. So, rather than saying 'must work well with others' and 'team mindset', I do away with that kind of nonsense and just tell it like it is. So, as with every role I work on, social fit is almost as important as technical fit. For this one, technical fit is very very important. But, we also have some social fit characteristics that are important. This is the kind of place that requires people to dive in and learn. The hiring manager for this one is actually a very dear friend of mine. He said something interesting to me not all that long ago. He mentioned, if you aren't spending at least an hour a day learning something new, you really are doing yourself a disservice. This is that classic environment where no one says 'this is not my job'. So that ability to jump in and help is needed for success in this role. This role is being done onsite in Irvine, California. I prefer working with candidates that are already local to the area. If you need to relocate, that is fine, but there are no relocation dollars available. I can only work with US Citizens or Green Card Holders for this role. I cannot work with H1, OPT, EAD, F1, H4, or anyone that is not already a US Citizen or Green Card Holder for this role. The Data Engineer role is similar to the Data Integration role I posted. However, this one is mor Ops focused, with the orchestration of deployment and ML flow, and including orchestrating and using data on the clusters and managing how the models are performing. This role focuses on coding & configuring on the ML side of the house. You will be designing, automating, and observing end to end data pipelines that feed this client's Kubeflow driven machine learning platform, ensuring models are trained, deployed, and monitored on trustworthy, well governed data. You will build batch/stream workflows, wire them into Azure DevOps CI/CD, and surface real time health metrics in Prometheus + Grafana dashboards to guarantee data availability. The role bridges Data Engineering and MLOps, allowing data scientists to focus on experimentation and the business sees rapid, reliable predictive insight. Here are some of the main responsibilities: Design and implement batch and streaming pipelines in Apache Spark running on Kubernetes and Kubeflow Pipelines to hydrate feature stores and training datasets. Build high throughput ETL/ELT jobs with SSIS, SSAS, and T SQL against MS SQL Server, applying Data Vault style modeling patterns for auditability. Integrate source control, build, and release automation using GitHub Actions and Azure DevOps for every pipeline component. Instrument pipelines with Prometheus exporters and visualize SLA, latency, and error budget metrics to enable proactive alerting. Create automated data quality and schema drift checks; surface anomalies to support a rapid incident response process. Use MLflow Tracking and Model Registry to version artifacts, parameters, and metrics for reproducible experiments and safe rollbacks. Work with data scientists to automate model retraining and deployment triggers within Kubeflow based on data freshness or concept drift signals. Develop PowerShell and .NET utilities to orchestrate job dependencies, manage secrets, and publish telemetry to Azure Monitor. Optimize Spark and SQL workloads through indexing, partitioning, and cluster sizing strategies, benchmarking performance in CI pipelines. Document lineage, ownership, and retention policies; ensure pipelines conform to PCI/SOX and internal data governance standards. Here is what we are seeking: At least 6 years of experience building data pipelines in Spark or equivalent. At least 2 years deploying workloads on Kubernetes/Kubeflow. At least 2 years of experience with MLflow or similar experiment‑tracking tools. At least 6 years of experience in T‑SQL, Python/Scala for Spark. At least 6 years of PowerShell/.NET scripting. At least 6 years of experience with with GitHub, Azure DevOps, Prometheus, Grafana, and SSIS/SSAS. Kubernetes CKA/CKAD, Azure Data Engineer (DP‑203), or MLOps‑focused certifications (e.g., Kubeflow or MLflow) would be great to see. Mentor engineers on best practices in containerized data engineering and MLOps.
    $111k-166k yearly est. 3d ago
  • Senior Data Architect-SoCal only(No C2C)

    JSG (Johnson Service Group, Inc.

    Data engineer job in Calabasas, CA

    JSG is seeking a Senior Data Solutions Architect for a client in Woodland Hills, CA. This position is remote, and our client is looking for local candidates based in Southern California.The Senior Data Solutions Engineer will design, scale, and optimize the company's enterprise data platform. This role will build and maintain cloud-native data pipelines, lakehouse/warehouse architectures, and multi-system integrations that support Finance, CRM, Operations, Marketing, and guest experience analytics. The engineer will focus on building secure, scalable, and cost-efficient systems while applying modern DevOps, ETL, and cloud engineering practices requiring a strong technologist with hands-on expertise across data pipelines, orchestration, governance, and cloud infrastructure. Key Responsibilities Design, build, and maintain ELT/ETL pipelines across Snowflake, Databricks, Microsoft Fabric Gen 2, Azure Synapse Analytics, and legacy SQL/Oracle platforms. Implement medallion/lakehouse architecture, CDC pipelines, and streaming ingestion frameworks. Leverage Python (90%) and SQL (10%) for data processing, orchestration, and automation. Manage AWS and Azure multi-account environments, enforcing MFA, IAM policies, and governance. Build serverless architectures (AWS Lambda, Azure Functions, EventBridge, SQS, Step Functions) for event-driven data flows. Integrate infrastructure with CI/CD pipelines (GitHub Actions, Azure DevOps, MWAA/Airflow, dbt) for automated testing and deployments. Deploy infrastructure as code using Terraform and Azure DevOps for reproducible, version-controlled environments. Implement observability and monitoring frameworks (Datadog, Prometheus, Grafana, Kibana, Azure Monitor) to ensure system reliability, performance, and cost efficiency. Collaborate with stakeholders to deliver secure, scalable, and cost-efficient data solutions. Background in Finance a or Consumer facing industries is preferred Salary: $160K-$175K JSG offers medical, dental, vision, life insurance options, short-term disability, 401(k), weekly pay, and more. Johnson Service Group (JSG) is an Equal Opportunity Employer. JSG provides equal employment opportunities to all applicants and employees without regard to race, color, religion, sex, age, sexual orientation, gender identity, national origin, disability, marital status, protected veteran status, or any other characteristic protected by law.
    $160k-175k yearly 4d ago
  • Senior Data Architect ( AI/ML)

    Neurealm

    Data engineer job in Santa Ana, CA

    Senior AI Solution Architect - Data & AI Platforms (GCP) Experience 10-15+ years overall experience (with 5+ years in AI/Data Architecture roles) We are seeking a highly skilled AI Solution Architect with deep expertise in Data Architecture, AI/ML platforms, and Generative AI solutions, to design and deliver scalable, secure, and enterprise-grade data and AI solutions on Google Cloud Platform (GCP). The ideal candidate will have strong hands-on experience across data lakehouse architectures, modern BI platforms, ML/MLOps, Conversational Analytics, Generative AI, and Agentic AI frameworks, and will work closely with business, data engineering, and AI teams to drive end-to-end AI-led transformation. Key Responsibilities Data & Platform Architecture Design and own end-to-end data architectures including ingestion, processing, storage, governance, and consumption layers Architect modern data lakehouse platforms using GCP services (e.g., BigQuery, Dataproc, Cloud Storage) Define scalable data platforms supporting batch, streaming, and real-time analytics Establish data governance, metadata management, data quality, lineage, and security frameworks AI, ML & MLOps Architecture Design ML/AI architectures supporting model training, deployment, monitoring, and lifecycle management Define and implement MLOps frameworks (CI/CD for ML, feature stores, model registries, observability) Collaborate with data scientists to productionize ML models at scale Evaluate and recommend ML frameworks, tools, and best practices Generative AI & Agentic AI Architect and implement Generative AI solutions using LLMs (e.g., text, code, embeddings, multimodal use cases) Design Conversational Analytics and AI-powered BI solutions Build and evaluate Agentic AI platforms, including autonomous agents, orchestration frameworks, and tool integrations Lead solution evaluations, PoCs, and vendor/tool assessments for GenAI and Agent-based systems Business Intelligence & Analytics Design modern BI and analytics platforms enabling self-service analytics and AI-driven insights Integrate BI tools with data lakehouse and AI layers Enable semantic layers, metrics definitions, and governed analytics Cloud & GCP Leadership Lead architecture and solution design on Google Cloud Platform (GCP) Utilize GCP services such as BigQuery, Vertex AI, Cloud Storage, Dataflow, Dataproc, Pub/Sub, Looker, and IAM Ensure architectures follow best practices for security, scalability, performance, and cost optimization Stakeholder & Technical Leadership Partner with business leaders to translate business requirements into AI-driven solutions Lead technical design reviews and architecture governance Mentor engineers, architects, and data scientists Create architecture blueprints, reference architectures, and technical documentation Required Skills & Qualifications Core Technical Skills Strong experience in Data Architecture & Data Platforms Hands-on expertise in Data Lakehouse architectures Deep understanding of end-to-end data management Experience with modern BI platforms and analytics ecosystems Strong background in AI/ML architecture and MLOps Proven experience in Conversational Analytics and Generative AI Hands-on exposure to Agentic AI platforms, frameworks, and evaluations Strong expertise in Google Cloud Platform (GCP) Tools & Technologies (preferred) GCP: BigQuery, Vertex AI, Cloud Storage, Dataflow, Dataproc, Pub/Sub, Looker AI/ML: TensorFlow, PyTorch, scikit-learn, LLM frameworks MLOps: CI/CD, feature stores, model registries, monitoring tools Data: SQL, Python, Spark, Kafka BI: Looker, Tableau, Power BI (or equivalent) Preferred Qualifications Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or related field GCP Professional certifications (e.g., Professional Data Engineer, Professional ML Engineer, Cloud Architect) Experience working in large-scale enterprise or consulting environments Strong communication and stakeholder management skills
    $115k-158k yearly est. 3d ago
  • Sr. Developer eCommerce Systems

    Pacsun 3.9company rating

    Data engineer job in Anaheim, CA

    Join the Pacsun Community Co-created in Los Angeles, Pacsun inspires the next generation of youth, building community at the intersection of fashion, music, art and sport. Pacsun is a leading lifestyle brand offering an exclusive collection of the most relevant brands and styles such as adidas, Brandy Melville, Essentials Fear of God, our own brands, and many more. Our Pacsun community believes in and understands the importance of using our voice, platform, and resources to inspire and bring about positive development. Through our PacCares program, we are committed to our responsibility in using our platform to drive change and take action on the issues important to our community. Join the Pacsun Community. Learn more here: LinkedIn- Our Community About the Job: Pacsun's IT eCommerce team uses AI and innovative technologies to enhance customer experience and improve operational efficiency. As a key member of the team, the Senior eCommerce Developer contributes the architecture, development and optimization of the company's digital commerce experiences. This role is responsible for both back‑end and front-end development on Salesforce Commerce Cloud (SFCC), ensuring high‑performance, secure and accessible storefronts, with robust system integration in the eCommerce ecosystem. The Senior eCommerce Developer will lead end‑to‑end delivery of new features, mentor junior developers and off-shore team, and collaborate closely with UX, product, QA and business teams to create compelling online experiences that drive revenue and customer loyalty. This role will work on the full stack of Pacsun's Salesforce Commerce Cloud, mobile app, AI initiatives and system integrations, supporting Commerce, Loyalty, CRM, OMS, and other eCommerce platforms. A day in the life, what you'll be doing: Back‑End Development & Integration Design, build and maintain SFCC server‑side components, including controllers, pipelines, cartridges and custom business logic. Develop and manage robust APIs that connect SFCC with tax engines, payment processors, fraud management services and the order management system. Ensure reliable data synchronization between SFCC and external platforms such as CRM, Loyalty, OMS, ERP and analytics systems. Optimize database models, caching strategies and performance tuning to support high transaction volumes and peak traffic periods. Checkout & Transaction Optimization Own the end‑to‑end checkout experience, ensuring seamless, secure and performant workflows from cart to order confirmation. Integrate payment gateways and fraud protections to deliver accurate pricing and effortless transactions. Collaborate with UX and product teams to identify friction points in the checkout process and implement improvements that boost conversion and customer satisfaction. Tax, Shipping & OMS Integration Implement and maintain integrations with third‑party tax services to handle complex jurisdictional tax rules. Connect SFCC to shipping providers and fulfillment platforms to provide real‑time shipping options and tracking. Build and support integrations with the order management system to ensure accurate order routing, inventory updates and status synchronization. AI & Innovation Support Partner with data science and innovation teams to embed AI‑driven personalization, recommendation and search solutions into the platform. Develop integration points for machine‑learning models and real‑time personalization engines, ensuring data security and compliance. Prototype and implement new technologies that enhance the customer experience and streamline operations. Technical Leadership & Collaboration Lead code reviews, define backend architecture standards and mentor less experienced developers on integration patterns and best practices. Participate in IT management and technical teams to develop and deploy processes to ensure rapid, reliable releases. Work closely with product, UX, QA and DevOps teams to define requirements, plan sprints and deliver high‑quality software on schedule. What it takes to Join: 8+ years of experience in web development and at least 5 years focused on Salesforce Commerce Cloud and SFRA. Deep knowledge of modern front‑end technologies (HTML5, CSS3/SCSS, JavaScript, React or similar frameworks) and back‑end development (Node.js, Java or equivalent). Hands‑on experience with SFCC OCAPI/SCAPI, cartridge development, API integrations and Business Manager configurations. Proven track record integrating third‑party services (payments, tax, shipping, CRM, loyalty, analytics) and implementing secure, scalable solutions. Familiarity with Agile methodologies, version control (Git) and CI/CD pipelines. Strong understanding of web performance optimization, SEO and accessibility standards. Ability to lead discussions, mentor teammates and collaborate with technical teams. Bachelor's degree in Computer Science, Information Systems or related field; Salesforce B2C Commerce Developer certification is preferred. Salesforce Commerce Cloud SFRA certified developer is preferred. Proven ability to excel in fast-growing, dynamic business environments with competing priorities, with a positive, solution-oriented mindset. Excellent analytical and problem-solving skills. Salary Range: $149,000 - $159,000 Pac Perks: Dog friendly office environment On-site Cafe On-site Gym $1,000 referral incentive program Generous associate discount of 30-50% off merchandise online and in-stores Competitive long term and short-term incentive program Immediate 100% vested 401K contributions and employer match Calm Premium access for all employees Employee perks throughout the year Physical Requirements: The physical demands described here are representative of those that are required by an associate to successfully perform the essential functions of this job. While performing the duties of this job, the associate is regularly required to talk or hear. The associate is frequently required to sit; stand; walk; use hands to finger, handle or feel; as well as reach with hands and arms. Specific vision abilities required by this job include close vision, distance vision, depth perception and ability to adjust focus. Ability to work in open environment with fluctuating temperatures and standard lighting. Ability to work on computer and mobile phone for multiple hours; with frequent interruptions. Required to travel in elevator or stairwells to attend meetings and engage with associates on multiple floors throughout building. Hotel, Airplane, and Car Travel may be required. Position Type/Expected Hours of Work: This is a full-time position. As a National Retailer, occasional evening and/or weekend work may be required during periods of high volume. This role operates in a professional office environment and routinely uses standard office equipment. Other Considerations: Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the associate for this job. Duties, responsibilities and activities may change at any time with or without notice. Reasonable accommodations may be made to qualified individuals with disabilities to enable them to perform the essential functions of the role. Equal Opportunity Employer This employer is required to notify all applicants of their rights pursuant to federal employment laws. For further information, please review the Know Your Rights notice from the Department of Labor.
    $149k-159k yearly 1d ago
  • Software Developer Engineer in Test

    LHH 4.3company rating

    Data engineer job in Los Angeles, CA

    Job Title: Software Developer Engineer in Test Reports To: Sr Manager, Quality Engineering About the role We are looking for a Software Developer Engineer in Test to join our Platform Engineering & Playback team. As an Automation SDET, you'll be responsible for designing and building scalable test automation frameworks that ensure the integrity and quality of our streaming platform. You'll work across teams to validate video playback, API reliability, cross-device compatibility, and more ultimately helping us deliver uninterrupted entertainment to a global audience. Key Responsibilities: Architect and develop robust, reusable automated test frameworks for APIs, UI, and video playback components Validate streaming applications workflows across web, mobile, smart TVs, and OTT devices Automate testing for adaptive bitrate streaming, playback metrics, and buffering scenarios Architect a solution for testing the TVs and OTT devices workflows. Integrate automated tests with CI/CD pipelines to ensure continuous delivery Write clear, concise, and comprehensive test plans and test cases Work closely with developers and QA to ensure high-quality test coverage Participate in code reviews and provide feedback on testability and design Champion quality engineering practices within the development teams Mentor QA engineers on automation strategies and best practices Required Qualifications Bachelors degree in Computer Science, Engineering, or equivalent experience 2+ years of experience in test automation, ideally in media or streaming environments Proficiency in one or more programming languages (e.g., Java, Python, JavaScript, C#) Experience developing test frameworks and reusable testing libraries Experience with test automation frameworks (e.g., Selenium, Cypress, Playwright, TestNG, JUnit) Solid understanding of HTTP, REST APIs, and API testing tools (e.g., Postman, REST Assured) Experience with version control (Git), CI/CD tools (e.g., Jenkins, GitHub Actions), and build systems Excellent debugging, problem-solving, and communication skills Desired Qualifications Experience with cloud platforms (AWS, Azure, or GCP) Exposure to OTT platforms or smart TV development environments Experience testing cross-platform apps (iOS, Android, Roku, Fire TV, etc.) Familiarity with streaming protocols (HLS, DASH) and media playback components “Benefit offerings include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, EAP program, commuter benefits, and 401K plan. Our program provides employees the flexibility to choose the type of coverage that meets their individual needs. Available paid leave may include Paid Sick Leave, where required by law; any other paid leave required by Federal, State, or local law; and Holiday pay upon meeting eligibility criteria.” “Equal Opportunity Employer/Veterans/Disabled To read our Candidate Privacy Information Statement, which explains how we will use your information, please navigate to ******************************************* The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable: • The California Fair Chance Act • Los Angeles City Fair Chance Ordinance • Los Angeles County Fair Chance Ordinance for Employers • San Francisco Fair Chance Ordinance”
    $115k-154k yearly est. 3d ago
  • Sr. Software Engineer (NO H1B OR C2C) - Major Entertainment Company

    Techlink Resources, Inc. 4.5company rating

    Data engineer job in Los Angeles, CA

    Senior Software Engineer - Ad Platform Machine Learning We're looking for a Senior Software Engineer to join our Ad Platform Decisioning & Machine Learning Platform team. Our mission is to power the Company's advertising ecosystem with advanced machine learning, AI-driven decisioning, and high-performance backend systems. We build end-to-end solutions that span machine learning, large-scale data processing, experimentation platforms, and microservices-all to improve ad relevance, performance, and efficiency. If you're passionate about ML technologies, backend engineering, and solving complex problems in a fast-moving environment, this is an exciting opportunity to make a direct impact on next-generation ad decisioning systems. What You'll Do Build next-generation experimentation platforms for ad decisioning and large-scale A/B testing Develop simulation platforms that apply state-of-the-art ML and optimization techniques to improve ad performance Design and implement scalable approaches for large-scale data analysis Work closely with researchers to productize cutting-edge ML innovations Architect distributed systems with a focus on performance, scalability, and flexibility Champion engineering best practices including CI/CD, design patterns, automated testing, and strong code quality Contribute to all phases of the software lifecycle-design, experimentation, implementation, and testing Partner with product managers, program managers, SDETs, and researchers in a collaborative and innovative environment Basic Qualifications 4+ years of professional programming and software design experience (Java, Python, Scala, etc.) Experience building highly available, scalable microservices Strong understanding of system architecture and application design Knowledge of big data technologies and large-scale data processing Passion for understanding the ad business and driving innovation Enthusiastic about technology and comfortable working across disciplines Preferred Qualifications Domain knowledge in digital advertising Familiarity with AI/ML technologies and common ML tech stacks Experience with big data and workflow tools such as Airflow or Databricks Education Bachelor's degree plus 5+ years of relevant industry experience Role Scope You'll support ongoing initiatives across the ad platform, including building new experimentation and simulation systems used for online A/B testing. Media industry experience is not required. Technical Environment Java & Spring Boot for backend microservices AWS as the primary cloud environment Python & Scala for data pipelines running on Spark and Airflow Candidates should be strong in either backend microservices or data pipeline development and open to learning the other API development experience is required Interview Process Round 1: Technical & coding evaluation (1 hour) Round 2: Technical + behavioral interview (1 hour) Candidates are assessed on technical strength and eagerness to learn.
    $113k-148k yearly est. 2d ago
  • Workday Integration Senior Developer

    Tata Consultancy Services 4.3company rating

    Data engineer job in Culver City, CA

    Must Have Technical/Functional Skills Designing, development, testing and deployment of Workday integrations using EIB, Core Connector,SSK, XSLT3.0 and Workday Studio. Experience in designing and developing complex reports as required for some integrations as well as testing and support. Roles & Responsibilities Designing, developing, and maintaining: Workday dashboards, apps, reports, and integrations Testing and troubleshooting: Workday integrations Collaborating with stakeholders: Define business requirements and pain points with stakeholders in finance, accounting, payroll, and legal contributing to data and analytics strategy: Contribute to and execute on the data and analytics strategy for human resources Integrating Workday: Integrate Workday with other technologies and vendor systems Building collateral: Participate in knowledge capture sessions and help build HP delivery collateral Adding new features: Contribute functional expertise to the Workday product team by adding new features and workflows Min 7+ Years of experience with Workday Integrations required which includes APIs, EIBs, PECI, PICOF, Workday Studio, Core Connector for Worker and other areas such as Procure to Pay, Accounting 5+ years in Workday Financials implementation and integration. 3+ years in a lead role managing integration projects. Sound understanding of one or more functional modules in Workday is preferred. Certifications in Integration Core, CCTPP and Studio are a plus. Excellent communication skills, both written and verbal. Ability to effectively present information to internal and external associates Demonstrated ability to organize and prioritize projects in a fast-paced and deadline-oriented business environment Generic Managerial Skills 3+ years in a lead role managing integration projects Salary Range $120,000 - $160,000 a Year
    $120k-160k yearly 2d ago
  • ServiceNow CMDB Engineer

    Summit Tech Partners 3.5company rating

    Data engineer job in Irvine, CA

    Employment Type: Full-Time, Direct Hire (W2 Only - No sponsorship available) About the Role We're seeking a skilled and driven ServiceNow CMDB Engineer to join our team in Irving, TX. This is a hands-on, onsite role focused on designing, implementing, and maintaining a robust Configuration Management Database (CMDB) aligned with ServiceNow's Common Service Data Model (CSDM). You'll play a critical role in enhancing IT operations, asset management, and service delivery across the enterprise. Responsibilities Architect, configure, and maintain the ServiceNow CMDB to support ITOM and ITAM initiatives Implement and optimize CSDM frameworks to ensure data integrity and alignment with business services Collaborate with cross-functional teams to define CI classes, relationships, and lifecycle processes Develop and enforce CMDB governance, data quality standards, and reconciliation rules Integrate CMDB with discovery tools and external data sources Support audits, compliance, and reporting requirements related to ITIL processes Troubleshoot and resolve CMDB-related issues and performance bottlenecks Qualifications 3+ years of hands-on experience with ServiceNow CMDB and CSDM implementation Strong understanding of ITIL practices and ITOM/ITAM modules Proven ability to manage CI lifecycle and maintain data accuracy Experience with ServiceNow Discovery, Service Mapping, and integrations ServiceNow Certified System Administrator (CSA) or higher certifications preferred Excellent communication and documentation skills Must be authorized to work in the U.S. without sponsorship Perks & Benefits Competitive compensation package Collaborative and innovative work environment Opportunity to work with cutting-edge ServiceNow technologies
    $88k-121k yearly est. 5d ago
  • Software Engineer

    Plug 3.8company rating

    Data engineer job in Santa Monica, CA

    Plug is the only wholesale platform built exclusively for used electric vehicles. Designed for dealers and commercial consignors, Plug combines EV-specific data, systems and expertise to bring clarity and confidence to the wholesale buying and selling process. With the addition of Trade Desk™, dealers can quickly receive cash offers or list EV trade-ins directly into the auction, removing friction and maximizing returns. By replacing outdated wholesale methods with tools tailored to EVs, Plug empowers dealers to make faster and more profitable decisions with a partner they can trust. For more information, visit ***************** The Opportunity This is an on site role in Santa Monica, CA. We are looking for a Software Engineer to join our growing team! A full-stack software engineer who will report directly to our CTO, and who will own entire customer-facing products. We're building systems like multi-modal AI-enabled data onramps for EVs, near-real time API connectivity to the vehicles, and pricing intelligence tooling. As a member of the team you'll help lay the technical and product foundation for our growing business. We're building a culture that cares about collaboration, encourages intellectual honesty, celebrates technical excellence, and is driven by careful attention to detail and planning for the future. We believe diversity of perspective and experience are key to building great technology and a thriving team. Sound cool? Let's work together. Key Responsibilities Collaborate with colleagues and be a strong voice in product design sessions, architecture discussions, and code reviews. Design, implement, test, debug, and document work on new and existing software features and products, ensuring they meet business, quality, and operational needs. Write clear, efficient, and scalable code with an eye towards flexibility and maintainability. Take ownership of features and products, and support their planning and development by understanding the ultimate goal and evaluating effort, risk, and priority in an agile environment. Own and contribute to team productivity and process improvements. Use and develop APIs to create integrations between Plug and 3rd party platforms. Be an integral part of a close team of developers; this is an opportunity to help shape a nascent team culture. The ideal candidate will be a high-growth individual able to grow their career as the team grows. Qualifications 4-6 years of hands-on experience developing technical solutions Advanced understanding of web application technologies, both backend and frontend as well as relational databases. Familiarity with Cloud PaaS deployments. Familiarity with TypeScript or any other modern typed language. Familiarity with and positive disposition toward code generation AI tooling. Strong analytical and quantitative skills. Strong verbal and written communication skills with a focus on conciseness. A self-directed drive to deliver end-to-end solutions with measurable goals and results. Understanding and accepting of the ever-changing controlled chaos that is an early startup, and willing to work within that chaos to improve processes and outcomes. Experience balancing contending priorities and collaborating with colleagues to reach workable compromises. A proven track record of gaining trust and respect by consistently demonstrating sound critical-thinking and a risk-adjusted bias toward action. You pride yourself on having excellent reliability and integrity. Extraordinary grit; smart, creative, and persistent personality. Authorized to work in the US for any employer. Having worked in automotive or EV systems is a plus. Compensation and Benefits Annual Salary: 130K - 150K Equity: TBD Benefits: Health, vision, and dental insurance. Lunch stipend. Parking. This full-time position is based in Santa Monica, CA. We welcome candidates from all locations to apply, provided they are willing to relocate for the role. Relocation assistance will not be provided for successful candidates. Sponsorship not available at this time. Plug is an equal opportunity employer. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. And if you do, you suck.
    $108k-148k yearly est. 2d ago
  • Senior Software Engineer - Full Stack & DevOps

    Beacon Healthcare Systems 4.5company rating

    Data engineer job in Huntington Beach, CA

    We're seeking a Senior Software Engineer who thrives at the intersection of application development and DevOps. You'll design, build, and deploy scalable SaaS solutions for Medicare and Medicaid health plans, while also contributing to the automation, reliability, and security of our development lifecycle. This role is central to delivering high-quality features for our Compliance, Appeals & Grievances, and Universe Scrubber products. Key Responsibilities: · Application Development Design and implement backend services, APIs, and user interfaces using modern frameworks and cloud-native architecture. Ensure performance, scalability, and maintainability across the stack. · DevOps Integration Collaborate with infrastructure and DevOps teams to build and maintain CI/CD pipelines, automate deployments, and optimize environment provisioning across development, QA, and production. · Cloud-Native Engineering Develop and deploy applications on AWS, leveraging services like Lambda, ECS, RDS, and S3. Ensure solutions are secure, resilient, and compliant with healthcare regulations. · Quality & Compliance Write clean, testable code and participate in peer reviews, unit testing, and performance tuning. Ensure all software adheres to CMS, HIPAA, and internal compliance standards. · AI-Enabled Features Support integration of AI/ML capabilities into product workflows, such as intelligent routing of grievances or automated compliance checks. · Mentorship & Collaboration Provide technical guidance to junior engineers and collaborate with cross-functional teams to translate healthcare business needs into technical solutions. Qualifications: Bachelor's degree in computer science or related field 5+ years of experience in software development, with exposure to DevOps practices Proficiency in languages such as Java, Python, or C#, and experience with cloud platforms (preferably AWS) Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions), infrastructure-as-code (e.g., Terraform, Ansible), and containerization (e.g., Docker, Kubernetes) Understanding of healthcare data formats (EDI, HL7, FHIR) and regulatory frameworks
    $112k-147k yearly est. 2d ago
  • DevOps Engineer

    Sonata Software

    Data engineer job in Westlake Village, CA

    In today's market, there is a unique duality in technology adoption. On one side, extreme focus on cost containment by clients, and on the other, deep motivation to modernize their Digital storefronts to attract more consumers and B2B customers. As a leading Modernization Engineering company, we aim to deliver modernization-driven hypergrowth for our clients based on the deep differentiation we have created in Modernization Engineering, powered by our Lightening suite and 16-step Platformation™ playbook. In addition, we bring agility and systems thinking to accelerate time to market for our clients. Headquartered in Bengaluru, India, Sonata has a strong global presence, including key regions in the US, UK, Europe, APAC, and ANZ. We are a trusted partner of world-leading companies in BFSI (Banking, Financial Services, and Insurance), HLS (Healthcare and Lifesciences), TMT (Telecom, Media, and Technology), Retail & CPG, and Manufacturing space. Our bouquet of Modernization Engineering Services cuts across Cloud, Data, Dynamics, Contact Centers, and around newer technologies like Generative AI, MS Fabric, and other modernization platforms. Job Role : Sr. DevOps Engineer, platforms Work Location: Westlake Village, CA (5 Days Onsite) Duration: Contract to Hire Job Description: Responsibilities: Design, implement, and manage scalable and resilient infrastructure on AWS. Architect and maintain Windows/Linux based environments, ensuring seamless integration with cloud platforms. Develop and maintain infrastructure-as-code(IaC) using both AWS Cloudformation/CDK and Terraform. Develop and maintain Configuration Management for Windows servers using Chef. Design, build, and optimize CI/CD pipelines using GitLab CI/CD for .NET applications. Implement and enforce security best practices across the infrastructure and deployment processes. Collaborate closely with development teams to understand their needs and provide DevOps expertise. Troubleshoot and resolve infrastructure and application deployment issues. Implement and manage monitoring and logging solutions to ensure system visibility and proactive issue detection. Clearly and concisely contribute to the development and documentation of DevOps standards and best practices. Stay up-to-date with the latest industry trends and technologies in cloud computing, DevOps, and security. Provide mentorship and guidance to junior team members. Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience). 5+ years of experience in a DevOps or Site Reliability Engineering (SRE) role. Extensive hands-on experience with Amazon Web Services (AWS) Solid understanding of Windows/Linux Server administration and integration with cloud environments. Proven experience with infrastructure-as-code tools, specifically AWS CDK and Terraform. Strong experience designing and implementing CI/CD pipelines using GitLab CI/CD. Experience deploying and managing .NET applications in cloud environments. Deep understanding of security best practices and their implementation in cloud infrastructure and CI/CD pipelines. Solid understanding of networking principles (TCP/IP, DNS, load balancing, firewalls) in cloud environments. Experience with monitoring and logging tools (e.g., NewRelic, CloudWatch, Cloud Logging, Prometheus). Strong scripting skills(e.g., PowerShell, Python, Ruby, Bash). Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills. Experience with containerization technologies (e.g., Docker, Kubernetes) is a plus. Relevant AWS and/or GCP certifications are a plus. Experience with the configuration management tool Chef Preferred Qualifications Knowledge of and a strong understanding of Powershell and Python Scripting Strong background with AWS EC2 features and Services (Autoscaling and WarmPools) Understanding of Windows server Build process using tools like Chocolaty for packages and Packer for AMI/Image generation. Extensive hands-on experience with Amazon Web Services (AWS) Why join Sonata Software? At Sonata, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build never seen before solutions to some of the world's toughest problems. You´ll be challenged, but you will not be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. Sonata Software is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity, age, religion, disability, sexual orientation, veteran status, marital status, or any other characteristics protected by law.
    $101k-137k yearly est. 3d ago
  • DevOps Engineer

    Akkodis

    Data engineer job in Westlake Village, CA

    At Akkodis, we use our insight, knowledge, and global resources to make exceptional connections every day. With 60 branch offices located strategically throughout North America, we are positioned perfectly to deliver the industry's top talent to each of our clients. Clients choose Akkodis as their workforce partner to solve staffing challenges that range from locating hard-to-find niche talent to completing quick-fill demands. Akkodis is seeking a DevOps Engineer with a Westlake Village, CA-based client to join their team. JOB TITLE: DevOps Engineer EMPLOYMENT TYPE/DURATION: Contract role - 6 months + (possible conversion to FTE) COMPENSATION: Pay rate $60- $62.50/hour LOCATION DETAILS: On-Site Mon-Fri 8am -5pm PST-- Westlake Village, CA Top Required Skills: 3+ years of experience in a DevOps or Site Reliability Engineering (SRE) role. Extensive hands-on experience with Amazon Web Services (AWS) Solid understanding of Windows/Linux Server administration and integration with cloud environments. Proven experience with infrastructure-as-code (IaC) tools, specifically Terraform (OpenTofu) and AWS CDK. Strong experience designing and implementing CI/CD pipelines using GitLab CI/CD. Experience deploying and managing .NET applications in cloud environments. We're looking for an experienced, forward-thinking engineer to strengthen our DevOps capabilities across AWS and Windows/Linux environments. In this role, you'll drive the design and evolution of scalable, secure, and automated infrastructure to support our Infrastructure and Application stack. You'll work closely with development teams to streamline CI/CD pipelines, embed security best practices, and champion infrastructure-as-code. If you're passionate about automation, cloud-native patterns, and making systems run smarter and faster, we want to hear from you. Design, implement, and manage scalable and resilient infrastructure on AWS. Architect and maintain Windows/Linux based environments, ensuring seamless integration with cloud platforms. Develop and maintain infrastructure-as-code (IaC) using both AWS CloudFormation/CDK and Terraform. Develop and maintain Configuration Management for Windows servers using Chef. Design, build, and optimize CI/CD pipelines using GitLab CI/CD for .NET applications. Implement and enforce security best practices across the infrastructure and deployment processes. Collaborate closely with development teams to understand their needs and provide DevOps expertise. Troubleshoot and resolve infrastructure and application deployment issues. Implement and manage monitoring and logging solutions to ensure system visibility and proactive issue detection. Clearly and concisely contributes to the development and documentation of DevOps standards and best practices. Stay up to date with the latest industry trends and technologies in cloud computing, DevOps, and security. Provide mentorship and guidance to junior team members. Qualifications: Bachelor's degree in computer science, Engineering, or a related field (or equivalent experience). 3+ years of experience in a DevOps or Site Reliability Engineering (SRE) role. Extensive hands-on experience with Amazon Web Services (AWS) Solid understanding of Windows/Linux Server administration and integration with cloud environments. Proven experience with infrastructure-as-code (IaC) tools, specifically Terraform (OpenTofu) and AWS CDK. Strong experience designing and implementing CI/CD pipelines using GitLab CI/CD. Experience deploying and managing .NET applications in cloud environments. Deep understanding of security best practices and their implementation in cloud infrastructure and CI/CD pipelines. Solid understanding of networking principles (TCP/IP, DNS, load balancing, firewalls) in cloud environments. Experience with monitoring and logging tools (e.g., NewRelic, CloudWatch, Cloud Logging, Prometheus). Strong scripting skills (e.g., Python, Ruby, PowerShell, Bash). Experience with the configuration management tool Chef Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills. Preferred Qualifications Experience with containerization & orchestration technologies (e.g., Docker, Kubernetes) is a plus. Relevant AWS and/or GCP certifications are a plus. Strong understanding of Powershell and Python Scripting Strong background with AWS EC2 features and Services (Autoscaling and WarmPools) Understanding of Windows server Build process using tools like Chocolaty for packages and Packer for AMI/Image generation. Solid experience with the Windows server operating system and server tools such as IIS. If you are interested in this role, then please click APPLY NOW. For other opportunities available at Akkodis go to **************** If you have questions about the position, please contact Dana More at ************************** Equal Opportunity Employer/Veterans/Disabled Benefit offerings include medical, dental, vision, term life insurance, short-term disability insurance, additional voluntary benefits, commuter benefits and 401K plan. Our program provides employees the flexibility to choose the type of coverage that meets their individual needs. Available paid leave may include Paid Sick Leave, where required by law; any other paid leave required by Federal, State, or local law; and Holiday pay upon meeting eligibility criteria. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs which are direct hire to a client To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit ********************************************** The Company will consider qualified applicants with arrest and conviction record.
    $60-62.5 hourly 2d ago
  • Senior Gameplay Engineer

    The Walt Disney Company 4.6company rating

    Data engineer job in Burbank, CA

    This is not a remote role. You must be in the local area or be willing to relocate. About the Role The Office of Technology Enablement (OTE), a division of The Walt Disney Company, is embarking on a mission to build the future of XR experiences across all Disney segments. Working with top-class industry talent, this role is perfect for the accomplished game engineer looking to create something magical - from focused R&D projects to working with different business units on new ways to experience the characters and stories you love. OTE is looking for a uniquely talented Senior Gameplay Engineer to join us on this ambitious and exciting initiative. If you are an experienced game programmer with a love of Disney properties, you'll want to check out this opportunity! This role will report to the Director of Development. What you Will Do: Leverage your knowledge to help implement a collection of interactive experiences based on Disney's robust portfolio of characters and worlds. Be an active, hands-on participant in the process, directly writing code and working daily with design/production/art to establish and achieve goals for each interactive experience. A significant portion of this work will involve implementation using Unreal. This role will require a willingness and ability to operate within the limitations of that ecosystem and grow with it as the functionality matures. Empower designers by serving as their main support avenue during the development process. Find creative ways to overcome hardware limitations, maintaining a positive outlook along the way. Work closely with other members of the engineering and creative teams to ensure that implementation quality is maintained. Be an advocate of stability and flexibility. Champion Disney and team values. Maintain a ‘guest-first' mentality by being an advocate for the audience experience. Serve as a key team member of a growing XR development team at Disney. Required Qualifications & Skills 5 years of experience developing console/PC/mobile games or other digital interactive entertainment. Experience with Unreal Engine 4/5+ at the native (C++) level. Participated in the creation and release of a commercial product, in a hands-on programming role. Was one of the main authors of a major gameplay system. Served as a programmer during the prototype phase of a project. Understands the difference in requirements/goals between prototyping and production. Understands and implements the following concepts at a production-quality: C++ code (Performance Impact, Memory Management, Inheritance, etc.) Client/Server architecture (Replication, Client-side Prediction, Movement Syncing, etc.) Game Mathematics (Linear Algebra, Vector Math, Kinematic Physics, Collision, etc.) Ability to guide engineers. A Bachelor's degree in Computer Science or equivalent combination of education and experience. Preferred Qualifications Shipped products for Apple Vision Pro, Meta Quest or other VR Platforms. Graphics Programming, Mobile experience, and familiarity with Online Services are all bonuses. Experience with developing and publishing Unreal content. Has a broad understanding of XR experiences and the various devices that are available on the market #DISNEYTECH The hiring range for this position in Burbank, CA is $141,900.00-$190,300.00 per year. The base pay actually offered will take into account internal equity and also may vary depending on the candidate's geographic region, job-related knowledge, skills, and experience among other factors. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered. Job Posting Segment: TWDSTECH Job Posting Primary Business: Technology Enablement Primary Job Posting Category: Games-Engineering Employment Type: Full time Primary City, State, Region, Postal Code: Burbank, CA, USA Alternate City, State, Region, Postal Code: Date Posted: 2025-12-20
    $141.9k-190.3k yearly Auto-Apply 6d ago
  • Azure & Microsoft Fabric Data Engineer & Architect

    Stand 8 Technology Consulting

    Data engineer job in Los Angeles, CA

    STAND 8 provides end-to-end IT solutions to enterprise partners across the United States and with offices in Los Angeles, New York, New Jersey, Atlanta, and more including internationally in Mexico and India. Our global solutions team is seeking an Azure & Microsoft Fabric Data Engineer/Architect to support and lead our Media & Entertainment client to build a next-generation financial data platform. We're looking for someone who can contribute both strategically and will have the hands-on skill to provide subject matter expertise. In this role, you'll design and build enterprise-level data models, lead data migration efforts into Azure, and develop cutting-edge data processing pipelines using Microsoft Fabric. If you thrive at the intersection of architecture and hands-on engineering, and want to help shape a modern financial system with complex upstream data processing - this is the opportunity for you! This project will required the person to work onsite the Burbank / Studio City adjacent location 3 days / week. We are setting up interviews immediately and look forward to hearing from you! This is a hybrid position requiring 3-4 days per week onsite. Responsibilities Architect, design, and hands-on develop end-to-end data solutions using Azure Data Services and Microsoft Fabric. Build and maintain complex data models in Azure SQL, Lakehouse, and Fabric environments that support advanced financial calculations. Lead and execute data migration efforts from multiple upstream and legacy systems into Azure and Fabric. Develop, optimize, and maintain ETL/ELT pipelines using Microsoft Fabric Data Pipelines, Data Factory, and Azure engineering tools. Perform hands-on SQL development, including stored procedures, query optimization, performance tuning, and data transformation logic. Partner with finance, engineering, and product stakeholders to translate requirements into scalable, maintainable data solutions. Ensure data quality, lineage, profiling, and governance across ingestion and transformation layers. Tune and optimize Azure SQL databases and Fabric Lakehouse environments for performance and cost efficiency. Troubleshoot data processing and pipeline issues to maintain stability and reliability. Document architecture, data flows, engineering standards, and best practices. Qualifications Expert, hands-on experience with Azure Data Services (Azure SQL, Data Factory, Data Lake Storage, Synapse, Azure Storage). Deep working knowledge of Microsoft Fabric, including Data Engineering workloads, Lakehouse, Fabric SQL, Pipelines, and governance. Strong experience designing and building data models within Azure SQL and Fabric architectures. Proven track record delivering large-scale data migrations into Azure environments. Advanced proficiency in SQL/T-SQL, including stored procedures, indexing, and performance tuning. Demonstrated success building and optimizing ETL/ELT pipelines for complex financial or multi-source datasets. Understanding of financial systems, data structures, and complex calculation logic. Excellent communication and documentation skills with the ability to collaborate across technical and business teams. Additional Details The base range for this contract position is $70-85/per hour, depending on experience. Our pay ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target for new hires of this position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Qualified applicants with arrest or conviction records will be considered. Additional Details The base range for this contract position is $70-85/per hour, depending on experience. Our pay ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target for new hires of this position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Qualified applicants with arrest or conviction records will be considered. Benefits Medical coverage and Health Savings Account (HSA) through Anthem Dental/Vision/Various Ancillary coverages through Unum 401(k) retirement savings plan Company-paid Employee Assistance Program (EAP) Discount programs through ADP WorkforceNow About Us STAND 8 provides end-to-end IT solutions to enterprise partners across the United States and globally with offices in Los Angeles, Atlanta, New York, Mexico, Japan, India, and more. STAND 8 focuses on the "bleeding edge" of technology and leverages automation, process, marketing, and over fifteen years of success and growth to provide a world-class experience for our customers, partners, and employees. Our mission is to impact the world positively by creating success through PEOPLE, PROCESS, and TECHNOLOGY. Check out more at ************** and reach out today to explore opportunities to grow together! By applying to this position, your data will be processed in accordance with the STAND 8 Privacy Policy.
    $70-85 hourly 5d ago

Learn more about data engineer jobs

How much does a data engineer earn in Lawndale, CA?

The average data engineer in Lawndale, CA earns between $86,000 and $164,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Lawndale, CA

$119,000

What are the biggest employers of Data Engineers in Lawndale, CA?

The biggest employers of Data Engineers in Lawndale, CA are:
  1. Robert Half
  2. Ernst & Young
  3. Internet Brands
  4. Sony Pictures
  5. Contact Government Services, LLC
  6. EY Studio+ Nederland
  7. WMSN FOX 47 News, Madison
  8. Salted
  9. Slalom
  10. StubHub
Job type you want
Full Time
Part Time
Internship
Temporary