Data Scientist
Data engineer job in Long Beach, CA
STAND 8 provides end to end IT solutions to enterprise partners across the United States and with offices in Los Angeles, New York, New Jersey, Atlanta, and more including internationally in Mexico and India We are seeking a highly analytical and technically skilled Data Scientist to transform complex, multi-source data into unified, actionable insights used for executive reporting and decision-making.
This role requires expertise in business intelligence design, data modeling, metadata management, data integrity validation, and the development of dashboards, reports, and analytics used across operational and strategic environments.
The ideal candidate thrives in a fast-paced environment, demonstrates strong investigative skills, and can collaborate effectively with technical teams, business stakeholders, and leadership.
Essential Duties & Responsibilities
As a Data Scientist, participate across the full solution lifecycle: business case, planning, design, development, testing, migration, and production support.
Analyze large and complex datasets with accuracy and attention to detail.
Collaborate with users to develop effective metadata and data relationships.
Identify reporting and dashboard requirements across business units.
Determine strategic placement of business logic within ETL or metadata models.
Build enterprise data warehouse metadata/semantic models.
Design and develop unified dashboards, reports, and data extractions from multiple data sources.
Develop and execute testing methodologies for reports and metadata models.
Document BI architecture, data lineage, and project report requirements.
Provide technical specifications and data definitions to support the enterprise data dictionary.
Apply analytical skills and Data Science techniques to understand business processes, financial calculations, data flows, and application interactions.
Identify and implement improvements, workarounds, or alternative solutions related to ETL processes, ensuring integrity and timeliness.
Create UI components or portal elements (e.g., SharePoint) for dynamic or interactive stakeholder reporting.
As a Data Scientist, download and process SQL database information to build Power BI or Tableau reports (including cybersecurity awareness campaigns).
Utilize SQL, Python, R, or similar languages for data analysis and modeling.
Support process optimization through advanced modeling, leveraging experience as a Data Scientist where needed.
Required Knowledge & Attributes
Highly self-motivated with strong organizational skills and ability to manage multiple verbal and written assignments.
Experience collaborating across organizational boundaries for data sourcing and usage.
Analytical understanding of business processes, forecasting, capacity planning, and data governance.
Proficient with BI tools (Power BI, Tableau, PBIRS, SSRS, SSAS).
Strong Microsoft Office skills (Word, Excel, Visio, PowerPoint).
High attention to detail and accuracy.
Ability to work independently, demonstrate ownership, and ensure high-quality outcomes.
Strong communication, interpersonal, and stakeholder engagement skills.
Deep understanding that data integrity and consistency are essential for adoption and trust.
Ability to shift priorities and adapt within fast-paced environments.
Required Education & Experience
Bachelor's degree in Computer Science, Mathematics, or Statistics (or equivalent experience).
3+ years of BI development experience.
3+ years with Power BI and supporting Microsoft stack tools (SharePoint 2019, PBIRS/SSRS, Excel 2019/2021).
3+ years of experience with SDLC/project lifecycle processes
3+ years of experience with data warehousing methodologies (ETL, Data Modeling).
3+ years of VBA experience in Excel and Access.
Strong ability to write SQL queries and work with SQL Server 2017-2022.
Experience with BI tools including PBIRS, SSRS, SSAS, Tableau.
Strong analytical skills in business processes, financial modeling, forecasting, and data flow understanding.
Critical thinking and problem-solving capabilities.
Experience producing high-quality technical documentation and presentations.
Excellent communication and presentation skills, with the ability to explain insights to leadership and business teams.
Benefits
Medical coverage and Health Savings Account (HSA) through Anthem
Dental/Vision/Various Ancillary coverages through Unum
401(k) retirement savings plan
Paid-time-off options
Company-paid Employee Assistance Program (EAP)
Discount programs through ADP WorkforceNow
Additional Details
The base range for this contract position is $73 - $83 / per hour, depending on experience. Our pay ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target for new hires of this position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Qualified applicants with arrest or conviction records will be considered
About Us
STAND 8 provides end-to-end IT solutions to enterprise partners across the United States and globally with offices in Los Angeles, Atlanta, New York, Mexico, Japan, India, and more. STAND 8 focuses on the "bleeding edge" of technology and leverages automation, process, marketing, and over fifteen years of success and growth to provide a world-class experience for our customers, partners, and employees.
Our mission is to impact the world positively by creating success through PEOPLE, PROCESS, and TECHNOLOGY.
Check out more at ************** and reach out today to explore opportunities to grow together!
By applying to this position, your data will be processed in accordance with the STAND 8 Privacy Policy.
Principal Data Scientist
Data engineer job in Alhambra, CA
The Principal Data Scientist works to establish a comprehensive Data Science Program to advance data-driven decision-making, streamline operations, and fully leverage modern platforms including Databricks, or similar, to meet increasing demand for predictive analytics and AI solutions.
The Principal Data Scientist will guide program development, provide training and mentorship to junior members of the team, accelerate adoption of advanced analytics, and build internal capacity through structured mentorship.
The Principal Data Scientist will possess exceptional communication abilities, both verbal and written, with a strong customer service mindset and the ability to translate complex concepts into clear, actionable insights; strong analytical and business acumen, including foundational experience with regression, association analysis, outlier detection, and core data analysis principles; working knowledge of database design and organization, with the ability to partner effectively with Data Management and Data Engineering teams; outstanding time management and organizational skills, with demonstrated success managing multiple priorities and deliverables in parallel; a highly collaborative work style, coupled with the ability to operate independently, maintain focus, and drive projects forward with minimal oversight; a meticulous approach to quality, ensuring accuracy, reliability, and consistency in all deliverables; and proven mentorship capabilities, including the ability to guide, coach, and upskill junior data scientists and analysts.
5+ years of professional experience leading data science initiatives, including developing machine learning models, statistical analyses, and end-to-end data science workflows in production environments.
3+ years of experience working with Databricks and similar cloud-based analytics platforms, including notebook development, feature engineering, ML model training, and workflow orchestration.
3+ years of experience applying advanced analytics and predictive modeling (e.g., regression, classification, clustering, forecasting, natural language processing).
2+ years of experience implementing MLOps practices, such as model versioning, CI/CD for ML, MLflow, automated pipelines, and model performance monitoring.
2+ years of experience collaborating with data engineering teams to design data pipelines, optimize data transformations, and implement Lakehouse or data warehouse architectures (e.g., Databricks, Snowflake, SQL-based platforms).
2+ years of experience mentoring or supervising junior data scientists or analysts, including code reviews, training, and structured skill development.
2+ years of experience with Python and SQL programming, using data sources such as SQL Server, Oracle, PostgreSQL, or similar relational databases.
1+ year of experience operationalizing analytics within enterprise governance frameworks, partnering with Data Management, Security, and IT to ensure compliance, reproducibility, and best practices.
Education:
This classification requires possession of a Master's degree or higher in Data Science, Statistics, Computer Science, or a closely related field. Additional qualifying professional experience may be substituted for the required education on a year-for-year basis.
At least one of the following industry-recognized certifications in data science or cloud analytics, such as: ⢠Microsoft Azure Data Scientist Associate (DP-100) ⢠Databricks Certified Data Scientist or Machine Learning Professional ⢠AWS Machine Learning Specialty ⢠Google Professional Data Engineer ⢠or equivalent advanced analytics certifications. The certification is required and may not be substituted with additional experience.
Senior Data Engineer
Data engineer job in Los Angeles, CA
Robert Half is partnering with a well known brand seeking an experienced Data Engineer with Databricks experience. Working alongside data scientists and software developers, you'll work will directly impact dynamic pricing strategies by ensuring the availability, accuracy, and scalability of data systems. This position is full time with full benefits and 3 days onsite in the Woodland Hills, CA area.
Responsibilities:
Design, build, and maintain scalable data pipelines for dynamic pricing models.
Collaborate with data scientists to prepare data for model training, validation, and deployment.
Develop and optimize ETL processes to ensure data quality and reliability.
Monitor and troubleshoot data workflows for continuous integration and performance.
Partner with software engineers to embed data solutions into product architecture.
Ensure compliance with data governance, privacy, and security standards.
Translate stakeholder requirements into technical specifications.
Document processes and contribute to data engineering best practices.
Requirements:
Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
4+ years of experience in data engineering, data warehousing, and big data technologies.
Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server).
Must have experience in Databricks.
Experience working within Azure or AWS or GCP environment.
Familiarity with big data tools like Spark, Hadoop, or Databricks.
Experience in real-time data pipeline tools.
Experienced with Python
Senior Data Engineer
Data engineer job in Glendale, CA
City: Glendale, CA
Onsite/ Hybrid/ Remote: Hybrid (3 days a week onsite, Friday - Remote)
Duration: 12 months
Rate Range: Up to$85/hr on W2 depending on experience (no C2C or 1099 or sub-contract)
Work Authorization: GC, USC, All valid EADs except OPT, CPT, H1B
Must Have:
⢠5+ years Data Engineering
⢠Airflow
⢠Spark DataFrame API
⢠Databricks
⢠SQL
⢠API integration
⢠AWS
⢠Python or Java or Scala
Responsibilities:
⢠Maintain, update, and expand Core Data platform pipelines.
⢠Build tools for data discovery, lineage, governance, and privacy.
⢠Partner with engineering and cross-functional teams to deliver scalable solutions.
⢠Use Airflow, Spark, Databricks, Delta Lake, Kubernetes, and AWS to build and optimize workflows.
⢠Support platform standards, best practices, and documentation.
⢠Ensure data quality, reliability, and SLA adherence across datasets.
⢠Participate in Agile ceremonies and continuous process improvement.
⢠Work with internal customers to understand needs and prioritize enhancements.
⢠Maintain detailed documentation that supports governance and quality.
Qualifications:
⢠5+ years in data engineering with large-scale pipelines.
⢠Strong SQL and one major programming language (Python, Java, or Scala).
⢠Production experience with Spark and Databricks.
⢠Experience ingesting and interacting with API data sources.
⢠Hands-on Airflow orchestration experience.
⢠Experience developing APIs with GraphQL.
⢠Strong AWS knowledge and infrastructure-as-code familiarity.
⢠Understanding of OLTP vs OLAP, data modeling, and data warehousing.
⢠Strong problem-solving and algorithmic skills.
⢠Clear written and verbal communication.
⢠Agile/Scrum experience.
⢠Bachelor's degree in a STEM field or equivalent industry experience.
Lead Data Engineer - (Automotive exp)
Data engineer job in Torrance, CA
Role: Sr Technical Lead
Duration: 12+ Month Contract
Daily Tasks Performed:
Lead the design, development, and deployment of a scalable, secure, and high-performance CDP SaaS product.
Architect solutions that integrate with various data sources, APIs, and third-party platforms.
Design, develop, and optimize complex SQL queries for data extraction, transformation, and analysis
Build and maintain workflow pipelines using Digdag, integrating with data platforms such as Treasure Data, AWS, or other cloud services
Automate ETL processes and schedule tasks using Digdag's YAML-based workflow definitions
Implement data quality checks, logging, and alerting mechanisms within workflow
Leverage AWS services (e.g., S3, Lambda, Athena) where applicable to enhance data processing and storage capabilities
Ensure best practices in software engineering, including code reviews, testing, CI/CD, and documentation.
Oversee data privacy, security, and compliance initiatives (e.g., GDPR, CCPA).
Ensure adherence to security, compliance, and data governance requirements.
Oversee development of real-time and batch data processing systems.
Collaborate with cross-functional teams including data analysts, product managers, and software engineers to translate business requirements into technical solutions
Collaborate with the stakeholders to define technical requirements to align technical solutions with business goals and deliver product features.
Mentor and guide developers, fostering a culture of technical excellence and continuous improvement.
Troubleshoot complex technical issues and provide hands-on support as needed.
Monitor, troubleshoot, and improve data workflows for performance, reliability, and cost-efficiency as needed
Optimize system performance, scalability, and cost efficiency.
What this person will be working on:
As the Senior Technical Lead for our Customer Data Platform (CDP), the candidate will define the technical strategy, architecture, and execution of the platform. They will lead the design and delivery of scalable, secure, and high-performing solutions that enable unified customer data management, advanced analytics, and personalized experiences. This role demands deep technical expertise, strong leadership, and a solid understanding of data platforms and modern cloud technologies. It is a pivotal position that supports the CDP vision by mentoring team members and delivering solutions that empower our customers to unify, analyze, and activate their data.
Position Success Criteria (Desired) - 'WANTS'
Bachelor's or Master's degree in Computer Science, Engineering, or related field.
8+ years of software development experience, with at least 3+ years in a technical leadership role.
Proven experience building and scaling SaaS products, preferably in customer data, marketing technology, or analytics domains
Extensive hands-on experience with Presto, Hive, and Python
Strong proficiency in writing complex SQL queries for data extraction, transformation, and analysis
Familiarity with AWS data services such as S3, Athena, Glue, and Lambda
Deep understanding of data modeling, ETL pipelines, workflow orchestration, and both real-time and batch data processing
Experience ensuring data privacy, security, and compliance in SaaS environments
Knowledge of Customer Data Platforms (CDPs), CDP concepts, and integration with CRM, marketing, and analytics tools
Excellent communication, leadership, and project management skills
Experience working with Agile methodologies and DevOps practices
Ability to thrive in a fast-paced, agile environment
Collaborative mindset with a proactive approach to problem-solving
Stay current with industry trends and emerging technologies relevant to SaaS and customer data platforms.
Data Engineer (AWS Redshift, BI, Python, ETL)
Data engineer job in Manhattan Beach, CA
We are seeking a skilled Data Engineer with strong experience in business intelligence (BI) and data warehouse development to join our team. In this role, you will design, build, and optimize data pipelines and warehouse architectures that support analytics, reporting, and data-driven decision-making. You will work closely with analysts, data scientists, and business stakeholders to ensure reliable, scalable, and high-quality data solutions.
Responsibilities:
Develop and maintain ETL/ELT pipelines for ingesting, transforming, and delivering data.
Design and enhance data warehouse models (star/snowflake schemas) and BI datasets.
Optimize data workflows for performance, scalability, and reliability.
Collaborate with BI teams to support dashboards, reporting, and analytics needs.
Ensure data quality, governance, and documentation across all solutions.
Qualifications:
Proven experience with data engineering tools (SQL, Python, ETL frameworks).
Strong understanding of BI concepts, reporting tools, and dimensional modeling.
Hands-on experience with cloud data platforms (e.g., AWS, Azure, GCP) is a plus.
Excellent problem-solving skills and ability to work in a cross-functional environment.
Data Analytics Engineer
Data engineer job in Irvine, CA
We are seeking a Data Analytics Engineer to join our team who serves as a hybrid Database Administrator, Data Engineer, and Data Analyst, responsible for managing core data infrastructure, developing and maintaining ETL pipelines, and delivering high-quality analytics and visual insights to executive stakeholders. This role bridges technical execution with business intelligence, ensuring that data across Salesforce, financial, and operational systems is accurate, accessible, and strategically presented.
Essential Functions
Database Administration: Oversee and maintain database servers, ensuring performance, reliability, and security. Manage user access, backups, and data recovery processes while optimizing queries and database operations.
Data Engineering (ELT): Design, build, and maintain robust ELT pipelines (SQL/DBT or equivalent) to extract, transform, and load data across Salesforce, financial, and operational sources. Ensure data lineage, integrity, and governance throughout all workflows.
Data Modeling & Governance: Design scalable data models and maintain a governed semantic layer and KPI catalog aligned with business objectives. Define data quality checks, SLAs, and lineage standards to reconcile analytics with finance source-of-truth systems.
Analytics & Reporting: Develop and manage executive-facing Tableau dashboards and visualizations covering key lending and operational metrics - including pipeline conversion, production, credit quality, delinquency/charge-offs, DSCR, and LTV distributions.
Presentation & Insights: Translate complex datasets into clear, compelling stories and presentations for leadership and cross-functional teams. Communicate findings through visual reports and executive summaries to drive strategic decisions.
Collaboration & Integration: Partner with Finance, Capital Markets, and Operations to refine KPIs and perform ad-hoc analyses. Collaborate with Engineering to align analytical and operational data, manage integrations, and support system scalability.
Enablement & Training: Conduct training sessions, create documentation, and host data office hours to promote data literacy and empower business users across the organization.
Competencies & Skills
Advanced SQL proficiency with strong data modeling, query optimization, and database administration experience (PostgreSQL, MySQL, or equivalent).
Hands-on experience managing and maintaining database servers and optimizing performance.
Proficiency with ETL/ELT frameworks (DBT, Airflow, or similar) and cloud data stacks (AWS/Azure/GCP).
Strong Tableau skills - parameters, LODs, row-level security, executive-level dashboard design, and storytelling through data.
Experience with Salesforce data structures and ingestion methods.
Proven ability to communicate and present technical data insights to executive and non-technical stakeholders.
Solid understanding of lending/financial analytics (pipeline conversion, delinquency, DSCR, LTV).
Working knowledge of Python for analytics tasks, cohort analysis, and variance reporting.
Familiarity with version control (Git), CI/CD for analytics, and data governance frameworks.
Excellent organizational, documentation, and communication skills with a strong sense of ownership and follow-through.
Education & Experience
Bachelor's degree in Computer Science, Engineering, Information Technology, Data Analytics, or a related field.
3+ years of experience in data analytics, data engineering, or database administration roles.
Experience supporting executive-level reporting and maintaining database infrastructure in a fast-paced environment.
Snowflake/AWS Data Engineer
Data engineer job in Irvine, CA
Sr. Data Engineer
Full Time Direct Hire Job
Hybrid with work location-Irvine, CA.
The Senior Data Engineer will help design and build a modern data platform that supports enterprise analytics, integrations, and AI/ML initiatives. This role focuses on developing scalable data pipelines, modernizing the enterprise data warehouse, and enabling self-service analytics across the organization.
Key Responsibilities
⢠Build and maintain scalable data pipelines using Snowflake, dbt, and Fivetran.
⢠Design and optimize enterprise data models for performance and scalability.
⢠Support data cataloging, lineage, quality, and compliance efforts.
⢠Translate business and analytics requirements into reliable data solutions.
⢠Use AWS (primarily S3) for storage, integration, and platform reliability.
⢠Perform other data engineering tasks as needed.
Required Qualifications
⢠Bachelor's degree in Computer Science, Data Engineering, Information Systems, or related field.
⢠5+ years of data engineering experience.
⢠Hands-on expertise with Snowflake, dbt, and Fivetran.
⢠Strong background in data warehousing, dimensional modeling, and SQL.
⢠Experience with AWS (S3) and data governance tools such as Alation or Atlan.
⢠Proficiency in Python for scripting and automation.
⢠Experience with streaming technologies (Kafka, Kinesis, Flink) a plus.
⢠Knowledge of data security and compliance best practices.
⢠Exposure to AI/ML workflows and modern BI tools like Power BI, Tableau, or Looker.
⢠Ability to mentor junior engineers.
Skills
⢠Snowflake
⢠dbt
⢠Fivetran
⢠Data modeling and warehousing
⢠AWS
⢠Data governance
⢠SQL
⢠Python
⢠Strong communication and cross-functional collaboration
⢠Interest in emerging data and AI technologies
Data Engineer
Data engineer job in Irvine, CA
Job Title: Data Engineer
Duration: Direct-Hire Opportunity
We are looking for a Data Engineer who is hands-on, collaborative, and experienced with Microsoft SQL Server, Snowflake, AWS RDS, and MySQL. The ideal candidate has a strong background in data warehousing, data lakes, ETL pipelines, and business intelligence tools.
This role plays a key part in executing data strategy - driving optimization, reliability, and scalable BI capabilities across the organization. It's an excellent opportunity for a data professional who wants to influence architectural direction, contribute technical expertise, and grow within a data-driven company focused on innovation.
Key Responsibilities
Design, develop, and maintain SQL Server and Snowflake data warehouses and data lakes, focusing on performance, governance, and security.
Manage and optimize database solutions within Snowflake, SQL Server, MySQL, and AWS RDS.
Build and enhance ETL pipelines using tools such as Snowpipe, DBT, Boomi, SSIS, and Azure Data Factory.
Utilize data tools such as SSMS, Profiler, Query Store, and Redgate for performance tuning and troubleshooting.
Perform database administration tasks, including backup, restore, and monitoring.
Collaborate with Business Intelligence Developers and Business Analysts on enterprise data projects.
Ensure database integrity, compliance, and adherence to best practices in data security.
Configure and manage data integration and BI tools such as Power BI, Tableau, Power Automate, and scripting languages (Python, R).
Qualifications
Proficiency with Microsoft SQL Server, including advanced T-SQL development and optimization.
7+ years working as a SQL Server Developer/Administrator, with experience in relational and object-oriented databases.
2+ years of experience with Snowflake data warehouse and data lake solutions.
Experience developing pipelines and reporting solutions using Power BI, SSRS, SSIS, Azure Data Factory, or DBT.
Scripting and automation experience using Python, PowerShell, or R.
Familiarity with data integration and analytics tools such as Boomi, Redshift, or Databricks (a plus).
Excellent communication, problem-solving, and organizational skills.
Education: Bachelor's or Master's degree in Computer Science, Information Systems, Data Science, or a related field.
Technical Skills
SQL Server / Snowflake / MySQL / AWS RDS
ETL Development (Snowpipe, SSIS, Azure Data Factory, DBT)
BI Tools (Power BI, Tableau)
Python, R, PowerShell
Data Governance & Security Best Practices
Determining compensation for this role (and others) at Vaco/Highspring depends upon a wide array of factors including but not limited to the individual's skill sets, experience and training, licensure and certifications, office location and other geographic considerations, as well as other business and organizational needs. With that said, as required by local law in geographies that require salary range disclosure, Vaco/Highspring notes the salary range for the role is noted in this job posting. The individual may also be eligible for discretionary bonuses, and can participate in medical, dental, and vision benefits as well as the company's 401(k) retirement plan. Additional disclaimer: Unless otherwise noted in the job description, the position Vaco/Highspring is filing for is occupied. Please note, however, that Vaco/Highspring is regularly asked to provide talent to other organizations. By submitting to this position, you are agreeing to be included in our talent pool for future hiring for similarly qualified positions. Submissions to this position are subject to the use of AI to perform preliminary candidate screenings, focused on ensuring minimum job requirements noted in the position are satisfied. Further assessment of candidates beyond this initial phase within Vaco/Highspring will be otherwise assessed by recruiters and hiring managers. Vaco/Highspring does not have knowledge of the tools used by its clients in making final hiring decisions and cannot opine on their use of AI products.
Solution Architect - Data Architect AI
Data engineer job in Santa Ana, CA
Required Skills & Qualifications
Strong experience in enterprise data warehousing, BI architecture, and large-scale analytics delivery
Deep understanding of data modeling for analytics (dimensional & semantic layer design)
Hands-on expertise with ETL/ELT patterns and orchestration
Strong knowledge of data governance, metadata management, security models, and IAM for data platforms
Experience with BI tools (preferably Looker or equivalent enterprise-grade platforms)
Awareness of AI/ML techniques for insight automation, anomaly/outlier detection
Strong understanding of DevOps concepts for data: CI/CD, code-based delivery, versioning
Proficiency in cloud-based data analytics platforms (GCP experience highly advantageous; BigQuery familiarity a plus)
Excellent stakeholder management, business communication, and requirement discovery skills
Ability to operate in an onsite-offshore model, driving clarity and collaboration
Sr. Developer eCommerce Systems
Data engineer job in Anaheim, CA
Join the Pacsun Community
Co-created in Los Angeles, Pacsun inspires the next generation of youth, building community at the intersection of fashion, music, art and sport. Pacsun is a leading lifestyle brand offering an exclusive collection of the most relevant brands and styles such as adidas, Brandy Melville, Essentials Fear of God, our own brands, and many more.
Our Pacsun community believes in and understands the importance of using our voice, platform, and resources to inspire and bring about positive development. Through our PacCares program, we are committed to our responsibility in using our platform to drive change and take action on the issues important to our community. Join the Pacsun Community.
Learn more here: LinkedIn- Our Community
About the Job:
Pacsun's IT eCommerce team uses AI and innovative technologies to enhance customer experience and improve operational efficiency. As a key member of the team, the Senior eCommerce Developer contributes the architecture, development and optimization of the company's digital commerce experiences.
This role is responsible for both backāend and front-end development on Salesforce Commerce Cloud (SFCC), ensuring highāperformance, secure and accessible storefronts, with robust system integration in the eCommerce ecosystem. The Senior eCommerce Developer will lead endātoāend delivery of new features, mentor junior developers and off-shore team, and collaborate closely with UX, product, QA and business teams to create compelling online experiences that drive revenue and customer loyalty.
This role will work on the full stack of Pacsun's Salesforce Commerce Cloud, mobile app, AI initiatives and system integrations, supporting Commerce, Loyalty, CRM, OMS, and other eCommerce platforms.
A day in the life, what you'll be doing:
BackāEnd Development & Integration
Design, build and maintain SFCC serverāside components, including controllers, pipelines, cartridges and custom business logic.
Develop and manage robust APIs that connect SFCC with tax engines, payment processors, fraud management services and the order management system.
Ensure reliable data synchronization between SFCC and external platforms such as CRM, Loyalty, OMS, ERP and analytics systems.
Optimize database models, caching strategies and performance tuning to support high transaction volumes and peak traffic periods.
Checkout & Transaction Optimization
Own the endātoāend checkout experience, ensuring seamless, secure and performant workflows from cart to order confirmation.
Integrate payment gateways and fraud protections to deliver accurate pricing and effortless transactions.
Collaborate with UX and product teams to identify friction points in the checkout process and implement improvements that boost conversion and customer satisfaction.
Tax, Shipping & OMS Integration
Implement and maintain integrations with thirdāparty tax services to handle complex jurisdictional tax rules.
Connect SFCC to shipping providers and fulfillment platforms to provide realātime shipping options and tracking.
Build and support integrations with the order management system to ensure accurate order routing, inventory updates and status synchronization.
AI & Innovation Support
Partner with data science and innovation teams to embed AIādriven personalization, recommendation and search solutions into the platform.
Develop integration points for machineālearning models and realātime personalization engines, ensuring data security and compliance.
Prototype and implement new technologies that enhance the customer experience and streamline operations.
Technical Leadership & Collaboration
Lead code reviews, define backend architecture standards and mentor less experienced developers on integration patterns and best practices.
Participate in IT management and technical teams to develop and deploy processes to ensure rapid, reliable releases.
Work closely with product, UX, QA and DevOps teams to define requirements, plan sprints and deliver highāquality software on schedule.
What it takes to Join:
8+ years of experience in web development and at least 5 years focused on Salesforce Commerce Cloud and SFRA.
Deep knowledge of modern frontāend technologies (HTML5, CSS3/SCSS, JavaScript, React or similar frameworks) and backāend development (Node.js, Java or equivalent).
Handsāon experience with SFCC OCAPI/SCAPI, cartridge development, API integrations and Business Manager configurations.
Proven track record integrating thirdāparty services (payments, tax, shipping, CRM, loyalty, analytics) and implementing secure, scalable solutions.
Familiarity with Agile methodologies, version control (Git) and CI/CD pipelines.
Strong understanding of web performance optimization, SEO and accessibility standards.
Ability to lead discussions, mentor teammates and collaborate with technical teams.
Bachelor's degree in Computer Science, Information Systems or related field; Salesforce B2C Commerce Developer certification is preferred.
Salesforce Commerce Cloud SFRA certified developer is preferred.
Proven ability to excel in fast-growing, dynamic business environments with competing priorities, with a positive, solution-oriented mindset.
Excellent analytical and problem-solving skills.
Salary Range: $149,000 - $159,000
Pac Perks:
Dog friendly office environment
On-site Cafe
On-site Gym
$1,000 referral incentive program
Generous associate discount of 30-50% off merchandise online and in-stores
Competitive long term and short-term incentive program
Immediate 100% vested 401K contributions and employer match
Calm Premium access for all employees
Employee perks throughout the year
Physical Requirements:
The physical demands described here are representative of those that are required by an associate to successfully perform the essential functions of this job.
While performing the duties of this job, the associate is regularly required to talk or hear. The associate is frequently required to sit; stand; walk; use hands to finger, handle or feel; as well as reach with hands and arms.
Specific vision abilities required by this job include close vision, distance vision, depth perception and ability to adjust focus.
Ability to work in open environment with fluctuating temperatures and standard lighting.
Ability to work on computer and mobile phone for multiple hours; with frequent interruptions.
Required to travel in elevator or stairwells to attend meetings and engage with associates on multiple floors throughout building.
Hotel, Airplane, and Car Travel may be required.
Position Type/Expected Hours of Work:
This is a full-time position. As a National Retailer, occasional evening and/or weekend work may be required during periods of high volume. This role operates in a professional office environment and routinely uses standard office equipment.
Other Considerations:
Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the associate for this job. Duties, responsibilities and activities may change at any time with or without notice. Reasonable accommodations may be made to qualified individuals with disabilities to enable them to perform the essential functions of the role.
Equal Opportunity Employer
This employer is required to notify all applicants of their rights pursuant to federal employment laws.
For further information, please review the Know Your Rights notice from the Department of Labor.
ServiceNow CMDB Engineer
Data engineer job in Irvine, CA
Employment Type: Full-Time, Direct Hire (W2 Only - No sponsorship available)
About the Role
We're seeking a skilled and driven ServiceNow CMDB Engineer to join our team in Irving, TX. This is a hands-on, onsite role focused on designing, implementing, and maintaining a robust Configuration Management Database (CMDB) aligned with ServiceNow's Common Service Data Model (CSDM). You'll play a critical role in enhancing IT operations, asset management, and service delivery across the enterprise.
Responsibilities
Architect, configure, and maintain the ServiceNow CMDB to support ITOM and ITAM initiatives
Implement and optimize CSDM frameworks to ensure data integrity and alignment with business services
Collaborate with cross-functional teams to define CI classes, relationships, and lifecycle processes
Develop and enforce CMDB governance, data quality standards, and reconciliation rules
Integrate CMDB with discovery tools and external data sources
Support audits, compliance, and reporting requirements related to ITIL processes
Troubleshoot and resolve CMDB-related issues and performance bottlenecks
Qualifications
3+ years of hands-on experience with ServiceNow CMDB and CSDM implementation
Strong understanding of ITIL practices and ITOM/ITAM modules
Proven ability to manage CI lifecycle and maintain data accuracy
Experience with ServiceNow Discovery, Service Mapping, and integrations
ServiceNow Certified System Administrator (CSA) or higher certifications preferred
Excellent communication and documentation skills
Must be authorized to work in the U.S. without sponsorship
Perks & Benefits
Competitive compensation package
Collaborative and innovative work environment
Opportunity to work with cutting-edge ServiceNow technologies
Azure Cloud Engineer (Jr/Mid) - (Locals only)
Data engineer job in Los Angeles, CA
Job Title: Cloud Team Charter
Job Type: Contract to Hire
Work Schedule: Hybrid (3 days onsite, 2 days remote)
Rate: $60. Based on experience
Responsibilities:
Cloud Team Charter/ Scope- 2 resources (1 Sr and 1 Mid/Jr)
Operate and maintain Cloud Foundation Services, such as:
Azure Policies
Backup Engineering and Enforcement
Logging Standard and Enforcement
AntiVirus and Malware Enforcement
Azure service/resources life cycle management, including retirement of resources
Tagging enforcement
Infrastructure Security
Ownership of Defender reporting as it relates to Infrastructure.
Collaboration with Cyber Security and App team to generate necessary reports for Infrastructure security review.
Actively monitoring and remediating infrastructure vulnerability with App Team. Coordinate with the App team to address infrastructure vulnerabilities.
Drive continuous improvement in Cloud Security by tracking/maintaining infrastructure vulnerabilities through Azure Security Center.
Cloud Support:
PaaS DB support
Support for Cloud Networking (L2) and work with the Network team as needed
Developer support in the Cloud.
Support for the CMDB team to track the Cloud assets.
L4 Cloud support for the enterprise.
About Maxonic:
Since 2002 Maxonic has been at the forefront of connecting candidate strengths to client challenges. Our award winning, dedicated team of recruiting professionals are specialized by technology, are great listeners, and will seek to find a position that meets the long-term career needs of our candidates. We take pride in the over 10,000 candidates that we have placed, and the repeat business that we earn from our satisfied clients.
Interested in Applying?
Please apply with your most current resume. Feel free to contact
Jhankar Chanda (******************* / ************ ) for more details.
Snowflake DBT Engineer-- CDC5697451
Data engineer job in Irvine, CA
Key Responsibilities
Design develop and maintain ELT pipelines using Snowflake and DBT
Build and optimize data models in Snowflake to support analytics and reporting
Implement modular testable SQL transformations using DBT
Integrate DBT workflows into CICD pipelines and manage infrastructure as code using Terraform
Collaborate with data scientists analysts and business stakeholders to translate requirements into technical solutions
Optimize Snowflake performance through clustering partitioning indexing and materialized views
Automate data ingestion and transformation workflows using Airflow or similar orchestration tools
Ensure data quality governance and security across pipelines
Troubleshoot and resolve performance bottlenecks and data issues
Maintain documentation for data architecture pipelines and operational procedures
Required Skills Qualifications
Bachelors or Masters degree in Computer Science Data Engineering or related field
10 years of experience in data engineering with at least 3 years focused on Snowflake and DBT
Strong proficiency in SQL and Python
Experience with cloud platforms AWS GCP or Azure
Familiarity with Git CICD and Infrastructure as Code tools Terraform CloudFormation
Knowledge of data modeling star schema normalization and ELT best practices
Senior Software Engineer
Data engineer job in Irvine, CA
The Sr. Software Engineer will be responsible for the design/implementation of new software applications, maintenance and enhancement of various software products / solutions. They assist in successful execution of projects with minimal direction and guidance.
What You'll Be Doing
Spend 90% of your time actively designing and coding in support of the immediate team. 10% of your time will be spent researching new technology, coaching, and mentoring other engineers.
As a senior team member of developers, providing feedback and training where necessary, and ensure that technical initiatives align with organizational goals working closely with Principal Engineers / Development Managers.
As a Full Stack Engineer assigned to the product/project ensure performance, maintainability, and functional requirements from design, development, testing to rollout and support
Work with cross-engineering staff, collaborating on hardware and system monitoring requirements to ensure expected performance and reliability of the application / system developed.
Proactively communicate and work to mitigate changes to project timelines, degradation in performance of applications, troubleshooting / problem solving production issues.
Education
The Ideal Candidate:
Bachelor's degree in Computer Science, Engineering or related industry experience
Experience
A minimum of 6 years of professional software development experience in business process automation applications.
A minimum of 5 years' experience in .Net, C#, Windows tools and languages as well as modern web frameworks (Angular via Typescript, React, Vue)
Understanding of data repository models is a must. Understanding of SQL and NoSQL is preferred.
Understanding of Agile methodologies, Domain Driven Design, Test/Behavior Driven Design, Event Driven via Asynchronous messaging approaches, microservice architecture.
Preferred Experience
ASP.NET, WCF, Web Services, NServiceBus, Azure Cloud, Infrastructure as Code (IaC)
DevOps experience as a full stack developer owning the Software Development Lifecycle.
Strong understanding and experience writing unit and integration tests for all code produced.
Specialized Skills
Can effectively lead technical initiatives, collaboratively design/requirements meetings while gathering the necessary information for software development.
Ownership and accountability mindset, strong decision making along with communication and analytical skills that helps to partner with Product Owners and cross functional teams.
Leadership in project execution and delivery. Must be an excellent team player with the ability to handle stressful situations.
The individual has deep expertise in their chosen technology stack and have a broader knowledge of various programming languages, frameworks, and tools.
Brings a wealth of experience and a nuanced understanding of the specific domain, enabling insightful decisions and innovative problem-solving.
Ability to break up larger projects into individual pieces, assess complexity of each piece, and balance the work amongst team members.
Ability to work in fast paced / flexible environment that practices SAFe / Agile based SDLC.
Sets high standards for behavior and performance, models the values and principles of the organization, and inspires others through action.
Practices Test Driven Design leveraging unit tests, mocks, and data factories.
Experience with event driven design and microservice architecture best practices.
Posses strong sense of interpersonal awareness, has a bias for action, builds trust, is technically deep, and has good judgement.
Pay Range: $111k - 165k
The specific compensation for this position will be determined by a number of factors, including the scope, complexity and location of the role as well as the cost of labor in the market; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. Our full-time consultants have access to benefits including medical, dental, vision as well as 401K contributions.
Senior Software Engineer - Full Stack & DevOps
Data engineer job in Huntington Beach, CA
We're seeking a Senior Software Engineer who thrives at the intersection of application development and DevOps. You'll design, build, and deploy scalable SaaS solutions for Medicare and Medicaid health plans, while also contributing to the automation, reliability, and security of our development lifecycle. This role is central to delivering high-quality features for our Compliance, Appeals & Grievances, and Universe Scrubber products.
Key Responsibilities:
Ā· Application Development
Design and implement backend services, APIs, and user interfaces using modern frameworks and cloud-native architecture. Ensure performance, scalability, and maintainability across the stack.
Ā· DevOps Integration
Collaborate with infrastructure and DevOps teams to build and maintain CI/CD pipelines, automate deployments, and optimize environment provisioning across development, QA, and production.
Ā· Cloud-Native Engineering
Develop and deploy applications on AWS, leveraging services like Lambda, ECS, RDS, and S3. Ensure solutions are secure, resilient, and compliant with healthcare regulations.
Ā· Quality & Compliance
Write clean, testable code and participate in peer reviews, unit testing, and performance tuning. Ensure all software adheres to CMS, HIPAA, and internal compliance standards.
Ā· AI-Enabled Features
Support integration of AI/ML capabilities into product workflows, such as intelligent routing of grievances or automated compliance checks.
Ā· Mentorship & Collaboration
Provide technical guidance to junior engineers and collaborate with cross-functional teams to translate healthcare business needs into technical solutions.
Qualifications:
Bachelor's degree in computer science or related field
5+ years of experience in software development, with exposure to DevOps practices
Proficiency in languages such as Java, Python, or C#, and experience with cloud platforms (preferably AWS)
Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions), infrastructure-as-code (e.g., Terraform, Ansible), and containerization (e.g., Docker, Kubernetes)
Understanding of healthcare data formats (EDI, HL7, FHIR) and regulatory frameworks
Sr. Software Engineer (NO H1B OR C2C) - Major Entertainment Company
Data engineer job in Los Angeles, CA
Senior Software Engineer - Ad Platform Machine Learning
We're looking for a Senior Software Engineer to join our Ad Platform Decisioning & Machine Learning Platform team. Our mission is to power the Company's advertising ecosystem with advanced machine learning, AI-driven decisioning, and high-performance backend systems. We build end-to-end solutions that span machine learning, large-scale data processing, experimentation platforms, and microservices-all to improve ad relevance, performance, and efficiency.
If you're passionate about ML technologies, backend engineering, and solving complex problems in a fast-moving environment, this is an exciting opportunity to make a direct impact on next-generation ad decisioning systems.
What You'll Do
Build next-generation experimentation platforms for ad decisioning and large-scale A/B testing
Develop simulation platforms that apply state-of-the-art ML and optimization techniques to improve ad performance
Design and implement scalable approaches for large-scale data analysis
Work closely with researchers to productize cutting-edge ML innovations
Architect distributed systems with a focus on performance, scalability, and flexibility
Champion engineering best practices including CI/CD, design patterns, automated testing, and strong code quality
Contribute to all phases of the software lifecycle-design, experimentation, implementation, and testing
Partner with product managers, program managers, SDETs, and researchers in a collaborative and innovative environment
Basic Qualifications
4+ years of professional programming and software design experience (Java, Python, Scala, etc.)
Experience building highly available, scalable microservices
Strong understanding of system architecture and application design
Knowledge of big data technologies and large-scale data processing
Passion for understanding the ad business and driving innovation
Enthusiastic about technology and comfortable working across disciplines
Preferred Qualifications
Domain knowledge in digital advertising
Familiarity with AI/ML technologies and common ML tech stacks
Experience with big data and workflow tools such as Airflow or Databricks
Education
Bachelor's degree plus 5+ years of relevant industry experience
Role Scope
You'll support ongoing initiatives across the ad platform, including building new experimentation and simulation systems used for online A/B testing. Media industry experience is not required.
Technical Environment
Java & Spring Boot for backend microservices
AWS as the primary cloud environment
Python & Scala for data pipelines running on Spark and Airflow
Candidates should be strong in either backend microservices or data pipeline development and open to learning the other
API development experience is required
Interview Process
Round 1: Technical & coding evaluation (1 hour)
Round 2: Technical + behavioral interview (1 hour)
Candidates are assessed on technical strength and eagerness to learn.
Software Engineer (Java/Typescrip/Kotlin)
Data engineer job in Burbank, CA
Optomi in partnership with one of our top clients is seeking a highly skilled Software Engineer with strong experience in building application and shared services, REST APIs, and cloud-native solutions. In this role, you will contribute to the development of the Studio's media platforms and B2B applications that support content fulfillment across the Studio's global supply chain. The ideal candidate will bring strong AWS expertise, proficiency in modern programming languages, and the ability to work cross-functionally within a collaborative engineering environment.
What the Right Candidate Will Enjoy!
Contributing to high-visibility media platforms and content supply chain applications
Building scalable, reusable B2B REST APIs used across multiple business units
Hands-on development with TypeScript, Java, Kotlin, or JavaScript
Working extensively with AWS serverless tools-including Lambda and API Gateway
Solving complex engineering challenges involving identity and access management
Participating in a structured, multi-stage interview process that values both technical and collaborative skills
Collaborating with engineers, product owners, security teams, and infrastructure partners
Delivering features in an Agile environment with opportunities for continuous learning
Expanding skillsets across cloud services, API design, and distributed systems
Experience of the Right Candidate:
3+ years of industry experience in software engineering
STEM Degree
Strong focus on application development and shared services
Extensive experience with AWS tools and technologies, especially serverless computing and API Gateway
Strong proficiency in TypeScript, Java, Kotlin, or JavaScript (TypeScript/Java preferred)
Solid understanding of REST API design principles and software engineering best practices
Strong communication and problem-solving skills
Ability to collaborate effectively within cross-functional teams
Experience with databases and identity & access management concepts a plus
Comfortable participating in coding assessments and system design interviews
Responsibilities of the Right Candidate:
Collaborate on the design, development, and deployment of scalable, high-quality software solutions
Build, enhance, and maintain API-driven shared services for Studio media platforms
Leverage AWS tools and serverless technologies to architect reliable, cloud-native applications
Partner closely with product owners, security teams, and other engineering groups to deliver on requirements
Participate in Agile ceremonies-estimating work, prioritizing tasks, and delivering iteratively
Apply and uphold best practices in coding standards, architecture, and system reliability
Contribute to identity and access management services and reusable B2B REST APIs
Conduct testing and ensure high-quality deployments across the platform
Actively stay current with emerging technologies, industry trends, and engineering best practices
Support continuous improvement efforts for development processes, tooling, and automation
Software Engineer
Data engineer job in Santa Monica, CA
Plug is the only wholesale platform built exclusively for used electric vehicles. Designed for dealers and commercial consignors, Plug combines EV-specific data, systems and expertise to bring clarity and confidence to the wholesale buying and selling process. With the addition of Trade Deskā¢, dealers can quickly receive cash offers or list EV trade-ins directly into the auction, removing friction and maximizing returns. By replacing outdated wholesale methods with tools tailored to EVs, Plug empowers dealers to make faster and more profitable decisions with a partner they can trust. For more information, visit *****************
The Opportunity
This is an on site role in Santa Monica, CA.
We are looking for a Software Engineer to join our growing team! A full-stack software engineer who will report directly to our CTO, and who will own entire customer-facing products. We're building systems like multi-modal AI-enabled data onramps for EVs, near-real time API connectivity to the vehicles, and pricing intelligence tooling.
As a member of the team you'll help lay the technical and product foundation for our growing business. We're building a culture that cares about collaboration, encourages intellectual honesty, celebrates technical excellence, and is driven by careful attention to detail and planning for the future. We believe diversity of perspective and experience are key to building great technology and a thriving team. Sound cool? Let's work together.
Key Responsibilities
Collaborate with colleagues and be a strong voice in product design sessions, architecture discussions, and code reviews.
Design, implement, test, debug, and document work on new and existing software features and products, ensuring they meet business, quality, and operational needs.
Write clear, efficient, and scalable code with an eye towards flexibility and maintainability.
Take ownership of features and products, and support their planning and development by understanding the ultimate goal and evaluating effort, risk, and priority in an agile environment.
Own and contribute to team productivity and process improvements.
Use and develop APIs to create integrations between Plug and 3rd party platforms.
Be an integral part of a close team of developers; this is an opportunity to help shape a nascent team culture. The ideal candidate will be a high-growth individual able to grow their career as the team grows.
Qualifications
4-6 years of hands-on experience developing technical solutions
Advanced understanding of web application technologies, both backend and frontend as well as relational databases.
Familiarity with Cloud PaaS deployments.
Familiarity with TypeScript or any other modern typed language.
Familiarity with and positive disposition toward code generation AI tooling.
Strong analytical and quantitative skills.
Strong verbal and written communication skills with a focus on conciseness.
A self-directed drive to deliver end-to-end solutions with measurable goals and results.
Understanding and accepting of the ever-changing controlled chaos that is an early startup, and willing to work within that chaos to improve processes and outcomes.
Experience balancing contending priorities and collaborating with colleagues to reach workable compromises.
A proven track record of gaining trust and respect by consistently demonstrating sound critical-thinking and a risk-adjusted bias toward action.
You pride yourself on having excellent reliability and integrity.
Extraordinary grit; smart, creative, and persistent personality.
Authorized to work in the US for any employer.
Having worked in automotive or EV systems is a plus.
Compensation and Benefits
Annual Salary: 130K - 150K
Equity: TBD
Benefits: Health, vision, and dental insurance. Lunch stipend. Parking.
This full-time position is based in Santa Monica, CA. We welcome candidates from all locations to apply, provided they are willing to relocate for the role. Relocation assistance will not be provided for successful candidates. Sponsorship not available at this time.
Plug is an equal opportunity employer. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. And if you do, you suck.
Plumbing Engineer
Data engineer job in Marina del Rey, CA
We are currently seeking a Plumbing Engineer to join our team in Marina Del Rey, California. SUMMARY: This position is responsible for managing and performing tests on various materials and equipment and maintaining knowledge on all product specifications and ensure adherence to all required standards by performing the following duties.
DUTIES AND RESPONSIBILITIES:
Build long term customer relationships with existing and potential customers.
Effectively manage Plumbing and design projects by satisfying clients' needs, meeting budget expectations and project schedules.
Provide support during construction phases.
Performs other related duties as assigned by management.
SUPERVISORY RESPONSIBILITIES:
Carries out supervisory responsibilities in accordance with the organization's policies and applicable laws.
QUALIFICATIONS:
Bachelor's Degree (BA) from four-year college or universityin Mechanical Engineering or completed Course Work in Plumbing, or one to two years of related experience and/or training, or equivalent combination of education and experience.
Certificates, licenses and registrations required: LEED Certification is a plus.
Computer skills required:Experienced at using a computer, preferably knowledgeable with MS Word, MS Excel, AutoCAD, and REVIT is a plus.
Other skills required:
5 years of experience minimum, individuals should have recent experience working for a consulting engineering or engineering/architectural firm designing plumbing systems.
Experience in the following preferred:
Residential
Commercial
Multi-Family
Restaurants
Strong interpersonal skills and experience in maintaining strong client relationships are required.
Ability to communicate effectively with people through oral presentations and written communications.
Ability to motivate multiple-discipline project teams in meeting client's needs in a timely manner and meeting budget objectives.