Assistant Data Analyst
Data analyst job in El Segundo, CA
About the Company
Step into a high-impact Assistant Data Analyst role supporting a fast-growing consumer products business. You will work with data from more than 30 international markets and multiple business models, transforming complex datasets into strategic insights that leaders rely on. If you enjoy solving challenging problems, building powerful reports, and influencing decisions with data, this role gives you a chance to make your work visible at a global level.
About the Role
You will sit at the intersection of data, operations, and strategy-partnering with regional teams and senior stakeholders to ensure data is accurate, timely, and meaningful. This opportunity is ideal for someone who enjoys both the technical and business sides of analytics and expand experience working within international environment.
Responsibilities
Support and help oversee the collection and processing of data from 30+ countries, ensuring accuracy, consistency, and on-time delivery across all international channels.
Conduct comprehensive analysis on global sales and inventory data to develop recommendations for inventory optimization and improved business performance.
Collaborate with data analytics teams and regional stakeholders to ensure required data is captured within agreed timelines and defined standards.
Facilitate seamless data flow between systems and business users, enhancing accessibility, usability, and reliability of datasets.
Act as a liaison between the central data organization and various regional and business partners, clearly communicating data requirements, expectations, and deliverables.
Leverage advanced Excel functionality, including Power Pivot and data models, to build reports, dashboards, and analytical tools.
Independently manage projects of moderate complexity and provide business-focused support on larger, cross-functional data initiatives.
Prepare and deliver regular reports, insights, and strategic recommendations to senior leadership and other key stakeholders.
Ensure data quality controls are followed and contribute to continuous improvement of international data management processes.
Qualifications
Bachelor's degree in Business Analytics, International Business, Data Analytics, or a closely related field.
MBA or a Master's degree in Analytics or a related discipline preferred.
4-6 years of experience in data analysis and data management within a global or multi-region business environment.
Prior experience working with international data sources and stakeholders across varied business models.
Demonstrated track record of using data to support strategic business decision-making.
Required Skills
Advanced experience working with data models and complex data structures, particularly in large, multi-country environments.
Programming experience with Python for data processing, automation, and analysis tasks.
Comfortable working with large, complex datasets drawn from multiple business models and international sources.
Strong analytical capabilities with advanced proficiency in Power BI, Power Pivot, and Microsoft Excel.
Solid understanding of data management concepts and how they support broader business objectives.
Proven ability to interpret data and convert findings into clear, actionable business recommendations.
Effective project management skills, including planning, prioritizing, and executing moderately complex data projects.
Knowledge of international business structures (joint ventures, subsidiaries, distributors) and their differing data requirements.
Excellent written and verbal communication skills, including the ability to present complex analytical insights to non-technical stakeholders.
Ability to thrive in a dynamic, global environment and manage competing priorities.
Willingness to accommodate meetings and calls across multiple time zones as needed.
Preferred Skills
Advanced Microsoft Excel, including Power Pivot, data models, pivot tables, advanced formulas, and complex spreadsheet design.
Power BI (or similar BI tools) for building dashboards, visualizations, KPIs, and self-service reporting solutions.
Python for data manipulation, automation scripts, and analytical workflows.
Experience working with large data extracts from ERP, CRM, or data warehouse systems.
Strong proficiency with standard office productivity tools (Outlook, PowerPoint, Word) to support communication and presentation of analytics.
Pay range and compensation package
Pay Rate: $35-$40 per hour
Note: Must be ok to work onsite Monday through Friday 40hrs/ a week.
Equal Opportunity Statement
We are committed to diversity and inclusivity in our hiring practices.
Data Quality Analyst
Data analyst job in Pomona, CA
Bachelor's degree in Business, Finance, Accounting, Statistics or related field or an equivalent combination of education, training and experience. The candidate must possess the demonstrated ability to perform moderately complex quantitative analysis with the ability to gather, document, analyze and draw conclusions on data and information. The candidate will possess five to seven years of experience in the field of analysis. Responsible for developing strategies that support business need for efficient and effective use of best practices.Identify requirements, approach, solutions, costs, risks and options to address business need. Lead implementation activities of an initiative, application or feature set. Responsible for project documentation including definition, requirements, conversion, testing, implementation and training. Able to priorities initiatives by assessing business value for effort and develop options. Collaborates with vendor to understand product direction, release strategy, and timeframe. Maintain a broad knowledge of current and emerging state-of-the-art computer/network systems technologies, architectures and products.
Education Requirement:
A. High School Diploma or Equivalent
Day-to-Day Responsibilities/Workload:
Perform detailed analysis on large amounts of contractor personnel data to ensure accuracy and identify discrepancies. Facilitate the cleanup of said discrepancies. Support the migration of different business areas into the Field & Contractor Oversight program. This includes identification and assessment of business needs, development and assignment of provisioning job templates, and verification all changes made timely and accurately. In addition, review and assess cybersecurity and phishing program data to identify trends and support the Sr Advisor in report outs with vendors and Senior Leadership.
Required Skills/Attributes:
Strong Advanced Microsoft Excel skills. Focus on Customer Service and user experience, Communication skills across multiple mediums (email, Teams, phone, in person meetings, etc.), Experience working with and performing analysis on large data sets with multiple data attributes. Demonstrated experience with managing multiple assignments and strong time management skills
Data Analyst - Payroll
Data analyst job in Rosemead, CA
Trident Consulting is seeking a "Data Analyst" for one of our clients in “Rosemead, CA - Hybrid" A global leader in business and technology services.
Role: Data analyst
Duration: Contract
Rate: $18-23/Hr
Day-to-Day Responsibilities/Workload
Data Collection & Integration: Gather and consolidate data from diverse sources (SAP, Success Factors), including databases, spreadsheets, and other systems, ensuring accuracy and completeness.
Data Analysis & Reporting: Utilize Power Query and other analytical tools to create clear, insightful reports and summaries that effectively communicate findings to non-technical stakeholders.
Client Support & Issue Resolution: Respond to client inquiries through a shared inbox, providing timely and professional assistance. Troubleshoot and resolve issues related to payroll and expense data with attention to detail and accuracy.
Process Improvement: Identify opportunities to streamline data workflows and enhance reporting efficiency through automation and best practices.
Required Skills/Attributes
Advanced Excel, Customer Service Skills, team player.
Desired Skills/Attributes
SAP/ Successful Knowledge; Power Query
About Trident:
Trident Consulting is a premier IT staffing firm providing high-impact workforce solutions to Fortune 500 and mid-market clients. Since 2005, we've specialized in sourcing elite technology and engineering talent for contract, direct hire, and managed services roles. Our expertise spans cloud, AI/ML, cybersecurity, and data analytics, supported by a 3M+ candidate database and a 78% fill ratio. With a highly engaged leadership team and a reputation for delivering hard-to-fill, niche talent, we help organizations build agile, high-performing teams that drive innovation and business success.
Some of our recent awards include:
Trailblazer Women Award 2025 by Consulate General of India in San Francisco.
Ranked as the #1 Women Owned Business Enterprise in the large category by ITServe.
Received the TechServe Excellence award.
Consistently ranked in the Inc. 5000 list of fastest-growing private companies in America.
Recognized in the SF Business Times as one of the Largest Bay Area BIPOC/Minority-Owned Businesses in 2022.
Data Scientist
Data analyst job in Long Beach, CA
STAND 8 provides end to end IT solutions to enterprise partners across the United States and with offices in Los Angeles, New York, New Jersey, Atlanta, and more including internationally in Mexico and India We are seeking a highly analytical and technically skilled Data Scientist to transform complex, multi-source data into unified, actionable insights used for executive reporting and decision-making.
This role requires expertise in business intelligence design, data modeling, metadata management, data integrity validation, and the development of dashboards, reports, and analytics used across operational and strategic environments.
The ideal candidate thrives in a fast-paced environment, demonstrates strong investigative skills, and can collaborate effectively with technical teams, business stakeholders, and leadership.
Essential Duties & Responsibilities
As a Data Scientist, participate across the full solution lifecycle: business case, planning, design, development, testing, migration, and production support.
Analyze large and complex datasets with accuracy and attention to detail.
Collaborate with users to develop effective metadata and data relationships.
Identify reporting and dashboard requirements across business units.
Determine strategic placement of business logic within ETL or metadata models.
Build enterprise data warehouse metadata/semantic models.
Design and develop unified dashboards, reports, and data extractions from multiple data sources.
Develop and execute testing methodologies for reports and metadata models.
Document BI architecture, data lineage, and project report requirements.
Provide technical specifications and data definitions to support the enterprise data dictionary.
Apply analytical skills and Data Science techniques to understand business processes, financial calculations, data flows, and application interactions.
Identify and implement improvements, workarounds, or alternative solutions related to ETL processes, ensuring integrity and timeliness.
Create UI components or portal elements (e.g., SharePoint) for dynamic or interactive stakeholder reporting.
As a Data Scientist, download and process SQL database information to build Power BI or Tableau reports (including cybersecurity awareness campaigns).
Utilize SQL, Python, R, or similar languages for data analysis and modeling.
Support process optimization through advanced modeling, leveraging experience as a Data Scientist where needed.
Required Knowledge & Attributes
Highly self-motivated with strong organizational skills and ability to manage multiple verbal and written assignments.
Experience collaborating across organizational boundaries for data sourcing and usage.
Analytical understanding of business processes, forecasting, capacity planning, and data governance.
Proficient with BI tools (Power BI, Tableau, PBIRS, SSRS, SSAS).
Strong Microsoft Office skills (Word, Excel, Visio, PowerPoint).
High attention to detail and accuracy.
Ability to work independently, demonstrate ownership, and ensure high-quality outcomes.
Strong communication, interpersonal, and stakeholder engagement skills.
Deep understanding that data integrity and consistency are essential for adoption and trust.
Ability to shift priorities and adapt within fast-paced environments.
Required Education & Experience
Bachelor's degree in Computer Science, Mathematics, or Statistics (or equivalent experience).
3+ years of BI development experience.
3+ years with Power BI and supporting Microsoft stack tools (SharePoint 2019, PBIRS/SSRS, Excel 2019/2021).
3+ years of experience with SDLC/project lifecycle processes
3+ years of experience with data warehousing methodologies (ETL, Data Modeling).
3+ years of VBA experience in Excel and Access.
Strong ability to write SQL queries and work with SQL Server 2017-2022.
Experience with BI tools including PBIRS, SSRS, SSAS, Tableau.
Strong analytical skills in business processes, financial modeling, forecasting, and data flow understanding.
Critical thinking and problem-solving capabilities.
Experience producing high-quality technical documentation and presentations.
Excellent communication and presentation skills, with the ability to explain insights to leadership and business teams.
Benefits
Medical coverage and Health Savings Account (HSA) through Anthem
Dental/Vision/Various Ancillary coverages through Unum
401(k) retirement savings plan
Paid-time-off options
Company-paid Employee Assistance Program (EAP)
Discount programs through ADP WorkforceNow
Additional Details
The base range for this contract position is $73 - $83 / per hour, depending on experience. Our pay ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target for new hires of this position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Qualified applicants with arrest or conviction records will be considered
About Us
STAND 8 provides end-to-end IT solutions to enterprise partners across the United States and globally with offices in Los Angeles, Atlanta, New York, Mexico, Japan, India, and more. STAND 8 focuses on the "bleeding edge" of technology and leverages automation, process, marketing, and over fifteen years of success and growth to provide a world-class experience for our customers, partners, and employees.
Our mission is to impact the world positively by creating success through PEOPLE, PROCESS, and TECHNOLOGY.
Check out more at ************** and reach out today to explore opportunities to grow together!
By applying to this position, your data will be processed in accordance with the STAND 8 Privacy Policy.
Data Scientist
Data analyst job in Alhambra, CA
Title: Principal Data Scientist
Duration: 12 Months Contract
Additional Information
California Resident Candidates Only. This position is HYBRID (2 days onsite, 2 days telework). Interviews will be conducted via Microsoft Teams. The work schedule follows a 4/40 (10-hour days, Monday-Thursday), with the specific shift determined by the program manager. Shifts may range between 7:15 a.m. and 6:00 p.m.
Job description:
The Principal Data Scientist works to establish a comprehensive Data Science Program to advance data-driven decision-making, streamline operations, and fully leverage modern platforms including Databricks, or similar, to meet increasing demand for predictive analytics and AI solutions. The Principal Data Scientist will guide program development, provide training and mentorship to junior members of the team, accelerate adoption of advanced analytics, and build internal capacity through structured mentorship. The Principal Data Scientist will possess exceptional communication abilities, both verbal and written, with a strong customer service mindset and the ability to translate complex concepts into clear, actionable insights; strong analytical and business acumen, including foundational experience with regression, association analysis, outlier detection, and core data analysis principles; working knowledge of database design and organization, with the ability to partner effectively with Data Management and Data Engineering teams; outstanding time management and organizational skills, with demonstrated success managing multiple priorities and deliverables in parallel; a highly collaborative work style, coupled with the ability to operate independently, maintain focus, and drive projects forward with minimal oversight; a meticulous approach to quality, ensuring accuracy, reliability, and consistency in all deliverables; and proven mentorship capabilities, including the ability to guide, coach, and upskill junior data scientists and analysts.
Experience Required:
Five (5)+ years of professional experience leading data science initiatives, including developing machine learning models, statistical analyses, and end-to-end data science workflows in production environments.
Three (3)+ years of experience working with Databricks and similar cloud-based analytics platforms, including notebook development, feature engineering, ML model training, and workflow orchestration.
Three (3)+ years of experience applying advanced analytics and predictive modeling (e.g., regression, classification, clustering, forecasting, natural language processing).
Two (2)+ years of experience implementing MLOps practices, such as model versioning, CI/CD for ML, MLflow, automated pipelines, and model performance monitoring.
Two (2)+ years of experience collaborating with data engineering teams to design data pipelines, optimize data transformations, and implement Lakehouse or data warehouse architectures (e.g., Databricks, Snowflake, SQL-based platforms).
Two (2)+ years of experience mentoring or supervising junior data scientists or analysts, including code reviews, training, and structured skill development.
Two (2)+ years of experience with Python and SQL programming, using data sources such as SQL Server, Oracle, PostgreSQL, or similar relational databases.
One (1)+ year of experience operationalizing analytics within enterprise governance frameworks, partnering with Data Management, Security, and IT to ensure compliance, reproducibility, and best practices.
Education Required & certifications:
This classification requires possession of a Master's degree or higher in Data Science, Statistics, Computer Science, or a closely related field. Additional qualifying professional experience may be substituted for the required education on a year-for-year basis. At least one of the following industry-recognized certifications in data science or cloud analytics, such as:
Microsoft Azure Data Scientist Associate (DP-100)
Databricks Certified Data Scientist or Machine Learning Professional
AWS Machine Learning Specialty
Google Professional Data Engineer • or equivalent advanced analytics certifications. The certification is required and may not be substituted with additional experience.
About US Tech Solutions:
US Tech Solutions is a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit ************************
US Tech Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Recruiter Details:
Name: T Saketh Ram Sharma
Email: *****************************
Internal Id: 25-54101
Technical Business Analyst (Media / Entertainment) - 140432
Data analyst job in Culver City, CA
Technical Business Analyst
Onsite in Culver City, CA / Sunnyvale, CA from Day 1 (Client prefer local folks)
Hybrid Schedule: 3 Onsite Days (Tue, Wed, Thur) & 2 Remote Days (Mon, Fri)
Long term contract
Direct client opportunity
No mid layer / No Implementation partners are Involved
Job Summary:
Client is seeking a detail-oriented and technically proficient Technical Business Analyst (Media) - Contractor to join our Content Engineering team. In this role, you will be a critical bridge between our engineering teams and the diverse range of stakeholders they support across Client's media ecosystem. You will be responsible for gathering, documenting, and translating complex technical requirements into clear, actionable specifications that enable our engineers to deliver innovative and reliable solutions. The ideal candidate will possess a strong technical aptitude, exceptional communication and interpersonal skills, and a passion for enabling creativity and efficiency in a fast-paced media environment. This role requires the ability to effectively communicate technical concepts to diverse audiences and build consensus around solutions.
Key Responsibilities:
Workflow Analysis & Optimization: Collaborate with media production teams (video editors, audio engineers, VFX artists, live events) to thoroughly document their specific network requirements, pain points, and desired outcomes. Translate these requirements into clear and actionable specifications for engineering team implementation.
Client Networking Team Collaboration & Advocacy: Act as a key liaison between Media teams and Engineering teams, building strong working relationships to champion the unique technical requirements of media teams and ensure that their needs are met. This includes effectively communicating documented requirements, proactively resolving conflicts, and fostering a spirit of partnership.
Requirements Elicitation: Conduct thorough interviews and workshops with stakeholders (networking engineers, storage engineers, media asset management engineers, file transfer services engineers, camera robotics operators, etc.) to elicit detailed requirements for new and existing systems.
Documentation & Specification: Create comprehensive and well-organized documentation of technical requirements, including use cases, user stories, data flows, and system diagrams.
Technical Translation: Translate complex technical concepts into clear and concise language that is easily understood by both technical and non-technical audiences.
Prioritization & Roadmapping: Work with engineering leads to prioritize requirements and contribute to the development of product roadmaps.
Process Improvement: Identify opportunities to improve the requirements gathering and documentation processes, streamlining workflows and increasing efficiency.
Market Research & Analysis: Conduct market research and analysis to identify emerging trends and technologies that could benefit Client's media engineering efforts.
User Acceptance Testing (UAT): Assist with the planning and execution of user acceptance testing to ensure that delivered solutions meet the documented requirements.
Qualifications:
Bachelor's degree in Computer Science, Information Systems, or equivalent experience.
3+ years of experience in a business analyst, product analyst, or technical writing role.
Strong technical aptitude and the ability to quickly learn and understand complex technical concepts.
Exceptional communication, interpersonal, and relationship-building skills, with the ability to effectively communicate technical concepts to diverse audiences and build consensus around solutions.
Proven ability to document complex technical requirements in a clear and concise manner.
Strong understanding of service uptime, redundancy, and failover mechanisms in a broadcast environment.
Experience with requirements management tools and methodologies.
Familiarity with media production workflows and technologies a plus.
Passion for enabling creativity and innovation.
Preferred Qualifications:
Experience working in a media and entertainment environment.
Knowledge of networking, storage, or media asset management systems.
Experience with agile development methodologies.
Experience with user interface (UI) design principles.
Skill Sets:
Relating to Others - Is Required
Communication Skill - Is Required
Agile Project Management - Is Required
Requirements Gathering - Is Required
Media and Entertainment - Is Required
Pay Range: $70/hr - $75/hr
The specific compensation for this position will be determined by a number of factors, including the scope, complexity and location of the role as well as the cost of labor in the market; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. Our full-time consultants have access to benefits including medical, dental, vision and 401K contributions as well as any other PTO, sick leave, and other benefits mandated by applicable state or localities where you reside or work.
Senior Analyst, Client Insights & Business Intelligence
Data analyst job in Irvine, CA
Irvine, CA (on-site)
RIS Rx (pronounced “RISE”) is a healthcare technology organization with a strong imprint in the patient access and affordability space. RIS Rx has quickly become an industry leader in delivering impactful solutions to stakeholders across the healthcare continuum. RIS Rx is proud to offer an immersive service portfolio to help address common access barriers. We don't believe in a “one size fits all” approach to our service offerings. Our philosophy is to bring forward innovation, value and service to everything that we do. This approach has allowed us to have the opportunity to serve countless patients to help produce better treatment outcomes and an overall improved quality of life. Here at RIS Rx, we invite our partners and colleagues to “Rise Up” with us to bring accessible healthcare and solutions for all.
Role Overview
The Senior Analyst plays a critical role on the Client Success & Program Delivery team, leading advanced analytics, supporting strategic client initiatives, and optimizing program performance. This role is ideal for professionals with 4-7+ years of relevant experience who possess strong technical skills, can manage independent analytical workstreams, and can confidently contribute to client-facing discussions.
You will serve as an analytical subject matter expert - owning data quality, designing performance dashboards, delivering insights, and strengthening operational processes.
Key Responsibilities
Client Program Performance & Analytics
Own recurring program reporting and dashboard development, ensuring accuracy, timeliness, and insight depth.
Perform advanced data analysis using SQL and BI tools to identify trends, performance drivers, and program risks.
Convert analytical findings into actionable insights and recommendations for client and internal stakeholders.
Support business cases, renewal packages, opportunity sizing, financial modeling, and program optimization efforts.
Program Delivery & Client Engagement
Serve as the analytical lead during program onboarding, implementation, and ongoing delivery.
Prepare client-ready presentations, QBR materials, and monthly/quarterly performance summaries.
Participate in client meetings to share insights, troubleshoot issues, and support transparency in program execution.
Operational Excellence & Process Improvement
Enhance and standardize SOPs, playbooks, and workflow documentation for scalability and efficiency.
Develop repeatable analytical templates, data models, and automation that strengthen program reliability and reduce manual work.
Partner with cross-functional teams to improve data pipelines, reporting infrastructure, and performance monitoring processes.
Cross-Functional Collaboration
Collaborate with Product, Technology, Operations, and Analytics teams to align program needs with technical capabilities and enhancements.
Conduct market, payer, and competitive research to inform strategic initiatives and platform evolution.
Stay current on trends in healthcare access, affordability programs, specialty pharmacy, and benefit design.
Education & Experience
Bachelor's degree in Business, Economics, Finance, Data Analytics, Healthcare Administration, Information Systems, or a related field.
4-7 years of experience in analytics, consulting, client success, healthcare operations, program delivery, or a similar function.
Technical Requirements
Proficiency in SQL for data extraction, transformation, joining tables, and validating large datasets.
Advanced Excel skills, including pivot tables, advanced formulas, data modeling, and large dataset manipulation.
Experience with data visualization / BI tools such as Power BI, Tableau, Looker, Qlik, or similar.
Ability to build automated dashboards, KPI scorecards, and data refresh pipelines.
Familiarity with Python or R for data analysis is a plus (not required but preferred).
Experience working with relational databases, data warehouses, or cloud analytics environments (e.g., Snowflake, BigQuery, Redshift) preferred.
Professional Skills
Strong analytical mindset with the ability to interpret complex data and tell a clear, compelling story.
Excellent verbal and written communication skills; able to translate data into client-ready insights.
Highly organized, detail-oriented, and able to manage multiple concurrent priorities.
Comfortable working in a fast-paced, evolving environment with cross-functional collaboration.
Experience in healthcare, market access, specialty pharmacy, or patient support programs is highly preferred.
Sr. Data Analyst / Engineer (Microsoft Fabric & PowerBI)
Data analyst job in Santa Ana, CA
XP3R is a fast-growing data and technology consulting firm that helps organizations unlock the power of data, analytics, AI, and technology to make smarter & faster decisions. We partner with clients to turn complex data into actionable insights that drive better decisions and measurable results, and encapsulate data & insights into custom-tooling for interactivity and visibility.
What makes us different is how we work; we combine structure with curiosity, and strategy with execution. We move fast and focus on value-adding outcomes, and we don't back down from any challenge. Our team is made up of builders, problem-solvers, and lifelong learners who love the challenge of transforming ambiguity into clarity. At XP3R, you'll have the freedom to experiment, the support to grow, and the opportunity to make an impact that's visible from day one. We're a small, ambitious team, and every hire has a direct impact on how we grow.
The Role
We are looking for a Senior Data Analyst + Engineer who thrives at the intersection of business strategy and technical execution. This is not a “back-office dashboard role”, you'll be engaging directly with clients, advising on business strategy, and then building the technical solutions that bring it to life. This role requires a deep fundamental understanding of data, data modeling, and data architecture. You will be expected to design & build with Microsoft Fabric and Power BI and to both consult and deliver. You'll lead the full lifecycle: data sourcing, modeling, pipelines, dashboards, and business alignment.
What You'll Do
Partner with clients to understand business needs and translate them into technical solutions.
Design and implement data models by combining data + business knowledge that align with client requirements and scale effectively.
Lead data sourcing and requirements gathering, including working with APIs, exploratory analysis, and integrating disparate data systems.
Build and maintain Fabric ETL pipelines (Dataflow Gen2, PySpark notebooks, data pipelines).
Develop and optimize Power BI dashboards, including advanced DAX, calculation groups, and data visualization best practices.
Create dashboard mockups and prototypes to guide client conversations before implementation.
Design & stand up systems to enforce data validations and QA/QC in Microsoft Fabric.
Collaborate with leadership to define solution architecture - and ideally, take the lead on designing architecture yourself.
Serve as a trusted advisor to clients, bridging technical expertise with business insights.
Mentor junior team members and raise the standard of technical excellence across XP3R.
What We're Looking For
At XP3R, we look for people who blend technical mastery with strategic insight. You're someone who can move seamlessly between data modeling and client conversations, transforming complex challenges into clear, scalable solutions. We value leaders who stay curious, think critically, and take ownership of outcomes from concept to delivery. The right person will combine technical depth with business presence:
Technically Excellent - You are rooted in the technical, and have a knack for being able to figure things out to drive implementation and delivery of work product. You're experience allows you to build high quality product fast; exposure to SQL, Python, R, Power BI, or similar tools is required.
Owns the Work - You take initiative, explore solutions, and learn new tools without waiting for direction. You take ownership of work & hold yourself accountable for delivery.
Entrepreneurial by Nature - You're motivated by challenge and want to help build something meaningful, not just maintain it.
Quick to Learn - You adapt fast, connect ideas quickly, and enjoy turning new knowledge into action. You excel at switching between different contexts & tasks and pick up skills as you go.
Collaborative & Reliable - You elevate the people around you through communication, structure, and a sense of shared purpose
Salary Range: $100,000 - $220,000 (DOE)
Principal Data Scientist
Data analyst job in Alhambra, CA
The Principal Data Scientist works to establish a comprehensive Data Science Program to advance data-driven decision-making, streamline operations, and fully leverage modern platforms including Databricks, or similar, to meet increasing demand for predictive analytics and AI solutions.
The Principal Data Scientist will guide program development, provide training and mentorship to junior members of the team, accelerate adoption of advanced analytics, and build internal capacity through structured mentorship.
The Principal Data Scientist will possess exceptional communication abilities, both verbal and written, with a strong customer service mindset and the ability to translate complex concepts into clear, actionable insights; strong analytical and business acumen, including foundational experience with regression, association analysis, outlier detection, and core data analysis principles; working knowledge of database design and organization, with the ability to partner effectively with Data Management and Data Engineering teams; outstanding time management and organizational skills, with demonstrated success managing multiple priorities and deliverables in parallel; a highly collaborative work style, coupled with the ability to operate independently, maintain focus, and drive projects forward with minimal oversight; a meticulous approach to quality, ensuring accuracy, reliability, and consistency in all deliverables; and proven mentorship capabilities, including the ability to guide, coach, and upskill junior data scientists and analysts.
5+ years of professional experience leading data science initiatives, including developing machine learning models, statistical analyses, and end-to-end data science workflows in production environments.
3+ years of experience working with Databricks and similar cloud-based analytics platforms, including notebook development, feature engineering, ML model training, and workflow orchestration.
3+ years of experience applying advanced analytics and predictive modeling (e.g., regression, classification, clustering, forecasting, natural language processing).
2+ years of experience implementing MLOps practices, such as model versioning, CI/CD for ML, MLflow, automated pipelines, and model performance monitoring.
2+ years of experience collaborating with data engineering teams to design data pipelines, optimize data transformations, and implement Lakehouse or data warehouse architectures (e.g., Databricks, Snowflake, SQL-based platforms).
2+ years of experience mentoring or supervising junior data scientists or analysts, including code reviews, training, and structured skill development.
2+ years of experience with Python and SQL programming, using data sources such as SQL Server, Oracle, PostgreSQL, or similar relational databases.
1+ year of experience operationalizing analytics within enterprise governance frameworks, partnering with Data Management, Security, and IT to ensure compliance, reproducibility, and best practices.
Education:
This classification requires possession of a Master's degree or higher in Data Science, Statistics, Computer Science, or a closely related field. Additional qualifying professional experience may be substituted for the required education on a year-for-year basis.
At least one of the following industry-recognized certifications in data science or cloud analytics, such as: • Microsoft Azure Data Scientist Associate (DP-100) • Databricks Certified Data Scientist or Machine Learning Professional • AWS Machine Learning Specialty • Google Professional Data Engineer • or equivalent advanced analytics certifications. The certification is required and may not be substituted with additional experience.
Retail Business Analyst
Data analyst job in Los Angeles, CA
About the Company
POP MART (09992.HK), founded in 2010, is a leading global company in the trend culture and entertainment industry. Centered around IP, POP MART has built a comprehensive platform for creative incubation and IP operation, empowering global creators while delivering exciting products, services, and immersive entertainment experiences to consumers.POP MART identifies and nurtures emerging artists and designers worldwide, creating popular character IPs through a well-established IP development and operation system. Its portfolio includes iconic IPs such as MOLLY, SKULLPANDA, DIMOO, THE MONSTERS, and Hirono. By launching art toys and derivative products based on these IPs, POP MART continues to lead trends in consumer culture. As of the end of 2024, POP MART operates over 500 physical stores and more than 2,300 Robo Shops across 30+ countries and regions. Through multiple cross-border e-commerce platforms, the company has reached audiences in over 90 countries and regions, bringing joy to young consumers around the world.
About the Role
We are seeking a commercially-minded and collaborative Retail Business Analyst to serve as a key partner to our channel. This role goes beyond reporting - you will be the analytical engine that drives decision-making at the leadership level. Your primary mission is to uncover insights that reveal the health of our business, identify risks and opportunities, and spearhead solutions through deep cross-functional collaboration. You will transform raw data into actionable strategies that directly impact our top and bottom line.
What You Will Achieve
Generate daily/weekly/monthly commercial performance reports (sales, margin, inventory, sell-through) for execs; highlight key trends & deviations, and analyze core KPIs (sell-through rate, ATP, full-price sell, channel productivity) to assess business health.
Conduct deep-dive analyses of performance issues (regional sales decline, category underperformance, channel conflict) to identify root causes; build forward-looking models/forecasts/scenario plans to support strategic planning.
Act as the primary analytics partner for Sales Ops, Merchandising, and Supply Chain teams; translate insights into actionable recommendations and own end-to-end problem-solving (discovery → solution → implementation → impact measurement).
Lead data-driven business reviews to drive decisions; coordinate cross-functionally to align on data definitions & goals, bridge technical and commercial teams, and champion a data-centric culture across the organization.
What You Will Need
2+ years of experience as a Business Analyst, Commercial Analyst, or similar role in a fast-paced retail, DTC, or CPG environment.
Must-have Skills:
Advanced Analytical Proficiency: Expert in Excel/Google/Lark Sheets; strong experience with data visualization tools (e.g., Tableau, Power BI, Looker).
Business Acumen: Deep understanding of retail/commercial metrics and P&L drivers. You ask “why” behind the numbers.
Proactive Problem-Solver: A proven track record of identifying business problems through data and driving solutions to implementation.
Exceptional Communication & Influence: Ability to simplify complex data into executive-level stories and persuade stakeholders to act.
Collaborative Driver: Excellent at project management and coordinating across teams (Sales Ops, Merch, Finance, Logistics) without direct authority.
Preferred Skills:
Experience with SQL for data extraction and manipulation.
Familiarity with planning or ERP systems (e.g., SAP, Netsuite).
Chinese speaking will be a plus
What We Offer
Market-competitive packages: we provide 401k, health insurance, PTO leave, paid sick leave, and family leave, etc.
Opportunities to learn and lead: we provide on-the-job training to ensure employees are equipped with the most up-to-date skill sets and knowledge
Career development: we work with you to advance your career through short-term assignments, and new experiences, etc.
*POP MART is committed to equal pay initiatives and will not ask candidates for their current or past salary.
**As an Equal Opportunity Employer, POP MART does not discriminate against applicants or employees because of race, color, creed, religion, sex, national origin, veteran status, disability, age, citizenship, marital or domestic/civil partnership status, sexual orientation, gender identity or expression or because of any other status or condition protected by applicable federal, state or local law.
Database Analyst
Data analyst job in Pomona, CA
AVID Technical Resources is seeking an Analyst to support our client's database project. Must be located in or near Pomona, CA or Monmouth, OR.
Required Skills:
Data retrieval and automation
Oracle Database knowledge
SQL and data modeling experience
PowerBI, PowerAutomate, PowerApps, MS Suite
Azure / AWS is a bonus!
Excellent written and oral communication skills
Oracle Agile PLM 9.3.X Business Analyst
Data analyst job in Irvine, CA
Must Have Technical/Functional Skills:
Collaborate closely with business users to understand their needs and translate them into clear technical and functional requirements.
Conduct meetings and workshops with stakeholders confidently and effectively.
Analyze, document, and optimize PLM/PDM business processes across engineering, manufacturing, and supply chain functions.
Configure and support Oracle Agile PLM 9.3.6 modules, primarily Product Collaboration (PC) and Portfolio & Program Management (PPM).
Assist in the development and implementation of business systems and processes aligned with PLM strategy.
Ensure alignment with SAP S/4 HANA ERP system and support integration efforts.
Utilize Azure DevOps and Jira for ticketing, tracking, and collaboration.
Collaborative with cross-functional teams and occasional travel between sites may be required.
7+ years of experience as a Business Analyst with a focus on Oracle Agile PLM 9.3.6, including configuration responsibilities.
Strong understanding of PLM/PDM concepts, workflows, and best practices.
Proven experience in the semiconductor or high-tech manufacturing industries.
Hands-on experience with Agile PLM modules such as Product Collaboration (PC) and Portfolio & Program Management (PPM).
Familiarity with SAP S/4 HANA ERP system and its integration with PLM platforms.
Proficiency in Azure DevOps and Jira ticketing systems.
TCS Employee Benefits Summary:
Discretionary Annual Incentive.
Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans.
Family Support: Maternal & Parental Leaves.
Insurance Options: Auto & Home Insurance, Identity Theft Protection.
Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement.
Time Off: Vacation, Time Off, Sick Leave & Holidays.
Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
Imaging Application Analyst (RIS, Radiology PACS, Cardiology PACS, Voice Dictation, Advanced Post Processing, RadOnc EMR, EKG/EEG, GI)
Data analyst job in Monterey Park, CA
Sr. Imaging Applications Analyst
Salary Range: $130k to $150k
The Imaging Applications Analyst, Senior is responsible for the design, implementation, validation, and support of multiple imaging applications (RIS, Radiology PACS, Cardiology PACS, Voice Dictation, Advanced Post Processing, RadOnc EMR, EKG/EEG, GI)and related ancillary systems.
The Imaging Applications Analyst ensures the design, configuration, integration, and user experience of these imaging applications, meets business and clinical objectives. The Imaging Applications Analyst consults with organizational clinicians, staff, and vendors of the imaging applications and is responsible for the configuration, testing, problem identification, issue resolution, and on-going support of the assigned applications including new implementations and upgrades.
Accountabilities:
Troubleshoot, configure, validate, upgrade, and support enterprise-wide clinical imaging applications.
Provide off-hour on-call support for issues and apply expertise and independent judgment for full resolution.
Work on assigned projects both independently and as part of a team.
Lead/Assist large-scale development and implementation projects. complex, inter-departmental projects and operational initiatives
Design and lead implementations, upgrades, and solution conversions in support of Keck clinical imaging applications
Assist in ensuring stability and functionality of the assigned applications.
Provide escalation support, troubleshooting and root-cause analysis of issues
Assist in identifying and trouble-shooting application issues, including isolating problems, recommending appropriate solutions and implementing solutions. Work closely with all IS teams to maintain Clinical Imaging Applications that are compliant with organizational standards and policies.
Regularly meet with users, vendors, IT staff to develop/modify system specifications and are responsible for the timely resolution or escalation of problems within the imaging application environment.
Work on assigned projects both independently and as part of a team and apply expertise and independent judgment for full resolution.
Responsible for support/testing of HL7 integration between Cerner EMR and all Imaging applications ensuring data integrity of integrated solutions.
Maintain expertise in Imaging Systems functionality and site/system workflows working directly with clinicians and staff to understand clinical workflows and reported issues
Coaches and mentors less experienced team members
Provide after-hours and weekend support where necessary for a 24x7 system availability model.
Minimum Education
Bachelor's degree in Computer Science, Healthcare Science, Financials, Business or related field required.
In lieu of a bachelor's degree, additional 4 years of experience are required.
Minimum Experience
Minimum 5 years of experience with design, configuration, maintenance, troubleshooting, upgrading, testing, and supporting clinical imaging applications (i.e., Cerner RadNet, FujiPACS/CV, PowerScribe 360, Varian Aria, Natus Xltek, Provation) or the equivalent combination of experience and education that would demonstrate the capability to successfully perform the essential functions of this position.
Working level knowledge of DICOM, HL7 and IHE.
Working level knowledge of Imaging Modalities (XR, US, MRI, CT, Nuclear, Mammo, EKG).
Extensive experience in managing, implementing, and supporting a diverse range of Cardiology IT applications and systems, including Fuji Synapse Cardiovascular PACS, GE/Merge Hemo, Epiphany ECG management, and Cerner RadNet / RIS.
Skilled in custom template building, focusing on creating standardized, efficient, and clinically relevant templates that streamline reporting and data capture.
Comprehensive knowledge of DICOM imaging protocols and standards, including image acquisition, archiving, retrieval, and seamless integration with Fuji CV PACS and Cerner EHR.
Possesses a foundational clinical background in cardiology, enabling a deep understanding of cardiac workflows, diagnostic procedures, and data requirements for IT system development and optimization.
This is a senior position with the expectation of mentoring other team members and leading through projects independently. A strong team-oriented attitude is critical.
Local resource preferred (on-site for the first six months then hybrid schedule).
Data Analytics Engineer
Data analyst job in Irvine, CA
We are seeking a Data Analytics Engineer to join our team who serves as a hybrid Database Administrator, Data Engineer, and Data Analyst, responsible for managing core data infrastructure, developing and maintaining ETL pipelines, and delivering high-quality analytics and visual insights to executive stakeholders. This role bridges technical execution with business intelligence, ensuring that data across Salesforce, financial, and operational systems is accurate, accessible, and strategically presented.
Essential Functions
Database Administration: Oversee and maintain database servers, ensuring performance, reliability, and security. Manage user access, backups, and data recovery processes while optimizing queries and database operations.
Data Engineering (ELT): Design, build, and maintain robust ELT pipelines (SQL/DBT or equivalent) to extract, transform, and load data across Salesforce, financial, and operational sources. Ensure data lineage, integrity, and governance throughout all workflows.
Data Modeling & Governance: Design scalable data models and maintain a governed semantic layer and KPI catalog aligned with business objectives. Define data quality checks, SLAs, and lineage standards to reconcile analytics with finance source-of-truth systems.
Analytics & Reporting: Develop and manage executive-facing Tableau dashboards and visualizations covering key lending and operational metrics - including pipeline conversion, production, credit quality, delinquency/charge-offs, DSCR, and LTV distributions.
Presentation & Insights: Translate complex datasets into clear, compelling stories and presentations for leadership and cross-functional teams. Communicate findings through visual reports and executive summaries to drive strategic decisions.
Collaboration & Integration: Partner with Finance, Capital Markets, and Operations to refine KPIs and perform ad-hoc analyses. Collaborate with Engineering to align analytical and operational data, manage integrations, and support system scalability.
Enablement & Training: Conduct training sessions, create documentation, and host data office hours to promote data literacy and empower business users across the organization.
Competencies & Skills
Advanced SQL proficiency with strong data modeling, query optimization, and database administration experience (PostgreSQL, MySQL, or equivalent).
Hands-on experience managing and maintaining database servers and optimizing performance.
Proficiency with ETL/ELT frameworks (DBT, Airflow, or similar) and cloud data stacks (AWS/Azure/GCP).
Strong Tableau skills - parameters, LODs, row-level security, executive-level dashboard design, and storytelling through data.
Experience with Salesforce data structures and ingestion methods.
Proven ability to communicate and present technical data insights to executive and non-technical stakeholders.
Solid understanding of lending/financial analytics (pipeline conversion, delinquency, DSCR, LTV).
Working knowledge of Python for analytics tasks, cohort analysis, and variance reporting.
Familiarity with version control (Git), CI/CD for analytics, and data governance frameworks.
Excellent organizational, documentation, and communication skills with a strong sense of ownership and follow-through.
Education & Experience
Bachelor's degree in Computer Science, Engineering, Information Technology, Data Analytics, or a related field.
3+ years of experience in data analytics, data engineering, or database administration roles.
Experience supporting executive-level reporting and maintaining database infrastructure in a fast-paced environment.
Big Data Engineer
Data analyst job in Santa Monica, CA
Our client is seeking a Big Data Engineer to join their team! This position is located in Santa Monica, California.
Design and build core components of a large-scale data platform for both real-time and batch processing, owning key features of big data applications that evolve with business needs
Develop next-generation, cloud-based big data infrastructure supporting batch and streaming workloads, with continuous improvements to performance, scalability, reliability, and availability
Champion engineering excellence, promoting best practices such as design patterns, CI/CD, thorough code reviews, and automated testing
Drive innovation, contributing new ideas and applying cutting-edge technologies to deliver impactful solutions
Participate in the full software development lifecycle, including system design, experimentation, implementation, deployment, and testing
Collaborate closely with program managers, product managers, SDETs, and researchers in an open, agile, and highly innovative environment
Desired Skills/Experience:
Bachelor's degree in a STEM field such as: Science, Technology, Engineering, Mathematics
5+ years of relevant professional experience
4+ years of professional software development experience using Java, Scala, Python, or similar programming languages
3+ years of hands-on big data development experience with technologies such as Spark, Flink, SingleStore, Kafka, NiFi, and AWS big data tools
Strong understanding of system and application design, architecture principles, and distributed system fundamentals
Proven experience building highly available, scalable, and production-grade services
Genuine passion for technology, with the ability to work across interdisciplinary areas and adopt new tools or approaches
Experience processing massive datasets at the petabyte scale
Proficiency with cloud infrastructure and DevOps tools, such as Terraform, Kubernetes (K8s), Spinnaker, IAM, and ALB
Hands-on experience with modern data warehousing and analytics platforms, including ClickHouse, Druid, Snowflake, Impala, Presto, Kinesis, and more
Familiarity with common web development frameworks, such as Spring Boot, React.js, Vue.js, or Angular
Benefits:
Medical, Dental, & Vision Insurance Plans
Employee-Owned Profit Sharing (ESOP)
401K offered
The approximate pay range for this position is between $52.00 and $75.00. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
Content Security Policy and Governance Analyst
Data analyst job in Glendale, CA
City: Glendale, CA
Onsite/ Hybrid/ Remote: Onsite (4 days a week)
Duration: 12 months
Rate Range: Up to$92.5/hr on W2 depending on experience (no C2C or 1099 or sub-contract)
Work Authorization: GC, USC, All valid EADs except OPT, CPT, H1B
Must Have:
• Information security background
• Hands on experience writing and maintaining security or compliance policies
• Experience supporting security audits or compliance assessments
• Project management experience on security or compliance initiatives
• WordPress experience for content publishing and site updates
• Microsoft 365, including advanced Word and Excel, and PowerPoint (or Keynote)
• Experience with collaboration and workflow tools such as Confluence, Jira and ServiceNow
Responsibilities:
• Support the studio content security team in securing content from early development through delivery by driving policy, audit and governance work.
• Draft, update and maintain security and content security policies, standards and guidance that translate complex security controls into clear, user friendly documentation for creative and production stakeholders.
• Act as a liaison between internal audit, studio business units and technology teams to plan, coordinate and track content security related audits and assessments.
• Help ensure audits remain in scope, add value and align with content security objectives, including tracking findings, owners and action plans through closure.
• Partner with legal and security stakeholders to define and refine security requirements and terms for agreements, and translate security needs into clear input for legal language.
• Maintain and publish policy and guidance on the team's WordPress based site, including updates, edits and new content.
• Manage and track policy, audit and stakeholder deliverables using project management practices and tools.
• Use ServiceNow and similar tools to log, route and track stakeholder requests and build transparency and metrics around team workload.
• Collaborate with application, cloud and AI security partners to align policies and requirements, including emerging areas such as AI usage, watermarking and related controls.
• Prepare summaries, decks and documentation for leadership, internal partners and stakeholders, including executive ready overviews of complex topics.
• Contribute to continuous improvement of content security processes, requirements and communication.
Qualifications:
• Prior experience in information security, content security, security governance, risk and compliance or a closely related security discipline.
• Proven experience writing policies, standards, procedures or similar documentation from inception through publication.
• Strong analytical and structured thinking with high attention to detail, balanced with the ability to apply contextual and flexible judgment to real world business needs.
• Demonstrated ability to quickly learn a complex business environment and become an effective contributor.
• Strong project management skills, including managing multi stakeholder work, tracking dependencies and driving deliverables to completion.
• Ability to work both independently and collaboratively, with sound judgment on when to escalate, when to seek input and when to execute autonomously.
• Excellent written and verbal communication skills, including the ability to condense complex topics into clear, concise executive summaries.
• Comfortable operating in a fast paced environment with shifting priorities, and able to stay productive when priorities or direction change.
• Highly organized, able to manage multiple initiatives in parallel and maintain progress in a matrixed environment.
• Experience working with WordPress, Microsoft 365 (Word, Excel, PowerPoint), and collaboration tools such as Confluence, Jira and ServiceNow.
• Media and entertainment or studio production experience preferred but not required, provided the candidate can demonstrate the ability to learn a new industry quickly.
• Legal training or experience working closely with legal teams on security or contractual language is a plus, not a requirement.
• Interest in or exposure to AI related security and policy topics is a plus.
• HS Diploma required; additional education in information security, law, business, communications or related fields is a plus.
Lead Data Engineer - (Automotive exp)
Data analyst job in Torrance, CA
Role: Sr Technical Lead
Duration: 12+ Month Contract
Daily Tasks Performed:
Lead the design, development, and deployment of a scalable, secure, and high-performance CDP SaaS product.
Architect solutions that integrate with various data sources, APIs, and third-party platforms.
Design, develop, and optimize complex SQL queries for data extraction, transformation, and analysis
Build and maintain workflow pipelines using Digdag, integrating with data platforms such as Treasure Data, AWS, or other cloud services
Automate ETL processes and schedule tasks using Digdag's YAML-based workflow definitions
Implement data quality checks, logging, and alerting mechanisms within workflow
Leverage AWS services (e.g., S3, Lambda, Athena) where applicable to enhance data processing and storage capabilities
Ensure best practices in software engineering, including code reviews, testing, CI/CD, and documentation.
Oversee data privacy, security, and compliance initiatives (e.g., GDPR, CCPA).
Ensure adherence to security, compliance, and data governance requirements.
Oversee development of real-time and batch data processing systems.
Collaborate with cross-functional teams including data analysts, product managers, and software engineers to translate business requirements into technical solutions
Collaborate with the stakeholders to define technical requirements to align technical solutions with business goals and deliver product features.
Mentor and guide developers, fostering a culture of technical excellence and continuous improvement.
Troubleshoot complex technical issues and provide hands-on support as needed.
Monitor, troubleshoot, and improve data workflows for performance, reliability, and cost-efficiency as needed
Optimize system performance, scalability, and cost efficiency.
What this person will be working on:
As the Senior Technical Lead for our Customer Data Platform (CDP), the candidate will define the technical strategy, architecture, and execution of the platform. They will lead the design and delivery of scalable, secure, and high-performing solutions that enable unified customer data management, advanced analytics, and personalized experiences. This role demands deep technical expertise, strong leadership, and a solid understanding of data platforms and modern cloud technologies. It is a pivotal position that supports the CDP vision by mentoring team members and delivering solutions that empower our customers to unify, analyze, and activate their data.
Position Success Criteria (Desired) - 'WANTS'
Bachelor's or Master's degree in Computer Science, Engineering, or related field.
8+ years of software development experience, with at least 3+ years in a technical leadership role.
Proven experience building and scaling SaaS products, preferably in customer data, marketing technology, or analytics domains
Extensive hands-on experience with Presto, Hive, and Python
Strong proficiency in writing complex SQL queries for data extraction, transformation, and analysis
Familiarity with AWS data services such as S3, Athena, Glue, and Lambda
Deep understanding of data modeling, ETL pipelines, workflow orchestration, and both real-time and batch data processing
Experience ensuring data privacy, security, and compliance in SaaS environments
Knowledge of Customer Data Platforms (CDPs), CDP concepts, and integration with CRM, marketing, and analytics tools
Excellent communication, leadership, and project management skills
Experience working with Agile methodologies and DevOps practices
Ability to thrive in a fast-paced, agile environment
Collaborative mindset with a proactive approach to problem-solving
Stay current with industry trends and emerging technologies relevant to SaaS and customer data platforms.
Data Engineer
Data analyst job in Irvine, CA
Thank you for stopping by to take a look at the Data Integration Engineer role I posted here on LinkedIN, I appreciate it.
If you have read my s in the past, you will recognize how I write job descriptions. If you are new, allow me to introduce myself. My name is Tom Welke. I am Partner & VP at RSM Solutions, Inc and I have been recruiting technical talent for more than 23 years and been in the tech space since the 1990s. Due to this, I actually write JD's myself...no AI, no 'bots', just a real live human. I realized a while back that looking for work is about as fun as a root canal with no anesthesia...especially now. So, rather than saying 'must work well with others' and 'team mindset', I do away with that kind of nonsense and just tell it like it is.
So, as with every role I work on, social fit is almost as important as technical fit. For this one, technical fit is very very important. But, we also have some social fit characteristics that are important. This is the kind of place that requires people to dive in and learn. The hiring manager for this one is actually a very dear friend of mine. He said something interesting to me not all that long ago. He mentioned, if you aren't spending at least an hour a day learning something new, you really are doing yourself a disservice. This is that classic environment where no one says 'this is not my job'. So that ability to jump in and help is needed for success in this role.
This role is being done onsite in Irvine, California. I prefer working with candidates that are already local to the area. If you need to relocate, that is fine, but there are no relocation dollars available.
I can only work with US Citizens or Green Card Holders for this role. I cannot work with H1, OPT, EAD, F1, H4, or anyone that is not already a US Citizen or Green Card Holder for this role.
The Data Engineer role is similar to the Data Integration role I posted. However, this one is mor Ops focused, with the orchestration of deployment and ML flow, and including orchestrating and using data on the clusters and managing how the models are performing. This role focuses on coding & configuring on the ML side of the house.
You will be designing, automating, and observing end to end data pipelines that feed this client's Kubeflow driven machine learning platform, ensuring models are trained, deployed, and monitored on trustworthy, well governed data. You will build batch/stream workflows, wire them into Azure DevOps CI/CD, and surface real time health metrics in Prometheus + Grafana dashboards to guarantee data availability. The role bridges Data Engineering and MLOps, allowing data scientists to focus on experimentation and the business sees rapid, reliable predictive insight.
Here are some of the main responsibilities:
Design and implement batch and streaming pipelines in Apache Spark running on Kubernetes and Kubeflow Pipelines to hydrate feature stores and training datasets.
Build high throughput ETL/ELT jobs with SSIS, SSAS, and T SQL against MS SQL Server, applying Data Vault style modeling patterns for auditability.
Integrate source control, build, and release automation using GitHub Actions and Azure DevOps for every pipeline component.
Instrument pipelines with Prometheus exporters and visualize SLA, latency, and error budget metrics to enable proactive alerting.
Create automated data quality and schema drift checks; surface anomalies to support a rapid incident response process.
Use MLflow Tracking and Model Registry to version artifacts, parameters, and metrics for reproducible experiments and safe rollbacks.
Work with data scientists to automate model retraining and deployment triggers within Kubeflow based on data freshness or concept drift signals.
Develop PowerShell and .NET utilities to orchestrate job dependencies, manage secrets, and publish telemetry to Azure Monitor.
Optimize Spark and SQL workloads through indexing, partitioning, and cluster sizing strategies, benchmarking performance in CI pipelines.
Document lineage, ownership, and retention policies; ensure pipelines conform to PCI/SOX and internal data governance standards.
Here is what we are seeking:
At least 6 years of experience building data pipelines in Spark or equivalent.
At least 2 years deploying workloads on Kubernetes/Kubeflow.
At least 2 years of experience with MLflow or similar experiment‑tracking tools.
At least 6 years of experience in T‑SQL, Python/Scala for Spark.
At least 6 years of PowerShell/.NET scripting.
At least 6 years of experience with with GitHub, Azure DevOps, Prometheus, Grafana, and SSIS/SSAS.
Kubernetes CKA/CKAD, Azure Data Engineer (DP‑203), or MLOps‑focused certifications (e.g., Kubeflow or MLflow) would be great to see.
Mentor engineers on best practices in containerized data engineering and MLOps.
Oracle Agile PLM 9.3.X Business Analyst
Data analyst job in Irvine, CA
Base salary range (Min. & Max. to be mentioned)- Market Rate
Experience Range:7+ yrs
Must Have Technical/Functional Skills: Oracle Agile PLM 9.3.6 Business Analyst - Semiconductor Industry
Semi-Conductor industry Domain
Collaborate closely with business users to understand their needs and translate them into clear technical and functional requirements.
Conduct meetings and workshops with stakeholders confidently and effectively.
Analyze, document, and optimize PLM/PDM business processes across engineering, manufacturing, and supply chain functions.
Configure and support Oracle Agile PLM 9.3.6 modules, primarily Product Collaboration (PC) and Portfolio & Program Management (PPM).
Assist in the development and implementation of business systems and processes aligned with PLM strategy.
Define test plans and participate in testing activities including unit testing, integration testing, and system testing.
Support data migration, validation, and cleansing efforts during system transitions.
Provide day-to-day support and troubleshooting for Agile PLM users.
Ensure alignment with SAP S/4 HANA ERP system and support integration efforts.
Utilize Azure DevOps and Jira for ticketing, tracking, and collaboration.
Collaborative with cross-functional teams and occasional travel between sites may be required.
7+ years of experience as a Business Analyst with a focus on Oracle Agile PLM 9.3.6, including configuration responsibilities.
Strong understanding of PLM/PDM concepts, workflows, and best practices.
Proven experience in the semiconductor or high-tech manufacturing industries.
Hands-on experience with Agile PLM modules such as Product Collaboration (PC) and Portfolio & Program Management (PPM).
Familiarity with SAP S/4 HANA ERP system and its integration with PLM platforms.
Proficiency in Azure DevOps and Jira ticketing systems.
Excellent communication and stakeholder engagement skills.
Ability to work independently in a fast-paced, onsite environment
Roles & Responsibilities
Collaborate closely with business users to understand their needs and translate them into clear technical and functional requirements.
Conduct meetings and workshops with stakeholders confidently and effectively.
Analyze, document, and optimize PLM/PDM business processes across engineering, manufacturing, and supply chain functions.
Configure and support Oracle Agile PLM 9.3.6 modules, primarily Product Collaboration (PC) and Portfolio & Program Management (PPM).
Assist in the development and implementation of business systems and processes aligned with PLM strategy.
Define test plans and participate in testing activities including unit testing, integration testing, and system testing.
Support data migration, validation, and cleansing efforts during system transitions.
Provide day-to-day support and troubleshooting for Agile PLM users.
Ensure alignment with SAP S/4 HANA ERP system and support integration efforts.
Utilize Azure DevOps and Jira for ticketing, tracking, and collaboration.
Collaborative with cross-functional teams and occasional travel between sites may be required.
7+ years of experience as a Business Analyst with a focus on Oracle Agile PLM 9.3.6, including configuration responsibilities.
Strong understanding of PLM/PDM concepts, workflows, and best practices.
Proven experience in the semiconductor or high-tech manufacturing industries.
Hands-on experience with Agile PLM modules such as Product Collaboration (PC) and Portfolio & Program Management (PPM).
Familiarity with SAP S/4 HANA ERP system and its integration with PLM platforms.
Proficiency in Azure DevOps and Jira ticketing systems.
Excellent communication and stakeholder engagement skills.
Ability to work independently in a fast-paced, onsite environment
Generic Managerial Skills, If any
Should be able to lead and drive the Operational issues on his own.
Senior Data Engineer
Data analyst job in Glendale, CA
Our client is seeking a Senior Data Engineer to join their team! This position is located in Glendale, California.
Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
Build tools and services to support data discovery, lineage, governance, and privacy
Collaborate with other software and data engineers and cross-functional teams
Work with a tech stack that includes Airflow, Spark, Databricks, Delta Lake, Kubernetes, and AWS
Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
Contribute to developing and documenting internal and external standards and best practices for pipeline configurations, naming conventions, and more
Ensure high operational efficiency and quality of Core Data platform datasets to meet SLAs and ensure reliability and accuracy for stakeholders in Engineering, Data Science, Operations, and Analytics
Participate in agile and scrum ceremonies to collaborate and refine team processes
Engage with customers to build relationships, understand needs, and prioritize both innovative solutions and incremental platform improvements
Maintain detailed documentation of work and changes to support data quality and data governance requirements
Desired Skills/Experience:
5+ years of data engineering experience developing large data pipelines
Proficiency in at least one major programming language such as: Python, Java or Scala
Strong SQL skills and the ability to create queries to analyze complex datasets
Hands-on production experience with distributed processing systems such as Spark
Experience interacting with and ingesting data efficiently from API data sources
Experience coding with the Spark DataFrame API to create data engineering workflows in Databricks
Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
Experience developing APIs with GraphQL
Deep understanding of AWS or other cloud providers, as well as infrastructure-as-code
Familiarity with data modeling techniques and data warehousing best practices
Strong algorithmic problem-solving skills
Excellent written and verbal communication skills
Advanced understanding of OLTP versus OLAP environments
Benefits:
Medical, Dental, & Vision Insurance Plans
Employee-Owned Profit Sharing (ESOP)
401K offered
The approximate pay range for this position is between $51.00 and $73.00. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.