Senior QA analyst with Financial transaction exp
Senior data analyst- job in California
About the role Cognizant is looking to hire an experienced Senior QA Analyst with Financial Transaction experience. As a Senior QA Analyst, you will make an impact by supporting & encouraging Junior Test Analyst to communicate directly with business and development partners as confidence is gained.
Work model: Hybrid
We believe hybrid work is the way forward as we strive to provide flexibility wherever possible. Based on this role's business requirements, this is a hybrid position requiring 3-4 days a week in a client or Cognizant office in Torrance, CA. Regardless of your working arrangement, we are here to support a healthy work-life balance though our various wellbeing programs.
The working arrangements for this role are accurate as of the date of posting. This may change based on the project you're engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations.
What you need to have to be considered
• 5-8 years of experience in Testing.
• Strong knowledge of financial transaction flows.
What you need to have to be considered
• Acts as primary point of contact for the project
• Acts as primary coordination point with different partner teams and streams
• Acts as Primary escalation point for any testing related concerns
• Support & encourage Test Analyst to communicate directly with business and development partners as confidence is gained
• Understands the business processes and key business metrics
• Provides regular status to Management
Salary and Other Compensation:
Applications will be accepted until December 01, 2025.
The annual salary for this position is between $53,477 - $92,500 depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans.
Benefits:
Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
• Medical/Dental/Vision/Life Insurance
• Paid holidays plus Paid Time Off
• 401(k) plan and contributions
• Long-term/Short-term Disability
• Paid Parental Leave
• Employee Stock Purchase Plan
Cognizant will only consider applicants for this position who are legally authorized to work in the United States without company sponsorship.
*Please note, this role is not able to offer visa transfer or sponsorship now or in the future*
Auto-Apply4-H Data Systems Analyst 3 - Davis, CA, Job ID 82838
Senior data analyst- job in Davis, CA
Under the direction and supervision of the Statewide 4-H Director, the 4-H Data Systems Analyst applies advanced analytical concepts, organizational objectives, and database integration principles to assist with the management and development of the statewide 4-H enrollment and reporting system. This role involves analyzing extensive and multi-layered processes and problems; developing identified online system needs and solutions; collaborating to ensure all new and updated enrollment system processes will improve efficiency of the University of California 4-H (CA 4-H) Youth Development Program's enrollment system.
The incumbent provides subject-matter expertise to inform enrollment system design, data integrity, reporting, training, and compliance across related platforms used in CA 4-H. This includes serving as the primary liaison with vendors, county offices, statewide staff, and external partners to ensure the enrollment system and related tools meet program, policy, and compliance requirements.
The position is responsible for designing data methodologies, developing statewide enrollment reporting frameworks, and analyzing program participation trends to inform organizational decision-making. The analyst also leads requirements gathering and analysis to translate statewide operational, programmatic, and policy needs into technical specifications.
The 4-H Data Systems Analyst participates in the development of enrollment system training, resources, and system enhancements. The role requires the ability to manage multiple, high-level projects, anticipate and adapt to organizational needs, and deliver innovative, data-driven solutions that increase efficiency, compliance, and program effectiveness across CA 4-H. This position independently applies advanced data systems concepts to resolve complex issues and shape statewide system functions. The position also collaborates with the 4-H Policy Analyst to ensure that all applicable UC, state, federal, and 4-H policy changes are integrated into the enrollment system. The 4-H Data Analyst also collaborates on policy-based issues impacting the UC 4-H enrollment system, UC ANR digital enterprise system, and the national 4-H network for data management and enrollment reporting.
This position is a career appointment that is 100% fixed.
The home department is CA 4-H. While this position normally is based in Davis, CA, this position is eligible for hybrid flexible work arrangements for applicants living in the State of California at this time. Please note that hybrid flexible work arrangements are subject to change by the University.
Pay Scale: $ 81,500.00/year to $ 115,800.00 /year
Job Posting Close Date:
This job is open until filled. The first application review date will be 12/16/2025.
Key Responsibilities:
40%
Statewide Data System Coordination and Support:
Provides strategic oversight and management of the statewide 4-H enrollment database and related systems, ensuring data integrity, compliance, and security.
Participates in the design and oversees implementation of system features, integrations, and workflows to increase efficiency and effectiveness of program operations.
Assists with the development of statewide methodologies for extracting, validating, and reporting data, ensuring alignment with UC, state, and federal reporting requirements.
Serves as primary liaison to vendors and developers, advocating California's system needs and ensuring successful system enhancements and problem resolution.
Ensures consistent application of data governance and quality assurance practices across all statewide enrollment data workflows.
Collaborates with Statewide 4-H Director, 4-H Policy Analyst and others to anticipate and interpret applicable policy changes (UC, state, federal and 4-H) and integrates them into enrollment system design and user processes.
20%
Data Analysis, Reporting, and Policy Support:
Designs and delivers advanced reporting dashboards, data visualizations, and analyses to support statewide monitoring, compliance, and decision-making.
Conducts complex analyses of program participation and system usage, identifying trends, gaps, and opportunities to inform leadership decisions.
Leads requirements gathering and analysis to translate statewide operational, programmatic, and policy needs into technical specifications and system configurations.
Serves as subject matter expert in translating program and policy requirements into actionable enrollment system processes.
30%
Training, Communication, & Statewide Support:
Assists with the design and implementation of statewide training programs, guidance materials, and communication strategies for all 4-H data system users, including county staff, volunteers, and families.
Delivers advanced, multi-platform trainings (virtual and in-person), ensuring consistent statewide understanding and compliance.
Coaches and advises county-level staff on complex system and policy questions, providing advanced-level troubleshooting and guidance.
Represents California 4-H in national peer groups and committees related to enrollment and data systems, sharing best practices and advocating for program needs.
10%
Additional Systems & Financial Reporting System:
Provides secondary technical support for additional online 4-H systems, including the statewide financial reporting platform, as needed.
Advises on future CA 4-H enrollment system technology adoption, integration, and system expansion opportunities to strengthen program operations. Review enrollment system functions for increased efficiencies in enrollment procedures and overall data collection and use.
Provides subject-matter expertise to evaluate system functionality and recommend improvements to support statewide operational efficiency.
Requirements:
Bachelor's degree in a related field and extensive professional experience in data systems management, reporting, and analysis, or equivalent combination of education and experience
Demonstrated expertise in database design, system implementation, and data security/integrity practices, including handling complex and sensitive data.
Thorough knowledge of data visualization and reporting tools; ability to design dashboards and decision-support tools for executive audiences.
Strong analytical, problem-solving, collaboration, and decision-making skills; ability to independently as well as collaboratively resolve highly complex issues requiring evaluation of multiple factors.
Excellent written and verbal communication skills; ability to communicate technical concepts to diverse audiences.
Ability to anticipate organizational needs, translate policy into operational procedures, and recommend strategic improvements.
Demonstrate strong proficiency using Microsoft Office, Zoom, Google Workspace applications, Box, and similar collaboration and communication software tools.
Preferred Skills:
Master's degree in a related field and significant professional experience in data systems management, reporting, and analysis, and/or equivalent combination of education and experience.
Knowledge of Cooperative Extension.
4-H knowledge of program delivery, including delivery modes.
Experience managing vendor relationships and system development projects.
Coding knowledge and experience
Fluency in Spanish
Special Conditions of Employment:
Must possess valid California Driver's License to drive a County or University vehicle. Ability and means to travel on a flexible schedule as needed, proof of liability damage insurance on vehicle used is required. Reimbursement of job-related travel will be reimbursed according to University policies.
Travel including travel outside normal business hours may be requested.
The University reserves the right to make employment contingent upon successful completion of the background check. This is a designated position requiring a background check and may require fingerprinting due to the nature of the job responsibilities. UC ANR does hire people with conviction histories and reviews information received in the context of the job responsibilities.
As of January 1, 2014, ANR is a smoke- and tobacco-free environment in which smoking, the use of smokeless tobacco products, and the use of unregulated nicotine products (e-cigarettes), is strictly prohibited.
As a condition of employment, you will be required to comply with the University of California https://apptrkr.com/get_redirect.php?id=6769020&target URL=Policy on Vaccination Programs, as may be amended or revised from time to time. Federal, state, or local public health directives may impose additional requirements.
Exercise the utmost discretion in managing sensitive information learned in the course of performing their duties. Sensitive information includes, but is not limited to, employee and student records, health and patient records, financial data, strategic plans, proprietary information, and any other sensitive or non-public information learned during the course and scope of employment. Understands that sensitive information should be shared on a limited basis and actively takes steps to limit access to sensitive information to individuals who have legitimate business need to know. Ensure that sensitive information is properly safeguarded. Follow all organizational policies and laws on data protection and privacy. This includes secure handling of physical and digital records and proper usage of IT systems to prevent data leaks. The unauthorized or improper disclosure of confidential work-related information obtained from any source on any work-related matter is a violation of these expectations.
Misconduct Disclosure Requirement: As a condition of employment, the final candidate who accepts a conditional offer of employment will be required to disclose if they have been subject to any final administrative or judicial decisions within the last seven years determining that they committed any misconduct; received notice of any allegations or are currently the subject of any administrative or disciplinary proceedings involving misconduct; have left a position after receiving notice of allegations or while under investigation in an administrative or disciplinary proceeding involving misconduct; or have filed an appeal of a finding of misconduct with a previous employer.
a. "Misconduct" means any violation of the policies or laws governing conduct at the applicant's previous place of employment, including, but not limited to, violations of policies or laws prohibiting sexual harassment, sexual assault, or other forms of harassment, discrimination, dishonesty, or unethical conduct, as defined by the employer. For reference, below are UC's policies addressing some forms of misconduct:
UC Sexual Violence and Sexual Harassment Policy
UC Anti-Discrimination Policy
Abusive Conduct in the Workplace
To apply, please visit: https://careerspub.universityofcalifornia.edu/psc/ucanr/EMPLOYEE/HRMS/c/HRS_HRAM_FL.HRS_CG_SEARCH_FL.GBL?Page=HRS_APP_JBPST_FL&JobOpeningId=82838&PostingSeq=1&SiteId=17&language Cd=ENG&FOCUS=Applicant
Copyright ©2025 Jobelephant.com Inc. All rights reserved.
Posted by the FREE value-added recruitment advertising agency
jeid-8bbf0097aafc724582da70acb5ae5a1e
Sr. Data Analyst / Engineer (Microsoft Fabric & PowerBI)
Senior data analyst- job in Santa Ana, CA
XP3R is a fast-growing data and technology consulting firm that helps organizations unlock the power of data, analytics, AI, and technology to make smarter & faster decisions. We partner with clients to turn complex data into actionable insights that drive better decisions and measurable results, and encapsulate data & insights into custom-tooling for interactivity and visibility.
What makes us different is how we work; we combine structure with curiosity, and strategy with execution. We move fast and focus on value-adding outcomes, and we don't back down from any challenge. Our team is made up of builders, problem-solvers, and lifelong learners who love the challenge of transforming ambiguity into clarity. At XP3R, you'll have the freedom to experiment, the support to grow, and the opportunity to make an impact that's visible from day one. We're a small, ambitious team, and every hire has a direct impact on how we grow.
The Role
We are looking for a Senior Data Analyst + Engineer who thrives at the intersection of business strategy and technical execution. This is not a “back-office dashboard role”, you'll be engaging directly with clients, advising on business strategy, and then building the technical solutions that bring it to life. This role requires a deep fundamental understanding of data, data modeling, and data architecture. You will be expected to design & build with Microsoft Fabric and Power BI and to both consult and deliver. You'll lead the full lifecycle: data sourcing, modeling, pipelines, dashboards, and business alignment.
What You'll Do
Partner with clients to understand business needs and translate them into technical solutions.
Design and implement data models by combining data + business knowledge that align with client requirements and scale effectively.
Lead data sourcing and requirements gathering, including working with APIs, exploratory analysis, and integrating disparate data systems.
Build and maintain Fabric ETL pipelines (Dataflow Gen2, PySpark notebooks, data pipelines).
Develop and optimize Power BI dashboards, including advanced DAX, calculation groups, and data visualization best practices.
Create dashboard mockups and prototypes to guide client conversations before implementation.
Design & stand up systems to enforce data validations and QA/QC in Microsoft Fabric.
Collaborate with leadership to define solution architecture - and ideally, take the lead on designing architecture yourself.
Serve as a trusted advisor to clients, bridging technical expertise with business insights.
Mentor junior team members and raise the standard of technical excellence across XP3R.
What We're Looking For
At XP3R, we look for people who blend technical mastery with strategic insight. You're someone who can move seamlessly between data modeling and client conversations, transforming complex challenges into clear, scalable solutions. We value leaders who stay curious, think critically, and take ownership of outcomes from concept to delivery. The right person will combine technical depth with business presence:
Technically Excellent - You are rooted in the technical, and have a knack for being able to figure things out to drive implementation and delivery of work product. You're experience allows you to build high quality product fast; exposure to SQL, Python, R, Power BI, or similar tools is required.
Owns the Work - You take initiative, explore solutions, and learn new tools without waiting for direction. You take ownership of work & hold yourself accountable for delivery.
Entrepreneurial by Nature - You're motivated by challenge and want to help build something meaningful, not just maintain it.
Quick to Learn - You adapt fast, connect ideas quickly, and enjoy turning new knowledge into action. You excel at switching between different contexts & tasks and pick up skills as you go.
Collaborative & Reliable - You elevate the people around you through communication, structure, and a sense of shared purpose
Salary Range: $100,000 - $220,000 (DOE)
Data Modeler
Senior data analyst- job in Sacramento, CA
We are seeking a senior, hands-on Data Analyst /Data Modeler with strong communication skills for a long-term hybrid assignment with our Sacramento, California-based client. This position requires you to be able to work on-site in Sacramento, California, on Mondays and Wednesdays each week. Candidates must currently live within 60 miles of Sacramento, CA.
Requirements:
Senior, hands-on data model with strong communication skills.
Expert-level command of ER/STUDIO Data Architect modeling application
Strong ability to articulate data modeling principles and gather requirements from non-technical business stakeholders
Excellent presentation skills to different (business and technical) audiences ranging from senior level leadership to operational staff with no supervision
Ability to translate business and functional requirements into technical requirements for technical team members.
Candidate needs to be able to demonstrate direct hands-on recent practical experience in the areas identified with specific examples.
Qualifications:
Mandatory:
Minimum of ten (10) years demonstrable experience in the data management space, with at least 5 years specializing in database design and at least 5 years in data modeling.
Minimum of five (5) years of experience as a data analyst or in other quantitative analysis or related disciplines, such as a researcher or data engineer, supportive of key duties/responsibilities identified above.
Minimum of five (5) years of demonstrated experience with ER/Studio data modeling application
Minimum of five (5) years of relevant experience in relational data modeling and dimensional data modeling, statistical analysis, and machine learning, supportive of key duties/responsibilities identified above.
Excellent communication and collaboration skills to work effectively with stakeholders and team members.
At least 2 years of experience working on Star, Snowflake, and/or Hybrid schemas
Oracle/ODI/OCI/ADW experience required
Desired:
At least 2 years of experience working on Oracle Autonomous Data Warehouse (ADW), specifically installed in an OCI environment.
Expert-level Kimball Dimensional Data Modeling experience
Expert-level experience developing in Oracle SQL Developer or ER/Studio Data Architect for Oracle.
Ability to develop and perform Extract, Transform, and Load (ETL) activities using Oracle tools and PL/SQL with at least 2 years of experience. Ability to perform technical leadership of an Oracle data warehouse team, including but not limited to ETL, requirements solicitation, DBA, data warehouse administration, and data analysis on a hands-on basis.
Business Intelligence Analyst - Tableau
Senior data analyst- job in Santa Rosa, CA
About the Role
We are seeking a Tableau Report Developer to join our Data & Analytics team in San Francisco. This role is critical to building and maintaining high-quality business reporting that drives decision-making across our retail brands. You will work closely with stakeholders in finance, operations, merchandising, and leadership to deliver insights that directly impact growth.
THIS IS A HYBRID POSITION BASED IN OUR SAN FRANCISCO OFFICE. CANDIDATES MUST BE ABLE TO COMMUTE TO THE OFFICE 3 DAYS A WEEK.
Responsibilities
● Design, develop, and maintain Tableau dashboards and reports that provide actionable insights to business teams.
● Translate business questions into effective data visualizations and reporting solutions.
● Partner with stakeholders to understand requirements, gather feedback, and refine reporting deliverables.
● Perform data analysis to validate trends, identify anomalies, and ensure accuracy of reporting.
● Work with the data engineering team to improve data pipelines and ensure reliable data availability.
● Provide ad-hoc reporting support for retail, e-commerce, and cross-functional business partners.
Requirements
● 3+ years of professional experience developing Tableau dashboards and reports.
● Strong background in data analysis and business reporting.
● Excellent ability to engage with business stakeholders-translating needs into technical solutions.
● Experience in retail or e-commerce analytics highly preferred.
● Solid SQL skills and familiarity with cloud-based data warehouses (e.g., Snowflake, Domo).
● Strong communication and collaboration skills.
Business Intelligence Analyst
Senior data analyst- job in Fremont, CA
HCLTech is looking for a highly talented and self- motivated Business Intelligence Analyst to join it in advancing the technological world through innovation and creativity.
Job Title: Business Intelligence Analyst
Position Type: Full Time
Location: Onsite
Role Overview
Mandatory skills - My Sql, Python, Tableau and Simulation Tool Experience
Strong ability in translating business requirements and needs into analytic solutions, within multiple areas in IT and with various stakeholders, including key leaders and managers.
• Leverage data to understand in depth IT business processes, identify areas of opportunity for process improvement.
• Write queries, analyze, visualize, and provide analytics on data to build reporting solutions to support various company initiatives. E.g., build rich and dynamic dashboards using Tableau.
* Support project development life cycles through data modeling, reporting and analytics.
• Participate in the on-going development of the business intelligence and data warehousing functions within the wider organization.
• Create training materials to guide business users on how to use dashboards.
• Participate in the creation and support of development standards and best practices.
• Explore and recommend emerging technologies and techniques to support/enhance BI landscape components.
* Automate solutions where appropriate. Skills
* At least 4-6 years of business intelligence and data warehouse experience.
* At least 2-year experience with ANSI SQL/ Presto / Hive/ MySQL.
* At least 1 year of experience with Tableau.
* Prefer a candidate with scripting experience (Python/R/Javascript/PHP/ Perl/Ruby/etc.)
* Prefer a candidate with experience building and maintaining pipelines
* Knowledge of ETL processes and designs.
Pay and Benefits
Pay Range Minimum: $59,000 per year
Pay Range Maximum: $109,000 per year
HCLTech is an equal opportunity employer, committed to providing equal employment opportunities to all applicants and employees regardless of race, religion, sex, color, age, national origin, pregnancy, sexual orientation, physical disability or genetic information, military or veteran status, or any other protected classification, in accordance with federal, state, and/or local law. Should any applicant have concerns about discrimination in the hiring process, they should provide a detailed report of those concerns to ****************** for investigation.
A candidate's pay within the range will depend on their skills, experience, education, and other factors permitted by law. This role may also be eligible for performance-based bonuses subject to company policies. In addition, this role is eligible for the following benefits subject to company policies: medical, dental, vision, pharmacy, life, accidental death & dismemberment, and disability insurance; employee assistance program; 401(k) retirement plan; 10 days of paid time off per year (some positions are eligible for need-based leave with no designated number of leave days per year); and 10 paid holidays per year
How You'll Grow
At HCLTech, we offer continuous opportunities for you to find your spark and grow with us. We want you to be happy and satisfied with your role and to really learn what type of work sparks your
brilliance the best. Throughout your time with us, we offer transparent communication with senior-level employees, learning and career development programs at every level, and opportunities to experiment in different roles or even pivot industries. We believe that you should be in control of your career with unlimited opportunities to find the role that fits you best.
Business Analyst
Senior data analyst- job in Santa Clara, CA
Analyzing current business processes, workflows and procedures to identify areas for improvement.
Developing and implementing optimized processes and procedures to enhance efficiency, productivity and customer satisfaction.
Collaborating with stakeholders to design and execute new Quote to Cash products aligned with organizational goals.
Monitoring and evaluating the effectiveness of implemented features, measuring key performance indicators and making necessary adjustments for continuous improvement.
Stay updated on industry trends, emerging technologies and process improvement methodologies.
Responsibilities:
Help drive the roadmap for business process automation related to subscription functions; Quote to Cash.
Gain cross functional alignment across product, sales ops, finance, legal, operations, services etc.
Plans, implements and monitors business process changes for projects
Assists in making business decisions relating to system implementation, modification,
maintenance, etc.
Analyze complex business problems relating to end-to-end processes involving CPQ, Subscription Management, Entitlement Management, ERP.
Collaborate closely with IT Architects and product management to define and implement effective and efficient solutions to help scale the business.
Assume a hands-on role in testing solutions, identifying issues, validating solution meets specification and communicate to team members.
Presents analyses, solutions and business cases to senior management
Coordinates with cross-functional team to develop business process requirements
Experience:
10+ years of experience in process analyst role, including substantial experience in Enterprise Applications (eg. eCommerce, Salesforce, ERP, Zuora etc.)
Strong knowledge in enterprise CRM, CPQ, Order to Cash, SaaS / Cloud applications, with an enthusiasm to gain and apply new knowledge.
Familiarity with process mapping and modeling techniques.
Good understanding of business processes including CPQ, eCommerce, Contract, Subscription, ERP, Entitlement and Fulfillment systems.
Strong analytical skills with the ability to collect, study and interpret complex data.
Ability to manage and perform multiple complex tasks as part of the daily work assignment.
Systems / Tools:
Google Workplace (Gmail, Sheets, Docs, Slides, calendar)
MSFT Office Application Suite
SFDC CPQ, Contracts and CRM modules
Strong communication and presentation skills to effectively communicate process changes and recommendations to stakeholders at all levels.
Skills:
EAbility to manage and perform multiple complex tasks as part of the daily work assignment.
Systems / Tools:
Google Workplace (Gmail, Sheets, Docs, Slides, calendar)
MSFT Office Application Suite
SFDC CPQ, Contracts and CRM modules
Strong communication and presentation skills to effectively communicate process changes and recommendations to stakeholders at all levels.
Education:
Bachelor's degree in business administration, management or a related field.
About US Tech Solutions:
US Tech Solutions is a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit ************************
US Tech Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Recruiter's email id *******************************
ID: # 25-54457
Business System Analyst
Senior data analyst- job in Sunnyvale, CA
ServiceNow HRSD(HR Service Delivery) BSA
We are seeking a skilled HRSD Business Systems Analyst (BSA) to partner closely with the ServiceNow HRSD Architect, who acts as the bridge between HR business stakeholders and IT/ServiceNow development teams. The HRSD BSA will gather and analyze requirements, collaborate on designing HRSD solutions, and ensure the successful delivery of HR service management initiatives such as Configurable Workspace, Now Assist with Case & Knowledge Management, Employee Center Pro, and Workday integrations.
In addition to project delivery, the BSA will support Business-As-Usual (BAU) activities, including minor enhancements, defect fixes, incident triage, and process improvements to ensure the ongoing stability and efficiency of HRSD services
Key Responsibilities
Requirements Gathering & Analysis
Collaborate with HR stakeholders/ People Systems team to elicit, document, and
Prioritize requirements for HRSD modules, with a focus on Configurable Workspace, Now Assist, Case & Knowledge Management, and Employee Center Pro.
Identify gaps in current HR processes and recommend solutions leveraging ServiceNow HRSD capabilities.
Solution Design & Collaboration
Partner with the ServiceNow HRSD Architect and development teams to design scalable, efficient HRSD solutions.
Translate business needs into functional specifications, user stories, acceptance criteria, and process flows.
Project Delivery Support
Assist in sprint planning, backlog grooming, and prioritization of stories with HR business stakeholders.
Support SIT (System Integration Testing) and UAT (User Acceptance Testing), including test case preparation, execution, and validation for both new functionality and upgrade initiatives.
Configurable Workspace Implementation
Support design, testing, and rollout of HR Agent Workspaces (Configurable), ensuring workflows are intuitive, efficient, and aligned with HR operational needs.
Now Assist Implementation
Collaborate on configuring and optimizing Now Assist for HR fulfillers to improve case handling efficiency and knowledge recommendations.
Business-As-Usual (BAU) Support
Support ongoing HRSD operations, including minor enhancements, defect fixes, and incident triage.
Monitor and ensure adherence to SLAs for incident resolution and defect turnaround.
Recommend process improvements and automation opportunities to reduce manual effort and improve HR service efficiency.
Integration & Data Support
Work with integration teams to support Workday HCM and other HR systems integrations.
Ensure data accuracy and consistency across HRSD modules and integrated systems.
Stakeholder Communication
Act as a liaison between HR, IT, and development teams to ensure alignment on requirements, timelines, and priorities.
Facilitate workshops, demos, and training sessions as needed to ensure adoption of HRSD solutions.
Continuous Improvement
Identify opportunities to enhance HRSD processes, tools, and agent experiences.
Stay updated on ServiceNow HRSD best practices, new releases, and emerging capabilities relevant to HR fulfillers.
Required Skills & Qualifications
• 6+ years of experience as a Business Systems Analyst or similar role, preferably in ServiceNow HRSD environments.
• Strong understanding of HR processes such as HR case resolution, knowledge management, and Employee Relations
• Hands-on experience with Configurable Workspace, Now Assist, Case & Knowledge Management, and Employee Center Pro.
• Familiarity with Workday HCM or other HR system integrations.
• Understanding of ServiceNow platform concepts, data model, tables, workflows, and security model.
• Ability to gather, analyze, and document business requirements, functional specifications, user stories, and acceptance criteria.
• Strong analytical and problem-solving skills, with attention to detail.
• Experience supporting Business-As-Usual (BAU) operations, including defect triage, minor enhancements, and incident resolution.
• Experience working in Agile environments, including sprint planning, backlog management, and stakeholder prioritization.
• Ability to collaborate effectively with HR stakeholders, IT teams, developers, and architects.
• Strong facilitation skills for workshops, demos, and training sessions.
• Excellent written and verbal communication skills, with the ability to translate technical concepts to business users and vice versa.
• Ability to influence and negotiate priorities with multiple stakeholders.
• Knowledge of ServiceNow upgrade processes and best practices.
• Experience with reporting and analytics within ServiceNow HRSD is a plus.
Soft Skills
• Strong analytical and problem-solving abilities.
• Ability to work independently and as part of a team.
• Attention to detail and a commitment to quality.
Preferred Certifications
• ServiceNow Certified System Administrator (CSA)
• ServiceNow HRSD Implementation Specialist
• Agile or Scrum certification (CSM/PSM)
Advisory Azure Data Architect
Senior data analyst- job in San Diego, CA
EdgeX builds infrastructure-free IoT systems for real-time asset tracking and environmental monitoring, primarily in healthcare and industrial environments. Our sensors and platform operate entirely independent of facility IT, enabling instant deployment and insights. As part of our platform evolution, we have implemented Microsoft Azure Data Explorer (ADX) to enhance scalability, performance, and long-term maintainability across our data ecosystem.
Role
We are seeking an Advisory Azure Data Architect with deep expertise in IoT, time-series data, and observability platforms to provide expert oversight and validation as we transition more services and reporting workloads to ADX. This is a part-time, advisory engagement - ideal for a seasoned architect who can dedicate a few hours per week to help ensure that our ADX implementation, data migration strategy, and observability stack are aligned with best practices for scalability, performance, and cost efficiency.
Key Responsibilities
• Act as a strategic advisor to our engineering and infrastructure teams as we integrate additional systems and reports with ADX.
• Review and validate existing ADX configurations, ingestion pipelines, and data models.
• Provide guidance on data migration strategies, retention policies, and performance optimization.
• Ensure proper observability and monitoring are in place using tools such as Datadog, Azure Monitor, and related platforms.
• Offer best practices for managing and scaling telemetry and IoT data workloads.
• Participate in periodic architecture reviews and provide actionable feedback to technical leadership.
What We're Looking For
• Proven experience designing and optimizing Azure Data Explorer (ADX) environments at scale.
• Strong background in IoT, time-series data, and streaming architectures.
• Hands-on familiarity with observability tools such as Datadog, Azure Monitor, Grafana, or Prometheus.
• Understanding of related Azure services (Event Hubs, IoT Hub, Data Factory, Synapse, Log Analytics).
• Excellent communication and advisory skills; able to collaborate effectively with both technical and leadership teams.
• Prior consulting or fractional advisory experience preferred.
Engagement Details
• Part-time, advisory role (approx. 2-5 hours per week).
• Flexible schedule; remote candidates welcome.
• Focused on mid-to-late project validation and optimization, not initial architecture design.
Snowflake Data Architect
Senior data analyst- job in Santa Clara, CA
Senior Snowflake Data Engineer (Contract | Long-Term)
We're partnering with an enterprise data platform team on a long-term initiative where Snowflake is the primary cloud data warehouse supporting analytics and reporting at scale. This role is ideal for someone whose core strength is Snowflake, with some experience working alongside Databricks in modern data ecosystems.
What you'll be doing
Building and maintaining ELT pipelines primarily in Snowflake
Writing, optimizing, and troubleshooting complex Snowflake SQL
Managing Snowflake objects: virtual warehouses, schemas, streams, tasks, and secure views
Supporting performance tuning, cost optimization, and warehouse sizing
Collaborating with analytics and business teams to deliver trusted datasets
Integrating upstream or adjacent processing from Databricks where applicable
What we're looking for
Strong, hands-on Snowflake Data Engineering experience (primary platform)
Advanced SQL expertise within Snowflake
Experience designing ELT pipelines and analytical data models
Working knowledge of Databricks / Spark in a production environment
Understanding of Snowflake governance, security, and cost controls
Nice to have
dbt experience
Experience supporting enterprise analytics or reporting teams
Exposure to cloud-based data platforms in large-scale environments
Engagement details
Contract (long-term)
Competitive hourly rate
Remote or hybrid (US-based)
This role is best suited for engineers who go deep in Snowflake and can collaborate across platforms when Databricks is part of the stack.
Data Architect
Senior data analyst- job in Emeryville, CA
Role: Data Architect (SAP Data exp)
Reporting to the VP, Development & Data, build and lead a team responsible for leveraging data-driven insights and advanced analytics to optimize decision-making, improve operational efficiency, and drive strategic business value across the organization.
You will guide the design and implementation of Grocery Outlet's data science, AI, and data governance programs-ensuring robust data management, governance practices, and the development of innovative AI-enabled solutions that support key business areas such as merchandising, supply chain, marketing, and operations.
You will collaborate closely with senior stakeholders across Business Intelligence, Enterprise Data Architecture, and Business Solutions to align data and analytics investments with enterprise goals.
You are a visionary, strategic leader who sees data and AI as powerful enablers of business success. You combine deep technical expertise in data science, AI, and analytics with strong business acumen and excellent stakeholder communication. You are passionate about turning data into actionable insights that move the needle.
Requirements
Experience & Expertise
• 12+ years of experience in data science, analytics, or data governance roles
• 5+ years leading data science, AI, or enterprise analytics functions at a senior level
• Proven track record of successfully delivering AI and analytics solutions that drive measurable business impact
• Deep knowledge of machine learning techniques, predictive modeling, statistical analysis, and data visualization tools
• Strong understanding of data governance frameworks, data privacy, security, and regulatory compliance (e.g., CCPA, GDPR)
• Experience building and scaling analytics and data teams within retail, consumer products, or supply chain organizations
• Bachelor's degree in computer science, Data Science, Engineering, or related field required; Masters or advanced degree strongly preferred
• Experience with SAP Data models and familiarity with various functions within SAP will be a big plus.
Leadership & Skills
• Exceptional executive communication and stakeholder engagement capabilities
• Collaborative and inclusive leader who builds strong relationships across business and technology functions
• Skilled at translating complex data concepts into clear business insights and strategic recommendations
• Metrics-driven and outcome-focused; disciplined in demonstrating the ROI of analytics investments
• Strong team-builder and mentor with a proven ability to attract, grow, and retain analytics talent
• Innovative thinker who proactively explores new techniques, tools, and industry trends
Job responsibilities
Strategic Leadership & Vision
• Define and execute comprehensive data science, data governance and AI strategy aligned with corporate priorities
• Act as an influential advisor to executive leaders and business stakeholders on leveraging AI and data-driven insights
• Champion the responsible and ethical use of AI and data, ensuring that all initiatives balance business value with transparency, security, and regulatory compliance
• Lead the development and execution of Grocery Outlet's enterprise data governance framework
AI & Data Science Development
• Lead the vision and roadmap for advancing enterprise data capabilities and scaling AI, and advanced analytics across core business areas
• Lead enterprise data development initiatives to improve data quality, cleanse and standardize master data
• Drive close collaboration across the data team, business data organization, IT, and functional stakeholders to accelerate prototyping, testing, and deployment of analytics solutions
• Stay current on emerging AI techniques, platforms, and trends; introduce innovation and best practices
Data Governance & Management
• Establish and enforce standards, policies, and procedures for data quality, accuracy, security, privacy, and compliance
• Oversee the design and implementation of data lineage, and master data management (MDM) practices
• Collaborate closely with Enterprise Data Architect to ensure data availability, integrity, and accessibility across the enterprise
• Define and publish KPIs and metrics for data governance effectiveness and maturity
People & Team Leadership
• Recruit, build, and lead high-performing teams of data & AI engineers, and data governance specialists
• Foster a culture of collaboration, accountability, and continuous learning within your teams
• Provide coaching and professional development to drive growth and career progression
• Manage relationships and performance with external vendors, consultants, and analytics partners
Performance & Value Realization
• Measure, track, and communicate the business impact and value of AI and data initiatives
• Develop clear success criteria, metrics, and dashboards for analytics-driven outcomes
• Partner with Finance and the PMO to quantify and articulate the ROI of investments in AI, data science, and governance
• Ensure transparency, timely delivery, and alignment of analytics projects with organizational goals
AWS Data Architect
Senior data analyst- job in San Francisco, CA
Fractal is a strategic AI partner to Fortune 500 companies with a vision to power every human decision in the enterprise. Fractal is building a world where individual choices, freedom, and diversity are the greatest assets; an ecosystem where human imagination is at the heart of every decision. Where no possibility is written off, only challenged to get better. We believe that a true Fractalite is the one who empowers imagination with intelligence. Fractal has been featured as a Great Place to Work by The Economic Times in partnership with the Great Place to Work Institute and recognized as a ‘Cool Vendor' and a ‘Vendor to Watch' by Gartner.
Please visit Fractal | Intelligence for Imagination for more information about Fractal.
Fractal is looking for a proactive and driven AWS Lead Data Architect/Engineer to join our cloud and data tech team. In this role, you will work on designing the system architecture and solution, ensuring the platform is scalable while performant, and creating automated data pipelines.
Responsibilities:
Design & Architecture of Scalable Data Platforms
Design, develop, and maintain large-scale data processing architectures on the Databricks Lakehouse Platform to support business needs
Architect multi-layer data models including Bronze (raw), Silver (cleansed), and Gold (curated) layers for various domains (e.g., Retail Execution, Digital Commerce, Logistics, Category Management).
Leverage Delta Lake, Unity Catalog, and advanced features of Databricks for governed data sharing, versioning, and reproducibility.
Client & Business Stakeholder Engagement
Partner with business stakeholders to translate functional requirements into scalable technical solutions.
Conduct architecture workshops and solutioning sessions with enterprise IT and business teams to define data-driven use cases
Data Pipeline Development & Collaboration
Collaborate with data engineers and data scientists to develop end-to-end pipelines using Python, PySpark, SQL
Enable data ingestion from diverse sources such as ERP (SAP), POS data, Syndicated Data, CRM, e-commerce platforms, and third-party datasets.
Performance, Scalability, and Reliability
Optimize Spark jobs for performance tuning, cost efficiency, and scalability by configuring appropriate cluster sizing, caching, and query optimization techniques.
Implement monitoring and alerting using Databricks Observability, Ganglia, Cloud-native tools
Security, Compliance & Governance
Design secure architectures using Unity Catalog, role-based access control (RBAC), encryption, token-based access, and data lineage tools to meet compliance policies.
Establish data governance practices including Data Fitness Index, Quality Scores, SLA Monitoring, and Metadata Cataloging.
Adoption of AI Copilots & Agentic Development
Utilize GitHub Copilot, Databricks Assistant, and other AI code agents for
Writing PySpark, SQL, and Python code snippets for data engineering and ML tasks.
Generating documentation and test cases to accelerate pipeline development.
Interactive debugging and iterative code optimization within notebooks.
Advocate for agentic AI workflows that use specialized agents for
Data profiling and schema inference.
Automated testing and validation.
Innovation and Continuous Learning
Stay abreast of emerging trends in Lakehouse architectures, Generative AI, and cloud-native tooling.
Evaluate and pilot new features from Databricks releases and partner integrations for modern data stack improvements.
Requirements:
Bachelor's or master's degree in computer science, Information Technology, or a related field.
8-12 years of hands-on experience in data engineering, with at least 5+ years on Python and Apache Spark.
Expertise in building high-throughput, low-latency ETL/ELT pipelines on AWS/Azure/GCP using Python, PySpark, SQL.
Excellent hands on experience with workload automation tools such as Airflow, Prefect etc.
Familiarity with building dynamic ingestion frameworks from structured/unstructured data sources including APIs, flat files, RDBMS, and cloud storage
Experience designing Lakehouse architectures with bronze, silver, gold layering.
Strong understanding of data modelling concepts, star/snowflake schemas, dimensional modelling, and modern cloud-based data warehousing.
Experience with designing Data marts using Cloud data warehouses and integrating with BI tools (Power BI, Tableau, etc.).
Experience CI/CD pipelines using tools such as AWS Code commit, Azure DevOps, GitHub Actions.
Knowledge of infrastructure-as-code (Terraform, ARM templates) for provisioning platform resources
In-depth experience with AWS Cloud services such as Glue, S3, Redshift etc.
Strong understanding of data privacy, access controls, and governance best practices.
Experience working with RBAC, tokenization, and data classification frameworks
Excellent communication skills for stakeholder interaction, solution presentations, and team coordination.
Proven experience leading or mentoring global, cross-functional teams across multiple time zones and engagements.
Ability to work independently in agile or hybrid delivery models, while guiding junior engineers and ensuring solution quality
Must be able to work in PST time zone.
Pay:
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Fractal, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is: $150k - $180k. In addition, you may be eligible for a discretionary bonus for the current performance period.
Benefits:
As a full-time employee of the company or as an hourly employee working more than 30 hours per week, you will be eligible to participate in the health, dental, vision, life insurance, and disability plans in accordance with the plan documents, which may be amended from time to time. You will be eligible for benefits on the first day of employment with the Company. In addition, you are eligible to participate in the Company 401(k) Plan after 30 days of employment, in accordance with the applicable plan terms. The Company provides for 11 paid holidays and 12 weeks of Parental Leave. We also follow a “free time” PTO policy, allowing you the flexibility to take the time needed for either sick time or vacation.
Fractal provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
Database Analyst
Senior data analyst- job in Pomona, CA
AVID Technical Resources is seeking an Analyst to support our client's database project. Must be located in or near Pomona, CA or Monmouth, OR.
Required Skills:
Data retrieval and automation
Oracle Database knowledge
SQL and data modeling experience
PowerBI, PowerAutomate, PowerApps, MS Suite
Azure / AWS is a bonus!
Excellent written and oral communication skills
Imaging Application Analyst (RIS, Radiology PACS, Cardiology PACS, Voice Dictation, Advanced Post Processing, RadOnc EMR, EKG/EEG, GI)
Senior data analyst- job in Monterey Park, CA
Sr. Imaging Applications Analyst
Salary Range: $130k to $150k
The Imaging Applications Analyst, Senior is responsible for the design, implementation, validation, and support of multiple imaging applications (RIS, Radiology PACS, Cardiology PACS, Voice Dictation, Advanced Post Processing, RadOnc EMR, EKG/EEG, GI)and related ancillary systems.
The Imaging Applications Analyst ensures the design, configuration, integration, and user experience of these imaging applications, meets business and clinical objectives. The Imaging Applications Analyst consults with organizational clinicians, staff, and vendors of the imaging applications and is responsible for the configuration, testing, problem identification, issue resolution, and on-going support of the assigned applications including new implementations and upgrades.
Accountabilities:
Troubleshoot, configure, validate, upgrade, and support enterprise-wide clinical imaging applications.
Provide off-hour on-call support for issues and apply expertise and independent judgment for full resolution.
Work on assigned projects both independently and as part of a team.
Lead/Assist large-scale development and implementation projects. complex, inter-departmental projects and operational initiatives
Design and lead implementations, upgrades, and solution conversions in support of Keck clinical imaging applications
Assist in ensuring stability and functionality of the assigned applications.
Provide escalation support, troubleshooting and root-cause analysis of issues
Assist in identifying and trouble-shooting application issues, including isolating problems, recommending appropriate solutions and implementing solutions. Work closely with all IS teams to maintain Clinical Imaging Applications that are compliant with organizational standards and policies.
Regularly meet with users, vendors, IT staff to develop/modify system specifications and are responsible for the timely resolution or escalation of problems within the imaging application environment.
Work on assigned projects both independently and as part of a team and apply expertise and independent judgment for full resolution.
Responsible for support/testing of HL7 integration between Cerner EMR and all Imaging applications ensuring data integrity of integrated solutions.
Maintain expertise in Imaging Systems functionality and site/system workflows working directly with clinicians and staff to understand clinical workflows and reported issues
Coaches and mentors less experienced team members
Provide after-hours and weekend support where necessary for a 24x7 system availability model.
Minimum Education
Bachelor's degree in Computer Science, Healthcare Science, Financials, Business or related field required.
In lieu of a bachelor's degree, additional 4 years of experience are required.
Minimum Experience
Minimum 5 years of experience with design, configuration, maintenance, troubleshooting, upgrading, testing, and supporting clinical imaging applications (i.e., Cerner RadNet, FujiPACS/CV, PowerScribe 360, Varian Aria, Natus Xltek, Provation) or the equivalent combination of experience and education that would demonstrate the capability to successfully perform the essential functions of this position.
Working level knowledge of DICOM, HL7 and IHE.
Working level knowledge of Imaging Modalities (XR, US, MRI, CT, Nuclear, Mammo, EKG).
Extensive experience in managing, implementing, and supporting a diverse range of Cardiology IT applications and systems, including Fuji Synapse Cardiovascular PACS, GE/Merge Hemo, Epiphany ECG management, and Cerner RadNet / RIS.
Skilled in custom template building, focusing on creating standardized, efficient, and clinically relevant templates that streamline reporting and data capture.
Comprehensive knowledge of DICOM imaging protocols and standards, including image acquisition, archiving, retrieval, and seamless integration with Fuji CV PACS and Cerner EHR.
Possesses a foundational clinical background in cardiology, enabling a deep understanding of cardiac workflows, diagnostic procedures, and data requirements for IT system development and optimization.
This is a senior position with the expectation of mentoring other team members and leading through projects independently. A strong team-oriented attitude is critical.
Local resource preferred (on-site for the first six months then hybrid schedule).
Epic Wisdom Applications Analyst
Senior data analyst- job in Oakland, CA
Role: Epic Wisdom Applications Analyst
Client: Delta Dental
Work Authorization: U.S. Citizens, Green Card Holders, and those authorized to work in the U.S. for any employer will be considered.
We are seeking a skilled and experienced Application Analyst with expertise in Epic Wisdom to join our dynamic team. This Epic Applications Analyst plays a critical role in the implementation and installation of Epic applications, module upgrades, new components and modifications. This position is responsible for workflow configurations, go-live support, and on-going client support. You are also responsible for configuring application settings and validating configurations. This position is a expert role in Epic Practice Management system products. As an Application Analyst, you will collaborate with cross-functional teams, analyze business processes, and implement solutions to enhance the efficiency and effectiveness of our Epic Wisdom applications.
Qualifications:
Bachelors degree in information technology, Computer Science, or related field.
Minimum of 5+ years of experience as an Application Analyst, with a focus on Epic Wisdom.
Epic Wisdom certification REQUIRED (EPIC AMBULATORY CERT + Wisdom Project Experience with a path to certify.)
Strong analytical and problem-solving skills.
Excellent communication and interpersonal skills.
Experience with healthcare information systems in a clinical setting is a plus.
Key Competencies
Leading, coordinating, and facilitating effective meetings
Strong communication skills
User focused design and data visualization skills
Proficiency and demonstrated experience with analytics tools
Familiarity with Epic applications
Understanding of system build, end user workflows, and patterns of use
Responsibilities:
Epic Wisdom Implementation:
Lead the end-to-end setup and implementation of the Epic Wisdom system, ensuring alignment with organizational goals and industry best practices.
Collaborate closely with project stakeholders, including clinical staff, IT professionals, and business analysts, to gather requirements and customize the system accordingly.
Familiarity with dental practices workflows particularly interaction of clinical users with scheduling and billing team, patient experiences within the organization
Configuration and Customization:
Configure Epic Wisdom modules to meet the specific needs and workflows of our healthcare organization.
Implement customizations and enhancements to optimize system performance and usability.
Training and Knowledge Transfer:
Develop and deliver comprehensive training programs for end-users, ensuring a smooth transition to the new Epic Wisdom system.
Facilitate knowledge transfer sessions to internal teams, empowering them to effectively manage and maintain the system post-implementation.
Collaboration and Communication:
Work closely with the IT team to integrate Epic Wisdom with existing infrastructure and other healthcare information systems.
Provide regular updates on implementation progress, addressing challenges and ensuring effective
Epic Wisdom System Support:
Provide day-to-day support for the Epic Wisdom application, addressing end-user issues, troubleshooting problems, and ensuring system stability.
Collaborate with technical teams to resolve any infrastructure-related issues affecting the Epic Wisdom system.
System Optimization:
Analyze and evaluate existing Epic Wisdom workflows, identifying areas for improvement and optimization.
Work closely with end-users and stakeholders to understand their requirements and implement enhancements to streamline processes.
Implementation and Upgrades:
Participate in the planning and execution of Epic Wisdom system upgrades, ensuring a smooth transition and minimal disruption to operations.
Collaborate with the IT team to implement new modules and features within the Epic Wisdom application.
Training and Documentation:
Develop and deliver training programs for end-users on Epic Wisdom functionalities and best practices.
Create and maintain comprehensive documentation for Epic Wisdom configurations, workflows, and customizations.
Collaboration and Communication:
Collaborate with other IT professionals, clinical staff, and business analysts to gather requirements and translate them into actionable plans for the Epic Wisdom system.
Communicate effectively with all stakeholders, providing updates on project statuses, issue resolutions, and system improvements.
Benefits:
Medical/Dental/Vision Benefits Offered
401k
Life Insurance plans
Employee Health and Wellness program
Xcelligen Inc is a Veteran friendly employer and provides equal employment opportunity (EEO) to all employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability status, genetic information, marital status, ancestry, protected veteran status, or any other characteristic protected by applicable federal, state, and local laws.
Business Intelligence Analyst - Tableau
Senior data analyst- job in Fremont, CA
About the Role
We are seeking a Tableau Report Developer to join our Data & Analytics team in San Francisco. This role is critical to building and maintaining high-quality business reporting that drives decision-making across our retail brands. You will work closely with stakeholders in finance, operations, merchandising, and leadership to deliver insights that directly impact growth.
THIS IS A HYBRID POSITION BASED IN OUR SAN FRANCISCO OFFICE. CANDIDATES MUST BE ABLE TO COMMUTE TO THE OFFICE 3 DAYS A WEEK.
Responsibilities
● Design, develop, and maintain Tableau dashboards and reports that provide actionable insights to business teams.
● Translate business questions into effective data visualizations and reporting solutions.
● Partner with stakeholders to understand requirements, gather feedback, and refine reporting deliverables.
● Perform data analysis to validate trends, identify anomalies, and ensure accuracy of reporting.
● Work with the data engineering team to improve data pipelines and ensure reliable data availability.
● Provide ad-hoc reporting support for retail, e-commerce, and cross-functional business partners.
Requirements
● 3+ years of professional experience developing Tableau dashboards and reports.
● Strong background in data analysis and business reporting.
● Excellent ability to engage with business stakeholders-translating needs into technical solutions.
● Experience in retail or e-commerce analytics highly preferred.
● Solid SQL skills and familiarity with cloud-based data warehouses (e.g., Snowflake, Domo).
● Strong communication and collaboration skills.
Business Analyst
Senior data analyst- job in Gold River, CA
Assists in the research and assessment of business goals, objectives and needs to align information technology solutions with business initiatives for multiple, less complex accounts.
Serves as the liaison between technical personnel and business area for multiple accounts.
Essential Job Functions:
Assists in planning and designing business processes; assists in formulating recommendations to improve and support business activities.
Assists in analyzing and documenting client's business requirements and processes; communicates these requirements to technical personnel by constructing basic conceptual data and process models, including data dictionaries and volume estimates.
Assists in creating basic test scenarios to be used in testing the business applications in order to verify that client requirements are incorporated into the system design.
Assists in developing and modifying systems requirements documentation to meet client needs.
Participates in meetings with clients to gather and document requirements and explore potential solutions.
Executes systems tests from existing test plans.
Assists in analyzing test results in various phases.
Participates in technical reviews and inspections to verify 'intent of change' is carried through phase of project.
Basic Qualifications:
Bachelor's degree or equivalent combination of education and experience
Bachelor's degree in business administration, information systems, or related field preferred
Three or more years of business analysis experience.
Experience working with the interface of information technology with functional groups within an organization.
Experience working with business processes and re-engineering.
Experience working with computer programming concepts and basic language.
Other Qualifications:
Interpersonal skills to interact with customers and team members
Communication skills
Analytical and problem-solving skills
Presentation skills to communicate with management and customers
Personal computer and business solutions software skills
Ability to work in a team environment with multiple team members and the ability to multitask
Key Skills:
Business Analysis, Business processes, Fluent Spanish speaker
Education:
Bachelor's degree or equivalent combination of education and experience
About US Tech Solutions:
US Tech Solutions is a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit ***********************
US Tech Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Recruiter Details:
Name: Roushan
Email: **********************************
Internal Id: 25-54594
AWS Data Architect
Senior data analyst- job in San Jose, CA
Fractal is a strategic AI partner to Fortune 500 companies with a vision to power every human decision in the enterprise. Fractal is building a world where individual choices, freedom, and diversity are the greatest assets; an ecosystem where human imagination is at the heart of every decision. Where no possibility is written off, only challenged to get better. We believe that a true Fractalite is the one who empowers imagination with intelligence. Fractal has been featured as a Great Place to Work by The Economic Times in partnership with the Great Place to Work Institute and recognized as a ‘Cool Vendor' and a ‘Vendor to Watch' by Gartner.
Please visit Fractal | Intelligence for Imagination for more information about Fractal.
Fractal is looking for a proactive and driven AWS Lead Data Architect/Engineer to join our cloud and data tech team. In this role, you will work on designing the system architecture and solution, ensuring the platform is scalable while performant, and creating automated data pipelines.
Responsibilities:
Design & Architecture of Scalable Data Platforms
Design, develop, and maintain large-scale data processing architectures on the Databricks Lakehouse Platform to support business needs
Architect multi-layer data models including Bronze (raw), Silver (cleansed), and Gold (curated) layers for various domains (e.g., Retail Execution, Digital Commerce, Logistics, Category Management).
Leverage Delta Lake, Unity Catalog, and advanced features of Databricks for governed data sharing, versioning, and reproducibility.
Client & Business Stakeholder Engagement
Partner with business stakeholders to translate functional requirements into scalable technical solutions.
Conduct architecture workshops and solutioning sessions with enterprise IT and business teams to define data-driven use cases
Data Pipeline Development & Collaboration
Collaborate with data engineers and data scientists to develop end-to-end pipelines using Python, PySpark, SQL
Enable data ingestion from diverse sources such as ERP (SAP), POS data, Syndicated Data, CRM, e-commerce platforms, and third-party datasets.
Performance, Scalability, and Reliability
Optimize Spark jobs for performance tuning, cost efficiency, and scalability by configuring appropriate cluster sizing, caching, and query optimization techniques.
Implement monitoring and alerting using Databricks Observability, Ganglia, Cloud-native tools
Security, Compliance & Governance
Design secure architectures using Unity Catalog, role-based access control (RBAC), encryption, token-based access, and data lineage tools to meet compliance policies.
Establish data governance practices including Data Fitness Index, Quality Scores, SLA Monitoring, and Metadata Cataloging.
Adoption of AI Copilots & Agentic Development
Utilize GitHub Copilot, Databricks Assistant, and other AI code agents for
Writing PySpark, SQL, and Python code snippets for data engineering and ML tasks.
Generating documentation and test cases to accelerate pipeline development.
Interactive debugging and iterative code optimization within notebooks.
Advocate for agentic AI workflows that use specialized agents for
Data profiling and schema inference.
Automated testing and validation.
Innovation and Continuous Learning
Stay abreast of emerging trends in Lakehouse architectures, Generative AI, and cloud-native tooling.
Evaluate and pilot new features from Databricks releases and partner integrations for modern data stack improvements.
Requirements:
Bachelor's or master's degree in computer science, Information Technology, or a related field.
8-12 years of hands-on experience in data engineering, with at least 5+ years on Python and Apache Spark.
Expertise in building high-throughput, low-latency ETL/ELT pipelines on AWS/Azure/GCP using Python, PySpark, SQL.
Excellent hands on experience with workload automation tools such as Airflow, Prefect etc.
Familiarity with building dynamic ingestion frameworks from structured/unstructured data sources including APIs, flat files, RDBMS, and cloud storage
Experience designing Lakehouse architectures with bronze, silver, gold layering.
Strong understanding of data modelling concepts, star/snowflake schemas, dimensional modelling, and modern cloud-based data warehousing.
Experience with designing Data marts using Cloud data warehouses and integrating with BI tools (Power BI, Tableau, etc.).
Experience CI/CD pipelines using tools such as AWS Code commit, Azure DevOps, GitHub Actions.
Knowledge of infrastructure-as-code (Terraform, ARM templates) for provisioning platform resources
In-depth experience with AWS Cloud services such as Glue, S3, Redshift etc.
Strong understanding of data privacy, access controls, and governance best practices.
Experience working with RBAC, tokenization, and data classification frameworks
Excellent communication skills for stakeholder interaction, solution presentations, and team coordination.
Proven experience leading or mentoring global, cross-functional teams across multiple time zones and engagements.
Ability to work independently in agile or hybrid delivery models, while guiding junior engineers and ensuring solution quality
Must be able to work in PST time zone.
Pay:
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Fractal, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is: $150k - $180k. In addition, you may be eligible for a discretionary bonus for the current performance period.
Benefits:
As a full-time employee of the company or as an hourly employee working more than 30 hours per week, you will be eligible to participate in the health, dental, vision, life insurance, and disability plans in accordance with the plan documents, which may be amended from time to time. You will be eligible for benefits on the first day of employment with the Company. In addition, you are eligible to participate in the Company 401(k) Plan after 30 days of employment, in accordance with the applicable plan terms. The Company provides for 11 paid holidays and 12 weeks of Parental Leave. We also follow a “free time” PTO policy, allowing you the flexibility to take the time needed for either sick time or vacation.
Fractal provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
Business Intelligence Analyst - Tableau
Senior data analyst- job in San Jose, CA
About the Role
We are seeking a Tableau Report Developer to join our Data & Analytics team in San Francisco. This role is critical to building and maintaining high-quality business reporting that drives decision-making across our retail brands. You will work closely with stakeholders in finance, operations, merchandising, and leadership to deliver insights that directly impact growth.
THIS IS A HYBRID POSITION BASED IN OUR SAN FRANCISCO OFFICE. CANDIDATES MUST BE ABLE TO COMMUTE TO THE OFFICE 3 DAYS A WEEK.
Responsibilities
● Design, develop, and maintain Tableau dashboards and reports that provide actionable insights to business teams.
● Translate business questions into effective data visualizations and reporting solutions.
● Partner with stakeholders to understand requirements, gather feedback, and refine reporting deliverables.
● Perform data analysis to validate trends, identify anomalies, and ensure accuracy of reporting.
● Work with the data engineering team to improve data pipelines and ensure reliable data availability.
● Provide ad-hoc reporting support for retail, e-commerce, and cross-functional business partners.
Requirements
● 3+ years of professional experience developing Tableau dashboards and reports.
● Strong background in data analysis and business reporting.
● Excellent ability to engage with business stakeholders-translating needs into technical solutions.
● Experience in retail or e-commerce analytics highly preferred.
● Solid SQL skills and familiarity with cloud-based data warehouses (e.g., Snowflake, Domo).
● Strong communication and collaboration skills.
AWS Data Architect
Senior data analyst- job in Fremont, CA
Fractal is a strategic AI partner to Fortune 500 companies with a vision to power every human decision in the enterprise. Fractal is building a world where individual choices, freedom, and diversity are the greatest assets; an ecosystem where human imagination is at the heart of every decision. Where no possibility is written off, only challenged to get better. We believe that a true Fractalite is the one who empowers imagination with intelligence. Fractal has been featured as a Great Place to Work by The Economic Times in partnership with the Great Place to Work Institute and recognized as a ‘Cool Vendor' and a ‘Vendor to Watch' by Gartner.
Please visit Fractal | Intelligence for Imagination for more information about Fractal.
Fractal is looking for a proactive and driven AWS Lead Data Architect/Engineer to join our cloud and data tech team. In this role, you will work on designing the system architecture and solution, ensuring the platform is scalable while performant, and creating automated data pipelines.
Responsibilities:
Design & Architecture of Scalable Data Platforms
Design, develop, and maintain large-scale data processing architectures on the Databricks Lakehouse Platform to support business needs
Architect multi-layer data models including Bronze (raw), Silver (cleansed), and Gold (curated) layers for various domains (e.g., Retail Execution, Digital Commerce, Logistics, Category Management).
Leverage Delta Lake, Unity Catalog, and advanced features of Databricks for governed data sharing, versioning, and reproducibility.
Client & Business Stakeholder Engagement
Partner with business stakeholders to translate functional requirements into scalable technical solutions.
Conduct architecture workshops and solutioning sessions with enterprise IT and business teams to define data-driven use cases
Data Pipeline Development & Collaboration
Collaborate with data engineers and data scientists to develop end-to-end pipelines using Python, PySpark, SQL
Enable data ingestion from diverse sources such as ERP (SAP), POS data, Syndicated Data, CRM, e-commerce platforms, and third-party datasets.
Performance, Scalability, and Reliability
Optimize Spark jobs for performance tuning, cost efficiency, and scalability by configuring appropriate cluster sizing, caching, and query optimization techniques.
Implement monitoring and alerting using Databricks Observability, Ganglia, Cloud-native tools
Security, Compliance & Governance
Design secure architectures using Unity Catalog, role-based access control (RBAC), encryption, token-based access, and data lineage tools to meet compliance policies.
Establish data governance practices including Data Fitness Index, Quality Scores, SLA Monitoring, and Metadata Cataloging.
Adoption of AI Copilots & Agentic Development
Utilize GitHub Copilot, Databricks Assistant, and other AI code agents for
Writing PySpark, SQL, and Python code snippets for data engineering and ML tasks.
Generating documentation and test cases to accelerate pipeline development.
Interactive debugging and iterative code optimization within notebooks.
Advocate for agentic AI workflows that use specialized agents for
Data profiling and schema inference.
Automated testing and validation.
Innovation and Continuous Learning
Stay abreast of emerging trends in Lakehouse architectures, Generative AI, and cloud-native tooling.
Evaluate and pilot new features from Databricks releases and partner integrations for modern data stack improvements.
Requirements:
Bachelor's or master's degree in computer science, Information Technology, or a related field.
8-12 years of hands-on experience in data engineering, with at least 5+ years on Python and Apache Spark.
Expertise in building high-throughput, low-latency ETL/ELT pipelines on AWS/Azure/GCP using Python, PySpark, SQL.
Excellent hands on experience with workload automation tools such as Airflow, Prefect etc.
Familiarity with building dynamic ingestion frameworks from structured/unstructured data sources including APIs, flat files, RDBMS, and cloud storage
Experience designing Lakehouse architectures with bronze, silver, gold layering.
Strong understanding of data modelling concepts, star/snowflake schemas, dimensional modelling, and modern cloud-based data warehousing.
Experience with designing Data marts using Cloud data warehouses and integrating with BI tools (Power BI, Tableau, etc.).
Experience CI/CD pipelines using tools such as AWS Code commit, Azure DevOps, GitHub Actions.
Knowledge of infrastructure-as-code (Terraform, ARM templates) for provisioning platform resources
In-depth experience with AWS Cloud services such as Glue, S3, Redshift etc.
Strong understanding of data privacy, access controls, and governance best practices.
Experience working with RBAC, tokenization, and data classification frameworks
Excellent communication skills for stakeholder interaction, solution presentations, and team coordination.
Proven experience leading or mentoring global, cross-functional teams across multiple time zones and engagements.
Ability to work independently in agile or hybrid delivery models, while guiding junior engineers and ensuring solution quality
Must be able to work in PST time zone.
Pay:
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Fractal, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is: $150k - $180k. In addition, you may be eligible for a discretionary bonus for the current performance period.
Benefits:
As a full-time employee of the company or as an hourly employee working more than 30 hours per week, you will be eligible to participate in the health, dental, vision, life insurance, and disability plans in accordance with the plan documents, which may be amended from time to time. You will be eligible for benefits on the first day of employment with the Company. In addition, you are eligible to participate in the Company 401(k) Plan after 30 days of employment, in accordance with the applicable plan terms. The Company provides for 11 paid holidays and 12 weeks of Parental Leave. We also follow a “free time” PTO policy, allowing you the flexibility to take the time needed for either sick time or vacation.
Fractal provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.