Revenue Integrity, Managing Consultant
Data consultant job at Guidehouse
Job Family:
Finance & Accounting Consulting
Travel Required:
Up to 75%+
Clearance Required:
None
What You Will Do:
Guidehouse Managers are strategic leaders and trusted advisors who drive client success through vision, execution, and mentorship. A successful Manager is a self-motivated problem solver who thrives on delivering meaningful results, leading teams, and managing complex client engagements. In this role, you'll lead revenue integrity consulting projects, guide clients through operational transformation, and ensure the delivery of measurable financial outcomes.
Position Summary
We are seeking a Manager to lead client engagements focused on Revenue Integrity, charge capture, charge reconciliation, and Epic EHR optimization. The Manager will leverage a data-driven approach to evaluate the full spectrum of a client's revenue integrity operations - identifying trends, benchmarking performance, and isolating root causes of revenue leakage or inefficiency.
Using these insights, the Manager will design and oversee implementation of comprehensive recommendations that may address people, process, and technology improvements. This may include redefining governance structures, enhancing staff roles and training, streamlining charge workflows, or optimizing Epic system functionality. The Manager will ensure that recommendations are both strategic and practical, aligning operational execution with organizational objectives and measurable outcomes across both hospital and professional charging environments.
What You Will Need:
Bachelor's degree
5+ years of revenue cycle management experience, focused on optimization and performance improvement
5+ years of management consulting/consulting like, professional services, and/or project management experience
Demonstrated ability to present to large groups, both externally and within the practice
Proven success in driving operational process improvement and change management for revenue cycle optimization projects within hospitals and/or health systems
Intermediate to advanced data manipulation and analytical skills using Excel
What Would Be Nice To Have:
Master's degree or higher in Business Administration, Health Care Administration, Clinical Administration
Certifications in Epic
AI technology experience in Healthcare Revenue Cycle
The annual salary range for this position is $102,000.00-$170,000.00. Compensation decisions depend on a wide range of factors, including but not limited to skill sets, experience and training, security clearances, licensure and certifications, and other business and organizational needs.
What We Offer:
Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace.
Benefits include:
Medical, Rx, Dental & Vision Insurance
Personal and Family Sick Time & Company Paid Holidays
Position may be eligible for a discretionary variable incentive bonus
Parental Leave and Adoption Assistance
401(k) Retirement Plan
Basic Life & Supplemental Life
Health Savings Account, Dental/Vision & Dependent Care Flexible Spending Accounts
Short-Term & Long-Term Disability
Student Loan PayDown
Tuition Reimbursement, Personal Development & Learning Opportunities
Skills Development & Certifications
Employee Referral Program
Corporate Sponsored Events & Community Outreach
Emergency Back-Up Childcare Program
Mobility Stipend
About Guidehouse
Guidehouse is an Equal Opportunity Employer-Protected Veterans, Individuals with Disabilities or any other basis protected by law, ordinance, or regulation.
Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco.
If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at ************** or via email at RecruitingAccommodation@guidehouse.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation.
All communication regarding recruitment for a Guidehouse position will be sent from Guidehouse email domains including @guidehouse.com or ************************. Correspondence received by an applicant from any other domain should be considered unauthorized and will not be honored by Guidehouse. Note that Guidehouse will never charge a fee or require a money transfer at any stage of the recruitment process and does not collect fees from educational institutions for participation in a recruitment event. Never provide your banking information to a third party purporting to need that information to proceed in the hiring process.
If any person or organization demands money related to a job opportunity with Guidehouse, please report the matter to Guidehouse's Ethics Hotline. If you want to check the validity of correspondence you have received, please contact *************************. Guidehouse is not responsible for losses incurred (monetary or otherwise) from an applicant's dealings with unauthorized third parties.
Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee.
Auto-ApplyOracle Data Analyst (Exadata)
Dallas, TX jobs
6+ month contract Downtown Dallas, TX (Onsite) Primary responsibilities of the Senior Data Analyst include supporting and analyzing data anomalies for multiple environments including but not limited to Data Warehouse, ODS, Data Replication/ETL Data Management initiatives. The candidate will be in a supporting role and will work closely with Business, DBA, ETL and Data Management team providing analysis and support for complex Data related initiatives. This individual will also be responsible for assisting in initial setup and on-going documentation/configuration related to Data Governance and Master Data Management solutions. This candidate must have a passion for data, along with good SQL, analytical and communication skills.
Responsibilities
Investigate and Analyze data anomalies and data issues reported by Business
Work with ETL, Replication and DBA teams to determine data transformations, data movement and derivations and document accordingly
Work with support teams to ensure consistent and proactive support methodologies are adhered to for all aspects of data movements and data transformations
Assist in break fix and production validation as it relates to data derivations, replication and structures
Assist in configuration and on-going setup of Data Virtualization and Master Data Management tools
Assist in keeping documentation up to date as it relates to Data Standardization definitions, Data Dictionary and Data Lineage
Gather information from various Sources and interpret Patterns and Trends
Ability to work in a team-oriented, fast-paced agile environment managing multiple
priorities
Qualifications
4+ years of experience working in OLTP, Data Warehouse and Big Data databases
4+ years of experience working with Oracle Exadata
4+ years in a Data Analyst role
2+ years writing medium to complex stored procedures a plus
Ability to collaborate effectively and work as part of a team
Extensive background in writing complex queries
Extensive working knowledge of all aspects of Data Movement and Processing, including ETL, API, OLAP and best practices for data tracking
Denodo Experience a plus
Master Data Management a plus
Big Data Experience a plus (Hadoop, MongoDB)
Postgres and Cloud Experience a plus
Estimated Min Rate: $57.40
Estimated Max Rate: $82.00
What's In It for You?
We welcome you to be a part of the largest and legendary global staffing companies to meet your career aspirations. Yoh's network of client companies has been employing professionals like you for over 65 years in the U.S., UK and Canada. Join Yoh's extensive talent community that will provide you with access to Yoh's vast network of opportunities and gain access to this exclusive opportunity available to you. Benefit eligibility is in accordance with applicable laws and client requirements. Benefits include:
Medical, Prescription, Dental & Vision Benefits (for employees working 20+ hours per week)
Health Savings Account (HSA) (for employees working 20+ hours per week)
Life & Disability Insurance (for employees working 20+ hours per week)
MetLife Voluntary Benefits
Employee Assistance Program (EAP)
401K Retirement Savings Plan
Direct Deposit & weekly epayroll
Referral Bonus Programs
Certification and training opportunities
Note: Any pay ranges displayed are estimations. Actual pay is determined by an applicant's experience, technical expertise, and other qualifications as listed in the job description. All qualified applicants are welcome to apply.
Yoh, a Day & Zimmermann company, is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Visit ************************************************ to contact us if you are an individual with a disability and require accommodation in the application process.
For California applicants, qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. All of the material job duties described in this posting are job duties for which a criminal history may have a direct, adverse, and negative relationship potentially resulting in the withdrawal of a conditional offer of employment.
It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.
By applying and submitting your resume, you authorize Yoh to review and reformat your resume to meet Yoh's hiring clients' preferences. To learn more about Yoh's privacy practices, please see our Candidate Privacy Notice: **********************************
Business Data Analyst
New York, NY jobs
Technical Data / Business Analyst - Hybrid in NYC (3 Days Onsite)
Contract Role via Russell Tobin
Pay: $75-$90/hour (W2), depending on experience
Second-round interview is onsite - no exceptions
Russell Tobin is partnering with a leading global financial services firm to hire an experienced Technical Data / Business Analyst for a contract opportunity based in New York City. This role sits within a high-visibility Risk & Compliance technology group and offers a chance to work with top-tier professionals in a collaborative, innovation-driven environment.
What We Offer
Opportunity to work with leading professionals in a firm that values both individual contribution and teamwork
Modern, collaboration-focused office environment
Flexible hybrid schedule (3 days onsite per week)
Challenging, rewarding work with room for continuous learning and advancement
A technology-forward environment focused on solving complex business problems
Team Overview
You will join the Non-Financial Risk Technology (NFRT) organization, which provides operational controls, surveillance capabilities, and technology solutions to strengthen enterprise resilience. The team supports critical business functions including business continuity, records management, due diligence, and compliance monitoring.
This specific role is aligned with the Wealth Management Monitoring Department, supporting Risk & Compliance teams with tools and applications used to monitor Financial Advisor adherence to firm policies.
Position Overview
The Technical Data / Business Analyst will play a key role in business analysis, project management, and data analysis efforts related to model development, requirements documentation, UAT, and product support. The ideal candidate is analytical, detail-oriented, proactive, and thrives in a fast-paced, team-oriented environment.
Key Responsibilities
Elicit, analyze, and create business requirements documentation (Agile user stories, proof-of-concept models, supporting documentation)
Analyze datasets to identify gaps and coordinate with cross-functional teams to resolve issues
Work closely with development teams to ensure technical solutions meet business expectations
Conduct Behavior-Driven Development (BDD), and write/execute User Acceptance Testing (UAT) plans
Create and manage project plans; escalate risks and delays as needed
Lead cross-functional initiatives, manage stakeholder communication, and oversee project risks
Provide mentorship to junior team members
Required Qualifications
Bachelor's degree in Computer Science or related field
8+ years of experience as a Business Analyst in global organizations
Strong Data Analysis skills with hands-on SQL and relational database experience
Solid understanding of software development lifecycle (SDLC) and Agile methodologies
Excellent written and verbal communication skills
Ability to collaborate across global IT and business teams
Self-driven, creative problem solver capable of owning deliverables
Comfortable in fast-paced, high-pressure environments
Strong customer service orientation and ability to navigate complex situations
Background in trade lifecycles and asset classes (Equities, Fixed Income, Options, Futures)
Knowledge of compliance surveillance platforms such as Actimize, MANTAS, SunGard, SMARTS, and exposure to models (wash trades, spoofing, insider trading, etc.)
Desired Skills
Experience with IT Project Management and SDLC (Waterfall/Agile)
Experience with Behavior-Driven Development (BDD)
Background supporting medium-to-large scale development projects
Familiarity with trades data, positions data, and reference data
About the Client
Our client is a leading global financial services institution and a market leader in investment banking, securities, investment management, and wealth management. They are committed to fostering an inclusive workplace where individuals from diverse backgrounds can thrive and grow.
Russell Tobin / Pride Global offers eligible employees comprehensive healthcare coverage (medical, dental, and vision plans), supplemental coverage (accident insurance, critical illness insurance, and hospital indemnity), a 401(k) retirement savings plan, life & disability insurance, an employee assistance program, identity theft protection, legal support, auto and home insurance, pet insurance, and employee discounts with select vendors.
Junior Data Analyst
Columbus, OH jobs
12 Month Contract-to-Hire
Columbus, OH
$28/hr
Our healthcare services client is seeking a driven entry-level Data Analyst to join their Financial Planning and Analysis team. In this role, you will support key financial analytics initiatives that guide decision-making for internal business partners. This is an excellent opportunity to gain exposure to senior leadership, strengthen your technical toolkit, and accelerate your career through our Elevate Program, which offers structured training and development.
Qualifications:
Bachelor's degree in Data Analytics, MIS, CIS, or a related field
Hands-on SQL experience
Hands-on Python experience
Experience with data querying and analysis
Proficiency with Tableau or other data visualization tools
Strong written and verbal communication skills
Job Responsibilities:
Support financial planning and analysis activities alongside the analytics team
Attend and contribute to project intake and requirements meetings
Analyze data to identify trends and deliver actionable insights to partners
Communicate findings with business stakeholders, senior leadership, and analytics teams
Collaborate closely with senior Data Analysts on high-impact projects
Why Should You Apply?
Receive mentorship and support from experienced team members
Access tailored technical training and professional growth through our Elevate Program
Build your career with a Fortune 15 organization known for investing in early-career talent
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
Financial Data Analyst
Alpharetta, GA jobs
Ready to build the future with AI?
At Genpact, we don't just keep up with technology-we set the pace. AI and digital innovation are redefining industries, and we're leading the charge. Genpact's AI Gigafactory, our industry-first accelerator, is an example of how we're scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies' most complex challenges.
If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what's possible, this is your moment.
Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook.
Inviting applications for the role of Financial Data Analyst at Alpharetta , GA .
Role : Financial Data Analyst
Location : Alpharetta , GA 30005 / 3 days from Office
Hiring Type: Fulltime with Genpact + Benefits
Responsibilities
Define and execute the product roadmap for AI tooling and data integration initiatives, driving products from concept to launch in a fast-paced, Agile environment.
Translate business needs and product strategy into detailed requirements and user stories.
Collaborate with engineering, data, and AI/ML teams to design and implement data connectors that enable seamless access to internal and external financial datasets.
Partner with data engineering teams to ensure reliable data ingestion, transformation, and availability for analytics and AI models.
Evaluate and work to onboard new data sources, ensuring accuracy, consistency, and completeness of fundamental and financial data.
Continuously assess opportunities to enhance data coverage, connectivity, and usability within AI and analytics platforms.
Monitor and analyze product performance post-launch to drive ongoing optimization and inform future investments.
Facilitate alignment across stakeholders, including engineering, research, analytics, and business partners, ensuring clear communication and prioritization.
Minimum qualifications
Bachelor's degree in Computer Science, Finance, or related discipline. MBA/Master's Degree desired.
5+ years of experience in a similar role
Strong understanding of fundamental and financial datasets, including company financials, market data, and research data.
Proven experience in data integration, particularly using APIs, data connectors, or ETL frameworks to enable AI or analytics use cases.
Familiarity with AI/ML data pipelines, model lifecycle, and related tooling.
Experience working with cross-functional teams in an Agile environment.
Strong analytical, problem-solving, and communication skills with the ability to translate complex concepts into actionable insights.
Prior experience in financial services, investment banking, or research domains.
Excellent organizational and stakeholder management abilities with a track record of delivering data-driven products.
Preferred qualifications
Deep understanding of Python, SQL, or similar scripting languages
Knowledge of cloud data platforms (AWS, GCP, or Azure) and modern data architectures (data lakes, warehouses, streaming)
Familiarity with AI/ML platforms
Understanding of data governance, metadata management, and data security best practices in financial environments.
Experience with API standards (REST, GraphQL) and data integration frameworks.
Demonstrated ability to partner with engineering and data science teams to operationalize AI initiatives.
Why join Genpact?
• Lead AI-first transformation - Build and scale AI solutions that redefine industries
• Make an impact - Drive change for global enterprises and solve business challenges that matter
• Accelerate your career-Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills
• Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace
• Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything webuild
• Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress
Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up.
Let's build tomorrow together.
Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation.
Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Junior Data Analyst
Daytona Beach, FL jobs
We're looking for a detail-oriented Junior Data Analyst to support survey administration and data management for university-wide projects. This role involves working with institutional databases, preparing reports, and ensuring data accuracy to help drive informed decisions.
Key Responsibilities
Administer and process internal and external surveys.
Collect, clean, and organize data from student, faculty, and staff records.
Maintain accurate logs of reports and deadlines.
Prepare dashboards and assist with KPI tracking.
Collaborate with departments to ensure data consistency and integrity.
Support the development of forms, questionnaires, and institutional reports.
Assist with special projects and events as needed.
Required Skills
Strong analytical and data management skills.
Proficiency in databases, spreadsheets, and reporting tools.
Familiarity with data visualization best practices.
Ability to work independently and meet deadlines.
Excellent communication and organizational skills.
High attention to detail and confidentiality.
Preferred Experience
Knowledge of survey administration and data collection techniques.
Experience with educational software (e.g., Jenzabar) is a plus.
Bachelor's degree in a related field.
GIS Data Analyst
Atlanta, GA jobs
Tremendous opportunity for a GIS Analyst to join a stable company making big strides in their industry. This is focused on the engineering side of the business! You will be able to gain experience in this critical technical role in the creation and support of critical feature data.
You will answer ongoing questions and inquiries, generate new geospatial data, and provide flexible GIS service. Regular inventory analysis showing counts, mileage, and measurements of the system is anticipated. You will also directly perform geospatial data preparation and assemblage tasks.This technical role requires that you have a broad geographic understanding, the ability to correctly interpret geometrics and feature placement information received from the field, and a general understanding of the installation and maintenance activities undertaken by the Engineering Department sub-groups.
Responsibilities:
Perform and be responsible for the creation, modification, and quality control of essential geospatial infrastructure data for company wide use for a variety of critical applications, including the analysis and assembly of geospatial data utilized to assist PTC compliance.
Utilize their practical knowledge to ensure field generated spatial data from various sources is properly converted into a functioning theoretical digital network compliant with the database and data model standards established for systems.
Authorized to perform quality control checks, accept, and promote geospatial data change sets that are produced by the Geospatial Data Group's CADD technicians.
Utilize a variety of GIS software tools to perform topology corrections and make geometric modification as part of the data quality review and acceptance process.
Called upon to work with other involved groups and departments in a collaborative manner to fully utilize the technical skill sets of the Geospatial Data Group toward the Enterprise GIS goals.
Assist the Sr. Geospatial Data Analyst with assorted GIS responsibilities as required per business needs.
Leverage the company's investment in geospatial technology to generate value and identify cost savings using GIS technology.
This is a 12 month contract position working out of our office in Midtown Atlanta 4 Days a week with 1 day remote. Our new office is state of the art with many amenities (gym, coffee shop, cafeteria, etc.) and paid parking. This is an excellent opportunity to work within an enterprise environment and an outstanding work-life balance.
REQUIRED:
3+ years experience working with Data in a GIS environment
Strong Communication skills, presenting and working across org. departments
Experience with ESRI Suite (ArcGIS)
Experience with data editing and spatial analysis
Bachelor degree in GIS or similar (computer science, SW engineering, IT, etc.)
PREFERRED:
TFS
Esri JavaScript API
ArcGIS online and ArcGIS pro
Experience with relational databases
Experience with database queries (basic queries)
SQL
Must be authorized to work in the U.S./ Sponsorships are not available
Data Analyst
Warren, MI jobs
The main function of a Data Analyst is to coordinate changes to computer databases, test, and implement the database applying knowledge of database management systems.
Job Responsibilities:
• Work with senior management, technical and client teams in order to determine data requirements, business data implementation approaches, best practices for advanced data manipulation, storage and analysis strategies
• Write and code logical and physical database descriptions and specify identifiers of database to management system or direct others in coding descriptions
• Design, implement, automate and maintain large scale enterprise data ETL processes
• Modify existing databases and database management systems and/or direct programmers and analysts to make changes
• Test programs or databases, correct errors and make necessary modifications
Qualifications:
• Bachelor's degree in a technical field such as computer science, computer engineering or related field required
• 2-4 years applicable experience required
• Experience with database technologies
• Knowledge of the ETL process
• Knowledge of at least one scripting language
• Strong written and oral communication skills
• Strong troubleshooting and problem solving skills
• Demonstrated history of success
• Desire to be working with data and helping businesses make better data driven decisions
Oracle Financial Reporting & Analytics Consultant
San Jose, CA jobs
Title: Oracle Financial Reporting & Analytics Consultant
The Oracle Financial Reporting & Analytics Consultant will lead the design and development of reporting solutions across OTBI, BI Publisher, FR Studio, and Narrative Reporting. This role will focus on creating scalable, repeatable reporting packages and close-automation capabilities for Finance and Accounting stakeholders.
Core Skills & Experience
9-10 years of hands-on experience across the Oracle Financial reporting stack.
Strong expertise with:
OTBI (Oracle Transactional Business Intelligence)
BI Publisher
FR Studio (Financial Reporting Studio)
Narrative Reporting
Proven ability to build automated GL/Close dashboards, reconciliations, and operational reporting solutions.
Strong SQL, data modeling, and data quality skills.
Solid understanding of data lineage and flows across ERP → EPM → Data Warehouse environments.
Strong controls mindset, with focus on accuracy, governance, audit readiness, and repeatability.
Responsibilities
Architect and develop financial and operational reporting across OTBI, BIP, FR, and Narrative Reporting.
Build standard and automated reporting packages to streamline period close activities.
Design reconciliation and variance analysis frameworks using Oracle reporting tools.
Partner with Finance, Accounting, and FP&A teams to translate business needs into scalable reporting solutions.
Ensure reporting solutions meet data governance, security, and compliance standards.
Optimize performance of reporting processes and automate manual workflows whenever possible.
Support integrations and data flows between ERP, EPM, and downstream analytics platforms.
About Trident: Trident Consulting is an award-winning IT/engineering staffing company founded in 2005 and headquartered in San Ramon, CA. We specialize in placing high-quality vetted technology and engineering professionals in contract and full-time roles. Trident's commitment is to deliver the best and brightest individuals in the industry for our clients' toughest requirements.
Some of our recent awards include: 2022, 2021, 2020 Inc. 5000 fastest-growing private companies in America 2022, 2021 SF Business Times 100 fastest-growing private companies in Bay Area
Lead Data Architect
San Jose, CA jobs
Fractal is looking for a proactive and driven AWS Lead Data Architect/Engineer to join our cloud and data tech team. In this role, you will work on designing the system architecture and solution, ensuring the platform is scalable while performant, and creating automated data pipelines.
Responsibilities:
Design & Architecture of Scalable Data Platforms
Design, develop, and maintain large-scale data processing architectures on the Databricks Lakehouse Platform to support business needs
Architect multi-layer data models including Bronze (raw), Silver (cleansed), and Gold (curated) layers for various domains (e.g., Retail Execution, Digital Commerce, Logistics, Category Management).
Leverage Delta Lake, Unity Catalog, and advanced features of Databricks for governed data sharing, versioning, and reproducibility.
Client & Business Stakeholder Engagement
Partner with business stakeholders to translate functional requirements into scalable technical solutions.
Conduct architecture workshops and solutioning sessions with enterprise IT and business teams to define data-driven use cases
Data Pipeline Development & Collaboration
Collaborate with data engineers and data scientists to develop end-to-end pipelines using Python, PySpark, SQL
Enable data ingestion from diverse sources such as ERP (SAP), POS data, Syndicated Data, CRM, e-commerce platforms, and third-party datasets.
Performance, Scalability, and Reliability
Optimize Spark jobs for performance tuning, cost efficiency, and scalability by configuring appropriate cluster sizing, caching, and query optimization techniques.
Implement monitoring and alerting using Databricks Observability, Ganglia, Cloud-native tools
Security, Compliance & Governance
Design secure architectures using Unity Catalog, role-based access control (RBAC), encryption, token-based access, and data lineage tools to meet compliance policies.
Establish data governance practices including Data Fitness Index, Quality Scores, SLA Monitoring, and Metadata Cataloging.
Adoption of AI Copilots & Agentic Development
Utilize GitHub Copilot, Databricks Assistant, and other AI code agents for
Writing PySpark, SQL, and Python code snippets for data engineering and ML tasks.
Generating documentation and test cases to accelerate pipeline development.
Interactive debugging and iterative code optimization within notebooks.
Advocate for agentic AI workflows that use specialized agents for
Data profiling and schema inference.
Automated testing and validation.
Innovation and Continuous Learning
Stay abreast of emerging trends in Lakehouse architectures, Generative AI, and cloud-native tooling.
Evaluate and pilot new features from Databricks releases and partner integrations for modern data stack improvements.
Requirements:
Bachelor's or master's degree in computer science, Information Technology, or a related field.
8-12 years of hands-on experience in data engineering, with at least 5+ years on Python and Apache Spark.
Expertise in building high-throughput, low-latency ETL/ELT pipelines on AWS/Azure/GCP using Python, PySpark, SQL.
Excellent hands on experience with workload automation tools such as Airflow, Prefect etc.
Familiarity with building dynamic ingestion frameworks from structured/unstructured data sources including APIs, flat files, RDBMS, and cloud storage
Experience designing Lakehouse architectures with bronze, silver, gold layering.
Strong understanding of data modelling concepts, star/snowflake schemas, dimensional modelling, and modern cloud-based data warehousing.
Experience with designing Data marts using Cloud data warehouses and integrating with BI tools (Power BI, Tableau, etc.).
Experience CI/CD pipelines using tools such as AWS Code commit, Azure DevOps, GitHub Actions.
Knowledge of infrastructure-as-code (Terraform, ARM templates) for provisioning platform resources
In-depth experience with AWS Cloud services such as Glue, S3, Redshift etc.
Strong understanding of data privacy, access controls, and governance best practices.
Experience working with RBAC, tokenization, and data classification frameworks
Excellent communication skills for stakeholder interaction, solution presentations, and team coordination.
Proven experience leading or mentoring global, cross-functional teams across multiple time zones and engagements.
Ability to work independently in agile or hybrid delivery models, while guiding junior engineers and ensuring solution quality
Must be able to work in PST time zone.
Oracle Financial Reporting & Analytics Consultant
Santa Rosa, CA jobs
Title: Oracle Financial Reporting & Analytics Consultant
The Oracle Financial Reporting & Analytics Consultant will lead the design and development of reporting solutions across OTBI, BI Publisher, FR Studio, and Narrative Reporting. This role will focus on creating scalable, repeatable reporting packages and close-automation capabilities for Finance and Accounting stakeholders.
Core Skills & Experience
9-10 years of hands-on experience across the Oracle Financial reporting stack.
Strong expertise with:
OTBI (Oracle Transactional Business Intelligence)
BI Publisher
FR Studio (Financial Reporting Studio)
Narrative Reporting
Proven ability to build automated GL/Close dashboards, reconciliations, and operational reporting solutions.
Strong SQL, data modeling, and data quality skills.
Solid understanding of data lineage and flows across ERP → EPM → Data Warehouse environments.
Strong controls mindset, with focus on accuracy, governance, audit readiness, and repeatability.
Responsibilities
Architect and develop financial and operational reporting across OTBI, BIP, FR, and Narrative Reporting.
Build standard and automated reporting packages to streamline period close activities.
Design reconciliation and variance analysis frameworks using Oracle reporting tools.
Partner with Finance, Accounting, and FP&A teams to translate business needs into scalable reporting solutions.
Ensure reporting solutions meet data governance, security, and compliance standards.
Optimize performance of reporting processes and automate manual workflows whenever possible.
Support integrations and data flows between ERP, EPM, and downstream analytics platforms.
About Trident: Trident Consulting is an award-winning IT/engineering staffing company founded in 2005 and headquartered in San Ramon, CA. We specialize in placing high-quality vetted technology and engineering professionals in contract and full-time roles. Trident's commitment is to deliver the best and brightest individuals in the industry for our clients' toughest requirements.
Some of our recent awards include: 2022, 2021, 2020 Inc. 5000 fastest-growing private companies in America 2022, 2021 SF Business Times 100 fastest-growing private companies in Bay Area
Lead Data Architect
Fremont, CA jobs
Fractal is looking for a proactive and driven AWS Lead Data Architect/Engineer to join our cloud and data tech team. In this role, you will work on designing the system architecture and solution, ensuring the platform is scalable while performant, and creating automated data pipelines.
Responsibilities:
Design & Architecture of Scalable Data Platforms
Design, develop, and maintain large-scale data processing architectures on the Databricks Lakehouse Platform to support business needs
Architect multi-layer data models including Bronze (raw), Silver (cleansed), and Gold (curated) layers for various domains (e.g., Retail Execution, Digital Commerce, Logistics, Category Management).
Leverage Delta Lake, Unity Catalog, and advanced features of Databricks for governed data sharing, versioning, and reproducibility.
Client & Business Stakeholder Engagement
Partner with business stakeholders to translate functional requirements into scalable technical solutions.
Conduct architecture workshops and solutioning sessions with enterprise IT and business teams to define data-driven use cases
Data Pipeline Development & Collaboration
Collaborate with data engineers and data scientists to develop end-to-end pipelines using Python, PySpark, SQL
Enable data ingestion from diverse sources such as ERP (SAP), POS data, Syndicated Data, CRM, e-commerce platforms, and third-party datasets.
Performance, Scalability, and Reliability
Optimize Spark jobs for performance tuning, cost efficiency, and scalability by configuring appropriate cluster sizing, caching, and query optimization techniques.
Implement monitoring and alerting using Databricks Observability, Ganglia, Cloud-native tools
Security, Compliance & Governance
Design secure architectures using Unity Catalog, role-based access control (RBAC), encryption, token-based access, and data lineage tools to meet compliance policies.
Establish data governance practices including Data Fitness Index, Quality Scores, SLA Monitoring, and Metadata Cataloging.
Adoption of AI Copilots & Agentic Development
Utilize GitHub Copilot, Databricks Assistant, and other AI code agents for
Writing PySpark, SQL, and Python code snippets for data engineering and ML tasks.
Generating documentation and test cases to accelerate pipeline development.
Interactive debugging and iterative code optimization within notebooks.
Advocate for agentic AI workflows that use specialized agents for
Data profiling and schema inference.
Automated testing and validation.
Innovation and Continuous Learning
Stay abreast of emerging trends in Lakehouse architectures, Generative AI, and cloud-native tooling.
Evaluate and pilot new features from Databricks releases and partner integrations for modern data stack improvements.
Requirements:
Bachelor's or master's degree in computer science, Information Technology, or a related field.
8-12 years of hands-on experience in data engineering, with at least 5+ years on Python and Apache Spark.
Expertise in building high-throughput, low-latency ETL/ELT pipelines on AWS/Azure/GCP using Python, PySpark, SQL.
Excellent hands on experience with workload automation tools such as Airflow, Prefect etc.
Familiarity with building dynamic ingestion frameworks from structured/unstructured data sources including APIs, flat files, RDBMS, and cloud storage
Experience designing Lakehouse architectures with bronze, silver, gold layering.
Strong understanding of data modelling concepts, star/snowflake schemas, dimensional modelling, and modern cloud-based data warehousing.
Experience with designing Data marts using Cloud data warehouses and integrating with BI tools (Power BI, Tableau, etc.).
Experience CI/CD pipelines using tools such as AWS Code commit, Azure DevOps, GitHub Actions.
Knowledge of infrastructure-as-code (Terraform, ARM templates) for provisioning platform resources
In-depth experience with AWS Cloud services such as Glue, S3, Redshift etc.
Strong understanding of data privacy, access controls, and governance best practices.
Experience working with RBAC, tokenization, and data classification frameworks
Excellent communication skills for stakeholder interaction, solution presentations, and team coordination.
Proven experience leading or mentoring global, cross-functional teams across multiple time zones and engagements.
Ability to work independently in agile or hybrid delivery models, while guiding junior engineers and ensuring solution quality
Must be able to work in PST time zone.
Oracle Financial Reporting & Analytics Consultant
San Francisco, CA jobs
Title: Oracle Financial Reporting & Analytics Consultant
The Oracle Financial Reporting & Analytics Consultant will lead the design and development of reporting solutions across OTBI, BI Publisher, FR Studio, and Narrative Reporting. This role will focus on creating scalable, repeatable reporting packages and close-automation capabilities for Finance and Accounting stakeholders.
Core Skills & Experience
9-10 years of hands-on experience across the Oracle Financial reporting stack.
Strong expertise with:
OTBI (Oracle Transactional Business Intelligence)
BI Publisher
FR Studio (Financial Reporting Studio)
Narrative Reporting
Proven ability to build automated GL/Close dashboards, reconciliations, and operational reporting solutions.
Strong SQL, data modeling, and data quality skills.
Solid understanding of data lineage and flows across ERP → EPM → Data Warehouse environments.
Strong controls mindset, with focus on accuracy, governance, audit readiness, and repeatability.
Responsibilities
Architect and develop financial and operational reporting across OTBI, BIP, FR, and Narrative Reporting.
Build standard and automated reporting packages to streamline period close activities.
Design reconciliation and variance analysis frameworks using Oracle reporting tools.
Partner with Finance, Accounting, and FP&A teams to translate business needs into scalable reporting solutions.
Ensure reporting solutions meet data governance, security, and compliance standards.
Optimize performance of reporting processes and automate manual workflows whenever possible.
Support integrations and data flows between ERP, EPM, and downstream analytics platforms.
About Trident: Trident Consulting is an award-winning IT/engineering staffing company founded in 2005 and headquartered in San Ramon, CA. We specialize in placing high-quality vetted technology and engineering professionals in contract and full-time roles. Trident's commitment is to deliver the best and brightest individuals in the industry for our clients' toughest requirements.
Some of our recent awards include: 2022, 2021, 2020 Inc. 5000 fastest-growing private companies in America 2022, 2021 SF Business Times 100 fastest-growing private companies in Bay Area
Oracle Financial Reporting & Analytics Consultant
San Mateo, CA jobs
Title: Oracle Financial Reporting & Analytics Consultant
The Oracle Financial Reporting & Analytics Consultant will lead the design and development of reporting solutions across OTBI, BI Publisher, FR Studio, and Narrative Reporting. This role will focus on creating scalable, repeatable reporting packages and close-automation capabilities for Finance and Accounting stakeholders.
Core Skills & Experience
9-10 years of hands-on experience across the Oracle Financial reporting stack.
Strong expertise with:
OTBI (Oracle Transactional Business Intelligence)
BI Publisher
FR Studio (Financial Reporting Studio)
Narrative Reporting
Proven ability to build automated GL/Close dashboards, reconciliations, and operational reporting solutions.
Strong SQL, data modeling, and data quality skills.
Solid understanding of data lineage and flows across ERP → EPM → Data Warehouse environments.
Strong controls mindset, with focus on accuracy, governance, audit readiness, and repeatability.
Responsibilities
Architect and develop financial and operational reporting across OTBI, BIP, FR, and Narrative Reporting.
Build standard and automated reporting packages to streamline period close activities.
Design reconciliation and variance analysis frameworks using Oracle reporting tools.
Partner with Finance, Accounting, and FP&A teams to translate business needs into scalable reporting solutions.
Ensure reporting solutions meet data governance, security, and compliance standards.
Optimize performance of reporting processes and automate manual workflows whenever possible.
Support integrations and data flows between ERP, EPM, and downstream analytics platforms.
About Trident: Trident Consulting is an award-winning IT/engineering staffing company founded in 2005 and headquartered in San Ramon, CA. We specialize in placing high-quality vetted technology and engineering professionals in contract and full-time roles. Trident's commitment is to deliver the best and brightest individuals in the industry for our clients' toughest requirements.
Some of our recent awards include: 2022, 2021, 2020 Inc. 5000 fastest-growing private companies in America 2022, 2021 SF Business Times 100 fastest-growing private companies in Bay Area
Oracle Financial Reporting & Analytics Consultant
Fremont, CA jobs
Title: Oracle Financial Reporting & Analytics Consultant
The Oracle Financial Reporting & Analytics Consultant will lead the design and development of reporting solutions across OTBI, BI Publisher, FR Studio, and Narrative Reporting. This role will focus on creating scalable, repeatable reporting packages and close-automation capabilities for Finance and Accounting stakeholders.
Core Skills & Experience
9-10 years of hands-on experience across the Oracle Financial reporting stack.
Strong expertise with:
OTBI (Oracle Transactional Business Intelligence)
BI Publisher
FR Studio (Financial Reporting Studio)
Narrative Reporting
Proven ability to build automated GL/Close dashboards, reconciliations, and operational reporting solutions.
Strong SQL, data modeling, and data quality skills.
Solid understanding of data lineage and flows across ERP → EPM → Data Warehouse environments.
Strong controls mindset, with focus on accuracy, governance, audit readiness, and repeatability.
Responsibilities
Architect and develop financial and operational reporting across OTBI, BIP, FR, and Narrative Reporting.
Build standard and automated reporting packages to streamline period close activities.
Design reconciliation and variance analysis frameworks using Oracle reporting tools.
Partner with Finance, Accounting, and FP&A teams to translate business needs into scalable reporting solutions.
Ensure reporting solutions meet data governance, security, and compliance standards.
Optimize performance of reporting processes and automate manual workflows whenever possible.
Support integrations and data flows between ERP, EPM, and downstream analytics platforms.
About Trident: Trident Consulting is an award-winning IT/engineering staffing company founded in 2005 and headquartered in San Ramon, CA. We specialize in placing high-quality vetted technology and engineering professionals in contract and full-time roles. Trident's commitment is to deliver the best and brightest individuals in the industry for our clients' toughest requirements.
Some of our recent awards include: 2022, 2021, 2020 Inc. 5000 fastest-growing private companies in America 2022, 2021 SF Business Times 100 fastest-growing private companies in Bay Area
Data Scientist
Phoenix, AZ jobs
We are seeking a Data Scientist to support advanced analytics and machine learning initiatives across the organization. This role involves working with large, complex datasets to uncover insights, validate data integrity, and build predictive models. A key focus will be developing and refining machine learning models that leverage sales and operational data to optimize pricing strategies at the store level.
Day-to-Day Responsibilities
Compare and validate numbers across multiple data systems
Investigate discrepancies and understand how metrics are derived
Perform data science and data analysis tasks
Build and maintain AI/ML models using Python
Interpret model results, fine-tune algorithms, and iterate based on findings
Validate and reconcile data from different sources to ensure accuracy
Work with sales and production data to produce item-level pricing recommendations
Support ongoing development of a new data warehouse and create queries as needed
Review Power BI dashboards (Power BI expertise not required)
Contribute to both ML-focused work and general data science responsibilities
Improve and refine an existing ML pricing model already in production
Qualifications
Strong proficiency with MS SQL Server
Experience creating and deploying machine learning models in Python
Ability to interpret, evaluate, and fine-tune model outputs
Experience validating and reconciling data across systems
Strong foundation in machine learning, data modeling, and backend data operations
Familiarity with querying and working with evolving data environments
Data Analytics Consultant
New York, NY jobs
This role serves as the key liaison between business teams, Central & Local IS&T, and the Data & Analytics team to scope, coordinate, and manage data analytics initiatives. The position is responsible for user story creation, analytics project coordination, data quality oversight, and supporting commercial reporting tools.
Key Responsibilities
Application & Project Management
Serve as the primary point of contact for business teams on analytics requests, user story development, data quality concerns, and commercial reporting applications.
Scope and coordinate new analytics requests in partnership with business and IS&T teams.
Create, track, and manage JIRA tickets for new data requirements through the full development lifecycle.
Work with Data & Analytics and Central IS&T teams to ensure data quality across the GCP data platform; coordinate issue resolution with Central and AMS teams.
Maintain project timelines and overall project plans for assigned initiatives.
Coordinate project tasks across business users and various IS&T teams.
Support user acceptance testing (UAT) for commercial reporting tools and ensure new application deployments do not negatively impact reporting.
Participate in functional and integration testing.
Represent local teams in global analytics and IS&T communities to ensure alignment with broader strategic direction.
Data & Technical Responsibilities
General understanding of GCP BigQuery and Microsoft SQL to query and analyze data.
Assess whether required data exists in GCP; identify gaps and initiate requests when needed.
Monitor data quality and collaborate with IS&T teams on remediation.
Skills & Qualifications
Required (1)
3-5 years of progressively responsible IS&T experience.
Proven success as a project manager or business analyst.
Strong understanding of analytics tool architecture and capabilities.
Excellent written and verbal communication skills.
Strong ability to meet deadlines, manage milestones, and synthesize status updates.
Highly organized, able to prioritize, multitask, and work independently.
Comfortable engaging with executive-level stakeholders.
Proficiency with Microsoft Office.
Ability to thrive in a fast-paced, global, matrixed environment.
Some Knowledge / Growth Areas (2)
PowerBI
SQL databases
GCP
Microsoft tools
Experience organizing AMS support teams
Retail industry experience (preferred)
Preferred (3)
Effective negotiation skills
Prior consulting experience
Key Competencies
Independence and accountability
Relationship building and collaboration
Adaptability
Self-motivation and stress management
Ability to influence without authority
Comfort with changing expectations
Lead Data Architect
Tempe, AZ jobs
We are seeking a Lead Data Architect to drive the design and implementation of our enterprise data architecture with a focus on Azure Data Lake, Databricks, and Lakehouse architecture. This role will serve as the data design authority, ensuring alignment with enterprise standards while enabling business value through scalable, high-quality data solutions.
The ideal candidate will have a proven track record in financial services or wealth management, deep expertise in data modeling and MDM (e.g., Profisee), and experience architecting cloud-native data platforms that support analytics, AI/ML, and regulatory/compliance requirements.
Key Responsibilities
Define and own the enterprise data architecture strategy, standards, and patterns.
Lead the design and implementation of Azure-based Lakehouse architecture leveraging Azure Data Lake, Databricks, Delta Lake, and related services.
Serve as the data design authority, governing data models, integration patterns, metadata management, and data quality standards.
Architect and implement Master Data Management (MDM) solutions, preferably with Profisee.
Collaborate with stakeholders, engineers, and analysts to translate business requirements into scalable architecture and data models.
Ensure alignment with data governance, security, and compliance frameworks.
Provide technical leadership in data design, ETL/ELT best practices, and performance optimization.
Partner with enterprise and solution architects to integrate data architecture with application and cloud strategies.
Mentor and guide data engineers and modelers, fostering a culture of engineering and architecture excellence.
Required Qualifications
10+ years of experience in data architecture, data engineering, or related fields, with 5+ years in a lead/architect capacity.
Strong expertise in Azure Data Lake, Databricks, Delta Lake, and Lakehouse architecture.
Hands-on experience architecting and implementing MDM solutions (Profisee strongly preferred).
Deep knowledge of data modeling (conceptual, logical, physical) and metadata management.
Experience as a data design authority across enterprise programs.
Strong understanding of financial services data domains (clients, accounts, portfolios, products, transactions) and regulatory needs.
Proficiency in SQL, Python, Spark, and modern ELT/ETL tools.
Familiarity with data governance, lineage, cataloging, and data quality tools.
Excellent communication and leadership skills to engage with senior business and technology stakeholders.
Preferred Qualifications
Experience with real-time data streaming (Kafka, Event Hub).
Exposure to BI/Analytics platforms (Power BI, Tableau) integrated with Lakehouse.
Knowledge of data security and privacy frameworks in financial services.
Cloud certification in Microsoft Azure Data Engineering/Architecture.
Benefits
Comprehensive health, vision, and dental coverage.
401(k) plans plus a variety of voluntary plans such as legal services, insurance, and more.
👉 If you're a data architecture leader who thrives on building scalable, cloud-native data platforms and want to make an impact in financial services, we'd love to connect.
Data Engineer
Austin, TX jobs
About the Role
We are seeking a highly skilled Databricks Data Engineer with strong expertise in modern data engineering, Azure cloud technologies, and Lakehouse architectures. This role is ideal for someone who thrives in dynamic environments, enjoys solving complex data challenges, and can lead end-to-end delivery of scalable data solutions.
What We're Looking For
8+ years designing and delivering scalable data pipelines in modern data platforms
Deep experience in data engineering, data warehousing, and enterprise-grade solution delivery
Ability to lead cross-functional initiatives in matrixed teams
Advanced skills in SQL, Python, and ETL/ELT development, including performance tuning
Hands-on experience with Azure, Snowflake, and Databricks, including system integrations
Key Responsibilities
Design, build, and optimize large-scale data pipelines on the Databricks Lakehouse platform
Modernize and enhance cloud-based data ecosystems on Azure, contributing to architecture, modeling, security, and CI/CD
Use Apache Airflow and similar tools for workflow automation and orchestration
Work with financial or regulated datasets while ensuring strong compliance and governance
Drive best practices in data quality, lineage, cataloging, and metadata management
Primary Technical Skills
Develop and optimize ETL/ELT pipelines using Python, PySpark, Spark SQL, and Databricks Notebooks
Design efficient Delta Lake models for reliability and performance
Implement and manage Unity Catalog for governance, RBAC, lineage, and secure data sharing
Build reusable frameworks using Databricks Workflows, Repos, and Delta Live Tables
Create scalable ingestion pipelines for APIs, databases, files, streaming sources, and MDM systems
Automate ingestion and workflows using Python and REST APIs
Support downstream analytics for BI, data science, and application workloads
Write optimized SQL/T-SQL queries, stored procedures, and curated datasets
Automate DevOps workflows, testing pipelines, and workspace configurations
Additional Skills
Azure: Data Factory, Data Lake, Key Vault, Logic Apps, Functions
CI/CD: Azure DevOps
Orchestration: Apache Airflow (plus)
Streaming: Delta Live Tables
MDM: Profisee (nice-to-have)
Databases: SQL Server, Cosmos DB
Soft Skills
Strong analytical and problem-solving mindset
Excellent communication and cross-team collaboration
Detail-oriented with a high sense of ownership and accountability
Revenue Integrity, Senior Consultant
Data consultant job at Guidehouse
Job Family:
Finance & Accounting Consulting
Travel Required:
Up to 75%+
Clearance Required:
None
What You Will Do:
At Guidehouse, successful consultants are self-starters who thrive in dynamic environments, proactively identify client needs, and deliver impactful solutions. They combine analytical rigor with practical execution, communicate effectively across diverse teams, and manage multiple priorities with minimal supervision. As a Senior Consultant in Revenue Integrity, you'll leverage your expertise to improve financial performance, ensure compliance, and optimize charge capture processes for both hospital and professional billing environments.
Position Summary:
We are seeking a Senior Consultant with deep experience in Revenue Integrity, charge capture, and charge reconciliation, along with a strong working knowledge of Epic EHR. The Senior Consultant will partner with healthcare clients to assess current-state charge capture operations, utilize data to uncover patterns and trends that reveal areas of financial or operational opportunity, and develop actionable, evidence-based recommendations.
These recommendations may span people, process, and technology domains - from redesigning workflows and refining staff roles to enhancing system configuration and reporting capabilities. The Senior Consultant will play a pivotal role in transforming data insights into operational improvements that drive measurable results, ensuring alignment with regulatory requirements and best practices across both hospital and professional charging environments.
What You Will Need:
Bachelor's degree
3+ years of revenue cycle management experience, focused on optimization and performance improvement
3+ years of management consulting/consulting like, professional services, and/or project management experience
Demonstrated ability to present to large groups, both externally and within the practice
Proven success in driving operational process improvement and change management for revenue cycle optimization projects within hospitals and/or health systems
Intermediate to advanced data manipulation and analytical skills using Excel
What Would Be Nice To Have:
Master's Degree or higher in Business Administration, Health Care Administration, Clinical Administration
Certifications in Epic
AI technology experience in Healthcare Revenue Cycle
The annual salary range for this position is $89,000.00-$148,000.00. Compensation decisions depend on a wide range of factors, including but not limited to skill sets, experience and training, security clearances, licensure and certifications, and other business and organizational needs.
What We Offer:
Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace.
Benefits include:
Medical, Rx, Dental & Vision Insurance
Personal and Family Sick Time & Company Paid Holidays
Position may be eligible for a discretionary variable incentive bonus
Parental Leave and Adoption Assistance
401(k) Retirement Plan
Basic Life & Supplemental Life
Health Savings Account, Dental/Vision & Dependent Care Flexible Spending Accounts
Short-Term & Long-Term Disability
Student Loan PayDown
Tuition Reimbursement, Personal Development & Learning Opportunities
Skills Development & Certifications
Employee Referral Program
Corporate Sponsored Events & Community Outreach
Emergency Back-Up Childcare Program
Mobility Stipend
About Guidehouse
Guidehouse is an Equal Opportunity Employer-Protected Veterans, Individuals with Disabilities or any other basis protected by law, ordinance, or regulation.
Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco.
If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at ************** or via email at RecruitingAccommodation@guidehouse.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation.
All communication regarding recruitment for a Guidehouse position will be sent from Guidehouse email domains including @guidehouse.com or ************************. Correspondence received by an applicant from any other domain should be considered unauthorized and will not be honored by Guidehouse. Note that Guidehouse will never charge a fee or require a money transfer at any stage of the recruitment process and does not collect fees from educational institutions for participation in a recruitment event. Never provide your banking information to a third party purporting to need that information to proceed in the hiring process.
If any person or organization demands money related to a job opportunity with Guidehouse, please report the matter to Guidehouse's Ethics Hotline. If you want to check the validity of correspondence you have received, please contact *************************. Guidehouse is not responsible for losses incurred (monetary or otherwise) from an applicant's dealings with unauthorized third parties.
Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee.
Auto-Apply