Data engineer jobs in Rancho Cordova, CA - 411 jobs
All
Data Engineer
Data Scientist
Data Architect
Senior Software Engineer
Lead Data Architect
Lead Data Analyst
Data Modeler
Lead Data Technician
Lead Data Architect
R Systems 4.5
Data engineer job in Sacramento, CA
Role: Lead Data Architect (NoSQL/AWS)
We are seeking an experienced Data Architect with deep expertise in NoSQL database platforms, specifically MongoDB and/or Apache Cassandra, deployed and managed on AWS Cloud. This role will lead the design, architecture, and modernization of highly scalable, distributed data platforms supporting mission-critical, high-throughput applications.
Key Responsibilities
Lead the architecture, design, and implementation of NoSQL data platforms using MongoDB and/or Cassandra on AWS Cloud.
Design highly available, fault-tolerant, and horizontally scalable data architectures for large-scale, low-latency workloads.
Architect cloud-native data solutions leveraging AWS services such as EC2, EKS, S3, IAM, CloudWatch, and networking components.
Define data modeling strategies optimized for NoSQL systems, including schema design, partitioning, indexing, and replication.
Establish performance tuning, capacity planning, backup, disaster recovery, and high-availability strategies for NoSQL databases.
Lead database modernization initiatives, including migration from relational or legacy platforms to NoSQL architectures.
Define and enforce data security, encryption, access control, and compliance standards across cloud-hosted data platforms.
Partner with application, DevOps, and security teams to ensure end-to-end architectural alignment.
Develop architecture standards, reference designs, and best practices for NoSQL and cloud-based data platforms.
Provide technical leadership, design reviews, and guidance to engineering teams.
Required Qualifications
Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
8+ years of experience in data architecture, databaseengineering, or distributed data platform roles.
Strong hands-on experience with MongoDB and/or Apache Cassandra, including cluster design and operations.
Proven experience hosting and managing NoSQL databases on AWS Cloud.
Deep understanding of NoSQL data modeling, consistency models, sharding, replication, and CAP trade-offs.
Experience with AWS infrastructure services (EC2, VPC, IAM, S3, CloudWatch).
Strong knowledge of database performance tuning, scalability, and resiliency patterns.
Experience designing secure, compliant data platforms in regulated or enterprise environments.
Excellent communication and stakeholder collaboration skills.
Preferred Qualifications
Experience with MongoDB Atlas on AWS or Amazon Keyspaces (Cassandra-compatible).
Hands-on exposure to Kubernetes (EKS) and containerized database deployments.
Knowledge of event-driven architectures and streaming platforms (Kafka, Kinesis).
AWS certifications (Solutions Architect, Data Analytics, or Database Specialty).
Experience supporting public sector, government, or highly regulated environments.
$106k-144k yearly est. 4d ago
Looking for a job?
Let Zippia find it for you.
Sr. Principal Software Developer - Cloud Storage
Oracle 4.6
Data engineer job in Sacramento, CA
At Oracle Cloud Infrastructure (OCI), we build the future of the cloud for Enterprises as a diverse team of fellow creators and inventors. Oracle's Cloud Infrastructure team is building Infrastructure-as-a-Service technologies that operate at a high scale in a broadly distributed multi-tenant cloud environment. We act with the speed and attitude of a start-up, with the scale and customer focus of the leading enterprise software company in the world.
Are you interested in working on Paxos, IO Path availability and performance, replication, data and metadata consistency checking. The data plane team is building a new Storage layer supporting low latency high throughput storage system in 100+ regions.
Our customers run their businesses on our cloud, and our mission is to provide them with industry leading compute, storage, networking, database, security, and an ever expanding set of foundational cloud-based services. As part of this effort, the Object Storage Service team is looking for hands-on engineers with expertise and passion in solving difficult problems in distributed systems, large scale storage, and scaling services to meet future growth. If this is you, you can be part of the team that drives the best in class Object Storage Service into the next phase of its development. These are exciting times for the service - we are growing fast, and delivering innovative, enterprise class features to satisfy the customer workload.
As a technical leader, you will own the software design and development for major components and features of the Object Storage Service. You should be able to design complex systems, strong programmer and a distributed systems generalist, able to dive deep into any part of the stack and low level systems, as well as design broad distributed system interactions. You should value simplicity and scale, work comfortably in a collaborative, agile environment, and be excited to learn.
**This position is located in Seattle or Santa Clara. No remote. Relocation assistance is available.**
**Responsibilities**
As a Consulting Member of Technical Staff, you will be called upon to lead major projects and have significant participation in design and architecture. You will be expected to act as a technical leader on your team and demonstrate core values for other more junior engineers. You should be both a rock-solid coder and a distributed systems generalist, able to dive deep into any part of the stack and low-level systems, as well as design broad distributed system interactions. You should value simplicity and scale, work comfortably in a collaborative, agile environment, and be excited to learn.
To succeed with these responsibilities will require:
+ Bachelors or Masters in Computer Science, Computer Engineering, or related field.
+ 15+ years experience delivering storage systems.
+ Cloud experience is a plus.
+ Proven experience with a major Object Oriented Programming language such as Java or C++.
+ Strong knowledge of data structures, algorithms, operating systems, and distributed systems fundamentals.
+ Very strong knowledge of databases, storage and distributed persistence technologies.
+ Strong troubleshooting and performance tuning skills.
Disclaimer:
**Certain US customer or client-facing roles may be required to comply with applicable requirements, such as immunization and occupational health mandates.**
**Range and benefit information provided in this posting are specific to the stated locations only**
US: Hiring Range in USD from: $96,800 to $251,600 per annum. May be eligible for bonus, equity, and compensation deferral.
Oracle maintains broad salary ranges for its roles in order to account for variations in knowledge, skills, experience, market conditions and locations, as well as reflect Oracle's differing products, industries and lines of business.
Candidates are typically placed into the range based on the preceding factors as well as internal peer equity.
Oracle US offers a comprehensive benefits package which includes the following:
1. Medical, dental, and vision insurance, including expert medical opinion
2. Short term disability and long term disability
3. Life insurance and AD&D
4. Supplemental life insurance (Employee/Spouse/Child)
5. Health care and dependent care Flexible Spending Accounts
6. Pre-tax commuter and parking benefits
7. 401(k) Savings and Investment Plan with company match
8. Paid time off: Flexible Vacation is provided to all eligible employees assigned to a salaried (non-overtime eligible) position. Accrued Vacation is provided to all other employees eligible for vacation benefits. For employees working at least 35 hours per week, the vacation accrual rate is 13 days annually for the first three years of employment and 18 days annually for subsequent years of employment. Vacation accrual is prorated for employees working between 20 and 34 hours per week. Employees working fewer than 20 hours per week are not eligible for vacation.
9. 11 paid holidays
10. Paid sick leave: 72 hours of paid sick leave upon date of hire. Refreshes each calendar year. Unused balance will carry over each year up to a maximum cap of 112 hours.
11. Paid parental leave
12. Adoption assistance
13. Employee Stock Purchase Plan
14. Financial planning and group legal
15. Voluntary benefits including auto, homeowner and pet insurance
The role will generally accept applications for at least three calendar days from the posting date or as long as the job remains posted.
Career Level - IC5
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_************* or by calling *************** in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
$96.8k-251.6k yearly 8d ago
Senior Frontend Developer
Capitol Tech Solutions 3.6
Data engineer job in Sacramento, CA
Senior Front-End Developer
About Us:
Capitol Tech Solutions (CTS) is a leading digital transformation company specializing in software development, website design, and data-driven solutions. We partner with both public and private sector clients to deliver innovative, accessible, and user-centered digital experiences. At CTS, we provide top-tier technology solutions tailored to meet the diverse needs of our clients. We foster a collaborative and innovative work environment where team members are encouraged to grow, contribute, and achieve their full potential.
Primary Responsibilities:
We are seeking a highly skilled and creative Senior Front-End Developer to join our dynamic team. In this role, you will be responsible for designing and implementing user-friendly interfaces for web applications. You will leverage your expertise in front-end technologies to create responsive, accessible, and visually appealing user experiences. This includes developing interactive elements, such as navigation menus, buttons, and layouts, optimized for both desktop and mobile platforms.
You will collaborate closely with the Director of Software Development, designers, front-end and back-end developers, project managers, and business analysts to ensure that all projects meet client requirements and are delivered on time and in accordance with specifications.
Lead the design and development of intuitive user interfaces using HTML, CSS, JavaScript, front-end frameworks such as Svelte and .NET technologies such as C#.
Participate in Agile development processes, including sprint planning, reviews, and retrospectives.
Communicate directly with clients to gather requirements, provide updates, and give technical guidance.
Create and translate wireframes, storyboards, user flows, and design into high-quality code.
Design and implement universal UI solutions that focus on performance, scalability, and accessibility, guaranteeing smooth user experiences across all platforms.
Conduct user research and qualitative analysis to guide design choices and enhance usability.
Conduct comprehensive testing to verify that interfaces meet design and functionality requirements.
Troubleshoot and resolve UI-related issues and bugs.
Document the technical aspects of the project for future reference and debugging.
Lead and mentor junior developers and contribute to code reviews and best practices.
Keep up to date with UI/UX trends, platform updates, and security practices, then incorporate them into your development workflows.
Qualifications:
Bachelor's degree in computer science, software engineering, or a related field.
Proven experience in UI development with a strong project portfolio.
Proficient in front-end technologies (HTML, CSS, JavaScript), C#, and the .NET framework.
7+ years of experience as a programmer/analyst in a .NET environment.
7+ years of experience using front-end frameworks/libraries with preference for Svelte, React, Vue, and/or Angular
5+ years of experience in digital design, user research, qualitative analysis, and interaction design.
4+ years of experience developing web application UI/UX compliant with WCAG 2.0 standards.
4+ years of experience working in an Agile team environment.
Familiarity with front-end frameworks (e.g., React, Angular, or Vue.js).
Experience with design tools such as Figma, Adobe Cloud, or Sketch.
Strong communication and collaboration skills.
Salary & Benefits:
Hourly: $48.00-$52.08
Full-time employment includes flexible personal time off, nine paid holidays per year, a 401(k) plan with employer matching, and comprehensive health insurance packages covering medical, dental, and vision care.
$48-52.1 hourly 3d ago
Data Engineer
Recology 4.5
Data engineer job in Sacramento, CA
Role Under limited general direction, develops and maintains data lake repository, data warehouse, and data mapping infrastructure with the ultimate goal to make data accessible so that the organization can use it to evaluate and optimize their performance. Demonstrated expertise designing, developing and maintaining scalable data pipelines and ETL processes to support enterprise data needs.
This is a hybrid position, with three days per week in-office and the rest remote.
Essential Responsibilities
* Creates and maintains optimal data pipeline architecture.
* Assembles large, complex data sets that meet functional / non-functional business requirements.
* Performs unit testing and mock data generation.
* Uses relational data best practices, including referential integrity and query optimization (Oracle, MySQL, SQL Server).
* Participates in agile software planning and development activities, including daily standups, user story and task organization and grooming activities, and effort estimation.
* Analyzes business and technical requirements to develop documentation, designs, code, and tests.
* Maintains source control hygiene (branch protection, git-flow).
* CI/CD includes build, test, and automated deployment (like Jenkins, Travis, and DevOps).
* Other duties as assigned.
Qualifications
* 5+ years of experience in a dataengineer role.
* Significant experience working with relational database technologies such as Oracle & SQL Server.
* Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), as well as working familiarity with a variety of databases.
* Outstanding analytical, quantitative, problem-solving, and programming skills.
* Demonstrated expertise designing, developing and maintaining scalable data pipelines and ETL processes to support enterprise data needs.
* Full-stack development experience.
* Willingness to jump in, learn new tech, and work effectively across full stack.
* Experience reviewing and executing plans and performance tuning.
* Experience working with common languages and frameworks such as C# or Java.
* In-depth understanding of modern application design principles, DevOps & Microservices.
* Relevant certifications from cloud provides (e.g., AWS Certified Data Analytics, Azure DataEngineering, etc.).
* Strong written and verbal communication skills, including ability to convey proposed solutions through natural language, diagrams, and design patterns.
* Ability to work both independently and collaboratively.
* High school diploma or GED required.
* Bachelor's degree preferred.
Recology Offers
* An ecologically innovative company that finds and mentors people committed to protecting the environment and sustaining our communities.
* The largest employee-owned resource recovery company in the industry with terrific benefits to help you prosper.
* A creative and caring culture that values community, diversity, altruism, accountability, collaboration, and learning by doing.
* An inspired company mission driven to use and return resources to their best and highest use through the practice of the 4R's: Reduce, Re-use, Recycle, and Recologize.
* Distinct professional challenges to connect with, care for, and grow community that sees a world without waste.
Recology Benefits May Include
* Paid time off and paid holidays.
* Health and wellness benefits including medical, dental, and vision.
* Retirement plans (Employee Stock Ownership Plan, 401(k) with match).
* Annual wellness incentives.
* Employee Assistance Program (EAP).
* Educational assistance.
* Commuting benefits.
* Employee referral program.
Supplemental Information
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of this job; and pursuant to applicable law, we will consider for employment qualified applicants with criminal records. It is important that you provide accurate information on the job application, inaccurate information may cause delays in the processing of your application and/or may disqualify you as a candidate.
Recology is an equal opportunity employer committed to supporting an inclusive work environment where employees are valued, heard, and provided development opportunities. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, citizenship, disability, protected veteran status, or any other basis that is prohibited by law.
This description is not intended and should not be construed to be an exhaustive list of all responsibilities, skills, effort, work conditions, and benefits associated with the job.
$126k-170k yearly est. 18d ago
Data Scientist, Analytics (Technical Leadership)
Meta 4.8
Data engineer job in Sacramento, CA
We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.
**Required Skills:**
Data Scientist, Analytics (Technical Leadership) Responsibilities:
1. Work with complex data sets to solve challenging problems using analytical and statistical approaches
2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies
3. Identify and measure success through goal setting, forecasting, and monitoring key metrics
4. Partner with cross-functional teams to inform and execute product strategy and investment decisions
5. Build long-term vision and strategy for programs and products
6. Collaborate with executives to define and develop data platforms and instrumentation
7. Effectively communicate insights and recommendations to stakeholders
8. Define success metrics, forecast changes, and set team goals
9. Support developing roadmaps and coordinate analytics efforts across teams
**Minimum Qualifications:**
Minimum Qualifications:
10. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
11. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab)
12. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development
13. Experience working effectively with and leading complex analytics projects across multiple stakeholders and cross-functional teams, including Engineering, PM/TPM, Analytics and Finance
14. Experience with predictive modeling, machine learning, and experimentation/causal inference methods
15. Experience communicating complex technical topics in a clear, precise, and actionable manner
**Preferred Qualifications:**
Preferred Qualifications:
16. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy
17. Masters or Ph.D. Degree in a quantitative field
18. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research)
19. 10+ years of experience doing complex quantitative analysis in product analytics
**Public Compensation:**
$210,000/year to $281,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@meta.com.
$210k-281k yearly 60d+ ago
Principal Data Scientist
Maximus 4.3
Data engineer job in Sacramento, CA
Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team.
You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes.
This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.)
This position requires occasional travel to the DC area for client meetings.
U.S. citizenship is required for this position due to government contract requirements.
Essential Duties and Responsibilities:
- Make deep dives into the data, pulling out objective insights for business leaders.
- Initiate, craft, and lead advanced analyses of operational data.
- Provide a strong voice for the importance of data-driven decision making.
- Provide expertise to others in data wrangling and analysis.
- Convert complex data into visually appealing presentations.
- Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners.
- Understand the importance of automation and look to implement and initiate automated solutions where appropriate.
- Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects.
- Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects.
- Guide operational partners on product performance and solution improvement/maturity options.
- Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization.
- Learn new skills in advanced analytics/AI/ML tools, techniques, and languages.
- Mentor more junior data analysts/data scientists as needed.
- Apply strategic approach to lead projects from start to finish;
Job-Specific Minimum Requirements:
- Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
- Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning.
- Contribute to the development of mathematically rigorous process improvement procedures.
- Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments.
Minimum Requirements
- Bachelor's degree in related field required.
- 10-12 years of relevant professional experience required.
Job-Specific Minimum Requirements:
- 10+ years of relevant Software Development + AI / ML / DS experience.
- Professional Programming experience (e.g. Python, R, etc.).
- Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML.
- Experience with API programming.
- Experience with Linux.
- Experience with Statistics.
- Experience with Classical Machine Learning.
- Experience working as a contributor on a team.
Preferred Skills and Qualifications:
- Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.).
- Experience developing machine learning or signal processing algorithms:
- Ability to leverage mathematical principles to model new and novel behaviors.
- Ability to leverage statistics to identify true signals from noise or clutter.
- Experience working as an individual contributor in AI.
- Use of state-of-the-art technology to solve operational problems in AI and Machine Learning.
- Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles.
- Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions.
- Ability to build reference implementations of operational AI & Advanced Analytics processing solutions.
Background Investigations:
- IRS MBI - Eligibility
#techjobs #VeteransPage #LI-Remote
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
156,740.00
Maximum Salary
$
234,960.00
$115k-161k yearly est. Easy Apply 7d ago
Data Scientist II
The Gap 4.4
Data engineer job in Folsom, CA
About the RoleThe Forecasting Team at Gap Inc. applies data analysis and machine learning techniques to drive business benefits for Gap Inc. and its brands. The team's focus is to shape the company's inventory management strategy through advanced data science and forecasting techniques. The successful candidate will lead the development of advanced forecasting models across various business functions, time horizons, and product hierarchies.
Areas of expertise include forecasting, time series, predictive modeling, supply chain analytics, and inventory management. You will support the team to build and deploy data and predictive analytics capabilities, in partnership with GapTech, PDM, Central Marketing & business partners across our brands.What You'll Do
Build, validate, and maintain AI (Machine Learning (ML) /Deep learning) models, diagnose, and optimize performance and develop statistical models and analysis for ad hoc business focused analysis.
Develop software programs, algorithms and automated processes that cleanse, integrate, and evaluate large data sets from multiple disparate sources.
Manipulate large amounts of data across a diverse set of subject areas, collaborating with other data scientists and dataengineers to prepare data pipelines for various modeling protocols.
Deliver sound, data-backed recommendations tied to business results, industry insights, and overall Gap Inc. ecosystem of technology, platform, and resources.
Communicate compelling, data-driven recommendations as well as potential trade-offs, backed by data analysis and/or model outputs to influence leaders' and stakeholders' decisions.
Build networks across the organization and partners to anticipate leader requests and influence data-driven decision making.
Guides discussions and empowers more junior team members to identify best solutions.
Who You Are
Experience in developing advanced algorithms using machine leaning (ML), statistical, and optimization methods to enhance various business components in the retail sector.
Hands-on experience with forecasting models, running simulations of what-if analysis, and prescriptive analytics.
Experience with time series analysis, predictive modeling, hierarchical Bayesian, causal ML, and transformer-based algorithms.
Experience with creating business impact in supply chain, merchandise, inventory planning, or vendor management using advanced forecasting techniques.
Experience working directly with cross-functional teams such as product management, engineers, business partners.
Advanced proficiency in modern analytics tools and languages such as Python, R, Spark, SQL.
Advanced proficiency using SQL for efficient manipulation of large datasets in on prem and cloud distributed computing environments, such as Azure environments.
Ability to work both at a detailed level as well as to summarize findings and extrapolate knowledge to make strong recommendations for change.
Ability to collaborate with cross functional teams and influence product and analytics roadmap, with a demonstrated proficiency in relationship building.
Ability to assess relatively complex situations and analyze data to make judgments and recommend solutions.
Required
BS with 7+ years of experience (or MS with 5+ years) in Data Science, Computer Science, Machine Learning, Applied Mathematics, or equivalent quantitative field.
People mentoring experience, ability to work independently on large scale projects.
Proven ability to lead teams in solving unstructured technical problems to achieve business impact.
Full stack experience across analytics, data science, machine learning, and dataengineering
$123k-171k yearly est. Auto-Apply 53d ago
Data Conversion Lead
Vsimplifyit
Data engineer job in Sacramento, CA
Job Name: Data Conversion Lead for a cloud solution
Company: VsimplifyIT Consulting
Job City: Sacramento,
Job State
: California
J
ob Country
: USA
Job Description:
VsimplifyIT Consulting is a leading consulting company that helps organizations of all sizes achieve their business goals through innovative technology solutions. We are currently seeking a highly experienced and qualified Data Conversion Lead to join our team.
The Data Conversion Lead is responsible for leading and managing a team of data conversion specialists in the design, development, and implementation of data conversion solutions. The ideal candidate will have 5+ years of experience in data conversion, 3+ years of experience leading a team of data conversion specialists, and experience with a variety of data conversion tools and technologies.
Essential Duties and Responsibilities:
Lead and manage a team of data conversion specialists in the design, development, and implementation of data conversion solutions
Work with clients to understand their data conversion needs and develop a customized solution
Develop and implement data conversion strategies and methodologies
Ensure the quality and accuracy of all data conversions
Monitor and report on the progress of data conversion projects
Qualifications:
5+ years of experience in data conversion
3+ years of experience leading a team of data conversion specialists
Experience with a variety of data conversion tools and technologies
Strong analytical and problem-solving skills
Excellent communication and interpersonal skills
IMPORTANT-APPLY ONLY IF ALL YOUR PROJECT MEET FOLLOWING:
a) At least 600 internal authorized users.
b) Cloud hosted public facing web portal with at least 10,000 external users.
c) A minimum of 2 legacy systems replaced by a single cloud hosted solution.
d) Development, integration, and configuration effort that occurred for a duration of one year or longer.
e) Total project cost is greater than $10 Million ($10,000,000).
f) Required data conversion, external system interface development, and integration with other applications.
Why Work for VSIMPLIFYIT Consulting?
VsimplifyIT Consulting is a leading provider of IT consulting services. We offer our clients a wide range of services, including project management, system development, and integration. We are committed to providing our clients with the highest quality services and helping them to achieve their business goals.
We are also a great place to work!
If you are a highly experienced Project Manager who meets all the above requirements, we encourage you to apply today!
$130k-188k yearly est. 60d+ ago
Data Scientist, Privacy
Datavant
Data engineer job in Sacramento, CA
Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care.
By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare.
As part of the Privacy Science team within Privacy Hub you will play a crucial role in ensuring that privacy of patients is safeguarded in the modern world of data sharing. As well as working on real data, you will be involved in exciting research to keep us as industry leaders in this area, and stimulating discussions on re-identification risk. You will be supported in developing/consolidating data analysis and coding skills to become proficient in the analysis of large health-related datasets.
**You Will:**
+ Critically analyze large health datasets using standard and bespoke software libraries
+ Discuss your findings and progress with internal and external stakeholders
+ Produce high quality reports which summarise your findings
+ Contribute to research activities as we explore novel and established sources of re-identification risk
**What You Will Bring to the Table:**
+ Excellent communication skills. Meticulous attention to detail in the production of comprehensive, well-presented reports
+ A good understanding of statistical probability distributions, bias, error and power as well as sampling and resampling methods
+ Seeks to understand real-world data in context rather than consider it in abstraction.
+ Familiarity or proficiency with programmable data analysis software R or Python, and the desire to develop expertise in its language
+ Application of scientific methods to practical problems through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions
+ Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines
+ Initiative and ability to independently explore and research novel topics and concepts as they arise, to expand Privacy Hub's knowledge base
+ An appreciation of the need for effective methods in data privacy and security, and an awareness of the relevant legislation
+ Familiarity with Amazon Web Services cloud-based storage and computing facilities
**Bonus Points If You Have:**
+ Experience creating documents using LATEX
+ Detailed knowledge of one or more types of health information, e.g., genomics, disease, health images
+ Experience working with or supporting public sector organizations, such as federal agencies (e.g., CMS, NIH, VA, CDC), state health departments, or public health research partners. Familiarity with government data environments, procurement processes, or privacy frameworks in regulated settings is highly valued.
\#LI-BC1
We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status.
At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services.
The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job.
The estimated total cash compensation range for this role is:
$104,000-$130,000 USD
To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion.
This job is not eligible for employment sponsorship.
Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay.
At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way.
Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis.
For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
$104k-130k yearly 22d ago
Senior Data Modeler
Estaffllc
Data engineer job in Sacramento, CA
We are seeking a Senior Data Modeler in the Sacramento, California area. This position requires you to be able to work on-site in Sacramento, California, on Mondays and Wednesdays each week. Candidates must currently live within 60 miles of Sacramento, CA. Anyone else will be rejected.
Key Duties/Responsibilities:
Performs business and systems analysis and documentation
Develops conceptual, logical, and physical relational data models for the enterprise data warehouse
Experience with large data warehouse implementation projects
Performs data modeling in relational and dimensional models
Develops a physical data model and/or works with the architect to develop a physical data model
Ability to expertly develop Data Facts and Dimensions in the EDW
Provides documentation to support the Kimball Dimensional Data Modeling Framework, as necessary
Visualizes and designs the enterprise data management framework, specifying processes used to plan, acquire, maintain, use, archive, retrieve, control, and purge data
Documents data flow diagrams in existing and future reports to use as input in report design and optimization
Development Requirements Specifications
Develop Design Specifications
Performs data analysis/predictive data modeling
Mentors and education team members on best practices and industry standards
Mandatory Requirements:
* Minimum of ten (10) years demonstrable experience in the data management space, with at least 5 years specializing in database design and at least 5 years in data modeling.
* Senior, hands-on Data Modeler with strong communication skills.
* Expert-level command of the ER/STUDIO Data Architect modeling application
* Must have Oracle Data Integrator (ODI) experience.
* At least 2 year's experience working in Oracle Autonomous Data Warehouse (ADW), specifically installed in an OCI environment
* Strong ability to articulate data modeling principles and gather requirements from non-technical business stakeholders
* Minimum of five (5) years of relevant experience in relational data modeling and dimensional data modeling, statistical analysis, and machine learning, supportive of key duties/responsibilities identified above.
* Minimum of five (5) years of experience as a data analyst or in other quantitative analysis or related disciplines, such as researcher or dataengineer, supportive of key duties/responsibilities identified above.
* Excellent presentation skills to different (business and technical) audiences, ranging from Senior-level leadership to operational staff, with no supervision
* At least 2 years of experience working on Star, Snowflake, and/or Hybrid schemas
* Ability to translate business and functional requirements into technical requirements for technical team members.
* Candidate needs to be able to demonstrate direct, hands-on, recent practical experience in the identified areas, with specific examples.Required to have State/Gov experience
Desired:
* Expert-level Kimball Dimensional Data Modeling experience
* Expert-level experience developing in Oracle SQL Developer or ER/Studio Data Architect for Oracle.
* Ability to develop and perform Extract, Transform, and Load (ETL) activities using Oracle tools and PL/SQL with at least 2 years of experience.
* Ability to perform technical leadership of an Oracle data warehouse team, including but not limited to ETL, requirements solicitation, DBA, data warehouse administration, and data analysis on a hands-on basis.
$109k-155k yearly est. 11d ago
Senior Data Engineer, Product Analytics
Datarobot 4.2
Data engineer job in Sacramento, CA
DataRobot delivers AI that maximizes impact and minimizes business risk. Our platform and applications integrate into core business processes so teams can develop, deliver, and govern AI at scale. DataRobot empowers practitioners to deliver predictive and generative AI, and enables leaders to secure their AI assets. Organizations worldwide rely on DataRobot for AI that makes sense for their business - today and in the future.
**About DataRobot**
DataRobot delivers the industry-leading agentic AI applications and platform that maximize impact and minimize risk for your business. DataRobot's enterprise AI platform democratizes data science with end-to-end automation for building, deploying, and managing machine learning models. This platform maximizes business value by delivering AI at scale and continuously optimizing performance over time. The company's proven combination of cutting-edge software and world-class AI implementation, training, and support services empowers any organization, regardless of size, industry, or resources, to drive better business outcomes with AI.
**About the role:**
As a technical driver and hands-on expert, the Senior DataEngineer will shape our end-to-end data strategy and guide the team's technical execution. This role is responsible for building scalable Data Warehouse and Lakehouse solutions on Snowflake, championing the ELT paradigm, and ensuring robust data governance and cost optimization. We are looking for a seasoned engineer who combines deep technical mastery with a passion for mentoring others to build and influence high-impact, data-driven solutions.
**Key Responsibilities:**
+ Architect and deliver scalable, reliable data warehouses, analytics platforms, and integration solutions. Critical role in supporting our internal AI strategy.
+ Partner with Product Manager, Analytics to shape our project roadmap and lead its implementation. Collaborate with and mentor cross-functional teams to design and execute sophisticated data software solutions that elevate business performance and align to coding standards and architecture.
+ Develop, deploy, and support analytic data products, such as data marts, ETL jobs (extract/transform/load), functions (in Python/SQL/DBT) in a cloud data warehouse environment using Snowflake, Stitch/Fivetran/Airflow, AWS services (e.g., EC2, lambda, kinesis).
+ Navigate various data sources and efficiently locate data in a complex data ecosystem.
+ Work closely with data analysts, and data scientists to build models and metrics to support their analytics needs. Data modeling enhancements caused by upstream data changes.
+ Instrument telemetry capture and data pipelines for various environments to provide product usage visibility.
+ Maintain and support deployed ETL pipelines and ensure data quality.
+ Develop monitoring and alerting systems to provide visibility into the health of data infrastructure, cloud applications, and data pipelines.
+ Partner with the IT enterprise applications and engineering teams on integration efforts between systems that impact data & Analytics
+ Work with R&D to answer complex technical questions about product analytics and corresponding data structure.
**Knowledge, Skills, and Abilities:**
+ 5-7 years of experience in a dataengineering or data analyst role.
+ Experience building and maintaining product analytics pipelines, including the implementation of event tracking (e.g., Snowplow) and the integration of behavioral data into Snowflake from platforms like Amplitude.
+ Strong understanding of data warehousing concepts, working experience with relational databases (Snowflake, Redshift, Postgres, etc.), and SQL.
+ Experience working with cloud providers like AWS, Azure, GCP, etc.
+ Solid programming foundations and proficiency in data-related languages like Python, Scala, and R.
+ Experience with DevOps workflows and tools like DBT, GitHub, Airflow, etc.
+ Experience with an infrastructure-as-code tool such as Terraform or CloudFormation
+ Excellent communication skills. Ability to effectively communicate with both technical and non-technical audiences
+ Knowledge of real-time stream technologies like AWS Firehose, Spark, etc.
+ Highly collaborative in working with teammates and stakeholders
**Nice to Have:**
+ AWS cloud certification is a plus
+ BA/BS preferred in a technical or engineering field
**Compensation Statement**
The U.S. annual base salary range for this full-time position is between $180,000 and $210,000 USD/year. Actual offers may be higher or lower than this range based on various factors, including (but not limited to) the candidate's work location, job-related skills, experience, and education.
The talent and dedication of our employees are at the core of DataRobot's journey to be an iconic company. We strive to attract and retain the best talent by providing competitive pay and benefits with our employees' well-being at the core. Here's what your benefits package may include depending on your location and local legal requirements: Medical, Dental & Vision Insurance, Flexible Time Off Program, Paid Holidays, Paid Parental Leave, Global Employee Assistance Program (EAP) and more!
**DataRobot Operating Principles:**
+ Wow Our Customers
+ Set High Standards
+ Be Better Than Yesterday
+ Be Rigorous
+ Assume Positive Intent
+ Have the Tough Conversations
+ Be Better Together
+ Debate, Decide, Commit
+ Deliver Results
+ Overcommunicate
Research shows that many women only apply to jobs when they meet 100% of the qualifications while many men apply to jobs when they meet 60%. **At DataRobot we encourage ALL candidates, especially women, people of color, LGBTQ+ identifying people, differently abled, and other people from marginalized groups to apply to our jobs, even if you do not check every box.** We'd love to have a conversation with you and see if you might be a great fit.
DataRobot is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. DataRobot is committed to working with and providing reasonable accommodations to applicants with physical and mental disabilities. Please see the United States Department of Labor's EEO poster and EEO poster supplement for additional information.
All applicant data submitted is handled in accordance with our Applicant Privacy Policy (*************************************************** .
DataRobot delivers AI that maximizes impact and minimizes business risk. Our AI applications and platform integrate into core business processes so teams can develop, deliver, and govern AI at scale. DataRobot empowers practitioners to deliver predictive and generative AI, and enables leaders to secure their AI assets. Organizations worldwide rely on DataRobot for AI that makes sense for their business - today and in the future. For more information, visit our website (************************* and connect with us on LinkedIn (******************************************** .
**_DataRobot has become aware of scams involving false offers of DataRobot employment. The scams and false offers use imposter websites, email addresses, text messages, and other fraudulent means. None of these offers are legitimate, and DataRobot's recruiting process never involves conducting interviews via instant messages, nor requires candidates to purchase products or services, or to process payments on our behalf._** **_Please note that DataRobot does not ask for money in its recruitment process._** **_DataRobot is committed to providing a safe and secure environment for all job applicants. We encourage all job seekers to be vigilant and protect themselves against recruitment scams by verifying the legitimacy of any job offer before providing personal information or paying any_** **_fees. Communication_** **_from our company will be sent from a verified email address using the @_** **_datarobot.com_** **_email domain. If you receive any suspicious emails or messages claiming to be from DataRobot, please do not respond._**
**_Thank you for your interest in DataRobot, and we look forward to receiving your application through our official channels._**
Don't see the dream job you are looking for? Drop off your contact information and resume and we will reach out to you if we find the perfect fit!
$180k-210k yearly 19d ago
Google Senior Data Engineer
Accenture 4.7
Data engineer job in Sacramento, CA
We Are Accenture is a premier Google Cloud partner helping organizations modernize data ecosystems, build real-time analytics capabilities, and responsibly scale AI. As part of Accenture Cloud First and the Accenture Google Business Group (AGBG), we deliver solutions leveraging Google Cloud's Data & AI platform-including BigQuery, Looker, Vertex AI, Gemini Foundation Models, and Gemini Enterprise.
You Are
A hands-on Engineer with foundational experience in DataEngineering, Analytics, or Machine Learning-now building deep expertise in Google Cloud Platform (GCP). You are eager to apply technical skills, learn advanced Data & AI patterns, and support delivery teams in designing and implementing modern data and AI solutions.
You're comfortable working directly with clients, supporting senior architects, and contributing to end-to-end project execution.
The Work (What You Will Do)
As a GCP Senior DataEngineer, you will help deliver data modernization, analytics, and AI solutions on GCP. You will support architecture design, build data pipelines and models, perform analysis, and contribute to technical implementations under guidance from senior team members.
1. Hands-On Technical Delivery
+ Build data pipelines, ETL/ELT processes, and integrations using GCP services such as: BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage
+ Assist with data modeling, performance tuning, and query optimization in BigQuery.
+ Implement data ingestion patterns for batch and streaming data sources.
+ Support development of dashboards and analytics products using Looker or Looker Studio.
2. Support Agentic AI & ML Solution Development
+ Assist in developing ML models and AI solutions using:Vertex AI, Gemini Foundation Models, Gemini Enterprise, Model APIs & Embeddings
+ Implement ML pipelines and help establish MLOps processes (monitoring, retraining, deployment).
+ Support prompt engineering, embeddings, and retrieval-augmented generation (RAG) experimentation.
+ Contribute to model testing, validation, and documentation.
3. Requirements Gathering & Client Collaboration
+ Participate in client workshops to understand data needs, use cases, and technical requirements.
+ Help translate functional requirements into technical tasks and implementation plans.
+ Communicate progress, blockers, and insights to project leads and client stakeholders.
4. Data Governance, Quality & Security Support
+ Implement metadata management, data quality checks, and lineage tracking using GCP tools (Dataplex, IAM).
+ Follow best practices for security, identity management, and compliance.
+ Support operational processes for data validation, testing, and monitoring.
5. Continuous Learning & Team Support
+ Learn and apply GCP Data & AI best practices across architectural patterns, engineering standards, and AI frameworks.
+ Collaborate closely with senior dataengineers, ML engineers, and architects.
+ Contribute to internal accelerators, documentation, and reusable components.
+ Stay current with GCP releases, Gemini model updates, and modern engineering practices.
Travel may be required for this role. The amount of travel will vary from 0 to 100% depending on business need and client requirements.
Here's what you need
+ Minimum of 5 years of hands-on experience in DataEngineering, Data Analytics, ML Engineering, or related fields.
+ Minimum of 4 years of practical experience with Google Cloud Platform.
+ Minimum of 5 years of experience with SQL, data modeling, and building data pipelines.
+ Minimum of 3 years of experience with Python or AI or GenAI tools (Vertex AI preferred).
+ Bachelor's degree or equivalent (minimum 12 years) work experience. (If Associate's Degree, must have minimum 6 years work experience)
Bonus point if you have
+ Experience with GCP services such as BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, and Looker.
+ Exposure to AI/ML development or experimentation with Vertex AI, Gemini models, embeddings, or RAG patterns.
+ Hands-on experience with CI/CD, Git, or cloud-native engineering practices.
+ Google Cloud certifications (Associate Cloud Engineer or Professional DataEngineer).
+ Experience working in agile delivery environments.
Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired as set forth below.We anticipate this job posting will be posted on 01/24/2026 and open for at least 3 days.
Accenture offers a market competitive suite of benefits including medical, dental, vision, life, and long-term disability coverage, a 401(k) plan, bonus opportunities, paid holidays, and paid time off. See more information on our benefits here:
U.S. Employee Benefits | Accenture (*******************************************************
Role Location Annual Salary Range
California $94,400 to $266,300
Cleveland $87,400 to $213,000
Colorado $94,400 to $230,000
District of Columbia $100,500 to $245,000
Illinois $87,400 to $230,000
Maryland $94,400 to $230,000
Massachusetts $94,400 to $245,000
Minnesota $94,400 to $230,000
New York $87,400 to $266,300
New Jersey $100,500 to $266,300
Washington $100,500 to $245,000
Requesting an Accommodation
Accenture is committed to providing equal employment opportunities for persons with disabilities or religious observances, including reasonable accommodation when needed. If you are hired by Accenture and require accommodation to perform the essential functions of your role, you will be asked to participate in our reasonable accommodation process. Accommodations made to facilitate the recruiting process are not a guarantee of future or continued accommodations once hired.
If you would like to be considered for employment opportunities with Accenture and have accommodation needs such as for a disability or religious observance, please call us toll free at **************** or send us an email or speak with your recruiter.
Equal Employment Opportunity Statement
We believe that no one should be discriminated against because of their differences. All employment decisions shall be made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by federal, state, or local law. Our rich diversity makes us more innovative, more competitive, and more creative, which helps us better serve our clients and our communities.
For details, view a copy of the Accenture Equal Opportunity Statement (********************************************************************************************************************************************
Accenture is an EEO and Affirmative Action Employer of Veterans/Individuals with Disabilities.
Accenture is committed to providing veteran employment opportunities to our service men and women.
Other Employment Statements
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States.
Candidates who are currently employed by a client of Accenture or an affiliated Accenture business may not be eligible for consideration.
Job candidates will not be obligated to disclose sealed or expunged records of conviction or arrest as part of the hiring process. Further, at Accenture a criminal conviction history is not an absolute bar to employment.
The Company will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. Additionally, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the Company's legal duty to furnish information.
California requires additional notifications for applicants and employees. If you are a California resident, live in or plan to work from Los Angeles County upon being hired for this position, please click here for additional important information.
Please read Accenture's Recruiting and Hiring Statement for more information on how we process your data during the Recruiting and Hiring process.
$100.5k-266.3k yearly 2d ago
Data Engineer
Kuvare
Data engineer job in Rosemont, CA
About the role
The Kuvare DataEngineer is responsible for the retrieval, storage, and distribution of data across varied platforms and data stores, including modern ETL/ELT pipelines such as Azure Data Factory, as well as modern models such as data mesh and event streaming architectures like Azure Event Hub or Kafka. The individual will rely on broad experience in data modeling, including relational and dimensional models as well as semi-structured and unstructured data forms such as JSON, YAML, and PDF. This role has a direct business impact by providing rapid enhancements to the data platform in support of new business relationships, new consumer products, etc.
What you'll do
· Research architectural trends, design methods, and emerging technologies, in adherence to and in support of company standards, helping Kuvare develop and maintain a nimble technology platform for adapting to new business opportunities.
· Work with Business SMEs and end users - including external business partners and vendors - to translate business requirements into technical specifications, including technical designs, data flow diagrams, and other technical artifacts.
· Create data pipelines to ingest, update, or transport data to/from any variety of data stores, from external data files to internal dimensional or relational models; and leveraging a broad toolset including SFTP, Azure Data Factory, Azure Event Hub, SQL Stored Procedures, etc.
· Leverage excellent SQL skills to build structured data repositories: tables, views, indexes, constraints, etc.
· Perform tests and evaluations to ensure reliability, data security, privacy, and integrity.
· Other Responsibilities
· Ability and willingness to travel occasionally, such as for monthly meetings at the local office and an annual IT summit at a Kuvare hub office, which might be Baton Rouge, Cedar Rapids, or Chicago.
· Occasional evening and weekend work to meet deadlines.
· Other duties and responsibilities as assigned.
Qualifications
· Bachelor's degree in computer science, information systems, computer engineering or related field; or an equivalent combination of education and experience.
· Extensive experience with Microsoft SQL Server and the Azure environment.
· Expertise in relational and dimensional database modeling, data development and administration.
· Extensive experience with ETL/ELT pipelines including any combination of modern tools such as SSIS, Azure Data Factory, Function Apps, Logic Apps, etc.
· Analytic and problem-solving skills and experience.
· Excellent written and oral communication skills.
$108k-155k yearly est. 11d ago
DATA SCIENTIST
Department of The Air Force
Data engineer job in Fairfield, CA
The PALACE Acquire Program offers you a permanent position upon completion of your formal training plan. As a Palace Acquire Intern you will experience both personal and professional growth while dealing effectively and ethically with change, complexity, and problem solving. The program offers a 3-year formal training plan with yearly salary increases. Promotions and salary increases are based upon your successful performance and supervisory approval.
Summary
The PALACE Acquire Program offers you a permanent position upon completion of your formal training plan. As a Palace Acquire Intern you will experience both personal and professional growth while dealing effectively and ethically with change, complexity, and problem solving. The program offers a 3-year formal training plan with yearly salary increases. Promotions and salary increases are based upon your successful performance and supervisory approval.
Overview
Help
Accepting applications
Open & closing dates
09/29/2025 to 09/28/2026
Salary $49,960 to - $99,314 per year
Total salary varies depending on location of position
Pay scale & grade GS 7 - 9
Locations
Gunter AFB, AL
Few vacancies
Maxwell AFB, AL
Few vacancies
Davis Monthan AFB, AZ
Few vacancies
Edwards AFB, CA
Few vacancies
Show morefewer locations (44)
Los Angeles, CA
Few vacancies
Travis AFB, CA
Few vacancies
Vandenberg AFB, CA
Few vacancies
Air Force Academy, CO
Few vacancies
Buckley AFB, CO
Few vacancies
Cheyenne Mountain AFB, CO
Few vacancies
Peterson AFB, CO
Few vacancies
Schriever AFB, CO
Few vacancies
Joint Base Anacostia-Bolling, DC
Few vacancies
Cape Canaveral AFS, FL
Few vacancies
Eglin AFB, FL
Few vacancies
Hurlburt Field, FL
Few vacancies
MacDill AFB, FL
Few vacancies
Patrick AFB, FL
Few vacancies
Tyndall AFB, FL
Few vacancies
Robins AFB, GA
Few vacancies
Hickam AFB, HI
Few vacancies
Barksdale AFB, LA
Few vacancies
Hanscom AFB, MA
Few vacancies
Natick, MA
Few vacancies
Aberdeen Proving Ground, MD
Few vacancies
Andrews AFB, MD
Few vacancies
White Oak, MD
Few vacancies
Offutt AFB, NE
Few vacancies
Holloman AFB, NM
Few vacancies
Kirtland AFB, NM
Few vacancies
Nellis AFB, NV
Few vacancies
Rome, NY
Few vacancies
Heath, OH
Few vacancies
Wright-Patterson AFB, OH
Few vacancies
Tinker AFB, OK
Few vacancies
Arnold AFB, TN
Few vacancies
Dyess AFB, TX
Few vacancies
Fort Sam Houston, TX
Few vacancies
Goodfellow AFB, TX
Few vacancies
Lackland AFB, TX
Few vacancies
Randolph AFB, TX
Few vacancies
Hill AFB, UT
Few vacancies
Arlington, VA
Few vacancies
Dahlgren, VA
Few vacancies
Langley AFB, VA
Few vacancies
Pentagon, Arlington, VA
Few vacancies
Fairchild AFB, WA
Few vacancies
Warren AFB, WY
Few vacancies
Remote job No Telework eligible No Travel Required Occasional travel - You may be expected to travel for this position. Relocation expenses reimbursed No Appointment type Internships Work schedule Full-time Service Competitive
Promotion potential
13
Job family (Series)
* 1560 Data Science Series
Supervisory status No Security clearance Secret Drug test No Position sensitivity and risk Noncritical-Sensitive (NCS)/Moderate Risk
Trust determination process
* Suitability/Fitness
Financial disclosure No Bargaining unit status No
Announcement number K-26-DHA-12804858-AKK Control number 846709300
This job is open to
Help
The public
U.S. Citizens, Nationals or those who owe allegiance to the U.S.
Students
Current students enrolled in an accredited high school, college or graduate institution.
Recent graduates
Individuals who have graduated from an accredited educational institute or certificate program within the last 2 years or 6 years for Veterans.
Clarification from the agency
This public notice is to gather applications that may or may not result in a referral or selection.
Duties
Help
1. Performs developmental assignments in support of projects assigned to higher level analysts. Performs minor phases of a larger assignment or work of moderate difficulty where procedures are established, and a number of specific guidelines exist. Applies the various steps of accepted data science procedures to search for information and perform well precedented work.
2. Performs general operations and assignments for portions of a project or study consisting of a series of interrelated tasks or problems. The employee applies judgment in the independent application of methods and techniques previously learned. The employee locates and selects the most appropriate guidelines and modifies to address unusual situations.
3. Participates in special initiatives, studies, and projects. Performs special research tasks designed to utilize and enhance knowledge of work processes and techniques. Works with higher graded specialists in planning and conducting special initiatives, studies, and projects. Assists in preparing reports and briefings outlining study findings and recommendations.
4. Prepares correspondence and other documentation. Drafts or prepares a variety of documents to include newsletter items,
responses to routine inquiries, reports, letters, and other related documents.
Requirements
Help
Conditions of employment
* Employee must maintain current certifications
* Successful completion of all training and regulatory requirements as identified in the applicable training plan
* Must meet suitability for federal employment
* Direct Deposit: All federal employees are required to have direct deposit
* Please read this Public Notice in its entirety prior to submitting your application for consideration.
* Males must be registered for Selective Service, see ***********
* A security clearance may be required. This position may require a secret, top-secret or special sensitive clearance.
* If authorized, PCS will be paid IAW JTR and AF Regulations. If receiving an authorized PCS, you may be subject to completing/signing a CONUS agreement. More information on PCS requirements, may be found at: *****************************************
* More information on PCS requirements, may be found at: *****************************************
* Position may be subject to random drug testing
* U.S. Citizenship Required
* Disclosure of Political Appointments
* Student Loan Repayment may be authorized
* Recruitment Incentive may be authorized for this position
* Total salary varies depending on location of position
* You will be required to serve a one year probationary period
* Grade Point Average - 2.95 or higher out of a possible 4.0
* Mobility - you may be required to relocate during or after completion of your training
* Work may occasionally require travel away from the normal duty station on military or commercial aircraft
Qualifications
BASIC REQUIREMENT OR INDIVIDUAL OCCUPATIONAL REQUIREMENT:
Degree: Mathematics, statistics, computer science, data science or field directly related to the position. The degree must be in a major field of study (at least at the baccalaureate level) that is appropriate for the position.
You may qualify if you meet one of the following:
1. GS-7: You must have completed or will complete a 4-year course of study leading to a bachelor's from an accredited institution AND must have documented Superior Academic Achievement (SAA) at the undergraduate level in the following:
a) Grade Point Average 2.95 or higher out of a possible 4.0 as recorded on your official transcript or as computed based on 4 years of education or as computed based on courses completed during the final 2 years of curriculum; OR 3.45 or higher out of a possible 4.0 based on the average of the required courses completed in your major field or the required courses in your major field completed during the final 2 years of your curriculum.
2. GS-9: You must have completed 2 years of progressively higher-level graduate education leading to a master's degree or equivalent graduate degree:
a) Grade Point Average - 2.95 or higher out of a possible 4.0 as recorded on your official transcript or as computed based on 4 years of education or as computed based on courses completed during the final 2 years of curriculum; OR 3.45 or higher out of a possible 4.0 based on the average of the required courses completed in your major field or the required courses in your major field completed during the final 2 years of your curriculum. If more than 10 percent of total undergraduate credit hours are non-graded, i.e. pass/fail, CLEP, CCAF, DANTES, military credit, etc. you cannot qualify based on GPA.
KNOWLEDGE, SKILLS AND ABILITIES (KSAs): Your qualifications will be evaluated on the basis of your level of knowledge, skills, abilities and/or competencies in the following areas:
1. Professional knowledge of basic principles, concepts, and practices of data science to apply scientific methods and techniques to analyze systems, processes, and/or operational problems and procedures.
2. Knowledge of mathematics and analysis to perform minor phases of a larger assignment and prepare reports, documentation, and correspondence to communicate factual and procedural information clearly.
3. Skill in applying basic principles, concepts, and practices of the occupation sufficient to perform routine to difficult but well precedented assignments in data science analysis.
4. Ability to analyze, interpret, and apply data science rules and procedures in a variety of situations and recommend solutions to senior analysts.
5. Ability to analyze problems to identify significant factors, gather pertinent data, and recognize solutions.
6. Ability to plan and organize work and confer with co-workers effectively.
PART-TIME OR UNPAID EXPERIENCE: Credit will be given for appropriate unpaid and or part-time work. You must clearly identify the duties and responsibilities in each position held and the total number of hours per week.
VOLUNTEER WORK EXPERIENCE: Refers to paid and unpaid experience, including volunteer work done through National Service Programs (i.e., Peace Corps, AmeriCorps) and other organizations (e.g., professional; philanthropic; religious; spiritual; community; student and social). Volunteer work helps build critical competencies, knowledge and skills that can provide valuable training and experience that translates directly to paid employment. You will receive credit for all qualifying experience, including volunteer experience.
Education
IF USING EDUCATION TO QUALIFY: If position has a positive degree requirement or education forms the basis for qualifications, you MUST submit transcriptswith the application. Official transcripts are not required at the time of application; however, if position has a positive degree requirement, qualifying based on education alone or in combination with experience, transcripts must be verified prior to appointment. An accrediting institution recognized by the U.S. Department of Education must accredit education. Click here to check accreditation.
FOREIGN EDUCATION: Education completed in foreign colleges or universities may be used to meet the requirements. You must show proof the education credentials have been deemed to be at least equivalent to that gained in conventional U.S. education program. It is your responsibility to provide such evidence when applying.
Additional information
For DHA Positions:
These positions are being filled under Direct-Hire Authority for the Department of Defense for Post-Secondary Students and Recent Graduates. The Secretary of the Air Force has delegated authority by the Office of the Secretary of Defense to directly appoint qualified post-secondary students and recent graduates directly into competitive service positions; these positions may be professional or administrative occupations and are located Air Force-Wide. Positions may be filled as permanent or term with a full-time or part-time work schedule. Pay will vary by geographic location.
* The term "Current post-secondary student" means a person who is currently enrolled in, and in good academic standing at a full-time program at an institution of higher education; and is making satisfactory progress toward receipt of a baccalaureate or graduate degree; and has completed at least one year of the program.
* The term "recent graduate" means a person who was awarded a degree by an institution of higher education not more than two years before the date of the appointment of such person, except in the case of a person who has completed a period of obligated service in a uniform service of more than four years.
Selective Service: Males born after 12-31-59 must be registered or exempt from Selective Service. For additional information, click here.
Direct Deposit: All federal employees are required to have direct deposit.
If you are unable to apply online, view the following link for information regarding Alternate Application. The Vacancy ID is
If you have questions regarding this announcement and have hearing or speech difficulties click here.
Tax Law Impact for PCS: On 22-Dec-2017, Public Law 115-97 - the "Tax Cuts and Jobs Act of 2017" suspended qualified moving expense deductions along with the exclusion for employer reimbursements and payments of moving expenses effective 01-Jan-2018 for tax years 2018 through 2025. The law made taxable certain reimbursements and other payments, including driving mileage, airfare and lodging expenses, en-route travel to the new duty station, and temporary storage of those items. The Federal Travel Regulation Bulletin (FTR) 18-05 issued by General Services Administration (GSA) has authorized agencies to use the Withholding Tax Allowance (WTA) and Relocation Income Tax Allowance (RITA) to pay for "substantially all" of the increased tax liability resulting from the "2018 Tax Cuts and Jobs Act" for certain eligible individuals. For additional information on WTA/RITA allowances and eligibilities please click here. Subsequently, FTR Bulletin 20-04 issued by GSA, provides further information regarding NDAA FY2020, Public Law 116-92, and the expansion of eligibility beyond "transferred" for WTA/RITA allowances. For additional information, please click here.
Expand Hide additional information
Candidates should be committed to improving the efficiency of the Federal government, passionate about the ideals of our American republic, and committed to upholding the rule of law and the United States Constitution.
Benefits
Help
A career with the U.S. government provides employees with a comprehensive benefits package. As a federal employee, you and your family will have access to a range of benefits that are designed to make your federal career very rewarding. Opens in a new window Learn more about federal benefits.
Review our benefits
Eligibility for benefits depends on the type of position you hold and whether your position is full-time, part-time or intermittent. Contact the hiring agency for more information on the specific benefits offered.
How you will be evaluated
You will be evaluated for this job based on how well you meet the qualifications above.
For DHA Positions:
These positions are being filled under Direct-Hire Authority for the DoD for Post-Secondary Students and Recent Graduates. The Secretary of the Air Force has delegated authority by the Office of the Secretary of Defense to directly appoint qualified students and recent graduates directly into competitive service positions; positions may be professional or administrative occupations and located Air Force-Wide. Positions may be filled as permanent/term with a full-time/part-time work schedule. Pay will vary by geographic location.
* The term "Current post-secondary student" means a person who is currently enrolled and in good academic standing at a full-time program at an institution of higher education; and is progressing toward a baccalaureate or graduate degree; and has completed at least 1 year of the program.
* The term "recent graduate" means a person awarded a degree by an institution of higher education not more than 2 years before the date of the appointment of such person, except in the case of a person who has completed a period of obligated service in a uniform service of more than 4 years.
Your latest resume will be used to determine your qualifications.
Your application package (resume, supporting documents, and responses to the questionnaire) will be used to determine your eligibility, qualifications, and quality ranking for this position. Please follow all instructions carefully. Errors or omissions may affect your rating or consideration for employment.
Your responses to the questionnaire may be compared to the documents you submit. The documents you submit must support your responses to the online questionnaire. If your application contradicts or does not support your questionnaire responses, you will receive a rating of "not qualified" or "insufficient information" and you will not receive further consideration for this job.
Applicants who disqualify themselves will not be evaluated further.
Benefits
Help
A career with the U.S. government provides employees with a comprehensive benefits package. As a federal employee, you and your family will have access to a range of benefits that are designed to make your federal career very rewarding. Opens in a new window Learn more about federal benefits.
Review our benefits
Eligibility for benefits depends on the type of position you hold and whether your position is full-time, part-time or intermittent. Contact the hiring agency for more information on the specific benefits offered.
Required documents
Required Documents
Help
The following documents are required and must be provided with your application for this Public Notice. Applicants who do not submit required documentation to determine eligibility and qualifications will be eliminated from consideration. Other documents may be required based on the eligibility/eligibilities you are claiming. Click here to view the AF Civilian Employment Eligibility Guide and the required documents you must submit to substantiate the eligibilities you are claiming.
* Online Application - Questionnaire
* Resume: Your resume may NOT exceed two pages, and the font size should not be smaller than 10 pts. You will not be considered for this vacancy if your resume is illegible/unreadable. Additional information on resume requirements can be located under "
$50k-99.3k yearly 33d ago
Sr Data Engineer
Insight Global
Data engineer job in Sacramento, CA
Data Pipeline Development Design and implement complex, scalable ETL pipelines using Azure Fabric, Data Factory, and Synapse. Build and maintain transformation pipelines and data flows in Azure Fabric. Source data from diverse systems including APIs, legacy systems, and mainframes (nice to have).
Automate data ingestion, transformation, and validation processes using PySpark and Python.
Maintain source control hygiene and CI/CD pipelines (e.g., Azure DevOps, Jenkins).
Database Design & Optimization
Design and maintain relational and NoSQL databases (SQL Server, Oracle, etc.).
Ensure referential integrity, indexing, and query optimization for performance.
Data Infrastructure Management
Manage data warehouses, data lakes, and other storage solutions on Azure.
Monitor system performance, ensure data security, and maintain compliance.
Data Modeling & Governance
Develop and maintain logical and physical data models.
Implement data governance policies and ensure data quality standards.
Collaboration & Agile Development
Work closely with business and technical teams to gather requirements and deliver solutions.
Participate in agile ceremonies, sprint planning, and code reviews.
Provide technical guidance and mentorship to team members.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to ********************.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: ****************************************************
Skills and Requirements
Proven experience with Azure Data Services, especially Azure Fabric for data flows and transformation pipelines.
Strong proficiency in SQL, Python, and PySpark.
Experience with data warehousing, ETL/ELT, and data modeling.
Familiarity with CI/CD, DevOps, and microservices architecture.
Experience with relational (SQL Server, Oracle) and NoSQL databases.
Strong analytical and problem-solving skills.
Excellent communication skills, both written and verbal. Experience integrating data from mainframes or other legacy systems.
Familiarity with mock data generation, data validation frameworks, and data quality tools.
Exposure to data visualization and BI tools for delivering insights
$108k-155k yearly est. 11d ago
Bigdata Engineer
Jobsbridge
Data engineer job in Sacramento, CA
Primary Responsibilities
Develop web based user interfaces for analysis of operational and business data
Create algorithmic data models
Manage data sources and feeds for web based user interfaces
Ability to deliver on deadline driven projects
Work within a standards driven development approach
Collaborate effectively with team members
General Experience with the following:
Familiarity with Data Analytics / Data Science practices
Familiarity with Data Quality and Master Data Management practices
Effective communication skills, both written and verbal
Create technical design/specification documentation
Experience with formal source control tools/methods
Education and Experience
Prefer bachelor's degree or above in Computer Science or related fiel
5+ years of development experience
4+ years of web development with AJAX skills
4+ years of database development experience
2+ Big Data experience: Hadoop, Hbase, Cassandra, MongoDB, etc.
Database administration experience
Skills and Knowledge
Development/Management experience with most of the following:
HTML, Javascript, XML
PERL/PHP
Adobe FLEX
Oracle/MS SQL experience
Informatica product suite
Java Web Services
Qualifications
Big Data, HTML, Javascript, XML, Oracle/MS SQL, Data Analytics, Data Science
Additional Information
All your information will be kept confidential according to EEO guidelines.
$108k-155k yearly est. 60d+ ago
Sr Data Engineer (MFT - IBM Sterling)
The Hertz Corporation 4.3
Data engineer job in Sacramento, CA
**A Day in the Life:** The IBM Sterling Senior MFT Engineer will have expertise in Sterling B2B Integrator, Sterling File Gateway, Sterling Secure Proxy, Control Center. This position requires an expert level knowledge of these technologies. You'll provide third-level support for core hardware, software, data and infrastructure components within the B2B integration team. The software focus areas will include software infrastructure components, continuous improvement and automation related to middleware and enterprise data movement within a mission-critical, multi-site enterprise environment.
The ideal candidate will have a passion for technology; possess the ability to create change and be the facilitator for this transformation. They will have experience tailoring software design, developing, and implementing of solutions in support of data quality and automation. They will need to understand the requirements of a project, test the performance of the infrastructure, and verify that the requirements have been successfully met.
We expect the starting salary to be around $135k but will be commensurate with experience.
**What You'll Do:**
TECHNICAL SENIORSHIP
+ Communication with internal and external business users on Sterling Integrator mappings
+ Making changes to existing partner integrations to meet internal and external requirements
+ Design, develop and implement solutions based on standards and processes that establish consistency across the enterprise data, reduce risks, and promote efficiencies in support of the organizational goals and objectives.
+ Diagnose and troubleshoot complex issues, restore services and perform root cause analysis.
+ Facilitate the review, vetting of these designs with the architecture governance bodies, as required.
+ Be aware of all aspects of security related to the Sterling environment and integrations
INNOVATIVE INFRASTRUCTURE & PROBLEM SOLVING
+ Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows.
TEAMWORK & COMMUNICATION
+ Superior & demonstrated team building & development skills to harness powerful teams
+ Ability to communicate effectively with different levels of Seniorship within the organization
+ Provide timely updates so that progress against each individual incident can be updated as required
+ Write and review high quality technical documentation
CONTROL & AUDIT
+ Ensures their workstation and all processes and procedures, follow organization standards
CONTINUOUS IMPROVEMENT
+ Encourages and maintains a 'Best practice sharing culture', always striving to find ways to improve service and change mind set.
**What We're Looking For:**
+ Bachelor's degree in Engineering, Statistics, Computer Science or other quantitative fields, required
+ 5+ years of IT experience
+ 4+ years' experience working with Sterling B2B Integrator, Sterling File Gateway, PCM (preferred)
+ 3+ years' experience with scripting to enable automation of standard activities (example: Ansible, Python, Bash, Java)
+ Strong interpersonal and communication skills with Agile/Scrum experience.
+ Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions with the ability to develop custom setups.
+ Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels.
+ Prefer Travel, transportation, or hospitality experience
+ Prefer experience with designing application data models for mobile or web applications
+ Excellent written and verbal communication skills.
+ Flexibility in scheduling which may include nights, weekends, and holidays
**What You'll Get:**
+ Up to 40% off the base rate of any standard Hertz Rental
+ Paid Time Off
+ Medical, Dental & Vision plan options
+ Retirement programs, including 401(k) employer matching
+ Paid Parental Leave & Adoption Assistance
+ Employee Assistance Program for employees & family
+ Educational Reimbursement & Discounts
+ Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness
+ Perks & Discounts -Theme Park Tickets, Gym Discounts & more
The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world.
**US EEO STATEMENT**
At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company.
Individuals are encouraged to apply for positions because of the characteristics that make them unique.
EOE, including disability/veteran
$135k yearly 60d+ ago
Peoplesoft Data Converstion Lead
Telligentech
Data engineer job in Sacramento, CA
Role: PeopleSoft Data Conversion Lead
Duration: 12+ months
Role Description:
• Responsible for design, data mapping, and conversion approach for all legacy data feeds.
• Define the conversion approach, which includes the methods and industry standards.
• Manage relationship with client counterpart, and transfer knowledge accordingly.
• Work with client departments to prepare for conversion testing and deployment.
• Develops data process flows, which will align with business processes.
• Define the data requirements and structure for the application.
• Model and design the application data structure, storage and integration.
Required Experience:
• Two (2) years of experience serving in the capacity of Data Conversion Lead in at least one (1) project implementing PeopleSoft Financials software.
• One (1) year of experience serving in the capacity of Data Conversion Lead for at least one (1) public sector IT project for at least 7,500 users (user IDs).
Preferred Experience:
Strong California State Government experience
Skills:
• Data Conversion & Migration
• PeopleSoft Financials
• PeopleSoft-Public Sector
• Tech Delivery Architecture
• Technology Architecture Design
Additional InformationThanks & Regards, Rahul | Telligen Tech,INC| Phone: ************/************ |Fax: ************ | Email: ********************** | *********************** |
$106k-150k yearly est. Easy Apply 60d+ ago
Data Architect
Paradigminfotech
Data engineer job in Sacramento, CA
Paradigm Infotech. Inc is a global IT solutions provider focused on delivering customer value through high Quality Processes and Cost-efficient solutions. Paradigm has been one of the trendsetters in global delivery practices with our Client-Centric Model for customer management and delivery.
JOB DETAILS:
Position: Data Architect
Location: Sacramento, CA
Duration: 3 months with possible chances of extension
JOB DESCRIPTION:
Required Skills:
• Minimum 7+ years of experience in data modeling, including conceptual, logical, physical, warehouse, and data mart models.
• Experience working with stakeholders to define user and data requirements for information exchange between systems.
• Extensive experience in architecture design.
• Experience and efficient in HANA data modeling/Views creation in conjunction with MSTR reporting”. Resolve any reporting performance issues by using HANA modeling and tuning techniques.
• Experience implementing data standards between systems
• Experience with common big data technologies and SAP HANA would be a plus
• Good Communications and Presentation Skills
• A Good team Player
Thanks & Regards
ASHOK KUMAR
Sr. Lead / IT Recruiter
Paradigm Infotech
Call : ************
Additional Information
Mandatory Details:
Full Name as per SSN:
Total Experience:
US Experience:
$116k-164k yearly est. 60d+ ago
Slalom Flex (Project Based)- Microsoft Power Platform Data Architect
Slalom 4.6
Data engineer job in Sacramento, CA
Data Architect / Microsoft Power Platform Architect (6‑Month Project) Duration: 6 months (with potential extension) Start: Immediate Employment Type: W2 About Us Slalom is a purpose-led, global business and technology consulting company. From strategy to implementation, our approach is fiercely human. In six+ countries and 43+ markets, we deeply understand our customers-and their customers-to deliver practical, end-to-end solutions that drive meaningful impact. Backed by close partnerships with over 400 leading technology providers, our 10,000+ strong team helps people and organizations dream bigger, move faster, and build better tomorrows for all. We're honored to be consistently recognized as a great place to work, including being one of Fortune's 100 Best Companies to Work For seven years running. Learn more at slalom.com.
Overview
We are seeking a Microsoft Power Platform Architect with strong Data Architecture expertise to support an in‑flight initiative focused on building a pricing application. This role will provide immediate capacity to an existing team of six platform developers, performing a full architectural review of the current solution and optimizing an established design spanning Power Platform (front end) and Databricks/Snowflake (backend).
The ideal candidate brings hands-on experience across Power Apps, Power Automate, and modern dataengineering and is comfortable stepping into a partially built solution to strengthen quality, stability, and performance.
Key Responsibilities
Architectural & Technical Leadership
* Perform a full end‑to‑end architectural review of the existing pricing application across Power Platform, Databricks, and Snowflake.
* Optimize the already-defined architecture, identifying gaps, constraints, and opportunities for improvement.
* Provide expert guidance on best practices for Power Platform, data modeling, integration design, and automation patterns.
Development & Optimization
* Enhance and optimize Power Apps, Power Automate flows, and integrations with Databricks and Snowflake.
* Implement improvements to ensure scalability, reliability, and performance of the pricing solution.
* Review and refine backend data pipelines, SQL logic, and data transformations as needed.
Quality Engineering & Delivery Support
* Build and execute end‑to‑end testing strategies, including functional, integration, and regression testing.
* Perform bug resolution and troubleshoot issues across both front-end and back-end layers.
* Collaborate with the existing team of six developers to ensure smooth delivery and alignment with solution architecture.
Cross‑Functional Collaboration
* Work closely with product owners, developers, and dataengineers to validate technical decisions.
* Communicate architectural recommendations clearly to both technical and business stakeholders.
* Support Agile ceremonies and sprint planning as needed.
Required Skills & Experience
* Power Automate (expert level)
* Hands-on experience building and optimizing Power Apps
* Experience managing and troubleshooting Power Platform flows
* Strong understanding of Databricks and Snowflake data architecture
* Ability to conduct thorough architectural reviews and produce actionable recommendations
* Strong SQL skills for debugging, optimization, and backend analysis
* Experience working on mid‑to‑large scale applications within the Power Platform ecosystem
Preferred / Nice‑to‑Have Skills
* Experience building custom connectors or plug-ins in C#
* Agile delivery experience
* Power BI (report development or model optimization)
* Background in pricing, financial modeling, or workflow-driven applications
Compensation and Benefits
Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses.
Slalom is committed to fair and equitable compensation practices. For this position, the base salary pay range is $80/hr to $100/hr. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time.
EEO and Accommodations
Slalom is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration
for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements.
Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the
selection process. Please advise the talent acquisition team if you require accommodations during the interview process.
#DMVHOT
How much does a data engineer earn in Rancho Cordova, CA?
The average data engineer in Rancho Cordova, CA earns between $92,000 and $181,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.
Average data engineer salary in Rancho Cordova, CA
$129,000
What are the biggest employers of Data Engineers in Rancho Cordova, CA?
The biggest employers of Data Engineers in Rancho Cordova, CA are: