Role: Lead Data Architect (NoSQL/AWS)
We are seeking an experienced Data Architect with deep expertise in NoSQL database platforms, specifically MongoDB and/or Apache Cassandra, deployed and managed on AWS Cloud. This role will lead the design, architecture, and modernization of highly scalable, distributed data platforms supporting mission-critical, high-throughput applications.
Key Responsibilities
Lead the architecture, design, and implementation of NoSQL data platforms using MongoDB and/or Cassandra on AWS Cloud.
Design highly available, fault-tolerant, and horizontally scalable data architectures for large-scale, low-latency workloads.
Architect cloud-native data solutions leveraging AWS services such as EC2, EKS, S3, IAM, CloudWatch, and networking components.
Define data modeling strategies optimized for NoSQL systems, including schema design, partitioning, indexing, and replication.
Establish performance tuning, capacity planning, backup, disaster recovery, and high-availability strategies for NoSQL databases.
Lead database modernization initiatives, including migration from relational or legacy platforms to NoSQL architectures.
Define and enforce data security, encryption, access control, and compliance standards across cloud-hosted data platforms.
Partner with application, DevOps, and security teams to ensure end-to-end architectural alignment.
Develop architecture standards, reference designs, and best practices for NoSQL and cloud-based data platforms.
Provide technical leadership, design reviews, and guidance to engineering teams.
Required Qualifications
Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
8+ years of experience in data architecture, databaseengineering, or distributed data platform roles.
Strong hands-on experience with MongoDB and/or Apache Cassandra, including cluster design and operations.
Proven experience hosting and managing NoSQL databases on AWS Cloud.
Deep understanding of NoSQL data modeling, consistency models, sharding, replication, and CAP trade-offs.
Experience with AWS infrastructure services (EC2, VPC, IAM, S3, CloudWatch).
Strong knowledge of database performance tuning, scalability, and resiliency patterns.
Experience designing secure, compliant data platforms in regulated or enterprise environments.
Excellent communication and stakeholder collaboration skills.
Preferred Qualifications
Experience with MongoDB Atlas on AWS or Amazon Keyspaces (Cassandra-compatible).
Hands-on exposure to Kubernetes (EKS) and containerized database deployments.
Knowledge of event-driven architectures and streaming platforms (Kafka, Kinesis).
AWS certifications (Solutions Architect, Data Analytics, or Database Specialty).
Experience supporting public sector, government, or highly regulated environments.
$106k-144k yearly est. 1d ago
Looking for a job?
Let Zippia find it for you.
SCCM Endpoint Engineer (LARGELY REMOTE/NO C2C)
Amerit Consulting 4.0
Data engineer job in Stockton, CA
Our client, a Medical Center facility under the aegis of a California Public Ivy university and one of largest health delivery systems in California, seeks an accomplished SCCM Endpoint Engineer.
________________________________________
NOTE- THIS IS LARGELY REMOTE ROLE & ONLY W2 CANDIDATES/NO C2C/1099
*** Candidate must be authorized to work in USA without requiring sponsorship ***
Position: SCCM Endpoint Engineer (Job Id - # 3167240)
Location: Los Angeles CA 90024 (Hybrid-99% Remote/1% onsite)
Duration: 10 months + Strong Possibility of Extension
_________________________________________________________
Candidate will travel onsite to learn/view their setup and come onsite as needed for team building or vendor engagements. Onsite requirements are about 2-3 per year.
____________________________________________________
Required skills and experience:
Ability to monitor and report on statuses of endpoints utilizing SCCM/MECM & Intune.
Understanding of Networking and Active Directory.
Advanced knowledge of Microsoft Windows 10, Mac OS, Intune, Autopilot, SCCM/MECM, JAMF, and other endpoint management solutions
Advanced knowledge of ISS Microsoft Office products (O365, Office 2016, Outlook, Exchange and OWA).
Understanding of project plans, presentations, procedures, diagrams, and other technical documentation.
Understanding of Networking protocols and standards: DNS, DHCP, WINS and TCP/IP, etc.
Ability to work independently with minimal supervision as well as in a team environment.
Ability to follow escalation procedure within the TSD Team and under the ISS umbrella.
Establish standards and procedures for best practices, enabling commitments to established SLA's.
Ability to research and test new technologies and processes.
Demonstrate ability to develop creative solutions to complex problems.
Understanding of various Desktop Management Systems such as anti-virus software, patch management, full disk encryption, SSO/Tap-Badge (Imprivata) software and software delivery.
Ability to prioritize, organize, and execute work assignments.
Ability to communicate the status of various systems to management, leadership and/or support personnel.
Ability to skillfully react to a fluid and constantly changing work environment.
Ability to train, delegate and review the work of staff members.
Advanced knowledge of ticketing systems (ServiceNow).
Strong technical abilities with excellent communication and interpersonal skills.
Advanced knowledge of cloud computing (Azure, Intune, Autopilot, DaaS, Box, OneDrive).
Advanced knowledge of standard desktop imaging and upgrade procedures; SCCM/MECM/MDT, Intune, OSD, PXE, thin vs thick images.
Advanced knowledge of VPN remote software and RDP setup.
Advanced knowledge of Windows and Citrix based printing.
Understand ITIL overview and tier structure support using ticket tracking system.
Advanced knowledge of Apple OSX and iOS operating systems and platforms.
Advanced knowledge of virtualization technologies (Citrix XenApp, XenDesktop, VMWare, Azure Virtual Desktop, Windows 365, Amazon Workspaces).
Advanced knowledge of IT Security applications (Cisco AMP, Aruba OnGuard, DUO, FireEye, Windows Defender, Windows BitLocker, Checkpoint Encryption and USB allowlisting).
___________________________________________
Bhupesh Khurana
Lead Technical Recruiter
Email - *****************************
Company Overview:
Amerit Consulting is an extremely fast-growing staffing and consulting firm. Amerit Consulting was founded in 2002 to provide consulting, temporary staffing, direct hire, and payrolling services to Fortune 500 companies nationally, as well as small to mid-sized organizations on a local & regional level. Currently, Amerit has over 2,000 employees in 47 states. We develop and implement solutions that help our clients operate more efficiently, deliver greater customer satisfaction, and see a positive impact on their bottom line. We create value by bringing together the right people to achieve results. Our clients and employees say they choose to work with Amerit because of how we work with them - with service that exceeds their expectations and a personal commitment to their success. Our deep expertise in human capital management has fueled our expansion into direct hire placements, temporary staffing, contract placements, and additional staffing and consulting services that propel our clients businesses forward.
Amerit Consulting provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.
Applicants, with criminal histories, are considered in a manner that is consistent with local, state and federal laws
$103k-153k yearly est. 2d ago
BIOPHARMACEUTICAL - C&Q ENGINEER
MMR Consulting
Data engineer job in Vacaville, CA
Previous Pharmaceutical/Biotech experience is mandatory for this role.
MMR Consulting is an engineering and consulting firm specializing in the pharmaceutical and biotechnology industries. Its services include Engineering, Project Management, and other Consulting services.
MMR Consulting has offices in Canada, USA, and Australia.
This is an outstanding opportunity to join our growing team, where the successful candidate will work with a group of engineers and specialists involved in project management, commissioning and qualification, of equipment, systems and facilities.
This role is for Bioprocess C&Q Engineer role to work on the commissioning, qualification, startup of upstream and downstream bioprocess systems/equipment in the biopharmaceutical industry, as well as process equipment in pharma/biotech industries.
The work will require working out of the client's facilities in Vacaville, California.
Responsibilities
Provide technical guidance into the commissioning, qualification and start-up of various equipment and facilities used in life science manufacturing, such as bioreactors, tanks, CIP, Buffers, Media, Chrom, TFF, washers & autoclaves, etc.
Lead the development of key qualification deliverables during the project lifecycle to ensure project is well defined, and the action plan to test the system is applicable and relevant.
Lead qualification processes throughout the project lifecycle such as VPP, Risk Assessments, RTM, DQ, FAT, SAT, IQ, OQ and PQ as appropriate to ensure timely completion and to ensure all quality and engineering specifications are met.
Prepare protocols, execute protocols, summarize data, resolve deviations, prepare final reports.
Experience with C&Q of process equipment, utilities, facilities is an asset. Thermal Validation experience is an asset.
Coordinate meetings with cross-functional departments, to drive project progress, facilitate decisions, provide updates.
Engage other departments, as required, for design reviews and decisions.
Travel may be occasionally required for meetings with clients, equipment fabrication vendors or Factory Acceptance Testing (FATs).
Work may require occasional support over shutdowns or extended hours, specifically during installation and commissioning / validation phases.
Client-management (maintain key Client relationships in support of business development and pursuit of new work), project scheduling/budgeting, coordination of client and MMR resources for effective project delivery, supporting business development (providing technical support to the sales as required for proposals/opportunities), presenting at industry conferences/publishing papers etc.
Visit construction and installation sites following all site safety requirements.
Other duties as assigned by client, and/or MMR, based on workload and project requirements.
Qualifications
3-6+ years for years of experience in commissioning, qualification or validation of various systems within the pharmaceutical/biotech industry.
Engineering or Science degree, preferably in Mechanical, Electrical, Chemical, Biochemical, Electromechanical or a related discipline.
Excellent written and spoken English is required including the preparation of technical documents in English
Knowledge of requirements for a cGMP operations, including SOPs, Change Controls, Validation.
Experience with developing and executing validation projects. Risk-Based Commissioning & Qualification approaches, such as ASTM E-2500 or ISPE ICQ, is considered an asset, but not required.
Experience with commissioning and qualification of biotech process equipment (upstream or downstream or both), such as some, but not all, of the following: fermentation, bioreactors, downstream purification processes (chromatography, TFF, UF) is required
Experience with commissioning & qualification of process control systems (i.e. PCS, SCADA, Historians) and building automation systems (i.e. Siemens Insight / Desigo, JCI Metasys) are considered an asset.
Experience with Qualification or Validation of clean utilities, ISO clean rooms, and Thermal Validation is considered an asset.
Experience with preparation and execution of URS's, DQ's, RTMs, Risk Assessments, CPPs, VPPs, FATs, SATs, IOQs, NCRs, Final Reports.
Ability to lift 50 lbs.
Ability to handle multiple projects and work in a fast-paced environment.
Strong multi-tasking skills
Salary range: 80,000$ -120,000$ based on experience.
Equal Employment Opportunity and Reasonable Accommodations
MMR Consulting is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Our hiring decisions are based on merit, qualifications, and business needs. We are committed to working with and providing reasonable accommodations to individuals with disabilities globally. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the application or interview process, please let us know the nature of your request.
$95k-138k yearly est. 1d ago
Senior Frontend Developer
Capitol Tech Solutions 3.6
Data engineer job in Sacramento, CA
Senior Front-End Developer
About Us:
Capitol Tech Solutions (CTS) is a leading digital transformation company specializing in software development, website design, and data-driven solutions. We partner with both public and private sector clients to deliver innovative, accessible, and user-centered digital experiences. At CTS, we provide top-tier technology solutions tailored to meet the diverse needs of our clients. We foster a collaborative and innovative work environment where team members are encouraged to grow, contribute, and achieve their full potential.
Primary Responsibilities:
We are seeking a highly skilled and creative Senior Front-End Developer to join our dynamic team. In this role, you will be responsible for designing and implementing user-friendly interfaces for web applications. You will leverage your expertise in front-end technologies to create responsive, accessible, and visually appealing user experiences. This includes developing interactive elements, such as navigation menus, buttons, and layouts, optimized for both desktop and mobile platforms.
You will collaborate closely with the Director of Software Development, designers, front-end and back-end developers, project managers, and business analysts to ensure that all projects meet client requirements and are delivered on time and in accordance with specifications.
Lead the design and development of intuitive user interfaces using HTML, CSS, JavaScript, front-end frameworks such as Svelte and .NET technologies such as C#.
Participate in Agile development processes, including sprint planning, reviews, and retrospectives.
Communicate directly with clients to gather requirements, provide updates, and give technical guidance.
Create and translate wireframes, storyboards, user flows, and design into high-quality code.
Design and implement universal UI solutions that focus on performance, scalability, and accessibility, guaranteeing smooth user experiences across all platforms.
Conduct user research and qualitative analysis to guide design choices and enhance usability.
Conduct comprehensive testing to verify that interfaces meet design and functionality requirements.
Troubleshoot and resolve UI-related issues and bugs.
Document the technical aspects of the project for future reference and debugging.
Lead and mentor junior developers and contribute to code reviews and best practices.
Keep up to date with UI/UX trends, platform updates, and security practices, then incorporate them into your development workflows.
Qualifications:
Bachelor's degree in computer science, software engineering, or a related field.
Proven experience in UI development with a strong project portfolio.
Proficient in front-end technologies (HTML, CSS, JavaScript), C#, and the .NET framework.
7+ years of experience as a programmer/analyst in a .NET environment.
7+ years of experience using front-end frameworks/libraries with preference for Svelte, React, Vue, and/or Angular
5+ years of experience in digital design, user research, qualitative analysis, and interaction design.
4+ years of experience developing web application UI/UX compliant with WCAG 2.0 standards.
4+ years of experience working in an Agile team environment.
Familiarity with front-end frameworks (e.g., React, Angular, or Vue.js).
Experience with design tools such as Figma, Adobe Cloud, or Sketch.
Strong communication and collaboration skills.
Salary & Benefits:
Hourly: $48.00-$52.08
Full-time employment includes flexible personal time off, nine paid holidays per year, a 401(k) plan with employer matching, and comprehensive health insurance packages covering medical, dental, and vision care.
$48-52.1 hourly 5d ago
Data Engineer
Recology 4.5
Data engineer job in Sacramento, CA
Role Under limited general direction, develops and maintains data lake repository, data warehouse, and data mapping infrastructure with the ultimate goal to make data accessible so that the organization can use it to evaluate and optimize their performance. Demonstrated expertise designing, developing and maintaining scalable data pipelines and ETL processes to support enterprise data needs.
This is a hybrid position, with three days per week in-office and the rest remote.
Essential Responsibilities
* Creates and maintains optimal data pipeline architecture.
* Assembles large, complex data sets that meet functional / non-functional business requirements.
* Performs unit testing and mock data generation.
* Uses relational data best practices, including referential integrity and query optimization (Oracle, MySQL, SQL Server).
* Participates in agile software planning and development activities, including daily standups, user story and task organization and grooming activities, and effort estimation.
* Analyzes business and technical requirements to develop documentation, designs, code, and tests.
* Maintains source control hygiene (branch protection, git-flow).
* CI/CD includes build, test, and automated deployment (like Jenkins, Travis, and DevOps).
* Other duties as assigned.
Qualifications
* 5+ years of experience in a dataengineer role.
* Significant experience working with relational database technologies such as Oracle & SQL Server.
* Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), as well as working familiarity with a variety of databases.
* Outstanding analytical, quantitative, problem-solving, and programming skills.
* Demonstrated expertise designing, developing and maintaining scalable data pipelines and ETL processes to support enterprise data needs.
* Full-stack development experience.
* Willingness to jump in, learn new tech, and work effectively across full stack.
* Experience reviewing and executing plans and performance tuning.
* Experience working with common languages and frameworks such as C# or Java.
* In-depth understanding of modern application design principles, DevOps & Microservices.
* Relevant certifications from cloud provides (e.g., AWS Certified Data Analytics, Azure DataEngineering, etc.).
* Strong written and verbal communication skills, including ability to convey proposed solutions through natural language, diagrams, and design patterns.
* Ability to work both independently and collaboratively.
* High school diploma or GED required.
* Bachelor's degree preferred.
Recology Offers
* An ecologically innovative company that finds and mentors people committed to protecting the environment and sustaining our communities.
* The largest employee-owned resource recovery company in the industry with terrific benefits to help you prosper.
* A creative and caring culture that values community, diversity, altruism, accountability, collaboration, and learning by doing.
* An inspired company mission driven to use and return resources to their best and highest use through the practice of the 4R's: Reduce, Re-use, Recycle, and Recologize.
* Distinct professional challenges to connect with, care for, and grow community that sees a world without waste.
Recology Benefits May Include
* Paid time off and paid holidays.
* Health and wellness benefits including medical, dental, and vision.
* Retirement plans (Employee Stock Ownership Plan, 401(k) with match).
* Annual wellness incentives.
* Employee Assistance Program (EAP).
* Educational assistance.
* Commuting benefits.
* Employee referral program.
Supplemental Information
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of this job; and pursuant to applicable law, we will consider for employment qualified applicants with criminal records. It is important that you provide accurate information on the job application, inaccurate information may cause delays in the processing of your application and/or may disqualify you as a candidate.
Recology is an equal opportunity employer committed to supporting an inclusive work environment where employees are valued, heard, and provided development opportunities. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, citizenship, disability, protected veteran status, or any other basis that is prohibited by law.
This description is not intended and should not be construed to be an exhaustive list of all responsibilities, skills, effort, work conditions, and benefits associated with the job.
Apple is a place where extraordinary people gather to do their best work. Just be ready to dream big! The people here at Apple don't just build products - they build the kind of wonder that's revolutionized entire industries. It's the diversity of those people and their ideas that encourages the innovation that runs through everything we do, from amazing technology to industry-leading environmental efforts. Join Apple, and help us leave the world better than we found it. Imaging what you could do here! At Apple, creative ideas have a way of becoming wonderful products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. Essential Fucntions Resolve technical and non-technical issues through tickets, from understanding user problems to giving solutions and documenting root cause analysis. Effectively communicate to both technical and non-technical customers the progress of the ticket. Identify and guide teams to understand users issue and help teams come up with solutions to address the concern. Collaborate with cross functional teams to fix issues like bugs and data quality. Ensure customer success through timely issue resolution, providing positive customer impact and experience. In this role you will be asked to lead through ambiguous tickets and issues on a big data system, working with cross functional teams on Apple's manufacturing data platform. The expectation is that you will be able to independently drive ownership of the issue across all leadership levels of the business and provide guidance and direction when you may not have all the facts or data points. Teams across the organization are relying on the solutions we provide to make critical decisions with system availability being paramount to the success of many groups throughout the company, and with the ability to work weekends and holidays as required.
Customer-oriented mindset with a focus on delivering quality support and ensuring high customer satisfaction Interact with end-users to understand, clarify, and document issues, as well as to provide updates on issue resolution progress. Excellent communication skills to convey technical information to both technical and non-technical stakeholders. Conducts research to understand how the organization functions and where it can improve. Being able to analyze this information and form a hypothesis of organization weaknesses and how to fix them. Prepares reports with excel or Similar tools. L2/L3 support on large enterprise applications with advance troubleshooting and root cause analysis. Stays current with latest trends in manufacturing and operations in industries and applies them to existing business models Travels to different job locations as required to maximize business knowledge and user feedback. Analyze and interpret data to support business in decision-making, ensuring data accuracy and integrity. Performs requirements gathering for overall improvement of platform. Creates documentation for SOPs and process flows. Collaborate with the testing team to ensure that testing efforts align with business requirements and participate in user acceptance testing (UAT) and facilitate user feedback. Works with members of own team to offer different ideas Clarifies strategic and operational problems / successes / failures with management Implement strategies for gathering, reviewing and analyzing data to see the trend and patterns.
Python Scripting experience. Working knowledge on Linux Operating System and Unix/Shell Scripting. Experience with Tableau (or other similar reporting tools) for data Visualisation and reporting.
5+ years of experience working in L2/L3 support on large enterprise applications with advanced troubleshooting and root cause analysis across multiple technologies. Exposure to cloud technologies. Programming work experience in writing complex Sql to extract Data from a large Databases.
$153k-197k yearly est. 60d+ ago
Data Scientist, Analytics (Technical Leadership)
Meta 4.8
Data engineer job in Sacramento, CA
We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.
**Required Skills:**
Data Scientist, Analytics (Technical Leadership) Responsibilities:
1. Work with complex data sets to solve challenging problems using analytical and statistical approaches
2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies
3. Identify and measure success through goal setting, forecasting, and monitoring key metrics
4. Partner with cross-functional teams to inform and execute product strategy and investment decisions
5. Build long-term vision and strategy for programs and products
6. Collaborate with executives to define and develop data platforms and instrumentation
7. Effectively communicate insights and recommendations to stakeholders
8. Define success metrics, forecast changes, and set team goals
9. Support developing roadmaps and coordinate analytics efforts across teams
**Minimum Qualifications:**
Minimum Qualifications:
10. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
11. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab)
12. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development
13. Experience working effectively with and leading complex analytics projects across multiple stakeholders and cross-functional teams, including Engineering, PM/TPM, Analytics and Finance
14. Experience with predictive modeling, machine learning, and experimentation/causal inference methods
15. Experience communicating complex technical topics in a clear, precise, and actionable manner
**Preferred Qualifications:**
Preferred Qualifications:
16. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy
17. Masters or Ph.D. Degree in a quantitative field
18. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research)
19. 10+ years of experience doing complex quantitative analysis in product analytics
**Public Compensation:**
$210,000/year to $281,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
$210k-281k yearly 60d+ ago
Data Scientist II
The Gap 4.4
Data engineer job in Folsom, CA
About the RoleThe Forecasting Team at Gap Inc. applies data analysis and machine learning techniques to drive business benefits for Gap Inc. and its brands. The team's focus is to shape the company's inventory management strategy through advanced data science and forecasting techniques. The successful candidate will lead the development of advanced forecasting models across various business functions, time horizons, and product hierarchies.
Areas of expertise include forecasting, time series, predictive modeling, supply chain analytics, and inventory management. You will support the team to build and deploy data and predictive analytics capabilities, in partnership with GapTech, PDM, Central Marketing & business partners across our brands.What You'll Do
Build, validate, and maintain AI (Machine Learning (ML) /Deep learning) models, diagnose, and optimize performance and develop statistical models and analysis for ad hoc business focused analysis.
Develop software programs, algorithms and automated processes that cleanse, integrate, and evaluate large data sets from multiple disparate sources.
Manipulate large amounts of data across a diverse set of subject areas, collaborating with other data scientists and dataengineers to prepare data pipelines for various modeling protocols.
Deliver sound, data-backed recommendations tied to business results, industry insights, and overall Gap Inc. ecosystem of technology, platform, and resources.
Communicate compelling, data-driven recommendations as well as potential trade-offs, backed by data analysis and/or model outputs to influence leaders' and stakeholders' decisions.
Build networks across the organization and partners to anticipate leader requests and influence data-driven decision making.
Guides discussions and empowers more junior team members to identify best solutions.
Who You Are
Experience in developing advanced algorithms using machine leaning (ML), statistical, and optimization methods to enhance various business components in the retail sector.
Hands-on experience with forecasting models, running simulations of what-if analysis, and prescriptive analytics.
Experience with time series analysis, predictive modeling, hierarchical Bayesian, causal ML, and transformer-based algorithms.
Experience with creating business impact in supply chain, merchandise, inventory planning, or vendor management using advanced forecasting techniques.
Experience working directly with cross-functional teams such as product management, engineers, business partners.
Advanced proficiency in modern analytics tools and languages such as Python, R, Spark, SQL.
Advanced proficiency using SQL for efficient manipulation of large datasets in on prem and cloud distributed computing environments, such as Azure environments.
Ability to work both at a detailed level as well as to summarize findings and extrapolate knowledge to make strong recommendations for change.
Ability to collaborate with cross functional teams and influence product and analytics roadmap, with a demonstrated proficiency in relationship building.
Ability to assess relatively complex situations and analyze data to make judgments and recommend solutions.
Required
BS with 7+ years of experience (or MS with 5+ years) in Data Science, Computer Science, Machine Learning, Applied Mathematics, or equivalent quantitative field.
People mentoring experience, ability to work independently on large scale projects.
Proven ability to lead teams in solving unstructured technical problems to achieve business impact.
Full stack experience across analytics, data science, machine learning, and dataengineering
$123k-171k yearly est. Auto-Apply 45d ago
Data Scientist, Privacy
Datavant
Data engineer job in Sacramento, CA
Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care.
By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare.
As part of the Privacy Science team within Privacy Hub you will play a crucial role in ensuring that privacy of patients is safeguarded in the modern world of data sharing. As well as working on real data, you will be involved in exciting research to keep us as industry leaders in this area, and stimulating discussions on re-identification risk. You will be supported in developing/consolidating data analysis and coding skills to become proficient in the analysis of large health-related datasets.
**You Will:**
+ Critically analyze large health datasets using standard and bespoke software libraries
+ Discuss your findings and progress with internal and external stakeholders
+ Produce high quality reports which summarise your findings
+ Contribute to research activities as we explore novel and established sources of re-identification risk
**What You Will Bring to the Table:**
+ Excellent communication skills. Meticulous attention to detail in the production of comprehensive, well-presented reports
+ A good understanding of statistical probability distributions, bias, error and power as well as sampling and resampling methods
+ Seeks to understand real-world data in context rather than consider it in abstraction.
+ Familiarity or proficiency with programmable data analysis software R or Python, and the desire to develop expertise in its language
+ Application of scientific methods to practical problems through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions
+ Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines
+ Initiative and ability to independently explore and research novel topics and concepts as they arise, to expand Privacy Hub's knowledge base
+ An appreciation of the need for effective methods in data privacy and security, and an awareness of the relevant legislation
+ Familiarity with Amazon Web Services cloud-based storage and computing facilities
**Bonus Points If You Have:**
+ Experience creating documents using LATEX
+ Detailed knowledge of one or more types of health information, e.g., genomics, disease, health images
+ Experience working with or supporting public sector organizations, such as federal agencies (e.g., CMS, NIH, VA, CDC), state health departments, or public health research partners. Familiarity with government data environments, procurement processes, or privacy frameworks in regulated settings is highly valued.
\#LI-BC1
We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status.
At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services.
The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job.
The estimated total cash compensation range for this role is:
$104,000-$130,000 USD
To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion.
This job is not eligible for employment sponsorship.
Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay.
At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way.
Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis.
For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
$104k-130k yearly 14d ago
Educator Preparation Data Scientist
CSU 3.8
Data engineer job in Sacramento, CA
Chancellor's Office Statement
Join our team at the California State University, Office of the Chancellor, and make a difference in providing access to higher education. We are currently seeking experienced candidates for the position of Educator Preparation Data Scientist. The CSU Chancellor's Office, located on the waterfront adjacent to the Aquarium of the Pacific in downtown Long Beach, is the headquarters for the nation's largest and most diverse system of higher education. The CSU Chancellor's Office offers a premium benefit package that includes outstanding vacation, health, and dental plans; a fee waiver education program; membership in the California Public Employees Retirement System (PERS); and 15 paid holidays a year.
Salary
The anticipated salary hiring range is up to $104,136 per year, commensurate with qualifications and experience.
Classification
Administrator II
Position Information
The California State University, Office of the Chancellor, is seeking an Educator Preparation Data Scientist to be responsible for managing, developing, and maintaining the Center's educator preparation data systems and dashboards. The Data Scientist performs strategic data analyses to inform continuous improvement efforts in CSU's educator preparation programs and supports campus data needs in ways that enhance the system's ability to recruit, prepare, develop, and retain outstanding teachers for California schools.
The data initiatives overseen in this role directly advance the Chancellor's Office strategic plan, "CSU Forward: Thriving Students, Thriving University, Thriving California” by providing actionable evidence about CSU educator preparation programs. These initiatives support:
-Public transparency and informed strategic planning and policy development;
-Campus accreditation and compliance with state and federal requirements;
-Evaluation of state, federal, and privately funded grants and initiatives;
-Systemwide accountability, including the establishment of improvement goals and priorities;
-Effective communication of the impact and value of university-based educator preparation programs; and
-Statewide policy discussions related to educator preparation reform.
THIS POSITION IS LOCATED AT THE CSU SACRAMENTO CAMPUS.
This position is approved for telecommuting (two days telecommuting, three days in office (in-person)) with onsite work at the main headquarters located in Sacramento, California.
Responsibilities
Under the general direction of the Director, Educator Quality Center, the Educator Preparation Data Scientist will perform duties as outlined below:
Data Systems Project Management
-Design, develop, and manage systems and processes for collecting, extracting, loading, and integrating high-priority credential program and student data into the EdQ operational data store.
-Lead the development of dashboards and reporting tools that support CSU educator preparation programs.
-Collaborate with a wide range of stakeholders-including campus and Chancellor's Office staff, internal IT and IR&A teams, and external partners such as state and national education agencies, funding organizations, consultants, and vendors-to ensure data systems meet strategic and operational needs.
Strategic Data Analysis
-Conduct strategic data analyses to generate valid and reliable evidence that supports continuous improvement in CSU educator preparation programs and addresses California's educator workforce needs.
-Translate analytical findings into actionable insights and communicate them clearly through data visualizations, storytelling, and presentations tailored to non-technical audiences.
-Evaluate the impact of key initiatives using quantitative evidence and support data-informed decision-making across the CSU system.
Support Effective Use of Data
-Develop and refine systems and processes that improve the quality, consistency, and usability of existing educator preparation metrics administered by EdQ.
-Define and monitor key performance indicators (KPIs) in collaboration with educator preparation faculty and practitioners to reflect their goals and values.
-Promote the standardization and adoption of shared data tools, software platforms, and protocols across CSU educator preparation programs.
Qualifications
This position requires:
-Demonstrated interest in improving educational outcomes, particularly in educator preparation.
-Master's degree or higher in a technical, computational, or quantitative field (e.g., Data Science, Statistics, Computer Science, Economics, Educational Measurement, or related discipline).
-A minimum of four years of professional experience, including at least one year of hands-on experience in data science, analytics, or applied research.
-Proven ability to deliver data products, tools, or original research-preferably in an educational or public-sector context, using large or complex datasets.
-Strong quantitative skills, including experience with statistical methods used in education research.
-Proficiency with one or more statistical software tools or programming languages (e.g., R, Python, Stata).
-Working knowledge of SQL, relational databases, and data visualization tools (e.g., Tableau, Power BI).
-Experience with data mining, exploration, and visualization techniques.
-Familiarity with data systems design, development, and management.
-Strong project management skills and ability to work collaboratively with cross-functional teams.
-Experience administering surveys using platforms such as Qualtrics, SurveyMonkey, or QuestionPro.
-Experience working with sensitive or confidential data in a secure computing environment.
Preferred Qualifications
-Experience with ETL (Extract, Transform, Load) processes and data warehousing solutions.
-Proficiency with version control systems such as Git.
-Familiarity with data governance practices, especially in educational or public-sector environments.
-Experience using AI or machine learning tools (e.g., OpenAI) to enhance data analysis, automation, or reporting workflows.
-Knowledge of educator preparation policy, accreditation processes, or workforce analytics.
Application Period
Priority consideration will be given to candidates who apply by December 2, 2025. Applications will be accepted until the job posting is removed.
How To Apply
Please click "Apply Now" to complete the California State University, Chancellor's Office online employment application.
Equal Employment Opportunity
Consistent with California law and federal civil rights laws, the CSU provides equal opportunity in education and employment without unlawful discrimination or preferential treatment based on race, sex, color, ethnicity, or national origin. Reasonable accommodations will be provided for qualified applicants with disabilities who self-disclose by contacting the Senior Human Resources Manager at **************.
Title IX
Please view the Notice of Non-Discrimination on the Basis of Gender or Sex and Contact Information for Title IX Coordinator at: *********************************
E-Verify
This position requires new hire employment verification to be processed through the E-Verify program administered by the Department of Homeland Security, U.S. Citizenship and Immigration Services (DHSUSCIS)' in partnership with the Social Security Administration (SSA).
If hired, you will be required to furnish proof that you are legally authorized to work in the United States. The CSU Chancellor's Office is not a sponsoring agency for staff and Management positions (i.e., H1-B VISAS).
COVID19 Vaccination Policy
Per the CSU COVID-19 Vaccination Policy, it is strongly recommended that all Chancellor's Office employees who are accessing office and campus facilities follow COVID-19 vaccine recommendations adopted by the U.S. Centers for Disease Control and Prevention (CDC) and the California Department of Public Health (CDPH) applicable to their age, medical condition, and other relevant indications.
Mandated Reporter Per CANRA
The person holding this position is considered a 'mandated reporter' under the California Child Abuse and Neglect Reporting Act and is required to comply with the requirements set forth in CSU Executive Order 1083 as a condition of employment.
CSU Out of State Employment Policy
California State University, Office of the Chancellor, as part of the CSU system, is a State of California Employer. As such, the University requires all employees upon date of hire to reside in the State of California. As of January 1, 2022, the CSU Out-of-State Employment Policy prohibits the hiring of employees to perform CSU-related work outside the state of California.
Background
The Chancellor's Office policy requires that the selected candidate successfully complete a full background check (including a criminal records check) prior to assuming this position.
$104.1k yearly 60d+ ago
Senior Data Engineer, Product Analytics
Datarobot 4.2
Data engineer job in Sacramento, CA
DataRobot delivers AI that maximizes impact and minimizes business risk. Our platform and applications integrate into core business processes so teams can develop, deliver, and govern AI at scale. DataRobot empowers practitioners to deliver predictive and generative AI, and enables leaders to secure their AI assets. Organizations worldwide rely on DataRobot for AI that makes sense for their business - today and in the future.
**About DataRobot**
DataRobot delivers the industry-leading agentic AI applications and platform that maximize impact and minimize risk for your business. DataRobot's enterprise AI platform democratizes data science with end-to-end automation for building, deploying, and managing machine learning models. This platform maximizes business value by delivering AI at scale and continuously optimizing performance over time. The company's proven combination of cutting-edge software and world-class AI implementation, training, and support services empowers any organization, regardless of size, industry, or resources, to drive better business outcomes with AI.
**About the role:**
As a technical driver and hands-on expert, the Senior DataEngineer will shape our end-to-end data strategy and guide the team's technical execution. This role is responsible for building scalable Data Warehouse and Lakehouse solutions on Snowflake, championing the ELT paradigm, and ensuring robust data governance and cost optimization. We are looking for a seasoned engineer who combines deep technical mastery with a passion for mentoring others to build and influence high-impact, data-driven solutions.
**Key Responsibilities:**
+ Architect and deliver scalable, reliable data warehouses, analytics platforms, and integration solutions. Critical role in supporting our internal AI strategy.
+ Partner with Product Manager, Analytics to shape our project roadmap and lead its implementation. Collaborate with and mentor cross-functional teams to design and execute sophisticated data software solutions that elevate business performance and align to coding standards and architecture.
+ Develop, deploy, and support analytic data products, such as data marts, ETL jobs (extract/transform/load), functions (in Python/SQL/DBT) in a cloud data warehouse environment using Snowflake, Stitch/Fivetran/Airflow, AWS services (e.g., EC2, lambda, kinesis).
+ Navigate various data sources and efficiently locate data in a complex data ecosystem.
+ Work closely with data analysts, and data scientists to build models and metrics to support their analytics needs. Data modeling enhancements caused by upstream data changes.
+ Instrument telemetry capture and data pipelines for various environments to provide product usage visibility.
+ Maintain and support deployed ETL pipelines and ensure data quality.
+ Develop monitoring and alerting systems to provide visibility into the health of data infrastructure, cloud applications, and data pipelines.
+ Partner with the IT enterprise applications and engineering teams on integration efforts between systems that impact data & Analytics
+ Work with R&D to answer complex technical questions about product analytics and corresponding data structure.
**Knowledge, Skills, and Abilities:**
+ 5-7 years of experience in a dataengineering or data analyst role.
+ Experience building and maintaining product analytics pipelines, including the implementation of event tracking (e.g., Snowplow) and the integration of behavioral data into Snowflake from platforms like Amplitude.
+ Strong understanding of data warehousing concepts, working experience with relational databases (Snowflake, Redshift, Postgres, etc.), and SQL.
+ Experience working with cloud providers like AWS, Azure, GCP, etc.
+ Solid programming foundations and proficiency in data-related languages like Python, Scala, and R.
+ Experience with DevOps workflows and tools like DBT, GitHub, Airflow, etc.
+ Experience with an infrastructure-as-code tool such as Terraform or CloudFormation
+ Excellent communication skills. Ability to effectively communicate with both technical and non-technical audiences
+ Knowledge of real-time stream technologies like AWS Firehose, Spark, etc.
+ Highly collaborative in working with teammates and stakeholders
**Nice to Have:**
+ AWS cloud certification is a plus
+ BA/BS preferred in a technical or engineering field
**Compensation Statement**
The U.S. annual base salary range for this full-time position is between $180,000 and $210,000 USD/year. Actual offers may be higher or lower than this range based on various factors, including (but not limited to) the candidate's work location, job-related skills, experience, and education.
The talent and dedication of our employees are at the core of DataRobot's journey to be an iconic company. We strive to attract and retain the best talent by providing competitive pay and benefits with our employees' well-being at the core. Here's what your benefits package may include depending on your location and local legal requirements: Medical, Dental & Vision Insurance, Flexible Time Off Program, Paid Holidays, Paid Parental Leave, Global Employee Assistance Program (EAP) and more!
**DataRobot Operating Principles:**
+ Wow Our Customers
+ Set High Standards
+ Be Better Than Yesterday
+ Be Rigorous
+ Assume Positive Intent
+ Have the Tough Conversations
+ Be Better Together
+ Debate, Decide, Commit
+ Deliver Results
+ Overcommunicate
Research shows that many women only apply to jobs when they meet 100% of the qualifications while many men apply to jobs when they meet 60%. **At DataRobot we encourage ALL candidates, especially women, people of color, LGBTQ+ identifying people, differently abled, and other people from marginalized groups to apply to our jobs, even if you do not check every box.** We'd love to have a conversation with you and see if you might be a great fit.
DataRobot is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. DataRobot is committed to working with and providing reasonable accommodations to applicants with physical and mental disabilities. Please see the United States Department of Labor's EEO poster and EEO poster supplement for additional information.
All applicant data submitted is handled in accordance with our Applicant Privacy Policy (*************************************************** .
DataRobot delivers AI that maximizes impact and minimizes business risk. Our AI applications and platform integrate into core business processes so teams can develop, deliver, and govern AI at scale. DataRobot empowers practitioners to deliver predictive and generative AI, and enables leaders to secure their AI assets. Organizations worldwide rely on DataRobot for AI that makes sense for their business - today and in the future. For more information, visit our website (************************* and connect with us on LinkedIn (******************************************** .
**_DataRobot has become aware of scams involving false offers of DataRobot employment. The scams and false offers use imposter websites, email addresses, text messages, and other fraudulent means. None of these offers are legitimate, and DataRobot's recruiting process never involves conducting interviews via instant messages, nor requires candidates to purchase products or services, or to process payments on our behalf._** **_Please note that DataRobot does not ask for money in its recruitment process._** **_DataRobot is committed to providing a safe and secure environment for all job applicants. We encourage all job seekers to be vigilant and protect themselves against recruitment scams by verifying the legitimacy of any job offer before providing personal information or paying any_** **_fees. Communication_** **_from our company will be sent from a verified email address using the @_** **_datarobot.com_** **_email domain. If you receive any suspicious emails or messages claiming to be from DataRobot, please do not respond._**
**_Thank you for your interest in DataRobot, and we look forward to receiving your application through our official channels._**
Don't see the dream job you are looking for? Drop off your contact information and resume and we will reach out to you if we find the perfect fit!
$180k-210k yearly 11d ago
Data Engineering
360-Tsg
Data engineer job in Rosemont, CA
We are once again working with an exciting organization, they are looking for a DataEngineer to join their team. Your responsibilities will involve maintaining and enhancing the Registry's data acquisition, integration, and ETL pipelines in support of both operational and business intelligence data stores. The incumbent is responsible for applying diverse data cleansing and transformation techniques as well as the ongoing management and monitoring of all Registry databases. This includes addressing issues pertaining to the ongoing operations and optimization of the data environment including performance, reliability, logging, scalability, etc. This position will also provide support for the companies database systems, warehouse, marts, and supporting applications.
Your will also lead the efforts to develop a unified enterprise data model for the organization. Design, develop, and maintain high\-performance data platforms on premise and in Microsoft Azure cloud\-based environments including leading the development of a data warehouse environment to support the Registry's business intelligence roadmap. Champion efforts that will ensure that the organizations business intelligence applications remain relevant for use by internal business groups by actively participating in strategy and project planning discussions. You will work collaboratively with Registry participants and internal support teams, identify and implement changes that improve system performance and the user experience.
You will design, develop, and maintain the ETL pipelines using SSIS that standardize raw data from multiple data sources and optimize both the operational and dimensional\/star schema data model necessary for transactional systems and business intelligence applications. Gather business and functional requirements and translate these requirements into robust, scalable, operable solutions aligning to an overall data architecture. Extract, transform, and load data to and from various data sources including relational databases, NoSQL databases, web services, and flat files. Produce various technical documents such as ER diagrams, table schemas, data lineage, API documents, etc.
Provide leadership and oversight on database architectural design for existing systems including database administration, performance monitoring, and troubleshooting. Provide complex analysis, conceptualize, design, implement, and develop solutions for critical data\-centric projects. Perform data flow, system and data analysis, and develop meaningful and useful presentation of data in downstream applications. Plan and implement standards, define\/code conformed global and reusable objects, perform complex database design and data repository modeling.
Monitor ETL processes, system audits, dashboard reporting, and presentation layer functioning and performance. Proactively identify and implement procedures that resolve performance and\/or data reporting issues. Support the optimal performance of the organization's data and BI systems. Monitor database performance, provide optimization recommendations, and implement recommendations. Follow the release cycles and implement on\-time delivery of task assignments, defect correction, change requests, and enhancements. Troubleshoot and solve technical problems. Perform other responsibilities as assignment by management.
Requirements
Relevant work experience in a dataengineering role leveraging SQL, SSIS; including design and support of ETL routines that support the import of data from multiple data sources
Data warehousing experience including the design, development, and ongoing support of star or snowflake data schemas to support business intelligence applications
Database administration or database development experience in a SQL or MySQL environment; knowledge of Microsoft technology stack; background in Azure Infrastructure as a Service environment desired
Experience working with both structured and unstructured data
Demonstrated understanding of Business Intelligence and data solutions including cubes, data warehouse, data marts, and supporting schema types (star, snowflake, etc.)
Data modeling experience in building logical and physical data models
Applied knowledge of Microsoft Security\/Authentication Concepts (Active Directory, IIS, Windows OS)
Strong technical planning skills with the ability to prioritize and multitask across a number of work streams
Polished presentation skills; experience creating and presenting findings to executive level staff
Strong written, verbal and interpersonal communication skills, with an ability to communicate ideas and solutions effectively
Must be highly collaborative with the ability to manage and motivate project teams and meet deliverables
Ability to build strong stakeholder relationships and translate complex technical concepts to non\-technical stakeholders
Knowledge of SSRS or PowerBI is a plus
"}}],"is Mobile":false,"iframe":"true","job Type":"Contract to Hire","apply Name":"Apply Now","zsoid":"61675964","FontFamily":"PuviRegular","job OtherDetails":[{"field Label":"Work Experience","uitype":2,"value":"5+ years"},{"field Label":"Industry","uitype":2,"value":"Health Care"},{"field Label":"Location","uitype":1,"value":"Rosemont"},{"field Label":"Job Opening ID","uitype":111,"value":"ZR_131_JOB"},{"field Label":"City","uitype":1,"value":"Rosemont"},{"field Label":"State\/Province","uitype":1,"value":"IL"},{"field Label":"Zip\/Postal Code","uitype":1,"value":"60018"}],"header Name":"DataEngineering","widget Id":"**********00072311","is JobBoard":"false","user Id":"**********00096003","attach Arr":[],"custom Template":"5","is CandidateLoginEnabled":false,"job Id":"**********06403248","FontSize":"15","location":"Rosemont","embedsource":"CareerSite","indeed CallBackUrl":"https:\/\/recruit.zoho.com\/recruit\/JBApplyAuth.do"}
$108k-155k yearly est. 60d+ ago
Data Engineer
Kuvare
Data engineer job in Rosemont, CA
About the role
The Kuvare DataEngineer is responsible for the retrieval, storage, and distribution of data across varied platforms and data stores, including modern ETL/ELT pipelines such as Azure Data Factory, as well as modern models such as data mesh and event streaming architectures like Azure Event Hub or Kafka. The individual will rely on broad experience in data modeling, including relational and dimensional models as well as semi-structured and unstructured data forms such as JSON, YAML, and PDF. This role has a direct business impact by providing rapid enhancements to the data platform in support of new business relationships, new consumer products, etc.
What you'll do
· Research architectural trends, design methods, and emerging technologies, in adherence to and in support of company standards, helping Kuvare develop and maintain a nimble technology platform for adapting to new business opportunities.
· Work with Business SMEs and end users - including external business partners and vendors - to translate business requirements into technical specifications, including technical designs, data flow diagrams, and other technical artifacts.
· Create data pipelines to ingest, update, or transport data to/from any variety of data stores, from external data files to internal dimensional or relational models; and leveraging a broad toolset including SFTP, Azure Data Factory, Azure Event Hub, SQL Stored Procedures, etc.
· Leverage excellent SQL skills to build structured data repositories: tables, views, indexes, constraints, etc.
· Perform tests and evaluations to ensure reliability, data security, privacy, and integrity.
· Other Responsibilities
· Ability and willingness to travel occasionally, such as for monthly meetings at the local office and an annual IT summit at a Kuvare hub office, which might be Baton Rouge, Cedar Rapids, or Chicago.
· Occasional evening and weekend work to meet deadlines.
· Other duties and responsibilities as assigned.
Qualifications
· Bachelor's degree in computer science, information systems, computer engineering or related field; or an equivalent combination of education and experience.
· Extensive experience with Microsoft SQL Server and the Azure environment.
· Expertise in relational and dimensional database modeling, data development and administration.
· Extensive experience with ETL/ELT pipelines including any combination of modern tools such as SSIS, Azure Data Factory, Function Apps, Logic Apps, etc.
· Analytic and problem-solving skills and experience.
· Excellent written and oral communication skills.
$108k-155k yearly est. 3d ago
Sr Data Engineer
Insight Global
Data engineer job in Sacramento, CA
Data Pipeline Development Design and implement complex, scalable ETL pipelines using Azure Fabric, Data Factory, and Synapse. Build and maintain transformation pipelines and data flows in Azure Fabric. Source data from diverse systems including APIs, legacy systems, and mainframes (nice to have).
Automate data ingestion, transformation, and validation processes using PySpark and Python.
Maintain source control hygiene and CI/CD pipelines (e.g., Azure DevOps, Jenkins).
Database Design & Optimization
Design and maintain relational and NoSQL databases (SQL Server, Oracle, etc.).
Ensure referential integrity, indexing, and query optimization for performance.
Data Infrastructure Management
Manage data warehouses, data lakes, and other storage solutions on Azure.
Monitor system performance, ensure data security, and maintain compliance.
Data Modeling & Governance
Develop and maintain logical and physical data models.
Implement data governance policies and ensure data quality standards.
Collaboration & Agile Development
Work closely with business and technical teams to gather requirements and deliver solutions.
Participate in agile ceremonies, sprint planning, and code reviews.
Provide technical guidance and mentorship to team members.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to ********************.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: ****************************************************
Skills and Requirements
Proven experience with Azure Data Services, especially Azure Fabric for data flows and transformation pipelines.
Strong proficiency in SQL, Python, and PySpark.
Experience with data warehousing, ETL/ELT, and data modeling.
Familiarity with CI/CD, DevOps, and microservices architecture.
Experience with relational (SQL Server, Oracle) and NoSQL databases.
Strong analytical and problem-solving skills.
Excellent communication skills, both written and verbal. Experience integrating data from mainframes or other legacy systems.
Familiarity with mock data generation, data validation frameworks, and data quality tools.
Exposure to data visualization and BI tools for delivering insights
$108k-155k yearly est. 3d ago
Bigdata Engineer
Jobsbridge
Data engineer job in Sacramento, CA
Primary Responsibilities Develop web based user interfaces for analysis of operational and business data Create algorithmic data models Manage data sources and feeds for web based user interfaces Ability to deliver on deadline driven projects Work within a standards driven development approach
Collaborate effectively with team members
General Experience with the following:
Familiarity with Data Analytics / Data Science practices
Familiarity with Data Quality and Master Data Management practices
Effective communication skills, both written and verbal
Create technical design/specification documentation
Experience with formal source control tools/methods
Education and Experience
Prefer bachelor's degree or above in Computer Science or related fiel
5+ years of development experience
4+ years of web development with AJAX skills
4+ years of database development experience
2+ Big Data experience: Hadoop, Hbase, Cassandra, MongoDB, etc.
Database administration experience
Skills and Knowledge
Development/Management experience with most of the following:
HTML, Javascript, XML
PERL/PHP
Adobe FLEX
Oracle/MS SQL experience
Informatica product suite
Java Web Services
Qualifications
Big Data, HTML, Javascript, XML, Oracle/MS SQL, Data Analytics, Data Science
Additional Information
All your information will be kept confidential according to EEO guidelines.
$108k-155k yearly est. 1h ago
Educator Preparation Data Scientist
California State University System 4.2
Data engineer job in Sacramento, CA
Responsibilities Under the general direction of the Director, Educator Quality Center, the Educator Preparation Data Scientist will perform duties as outlined below: Data Systems Project Management * Design, develop, and manage systems and processes for collecting, extracting, loading, and integrating high-priority credential program and student data into the EdQ operational data store.
* Lead the development of dashboards and reporting tools that support CSU educator preparation programs.
* Collaborate with a wide range of stakeholders-including campus and Chancellor's Office staff, internal IT and IR&A teams, and external partners such as state and national education agencies, funding organizations, consultants, and vendors-to ensure data systems meet strategic and operational needs.
Strategic Data Analysis
* Conduct strategic data analyses to generate valid and reliable evidence that supports continuous improvement in CSU educator preparation programs and addresses California's educator workforce needs.
* Translate analytical findings into actionable insights and communicate them clearly through data visualizations, storytelling, and presentations tailored to non-technical audiences.
* Evaluate the impact of key initiatives using quantitative evidence and support data-informed decision-making across the CSU system.
Support Effective Use of Data
* Develop and refine systems and processes that improve the quality, consistency, and usability of existing educator preparation metrics administered by EdQ.
* Define and monitor key performance indicators (KPIs) in collaboration with educator preparation faculty and practitioners to reflect their goals and values.
* Promote the standardization and adoption of shared data tools, software platforms, and protocols across CSU educator preparation programs.
Qualifications
This position requires:
* Demonstrated interest in improving educational outcomes, particularly in educator preparation.
* Master's degree or higher in a technical, computational, or quantitative field (e.g., Data Science, Statistics, Computer Science, Economics, Educational Measurement, or related discipline).
* A minimum of four years of professional experience, including at least one year of hands-on experience in data science, analytics, or applied research.
* Proven ability to deliver data products, tools, or original research-preferably in an educational or public-sector context, using large or complex datasets.
* Strong quantitative skills, including experience with statistical methods used in education research.
* Proficiency with one or more statistical software tools or programming languages (e.g., R, Python, Stata).
* Working knowledge of SQL, relational databases, and data visualization tools (e.g., Tableau, Power BI).
* Experience with data mining, exploration, and visualization techniques.
* Familiarity with data systems design, development, and management.
* Strong project management skills and ability to work collaboratively with cross-functional teams.
* Experience administering surveys using platforms such as Qualtrics, SurveyMonkey, or QuestionPro.
* Experience working with sensitive or confidential data in a secure computing environment.
Preferred Qualifications
* Experience with ETL (Extract, Transform, Load) processes and data warehousing solutions.
* Proficiency with version control systems such as Git.
* Familiarity with data governance practices, especially in educational or public-sector environments.
* Experience using AI or machine learning tools (e.g., OpenAI) to enhance data analysis, automation, or reporting workflows.
* Knowledge of educator preparation policy, accreditation processes, or workforce analytics.
Application Period
Priority consideration will be given to candidates who apply by December 2, 2025. Applications will be accepted until the job posting is removed.
How To Apply
Please click "Apply Now" to complete the California State University, Chancellor's Office online employment application.
Equal Employment Opportunity
Consistent with California law and federal civil rights laws, the CSU provides equal opportunity in education and employment without unlawful discrimination or preferential treatment based on race, sex, color, ethnicity, or national origin. Reasonable accommodations will be provided for qualified applicants with disabilities who self-disclose by contacting the Senior Human Resources Manager at **************.
Title IX
Please view the Notice of Non-Discrimination on the Basis of Gender or Sex and Contact Information for Title IX Coordinator at: *********************************
E-Verify
This position requires new hire employment verification to be processed through the E-Verify program administered by the Department of Homeland Security, U.S. Citizenship and Immigration Services (DHSUSCIS)' in partnership with the Social Security Administration (SSA).
If hired, you will be required to furnish proof that you are legally authorized to work in the United States. The CSU Chancellor's Office is not a sponsoring agency for staff and Management positions (i.e., H1-B VISAS).
COVID19 Vaccination Policy
Per the CSU COVID-19 Vaccination Policy, it is strongly recommended that all Chancellor's Office employees who are accessing office and campus facilities follow COVID-19 vaccine recommendations adopted by the U.S. Centers for Disease Control and Prevention (CDC) and the California Department of Public Health (CDPH) applicable to their age, medical condition, and other relevant indications.
Mandated Reporter Per CANRA
The person holding this position is considered a 'mandated reporter' under the California Child Abuse and Neglect Reporting Act and is required to comply with the requirements set forth in CSU Executive Order 1083 as a condition of employment.
CSU Out of State Employment Policy
California State University, Office of the Chancellor, as part of the CSU system, is a State of California Employer. As such, the University requires all employees upon date of hire to reside in the State of California. As of January 1, 2022, the CSU Out-of-State Employment Policy prohibits the hiring of employees to perform CSU-related work outside the state of California.
Background
The Chancellor's Office policy requires that the selected candidate successfully complete a full background check (including a criminal records check) prior to assuming this position.
Advertised: Nov 18 2025 Pacific Standard Time
Applications close:
$94k-133k yearly est. 60d+ ago
Sr Data Engineer (MFT - IBM Sterling)
The Hertz Corporation 4.3
Data engineer job in Sacramento, CA
**A Day in the Life:** The IBM Sterling Senior MFT Engineer will have expertise in Sterling B2B Integrator, Sterling File Gateway, Sterling Secure Proxy, Control Center. This position requires an expert level knowledge of these technologies. You'll provide third-level support for core hardware, software, data and infrastructure components within the B2B integration team. The software focus areas will include software infrastructure components, continuous improvement and automation related to middleware and enterprise data movement within a mission-critical, multi-site enterprise environment.
The ideal candidate will have a passion for technology; possess the ability to create change and be the facilitator for this transformation. They will have experience tailoring software design, developing, and implementing of solutions in support of data quality and automation. They will need to understand the requirements of a project, test the performance of the infrastructure, and verify that the requirements have been successfully met.
We expect the starting salary to be around $135k but will be commensurate with experience.
**What You'll Do:**
TECHNICAL SENIORSHIP
+ Communication with internal and external business users on Sterling Integrator mappings
+ Making changes to existing partner integrations to meet internal and external requirements
+ Design, develop and implement solutions based on standards and processes that establish consistency across the enterprise data, reduce risks, and promote efficiencies in support of the organizational goals and objectives.
+ Diagnose and troubleshoot complex issues, restore services and perform root cause analysis.
+ Facilitate the review, vetting of these designs with the architecture governance bodies, as required.
+ Be aware of all aspects of security related to the Sterling environment and integrations
INNOVATIVE INFRASTRUCTURE & PROBLEM SOLVING
+ Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows.
TEAMWORK & COMMUNICATION
+ Superior & demonstrated team building & development skills to harness powerful teams
+ Ability to communicate effectively with different levels of Seniorship within the organization
+ Provide timely updates so that progress against each individual incident can be updated as required
+ Write and review high quality technical documentation
CONTROL & AUDIT
+ Ensures their workstation and all processes and procedures, follow organization standards
CONTINUOUS IMPROVEMENT
+ Encourages and maintains a 'Best practice sharing culture', always striving to find ways to improve service and change mind set.
**What We're Looking For:**
+ Bachelor's degree in Engineering, Statistics, Computer Science or other quantitative fields, required
+ 5+ years of IT experience
+ 4+ years' experience working with Sterling B2B Integrator, Sterling File Gateway, PCM (preferred)
+ 3+ years' experience with scripting to enable automation of standard activities (example: Ansible, Python, Bash, Java)
+ Strong interpersonal and communication skills with Agile/Scrum experience.
+ Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions with the ability to develop custom setups.
+ Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels.
+ Prefer Travel, transportation, or hospitality experience
+ Prefer experience with designing application data models for mobile or web applications
+ Excellent written and verbal communication skills.
+ Flexibility in scheduling which may include nights, weekends, and holidays
**What You'll Get:**
+ Up to 40% off the base rate of any standard Hertz Rental
+ Paid Time Off
+ Medical, Dental & Vision plan options
+ Retirement programs, including 401(k) employer matching
+ Paid Parental Leave & Adoption Assistance
+ Employee Assistance Program for employees & family
+ Educational Reimbursement & Discounts
+ Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness
+ Perks & Discounts -Theme Park Tickets, Gym Discounts & more
The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world.
**US EEO STATEMENT**
At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company.
Individuals are encouraged to apply for positions because of the characteristics that make them unique.
EOE, including disability/veteran
$135k yearly 60d+ ago
Senior Data Engineer
Peoplefinders
Data engineer job in Sacramento, CA
PeopleFinders.com, the premier online service for consumers to locate, contact and verify people and businesses. Over the past couple of decades the Company has quietly become one of the largest owners of public records data in the country, distributing its products over a vast network of websites.
Role Overview
The Senior DataEngineer is responsible for leading the design, development, and optimization of scalable, reliable data pipelines and data models that support analytics, reporting, and data-driven decision-making across the organization. This role works closely with Product, Engineering, Marketing, Finance, and Analytics teams to transform complex, multi-source data into high-quality, trusted datasets.
As a senior individual contributor, this role provides technical leadership, sets dataengineering standards, and helps shape the evolution of the organization's data platform while ensuring performance, reliability, and scalability.
Salary - $110K - $125K Essential Duties & Responsibilities
Design, build, and own scalable ETL/ELT pipelines ingesting data from third-party platforms such as HubSpot, Chargebee, Google Analytics, and other SaaS sources
Lead data modeling and transformation efforts within Snowflake to support analytics, BI, and downstream consumers
Architect and optimize data workflows for performance, scalability, and cost efficiency
Integrate and manage data from operational databases including MongoDB and SQL Server
Partner with analytics and business teams to enable high-quality dashboards and reporting in Tableau, Domo, and Power BI
Establish and enforce best practices for data quality, validation, testing, and observability
Monitor, troubleshoot, and resolve complex data pipeline and production issues
Provide technical guidance and mentorship to DataEngineers and Analytics Engineers
Collaborate with Engineering, Product, Data, and Operations teams to align data solutions with business priorities
Contribute to data platform roadmap planning, tooling selection, and architectural decisions
Qualifications
Bachelor's degree in Computer Science, Engineering, Data Science, or equivalent practical experience
6+ years of experience in DataEngineering, Analytics Engineering, or related roles
Advanced SQL skills and extensive experience working with large datasets in cloud data warehouses
Deep hands-on experience with Snowflake, MongoDB, SQL Server and other database technologies
Proven experience designing, building, and maintaining complex ETL/ELT pipelines
Experience supporting BI and analytics platforms such as Tableau, Domo, and Power BI
Strong understanding of data modeling, dimensional modeling, and analytics engineering best practices
Ability to lead technical initiatives and communicate complex concepts to technical and non-technical stakeholders
Preferred Experience
Experience with modern data stack tools such as dbt, Airflow, Fivetran, or Stitch
Experience designing semantic models or enterprise datasets optimized for Power BI
Experience supporting marketing, subscription, revenue, or product analytics use cases
Familiarity with APIs, event-driven data, and semi-structured data formats
Experience optimizing cloud data warehouse performance and cost at scale
Exposure to data governance, security, privacy, or compliance-aware environments
Experience acting as a technical lead or senior individual contributor within a data team
Benefits
Flexible PTO
6% 401K match
100% of the cost paid for the employee for medical, dental and vision
60% of the cost paid for dependents for medical, dental and vision
$110k-125k yearly Auto-Apply 12d ago
Senior Data Engineer
Enformion 4.1
Data engineer job in Sacramento, CA
Enformion is a dynamic and innovative data and analytics company that assists digital marketplaces in fraud prevention, risk management, seamless user onboarding, and fostering trust between shoppers and merchants. Our AI-powered solutions leverage extensive data intelligence and advanced behavioral analysis, enabling continuous monitoring for emerging risk indicators.
Who We Want
Do you live for working on challenging Big Data problems at a massive scale? Are you the type that intimately knows the ins and outs of Big data development with the expert knowledge and experience to push your hardware to the limits? If yes, then we want you.
We are looking for a Senior DataEngineer to help our engineering team build a modern data processing platform using Spark, EMR and other relational and no SQL databases. We are investing resources into setting up a more flexible and scalable data infrastructure to support the addition of new data sets and improve overall data quality. An ideal candidate will be excited to be in a small size company with a startup mindset that moves quickly on a constant flow of ideas, is able to weed through the maze of Big data tools and potential approaches to find the best possible solution and architecture.
Salary - $110K - $125K
Responsibilities
Implement and maintain big data platform and infrastructure
Develop, optimize and tune MySQL stored procedures, scripts, and indexes
Develop Hive schemas and scripts, Spark Jobs using pyspark and Scala and UDFs in Java
Design, develop and maintain automated, complex, and efficient ETL processes to do batch records-matching of multiple large-scale datasets, including supporting documentation
Develop and maintains pipelines using Airflow or any other tools to monitor, debug, and analyze data pipelines
Troubleshoot Hadoop cluster and query issues, evaluate query plans, and optimize schemas and queries
Strong interpersonal skills to resolve problems in a professional manner, lead working groups, and negotiate consensus
Qualifications & Skills
BS, MS, or PhD in Computer Science or related field
5+ years minimum experience in language such as Java, Scala, PySpark, Perl, Shell Scripting and Python
Working knowledge of the Hadoop ecosystem applications (MapReduce, YARN, Pig, Hbase, Hive, Spark and more!)
Strong Experience working with data pipelines in multi-terabyte data warehouses. Experience in dealing with performance and scalability issues
Strong SQL (MySQL, Hive, etc.) and No-SQL (MongoDB, Hbase, etc.) skills, including writing complex queries and performance tuning
Knowledge of data modeling, partitioning, indexing, and architectural database design.
Experience using Source Code and Version Control systems like GIT etc.
Experience on continuous build and test process using tools such as GitLab, SBT, Postman, etc.
Experience with Search Engines, Name/Address Matching, or Linux text processing
Preferred:
Knowledge of cluster configuration, Hadoop administration and performance tuning are a huge plus.
Distributed computing principles and experience in big data technologies including performance tuning
Machine Learning
Location
Remote
$110k-125k yearly Auto-Apply 18d ago
Senior Dotnet Developer
R Systems 4.5
Data engineer job in Sacramento, CA
** Only Local Sacramento, CA Candidates are encouraged to apply **
** Healthcare & Public Sector Experience is Preferred **
Senior .NET Full Stack Developer (Public Sector, Healthcare, AWS)
We are seeking an experienced Senior .NET Full Stack Developer with strong expertise in designing and building secure, scalable applications using Microsoft .NET technologies. The ideal candidate should have hands-on experience working in Public Sector and Healthcare environments, along with solid proficiency in AWS cloud services.
Key Responsibilities
Design, develop, and maintain full-stack applications using .NET Core/.NET 6+, C#, Web API, Entity Framework, and SQL Server.
Develop front-end interfaces using Angular/React, JavaScript/TypeScript, HTML5, CSS3, and modern UI frameworks.
Architect and implement RESTful APIs, microservices, and enterprise-grade integrations.
Build and deploy cloud-native applications on AWS, using services such as:
AWS Lambda, API Gateway, ECS/EKS, EC2
S3, RDS, DynamoDB
Cognito, IAM
SQS/SNS, CloudWatch, CloudFormation
Implement secure authentication and authorization mechanisms (OAuth2, JWT, SSO).
Work with cross-functional teams including Product Owners, Business Analysts, and QA.
Participate in all stages of SDLC-requirements, design, coding, testing, deployment.
Perform code reviews and mentor junior developers.
Troubleshoot and optimize application performance, scalability, and reliability.
Required Skills & Experience
7-10+ years of hands-on experience in .NET / C# development.
Strong experience with .NET Core/.NET 5+, Web API, MVC, Microservices.
Proficient in front-end development with React/Angular.
Strong database skills with SQL Server, query optimization, and stored procedures.
Solid experience with the AWS ecosystem, including:
Serverless (Lambda), Containerization (ECS/EKS), Storage (S3), Databases (RDS/DynamoDB)
API Gateway, CloudWatch, IAM, VPC, SQS/SNS
CI/CD using AWS CodePipeline, CodeBuild, GitHub Actions, or equivalent
Experience in Public Sector and Healthcare domains, with understanding of:
Government security standards and compliance-driven development
Familiarity with containerization (Docker), Infrastructure as Code (CloudFormation/Terraform), unit testing frameworks.
Good to Have
Experience with Kubernetes (EKS).
Experience with reporting/analytics tools.
AWS Certifications (Developer Associate, Solutions Architect Associate/Professional).
Experience with NoSQL databases (MongoDB, DynamoDB).
The average data engineer in Folsom, CA earns between $92,000 and $181,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.
Average data engineer salary in Folsom, CA
$129,000
What are the biggest employers of Data Engineers in Folsom, CA?
The biggest employers of Data Engineers in Folsom, CA are: