Luxoft is looking for a Senior DataEngineer for development of new application to be used by investors and investment committees to review their portfolio data, tailored to specific user groups.
Responsibilities:
• Work with complex data structures and provide innovative ways to a solution for complex data delivery requirements
• Evaluate new and alternative data sources and new integration techniques
• Contribute to data models and designs for the data warehouse
• Establish standards for documentation and ensure your team adheres to those standards
• Influence and develop a thorough understanding of standards and best practices used by your team
Mandatory Skills Description:
• Seasoned dataengineer who has hands-on experience in AWS to conduct end-to-end data analysis and data pipeline build-out using Python, Glue, S3, Airflow, DBT, Redshift, RDS, etc.
• Extensive Python API design experience, preferably Fast API
• Strong SQL knowledge
Nice-to-Have Skills Description:
- Pyspark
- Databricks
- ETL design
$99k-139k yearly est. 4d ago
Looking for a job?
Let Zippia find it for you.
SCCM Endpoint Engineer (LARGELY REMOTE/NO C2C)
Amerit Consulting 4.0
Data engineer job in Riverside, CA
Our client, a Medical Center facility under the aegis of a California Public Ivy university and one of largest health delivery systems in California, seeks an accomplished SCCM Endpoint Engineer.
________________________________________
NOTE- THIS IS LARGELY REMOTE ROLE & ONLY W2 CANDIDATES/NO C2C/1099
*** Candidate must be authorized to work in USA without requiring sponsorship ***
Position: SCCM Endpoint Engineer (Job Id - # 3167240)
Location: Los Angeles CA 90024 (Hybrid-99% Remote/1% onsite)
Duration: 10 months + Strong Possibility of Extension
_________________________________________________________
Candidate will travel onsite to learn/view their setup and come onsite as needed for team building or vendor engagements. Onsite requirements are about 2-3 per year.
____________________________________________________
Required skills and experience:
Ability to monitor and report on statuses of endpoints utilizing SCCM/MECM & Intune.
Understanding of Networking and Active Directory.
Advanced knowledge of Microsoft Windows 10, Mac OS, Intune, Autopilot, SCCM/MECM, JAMF, and other endpoint management solutions
Advanced knowledge of ISS Microsoft Office products (O365, Office 2016, Outlook, Exchange and OWA).
Understanding of project plans, presentations, procedures, diagrams, and other technical documentation.
Understanding of Networking protocols and standards: DNS, DHCP, WINS and TCP/IP, etc.
Ability to work independently with minimal supervision as well as in a team environment.
Ability to follow escalation procedure within the TSD Team and under the ISS umbrella.
Establish standards and procedures for best practices, enabling commitments to established SLA's.
Ability to research and test new technologies and processes.
Demonstrate ability to develop creative solutions to complex problems.
Understanding of various Desktop Management Systems such as anti-virus software, patch management, full disk encryption, SSO/Tap-Badge (Imprivata) software and software delivery.
Ability to prioritize, organize, and execute work assignments.
Ability to communicate the status of various systems to management, leadership and/or support personnel.
Ability to skillfully react to a fluid and constantly changing work environment.
Ability to train, delegate and review the work of staff members.
Advanced knowledge of ticketing systems (ServiceNow).
Strong technical abilities with excellent communication and interpersonal skills.
Advanced knowledge of cloud computing (Azure, Intune, Autopilot, DaaS, Box, OneDrive).
Advanced knowledge of standard desktop imaging and upgrade procedures; SCCM/MECM/MDT, Intune, OSD, PXE, thin vs thick images.
Advanced knowledge of VPN remote software and RDP setup.
Advanced knowledge of Windows and Citrix based printing.
Understand ITIL overview and tier structure support using ticket tracking system.
Advanced knowledge of Apple OSX and iOS operating systems and platforms.
Advanced knowledge of virtualization technologies (Citrix XenApp, XenDesktop, VMWare, Azure Virtual Desktop, Windows 365, Amazon Workspaces).
Advanced knowledge of IT Security applications (Cisco AMP, Aruba OnGuard, DUO, FireEye, Windows Defender, Windows BitLocker, Checkpoint Encryption and USB allowlisting).
___________________________________________
Bhupesh Khurana
Lead Technical Recruiter
Email - *****************************
Company Overview:
Amerit Consulting is an extremely fast-growing staffing and consulting firm. Amerit Consulting was founded in 2002 to provide consulting, temporary staffing, direct hire, and payrolling services to Fortune 500 companies nationally, as well as small to mid-sized organizations on a local & regional level. Currently, Amerit has over 2,000 employees in 47 states. We develop and implement solutions that help our clients operate more efficiently, deliver greater customer satisfaction, and see a positive impact on their bottom line. We create value by bringing together the right people to achieve results. Our clients and employees say they choose to work with Amerit because of how we work with them - with service that exceeds their expectations and a personal commitment to their success. Our deep expertise in human capital management has fueled our expansion into direct hire placements, temporary staffing, contract placements, and additional staffing and consulting services that propel our clients businesses forward.
Amerit Consulting provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.
Applicants, with criminal histories, are considered in a manner that is consistent with local, state and federal laws
$92k-135k yearly est. 2d ago
Data Scientist
Opendoor Technologies Inc.
Data engineer job in Ontario, CA
Data Scientist, Pricing Opendoor is transforming one of the largest, most complex markets in the world - residential real estate - using data at massive scale. Every pricing signal we generate directly impacts how we value homes, how we manage risk, and how efficiently capital moves through our marketplace. The work is highly leveraged: the quality of our pricing decisions influences conversion, margins, customer trust, and the company's financial performance.
We are looking for mid to senior level Data Scientists. In this role, you will be a core driver of how Opendoor prices real estate at scale. You'll operate at the intersection of economics, machine learning, experimentation, and product strategy - tackling ambiguity, shaping the pricing roadmap, and building models/analyses that materially move the business. Your insights will influence how we evaluate millions of dollars of housing inventory - and directly shape outcomes for our customers, our balance sheet, and the health of our marketplace.
What You'll Do
* Build and maintain pricing metrics, dashboards, and frameworks.
* Run experiments and causal analyses to measure impact and drive decisions.
* Develop predictive + statistical models that improve pricing accuracy.
* Partner closely with Product, Engineering, and Operations teams to influence roadmap and model deployment.
* Deliver insights and narratives that inform executive strategy.
Skills & Qualifications
* Deep statistical reasoning: hypothesis design, experimental design, causal inference, and ability to distinguish signal vs noise.
* Proven end-to-end ML ownership: data acquisition, feature engineering, model development, validation, deployment, and ongoing monitoring.
* Strong SQL + Python proficiency; comfortable working with production data pipelines and modern ML tooling (e.g., Spark, Airflow, Ray, SageMaker, Vertex, etc.).
* Demonstrated ability to translate complex analytical findings into clear business recommendations and influence cross-functional decision-making.
* Experience working with ill-defined problems and driving clarity on problem definition, success metrics, and realistic tradeoffs.
* High data-quality bar: disciplined approach to validation, bias analysis, and making decisions rooted in evidence vs intuition.
* Effective communicator - able to tell the story behind the model to both highly technical and non-technical audiences.
Base salary range for this role varies. Generally, the base salary range is $135,000 - $199,000 CAN annually + RSUs + bonus + ESPP + additional employee benefits (medical/dental/vision, life insurance, unlimited PTO, 401K).
JR 9200
#LI-KC1
At Opendoor our mission is to tilt the world in favor of homeowners and those who aim to become one. Homeownership matters. It's how people build wealth, stability, and community. It's how families put down roots, how neighborhoods strengthen, how the future gets built. We're building the modern system of homeownership giving people the freedom to buy and sell on their own terms. We've built an end-to-end online experience that has already helped thousands of people and we're just getting started.
$135k-199k yearly Auto-Apply 54d ago
Data Scientist
Prime Talent Recruiting
Data engineer job in Irvine, CA
Data Scientist - User Insights
Salary $150k -$180k
Orange County - Hybrid
Were looking for a Data Scientist to develop and deploy machine learning models that drive personalization and recommendation insights across SaaS and mobile app platforms.Candidate must be experienced in applying machine learning to real-world problems beyond just theoretical and mathematical concepts
Over 5 years of experience designing, building, and deploying large-scale ML systems, with a preference for expertise in recommender systems, predictive modeling, or ad-tech.
Proven track record of successfully bringing machine learning solutions to market.
You need to deliver data in a way that engineers can easily use and integrate into their systems
What youll do:
Design, train, and deploy scoring and recommendation models that connect users with relevant offers and personalized experiences in an application.
Practical experience working with Generative AI and large language model workflows.
Collaborate with engineering, product teams to develop scalable machine learning systems that align with product goals .
Successfully moved several machine learning models from prototype to production, delivering real business results.
Skilled in Python and its tools like Pandas, PyTorch and TensorFlow.
Strong SQL knowledge and experience working with big data platforms like Spark and Snowflake.
Experience working hands-on with Generative AI and large language model workflows, including prompt engineering and fine tuning.
Experience with ML Ops on cloud platforms such as AWS, including container tools like Docker and Kubernetes, and workflow tools like Airflow and Kubeflow.
Experienced in integrating model services with scalable, version-controlled APIs for reliable deployment and maintenance.
Perks:
Medical, dental, and vision
Wellness reimbursements
Unlimited PTO
Equity opportunity
401(k)
$150k-180k yearly 60d+ ago
Data Scientist
Kia USA
Data engineer job in Irvine, CA
At Kia, we're creating award-winning products and redefining what value means in the automotive industry. It takes a special group of individuals to do what we do, and we do it together. Our culture is fast-paced, collaborative, and innovative. Our people thrive on thinking differently and challenging the status quo. We are creating something special here, a culture of learning and opportunity, where you can help Kia achieve big things and most importantly, feel passionate and connected to your work every day.
Kia provides team members with competitive benefits including premium paid medical, dental and vision coverage for you and your dependents, 401(k) plan matching of 100% up to 6% of the salary deferral, and paid time off. Kia also offers company lease and purchase programs, company-wide holiday shutdown, paid volunteer hours, and premium lifestyle amenities at our corporate campus in Irvine, California.
Status
Exempt
General Summary
The Data Scientist will play an important role in executing data analysis for Kia North America's regional subsidiaries (KUS/KCA/KaGA/KMX). A future-driven automotive company, Kia has access to vast and diverse datasets and is excited to fill this position with an individual that can derive business improvements and insights from this data. Strong applicants for this role will have statistics, machine learning, and computer science skills to leverage high-performance compute clusters as well as perform reproducible data analysis at scale. With these requirements in mind, our mindset is that data, analytics, automation, and responsible AI can revolutionize our many lines of business.
Essential Duties and Responsibilities
1st Priority - 30%
Data wrangling and analysis
* Assess the accuracy of new data sources
* Understand the relationship between the data and the business process
* Preprocess structured and unstructured data
* Analyze large amounts of data to discover trends and patterns
* Build prediction and classification models
* Coordinate with different functional teams for feature engineering
2nd Priority - 30%
Assess, visualize, and improve analysis
* Test and continuously improve the accuracy of statistical and machine learning models
* Present information using Python notebooks and/or dashboards
* Simplify and explain complex statistics in an intuitive manner
* Continuously monitor and validate production analysis results
3rd Priority - 20%
Collaborate with IT Team to deploy analysis results
* Build REST APIs for data and analysis result consumption
* Assist the IT system developers to deploy analysis as a service
4th Priority - 20%
Clear documentation, source code management, and reproducible analysis
* Use git within GitLab
* Create virtual environments to isolate project dependencies and requirements
* Track model performance and hyperparameter configurations
* Track data versioning
Qualifications/Education
Education:
* Bachelor's degree or equivalent experience in related field of technology required
* Master's degree in analytics, data science, or computer science preferred
Job Requirement
Overall Related Experience:
* Experience querying databases and using programming languages such as Python and SQL
* Experience using Hadoop ecosystem (Hadoop, Hive, Impala and Spark etc.)
* Experience using statistics, machine learning and deep learning algorithms
* Experience publishing results to stakeholders through dashboards (e.g. Power BI, MicroStrategy, Tableau)
Directly Related Experience:
* 3+ years of experience in data science preferred
Specialized Skills and Knowledge Required
* Proficiency in Python and SQL
* Knowledge of a variety of machine learning techniques including deep learning
* Knowledge of advanced statistical techniques
* Knowledge and experience with natural language processing (NLP)
* Experience with common Python libraries for data analysis such as Pandas and NumPy
* Experience with visualization libraries such as Matplotlib, Seaborn, Plotly, Bokeh and plotnine
* Experience developing and evaluating statistical and machine learning models using libraries such as stats models and Scikit-learn
* Experience with deep learning frameworks such as PyTorch and TensorFlow
* Experience with big data processing tools: Hadoop ecosystem, Spark etc.
* Strong data-driven problem-solving skills
* Excellent written and verbal communication skills to coordinate across teams
Competencies
* Care for People
* Chase Excellence Every Day
* Dare to Push Boundaries
* Empower People to Act
* Move Further Together
Pay Range
$82,382 - $110,272
Pay will be based on several variables that are unique to each candidate, including but not limited to, job-related skills, experience, relevant education or training, etc.
Equal Employment Opportunities
KUS provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, ancestry, national origin, sex, including pregnancy and childbirth and related medical conditions, gender, gender identity, gender expression, age, legally protected physical disability or mental disability, legally protected medical condition, marital status, sexual orientation, family care or medical leave status, protected veteran or military status, genetic information or any other characteristic protected by applicable law. KUS complies with applicable law governing non-discrimination in employment in every location in which KUS has offices. The KUS EEO policy applies to all areas of employment, including recruitment, hiring, training, promotion, compensation, benefits, discipline, termination and all other privileges, terms and conditions of employment.
Disclaimer: The above information on this job description has been designed to indicate the general nature and level of work performed by employees within this classification and for this position. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities, and qualifications required of employees assigned to this job.
$82.4k-110.3k yearly 22d ago
Data & Applied M/L Engineer
Gatekeeper Systems 3.3
Data engineer job in Lake Forest, CA
At Gatekeeper Systems,
we're revolutionizing retail loss prevention and customer safety through a powerful combination of physical deterrents and cutting-edge technology-including AI, computer vision, and facial recognition. As a global leader with over 25 years of industry excellence and a growing, diverse team of 500 employees across offices in North America, Europe, Australia, and Asia, we're driven by innovation, integrity, and impact. Join us and be part of a mission-focused team that's making a real difference in the future of retail, providing innovative solutions and services that redefine industry standards.
POSITION SUMMARY:
WHAT WE OFFER…
Join the team at Gatekeeper Systems and watch your career grow! We offer competitive compensation and benefits packages that include:
Attractive Total Compensation Package, including annual bonus
Comprehensive healthcare benefits including medical, dental, and vision coverage; Life/ADD/LTD insurance; FSA/HSA options.
401(k) Plan with Employer Match
Generous Paid Time Off (PTO) policy
Observance of 11 paid company holidays
Various Employee Engagement Events
Exciting Growth Opportunities
Positive Company Culture
This role bridges the gap between core dataengineering and practical machine learning applications. Primarily, you will be a data platform engineer responsible for owning core data pipelines, data models, and quality controls that power Gatekeeper analytics and future data products.
Secondarily, you will drive the production lifecycle of our shopping cart computer-vision feature. You will orchestrate the data workflows that interface with our Machine Learned models to ensure accurate cart classification, while leveraging the FaceFirst ML team for deeper capacity. You will collaborate with BI Analysts, software engineers, and product teams to transform raw data into actionable insights.
ESSENTIAL JOB FUNCTIONS; but not limited to:
Data Platform, Pipelines, & Quality (Primary Focus)
Pipeline Design & Operation: Design, build, and operate scalable ELT/ETL pipelines that ingest data from IoT/smart-cart telemetry, video events, operational systems, and external partners into our cloud data lake/warehouse.
Infrastructure Management: Build and maintain robust data infrastructure, including databases (SQL and NoSQL), data warehouses, and data integration solutions.
Data Modeling: Establish canonical data models and definitions (schemas, event taxonomy, metrics) so teams can trust and reuse the same data across products, BI, and analytics.
Data Quality Assurance: Own data quality end-to-end by implementing validation rules, automated tests, anomaly detection, and monitoring/alerting to prevent and quickly detect regressions.
Consistency & Governance: Drive data consistency improvements across systems (naming, identifiers, timestamps, joins, deduplication) and document data contract expectations with producing teams.
Root Cause Analysis: Troubleshoot pipeline and data issues, perform root-cause analysis, and implement durable fixes that improve reliability and reduce operational load.
Collaboration & Analytics: Partner with BI Analysts and Product teams to create curated datasets and self-serve analytics foundations (e.g., marts/semantic layer), as well as support internally facing dashboards to communicate system health.
Applied ML Ownership - Smart Exit Cart-Empty Classifier (Secondary Focus)
Lifecycle Management: Own the production lifecycle for the cart classification capability, including data collection/labeling workflows, evaluation, threshold tuning, and safe release/rollback processes.
Pipeline Implementation: Implement and optimize machine learning pipelines, from feature engineering and model training to deployment and monitoring in production.
Evaluation & Monitoring: Build and maintain an evaluation harness (offline metrics + repeatable test sets) and ongoing monitoring (accuracy drift, data drift, false positive/negative analysis).
Cross-Team Collaboration: Collaborate with the FaceFirst ML team to incorporate improvements (model updates, feature changes) while keeping Gatekeeper's production integration stable.
Integration: Work with software engineers to ensure the classifier integrates cleanly into the product workflow with robust telemetry, logging, and operational runbooks.
QUALIFICATION REQUIREMENTS
The requirements listed below are representative of the knowledge, skill and/or ability required.
Exemplifies professionalism in all aspects of day-to-day duties and responsibilities.
Self-aware and open to learning about personal effectiveness in the workplace.
Exhibits a positive attitude toward the vision, policies, and goals of Gatekeeper Systems.
Constantly strives to improve performance and effectiveness of the team and the company.
EDUCATION AND/OR EXPERIENCE
Work Experience: 5+ years
Core Engineering: Strong experience building and operating production ELT/ETL pipelines and data warehouses.
Programming: Fluency in SQL and Python (or similar) for data transformation, validation, and automation.
Cloud Platforms: Experience with cloud data platforms (Azure and/or GCP), including object storage, security/access controls, and cost-aware design.
Tooling: Hands-on experience with orchestration and transformation tooling (e.g., Airflow/Prefect) and batch processing frameworks (e.g., Spark/Databricks).
Quality Practices: Practical experience implementing data quality practices (tests, monitoring/alerting, lineage/documentation) and improving data consistency across systems.
Operations: Collaborate with operational teams to identify, diagnose, and remediate in-field system issues.
Bachelor's Degree in Computer Science, Software Engineering, Information Systems, Mathematics, Statistics, or a related technical field.
SALARY RANGE
$150,000 to $175,000; 5% AIP
PHYSICAL DEMANDS
The physical demands described here are representative of those that must be met by a team member to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Repetitive motions and routine use of standard office equipment such as computers, telephones, copiers/scanners and filing cabinets.
Ability to see, speak, walk, hear, stand, use of hand/fingers to handle or feel; climb stairs, stoop, carry/lifting up to 50 lbs.
Ability to sit at a desk.
Specific vision abilities required include close vision, color vision, peripheral visions, depth perception and the ability to adjust focus.
Regularly utilizes manual dexterity to put parts or pieces together quickly and accurately.
DISCLAIMER
This Job Description is a general overview of the requirements for the position. It is not designed to contain, nor should it be interpreted as being all inclusive of every task which may be assigned or required. It is subject to change, in alignment with company/department needs and priorities.
Gatekeeper Systems, Inc., is an equal opportunity employer. We are committed to developing a diverse workforce and cultivating an inclusive environment. We value diversity and believe that we are strengthened by the differences in our experiences, thinking, culture, and background. We strongly encourage applications from candidates who demonstrate that they can contribute to this goal. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, disability status or any protected basis.
$150k-175k yearly 6d ago
Data Engineer
MBK Real Estate 4.2
Data engineer job in Irvine, CA
MBK Real Estate (MBK) is an industry leader in real estate operations and development. Through its subsidiaries, MBK Rental Living, MBK Industrial Properties and MBK Senior Living, MBK Real Estate is renowned for building award-winning new home and apartment communities, state-of-art distribution facilities and for its reputation in providing exceptional high standards of service throughout its senior living communities.
MBK is a privately held real estate development firm with roots in the business dating back to 1996. MBK's extensive activities through its operating divisions, along with the size and scale of our development and building activities represent the continuation and commitment to American enterprise.
Job Description
MBK Real Estate is hiring a DataEngineer to join our team at our Home Office in Irvine, CA!
This role will be hybrid with 3 days in office and 2 days remote.
Job Summary:
We are seeking a highly skilled DataEngineer with strong experience in Microsoft Azure, Microsoft Fabric, SQL, Python, and Power BI to help design, build, and optimize our cloud-based data ecosystem. This role will work closely with our data/business intelligence team to ensure scalable, reliable, and high-quality data solutions that support our enterprise data strategy.
Supervisory Responsibilities:
N/A
Duties & Responsibilities:
Design, build, and maintain cloud-native data pipelines using Azure Synapse and Microsoft Fabric.
Develop and optimize ETL/ELT workflows for ingestion, transformation, and processing of large-scale datasets.
Create and manage data models, warehouses, and lakehouse environments within Microsoft Fabric.
Write high-performance SQL for data transformation, report automation, modeling, and analytics.
Build reusable Python components for data processing, automation, and orchestration.
Implement best practices for data quality, governance, observability, security, and performance tuning.
Collaborate with data analysts and business teams to translate requirements into scalable solutions.
Monitor, troubleshoot, and improve pipelines to ensure high availability and reliability.
Contribute to cloud architecture, tooling selection, performance tuning, and overall platform modernization efforts.
Salary: $120,000-$130,000 + Bonus
**Candidates must be legally authorized to work in the United States. This position does not offer sponsorship now or in the future
**We are not able to consider relocation candidates, and candidates will need to reside within a reasonably commutable distance for consideration
Education Requirements:
A bachelor's Degree in a computer science-related field is strongly preferred.
Experience Requirements (in years):
5+ years of professional experience as a DataEngineer or similar role.
Strong hands-on experience (3+ years) with Azure cloud services (e.g., Azure Synapse, Azure Data Lake Storage, Azure Functions preferred).
Practical experience (1+ years) working with Microsoft Fabric (Data Factory, Lakehouse, Pipelines, Notebooks, and Power BI integration).
Advanced proficiency (3+ years) in SQL (query optimization, performance tuning, complex joins, stored procedures).
Strong Python programming skills (2+ years) for data processing, automation, and scripting.
Required Competencies/Licenses/Certifications:
Experience building scalable, production-grade data pipelines for structured and unstructured data.
Understanding of data modeling, data warehousing concepts, and ELT/ETL patterns.
Familiarity with CI/CD, Git-based workflows, and modern DevOps practices.
Microsoft Suite competency.
Strong problem-solving and analytical mindset.
Excellent communication skills and ability to partner across technical and business teams.
Detail-oriented, organized, and able to manage multiple priorities.
Growth mindset and eagerness to work with emerging Azure/Fabric technologies.
Preferred Qualifications
Experience with Delta Lake, parquet, or similar big data file formats.
Familiarity with Power BI, semantic models, and Fabric integrated analytics.
Knowledge of Spark, Databricks, or distributed data processing.
Exposure to data governance frameworks, metadata management, and lineage tools.
Azure and/or certifications (e.g., DP-203, Azure DataEngineer Associate, DP-700, Fabric DataEngineer Associate).
Physical Demands & Work Environment:
Must be mobile and able to perform the physical requirements of the job, bending, kneeling, stooping, pushing, pulling and repetitive motion.
Ability to sit and work at a computer for long periods of time.
Able to move intermittently throughout the workday and between divisions.
We offer a rich benefits package comprising of the following: competitive salaries with opportunities for growth; 401(k) retirement plan with up to 4% employer matching; comprehensive industry leading medical, dental and vision insurance; company-provided life, disability and AD&D insurance; flexible spending accounts, generous paid time off including vacation and sick time, holidays, and bereavement leave; and a variety of programs including leadership development, training, and personal coaching; education loan assistance and scholarships; daily living, financial and legal services; childcare and eldercare assistance; employee discounts; and health and wellness resources that include virtual yoga, mindfulness, and financial readiness for employees and their family members.
If you are ready to meet the challenges of this critical role, we want to hear from you!
MBK is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, disability, age, sexual orientation, gender identity, national origin, veteran status, genetic information, or other protected reason. Our company is committed to providing access, equal opportunity and reasonable accommodation for qualifying individuals in employment, its services, programs, and activities. To request reasonable accommodation, contact *************************.
Regulatory Disclosures for Senior Living Communities with Medicaid Residents: An “Excluded Party” is a person that the federal or state government found not eligible to provide care and services in a facility that receives Medicare or Medicaid funding. If employed at one of our senior living communities that receives Medicare or Medicaid funding, team members must not be considered an “Excluded Party” as defined by the U.S. Department of Health and Human Services, any state Medicaid Programs, and any additional federal and state government contract programs. If, as a team member, you learn that you are an Excluded Party at any time, you must present your Excluded Party notice letter to your supervisor immediately.
Other Regulatory Requirements: If employed at one of our senior living communities, team members must continually comply with certain laws and regulations that impact the company, including, but not limited to, as applicable, state licensing regulations, the Health Insurance Portability and Accountability Act of 1996 (HIPAA), Resident Rights as defined by the U.S. Department of Health and Human Services, and any other federal or state laws relating to team members' professional licenses.
HIPAA Disclosure:
All Team Members prior to commencing employment and once employed must not be considered an “Excluded Party” as defined by the Medicare and state Medicaid Programs as well as other federal and state government contract programs. If as an associate you learn you are an Excluded Party, you must present your Excluded Party notice letter to your supervisor immediately. An Excluded Party is a person that the federal or state government found not eligible to provide care and services in a Community that receives Medicare or Medicaid funding. In addition, at all times, during your employment, all associates must be in compliance with certain laws and regulations that affect the company, including but not limited to Resident Rights, HIPAA, State licensing regulations, and those laws relating you an associates' professional license.
$122k-175k yearly est. Auto-Apply 8d ago
Staff Data Engineer - Large Driving Model, Autonomy
Rivian 4.1
Data engineer job in Irvine, CA
About Rivian Rivian is on a mission to keep the world adventurous forever. This goes for the emissions-free Electric Adventure Vehicles we build, and the curious, courageous souls we seek to attract. As a company, we constantly challenge what's possible, never simply accepting what has always been done. We reframe old problems, seek new solutions and operate comfortably in areas that are unknown. Our backgrounds are diverse, but our team shares a love of the outdoors and a desire to protect it for future generations. Role Summary We are seeking a Staff DataEngineer to drive the advancement of our Large Driving Model (LDM). The successful candidate will be responsible for the end-to-end data strategy that fuels our models, from mining and curation to experimental collaboration. You will build the pipelines that surface high-value examples, refine evaluation benchmarks, and partner closely with modeling teams to accelerate model performance. Your work will directly influence the intelligence and safety of our autonomous capabilities, serving as a critical pillar in our mission. If you are an independent technical leader, highly analytical, and passionate about data-centric AI, we would like to talk with you! Responsibilities Build scalable data mining pipelines to extract rare and critical driving scenarios from fleet data. Curate and analyze large-scale training datasets to improve the robustness and generalization of the Large Driving Model. Design and enhance evaluation frameworks to rigorously measure model performance against complex real-world baselines. Collaborate closely with modeling engineers to define data requirements and iterate on rapid experimental loops. Identify trends, anomalies, and data distribution shifts that impact model training and validation. Serve as a key technical authority within the department, offering guidance on current and future data infrastructure and strategy. Navigate complex tradeoffs between data quality, scale, and compute budgets, ensuring architectural decisions align with company priorities. Empower fellow engineers by crafting easily extendable and collaborative designs and code. Display the ability to influence and foster consensus across data and modeling teams, even during challenging technical discussions. Qualifications B.S. or M.S. in Computer Science, Data Science, Robotics, Engineering, or a related field. 8+ years of experience as a DataEngineer or Machine Learning Engineer, preferably within the autonomous driving, robotics, or LDM space. Demonstrated potential or experience in technical leadership (TLM), capable of mentoring engineers and managing project roadmaps. Demonstrated ability to set direction and guide the team towards it. Expert level knowledge and experience in Python, SQL, and distributed data processing frameworks. Solid understanding of data-centric AI, dataset curation strategies, and model evaluation methodologies. Experience with large-scale sensor data (LiDAR, Camera) or complex robotics systems. Excellent problem-solving skills and attention to detail. Capable of working in a fast-paced development environment. Good team player with excellent communication skills that can build team consensus. Passionately motivated to take ideas from experimental validation to a verified product. Hands-on approach: Proactively identifies and fills in gaps where needed. Pay Disclosure Salary Range for California Based Applicants: $228,000.00 - $285,000.00 (actual compensation will be determined based on experience, location, and other factors permitted by law). Benefits Summary: Rivian provides robust medical/Rx, dental and vision insurance packages for full-time employees, their spouse or domestic partner, and children up to age 26. Coverage is effective on the first day of employment. Equal Opportunity Rivian is an equal opportunity employer and complies with all applicable federal, state, and local fair employment practices laws. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, ancestry, sex, sexual orientation, gender, gender expression, gender identity, genetic information or characteristics, physical or mental disability, marital/domestic partner status, age, military/veteran status, medical condition, or any other characteristic protected by law. Rivian is committed to ensuring that our hiring process is accessible for persons with disabilities. If you have a disability or limitation, such as those covered by the Americans with Disabilities Act, that requires accommodations to assist you in the search and application process, please email us at candidateaccommodations@rivian.com. Candidate Data Privacy Rivian may collect, use and disclose your personal information or personal data (within the meaning of the applicable data protection laws) when you apply for employment and/or participate in our recruitment processes ("Candidate Personal Data"). This data includes contact, demographic, communications, educational, professional, employment, social media/website, network/device, recruiting system usage/interaction, security and preference information. Rivian may use your Candidate Personal Data for the purposes of (i) tracking interactions with our recruiting system; (ii) carrying out, analyzing and improving our application and recruitment process, including assessing you and your application and conducting employment, background and reference checks; (iii) establishing an employment relationship or entering into an employment contract with you; (iv) complying with our legal, regulatory and corporate governance obligations; (v) recordkeeping; (vi) ensuring network and information security and preventing fraud; and (vii) as otherwise required or permitted by applicable law. Rivian may share your Candidate Personal Data with (i) internal personnel who have a need to know such information in order to perform their duties, including individuals on our People Team, Finance, Legal, and the team(s) with the position(s) for which you are applying; (ii) Rivian affiliates; and (iii) Rivian's service providers, including providers of background checks, staffing services, and cloud services. Rivian may transfer or store internationally your Candidate Personal Data, including to or in the United States, Canada, the United Kingdom, and the European Union and in the cloud, and this data may be subject to the laws and accessible to the courts, law enforcement and national security authorities of such jurisdictions. Please note that we are currently not accepting applications from third party application services.
B.S. or M.S. in Computer Science, Data Science, Robotics, Engineering, or a related field. 8+ years of experience as a DataEngineer or Machine Learning Engineer, preferably within the autonomous driving, robotics, or LDM space. Demonstrated potential or experience in technical leadership (TLM), capable of mentoring engineers and managing project roadmaps. Demonstrated ability to set direction and guide the team towards it. Expert level knowledge and experience in Python, SQL, and distributed data processing frameworks. Solid understanding of data-centric AI, dataset curation strategies, and model evaluation methodologies. Experience with large-scale sensor data (LiDAR, Camera) or complex robotics systems. Excellent problem-solving skills and attention to detail. Capable of working in a fast-paced development environment. Good team player with excellent communication skills that can build team consensus. Passionately motivated to take ideas from experimental validation to a verified product. Hands-on approach: Proactively identifies and fills in gaps where needed.
Build scalable data mining pipelines to extract rare and critical driving scenarios from fleet data. Curate and analyze large-scale training datasets to improve the robustness and generalization of the Large Driving Model. Design and enhance evaluation frameworks to rigorously measure model performance against complex real-world baselines. Collaborate closely with modeling engineers to define data requirements and iterate on rapid experimental loops. Identify trends, anomalies, and data distribution shifts that impact model training and validation. Serve as a key technical authority within the department, offering guidance on current and future data infrastructure and strategy. Navigate complex tradeoffs between data quality, scale, and compute budgets, ensuring architectural decisions align with company priorities. Empower fellow engineers by crafting easily extendable and collaborative designs and code. Display the ability to influence and foster consensus across data and modeling teams, even during challenging technical discussions.
$228k-285k yearly 32d ago
AI Data Scientist
Vets Hired
Data engineer job in Camp Pendleton South, CA
We are seeking an AI Data Scientist to support defense-related testing, engineering, integration, and sustainment activities for advanced C5ISR systems. This role focuses on data integration, advanced analytics, and the development and deployment of secure AI/ML solutions in both real-world and theoretical environments. The position supports decision-making, experimentation, and operational effectiveness through scalable and portable data-driven solutions.
Key Responsibilities
Conduct data integration across multiple sources and systems
Investigate and develop flexible analytics solutions, including visualization, model building, and decision-support capabilities
Support the secure, rapid development, training, evaluation, debugging, and deployment of AI models
Develop AI/ML solutions applicable to distributed, vehicle-mounted, and on-premises environments
Apply data science methodologies, standards, and frameworks to operational and test environments
Required Skills & Qualifications
Ability to apply data science expertise using established processes, methodologies, and frameworks
Experience developing and applying data science tools in integrated test and/or operational environments
Proficiency with statistical programming languages such as Python, R, and SQL
Experience with programming languages such as C, C++, Java, or JavaScript
Experience designing, working with, and maintaining data architectures
Hands-on experience with AI/ML techniques, including large language models, clustering, decision trees, and neural networks
Strong understanding of advanced statistical techniques, including regression, probability distributions, and statistical testing
Active Secret security clearance
Preferred Qualifications
Bachelors degree in Data Science, Computer Science, Statistics, Mathematics, Computer Engineering, or a related field
Knowledge of statistical analysis and data mining techniques
Three or more years of experience with distributed data and computing tools
Three or more years of experience developing and applying advanced machine learning and statistical models, including regression, simulation, modeling, clustering, and neural networks
Experience visualizing data and presenting analytical results to technical and non-technical stakeholders
Working Place: Camp Pendleton, California, United States Company : 2026 Jan 29th Virtual Fair - Auria Space
$93k-135k yearly est. 5d ago
Sr Data Engineer
Invitrogen Holdings
Data engineer job in Carlsbad, CA
As part of the Thermo Fisher Scientific team, you'll find meaningful work that impacts the world. Join us in bringing our Mission to life daily to help customers make the world healthier, cleaner, and safer. We provide resources to support your career goals and develop solutions for global challenges like environmental protection, food safety, and cancer cures.
How You Will Get Here:
Proven experience (5+ years) in dataengineering with a strong understanding of modern data architecture, pipelines, and operations.
5 years of demonstrated expertise in Databricks and Spark, Python, Apache, SQL including pipeline creation, automation, and monitoring
5 years or more hands-on experience crafting and administrating relational database solutions, including AWS Delta Lake, Oracle, and AWS Redshift
5 years or more experience handling and managing AWS products, including Spark, Glue, Kafka, Elastic Search, Lambda, S3, Redshift, and others
Strong problem-solving skills with debugging, performance tuning, and AI/ML model deployment in cloud environments
Prior experience with tools like Power BI, Cognos, SQL Server, and Oracle is a strong plus
Deep understanding of data modeling, metadata management, and data governance frameworks
Experience building, scheduling, and monitoring data workflows using Apache Airflow
5 years of solid experience in DevOps/DataOps or equivalent roles, version control platforms (such as GitHub), and continuous integration and delivery pipelines
Demonstrated experience in leading engineering projects and managing project lifecycles
A self-starter with the drive and ability to deliver complex solutions rapidly
Communicate effectively with technical and non-technical personnel in oral and written form
Ensure stable and timely data pipelines that support period-end and quarter-end financial close processes, enabling accurate reporting and reconciliations
Monitor, solve, and optimize finance workflows during close cycles to guarantee data completeness, performance, and SLA compliance
Non-Technical Qualifications:
Effective leadership abilities to build technical choices and steer teams towards data-driven enterprise solutions
Outstanding interpersonal skills to translate sophisticated technical and AI concepts into business-aligned solutions
Demonstrable ability to collaborate with team members across IT, business, and AI/analytics departments
Passion for continuous learning, innovation, and staying ahead of evolving cloud, AI, and data technologies
Compensation and Benefits
This position may also qualify for a variable annual bonus depending on company, team, and/or individual performance results aligned with company policy. We provide a comprehensive Total Rewards package that our U.S. colleagues and their families can rely on, which includes:
A choice of national medical and dental plans, and a national vision plan, including health incentive programs
Employee assistance and family support programs, including commuter benefits and tuition reimbursement
A minimum of 120 hours of paid time off (PTO), 10 paid holidays each year, and paid parental leave including 3 weeks for bonding and 8 weeks for caregiving. Accident and life insurance are provided, along with short- and long-term disability as outlined in company policy.
Retirement and savings programs, such as our competitive 401(k) U.S. retirement savings plan
Employees' Stock Purchase Plan (ESPP) offers eligible colleagues the opportunity to purchase company stock at a discount
For more information on our benefits, please visit: *****************************************************
Compensation and Benefits
The salary range estimated for this position based in California is $103,100.00-$154,700.00.
This position may also be eligible to receive a variable annual bonus based on company, team, and/or individual performance results in accordance with company policy. We offer a comprehensive Total Rewards package that our U.S. colleagues and their families can count on, which includes:
A choice of national medical and dental plans, and a national vision plan, including health incentive programs
Employee assistance and family support programs, including commuter benefits and tuition reimbursement
At least 120 hours paid time off (PTO), 10 paid holidays annually, paid parental leave (3 weeks for bonding and 8 weeks for caregiver leave), accident and life insurance, and short- and long-term disability in accordance with company policy
Retirement and savings programs, such as our competitive 401(k) U.S. retirement savings plan
Employees' Stock Purchase Plan (ESPP) offers eligible colleagues the opportunity to purchase company stock at a discount
For more information on our benefits, please visit: *****************************************************
$103.1k-154.7k yearly Auto-Apply 22d ago
Principal Data Engineer
Americor
Data engineer job in Irvine, CA
Americor is a leading provider of debt relief solutions for people of all backgrounds. We offer various services to help our clients achieve financial freedom, including debt consolidation loans, debt settlement, and credit repair. Our dedication to others sets us apart - not only as a company but as a community of employees who support each other's personal and professional growth. Recognized as a ‘Top Place to Work' and ‘Best Company' for our outstanding service and commitment to excellence.
We are currently seeking a Principal/Lead Analytics Engineer to join our rapidly growing team. Specific job title and offer to align with the qualifications of the candidate.
*Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor or take over sponsorship of an employment Visa at this time
Compensation: $120,000-$150,000 annually, dependent on experience.
Location: Irvine, CA Hybrid (In-office every Tuesday).
Schedule: Monday-Friday, with weekend on-call availability required as needed.
Responsibilities:
Own bridge between dataengineering, data science, and business intelligence for data analytics.
Build and maintain ETL/ELT processes using Airflow, dbt, and custom Python scripts.
Design and implement scalable, robust, and efficient data pipelines.
Demonstrate and ensure best practices for data security, data validation and data quality are in place and maintained.
Stay current with the latest trends and developments in dataengineering; champion internal adoption and change management of the latest practices.
Other duties as assigned.
Requirements:
7+ years of relevant experience.
Advanced proficiency with Python.
Advanced proficiency with SQL.
Experience with dbt core cli preferred.
Advanced skills with database architecture and data pipelining.
Experience utilizing cloud services to develop ETL/ELT pipelines; AWS Lambda, S3, etc., preferred.
Experience using orchestrator, scheduling, and monitoring tools; Airflow and cron preferred.
Experience leveraging git for codebase source control; experience developing GitHub/Bitbucket CI/CD pipelines preferred.
Experience with Docker and docker-compose preferred.
Proven ability to position oneself to advise and teach across a data organization.
Excellent verbal, written, and interpersonal communication skills.
Education:
Bachelor's degree in a quantitative field or equivalent experience.
Master's degree preferred.
Company Benefits:
Ongoing training and development
Opportunity for career advancement
Medical
Dental
Vision
Company Paid Group Life / AD&D Insurance
7 Paid Holidays and 2 Floating Holiday Days to use at will
Paid Time Off
Flexible Spending/HSA
Employee Assistance Program (EAP)
401(k) match
Referral Program
Americor is proud to be an Equal Opportunity Employer. Americor does not discriminate based on race, color, gender, disability, veteran, military status, religion, age, creed, national origin, sexual identity or expression, sexual orientation, marital status, genetic information, or any other basis prohibited by local, state, or federal law.
* Note to Agencies: Americor Funding, Inc. (the “Company”) has an internal recruiting department. Americor Funding Inc. may supplement that internal capability from time to time with assistance from temporary staffing agencies, placement services, and professional recruiters (“Agency”). Agencies are hereby specifically directed NOT to contact Americor Funding Inc. employees directly in an attempt to present candidates. The Company's policy is for the internal recruiting team or other authorized personnel to present ALL candidates to hiring managers. Any unsolicited resumes sent to Americor Funding Inc. from a third party, such as an Agency, including unsolicited resumes sent to a Company mailing address, fax machine, or email address, directly to Company employees, or to the resume database, will be considered Company property. Americor Funding Inc. will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume. Americor Funding Inc. will consider any candidate for whom an Agency has submitted an unsolicited resume to have been referred by the Agency free of any charges or fees.
Note to Agencies: Americor Funding Inc. has an internal recruiting department and may, at times, supplement that capability with assistance from temporary staffing agencies, placement services, and professional recruiters (“Agency”). Agencies are directed not to contact Americor Funding Inc. employees directly to present candidates. It is the Company's policy that all candidates must be presented to hiring managers exclusively by the internal recruiting team or other authorized personnel.
Any unsolicited resumes sent to Americor Funding Inc. by an Agency-including those sent to any Company mailing address, fax, email, employee, or the resume database-will be considered the property of the Company. Americor Funding Inc. will not pay a fee for any placement resulting from an unsolicited resume. Any candidate for whom an Agency submits an unsolicited resume will be deemed referred at no cost to the Company.
Americor is proud to be an Equal Opportunity Employer. Americor does not discriminate based on race, color, gender, disability, veteran, military status, religion, age, creed, national origin, sexual identity or expression, sexual orientation, marital status, genetic information, or any other basis prohibited by local, state, or federal law.
$120k-150k yearly Auto-Apply 40d ago
Senior Big Data Engineer
TP-Link Systems 3.9
Data engineer job in Irvine, CA
ABOUT US:
Headquartered in the United States, TP-Link Systems Inc. is a global provider of reliable networking devices and smart home products, consistently ranked as the world's top provider of Wi-Fi devices. The company is committed to delivering innovative products that enhance people's lives through faster, more reliable connectivity. With a commitment to excellence, TP-Link serves customers in over 170 countries and continues to grow its global footprint.
We believe technology changes the world for the better! At TP-Link Systems Inc, we are committed to crafting dependable, high-performance products to connect users worldwide with the wonders of technology.
Embracing professionalism, innovation, excellence, and simplicity, we aim to assist our clients in achieving remarkable global performance and enable consumers to enjoy a seamless, effortless lifestyle.
KEY RESPONSIBILITIES
Develop and maintain the Big Data Platform by performing data cleansing, data warehouse modeling, and report development on large datasets. Collaborate with cross-functional teams to provide actionable insights for decision-making.
Manage the operation and administration of the Big Data Platform, including system deployment, task scheduling, proactive monitoring, and alerting to ensure stability and security.
Handle data collection and integration tasks, including ETL development, data de-identification, and managing data security.
Provide support for other departments by processing data, writing queries, developing solutions, performing statistical analysis, and generating reports.
Troubleshoot and resolve critical issues, conduct fault diagnosis, and optimize system performance.
Requirements
REQUIRED QUALIFICATIONS
Bachelor's degree or higher in Computer Science or a related field, with at least three years of experience maintaining a Big Data platform.
Strong understanding of Big Data technologies such as Hadoop, Flink, Spark, Hive, HBase, and Airflow, with proven expertise in Big Data development and performance optimization.
Familiarity with Big Data OLAP tools like Kylin, Impala, and ClickHouse, as well as experience in data warehouse design, data modeling, and report generation.
Proficiency in Linux development environments and Python programming.
Excellent communication, collaboration, and teamwork skills, with a proactive attitude and a strong sense of responsibility.
PREFERRED QUALIFICAITONS
Experience with cloud-based deployments, particularly AWS EMR, with familiarity in other cloud platforms being a plus.
Proficiency in additional languages such as Java or Scala is a plus.
Benefits
Salary Range: $150,000 - $180,000
Free snacks and drinks, and provided lunch on Fridays
Fully paid medical, dental, and vision insurance (partial coverage for dependents)
Contributions to 401k funds
Bi-annual reviews, and annual pay increases
Health and wellness benefits, including free gym membership
Quarterly team-building events
At TP-Link Systems Inc., we are continually searching for ambitious individuals who are passionate about their work. We believe that diversity fuels innovation, collaboration, and drives our entrepreneurial spirit. As a global company, we highly value diverse perspectives and are committed to cultivating an environment where all voices are heard, respected, and valued. We are dedicated to providing equal employment opportunities to all employees and applicants, and we prohibit discrimination and harassment of any kind based on race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Beyond compliance, we strive to create a supportive and growth-oriented workplace for everyone. If you share our passion and connection to this mission, we welcome you to apply and join us in building a vibrant and inclusive team at TP-Link Systems Inc.
Please, no third-party agency inquiries, and we are unable to offer visa sponsorships at this time.
$150k-180k yearly Auto-Apply 60d+ ago
Data Engineer
JBA International 4.1
Data engineer job in Irvine, CA
This job is onsite, based in Irvine CA. Essential Duties and Responsibilities
Develop and maintain SQL Server and Snowflake Data Warehouses and Data Lakes with a focus on data governance, security, and performance optimization.
Manage and optimize database solutions within Snowflake, SQL Server, MySQL, and AWS RDS environments.
Build and optimize ETL pipeline processes using Snowpipe, DBT, Boomi, SSIS and/or Azure Data Factory.
Proficiency in Data Tools such as SSMS, Profiler, Query Store, Redgate, etc.
Perform light operational tasks, including database backup & restore.
Collaborate and support organizational data development projects, working closely with the Business Intelligence and Business Analysts Teams.
Help ensure and maintain database security, integrity, and compliance following the industry's best practices.
Configure and manage data integration platforms and business intelligence tools, ideally including Power BI, Tableau, Power Automate, and scripting languages like Python or R.
Qualifications
7+ years of hands-on experience working as Microsoft SQL Server Developer or Administrator, with hands-on experience developing relational and object-oriented databases.
2+ years of hands-on experience working with Snowflake data warehouses and data lake environments.
Proven mastery of Microsoft SQL Server, including t-SQL.
Hands-on experience developing databases, pipelines, and reporting solutions using BI Reporting & ETL tools such as Power BI, SSRS, SSIS, Azure Data Factory, DBT and other similar industry tool sets.
Proficiency with scripting and automation, including Python, PowerShell; or R.
Experience with, and knowledge of, data integration and analytic tools such as Boomi, Redshift, or Databricks is desirable.
Excellent communication and organizational skills.
Education and/or Experience
Bachelor or Master's Degree in Computer Science, Information Systems, Data Sciences, or equivalent experience.
Language Skills
Ability to read and comprehend simple instructions, short correspondence, and memos. Ability to write simple correspondence. Ability to effectively present information in one-on-one and small group situations to customers, clients, and other employees of the organization.
Mathematical Skills
Ability to add, subtract, multiply, and divide in all units of measure, using whole numbers, common fractions, and decimals. Ability to compute rate, ratio, and percentage and to draw and interpret bar graphs.
Reasoning Ability
Ability to apply common sense understanding to carry out detailed instructions. Ability to deal with problems involving a few concrete variables in standardized situations.
Computer Skills
Advanced knowledge of Microsoft Office applications (Word, Excel, Outlook, etc).
Physical Demands
The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodation may be made to enable individuals with disabilities to perform the essential functions.
While performing the duties of this Job, the employee is regularly required to use hands to finger, handle, or feel; reach with hands and arms and talk or hear. Employees are frequently required to walk and sit. The employee is occasionally required to stand. The employee must occasionally lift and/or move up to 25 pounds. Specific vision abilities required by this job include close vision, distance vision, color vision, peripheral vision, depth perception and ability to adjust focus. Always practice good judgment and refer to the safety guidelines.
Work Environment
The work environment characteristics described here are representative of those an employee encounters while performing the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Always practice good judgment and refer to the safety guidelines.
The noise level in the work environment is usually moderate
Salary:
$125K-$135k
Benefits
Our benefit offerings include: Medical, dental, vision, wellness programs, EAP counseling services, FSA & HSA, company sponsored life insurance for employee, voluntary life insurance for employee, spouse and child, AD&D Insurance, voluntary short-term and long-term disability, critical illness insurance, accident insurance, legal assistance, pet insurance, living will and trust preparation discounts, commuter program, annual walking challenge, employee appreciation events and monthly sales awards.
At Acra Lending, we were founded with a clear purpose: to specialize in alternative income loan products that help borrowers qualify for our flexible loan programs. We operate in 39 States, out of nine offices in Arizona, California, Florida, Georgia, Hawaii, Missouri, Nevada, Texas, and Utah along with a growing team of remote professionals across the country. Throughout the mortgage industry, Acra Lending is widely known for responsible lending practices, product innovation and operational efficiency. The foundation of our Company is built on helping our customers by providing Non-QM mortgage solutions for today's borrower in residential and commercial properties across America.
Our Leadership team will encourage you to grow, make time to have fun, and work together to make great things happen. We embrace the strengths and values of each team member. We believe in having diverse perspectives where everyone is included, to serve customers from all walks of life. We reward our employees with a competitive salary and a variety of benefits to help our team members reach their health, retirement, and professional goals along with an exceptional 401k match program. We look forward to meeting you!
$125k-135k yearly 60d+ ago
Data Scientist II
Esri 4.4
Data engineer job in Redlands, CA
Are you passionate about changing the world through machine learning and location intelligence? Join Esri as we leverage the IoT revolution and explosive growth of location data to help organizations extract advanced intelligence, predict significant events, and automate work processes using AI and machine learning.
We're seeking a Data Scientist II who combines technical expertise with strong interpersonal skills to collaborate directly with customers, understand their unique challenges, and develop data-driven solutions that integrate seamlessly with GIS workflows. While prior GIS experience is not required, you'll work closely with GIS experts and gain valuable knowledge in spatial data analytics.
We serve customers across diverse domains including natural resources, defense, commercial industries, public transportation, utilities, and governmental entities in over 160+ countries.
Responsibilities
Consult closely with customers to understand their needs and serve as Esri's sole representative
Scope projects, communicate uncertainties and risks, and accurately estimate delivery times
Develop and pitch data science solutions that map business problems to advanced analytics approaches
Serve as a subject matter expert for internal stakeholders
Write production-level code for data analysis and process automation
Build high-quality analytics systems employing techniques from data mining, statistics, and machine learning
Perform feature engineering, model selection, and hyperparameter optimization to achieve high predictive accuracy
Deploy models to production environments and assist customers in troubleshooting implementations
Implement best practices for geospatial machine learning and develop reusable technical components for demonstrations and rapid prototyping
Design and implement generative AI solutions and agentic systems to enhance customer workflows and automate complex geospatial tasks
Stay updated with the latest trends in machine learning, deep learning, and AI, and incorporate them into project delivery
Quickly become proficient in unfamiliar fields to address diverse customer needs
Requirements
2+ years of practical machine learning experience or relevant academic/lab work
Hands-on relevant experience in generative AI, large language models, and agentic systems
Exemplify tenacity, determination, creativity, curiosity, and independence in problem-solving
Ability to learn new concepts quickly and implement new technologies rapidly
Flexibility to adapt to diverse customer solutions in a rapidly evolving technical landscape
Experience working with non-technical stakeholders in multiple domains
Proficiency in building and optimizing supervised and unsupervised machine learning models, including deep learning techniques
Fundamental understanding of mathematical and machine learning concepts
Applied statistics experience
Expertise in developing machine learning solutions, preferably in Python
Proficiency in data extraction, transformation, and loading between various systems
Ability to produce compelling data visualizations to convey results
Strong communication skills, including the ability to explain technical concepts to non-technical audiences
Capability to manage multiple projects simultaneously
Bachelor's degree in mathematics, statistics, computer science, physics, or a related field
Recommended Qualifications
Master's degree or PhD in mathematics, statistics, computer science, physics, or a related field
Experience with spatial and GIS concepts, preferably using Esri software
Familiarity with Git, ML libraries such as PyTorch and TensorFlow, and transformers-based large language models (LLMs)
Experience handling large-scale batch/streaming data with big data tools like Apache Spark
Experience with cloud services (AWS, Azure, and more)
#LI-MN1
$81k-109k yearly est. Auto-Apply 6d ago
Lead Gameplay Engineer
Tencent 4.5
Data engineer job in Irvine, CA
About the Hiring TeamWelcome to Lightspeed LA, the first U.S.-based, AAA game development studio for Tencent Games. Lightspeed LA is focused on building open-world games that emphasize an ambitious living world, sophisticated player-driven gameplay, and mature, thoughtful storytelling. We are developing the first in a series of genre-defining titles.
Why join Lightspeed LA?
For us, it's not only about the exciting, new AAA open world game we're developing but also about team values and studio culture. We foster an open and transparent environment where everyone feels comfortable providing constructive feedback about all aspects of our games.
● We encourage the showing of work that's still in progress.
● We don't attack or question unfinished work, we celebrate its promise.
● We have a No Blame Culture where we blame the problem, not the person.
This is a safe place to fail, and we don't view failure as a negative, but as the natural result of pushing boundaries and trying new things. We create together. We face Challenges Together. We succeed together.What the Role Entails
Work with and lead a gameplay engineering team to craft development of innovative gameplay content systems and experiences
Work with design and production and engineering leadership to manage the gameplay schedule and backlog
Implement, debug and optimize gameplay systems in C/C++ and other languages
Author lean, high-performance, maintainable code
Cultivate strong cross-disciplinary working relationships based on honesty and transparency among various departments within the studio
Leadership and Management:
Provide technical leadership across multiple gameplay systems and features
Mentor, develop, and learn from engineering colleagues
Be a leadership resource for engineers across the entire gameplay team
Help build and maintain an open and energetic team culture
Conduct regular one-on-one meetings with team members to provide guidance, feedback, and support their professional development
Participate in the recruitment and onboarding of new team members, ensuring a diverse and skilled team that aligns with project requirements
Facilitate team-building activities to promote a positive and collaborative working environment
Who We Look For
Experience working in Unreal Engine
Passion and enthusiasm for creating fun and engaging gameplay through iteration and innovation
Experience bringing multiple large-scale game titles to completion on PC and consoles
7+ years of professional game programming experience
Experience managing direct reports
Experience performing in a senior or principal software engineering role or equivalent
Ability to collaborate with other engineers, designers, artists and other game developers
Ability to work well in a fluid, changing environment as creative challenges evolve
Deep knowledge of gameplay systems involving AI, animation, physics, and character-related systems from engine to game
Expert in C++, debugging, and 3D mathematics
Enjoy working closely with creative and technical team members
Value lean, simple, efficient, well-tested code
Strong written and verbal communication skills
Preferred:
Experience with game console hardware
Understanding of game asset pipelines
Why Join Us?
Location State(s)
US-California-IrvineThe expected base pay range for this position in the location(s) listed above is $116,200.00 to $343,600.00 per year. Actual pay may vary depending on job-related knowledge, skills, and experience. Employees hired for this position may be eligible for a sign on payment, relocation package, and restricted stock units, which will be evaluated on a case-by-case basis. Subject to the terms and conditions of the plans in effect, hired applicants are also eligible for medical, dental, vision, life and disability benefits, and participation in the Company's 401(k) plan. The Employee is also eligible for up to 15 to 25 days of vacation per year (depending on the employee's tenure), up to 13 days of holidays throughout the calendar year, and up to 10 days of paid sick leave per year. Your benefits may be adjusted to reflect your location, employment status, duration of employment with the company, and position level. Benefits may also be pro-rated for those who start working during the calendar year.Equal Employment Opportunity at Tencent
As an equal opportunity employer, we firmly believe that diverse voices fuel our innovation and allow us to better serve our users and the community. We foster an environment where every employee of Tencent feels supported and inspired to achieve individual and common goals.
$102k-150k yearly est. Auto-Apply 60d+ ago
Data Engineer II
Playstation 4.8
Data engineer job in Aliso Viejo, CA
Why PlayStation?
PlayStation isn't just the Best Place to Play - it's also the Best Place to Work. Today, we're recognized as a global leader in entertainment producing The PlayStation family of products and services including PlayStation 5, PlayStation 4, PlayStation VR, PlayStation Plus, acclaimed PlayStation software titles from PlayStation Studios, and more.
PlayStation also strives to create an inclusive environment that empowers employees and embraces diversity. We welcome and encourage everyone who has a passion and curiosity for innovation, technology, and play to explore our open positions and join our growing global team.
The PlayStation brand falls under Sony Interactive Entertainment, a wholly-owned subsidiary of Sony Group Corporation.
Role description
The DataEngineering Team fuels real-time, intelligent data insights for PlayStation's Cloud Streaming offering. In this role, you'll collaborate closely with teams across PlayStation to deliver a scalable and reliable data platform that drives a data-first culture and outlook.
As a DataEngineer II, you will develop and maintain server-side applications for shaping, managing, and transforming data on a large geographically distributed infrastructure for PlayStation. Your work will include developing JVM based applications using Kotlin, Scala, or Java, Flink jobs, and APIs to provide real time data and insights across the Cloud Streaming ecosystem to ensure seamless data flow throughout our data pipeline. You bring passion and expertise to continuously improve the value that our data platform provides to data producers and consumers across the organization, providing tooling, standard methodologies, and state-of-the art technology to empower data-driven decisions to the company. You are self-motivated and thrive in a distributed team environment - comfortable taking initiative, handling your own work, and collaborating effectively across time zones and communication channels, embodying our one team mantra!
Responsibilities
Design and implement middleware and backend systems capable of handling high throughput and load
Develop and maintain stream processing jobs using Flink (stateful / stateless), and implement scalable ETL jobs that enable our partners and customers to glean intelligent insight
Optimize performance, reliability, and monitoring of the data pipeline stack
Support migration efforts of our on-premises Kafka to new cloud technologies
Perform in-depth analysis of functional or business requirements
Contribute to architectural decisions for PlayStation's cloud streaming data pipeline
Liaise with product domain teams to help turn data into information, and information into business value
Required Qualifications
Bachelor's or Master's degree in Computer Science or a related field or equivalent practical experience
3+ years of experience in software development and backend engineering developing and maintaining high throughput and concurrent systems with high availability and fault tolerance
Proficient in JVM Languages (Kotlin, Scala, Java) and Python
Experience with Apache Kafka, Flink, Spark, and SQL
Knowledge of software paradigms like functional programming, object oriented programming, TDD
Knowledge of web technologies including REST, JSON, gRPC, and WebSockets
Expertise on data structures (JSON, Avro, Protobuf) as well as unstructured and semi-structured data
Experience with ETL, stream processing, real-time pipelines, and batch processing
Familiarity with software development tools and processes like Git and CI/CD
Track record monitoring and analyzing system performance, isolating issues or bottlenecks that could impact reliability, performance and scalability
Self-motivated, strong interpersonal and communication skills; ability to work with geographically remote teams
Experience with Amazon Web Services at enterprise scale including, but not limited to, OpenSearch, MSK, and EKS is a plus
Experience with Kubernetes, Docker, and cloud deployment technologies is a plus
#LI-RG1
Please refer to our Candidate Privacy Notice for more information about how we process your personal information, and your data protection rights.
At SIE, we consider several factors when setting each role's base pay range, including the competitive benchmarking data for the market and geographic location.
Please note that the base pay range may vary in line with our hybrid working policy and individual base pay will be determined based on job-related factors which may include knowledge, skills, experience, and location.
In addition, this role is eligible for SIE's top-tier benefits package that includes medical, dental, vision, matching 401(k), paid time off, wellness program and coveted employee discounts for Sony products. This role also may be eligible for a bonus package. Click here to learn more.
The estimated base pay range for this role is listed below.$150,100-$225,100 USD
Equal Opportunity Statement:
Sony is an Equal Opportunity Employer. All persons will receive consideration for employment without regard to gender (including gender identity, gender expression and gender reassignment), race (including colour, nationality, ethnic or national origin), religion or belief, marital or civil partnership status, disability, age, sexual orientation, pregnancy, maternity or parental status, trade union membership or membership in any other legally protected category.
We strive to create an inclusive environment, empower employees and embrace diversity. We encourage everyone to respond.
PlayStation is a Fair Chance employer and qualified applicants with arrest and conviction records will be considered for employment.
$150.1k-225.1k yearly Auto-Apply 11d ago
Senior Gameplay Engineer
Lightspeed La 4.6
Data engineer job in Irvine, CA
About the Hiring TeamWelcome to Lightspeed LA, the first U.S.-based, AAA game development studio for Tencent Games. Lightspeed LA is focused on building open-world games that emphasize an ambitious living world, sophisticated player-driven gameplay, and mature, thoughtful storytelling. We are developing the first in a series of genre-defining titles.
Why join Lightspeed LA?
For us, it's not only about the exciting, new AAA open world game we're developing but also about team values and studio culture. We foster an open and transparent environment where everyone feels comfortable providing constructive feedback about all aspects of our games.
● We encourage the showing of work that's still in progress.
● We don't attack or question unfinished work, we celebrate its promise.
● We have a No Blame Culture where we blame the problem, not the person.
This is a safe place to fail, and we don't view failure as a negative, but as the natural result of pushing boundaries and trying new things. We create together. We face Challenges Together. We succeed together.What the Role Entails
LightSpeed LA is seeking for a talented and enthusiastic senior gameplay engineer to join our new studio in sunny Irvine California. The ideal candidate will have a strong passion for games and solving ambitious technical and creative problems. Join us and help build a team and culture of mutual respect and passion that pushes boundaries while having lots of fun!
Responsibilities:
Design, implement, and support quality gameplay systems and tools
Collaborate directly and proactively with engineers, designers, artists, and QA to explore and identify needs and opportunities
Cultivate strong cross-disciplinary working relationships based on honesty and transparency
Author lean, high-performance, maintainable code
Actively profile and optimize the game's codebase
Participate in the testing and documentation of game systems and tools
Mentor, develop, and learn from colleagues
Help build and maintain an open and energetic team culture
Who We Look For
Passion and enthusiasm for creating fun and engaging gameplay through iteration and innovation
5+ years of professional game programming experience
Experience bringing a large-scale game title to completion on PC and consoles
Expert in C++, debugging, and 3D math
Enjoy working closely with creative and technical team members
Strong written and verbal communication skills
Value lean, simple, efficient, well-tested code
Preferred:
Experience working either AAA proprietary engines or third party engines like Unreal and Unity
Experience with game console hardware
Understanding of game asset pipelines
Deep knowledge of animation, physics, AI, and/or networking
Why Join Us?
Location State(s)
US-California-IrvineThe expected base pay range for this position in the location(s) listed above is $116,200.00 to $269,100.00 per year. Actual pay may vary depending on job-related knowledge, skills, and experience. Employees hired for this position may be eligible for a sign on payment, relocation package, and restricted stock units, which will be evaluated on a case-by-case basis. Subject to the terms and conditions of the plans in effect, hired applicants are also eligible for medical, dental, vision, life and disability benefits, and participation in the Company's 401(k) plan. The Employee is also eligible for up to 15 to 25 days of vacation per year (depending on the employee's tenure), up to 13 days of holidays throughout the calendar year, and up to 10 days of paid sick leave per year. Your benefits may be adjusted to reflect your location, employment status, duration of employment with the company, and position level. Benefits may also be pro-rated for those who start working during the calendar year.Equal Employment Opportunity at Tencent
As an equal opportunity employer, we firmly believe that diverse voices fuel our innovation and allow us to better serve our users and the community. We foster an environment where every employee of Tencent feels supported and inspired to achieve individual and common goals.
$82k-121k yearly est. Auto-Apply 60d+ ago
Data Engineer
Luxoft
Data engineer job in Irvine, CA
Project description
Luxoft is looking for a Senior DataEngineer for development of new application to be used by investors and investment committees to review their portfolio data, tailored to specific user groups.
Responsibilities
Work with complex data structures and provide innovative ways to a solution for complex data delivery requirements
Evaluate new and alternative data sources and new integration techniques
Contribute to data models and designs for the data warehouse
Establish standards for documentation and ensure your team adheres to those standards
Influence and develop a thorough understanding of standards and best practices used by your team
Skills
Must have
Seasoned dataengineer who has hands-on experience in AWS to conduct end-to-end data analysis and data pipeline build-out using Python, Glue, S3, Airflow, DBT, Redshift, RDS, etc.
Extensive Python API design experience, preferably Fast API
Strong SQL knowledge
Nice to have
Pyspark
Databricks
ETL design
$99k-139k yearly est. 1d ago
Data & Applied M/L Engineer
Gatekeeper Systems 3.3
Data engineer job in Irvine, CA
At Gatekeeper Systems,
we're revolutionizing retail loss prevention and customer safety through a powerful combination of physical deterrents and cutting-edge technology-including AI, computer vision, and facial recognition. As a global leader with over 25 years of industry excellence and a growing, diverse team of 500 employees across offices in North America, Europe, Australia, and Asia, we're driven by innovation, integrity, and impact. Join us and be part of a mission-focused team that's making a real difference in the future of retail, providing innovative solutions and services that redefine industry standards.
POSITION SUMMARY:
WHAT WE OFFER…
Join the team at Gatekeeper Systems and watch your career grow! We offer competitive compensation and benefits packages that include:
Attractive Total Compensation Package, including annual bonus
Comprehensive healthcare benefits including medical, dental, and vision coverage; Life/ADD/LTD insurance; FSA/HSA options.
401(k) Plan with Employer Match
Generous Paid Time Off (PTO) policy
Observance of 11 paid company holidays
Various Employee Engagement Events
Exciting Growth Opportunities
Positive Company Culture
This role bridges the gap between core dataengineering and practical machine learning applications. Primarily, you will be a data platform engineer responsible for owning core data pipelines, data models, and quality controls that power Gatekeeper analytics and future data products.
Secondarily, you will drive the production lifecycle of our shopping cart computer-vision feature. You will orchestrate the data workflows that interface with our Machine Learned models to ensure accurate cart classification, while leveraging the FaceFirst ML team for deeper capacity. You will collaborate with BI Analysts, software engineers, and product teams to transform raw data into actionable insights.
ESSENTIAL JOB FUNCTIONS; but not limited to:
Data Platform, Pipelines, & Quality (Primary Focus)
Pipeline Design & Operation: Design, build, and operate scalable ELT/ETL pipelines that ingest data from IoT/smart-cart telemetry, video events, operational systems, and external partners into our cloud data lake/warehouse.
Infrastructure Management: Build and maintain robust data infrastructure, including databases (SQL and NoSQL), data warehouses, and data integration solutions.
Data Modeling: Establish canonical data models and definitions (schemas, event taxonomy, metrics) so teams can trust and reuse the same data across products, BI, and analytics.
Data Quality Assurance: Own data quality end-to-end by implementing validation rules, automated tests, anomaly detection, and monitoring/alerting to prevent and quickly detect regressions.
Consistency & Governance: Drive data consistency improvements across systems (naming, identifiers, timestamps, joins, deduplication) and document data contract expectations with producing teams.
Root Cause Analysis: Troubleshoot pipeline and data issues, perform root-cause analysis, and implement durable fixes that improve reliability and reduce operational load.
Collaboration & Analytics: Partner with BI Analysts and Product teams to create curated datasets and self-serve analytics foundations (e.g., marts/semantic layer), as well as support internally facing dashboards to communicate system health.
Applied ML Ownership - Smart Exit Cart-Empty Classifier (Secondary Focus)
Lifecycle Management: Own the production lifecycle for the cart classification capability, including data collection/labeling workflows, evaluation, threshold tuning, and safe release/rollback processes.
Pipeline Implementation: Implement and optimize machine learning pipelines, from feature engineering and model training to deployment and monitoring in production.
Evaluation & Monitoring: Build and maintain an evaluation harness (offline metrics + repeatable test sets) and ongoing monitoring (accuracy drift, data drift, false positive/negative analysis).
Cross-Team Collaboration: Collaborate with the FaceFirst ML team to incorporate improvements (model updates, feature changes) while keeping Gatekeeper's production integration stable.
Integration: Work with software engineers to ensure the classifier integrates cleanly into the product workflow with robust telemetry, logging, and operational runbooks.
QUALIFICATION REQUIREMENTS
The requirements listed below are representative of the knowledge, skill and/or ability required.
Exemplifies professionalism in all aspects of day-to-day duties and responsibilities.
Self-aware and open to learning about personal effectiveness in the workplace.
Exhibits a positive attitude toward the vision, policies, and goals of Gatekeeper Systems.
Constantly strives to improve performance and effectiveness of the team and the company.
EDUCATION AND/OR EXPERIENCE
Work Experience: 5+ years
Core Engineering: Strong experience building and operating production ELT/ETL pipelines and data warehouses.
Programming: Fluency in SQL and Python (or similar) for data transformation, validation, and automation.
Cloud Platforms: Experience with cloud data platforms (Azure and/or GCP), including object storage, security/access controls, and cost-aware design.
Tooling: Hands-on experience with orchestration and transformation tooling (e.g., Airflow/Prefect) and batch processing frameworks (e.g., Spark/Databricks).
Quality Practices: Practical experience implementing data quality practices (tests, monitoring/alerting, lineage/documentation) and improving data consistency across systems.
Operations: Collaborate with operational teams to identify, diagnose, and remediate in-field system issues.
Bachelor's Degree in Computer Science, Software Engineering, Information Systems, Mathematics, Statistics, or a related technical field.
SALARY RANGE
$150,000 to $175,000; 5% AIP
PHYSICAL DEMANDS
The physical demands described here are representative of those that must be met by a team member to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Repetitive motions and routine use of standard office equipment such as computers, telephones, copiers/scanners and filing cabinets.
Ability to see, speak, walk, hear, stand, use of hand/fingers to handle or feel; climb stairs, stoop, carry/lifting up to 50 lbs.
Ability to sit at a desk.
Specific vision abilities required include close vision, color vision, peripheral visions, depth perception and the ability to adjust focus.
Regularly utilizes manual dexterity to put parts or pieces together quickly and accurately.
DISCLAIMER
This Job Description is a general overview of the requirements for the position. It is not designed to contain, nor should it be interpreted as being all inclusive of every task which may be assigned or required. It is subject to change, in alignment with company/department needs and priorities.
Gatekeeper Systems, Inc., is an equal opportunity employer. We are committed to developing a diverse workforce and cultivating an inclusive environment. We value diversity and believe that we are strengthened by the differences in our experiences, thinking, culture, and background. We strongly encourage applications from candidates who demonstrate that they can contribute to this goal. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, disability status or any protected basis.
$150k-175k yearly Auto-Apply 4d ago
Lead Gameplay Engineer
Lightspeed La 4.6
Data engineer job in Irvine, CA
About the Hiring TeamWelcome to Lightspeed LA, the first U.S.-based, AAA game development studio for Tencent Games. Lightspeed LA is focused on building open-world games that emphasize an ambitious living world, sophisticated player-driven gameplay, and mature, thoughtful storytelling. We are developing the first in a series of genre-defining titles.
Why join Lightspeed LA?
For us, it's not only about the exciting, new AAA open world game we're developing but also about team values and studio culture. We foster an open and transparent environment where everyone feels comfortable providing constructive feedback about all aspects of our games.
● We encourage the showing of work that's still in progress.
● We don't attack or question unfinished work, we celebrate its promise.
● We have a No Blame Culture where we blame the problem, not the person.
This is a safe place to fail, and we don't view failure as a negative, but as the natural result of pushing boundaries and trying new things. We create together. We face Challenges Together. We succeed together.What the Role Entails
Work with and lead a gameplay engineering team to craft development of innovative gameplay content systems and experiences
Work with design and production and engineering leadership to manage the gameplay schedule and backlog
Implement, debug and optimize gameplay systems in C/C++ and other languages
Author lean, high-performance, maintainable code
Cultivate strong cross-disciplinary working relationships based on honesty and transparency among various departments within the studio
Leadership and Management:
Provide technical leadership across multiple gameplay systems and features
Mentor, develop, and learn from engineering colleagues
Be a leadership resource for engineers across the entire gameplay team
Help build and maintain an open and energetic team culture
Conduct regular one-on-one meetings with team members to provide guidance, feedback, and support their professional development
Participate in the recruitment and onboarding of new team members, ensuring a diverse and skilled team that aligns with project requirements
Facilitate team-building activities to promote a positive and collaborative working environment
Who We Look For
Experience working in Unreal Engine
Passion and enthusiasm for creating fun and engaging gameplay through iteration and innovation
Experience bringing multiple large-scale game titles to completion on PC and consoles
7+ years of professional game programming experience
Experience managing direct reports
Experience performing in a senior or principal software engineering role or equivalent
Ability to collaborate with other engineers, designers, artists and other game developers
Ability to work well in a fluid, changing environment as creative challenges evolve
Deep knowledge of gameplay systems involving AI, animation, physics, and character-related systems from engine to game
Expert in C++, debugging, and 3D mathematics
Enjoy working closely with creative and technical team members
Value lean, simple, efficient, well-tested code
Strong written and verbal communication skills
Preferred:
Experience with game console hardware
Understanding of game asset pipelines
Why Join Us?
Location State(s)
US-California-IrvineThe expected base pay range for this position in the location(s) listed above is $116,200.00 to $343,600.00 per year. Actual pay may vary depending on job-related knowledge, skills, and experience. Employees hired for this position may be eligible for a sign on payment, relocation package, and restricted stock units, which will be evaluated on a case-by-case basis. Subject to the terms and conditions of the plans in effect, hired applicants are also eligible for medical, dental, vision, life and disability benefits, and participation in the Company's 401(k) plan. The Employee is also eligible for up to 15 to 25 days of vacation per year (depending on the employee's tenure), up to 13 days of holidays throughout the calendar year, and up to 10 days of paid sick leave per year. Your benefits may be adjusted to reflect your location, employment status, duration of employment with the company, and position level. Benefits may also be pro-rated for those who start working during the calendar year.Equal Employment Opportunity at Tencent
As an equal opportunity employer, we firmly believe that diverse voices fuel our innovation and allow us to better serve our users and the community. We foster an environment where every employee of Tencent feels supported and inspired to achieve individual and common goals.
The average data engineer in Hemet, CA earns between $85,000 and $161,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.