Post job

Data engineer jobs in Santa Barbara, CA - 97 jobs

All
Data Engineer
Software Engineer
Lead Building Engineer
Requirements Engineer
Data Scientist
Software Development Engineer
  • SCCM Endpoint Engineer (LARGELY REMOTE/NO C2C)

    Amerit Consulting 4.0company rating

    Data engineer job in Oxnard, CA

    Our client, a Medical Center facility under the aegis of a California Public Ivy university and one of largest health delivery systems in California, seeks an accomplished SCCM Endpoint Engineer. ________________________________________ NOTE- THIS IS LARGELY REMOTE ROLE & ONLY W2 CANDIDATES/NO C2C/1099 *** Candidate must be authorized to work in USA without requiring sponsorship *** Position: SCCM Endpoint Engineer (Job Id - # 3167240) Location: Los Angeles CA 90024 (Hybrid-99% Remote/1% onsite) Duration: 10 months + Strong Possibility of Extension _________________________________________________________ Candidate will travel onsite to learn/view their setup and come onsite as needed for team building or vendor engagements. Onsite requirements are about 2-3 per year. ____________________________________________________ Required skills and experience: Ability to monitor and report on statuses of endpoints utilizing SCCM/MECM & Intune. Understanding of Networking and Active Directory. Advanced knowledge of Microsoft Windows 10, Mac OS, Intune, Autopilot, SCCM/MECM, JAMF, and other endpoint management solutions Advanced knowledge of ISS Microsoft Office products (O365, Office 2016, Outlook, Exchange and OWA). Understanding of project plans, presentations, procedures, diagrams, and other technical documentation. Understanding of Networking protocols and standards: DNS, DHCP, WINS and TCP/IP, etc. Ability to work independently with minimal supervision as well as in a team environment. Ability to follow escalation procedure within the TSD Team and under the ISS umbrella. Establish standards and procedures for best practices, enabling commitments to established SLA's. Ability to research and test new technologies and processes. Demonstrate ability to develop creative solutions to complex problems. Understanding of various Desktop Management Systems such as anti-virus software, patch management, full disk encryption, SSO/Tap-Badge (Imprivata) software and software delivery. Ability to prioritize, organize, and execute work assignments. Ability to communicate the status of various systems to management, leadership and/or support personnel. Ability to skillfully react to a fluid and constantly changing work environment. Ability to train, delegate and review the work of staff members. Advanced knowledge of ticketing systems (ServiceNow). Strong technical abilities with excellent communication and interpersonal skills. Advanced knowledge of cloud computing (Azure, Intune, Autopilot, DaaS, Box, OneDrive). Advanced knowledge of standard desktop imaging and upgrade procedures; SCCM/MECM/MDT, Intune, OSD, PXE, thin vs thick images. Advanced knowledge of VPN remote software and RDP setup. Advanced knowledge of Windows and Citrix based printing. Understand ITIL overview and tier structure support using ticket tracking system. Advanced knowledge of Apple OSX and iOS operating systems and platforms. Advanced knowledge of virtualization technologies (Citrix XenApp, XenDesktop, VMWare, Azure Virtual Desktop, Windows 365, Amazon Workspaces). Advanced knowledge of IT Security applications (Cisco AMP, Aruba OnGuard, DUO, FireEye, Windows Defender, Windows BitLocker, Checkpoint Encryption and USB allowlisting). ___________________________________________ Bhupesh Khurana Lead Technical Recruiter Email - ***************************** Company Overview: Amerit Consulting is an extremely fast-growing staffing and consulting firm. Amerit Consulting was founded in 2002 to provide consulting, temporary staffing, direct hire, and payrolling services to Fortune 500 companies nationally, as well as small to mid-sized organizations on a local & regional level. Currently, Amerit has over 2,000 employees in 47 states. We develop and implement solutions that help our clients operate more efficiently, deliver greater customer satisfaction, and see a positive impact on their bottom line. We create value by bringing together the right people to achieve results. Our clients and employees say they choose to work with Amerit because of how we work with them - with service that exceeds their expectations and a personal commitment to their success. Our deep expertise in human capital management has fueled our expansion into direct hire placements, temporary staffing, contract placements, and additional staffing and consulting services that propel our clients businesses forward. Amerit Consulting provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. Applicants, with criminal histories, are considered in a manner that is consistent with local, state and federal laws
    $94k-138k yearly est. 3d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Data Engineer

    KBR 4.7company rating

    Data engineer job in Camarillo, CA

    Title: Data Engineer Belong. Connect. Grow. with KBR! KBR's National Security Solutions team provides high-end engineering and advanced technology solutions to our customers in the intelligence and national security communities. In this position, your work will have a profound impact on the country's most critical role - protecting our national security. Why Join Us? Innovative Projects: KBR's work is at the forefront of engineering, logistics, operations, science, program management, mission IT and cybersecurity solutions. Collaborative Environment: Be part of a dynamic team that thrives on collaboration and innovation, fostering a supportive and intellectually stimulating workplace. Impactful Work: Your contributions will be pivotal in designing and optimizing defense systems that ensure national security and shape the future of space defense. Come join the ITEA award winning TRMC BDKM team and be a part of the team responsible for revolutionizing how analysis is performed across the entire Department of Defense! Key Responsibilities: As a Data Engineer, you will be a critical part of the team that is responsible for enabling the development of data-driven decision analysis products through the innovative application, and promotion, of novel methods from data science, machine learning, and operations research to provide robust and flexible testing and evaluation capabilities to support DoD modernization. Analytic Experience: Candidate will be a part of the technical team responsible for providing analytic consulting services, supporting analytic workflow and product development and testing, promoting the user adoption of methods and best practices from data science, conducting applied methods projects, and supporting the creation of analysis-ready data. Onsite Support: Candidate will be the face of the CHEETAS Team and will be responsible for ensuring stakeholders have the analytical tools, data products and reports they need to make insightful recommendations based on your data driven analysis. Stakeholder Assistance: Candidate will directly assisting both analyst / technical and non-analyst / non-technical stakeholders with the analysis of DoD datasets and demonstrating the 'art of the possible' to the stakeholders and VIPs with insights gained from your analysis of DoD Test and Evaluation (T&E) data. Communication: Must effectively communicate at both a programmatic and technical level. Although you potentially may be the only team member physically on-site supporting you will not be alone. You will have support from other data science team members as well as the software engineering and system administration teams. Technical Support: Candidate will be responsible for running and operating CHEETAS (and other tools); demonstrating these tools to stakeholders & VIPs; conveying analysis results; adapting internally-developed tools, notebooks and reports to meet emerging needs; gathering use cases, requirements, gaps and needs from stakeholders and for larger development items providing that information as feature requests or bug reports to the CHEETAS development team; and performing impromptu hands-on training sessions with end users and potentially troubleshooting problems from within closed networks without internet access (with support from distributed team members). Independent Work: Candidate must be self-motivated and capable of working independently with little supervision / direct tasking. Work Environment: Location: Onsite; Honolulu, HI Travel Requirements: This position will require travel of 25% with potential surge to 50% to support end users located at various DoD ranges & labs located across the US (including Alaska and Hawaii). When not supporting a site, this position can work remotely or from a nearby KBR office ( if available and desired ). Working Hours: Standard, although you potentially may be the only team member physically on-site providing support, you will not be alone. Basic Qualifications: Security Clearance: Active or current TS/SCI Clearance is required Education: A degree in operations research, engineering, applied math, statistics, computer science or information technology with preferred 15+ years of experience within DoD. Candidates with 10-15 years of DoD experience will be considered on a case-by-case basis. Entry level candidates will not be considered. Technical Experience: Previous experience must include five (5) years of hands-on experience in big data analytics, five (5) years of hands-on experience with object-oriented and functional languages (e.g., Python, R, C++, C#, Java, Scala, etc.). Data Experience: Experience in dealing with imperfections in data. Experience should demonstrate competency in key concepts from software engineering, computer programming, statistical analysis, data mining algorithms, machine learning, and modeling sufficient to inform technical choices and infrastructure configuration. Data Analytics: Proven analytical skills and experience in preparing and handling large volumes of data for ETL processes. Experience should include working with teams in the development and interpretation the results of analytic products with DoD specific data types. Big Data Infrastructure: Experience in the installation, configuration, and use of big data infrastructure (Spark, Trino, Hadoop, Hive, Neo4J, JanusGraph, HBase, MS SQL Server with Polybase, VMWare as examples). Experience in implementing Data Visualization solutions. Qualifications Required: Experience using scripting languages (Python and R) to process, analyze and visualize data. Experience using notebooks (Jupyter Notebooks and RMarkdown) to create reproducible and explainable products. Experience using interactive visualization tools (RShiny, py Shiny, Dash) to create interactive analytics. Experience generating and presenting reports, visualizations and findings to customers. Experience building and optimizing ‘big data' data pipelines, architectures and data sets. Experience cleaning and preparing time series and geospatial data for analysis. Experience working with Windows, Linux, and containers. Experience querying databases using SQL and working with and configuring distributed storage and computing environments to conduct analysis (Spark, Trino, Hadoop, Hive, Neo4J, JanusGraph, MongoDB, Accumulo, HBase as examples). Experience working with code repositories in a collaborative team. Ability to make insightful recommendations based on data driven analysis and customer interactions. Ability to effectively communicate both orally and in writing with customers and teammates. Ability to speak and present findings in front of large technical and non-technical groups. Ability to create documentation and repeatable procedures to enable reproducible research. Ability to create training and educational content for novice end users on the use of tools and novel analytic methods. Ability to solve problems, debug, and troubleshoot while under pressure and time constraints is required. Should be self-motivated to design, develop, enhance, reengineer or integrate software applications to improve the quality of data outputs available for end users. Ability to work closely with data scientists to develop and subsequently implement the best technical design and approach for new analytical products. Strong analytical skills related to working with both structured and unstructured datasets. Excellent programming, testing, debugging, and problem-solving skills. Experience designing, building, and maintaining both new and existing data systems and solutions Understanding of ETL processes, how they function and experience implementing ETL processes required. Knowledge of message queuing, stream processing and extracting value from large disparate datasets. Knowledge of software design patterns and Agile Development methodologies is required. Knowledge of methods from operations research, statistical and machine learning, data science, and computer science is sufficient to select appropriate methods to enable data preparation and computing architecture configuration to implement these approaches. Knowledge of computer programming concepts, data structures and storage architecture, to include relational and non-relational databases, distributed computing frameworks, and modeling and simulation experimentation sufficient to select appropriate methods to enable data preparation and computing architecture configuration to implement these approaches. Scheduled Weekly Hours: 40 Basic Compensation: $119,900 - $179,800 The offered rate will be based on the selected candidate's knowledge, skills, abilities and/or experience and in consideration of internal parity. Additional Compensation: KBR may offer bonuses, commissions, or other forms of compensation to certain job titles or levels, per internal policy or contractual designation. Additional compensation may be in the form of sign on bonus, relocation benefits, short-term incentives, long-term incentives, or discretionary payments for exceptional performance. Ready to Make a Difference? If you're excited about making a significant impact in the field of space defense and working on projects that matter, we encourage you to apply and join our team at KBR. Let's shape the future together. KBR Benefits KBR offers a selection of competitive lifestyle benefits which could include 401K plan with company match, medical, dental, vision, life insurance, AD&D, flexible spending account, disability, paid time off, or flexible work schedule. We support career advancement through professional training and development. Belong, Connect and Grow at KBR At KBR, we are passionate about our people and our Zero Harm culture. These inform all that we do and are at the heart of our commitment to, and ongoing journey toward being a People First company. That commitment is central to our team of team's philosophy and fosters an environment where everyone can Belong, Connect and Grow. We Deliver - Together. KBR is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, disability, sex, sexual orientation, gender identity or expression, age, national origin, veteran status, genetic information, union status and/or beliefs, or any other characteristic protected by federal, state, or local law.
    $119.9k-179.8k yearly Auto-Apply 60d+ ago
  • Data Scientist

    Del Rey Systems & Technology, Inc. 4.3company rating

    Data engineer job in Oxnard, CA

    Data Scientist II (2ppl) STATUS: Contingency - Announcement of Award Imminent SSC: Active Secret Security Clearance (required) SALARY: Please see labor category posted below SUMMARY: The Naval Surface Warfare Center, Port Hueneme Division (NSWC PHD) is part of the larger Naval Sea Systems Command. The NSWC PHD mission is to provide research, development, test and evaluation, and in-service engineering and logistics support to the U.S. Navy, other military services, and government agencies. Its focus areas include combat systems, unmanned systems, surface ship systems, and information systems. LABOR CATEGORIES: All positions require an Active Secret Clearance and experience in DoD Data Scientist II - $105,872.00 Desired Education: Bachelor's degree in a related technical field. Desired Experience: Three (3) years' experience with software integration or testing, including analyzing and implementing test plans and scripts. Experience with frequent scripting language use, such as Python and R and using packages commonly used in data science applications or advanced analytics. Experience with data science, data mining, statistics, or graph algorithms to support analytics objectives. COMPANY OVERVIEW DEL REY Systems & Technology, Inc. (DEL REY) is a small Veteran-owned defense contractor founded in 1995 and headquartered in San Diego, California. We are an equal opportunity employer and believe in recruiting and developing the very best professionals in the field. Although our corporate office is in California, we have employees supporting our customers from coast-to-coast and many states in-between. For employment consideration, please submit your resume to this posting in MS-Word and let us know the position for which you are applying. DEL REY is proud to offer competitive compensation and a comprehensive benefit package. Employee benefits include both a Traditional 401k and ROTH Retirement Accounts; Medical, Dental, Vision, FSA, Vacation, Sick, Basic Term Life Insurance, Employee Assistance Program and voluntary supplemental insurance. DEL REY complies with applicable Federal civil rights laws and does not discriminate. We welcome all applicants as we are always looking for skilled employees possessing a desire to join and contribute to an employee-focused company committed to sustaining superior customer satisfaction. For employment consideration, please respond to the job board where we have our posting or to our Career Page and reference the position which you are seeking. DISCLAIMER: The information in this job description indicates the general nature of the opportunity. It should not be construed as a complete or final description . *** Time-Sensitive - Apply if interested ***
    $105.9k yearly 4d ago
  • Data Engineer

    Evidation 4.2company rating

    Data engineer job in Santa Barbara, CA

    Our small data science team is changing the way healthcare decisions are made. We're taking advantage of new technologies and the latest in statistics and machine learning to rapidly analyze new kinds of healthcare data. We have one-of-a-kind access to health and behavior datasets, and we're using it to change the world for the better. Our distributed data processing and analysis pipeline is built on top of technologies like Spark, Hadoop, Elasticsearch, Python, and Scala, and we need brilliant engineers to help us scale it and adapt it to new challenges. This is the perfect job if you have a software engineering background and want to get involved in healthcare and big data engineering. We have offices in Menlo Park, CA and Santa Barbara, CA and a Surf Air membership so you can rove between the offices. This position is in Santa Barbara, CA. Job Description Responsibilities: Own our data ETL pipelines Investigate, prototype, and validate new analytics systems Optimize our analytics systems for performance, cost, and availability Design and implement new data analytics systems using top-notch software engineering practices Work closely with our Data Scientists to deliver dashboards, insights, and data pipelines to clients Qualifications Requirements: Background in quantitative fields (Computer Science, Engineering, Math, Physics) You're a strong analytical thinker and coder. Think ACM-ICPC, IOI, and IPSC. You're a great software engineer who writes modular, extensible, well-designed systems. You have solid experience with Python and SQL. You value learning, personal growth, teamwork, and excelling at what you do. Additional Information All your information will be kept confidential according to EEO guidelines.
    $114k-161k yearly est. 18h ago
  • Real-Time Data Engineer II

    Appfolio 4.6company rating

    Data engineer job in Santa Barbara, CA

    Description What we're looking for:As a member of the Data Platform Engineering team, the Real-Time Data Engineer II will work collaboratively to develop an infrastructure that ingests data from disparate sources, processes them real-time and routes them to various target storages and applications, while providing access to high quality data to users, ranging from application developers interested in specific events to data analysts keen on business intelligence to data scientists training ML models. At AppFolio, we paddle as one. We ride and make waves together, with a relentless focus on building great products for the way our customers work and live today - and tomorrow. AppFolio is a destination organization where careers are made and accelerated. Here, innovation is a team sport.Your impact: Design, build and operate on next generation data pipeline infrastructure based on Apache Kafka and its ecosystem Improve data architecture, quality, discoverability and access policies to enable and enforce data governance Collaborate with engineers, data analysts and scientists to ensure that our data infrastructure meets the SLOs of our data-intensive customers Develop techniques for monitoring the completeness, correctness and reliability of our data sets Leverage agile practices, encourage collaboration, prioritization, and urgency to develop at a rapid pace Research, share, and recommend new technologies and trends Qualifications You have hands-on experience with using Apache Kafka in production and have a passion for building a reliable, scalable and fault-tolerant infrastructure. You have industry experience with working with real time transformation technologies such as Apache Flink You have worked with a variety of data sources, including change data capture systems, data streaming and event sourcing in production. You have hands-on experience with data warehouse technology. You embrace the platform-first approach to build standard solutions and self-serve capabilities for engineering teams You want to work with a high degree of autonomy, while at the same time working on initiatives of high importance to the company. You care about work-life balance and want your company to care about it too; you'll put in the extra hour when needed but won't let it become a habit. Must have 3+ years of experience with Apache Kafka, Kafka Connect and its ecosystem 2+ years of experience in streaming processing technologies, such as Apache Flink 2+ years of experience with AWS primitives (IAM, VPC, S3, MSK, EKS, etc.) 3+ years of experience working with programming languages like Python or Ruby Excellent SQL skills with working knowledge of query optimization 2+ years of experience working with Infrastructure as Code, configuration management, and monitoring tools. Bachelors in Computer Science or other quantitative fields. Nice to have Experience with Debezium connectors Experience with large scale Data Lakes and Lake Houses, especially with Apache Iceberg is a plus Experience with distributed SQL query engines, such as Trino is a plus Experience with containers and container orchestration tools. Docker and Kubernetes experience is desirable. Data science skills for analyzing data and communicating with ML engineers are a plus. Compensation & BenefitsThe base salary/hourly wage that we reasonably expect to pay for this role is: $104,000 to $130,000.The actual base salary/hourly wage for this role will be determined by a variety of factors, including but not limited to: the candidate's skills, education, experience, etc. Please note that base pay is one important aspect of a compelling Total Rewards package. The base pay range indicated here does not include any additional benefits or bonuses that you may be eligible for based on your role and/or employment type.Regular full-time employees are eligible for benefits - see here. #LI-KB1 About AppFolio AppFolio is the technology leader powering the future of the real estate industry. Our innovative platform and trusted partnership enable our customers to connect communities, increase operational efficiency, and grow their business. For more information about AppFolio, visit appfolio.com. Why AppFolio Grow | We enable a culture of high performance, where delivering results is recognized by opportunities for growth and compelling total rewards. Our challenging and meaningful work drive the growth of our business, and ourselves. Learn | We partner with you to realize your potential by investing in you from the start. We're cultivating a team of big thinkers through coaching and mentorship with our best-in-class leaders, and giving you the time and tools to develop your skills. Impact | We are creating a world where living in, investing in, managing, and supporting communities feels magical and effortless, freeing people to thrive. We do this by innovating with purpose while cultivating a culture of impact. We learn as much from each other as we do our customers and our communities. Connect | We excel at hybrid work by fostering an environment that feels flexible, personal and connected, no matter where we are. We create space to fuel innovation and collaboration, and we come together to celebrate, connect, and succeed. Paddle as One. Learn more at appfolio.com/company/careers Statement of Equal Opportunity At AppFolio, we value diversity in backgrounds and perspectives and depend on it to drive our innovative culture. That's why we're a proud Equal Opportunity Employer, and we believe that our products, our teams, and our business are stronger because of it. This means that no matter what race, color, religion, sex, sexual orientation, gender identification, national origin, age, marital status, ancestry, physical or mental disability, or veteran status, you're always welcome at AppFolio.
    $104k-130k yearly Auto-Apply 60d+ ago
  • Data Engineer

    Metrosys

    Data engineer job in Santa Barbara, CA

    MetroSys is seeking an experienced Data Engineer with a strong background in Python and Microsoft Azure environments. The ideal candidate will have at least 5 years of experience in building and optimizing data pipelines, managing data storage solutions, and integrating systems through APIs. This role will focus on developing robust data pipelines and warehouse solutions to support enterprise-level data initiatives. Key Responsibilities Design, develop, and maintain scalable data pipelines to support analytics, reporting, and operational workloads. Build and optimize data storage solutions in Microsoft Azure, including Azure Data Lake and related services. Integrate third-party and internal systems using APIs for data ingestion and synchronization. Collaborate with data architects, analysts, and business stakeholders to ensure data solutions meet requirements. Implement best practices for data governance, quality, and security across the pipeline lifecycle. Monitor, troubleshoot, and improve pipeline performance and reliability. Qualifications 5+ years of hands-on experience as a Data Engineer or similar role. Strong proficiency with Python for data manipulation, automation, and pipeline development. Proven experience in Microsoft Azure data services (Azure Data Factory, Data Lake, Synapse, SQL Database). Solid understanding of data warehouse concepts and storage optimization techniques. Experience designing and consuming APIs for system integration. Strong problem-solving skills and ability to work independently in a remote environment. Preferred Skills: Knowledge of cloud security practices and compliance requirements. Familiarity with CI/CD pipelines for data workflows. Experience with large-scale enterprise data projects.
    $103k-146k yearly est. Auto-Apply 60d+ ago
  • Staff Data Engineer

    Artera

    Data engineer job in Santa Barbara, CA

    Our Mission: Make healthcare #1 in customer service. What We Deliver: Artera, a SaaS leader in digital health, transforms patient experience with AI-powered virtual agents (voice and text) for every step of the patient journey. Trusted by 1,000+ provider organizations - including specialty groups, FQHCs, large IDNs and federal agencies - engaging 100 million patients annually. Artera's virtual agents support front desk staff to improve patient access including self-scheduling, intake, forms, billing and more. Whether augmenting a team or unleashing a fully autonomous digital workforce, Artera offers multiple virtual agent options to meet healthcare organizations where they are in their AI journey. Artera helps support 2B communications in 109 languages across voice, text and web. A decade of healthcare expertise, powered by AI. Our Impact: Trusted by 1,000+ provider organizations - including specialty groups, FQHCs, large IDNs and federal agencies - engaging 100 million patients annually. Hear from our CEO, Guillaume de Zwirek, about why we are standing at the edge of the biggest technological shift in healthcare's history! Our award-winning culture: Our award-winning culture: Since founding in 2015, Artera has consistently been recognized for its innovative technology, business growth, and named a top place to work. Examples of these accolades include: Inc. 5000 Fastest Growing Private Companies (2020, 2021, 2022, 2023, 2024); Deloitte Technology Fast 500 (2021, 2022, 2023, 2024, 2025); Built In Best Companies to Work For (2021, 2022, 2023, 2024, 2025, 2026). Artera has also been recognized by Forbes as one of “America's Best Startup Employers,” Newsweek as one of the “World's Best Digital Health Companies,” and named one of the top “44 Startups to Bet your Career on in 2024” by Business Insider. SUMMARY We are seeking a highly skilled and motivated Staff Data Engineer to join our team at Artera. This role is critical to maintaining and improving our data infrastructure, ensuring that our data pipelines are robust, efficient, and capable of delivering high-quality data to both internal and external stakeholders. As a key player in our data team, you will have the opportunity to make strategic decisions about the tools we use, how we organize our data, and the best methods for orchestrating and optimizing our data processes. Your contributions will be essential to ensuring the uninterrupted flow of data across our platform, supporting the analytics needs of our clients and internal teams. If you are passionate about data, problem-solving, and continuous improvement, while also taking the lead on investigating and implementing solutions to enhance our data infrastructure. RESPONSIBILITES Continuous Enhancement: Maintain and elevate Artera's data infrastructure, ensuring peak performance and dependability. Strategic Leadership: Drive the decision-making process for the selection and implementation of data tools and technologies Streamlining: Design and refine data pipelines to ensure smooth and efficient data flow. Troubleshooting: Manage the daily operations of the Artera platform, swiftly identifying and resolving data-related challenges. Cross-Functional Synergy: Partner with cross-functional teams to develop new data requirements and refine existing processes. Guidance: Provide mentorship to junior engineers, supporting their growth and assisting with complex projects. Collaborative Innovation: Contribute to ongoing platform improvements, ensuring a culture of continuous innovation. Knowledge Expansion: Stay informed on industry trends and best practices in data infrastructure and cloud technologies. Dependability: Guarantee consistent data delivery to customers and stakeholders, adhering to or surpassing service level agreements. Oversight: Monitor and sustain the data infrastructure, covering areas like recalls, message delivery, and reporting functions. Proactiveness: Improves stability and performance of architecture for team implementations. Requirements Bachelor's Degree in STEM preferred *additional experience is also accepted in lieu of a degree Proven experience with Kubernetes and Cloud infrastructure (AWS preferred) Strong proficiency in Python and SQL for data processing and automation. Expertise in orchestration tools such as Airflow and Docker. Understanding of performance optimization and cost-effectiveness in Snowflake. Ability to work effectively in a collaborative, cross-functional environment. Strong problem-solving skills with a proactive and solution-oriented mindset. Experience with event sourced and microservice architecture Experienced working with asynchronous requests in large scale applications Commitment to testing best practices Experience in Large-scale data architecture Demonstrated ability to build and maintain complex data pipelines and data flows. Bonus Experience Knowledge of DBT & Meltano The compensation for this role will be based on level of experience and the geographic tier in which you are located. This position also comes with equity and a variety of benefits.Security RequirementsThis engineering role contributes to a secure, federally compliant platform. Candidates must be eligible for a government background check and operate within strict code management, access, and documentation standards. Security-conscious development and participation in compliance practices are core to the role. OUR APPROACH TO WORK LOCATIONArtera has hybrid office locations in Santa Barbara, CA, and Philadelphia (Wayne), PA, where team members typically come in three days a week. Specific frequency can vary depending on your team's needs, manager expectations and/or role responsibilities. In addition to our U.S. office locations, we are intentionally building geographically concentrated teams in several key metropolitan areas, which we call our “Hiring Hubs.” We are currently hiring remote candidates located within the following hiring hubs:- Boston Metro Area, MA- Chicago Metro Area, IL- Denver Metro Area, CO- Kansas City Metro Area (KS/MO)- Los Angeles Metro Area, CA- San Francisco / Bay Area, CA- Seattle Metro Area, WA This hub-based model helps us cultivate strong local connections and team cohesion, even in a distributed environment. To be eligible for employment at Artera, candidates must reside in one of our hybrid office cities or one of the designated hiring hubs. Specific roles may call out location preferences when relevant. As our hubs grow, we may establish local offices to further enhance in-person connection and collaboration. While there are no current plans in place, should an office open in your area, we anticipate implementing a hybrid model. Any future attendance expectations would be developed thoughtfully, considering factors like typical commute times and access to public transit, to ensure they are fair and practical for the local team. WORKING AT ARTERA Company benefits - Full health benefits (medical, dental, and vision), flexible spending accounts, company paid life insurance, company paid short-term & long-term disability, company equity, voluntary benefits, 401(k) and more! Career development - Manager development cohorts, employee development funds Generous time off - Company holidays, Winter & Summer break, and flexible time off Employee Resource Groups (ERGs) - We believe that everyone should belong at their workplace. Our ERGs are available for identifying employees or allies to join. EQUAL EMPLOYMENT OPPORTUNITY (EEO) STATEMENTArtera is an Equal Opportunity Employer and is committed to fair and equitable hiring practices. All hiring decisions at Artera are based on strategic business needs, job requirements and individual qualifications. All candidates are considered without regard to race, color, religion, gender, sexuality, national origin, age, disability, genetics or any other protected status. Artera is committed to providing employees with a work environment free of discrimination and harassment; Artera will not tolerate discrimination or harassment of any kind. Artera provides reasonable accommodations for applicants and employees in compliance with state and federal laws. If you need an accommodation, please reach out to ************. DATA PRIVACYArtera values your privacy. By submitting your application, you consent to the processing of your personal information provided in conjunction with your application. For more information please refer to our Privacy Policy. SECURITY REQUIREMENTSAll employees are responsible for protecting the confidentiality, integrity, and availability of the organization's systems and data, including safeguarding Artera's sensitive information such as, Personal identifiable Information (PII) and Protected Health Information (PHI). Those with specific security or privacy responsibilities must ensure compliance with organizational policies, regulatory requirements, and applicable standards and frameworks by implementing safeguards, monitoring for threats, reporting incidents, and addressing data handling risks or breaches. We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
    $103k-146k yearly est. 14d ago
  • Data Engineer

    City of Oxnard, Ca 4.3company rating

    Data engineer job in Oxnard, CA

    This recruitment is open until filled. Early submissions are encouraged as applications will be reviewed on a regular and ongoing basis. The City of Oxnard is seeking a skilled and proactive Data Engineer to support our modernization efforts and optimize data workflows across our 300+ applications. This role will be responsible for designing, developing, and maintaining ETL pipelines, data transformations, and legacy systems integrations to ensure seamless data flow between city systems and third-party vendors. You will work closely with integration specialists, database administrators, and business stakeholders to enhance accessibility, reporting, and usability of critical data. This role is essential in monitoring and maintaining the data health of the city and aiding decision making through analysis and visualization. Therefore, the ideal candidate will have experience in BI and working with stakeholders to create dashboards and reports. WHAT YOU'LL DO: * Maintain disparate datasets through ETL pipelines, improve data accessibility, ensure data integrity, and drive and enforce security compliance. * Prepare strategies for modernizing legacy systems through database migrations or hybrid integrations. * Collaborate with database administrators and integration specialists to streamline workflows and optimize performance. * Work with system administrators to ensure data security and access control best practices. * Support BI initiatives by structuring data and building dashboards for analysts. * Help train and support business users to use BI tools effectively. * Maintain technical documentation for data flows, integrations, and system dependencies. * Analyze data trends, discrepancies, and vendor requests to guide informed decisions. * Participate in evaluating and recommending software tools and platforms for data work. Payroll Title/Classification: Business Systems Analyst, Senior WORK SCHEDULE: The normal work week is Monday through Friday 8:00 am-6:00 pm with every other Friday off. This position may be required to be on an on-call (stand-by) rotation and you may be required to be available to work additional hours as needed to respond to workload needs. The City does not offer hybrid or remote work. Please note: The Information Technology Department supports public safety personnel including the Police Department on a 24-hour, 7-day-per-week schedule, therefore, the candidate may be required to be on call on a rotating basis, subject to callback. As part of the selection process, applicants will be required to successfully complete a thorough background investigation, which may include a polygraph exam.This class specification represents only the core areas of responsibilities; specific position assignments will vary depending on the needs of the Department. * Design, build, and maintain ETL pipelines to support data exchange between systems. * Ensure data consistency, integrity, and compliance with governance regulations such as CJIS. * Develop integration solutions using APIs, web services, and direct database connections. * Optimize and transform legacy and modern data for use across applications. * Collaborate with technical teams to structure and optimize databases for accessibility. * Implement observability mechanisms to ensure critical data workflows and integrations are traceable, auditable, and monitored for failures or anomalies. * Perform root cause analysis of data errors, failures, and bottlenecks. * Write documentation for databases, workflows, custom integrations, and reports. * Conduct unit and integration testing for pipelines and transformations. * Research, evaluate, and recommend software tools, platforms, and third-party solutions to meet business and technical requirements. * Work closely with business leaders, analysts, and department heads to gather requirements and ensure BI solutions align with business objectives. * Build dashboards and interactive reports, empowering end-users to access and explore data independently. * Work cross-functionally with stakeholders and non-technical business users. * Other assigned duties as the role may require. The following are the minimum qualifications necessary for entry into the classification: Education: * Bachelor's degree in Computer Science, Business Administration, Information Technology, or a related field. Experience: * Minimum of 5 years of hands-on experience in Data Engineering, Integration Engineering, Data Analysis, or a similar field. * Proven experience with custom reporting, dashboards, or BI tools (Power BI, Tableau, Qlik, or similar). * Strong proficiency in SQL, Python, or other scripting/ETL languages. * Strong understanding of database systems (MS SQL Server, MySQL, or similar) including database design and optimization. * Hands-on experience with ETL processes and tools (SSIS, Talend, Apache Airflow, or similar). * Knowledge of data modeling, warehousing, and integration protocols (APIs, SFTP, message queues, XML/JSON data exchanges). Highly Desirable Experience, Qualifications and/or Certifications: * Local government or public sector experience especially public safety (police and fire). * Experience with version control systems (e.g., Git or TFS) and collaborative development workflows (e.g., GitFlow). * Previous experience in a hybrid role combining technical and business analysis responsibilities. * Knowledge of cloud-based data storage and processing (AWS, Azure, etc.). Licensing/Certifications: * Valid CA Class C Driver's License Other Requirements: * Must be able to speak and understand English to effectively communicate with fellow employees, customers, and vendors. APPLICATION PROCESS: * Submit NEOGOV/Government Jobs on-line application. * Complete and submit responses to the supplemental questions, if required. * Upload resume, cover letter, proof of degree (transcript), or other requested documents. Your application may be rejected as incomplete if you do not include the relevant information in the online application and include the information only on the resume. Applications and/or Supplemental Questionnaires that state "see my resume" or "see my personnel file" are considered incomplete and will not be accepted. Cover letters and/or optional resumes are not accepted in lieu of a completed application. The list of qualified candidates established from this recruitment may be used to fill other full-time, part-time, and temporary assignments. There is currently one (1) full-time vacancy within the Information Technology Department. Selected candidate(s) must pass a thorough background investigation. UNION MEMBERSHIP: Positions in this classification are represented by the Oxnard Mid Manager's Association (OMMA). NOTE: For most positions, the City of Oxnard relies on office automation (Microsoft Office/Google) and web-based enabled tools, therefore candidates must be proficient and comfortable with computer use to perform functions associated with on-going work. Regular and reliable attendance, effective communication skills, and development of effective working relationships are requirements of all positions. Employees are required to participate in the City's direct deposit plan and are paid on a bi-weekly basis. This position requires a 12 month probationary period. Pursuant to California Government Code Section 3100, all public employees are required to serve as disaster service workers subject to such disaster service activities as may be assigned to them. EQUAL OPPORTUNITY: The City of Oxnard is an Equal Opportunity Employer and welcomes applications from all qualified applicants. We do not discriminate on the basis of race, color, religion, sex, national origin, age, marital status, medical condition, disability or sexual orientation. REASONABLE ACCOMMODATION: The City of Oxnard makes reasonable accommodation for individuals/people with disabilities. If you believe you require special arrangements to participate in the testing process, you must inform the Human Resources Department in writing no later than the filing date. Applicants who request such accommodation must document their request with an explanation of the type and extent of accommodation required. LEGAL REQUIREMENT: On the first day of employment, new employees must provide proof of citizenship or documentation of legal right to work in the United States in compliance with the Immigration Reform and Control Act of 1986, as amended. The City participates in E-Verify and will provide the federal government with your form I-9 information to confirm that you are authorized to work in the U.S. If E-Verify cannot confirm that you are authorized to work, this employer is required to give you written instructions and an opportunity to contact Department of Homeland Security (DHS) or Social Security Administration (SSA) so you can begin to resolve the issue before the employer can take any action against you, including terminating your employment. Employers can only use E-Verify once you have accepted a job offer and completed the Form I-9. For more information on E-Verify, please contact DHS. ************ dhs.gov/e-verify If you have any questions regarding this recruitment, please contact Ashley Costello at **************************. NOTE: The provisions of this bulletin do not constitute an expressed or implied contract. Any provision contained in this bulletin may be modified or revoked without notice.
    $70k-88k yearly est. Easy Apply 60d+ ago
  • Software Development Engineers

    JBA International 4.1company rating

    Data engineer job in Santa Barbara, CA

    Basic Qualifications 1+ years of experience contributing to the architecture and design (architecture, design patterns, reliability and scaling) of new and current systems. 2+ years of non-internship professional software development experience Programming experience with at least one modern language such as Java, C++, or C# including object-oriented design Preferred Qualifications Experience developing multi-tenant software applications Experience with AWS technology (e.g. EC2, DynamoDB, S3) Proficiency in relational databases Proficiency with search technologies (ElasticSearch/Lucene) Proficiency building large-scale, high-availability web services Security-minded and comfortable working in Linux
    $111k-148k yearly est. 60d+ ago
  • Software Engineer

    Toyon Research 4.1company rating

    Data engineer job in Goleta, CA

    Requirements Candidates with a Bachelor's or advanced degree in computer science or engineering are encouraged to apply. The ideal candidate will have the following experience, skills, and ambition: Excellent thinking and problem-solving skills Ability to work with others to conceptualize and collaboratively solve problems Strong math skills, particularly for implementation of optimization and search methods Experience with coding and testing numerical methods Object-oriented software development in C++, C#, Java, JavaScript, Python or other Experience designing, developing and performing analysis with geographic information systems (such as ArcGIS) Ability to configure and maintain virtual machines and host servers Understanding of application development for cluster computing platforms Existing TS Clearance with SCI eligibility preferred WE OFFER AN EXCEPTIONAL EMPLOYEE BENEFITS PACKAGE! Competitive Industry Pay 100% Employer-Paid Medical Insurance Premium HSA with Employer Contributions Dental and Vision Coverage Options Company Funded 401(k) and Profit Sharing Plans Employee Stock Ownership Plan (ESOP) Life and Disability Insurance Paid Parental Leave The annual pay range for the GIS Software Engineer position is $90,000 to $140,000. The posted pay range values provide the candidate with guidance on annual base compensation for the position, at a full time level of effort, exclusive of overtime, bonus, and benefits-related compensation, over a range of qualifications that may fit hiring objectives. Toyon Research Corporation will consider the individual candidate's education, work experience, applicable knowledge, skills and training, among other factors, when preparing an offer of employment. Pay Transparency Nondiscrimination Provision Equal Opportunity Employer including Disability and Veterans Applicant Privacy Notice Learn more about our company in our latest video, We are Toyon. Ref #2544-G
    $90k-140k yearly 60d+ ago
  • Software Engineer - Santa Barbara, CA

    Pearly

    Data engineer job in Santa Barbara, CA

    The Role We are looking for a Santa Barbara, CA based Software Engineer to develop platform capabilities that position Pearly as the leading payments layer in the dental industry. Everything we build at Pearly operates at scale from day one - from user-facing workflows to high-throughput payment, notification, and data sync systems. You will be shipping capabilities that impact millions of users. In addition to application code, you will work on the development infrastructure that allows our organization to scale safely - including automated testing, monitoring, and internal tooling. The common denominator across every project you work on will be a deep sense of craft - both as an individual and as a collective team. At Pearly you will drive product excellence that enables us to be the winners in our space. Our Stack Pearly operates on a hyper-modern stack that is optimized for developer experience and scalability. You'll be releasing code that is high-impact, simple, and safe from day one; working predominantly in Typescript, GraphQL, and PostgreSQL. Our tooling leverages the best of open source and cloud-based capabilities, including Serverless Functions, GitHub Actions, Docker, and Playwright. We put a lot of thought into how to make shipping code at Pearly as fun and low-friction as possible, and we're constantly iterating on our stack/tooling to stay at the leading edge - including leveraging the latest AI tools like Claude Code and Cursor. What You'll Do Own the full development cycle: You'll be involved in every step of development, from product ideation, to specification, through to implementation and release to Production. Define the platform : You will research and make judgement calls on fundamental decisions regarding data modeling, architecture, infrastructure, and tooling. Collaborate effectively: We operate as a flat, mutually supportive, highly connected team where frequent input and collaboration is expected and best ideas always win. Build the path forward: You'll be entrusted to build today what we need for tomorrow - we believe in creating generalized frameworks that anticipate future needs as a core development principle. Qualifications Sense of Craft. You take pride in and are consistently honing your technical and creative abilities. You seek out opportunities to introduce simplicity, and enjoy composing solutions to business requirements in a way that is both innovative and effective. 2+ Years Deploying to Prod: You have an understanding of the stakes involved in deploying live systems. You exhibit exceptional attention to detail, and are prepared to operate in a security and accuracy sensitive domain. Command of the Full Stack. You are comfortable working in any layer of modern web application architecture - from Data Persistence to Business Logic to the User Interface. The ability to learn quickly and deeply is more important than experience with any particular technology, although a strong understanding of relational data modeling is a must. Willing to Roll Up Your Sleeves. Taking the initiative is second nature to you. You will operate with a high degree of autonomy, take projects to the finish line, and own the outcome. Based in Santa Barbara, CA. You will work at Pearly's office in downtown Santa Barbara, CA with the opportunity to periodically work remotely. Benefits Competitive salary, equity, and healthcare benefits Meeting-light culture Work with an A+ smart and passionate team Flexible vacation/time-off policy Opportunity to make your mark at an accelerating company with great product-market fit
    $98k-139k yearly est. Auto-Apply 28d ago
  • Mid-Level Software Engineer (Local Preferred)

    Jobsbridge

    Data engineer job in Santa Barbara, CA

    The Mid-Level Software Engineer will be responsible for developing incremental changes, bug fixes and larger features for our SaaS ecommerce platform, which serves the store-front, payment, digital fulfillment, and post-order needs of digital retailers worldwide. Develop and test new features, incremental changes, and bug fixes to a production-ready state using agile development processes Contribute to architectural designs Define and implement automated tests for new development Participate in code review process Develop skills with new technologies Qualifications: 3+ years Java / JavaScript / node experience 2+ years AWS experience, including EC2, S3, RDS, DynamoDB. Experience with Lambda and Redshift is a plus 3+ years of experience developing software services using RESTful APIs Ability to automate tasks through scripting Understanding of agile development processes Bachelor's degree in Computer Science or equivalent experience Qualifications AWS experience, Java / JavaScript / node experience Additional Information Multiple Openings
    $98k-139k yearly est. 18h ago
  • Lead, Full Time - Balboa Building

    The Gap 4.4company rating

    Data engineer job in Santa Barbara, CA

    About Banana Republic Banana Republic is a storyteller's brand, outfitting the modern explorer with high-quality, expertly crafted collections made to inspire and enrich life's journeys. Founded in 1978 in San Francisco, we continue to evolve our heritage of exploration through thoughtfully designed apparel and accessories that blend timeless style with exceptional craftsmanship. Our team is made up of passionate, curious storytellers - creators and visionaries who seek out what's next and bring it to life through elevated design, immersive experiences, and a shared spirit of creativity and innovation. About the Role In this role, you will support the store leadership team by performing functional tasks as assigned. You will act as a role model to employees to support selling behaviors and the execution of tasks in specific areas of expertise. You will focus on leading processes, executing tasks, and maintaining productivity to ensure goals are met. Through collaboration with your leadership team, your goal is to role model and teach your team and drive behaviors to deliver a best-in-class customer experience. What You'll Do * Consistently treat all customers and employees with respect and contribute to a positive work environment. * Promote customer loyalty by educating customers about our loyalty programs. * All leads are expected to become experts of the brand's selling behaviors by role modeling these behaviors with every customer who walks through our doors and allowing us to provide an exceptional customer experience. * Support sales leader during (non-peak) hours, with the customer as the primary focus * Support the store leadership team by collaborating effectively with employees and ensuring work tasks are completed in a timely and efficient manner * Build and share expertise in the product lifecycle * Support completion of work before or after the store operating hours, inclusive of opening and/or closing checklists * Leverage omni-channel to deliver a frictionless customer experience. * Ensure all compliance standards are met. Who You Are * You embody Gap Inc's Purpose, Mission, Vision, Values and Behaviors * Provides clear and direct communication of expectations. * Ability to utilize technology effectively to engage with customers and team to meet goals * Demonstrate interest and initiative towards continuous improvement and growth * Agreeable to work a flexible schedule to meet the needs of the business, including holiday, evening, overnight and weekend shifts. * Able to maneuver around the sales floor, stockroom and office and can lift up to 30 pounds. Benefits at Banana Republic * Merchandise discount for our brands: 50% off regular-priced merchandise at Old Navy, Gap, Banana Republic and Athleta, and 30% off at Outlet for all employees. * One of the most competitive Paid Time Off plans in the industry.* * Employees can take up to five "on the clock" hours each month to volunteer at a charity of their choice.* * Extensive 401(k) plan with company matching for contributions up to four percent of an employee's base pay.* * Employee stock purchase plan.* * Medical, dental, vision and life insurance.* * See more of the benefits we offer. * For eligible employees Gap Inc. is an equal-opportunity employer and is committed to providing a workplace free from harassment and discrimination. We are committed to recruiting, hiring, training and promoting qualified people of all backgrounds, and make all employment decisions without regard to any protected status. We have received numerous awards for our long-held commitment to equality and will continue to foster a diverse and inclusive environment of belonging. In 2022, we were recognized by Forbes as one of the World's Best Employers and one of the Best Employers for Diversity. Hourly Range: $16.60 - $20.75 USD Employee pay will vary based on factors such as qualifications, experience, skill level, competencies and work location. We will meet minimum wage or minimum of the pay range (whichever is higher) based on city, county and state requirements.
    $16.6-20.8 hourly 60d+ ago
  • Software Engineer, Platform - Oxnard, USA

    Speechify

    Data engineer job in Oxnard, CA

    The mission of Speechify is to make sure that reading is never a barrier to learning. Over 50 million people use Speechify's text-to-speech products to turn whatever they're reading - PDFs, books, Google Docs, news articles, websites - into audio, so they can read faster, read more, and remember more. Speechify's text-to-speech reading products include its iOS app, Android App, Mac App, Chrome Extension, and Web App. Google recently named Speechify the Chrome Extension of the Year and Apple named Speechify its 2025 Design Award winner for Inclusivity. Today, nearly 200 people around the globe work on Speechify in a 100% distributed setting - Speechify has no office. These include frontend and backend engineers, AI research scientists, and others from Amazon, Microsoft, and Google, leading PhD programs like Stanford, high growth startups like Stripe, Vercel, Bolt, and many founders of their own companies. Overview The responsibilities of our Platform team include building and maintaining all backend services, including, but not limited to, payments, analytics, subscriptions, new products, text to speech, and external APIs. This is a key role and ideal for someone who thinks strategically, enjoys fast-paced environments, is passionate about making product decisions, and has experience building great user experiences that delight users. We are a flat organization that allows anyone to become a leader by showing excellent technical skills and delivering results consistently and fast. Work ethic, solid communication skills, and obsession with winning are paramount. Our interview process involves several technical interviews and we aim to complete them within 1 week. What You'll Do Design, develop, and maintain robust APIs including public TTS API, internal APIs like Payment, Subscription, Auth and Consumption Tracking, ensuring they meet business and scalability requirements Oversee the full backend API landscape, enhancing and optimizing for performance and maintainability Collaborate on B2B solutions, focusing on customization and integration needs for enterprise clients Work closely with cross-functional teams to align backend architecture with overall product strategy and user experience An Ideal Candidate Should Have Proven experience in backend development: TS/Node (required) Direct experience with GCP and knowledge of AWS, Azure, or other cloud providers Efficiency in ideation and implementation, prioritizing tasks based on urgency and impact Preferred: Experience with Docker and containerized deployments Preferred: Proficiency in deploying high availability applications on Kubernetes What We Offer A dynamic environment where your contributions shape the company and its products A team that values innovation, intuition, and drive Autonomy, fostering focus and creativity The opportunity to have a significant impact in a revolutionary industry Competitive compensation, a welcoming atmosphere, and a commitment to an exceptional asynchronous work culture The privilege of working on a product that changes lives, particularly for those with learning differences like dyslexia, ADD, and more An active role at the intersection of artificial intelligence and audio - a rapidly evolving tech domain The United States Based Salary range for this role is: 140,000-200,000 USD/Year + Bonus + Stock depending on experience Think you're a good fit for this job? Tell us more about yourself and why you're interested in the role when you apply. And don't forget to include links to your portfolio and LinkedIn. Not looking but know someone who would make a great fit? Refer them! Speechify is committed to a diverse and inclusive workplace. Speechify does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.
    $96k-136k yearly est. Auto-Apply 2d ago
  • Software Engineer

    Vsolvit

    Data engineer job in Oxnard, CA

    ***Remote/Hybrid Opportunity*** VSolvit LLC is seeking an experienced Mid-Level Full Stack Software Engineer to join our dynamic and growing team supporting our U.S Navy client. In this role, you will play a pivotal part in designing, building, and maintaining scalable web applications and APIs, leveraging cloud-native technologies. You will be responsible for the full lifecycle of software development, from conception to deployment, including designing system architecture, implementing solutions, and optimizing performance. This is a hands-on developer role with the opportunity to grow one's software developer skills in a growing and technically diverse technology company. As with any position, additional expectations exist. Some of these include, but are not limited to, adhering to normal working hours, meeting deadlines, following company policies as outlined by the Employee Handbook, communicating regularly with assigned supervisor(s), staying focused on the assigned tasks, and completing other tasks as assigned. Responsibilities Experience with common JavaScript libraries (Angular, React, jQuery, Backbone, Redux etc.) Ability to program in Java, JavaScript, C, C++, C# Knowledge of common data structures and algorithms Desire and ability to pick up new technologies quickly Displays a passion for what you do and a drive to improve Strong problem-solving and software triage skills with the ability to work cross-functionally in a fast-paced and rapidly changing work environment Analyzes the technical and business requirements to develop a systems solution Provides system software support for both front-end and backend Develop documentation for application and code Working with users and application stakeholders to verify and validate requirement Knowledge in writing packages/procedures for Java facing applications Write SQL queries, database triggers, PL/SQL, and packages according to the business requirements Basic Qualifications Bachelor's degree in Computer Science, technology, or related field of studies 2+ years' experience in Java, JavaScript, C, C++, C# and Angular 2+ years' experience in SQL development, preferably Oracle 19c or higher 2+ years' working as part of a software development team 2+ years' performing technical documentation; design documents, and system design documents 2+ years' experience analyzing technical specifications Ability to pay attention to detail A strong desire to learn and accept new challenges within a collaborate team context Must have the ability to obtain and maintain a CompTIA Security+ certification Must be a U.S Citizen Must be able to obtain and maintain Secret clearance If applicable: If you are or have been recently employed by the U.S. government, a post-employment ethics letter will be required if employment with VSolvit is offered Preferred Qualifications 2+ years' experience working with Java, ASP.NET with MVC, Visual C#, SQL, JavaScript, jQuery, and web services framework 2+ years' experience developing in a cloud-based environment, e.g. Azure Experience using standard SDLC methodologies 2+ years' experience in Agile development Knowledge of fundamental enterprise application development practices and architecture preferred Good interpersonal and communication skills a plus Experience using Microsoft Office tools, understanding workflows and requirements/design documentation Knowledge of RDBMS, i.e. Oracle 19c or higher or SQL Server 2014 or higher Knowledgeable of Postgres in particular, an understanding of Postgres as a relational store and Object store Experience working under CMMI Level 3 standards Company Summary Join the VSolvit Team! Founded in 2006, VSolvit (pronounced 'We Solve It') is a technology services provider that specializes in cybersecurity, cloud computing, geographic information systems (GIS), business intelligence (BI) systems, data warehousing, engineering services, and custom database and application development. VSolvit is an award winning WOSB, CA CDB, MBE, WBE, and CMMI Level 3 certified company. We offer a customizable health benefits program that best meets the needs of its employees. Offering may include: medical, dental, and vision insurance, life insurance, long and short-term disability and other insurance products, Health Savings Account, Flexible Spending Account, 401K Retirement Plan options, Tuition Reimbursement, and assorted voluntary benefits. Our goal is to grow together and enjoy the work that we do as a team. VSolvit LLC is an Equal Opportunity/Affirmative Action employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, national origin, protected veteran status, or disability status.
    $96k-136k yearly est. Auto-Apply 60d+ ago
  • Data Engineer

    KBR 4.7company rating

    Data engineer job in Camarillo, CA

    Title: Data Engineer Belong. Connect. Grow. with KBR! KBR's National Security Solutions team provides high-end engineering and advanced technology solutions to our customers in the intelligence and national security communities. In this position, your work will have a profound impact on the country's most critical role - protecting our national security. Why Join Us? * Innovative Projects: KBR's work is at the forefront of engineering, logistics, operations, science, program management, mission IT and cybersecurity solutions. * Collaborative Environment: Be part of a dynamic team that thrives on collaboration and innovation, fostering a supportive and intellectually stimulating workplace. * Impactful Work: Your contributions will be pivotal in designing and optimizing defense systems that ensure national security and shape the future of space defense. Come join the ITEA award winning TRMC BDKM team and be a part of the team responsible for revolutionizing how analysis is performed across the entire Department of Defense! Key Responsibilities: As a Data Engineer, you will be a critical part of the team that is responsible for enabling the development of data-driven decision analysis products through the innovative application, and promotion, of novel methods from data science, machine learning, and operations research to provide robust and flexible testing and evaluation capabilities to support DoD modernization. * Analytic Experience: Candidate will be a part of the technical team responsible for providing analytic consulting services, supporting analytic workflow and product development and testing, promoting the user adoption of methods and best practices from data science, conducting applied methods projects, and supporting the creation of analysis-ready data. * Onsite Support: Candidate will be the face of the CHEETAS Team and will be responsible for ensuring stakeholders have the analytical tools, data products and reports they need to make insightful recommendations based on your data driven analysis. * Stakeholder Assistance: Candidate will directly assisting both analyst / technical and non-analyst / non-technical stakeholders with the analysis of DoD datasets and demonstrating the 'art of the possible' to the stakeholders and VIPs with insights gained from your analysis of DoD Test and Evaluation (T&E) data. * Communication: Must effectively communicate at both a programmatic and technical level. Although you potentially may be the only team member physically on-site supporting you will not be alone. You will have support from other data science team members as well as the software engineering and system administration teams. * Technical Support: Candidate will be responsible for running and operating CHEETAS (and other tools); demonstrating these tools to stakeholders & VIPs; conveying analysis results; adapting internally-developed tools, notebooks and reports to meet emerging needs; gathering use cases, requirements, gaps and needs from stakeholders and for larger development items providing that information as feature requests or bug reports to the CHEETAS development team; and performing impromptu hands-on training sessions with end users and potentially troubleshooting problems from within closed networks without internet access (with support from distributed team members). * Independent Work: Candidate must be self-motivated and capable of working independently with little supervision / direct tasking. Work Environment: * Location: Onsite; Honolulu, HI * Travel Requirements: This position will require travel of 25% with potential surge to 50% to support end users located at various DoD ranges & labs located across the US (including Alaska and Hawaii). When not supporting a site, this position can work remotely or from a nearby KBR office (if available and desired). * Working Hours: Standard, although you potentially may be the only team member physically on-site providing support, you will not be alone. Basic Qualifications: * Security Clearance: Active or current TS/SCI Clearance is required * Education: A degree in operations research, engineering, applied math, statistics, computer science or information technology with preferred 15+ years of experience within DoD. Candidates with 10-15 years of DoD experience will be considered on a case-by-case basis. Entry level candidates will not be considered. * Technical Experience: Previous experience must include five (5) years of hands-on experience in big data analytics, five (5) years of hands-on experience with object-oriented and functional languages (e.g., Python, R, C++, C#, Java, Scala, etc.). * Data Experience: Experience in dealing with imperfections in data. Experience should demonstrate competency in key concepts from software engineering, computer programming, statistical analysis, data mining algorithms, machine learning, and modeling sufficient to inform technical choices and infrastructure configuration. * Data Analytics: Proven analytical skills and experience in preparing and handling large volumes of data for ETL processes. Experience should include working with teams in the development and interpretation the results of analytic products with DoD specific data types. * Big Data Infrastructure: Experience in the installation, configuration, and use of big data infrastructure (Spark, Trino, Hadoop, Hive, Neo4J, JanusGraph, HBase, MS SQL Server with Polybase, VMWare as examples). Experience in implementing Data Visualization solutions. Qualifications Required: * Experience using scripting languages (Python and R) to process, analyze and visualize data. * Experience using notebooks (Jupyter Notebooks and RMarkdown) to create reproducible and explainable products. * Experience using interactive visualization tools (RShiny, py Shiny, Dash) to create interactive analytics. * Experience generating and presenting reports, visualizations and findings to customers. * Experience building and optimizing 'big data' data pipelines, architectures and data sets. * Experience cleaning and preparing time series and geospatial data for analysis. * Experience working with Windows, Linux, and containers. * Experience querying databases using SQL and working with and configuring distributed storage and computing environments to conduct analysis (Spark, Trino, Hadoop, Hive, Neo4J, JanusGraph, MongoDB, Accumulo, HBase as examples). * Experience working with code repositories in a collaborative team. * Ability to make insightful recommendations based on data driven analysis and customer interactions. * Ability to effectively communicate both orally and in writing with customers and teammates. * Ability to speak and present findings in front of large technical and non-technical groups. * Ability to create documentation and repeatable procedures to enable reproducible research. * Ability to create training and educational content for novice end users on the use of tools and novel analytic methods. * Ability to solve problems, debug, and troubleshoot while under pressure and time constraints is required. * Should be self-motivated to design, develop, enhance, reengineer or integrate software applications to improve the quality of data outputs available for end users. * Ability to work closely with data scientists to develop and subsequently implement the best technical design and approach for new analytical products. * Strong analytical skills related to working with both structured and unstructured datasets. * Excellent programming, testing, debugging, and problem-solving skills. * Experience designing, building, and maintaining both new and existing data systems and solutions * Understanding of ETL processes, how they function and experience implementing ETL processes required. * Knowledge of message queuing, stream processing and extracting value from large disparate datasets. * Knowledge of software design patterns and Agile Development methodologies is required. * Knowledge of methods from operations research, statistical and machine learning, data science, and computer science is sufficient to select appropriate methods to enable data preparation and computing architecture configuration to implement these approaches. * Knowledge of computer programming concepts, data structures and storage architecture, to include relational and non-relational databases, distributed computing frameworks, and modeling and simulation experimentation sufficient to select appropriate methods to enable data preparation and computing architecture configuration to implement these approaches. Scheduled Weekly Hours: 40 Basic Compensation: $119,900 - $179,800 The offered rate will be based on the selected candidate's knowledge, skills, abilities and/or experience and in consideration of internal parity. Additional Compensation: KBR may offer bonuses, commissions, or other forms of compensation to certain job titles or levels, per internal policy or contractual designation. Additional compensation may be in the form of sign on bonus, relocation benefits, short-term incentives, long-term incentives, or discretionary payments for exceptional performance. Ready to Make a Difference? If you're excited about making a significant impact in the field of space defense and working on projects that matter, we encourage you to apply and join our team at KBR. Let's shape the future together. KBR Benefits KBR offers a selection of competitive lifestyle benefits which could include 401K plan with company match, medical, dental, vision, life insurance, AD&D, flexible spending account, disability, paid time off, or flexible work schedule. We support career advancement through professional training and development. Belong, Connect and Grow at KBR At KBR, we are passionate about our people and our Zero Harm culture. These inform all that we do and are at the heart of our commitment to, and ongoing journey toward being a People First company. That commitment is central to our team of team's philosophy and fosters an environment where everyone can Belong, Connect and Grow. We Deliver - Together. KBR is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, disability, sex, sexual orientation, gender identity or expression, age, national origin, veteran status, genetic information, union status and/or beliefs, or any other characteristic protected by federal, state, or local law.
    $119.9k-179.8k yearly Auto-Apply 14d ago
  • Data Engineer

    Evidation 4.2company rating

    Data engineer job in Santa Barbara, CA

    Our small data science team is changing the way healthcare decisions are made. We're taking advantage of new technologies and the latest in statistics and machine learning to rapidly analyze new kinds of healthcare data. We have one-of-a-kind access to health and behavior datasets, and we're using it to change the world for the better. Our distributed data processing and analysis pipeline is built on top of technologies like Spark, Hadoop, Elasticsearch, Python, and Scala, and we need brilliant engineers to help us scale it and adapt it to new challenges. This is the perfect job if you have a software engineering background and want to get involved in healthcare and big data engineering. We have offices in Menlo Park, CA and Santa Barbara, CA and a Surf Air membership so you can rove between the offices. This position is in Santa Barbara, CA. Job Description Responsibilities: Own our data ETL pipelines Investigate, prototype, and validate new analytics systems Optimize our analytics systems for performance, cost, and availability Design and implement new data analytics systems using top-notch software engineering practices Work closely with our Data Scientists to deliver dashboards, insights, and data pipelines to clients Qualifications Requirements: Background in quantitative fields (Computer Science, Engineering, Math, Physics) You're a strong analytical thinker and coder. Think ACM-ICPC, IOI, and IPSC. You're a great software engineer who writes modular, extensible, well-designed systems. You have solid experience with Python and SQL. You value learning, personal growth, teamwork, and excelling at what you do. Additional Information All your information will be kept confidential according to EEO guidelines.
    $114k-161k yearly est. 60d+ ago
  • Real-Time Data Engineer II

    Appfolio 4.6company rating

    Data engineer job in Santa Barbara, CA

    What we're looking for: As a member of the Data Platform Engineering team, the Real-Time Data Engineer II will work collaboratively to develop an infrastructure that ingests data from disparate sources, processes them real-time and routes them to various target storages and applications, while providing access to high quality data to users, ranging from application developers interested in specific events to data analysts keen on business intelligence to data scientists training ML models. At AppFolio, we paddle as one. We ride and make waves together, with a relentless focus on building great products for the way our customers work and live today - and tomorrow. AppFolio is a destination organization where careers are made and accelerated. Here, innovation is a team sport. Your impact: * Design, build and operate on next generation data pipeline infrastructure based on Apache Kafka and its ecosystem * Improve data architecture, quality, discoverability and access policies to enable and enforce data governance * Collaborate with engineers, data analysts and scientists to ensure that our data infrastructure meets the SLOs of our data-intensive customers * Develop techniques for monitoring the completeness, correctness and reliability of our data sets * Leverage agile practices, encourage collaboration, prioritization, and urgency to develop at a rapid pace * Research, share, and recommend new technologies and trends Qualifications * You have hands-on experience with using Apache Kafka in production and have a passion for building a reliable, scalable and fault-tolerant infrastructure. * You have industry experience with working with real time transformation technologies such as Apache Flink * You have worked with a variety of data sources, including change data capture systems, data streaming and event sourcing in production. * You have hands-on experience with data warehouse technology. * You embrace the platform-first approach to build standard solutions and self-serve capabilities for engineering teams * You want to work with a high degree of autonomy, while at the same time working on initiatives of high importance to the company. * You care about work-life balance and want your company to care about it too; you'll put in the extra hour when needed but won't let it become a habit. Must have * 3+ years of experience with Apache Kafka, Kafka Connect and its ecosystem * 2+ years of experience in streaming processing technologies, such as Apache Flink * 2+ years of experience with AWS primitives (IAM, VPC, S3, MSK, EKS, etc.) * 3+ years of experience working with programming languages like Python or Ruby * Excellent SQL skills with working knowledge of query optimization * 2+ years of experience working with Infrastructure as Code, configuration management, and monitoring tools. * Bachelors in Computer Science or other quantitative fields. Nice to have * Experience with Debezium connectors * Experience with large scale Data Lakes and Lake Houses, especially with Apache Iceberg is a plus * Experience with distributed SQL query engines, such as Trino is a plus * Experience with containers and container orchestration tools. Docker and Kubernetes experience is desirable. * Data science skills for analyzing data and communicating with ML engineers are a plus. Compensation & Benefits The base salary/hourly wage that we reasonably expect to pay for this role is: $104,000 to $130,000. The actual base salary/hourly wage for this role will be determined by a variety of factors, including but not limited to: the candidate's skills, education, experience, etc. Please note that base pay is one important aspect of a compelling Total Rewards package. The base pay range indicated here does not include any additional benefits or bonuses that you may be eligible for based on your role and/or employment type. Regular full-time employees are eligible for benefits - see here. #LI-KB1 About AppFolio AppFolio is the technology leader powering the future of the real estate industry. Our innovative platform and trusted partnership enable our customers to connect communities, increase operational efficiency, and grow their business. For more information about AppFolio, visit appfolio.com. Why AppFolio Grow | We enable a culture of high performance, where delivering results is recognized by opportunities for growth and compelling total rewards. Our challenging and meaningful work drive the growth of our business, and ourselves. Learn | We partner with you to realize your potential by investing in you from the start. We're cultivating a team of big thinkers through coaching and mentorship with our best-in-class leaders, and giving you the time and tools to develop your skills. Impact | We are creating a world where living in, investing in, managing, and supporting communities feels magical and effortless, freeing people to thrive. We do this by innovating with purpose while cultivating a culture of impact. We learn as much from each other as we do our customers and our communities. Connect | We excel at hybrid work by fostering an environment that feels flexible, personal and connected, no matter where we are. We create space to fuel innovation and collaboration, and we come together to celebrate, connect, and succeed. Paddle as One. Learn more at appfolio.com/company/careers Statement of Equal Opportunity At AppFolio, we value diversity in backgrounds and perspectives and depend on it to drive our innovative culture. That's why we're a proud Equal Opportunity Employer, and we believe that our products, our teams, and our business are stronger because of it. This means that no matter what race, color, religion, sex, sexual orientation, gender identification, national origin, age, marital status, ancestry, physical or mental disability, or veteran status, you're always welcome at AppFolio. By submitting this form, I acknowledge I have reviewed AppFolio's Privacy Policy.
    $104k-130k yearly Auto-Apply 30d ago
  • Staff Data Engineer

    Artera

    Data engineer job in Santa Barbara, CA

    Our Mission: Make healthcare #1 in customer service. What We Deliver: Artera, a SaaS leader in digital health, transforms patient experience with AI-powered virtual agents (voice and text) for every step of the patient journey. Trusted by 1,000+ provider organizations - including specialty groups, FQHCs, large IDNs and federal agencies - engaging 100 million patients annually. Artera's virtual agents support front desk staff to improve patient access including self-scheduling, intake, forms, billing and more. Whether augmenting a team or unleashing a fully autonomous digital workforce, Artera offers multiple virtual agent options to meet healthcare organizations where they are in their AI journey. Artera helps support 2B communications in 109 languages across voice, text and web. A decade of healthcare expertise, powered by AI. Our Impact: Trusted by 1,000+ provider organizations - including specialty groups, FQHCs, large IDNs and federal agencies - engaging 100 million patients annually. Hear from our CEO, Guillaume de Zwirek, about why we are standing at the edge of the biggest technological shift in healthcare's history! Our award-winning culture: Our award-winning culture: Since founding in 2015, Artera has consistently been recognized for its innovative technology, business growth, and named a top place to work. Examples of these accolades include: Inc. 5000 Fastest Growing Private Companies (2020, 2021, 2022, 2023, 2024); Deloitte Technology Fast 500 (2021, 2022, 2023, 2024, 2025); Built In Best Companies to Work For (2021, 2022, 2023, 2024, 2025, 2026). Artera has also been recognized by Forbes as one of “America's Best Startup Employers,” Newsweek as one of the “World's Best Digital Health Companies,” and named one of the top “44 Startups to Bet your Career on in 2024” by Business Insider. SUMMARY We are seeking a highly skilled and motivated Staff Data Engineer to join our team at Artera. This role is critical to maintaining and improving our data infrastructure, ensuring that our data pipelines are robust, efficient, and capable of delivering high-quality data to both internal and external stakeholders. As a key player in our data team, you will have the opportunity to make strategic decisions about the tools we use, how we organize our data, and the best methods for orchestrating and optimizing our data processes. Your contributions will be essential to ensuring the uninterrupted flow of data across our platform, supporting the analytics needs of our clients and internal teams. If you are passionate about data, problem-solving, and continuous improvement, while also taking the lead on investigating and implementing solutions to enhance our data infrastructure. RESPONSIBILITES Continuous Enhancement: Maintain and elevate Artera's data infrastructure, ensuring peak performance and dependability. Strategic Leadership: Drive the decision-making process for the selection and implementation of data tools and technologies Streamlining: Design and refine data pipelines to ensure smooth and efficient data flow. Troubleshooting: Manage the daily operations of the Artera platform, swiftly identifying and resolving data-related challenges. Cross-Functional Synergy: Partner with cross-functional teams to develop new data requirements and refine existing processes. Guidance: Provide mentorship to junior engineers, supporting their growth and assisting with complex projects. Collaborative Innovation: Contribute to ongoing platform improvements, ensuring a culture of continuous innovation. Knowledge Expansion: Stay informed on industry trends and best practices in data infrastructure and cloud technologies. Dependability: Guarantee consistent data delivery to customers and stakeholders, adhering to or surpassing service level agreements. Oversight: Monitor and sustain the data infrastructure, covering areas like recalls, message delivery, and reporting functions. Proactiveness: Improves stability and performance of architecture for team implementations. Requirements Bachelor's Degree in STEM preferred *additional experience is also accepted in lieu of a degree Proven experience with Kubernetes and Cloud infrastructure (AWS preferred) Strong proficiency in Python and SQL for data processing and automation. Expertise in orchestration tools such as Airflow and Docker. Understanding of performance optimization and cost-effectiveness in Snowflake. Ability to work effectively in a collaborative, cross-functional environment. Strong problem-solving skills with a proactive and solution-oriented mindset. Experience with event sourced and microservice architecture Experienced working with asynchronous requests in large scale applications Commitment to testing best practices Experience in Large-scale data architecture Demonstrated ability to build and maintain complex data pipelines and data flows. Bonus Experience Knowledge of DBT & Meltano Security RequirementsThis engineering role contributes to a secure, federally compliant platform. Candidates must be eligible for a government background check and operate within strict code management, access, and documentation standards. Security-conscious development and participation in compliance practices are core to the role. OUR APPROACH TO WORK LOCATIONArtera has hybrid office locations in Santa Barbara, CA, and Philadelphia (Wayne), PA, where team members typically come in three days a week. Specific frequency can vary depending on your team's needs, manager expectations and/or role responsibilities. In addition to our U.S. office locations, we are intentionally building geographically concentrated teams in several key metropolitan areas, which we call our “Hiring Hubs.” We are currently hiring remote candidates located within the following hiring hubs:- Boston Metro Area, MA- Chicago Metro Area, IL- Denver Metro Area, CO- Kansas City Metro Area (KS/MO)- Los Angeles Metro Area, CA- San Francisco / Bay Area, CA- Seattle Metro Area, WA This hub-based model helps us cultivate strong local connections and team cohesion, even in a distributed environment. To be eligible for employment at Artera, candidates must reside in one of our hybrid office cities or one of the designated hiring hubs. Specific roles may call out location preferences when relevant. As our hubs grow, we may establish local offices to further enhance in-person connection and collaboration. While there are no current plans in place, should an office open in your area, we anticipate implementing a hybrid model. Any future attendance expectations would be developed thoughtfully, considering factors like typical commute times and access to public transit, to ensure they are fair and practical for the local team. WORKING AT ARTERA Company benefits - Full health benefits (medical, dental, and vision), flexible spending accounts, company paid life insurance, company paid short-term & long-term disability, company equity, voluntary benefits, 401(k) and more! Career development - Manager development cohorts, employee development funds Generous time off - Company holidays, Winter & Summer break, and flexible time off Employee Resource Groups (ERGs) - We believe that everyone should belong at their workplace. Our ERGs are available for identifying employees or allies to join. EQUAL EMPLOYMENT OPPORTUNITY (EEO) STATEMENTArtera is an Equal Opportunity Employer and is committed to fair and equitable hiring practices. All hiring decisions at Artera are based on strategic business needs, job requirements and individual qualifications. All candidates are considered without regard to race, color, religion, gender, sexuality, national origin, age, disability, genetics or any other protected status. Artera is committed to providing employees with a work environment free of discrimination and harassment; Artera will not tolerate discrimination or harassment of any kind. Artera provides reasonable accommodations for applicants and employees in compliance with state and federal laws. If you need an accommodation, please reach out to ************. DATA PRIVACYArtera values your privacy. By submitting your application, you consent to the processing of your personal information provided in conjunction with your application. For more information please refer to our Privacy Policy. SECURITY REQUIREMENTSAll employees are responsible for protecting the confidentiality, integrity, and availability of the organization's systems and data, including safeguarding Artera's sensitive information such as, Personal identifiable Information (PII) and Protected Health Information (PHI). Those with specific security or privacy responsibilities must ensure compliance with organizational policies, regulatory requirements, and applicable standards and frameworks by implementing safeguards, monitoring for threats, reporting incidents, and addressing data handling risks or breaches.
    $103k-146k yearly est. Auto-Apply 14d ago
  • Lead, Part Time - Balboa Building

    The Gap 4.4company rating

    Data engineer job in Santa Barbara, CA

    About Banana Republic Banana Republic is a storyteller's brand, outfitting the modern explorer with high-quality, expertly crafted collections made to inspire and enrich life's journeys. Founded in 1978 in San Francisco, we continue to evolve our heritage of exploration through thoughtfully designed apparel and accessories that blend timeless style with exceptional craftsmanship. Our team is made up of passionate, curious storytellers - creators and visionaries who seek out what's next and bring it to life through elevated design, immersive experiences, and a shared spirit of creativity and innovation. About the Role In this role, you will support the store leadership team by performing functional tasks as assigned. You will act as a role model to employees to support selling behaviors and the execution of tasks in specific areas of expertise. You will focus on leading processes, executing tasks, and maintaining productivity to ensure goals are met. Through collaboration with your leadership team, your goal is to role model and teach your team and drive behaviors to deliver a best-in-class customer experience. What You'll Do * Consistently treat all customers and employees with respect and contribute to a positive work environment. * Promote customer loyalty by educating customers about our loyalty programs. * All leads are expected to become experts of the brand's selling behaviors by role modeling these behaviors with every customer who walks through our doors and allowing us to provide an exceptional customer experience. * Support sales leader during (non-peak) hours, with the customer as the primary focus * Support the store leadership team by collaborating effectively with employees and ensuring work tasks are completed in a timely and efficient manner * Build and share expertise in the product lifecycle * Support completion of work before or after the store operating hours, inclusive of opening and/or closing checklists * Leverage omni-channel to deliver a frictionless customer experience. * Ensure all compliance standards are met. Who You Are * You embody Gap Inc's Purpose, Mission, Vision, Values and Behaviors * Provides clear and direct communication of expectations. * Ability to utilize technology effectively to engage with customers and team to meet goals * Demonstrate interest and initiative towards continuous improvement and growth * Agreeable to work a flexible schedule to meet the needs of the business, including holiday, evening, overnight and weekend shifts. * Able to maneuver around the sales floor, stockroom and office and can lift up to 30 pounds. Benefits at Banana Republic * Merchandise discount for our brands: 50% off regular-priced merchandise at Old Navy, Gap, Banana Republic and Athleta, and 30% off at Outlet for all employees. * One of the most competitive Paid Time Off plans in the industry.* * Employees can take up to five "on the clock" hours each month to volunteer at a charity of their choice.* * Extensive 401(k) plan with company matching for contributions up to four percent of an employee's base pay.* * Employee stock purchase plan.* * Medical, dental, vision and life insurance.* * See more of the benefits we offer. * For eligible employees Gap Inc. is an equal-opportunity employer and is committed to providing a workplace free from harassment and discrimination. We are committed to recruiting, hiring, training and promoting qualified people of all backgrounds, and make all employment decisions without regard to any protected status. We have received numerous awards for our long-held commitment to equality and will continue to foster a diverse and inclusive environment of belonging. In 2022, we were recognized by Forbes as one of the World's Best Employers and one of the Best Employers for Diversity. Hourly Range: $16.60 - $20.75 USD Employee pay will vary based on factors such as qualifications, experience, skill level, competencies and work location. We will meet minimum wage or minimum of the pay range (whichever is higher) based on city, county and state requirements.
    $16.6-20.8 hourly 42d ago

Learn more about data engineer jobs

How much does a data engineer earn in Santa Barbara, CA?

The average data engineer in Santa Barbara, CA earns between $88,000 and $171,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Santa Barbara, CA

$123,000

What are the biggest employers of Data Engineers in Santa Barbara, CA?

The biggest employers of Data Engineers in Santa Barbara, CA are:
  1. AppFolio
  2. Evidation Health
  3. Artera
  4. Metrosys
Job type you want
Full Time
Part Time
Internship
Temporary