Analytics Engineer
Requirements engineer job in Los Angeles, CA
Proper Hospitality is seeking a visionary Analytics Engineer to help build the future of data across our growing portfolio.
You will be the foundational data architect for Proper's next-generation hospitality intelligence platform, joining as the founding member of the data engineering team. Your mission is to design, build, and own the semantic modeling layer, identity-resolution framework, and core data infrastructure that powers analytics, personalization, membership logic, and AI-driven operations across all Proper properties. This is a hands-on role with direct ownership of data modeling, governance, and data quality, with influence over technical direction, vendor management, and cross-functional alignment.
You will collaborate with data engineering vendors, AI/ML engineering vendors and internal business leaders across operations, marketing, revenue management and sales to ensure our data infrastructure is accurate, scalable, governed, and actionable. You will be responsible for portfolio-wide hotel performance analytics, trend identification, and decision-support for Operations, Finance, Revenue Management, and senior executives.
Key Responsibilities
Data Architecture & Modeling
Design and own the company-wide dimensional modeling strategy using dbt (data build tool)
Create and maintain clean, well-documented, version-controlled models for core domains (PMS, POS, spa/wellness, membership, digital, etc)
Establish and enforce naming conventions, data contracts, lineage, and schema governance
Identity Resolution & Guest Graph
Architect and maintain the Proper guest identity graph, unifying data across all systems into a single, accurate guest profile
Develop deterministic and heuristic matching rules; iterate on feature extraction, merging logic, and identity quality metrics
Data Quality & Reliability
Implement robust data validation, monitoring, and alerting frameworks to ensure completeness, accuracy, and timeliness across all pipelines
Partner with contractors to ensure staging layers ingest data consistently and reliably
Business-Facing Metrics & Semantic Layer
Define and maintain authoritative metric definitions (LTV, ADR, occupancy, conversion, channel attribution, membership value, churn)
Build accessible data marts and semantic layers that can serve BI tools, CRM systems, and AI services
Design metrics and data visualizations with dashboarding tools like Tableau, Sigma, and Mode
Cross-Functional Collaboration
Work closely with operations, revenue management, marketing, and the executive team to understand data needs and translate them into scalable models
Provide technical guidance and enforce standards with third-party engineering vendors
Be a cross-functional champion at upholding high data integrity standards to increase reusability, readability and standardization
Hotel Performance Analytics
Build recurring analytical frameworks and dashboards for property-level and portfolio-level insights (occupancy, ADR, RevPAR, segmentation mix, pickup behavior, channel performance, cost-per-room, labor productivity, F&B covers, check averages, menu engineering)
Detect structural trends and operational inefficiencies by analyzing PMS, POS, labor, spa, digital, and membership datasets
Partner with property and cluster leadership to interpret trends, validate root causes, and tie data outputs to operational actions
Build forecasting models for occupancy, F&B demand, spa utilization, labor, and revenue
Produce executive-level performance briefs that combine data engineering rigor with applied hospitality interpretation
AI/ML Enablement
Create and maintain feature tables for predictive models (propensity, demand forecasting, churn, LTV)
Support experimentation and real-time personalization use cases by providing clean features and stable data sources
Documentation & Governance
Maintain comprehensive documentation of all datasets, lineage, assumptions, and transformations
Own data governance, security, privacy compliance, and access controls in coordination with leadership
Qualifications
Required
4-7+ years of hands-on experience in analytics engineering, data engineering, or modern data stack architecture
Expert-level SQL
Deep experience with dbt, dimensional modeling, and analytics engineering best practices
Strong understanding of cloud data warehouses (Snowflake, BigQuery, or Databricks)
Experience building and validating ETL/ELT pipelines and working with raw staging layers
Strong understanding of data quality frameworks, testing, lineage, and documentation
Demonstrated ability to unify data across disparate systems and design customer 360 profiles
Proven ability to translate raw data into actionable insights for operators, leaders, and executives
Preferred
Experience in hospitality, retail, wellness, or membership-based businesses
Familiarity with reverse-ETL tools (Hightouch, Census)
Experience with event streaming (Kafka, Pub/Sub) and real-time architecture
Exposure to Python for data modeling or feature engineering
Understanding of marketing automation platforms (Klaviyo, Salesforce, Braze)
Strong data privacy and governance understanding (GDPR/CCPA)
Success in the First 90 Days Looks Like
Proper-wide data modeling standards defined and documented
Unified guest identity graph MVP created and validated on core systems
dbt project structured, version-controlled, and integrated with CI/CD
Vendor pipelines reviewed, documented, and aligned with governance
First wave of clean, tested metric tables delivered to stakeholders
Proper's first set of high-value feature tables ready for AI/ML use cases
Delivery of first hotel performance analytics suite roadmap (occupancy, ADR, RevPAR, segmentation, labor, F&B) with recommended actions
Salary
$155,000-185,000
Proper Perks & Benefits
Compensation & Recognition
Competitive Salary + Bonus: Rewarding exceptional talent and performance across all levels.
Recognition Programs: Celebrating achievements big and small through company-wide appreciation and milestone rewards.
Annual Performance Reviews: Regular opportunities for feedback, growth, and advancement.
Culture of Growth & Belonging
Culture of Growth: A collaborative, design-forward environment that values creativity, intelligence, and curiosity - where learning and excellence are a daily practice.
Guided Skills Development: Access to training, leadership programs, mentorship, and cross-property mobility to encourage achievement and discovery.
Diversity, Equity, Inclusion & Belonging: We honor individuality while fostering a culture of respect and belonging across all teams.
Community Engagement: Opportunities to give back through local volunteerism, sustainability, and charitable partnerships.
Health & Wellness
Comprehensive Health Coverage: Medical, dental, and vision plans through Aetna, designed to fit a range of personal and family needs.
Wellness Access: Company-subsidized memberships with Equinox and ClassPass, plus wellbeing workshops and mental health resources.
Employee Assistance Program (EAP): Confidential support for emotional wellbeing, financial planning, and life management through Unum.
Time Off & Flexibility
Paid Time Off: Flexible PTO plus 11 paid holidays each year for corporate team members.
Paid Parental Leave: Paid time off for eligible employees welcoming a new child through birth, adoption, or foster placement.
Flexible Work Practices: Hybrid schedules for eligible roles and an emphasis on work-life balance.
Financial Wellbeing & Core Protections
401(k) Program: Company match of 50% of employee deferrals, up to the first 4% of eligible compensation.
Employer-Paid Life & Disability Insurance: Core protections with optional additional coverage.
Financial Education: Access to planning tools and workshops to support long-term stability and growth.
Lifestyle & Travel Perks
Hotel Stay Benefits: 75% off BAR (floor of $100) across the Proper portfolio.
Design Hotels Partnership: 50% off participating Marriott Design Hotels.
Dining Discounts: 75% off food & beverage at all Proper Hospitality outlets.
Lifestyle Perks: Complimentary or subsidized parking, cell phone reimbursement, and exclusive hospitality and retail discounts.
Why Join Proper Hospitality
At Proper, we build experiences that move people - and that begins with the team behind them. As a best-in-class employer, we're committed to creating one of the Best Places to Work in hospitality by nurturing a culture where creativity, excellence, and humanity thrive together.
Everything we do is grounded in the belief that hospitality is more than a profession - it's an opportunity to care for others and make lives better. Guided by the Pillars of Proper, we show up with warmth and authenticity (Care Proper), strive for excellence in everything we do (Achieve Proper), think creatively and resourcefully (Imagine Proper), and take pride in the style and culture that make us who we are (Present Proper).
We believe our people are our greatest strength, and we invest deeply in their wellbeing, growth, and sense of belonging. From comprehensive benefits to meaningful development programs, Proper is designed to help you build a career, and a life, that feels as inspiring as the experiences we create for our guests.
Our Commitment: Building the Best Place to Work
Our Best Place to Work initiative is a living commitment - a continuous investment in our people, our culture, and our purpose. We listen, learn, and evolve together to create an environment where everyone feels empowered to imagine boldly, achieve confidently, care deeply, and present themselves authentically.
At Proper, joining the team means more than finding a job - it means joining a community that believes in building beautiful experiences together, for our guests and for one another.
Thermal Engineer
Requirements engineer job in Los Angeles, CA
BCforward is seeking a highly motivated and experienced Thermal Engineer to join our dynamic team in Los Angeles, CA
Thermal Engineer
Work Arrangement: (100% Onsite)
Duration: 3+ months (Possible Extension).
Basic qualifications
Bachelor's degree in mechanical engineering or related discipline, or equivalent experience.
3+ year experience with Thermal Desktop (SINDA) and/or Ansys thermal analysis tools (ICEPAK. Fluent, mechanical, etc.)
2+ year experience with test planning, test set-up (thermocouple and heater installation), operating DAQ and power supplies, results correlation and system verification for production
Experience in documentation and writing test reports.
Preferred qualifications
Experience with avionics thermal design and analysis.
CAD skills (NX or Solidworks).
Experience with interpreting and correlating test data to thermal models.
Specifics tasks that this individual will support are as follows:
Provide design inputs to the mechanical and electrical teams.
Complete thermal modeling analysis using Ansys analysis tools.
Develop comprehensive thermal analysis documentation
Develop thermal testing and qualification plan.
Conduct thermal testing and model validation.
Perform trade studies to select architecture of the electronic enclosures.
Preliminary thermal analysis of printed circuit board assemblies.
Collaborate with design team to develop preliminary designs.
Define and execute development testing.
ServiceNow CMDB Engineer
Requirements engineer job in Irvine, CA
Employment Type: Full-Time, Direct Hire (W2 Only - No sponsorship available)
About the Role
We're seeking a skilled and driven ServiceNow CMDB Engineer to join our team in Irving, TX. This is a hands-on, onsite role focused on designing, implementing, and maintaining a robust Configuration Management Database (CMDB) aligned with ServiceNow's Common Service Data Model (CSDM). You'll play a critical role in enhancing IT operations, asset management, and service delivery across the enterprise.
Responsibilities
Architect, configure, and maintain the ServiceNow CMDB to support ITOM and ITAM initiatives
Implement and optimize CSDM frameworks to ensure data integrity and alignment with business services
Collaborate with cross-functional teams to define CI classes, relationships, and lifecycle processes
Develop and enforce CMDB governance, data quality standards, and reconciliation rules
Integrate CMDB with discovery tools and external data sources
Support audits, compliance, and reporting requirements related to ITIL processes
Troubleshoot and resolve CMDB-related issues and performance bottlenecks
Qualifications
3+ years of hands-on experience with ServiceNow CMDB and CSDM implementation
Strong understanding of ITIL practices and ITOM/ITAM modules
Proven ability to manage CI lifecycle and maintain data accuracy
Experience with ServiceNow Discovery, Service Mapping, and integrations
ServiceNow Certified System Administrator (CSA) or higher certifications preferred
Excellent communication and documentation skills
Must be authorized to work in the U.S. without sponsorship
Perks & Benefits
Competitive compensation package
Collaborative and innovative work environment
Opportunity to work with cutting-edge ServiceNow technologies
Snowflake DBT Engineer-- CDC5697451
Requirements engineer job in Irvine, CA
Key Responsibilities
Design develop and maintain ELT pipelines using Snowflake and DBT
Build and optimize data models in Snowflake to support analytics and reporting
Implement modular testable SQL transformations using DBT
Integrate DBT workflows into CICD pipelines and manage infrastructure as code using Terraform
Collaborate with data scientists analysts and business stakeholders to translate requirements into technical solutions
Optimize Snowflake performance through clustering partitioning indexing and materialized views
Automate data ingestion and transformation workflows using Airflow or similar orchestration tools
Ensure data quality governance and security across pipelines
Troubleshoot and resolve performance bottlenecks and data issues
Maintain documentation for data architecture pipelines and operational procedures
Required Skills Qualifications
Bachelors or Masters degree in Computer Science Data Engineering or related field
10 years of experience in data engineering with at least 3 years focused on Snowflake and DBT
Strong proficiency in SQL and Python
Experience with cloud platforms AWS GCP or Azure
Familiarity with Git CICD and Infrastructure as Code tools Terraform CloudFormation
Knowledge of data modeling star schema normalization and ELT best practices
Senior Data Engineer
Requirements engineer job in Calabasas, CA
City: Calabasas, CA/ Las Vegas, NV
Onsite/ Hybrid/ Remote: Onsite 4 days a week
Duration: 6 months Contract to Hire
Rate Range: $85/hr W2
Work Authorization: GC, USC Only
Must Have:
Databricks
Python
Azure
API development
ETL pipelines
DevOps and CI/CD
Responsibilities:
Design and build scalable batch and real-time data pipelines.
Develop data ingestion, processing, and analytical workflows.
Build data products and intelligent APIs.
Ensure data quality, reliability, and performance.
Collaborate with cross-functional teams to translate business needs into data solutions.
Support cloud-based data architecture for BI and AI/ML use cases.
Participate in code reviews and CI/CD practices.
Qualifications:
Bachelor's degree in a related technical field required.
8+ years of data engineering experience.
Strong experience with Databricks, Spark, and cloud platforms.
Proficiency in Python and SQL.
Hands-on experience with Azure data services.
Experience with REST APIs and DevOps practices.
Agile development experience.
Azure & Microsoft Fabric Data Engineer & Architect
Requirements engineer job in Los Angeles, CA
STAND 8 provides end-to-end IT solutions to enterprise partners across the United States and with offices in Los Angeles, New York, New Jersey, Atlanta, and more including internationally in Mexico and India. Our global solutions team is seeking an Azure & Microsoft Fabric Data Engineer/Architect to support and lead our Media & Entertainment client to build a next-generation financial data platform. We're looking for someone who can contribute both strategically and will have the hands-on skill to provide subject matter expertise. In this role, you'll design and build enterprise-level data models, lead data migration efforts into Azure, and develop cutting-edge data processing pipelines using Microsoft Fabric. If you thrive at the intersection of architecture and hands-on engineering, and want to help shape a modern financial system with complex upstream data processing - this is the opportunity for you!
This project will required the person to work onsite the Burbank / Studio City adjacent location 3 days / week. We are setting up interviews immediately and look forward to hearing from you!
This is a hybrid position requiring 3-4 days per week onsite.
Responsibilities
Architect, design, and hands-on develop end-to-end data solutions using Azure Data Services and Microsoft Fabric.
Build and maintain complex data models in Azure SQL, Lakehouse, and Fabric environments that support advanced financial calculations.
Lead and execute data migration efforts from multiple upstream and legacy systems into Azure and Fabric.
Develop, optimize, and maintain ETL/ELT pipelines using Microsoft Fabric Data Pipelines, Data Factory, and Azure engineering tools.
Perform hands-on SQL development, including stored procedures, query optimization, performance tuning, and data transformation logic.
Partner with finance, engineering, and product stakeholders to translate requirements into scalable, maintainable data solutions.
Ensure data quality, lineage, profiling, and governance across ingestion and transformation layers.
Tune and optimize Azure SQL databases and Fabric Lakehouse environments for performance and cost efficiency.
Troubleshoot data processing and pipeline issues to maintain stability and reliability.
Document architecture, data flows, engineering standards, and best practices.
Qualifications
Expert, hands-on experience with Azure Data Services (Azure SQL, Data Factory, Data Lake Storage, Synapse, Azure Storage).
Deep working knowledge of Microsoft Fabric, including Data Engineering workloads, Lakehouse, Fabric SQL, Pipelines, and governance.
Strong experience designing and building data models within Azure SQL and Fabric architectures.
Proven track record delivering large-scale data migrations into Azure environments.
Advanced proficiency in SQL/T-SQL, including stored procedures, indexing, and performance tuning.
Demonstrated success building and optimizing ETL/ELT pipelines for complex financial or multi-source datasets.
Understanding of financial systems, data structures, and complex calculation logic.
Excellent communication and documentation skills with the ability to collaborate across technical and business teams.
Additional Details
The base range for this contract position is $70-85/per hour, depending on experience. Our pay ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target for new hires of this position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Qualified applicants with arrest or conviction records will be considered.
Additional Details
The base range for this contract position is $70-85/per hour, depending on experience. Our pay ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target for new hires of this position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Qualified applicants with arrest or conviction records will be considered.
Benefits
Medical coverage and Health Savings Account (HSA) through Anthem
Dental/Vision/Various Ancillary coverages through Unum
401(k) retirement savings plan
Company-paid Employee Assistance Program (EAP)
Discount programs through ADP WorkforceNow
About Us
STAND 8 provides end-to-end IT solutions to enterprise partners across the United States and globally with offices in Los Angeles, Atlanta, New York, Mexico, Japan, India, and more. STAND 8 focuses on the "bleeding edge" of technology and leverages automation, process, marketing, and over fifteen years of success and growth to provide a world-class experience for our customers, partners, and employees. Our mission is to impact the world positively by creating success through PEOPLE, PROCESS, and TECHNOLOGY.
Check out more at ************** and reach out today to explore opportunities to grow together!
By applying to this position, your data will be processed in accordance with the STAND 8 Privacy Policy.
Senior Data Engineer
Requirements engineer job in Los Angeles, CA
Robert Half is partnering with a well known brand seeking an experienced Data Engineer with Databricks experience. Working alongside data scientists and software developers, you'll work will directly impact dynamic pricing strategies by ensuring the availability, accuracy, and scalability of data systems. This position is full time with full benefits and 3 days onsite in the Woodland Hills, CA area.
Responsibilities:
Design, build, and maintain scalable data pipelines for dynamic pricing models.
Collaborate with data scientists to prepare data for model training, validation, and deployment.
Develop and optimize ETL processes to ensure data quality and reliability.
Monitor and troubleshoot data workflows for continuous integration and performance.
Partner with software engineers to embed data solutions into product architecture.
Ensure compliance with data governance, privacy, and security standards.
Translate stakeholder requirements into technical specifications.
Document processes and contribute to data engineering best practices.
Requirements:
Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
4+ years of experience in data engineering, data warehousing, and big data technologies.
Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server).
Must have experience in Databricks.
Experience working within Azure or AWS or GCP environment.
Familiarity with big data tools like Spark, Hadoop, or Databricks.
Experience in real-time data pipeline tools.
Experienced with Python
Snowflake/AWS Data Engineer
Requirements engineer job in Irvine, CA
Sr. Data Engineer
Full Time Direct Hire Job
Hybrid with work location-Irvine, CA.
The Senior Data Engineer will help design and build a modern data platform that supports enterprise analytics, integrations, and AI/ML initiatives. This role focuses on developing scalable data pipelines, modernizing the enterprise data warehouse, and enabling self-service analytics across the organization.
Key Responsibilities
• Build and maintain scalable data pipelines using Snowflake, dbt, and Fivetran.
• Design and optimize enterprise data models for performance and scalability.
• Support data cataloging, lineage, quality, and compliance efforts.
• Translate business and analytics requirements into reliable data solutions.
• Use AWS (primarily S3) for storage, integration, and platform reliability.
• Perform other data engineering tasks as needed.
Required Qualifications
• Bachelor's degree in Computer Science, Data Engineering, Information Systems, or related field.
• 5+ years of data engineering experience.
• Hands-on expertise with Snowflake, dbt, and Fivetran.
• Strong background in data warehousing, dimensional modeling, and SQL.
• Experience with AWS (S3) and data governance tools such as Alation or Atlan.
• Proficiency in Python for scripting and automation.
• Experience with streaming technologies (Kafka, Kinesis, Flink) a plus.
• Knowledge of data security and compliance best practices.
• Exposure to AI/ML workflows and modern BI tools like Power BI, Tableau, or Looker.
• Ability to mentor junior engineers.
Skills
• Snowflake
• dbt
• Fivetran
• Data modeling and warehousing
• AWS
• Data governance
• SQL
• Python
• Strong communication and cross-functional collaboration
• Interest in emerging data and AI technologies
Data Analytics Engineer
Requirements engineer job in Irvine, CA
We are seeking a Data Analytics Engineer to join our team who serves as a hybrid Database Administrator, Data Engineer, and Data Analyst, responsible for managing core data infrastructure, developing and maintaining ETL pipelines, and delivering high-quality analytics and visual insights to executive stakeholders. This role bridges technical execution with business intelligence, ensuring that data across Salesforce, financial, and operational systems is accurate, accessible, and strategically presented.
Essential Functions
Database Administration: Oversee and maintain database servers, ensuring performance, reliability, and security. Manage user access, backups, and data recovery processes while optimizing queries and database operations.
Data Engineering (ELT): Design, build, and maintain robust ELT pipelines (SQL/DBT or equivalent) to extract, transform, and load data across Salesforce, financial, and operational sources. Ensure data lineage, integrity, and governance throughout all workflows.
Data Modeling & Governance: Design scalable data models and maintain a governed semantic layer and KPI catalog aligned with business objectives. Define data quality checks, SLAs, and lineage standards to reconcile analytics with finance source-of-truth systems.
Analytics & Reporting: Develop and manage executive-facing Tableau dashboards and visualizations covering key lending and operational metrics - including pipeline conversion, production, credit quality, delinquency/charge-offs, DSCR, and LTV distributions.
Presentation & Insights: Translate complex datasets into clear, compelling stories and presentations for leadership and cross-functional teams. Communicate findings through visual reports and executive summaries to drive strategic decisions.
Collaboration & Integration: Partner with Finance, Capital Markets, and Operations to refine KPIs and perform ad-hoc analyses. Collaborate with Engineering to align analytical and operational data, manage integrations, and support system scalability.
Enablement & Training: Conduct training sessions, create documentation, and host data office hours to promote data literacy and empower business users across the organization.
Competencies & Skills
Advanced SQL proficiency with strong data modeling, query optimization, and database administration experience (PostgreSQL, MySQL, or equivalent).
Hands-on experience managing and maintaining database servers and optimizing performance.
Proficiency with ETL/ELT frameworks (DBT, Airflow, or similar) and cloud data stacks (AWS/Azure/GCP).
Strong Tableau skills - parameters, LODs, row-level security, executive-level dashboard design, and storytelling through data.
Experience with Salesforce data structures and ingestion methods.
Proven ability to communicate and present technical data insights to executive and non-technical stakeholders.
Solid understanding of lending/financial analytics (pipeline conversion, delinquency, DSCR, LTV).
Working knowledge of Python for analytics tasks, cohort analysis, and variance reporting.
Familiarity with version control (Git), CI/CD for analytics, and data governance frameworks.
Excellent organizational, documentation, and communication skills with a strong sense of ownership and follow-through.
Education & Experience
Bachelor's degree in Computer Science, Engineering, Information Technology, Data Analytics, or a related field.
3+ years of experience in data analytics, data engineering, or database administration roles.
Experience supporting executive-level reporting and maintaining database infrastructure in a fast-paced environment.
Data Engineer
Requirements engineer job in Irvine, CA
Job Title: Data Engineer
Duration: Direct-Hire Opportunity
We are looking for a Data Engineer who is hands-on, collaborative, and experienced with Microsoft SQL Server, Snowflake, AWS RDS, and MySQL. The ideal candidate has a strong background in data warehousing, data lakes, ETL pipelines, and business intelligence tools.
This role plays a key part in executing data strategy - driving optimization, reliability, and scalable BI capabilities across the organization. It's an excellent opportunity for a data professional who wants to influence architectural direction, contribute technical expertise, and grow within a data-driven company focused on innovation.
Key Responsibilities
Design, develop, and maintain SQL Server and Snowflake data warehouses and data lakes, focusing on performance, governance, and security.
Manage and optimize database solutions within Snowflake, SQL Server, MySQL, and AWS RDS.
Build and enhance ETL pipelines using tools such as Snowpipe, DBT, Boomi, SSIS, and Azure Data Factory.
Utilize data tools such as SSMS, Profiler, Query Store, and Redgate for performance tuning and troubleshooting.
Perform database administration tasks, including backup, restore, and monitoring.
Collaborate with Business Intelligence Developers and Business Analysts on enterprise data projects.
Ensure database integrity, compliance, and adherence to best practices in data security.
Configure and manage data integration and BI tools such as Power BI, Tableau, Power Automate, and scripting languages (Python, R).
Qualifications
Proficiency with Microsoft SQL Server, including advanced T-SQL development and optimization.
7+ years working as a SQL Server Developer/Administrator, with experience in relational and object-oriented databases.
2+ years of experience with Snowflake data warehouse and data lake solutions.
Experience developing pipelines and reporting solutions using Power BI, SSRS, SSIS, Azure Data Factory, or DBT.
Scripting and automation experience using Python, PowerShell, or R.
Familiarity with data integration and analytics tools such as Boomi, Redshift, or Databricks (a plus).
Excellent communication, problem-solving, and organizational skills.
Education: Bachelor's or Master's degree in Computer Science, Information Systems, Data Science, or a related field.
Technical Skills
SQL Server / Snowflake / MySQL / AWS RDS
ETL Development (Snowpipe, SSIS, Azure Data Factory, DBT)
BI Tools (Power BI, Tableau)
Python, R, PowerShell
Data Governance & Security Best Practices
Determining compensation for this role (and others) at Vaco/Highspring depends upon a wide array of factors including but not limited to the individual's skill sets, experience and training, licensure and certifications, office location and other geographic considerations, as well as other business and organizational needs. With that said, as required by local law in geographies that require salary range disclosure, Vaco/Highspring notes the salary range for the role is noted in this job posting. The individual may also be eligible for discretionary bonuses, and can participate in medical, dental, and vision benefits as well as the company's 401(k) retirement plan. Additional disclaimer: Unless otherwise noted in the job description, the position Vaco/Highspring is filing for is occupied. Please note, however, that Vaco/Highspring is regularly asked to provide talent to other organizations. By submitting to this position, you are agreeing to be included in our talent pool for future hiring for similarly qualified positions. Submissions to this position are subject to the use of AI to perform preliminary candidate screenings, focused on ensuring minimum job requirements noted in the position are satisfied. Further assessment of candidates beyond this initial phase within Vaco/Highspring will be otherwise assessed by recruiters and hiring managers. Vaco/Highspring does not have knowledge of the tools used by its clients in making final hiring decisions and cannot opine on their use of AI products.
Data Engineer
Requirements engineer job in Irvine, CA
Thank you for stopping by to take a look at the Data Integration Engineer role I posted here on LinkedIN, I appreciate it.
If you have read my s in the past, you will recognize how I write job descriptions. If you are new, allow me to introduce myself. My name is Tom Welke. I am Partner & VP at RSM Solutions, Inc and I have been recruiting technical talent for more than 23 years and been in the tech space since the 1990s. Due to this, I actually write JD's myself...no AI, no 'bots', just a real live human. I realized a while back that looking for work is about as fun as a root canal with no anesthesia...especially now. So, rather than saying 'must work well with others' and 'team mindset', I do away with that kind of nonsense and just tell it like it is.
So, as with every role I work on, social fit is almost as important as technical fit. For this one, technical fit is very very important. But, we also have some social fit characteristics that are important. This is the kind of place that requires people to dive in and learn. The hiring manager for this one is actually a very dear friend of mine. He said something interesting to me not all that long ago. He mentioned, if you aren't spending at least an hour a day learning something new, you really are doing yourself a disservice. This is that classic environment where no one says 'this is not my job'. So that ability to jump in and help is needed for success in this role.
This role is being done onsite in Irvine, California. I prefer working with candidates that are already local to the area. If you need to relocate, that is fine, but there are no relocation dollars available.
I can only work with US Citizens or Green Card Holders for this role. I cannot work with H1, OPT, EAD, F1, H4, or anyone that is not already a US Citizen or Green Card Holder for this role.
The Data Engineer role is similar to the Data Integration role I posted. However, this one is mor Ops focused, with the orchestration of deployment and ML flow, and including orchestrating and using data on the clusters and managing how the models are performing. This role focuses on coding & configuring on the ML side of the house.
You will be designing, automating, and observing end to end data pipelines that feed this client's Kubeflow driven machine learning platform, ensuring models are trained, deployed, and monitored on trustworthy, well governed data. You will build batch/stream workflows, wire them into Azure DevOps CI/CD, and surface real time health metrics in Prometheus + Grafana dashboards to guarantee data availability. The role bridges Data Engineering and MLOps, allowing data scientists to focus on experimentation and the business sees rapid, reliable predictive insight.
Here are some of the main responsibilities:
Design and implement batch and streaming pipelines in Apache Spark running on Kubernetes and Kubeflow Pipelines to hydrate feature stores and training datasets.
Build high throughput ETL/ELT jobs with SSIS, SSAS, and T SQL against MS SQL Server, applying Data Vault style modeling patterns for auditability.
Integrate source control, build, and release automation using GitHub Actions and Azure DevOps for every pipeline component.
Instrument pipelines with Prometheus exporters and visualize SLA, latency, and error budget metrics to enable proactive alerting.
Create automated data quality and schema drift checks; surface anomalies to support a rapid incident response process.
Use MLflow Tracking and Model Registry to version artifacts, parameters, and metrics for reproducible experiments and safe rollbacks.
Work with data scientists to automate model retraining and deployment triggers within Kubeflow based on data freshness or concept drift signals.
Develop PowerShell and .NET utilities to orchestrate job dependencies, manage secrets, and publish telemetry to Azure Monitor.
Optimize Spark and SQL workloads through indexing, partitioning, and cluster sizing strategies, benchmarking performance in CI pipelines.
Document lineage, ownership, and retention policies; ensure pipelines conform to PCI/SOX and internal data governance standards.
Here is what we are seeking:
At least 6 years of experience building data pipelines in Spark or equivalent.
At least 2 years deploying workloads on Kubernetes/Kubeflow.
At least 2 years of experience with MLflow or similar experiment‑tracking tools.
At least 6 years of experience in T‑SQL, Python/Scala for Spark.
At least 6 years of PowerShell/.NET scripting.
At least 6 years of experience with with GitHub, Azure DevOps, Prometheus, Grafana, and SSIS/SSAS.
Kubernetes CKA/CKAD, Azure Data Engineer (DP‑203), or MLOps‑focused certifications (e.g., Kubeflow or MLflow) would be great to see.
Mentor engineers on best practices in containerized data engineering and MLOps.
Senior Data Engineer - Snowflake / ETL (Onsite)
Requirements engineer job in Beverly Hills, CA
CGS Business Solutions is committed to helping you, as an esteemed IT Professional, find the next right step in your career. We match professionals like you to rewarding consulting or full-time opportunities in your area of expertise. We are currently seeking Technical Professionals who are searching for challenging and rewarding jobs for the following opportunity:
Summary
CGS is hiring for a Senior Data Engineer to serve as a core member of the Platform team. This is a high-impact role responsible for advancing our foundational data infrastructure.
Your primary mission will be to build key components of our Policy Journal - the central source of truth for all policy, commission, and client accounting data. You'll work closely with the Lead Data Engineer and business stakeholders to translate complex requirements into scalable data models and reliable pipelines that power analytics and operational decision-making for agents, managers, and leadership.
This role blends greenfield engineering, strategic modernization, and a strong focus on delivering trusted, high-quality data products.
Overview
• Build the Policy Journal - Design and implement the master data architecture unifying policy, commission, and accounting data from sources like IVANS and Applied EPIC to create the platform's “gold record.”
• Ensure Data Reliability - Define and implement data quality checks, monitoring, and alerting to guarantee accuracy, consistency, and timeliness across pipelines - while contributing to best practices in governance.
• Build the Analytics Foundation - Enhance and scale our analytics stack (Snowflake, dbt, Airflow), transforming raw data into clean, performant dimensional models for BI and operational insights.
• Modernize Legacy ETL - Refactor our existing Java + SQL (PostgreSQL) ETL system - diagnose duplication and performance issues, rewrite critical components in Python, and migrate orchestration to Airflow.
• Implement Data Quality Frameworks - Develop automated testing and validation frameworks aligned with our QA strategy to ensure accuracy, completeness, and integrity across pipelines.
• Collaborate on Architecture & Design - Partner with product and business stakeholders to deeply understand requirements and design scalable, maintainable data solutions.
Ideal Experience
• 5+ years of experience building and operating production-grade data pipelines.
• Expert-level proficiency in Python and SQL.
• Hands-on experience with the modern data stack - Snowflake/Redshift, Airflow, dbt, etc.
• Strong understanding of AWS data services (S3, Glue, Lambda, RDS).
• Experience working with insurance or insurtech data (policies, commissions, claims, etc.).
• Proven ability to design robust data models (e.g., dimensional modeling) for analytics.
• Pragmatic problem-solver capable of analyzing and refactoring complex legacy systems (ability to read Java/Hibernate is a strong plus - but no new Java coding required).
• Excellent communicator comfortable working with both technical and non-technical stakeholders.
Huge Plus!
• Direct experience with Agency Management Systems (Applied EPIC, Nowcerts, EZLynx, etc.)
• Familiarity with carrier data formats (Accord XML, IVANS AL3)
• Experience with BI tools (Tableau, Looker, Power BI)
About CGS Business Solutions: CGS specializes in IT business solutions, staffing and consulting services. With a strong focus in IT Applications, Network Infrastructure, Information Security, and Engineering. CGS is an INC 5000 company and is honored to be selected as one of the Best IT Recruitment Firms in California. After five consecutive Fastest Growing Company titles, CGS continues to break into new markets across the USA. Companies are counting on CGS to attract and help retain these resource pools in order to gain a competitive advantage the rapidly changing business environments.
Data Engineer (AWS Redshift, BI, Python, ETL)
Requirements engineer job in Manhattan Beach, CA
We are seeking a skilled Data Engineer with strong experience in business intelligence (BI) and data warehouse development to join our team. In this role, you will design, build, and optimize data pipelines and warehouse architectures that support analytics, reporting, and data-driven decision-making. You will work closely with analysts, data scientists, and business stakeholders to ensure reliable, scalable, and high-quality data solutions.
Responsibilities:
Develop and maintain ETL/ELT pipelines for ingesting, transforming, and delivering data.
Design and enhance data warehouse models (star/snowflake schemas) and BI datasets.
Optimize data workflows for performance, scalability, and reliability.
Collaborate with BI teams to support dashboards, reporting, and analytics needs.
Ensure data quality, governance, and documentation across all solutions.
Qualifications:
Proven experience with data engineering tools (SQL, Python, ETL frameworks).
Strong understanding of BI concepts, reporting tools, and dimensional modeling.
Hands-on experience with cloud data platforms (e.g., AWS, Azure, GCP) is a plus.
Excellent problem-solving skills and ability to work in a cross-functional environment.
Plumbing Engineer
Requirements engineer job in Marina del Rey, CA
We are currently seeking a Plumbing Engineer to join our team in Marina Del Rey, California. SUMMARY: This position is responsible for managing and performing tests on various materials and equipment and maintaining knowledge on all product specifications and ensure adherence to all required standards by performing the following duties.
DUTIES AND RESPONSIBILITIES:
Build long term customer relationships with existing and potential customers.
Effectively manage Plumbing and design projects by satisfying clients' needs, meeting budget expectations and project schedules.
Provide support during construction phases.
Performs other related duties as assigned by management.
SUPERVISORY RESPONSIBILITIES:
Carries out supervisory responsibilities in accordance with the organization's policies and applicable laws.
QUALIFICATIONS:
Bachelor's Degree (BA) from four-year college or universityin Mechanical Engineering or completed Course Work in Plumbing, or one to two years of related experience and/or training, or equivalent combination of education and experience.
Certificates, licenses and registrations required: LEED Certification is a plus.
Computer skills required:Experienced at using a computer, preferably knowledgeable with MS Word, MS Excel, AutoCAD, and REVIT is a plus.
Other skills required:
5 years of experience minimum, individuals should have recent experience working for a consulting engineering or engineering/architectural firm designing plumbing systems.
Experience in the following preferred:
Residential
Commercial
Multi-Family
Restaurants
Strong interpersonal skills and experience in maintaining strong client relationships are required.
Ability to communicate effectively with people through oral presentations and written communications.
Ability to motivate multiple-discipline project teams in meeting client's needs in a timely manner and meeting budget objectives.
DevOps Engineer
Requirements engineer job in Westlake Village, CA
In today's market, there is a unique duality in technology adoption. On one side, extreme focus on cost containment by clients, and on the other, deep motivation to modernize their Digital storefronts to attract more consumers and B2B customers.
As a leading Modernization Engineering company, we aim to deliver modernization-driven hypergrowth for our clients based on the deep differentiation we have created in Modernization Engineering, powered by our Lightening suite and 16-step Platformation™ playbook. In addition, we bring agility and systems thinking to accelerate time to market for our clients.
Headquartered in Bengaluru, India, Sonata has a strong global presence, including key regions in the US, UK, Europe, APAC, and ANZ. We are a trusted partner of world-leading companies in BFSI (Banking, Financial Services, and Insurance), HLS (Healthcare and Lifesciences), TMT (Telecom, Media, and Technology), Retail & CPG, and Manufacturing space. Our bouquet of Modernization Engineering Services cuts across Cloud, Data, Dynamics, Contact Centers, and around newer technologies like Generative AI, MS Fabric, and other modernization platforms.
Job Role : Sr. DevOps Engineer, platforms
Work Location: Westlake Village, CA (5 Days Onsite)
Duration: Contract to Hire
Job Description:
Responsibilities:
Design, implement, and manage scalable and resilient infrastructure on AWS.
Architect and maintain Windows/Linux based environments, ensuring seamless integration with cloud platforms.
Develop and maintain infrastructure-as-code(IaC) using both AWS Cloudformation/CDK and Terraform.
Develop and maintain Configuration Management for Windows servers using Chef.
Design, build, and optimize CI/CD pipelines using GitLab CI/CD for .NET applications.
Implement and enforce security best practices across the infrastructure and deployment processes.
Collaborate closely with development teams to understand their needs and provide DevOps expertise.
Troubleshoot and resolve infrastructure and application deployment issues.
Implement and manage monitoring and logging solutions to ensure system visibility and proactive issue detection.
Clearly and concisely contribute to the development and documentation of DevOps standards and best practices.
Stay up-to-date with the latest industry trends and technologies in cloud computing, DevOps, and security.
Provide mentorship and guidance to junior team members.
Qualifications:
Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience).
5+ years of experience in a DevOps or Site Reliability Engineering (SRE) role.
Extensive hands-on experience with Amazon Web Services (AWS)
Solid understanding of Windows/Linux Server administration and integration with cloud environments.
Proven experience with infrastructure-as-code tools, specifically AWS CDK and Terraform.
Strong experience designing and implementing CI/CD pipelines using GitLab CI/CD.
Experience deploying and managing .NET applications in cloud environments.
Deep understanding of security best practices and their implementation in cloud infrastructure and CI/CD pipelines.
Solid understanding of networking principles (TCP/IP, DNS, load balancing, firewalls) in cloud environments.
Experience with monitoring and logging tools (e.g., NewRelic, CloudWatch, Cloud Logging, Prometheus).
Strong scripting skills(e.g., PowerShell, Python, Ruby, Bash).
Excellent problem-solving and troubleshooting skills.
Strong communication and collaboration skills.
Experience with containerization technologies (e.g., Docker, Kubernetes) is a plus.
Relevant AWS and/or GCP certifications are a plus.
Experience with the configuration management tool Chef
Preferred Qualifications
Knowledge of and a strong understanding of Powershell and Python Scripting
Strong background with AWS EC2 features and Services (Autoscaling and WarmPools)
Understanding of Windows server Build process using tools like Chocolaty for packages and Packer for AMI/Image generation.
Extensive hands-on experience with Amazon Web Services (AWS)
Why join Sonata Software?
At Sonata, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build never seen before solutions to some of the world's toughest problems. You´ll be challenged, but you will not be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next.
Sonata Software is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity, age, religion, disability, sexual orientation, veteran status, marital status, or any other characteristics protected by law.
Software Developer Engineer in Test
Requirements engineer job in Los Angeles, CA
Job Title: Software Developer Engineer in Test
Reports To: Sr Manager, Quality Engineering
About the role
We are looking for a Software Developer Engineer in Test to join our Platform Engineering & Playback team. As an Automation SDET, you'll be responsible for designing and building scalable test automation frameworks that ensure the integrity and quality of our streaming platform. You'll work across teams to validate video playback, API reliability, cross-device compatibility, and more ultimately helping us deliver uninterrupted entertainment to a global audience.
Key Responsibilities:
Architect and develop robust, reusable automated test frameworks for APIs, UI, and video playback components
Validate streaming applications workflows across web, mobile, smart TVs, and OTT devices
Automate testing for adaptive bitrate streaming, playback metrics, and buffering scenarios
Architect a solution for testing the TVs and OTT devices workflows.
Integrate automated tests with CI/CD pipelines to ensure continuous delivery
Write clear, concise, and comprehensive test plans and test cases
Work closely with developers and QA to ensure high-quality test coverage
Participate in code reviews and provide feedback on testability and design
Champion quality engineering practices within the development teams
Mentor QA engineers on automation strategies and best practices
Required Qualifications
Bachelors degree in Computer Science, Engineering, or equivalent experience
2+ years of experience in test automation, ideally in media or streaming environments
Proficiency in one or more programming languages (e.g., Java, Python, JavaScript, C#)
Experience developing test frameworks and reusable testing libraries
Experience with test automation frameworks (e.g., Selenium, Cypress, Playwright, TestNG, JUnit)
Solid understanding of HTTP, REST APIs, and API testing tools (e.g., Postman, REST Assured)
Experience with version control (Git), CI/CD tools (e.g., Jenkins, GitHub Actions), and build systems
Excellent debugging, problem-solving, and communication skills
Desired Qualifications
Experience with cloud platforms (AWS, Azure, or GCP)
Exposure to OTT platforms or smart TV development environments
Experience testing cross-platform apps (iOS, Android, Roku, Fire TV, etc.)
Familiarity with streaming protocols (HLS, DASH) and media playback components
“Benefit offerings include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits,
EAP program, commuter benefits, and 401K plan. Our program provides employees the flexibility to choose the type
of coverage that meets their individual needs. Available paid leave may include Paid Sick Leave, where required by
law; any other paid leave required by Federal, State, or local law; and Holiday pay upon meeting eligibility criteria.”
“Equal Opportunity Employer/Veterans/Disabled
To read our Candidate Privacy Information Statement, which explains how we will use your information, please navigate to *******************************************
The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:
• The California Fair Chance Act
• Los Angeles City Fair Chance Ordinance
• Los Angeles County Fair Chance Ordinance for Employers
• San Francisco Fair Chance Ordinance”
Aerospace System Engineer II
Requirements engineer job in Irvine, CA
L·Garde is a full-service design, development, manufacturing, and qual test supplier to Tier 1 primes and government agencies. We provide systems engineering and skilled technicians to make your make your Skunk Works-type project a reality. With over 50 years of aerospace expertise, our deployable systems test the limits of what's possible in the harshest of environments in space, on the moon, and even on other planets.
If you're an engineer who thrives on teamwork, clear communication, and seeing your work translate into cutting-edge aerospace solutions, we'd love to talk to you.
A Day in the Life:
We're looking for a Systems Engineer who is passionate about solving complex challenges in aerospace and enjoys working closely with others to make big ideas a reality. In this role, you'll help transform mission requirements into fully engineered space systems, balancing technical performance, schedule, and cost. You'll collaborate across disciplines-design, test, integration, and program management-to ensure our spacecraft and payload systems meet the highest standards of innovation and reliability.
Key Responsibilities:
Lead systems engineering activities across the project lifecycle, from concept through delivery.
Develop and maintain system requirements, CONOPS, ICDs, and risk matrices.
Support Verification & Validation (V&V) efforts and create and maintain Model Based Systems Engineering (MBSE) models.
Partner with engineers, technicians, suppliers, and customers to resolve issues and ensure requirements are met.
Write and review test plans, procedures, and reports; analyze and post-process test data.
Contribute to design trade studies and product development planning.
Participate in major design reviews (SRR, PDR, CDR, TRR) and customer meetings.
Support proposal writing for advanced aerospace concepts.
Maintain a safe, clean, and organized work area by following 5S and safety guidelines.
Who You Are:
You have a Bachelor's degree in engineering, science, or related technical field.
2-4 years of satellite systems engineering experience with DoD, NASA, or commercial space programs.
At least 2 years in management, project leadership, or team leadership roles.
Proficiency with requirements tracking and management.
Proficiency with Model Based Systems Engineering and requirements tracking tools such as CAMEO and DOORS is a plus. Systems Engineers will be expected to have completed training for these tools within the 1st year.
Hands-on experience with hardware/software interfaces, aerospace drawings, and GD&T standards.
Exposure to SolidWorks CAD, FEA, Matlab, Thermal Desktop, CFD (Star CCM+), or LabView preferred
The ability to obtain a U.S. Security Clearance for which the U.S. Government requires U.S. Citizenship.
Top Secret Security Clearance a plus.
Excellent written and verbal communication skills.
Strong interpersonal skills with the ability to collaborate across all levels of the organization.
Detail-oriented, organized, and adaptable in a fast-paced environment.
Strong problem-solving mindset and passion for working in a team-driven culture.
What We Offer:
Be at the forefront of aerospace innovation by working on cutting-edge aerospace technologies.
Opportunity to wear multiple hats and grow your skill set.
Collaborative and inclusive work culture where your contributions are highly valued.
Competitive salary
Top-Tiered Benefits, 100% of both employee and dependents are covered by the company = Medical, Dental, Vision
Flexible Spending Account
Retirement and Company Match
Company-Sponsored Life and LTD Insurance
Generous Paid Time Off Policy with up to 4 weeks in the first year.
Robust Paid Holiday Schedule
Pay range: $110,000.00 - $145,000.00 per year
Join our team as an Aerospace Systems Engineer II and contribute to the advancement of aerospace innovation by taking on diverse, impactful projects in a collaborative environment, where your contributions are valued and your growth is fostered through hands-on experience.
L·Garde is an equal opportunity employer, including individuals with disabilities and veterans, and participates in the E-Verify Program.
Software Engineer (Java/Typescrip/Kotlin)
Requirements engineer job in Burbank, CA
Optomi in partnership with one of our top clients is seeking a highly skilled Software Engineer with strong experience in building application and shared services, REST APIs, and cloud-native solutions. In this role, you will contribute to the development of the Studio's media platforms and B2B applications that support content fulfillment across the Studio's global supply chain. The ideal candidate will bring strong AWS expertise, proficiency in modern programming languages, and the ability to work cross-functionally within a collaborative engineering environment.
What the Right Candidate Will Enjoy!
Contributing to high-visibility media platforms and content supply chain applications
Building scalable, reusable B2B REST APIs used across multiple business units
Hands-on development with TypeScript, Java, Kotlin, or JavaScript
Working extensively with AWS serverless tools-including Lambda and API Gateway
Solving complex engineering challenges involving identity and access management
Participating in a structured, multi-stage interview process that values both technical and collaborative skills
Collaborating with engineers, product owners, security teams, and infrastructure partners
Delivering features in an Agile environment with opportunities for continuous learning
Expanding skillsets across cloud services, API design, and distributed systems
Experience of the Right Candidate:
3+ years of industry experience in software engineering
STEM Degree
Strong focus on application development and shared services
Extensive experience with AWS tools and technologies, especially serverless computing and API Gateway
Strong proficiency in TypeScript, Java, Kotlin, or JavaScript (TypeScript/Java preferred)
Solid understanding of REST API design principles and software engineering best practices
Strong communication and problem-solving skills
Ability to collaborate effectively within cross-functional teams
Experience with databases and identity & access management concepts a plus
Comfortable participating in coding assessments and system design interviews
Responsibilities of the Right Candidate:
Collaborate on the design, development, and deployment of scalable, high-quality software solutions
Build, enhance, and maintain API-driven shared services for Studio media platforms
Leverage AWS tools and serverless technologies to architect reliable, cloud-native applications
Partner closely with product owners, security teams, and other engineering groups to deliver on requirements
Participate in Agile ceremonies-estimating work, prioritizing tasks, and delivering iteratively
Apply and uphold best practices in coding standards, architecture, and system reliability
Contribute to identity and access management services and reusable B2B REST APIs
Conduct testing and ensure high-quality deployments across the platform
Actively stay current with emerging technologies, industry trends, and engineering best practices
Support continuous improvement efforts for development processes, tooling, and automation
System Engineer (Managed Service Provider)
Requirements engineer job in Costa Mesa, CA
We are a long established Southern California Managed Service Provider supporting SMB clients across Los Angeles and Orange County with proactive IT, cybersecurity, cloud solutions, and hands on guidance. Our team is known for strong client relationships and clear communication, and we take a steady, service first approach to solving problems the right way.
We are hiring a Tier 3 Systems Engineer to be the L3 escalation point and technical backstop for complex issues across diverse client environments. This role requires previous MSP experience and is ideal for someone who enjoys deep troubleshooting, ownership, and helping reduce repeat issues by getting to root cause. Expect about 75 percent escalations and 25 percent project work tied to recurring client needs.
What You Will Do
• Own Tier 3 escalations across servers, networking, virtualization, and Microsoft 365
• Troubleshoot deeply and drive root cause fixes
• Handle SonicWall, VLAN, NAT, and site to site VPN work
• Support Windows Server AD, GPO, DNS, DHCP
• Support VMware ESXi vSphere and Hyper V
• Lead Microsoft 365 escalations and hardening
• Document clearly and communicate client ready updates
What You Bring
• 5 plus years MSP experience supporting multiple client environments
• Strong troubleshooting and escalation ownership
• SonicWall plus strong VLAN and VPN skills
• Windows Server 2012 to 2022
• VMware and or Hyper V
• Microsoft 365 plus Intune fundamentals
• Azure and Entra ID security configuration
• ConnectWise Command and ConnectWise Manage preferred
Location, Pay, and Benefits
• $95,000 to $105,000 DOE
• Hybrid after onboarding
• Medical, dental, vision
• 401k with 3% company match
• PTO and sick time plus paid holidays
• Mileage reimbursement
Descent Systems Engineer
Requirements engineer job in Torrance, CA
In Orbit envisions a world where our most critical resources are accessible when we need them the most. Today, In Orbit is on a mission to provide the most resilient and autonomous cargo delivery solutions for regions suffering from conflict and natural disasters.
Descent Systems Engineer:
In Orbit is looking for a Descent Systems Engineer eager to join a diverse and dynamic team developing solutions for cargo delivery where traditional aircraft and drones fail.
As a Descent Systems Engineer at In Orbit you will work on the design, development, and testing of advanced parachutes and decelerator systems. You will work with other engineers on integrating decelerator subsystems into the vehicle. The ideal candidate for this position will have experience manufacturing and testing parachute systems, a solid foundation of aerodynamic and mechanical design principles as well as flight testing experience.
Responsibilities:
Lead the development of parafoils, reefing systems, and other decelerator components.
Develop fabrication and manufacturing processes including material selection, patterning, sewing, rigging, and hardware integration.
Plan and conduct flight tests including drop tests, high-altitude balloon tests, and other captive-carry deployments.
Support the development of test plans, procedures, and instrumentation requirements to verify system performance.
Collaborate closely with mechanical, avionics, and software teams for vehicle-level integrations
Own documentation and configuration management for parachute assemblies, manufacturing specifications, and test reports.
Basic Qualifications:
Bachelor's Degree level of education in Aerospace Engineering or similar curriculum.
Strong understanding of aerodynamics, drag modeling, reefing techniques, and dynamic behaviors of decelerators
Experience with reefing line cutting systems or multi-stage deployment mechanisms
Experience conducting ground and flight tests for decelerator systems, including test planning, instrument integration, data analysis, and anomaly investigation.
Expertise with textile materials (e.g., F-111, S-P fabric, Kevlar, Dyneema).
Ability to work hands-on with sewing machines and ground test fixtures.
Solid teamworking and relationship building skills with the ability to effectively communicate difficult technical problems and solutions with other engineering disciplines.
Preferred Experience and Skills:
Experience with guided parachute systems.
Familiarity with FAA coordination for flight testing in and out of controlled airspace.
Experience with pattern design tools such as SpaceCAD, Lectra Modaris, or similar.
Additional Requirements:
Willing to work extended hours as needed
Able to stand for extended periods of time
Able to occasionally travel (~25%) and support off-site testing.
ITAR Requirements:
To conform to U.S. Government space technology export regulations, including the International Traffic in Arms Regulations (ITAR) you must be a U.S. citizen, lawful permanent resident of the U.S., protected individual as defined by 8 U.S.C. 1324b(a)(3), or eligible to obtain the required authorizations from the U.S. Department of State.