Post job

Data engineer jobs in Spring Valley, NV - 187 jobs

All
Data Engineer
Data Scientist
Data Architect
Data Consultant
Requirements Engineer
Data Modeler
  • Metrology Engineer

    Techsource 3.7company rating

    Data engineer job in North Las Vegas, NV

    Your Opportunity: Join a national-level science and security project. As a Metrology Engineer supporting the Scorpius flash X-ray accelerator, you'll work with a multi-laboratory team that includes Sandia National Laboratories, Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and the Nevada National Security Site. This team is installing and commissioning a one-of-a-kind accelerator that will remain permanently in Nevada and directly supports the U.S. nuclear security mission. What You will Do: Support precision alignment and measurement of accelerator components and subsystems Perform hands-on metrology using laser trackers, profilers, scanners, articulating arms, and CMMs Independently set up, operate, and troubleshoot laser trackers and associated tooling in the field Assist with setup, data collection, and post-processing to verify requirements and tolerances Use metrology software (e.g., SpatialAnalyzer, PolyWorks, Verisurf, or similar) for data acquisition, alignment, analysis, and reporting Work closely with mechanical, electrical, and accelerator engineers during installation and commissioning Apply GD&T to real hardware in a complex, large-scale system Contribute to metrology planning, tooling selection, setups, and measurement strategies Interpret tracker data, coordinate systems, uncertainty budgets, and alignment results to support installation decisions What You Will Need: 2-6 years of hands-on metrology or precision measurement experience Bachelor's degree in engineering or equivalent practical R&D experience Direct experience setting up and operating laser trackers in a field or installation environment Working experience with SpatialAnalyzer or similar large-scale metrology softwareperience Working knowledge of GD&T and measurement accuracy/limitations Comfortable working onsite near Northen Las Vegas in Mercuy, Nevada on an active installation project U.S. citizen able to obtain and maintain a Department of Energy (DOE) Q clearance Nice to Have Experience with accelerator systems, pulsed-power systems, or large scientific hardware Familiarity with Keyence laser profilers or similar systems Exposure to ASME Y14.5 or ASME B89 standards Experience designing or using custom metrology fixtures, tooling, or alignment aids Advanced use of SpatialAnalyzer features such as bundle adjustments, transformations, and automated workflows Work Location & Contract Onsite at the Nevada National Security Site (NNSS) 2-year contract role supporting installation and commissioning (1099 vs FTE optional) Relocation assistance provided Occasional coordination with national laboratory partners
    $81k-116k yearly est. 2d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Databricks Data Engineer - Manager - Consulting - Miami

    EY 4.7company rating

    Data engineer job in Las Vegas, NV

    At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. **Technology - Data and Decision Science - Data Engineering - Manager** We are looking for a dynamic and experienced Manager of Data Engineering to lead our team in designing and implementing complex cloud analytics solutions with a strong focus on Databricks. The ideal candidate will possess deep technical expertise in data architecture, cloud technologies, and analytics, along with exceptional leadership and client management skills. **The opportunity:** In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that business requirements are translated into effective technical solutions. Key responsibilities include: + Understanding and analyzing business requirements to translate them into technical requirements. + Designing, building, and operating scalable data architecture and modeling solutions. + Staying up to date with the latest trends and emerging technologies to maintain a competitive edge. **Key Responsibilities:** As a Data Engineering Manager, you will play a crucial role in managing and delivering complex technical initiatives. Your time will be spent across various responsibilities, including: + Leading workstream delivery and ensuring quality in all processes. + Engaging with clients on a daily basis, actively participating in working sessions, and identifying opportunities for additional services. + Implementing resource plans and budgets while managing engagement economics. This role offers the opportunity to work in a dynamic environment where you will face challenges that require innovative solutions. You will learn and grow as you guide others and interpret internal and external issues to recommend quality solutions. Travel may be required regularly based on client needs. **Skills and attributes for success:** To thrive in this role, you should possess a blend of technical and interpersonal skills. The following attributes will make a significant impact: + Lead the design and development of scalable data engineering solutions using Databricks on cloud platforms (e.g., AWS, Azure, GCP). + Oversee the architecture of complex cloud analytics solutions, ensuring alignment with business objectives and best practices. + Manage and mentor a team of data engineers, fostering a culture of innovation, collaboration, and continuous improvement. + Collaborate with clients to understand their analytics needs and deliver tailored solutions that drive business value. + Ensure the quality, integrity, and security of data throughout the data lifecycle, implementing best practices in data governance. + Drive end-to-end data pipeline development, including data ingestion, transformation, and storage, leveraging Databricks and other cloud services. + Communicate effectively with stakeholders, including technical and non-technical audiences, to convey complex data concepts and project progress. + Manage client relationships and expectations, ensuring high levels of satisfaction and engagement. + Stay abreast of the latest trends and technologies in data engineering, cloud computing, and analytics. + Strong analytical and problem-solving abilities. + Excellent communication skills, with the ability to convey complex information clearly. + Proven experience in managing and delivering projects effectively. + Ability to build and manage relationships with clients and stakeholders. **To qualify for the role, you must have:** + Bachelor's degree in computer science, Engineering, or a related field required; Master's degree preferred. + Typically, no less than 4 - 6 years relevant experience in data engineering, with a focus on cloud data solutions and analytics. + Proven expertise in Databricks and experience with Spark for big data processing. + Strong background in data architecture and design, with experience in building complex cloud analytics solutions. + Experience in leading and managing teams, with a focus on mentoring and developing talent. + Strong programming skills in languages such as Python, Scala, or SQL. + Excellent problem-solving skills and the ability to work independently and as part of a team. + Strong communication and interpersonal skills, with a focus on client management. **Required Expertise for Managerial Role:** + **Strategic Leadership:** Ability to align data engineering initiatives with organizational goals and drive strategic vision. + **Project Management:** Experience in managing multiple projects and teams, ensuring timely delivery and adherence to project scope. + **Stakeholder Engagement:** Proficiency in engaging with various stakeholders, including executives, to understand their needs and present solutions effectively. + **Change Management:** Skills in guiding clients through change processes related to data transformation and technology adoption. + **Risk Management:** Ability to identify potential risks in data projects and develop mitigation strategies. + **Technical Leadership:** Experience in leading technical discussions and making architectural decisions that impact project outcomes. + **Documentation and Reporting:** Proficiency in creating comprehensive documentation and reports to communicate project progress and outcomes to clients. **Large-Scale Implementation Programs:** 1. **Enterprise Data Lake Implementation:** Led the design and deployment of a cloud-based data lake solution for a Fortune 500 retail client, integrating data from multiple sources (e.g., ERPs, POS systems, e-commerce platforms) to enable advanced analytics and reporting capabilities. 2. **Real-Time Analytics Platform:** Managed the development of a real-time analytics platform using Databricks for a financial services organization, enabling real-time fraud detection and risk assessment through streaming data ingestion and processing. 3. **Data Warehouse Modernization:** Oversaw the modernization of a legacy data warehouse to a cloud-native architecture for a healthcare provider, implementing ETL processes with Databricks and improving data accessibility for analytics and reporting. **Ideally, you'll also have:** + Experience with advanced data analytics tools and techniques. + Familiarity with machine learning concepts and applications. + Knowledge of industry trends and best practices in data engineering. + Familiarity with cloud platforms (AWS, Azure, GCP) and their data services. + Knowledge of data governance and compliance standards. + Experience with machine learning frameworks and tools. **What we look for:** We seek individuals who are not only technically proficient but also possess the qualities of top performers, including a strong sense of collaboration, adaptability, and a passion for continuous learning. If you are driven by results and have a desire to make a meaningful impact, we want to hear from you. FY26NATAID **What we offer you** At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more . + We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $125,500 to $230,200. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $150,700 to $261,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options. + Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year. + Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. **Are you ready to shape your future with confidence? Apply today.** EY accepts applications for this position on an on-going basis. For those living in California, please click here for additional information. EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities. **EY | Building a better working world** EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories. EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law. EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
    $150.7k-261.6k yearly 45d ago
  • Graduate Data Scientist

    William Hill Sportsbook

    Data engineer job in Las Vegas, NV

    About the Role Join our Sportsbook team as a Graduate Data Scientist, where you'll gain hands-on experience working at the intersection of quantitative modeling and trading operations. You'll help support the development and monitoring of predictive models that power real-time odds and trading decisions, while also participating in trading workflows for pre-match and in-play sports betting markets. This is an excellent opportunity for an analytically minded graduate with a passion for sports, data, and betting markets to learn from an experienced team and grow into a hybrid quant/trading role. This is an on-site position at our Las Vegas, NV office. Key Responsibilities Assist in building, testing, and maintaining statistical models used to price sportsbook markets (e.g., moneyline, spreads, totals, props). Conduct exploratory data analysis and feature engineering using historical sports data and real-time feeds. Analyze performance metrics like margin, P&L, and customer behavior to improve pricing accuracy and model calibration. Work closely with Trading, Product, and Engineering teams to support model improvements and ensure seamless trading execution.. Stay up-to-date with sports analytics, statistical methods, and advanced machine learning; evaluate new methods to keep our models best-in-class. Support traders with pricing, risk management, and market monitoring during live events. Monitor odds movement, betting patterns, and liabilities to help ensure market efficiency. Contribute to setting pre-match and in-play prices using internal tools and models. Qualifications Bachelor's or Master's degree in a quantitative field such as Statistics, Mathematics, Computer Science, Engineering, or similar. Solid foundation in probability and statistics, with an interest in applying these to real-world problems. Strong programming skills in Python (especially with data-focused libraries like pandas and NumPy). Passion for sports and interest in sportsbook mechanics or financial markets. Strong attention to detail, especially when working with live data and pricing environments. Willingness to work flexible hours during major sporting events as needed. Nice to Have Familiarity with sports betting odds and common market types (e.g., spread, totals, player props). Exposure to sports analytics or participation in data competitions (e.g., Kaggle, university research). Experience with data visualization tools or dashboarding Prior experience in a sportsbook, trading desk, or similar high-tempo environment.
    $80k-116k yearly est. Auto-Apply 39d ago
  • Lead Data Scientist

    Streamline Media Group 4.4company rating

    Data engineer job in Las Vegas, NV

    Job Responsibilities: Manage data scientist and data specialist team. Identify the efficient source for quality data collection, Lead data collection and data mining processes. Ensure and guarantee data integrity. Analyze data and interpret data problems. Plan project, prioritize and streamline all planned data projects. Develop a proper analytic system according to the requirements. Analyze and test the performance of the products. Build reports on the visualization and performance of the products. keep implementing new techniques and models. Line up the data projects according to the goals of the company. Job Skills: Bachelor's degree in Data Science, Computer Science, or other related fields. Proven experience in Data Science and other related fields. Good understanding o techniques of data management and visualization. Expertise in statistical data analysis and predictive data modeling. Good technical and coding knowledge of Python, R language, MATLAB, SQL, and other databases. Outstanding communication skills. Inspiring leadership qualities and organizational skills.
    $91k-128k yearly est. 60d+ ago
  • Data Engineer

    KBR 4.7company rating

    Data engineer job in Nellis Air Force Base, NV

    Title: Data Engineer Belong. Connect. Grow. with KBR! KBR's National Security Solutions team provides high-end engineering and advanced technology solutions to our customers in the intelligence and national security communities. In this position, your work will have a profound impact on the country's most critical role - protecting our national security. Why Join Us? Innovative Projects: KBR's work is at the forefront of engineering, logistics, operations, science, program management, mission IT and cybersecurity solutions. Collaborative Environment: Be part of a dynamic team that thrives on collaboration and innovation, fostering a supportive and intellectually stimulating workplace. Impactful Work: Your contributions will be pivotal in designing and optimizing defense systems that ensure national security and shape the future of space defense. Come join the ITEA award winning TRMC BDKM team and be a part of the team responsible for revolutionizing how analysis is performed across the entire Department of Defense! Key Responsibilities: As a Data Engineer, you will be a critical part of the team that is responsible for enabling the development of data-driven decision analysis products through the innovative application, and promotion, of novel methods from data science, machine learning, and operations research to provide robust and flexible testing and evaluation capabilities to support DoD modernization. Analytic Experience: Candidate will be a part of the technical team responsible for providing analytic consulting services, supporting analytic workflow and product development and testing, promoting the user adoption of methods and best practices from data science, conducting applied methods projects, and supporting the creation of analysis-ready data. Onsite Support: Candidate will be the face of the CHEETAS Team and will be responsible for ensuring stakeholders have the analytical tools, data products and reports they need to make insightful recommendations based on your data driven analysis. Stakeholder Assistance: Candidate will directly assisting both analyst / technical and non-analyst / non-technical stakeholders with the analysis of DoD datasets and demonstrating the 'art of the possible' to the stakeholders and VIPs with insights gained from your analysis of DoD Test and Evaluation (T&E) data. Communication: Must effectively communicate at both a programmatic and technical level. Although you potentially may be the only team member physically on-site supporting you will not be alone. You will have support from other data science team members as well as the software engineering and system administration teams. Technical Support: Candidate will be responsible for running and operating CHEETAS (and other tools); demonstrating these tools to stakeholders & VIPs; conveying analysis results; adapting internally-developed tools, notebooks and reports to meet emerging needs; gathering use cases, requirements, gaps and needs from stakeholders and for larger development items providing that information as feature requests or bug reports to the CHEETAS development team; and performing impromptu hands-on training sessions with end users and potentially troubleshooting problems from within closed networks without internet access (with support from distributed team members). Independent Work: Candidate must be self-motivated and capable of working independently with little supervision / direct tasking. Work Environment: Location: Onsite Travel Requirements: This position will require travel of 25% with potential surge to 50% to support end users located at various DoD ranges & labs located across the US (including Alaska and Hawaii). When not supporting a site, this position can work remotely or from a nearby KBR office ( if available and desired ). Working Hours: Standard, although you potentially may be the only team member physically on-site providing support, you will not be alone. Basic Qualifications: Security Clearance: Active or current TS/SCI Clearance is required Education: A degree in operations research, engineering, applied math, statistics, computer science or information technology with preferred 15+ years of experience within DoD. Candidates with 10-15 years of DoD experience will be considered on a case-by-case basis. Entry level candidates will not be considered. Technical Experience: Previous experience must include five (5) years of hands-on experience in big data analytics, five (5) years of hands-on experience with object-oriented and functional languages (e.g., Python, R, C++, C#, Java, Scala, etc.). Data Experience: Experience in dealing with imperfections in data. Experience should demonstrate competency in key concepts from software engineering, computer programming, statistical analysis, data mining algorithms, machine learning, and modeling sufficient to inform technical choices and infrastructure configuration. Data Analytics: Proven analytical skills and experience in preparing and handling large volumes of data for ETL processes. Experience should include working with teams in the development and interpretation the results of analytic products with DoD specific data types. Big Data Infrastructure: Experience in the installation, configuration, and use of big data infrastructure (Spark, Trino, Hadoop, Hive, Neo4J, JanusGraph, HBase, MS SQL Server with Polybase, VMWare as examples). Experience in implementing Data Visualization solutions. Qualifications Required: Experience using scripting languages (Python and R) to process, analyze and visualize data. Experience using notebooks (Jupyter Notebooks and RMarkdown) to create reproducible and explainable products. Experience using interactive visualization tools (RShiny, py Shiny, Dash) to create interactive analytics. Experience generating and presenting reports, visualizations and findings to customers. Experience building and optimizing ‘big data' data pipelines, architectures and data sets. Experience cleaning and preparing time series and geospatial data for analysis. Experience working with Windows, Linux, and containers. Experience querying databases using SQL and working with and configuring distributed storage and computing environments to conduct analysis (Spark, Trino, Hadoop, Hive, Neo4J, JanusGraph, MongoDB, Accumulo, HBase as examples). Experience working with code repositories in a collaborative team. Ability to make insightful recommendations based on data driven analysis and customer interactions. Ability to effectively communicate both orally and in writing with customers and teammates. Ability to speak and present findings in front of large technical and non-technical groups. Ability to create documentation and repeatable procedures to enable reproducible research. Ability to create training and educational content for novice end users on the use of tools and novel analytic methods. Ability to solve problems, debug, and troubleshoot while under pressure and time constraints is required. Should be self-motivated to design, develop, enhance, reengineer or integrate software applications to improve the quality of data outputs available for end users. Ability to work closely with data scientists to develop and subsequently implement the best technical design and approach for new analytical products. Strong analytical skills related to working with both structured and unstructured datasets. Excellent programming, testing, debugging, and problem-solving skills. Experience designing, building, and maintaining both new and existing data systems and solutions Understanding of ETL processes, how they function and experience implementing ETL processes required. Knowledge of message queuing, stream processing and extracting value from large disparate datasets. Knowledge of software design patterns and Agile Development methodologies is required. Knowledge of methods from operations research, statistical and machine learning, data science, and computer science is sufficient to select appropriate methods to enable data preparation and computing architecture configuration to implement these approaches. Knowledge of computer programming concepts, data structures and storage architecture, to include relational and non-relational databases, distributed computing frameworks, and modeling and simulation experimentation sufficient to select appropriate methods to enable data preparation and computing architecture configuration to implement these approaches. Ready to Make a Difference? If you're excited about making a significant impact in the field of space defense and working on projects that matter, we encourage you to apply and join our team at KBR. Let's shape the future together. KBR Benefits KBR offers a selection of competitive lifestyle benefits which could include 401K plan with company match, medical, dental, vision, life insurance, AD&D, flexible spending account, disability, paid time off, or flexible work schedule. We support career advancement through professional training and development. Belong, Connect and Grow at KBR At KBR, we are passionate about our people and our Zero Harm culture. These inform all that we do and are at the heart of our commitment to, and ongoing journey toward being a People First company. That commitment is central to our team of team's philosophy and fosters an environment where everyone can Belong, Connect and Grow. We Deliver - Together. KBR is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, disability, sex, sexual orientation, gender identity or expression, age, national origin, veteran status, genetic information, union status and/or beliefs, or any other characteristic protected by federal, state, or local law.
    $92k-132k yearly est. Auto-Apply 60d+ ago
  • Data Modeling

    5 Star Recruitment 3.8company rating

    Data engineer job in Las Vegas, NV

    7 + years of experience in Data Modeling & Data Analysis and should have excellent communication and leadership skills. Financial Domain knowledge is plus. Should have strong understanding of cloud Datawarehouse like Snowflake and Data modelling skill using Data Vault 2.0. Should have in depth understanding or executed new concepts like Data Lakehouse and should have worked on Big Data sources / tools such as Hive , S3 , Trino , HUE, etc. Should have done complex (100+ entities/tables) Data Modeling programs in both on-prem and cloud environments. Experience in designing normalized, denormalized, relational, dimensional, star and snowflake schemas for cloud, big data and on prem databases using any one of these data modeling tools (Erwin, ER/Studio, Toad Data Modeler, etc.) Extensive Data Modeling experience in same or similar bigdata and cloud databases such as Hive, Redshift, Snowflake and on prem databases such as Oracle, SQL Server and DB2 Should have experience in working with product managers, project managers, business users, applications development team members, DBA teams and Data Governance team daily to analyze requirements, design, development, and deployment technical solutions. Should have implemented Bill Inmon and Ralph Kimball methodologies to design data warehouses Should have in depth understanding on Data Warehousing, Data Analysis, Data Profiling, Master Data Management (MDM), Data Quality, Data Lineage, Data Dictionary, Data Mapping, Data Policy, and Data Governance.
    $101k-143k yearly est. 60d+ ago
  • AWS Data Migration Consultant

    Slalom 4.6company rating

    Data engineer job in Las Vegas, NV

    Candidates can live within commutable distance to any Slalom office in the US. We have a hybrid and flexible environment. Who You'll Work With As a modern technology company, we've never met a technical challenge we didn't like. We enable our clients to learn from their data, create incredible digital experiences, and make the most of new technologies. We blend design, engineering, and analytics expertise to build the future. We surround our technologists with interesting challenges, innovative minds, and emerging technologies. We are seeking an experienced Cloud Data Migration Architect with deep expertise in SQL Server, Oracle, DB2, or a combination of these platforms, to lead the design, migration, and optimization of scalable database solutions in the AWS cloud. This role will focus on modernizing on-premises database systems by architecting high-performance, secure, and reliable AWS-hosted solutions. As a key technical leader, you will work closely with data engineers, cloud architects, and business stakeholders to define data strategies, lead complex database migrations, build out ETL pipelines, and optimize performance across legacy and cloud-native environments. What You'll Do * Design and optimize database solutions on AWS, including Amazon RDS, EC2-hosted instances, and advanced configurations like SQL Server Always On or Oracle RAC (Real Application Clusters). * Lead and execute cloud database migrations using AWS Database Migration Service (DMS), Schema Conversion Tool (SCT), and custom automation tools. * Architect high-performance database schemas, indexing strategies, partitioning models, and query optimization techniques. * Optimize complex SQL queries, stored procedures, functions, and views to ensure performance and scalability in the cloud. * Implement high-availability and disaster recovery (HA/DR) strategies including Always-On, Failover Clusters, Log Shipping, and Replication, tailored to each RDBMS. * Ensure security best practices are followed including IAM-based access control, encryption, and compliance with industry standards. * Collaborate with DevOps teams to implement Infrastructure-as-Code (IaC) using tools like Terraform, CloudFormation, or AWS CDK. * Monitor performance using tools such as AWS CloudWatch, Performance Insights, Query Store, Dynamic Management Views (DMVs), or Oracle-native tools. * Work with software engineers and data teams to integrate cloud databases into enterprise applications and analytics platforms. What You'll Bring * 5+ years of experience in database architecture, design, and administration with at least one of the following: SQL Server, Oracle, or DB2. * Expertise in one or more of the following RDBMS platforms: Microsoft SQL Server, Oracle, DB2. * Hands-on experience with AWS database services (RDS, EC2-hosted databases). * Strong understanding of HA/DR solutions and cloud database design patterns. * Experience with ETL development and data integration, using tools such as SSIS, AWS Glue, or custom solutions. * Familiarity with AWS networking components (VPCs, security groups) and hybrid cloud connectivity. * Strong troubleshooting and analytical skills to resolve complex database and performance issues. * Ability to work independently and lead database modernization initiatives in collaboration with engineering and client stakeholders. Nice to Have * AWS certifications such as AWS Certified Database - Specialty or AWS Certified Solutions Architect - Professional. * Experience with NoSQL databases or hybrid data architectures. * Knowledge of analytics and big data tools (e.g., Snowflake, Redshift, Athena, Power BI, Tableau). * Familiarity with containerization (Docker, Kubernetes) and serverless technologies (AWS Lambda, Fargate). * Experience with DB2 on-premise or cloud-hosted environments. About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this position, the target base salary pay range in the following locations: Boston, Houston, Los Angeles, Orange County, Seattle, San Diego, Washington DC, New York, New Jersey, for Consultant level is $105,000-147,000 and for Senior Consultant level it is $120,000-$169,000 and for Principal level it is $133,000-$187,000. In all other markets, the target base salary pay range for Consultant level is $96,000-$135,000 and for Senior Consultant level it is $110,000-$155,000 and for Principal level it is $122,000-$172,000. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time. EEO and Accommodations Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process. We will accept applications until 1/31/2026 or until the positions are filled.
    $133k-187k yearly 3d ago
  • Data Platform Engineer

    Abnormal 4.5company rating

    Data engineer job in Las Vegas, NV

    About the Role You'll build, operate, and improve the "engine room" of our data platform-reusable ingestion frameworks, reliability systems, and pipelines across structured, semi-structured, and unstructured sources. You'll own ETL/ELT workflows (Fivetran and AWS Glue) and develop stable, observable, and cost-efficient pipelines that power analytics and AI. Your work prevents incidents before they happen through great design, guardrails, and monitoring-translating technical reliability into a seamless experience for downstream customers. Who you are Pragmatic builder who writes clear SQL/Python and leaves systems more reliable than you found them. Infrastructure-minded engineer comfortable with Python, IaC, orchestration, and Snowflake administration. Customer-centric and fundamentals-first; you translate reliability into a delightful data consumer experience. Velocity-oriented: you deliver "good today" increments, measure impact, and iterate toward excellence. Owner mindset: you proactively drive outcomes, communicate trade-offs, and follow through on commitments. Intellectually honest: you share clear, candid updates and invite feedback to improve systems. Security-first with sound judgment around PII/PHI, least privilege, and secret management. Collaborative partner who can explain technical topics to both engineers and non-technical stakeholders. Naturally curious and thrive in ambiguity, seeking to solve business problems with pragmatic solutions. A self-starter who takes ownership of outcomes and iterates quickly to add value fast. Always balancing excellence with velocity-knowing when good enough today beats perfect next week. What you will do Develop reusable ingestion frameworks (Python/Airflow/AWS Glue) for APIs and unstructured sources beyond Fivetran, handling various data formats (JSON, Parquet, etc.). Own the end-to-end Medallion (bronze/silver/gold) architecture for core domains, ensuring robust lineage and metadata across diverse data sources. Implement data observability (native tests, alerts, lineage hooks); lead incident management and root-cause analysis (RCA) for data. Help standardize reusable "paved-road" patterns (e.g. CI templates, ingestion operators) to improve developer productivity. Prepare datasets for AI/LLM use cases (feature stores, embeddings/RAG prep). Must Haves 3-5+ years of data engineering with strong Python and SQL; hands-on Spark/PySpark (ideally via AWS Glue). Deep experience in AWS (S3, IAM, Lambda, CloudWatch) running secure, observable data workloads. Proficiency operating Snowflake (warehouse sizing, RBAC, resource monitors, clustering/partitioning). Proven governance/security patterns: masking policies, row-level security, and auditability. Orchestration experience (Airflow/MWAA) and event/file/API ingestion beyond managed connectors. CI/CD for data with GitHub Actions; test/promotion workflows; secrets and PII handling. Solid grasp of Medallion architecture, dimensional modeling (star schema), and data quality frameworks. Ownership of incident management and RCA with measurable reduction in MTTR. Nice to Have Familiarity with BI tools (Sigma, Looker, Tableau) for downstream troubleshooting and enablement. Experience with iPaaS/automation (e.g., Workato) and reverse ETL patterns. Data observability tools (e.g., Monte Carlo) and open standards like OpenLineage. IaC for data infrastructure (Terraform) and environment provisioning. Experience with Parquet/S3/Iceberg lakehouse patterns and event/data contracts. Fivetran administration and ELT operations. Experience contributing to paved-road standards (templates, operators, codegen). Exposure to feature stores or embeddings/RAG pipelines supporting AI products. #LI-EM4
    $92k-132k yearly est. Auto-Apply 3d ago
  • Data Engineer - Senior Manager

    PwC 4.8company rating

    Data engineer job in Las Vegas, NV

    Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Manager At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Growing as a strategic advisor, you leverage your influence, expertise, and network to deliver quality results. You motivate and coach others, coming together to solve complex problems. As you increase in autonomy, you apply sound judgment, recognising when to take action and when to escalate. You are expected to solve through complexity, ask thoughtful questions, and clearly communicate how things fit together. Your ability to develop and sustain high performing, diverse, and inclusive teams, and your commitment to excellence, contributes to the success of our Firm. Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: * Craft and convey clear, impactful and engaging messages that tell a holistic story. * Apply systems thinking to identify underlying problems and/or opportunities. * Validate outcomes with clients, share alternative perspectives, and act on client feedback. * Direct the team through complexity, demonstrating composure through ambiguous, challenging and uncertain situations. * Deepen and evolve your expertise with a focus on staying relevant. * Initiate open and honest coaching conversations at all levels. * Make difficult decisions and take action to resolve issues hindering team effectiveness. * Model and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. As part of the Data and Analytics Engineering team you design and implement thorough data architecture strategies that meet current and future business needs. As a Senior Manager you lead large projects, innovate processes, and maintain operational excellence while interacting with clients at a strategic level to drive project success. You also be responsible for developing and documenting data models, data flow diagrams, and data architecture guidelines, maintaining compliance with data governance and data security policies, and collaborating with business stakeholders to translate their data requirements into technical solutions. Responsibilities * Design and implement thorough data architecture strategies * Lead large-scale projects and innovate processes * Maintain operational excellence and client interactions * Develop and document data models and data flow diagrams * Adhere to data governance and security policies * Collaborate with business stakeholders to translate data requirements into technical solutions * Drive project success through strategic advising and problem-solving * Foster a diverse and inclusive team environment What You Must Have * Bachelor's Degree * 8 years of experience What Sets You Apart * Certification in Cloud Platforms [e.g., AWS Solutions Architect, AWS Data Engineer, Google Professional Cloud Architect, GCP Data Engineer Microsoft Azure Solutions Architect, Azure Data Engineer Associate, Snowflake Core, Snowflake Architect, Databricks Data Engineer Associate] is a plus * Designing and implementing thorough data architecture strategies * Developing and documenting data models and data flow diagrams * Maintaining data architecture compliance with governance and security policies * Collaborating with stakeholders to translate data requirements into solutions * Evaluating and recommending new data technologies and tools * Leading data strategy engagements providing thought leadership * Developing leading practices for Data Engineering, Data Science, and Data Governance * Architecting and implementing cloud-based solutions meeting industry standards Travel Requirements Up to 60% Job Posting End Date Learn more about how we work: ************************** PwC does not intend to hire experienced or entry level job seekers who will need, now or in the future, PwC sponsorship through the H-1B lottery, except as set forth within the following policy: *********************************** As PwC is an equal opportunity employer, all qualified applicants will receive consideration for employment at PwC without regard to race; color; religion; national origin; sex (including pregnancy, sexual orientation, and gender identity); age; disability; genetic information (including family medical history); veteran, marital, or citizenship status; or, any other status protected by law. For only those qualified applicants that are impacted by the Los Angeles County Fair Chance Ordinance for Employers, the Los Angeles' Fair Chance Initiative for Hiring Ordinance, the San Francisco Fair Chance Ordinance, San Diego County Fair Chance Ordinance, and the California Fair Chance Act, where applicable, arrest or conviction records will be considered for Employment in accordance with these laws. At PwC, we recognize that conviction records may have a direct, adverse, and negative relationship to responsibilities such as accessing sensitive company or customer information, handling proprietary assets, or collaborating closely with team members. We evaluate these factors thoughtfully to establish a secure and trusted workplace for all. Applications will be accepted until the position is filled or the posting is removed, unless otherwise set forth on the following webpage. Please visit this link for information about anticipated application deadlines: *************************************** The salary range for this position is: $124,000 - $280,000. Actual compensation within the range will be dependent upon the individual's skills, experience, qualifications and location, and applicable employment laws. All hired individuals are eligible for an annual discretionary bonus. PwC offers a wide range of benefits, including medical, dental, vision, 401k, holiday pay, vacation, personal and family sick leave, and more. To view our benefits at a glance, please visit the following link: ***********************************
    $69k-98k yearly est. Auto-Apply 8d ago
  • Senior Data Engineer

    4Rahlp1 American Homes 4 Rent, L.P

    Data engineer job in Las Vegas, NV

    Since 2012, we've grown to become one of the leading single-family rental companies and homebuilders in the country, recently recognized as a top employer by Fortune and Great Place To Work . At AMH, our goal is to simplify the experience of leasing a home through professional management and maintenance support, so our residents can focus on what really matters to them, wherever they are in life. The Senior Data Engineer is responsible for designing, building, and managing the data platform and tools to allow for the efficient processing and analysis of large data sets. Develops and maintains scalable data pipelines, ensures data quality, and deploys machine learning models to production. Collaborates with business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision-making across the organization. Builds real-time and batch pipelines to handle large volumes of data efficiently. Collaborates with cross-functional teams and translate business requirements into scalable data solutions. Responsibilities: Design, develop, and maintain real-time or batch data pipelines to process and analyze large volumes of data. Designs and develops programs and tools to support ingestion, curation, and provisioning of complex first party and third-party data to achieve analytics, reporting, and data science. Design and develop Advanced Data Products and Intelligent API's. Monitors the system performance by performing regular tests, troubleshoots, and integrates new features. Lead in analysis of data and the design the data architecture to support BI, AI/ML and data products. Designs and implements data platform architecture to meet organization analytical requirements. Ensures the solution designs address operational requirements such as scalability, maintainability, extensibility, flexibility, and integrity. Provide technical leadership and mentorship to team members. Leads peer development and code reviews with focus on test driven development and Continuous Integration and Continuous Development (CICD). Requirements: Bachelor's degree in computer science, information systems, data science, management information systems, mathematics, physics, engineering, statistics, economics, and/or a related field required. Master's degree in computer science, information systems, data science, management information systems, mathematics, physics, engineering, statistics, economics, and/or a related field preferred. Minimum of eight (8) years of experience as a data engineer with full-stack capabilities Minimum of ten (10) years of Experience in programming Minimum of five (5) years in Cloud technologies like Azure, Aws or Google. Strong SQL Knowledge Experience in ML and ML Pipeline a plus Experience in real-time integration, developing intelligent apps and data products. Proficiency in Python and experience with CI/CD practices Strong background in IAAS platforms and infrastructure Hands-on experience with Databricks, Spark, Fabric, or similar technologies Experience in Agile methodologies Hands-on experience in the design and development of data pipelines and data products Experience in developing data ingestion, data processing, and analytical pipelines for big data, NoSQL, and data warehouse solutions. Hands-on experience implementing data migration and data processing using Azure services: ADLS, Azure Data Factory, Event Hub, IoT Hub, Azure Stream Analytics, Azure Analysis Service, HDInsight, Databricks Azure Data Catalog, Cosmo Db, ML Studio, AI/ML, etc. Extensive experience in Big Data technologies such as Apache Spark and streaming technologies such as Kafka, EventHub, etc. Extensive experience in designing data applications in a cloud environment. Intermediate experience in RESTful APIs, messaging systems, and AWS or Microsoft Azure. Extensive experience in Data Architecture and data modeling Expert in data analysis and data quality frameworks. Knowledgeable with BI tools such as Power BI and Tableau. May occasionally work evenings and/or weekends. Compensation The anticipated pay range/scale for this position is $121,116.00 to $151,395.00 Annually. Actual starting base pay within this range will depend on factors including geographic location, education, training, skills, and relevant experience. Additional Compensation This position is eligible to receive a discretionary annual bonus. Perks and Benefits Employees have the opportunity to participate in medical, dental and vision insurance; flexible spending accounts and/or health savings accounts; dependent savings accounts; 401(k) with company matching contributions; employee stock purchase plan; and a tuition reimbursement program. The Company provides 9 paid holidays per year, and, upon hire, new employees will accrue paid time off (PTO) at a rate of 0.0577 hours of PTO per hour worked, up to a maximum of 120 hours per year. CA Privacy Notice: To learn more about what information we collect when you apply for a job, and how we use that information, please see our CA Job Applicant Privacy Notice found at ************************************** #LI-PH1
    $121.1k-151.4k yearly Auto-Apply 60d+ ago
  • DATA SCIENTIST

    Department of The Air Force

    Data engineer job in Nellis Air Force Base, NV

    The PALACE Acquire Program offers you a permanent position upon completion of your formal training plan. As a Palace Acquire Intern you will experience both personal and professional growth while dealing effectively and ethically with change, complexity, and problem solving. The program offers a 3-year formal training plan with yearly salary increases. Promotions and salary increases are based upon your successful performance and supervisory approval. Summary The PALACE Acquire Program offers you a permanent position upon completion of your formal training plan. As a Palace Acquire Intern you will experience both personal and professional growth while dealing effectively and ethically with change, complexity, and problem solving. The program offers a 3-year formal training plan with yearly salary increases. Promotions and salary increases are based upon your successful performance and supervisory approval. Overview Help Accepting applications Open & closing dates 09/29/2025 to 09/28/2026 Salary $49,960 to - $99,314 per year Total salary varies depending on location of position Pay scale & grade GS 7 - 9 Locations Gunter AFB, AL Few vacancies Maxwell AFB, AL Few vacancies Davis Monthan AFB, AZ Few vacancies Edwards AFB, CA Few vacancies Show morefewer locations (44) Los Angeles, CA Few vacancies Travis AFB, CA Few vacancies Vandenberg AFB, CA Few vacancies Air Force Academy, CO Few vacancies Buckley AFB, CO Few vacancies Cheyenne Mountain AFB, CO Few vacancies Peterson AFB, CO Few vacancies Schriever AFB, CO Few vacancies Joint Base Anacostia-Bolling, DC Few vacancies Cape Canaveral AFS, FL Few vacancies Eglin AFB, FL Few vacancies Hurlburt Field, FL Few vacancies MacDill AFB, FL Few vacancies Patrick AFB, FL Few vacancies Tyndall AFB, FL Few vacancies Robins AFB, GA Few vacancies Hickam AFB, HI Few vacancies Barksdale AFB, LA Few vacancies Hanscom AFB, MA Few vacancies Natick, MA Few vacancies Aberdeen Proving Ground, MD Few vacancies Andrews AFB, MD Few vacancies White Oak, MD Few vacancies Offutt AFB, NE Few vacancies Holloman AFB, NM Few vacancies Kirtland AFB, NM Few vacancies Nellis AFB, NV Few vacancies Rome, NY Few vacancies Heath, OH Few vacancies Wright-Patterson AFB, OH Few vacancies Tinker AFB, OK Few vacancies Arnold AFB, TN Few vacancies Dyess AFB, TX Few vacancies Fort Sam Houston, TX Few vacancies Goodfellow AFB, TX Few vacancies Lackland AFB, TX Few vacancies Randolph AFB, TX Few vacancies Hill AFB, UT Few vacancies Arlington, VA Few vacancies Dahlgren, VA Few vacancies Langley AFB, VA Few vacancies Pentagon, Arlington, VA Few vacancies Fairchild AFB, WA Few vacancies Warren AFB, WY Few vacancies Remote job No Telework eligible No Travel Required Occasional travel - You may be expected to travel for this position. Relocation expenses reimbursed No Appointment type Internships Work schedule Full-time Service Competitive Promotion potential 13 Job family (Series) * 1560 Data Science Series Supervisory status No Security clearance Secret Drug test No Position sensitivity and risk Noncritical-Sensitive (NCS)/Moderate Risk Trust determination process * Suitability/Fitness Financial disclosure No Bargaining unit status No Announcement number K-26-DHA-12804858-AKK Control number 846709300 This job is open to Help The public U.S. Citizens, Nationals or those who owe allegiance to the U.S. Students Current students enrolled in an accredited high school, college or graduate institution. Recent graduates Individuals who have graduated from an accredited educational institute or certificate program within the last 2 years or 6 years for Veterans. Clarification from the agency This public notice is to gather applications that may or may not result in a referral or selection. Duties Help 1. Performs developmental assignments in support of projects assigned to higher level analysts. Performs minor phases of a larger assignment or work of moderate difficulty where procedures are established, and a number of specific guidelines exist. Applies the various steps of accepted data science procedures to search for information and perform well precedented work. 2. Performs general operations and assignments for portions of a project or study consisting of a series of interrelated tasks or problems. The employee applies judgment in the independent application of methods and techniques previously learned. The employee locates and selects the most appropriate guidelines and modifies to address unusual situations. 3. Participates in special initiatives, studies, and projects. Performs special research tasks designed to utilize and enhance knowledge of work processes and techniques. Works with higher graded specialists in planning and conducting special initiatives, studies, and projects. Assists in preparing reports and briefings outlining study findings and recommendations. 4. Prepares correspondence and other documentation. Drafts or prepares a variety of documents to include newsletter items, responses to routine inquiries, reports, letters, and other related documents. Requirements Help Conditions of employment * Employee must maintain current certifications * Successful completion of all training and regulatory requirements as identified in the applicable training plan * Must meet suitability for federal employment * Direct Deposit: All federal employees are required to have direct deposit * Please read this Public Notice in its entirety prior to submitting your application for consideration. * Males must be registered for Selective Service, see *********** * A security clearance may be required. This position may require a secret, top-secret or special sensitive clearance. * If authorized, PCS will be paid IAW JTR and AF Regulations. If receiving an authorized PCS, you may be subject to completing/signing a CONUS agreement. More information on PCS requirements, may be found at: ***************************************** * More information on PCS requirements, may be found at: ***************************************** * Position may be subject to random drug testing * U.S. Citizenship Required * Disclosure of Political Appointments * Student Loan Repayment may be authorized * Recruitment Incentive may be authorized for this position * Total salary varies depending on location of position * You will be required to serve a one year probationary period * Grade Point Average - 2.95 or higher out of a possible 4.0 * Mobility - you may be required to relocate during or after completion of your training * Work may occasionally require travel away from the normal duty station on military or commercial aircraft Qualifications BASIC REQUIREMENT OR INDIVIDUAL OCCUPATIONAL REQUIREMENT: Degree: Mathematics, statistics, computer science, data science or field directly related to the position. The degree must be in a major field of study (at least at the baccalaureate level) that is appropriate for the position. You may qualify if you meet one of the following: 1. GS-7: You must have completed or will complete a 4-year course of study leading to a bachelor's from an accredited institution AND must have documented Superior Academic Achievement (SAA) at the undergraduate level in the following: a) Grade Point Average 2.95 or higher out of a possible 4.0 as recorded on your official transcript or as computed based on 4 years of education or as computed based on courses completed during the final 2 years of curriculum; OR 3.45 or higher out of a possible 4.0 based on the average of the required courses completed in your major field or the required courses in your major field completed during the final 2 years of your curriculum. 2. GS-9: You must have completed 2 years of progressively higher-level graduate education leading to a master's degree or equivalent graduate degree: a) Grade Point Average - 2.95 or higher out of a possible 4.0 as recorded on your official transcript or as computed based on 4 years of education or as computed based on courses completed during the final 2 years of curriculum; OR 3.45 or higher out of a possible 4.0 based on the average of the required courses completed in your major field or the required courses in your major field completed during the final 2 years of your curriculum. If more than 10 percent of total undergraduate credit hours are non-graded, i.e. pass/fail, CLEP, CCAF, DANTES, military credit, etc. you cannot qualify based on GPA. KNOWLEDGE, SKILLS AND ABILITIES (KSAs): Your qualifications will be evaluated on the basis of your level of knowledge, skills, abilities and/or competencies in the following areas: 1. Professional knowledge of basic principles, concepts, and practices of data science to apply scientific methods and techniques to analyze systems, processes, and/or operational problems and procedures. 2. Knowledge of mathematics and analysis to perform minor phases of a larger assignment and prepare reports, documentation, and correspondence to communicate factual and procedural information clearly. 3. Skill in applying basic principles, concepts, and practices of the occupation sufficient to perform routine to difficult but well precedented assignments in data science analysis. 4. Ability to analyze, interpret, and apply data science rules and procedures in a variety of situations and recommend solutions to senior analysts. 5. Ability to analyze problems to identify significant factors, gather pertinent data, and recognize solutions. 6. Ability to plan and organize work and confer with co-workers effectively. PART-TIME OR UNPAID EXPERIENCE: Credit will be given for appropriate unpaid and or part-time work. You must clearly identify the duties and responsibilities in each position held and the total number of hours per week. VOLUNTEER WORK EXPERIENCE: Refers to paid and unpaid experience, including volunteer work done through National Service Programs (i.e., Peace Corps, AmeriCorps) and other organizations (e.g., professional; philanthropic; religious; spiritual; community; student and social). Volunteer work helps build critical competencies, knowledge and skills that can provide valuable training and experience that translates directly to paid employment. You will receive credit for all qualifying experience, including volunteer experience. Education IF USING EDUCATION TO QUALIFY: If position has a positive degree requirement or education forms the basis for qualifications, you MUST submit transcriptswith the application. Official transcripts are not required at the time of application; however, if position has a positive degree requirement, qualifying based on education alone or in combination with experience, transcripts must be verified prior to appointment. An accrediting institution recognized by the U.S. Department of Education must accredit education. Click here to check accreditation. FOREIGN EDUCATION: Education completed in foreign colleges or universities may be used to meet the requirements. You must show proof the education credentials have been deemed to be at least equivalent to that gained in conventional U.S. education program. It is your responsibility to provide such evidence when applying. Additional information For DHA Positions: These positions are being filled under Direct-Hire Authority for the Department of Defense for Post-Secondary Students and Recent Graduates. The Secretary of the Air Force has delegated authority by the Office of the Secretary of Defense to directly appoint qualified post-secondary students and recent graduates directly into competitive service positions; these positions may be professional or administrative occupations and are located Air Force-Wide. Positions may be filled as permanent or term with a full-time or part-time work schedule. Pay will vary by geographic location. * The term "Current post-secondary student" means a person who is currently enrolled in, and in good academic standing at a full-time program at an institution of higher education; and is making satisfactory progress toward receipt of a baccalaureate or graduate degree; and has completed at least one year of the program. * The term "recent graduate" means a person who was awarded a degree by an institution of higher education not more than two years before the date of the appointment of such person, except in the case of a person who has completed a period of obligated service in a uniform service of more than four years. Selective Service: Males born after 12-31-59 must be registered or exempt from Selective Service. For additional information, click here. Direct Deposit: All federal employees are required to have direct deposit. If you are unable to apply online, view the following link for information regarding Alternate Application. The Vacancy ID is If you have questions regarding this announcement and have hearing or speech difficulties click here. Tax Law Impact for PCS: On 22-Dec-2017, Public Law 115-97 - the "Tax Cuts and Jobs Act of 2017" suspended qualified moving expense deductions along with the exclusion for employer reimbursements and payments of moving expenses effective 01-Jan-2018 for tax years 2018 through 2025. The law made taxable certain reimbursements and other payments, including driving mileage, airfare and lodging expenses, en-route travel to the new duty station, and temporary storage of those items. The Federal Travel Regulation Bulletin (FTR) 18-05 issued by General Services Administration (GSA) has authorized agencies to use the Withholding Tax Allowance (WTA) and Relocation Income Tax Allowance (RITA) to pay for "substantially all" of the increased tax liability resulting from the "2018 Tax Cuts and Jobs Act" for certain eligible individuals. For additional information on WTA/RITA allowances and eligibilities please click here. Subsequently, FTR Bulletin 20-04 issued by GSA, provides further information regarding NDAA FY2020, Public Law 116-92, and the expansion of eligibility beyond "transferred" for WTA/RITA allowances. For additional information, please click here. Expand Hide additional information Candidates should be committed to improving the efficiency of the Federal government, passionate about the ideals of our American republic, and committed to upholding the rule of law and the United States Constitution. Benefits Help A career with the U.S. government provides employees with a comprehensive benefits package. As a federal employee, you and your family will have access to a range of benefits that are designed to make your federal career very rewarding. Opens in a new window Learn more about federal benefits. Review our benefits Eligibility for benefits depends on the type of position you hold and whether your position is full-time, part-time or intermittent. Contact the hiring agency for more information on the specific benefits offered. How you will be evaluated You will be evaluated for this job based on how well you meet the qualifications above. For DHA Positions: These positions are being filled under Direct-Hire Authority for the DoD for Post-Secondary Students and Recent Graduates. The Secretary of the Air Force has delegated authority by the Office of the Secretary of Defense to directly appoint qualified students and recent graduates directly into competitive service positions; positions may be professional or administrative occupations and located Air Force-Wide. Positions may be filled as permanent/term with a full-time/part-time work schedule. Pay will vary by geographic location. * The term "Current post-secondary student" means a person who is currently enrolled and in good academic standing at a full-time program at an institution of higher education; and is progressing toward a baccalaureate or graduate degree; and has completed at least 1 year of the program. * The term "recent graduate" means a person awarded a degree by an institution of higher education not more than 2 years before the date of the appointment of such person, except in the case of a person who has completed a period of obligated service in a uniform service of more than 4 years. Your latest resume will be used to determine your qualifications. Your application package (resume, supporting documents, and responses to the questionnaire) will be used to determine your eligibility, qualifications, and quality ranking for this position. Please follow all instructions carefully. Errors or omissions may affect your rating or consideration for employment. Your responses to the questionnaire may be compared to the documents you submit. The documents you submit must support your responses to the online questionnaire. If your application contradicts or does not support your questionnaire responses, you will receive a rating of "not qualified" or "insufficient information" and you will not receive further consideration for this job. Applicants who disqualify themselves will not be evaluated further. Benefits Help A career with the U.S. government provides employees with a comprehensive benefits package. As a federal employee, you and your family will have access to a range of benefits that are designed to make your federal career very rewarding. Opens in a new window Learn more about federal benefits. Review our benefits Eligibility for benefits depends on the type of position you hold and whether your position is full-time, part-time or intermittent. Contact the hiring agency for more information on the specific benefits offered. Required documents Required Documents Help The following documents are required and must be provided with your application for this Public Notice. Applicants who do not submit required documentation to determine eligibility and qualifications will be eliminated from consideration. Other documents may be required based on the eligibility/eligibilities you are claiming. Click here to view the AF Civilian Employment Eligibility Guide and the required documents you must submit to substantiate the eligibilities you are claiming. * Online Application - Questionnaire * Resume: Your resume may NOT exceed two pages, and the font size should not be smaller than 10 pts. You will not be considered for this vacancy if your resume is illegible/unreadable. Additional information on resume requirements can be located under "
    $50k-99.3k yearly 25d ago
  • Data Engineer | Performance Management

    Swickard Auto Group

    Data engineer job in Las Vegas, NV

    Corporate Office | Enterprise Analytics | Turning Data into Decisions | Veterans encouraged to apply This position will support the data analytics and reporting requests of our growing Performance Management team within Swickard Auto Group. The Performance Management team is tasked with delivering insights and analytics that help drive bottom line impact and improve the customer experience across the entire organization. The insights generated through the data work that this position generates and maintains will be critical in the corporate and retail level decision-making process. The ideal candidate will have advanced Data Engineering skills, an analytical mindset, great attention to detail, ability to manage projects on their own, and craft data solutions that are both expedient and scalable. The candidate must be comfortable working in a small team with limited structure and have an entrepreneurial mindset. Duties and Responsibilities Create and maintain relevant data architecture documentation Build automated data pipelines using Azure Data Factory, Azure Functions, and Python. Develop and manage Data Vault 2.0 models (Hubs, Links, Satellites) across all dealership domains. Integrate data from multiple systems (APIs, SFTP, webhooks, ERPs, DMS systems) into the enterprise data platform. Create and maintain cloud-based data architecture across bronze, silver, and gold layers. Optimize SQL Server data models and ETL performance, including indexing, stored procedures, and pipeline efficiency. Develop production-quality Python ETL modules for ingestion, cleaning, validation, and transformation. Implement DevOps practices with Git, CI/CD, and YAML-based deployment pipelines. Qualifications 4+ years of related experience in a data engineering role Bachelor's Degree (accredited school or equivalent technical bootcamp) with emphasis in: Data Engineering, Data Analytics, Mathematics, or Statistics Proficiency in SQL and Python required; Node.js experience a plus. Experience with Data Vault modeling, Azure Data Factory, Azure Function Apps, and cloud-based ETL preferred. Strong proficiency in SQL Strong proficiency in Python Strong proficiency in API Infrastructure Proficiency in Cloud Data Architecture (Azure Strongly Preferred) Proficiency in Azure Function Apps and Cloud-based ETL Strong experience in Data Vault Modeling Strong experience in Azure Data Factory Experience developing and maintaining API, webhook, and SFTP integrations in Python Experience with YAML based DevOps pipeline management Experience designing enterprise-grade database architecture Strong attention to detail and a high degree of accuracy Ability to communicate well both verbally and in writing with all levels of the organization Strong analytical skills Solid understanding of data sources, data organization and storage The ideal candidate will also have: Experience automating processes and data analysis using Python Master's Degree in Data Analytics or similar field What You'll Receive Aggressive salary based on experience Medical, dental, and vision insurance Paid time off and holidays 401(k) plan Career growth opportunities within a fast-scaling organization A seat at the table - your work will be used by senior leadership Why Swickard Data is central to how we run the business - not an afterthought Leaders who value clarity, accountability, and follow-through Opportunity to build systems that scale across brands, states, and teams A culture that pairs high standards with genuine respect for people About Us We were founded in 2014 by Jeff Swickard in Wilsonville, OR. We're a hospitality company that happens to sell cars, parts, and service. We are a team. Everyone plays a role in our success. Culture: We want to be our customers' favorite place to purchase, lease, or service their vehicle and we want to be your favorite place to work! Highline Brands: Swickard has positioned itself as a leader in highline brands such as Mercedes Benz, BMW, Volvo, Porsche, Lexus, Audi, Land Rover, and more. We are consistently ranked as one of the fastest growing dealership groups in the US by Automotive News. Swickard Auto Group is a hospitality company that happens to sell cars. With 40+ rooftops, 20+ brands, and thousands of employees, we rely on disciplined systems and transparent performance management to deliver consistent results for our guests and our teams. Our question is simple: How can we do this better? Data helps us answer it - every day.
    $81k-115k yearly est. 28d ago
  • Sr Data Engineer

    Fusion HCR

    Data engineer job in Las Vegas, NV

    Senior Data Engineer Work Arrangement: 100% Onsite - 5 days per week (Required) Employment Type: Direct Hire Industry: Property Management / Real Estate Technology Client: Fusion client - large, national property management organization Position Summary A Fusion client in the property management space is seeking a Senior Data Engineer to play a key role in building, scaling, and optimizing a modern Databricks-first data platform. This role is heavily hands-on and focused on designing and improving Spark-based data pipelines using Databricks, Python, and SQL in a cloud environment. The Senior Data Engineer will lead large-scale data initiatives, including onboarding new data sources into the data warehouse, building real-time and batch data pipelines, and significantly improving pipeline performance and reliability. This role partners closely with engineering, analytics, and business teams to deliver scalable data solutions that support analytics, BI, and future AI/ML use cases. Critical Skill Priorities (In Order of Importance) Hands-on Databricks experience (Required) Strong Python scripting and SQL (daily, hands-on use) Apache Spark for cloud data loading and transformation Large-scale data initiatives (new source ingestion, platform expansion) Real-time and streaming data pipelines Pipeline performance tuning and optimization Key Responsibilities Design, build, and maintain real-time and batch data pipelines using Databricks and Spark Develop Python- and Spark-based processes for cloud data ingestion, transformation, and loading Lead large data initiatives such as: Bringing new internal and external data sources into the data warehouse Supporting streaming and near-real-time data use cases Improving pipeline speed, scalability, and reliability Design and evolve data architecture supporting analytics, BI, and future AI/ML initiatives Collaborate with cross-functional teams to translate business requirements into scalable data solutions Monitor pipeline health, troubleshoot data issues, and improve system performance Participate in code reviews and promote best practices around testing, CI/CD, and maintainable data pipelines Contribute to the design and development of data products and data services consumed across the organization Required Qualifications Bachelor's degree required in Computer Science, Data Science, Engineering, Information Systems, Mathematics, Statistics, or a related field Candidates with fewer years of experience may be considered only if degree requirements are met 5+ years of hands-on experience as a Data Engineer Strong, hands-on experience with Databricks and Apache Spark Strong proficiency in Python scripting for data processing and pipeline development Advanced SQL skills for analytics, transformations, and troubleshooting Experience building and supporting cloud-based data pipelines Experience working with large-scale data platforms and warehouses Strong troubleshooting and problem-solving skills Preferred / Nice-to-Have Qualifications Experience with Snowflake (may be considered in place of some Databricks experience) Experience with streaming technologies (Spark Streaming, Kafka, Event Hub, etc.) Experience optimizing and tuning data pipelines for performance and scalability Experience with CI/CD practices in data engineering environments Familiarity with BI tools such as Power BI or Tableau Experience working in Agile development environments
    $81k-115k yearly est. 28d ago
  • Data Engineer

    Tekgence

    Data engineer job in Las Vegas, NV

    Role: Data Engineer Project Type: Long Term Contract Client : Atos Syntel JOB DESCRIPTION :- Must have hands on experience and knowledge on Bigdata ecosystem Must have hands on experience in Apache PySpark Must have hands on experience in Java (Specifically Spring Boot webservices) Proficient in writing Python code Good SQL knowledge for Hive Queries Docker & Kubernetes : Good to have , not must
    $81k-115k yearly est. 60d+ ago
  • Data Analytics Engineer

    VBG (Veteran Benefits Guide

    Data engineer job in Las Vegas, NV

    Job DescriptionDescription: Who we are: Veteran Benefits Guide (VBG) was founded by a former United States Marine with the goal of ensuring that Veterans receive accurate disability benefits in a timely manner. Since it was founded, VBG has guided more than 35,000 Veterans through the complicated Veteran Affairs (VA) disability claims process. As a company founded by a Veteran and staffed by many Veterans and families of Veterans, VBG is committed to advocating for policies that protect the rights and interests of former service members. Who we're looking for: The Data Analytics Engineer is responsible for transforming raw and staged data into trusted, well-modeled, and analytics-ready datasets that empower reporting, dashboards, and data-driven decision-making across the organization. This role bridges the gap between engineering and analysis - ensuring data is clean, consistent, connected, and optimized for use by Analysts, BI Developers, and business teams. You will work closely with Data Engineers (who ingest data), BI Developers (who build dashboards), and Analysts (who generate insights) to build the semantic layer of the warehouse. You will own data modeling, cleansing, deduplication, and constructing unified datasets that bring together information from systems such as Salesforce, NetSuite, Google, and internal applications. This position is open to candidates located in the following states: Arizona (AZ), California (CA), Washington (WA), Nevada (NV), Utah (UT), Illinois (IL), Ohio (OH), New Jersey (NJ), Virginia (VA), North Carolina (NC), and Florida (FL). Essential Functions: Reasonable accommodation may be provided to enable individuals with disabilities to perform essential functions. Data Modeling & Transformation Build, maintain, and optimize curated data models using SQL, dbt, or similar transformation tools. Create dimensional models (fact/dimension) and semantic layers to support reporting and advanced analytics. Construct unified datasets that bring together cross-system information (e.g., Salesforce, NetSuite, Google Ads). Data Quality & Reliability Profile, validate, and cleanse data to eliminate duplicates, missing fields, and inconsistencies. Implement automated data tests to ensure accuracy, completeness, and referential integrity. Investigate and resolve issues flagged by Analysts when metrics do not match or data looks incorrect. Warehouse Optimization & Governance Partner with DBAs and Data Engineers to ensure performance at the warehouse structures and optimized queries. Adhere to and help define data governance, documentation standards, and semantic layer best practices. Maintain version-controlled analytics codebases using Git or similar workflows. Collaboration & Stakeholder Support Work closely with Analysts to understand their data needs and translate them into robust models. Support BI Developers by providing clean, reliable datasets that power dashboards and reports. Communicate issues, improvements, and data model changes clearly to technical and non-technical audiences. Success Measures (KPIs) Reduction in analyst time spent cleaning and prepping data (target: 40-60% reduction). Decrease in recurring data mismatches or report inconsistencies. Increased adoption of curated datasets by Analysts and BI Developers. Faster turnaround time for new data model requests and enhancements. High data quality scores and reduction in manual remediation efforts. Requirements: Qualifications or competencies: Advanced SQL skills (window functions, CTEs, performance tuning). Experience with transformation frameworks (dbt strongly preferred). Strong understanding of data warehousing concepts: star schema, snowflake schema, fact/dimension modeling. Familiarity with cloud warehouses (Snowflake, BigQuery, Redshift, Synapse). Ability to troubleshoot mismatched metrics, broken joins, or duplicated data. Preferred Skills Preferred: Snowflake SnowPro Core Certification Preferred: Snowflake SnowPro Advanced Data Engineer Certification Preferred: dbt Analytics Engineer Certification Preferred: AWS Data Engineer - Associate Certification Preferred: AWS Solutions Architect - Associate Certification Experience with Python or R for data validation or automation scripts. Knowledge of BI tools (Power BI, Tableau, Looker) and how they interact with semantic layers. Familiarity with CI/CD for analytics code and version control (Git). Exposure to data governance, cataloging, and documentation tools. Education and previous work experience: Bachelor's degree in Data Analytics, Computer Science, Information Systems, or related field. 3-5+ years of experience in analytics engineering, BI development, or data modeling. EEO: Veteran Benefits Guide provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, national origin, ancestry, physical disability, mental disability, medical condition, marital status, sex (including pregnancy, childbirth, breastfeeding or related medical conditions), gender (including gender identity and gender expression) genetic characteristic, sexual orientation, registered domestic partner status, age, military or veteran status, hairstyle or hair texture, reproductive health decision making, or any other characteristic protected by federal, state, or local laws.
    $81k-115k yearly est. 3d ago
  • Data Architect / Eng.

    International Market Centers 4.6company rating

    Data engineer job in Las Vegas, NV

    We at Andmore are seeking a technically skilled and business-aware Data Architect to lead the development of scalable data infrastructure that powers B2B marketing analytics and decision-making. This role requires deep expertise in Snowflake and the ability to collaborate with marketing technology teams, front-end developers, and CRM specialists to deliver data solutions that support customer engagement and internal operations. Key Responsibilities: * Design and optimize data pipelines and models using DBT and Snowflake, tailored to B2B marketing use cases. * Design and implement logical and physical data models (e.g., dimensional modeling) to represent analytics use-cases. * Lead technical discussions with front-end developers (e.g., Power BI specialists), marketing analysts, and CRM engineers to ensure data structures support reporting and analytics needs. * Collaborate with marketing operations and CRM teams (Microsoft Dynamics) to integrate and harmonize data across platforms, enabling unified customer views and actionable insights. * Translate functional marketing requirements into technical specifications by asking targeted questions and proposing scalable data architecture solutions. * Implement data governance and metadata management using tools like Snowflake or Microsoft Purview, ensuring compliance and transparency. * Support self-service analytics and dashboarding through Power BI, enabling marketing teams to explore campaign performance and audience behavior. * Monitor and troubleshoot data workflows, ensuring high availability, data quality, and performance across marketing data assets. * Contribute to the evolution of the marketing data stack, including experimentation with new tools and architecture patterns. Required Qualifications: * Bachelor's degree in Computer Science, Engineering, Information Systems, or a related field. * 7+ years of experience in data engineering, with a strong focus on cloud-based data platforms and data modeling. * Proven expertise in Snowflake. * Strong SQL skills and experience with dimensional modeling for marketing and activity data. * Experience working with marketing data sources such as CRM, web analytics, and campaign management platforms. * Excellent communication skills, with a focus on technical collaboration across teams. * Ability to design and implement scalable data architecture solutions. Recommended Skills & Tools: * Familiarity with Snowflake or Microsoft Purview for data governance and cataloging. * Experience integrating data from Microsoft Dynamics and other B2B Marketing platforms. * Experience with DBT for data transformations * Working knowledge of Power BI for dashboarding and reporting. * Understanding of data privacy, compliance (e.g., GDPR), and security best practices. * Strategic thinker with a proactive approach to problem-solving. * Strong interpersonal and technical collaboration skills. * Passion for continuous learning and staying current with data engineering trends.
    $100k-145k yearly est. 60d+ ago
  • Sr. Data Engineer

    Slickdeals 4.1company rating

    Data engineer job in Las Vegas, NV

    We believe shopping should feel like winning. That's why 10 million people come to Slickdeals to swap tips, upvote the best finds, and share the thrill of a great deal. Together, our community has saved more than $10 billion over the past 26 years. We're profitable, passionate, and in the middle of an exciting evolution-transforming from the internet's most trusted deal forum into the go-to daily shopping destination. If you thrive in a fast-moving, creative environment where ideas turn into impact fast, you'll fit right in. The Purpose: We're seeking a seasoned Senior Data Engineer to join our high-impact team at Slickdeals. This role will inherit and evolve a mature data ecosystem built over 3+ years, spanning Databricks, dbt, Airflow, AWS, Tableau, and AtScale. You'll be responsible for maintaining and modernizing core pipelines, enabling analytics and reporting, and supporting cost-conscious, scalable data infrastructure. What You'll Do: Own and maintain ETL/ELT pipelines using dbt, Airflow, and Databricks Develop and optimize data models in AtScale to support BI tools like Tableau Collaborate with Analytics, Product, and Engineering teams to deliver reliable, timely data Monitor and troubleshoot data workflows, ensuring high availability and performance Support cloud infrastructure in AWS, including S3, Kafka, EC2, Lambda, and IAM policies Contribute to cost optimization efforts across data storage, compute, and tooling Document systems, processes, and tribal knowledge for continuity and onboarding Participate in code reviews, architecture discussions, and team rituals What We're Looking For: Required Experience: BS/BA/BE degree in a quantitative area such as mathematics, statistics, economics, computer science, engineering, or equivalent experience. 8+ years of experience in data engineering or analytics engineering Strong proficiency in SQL, Python, and dbt Hands-on experience with Databricks, Airflow, and AWS Familiarity with semantic modeling tools Experience building dashboards and supporting BI teams using Tableau Understanding of data governance, security, and compliance best practices Excellent communication and written documentation skills Comfortable working in a fast-paced, collaborative environment Always curious and a continuous learner Preferred Experience: Experience with cost monitoring tools or FinOps practices Familiarity with vendor integrations and API-based data sharing Exposure to AtScale, Tableau, or other modern data platforms Passion for mentoring and knowledge sharing With your application, kindly attach a cover letter that outlines your greatest achievement. Please share what you built, how you measured success, and your role in the result. Please note: We are unable to sponsor visas at this time. Candidates must be authorized to work in the U.S. without current or future visa sponsorship or transfer. LOCATION: Las Vegas, NV Hybrid schedule visiting our Las Vegas office three days a week (Tues-Thurs). Slickdeals Compensation, Benefits, Perks: The expected base pay for this role is between $122,000 - $150,000. Pay is based on a number of factors including market location and may vary depending on job-related knowledge, skills, and experience. Exact compensation will be discussed during the interview process and tailored to the candidate's qualifications. Competitive base salary and annual bonus Competitive paid time off in addition to holiday time off A variety of healthcare insurance plans to give you the best care for your needs 401K matching above the industry standard Professional Development Reimbursement Program Work Authorization Candidates must be eligible to work in the United States. Slickdeals is an Equal Opportunity Employer; employment is governed on the basis of merit, competence and qualifications and will not be influenced in any manner by race, color, religion, gender (including pregnancy, childbirth, or related medical conditions), national origin/ethnicity, veteran status, disability status, age, sexual orientation, gender identity, marital status, mental or physical disability or any other protected status. Slickdeals will consider qualified applicants with criminal histories consistent with the "Ban the Box" legislation. We may access publicly available information as part of your application. Slickdeals participates in E-Verify. For more information, please refer to E-Verify Participation and Right to Work. Slickdeals does not accept unsolicited resumes from agencies and is not responsible for related fees.
    $122k-150k yearly Auto-Apply 21d ago
  • Senior Data Science Engineer, Football

    Draftkings 4.0company rating

    Data engineer job in Las Vegas, NV

    At DraftKings, AI is becoming an integral part of both our present and future, powering how work gets done today, guiding smarter decisions, and sparking bold ideas. It's transforming how we enhance customer experiences, streamline operations, and unlock new possibilities. Our teams are energized by innovation and readily embrace emerging technology. We're not waiting for the future to arrive. We're shaping it, one bold step at a time. To those who see AI as a driver of progress, come build the future together. The Crown Is Yours Our Sports Modeling team comprises sports modeling experts and data science technologists, coming together to develop innovative products that deliver incremental value across our Sportsbook platform. As a Senior Data Scientist on the Sports Modeling team, you will develop models and data-driven solutions that enhance the Sportsbook experience for our users. In this role, you will work on implementing advanced sports models, refining data assets, and ensuring seamless integration into applications. What you'll do as a Senior Data Scientist, Football Create statistical and machine learning models and integrate them into data science applications. Collect and engineer sports data assets to assist in model development. Implement the sports models and pricing engines in Python. Create automatic tests to ensure model and pricing engine accuracy. Collaborate closely with Trading, Product, Engineering, and QA teams to move projects from ideation to deployment. Test data flows and model integration in a larger business context. Coach and support more junior data scientists within the team. What you'll bring Demonstrated passion for sports and a strong understanding of relevant leagues and their dynamics. A college degree in Statistics, Data Science, Mathematics, Computer Science, Engineering, or another related field. Proficiency in Python, with experience building statistical or machine learning models across various sports. Solid grasp of data science principles, statistical modeling techniques, and object-oriented programming concepts. Familiarity with tools and practices such as Kubernetes, Kafka, version control, and MLOps principles. Self-motivation and eagerness to expand knowledge and understanding of Sportsbook products and related technologies. Join Our Team We're a publicly traded (NASDAQ: DKNG) technology company headquartered in Boston. As a regulated gaming company, you may be required to obtain a gaming license issued by the appropriate state agency as a condition of employment. Don't worry, we'll guide you through the process if this is relevant to your role. The US base salary range for this full-time position is 120,800.00 USD - 151,000.00 USD, plus bonus, equity, and benefits as applicable. Our ranges are determined by role, level, and location. The compensation information displayed on each job posting reflects the range for new hire pay rates for the position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Your recruiter can share more about the specific pay range and how that was determined during the hiring process. It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.
    $85k-120k yearly est. Auto-Apply 60d+ ago
  • Databricks Data Engineer - Manager - Consulting - Location Open

    EY 4.7company rating

    Data engineer job in Las Vegas, NV

    At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. **Technology - Data and Decision Science - Data Engineering - Manager** We are looking for a dynamic and experienced Manager of Data Engineering to lead our team in designing and implementing complex cloud analytics solutions with a strong focus on Databricks. The ideal candidate will possess deep technical expertise in data architecture, cloud technologies, and analytics, along with exceptional leadership and client management skills. **The opportunity:** In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that business requirements are translated into effective technical solutions. Key responsibilities include: + Understanding and analyzing business requirements to translate them into technical requirements. + Designing, building, and operating scalable data architecture and modeling solutions. + Staying up to date with the latest trends and emerging technologies to maintain a competitive edge. **Key Responsibilities:** As a Data Engineering Manager, you will play a crucial role in managing and delivering complex technical initiatives. Your time will be spent across various responsibilities, including: + Leading workstream delivery and ensuring quality in all processes. + Engaging with clients on a daily basis, actively participating in working sessions, and identifying opportunities for additional services. + Implementing resource plans and budgets while managing engagement economics. This role offers the opportunity to work in a dynamic environment where you will face challenges that require innovative solutions. You will learn and grow as you guide others and interpret internal and external issues to recommend quality solutions. Travel may be required regularly based on client needs. **Skills and attributes for success:** To thrive in this role, you should possess a blend of technical and interpersonal skills. The following attributes will make a significant impact: + Lead the design and development of scalable data engineering solutions using Databricks on cloud platforms (e.g., AWS, Azure, GCP). + Oversee the architecture of complex cloud analytics solutions, ensuring alignment with business objectives and best practices. + Manage and mentor a team of data engineers, fostering a culture of innovation, collaboration, and continuous improvement. + Collaborate with clients to understand their analytics needs and deliver tailored solutions that drive business value. + Ensure the quality, integrity, and security of data throughout the data lifecycle, implementing best practices in data governance. + Drive end-to-end data pipeline development, including data ingestion, transformation, and storage, leveraging Databricks and other cloud services. + Communicate effectively with stakeholders, including technical and non-technical audiences, to convey complex data concepts and project progress. + Manage client relationships and expectations, ensuring high levels of satisfaction and engagement. + Stay abreast of the latest trends and technologies in data engineering, cloud computing, and analytics. + Strong analytical and problem-solving abilities. + Excellent communication skills, with the ability to convey complex information clearly. + Proven experience in managing and delivering projects effectively. + Ability to build and manage relationships with clients and stakeholders. **To qualify for the role, you must have:** + Bachelor's degree in computer science, Engineering, or a related field required; Master's degree preferred. + Typically, no less than 4 - 6 years relevant experience in data engineering, with a focus on cloud data solutions and analytics. + Proven expertise in Databricks and experience with Spark for big data processing. + Strong background in data architecture and design, with experience in building complex cloud analytics solutions. + Experience in leading and managing teams, with a focus on mentoring and developing talent. + Strong programming skills in languages such as Python, Scala, or SQL. + Excellent problem-solving skills and the ability to work independently and as part of a team. + Strong communication and interpersonal skills, with a focus on client management. **Required Expertise for Managerial Role:** + **Strategic Leadership:** Ability to align data engineering initiatives with organizational goals and drive strategic vision. + **Project Management:** Experience in managing multiple projects and teams, ensuring timely delivery and adherence to project scope. + **Stakeholder Engagement:** Proficiency in engaging with various stakeholders, including executives, to understand their needs and present solutions effectively. + **Change Management:** Skills in guiding clients through change processes related to data transformation and technology adoption. + **Risk Management:** Ability to identify potential risks in data projects and develop mitigation strategies. + **Technical Leadership:** Experience in leading technical discussions and making architectural decisions that impact project outcomes. + **Documentation and Reporting:** Proficiency in creating comprehensive documentation and reports to communicate project progress and outcomes to clients. **Large-Scale Implementation Programs:** 1. **Enterprise Data Lake Implementation:** Led the design and deployment of a cloud-based data lake solution for a Fortune 500 retail client, integrating data from multiple sources (e.g., ERPs, POS systems, e-commerce platforms) to enable advanced analytics and reporting capabilities. 2. **Real-Time Analytics Platform:** Managed the development of a real-time analytics platform using Databricks for a financial services organization, enabling real-time fraud detection and risk assessment through streaming data ingestion and processing. 3. **Data Warehouse Modernization:** Oversaw the modernization of a legacy data warehouse to a cloud-native architecture for a healthcare provider, implementing ETL processes with Databricks and improving data accessibility for analytics and reporting. **Ideally, you'll also have:** + Experience with advanced data analytics tools and techniques. + Familiarity with machine learning concepts and applications. + Knowledge of industry trends and best practices in data engineering. + Familiarity with cloud platforms (AWS, Azure, GCP) and their data services. + Knowledge of data governance and compliance standards. + Experience with machine learning frameworks and tools. **What we look for:** We seek individuals who are not only technically proficient but also possess the qualities of top performers, including a strong sense of collaboration, adaptability, and a passion for continuous learning. If you are driven by results and have a desire to make a meaningful impact, we want to hear from you. FY26NATAID **What we offer you** At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more . + We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $125,500 to $230,200. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $150,700 to $261,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options. + Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year. + Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. **Are you ready to shape your future with confidence? Apply today.** EY accepts applications for this position on an on-going basis. For those living in California, please click here for additional information. EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities. **EY | Building a better working world** EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories. EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law. EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
    $150.7k-261.6k yearly 60d+ ago
  • Google Cloud Data & AI Engineer

    Slalom 4.6company rating

    Data engineer job in Las Vegas, NV

    Who You'll Work With As a modern technology company, our Slalom Technologists are disrupting the market and bringing to life the art of the possible for our clients. We have passion for building strategies, solutions, and creative products to help our clients solve their most complex and interesting business problems. We surround our technologists with interesting challenges, innovative minds, and emerging technologies You will collaborate with cross-functional teams, including Google Cloud architects, data scientists, and business units, to design and implement Google Cloud data and AI solutions. As a Consultant, Senior Consultant or Principal at Slalom, you will be a part of a team of curious learners who lean into the latest technologies to innovate and build impactful solutions for our clients. What You'll Do * Design, build, and operationalize large-scale enterprise data and AI solutions using Google Cloud services such as BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub and more. * Implement cloud-based data solutions for data ingestion, transformation, and storage; and AI solutions for model development, deployment, and monitoring, ensuring both areas meet performance, scalability, and compliance needs. * Develop and maintain comprehensive architecture plans for data and AI solutions, ensuring they are optimized for both data processing and AI model training within the Google Cloud ecosystem. * Provide technical leadership and guidance on Google Cloud best practices for data engineering (e.g., ETL pipelines, data pipelines) and AI engineering (e.g., model deployment, MLOps). * Conduct assessments of current data architectures and AI workflows, and develop strategies for modernizing, migrating, or enhancing data systems and AI models within Google Cloud. * Stay current with emerging Google Cloud data and AI technologies, such as BigQuery ML, AutoML, and Vertex AI, and lead efforts to integrate new innovations into solutions for clients. * Mentor and develop team members to enhance their skills in Google Cloud data and AI technologies, while providing leadership and training on both data pipeline optimization and AI/ML best practices. What You'll Bring * Proven experience as a Cloud Data and AI Engineer or similar role, with hands-on experience in Google Cloud tools and services (e.g., BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub, etc.). * Strong knowledge of data engineering concepts, such as ETL processes, data warehousing, data modeling, and data governance. * Proficiency in AI engineering, including experience with machine learning models, model training, and MLOps pipelines using tools like Vertex AI, BigQuery ML, and AutoML. * Strong problem-solving and decision-making skills, particularly with large-scale data systems and AI model deployment. * Strong communication and collaboration skills to work with cross-functional teams, including data scientists, business stakeholders, and IT teams, bridging data engineering and AI efforts. * Experience with agile methodologies and project management tools in the context of Google Cloud data and AI projects. * Ability to work in a fast-paced environment, managing multiple Google Cloud data and AI engineering projects simultaneously. * Knowledge of security and compliance best practices as they relate to data and AI solutions on Google Cloud. * Google Cloud certifications (e.g., Professional Data Engineer, Professional Database Engineer, Professional Machine Learning Engineer) or willingness to obtain certification within a defined timeframe. About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this position the target base salaries are listed below. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The target salary pay range is subject to change and may be modified at any time. East Bay, San Francisco, Silicon Valley: * Consultant $114,000-$171,000 * Senior Consultant: $131,000-$196,500 * Principal: $145,000-$217,500 San Diego, Los Angeles, Orange County, Seattle, Houston, New Jersey, New York City, Westchester, Boston, Washington DC: * Consultant $105,000-$157,500 * Senior Consultant: $120,000-$180,000 * Principal: $133,000-$199,500 All other locations: * Consultant: $96,000-$144,000 * Senior Consultant: $110,000-$165,000 * Principal: $122,000-$183,000 EEO and Accommodations Slalom is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process. We are accepting applications until 12/31. #LI-FB1
    $145k-217.5k yearly 29d ago

Learn more about data engineer jobs

How much does a data engineer earn in Spring Valley, NV?

The average data engineer in Spring Valley, NV earns between $70,000 and $134,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Spring Valley, NV

$97,000

What are the biggest employers of Data Engineers in Spring Valley, NV?

The biggest employers of Data Engineers in Spring Valley, NV are:
  1. Ernst & Young
  2. Abnormal Security
  3. Slickdeals
  4. DraftKings at Casino Queen
  5. Pwc
  6. Slalom
  7. 4Rahlp1 American Homes 4 Rent, L.P
  8. Anywhere Real Estate
  9. Fusion HCR
  10. Swickard Auto Group
Job type you want
Full Time
Part Time
Internship
Temporary