Post job

Data engineer jobs in North Las Vegas, NV - 189 jobs

All
Data Engineer
Data Scientist
Lead Building Engineer
Data Consultant
Data Architect
Data Modeler
  • Databricks Data Engineer - Manager - Consulting - Miami

    EY 4.7company rating

    Data engineer job in Las Vegas, NV

    At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. **Technology - Data and Decision Science - Data Engineering - Manager** We are looking for a dynamic and experienced Manager of Data Engineering to lead our team in designing and implementing complex cloud analytics solutions with a strong focus on Databricks. The ideal candidate will possess deep technical expertise in data architecture, cloud technologies, and analytics, along with exceptional leadership and client management skills. **The opportunity:** In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that business requirements are translated into effective technical solutions. Key responsibilities include: + Understanding and analyzing business requirements to translate them into technical requirements. + Designing, building, and operating scalable data architecture and modeling solutions. + Staying up to date with the latest trends and emerging technologies to maintain a competitive edge. **Key Responsibilities:** As a Data Engineering Manager, you will play a crucial role in managing and delivering complex technical initiatives. Your time will be spent across various responsibilities, including: + Leading workstream delivery and ensuring quality in all processes. + Engaging with clients on a daily basis, actively participating in working sessions, and identifying opportunities for additional services. + Implementing resource plans and budgets while managing engagement economics. This role offers the opportunity to work in a dynamic environment where you will face challenges that require innovative solutions. You will learn and grow as you guide others and interpret internal and external issues to recommend quality solutions. Travel may be required regularly based on client needs. **Skills and attributes for success:** To thrive in this role, you should possess a blend of technical and interpersonal skills. The following attributes will make a significant impact: + Lead the design and development of scalable data engineering solutions using Databricks on cloud platforms (e.g., AWS, Azure, GCP). + Oversee the architecture of complex cloud analytics solutions, ensuring alignment with business objectives and best practices. + Manage and mentor a team of data engineers, fostering a culture of innovation, collaboration, and continuous improvement. + Collaborate with clients to understand their analytics needs and deliver tailored solutions that drive business value. + Ensure the quality, integrity, and security of data throughout the data lifecycle, implementing best practices in data governance. + Drive end-to-end data pipeline development, including data ingestion, transformation, and storage, leveraging Databricks and other cloud services. + Communicate effectively with stakeholders, including technical and non-technical audiences, to convey complex data concepts and project progress. + Manage client relationships and expectations, ensuring high levels of satisfaction and engagement. + Stay abreast of the latest trends and technologies in data engineering, cloud computing, and analytics. + Strong analytical and problem-solving abilities. + Excellent communication skills, with the ability to convey complex information clearly. + Proven experience in managing and delivering projects effectively. + Ability to build and manage relationships with clients and stakeholders. **To qualify for the role, you must have:** + Bachelor's degree in computer science, Engineering, or a related field required; Master's degree preferred. + Typically, no less than 4 - 6 years relevant experience in data engineering, with a focus on cloud data solutions and analytics. + Proven expertise in Databricks and experience with Spark for big data processing. + Strong background in data architecture and design, with experience in building complex cloud analytics solutions. + Experience in leading and managing teams, with a focus on mentoring and developing talent. + Strong programming skills in languages such as Python, Scala, or SQL. + Excellent problem-solving skills and the ability to work independently and as part of a team. + Strong communication and interpersonal skills, with a focus on client management. **Required Expertise for Managerial Role:** + **Strategic Leadership:** Ability to align data engineering initiatives with organizational goals and drive strategic vision. + **Project Management:** Experience in managing multiple projects and teams, ensuring timely delivery and adherence to project scope. + **Stakeholder Engagement:** Proficiency in engaging with various stakeholders, including executives, to understand their needs and present solutions effectively. + **Change Management:** Skills in guiding clients through change processes related to data transformation and technology adoption. + **Risk Management:** Ability to identify potential risks in data projects and develop mitigation strategies. + **Technical Leadership:** Experience in leading technical discussions and making architectural decisions that impact project outcomes. + **Documentation and Reporting:** Proficiency in creating comprehensive documentation and reports to communicate project progress and outcomes to clients. **Large-Scale Implementation Programs:** 1. **Enterprise Data Lake Implementation:** Led the design and deployment of a cloud-based data lake solution for a Fortune 500 retail client, integrating data from multiple sources (e.g., ERPs, POS systems, e-commerce platforms) to enable advanced analytics and reporting capabilities. 2. **Real-Time Analytics Platform:** Managed the development of a real-time analytics platform using Databricks for a financial services organization, enabling real-time fraud detection and risk assessment through streaming data ingestion and processing. 3. **Data Warehouse Modernization:** Oversaw the modernization of a legacy data warehouse to a cloud-native architecture for a healthcare provider, implementing ETL processes with Databricks and improving data accessibility for analytics and reporting. **Ideally, you'll also have:** + Experience with advanced data analytics tools and techniques. + Familiarity with machine learning concepts and applications. + Knowledge of industry trends and best practices in data engineering. + Familiarity with cloud platforms (AWS, Azure, GCP) and their data services. + Knowledge of data governance and compliance standards. + Experience with machine learning frameworks and tools. **What we look for:** We seek individuals who are not only technically proficient but also possess the qualities of top performers, including a strong sense of collaboration, adaptability, and a passion for continuous learning. If you are driven by results and have a desire to make a meaningful impact, we want to hear from you. FY26NATAID **What we offer you** At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more . + We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $125,500 to $230,200. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $150,700 to $261,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options. + Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year. + Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. **Are you ready to shape your future with confidence? Apply today.** EY accepts applications for this position on an on-going basis. For those living in California, please click here for additional information. EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities. **EY | Building a better working world** EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories. EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law. EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
    $150.7k-261.6k yearly 53d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Principal Data Scientist

    Maximus 4.3company rating

    Data engineer job in Las Vegas, NV

    Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team. You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes. This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.) This position requires occasional travel to the DC area for client meetings. U.S. citizenship is required for this position due to government contract requirements. Essential Duties and Responsibilities: - Make deep dives into the data, pulling out objective insights for business leaders. - Initiate, craft, and lead advanced analyses of operational data. - Provide a strong voice for the importance of data-driven decision making. - Provide expertise to others in data wrangling and analysis. - Convert complex data into visually appealing presentations. - Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners. - Understand the importance of automation and look to implement and initiate automated solutions where appropriate. - Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects. - Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects. - Guide operational partners on product performance and solution improvement/maturity options. - Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization. - Learn new skills in advanced analytics/AI/ML tools, techniques, and languages. - Mentor more junior data analysts/data scientists as needed. - Apply strategic approach to lead projects from start to finish; Job-Specific Minimum Requirements: - Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation. - Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital. - Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning. - Contribute to the development of mathematically rigorous process improvement procedures. - Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments. Minimum Requirements - Bachelor's degree in related field required. - 10-12 years of relevant professional experience required. Job-Specific Minimum Requirements: - 10+ years of relevant Software Development + AI / ML / DS experience. - Professional Programming experience (e.g. Python, R, etc.). - Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML. - Experience with API programming. - Experience with Linux. - Experience with Statistics. - Experience with Classical Machine Learning. - Experience working as a contributor on a team. Preferred Skills and Qualifications: - Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.). - Experience developing machine learning or signal processing algorithms: - Ability to leverage mathematical principles to model new and novel behaviors. - Ability to leverage statistics to identify true signals from noise or clutter. - Experience working as an individual contributor in AI. - Use of state-of-the-art technology to solve operational problems in AI and Machine Learning. - Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles. - Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions. - Ability to build reference implementations of operational AI & Advanced Analytics processing solutions. Background Investigations: - IRS MBI - Eligibility #techjobs #VeteransPage #LI-Remote EEO Statement Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics. Pay Transparency Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances. Accommodations Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************. Minimum Salary $ 156,740.00 Maximum Salary $ 234,960.00
    $86k-122k yearly est. Easy Apply 7d ago
  • Graduate Data Scientist

    William Hill Sportsbook

    Data engineer job in Las Vegas, NV

    About the Role Join our Sportsbook team as a Graduate Data Scientist, where you'll gain hands-on experience working at the intersection of quantitative modeling and trading operations. You'll help support the development and monitoring of predictive models that power real-time odds and trading decisions, while also participating in trading workflows for pre-match and in-play sports betting markets. This is an excellent opportunity for an analytically minded graduate with a passion for sports, data, and betting markets to learn from an experienced team and grow into a hybrid quant/trading role. This is an on-site position at our Las Vegas, NV office. Key Responsibilities Assist in building, testing, and maintaining statistical models used to price sportsbook markets (e.g., moneyline, spreads, totals, props). Conduct exploratory data analysis and feature engineering using historical sports data and real-time feeds. Analyze performance metrics like margin, P&L, and customer behavior to improve pricing accuracy and model calibration. Work closely with Trading, Product, and Engineering teams to support model improvements and ensure seamless trading execution.. Stay up-to-date with sports analytics, statistical methods, and advanced machine learning; evaluate new methods to keep our models best-in-class. Support traders with pricing, risk management, and market monitoring during live events. Monitor odds movement, betting patterns, and liabilities to help ensure market efficiency. Contribute to setting pre-match and in-play prices using internal tools and models. Qualifications Bachelor's or Master's degree in a quantitative field such as Statistics, Mathematics, Computer Science, Engineering, or similar. Solid foundation in probability and statistics, with an interest in applying these to real-world problems. Strong programming skills in Python (especially with data-focused libraries like pandas and NumPy). Passion for sports and interest in sportsbook mechanics or financial markets. Strong attention to detail, especially when working with live data and pricing environments. Willingness to work flexible hours during major sporting events as needed. Nice to Have Familiarity with sports betting odds and common market types (e.g., spread, totals, player props). Exposure to sports analytics or participation in data competitions (e.g., Kaggle, university research). Experience with data visualization tools or dashboarding Prior experience in a sportsbook, trading desk, or similar high-tempo environment.
    $80k-116k yearly est. Auto-Apply 47d ago
  • Lead Data Scientist

    Streamline Media Group 4.4company rating

    Data engineer job in Las Vegas, NV

    Job Responsibilities: Manage data scientist and data specialist team. Identify the efficient source for quality data collection, Lead data collection and data mining processes. Ensure and guarantee data integrity. Analyze data and interpret data problems. Plan project, prioritize and streamline all planned data projects. Develop a proper analytic system according to the requirements. Analyze and test the performance of the products. Build reports on the visualization and performance of the products. keep implementing new techniques and models. Line up the data projects according to the goals of the company. Job Skills: Bachelor's degree in Data Science, Computer Science, or other related fields. Proven experience in Data Science and other related fields. Good understanding o techniques of data management and visualization. Expertise in statistical data analysis and predictive data modeling. Good technical and coding knowledge of Python, R language, MATLAB, SQL, and other databases. Outstanding communication skills. Inspiring leadership qualities and organizational skills.
    $91k-128k yearly est. 60d+ ago
  • Data Modeling

    5 Star Recruitment 3.8company rating

    Data engineer job in Las Vegas, NV

    7 + years of experience in Data Modeling & Data Analysis and should have excellent communication and leadership skills. Financial Domain knowledge is plus. Should have strong understanding of cloud Datawarehouse like Snowflake and Data modelling skill using Data Vault 2.0. Should have in depth understanding or executed new concepts like Data Lakehouse and should have worked on Big Data sources / tools such as Hive , S3 , Trino , HUE, etc. Should have done complex (100+ entities/tables) Data Modeling programs in both on-prem and cloud environments. Experience in designing normalized, denormalized, relational, dimensional, star and snowflake schemas for cloud, big data and on prem databases using any one of these data modeling tools (Erwin, ER/Studio, Toad Data Modeler, etc.) Extensive Data Modeling experience in same or similar bigdata and cloud databases such as Hive, Redshift, Snowflake and on prem databases such as Oracle, SQL Server and DB2 Should have experience in working with product managers, project managers, business users, applications development team members, DBA teams and Data Governance team daily to analyze requirements, design, development, and deployment technical solutions. Should have implemented Bill Inmon and Ralph Kimball methodologies to design data warehouses Should have in depth understanding on Data Warehousing, Data Analysis, Data Profiling, Master Data Management (MDM), Data Quality, Data Lineage, Data Dictionary, Data Mapping, Data Policy, and Data Governance.
    $101k-143k yearly est. 60d+ ago
  • AWS Data Migration Consultant

    Slalom 4.6company rating

    Data engineer job in Las Vegas, NV

    Candidates can live within commutable distance to any Slalom office in the US. We have a hybrid and flexible environment. Who You'll Work With As a modern technology company, we've never met a technical challenge we didn't like. We enable our clients to learn from their data, create incredible digital experiences, and make the most of new technologies. We blend design, engineering, and analytics expertise to build the future. We surround our technologists with interesting challenges, innovative minds, and emerging technologies. We are seeking an experienced Cloud Data Migration Architect with deep expertise in SQL Server, Oracle, DB2, or a combination of these platforms, to lead the design, migration, and optimization of scalable database solutions in the AWS cloud. This role will focus on modernizing on-premises database systems by architecting high-performance, secure, and reliable AWS-hosted solutions. As a key technical leader, you will work closely with data engineers, cloud architects, and business stakeholders to define data strategies, lead complex database migrations, build out ETL pipelines, and optimize performance across legacy and cloud-native environments. What You'll Do * Design and optimize database solutions on AWS, including Amazon RDS, EC2-hosted instances, and advanced configurations like SQL Server Always On or Oracle RAC (Real Application Clusters). * Lead and execute cloud database migrations using AWS Database Migration Service (DMS), Schema Conversion Tool (SCT), and custom automation tools. * Architect high-performance database schemas, indexing strategies, partitioning models, and query optimization techniques. * Optimize complex SQL queries, stored procedures, functions, and views to ensure performance and scalability in the cloud. * Implement high-availability and disaster recovery (HA/DR) strategies including Always-On, Failover Clusters, Log Shipping, and Replication, tailored to each RDBMS. * Ensure security best practices are followed including IAM-based access control, encryption, and compliance with industry standards. * Collaborate with DevOps teams to implement Infrastructure-as-Code (IaC) using tools like Terraform, CloudFormation, or AWS CDK. * Monitor performance using tools such as AWS CloudWatch, Performance Insights, Query Store, Dynamic Management Views (DMVs), or Oracle-native tools. * Work with software engineers and data teams to integrate cloud databases into enterprise applications and analytics platforms. What You'll Bring * 5+ years of experience in database architecture, design, and administration with at least one of the following: SQL Server, Oracle, or DB2. * Expertise in one or more of the following RDBMS platforms: Microsoft SQL Server, Oracle, DB2. * Hands-on experience with AWS database services (RDS, EC2-hosted databases). * Strong understanding of HA/DR solutions and cloud database design patterns. * Experience with ETL development and data integration, using tools such as SSIS, AWS Glue, or custom solutions. * Familiarity with AWS networking components (VPCs, security groups) and hybrid cloud connectivity. * Strong troubleshooting and analytical skills to resolve complex database and performance issues. * Ability to work independently and lead database modernization initiatives in collaboration with engineering and client stakeholders. Nice to Have * AWS certifications such as AWS Certified Database - Specialty or AWS Certified Solutions Architect - Professional. * Experience with NoSQL databases or hybrid data architectures. * Knowledge of analytics and big data tools (e.g., Snowflake, Redshift, Athena, Power BI, Tableau). * Familiarity with containerization (Docker, Kubernetes) and serverless technologies (AWS Lambda, Fargate). * Experience with DB2 on-premise or cloud-hosted environments. About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this position, the target base salary pay range in the following locations: Boston, Houston, Los Angeles, Orange County, Seattle, San Diego, Washington DC, New York, New Jersey, for Consultant level is $105,000-147,000 and for Senior Consultant level it is $120,000-$169,000 and for Principal level it is $133,000-$187,000. In all other markets, the target base salary pay range for Consultant level is $96,000-$135,000 and for Senior Consultant level it is $110,000-$155,000 and for Principal level it is $122,000-$172,000. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time. EEO and Accommodations Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process. We will accept applications until 1/31/2026 or until the positions are filled.
    $133k-187k yearly 11d ago
  • Data Platform Engineer

    Abnormal 4.5company rating

    Data engineer job in Las Vegas, NV

    About the Role You'll build, operate, and evolve the end-to-end data platform that powers analytics, automation, and AI use cases. This is a hands-on role spanning cloud infrastructure, ingestion/ETL, and data modeling across a Medallion (bronze/silver/gold) architecture. You'll partner directly with stakeholders to turn messy source data into trusted datasets, metrics, and data products. Who you are Pragmatic Builder: You write clear SQL/Python, ship durable systems, and leave pipelines more reliable than you found them. Data-Savvy Generalist: You're comfortable moving up and down the stack (cloud, pipelines, warehousing, and BI) and picking the right tool for the job. Fundamentals-first & Customer-Centric: You apply strong data modeling principles and optimize the analyst/stakeholder experience through consistent semantics and trustworthy reporting. Low-Ego, High-Ownership Teammate: You take responsibility for outcomes, seek feedback openly, and will roll up your sleeves to move work across the finish line. High-Energy Communicator: You're comfortable presenting, facilitating discussions, and getting in front of stakeholders to drive clarity and alignment. Self-Starter: You unblock yourself, drive decisions, and follow through on commitments; you bring a strong work ethic and invest in continuous learning. What you will do Ingestion & ETL: Build reusable ingestion and ETL frameworks (Python and Spark) for APIs, databases, and un/semi-structured sources; handle JSON/Parquet and evolving schemas. Medallion Architecture: Own and evolve Medallion layers (bronze/silver/gold) for key domains with clear lineage, metadata, and ownership. Data Modeling & Marts: Design dimensional models and gold marts for core business metrics; ensure consistent grain and definitions. Analytics Enablement: Maintain semantic layers and partner on BI dashboards (Sigma or similar) so metrics are certified and self-serve. Reliability & Observability: Implement tests, freshness/volume monitoring, alerting, and runbooks; perform incident response and root-cause analysis (RCA) for data issues. Warehouse & Performance: Administer and tune the cloud data warehouse (Snowflake or similar): compute sizing, permissions, query performance, and cost controls. Standardization & Automation: Build paved-road patterns (templates, operators, CI checks) and automate repetitive tasks to boost developer productivity. AI Readiness: Prepare curated datasets for AI/ML/LLM use cases (feature sets, embeddings prep) with appropriate governance. Must Haves 3-5+ years hands-on data engineering experience; strong SQL and Python; experience building data pipelines end-to-end in production. Strong cloud fundamentals (AWS preferred; other major clouds acceptable): object storage, IAM concepts, logging/monitoring, and managed compute. Experience building and operating production ETL pipelines with reliability basics: retries, backfills, idempotency, incremental processing patterns (e.g., SCDs, late-arriving data), and clear operational ownership (docs/runbooks). Solid understanding of Medallion / layered architecture concepts (bronze/silver/gold or equivalent) and experience working within each layer. Strong data modeling fundamentals (dimensional modeling/star schema): can define grain, build facts/dimensions, and support consistent metrics. Working experience in a modern cloud data warehouse (Snowflake or similar): can write performant SQL and understand core warehouse concepts. Hands-on dbt experience: building and maintaining models, writing core tests (freshness/uniqueness/RI), and contributing to documentation; ability to work in an established dbt project. Experience with analytics/BI tooling (Sigma, Looker, Tableau, etc.) and semantic layer concepts; ability to support stakeholders and troubleshoot issues end-to-end. Nice to Have Snowflake administration depth: warehouse sizing and cost management, advanced performance tuning, clustering strategies, and designing RBAC models Advanced governance & security patterns: masking policies, row-level security, and least-privilege frameworks as a primary implementer/owner Strong Spark/PySpark proficiency: deep tuning/optimization and large-scale transformations. dbt "platform-level" ownership: CI/CD-based deployments, environment/promotion workflows, advanced macros/packages, and leading large refactors or establishing standards from scratch. Orchestration: Airflow/MWAA DAG design patterns, backfill strategies at scale, dependency management, and operational hardening Sigma-specific depth: semantic layer/metrics layer architecture in Sigma, advanced dashboard standards, and organization-wide "certified metrics" rollout. Automation / iPaaS experience: Workato (or similar) for business integrations and operational workflows. Infrastructure-as-code: Terraform (or similar) for data/cloud infrastructure provisioning, environment management, and safe change rollout. Data observability & lineage tooling: OpenLineage/Monte Carlo-style patterns, automated lineage hooks, anomaly detection systems. Lakehouse / unstructured patterns: Parquet/Iceberg, event/data contracts, and advanced handling of semi/unstructured sources. AI/ML/LLM data workflows: feature stores, embeddings/RAG prep, and privacy-aware governance. #LI-EM4
    $92k-132k yearly est. Auto-Apply 11d ago
  • Senior Data Engineer

    4Rahlp1 American Homes 4 Rent, L.P

    Data engineer job in Las Vegas, NV

    Since 2012, we've grown to become one of the leading single-family rental companies and homebuilders in the country, recently recognized as a top employer by Fortune and Great Place To Work . At AMH, our goal is to simplify the experience of leasing a home through professional management and maintenance support, so our residents can focus on what really matters to them, wherever they are in life. The Senior Data Engineer is responsible for designing, building, and managing the data platform and tools to allow for the efficient processing and analysis of large data sets. Develops and maintains scalable data pipelines, ensures data quality, and deploys machine learning models to production. Collaborates with business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision-making across the organization. Builds real-time and batch pipelines to handle large volumes of data efficiently. Collaborates with cross-functional teams and translate business requirements into scalable data solutions. Responsibilities: Design, develop, and maintain real-time or batch data pipelines to process and analyze large volumes of data. Designs and develops programs and tools to support ingestion, curation, and provisioning of complex first party and third-party data to achieve analytics, reporting, and data science. Design and develop Advanced Data Products and Intelligent API's. Monitors the system performance by performing regular tests, troubleshoots, and integrates new features. Lead in analysis of data and the design the data architecture to support BI, AI/ML and data products. Designs and implements data platform architecture to meet organization analytical requirements. Ensures the solution designs address operational requirements such as scalability, maintainability, extensibility, flexibility, and integrity. Provide technical leadership and mentorship to team members. Leads peer development and code reviews with focus on test driven development and Continuous Integration and Continuous Development (CICD). Requirements: Bachelor's degree in computer science, information systems, data science, management information systems, mathematics, physics, engineering, statistics, economics, and/or a related field required. Master's degree in computer science, information systems, data science, management information systems, mathematics, physics, engineering, statistics, economics, and/or a related field preferred. Minimum of eight (8) years of experience as a data engineer with full-stack capabilities Minimum of ten (10) years of Experience in programming Minimum of five (5) years in Cloud technologies like Azure, Aws or Google. Strong SQL Knowledge Experience in ML and ML Pipeline a plus Experience in real-time integration, developing intelligent apps and data products. Proficiency in Python and experience with CI/CD practices Strong background in IAAS platforms and infrastructure Hands-on experience with Databricks, Spark, Fabric, or similar technologies Experience in Agile methodologies Hands-on experience in the design and development of data pipelines and data products Experience in developing data ingestion, data processing, and analytical pipelines for big data, NoSQL, and data warehouse solutions. Hands-on experience implementing data migration and data processing using Azure services: ADLS, Azure Data Factory, Event Hub, IoT Hub, Azure Stream Analytics, Azure Analysis Service, HDInsight, Databricks Azure Data Catalog, Cosmo Db, ML Studio, AI/ML, etc. Extensive experience in Big Data technologies such as Apache Spark and streaming technologies such as Kafka, EventHub, etc. Extensive experience in designing data applications in a cloud environment. Intermediate experience in RESTful APIs, messaging systems, and AWS or Microsoft Azure. Extensive experience in Data Architecture and data modeling Expert in data analysis and data quality frameworks. Knowledgeable with BI tools such as Power BI and Tableau. May occasionally work evenings and/or weekends. Compensation The anticipated pay range/scale for this position is $121,116.00 to $151,395.00 Annually. Actual starting base pay within this range will depend on factors including geographic location, education, training, skills, and relevant experience. Additional Compensation This position is eligible to receive a discretionary annual bonus. Perks and Benefits Employees have the opportunity to participate in medical, dental and vision insurance; flexible spending accounts and/or health savings accounts; dependent savings accounts; 401(k) with company matching contributions; employee stock purchase plan; and a tuition reimbursement program. The Company provides 9 paid holidays per year, and, upon hire, new employees will accrue paid time off (PTO) at a rate of 0.0577 hours of PTO per hour worked, up to a maximum of 120 hours per year. CA Privacy Notice: To learn more about what information we collect when you apply for a job, and how we use that information, please see our CA Job Applicant Privacy Notice found at ************************************** #LI-PH1
    $121.1k-151.4k yearly Auto-Apply 60d+ ago
  • Data Engineer | Performance Management

    Swickard Auto Group

    Data engineer job in Las Vegas, NV

    Corporate Office | Enterprise Analytics | Turning Data into Decisions | Veterans encouraged to apply This position will support the data analytics and reporting requests of our growing Performance Management team within Swickard Auto Group. The Performance Management team is tasked with delivering insights and analytics that help drive bottom line impact and improve the customer experience across the entire organization. The insights generated through the data work that this position generates and maintains will be critical in the corporate and retail level decision-making process. The ideal candidate will have advanced Data Engineering skills, an analytical mindset, great attention to detail, ability to manage projects on their own, and craft data solutions that are both expedient and scalable. The candidate must be comfortable working in a small team with limited structure and have an entrepreneurial mindset. Duties and Responsibilities Create and maintain relevant data architecture documentation Build automated data pipelines using Azure Data Factory, Azure Functions, and Python. Develop and manage Data Vault 2.0 models (Hubs, Links, Satellites) across all dealership domains. Integrate data from multiple systems (APIs, SFTP, webhooks, ERPs, DMS systems) into the enterprise data platform. Create and maintain cloud-based data architecture across bronze, silver, and gold layers. Optimize SQL Server data models and ETL performance, including indexing, stored procedures, and pipeline efficiency. Develop production-quality Python ETL modules for ingestion, cleaning, validation, and transformation. Implement DevOps practices with Git, CI/CD, and YAML-based deployment pipelines. Qualifications 4+ years of related experience in a data engineering role Bachelor's Degree (accredited school or equivalent technical bootcamp) with emphasis in: Data Engineering, Data Analytics, Mathematics, or Statistics Proficiency in SQL and Python required; Node.js experience a plus. Experience with Data Vault modeling, Azure Data Factory, Azure Function Apps, and cloud-based ETL preferred. Strong proficiency in SQL Strong proficiency in Python Strong proficiency in API Infrastructure Proficiency in Cloud Data Architecture (Azure Strongly Preferred) Proficiency in Azure Function Apps and Cloud-based ETL Strong experience in Data Vault Modeling Strong experience in Azure Data Factory Experience developing and maintaining API, webhook, and SFTP integrations in Python Experience with YAML based DevOps pipeline management Experience designing enterprise-grade database architecture Strong attention to detail and a high degree of accuracy Ability to communicate well both verbally and in writing with all levels of the organization Strong analytical skills Solid understanding of data sources, data organization and storage The ideal candidate will also have: Experience automating processes and data analysis using Python Master's Degree in Data Analytics or similar field What You'll Receive Aggressive salary based on experience Medical, dental, and vision insurance Paid time off and holidays 401(k) plan Career growth opportunities within a fast-scaling organization A seat at the table - your work will be used by senior leadership Why Swickard Data is central to how we run the business - not an afterthought Leaders who value clarity, accountability, and follow-through Opportunity to build systems that scale across brands, states, and teams A culture that pairs high standards with genuine respect for people About Us We were founded in 2014 by Jeff Swickard in Wilsonville, OR. We're a hospitality company that happens to sell cars, parts, and service. We are a team. Everyone plays a role in our success. Culture: We want to be our customers' favorite place to purchase, lease, or service their vehicle and we want to be your favorite place to work! Highline Brands: Swickard has positioned itself as a leader in highline brands such as Mercedes Benz, BMW, Volvo, Porsche, Lexus, Audi, Land Rover, and more. We are consistently ranked as one of the fastest growing dealership groups in the US by Automotive News. Swickard Auto Group is a hospitality company that happens to sell cars. With 40+ rooftops, 20+ brands, and thousands of employees, we rely on disciplined systems and transparent performance management to deliver consistent results for our guests and our teams. Our question is simple: How can we do this better? Data helps us answer it - every day.
    $81k-115k yearly est. 36d ago
  • Data Engineer 3

    BHE

    Data engineer job in Las Vegas, NV

    Basic Purpose Performs design, implementation, and maintenance tasks associated with multi-user, multi-platform database technologies. Recommends improvements of client accessibility to corporate information. Provides support to IT&T staff and clients in providing solutions to technical data base related problems, changes, or projects. Recommends, establishes, and coordinates database standards. Analyzes the business process, developing the data base to best support the Enterprise's needs. Optimizes database for efficient operation. Coordinates data updates from internal and external sources. Researches and recommends technology based solutions to business related challenges. Essential Education, Skills, and Environment Education and Work Experience Bachelor's degree from an accredited school in computer science, information technology or related field. Five years of related experience in database administration, systems analysis or data engineering. Candidates that do not possess a bachelor's degree must have a minimum of 9 years of experience in database administration, systems analysis or data engineering. Specialized Knowledge and Skills Demonstrated knowledge of: • Demonstrated knowledge of backup and recovery, disaster recovery, high availability, security, infrastructure, performance tuning, lifecycle, data management and data movement. Demonstrated skills such as: • Effective oral and written communication skills fostering an atmosphere of teamwork, high productivity and accuracy. • Consistently delivers requirements and meets timelines. • Advanced analytical and problem-solving skills. • Ability to prioritize and handle multiple tasks and projects concurrently Equipment and Applications PCs, word processing, spreadsheet and database software. Work Environment and Physical Demands General office environment. No special physical demands required. Essential Duties and Responsibilities Assists IT&T staff and clients with advanced data base technical issues. Recommends and implements methods to maximize data base performance. Participates in advanced installation, maintenance, administration, application, and product troubleshooting. Monitors and enforces all compliance requirements for area of responsibility. Develops and coordinates implementation of standards, in house education, data base backup/recovery procedures and language interfaces. Assists clients accessing corporate information. Recommends improvements of client accessibility to corporate information. Provides support to IT&T staff and clients in providing solutions to technical database related problems, changes, or projects. Recommends, establishes, and coordinates database standard. Optimizes database for efficient operation. Analyzes the business processes; develops the database to best support the Enterprise's needs. Coordinates data updates from internal and external sources. Researches and recommends technology based solutions to business related challenges. Participates in large and intermediate projects efforts in a lead role. Performs all functions of a Data Base Analyst II and provides technical direction to less experienced analysts. Ensures all compliance aspects of position are known and followed; understands and complies with all policies, codes and regulations applicable to position and company. Performs related duties as assigned.
    $81k-115k yearly est. Auto-Apply 1d ago
  • Data Engineer

    Tekgence

    Data engineer job in Las Vegas, NV

    Role: Data Engineer Project Type: Long Term Contract Client : Atos Syntel JOB DESCRIPTION :- Must have hands on experience and knowledge on Bigdata ecosystem Must have hands on experience in Apache PySpark Must have hands on experience in Java (Specifically Spring Boot webservices) Proficient in writing Python code Good SQL knowledge for Hive Queries Docker & Kubernetes : Good to have , not must
    $81k-115k yearly est. 60d+ ago
  • Data Analytics Engineer

    VBG (Veteran Benefits Guide

    Data engineer job in Las Vegas, NV

    Job DescriptionDescription: What is VBG: Veteran Benefits Guide has been proud to serve our nation's service members for more than 10 years. Founded by a U.S. Marine Corps Veteran, VBG assists Veterans through the challenging VA claims process to efficiently secure their hard-earned benefits. Now operating with more than 225 team members nationwide, VBG has helped over 55,000 Veterans through the VA claims process. The company is dedicated to honoring service and supporting the Veteran community through ongoing advocacy, community partnerships, and meaningful opportunities within its workforce. Who we're looking for: The Data Analytics Engineer is responsible for transforming raw and staged data into trusted, well-modeled, and analytics-ready datasets that empower reporting, dashboards, and data-driven decision-making across the organization. This role bridges the gap between engineering and analysis - ensuring data is clean, consistent, connected, and optimized for use by Analysts, BI Developers, and business teams. You will work closely with Data Engineers (who ingest data), BI Developers (who build dashboards), and Analysts (who generate insights) to build the semantic layer of the warehouse. You will own data modeling, cleansing, deduplication, and constructing unified datasets that bring together information from systems such as Salesforce, NetSuite, Google, and internal applications. This position is open to candidates located in the following states: Arizona (AZ), California (CA), Washington (WA), Nevada (NV), Utah (UT), Illinois (IL), Ohio (OH), New Jersey (NJ), Virginia (VA), North Carolina (NC), and Florida (FL). Essential Functions: Reasonable accommodation may be provided to enable individuals with disabilities to perform essential functions. Data Modeling & Transformation Build, maintain, and optimize curated data models using SQL, dbt, or similar transformation tools. Create dimensional models (fact/dimension) and semantic layers to support reporting and advanced analytics. Construct unified datasets that bring together cross-system information (e.g., Salesforce, NetSuite, Google Ads). Data Quality & Reliability Profile, validate, and cleanse data to eliminate duplicates, missing fields, and inconsistencies. Implement automated data tests to ensure accuracy, completeness, and referential integrity. Investigate and resolve issues flagged by Analysts when metrics do not match or data looks incorrect. Warehouse Optimization & Governance Partner with DBAs and Data Engineers to ensure performance at the warehouse structures and optimized queries. Adhere to and help define data governance, documentation standards, and semantic layer best practices. Maintain version-controlled analytics codebases using Git or similar workflows. Collaboration & Stakeholder Support Work closely with Analysts to understand their data needs and translate them into robust models. Support BI Developers by providing clean, reliable datasets that power dashboards and reports. Communicate issues, improvements, and data model changes clearly to technical and non-technical audiences. Success Measures (KPIs) Reduction in analyst time spent cleaning and prepping data (target: 40-60% reduction). Decrease in recurring data mismatches or report inconsistencies. Increased adoption of curated datasets by Analysts and BI Developers. Faster turnaround time for new data model requests and enhancements. High data quality scores and reduction in manual remediation efforts. Requirements: Qualifications or competencies: Advanced SQL skills (window functions, CTEs, performance tuning). Experience with transformation frameworks (dbt strongly preferred). Strong understanding of data warehousing concepts: star schema, snowflake schema, fact/dimension modeling. Familiarity with cloud warehouses (Snowflake, BigQuery, Redshift, Synapse). Ability to troubleshoot mismatched metrics, broken joins, or duplicated data. Preferred Skills Preferred: Snowflake SnowPro Core Certification Preferred: Snowflake SnowPro Advanced Data Engineer Certification Preferred: dbt Analytics Engineer Certification Preferred: AWS Data Engineer - Associate Certification Preferred: AWS Solutions Architect - Associate Certification Experience with Python or R for data validation or automation scripts. Knowledge of BI tools (Power BI, Tableau, Looker) and how they interact with semantic layers. Familiarity with CI/CD for analytics code and version control (Git). Exposure to data governance, cataloging, and documentation tools. Education and previous work experience: Bachelor's degree in Data Analytics, Computer Science, Information Systems, or related field. 3-5+ years of experience in analytics engineering, BI development, or data modeling. EEO: Veteran Benefits Guide provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, national origin, ancestry, physical disability, mental disability, medical condition, marital status, sex (including pregnancy, childbirth, breastfeeding or related medical conditions), gender (including gender identity and gender expression) genetic characteristic, sexual orientation, registered domestic partner status, age, military or veteran status, hairstyle or hair texture, reproductive health decision making, or any other characteristic protected by federal, state, or local laws.
    $81k-115k yearly est. 10d ago
  • Data Engineer 3

    Description This

    Data engineer job in Las Vegas, NV

    Basic Purpose Performs design, implementation, and maintenance tasks associated with multi-user, multi-platform database technologies. Recommends improvements of client accessibility to corporate information. Provides support to IT&T staff and clients in providing solutions to technical data base related problems, changes, or projects. Recommends, establishes, and coordinates database standards. Analyzes the business process, developing the data base to best support the Enterprise's needs. Optimizes database for efficient operation. Coordinates data updates from internal and external sources. Researches and recommends technology based solutions to business related challenges. Essential Education, Skills, and Environment Education and Work Experience Bachelor's degree from an accredited school in computer science, information technology or related field. Five years of related experience in database administration, systems analysis or data engineering. Candidates that do not possess a bachelor's degree must have a minimum of 9 years of experience in database administration, systems analysis or data engineering. Specialized Knowledge and Skills Demonstrated knowledge of: • Demonstrated knowledge of backup and recovery, disaster recovery, high availability, security, infrastructure, performance tuning, lifecycle, data management and data movement. Demonstrated skills such as: • Effective oral and written communication skills fostering an atmosphere of teamwork, high productivity and accuracy. • Consistently delivers requirements and meets timelines. • Advanced analytical and problem-solving skills. • Ability to prioritize and handle multiple tasks and projects concurrently Equipment and Applications PCs, word processing, spreadsheet and database software. Work Environment and Physical Demands General office environment. No special physical demands required. Essential Duties and Responsibilities Assists IT&T staff and clients with advanced data base technical issues. Recommends and implements methods to maximize data base performance. Participates in advanced installation, maintenance, administration, application, and product troubleshooting. Monitors and enforces all compliance requirements for area of responsibility. Develops and coordinates implementation of standards, in house education, data base backup/recovery procedures and language interfaces. Assists clients accessing corporate information. Recommends improvements of client accessibility to corporate information. Provides support to IT&T staff and clients in providing solutions to technical database related problems, changes, or projects. Recommends, establishes, and coordinates database standard. Optimizes database for efficient operation. Analyzes the business processes; develops the database to best support the Enterprise's needs. Coordinates data updates from internal and external sources. Researches and recommends technology based solutions to business related challenges. Participates in large and intermediate projects efforts in a lead role. Performs all functions of a Data Base Analyst II and provides technical direction to less experienced analysts. Ensures all compliance aspects of position are known and followed; understands and complies with all policies, codes and regulations applicable to position and company. Performs related duties as assigned.
    $81k-115k yearly est. Auto-Apply 1d ago
  • Sr Data Engineer

    Fusion HCR

    Data engineer job in Las Vegas, NV

    Senior Data Engineer Work Arrangement: 100% Onsite - 5 days per week (Required) Employment Type: Direct Hire Industry: Property Management / Real Estate Technology Client: Fusion client - large, national property management organization Position Summary A Fusion client in the property management space is seeking a Senior Data Engineer to play a key role in building, scaling, and optimizing a modern Databricks-first data platform. This role is heavily hands-on and focused on designing and improving Spark-based data pipelines using Databricks, Python, and SQL in a cloud environment. The Senior Data Engineer will lead large-scale data initiatives, including onboarding new data sources into the data warehouse, building real-time and batch data pipelines, and significantly improving pipeline performance and reliability. This role partners closely with engineering, analytics, and business teams to deliver scalable data solutions that support analytics, BI, and future AI/ML use cases. Critical Skill Priorities (In Order of Importance) Hands-on Databricks experience (Required) Strong Python scripting and SQL (daily, hands-on use) Apache Spark for cloud data loading and transformation Large-scale data initiatives (new source ingestion, platform expansion) Real-time and streaming data pipelines Pipeline performance tuning and optimization Key Responsibilities Design, build, and maintain real-time and batch data pipelines using Databricks and Spark Develop Python- and Spark-based processes for cloud data ingestion, transformation, and loading Lead large data initiatives such as: Bringing new internal and external data sources into the data warehouse Supporting streaming and near-real-time data use cases Improving pipeline speed, scalability, and reliability Design and evolve data architecture supporting analytics, BI, and future AI/ML initiatives Collaborate with cross-functional teams to translate business requirements into scalable data solutions Monitor pipeline health, troubleshoot data issues, and improve system performance Participate in code reviews and promote best practices around testing, CI/CD, and maintainable data pipelines Contribute to the design and development of data products and data services consumed across the organization Required Qualifications Bachelor's degree required in Computer Science, Data Science, Engineering, Information Systems, Mathematics, Statistics, or a related field Candidates with fewer years of experience may be considered only if degree requirements are met 5+ years of hands-on experience as a Data Engineer Strong, hands-on experience with Databricks and Apache Spark Strong proficiency in Python scripting for data processing and pipeline development Advanced SQL skills for analytics, transformations, and troubleshooting Experience building and supporting cloud-based data pipelines Experience working with large-scale data platforms and warehouses Strong troubleshooting and problem-solving skills Preferred / Nice-to-Have Qualifications Experience with Snowflake (may be considered in place of some Databricks experience) Experience with streaming technologies (Spark Streaming, Kafka, Event Hub, etc.) Experience optimizing and tuning data pipelines for performance and scalability Experience with CI/CD practices in data engineering environments Familiarity with BI tools such as Power BI or Tableau Experience working in Agile development environments
    $81k-115k yearly est. 36d ago
  • Sr. Data Engineer

    Slickdeals 4.1company rating

    Data engineer job in Las Vegas, NV

    We believe shopping should feel like winning. That's why 10 million people come to Slickdeals to swap tips, upvote the best finds, and share the thrill of a great deal. Together, our community has saved more than $10 billion over the past 26 years. We're profitable, passionate, and in the middle of an exciting evolution-transforming from the internet's most trusted deal forum into the go-to daily shopping destination. If you thrive in a fast-moving, creative environment where ideas turn into impact fast, you'll fit right in. The Purpose: We're seeking a seasoned Senior Data Engineer to join our high-impact team at Slickdeals. This role will inherit and evolve a mature data ecosystem built over 3+ years, spanning Databricks, dbt, Airflow, AWS, Tableau, and AtScale. You'll be responsible for maintaining and modernizing core pipelines, enabling analytics and reporting, and supporting cost-conscious, scalable data infrastructure. What You'll Do: Own and maintain ETL/ELT pipelines using dbt, Airflow, and Databricks Develop and optimize data models in AtScale to support BI tools like Tableau Collaborate with Analytics, Product, and Engineering teams to deliver reliable, timely data Monitor and troubleshoot data workflows, ensuring high availability and performance Support cloud infrastructure in AWS, including S3, Kafka, EC2, Lambda, and IAM policies Contribute to cost optimization efforts across data storage, compute, and tooling Document systems, processes, and tribal knowledge for continuity and onboarding Participate in code reviews, architecture discussions, and team rituals What We're Looking For: Required Experience: BS/BA/BE degree in a quantitative area such as mathematics, statistics, economics, computer science, engineering, or equivalent experience. 8+ years of experience in data engineering or analytics engineering Strong proficiency in SQL, Python, and dbt Hands-on experience with Databricks, Airflow, and AWS Familiarity with semantic modeling tools Experience building dashboards and supporting BI teams using Tableau Understanding of data governance, security, and compliance best practices Excellent communication and written documentation skills Comfortable working in a fast-paced, collaborative environment Always curious and a continuous learner Preferred Experience: Experience with cost monitoring tools or FinOps practices Familiarity with vendor integrations and API-based data sharing Exposure to AtScale, Tableau, or other modern data platforms Passion for mentoring and knowledge sharing With your application, kindly attach a cover letter that outlines your greatest achievement. Please share what you built, how you measured success, and your role in the result. Please note: We are unable to sponsor visas at this time. Candidates must be authorized to work in the U.S. without current or future visa sponsorship or transfer. LOCATION: Las Vegas, NV Hybrid schedule visiting our Las Vegas office three days a week (Tues-Thurs). Slickdeals Compensation, Benefits, Perks: The expected base pay for this role is between $122,000 - $150,000. Pay is based on a number of factors including market location and may vary depending on job-related knowledge, skills, and experience. Exact compensation will be discussed during the interview process and tailored to the candidate's qualifications. Competitive base salary and annual bonus Competitive paid time off in addition to holiday time off A variety of healthcare insurance plans to give you the best care for your needs 401K matching above the industry standard Professional Development Reimbursement Program Work Authorization Candidates must be eligible to work in the United States. Slickdeals is an Equal Opportunity Employer; employment is governed on the basis of merit, competence and qualifications and will not be influenced in any manner by race, color, religion, gender (including pregnancy, childbirth, or related medical conditions), national origin/ethnicity, veteran status, disability status, age, sexual orientation, gender identity, marital status, mental or physical disability or any other protected status. Slickdeals will consider qualified applicants with criminal histories consistent with the "Ban the Box" legislation. We may access publicly available information as part of your application. Slickdeals participates in E-Verify. For more information, please refer to E-Verify Participation and Right to Work. Slickdeals does not accept unsolicited resumes from agencies and is not responsible for related fees.
    $122k-150k yearly Auto-Apply 60d+ ago
  • Data Analytics Engineer

    VBG

    Data engineer job in Las Vegas, NV

    What is VBG: Veteran Benefits Guide has been proud to serve our nation's service members for more than 10 years. Founded by a U.S. Marine Corps Veteran, VBG assists Veterans through the challenging VA claims process to efficiently secure their hard-earned benefits. Now operating with more than 225 team members nationwide, VBG has helped over 55,000 Veterans through the VA claims process. The company is dedicated to honoring service and supporting the Veteran community through ongoing advocacy, community partnerships, and meaningful opportunities within its workforce. Who we're looking for: The Data Analytics Engineer is responsible for transforming raw and staged data into trusted, well-modeled, and analytics-ready datasets that empower reporting, dashboards, and data-driven decision-making across the organization. This role bridges the gap between engineering and analysis - ensuring data is clean, consistent, connected, and optimized for use by Analysts, BI Developers, and business teams. You will work closely with Data Engineers (who ingest data), BI Developers (who build dashboards), and Analysts (who generate insights) to build the semantic layer of the warehouse. You will own data modeling, cleansing, deduplication, and constructing unified datasets that bring together information from systems such as Salesforce, NetSuite, Google, and internal applications. This position is open to candidates located in the following states: Arizona (AZ), California (CA), Washington (WA), Nevada (NV), Utah (UT), Illinois (IL), Ohio (OH), New Jersey (NJ), Virginia (VA), North Carolina (NC), and Florida (FL). Essential Functions: Reasonable accommodation may be provided to enable individuals with disabilities to perform essential functions. Data Modeling & Transformation Build, maintain, and optimize curated data models using SQL, dbt, or similar transformation tools. Create dimensional models (fact/dimension) and semantic layers to support reporting and advanced analytics. Construct unified datasets that bring together cross-system information (e.g., Salesforce, NetSuite, Google Ads). Data Quality & Reliability Profile, validate, and cleanse data to eliminate duplicates, missing fields, and inconsistencies. Implement automated data tests to ensure accuracy, completeness, and referential integrity. Investigate and resolve issues flagged by Analysts when metrics do not match or data looks incorrect. Warehouse Optimization & Governance Partner with DBAs and Data Engineers to ensure performance at the warehouse structures and optimized queries. Adhere to and help define data governance, documentation standards, and semantic layer best practices. Maintain version-controlled analytics codebases using Git or similar workflows. Collaboration & Stakeholder Support Work closely with Analysts to understand their data needs and translate them into robust models. Support BI Developers by providing clean, reliable datasets that power dashboards and reports. Communicate issues, improvements, and data model changes clearly to technical and non-technical audiences. Success Measures (KPIs) Reduction in analyst time spent cleaning and prepping data (target: 40-60% reduction). Decrease in recurring data mismatches or report inconsistencies. Increased adoption of curated datasets by Analysts and BI Developers. Faster turnaround time for new data model requests and enhancements. High data quality scores and reduction in manual remediation efforts. Requirements Qualifications or competencies: Advanced SQL skills (window functions, CTEs, performance tuning). Experience with transformation frameworks (dbt strongly preferred). Strong understanding of data warehousing concepts: star schema, snowflake schema, fact/dimension modeling. Familiarity with cloud warehouses (Snowflake, BigQuery, Redshift, Synapse). Ability to troubleshoot mismatched metrics, broken joins, or duplicated data. Preferred Skills Preferred: Snowflake SnowPro Core Certification Preferred: Snowflake SnowPro Advanced Data Engineer Certification Preferred: dbt Analytics Engineer Certification Preferred: AWS Data Engineer - Associate Certification Preferred: AWS Solutions Architect - Associate Certification Experience with Python or R for data validation or automation scripts. Knowledge of BI tools (Power BI, Tableau, Looker) and how they interact with semantic layers. Familiarity with CI/CD for analytics code and version control (Git). Exposure to data governance, cataloging, and documentation tools. Education and previous work experience: Bachelor's degree in Data Analytics, Computer Science, Information Systems, or related field. 3-5+ years of experience in analytics engineering, BI development, or data modeling. EEO: Veteran Benefits Guide provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, national origin, ancestry, physical disability, mental disability, medical condition, marital status, sex (including pregnancy, childbirth, breastfeeding or related medical conditions), gender (including gender identity and gender expression) genetic characteristic, sexual orientation, registered domestic partner status, age, military or veteran status, hairstyle or hair texture, reproductive health decision making, or any other characteristic protected by federal, state, or local laws. Salary Description $90,100.00
    $90.1k yearly 13d ago
  • Senior Data Science Engineer, Football

    Draftkings 4.0company rating

    Data engineer job in Las Vegas, NV

    At DraftKings, AI is becoming an integral part of both our present and future, powering how work gets done today, guiding smarter decisions, and sparking bold ideas. It's transforming how we enhance customer experiences, streamline operations, and unlock new possibilities. Our teams are energized by innovation and readily embrace emerging technology. We're not waiting for the future to arrive. We're shaping it, one bold step at a time. To those who see AI as a driver of progress, come build the future together. The Crown Is Yours Our Sports Modeling team comprises sports modeling experts and data science technologists, coming together to develop innovative products that deliver incremental value across our Sportsbook platform. As a Senior Data Scientist on the Sports Modeling team, you will develop models and data-driven solutions that enhance the Sportsbook experience for our users. In this role, you will work on implementing advanced sports models, refining data assets, and ensuring seamless integration into applications. What you'll do as a Senior Data Scientist, Football Create statistical and machine learning models and integrate them into data science applications. Collect and engineer sports data assets to assist in model development. Implement the sports models and pricing engines in Python. Create automatic tests to ensure model and pricing engine accuracy. Collaborate closely with Trading, Product, Engineering, and QA teams to move projects from ideation to deployment. Test data flows and model integration in a larger business context. Coach and support more junior data scientists within the team. What you'll bring Demonstrated passion for sports and a strong understanding of relevant leagues and their dynamics. A college degree in Statistics, Data Science, Mathematics, Computer Science, Engineering, or another related field. Proficiency in Python, with experience building statistical or machine learning models across various sports. Solid grasp of data science principles, statistical modeling techniques, and object-oriented programming concepts. Familiarity with tools and practices such as Kubernetes, Kafka, version control, and MLOps principles. Self-motivation and eagerness to expand knowledge and understanding of Sportsbook products and related technologies. Join Our Team We're a publicly traded (NASDAQ: DKNG) technology company headquartered in Boston. As a regulated gaming company, you may be required to obtain a gaming license issued by the appropriate state agency as a condition of employment. Don't worry, we'll guide you through the process if this is relevant to your role. The US base salary range for this full-time position is 120,800.00 USD - 151,000.00 USD, plus bonus, equity, and benefits as applicable. Our ranges are determined by role, level, and location. The compensation information displayed on each job posting reflects the range for new hire pay rates for the position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Your recruiter can share more about the specific pay range and how that was determined during the hiring process. It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.
    $85k-120k yearly est. Auto-Apply 60d+ ago
  • Lead Building Engineer

    Lincoln Property Company 4.4company rating

    Data engineer job in Las Vegas, NV

    The Lead Building Engineer leads and oversees the day-to-day operations and maintenance of building systems, ensuring efficient and safe building performance while supervising engineering staff and contractors. Essential Duties and Responsibilities: Monitor the operating condition of all HVAC, plumbing, and electrical equipment. Coordinate preventive maintenance and emergency repairs for the building. Oversee service contractors while ensuring compliance with building standards. Supervise and mentor junior Engineers and Technicians; assist in setting staff performance goals and evaluating progress. Collaborate with tenants, contractors, and property managers to ensure reliable building operations and resolve tenant issues. As assigned, work with vendors to identify scopes of work and oversee compliance with contract terms and quality control. Conduct regular building inspections and proactively address deficiencies Manage preventive maintenance schedules and ensure they are executed effectively. Record equipment readings and ensure assigned building systems are operating according to standards. Follow company safety protocols and wear appropriate PPE; ensure compliance with OSHA, NEC, NFC, NFPA regulations; report safety hazards; participate in safety inspections; ensure all staff follow safety protocols and comply with building codes and regulations; oversee safety training and maintain compliance logs. Communicate verbally and in writing with teammates, leadership team, vendors, tenants, and client employees in a professional manner, keeping the appropriate customer(s) informed/updated as needed. Perform other duties as assigned. Qualifications: HS Diploma or GED required Five to seven years of industry-related experience or commensurate certification/trade experience At least one year of formal or informal supervisory, training, and/or mentoring experience CFC Certification preferred or willingness to obtain as requested; possess engineering and trade licenses required according to local, state, or national requirements In-depth knowledge of Microsoft Office products, energy management software systems, CMMS, and other building operational platforms Familiar with fire/life safety equipment/procedures Proven track record of delivering excellent internal and external customer service; ability to successfully interact/communicate with tenants/vendors/contractors as well as teammates and leadership team Familiarity with blueprints and code requirements Ability to work after hours, weekends, holidays, and during emergency situations as necessary to meet the needs of the client Possess competent knowledge of use and care of tools Ability to read and write English in order to understand manuals and procedures, and to write reports. Ability to demonstrate in-depth knowledge of building automation systems (BAS), HVAC, and electrical controls with skills in troubleshooting complex mechanical and electrical systems Physical Requirements: Ability to stand, walk, climb ladders, and lift up to 50 pounds; perform physically demanding tasks such as stooping, crouching, and kneeling. To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed above are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. This position is 100% in-office. The role requires working in office during standard business hours. Remote work or telecommuting is not an option for this position. #IND123 Pay Range$65,000-$85,000 USD About Lincoln Property Company Lincoln Property Company (“Lincoln”) is one of the largest private real estate firms in the United States. Offering a fully integrated platform of real estate services and innovative solutions to owners, investors, lenders and occupiers, Lincoln supports the entire real estate lifecycle across asset types, including office, multifamily, life science, retail, industrial, data center, production studio, healthcare, government, universities, and mixed-use properties, throughout the United States, United Kingdom, and Europe. Lincoln's combined management and leasing portfolio on behalf of institutional clients includes more than 680 million square feet of commercial space. For more information, visit: ************ All job offers are contingent on completion of a background check and proof of eligibility to work in the United States. By submitting your information or resume in response to this opportunity, you acknowledge that your personal information will be handled in accordance with Lincoln Property Company's privacy policy. Lincoln Property Company does not accept unsolicited resumes from third-party recruiters unless they were contractually engaged by Lincoln Property Company to provide candidates for a specified opening. Any such employment agency, person or entity that submits an unsolicited resume does so with the acknowledgement and agreement that Lincoln Property Company will have the right to hire that applicant at its discretion without any fee owed to the submitting employment agency, person or entity. At this time, we are not working with any agencies.
    $65k-85k yearly Auto-Apply 1d ago
  • Lead Building Engineer

    Lincoln x Gatski

    Data engineer job in Las Vegas, NV

    Job Description The Lead Building Engineer leads and oversees the day-to-day operations and maintenance of building systems, ensuring efficient and safe building performance while supervising engineering staff and contractors. Essential Duties and Responsibilities: Monitor the operating condition of all HVAC, plumbing, and electrical equipment. Coordinate preventive maintenance and emergency repairs for the building. Oversee service contractors while ensuring compliance with building standards. Supervise and mentor junior Engineers and Technicians; assist in setting staff performance goals and evaluating progress. Collaborate with tenants, contractors, and property managers to ensure reliable building operations and resolve tenant issues. As assigned, work with vendors to identify scopes of work and oversee compliance with contract terms and quality control. Conduct regular building inspections and proactively address deficiencies Manage preventive maintenance schedules and ensure they are executed effectively. Record equipment readings and ensure assigned building systems are operating according to standards. Follow company safety protocols and wear appropriate PPE; ensure compliance with OSHA, NEC, NFC, NFPA regulations; report safety hazards; participate in safety inspections; ensure all staff follow safety protocols and comply with building codes and regulations; oversee safety training and maintain compliance logs. Communicate verbally and in writing with teammates, leadership team, vendors, tenants, and client employees in a professional manner, keeping the appropriate customer(s) informed/updated as needed. Perform other duties as assigned. Qualifications: HS Diploma or GED required Five to seven years of industry-related experience or commensurate certification/trade experience At least one year of formal or informal supervisory, training, and/or mentoring experience CFC Certification preferred or willingness to obtain as requested; possess engineering and trade licenses required according to local, state, or national requirements In-depth knowledge of Microsoft Office products, energy management software systems, CMMS, and other building operational platforms Familiar with fire/life safety equipment/procedures Proven track record of delivering excellent internal and external customer service; ability to successfully interact/communicate with tenants/vendors/contractors as well as teammates and leadership team Familiarity with blueprints and code requirements Ability to work after hours, weekends, holidays, and during emergency situations as necessary to meet the needs of the client Possess competent knowledge of use and care of tools Ability to read and write English in order to understand manuals and procedures, and to write reports. Ability to demonstrate in-depth knowledge of building automation systems (BAS), HVAC, and electrical controls with skills in troubleshooting complex mechanical and electrical systems Physical Requirements: Ability to stand, walk, climb ladders, and lift up to 50 pounds; perform physically demanding tasks such as stooping, crouching, and kneeling. To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed above are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. This position is 100% in-office. The role requires working in office during standard business hours. Remote work or telecommuting is not an option for this position. #IND123 Pay Range$65,000-$85,000 USD Gatski Commercial & Lincoln Property Company Unite Gatski Commercial has officially joined forces with Lincoln Property Company, a nationally recognized leader in commercial real estate and property management, ******************** This partnership establishes a portfolio of property management and leasing in Southern Nevada to over 20 million SF. With Lincoln's extensive resources and expertise, our collaboration will continue to deliver Gatski's expert services across various sectors including office, retail, industrial and multifamily properties. Most importantly, Lincoln shares Gatski's core commitment to exceptional service, making this a natural and exciting evolution for our organization. This union not only signifies Gatski's commitment to growth but also reinforces Lincoln's foothold in key markets by utilizing Gatski's local knowledge and strong relationships. Together, we will set new benchmarks for the evolving Las Vegas market. Today a new era begins for Lincoln Gatski Las Vegas! Read More here: Lincoln Property Company and Gatski Commercial Unite to Expand Operations in Las Vegas Las Vegas real estate firm Gatski acquired by Lincoln Property of Dallas | Business Lincoln x Gatski does not accept unsolicited resumes from third-party recruiters unless they were contractually engaged by Lincoln x Gatski to provide candidates for a specified opening. Any such employment agency, person or entity that submits an unsolicited resume does so with the acknowledgement and agreement that Lincoln X Gatski will have the right to hire that applicant at its discretion without any fee owed to the submitting employment agency, person or entity. At this time, we are not working with any agencies.
    $65k-85k yearly 5d ago
  • Google Cloud Data & AI Engineer

    Slalom 4.6company rating

    Data engineer job in Las Vegas, NV

    Who You'll Work With As a modern technology company, our Slalom Technologists are disrupting the market and bringing to life the art of the possible for our clients. We have passion for building strategies, solutions, and creative products to help our clients solve their most complex and interesting business problems. We surround our technologists with interesting challenges, innovative minds, and emerging technologies You will collaborate with cross-functional teams, including Google Cloud architects, data scientists, and business units, to design and implement Google Cloud data and AI solutions. As a Consultant, Senior Consultant or Principal at Slalom, you will be a part of a team of curious learners who lean into the latest technologies to innovate and build impactful solutions for our clients. What You'll Do * Design, build, and operationalize large-scale enterprise data and AI solutions using Google Cloud services such as BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub and more. * Implement cloud-based data solutions for data ingestion, transformation, and storage; and AI solutions for model development, deployment, and monitoring, ensuring both areas meet performance, scalability, and compliance needs. * Develop and maintain comprehensive architecture plans for data and AI solutions, ensuring they are optimized for both data processing and AI model training within the Google Cloud ecosystem. * Provide technical leadership and guidance on Google Cloud best practices for data engineering (e.g., ETL pipelines, data pipelines) and AI engineering (e.g., model deployment, MLOps). * Conduct assessments of current data architectures and AI workflows, and develop strategies for modernizing, migrating, or enhancing data systems and AI models within Google Cloud. * Stay current with emerging Google Cloud data and AI technologies, such as BigQuery ML, AutoML, and Vertex AI, and lead efforts to integrate new innovations into solutions for clients. * Mentor and develop team members to enhance their skills in Google Cloud data and AI technologies, while providing leadership and training on both data pipeline optimization and AI/ML best practices. What You'll Bring * Proven experience as a Cloud Data and AI Engineer or similar role, with hands-on experience in Google Cloud tools and services (e.g., BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub, etc.). * Strong knowledge of data engineering concepts, such as ETL processes, data warehousing, data modeling, and data governance. * Proficiency in AI engineering, including experience with machine learning models, model training, and MLOps pipelines using tools like Vertex AI, BigQuery ML, and AutoML. * Strong problem-solving and decision-making skills, particularly with large-scale data systems and AI model deployment. * Strong communication and collaboration skills to work with cross-functional teams, including data scientists, business stakeholders, and IT teams, bridging data engineering and AI efforts. * Experience with agile methodologies and project management tools in the context of Google Cloud data and AI projects. * Ability to work in a fast-paced environment, managing multiple Google Cloud data and AI engineering projects simultaneously. * Knowledge of security and compliance best practices as they relate to data and AI solutions on Google Cloud. * Google Cloud certifications (e.g., Professional Data Engineer, Professional Database Engineer, Professional Machine Learning Engineer) or willingness to obtain certification within a defined timeframe. About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this position the target base salaries are listed below. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The target salary pay range is subject to change and may be modified at any time. East Bay, San Francisco, Silicon Valley: * Consultant $114,000-$171,000 * Senior Consultant: $131,000-$196,500 * Principal: $145,000-$217,500 San Diego, Los Angeles, Orange County, Seattle, Houston, New Jersey, New York City, Westchester, Boston, Washington DC: * Consultant $105,000-$157,500 * Senior Consultant: $120,000-$180,000 * Principal: $133,000-$199,500 All other locations: * Consultant: $96,000-$144,000 * Senior Consultant: $110,000-$165,000 * Principal: $122,000-$183,000 EEO and Accommodations Slalom is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process. We are accepting applications until 12/31. #LI-FB1
    $145k-217.5k yearly 37d ago

Learn more about data engineer jobs

How much does a data engineer earn in North Las Vegas, NV?

The average data engineer in North Las Vegas, NV earns between $69,000 and $134,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in North Las Vegas, NV

$96,000
Job type you want
Full Time
Part Time
Internship
Temporary