Post job

Data Engineer jobs at The Walt Disney Company - 1324 jobs

  • Lead Software Engineer - AI Infrastructure & Tooling

    The Walt Disney Company 4.6company rating

    Data engineer job at The Walt Disney Company

    Technology is at the heart of Disney's past, present, and future. Disney Entertainment and ESPN Product & Technology is a global organization of engineers, product developers, designers, technologists, data scientists, and more - all working to build and advance the technological backbone for Disney's media business globally. The DEEP&T team marries technology with creativity to build world-class products, enhance storytelling, and drive velocity, innovation, and scalability for our businesses. We are Storytellers and Innovators. Creators and Builders. Entertainers and Engineers. We work with every part of The Walt Disney Company's media portfolio to advance the technological foundation and consumer media touch points serving millions of people around the world. Here are a few reasons why we think you'd love working here: 1. Building the future of Disney's media: Our Technologists are designing and building the products and platforms that will power our media, advertising, and distribution businesses for years to come. 2. Reach, Scale & Impact: More than ever, Disney's technology and products serve as a signature doorway for fans' connections with the company's brands and stories. Disney+. Hulu. ESPN. ABC. ABC News…and many more. These products and brands - and the unmatched stories, storytellers, and events they carry - matter to millions of people globally. 3. Innovation: We develop and implement groundbreaking products and techniques that shape industry norms and solve complex and distinctive technical problems Product Engineering is a unified team responsible for the engineering of Disney Entertainment & ESPN digital and streaming products and platforms. This includes product and media engineering, quality assurance, as well engineering behind personalization, commerce, lifecycle, and identity. We are committed to a diverse and inclusive workplace. The Walt Disney Company is an equal opportunity employer and does not discriminate based on race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status Job Summary: As Lead Software Engineer within DEEP&T, you will touch the lives of Disney fans around the world, working on innovative digital products and platforms from Disney, ESPN, National Geographic, ABC, Marvel, Star Wars and more. You will be a leader on the Quality Engineering team, whose primary goal is continually improving the quality of the products that Disney is known for while decreasing time-to-market and enabling innovation. This role makes you a stakeholder in every digital property developed and released by the DEEP&T group, as well as the platforms and enterprise applications that power them. As an experienced software engineering leader who has a passion for leading engineers that drive quality and productivity into world-class applications and services, you will lead a small group of talented and driven software engineers dedicated to building test automation, tools, reports, and services that enable product delivery teams to deliver all Disney digital applications with the quality and reliability expected by our guests. Responsibilities: Owning and contributing to the architecture, development and maintenance of AI/ML frameworks and models that support test automation and tooling across multiple products Providing strategic input, defining and contributing towards Objectives and Key Results, and partnering within and across the organization to achieve them Leading and mentoring junior developers, providing meaningful insight, code reviews and technical direction across the team Developing and supporting high quality code to build and enhance test automation, quality infrastructure, and innovative proof-of-concepts Leading communication for projects and overall team status within and across the organization Keeping up with industry trends and new publicly available technologies, coming up with innovative ideas and solutions for complex technical problems Serving as an advanced resource for other engineers on the team, training and helping others to contribute meaningful and impactful features and capabilities Being an active member of the larger Quality Engineering team, helping increase productivity and impact through innovation, curiosity, and thoughtful debate Basic Qualifications: 7 years of relevant software development experience, including designing, deploying and integrating AI/ML models into existing systems Bachelor's degree or the foreign equivalent in Computer Science or a closely related field or equivalent years of professional experience Progressive experience in a software development (SDET or SE) occupation, including developing unit tests and/or automated testing of front end and backend services Proficient in object-oriented design and expertise with Python, Java and/or Node/JavaScript Experience leading teams of software engineers and/or SDETs Experience working with high-performing teams using Agile and Lean methodologies Experience in modern design patterns and techniques, deriving and gathering quality KPI's to give insight into product's health and progress Preferred Qualifications: Experience shipping production Python, Flask, Django, React or Node.js applications Experience training custom AI/ML models, validating LLM/LVLM performance and accuracy Experience with developing and deploying applications in cloud platforms (e.g. AWS) and optimizing cost efficiency Experience with Kafka, Amazon SQS, SageMaker, and Kinesis Experience with Docker, Kubernetes, Spinnaker, and continuous integration/delivery systems Experience with testcase management (e.g. JIRA Xray), code management (e.g. git, SonarQube) and data visualization (e.g. Grafana, Data Dog) tools The hiring range for this position in Los Angeles, CA is $155,700.00 to $208,700.00 per year. The base pay actually offered will take into account internal equity and also may vary depending on the candidate's geographic region, job-related knowledge, skills, and experience among other factors. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered. Job Posting Segment: PE - Sports, News & Entertainment, Enablement Job Posting Primary Business: PE - Sports, News & Entertainment, Enablement - News & Entertainment Engineering Primary Job Posting Category: Software Engineer Employment Type: Full time Primary City, State, Region, Postal Code: Remote Worker Location, USA Alternate City, State, Region, Postal Code: USA - CA - 2500 Broadway Street Date Posted: 2025-07-28
    $155.7k-208.7k yearly Auto-Apply 44d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • GCP Data engineer

    E-Solutions 4.5company rating

    Hartford, CT jobs

    Role: GCP Data Engineer Must Have Skills: 7+ Years of experience with GCP , Python , Pyspark , SQL GCP Services - Bigquery , Dataproc , Pub/sub , GCP Dataflow. The client will take - Coderpad interview so Python coding is strongly required here.
    $110k-154k yearly est. 1d ago
  • Software Engineer

    OYF (Own Your Future) Staffing 4.5company rating

    New York, NY jobs

    Founding Engineer We are looking for a Founding Engineer with 5+ years of experience to join our team and build the core operating system for private neurology practices. This person will be highly entrepreneurial, excited about working in a fast-paced startup environment, and passionate about making a significant impact on neurological care in the U.S. What will you be doing? Design, build, and maintain scalable B2B workflows for private practices. Architect the technical foundation, deciding between in-house builds and integrating best-in-class tools. Develop and maintain data infrastructure to generate actionable insights from clinical and operational data. Work directly with customers onsite to iterate on the platform and implement improvements based on real-world feedback. Define engineering standards, set the team culture, and help hire and mentor future engineers. Tech Stack Node, Typescript, React, SQL, AWS Candidate Profile Seniority: 5+ years of experience in backend software engineering, building B2B workflows, with Node/Typescript/React, SQL, and AWS. Work Experience: Startup experience, ideally 0 to 1 builds. Longevity and promotions at previous companies. Experience at a successful, scaling tech company. Hard Skills: Building B2B workflows from 0 to 1. Node/Typescript/React, SQL, and AWS experience. Soft Skills: Highly entrepreneurial, eager to learn startup operations. Miscellaneous: Work five days a week in New York City. Excited about the healthcare mission. About us The client, we're on a mission to bring back the private practice-the way healthcare used to be. We believe that when physicians have full autonomy over how they treat their patients, they're able to deliver better, more effective care. Our vision is to empower neurologists to launch, own, and operate their own practices, and in doing so, radically transform the future of neurological care in the US. Industry: Healthcare, Software Development Office Locations: New York City, New York About the team Small, founder led, high ownership team: Two co-founders work extremely closely with early customers and expect engineers to own problems end to end with very little hand-holding. Mistakes are owned as a team, not individually. Intense but transparent startup culture: In office five days a week in NYC, fast-paced and demanding, with an expectation of hard work and startup hours. Focused, mission driven, and execution oriented. Low ego, high agency environment: Direct communication, no politics, no individual hero culture. People are expected to raise their hand when blocked, collaborate deeply and care about the mission of improving patient outcomes. Benefits Fully paid health, dental and vision insurance, generous matching contributions to employee FSA/HSA, 401(k) with matching contributions and two-week company-wide winter break, plus additional paid time off.
    $94k-131k yearly est. 3d ago
  • Principal, Data Engineer

    Gemini 4.9company rating

    New York jobs

    About the Company Gemini is a global crypto and Web3 platform founded by Cameron and Tyler Winklevoss in 2014, offering a wide range of simple, reliable, and secure crypto products and services to individuals and institutions in over 70 countries. Our mission is to unlock the next era of financial, creative, and personal freedom by providing trusted access to the decentralized future. We envision a world where crypto reshapes the global financial system, internet, and money to create greater choice, independence, and opportunity for all - bridging traditional finance with the emerging cryptoeconomy in a way that is more open, fair, and secure. As a publicly traded company, Gemini is poised to accelerate this vision with greater scale, reach, and impact. The Department: Data At Gemini, our Data Team is the engine that powers insight, innovation, and trust across the company. We bring together world-class data engineers, platform engineers, machine learning engineers, analytics engineers, and data scientists - all working in harmony to transform raw information into secure, reliable, and actionable intelligence. From building scalable pipelines and platforms, to enabling cutting-edge machine learning, to ensuring governance and cost efficiency, we deliver the foundation for smarter decisions and breakthrough products. We thrive at the intersection of crypto, technology, and finance, and we're united by a shared mission: to unlock the full potential of Gemini's data to drive growth, efficiency, and customer impact. The Role: Principal, Data Engineer The Data Engineering Team owns the ingestion and transformation of data from production databases, streams, and external data sources into our data warehouse. As a Principal Data Engineer, you will set the technical direction for how data is modeled, processed, and delivered across the organization. You will partner closely with product, analytics, ML, finance, operations, and engineering teams to move, transform, and model data reliably, with observability, resilience, and agility. You'll lead by example through design excellence, mentoring, and technical leadership, ensuring our data architecture is scalable, governed, and ready for the next generation of analytics and machine learning at Gemini. This is a senior individual contributor role - highly technical, strategic, and cross-functional - where you'll influence the design of data systems that underpin key decisions and customer-facing products across Gemini. This role is required to be in person twice a week at either our New York City, NY or San Francisco, CA office. Responsibilities: Define and drive the long-term vision for data architecture, modeling, and transformation at Gemini Establish standards for data reliability, observability, and quality across all pipelines and data products using languages and frameworks such as Python, SQL, Spark, Flink, Beam, or equivalents Partner with Staff and Senior Data Engineers, Platform Engineers, and Analytics Engineers to unify how data is produced, stored, and consumed Lead large-scale design initiatives that span multiple teams and systems, ensuring maintainability, performance, and security Partner with data scientists, ML engineers, analysts, and product teams to understand data requirements, define SLAs, and deliver coherent data products that others can self-serve Establish data quality, validation, observability, and monitoring frameworks (data auditing, alerting, anomaly detection, data lineage) Investigate and resolve complex production issues: root cause analysis, performance bottlenecks, data integrity, fault tolerance Mentor and guide more junior and mid-level data engineers: lead code reviews, design reviews, and best-practice evangelism Help recruit and onboard new talent, shaping the future of Gemini's data engineering discipline Stay up to date on new tools, technologies, and patterns in the data and cloud space, bringing proposals and proof-of-concepts when appropriate Document data flows, data dictionaries, architecture patterns, and operational runbooks Minimum Qualifications: 10+ years of experience in data engineering (or similar) roles Strong experience in ETL/ELT pipeline design, implementation, and optimization Deep expertise in Python and SQL writing production-quality, maintainable, testable code Experience with large-scale data warehouses (e.g. Databricks, BigQuery, Snowflake) Solid grounding in software engineering fundamentals, data structures, and systems thinking Hands-on experience in data modeling (dimensional modeling, normalization, schema design) Experience building systems with real-time or streaming data (e.g. Kafka, Kinesis, Flink, Spark Streaming), and familiarity with CDC frameworks Experience with orchestration / workflow frameworks (e.g. Airflow) Familiarity with data governance, lineage, metadata, cataloging, and data quality practices Strong cross-functional communication skills; ability to translate between technical and non-technical stakeholders Proven experience in recruiting, mentoring, leading design discussions, and influencing data-engineering best practices across teams Preferred Qualifications: Experience with crypto, financial services, trading, markets, or exchange systems Experience with blockchain, crypto, Web3 data - e.g. blocks, transactions, contract calls, token transfers, UTXO/account models, on-chain indexing, chain APIs, etc. Experience with infrastructure as code, containerization, and CI/CD pipelines Hands-on experience managing and optimizing Databricks on AWS It Pays to Work Here The compensation & benefits package for this role includes: Competitive starting salary A discretionary annual bonus Long-term incentive in the form of a new hire equity grant Comprehensive health plans 401K with company matching Paid Parental Leave Flexible time off Salary Range: The base salary range for this role is between $192,500 - $275,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate's compensation, we consider a number of factors including skillset, experience, job scope, and current market data. In the United States, we offer a hybrid work approach at our hub offices, balancing the benefits of in-person collaboration with the flexibility of remote work. Expectations may vary by location and role, so candidates are encouraged to connect with their recruiter to learn more about the specific policy for the role. Employees who do not live near one of our hubs are part of our remote workforce. At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know. #LI-ES1
    $192.5k-275k yearly Auto-Apply 60d+ ago
  • Staff Data Engineer

    Gemini 4.9company rating

    Seattle, WA jobs

    About the Company Gemini is a global crypto and Web3 platform founded by Cameron and Tyler Winklevoss in 2014, offering a wide range of simple, reliable, and secure crypto products and services to individuals and institutions in over 70 countries. Our mission is to unlock the next era of financial, creative, and personal freedom by providing trusted access to the decentralized future. We envision a world where crypto reshapes the global financial system, internet, and money to create greater choice, independence, and opportunity for all - bridging traditional finance with the emerging cryptoeconomy in a way that is more open, fair, and secure. As a publicly traded company, Gemini is poised to accelerate this vision with greater scale, reach, and impact. The Department: Data At Gemini, our Data Team is the engine that powers insight, innovation, and trust across the company. We bring together world-class data engineers, platform engineers, machine learning engineers, analytics engineers, and data scientists - all working in harmony to transform raw information into secure, reliable, and actionable intelligence. From building scalable pipelines and platforms, to enabling cutting-edge machine learning, to ensuring governance and cost efficiency, we deliver the foundation for smarter decisions and breakthrough products. We thrive at the intersection of crypto, technology, and finance, and we're united by a shared mission: to unlock the full potential of Gemini's data to drive growth, efficiency, and customer impact. The Role: Staff Data Engineer The Data team is responsible for designing and operating the data infrastructure that powers insight, reporting, analytics, and machine learning across the business. As a Staff Data Engineer, you will lead architectural initiatives, mentor others, and build high-scale systems that impact the entire organization. You will partner closely with product, analytics, ML, finance, operations, and engineering teams to move, transform, and model data reliably, with observability, resilience, and agility. This role is required to be in person twice a week at either our San Francisco, CA or New York City, NY office. Responsibilities: * Lead the architecture, design, and implementation of data infrastructure and pipelines, spanning both batch and real-time / streaming workloads * Build and maintain scalable, efficient, and reliable ETL/ELT pipelines using languages and frameworks such as Python, SQL, Spark, Flink, Beam, or equivalents * Work on real-time or near-real-time data solutions (e.g. CDC, streaming, micro-batch) for use cases that require timely data * Partner with data scientists, ML engineers, analysts, and product teams to understand data requirements, define SLAs, and deliver coherent data products that others can self-serve * Establish data quality, validation, observability, and monitoring frameworks (data auditing, alerting, anomaly detection, data lineage) * Investigate and resolve complex production issues: root cause analysis, performance bottlenecks, data integrity, fault tolerance * Mentor and guide more junior and mid-level data engineers: lead code reviews, design reviews, and best-practice evangelism * Stay up to date on new tools, technologies, and patterns in the data and cloud space, bringing proposals and proof-of-concepts when appropriate * Document data flows, data dictionaries, architecture patterns, and operational runbooks Minimum Qualifications: * 8+ years of experience in data engineering (or similar) roles * Strong experience in ETL/ELT pipeline design, implementation, and optimization * Deep expertise in Python and SQL writing production-quality, maintainable, testable code * Experience with large-scale data warehouses (e.g. Databricks, BigQuery, Snowflake) * Solid grounding in software engineering fundamentals, data structures, and systems thinking * Hands-on experience in data modeling (dimensional modeling, normalization, schema design) * Experience building systems with real-time or streaming data (e.g. Kafka, Kinesis, Flink, Spark Streaming), and familiarity with CDC frameworks * Experience with orchestration / workflow frameworks (e.g. Airflow) * Familiarity with data governance, lineage, metadata, cataloging, and data quality practices * Strong cross-functional communication skills; ability to translate between technical and non-technical stakeholders * Proven experience in mentoring, leading design discussions, and influencing data-engineering best practices across teams Preferred Qualifications: * Experience with crypto, financial services, trading, markets, or exchange systems * Experience with blockchain, crypto, Web3 data - e.g. blocks, transactions, contract calls, token transfers, UTXO/account models, on-chain indexing, chain APIs, etc. * Experience with infrastructure as code, containerization, and CI/CD pipelines * Hands-on experience managing and optimizing Databricks on AWS It Pays to Work Here The compensation & benefits package for this role includes: * Competitive starting salary * A discretionary annual bonus * Long-term incentive in the form of a new hire equity grant * Comprehensive health plans * 401K with company matching * Paid Parental Leave * Flexible time off Salary Range: The base salary range for this role is between $168,000 - $240,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate's compensation, we consider a number of factors including skillset, experience, job scope, and current market data. In the United States, we offer a hybrid work approach at our hub offices, balancing the benefits of in-person collaboration with the flexibility of remote work. Expectations may vary by location and role, so candidates are encouraged to connect with their recruiter to learn more about the specific policy for the role. Employees who do not live near one of our hubs are part of our remote workforce. At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know. #LI-ES1
    $168k-240k yearly Auto-Apply 60d+ ago
  • Staff Data Platform Engineer

    Gemini 4.9company rating

    New York jobs

    About the Company Gemini is a global crypto and Web3 platform founded by Cameron and Tyler Winklevoss in 2014, offering a wide range of simple, reliable, and secure crypto products and services to individuals and institutions in over 70 countries. Our mission is to unlock the next era of financial, creative, and personal freedom by providing trusted access to the decentralized future. We envision a world where crypto reshapes the global financial system, internet, and money to create greater choice, independence, and opportunity for all - bridging traditional finance with the emerging cryptoeconomy in a way that is more open, fair, and secure. As a publicly traded company, Gemini is poised to accelerate this vision with greater scale, reach, and impact. The Department: Data At Gemini, our Data team is the engine that powers insight, innovation, and trust across the company. We bring together world-class data engineers, platform engineers, machine learning engineers, analytics engineers, and data scientists - all working in harmony to transform raw information into secure, reliable, and actionable intelligence. From building scalable pipelines and platforms, to enabling cutting-edge machine learning, to ensuring governance and cost efficiency, we deliver the foundation for smarter decisions and breakthrough products. We thrive at the intersection of crypto, technology, and finance, and we're united by a shared mission: to unlock the full potential of Gemini's data to drive growth, efficiency, and customer impact. The Role: Staff Data Platform Engineer The Data Platform Engineering team provides the foundation upon which all analytics, ML, and data-driven products are built. As a Staff Data Platform Engineer, you will own and evolve our data warehouse infrastructure (Databricks, AWS resources, storage layers, orchestration tools, and security). You will lead efforts in cost optimization, governance, access management, and tooling - ensuring that our data platform is scalable, reliable, secure, and cost-efficient. This is a senior IC role where you will influence architectural direction across the data org, mentor other engineers, and drive best practices in platform engineering and governance. This role is required to be in person twice a week at either our New York City, NY or San Francisco, CA office. Responsibilities: Own the design, build, and operate data infrastructure, including Databricks & AWS Define standards for storage, compute, orchestration, and metadata management to support analytics, data engineering, and ML use cases Lead architecture and design for cross-team data platform initiatives, ensuring scalability, resiliency, and performance Monitor & forecast data platform infrastructure and costs, proactively identifying opportunities for workload optimizations Partner with security, compliance, and legal teams to meet regulatory requirements and enforce data governance policies Implement and manage fine-grained access management, role-based access control (RBAC), attribute-based access control (ABAC), and row/column-level security Drive adoption of metadata management, catalogs, and lineage tooling Build reusable tooling and frameworks for data engineers and analysts (e.g., templated pipelines, schema migration tools, observability dashboards) Partner with SREs / DevOps to harden environments and implement disaster recovery Serve as a technical leader and subject matter expert across the data org, influencing standards and best practices Mentor senior data platform engineers, conduct design and code reviews, and guide cross-functional platform initiatives Minimum Qualifications: 8+ years of experience in data engineering, platform engineering, or infrastructure engineering, with at least 3+ years focused on platform-level responsibilities Strong hands-on experience with Databricks (clusters, jobs, delta lake, Unity Catalog) and AWS data services (S3, Glue, EMR, Athena, IAM) Deep knowledge of data warehouse and lakehouse architectures (Airflow, ETL, Delta, Parquet) Strong programming experience with Python, SQL, and infrastructure-as-code frameworks Proven track record in cost optimization, spend visibility, and scaling cloud-based data workloads Experience with data governance, security, and access management (RBAC/ABAC, encryption, auditing) Strong understanding of reliability, monitoring, and observability for distributed data systems Demonstrated ability to influence cross-team architecture and mentor other engineers Preferred Qualifications: Familiarity with blockchain / crypto / Web3 data (e.g., running indexing infrastructure, managing chain APIs, or operating large-scale node infrastructure) Experience with enterprise data catalog and governance platforms Experience with advanced access control frameworks (row-level, column-level, tokenized access) Exposure to ML platform components (feature stores, model registries) or AI/LLM workloads (vector databases, embedding pipelines) Knowledge of networking, VPC design, and data replication patterns Experience managing compliance-heavy data environments (SOX, HIPAA, PCI DSS, GDPR, etc.) Strong performance tuning skills at both infrastructure and query level It Pays to Work Here The compensation & benefits package for this role includes: Competitive starting salary A discretionary annual bonus Long-term incentive in the form of a new hire equity grant Comprehensive health plans 401K with company matching Paid Parental Leave Flexible time off Salary Range: The base salary range for this role is between $168,000 - $240,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate's compensation, we consider a number of factors including skillset, experience, job scope, and current market data. In the United States, we offer a hybrid work approach at our hub offices, balancing the benefits of in-person collaboration with the flexibility of remote work. Expectations may vary by location and role, so candidates are encouraged to connect with their recruiter to learn more about the specific policy for the role. Employees who do not live near one of our hubs are part of our remote workforce. At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know. #LI-PA1
    $168k-240k yearly Auto-Apply 60d+ ago
  • Senior Data Engineer

    Rain 3.7company rating

    New York, NY jobs

    About the Company At Rain, we're rebuilding the global financial pipes money flows through. Our infrastructure makes stablecoins usable in the real world by powering credit card transactions, cross-border payments, B2B purchases, remittances, and more. We partner with fintechs, neobanks, and institutions to help them launch solutions that are global, inclusive, and efficient. If you're curious, bold, and excited to help shape a borderless financial system, we'd love to talk. Our Ethos Operating at the epicenter of stablecoin innovation means moving fast and thinking globally. Our team reflects the diverse, international audiences we serve. We hire people who stay agile as the tide ebbs and flows, fix what's broken without waiting, chase trends before they peak, and remember to have fun through it all. About the Role We're looking for Rain's first dedicated Data Engineer-a hands-on builder who will architect the ingestion, pipelines, and infrastructure that power our data ecosystem. As Rain scales to millions of end users across payments, card programs, and blockchain rails, your systems will ensure every team has access to timely, accurate, and trustworthy data. Reporting to the CTO and partnering closely with Analytics, Operations, Product, Compliance, and BD, you will own Rain's data pipelines end-to-end: from ingesting raw transaction and blockchain data, to orchestrating transformations, to ensuring quality, observability, and reliability. This role is foundational-you'll be shaping the data backbone that underpins analytics, customer reporting, operational tooling, and on-chain integrations. If you love building in fast-moving environments, care deeply about data quality, and want to own core infrastructure at the heart of a modern fintech, this role is for you. What you'll do Design, build, and maintain Rain's core data pipelines, including ingestion from payments processors, card issuers, blockchain nodes, internal services, and third-party APIs Own orchestration and workflow management, implementing Airflow, Dagster, or similar tools to ensure reliable, observable, and scalable data processing Architect and manage Rain's data warehouse (Snowflake, BigQuery, or Redshift), driving performance, cost optimization, partitioning, and access patterns Develop high-quality ELT/ETL transformations to structure raw logs, transactions, ledgers, and on-chain events into clean, production-grade datasets Implement data quality frameworks and observability (tests, data contracts, freshness checks, lineage) to ensure every dataset is trustworthy Partner closely with backend engineers to instrument new events, define data contracts, and improve telemetry across Rain's infrastructure Support Analytics and cross-functional teams by delivering well-modeled, well-documented tables that power dashboards, ROI analyses, customer reporting, and key business metrics Own data reliability at scale, leading root-cause investigations, reducing pipeline failures, and building monitoring and alerting systems Evaluate and integrate new tools across ingestion, enrichment, observability, and developer experience-raising the bar on performance and maintainability Help set the long-term technical direction for Rain's data platform as we scale across new products, regions, and chains What we're looking for Data infrastructure builder - You thrive in early-stage environments, owning pipelines and platforms end-to-end and choosing simplicity without sacrificing reliability Expert data engineer - Strong Python and SQL fundamentals, with real experience building production-grade ETL/ELT Workflow & orchestration fluent - Hands-on experience with Airflow, Dagster, Prefect, or similar systems Warehouse & modeling savvy - Comfortable designing schemas, optimizing performance, and operating modern cloud warehouses (Snowflake, BigQuery, Redshift) Quality-obsessed - You care deeply about data integrity, testing, lineage, and observability Systems thinker - You see data as a platform; you design for reliability, scale, and future users Collaborator - You work well with backend engineers, analytics engineers, and cross-functional stakeholders to define requirements and deliver outcomes Experienced - 5-7+ years in data engineering roles, ideally within fintech, payments, B2B SaaS, or infrastructure-heavy startups Nice to have, but not mandatory Experience ingesting and processing payment data, transaction logs, or ledger systems Exposure to smart contracts, blockchain data structures, or on-chain event ingestion Experience building data tooling for compliance, risk, or regulated environments Familiarity with dbt and/or semantic modeling to support analytics layers Prior experience standing up data platforms from 0→1 at early-stage companies Things that enable a fulfilling, healthy, and happy experience at Rain: Unlimited time off 🌴 Unlimited vacation can be daunting, so we require Rainmakers to take at least 10 days off. Flexible working ☕ We support a flexible workplace. If you feel comfortable at home, please work from home. If you'd like to work with others in an office, feel free to come in. We want everyone to be able to work in the environment in which they are their most confident and productive selves. New Rainmakers will receive a stipend to create a comfortable home environment. Easy to access benefits 🧠For US Rainmakers, we offer comprehensive health, dental and vision plans for you and your dependents, as well as a 100% company subsidized life insurance plan. Retirement goals💡Plan for the future with confidence. We offer a 401(k) with a 4% company match. Equity plan 📦 We offer every Rainmaker an equity option plan so we can all benefit from our success. Rain Cards 🌧️ We want Rainmakers to be knowledgeable about our core products and services. To support this mission, we issue a card for our team to use for testing. Health and Wellness 📚 High performance begins from within. Rainmakers are welcome to use their card for eligible health and wellness spending like gym memberships/fitness classes, massages, acupuncture - whatever recharges you! Team summits ✨ Summits play an important role at Rain! Time spent together helps us get to know each other, strengthen our relationships, and build a common destiny. Expect team and company off-sites both domestically and internationally.
    $102k-148k yearly est. Auto-Apply 4d ago
  • Sr Data Engineer (FAST, AVOD)

    Sony Pictures Entertainment 4.8company rating

    Culver City, CA jobs

    Sony Pictures Entertainment, a division of Sony Corporation, is a global creative entertainment company built on a foundation of technology, storytelling, and innovation. Sony Pictures Core is a streaming service that operates on BRAVIA TVs and PlayStation consoles. We are seeking a strategic and analytical Senior Data Engineer to oversee the development and implementation of analytical datasets that drive decision-making across the organization and help shape the future of our DTC AVOD and FAST platforms- ultimately influencing the vision, launch, and growth of Sony Pictures Core. This role is a part of the broader ISA team but is dedicated and embedded in the day to day of the Sony Pictures Core AVOD/FAST business. The ideal candidate has a strong blend of technical expertise, business acumen, and leadership skills. Key Responsibilities: + Lead the design and development of data pipelines related to streaming video analytics to create high quality, analytics-ready datasets used by data scientists and data visualization teams in self-service analytics tools, and to power business-critical intelligence by the insights & strategy team. + Translate complex business requirements into the design of robust engineering and operational processes. + Assess source data fitness and understand the data collection methodologies for the AVOD / SVOD product portfolio. Anticipate source data quality challenges and lead solutioning to ensure downstream tools receive highly accurate, reliable data. + Contribute to fast-paced agile teams to build new analytical tools. Source and harmonize data from disparate systems, collaborating closely with product, data science, and data visualization teams. + Advocate for best practices in analytics engineering, promoting advanced use of tools like dbt, Airflow, and other relevant technologies to ensure scalability, reliability, and performance. + Work closely with Product teams to define data requirements and ensure data architecture and pipelines support both new and existing data product roadmaps. + Agile and Collaborative Mindset: Effectively collaborate with non-technical stakeholders to understand business requirements and with technical partners across multiple disciplines to design elegant solutions. Deliver business impact within a fast-paced, agile environment. + Data Governance: Ensure data accuracy, integrity, and security across systems. Partner with application engineers, IT teams, and Infosec on data architecture and governance initiatives. Qualifications: + Bachelor's or Master's degree in Data Science, Computer Science, Business Analytics, or related fields; advanced degrees are a plus. + 7+ years in data modelling, analytics engineering, data engineering, or related technical roles in a production environment. Hands-on experience with AVOD/SVOD streaming, content management or video ad management is a plus. + Proficiency with modern cloud data platforms (e.g., Snowflake, GCP/BigQuery, AWS/Redshift). + Expert-level SQL & Python skills required. Experience building transformation pipelines in dbt is a major plus. + Hands-on experience with modern orchestration tools like Airflow. + Familiarity with CI/CD workflows, version control, and automated testing frameworks to meet data quality standards. + Background in entertainment, with a deep understanding of the TV and streaming landscape across linear, digital, AVOD, FAST, SVOD, and emerging platforms. + Strong problem-solving skills and attention to detail. + Excellent communication and stakeholder management skills. Sony Pictures Entertainment is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, national origin, disability, veteran status, age, sexual orientation, gender identity, or other protected characteristics. To request an accommodation for purposes of participating in the hiring process, you may contact us at SPE_Accommodation_Assistance@spe.sony.com.
    $127k-168k yearly est. 39d ago
  • Sr Data Engineer (FAST, AVOD)

    Sony Pictures 4.8company rating

    Culver City, CA jobs

    Sony Pictures Entertainment, a division of Sony Corporation, is a global creative entertainment company built on a foundation of technology, storytelling, and innovation. Sony Pictures Core is a streaming service that operates on BRAVIA TVs and PlayStation consoles. We are seeking a strategic and analytical Senior Data Engineer to oversee the development and implementation of analytical datasets that drive decision-making across the organization and help shape the future of our DTC AVOD and FAST platforms- ultimately influencing the vision, launch, and growth of Sony Pictures Core. This role is a part of the broader ISA team but is dedicated and embedded in the day to day of the Sony Pictures Core AVOD/FAST business. The ideal candidate has a strong blend of technical expertise, business acumen, and leadership skills. Key Responsibilities: * Lead the design and development of data pipelines related to streaming video analytics to create high quality, analytics-ready datasets used by data scientists and data visualization teams in self-service analytics tools, and to power business-critical intelligence by the insights & strategy team. * Translate complex business requirements into the design of robust engineering and operational processes. * Assess source data fitness and understand the data collection methodologies for the AVOD / SVOD product portfolio. Anticipate source data quality challenges and lead solutioning to ensure downstream tools receive highly accurate, reliable data. * Contribute to fast-paced agile teams to build new analytical tools. Source and harmonize data from disparate systems, collaborating closely with product, data science, and data visualization teams. * Advocate for best practices in analytics engineering, promoting advanced use of tools like dbt, Airflow, and other relevant technologies to ensure scalability, reliability, and performance. * Work closely with Product teams to define data requirements and ensure data architecture and pipelines support both new and existing data product roadmaps. * Agile and Collaborative Mindset: Effectively collaborate with non-technical stakeholders to understand business requirements and with technical partners across multiple disciplines to design elegant solutions. Deliver business impact within a fast-paced, agile environment. * Data Governance: Ensure data accuracy, integrity, and security across systems. Partner with application engineers, IT teams, and Infosec on data architecture and governance initiatives. Qualifications: * Bachelor's or Master's degree in Data Science, Computer Science, Business Analytics, or related fields; advanced degrees are a plus. * 7+ years in data modelling, analytics engineering, data engineering, or related technical roles in a production environment. Hands-on experience with AVOD/SVOD streaming, content management or video ad management is a plus. * Proficiency with modern cloud data platforms (e.g., Snowflake, GCP/BigQuery, AWS/Redshift). * Expert-level SQL & Python skills required. Experience building transformation pipelines in dbt is a major plus. * Hands-on experience with modern orchestration tools like Airflow. * Familiarity with CI/CD workflows, version control, and automated testing frameworks to meet data quality standards. * Background in entertainment, with a deep understanding of the TV and streaming landscape across linear, digital, AVOD, FAST, SVOD, and emerging platforms. * Strong problem-solving skills and attention to detail. * Excellent communication and stakeholder management skills. The anticipated base salary for this position is $130K to $170K. This role may also qualify for annual incentive and/or comprehensive benefits. The actual base salary offered will depend on a variety of factors, including without limitation, the qualifications of the individual applicant for the position, years of relevant experience, level of education attained, certifications or other professional licenses held, and if applicable, the location of the position. Sony Pictures Entertainment is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, national origin, disability, veteran status, age, sexual orientation, gender identity, or other protected characteristics. SPE will consider qualified applicants with arrest or conviction records in accordance with applicable law. To request an accommodation for purposes of participating in the hiring process, you may contact us at SPE_Accommodation_Assistance@spe.sony.com.
    $130k-170k yearly Auto-Apply 40d ago
  • Sr Data Engineer (FAST, AVOD)

    Sony Pictures Entertainment 4.8company rating

    Culver City, CA jobs

    Sony Pictures Entertainment, a division of Sony Corporation, is a global creative entertainment company built on a foundation of technology, storytelling, and innovation. Sony Pictures Core is a streaming service that operates on BRAVIA TVs and PlayStation consoles. We are seeking a strategic and analytical Senior Data Engineer to oversee the development and implementation of analytical datasets that drive decision-making across the organization and help shape the future of our DTC AVOD and FAST platforms- ultimately influencing the vision, launch, and growth of Sony Pictures Core. This role is a part of the broader ISA team but is dedicated and embedded in the day to day of the Sony Pictures Core AVOD/FAST business. The ideal candidate has a strong blend of technical expertise, business acumen, and leadership skills. Key Responsibilities: Lead the design and development of data pipelines related to streaming video analytics to create high quality, analytics-ready datasets used by data scientists and data visualization teams in self-service analytics tools, and to power business-critical intelligence by the insights & strategy team. Translate complex business requirements into the design of robust engineering and operational processes. Assess source data fitness and understand the data collection methodologies for the AVOD / SVOD product portfolio. Anticipate source data quality challenges and lead solutioning to ensure downstream tools receive highly accurate, reliable data. Contribute to fast-paced agile teams to build new analytical tools. Source and harmonize data from disparate systems, collaborating closely with product, data science, and data visualization teams. Advocate for best practices in analytics engineering, promoting advanced use of tools like dbt, Airflow, and other relevant technologies to ensure scalability, reliability, and performance. Work closely with Product teams to define data requirements and ensure data architecture and pipelines support both new and existing data product roadmaps. Agile and Collaborative Mindset: Effectively collaborate with non-technical stakeholders to understand business requirements and with technical partners across multiple disciplines to design elegant solutions. Deliver business impact within a fast-paced, agile environment. Data Governance: Ensure data accuracy, integrity, and security across systems. Partner with application engineers, IT teams, and Infosec on data architecture and governance initiatives. Qualifications: Bachelor's or Master's degree in Data Science, Computer Science, Business Analytics, or related fields; advanced degrees are a plus. 7+ years in data modelling, analytics engineering, data engineering, or related technical roles in a production environment. Hands-on experience with AVOD/SVOD streaming, content management or video ad management is a plus. Proficiency with modern cloud data platforms (e.g., Snowflake, GCP/BigQuery, AWS/Redshift). Expert-level SQL & Python skills required. Experience building transformation pipelines in dbt is a major plus. Hands-on experience with modern orchestration tools like Airflow. Familiarity with CI/CD workflows, version control, and automated testing frameworks to meet data quality standards. Background in entertainment, with a deep understanding of the TV and streaming landscape across linear, digital, AVOD, FAST, SVOD, and emerging platforms. Strong problem-solving skills and attention to detail. Excellent communication and stakeholder management skills. The anticipated base salary for this position is $130K to $170K. This role may also qualify for annual incentive and/or comprehensive benefits. The actual base salary offered will depend on a variety of factors, including without limitation, the qualifications of the individual applicant for the position, years of relevant experience, level of education attained, certifications or other professional licenses held, and if applicable, the location of the position. Sony Pictures Entertainment is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, national origin, disability, veteran status, age, sexual orientation, gender identity, or other protected characteristics. SPE will consider qualified applicants with arrest or conviction records in accordance with applicable law. To request an accommodation for purposes of participating in the hiring process, you may contact us at SPE_Accommodation_Assistance@spe.sony.com.
    $130k-170k yearly Auto-Apply 40d ago
  • Staff Data Platform Engineer

    Gemini 4.9company rating

    New York jobs

    About the Company Gemini is a global crypto and Web3 platform founded by Cameron and Tyler Winklevoss in 2014, offering a wide range of simple, reliable, and secure crypto products and services to individuals and institutions in over 70 countries. Our mission is to unlock the next era of financial, creative, and personal freedom by providing trusted access to the decentralized future. We envision a world where crypto reshapes the global financial system, internet, and money to create greater choice, independence, and opportunity for all - bridging traditional finance with the emerging cryptoeconomy in a way that is more open, fair, and secure. As a publicly traded company, Gemini is poised to accelerate this vision with greater scale, reach, and impact. The Department: Platform Our Platform organization's purpose is to enable Gemini to scale effectively and empower our engineering teams to focus on building innovative financial products and experiences for individuals around the world. Platform focuses around building a scalable and secure foundations platform, enabling Engineering to deploy, validate, and operate their services in production, improve resiliency of the service and increase organizational efficiency by reducing operational toil and increase system efficiency through architectural evolution. The Platform team engages directly with our other engineering teams to onboard them onto our platform systems, reviewing and recommending design and architectural decisions, and guiding our engineering teams on how to implement the tooling provided by the larger Platform organization required to ensure systems can scale and react to changing conditions, with continuous improvement loops. The Role: Staff Data Platform Engineer As a Staff Data Platform Engineer, you're a part of the Data/Database Engineering Team and you'll be instrumental in leading building, scaling, and maintaining our data infrastructure with a focus on architecture, reliability, availability, and performance. This role will work closely with both data engineering and product engineering teams, providing a robust infrastructure foundation that enables them to build, maintain, and scale data-driven products and solutions. An immediate priority will be implementing advanced scaling strategies for our relational database systems to support a highly scalable infrastructure. This role also requires a strong commitment to uptime and incident response, including participation in an on-call rotation. You'll bring expertise in database technologies (relational, columnar, document, key-value, and unstructured) and familiarity with core data infrastructure components like message queues, ETL pipelines, and real-time processing tools to support a resilient, high-performing data platform. This role is required to be in person twice a week at either our New York City, NY or San Francisco, CA office. Responsibilities: Database Scaling and Optimization: Design and implement scaling strategies for relational systems to ensure they meet the high availability and scalability needs of data and product engineering teams. Availability and Uptime Management: Proactively monitor and optimize database systems to meet stringent uptime requirements. Participate in an on-call rotation to respond to incidents, troubleshoot issues, and restore service promptly during disruptions. Architect and Optimize Database Infrastructure: Manage a variety of database technologies, balancing tradeoffs across relational, columnar, document, key-value, and unstructured data solutions, providing a foundation for data warehousing and supporting data-driven product needs. Integration with Data Engineering and Product Pipelines: Collaborate with data and product engineering teams to implement and optimize data pipelines, including message queues (e.g., Kafka), ETL workflows, and real-time processing, ensuring efficient and reliable data movement. Infrastructure Automation and Reliability: Utilize infrastructure as code (IaC) to automate deployment, scaling, and maintenance, creating a consistent, reliable environment that supports high availability and deployment efficiency for both data and product teams. Performance Tuning and Incident Response: Conduct performance tuning, establish monitoring and alerting, and address potential issues quickly to ensure a responsive platform that meets the needs of all engineering workloads. Documentation and Knowledge Sharing: Document processes, including scaling strategies, monitoring setups, and best practices, to support alignment with engineering requirements and ensure smooth handoffs in on-call situations. Qualifications: Deep expertise in data and storage technologies, including RDBMS (e.g., Postgres), NoSQL, and other database types (e.g., columnar, document, key-value, and unstructured), Object Storage (S3), with a strong understanding of tradeoffs and use cases for each. Demonstrated experience with advanced database scaling strategies for relational systems. Strong knowledge of high-availability architectures and proficiency with monitoring tools to support uptime and incident response. Experience with cloud-based database and data processing platforms, such as Amazon Aurora, Databricks, AWS RDS, Redshift, BigQuery, Snowflake, and managed services like AWS EMR and Google Cloud Dataflow. Hands-on experience with modern data transport and streaming platforms such as Kafka, Kinesis, and Pulsar, including building and operating real-time data pipelines. Familiarity with traditional ETL workflows, scheduled batch pipelines, and message queuing systems (e.g., RabbitMQ, SQS), and how they integrate with streaming architectures. Strong programming skills (e.g., Python, Bash, SQL) and experience with CI/CD practices. Experience in an on-call rotation and handling incident response. Excellent communication and collaboration skills, with a proven ability to work effectively with data and product engineering teams. It Pays to Work Here The compensation & benefits package for this role includes: Competitive starting salary A discretionary annual bonus Long-term incentive in the form of a new hire equity grant Comprehensive health plans 401K with company matching Paid Parental Leave Flexible time off Salary Range: The base salary range for this role is between $168,000 - $240,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate's compensation, we consider a number of factors including skillset, experience, job scope, and current market data. In the United States, we offer a hybrid work approach at our hub offices, balancing the benefits of in-person collaboration with the flexibility of remote work. Expectations may vary by location and role, so candidates are encouraged to connect with their recruiter to learn more about the specific policy for the role. Employees who do not live near one of our hubs are part of our remote workforce. At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know. #LI-ES1
    $168k-240k yearly Auto-Apply 60d+ ago
  • Data Engineer III

    Horizon Media 4.8company rating

    New York, NY jobs

    Who We Are Horizon Media, founded in 1989 by Bill Koenigsberg, is recognized as one of the most innovative marketing and advertising firms. We are headquartered in New York City, with offices in Los Angeles and Toronto. A leader in driving business solutions for marketers, Horizon is known for its highly personal approach to client service. Renowned for its incredible culture, Horizon is consistently named to all the prestigious annual Best Places to Work lists published by Fortune, AdAge, Crain's New York Business and Los Angeles Business Journal. Together we are building a place of belonging. At Horizon, we understand the value that different perspectives can bring to our clients and culture, so we strive for an environment where our employees feel welcomed, safe and empowered. We value YOU and believe that your authentic voice and unique perspective allows us to create a more rewarding culture, and experience, together. Our simple recipe for success? We hire talented people (thinkers, doers, dreamers, makers), challenge them and give them every opportunity to grow. Job Summary As an Analytics Engineering Architect, you will be responsible for leading the design and implementation of enterprise-wide analytics solutions that transform raw data into actionable business insights. This role combines deep expertise in data architecture, visualization development, and analytics engineering to create scalable solutions that enable data-driven decision-making across the organization. You will establish standards for data transformation, testing, and documentation while partnering with leadership to align analytics initiatives with business objectives. What You'll Do (40%) Design and Implement Enterprise Analytics Architect and oversee implementation of enterprise-wide data modeling strategies in Snowflake Define and maintain dbt modeling standards and best practices across the organization Design scalable solutions for complex analytical problems spanning multiple data domains Lead the development and implementation of reusable analytics frameworks and components Drive implementation of advanced performance optimization strategies for large-scale data transformations (30%) Technical Leadership and Innovation Partner with leadership to align analytics initiatives with business objectives Drive adoption of modern engineering tools with emphasis on dbt development practices Establish best practices for engineering across the organization Evaluate and recommend new technologies and approaches to improve capabilities Lead technical discussions and decision-making with executives, bridging engineering teams and stakeholders (15%) Visualization Development and Strategy Lead the design and implementation of enterprise-wide visualization solutions Establish visualization standards and best practices across the organization Architect scalable and performant dashboard solutions Optimize data models and queries for visualization performance Guide the development of interactive analytics applications Ensure visualizations effectively communicate business insights (15%) Team Leadership and Mentorship Mentor and guide junior analytics engineers in dbt development and analytics engineering best practices Review and approve technical designs and perform code reviews Foster a culture of innovation and continuous improvement within the team Lead knowledge sharing initiatives and internal training programs Qualifications: Bachelor's degree in computer science, Statistics, Mathematics, or related field 10+ years of experience in analytics, data engineering, or related field Experience with dbt with expertise in: Package development and dependency management Custom macro development Testing framework implementation CI/CD pipeline integration Expert-level knowledge of data modeling, dimensional design, and analytics engineering best practices Advanced proficiency in SQL and Python Proven experience leading and mentoring technical teams Excellence in stakeholder management and technical communication Track record of successfully delivering enterprise-scale solutions Proven expertise in data visualization development, including: Experience building enterprise-scale dashboards Strong understanding of data visualization best practices Expertise in visual design principles and data storytelling Experience with multiple visualization platforms Minimum 3 years of experience with Snowflake, including: Performance optimization and query tuning Security and access control implementation Resource monitoring and cost optimization Data warehouse architecture design Preferred skills: Prior experience with marketing and advertising data Experience with Streamlit for building data visualizations & applications Knowledge of other modern data visualization frameworks (D3.js, Plotly, etc.) Familiarity with machine learning visualization techniques Building pipelines in support of chat applications Publishing applications to Snowflake's marketplace #LI-KG1 #LI-HYBRID #HM Horizon Media is proud to be an equal opportunity workplace. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. Salary Range $180,000.00 - $230,000.00 A successful applicant's actual base salary may vary based on factors such as individual's skill sets, experience, training, education, licensure/certifications, and qualifications for the role. As an organization, we take an aptitude and competency-based hiring approach. We provide a competitive total rewards package including a discretionary bonus and a variety of benefits including health insurance coverage, life and disability insurance, retirement savings plans, company paid holidays and unlimited paid time off (PTO), mental health and wellness resources, pet insurance, childcare resources, identity theft insurance, fertility assistance programs, and fitness reimbursement.
    $180k-230k yearly Auto-Apply 60d+ ago
  • Principal Data Engineer

    Horizon Media 4.8company rating

    New York, NY jobs

    Horizon Media, founded in 1989 by Bill Koenigsberg, is recognized as one of the most innovative marketing and advertising firms. We are headquartered in New York City, with offices in Los Angeles and Toronto. A leader in driving business solutions for marketers, Horizon is known for its highly personal approach to client service. Renowned for its incredible culture, Horizon is consistently named to all the prestigious annual Best Places to Work lists published by Fortune, AdAge, Crain's New York Business and Los Angeles Business Journal. Together we are building a place of belonging. At Horizon, we understand the value that different perspectives can bring to our clients and culture, so we strive for an environment where our employees feel welcomed, safe and empowered. We value YOU and believe that your authentic voice and unique perspective allows us to create a more rewarding culture, and experience, together. Our simple recipe for success? We hire talented people (thinkers, doers, dreamers, makers), challenge them and give them every opportunity to grow. Overview Horizon Media is seeking a hands-on Principal Data Engineer to join our growing data engineering team. Reporting to the VP of Architecture & Data Engineering, you will play a critical role in building and maintaining robust, scalable, and efficient data pipelines that power analytics, reporting, and data-driven decision-making across the organization. This role is deeply embedded in our modern data stack-leveraging tools like Snowflake, dbt Cloud, Fivetran, and Python. You will work closely with analysts, data scientists, and engineering peers to ensure our data systems are fast, reliable, and ready to meet business demands. Key Responsibilities Build and maintain end-to-end data pipelines using dbt, Snowflake, and Python Develop modular, testable, and well-documented SQL-based data models in dbt Integrate data from multiple sources using Fivetran and custom ingestion scripts Write and optimize complex SQL queries for performance and scalability Implement data quality checks, monitoring, and alerting to ensure pipeline reliability Collaborate with cross-functional teams to understand data needs and deliver clean, well-structured datasets Support production workflows through CI/CD pipelines and version control (Git) Troubleshoot pipeline failures and performance issues quickly and effectively Contribute to documentation and improve internal standards and tooling Qualifications 6-10 years of hands-on experience in data engineering or a similar role Strong experience working with Snowflake and dbt in production environments Proficient in SQL (advanced level) and Python for data transformation and automation Experience working with Fivetran or similar data ingestion platforms Familiarity with CI/CD tools and Git-based development workflows Understanding of data modeling principles (e.g., dimensional models, star/snowflake schemas) Experience working in cloud environments, preferably AWS Strong attention to detail, with a passion for clean, maintainable data systems Excellent problem-solving skills and a collaborative mindset Nice to Have Experience with orchestration tools like Airflow or Dagster Familiarity with data testing frameworks (e.g., dbt tests, Great Expectations) Exposure to media, advertising, or marketing data Technical Skills dbt Cloud, Snowflake, Fivetran SQL, Python Git, CI/CD workflows AWS (S3, Lambda, etc.) #LI-KG1 #LI-HYBRID #HM Horizon Media is proud to be an equal opportunity workplace. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. Salary Range $180,000.00 - $230,000.00 A successful applicant's actual base salary may vary based on factors such as individual's skill sets, experience, training, education, licensure/certifications, and qualifications for the role. As an organization, we take an aptitude and competency-based hiring approach. We provide a competitive total rewards package including a discretionary bonus and a variety of benefits including health insurance coverage, life and disability insurance, retirement savings plans, company paid holidays and unlimited paid time off (PTO), mental health and wellness resources, pet insurance, childcare resources, identity theft insurance, fertility assistance programs, and fitness reimbursement.
    $180k-230k yearly Auto-Apply 60d+ ago
  • Senior Data Engineer

    Resonate 4.2company rating

    Washington jobs

    Resonate is a leading provider of high-quality, AI-powered consumer data, intelligence, and technology, empowering marketers to create a more personalized world that increases customer acquisition and lifetime value. Our SaaS platform, Ignite, and our Data-as-a-Service (DaaS) offerings provide unparalleled insights into consumer motivations, values, and behaviors, enabling our clients to connect with their target audiences in more meaningful and effective ways. We are a dynamic and fast-growing company seeking passionate and innovative individuals to join our team! We're looking for a Senior Data Engineer who combines deep technical expertise with strong communication skills and a proactive mindset. You'll design, build, and optimize complex data pipelines in a cost-conscious environment while actively collaborating with your squad. This role is ideal for engineers who love sharing their work, navigating ambiguity, and understand that “it runs” is not the same as “it runs efficiently.” Thriving here means taking ownership, balancing innovation with operational excellence, and proactively seeking out opportunities for optimization. Key Responsibilities Design and develop high-performance data pipelines using Scala/Spark on AWS EMR. Build new features and data products while maintaining operational excellence. Optimize pipelines for both performance and cost efficiency. Debug complex distributed system issues using appropriate tools and methodologies. Implement comprehensive monitoring and observability from day one. Write efficient Snowflake/Snowpark procedures across multiple languages. Balance new development with continuous improvement of existing systems. Qualifications & Experience Requirements Technical Requirements: 5+ years of hands-on experience with Apache Spark using Scala exclusively (no PySpark). Proven track record of optimizing large-scale data pipelines for performance and cost. Strong AWS EMR experience, including fleet management and instance optimization. Proficiency in AWS Step Functions. Deep understanding of distributed computing principles and resource management. Experience debugging and tuning multi-terabyte daily workloads. Comfort working across Scala, Python, and SQL as needed. Experience with probabilistic data structures for high-cardinality data processing. Critical Skills: Advanced troubleshooting abilities in distributed systems. Strong understanding of data skew mitigation strategies. Metrics-first mindset-measuring before and after optimization. Root cause analysis expertise, preventing issues rather than just fixing them. Mindset Requirements: Cost-conscious, treating company money like your own. Balance between innovation and operational excellence. Proactive in optimization-seeing inefficiencies as opportunities. Communicate early and often, especially around risks and blockers. Ability to interpret needs beyond stated requirements. Preferred Qualifications Experience designing large-scale, observable, and maintainable systems. Proven ability to balance building new capabilities with perfecting existing systems. Strong track record of mentoring and knowledge sharing. Background in cost/performance tradeoff decision-making. Passion for environments where initiative and ownership are rewarded. Resonate Attributes At Resonate, we care about more than your experience on paper. We look for teammates who bring the right mindset and ways of working to help us do great work, together. The Resonators who thrive here embody these qualities: Proactive Problem-Solving: You spot challenges early and take initiative to solve them. Ownership: You take initiative, follow through, and hold yourself accountable. Collaboration: You value working with others and building strong, respectful relationships. Adaptability: You stay steady through change and adjust when needed. Growth Mindset: You're open to learning, feedback, and trying new things. Customer Focus: You keep the end user in mind and aim for impact. Clear Communication: You express ideas simply and listen to understand. Integrity & Empathy: You act with honesty and consider the people around you. Strategic Thinking: You focus on what matters most and work with purpose. Drive: You show up motivated and ready to move things forward. Readiness: You bring the skills to contribute meaningfully from day one. These attributes are the foundation of how we work and grow together! Benefits Besides the opportunity to work alongside smart, fun, and hard-working colleagues at Resonate, you'll also enjoy uncapped growth potential, the chance to have a meaningful impact on the company, and the opportunity to work on cutting-edge, AI-powered marketing data and identity solutions that are forever changing the industry. From a competitive 401(k) match to an Open PTO policy that encourages real time off, we're committed to creating an environment where our team members can thrive both inside and outside of work. Our comprehensive benefits package is designed to meet the diverse needs of our employees and their families. A full list of benefits is shared with candidates following their initial conversation with our HR team, so you'll have a clear understanding of the support and resources available to you as part of the Resonate team. Location At Resonate, we take a remote-first approach to work, offering a flexible environment that empowers our team to collaborate seamlessly across different locations. While we embrace remote work, we also encourage thoughtful and intentional in-person collaboration to foster connection and teamwork when needed. Whether you're working from home or joining us in one of our state-of-the-art offices, you'll have the tools and resources you need to succeed. Resonate is headquartered in Reston, VA, with offices in New York City and Washington, D.C. Our EEO Statement: Resonate is an equal opportunity employer that is committed to diversity and inclusion in the workplace. We prohibit discrimination and harassment of any kind based on race, color, sex, religion, sexual orientation, national origin, disability, genetic information, pregnancy, or any other protected characteristic as outline by federal, state, or local laws. Find out more about our story at *****************
    $108k-152k yearly est. Auto-Apply 60d+ ago
  • Data Engineer

    Cortina Solutions 3.4company rating

    Huntsville, AL jobs

    The Data Engineer is responsible for designing, building, and maintaining data pipelines and infrastructure to collect, store, and process raw data into a usable format for analysis. The candidate will be responsible for converting business and functional requirements into complex reports, data visualizations and dashboards. You will consult with the client in the development of intuitive and user-friendly data visualization dashboards and applications. Other tasks will include importing data into a visualization engine from various external sources and the development of other web-based query applications. Job Requirements: Must be a U.S. Citizen Must have a bachelor's degree in accounting, finance, information technology, logistics, or business management Must have at least 3 years of experience Experience with SQL Experience with Python Must hold an active DoD Secret Level security clearance or equivalent As our team members work on government sites, all potential candidates are subject to a background screening that fully complies with the Fair Credit Reporting Act.
    $80k-111k yearly est. 60d+ ago
  • Business Intelligence & Data Engineer

    Sentinel 3.8company rating

    Phoenix, AZ jobs

    Responsibilities We are seeking a hands-on Data Engineer to manage and expand our enterprise business intelligence and data platform. This role will be responsible for running and enhancing the full BI stack, with a primary focus on Tableau development and administration, and a secondary focus on Azure-based data engineering. The ideal candidate will be comfortable owning solutions end to end-from data ingestion through presentation-and working independently within a small, agile technology team. This role requires strong communication skills to collaborate effectively with both technical and non-technical stakeholders, including executives. This is a hybrid role at our client location in Phoenix, AZ. Qualifications Bachelor's degree in Computer Science, Data Science, Information Technology, or a related field, or equivalent practical experience Minimum 2 years of experience in data modeling and report building, specifically using Tableau for dashboard and report creation Relevant certifications in cloud platforms, data analytics, or business intelligence are a plus (e.g., Microsoft Certified: Azure Data Engineer Associate, Tableau Desktop Specialist) Must have experience with Tableau (dashboard creation, report development, data modeling). Proficiency in SQL for querying relational and non-relational data sources Experience with cloud-based data environments, preferably Microsoft Azure Experience in developing, maintaining, and enhancing ETL/ELT processes for data transformation and loading in cloud-based environments Strong understanding of data warehousing concepts, data modeling techniques, and best practices for cloud data architecture Proficiency in scripting languages like Python or R is preferred, particularly for data manipulation and analysis Ability to work independently and manage projects from start to finish Strong communication skills with both technical and non-technical audiences The candidate must have a car, as this position requires travel between location and the transportation of equipment A valid driver's license and proof of vehicle insurance will be required Legally authorized to work in the US without sponsorship Must demonstrate a “can-do” attitude We focus on candidates that display our “ACE” factor - Attitude, Compassion, and Enthusiasm to deliver quality solutions with exceptional customer service. What you get: We offer competitive pay, medical, dental, vision, 401K and more. Overview MOTIVATED…..make IT happen! Sentinel Technologies, Inc. has been rated a top workplace every year since 2012! About Us: Sentinel delivers solutions that can efficiently address a range of IT needs - from security, to communications, to systems & networks, to software applications, to cloud and managed services; all of which include our staffing solutions for our clients. Since 1982, Sentinel has grown from providing technology maintenance services to our current standing as one of the leading IT services and solutions provider in the US. We have aligned with many of today's global technology leaders including Cisco, Dell, VMware and Microsoft. Sentinel services customers both nationally and internationally with primary support operating centers in Downers Grove (HQ), Chicago, and Springfield, IL; Phoenix, AZ.; Lansing, and Grand Rapids, MI; Milwaukee, WI; and Denver, CO. If you are MOTIVATED… you can make IT happen at Sentinel. Our commitment to our employees is to create a work environment that encourages creativity, an entrepreneurial spirit, fosters growth through certification and hands-on training, and values a team-oriented culture with rewards based on impact! If you share our passion about what technology can do and want to be part of a top workplace environment - we'd like to have you join our team. Learn more at ************************* As part of Sentinel's employment process, candidates will be required to complete a background check. Only those who meet the minimum requirements will be contacted. No phone calls please. Sentinel is proud to be an equal opportunity employer including disability and veterans. In accordance with Title VII and state regulations, all qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, marital status, pregnancy, genetics, disability, military, veteran status or any other basis protected by law. If you are an individual with a disability and need assistance in applying for a position, please contact ************************. The “Know Your Rights” Poster is available here ******************************************************************************************** Sentinel EEO Policy Statement is available here. **************************************** JFNDNP
    $93k-119k yearly est. Auto-Apply 2d ago
  • Sr. Data Engineer

    Slickdeals 4.1company rating

    Las Vegas, NV jobs

    We believe shopping should feel like winning. That's why 10 million people come to Slickdeals to swap tips, upvote the best finds, and share the thrill of a great deal. Together, our community has saved more than $10 billion over the past 26 years. We're profitable, passionate, and in the middle of an exciting evolution-transforming from the internet's most trusted deal forum into the go-to daily shopping destination. If you thrive in a fast-moving, creative environment where ideas turn into impact fast, you'll fit right in. The Purpose: We're seeking a seasoned Senior Data Engineer to join our high-impact team at Slickdeals. This role will inherit and evolve a mature data ecosystem built over 3+ years, spanning Databricks, dbt, Airflow, AWS, Tableau, and AtScale. You'll be responsible for maintaining and modernizing core pipelines, enabling analytics and reporting, and supporting cost-conscious, scalable data infrastructure. What You'll Do: Own and maintain ETL/ELT pipelines using dbt, Airflow, and Databricks Develop and optimize data models in AtScale to support BI tools like Tableau Collaborate with Analytics, Product, and Engineering teams to deliver reliable, timely data Monitor and troubleshoot data workflows, ensuring high availability and performance Support cloud infrastructure in AWS, including S3, Kafka, EC2, Lambda, and IAM policies Contribute to cost optimization efforts across data storage, compute, and tooling Document systems, processes, and tribal knowledge for continuity and onboarding Participate in code reviews, architecture discussions, and team rituals What We're Looking For: Required Experience: BS/BA/BE degree in a quantitative area such as mathematics, statistics, economics, computer science, engineering, or equivalent experience. 8+ years of experience in data engineering or analytics engineering Strong proficiency in SQL, Python, and dbt Hands-on experience with Databricks, Airflow, and AWS Familiarity with semantic modeling tools Experience building dashboards and supporting BI teams using Tableau Understanding of data governance, security, and compliance best practices Excellent communication and written documentation skills Comfortable working in a fast-paced, collaborative environment Always curious and a continuous learner Preferred Experience: Experience with cost monitoring tools or FinOps practices Familiarity with vendor integrations and API-based data sharing Exposure to AtScale, Tableau, or other modern data platforms Passion for mentoring and knowledge sharing With your application, kindly attach a cover letter that outlines your greatest achievement. Please share what you built, how you measured success, and your role in the result. Please note: We are unable to sponsor visas at this time. Candidates must be authorized to work in the U.S. without current or future visa sponsorship or transfer. LOCATION: Las Vegas, NV Hybrid schedule visiting our Las Vegas office three days a week (Tues-Thurs). Slickdeals Compensation, Benefits, Perks: The expected base pay for this role is between $122,000 - $150,000. Pay is based on a number of factors including market location and may vary depending on job-related knowledge, skills, and experience. Exact compensation will be discussed during the interview process and tailored to the candidate's qualifications. Competitive base salary and annual bonus Competitive paid time off in addition to holiday time off A variety of healthcare insurance plans to give you the best care for your needs 401K matching above the industry standard Professional Development Reimbursement Program Work Authorization Candidates must be eligible to work in the United States. Slickdeals is an Equal Opportunity Employer; employment is governed on the basis of merit, competence and qualifications and will not be influenced in any manner by race, color, religion, gender (including pregnancy, childbirth, or related medical conditions), national origin/ethnicity, veteran status, disability status, age, sexual orientation, gender identity, marital status, mental or physical disability or any other protected status. Slickdeals will consider qualified applicants with criminal histories consistent with the "Ban the Box" legislation. We may access publicly available information as part of your application. Slickdeals participates in E-Verify. For more information, please refer to E-Verify Participation and Right to Work. Slickdeals does not accept unsolicited resumes from agencies and is not responsible for related fees.
    $122k-150k yearly Auto-Apply 21d ago
  • Senior Data Engineer

    Outfront Media 4.7company rating

    New York, NY jobs

    About OUTFRONT We are one of North America's most innovative media companies. We leverage the power of creative excellence, unbeatable locations and smart audience data to change the game for advertisers. Our purpose as a company is to help people, places and businesses grow stronger. To do this, we make meaningful connections between brands and people when they are outside of their homes through one of the largest and most diverse sets of out-of-home assets including billboards, transit and mobile displays across the U.S. We connect diverse audiences across over 150 markets and conduct our business considering all our stakeholders, from clients and employees, to the communities where we operate. We are committed to creating a diverse and inclusive work environment that promotes the growth of our people. Come join our industry-leading team! What We Offer OUTFRONT offers a comprehensive benefits program including: Medical, Dental, Vision (including same and opposite-sex domestic partners) HSA and FSA plans, Family Benefits, Pet Benefits 401(k) Plan with an Employer Match Paid Time Off, Commuter Benefits, Educational Assistance Robust Diversity, Equity and Inclusion program including 7 Employee Resource Groups (ERGs) Job Summary At OUTFRONT Media, a leader in Digital Out of Home (DOOH) advertising, we're blending cutting-edge technology with our nationwide network of digital billboards and transit displays to create innovative solutions that push the boundaries of what's possible. We're looking for a passionate and experienced Senior Data Engineer to join our growing Data & Analytics team. If you're excited about building high-performance data pipelines, solving complex problems, and collaborating with both internal and external teams, we want to hear from you! As a Senior Data Engineer, you'll leverage your expertise in cloud technologies, data architecture, and modern data warehouses to design and implement scalable, high-impact solutions that drive real business results. You'll partner with leadership to influence strategic decisions, ensuring our data systems evolve alongside the business to meet future needs. In this role, you'll be the go-to expert for all things cloud and data, playing a key part in shaping innovative, customer-focused solutions. If you're a multi-tasker with strong attention to detail, and you're ready to make a significant impact in a fast-paced, collaborative environment, let's work together to build the future of advertising technology! What You'll Do Build and maintain high-performance data pipelines that enable deeper analysis, reporting, and insights across the business. Design and implement large-scale, data structures that support analytics and data science needs, optimizing for speed, scalability, and reliability. Create seamless data ingestion processes (both real-time and batch) using modern ETL/ELT practices, leveraging cloud technologies and big data tools to ensure smooth data flow. Collaborate with business teams to understand needs, and translate them into flexible, scalable data solutions that grow with the company. Work alongside engineers to establish best practices for data system creation, ensuring data quality, integrity, and proper documentation throughout. Continuously improve reporting and analysis by automating processes, streamlining workflows and empowering teams with self-service data tools for easy access to insights. Partner with senior leadership to stay aligned on business objectives, provide regular updates, and make data-driven decisions to drive progress. Be the subject matter expert in your area, understanding the ad tech landscape and identifying ways to leverage OUTFRONT's unique position in the market. Take the initiative to identify, troubleshoot, and resolve challenges that could hinder progress on strategic goals, ensuring the team remains on track to meet both technical and business objectives. Experience 7+ years of experience designing and implementing data warehouse solutions, focusing on scalability and performance. 7+ years of experience writing complex SQL queries and developing robust ETL/ELT pipelines to transform and integrate data. 7+ years of programming experience in languages like Python, focusing on building scalable and efficient solutions. Extensive experience in building and optimizing APIs, containerization, and orchestration with Kubernetes to ensure scalable, high-performance data systems. Hands-on experience with Cloud data warehouses (ideally with Snowflake, Google BigQuery, or AWS Redshift). Proficient in cloud technologies such as GCP or AWS. Experience leading and collaborating with agile teams, utilizing tools like Jira and Confluence, to drive user-centered and iterative development processes. Technical Expertise Expertise in creating reports and dashboards using tools like Tableau, Looker, or Power BI to enable insightful decision-making. Deep knowledge of identity management, data privacy, audience targeting, cross-device graphs, and related technologies. Solid understanding of digital advertising technology, including ad serving, analytics, DSP, DMP, SSP, QA, and targeting. Soft Skills Strong cross-functional collaboration skills, with experience working in complex organizations and aligning teams on data product decisions and trade-offs. Exceptional communication skills, capable of breaking down complex technical concepts and clearly communicating them to stakeholders. Passionate about working with large datasets, skilled at spotting trends and deriving actionable insights to drive business decisions. A quick learner, adept at grasping both business and technical needs to deliver impactful solutions. Education Bachelor of Engineering in Computer Science or equivalent preferred. The salary range for this role is $165,000-$185,000 per year. Compensation is determined during our interview process by assessing a candidate's experience and skills relative to internal peers and market benchmarks evaluated for the scope and responsibilities of the position. Please note that the foregoing compensation information is a good-faith assessment associated with this position only and is provided pursuant to the New York City Salary Transparency Law. To all Recruitment Agencies: OUTFRONT Media LLC does not accept agency and unsolicited resumes. Please do not forward resumes to our OUTFRONT Media employees or any other company location. OUTFRONT Media is not responsible for any fees related to unsolicited resumes. OUTFRONT Media Is An Equal Opportunity Employer All applicants shall receive equal consideration without regard to race, color, religion, gender, marital status, gender identity or expression, sexual orientation, national origin, age, veteran status or disability. Please refer to the OUTFRONT Media Affirmative Action policy statement.
    $165k-185k yearly Auto-Apply 60d+ ago
  • Senior Data Engineer, 1

    Meredith 4.4company rating

    Day, NY jobs

    | Major goals and objectives and location requirements People Inc. is a leading digital media company that owns and operates a portfolio of highly respected brands across various verticals, including lifestyle, health, finance, and more. With a commitment to providing high-quality content and innovative digital experiences, People Inc. reaches millions of users globally and continues to drive growth and engagement across its platforms. People Inc. is looking for a Senior Data Engineer with strong Python and SQL skills to join the Data Operations team. The successful candidate will help build our data integration pipelines that feed into our data lakes and warehouses while maintaining data quality and integrity in our data stores. We are looking for someone who is a great team player but can also work independently. This person will also work closely with key stakeholders and understand and implement business requirements and see to it that data deliverables are met. Remote or Hybrid 3x a month In-office Expectations: This position offers remote work flexibility; however, if you reside within a commutable distance to one of our offices in New York, Des Moines, Birmingham, Los Angeles, Chicago, or Seattle, the expectation is to work from the office three times per month About The Positions Contributions: Weight % Accountabilities, Actions and Expected Measurable Results 60% You will enhance our systems by building new data integration pipelines and adding new data to our data lakes and warehouses while continuously optimizing them. You will work with internal team members as well as stakeholders to scope out business requirements and see data deliverables through to the end where they will be used via our Looker platform. You will continuously look for ways to improve our data transformations and data consumption processes so that our systems are running efficiently, and our customers are able to use and analyze our data quickly and effectively. 40% You will champion coding standards and best practices by actively participating in code reviews, and working to improve our internal tools and build process. You will work to ensure the security and stability of our infrastructure in a multi-cloud environment. You will collaborate with our Analytics engineers to ensure data integrity and the quality of our data deliverables. The Role's Minimum Qualifications and Job Requirements Education: Degree in a quantitative field, such as computer science, statistics, mathematics, engineering, data science, or equivalent experience. Experience: A minimum of 3+ years of experience in building and optimizing data pipelines with Python. You have experience writing complex SQL queries to analyze data. You have commendable experience with at least one cloud service platform (GCP and AWS preferred). You've worked with data at scale using Apache Spark, Beam or a similar framework. You're familiar with data streaming architectures using technologies like Pub/Sub and Apache Kafka. You are eager to learn about new tech stacks, big data technologies, data pipelining architectures, etc. and propose your findings to the team to try and optimize our systems. Specific Knowledge, Skills, Certifications and Abilities: Strong Python and SQL skills. Experience with Google Cloud Platform is a plus. % Travel Required (Approximate) : 0% It is the policy of People Inc. to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, the Company will provide reasonable accommodations for qualified individuals with disabilities. Accommodation requests can be made by emailing *************. The Company participates in the federal E-Verify program to confirm the identity and employment authorization of all newly hired employees. For further information about the E-Verify program, please click here: ********************************** Pay Range Salary: New York: $140,000.00 - $170,000.00 The pay range above represents the anticipated low and high end of the pay range for this position and may change in the future. Actual pay may vary and may be above or below the range based on various factors including but not limited to work location, experience, and performance. The range listed is just one component of People Inc's total compensation package for employees. Other compensation may include annual bonuses, and short- and long-term incentives. In addition, People Inc. provides to employees (and their eligible family members) a variety of benefits, including medical, dental, vision, prescription drug coverage, unlimited paid time off (PTO), adoption or surrogate assistance, donation matching, tuition reimbursement, basic life insurance, basic accidental death & dismemberment, supplemental life insurance, supplemental accident insurance, commuter benefits, short term and long term disability, health savings and flexible spending accounts, family care benefits, a generous 401K savings plan with a company match program, 10-12 paid holidays annually, and generous paid parental leave (birthing and non-birthing parents), all of which may vary depending on the specific nature of your employment with People Inc. and your work location. We also offer voluntary benefits such as pet insurance, accident, critical and hospital indemnity health insurance coverage, life and disability insurance. #NMG#
    $140k-170k yearly Auto-Apply 60d+ ago
  • Senior Data Engineer

    Outfront Media 4.7company rating

    Day, NY jobs

    About OUTFRONT We are one of North America's most innovative media companies. We leverage the power of creative excellence, unbeatable locations and smart audience data to change the game for advertisers. Our purpose as a company is to help people, places and businesses grow stronger. To do this, we make meaningful connections between brands and people when they are outside of their homes through one of the largest and most diverse sets of out-of-home assets including billboards, transit and mobile displays across the U.S. We connect diverse audiences across over 150 markets and conduct our business considering all our stakeholders, from clients and employees, to the communities where we operate. We are committed to creating a diverse and inclusive work environment that promotes the growth of our people. Come join our industry-leading team! What We Offer OUTFRONT offers a comprehensive benefits program including: Medical, Dental, Vision (including same and opposite-sex domestic partners) HSA and FSA plans, Family Benefits, Pet Benefits 401(k) Plan with an Employer Match Paid Time Off, Commuter Benefits, Educational Assistance Robust Diversity, Equity and Inclusion program including 7 Employee Resource Groups (ERGs) Job Summary At OUTFRONT Media, a leader in Digital Out of Home (DOOH) advertising, we're blending cutting-edge technology with our nationwide network of digital billboards and transit displays to create innovative solutions that push the boundaries of what's possible. We're looking for a passionate and experienced Senior Data Engineer to join our growing Data & Analytics team. If you're excited about building high-performance data pipelines, solving complex problems, and collaborating with both internal and external teams, we want to hear from you! As a Senior Data Engineer, you'll leverage your expertise in cloud technologies, data architecture, and modern data warehouses to design and implement scalable, high-impact solutions that drive real business results. You'll partner with leadership to influence strategic decisions, ensuring our data systems evolve alongside the business to meet future needs. In this role, you'll be the go-to expert for all things cloud and data, playing a key part in shaping innovative, customer-focused solutions. If you're a multi-tasker with strong attention to detail, and you're ready to make a significant impact in a fast-paced, collaborative environment, let's work together to build the future of advertising technology! What You'll Do Build and maintain high-performance data pipelines that enable deeper analysis, reporting, and insights across the business. Design and implement large-scale, data structures that support analytics and data science needs, optimizing for speed, scalability, and reliability. Create seamless data ingestion processes (both real-time and batch) using modern ETL/ELT practices, leveraging cloud technologies and big data tools to ensure smooth data flow. Collaborate with business teams to understand needs, and translate them into flexible, scalable data solutions that grow with the company. Work alongside engineers to establish best practices for data system creation, ensuring data quality, integrity, and proper documentation throughout. Continuously improve reporting and analysis by automating processes, streamlining workflows and empowering teams with self-service data tools for easy access to insights. Partner with senior leadership to stay aligned on business objectives, provide regular updates, and make data-driven decisions to drive progress. Be the subject matter expert in your area, understanding the ad tech landscape and identifying ways to leverage OUTFRONT's unique position in the market. Take the initiative to identify, troubleshoot, and resolve challenges that could hinder progress on strategic goals, ensuring the team remains on track to meet both technical and business objectives. Experience 7+ years of experience designing and implementing data warehouse solutions, focusing on scalability and performance. 7+ years of experience writing complex SQL queries and developing robust ETL/ELT pipelines to transform and integrate data. 7+ years of programming experience in languages like Python, focusing on building scalable and efficient solutions. Extensive experience in building and optimizing APIs, containerization, and orchestration with Kubernetes to ensure scalable, high-performance data systems. Hands-on experience with Cloud data warehouses (ideally with Snowflake, Google BigQuery, or AWS Redshift). Proficient in cloud technologies such as GCP or AWS. Experience leading and collaborating with agile teams, utilizing tools like Jira and Confluence, to drive user-centered and iterative development processes. Technical Expertise Expertise in creating reports and dashboards using tools like Tableau, Looker, or Power BI to enable insightful decision-making. Deep knowledge of identity management, data privacy, audience targeting, cross-device graphs, and related technologies. Solid understanding of digital advertising technology, including ad serving, analytics, DSP, DMP, SSP, QA, and targeting. Soft Skills Strong cross-functional collaboration skills, with experience working in complex organizations and aligning teams on data product decisions and trade-offs. Exceptional communication skills, capable of breaking down complex technical concepts and clearly communicating them to stakeholders. Passionate about working with large datasets, skilled at spotting trends and deriving actionable insights to drive business decisions. A quick learner, adept at grasping both business and technical needs to deliver impactful solutions. Education Bachelor of Engineering in Computer Science or equivalent preferred. The salary range for this role is $165,000-$185,000 per year. Compensation is determined during our interview process by assessing a candidate's experience and skills relative to internal peers and market benchmarks evaluated for the scope and responsibilities of the position. Please note that the foregoing compensation information is a good-faith assessment associated with this position only and is provided pursuant to the New York City Salary Transparency Law. To all Recruitment Agencies: OUTFRONT Media LLC does not accept agency and unsolicited resumes. Please do not forward resumes to our OUTFRONT Media employees or any other company location. OUTFRONT Media is not responsible for any fees related to unsolicited resumes. OUTFRONT Media Is An Equal Opportunity Employer All applicants shall receive equal consideration without regard to race, color, religion, gender, marital status, gender identity or expression, sexual orientation, national origin, age, veteran status or disability. Please refer to the OUTFRONT Media Affirmative Action policy statement.
    $165k-185k yearly Auto-Apply 14d ago

Learn more about The Walt Disney Company jobs

View all jobs