Post job

Data Engineer jobs at DICK'S Sporting Goods - 1634 jobs

  • Lead Data Scientist, Search (REMOTE)

    Dick's Sporting Goods 4.3company rating

    Data engineer job at DICK'S Sporting Goods

    At DICK'S Sporting Goods, we believe in how positively sports can change lives. On our team, everyone plays a critical role in creating confidence and excitement by personally equipping all athletes to achieve their dreams. We are committed to creating an inclusive and diverse workforce, reflecting the communities we serve. If you are ready to make a difference as part of the world's greatest sports team, apply to join our team today! OVERVIEW: Founded in 1948, DICK'S Sporting Goods first started as a bait-and-tackle shop in Binghamton, NY and has since rapidly expanded into a leading omnichannel retailer with more than 850 locations representing our multiple brands: DICK'S, House of Sport, Golf Galaxy, Public Lands, Going Going Gone, and more. Over the years, it's been our relentless focus on inspiring, supporting and equipping athletes and outdoor enthusiasts to achieve their dreams that has allowed us to become the $13B company we are today. Our company is looking to invest in our future as we embark on a journey from being the best sports retailer in the world to becoming the best sports company in the world. We aim to build the ultimate athlete data set that will power our tools and platforms for the most personalized athlete experiences. Join us as we transform our technology, data and analytics to build next-gen tools and platforms for our athletes and teammates. About the Position: Are you a passionate technologist with experience in AI, Machine Learning, Data Science and Analysis? Are you looking for an opportunity to drive enterprise impact and shape the future of a leading sports retailer with $12B+ in revenue and 800+ physical stores? Do you enjoy working with a highly skilled team of Machine Learning engineers & Scientists, co-creating enterprise grade AI capabilities? As the Lead Data Scientist, Search, you will be a key technical leader in our athlete and teammate transformation that aims to deliver a best in class customer and teammate search experience by providing them advanced intelligent decisioning tools using AI/GenAI and Machine Learning at its core. This is an exceptional opportunity to transform the way we deliver omnichannel search by building foundational AI/GenAI capabilities and do career defining work in the space. Job Purpose: This role will require an emerging technical leader & subject matter expert with strong experience in traditional Machine Learning algorithms along with deep understanding of the cutting edge SOTA AI/GenAI methods used in search engines. As a technical leader you will be influencing critical enterprise technical strategies both in the Machine Learning/AI space and neighboring integration spaces like frontend, search engineering, DSP, backend data systems etc. You will partner with product, business, and engineering leads to design search systems for future growth and scale and help them understand the art of the possible on search experiences with AI technology through deep technical design. Responsibilities: Lead design and implementation of advanced algorithms that improve multi-modal search performance, including using embeddings, graph learning, deep learning and large language models (LLMs), by refining query understanding, retrieval, ranking and whole page optimization. Developing & Optimizing Ranking Algorithms: Designing and deploying ranking algorithms that go beyond keyword matching to consider content quality, user experience and various site-level factors for determining the order of search results. Search Personalization: Develop & Implement AI/ML driven search personalization algorithms that learn from user behavior & preferences to deliver tailored search results based on user metadata like location, past site behavior etc. Expertise in data systems: Collaborate with data & search engineers to build robust data pipelines for collecting, transforming & managing massive datasets required for efficient operation of search systems. This includes unique challenges of integrating & synchronizing multimodal data streams. Search Architecture Design & Optimization: Collaborating with software engineers and architects to design and optimize the underlying infrastructure supporting the search engine. Performance Monitoring & Optimization: Developing and implementing tools & processes to monitor the performance of search systems in real time, detecting anomalies, ensuring data quality, and addressing technical issues to maintain the efficiency & effectivenss of the search system. Experimentation & A/B testing: Collaborate with analytics teams to design, conduct and analyze controlled experiments that evaluate search performance, bringing in deeper insights that help validate hypotheses about impact of search algorithms. Research & development of emerging technologies: Staying updated with the latest advancements in AI, ML & search technologies and exploring opportunities to incorporate these innovations into search systems. Mentor junior scientists and machine learning engineers in their technical work and help establish a deep technical community invested in search experiences improvement. Work collaboratively across functions to drive scalable, robust technical solutions and be part of the engineering technical leader community. Job Requirements: 6+ years of experience in the field with at least 2-3 years of being the main technical lead in related projects (strongly preferred) Experience in varied search systems, like Elastic, SOLR, with deep understanding of their underlying infrastructure and customization capabilities. Experience being the technical lead of multiple projects at the same time, responsible for delivery and business metrics Experience in API engineering, including designing, developing and maintaining APIs Extensive experience using common machine learning and deep learning frameworks such as TensorFlow, PyTorch, OpenAI, and LangChain Expert understanding of Python and other common languages. Expert level experience in big data technologies including but not limited to Spark, Kafka, distributed systems computing etc. Experience in an Agile working environment and at least one related project management tool (Azure DevOps, Jira, etc.) Previous experience mentoring, training, and developing junior members of the team through technical influence. Experience with software engineering principles as it relates to Machine Learning systems. Comfortable presenting results to and influencing senior and executive leadership on strategic technical decisions, from the lens of science. Deep understanding of SOTA machine learning models in the overall search space. Bonus if specific experience in cutting edge multi-modal systems. Brings a collaborative, problem solving and growth mindset to all interactions with a strong focus on delivery. QUALIFICATIONS: Education: Master's Degree or equivalent level in quantitative fields like computer science, engineering, physics, mathematics, etc. preferred General Experience: Substantial general work experience together with comprehensive job related experience in own area of expertise to fully competent level. (Over 6 years to 10 years) At DICK'S, we thrive on innovation and authenticity. That said, to protect the integrity and security of our hiring process, we ask that candidates do not use AI tools (like ChatGPT or others) during interviews or assessments. To ensure a smooth and secure experience, please note the following: Cameras must be on during all virtual interviews. AI tools are not permitted to be used by the candidate during any part of the interview process. Offers are contingent upon a satisfactory background check which may include ID verification. If you have any questions or need accommodations, we're here to help. Thanks for helping us keep the process fair and secure for everyone! #LI-FD1 VIRTUAL REQUIREMENTS: At DICK'S, we thrive on innovation and authenticity. That said, to protect the integrity and security of our hiring process, we ask that candidates do not use AI tools (like ChatGPT or others) during interviews or assessments. To ensure a smooth and secure experience, please note the following: Cameras must be on during all virtual interviews. AI tools are not permitted to be used by the candidate during any part of the interview process. Offers are contingent upon a satisfactory background check which may include ID verification. If you have any questions or need accommodations, we're here to help. Thanks for helping us keep the process fair and secure for everyone! Targeted Pay Range: $95,200.00 - $158,800.00. This is part of a competitive total rewards package that could include other components such as: incentive, equity and benefits. Individual pay is determined by a number of factors including experience, location, internal pay equity, and other relevant business considerations. We review all teammate pay regularly to ensure competitive and equitable pay.DICK'S Sporting Goods complies with all state paid leave requirements. We also offer a generous suite of benefits. To learn more, visit *********************************
    $95.2k-158.8k yearly Auto-Apply 36d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Senior Software Engineer, Loyalty - Multiple Openings (REMOTE)

    Dick's Sporting Goods 4.3company rating

    Data engineer job at DICK'S Sporting Goods

    At DICK'S Sporting Goods, we believe in how positively sports can change lives. On our team, everyone plays a critical role in creating confidence and excitement by personally equipping all athletes to achieve their dreams. We are committed to creating an inclusive and diverse workforce, reflecting the communities we serve. If you are ready to make a difference as part of the world's greatest sports team, apply to join our team today! OVERVIEW: At DICK'S Sporting Goods, we believe in how positively sports can change lives. On our team, everyone plays a critical role in creating confidence and excitement by personally equipping all athletes to achieve their dreams. We are committed to creating an inclusive and diverse workforce, reflecting the communities we serve. If you are ready to make a difference as part of the world's greatest sports team, apply to join our team today! OVERVIEW: At DICK'S Sporting Goods, we take a people-centric approach to everything we do. Our Athletes, how we refer to customers, and our Teammates, how we refer to our employees, are at the center of every decision we make so that we can provide transformational experiences online, in store, and in sport. When you join Technology at DICK'S Sporting Goods, you're joining a true team that wins together. We help our Athletes and fellow Teammates better their best by innovating solutions to interesting business problems and empowering every Technology Teammate to be an innovator. And, while we work remotely from all over the United States, we provide virtual and in-person events for the team to hangout, from virtual escape rooms to cheering on the Pittsburgh Pirates at beautiful PNC Park. JOB PURPOSE As a Senior Software Engineer, you'll play a crucial role in enhancing our Athlete experience through sophisticated and impactful technology solutions. You'll be responsible for designing and implementing high-quality software that drives loyalty and engagement, while also mentoring and guiding less experienced engineers to achieve their full potential. Your expertise will help shape the technical direction of the Loyalty program, ensuring our innovations are both effective and scalable. You'll have the chance to solve complex problems and contribute to a culture of continuous improvement. Your contributions will directly impact how Athletes interact with our brand, creating meaningful and memorable experiences. In this role, you will: Build and enable a best-in-class Loyalty platform Consider the Athlete experience as a central part of designs and solutions Collaborate with product managers, designers, architects, and other engineers Provide data and visibility into the performance and health of processes and services Deliver complex tasks to production, working independently when required, but usually through coordination across your team members Support and mentor peers in an ad-hoc manner Take the initiative to drive projects Learn how the work of the team impacts other areas of the business Desired Skills & Experience Willingness to continuously learn, experiment, and innovate Experience with modern practices including micro services, test driven development, CI/CD, application scalability, and reliability Understanding of cloud-native and event-based architecture; Microsoft Azure, Kubernetes, Tanzu Application Platform, and Kafka Knowledge of Java, Spring Boot, and Angular Experience with relational and document databases RESPONSIBILITIES Software Development Drive development of existing software and contribute to development of new software by analyzing and identifying areas for modification and improvement. Develop software that is fast, secure and reliable to meet defined requirements. Software Maintenance Monitor, identify, and correct more complex software defects to maintain fully functioning software, leveraging the support and skill of more junior teammates. Design and Conceptualization Produce multiple concepts and prototypes to design digital products/services. Technical Developments Recommendation Research and suggest ways to optimize solutions to better meet user and/or business, performance, quality needs. Software Roadmap Drive maintenance road map to facilitate software development and ensure the development work is prioritized in line with business requirements. Faults Diagnosis and Correction Find root cause and resolution to limit and address issues promptly. Work Scheduling and Allocation Assign short-term work schedules to a team based on storyboarding/backlog to achieve expectations while following established timelines. Ongoing Learning and Development Develop own and more junior team member capabilities by participating in assessment and development planning activities as well as formal and informal training and coaching; gain or maintain external professional accreditation where relevant to improve performance and fulfill personal potential. Maintain an understanding of relevant technology, external regulation, and industry best practices through ongoing education, attending conferences, and reading specialist media. Program/Portfolio Management Support Contribute to work within an established program management plan to achieve specific goals. Technical Persistence Layer/Legacy Database Design/Development Guide and deliver the design distribution of basic database resources and provide physical modeling and design services to tune database solutions for optimum performance. Functional/Technical Requirements Support the collection functional requirements using document analysis and workflow analysis to express the requirements in terms of target user roles and goals. BEHAVIORAL COMPETENCIES Tech Savvy Anticipates and adopts innovations in business-building digital and technology applications. For example, investigates technologies to learn some cutting-edge best practices. Uses digital/social media to benefit the team and add value to the work being done; understands how to avoid misuse of these tools. Courage Steps up to address difficult issues, saying what needs to be said. For example, shares own ideas and points of view openly, regardless of potential criticism or risk; shows conviction when faced with adversity and challenges; raises difficult topics to be sure they are addressed. Decision Quality Makes good and timely decisions that keep the organization moving forward. For example, knows when to act independently and when to escalate issues. Integrates various inputs, decision criteria, and trade-offs to make effective decisions. Typically makes good independent decisions. Action Oriented Takes on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm. For example, takes timely action on important or difficult issues. Identifies and pursues new opportunities that benefit the organization. Collaborates Builds partnerships and works collaboratively with others to meet shared objectives. For example, readily involves others to accomplish goals; stays in touch and shares information; discourages "us versus them" thinking; shows appreciation for others' ideas and input. Instills Trust Gains the confidence and trust of others through honesty, integrity, and authenticity. For example, demonstrates integrity, upholding professional codes of conduct. Instills trust by following through on agreements and commitments despite competing priorities and by being honest and straightforward. Customer Focus Builds strong customer relationships and delivers customer-centric solutions. For example, keeps in contact with customers to ensure problems are resolved, or to improve customer service. Studies customer feedback and emerging customer needs and uses these to determine some creative new ideas. QUALIFICATIONS: Bachelor's degree or equivalent level preferred General Experience: Experience enables job holder to deal with the majority of situations and to advise others (Over 3 years to 6 years) Managerial Experience: Basic experience of coordinating the work of others (4 to 6 months) #LI-JN1 VIRTUAL REQUIREMENTS: At DICK'S, we thrive on innovation and authenticity. That said, to protect the integrity and security of our hiring process, we ask that candidates do not use AI tools (like ChatGPT or others) during interviews or assessments. To ensure a smooth and secure experience, please note the following: Cameras must be on during all virtual interviews. AI tools are not permitted to be used by the candidate during any part of the interview process. Offers are contingent upon a satisfactory background check which may include ID verification. If you have any questions or need accommodations, we're here to help. Thanks for helping us keep the process fair and secure for everyone! Targeted Pay Range: $83,000.00 - $138,200.00. This is part of a competitive total rewards package that could include other components such as: incentive, equity and benefits. Individual pay is determined by a number of factors including experience, location, internal pay equity, and other relevant business considerations. We review all teammate pay regularly to ensure competitive and equitable pay.DICK'S Sporting Goods complies with all state paid leave requirements. We also offer a generous suite of benefits. To learn more, visit *********************************
    $83k-138.2k yearly Auto-Apply 12d ago
  • Lead, Data Scientist

    Petco 4.1company rating

    San Antonio, TX jobs

    Create a healthier, brighter future for pets, pet parents and people! If you want to make a real difference, create an exciting career path, feel welcome to be your whole self and nurture your wellbeing, Petco is the place for you. Our core values capture that spirit as we work to improve lives by doing what's right for pets, people and our planet. We love all pets like our own We're the future of the pet industry We're here to improve lives We drive outstanding results together We're welcome as we are Petco is a category-defining health and wellness company focused on improving the lives of pets, pet parents and Petco partners. We are 29,000 strong and operate 1,500+ pet care centers in the U.S., Mexico and Puerto Rico, including 250+ Vetco Total Care hospitals, hundreds of preventive care clinics and eight distribution centers. We're focused on purpose-driven work, and strongly believe what's good for pets, people and our planet is good for Petco. Position Purpose As Lead Data Scientist on Petco's Enterprise Analytics and Data Science team, you will spearhead the development and deployment of scalable, production-grade machine learning models that enable personalized membership and digital experiences. You will partner closely with stakeholders in the membership and digital teams to plan and execute data science initiatives that support customer engagement, loyalty growth, and digital performance. Essential Job Functions: The incumbent must be able to perform all of the following duties and responsibilities w ith or without a reasonable accommodation. Build, maintain, optimize, and productionize machine learning models and advanced algorithms that enhance membership and digital customer experiences. Generate and test hypotheses and analyze and interpret results of experiments. Communicate insights and recommendations through clear written reports and verbal presentations to audiences of varying technical sophistication. Improve upon existing methodologies by developing new data sources, testing model enhancements, and refining model parameters. Define requirements and collaborate with engineering teams to develop analytic capabilities, platforms, and pipelines that enable scalable modeling. Develop and monitor KPIs across multiple business areas, ensuring accurate tracking and performance measurement. Provide leadership, including guiding and training, to more junior data scientists. Education And Experience MS or PhD in Statistics, Math, Engineering, Economics, or a related quantitative field. 8+ years of experience in business analytics and data science. Previous experience working with machine learning/deep learning for real-world problems in a corporate environment is a must. Demonstrable track record of dealing well with ambiguity, prioritizing needs, and delivering results in a dynamic environment. Proficiency with SQL and Python required; proficiency in R is a strong plus. Ability to explain/present complicated/advanced analytical methodology and results to non-technical audiences. Advanced quantitative modeling, statistical analysis, and critical thinking skills. Related experience in a retail organization is a strong plus. #CORP Qualified applications with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. The pay ranges outlined below are presented in accordance with state-specific regulations. These ranges may differ in other areas and could be subject to variation based on regulatory minimum wage requirements. Actual pay rates will depend on factors such as position, location, level of experience, and applicable state or local minimum wage laws. If the regulatory minimum wage exceeds the minimum indicated in the pay range below, the regulatory minimum wage will be the minimum rate applied. Salary Range: $142,100.00 - $213,100.00 Hourly or Salary Range will be reflected above. For a more detailed overview of Petco Total Rewards, including health and financial benefits, 401K, incentives, and PTO - see ******************************************** Petco Animal Supplies, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, age, protected veteran status, or any other protected classification. To translate this webpage to Spanish or other languages on your internet browser, click the translate button to the right of your browser address bar. Additional instructions can be found here: Google Chrome Help . Para traducir esta página web al español u otros idiomas en su navegador de Internet, haga clic en el botón de traducción a la derecha de la barra de direcciones de su navegador. Puede encontrar instrucciones adicionales aquí: Google Chrome Ayuda.
    $142.1k-213.1k yearly 3d ago
  • Senior Gameplay Engineer

    Disney Experiences 3.9company rating

    Glendale, CA jobs

    About the Role Disney Digital Entertainment is on a mission to create the ‘digital front door' for The Walt Disney Company, bringing all of the magic of Disney together into a new interactive universe. We are building an expert development team that will be building game experiences that will herald the next generation of Disney to the world. Working with top-class industry talent, this role is perfect for the accomplished game engineer looking to create something epic - collaborating with an incredible group of game developers focusing on individual experiences to build a wonderfully rich and cohesive product that is truly “Disney”. We are looking for a uniquely talented Senior Gameplay Engineer to join us on this journey! If you are an experienced game programmer with a love of Disney/Pixar, Star Wars and Marvel properties, you'll want to check out this opportunity! This is a remote role and will report to the Director, Gameplay Engineering. What you Will Do: Leverage your experience and knowledge to help implement several interactive games and experiences based on Disney's robust portfolio of characters and worlds including Disney/Pixar, Marvel and Star Wars. Be an active, hands-on participant in the process, directly writing code and working daily with design/production/art to establish and achieve goals for each game experience. A significant portion of this work will involve implementation using UEFN (Unreal Editor Fortnite) and Verse. This role will require a willingness and ability to operate within the limitations of that ecosystem and grow with it as the functionality matures. There may be additional work in the UE5 (Unreal Engine 5) environment, however the primary responsibility will still be within UEFN/Verse. Empower designers by serving as their main support avenue during the game construction process. Find creative ways to overcome limitations, maintaining a positive outlook along the way. Work closely with other members of the engineering team to ensure that implementation quality is maintained. Be an advocate of stability and flexibility. Champion Disney and team values. Maintain a ‘guest-first' mentality by being an advocate for the player experience. Serve as a key member of a growing game development team at Disney. Required Qualifications & Skills 7 years of experience developing console/PC/mobile games or other digital interactive entertainment. Experience with Unreal Engine 4/5+ at the native (C++) level. Participated in the creation and release of a major product, in a hands-on programming role. Was one of the main authors of a significant gameplay system. Served as a programmer during the prototype phase of a project. Understands the difference in requirements/goals between prototyping and production. Understands and implements the following concepts at a production-quality, AAA level: C++ code (Performance Impact, Memory Management, Inheritance, etc.) Client/Server architecture (Replication, Client-side Prediction, Movement Syncing, etc.) Game Mathematics (Linear Algebra, Vector Math, Kinematic Physics, Collision, etc.) Able to mentor and to guide junior engineers. A Bachelor's degree in Computer Science or equivalent combination of education and experience. Preferred Qualifications Experience with UEFN/Verse, at least at the hobby-ist level Publishing and supporting live UEFN content. Graphics Programming, Mobile experience, and familiarity with Online Services are all bonuses. Has a broad, current understanding of Fortnite and the various devices that are available for UGC (User Generated Content). Experience developing and supporting products in the AAA space. Additional Information There will be a technical/skill assessment during this hiring process. Disney offers a rewards package to help you live your best life. This includes health and savings benefits, educational opportunities, and special extras that only Disney can provide. Learn more about our benefits and perks at *************************************** #LI-REQ #DXMedia #DCPJobs #LI-Remote The hiring range for this remote position is $141,900 to $190,300 per year, which factors in various geographic regions. The base pay actually offered will take into account internal equity and also may vary depending on the candidate's geographic region, job-related knowledge, skills, and experience among other factors. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered.
    $141.9k-190.3k yearly Auto-Apply 2d ago
  • Data Analytics Engineer

    Toca Football 3.2company rating

    Remote

    In order to be considered for this role, after clicking "Apply Now" above and being redirected, you must fully complete the application process on the follow-up screen. About TOCA Soccer At TOCA, we are passionate about people and the power of sport. We believe in creating an environment that becomes the “third home” for our guests where they learn, where they live, and where TOCA becomes the place where they play. Whether they're kicking a soccer ball for the first time, focused on finding their best, or rediscovering their passion for the game, we are here to support and guide them every step along the way. Everyone deserves the opportunity to experience the joy and fulfillment that sports can bring, regardless of background and skill levels. Our ultimate goal is to create a consistent and amazing experience for everyone who interacts with TOCA, whether it is our dedicated team members or esteemed guests. What makes a TOCA Teammate? We value an individual that seeks to... Play Hard Care Deeply Grow Together Strive for Excellence Create Awesome Experiences Why You'll love being apart of the TOCA Team: You'll have full access to our TOCA Treats, which includes (but is not limited to!): Competitive Pay & On Demand Pay Part Time, Flexible Scheduling Career Growth & Development Employee Assistance Program Active & Fit Membership Benefits Hub Discount Marketplace So many TOCA Perks we can't name all of them, but we'll try: 4 TOCA Training Sessions, 50% Off Classes, Free Pick/League Play, 1 Free Birthday Party, Food and Beverage Discount, and 2 Free Packages to share with your squad! Whew! Job Highlights: Job Title: Data Analytics Engineer Location: Remote Reports To: Vice President of Data & Commercial Analytics Employment Type: Full-Time Position Overview: We are hiring a Data Analytics Engineer to join our Data Operations team at TOCA and help scale our analytics platform. This is a hands-on role for someone who enjoys working close to raw data, building well-designed data models, and enabling the business to make better decisions through trusted reporting. You will own core analytics models, integrate AI functionalities, partner closely with business stakeholders, and help define best practices for how data is transformed, modeled, and consumed at TOCA. This role is ideal for someone who thrives in an environment where direction is often evolving and enjoys solving open-ended problems creatively. Your Game Plan: On the Data Field: Analytics Engineering & Modeling (50%) Design and build analytics data models in Snowflake using SQL and dbt Transform raw source data into clean, well-structured analytics models using star schema best practices Build and maintain fact and dimension tables across base, staging, and mart layers Own critical analytics datasets used for executive reporting and business decision-making Ensure data quality, consistency, and reliability across all reporting models Team Captain: Cross-Functional Partnership (30%) Partner with stakeholders across Operations, Marketing, Finance, Product, and People teams to translate business questions into analytics solutions Build and maintain reporting datasets used by Sigma, Qlik, and other BI tools Communicate findings and insights to both technical and non-technical audiences Support onboarding of new data sources and modeling them into the analytics layer Off the Field: Platform Growth & AI Enablement (10%) Drive the ongoing integration and optimization of AI features across products, processes, and workflows Contribute to evolving analytics standards, modeling conventions, and best practices Help define how data is transformed, modeled, and consumed across TOCA Facility & Culture MVP (10%) Develop and maintain dbt projects using GitHub for version control, pull requests, and code review Participate in and lead code reviews to ensure quality, performance, and maintainability Bring ownership, organization, and a strong work ethic to a fast-paced, evolving environment Thrive in a fully remote, globally distributed team What You Bring to the Pitch: Required: 4+ years experience in analytics engineering, data analytics, or business intelligence Advanced SQL skills and strong experience working in Snowflake or similar cloud data warehouses Strong experience building analytics models using dbt Experience working in GitHub-based workflows including pull requests and code reviews Strong understanding of star schema design, fact and dimension modeling, and analytics layering Strong experience building dashboards and reporting assets in Sigma or similar BI tools such as Qlik or Tableau Experience designing reporting layers that directly support executive and business-facing dashboards Strong written communication skills for documenting work for technical and non-technical audiences Comfort working with a fully remote, globally distributed company Preferred Skills: Experience working with other data technologies: HubSpot, Google Analytics Experience ingesting new data sources using Fivetran, Snowflake data shares, or APIs Experience implementing AI strategies and AI integrations
    $95k-138k yearly est. 5d ago
  • ETL / Informatica Architect

    Atria Group 4.2company rating

    Palo Alto, CA jobs

    Required Skills: • 8+ years of overall Informatica ETL development experience • At least 2 years experience as Informatica ETL Architect • Mastery in data warehousing concepts. Candidate should be able to clearly communicate fundamental concepts during the interview and demonstrate previous experience in all aspects. • Ability to develop a technical work plan and assign work and coordinate across multiple developers and projects. • Lead technical requirements, technical and data architectures for the data warehouse projects • Ensure compliance of meta data standards for the data warehouse • Strong data analysis, data modeling and ETL Architect Skills • Able to configure the dev, test, production environments and promotion processes • Platform is Oracle 11g , Informatica 9.0.1 • Strong business and communication skills Additional Information Duration: 3+ Months (strong possibility of extension) Hire Type: Contract, C2C or 1099 Rate: DOE Visa: H1, GC or USC only! Travel Covered: No. Prefer Local. Apply today!
    $119k-159k yearly est. 60d+ ago
  • ETL / Informatica Architect

    Atria Group 4.2company rating

    Palo Alto, CA jobs

    Required Skills: • 8+ years of overall Informatica ETL development experience • At least 2 years experience as Informatica ETL Architect • Mastery in data warehousing concepts. Candidate should be able to clearly communicate fundamental concepts during the interview and demonstrate previous experience in all aspects. • Ability to develop a technical work plan and assign work and coordinate across multiple developers and projects. • Lead technical requirements, technical and data architectures for the data warehouse projects • Ensure compliance of meta data standards for the data warehouse • Strong data analysis, data modeling and ETL Architect Skills • Able to configure the dev, test, production environments and promotion processes • Platform is Oracle 11g , Informatica 9.0.1 • Strong business and communication skills Additional Information Duration: 3+ Months (strong possibility of extension) Hire Type: Contract, C2C or 1099 Rate: DOE Visa: H1, GC or USC only! Travel Covered: No. Prefer Local. Apply today!
    $119k-159k yearly est. 1d ago
  • Snowflake Data Engineer

    Coach 4.8company rating

    New York jobs

    We believe that difference sparks brilliance, so we welcome people and ideas from everywhere to join us in stretching what's possible. At Tapestry, being true to yourself is core to who we are. When each of us brings our individuality to our collective ambition, our creativity is unleashed. This global house of brands - Coach, Kate Spade New York, Stuart Weitzman - was built by unconventional entrepreneurs and unexpected solutions, so when we say we believe in dreams, we mean we believe in making them happen. We're always on a journey to becoming our best, but you can count on this: Here, your voice is valued, your ambitions are supported, and your work is recognized. A member of the Tapestry family, we are part of a global house of brands that has unwavering optimism and is committed to being innovative and wholly inclusive. Visit Our People page to learn more about Tapestry's commitment to equity, inclusion, and diversity. Primary Purpose: The ideal candidate is an experienced Data Engineer with a strong background in Snowflake, SQL, and cloud-based data solutions. They are a self-motivated, independent problem-solver who is eager to learn new skills and adapt to changing technologies. Collaboration, performance optimization, and a commitment to maintaining data security and compliance are critical for success in this role. The successful individual will leverage their proficiency in Data Engineering to... Develop and manage data models, pipelines, and transformations. Demonstrate proficiency in SQL and BASH scripting. Leverage 5+ years of experience in data engineering or related roles. Collaborate with Data Engineering, Product Engineering, and Product Management teams to align with the product roadmap. Effectively document and communicate technical solutions to diverse audiences. Demonstrate a strong ability to work independently, take ownership of tasks and drive them to completion. Show a proactive approach to learning new technologies, tools and skills as needed. The accomplished individual will... Design, implement, and manage Snowflake Data Warehouse solutions. Be proficient in creating and optimizing Snowflake schema design. Optimize Snowflake schema designs for performance and scalability. Utilize Snowflake features such as data sharing, cloning and time travel. Analyze and optimize Snowflake compute resources (e.g. virtual warehouses). Optimize queries, indexes and storage for improved performance. Maintain data security and ensure compliance with regulations. Knowledge with SOC 2 compliance standards (preferred). Integrate Snowflake with AWS services, including S3 and Glue (preferred). An outstanding professional will have... Bachelor's degree in Computer Science, Information Systems, or a related field. Snowflake certification (preferred). AWS certification (preferred). Experience with Agile methodologies and modern software development lifecycle. Familiarity with Python and/or Java for Snowflake-related automation (preferred). Familiarity with Node.js for API development (preferred). Our Competencies for All Employees Courage: Doesn't hold back anything that needs to be said; provides current, direct, complete, and “actionable” positive and corrective feedback to others; lets people know where they stand; faces up to people problems on any person or situation (not including direct reports) quickly and directly; is not afraid to take negative action when necessary. Creativity: Comes up with a lot of new and unique ideas; easily makes connections among previously unrelated notions; tends to be seen as original and value-added in brainstorming settings. Customer Focus: Is dedicated to meeting the expectations and requirements of internal and external customers; gets first-hand customer information and uses it for improvements in products and services; acts with customers in mind; establishes and maintains effective relationships with customers and gains their trust and respect. Dealing with Ambiguity: Can effectively cope with change; can shift gears comfortably; can decide and act without having the total picture; isn't upset when things are up in the air; doesn't have to finish things before moving on; can comfortably handle risk and uncertainty. Drive for Results: Can be counted on to exceed goals successfully; is constantly and consistently one of the top performers; very bottom-line oriented; steadfastly pushes self and others for results. Interpersonal Savvy: Relates well to all kinds of people, up, down, and sideways, inside and outside the organization; builds appropriate rapport; builds constructive and effective relationships; uses diplomacy and tact; can diffuse even high-tension situations comfortably. Learning on the Fly: Learns quickly when facing new problems; a relentless and versatile learner; open to change; analyzes both successes and failures for clues to improvement; experiments and will try anything to find solutions; enjoys the challenge of unfamiliar tasks; quickly grasps the essence and the underlying structure of anything. Our Competencies for All People Managers Strategic Agility: Sees ahead clearly; can anticipate future consequences and trends accurately; has broad knowledge and perspective; is future oriented; can articulately paint credible pictures and visions of possibilities and likelihoods; can create competitive and breakthrough strategies and plans. Developing Direct Reports and Others: Provides challenging and stretching tasks and assignments; holds frequent development discussions; is aware of each person's career goals; constructs compelling development plans and executes them; pushes people to accept developmental moves; will take on those who need help and further development; cooperates with the developmental system in the organization; is a people builder. Building Effective Teams: Blends people into teams when needed; creates strong morale and spirit in his/her team; shares wins and successes; fosters open dialogue; lets people finish and be responsible for their work; defines success in terms of the whole team; creates a feeling of belonging in the team. Tapestry, Inc. is an equal opportunity and affirmative action employer and we pride ourselves on hiring and developing the best people. All employment decisions (including recruitment, hiring, promotion, compensation, transfer, training, discipline and termination) are based on the applicant's or employee's qualifications as they relate to the requirements of the position under consideration. These decisions are made without regard to age, sex, sexual orientation, gender identity, genetic characteristics, race, color, creed, religion, ethnicity, national origin, alienage, citizenship, disability, marital status, military status, pregnancy, or any other legally-recognized protected basis prohibited by applicable law. Americans with Disabilities Act (ADA) Tapestry, Inc. will provide applicants and employees with reasonable accommodation for disabilities or religious beliefs. If you require reasonable accommodation to complete the application process, please contact Tapestry People Services at ************** or ****************************** LI-HYBRID Visit Tapestry, Inc. at ************************ Work Setup Hybrid Flex (in office 1 to 3 days a week) BASE PAY RANGE $140,000.00 TO $170,000.00 Annually Click Here - U.S Corporate Compensation & Benefit
    $140k-170k yearly 60d+ ago
  • Data Engineer - Kafka

    Delhaize America 4.6company rating

    Salisbury, NC jobs

    Ahold Delhaize USA, a division of global food retailer Ahold Delhaize, is part of the U.S. family of brands, which includes five leading omnichannel grocery brands - Food Lion, Giant Food, The GIANT Company, Hannaford and Stop & Shop. Our associates support the brands with a wide range of services, including Finance, Legal, Sustainability, Commercial, Digital and E-commerce, Technology and more. Primary Purpose: The Data Engineer II contributes to the expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. They will contribute to our data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They engage through the entire lifecycle of a project from data mapping, data pipelines, data modeling, and finally data consumption. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. They will learn to optimize or even re-design our company's data architecture to support our next generation of products and data initiatives. They can take on smaller projects from start to finish, work on problems of moderate scope where analysis of situations or data requires a review of a variety of factors and trace issues to their source. They develop solutions to a variety of problems of moderate scope and complexity. Our flexible/ hybrid work schedule includes 3 in-person days at one of our core locations and 2 remote days. Our core office locations include Salisbury, NC, Chicago, IL, and Quincy, MA. Applicants must be currently authorized to work in the United States on a full-time basis. Duties & Responsibilities: * Solves simple to moderate application errors, resolves application problems, following up promptly with all appropriate customers and IT personnel. * Reviews and contributes to QA test plans and supports QA team during test execution. * Participates in developing streaming data applications (Kafka), data transformation and data pipelines. * Ensures change control and change management procedures are followed within the program/project as they relate to requirements. * Able to interpret requirement documents, contributes to creating functional design documents as a part of data development life cycle. * Documents all phases of work including gathering requirements, architectural diagrams, and other program technical specifications using current specified design standards for new or revised solutions. * Relates information from various sources to draw logical conclusions. * Conducts unit testing on data streams. * Conducts data lineage and impact analysis as a part of the change management process. * Conducts data analysis (SQL, Excel, Data Discovery, etc.) on legacy systems and new data sources. * Creates source to target data mappings for data pipelines and integration activities. * Assists in identifying the impact of proposed application development/enhancements projects. * Performs data profiling and process analysis to understand key source systems and uses knowledge of application features and functions to assess scope and impact of business needs. * Implement and maintain data governance policies and procedures to ensure data quality, security and compliance * Ensure operational stability of a 24/7/365 grocery retail environment by providing technical support, system monitoring, and issue resolution which may be required during off-hours, weekends, and holidays as needed. Qualifications: * Bachelors Degree in Computer Science or Technical field; equivalent trainings/certifications/experience equivalency will be considered * 3 or more years of equivalent experience in relevant job or field of technology * Experience with Kafka * Experience with Databricks or similar Preferred Qualifications: * Masters Degree in relevant field of study preferred, Additional trainings or certifications in relevant field of study preferred * Experience in Agile teams and or Product/Platform based operating model * Experience in retail or grocery preferred #DICEJobs #LI-hybrid #LI-SS1 Salary Range: $101,360 - $152,040 Actual compensation offered to a candidate may vary based on their unique qualifications and experience, internal equity, and market conditions. Final compensation decisions will be made in accordance with company policies and applicable laws. At Ahold Delhaize USA, we provide services to one of the largest portfolios of grocery companies in the nation, and we're actively seeking top talent. Our team shares a common motivation to drive change, take ownership and enable our brands to better care for their customers. We thrive on supporting great local grocery brands and their strategies. Our associates are the heartbeat of our organization. We are committed to offering a welcoming work environment where all associates can succeed and thrive. Guided by our values of courage, care, teamwork, integrity (and even a little humor), we are dedicated to being a great place to work. We believe in collaboration, curiosity, and continuous learning in all that we think, create and do. While building a culture where personal and professional growth are just as important as business growth, we invest in our people, empowering them to learn, grow and deliver at all levels of the business.
    $101.4k-152k yearly 56d ago
  • Data Engineer (Senior to Staff level)

    Sanity 4.1company rating

    Atlanta, GA jobs

    At Sanity, we're building the future of AI-powered Content Operations. Our AI Content Operating System gives teams the freedom to model, create, and automate content the way their business works, accelerating digital development and supercharging content operations efficiency. Companies like Linear, Figma, Cursor, Riot Games, Anthropic, and Morningbrew are using Sanity to power and automate their content operations. About the role We are seeking a talented and experienced Data Engineer (Senior to Staff level) to join our growing data team at a pivotal time in our development. As a key member of our data engineering team, you'll help scale and evolve our data infrastructure to ensure Sanity can make better data-driven decisions. This is an opportunity to work on mission-critical data systems that power our B2B SaaS platform. You'll improve our data pipelines, optimize data models, and strengthen our analytics capabilities using modern tools like Airflow, AirByte, BigQuery, DBT, and RudderStack. Working closely with engineers, analysts, and business stakeholders across US and European time zones, you'll help foster a data-driven culture by making data more accessible, reliable, and actionable across the organization. If you're passionate about solving complex data challenges, have experience scaling data infrastructure in B2B environments, and want to make a significant impact at a fast-growing company, we want to talk to you. This role offers the perfect blend of technical depth and strategic influence, allowing you to shape how Sanity leverages data to drive business success. What you'll be doing Data Infrastructure & ETL Development Design, develop, and maintain scalable ETL/ELT pipelines to ensure data is efficiently processed, transformed, and made available across the company. Collaborate with engineering teams to implement and scale product telemetry across our product surfaces. Develop and maintain data models in BigQuery that balance performance, cost, and usability Establish best practices for data ingestion, transformation, and orchestration, ensuring reliability and efficiency. Orchestrate data workflows to reduce manual effort, improve efficiency, and maintain high data quality standards. Collaboration & Cross-Team Partnerships Work closely with data analysts, engineers, and other internal stakeholders to understand their data needs and design robust pipelines that support data-driven decision-making. Build scalable and flexible data solutions that address both current business requirements and future growth needs. Partner with engineering, growth, and product teams to enhance data accessibility and usability. Data Observability & Reliability Build and maintain comprehensive monitoring, alerting, and logging systems for all data pipelines and infrastructure Implement SLAs/SLOs for critical data pipelines and establish incident response procedures Develop data quality monitoring that proactively detects anomalies, schema changes, and data freshness issues Create dashboards and alerting systems that provide visibility into pipeline health, data lineage, and system performance Debug and troubleshoot data issues efficiently using observability tools and data lineage tracking Continuous Improvement & Scalability Monitor and optimize data pipeline performance and costs as data volumes grow Implement and maintain data quality frameworks and testing practices Contribute to the evolution of our data infrastructure through careful evaluation of new tools and technologies Help establish data engineering best practices that scale with our growing business needs This may be you Remote in Europe or North America (East Coast/ET) 4+ years of experience building data pipelines at scale, with deep expertise in SQL, Python, and Node.js/TypeScript for data engineering workflows Proactive mindset with attention to detail, particularly in maintaining comprehensive documentation and data lineage Strong communication skills with demonstrated ability to collaborate effectively across US and European time zones Production experience with workflow orchestration tools like Airflow, and customer data platforms like RudderStack, ideally in a B2B SaaS environment Proven experience integrating and maintaining data flows with CRM systems like Salesforce, Marketo, or HubSpot Track record of building reliable data infrastructure that supports rapid business growth and evolving analytics needs Experience implementing data quality frameworks and monitoring systems to ensure reliable data delivery to stakeholders Nice to have: Experience with product analytics tools like Amplitude, Mixpanel, or PostHog Experience with Google Cloud Platform and BigQuery What we can offer A highly-skilled, inspiring, and supportive team Positive, flexible, and trust-based work environment that encourages long-term professional and personal growth A global, multi-culturally diverse group of colleagues and customers Comprehensive health plans and perks A healthy work-life balance that accommodates individual and family needs Competitive stock options program and location-based salary Who we are Sanity.io is a modern, flexible content operating system that replaces rigid legacy content management systems. One of our big differentiators is treating content as data so that it can be stored in a single source of truth, but seamlessly adapted and personalized for any channel without extra effort. Forward-thinking companies choose Sanity because they can create tailored content authoring experiences, customized workflows, and content models that reflect their business. Sanity recently raised a $85m Series C led by GP Bullhound and is also backed by leading investors like ICONIQ Growth, Threshold Ventures, Heavybit and Shopify, as well as founders of companies like Vercel, WPEngine, Twitter, Mux, Netlify and Heroku. This funding round has put Sanity in a strong position for accelerated growth in the coming years. You can only build a great company with a great culture. Sanity is a 200+ person company with highly committed and ambitious people. We are pioneers, we exist for our customers, we are hel ved, and we love type two fun! Read more about our values here! Sanity.io pledges to be an organization that reflects the globally diverse audience that our product serves. We believe that in addition to hiring the best talent, a diversity of perspectives, ideas, and cultures leads to the creation of better products and services. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, marital status, disability, or gender identity.
    $88k-122k yearly est. Auto-Apply 60d+ ago
  • Data Engineer

    Range 3.7company rating

    McLean, VA jobs

    Range is creating AI-powered solutions to eliminate financial complexity for our members. We're transforming wealth management through the perfect blend of cutting-edge technology and human expertise. We're obsessed with member experience! We've built an integrated platform that tackles the full spectrum of financial needs-investments, taxes, retirement planning, and estate management-all unified in one intuitive system. Backed by Scale, Google's Gradient Ventures, and Cathay Innovations, we're in hyper-growth mode and looking for exceptional talent to join our starting lineup. We recently raised $60M in our Series C funding and want builders to help scale the company. Every Ranger at this stage is shaping our culture and way of life-from former CEOs and startup founders to experts from leading hedge funds and tech companies. If you're ready to build something that truly matters in financial services, bring your talent to Range. Here, you'll make a genuine impact on how people manage their financial lives while working alongside a team that celebrates wins, makes big decisions, and blazes new trails together. About the role As a Data Engineer at Range, you'll play a central role in building and scaling the data infrastructure that powers our analytics, product insights, and customer experiences. You will design, develop, and maintain robust data pipelines and platforms that ensure data is accurate, secure, and available for both real-time and batch analytics. You'll collaborate closely with product, analytics, data science, and engineering teams to turn raw data into reliable, scalable information that drives business decisions and fuels growth. This role is ideal for someone who thrives on solving technical data challenges in a fast-moving fintech environment. We're excited to hire this role at Range's Headquarters in McLean, VA. All of our positions follow an in-office schedule Monday through Friday, allowing you to collaborate directly with your team. If you're not currently based in the area, but love what you see, let's discuss relocation as part of your journey to joining us. What you'll do with us Create, maintain, and optimize scalable data pipelines and ETL/ELT workflows to support analytics and product initiatives. Integrate data from various internal and external sources, ensuring seamless ingestion, transformation, and delivery. Help shape our data architecture, including data warehouse/schema design, storage solutions, and workflow orchestration. Optimize queries and storage performance for efficiency at scale, especially as data volume grows with our customer base. Implement data quality checks, monitoring systems, and troubleshooting tools to ensure data reliability and accuracy. Work closely with data scientists, analysts, and cross-functional partners to understand data needs and deliver timely solutions. Maintain clear documentation for pipelines, architecture decisions, and standards to support team alignment and onboarding. What will set you apart 5+ years of experience in Data Engineering BS or BA in Computer Science, Engineering, Statistics, Economics, Mathematics, or another quantitative discipline from a top-tier university Strong proficiency in Python and SQL for building and optimizing data workflows. Hands-on experience with cloud data platforms and toolchains (e.g., AWS, GCP, Snowflake, BigQuery, Redshift). Familiarity with data pipeline orchestration tools such as Airflow, dbt, Prefect, or similar. Experience with data modeling, schema design, and data warehousing concepts in large-scale environments. Strong analytical mindset with excellent problem-solving skills, especially in ambiguous or evolving environments. Comfortable working collaboratively across teams and translating business needs into technical solutions. Experience working in fintech or other consumer-focused, high-growth technology startup environments. Benefits Health & Wellness: 100% employer-covered medical insurance for employees (75% for dependents), plus dental and vision coverage 401(k): Retirement savings program to support your future Paid Time Off: Dedicated time to reset and recharge plus most federal holidays Parental Leave: Comprehensive leave policy for growing families Meals: Select meals covered throughout the week Fitness: Monthly movement stipend Equity & Career Growth: Early exercise eligibility and a strong focus on professional development Annual Compensation Reviews: Salary and equity refreshes based on performance Boomerang Program: After two years at Range, you can take time away to start your own company. We'll hold your spot for 6 months - and pause your equity vesting, which resumes if you return Range is proud to be an equal opportunity workplace. We are committed to equal employment opportunities regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. As a company, we are committed to designing products, building a culture, and supporting a team that reflects the diverse population we serve.
    $91k-128k yearly est. Auto-Apply 41d ago
  • Data Engineer - Kafka

    Delhaize America 4.6company rating

    Quincy, MA jobs

    Ahold Delhaize USA, a division of global food retailer Ahold Delhaize, is part of the U.S. family of brands, which includes five leading omnichannel grocery brands - Food Lion, Giant Food, The GIANT Company, Hannaford and Stop & Shop. Our associates support the brands with a wide range of services, including Finance, Legal, Sustainability, Commercial, Digital and E-commerce, Technology and more. Primary Purpose: The Data Engineer II contributes to the expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. They will contribute to our data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They engage through the entire lifecycle of a project from data mapping, data pipelines, data modeling, and finally data consumption. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. They will learn to optimize or even re-design our company's data architecture to support our next generation of products and data initiatives. They can take on smaller projects from start to finish, work on problems of moderate scope where analysis of situations or data requires a review of a variety of factors and trace issues to their source. They develop solutions to a variety of problems of moderate scope and complexity. Our flexible/ hybrid work schedule includes 3 in-person days at one of our core locations and 2 remote days. Our core office locations include Salisbury, NC, Chicago, IL, and Quincy, MA. Applicants must be currently authorized to work in the United States on a full-time basis. Duties & Responsibilities: * Solves simple to moderate application errors, resolves application problems, following up promptly with all appropriate customers and IT personnel. * Reviews and contributes to QA test plans and supports QA team during test execution. * Participates in developing streaming data applications (Kafka), data transformation and data pipelines. * Ensures change control and change management procedures are followed within the program/project as they relate to requirements. * Able to interpret requirement documents, contributes to creating functional design documents as a part of data development life cycle. * Documents all phases of work including gathering requirements, architectural diagrams, and other program technical specifications using current specified design standards for new or revised solutions. * Relates information from various sources to draw logical conclusions. * Conducts unit testing on data streams. * Conducts data lineage and impact analysis as a part of the change management process. * Conducts data analysis (SQL, Excel, Data Discovery, etc.) on legacy systems and new data sources. * Creates source to target data mappings for data pipelines and integration activities. * Assists in identifying the impact of proposed application development/enhancements projects. * Performs data profiling and process analysis to understand key source systems and uses knowledge of application features and functions to assess scope and impact of business needs. * Implement and maintain data governance policies and procedures to ensure data quality, security and compliance * Ensure operational stability of a 24/7/365 grocery retail environment by providing technical support, system monitoring, and issue resolution which may be required during off-hours, weekends, and holidays as needed. Qualifications: * Bachelors Degree in Computer Science or Technical field; equivalent trainings/certifications/experience equivalency will be considered * 3 or more years of equivalent experience in relevant job or field of technology * Experience with Kafka * Experience with Databricks or similar Preferred Qualifications: * Masters Degree in relevant field of study preferred, Additional trainings or certifications in relevant field of study preferred * Experience in Agile teams and or Product/Platform based operating model * Experience in retail or grocery preferred #DICEJobs #LI-hybrid #LI-SS1 Salary Range: $101,360 - $152,040 Actual compensation offered to a candidate may vary based on their unique qualifications and experience, internal equity, and market conditions. Final compensation decisions will be made in accordance with company policies and applicable laws. At Ahold Delhaize USA, we provide services to one of the largest portfolios of grocery companies in the nation, and we're actively seeking top talent. Our team shares a common motivation to drive change, take ownership and enable our brands to better care for their customers. We thrive on supporting great local grocery brands and their strategies. Our associates are the heartbeat of our organization. We are committed to offering a welcoming work environment where all associates can succeed and thrive. Guided by our values of courage, care, teamwork, integrity (and even a little humor), we are dedicated to being a great place to work. We believe in collaboration, curiosity, and continuous learning in all that we think, create and do. While building a culture where personal and professional growth are just as important as business growth, we invest in our people, empowering them to learn, grow and deliver at all levels of the business.
    $101.4k-152k yearly 56d ago
  • Senior Data Engineer

    Smart Circle International 4.1company rating

    Newport Beach, CA jobs

    The Senior Data Automation Engineer will help enhance operational efficiency and business insight by designing, developing, and managing automated data processes across Salesforce, SQL Server, Snowflake, and Tableau. This role bridges technical expertise with business strategy - optimizing data pipelines, integrating enterprise systems, and enabling secure, scalable, and reliable automation. This role creates and maintains critical datasets, and self-service reporting tools that empower teams with accurate insights to enable better decision making. The ideal candidate is confident serving as a cross-functional expert who will collaborate with stakeholders to streamline workflows, safeguard data quality, and ensure compliance. They bring technical depth and problem-solving agility, with a strong commitment to advancing their data architecture and automation skills. The Basics: Location: Remote (WFH) Compensation: $125,000-$145,000/year, commensurate with experience Travel: Up to 10% Reports To: Salesforce and Tableau Developer What You'll Do: Salesforce Platform: Customize Salesforce data model and automations to align with business requirements Manage data governance and integrate external systems Support users, optimize performance, and ensure compliance Tableau and Database Administration: Design dashboards with advanced calculations and manage data layers and security Administer access controls and tune performance Develop scalable databases in SQL Server and Snowflake Optimize SQL scripts, support reporting, and maintain ETL pipelines Enterprise Scheduling & Automation Design and schedule automations and workflows Monitor and optimize job performance and implement alerting mechanisms Maintain documentation and security protocols Collaborate with stakeholders to enhance automation reliability and effectiveness Qualifications and Expertise: Proven expertise with Salesforce, Tableau, Microsoft SQL Server, Azure SQL, Azure Data Factory, and Snowflake Demonstrated ability to design and manage automated data pipelines across multi-platform environments Strong experience developing datasets, procedures, and functions in SQL-based systems (Microsoft SQL Server, Snowflake) Hands-on experience creating self-service, sales-facing reports in Tableau Solid understanding of data governance, system integration (APIs, middleware), and automation best practices Skilled at collaborating across technical and business teams to deliver scalable data solutions Bachelor's degree in Computer Science, Business, or related field 3+ years of recent, relevant experience in data engineering and analytics Salesforce or Snowflake certifications (preferred) Experience with Python integration and notebooks (preferred) Smart Circle International is a leading broker of outsourced sales and customer acquisition services. We help clients and independently owned and operated sales companies grow together through versatile in-person marketing and sales campaigns inside retailers, businesses, and through door-to-door canvassing. Our corporate offices are in Newport Beach and Toronto. Visit smartcircle.com to learn more! Total Rewards: Full-time positions qualify for a benefits package that includes vacation, sick leave, paid holidays, medical (with an HSA plan option), dental, vision and company paid Basic Life insurance, opportunity to enroll in Voluntary Life plans, Employee Assistance Program, 401K with employer match, employee referral program, home office stipend and opportunities for team building, growth, and development. Team members have on-demand access to an LMS with a variety of courses to further their professional and personal development. Equal Opportunity Employer: We believe in equal opportunity. Each team member is recruited, employed, evaluated, and considered for promotion without regard to race, color, national origin, age, sex, disability status, or any other protected characteristic under state or federal law. We will not tolerate discrimination or harassment based on any protected characteristic and expect all team members to treat others with dignity and respect. In accordance with the applicable law, the following represents a good faith estimate of the minimum and maximum compensation range for this position: the estimated annual compensation range for this role is $1255k - $145k/year. The compensation range reflects the Company's reasonable expectation at the time of posting. Exact compensation for this role will be determined based on permissible, non-discriminatory factors such as candidates' qualifications, skills, and experience. DISCLOSURE TO JOB APPLICANTS PURSUANT TO THE CALIFORNIA CONSUMER PRIVACY ACT As part of your job application and our evaluation of your candidacy, we collect, receive, maintain, and use your personal information, which as used herein, means information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with you or your household. The following is the personal information we may collect as part of the application process: Personal contact details, such as name, title, address, telephone number, and email address; and Application information, such as your qualifications, skills, education, references, and other information that may be in your resume, cover letter, and materials you provide to us when applying for employment. We collect your personal information to evaluate your job application and candidacy for employment, to check your eligibility to work in the country in which you have applied, for background checks, and to comply with employment and other laws. If you become employed by us, we will notify you of additional categories of personal information that we collect, receive, and maintain for business purposes. Qualifications What You Bring: A minimum of 5 years of strong hands-on experience with Salesforce, Tableau, Microsoft SQL Server, and Snowflake Proven experience delivering self-service, sales-facing reports in Tableau Demonstrated ability to manage data pipelines in multi-platform environments Deep understanding of data governance, system integration, and automation Excellent communication and collaboration skills across technical and non-technical teams. Experience with system integration tools and strategies (e.g., APIs, middleware) Salesforce Platform and Administration certifications (preferred) Snowflake certifications (preferred) Strong leadership, communication, and problem-solving abilities in sales operations environment Minimum Qualifications: Bachelor's degree in Business Administration, Computer Science, or a related field At least 5 years of recent experience developing datasets, procedures, and functions in Microsoft SQL Server and Snowflake Our Commitment: Smart Circle International is passionate about creating an inclusive workplace that promotes and values diversity and celebrates differences. We are committed to creating an environment that fosters growth opportunities for all team members. Wherever practical, Smart Circle International wants team members in the position that best suits their unique abilities, interests, and skills, as well as our business needs. We strongly believe that bringing our team members' diverse backgrounds, cultures, and perspectives together is the best way to serve our clients and the independent sales companies with which we work side by side. Equal Opportunity Employer: We believe in equal opportunity. Each team member is recruited, employed, evaluated, and considered for promotion without regard to race, color, national origin, age, sex, disability status, or any other protected characteristic under state or federal law. We will not tolerate discrimination or harassment based on any protected characteristic and expect all team members to treat others with dignity and respect. *In accordance with the applicable law, the following represents a good faith estimate of the minimum and maximum compensation range for this position: the estimated annual compensation range for this role is $110k - $160k.The compensation range reflects the Company's reasonable expectation at the time of posting. Exact compensation for this role will be determined based on permissible, non-discriminatory factors such as candidates' qualifications, skills, and experience. Full-time positions also include a benefits package that includes vacation, sick leave, paid holidays, medical (with an HSA plan option), dental, vision and company paid Basic Life insurance, opportunity to enroll in Voluntary Life plans, Employee Assistance Program, 401K with employer match, employee referral program, home office stipend and opportunities for team building, growth, and development. Team members have on-demand access to an LMS with a variety of courses to further their professional and personal development. DISCLOSURE TO JOB APPLICANTS PURSUANT TO THE CALIFORNIA CONSUMER PRIVACY ACT As part of your job application and our evaluation of your candidacy, we collect, receive, maintain, and use your personal information, which as used herein, means information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with you or your household. The following is the personal information we may collect as part of the application process: Personal contact details, such as name, title, address, telephone number, and email address; and Application information, such as your qualifications, skills, education, references, and other information that may be in your resume, cover letter, and materials you provide to us when applying for employment. We collect your personal information to evaluate your job application and candidacy for employment, to check your eligibility to work in the country in which you have applied, for background checks, and to comply with employment and other laws. If you become employed by us, we will notify you of additional categories of personal information that we collect, receive, and maintain for business purposes.
    $125k-145k yearly 17d ago
  • Data Engineer - Kafka

    Delhaize America 4.6company rating

    Chicago, IL jobs

    Ahold Delhaize USA, a division of global food retailer Ahold Delhaize, is part of the U.S. family of brands, which includes five leading omnichannel grocery brands - Food Lion, Giant Food, The GIANT Company, Hannaford and Stop & Shop. Our associates support the brands with a wide range of services, including Finance, Legal, Sustainability, Commercial, Digital and E-commerce, Technology and more. Primary Purpose: The Data Engineer II contributes to the expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. They will contribute to our data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They engage through the entire lifecycle of a project from data mapping, data pipelines, data modeling, and finally data consumption. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. They will learn to optimize or even re-design our company's data architecture to support our next generation of products and data initiatives. They can take on smaller projects from start to finish, work on problems of moderate scope where analysis of situations or data requires a review of a variety of factors and trace issues to their source. They develop solutions to a variety of problems of moderate scope and complexity. Our flexible/ hybrid work schedule includes 3 in-person days at one of our core locations and 2 remote days. Our core office locations include Salisbury, NC, Chicago, IL, and Quincy, MA. Applicants must be currently authorized to work in the United States on a full-time basis. Duties & Responsibilities: * Solves simple to moderate application errors, resolves application problems, following up promptly with all appropriate customers and IT personnel. * Reviews and contributes to QA test plans and supports QA team during test execution. * Participates in developing streaming data applications (Kafka), data transformation and data pipelines. * Ensures change control and change management procedures are followed within the program/project as they relate to requirements. * Able to interpret requirement documents, contributes to creating functional design documents as a part of data development life cycle. * Documents all phases of work including gathering requirements, architectural diagrams, and other program technical specifications using current specified design standards for new or revised solutions. * Relates information from various sources to draw logical conclusions. * Conducts unit testing on data streams. * Conducts data lineage and impact analysis as a part of the change management process. * Conducts data analysis (SQL, Excel, Data Discovery, etc.) on legacy systems and new data sources. * Creates source to target data mappings for data pipelines and integration activities. * Assists in identifying the impact of proposed application development/enhancements projects. * Performs data profiling and process analysis to understand key source systems and uses knowledge of application features and functions to assess scope and impact of business needs. * Implement and maintain data governance policies and procedures to ensure data quality, security and compliance * Ensure operational stability of a 24/7/365 grocery retail environment by providing technical support, system monitoring, and issue resolution which may be required during off-hours, weekends, and holidays as needed. Qualifications: * Bachelors Degree in Computer Science or Technical field; equivalent trainings/certifications/experience equivalency will be considered * 3 or more years of equivalent experience in relevant job or field of technology * Experience with Kafka * Experience with Databricks or similar Preferred Qualifications: * Masters Degree in relevant field of study preferred, Additional trainings or certifications in relevant field of study preferred * Experience in Agile teams and or Product/Platform based operating model * Experience in retail or grocery preferred #DICEJobs #LI-hybrid #LI-SS1 Salary Range: $101,360 - $152,040 Actual compensation offered to a candidate may vary based on their unique qualifications and experience, internal equity, and market conditions. Final compensation decisions will be made in accordance with company policies and applicable laws. At Ahold Delhaize USA, we provide services to one of the largest portfolios of grocery companies in the nation, and we're actively seeking top talent. Our team shares a common motivation to drive change, take ownership and enable our brands to better care for their customers. We thrive on supporting great local grocery brands and their strategies. Our associates are the heartbeat of our organization. We are committed to offering a welcoming work environment where all associates can succeed and thrive. Guided by our values of courage, care, teamwork, integrity (and even a little humor), we are dedicated to being a great place to work. We believe in collaboration, curiosity, and continuous learning in all that we think, create and do. While building a culture where personal and professional growth are just as important as business growth, we invest in our people, empowering them to learn, grow and deliver at all levels of the business.
    $101.4k-152k yearly 56d ago
  • Staff Data Engineer

    Market America 4.5company rating

    Miami Beach, FL jobs

    Market America, a product brokerage and Internet marketing company that specializes in One-to-One Marketing, is seeking an experienced Staff Data Engineer for our IT team. As a senior member of the Data Engineering team, you will have an important role in helping millions of customers on our SHOP.COM and Market America Worldwide multi-country and multi-language global eCommerce websites intelligently find what they want within different categories, merchant offers, products and taxonomy. We have thousands of 3 rd party affiliates / feeds which goes through ETL and data ingestion pipelines before ingesting into our search systems. We have multiple orchestration pipelines supporting various types of data for products, store offers, analytics, customer behavioral profiles, segments, logs and much more. If you are passionate about data engineering in processing millions of data, ETL processes for products, stores, customers, analytics, this is highly visible role that will provide you the opportunity to make a huge impact in our business and a difference to millions of customers worldwide. Data engineering team processes large amounts of data that we import and collect. The team works to enrich content, pricing integration, taxonomy assignments and algorithms, category classifier nodes and machine learning integration within the pipeline. Key Responsibilities: Must have minimum of 10-12 years of hands-on development experience implementing batch and events driven applications using Java, Kafka, Spark, Scala, PySpark and Python Experience with Apache Kafka and Connectors, Java, Springboot in building event driven services, Python in building ML pipelines Develop data pipelines responsible for ingesting large amounts of different kinds of data from various sources Help evolve data architecture and work on Next Generation real time pipeline algorithms and architecture in addition to supporting and maintaining current pipelines and legacy systems Write code and develop worker node for business logic, ETL and orchestration processes Develop algorithms for better attribution rules and category classifiers Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive search, discovery, and recommendations. Work closely with architects, engineers, data analysts, data scientists, contractors/consultants and project managers in assessing project requirements, design, develop and support data ingestions and API services Work with Data Scientists in building feature engineering pipelines and integrating machine learning models during content enrichment process Able to influence on priorities working with various partners including engineers, project management office and leadership Mentor junior team members, define architecture, code review, hands-on development and deliver the work in sprint cycle Participate in design discussions with Architects and other team members for the design of new systems and re-engineering of components of existing systems Wear Architect hat when required to bring new ideas to the table, thought leadership and forward thinking Take holistic approach to building solutions by thinking big picture and overall solution Work on moving away from legacy systems into next generation architecture Take complete ownership from requirements, solution design, development, production launch and post launch production support. Participate in code reviews and regular on-call rotations. Desire to apply best solution in the industry, apply correct design patterns during development and learn best practices and data engineering tools and technologies Required Skills & Experience: BS or MS in Computer Science (or related field) with 10+ years of hands-on software development experience working in large-scale data processing pipelines Must have skills are Apache Spark, Scala and PySpark with 2-4 years of experience building production grade batch pipelines that handle large volumes of data. Must have at least 4+ years of experience in Java and API / Microservices Must have at least 2+ years of experience in Python 2+ years of experience in understanding and writing complex SQL and stored procedures for processing raw data, ETL, data validation, using databases such as SQL Server, Redis and other NoSQL DBs Knowledge of Big data technologies, Hadoop, HDFS Expertise with building events driven pipelines with Kafka, Java / Spark, Apache Flink Expertise with Amazon AWS stack such as EMR, EC2, S3 Experience working with APIs to collect and ingest data as well build the APIs for business logic Experience working with setting up, maintaining, and debugging production systems and infrastructure Experience in building fault-tolerant and resilient system Experience in building worker nodes, knowledge of REST principles and data engineering design patterns In-depth knowledge of Java, Springboot, Spark, Scala, PySpark, Python, Orchestration tools, ESB, SQL, Stored procedures, Docker, RESTful web services, Kubernetes, CI/CD, Observability techniques, Kafka, Release processes, caching strategies, versioning, B&D, BitBucket / Git and AWS Cloud Eco-system, NoSQL Databases, Hazelcast Strong software development, architecture diagramming, problem-solving and debugging skills Phenomenal communication and influencing skills Nice to Have: Exposure to Machine Learning (ML), LLM models, using AI during coding, build with AI Knowledge of Elastic APM, ELK stack and search technologies such as Elasticsearch / Solr Nice to have some experience in workflow orchestration tools such as Air Flow or Apache NiFi Market America offers competitive salary and generous benefits, including health, dental, vision, life, short and long-term disability insurance, a 401(k) retirement plan with company match, and an on-site health clinic. Qualified candidates should apply online. This position will work remotely based from our Miami FL offices. Sorry, we are NOT able to sponsor for this position. Market America is proud to be an equal opportunity employer. Market America | SHOP.COM is changing the way people shop and changing the economic paradigm so anyone can become financially independent by creating their own economy and converting their spending into earning with the Shopping Annuity . ABOUT MARKET AMERICA, INC. & SHOP.COM Market America Worldwide | SHOP.COM is a global e-commerce and digital marketing company that specializes in one-to-one marketing and is the creator of the Shopping Annuity . Its mission is to provide a robust business system for entrepreneurs, while providing consumers a better way to shop. Headquartered in Greensboro, North Carolina, and with eight sites around the globe, including the U.S., Market America Worldwide was founded in 1992 by Founder, Chairman & CEO JR Ridinger. Through the company's primary, award-winning shopping website, SHOP.COM, consumers have access to millions of products, including Market America Worldwide exclusive brands and thousands of top retail brands. Further, SHOP.COM ranks 19th in Newsweek magazine's 2021 Best Online Shops, No. 52 in Digital Commerce 360's (formerly Internet Retailer) 2021 Top 1,000 Online Marketplaces, No. 79 in Digital Commerce 360's 2021 Top 1,000 Online Retailers and No. 11 in the 2021 Digital Commerce 360 Primary Merchandise Category Top 500. The company is also a two-time winner of the Better Business Bureau's Torch Award for Marketplace Ethics and was ranked No. 15 in The Business North Carolina Top 125 Private Companies for 2021. By combining Market America Worldwide's entrepreneurial business model with SHOP.COM's powerful comparative shopping engine, Cashback program, Hot Deals, ShopBuddy , Express Pay checkout, social shopping integration and countless other features, the company has become the ultimate online shopping destination. For more information about Market America Worldwide: MarketAmerica.com For more information on SHOP.COM, please visit: SHOP.COM
    $81k-102k yearly est. 14d ago
  • Staff Data Engineer

    Market America Inc. 4.5company rating

    Miami Beach, FL jobs

    Market America, a product brokerage and Internet marketing company that specializes in One-to-One Marketing, is seeking an experienced Staff Data Engineer for our IT team. As a senior member of the Data Engineering team, you will have an important role in helping millions of customers on our SHOP.COM and Market America Worldwide multi-country and multi-language global eCommerce websites intelligently find what they want within different categories, merchant offers, products and taxonomy. We have thousands of 3rd party affiliates / feeds which goes through ETL and data ingestion pipelines before ingesting into our search systems. We have multiple orchestration pipelines supporting various types of data for products, store offers, analytics, customer behavioral profiles, segments, logs and much more. If you are passionate about data engineering in processing millions of data, ETL processes for products, stores, customers, analytics, this is highly visible role that will provide you the opportunity to make a huge impact in our business and a difference to millions of customers worldwide. Data engineering team processes large amounts of data that we import and collect. The team works to enrich content, pricing integration, taxonomy assignments and algorithms, category classifier nodes and machine learning integration within the pipeline. Key Responsibilities: * Must have minimum of 10-12 years of hands-on development experience implementing batch and events driven applications using Java, Kafka, Spark, Scala, PySpark and Python * Experience with Apache Kafka and Connectors, Java, Springboot in building event driven services, Python in building ML pipelines * Develop data pipelines responsible for ingesting large amounts of different kinds of data from various sources * Help evolve data architecture and work on Next Generation real time pipeline algorithms and architecture in addition to supporting and maintaining current pipelines and legacy systems * Write code and develop worker node for business logic, ETL and orchestration processes * Develop algorithms for better attribution rules and category classifiers * Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive search, discovery, and recommendations. * Work closely with architects, engineers, data analysts, data scientists, contractors/consultants and project managers in assessing project requirements, design, develop and support data ingestions and API services * Work with Data Scientists in building feature engineering pipelines and integrating machine learning models during content enrichment process * Able to influence on priorities working with various partners including engineers, project management office and leadership * Mentor junior team members, define architecture, code review, hands-on development and deliver the work in sprint cycle * Participate in design discussions with Architects and other team members for the design of new systems and re-engineering of components of existing systems * Wear Architect hat when required to bring new ideas to the table, thought leadership and forward thinking * Take holistic approach to building solutions by thinking big picture and overall solution * Work on moving away from legacy systems into next generation architecture * Take complete ownership from requirements, solution design, development, production launch and post launch production support. Participate in code reviews and regular on-call rotations. * Desire to apply best solution in the industry, apply correct design patterns during development and learn best practices and data engineering tools and technologies Required Skills & Experience: * BS or MS in Computer Science (or related field) with 10+ years of hands-on software development experience working in large-scale data processing pipelines * Must have skills are Apache Spark, Scala and PySpark with 2-4 years of experience building production grade batch pipelines that handle large volumes of data. * Must have at least 4+ years of experience in Java and API / Microservices * Must have at least 2+ years of experience in Python * 2+ years of experience in understanding and writing complex SQL and stored procedures for processing raw data, ETL, data validation, using databases such as SQL Server, Redis and other NoSQL DBs * Knowledge of Big data technologies, Hadoop, HDFS * Expertise with building events driven pipelines with Kafka, Java / Spark, Apache Flink * Expertise with Amazon AWS stack such as EMR, EC2, S3 * Experience working with APIs to collect and ingest data as well build the APIs for business logic * Experience working with setting up, maintaining, and debugging production systems and infrastructure * Experience in building fault-tolerant and resilient system * Experience in building worker nodes, knowledge of REST principles and data engineering design patterns * In-depth knowledge of Java, Springboot, Spark, Scala, PySpark, Python, Orchestration tools, ESB, SQL, Stored procedures, Docker, RESTful web services, Kubernetes, CI/CD, Observability techniques, Kafka, Release processes, caching strategies, versioning, B&D, BitBucket / Git and AWS Cloud Eco-system, NoSQL Databases, Hazelcast * Strong software development, architecture diagramming, problem-solving and debugging skills * Phenomenal communication and influencing skills Nice to Have: * Exposure to Machine Learning (ML), LLM models, using AI during coding, build with AI * Knowledge of Elastic APM, ELK stack and search technologies such as Elasticsearch / Solr * Nice to have some experience in workflow orchestration tools such as Air Flow or Apache NiFi Market America offers competitive salary and generous benefits, including health, dental, vision, life, short and long-term disability insurance, a 401(k) retirement plan with company match, and an on-site health clinic. Qualified candidates should apply online. This position will work remotely based from our Miami FL offices. Sorry, we are NOT able to sponsor for this position. Market America is proud to be an equal opportunity employer. Market America | SHOP.COM is changing the way people shop and changing the economic paradigm so anyone can become financially independent by creating their own economy and converting their spending into earning with the Shopping Annuity. ABOUT MARKET AMERICA, INC. & SHOP.COM Market America Worldwide | SHOP.COM is a global e-commerce and digital marketing company that specializes in one-to-one marketing and is the creator of the
    $81k-102k yearly est. 14d ago
  • Retail Data Engineer

    Racetrac 4.4company rating

    Atlanta, GA jobs

    The Retail Data Engineer plays a critical role in managing the flow, transformation, and integrity of scan-level data within our retail data ecosystem. This role ensures that raw transactional data such as point-of-sale (POS), promotional, loyalty, and product data is clean, consistent, and fit for analysis. This individual will collaborate closely with merchandising, marketing, operations, and analytics teams to deliver trusted data that powers key business decisions in a dynamic retail environment. What You'll Do: Works with business teams (e.g., Category Management, Marketing, Supply Chain) to define and refine data needs. Identifies gaps or ambiguities in retail scan data (e.g., barcode inconsistencies, vendor mappings). Translates complex retail requirements into technical specifications for data ingestion and transformation. Develops, schedules, and optimizes ETL/ELT processes to ingest large volumes of scan data (e.g., from POS, ERP, loyalty programs). Applies robust transformation logic to normalize data across vendors, stores, and systems. Works with both structured and semi-structured retail datasets (CSV, JSON, EDI, etc.). Implements data validation, reconciliation, and anomaly detection for incoming retail data feeds. Designs and maintains audit trails and data lineage for scan data. Investigates and resolves data discrepancies in collaboration with store systems, IT, and vendors. Conducts exploratory data analysis to uncover trends, seasonality, anomalies, and root causes. Supports retail performance reporting, promotional effectiveness, and vendor analytics. Provides clear documentation and logic traceability for analysts and business users. Collaborates with cross-functional teams such as merchandising, inventory, loyalty, and finance to support retail KPIs and data insights. Acts as a data subject matter expert for scan and transaction data. Provides guidance on best practices for data usage and transformation in retail contexts. What We're Looking For: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or related field. 3-5 years of experience in data engineering, ideally in a retail environment. Experience working with scan-level data from large retail chains or CPG vendors. Familiarity with retail ERP systems (e.g., SAP, Oracle), merchandising tools, and vendor data feeds. Expert-level SQL and experience working with retail schema structures (e.g., SKUs, UPCs, store IDs). Proficient in data pipeline and orchestration tools such as dbt, Airflow, Fivetran, or Apache Spark. Experience with cloud-based data platforms (Snowflake, Google BigQuery, Azure Synapse, AWS Redshift). Familiarity with retail concepts such as POS systems, promotional pricing, markdowns, units vs. dollars sold, sell-through rates, and planogram compliance. Understanding of master data management (MDM) for products, stores, and vendors. Experience with data profiling and quality frameworks (e.g., Great Expectations, Soda, Monte Carlo). Responsibilities: Works with business teams (e.g., Category Management, Marketing, Supply Chain) to define and refine data needs. Identifies gaps or ambiguities in retail scan data (e.g., barcode inconsistencies, vendor mappings). Translates complex retail requirements into technical specifications for data ingestion and transformation. Develops, schedules, and optimizes ETL/ELT processes to ingest large volumes of scan data (e.g., from POS, ERP, loyalty programs). Applies robust transformation logic to normalize data across vendors, stores, and systems. Works with both structured and semi-structured retail datasets (CSV, JSON, EDI, etc.). Implements data validation, reconciliation, and anomaly detection for incoming retail data feeds. Designs and maintains audit trails and data lineage for scan data. Investigates and resolves data discrepancies in collaboration with store systems, IT, and vendors. Conducts exploratory data analysis to uncover trends, seasonality, anomalies, and root causes. Supports retail performance reporting, promotional effectiveness, and vendor analytics. Provides clear documentation and logic traceability for analysts and business users. Collaborates with cross-functional teams such as merchandising, inventory, loyalty, and finance to support retail KPIs and data insights. Acts as a data subject matter expert for scan and transaction data. Provides guidance on best practices for data usage and transformation in retail contexts. Qualifications: All qualified applicants will receive consideration for employment with RaceTrac without regard to their race, national origin, religion, age, color, sex, sexual orientation, gender identity, disability, or protected veteran status, or any other characteristic protected by local, state, or federal laws, rules, or regulations.
    $85k-107k yearly est. Auto-Apply 60d+ ago
  • Data Engineer

    Crystal Management 4.3company rating

    Vienna, VA jobs

    CMiT is seeking an experienced Data Engineer to help lead the analysis, documentation, and migration of a ~2TB Oracle database environment to PostgreSQL. This role involves mapping complex data dependencies across multiple schemas Oracle in legacy data centers to Postgres RDS in AWS environments. This role offers the opportunity to play a pivotal role in a mission critical database modernization initiative, working with enterprise scale data systems while building expertise in both traditional and cloud-native database technologies. Responsibilities Perform comprehensive analysis of Oracle database structures across multiple schemas and systems Map table relationships, constraints, and dependencies within the ~2TB environment Document data flows, lineage, and integration points with legacy systems Create detailed migration documentation and data dictionaries in Confluence Design optimized PostgreSQL target schemas for performance and maintainability Develop migration strategies for complex Oracle features (PL/SQL, packages, materialized views) Create and execute data migration scripts with comprehensive validation procedures Implement hybrid migration approaches for legacy data center and AWS environments Design data synchronization strategies between on-premises and cloud environments Implement secure data transfer mechanisms and connectivity solutions Ensure data consistency and integrity across distributed PostgreSQL deployments Qualifications Education/Certification Required A minimum of a Bachelor's Degree. Experience or education requirements may be met through an equivalent number of combined years of education or experience. Required Qualifications 5+ years Oracle database administration and development experience 3+ years hands-on PostgreSQL experience with advanced features and performance tuning Experience with large-scale database migrations (multi-TB environments) Expert-level SQL skills across both Oracle and PostgreSQL platforms Proficiency in scripting languages (Python, Bash, PowerShell) for automation Experience with database migration tools (AWS DMS, ora2pg, pgloader) Knowledge of ETL/ELT tools and data pipeline frameworks Hands-on experience with AWS database services (RDS, Aurora PostgreSQL) Comfortable with a strong culture of documentation Proficiency with Confluence for technical documentation and knowledge management and sharing Ability to create clear data flow diagrams and system architecture documentation Experience with data lineage tools and dependency mapping Proficiency with diagramming tools (Visio, PowerPoint, etc) Proven track record of successful Oracle to PostgreSQL migrations Experience with legacy enterprise systems and data integration patterns Understanding of compliance and audit requirements for data migrations Strong collaboration skills with cross-functional teams Ability to collaborate with team members to ensure client needs and expectations are met or exceeded. Comfortable working in a remote environment. Demonstrates a passion for solving complex software challenges and enjoys working team collaboration. Preferred Qualifications: AWS certifications (Database Specialty, Solutions Architect) PostgreSQL and/or Oracle certifications or equivalent expertise Experience with data governance and master data management Knowledge of performance monitoring tools (CloudWatch, pg Admin, Oracle Enterprise Manager) Ability to obtain DHS/TSA clearance Clearance Required Active TSA clearance required
    $74k-104k yearly est. 17d ago
  • Data Engineer

    Schwarz Partners 3.9company rating

    Carmel, IN jobs

    Schwarz Partners has an exciting opportunity available for a Data Engineer in Carmel, IN! Data Engineers build data pipelines that transform raw, unstructured data into formats that can be used for analysis. They are responsible for creating and maintaining the analytics infrastructure that enables almost every other data function. This includes architectures such as databases, servers, and large-scale processing systems. A Data Engineer uses different technologies to collect and map an organization's data landscape to help decision-makers find cost savings and optimization opportunities. In addition, data Engineers use this data to display trends in collected analytics information, encouraging transparency with stakeholders. Schwarz Partners is one of the largest independent manufacturers of corrugated sheets and packaging materials in the U.S. Through our family of companies, we continuously build and strengthen our capabilities. You'll find our products wherever goods are packaged, shipped, and sold-from innovative retail packaging to colorful in-store displays at pharmacies and grocers. You also may have spotted our trucks on the highway. Schwarz Partners is built around the idea that independence and innovation go hand in hand. Our structure allows us to adapt to change quickly, get new ideas off the ground, and thrive in the marketplace. Our people are empowered to tap into their talents, build their skills, and grow with us. ESSENTIAL JOB FUNCTIONS FOR THIS POSITION: Build and maintain the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud technologies. Assemble large, complex sets of data that meet non-functional and functional business requirements. Identify, design, and implement internal data-related process improvements. Working with stakeholders including data, design, product and executive teams and assisting them with data-related technical issues. Conduct configuration & design of application to better leverage the enterprise. Prepare data for prescriptive and predictive modeling. Use effective communication to work with application vendors. Assist in the creation and quality assurance review of design documents and test results to ensure all project requirements are satisfied. Ability to advise and implement on improvements to data warehousing and data workflow architecture. Think outside the box and come up with improvement and efficiency opportunities to streamline business and operational workflows. Document high-level business workflows and transform into low-level technical requirements. Ability to analyze complex information sets and communicate that information in a clear well thought out and well laid out manner. Ability to communicate at varying levels of detail (30,000 ft. view, 10,000 ft. view, granular level) and to produce corresponding documentation at varying levels of abstraction. Be an advocate for best practices and continued learning. Ability to communicate with business stakeholders on status of projects/issues. Ability to prioritize and multi-task between duties at any given time. Solid communication and interpersonal skills. Comply with company policies and procedures and all applicable laws and regulations. General DBA work as needed. Maintain and troubleshoot existing ETL processes. Create and maintain BI reports. Additional duties as assigned. REQUIRED EDUCATION / EXPERIENCE: Bachelor's degree in Computer Science or 4+ years' experience in related field. PREFERRED EDUCATION / EXPERIENCE: Experience developing data workflows. Ability to perform prescriptive and predictive modeling. REQUIRED SKILLS: Demonstrated experience with SQL in a large database environment. Direct experience utilizing SQL to develop queries or profile data. Experience in quantitative and qualitative analysis of data. Experienced level skills in Systems Analysis. Experienced level skills in Systems Engineering. Ability to function as a self-starter. REQUIRED MICROSOFT FABRIC SKILLS: Strong grasp of OneLake concepts: lakehouses vs. warehouses, shortcuts, mirroring, item/workspace structure. Hands-on with Delta Lake (Parquet, Delta tables, partitioning, V-ordering, Z-ordering, Vacuum retention). Understanding of Direct Lake, Import, and DirectQuery trade-offs and when to use each. Experience designing star schemas and modern medallion architectures (bronze/silver/gold). Spark/PySpark notebooks (jobs, clusters, caching, optimization, broadcast joins). Data Factory in Fabric (Pipelines): activities, triggers, parameterization, error handling/retries. Dataflows Gen2 (Power Query/M) for ELT, incremental refresh, and reusable transformations. Building/optimizing semantic models; DAX (measures, calculation groups, aggregations). Ability to multi-task, think on his/her feet and react, apply attention to detail and follow-up, and work effectively and collegially with management staff and end users.
    $73k-99k yearly est. 16d ago
  • Data Scientist

    Nextwave Resources 4.4company rating

    Durham, NC jobs

    Temp Data Scientist - Boston, MA or NC, NH, RI, TX, MA, or CO (18 month contract- probable extension or permanent conversion) Notes: Master's Degree Required Python development to build time series models Run SQL queries Linux OS administration Any web development experience preferred. Experience working with Artificial Intelligence, Machine learning algorithms, neural networks, decision trees, modeling, Cloud Machine Learning, time series analysis and robotics process automation. Description: We are seeking a hands-on experienced data scientist with financial services industry experience. As part of a small, nimble team, the associate's key differentiating abilities will be exceptional analytical skills, and an ability to conceive of and develop differentiated products for the benefit of customers. Absolutely critical is the associate's ability to carry an initiative from idea through to execution. 5+ years' experience in Information security/technology risk management for large-scale, complex IT infrastructures and distributed environments or an equivalent combination of related training and experience Analytic Skills: In addition to core regression, classification and time series skills that accompany the data science role, experience with next best action (NBA) prediction, multi-armed bandits, online learning, A/B testing, and experimentation methods are preferred Natural programmer, and confirmed industry experience with statistics and data modeling Experience with one or more of the following tools/frameworks - python, scikit-learn, nltk, pandas, numpy, R, pyspark, scala, SQL/big data tools, TensorFlow, PyTorch, etc Education- At least one advanced degree (Master or PhD level) in a technical or mathematically-oriented discipline, e.g., coursework or experience in fields such as statistics, machine learning, computer science, applied mathematics, econometrics, engineering, etc. Extensive experience in written and oral communications/presentations, and ability to produce a variety of business documents (business requirements, technical specs, slide presentations, etc.) that demonstrate command of language, clarity of thought, and orderliness of presentation We are looking for an authority quantitative developer to advance the research and development of AI/ML methods as components in the delivery of creative investment management technology solutions. You will have experience combining multi-variate statistical modeling, predictive machine learning methods and open-source approaches to Cloud computing and Big Data.
    $70k-100k yearly est. 60d+ ago

Learn more about DICK'S Sporting Goods jobs

View all jobs