Post job

Data Engineer jobs at Foot Locker - 1495 jobs

  • Lead Data Engineer

    The Friedkin Group 4.8company rating

    Houston, TX jobs

    Living Our Values All associates are guided by Our Values. Our Values are the unifying foundation of our companies. We strive to ensure that every decision we make and every action we take demonstrates Our Values. We believe that putting Our Values into practice creates lasting benefits for all of our associates, shareholders, and the communities in which we live. Why Join Us Career Growth: Advance your career with opportunities for leadership and personal development. Culture of Excellence: Be part of a supportive team that values your input and encourages innovation. Competitive Benefits: Enjoy a comprehensive benefits package that looks after both your professional and personal needs. Total Rewards Our Total Rewards package underscores our commitment to recognizing your contributions. We offer a competitive and fair compensation structure that includes base pay and performance-based rewards. Compensation is based on skill set, experience, qualifications, and job-related requirements. Our comprehensive benefits package includes medical, dental, and vision insurance, wellness programs, retirement plans, and generous paid leave. Discover more about what we offer by visiting our Benefits page. A Day In The Life As a Lead Data Engineer within the Trailblazer initiative, you will play a crucial role in architecting, implementing, and managing robust, scalable data infrastructure. This position demands a blend of systems engineering, data integration, and data analytics skills to enhance TFG's data capabilities, supporting advanced analytics, machine learning projects, and real-time data processing needs. The ideal candidate brings deep expertise in Lakehouse design principles, including layered Medallion Architecture patterns (Bronze, Silver, Gold), to drive scalable and governed data solutions. This is also a highly visible leadership role that will represent the data engineering function and lead the Data Management Community of Practice across TFG. As a Lead Data Engineer you will: Design and implement scalable and reliable data pipelines to ingest, process, and store diverse data at scale, using technologies such as Apache Spark, Hadoop, and Kafka. Work within cloud environments like AWS or Azure to leverage services including but not limited to EC2, RDS, S3, Lambda, and Azure Data Lake for efficient data handling and processing. Architect and operationalize data pipelines following Medallion Architecture best practices within a Lakehouse framework-ensuring data quality, lineage, and usability across Bronze, Silver, and Gold layers. Develop and optimize data models and storage solutions (Databricks, Data Lakehouses) to support operational and analytical applications, ensuring data quality and accessibility. Utilize ETL tools and frameworks (e.g., Apache Airflow, Fivetran) to automate data workflows, ensuring efficient data integration and timely availability of data for analytics. Lead the Data Management Community of Practice, serving as the primary facilitator, coordinator, and spokesperson. Drive knowledge sharing, establish best practices, and represent the data engineering discipline across TFG to both technical and business audiences. Collaborate closely with data scientists, providing the data infrastructure and tools needed for complex analytical models, leveraging Python or R for data processing scripts. Ensure compliance with data governance and security policies, implementing best practices in data encryption, masking, and access controls within a cloud environment. Monitor and troubleshoot data pipelines and databases for performance issues, applying tuning techniques to optimize data access and throughput. Stay abreast of emerging technologies and methodologies in data engineering, advocating for and implementing improvements to the data ecosystem. What We Need From You Bachelor's Degree in Computer Science, MIS, or other business discipline and 10+ years of experience in data engineering, with a proven track record in designing and operating large-scale data pipelines and architectures Req or Master's Degree computer science, MIS, or other business discipline and 5+ years of experience in data engineering, with a proven track record in designing and operating large-scale data pipelines and architectures Req Demonstrated experience designing and implementing Medallion Architecture in a Databricks Lakehouse environment, including layer transitions, data quality enforcement, and optimization strategies. Required Expertise in developing ETL/ELT workflows Required Comprehensive knowledge of platforms and services like Databricks, Dataiku, and AWS native data offerings Required Solid experience with big data technologies (Apache Spark, Hadoop, Kafka) and cloud services (AWS, Azure) related to data processing and storage Required Strong experience in AWS cloud services, with hands-on experience in integrating cloud storage and compute services with Databricks Required Proficient in SQL and programming languages relevant to data engineering (Python, Java, Scala) Required Hands on RDBMS experience (data modeling, analysis, programming, stored procedures) Required Familiarity with machine learning model deployment and management practices is a plus Required Strong executive presence and communication skills, with a proven ability to lead communities of practice, deliver presentations to senior leadership, and build alignment across technical and business stakeholders. Required AWS Certified Solution Architect Preferred Databricks Certified Associate Developer for Apache Spark Preferred DAMA CDMP Preferred or other relevant certifications. Preferred Physical and Environmental RequirementsThe physical requirements described here are representative of those that must be met by an associate to successfully perform the essential functions of the job. While performing the duties of the job, the associate is required on a daily basis to analyze and interpret data, communicate, and remain in a stationary position for a significant amount of the work day and frequently access, input, and retrieve information from the computer and other office productivity devices. The associate is regularly required to move about the office and around the corporate campus. The associate must frequently move up to 10 pounds and occasionally move up to 25 pounds. Travel Requirements 20% The associate is occasionally required to travel to other sites, including out-of-state, where applicable, for business. Join Us The Friedkin Group and its affiliates are committed to ensuring equal employment opportunities, including providing reasonable accommodations to individuals with disabilities. If you have a disability and would like to request an accommodation, please contact us at . We celebrate diversity and are committed to creating an inclusive environment for all associates. We are seeking candidates legally authorized to work in the United States, without Sponsorship. #LI-TW1
    $82k-114k yearly est. 1d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Engineer I, Southdale - Full Time

    Macy's 4.5company rating

    Minneapolis, MN jobs

    Be part of an amazing story Macy's is more than just a store. We're a story. One that's captured the hearts and minds of America for more than 160 years. A story about innovations and traditions...about inspiring stores and irresistible products...about the excitement of the Macy's 4th of July Fireworks, and the wonder of the Thanksgiving Day Parade. We've been part of memorable moments and milestones for countless customers and colleagues. Those stories are part of what makes this such a special place to work. Job Overview The Engineer I maintains the physical structure and equipment of the store, ensuring they remain in good working order. This role performs preventative maintenance as well as emergency, corrective, and routine repairs on electrical, mechanical, fire/life safety, plumbing, and HVAC systems. The Engineer I communicates effectively with the Chief Engineer and Store Management to coordinate maintenance efforts and address facility needs. What You Will Do Maintain HVAC, electrical, mechanical, plumbing, and fire/life safety systems and equipment to ensure optimal efficiency. Perform emergency, corrective, and routine repairs on HVAC, electrical, mechanical, plumbing, and fire/life safety systems. Conduct preventive maintenance on equipment and systems according to the scheduled plan. Keep detailed records of inspections, preventive maintenance, and repairs to ensure compliance with state and federal regulations and local fire marshal requirements. Work flexible hours, including days, evenings, nights, weekends, and during emergencies, as directed by the Chief Engineer or District Facilities Manager. Assist with special projects as assigned. Perform additional duties as needed. Follow shortage prevention programs and procedures. Complete tasks efficiently as directed by the Supervisor. Maintain regular, dependable attendance and punctuality. Foster an environment of acceptance and respect that strengthens relationships, and ensures authentic connections with colleagues, customers, and communities. In addition to the essential duties mentioned above, other duties may be assigned. Skills You Will Need Technical Expertise: Strong knowledge of electrical, plumbing, mechanical, and fire/life safety systems, with a preference for HVAC experience Preventative & Corrective Maintenance: Ability to diagnose issues and perform repairs, preventive maintenance, and emergency fixes on various building systems Regulatory Compliance: Understanding of state, federal, and local regulations, including fire marshal requirements and EPA standards (certification preferred) Safety Awareness: Commitment to maintaining a safe work environment for themselves and others, following all safety protocols and procedures Problem-Solving: Ability to troubleshoot issues efficiently and implement effective solutions Record Keeping: Strong attention to detail in maintaining records of inspections, maintenance, and repairs Communication Skills: Effective written and verbal communication, with the ability to interpret technical documents such as safety rules, operation manuals, and procedural guidelines Team Collaboration: Ability to work independently and coordinate with the Chief Engineer, Store Management, and other maintenance staff as needed Who You Are Candidates with a High School diploma or equivalent are encouraged to apply. Minimum 1 year of prior experience This position requires heavy lifting, constant moving, remaining in a stationary position, and reaching with arms and hands. Involves remaining in a stationary position for at least two consecutive hours, lifting at least 30 lbs. stooping, kneeling, crouching, and climbing ladders May involve reaching above eye level Requires close vision, color vision, depth perception, and focus adjustment Able to work a flexible schedule, including days, evenings, weekends, holidays, and during emergencies, based on department and company needs What we can offer you Join a team where work is as rewarding as it is fun! We offer a dynamic, inclusive environment with competitive pay and benefits. Enjoy comprehensive health and wellness coverage and a 401(k) match to invest in your future. Prioritize your well-being with paid time off and eight paid holidays. Grow your career with continuous learning and leadership development. Plus, build community by joining one of our Colleague Resource Groups and make a difference through our volunteer opportunities. Some additional benefits we offer include: Merchandise discounts Performance-based incentives Annual merit review Employee Assistance Program with mental health counseling and legal/financial advice Access the full menu of benefits offerings here. About Us This is a great time to join Macy's! Whether you're helping a customer find the perfect gift, streamlining operations in one of our distribution centers, enhancing our online shopping experience, buying in-style and on-trend merchandise to outfit our customers, or designing a balloon for the Thanksgiving Day Parade, we offer unique opportunities to be part of some of the most memorable moments in people's lives. Join us and help write the next chapter in our story - apply today! This is not all-inclusive and may not apply to colleagues covered by a collective bargaining agreement. Macy's Inc. reserves the right to amend this job description at any time. Macy's Inc. is an Equal Opportunity Employer, committed to an inclusive work environment. STORES00 This position may be eligible for performance-based incentives/bonuses. Benefits include 401k, medical/vision/dental/life/disability insurance options, PTO accruals, Holidays, and more. Eligibility requirements may apply based on location, job level, classification, and length of employment. Additional benefit details are available at macys JOBS.com.
    $99k-122k yearly est. 1d ago
  • ETL / Informatica Architect

    Atria Group 4.2company rating

    Palo Alto, CA jobs

    Required Skills: • 8+ years of overall Informatica ETL development experience • At least 2 years experience as Informatica ETL Architect • Mastery in data warehousing concepts. Candidate should be able to clearly communicate fundamental concepts during the interview and demonstrate previous experience in all aspects. • Ability to develop a technical work plan and assign work and coordinate across multiple developers and projects. • Lead technical requirements, technical and data architectures for the data warehouse projects • Ensure compliance of meta data standards for the data warehouse • Strong data analysis, data modeling and ETL Architect Skills • Able to configure the dev, test, production environments and promotion processes • Platform is Oracle 11g , Informatica 9.0.1 • Strong business and communication skills Additional Information Duration: 3+ Months (strong possibility of extension) Hire Type: Contract, C2C or 1099 Rate: DOE Visa: H1, GC or USC only! Travel Covered: No. Prefer Local. Apply today!
    $119k-159k yearly est. 1d ago
  • ETL / Informatica Architect

    Atria Group 4.2company rating

    Palo Alto, CA jobs

    Required Skills: • 8+ years of overall Informatica ETL development experience • At least 2 years experience as Informatica ETL Architect • Mastery in data warehousing concepts. Candidate should be able to clearly communicate fundamental concepts during the interview and demonstrate previous experience in all aspects. • Ability to develop a technical work plan and assign work and coordinate across multiple developers and projects. • Lead technical requirements, technical and data architectures for the data warehouse projects • Ensure compliance of meta data standards for the data warehouse • Strong data analysis, data modeling and ETL Architect Skills • Able to configure the dev, test, production environments and promotion processes • Platform is Oracle 11g , Informatica 9.0.1 • Strong business and communication skills Additional Information Duration: 3+ Months (strong possibility of extension) Hire Type: Contract, C2C or 1099 Rate: DOE Visa: H1, GC or USC only! Travel Covered: No. Prefer Local. Apply today!
    $119k-159k yearly est. 60d+ ago
  • Data Engineer IV

    Delhaize America 4.6company rating

    Salisbury, NC jobs

    Ahold Delhaize USA, a division of global food retailer Ahold Delhaize, is part of the U.S. family of brands, which includes five leading omnichannel grocery brands - Food Lion, Giant Food, The GIANT Company, Hannaford and Stop & Shop. Our associates support the brands with a wide range of services, including Finance, Legal, Sustainability, Commercial, Digital and E-commerce, Technology and more. Primary Purpose The Data Engineer IV will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. They will drive our data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They engage through the entire lifecycle of a project from data mapping, data pipelines, data modeling, and finally data consumption. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. They will proactively optimize or even assist with the design of our company's data architecture to support our next generation of products and data initiatives. They will work on various, ambiguous, and complex issues, where analysis of situations or data requires an in- depth evaluation of variable factors, thorough understanding of the business strategy, operations, markets and key stakeholders. Our flexible/ hybrid work schedule includes 3 in-person days at one of our core locations and 2 remote days. Our core office locations include Salisbury, NC , Chicago, IL, Quincy, MA. Applicants must be currently authorized to work in the United States on a full-time basis. Duties and Responsibilities * Creates and approves source to target data mappings for data pipelines and integration activities * Identifies and resolves the impact of proposed application development/enhancements projects * Performs data profiling and process analysis to understand key source systems and uses knowledge of application features and functions to assess scope and impact of business needs * Works with business users to design, develop, test, and implement business intelligence solutions in the Data & Analytics Platform * Assists in assessing alternative software solutions for workability and technical feasibility. * Work closely with business teams in every stage from gathering requirements to closure of project * Work with the key stakeholders to create requirements for each business solution, ensure the requirements are developed in concert with and agreed upon by the business partner, and that the solution delivers the agreed upon business needs * Implement and maintain data governance policies and procedures to ensure data quality, security and compliance * Initiates and leads workstreams for improving way of working in teams, eliminating technical debt and optimizing business processes * Acts as a subject matter expert on technology solution * Responsible for coaching/mentoring/leading junior engineers * Consistently delivers very high-quality work in cross-functional environments - seen as an expert and technical leader among peers * Has a balanced approach to team empowerment and leadership driving successful brainstorming sessions and solid technical conclusions to work approaches while modeling self-accountability, continuous development and team contributions * Remain up to date on emerging technologies and best practices in data engineering. Recommend new approaches to improving data infrastructure * Ensure operational stability of a 24/7/365 grocery retail environment by providing technical support, system monitoring, and issue resolution which may be required during off-hours, weekends, and holidays as needed Qualifications * Bachelors Degree in Computer Science, CIS or related (or equivalent related work experience) * 8 or more years of equivalent experience in relevant job or field of technology * Experience in an advanced role or technical capacity, leading teams directly or indirectly * Experience directly responsible for guiding, training or onboarding team members in relevant technologies, capabilities or skills * Experience with some or all of the following: Databricks, Azure Data Factory, Azure DevOps, TSQL, Python, PySpark, PowerBI, Data Warehouse Concepts Preferred Qualifications * Masters Degree in relevant field of study, Additional trainings or certifications in relevant field of study * 3 or more years experience in Agile teams and Product/Platform based operating model * 3 or more years of experience in leading teams or advancing technical capability in teams * Experience in retail or grocery preferred #DICEJobs #LI-hybrid #LI-SS1 Salary Range: $146,960 - $220,440 Actual compensation offered to a candidate may vary based on their unique qualifications and experience, internal equity, and market conditions. Final compensation decisions will be made in accordance with company policies and applicable laws. At Ahold Delhaize USA, we provide services to one of the largest portfolios of grocery companies in the nation, and we're actively seeking top talent. Our team shares a common motivation to drive change, take ownership and enable our brands to better care for their customers. We thrive on supporting great local grocery brands and their strategies. Our associates are the heartbeat of our organization. We are committed to offering a welcoming work environment where all associates can succeed and thrive. Guided by our values of courage, care, teamwork, integrity (and even a little humor), we are dedicated to being a great place to work. We believe in collaboration, curiosity, and continuous learning in all that we think, create and do. While building a culture where personal and professional growth are just as important as business growth, we invest in our people, empowering them to learn, grow and deliver at all levels of the business.
    $147k-220.4k yearly 11d ago
  • Snowflake Data Engineer

    Coach 4.8company rating

    New York jobs

    We believe that difference sparks brilliance, so we welcome people and ideas from everywhere to join us in stretching what's possible. At Tapestry, being true to yourself is core to who we are. When each of us brings our individuality to our collective ambition, our creativity is unleashed. This global house of brands - Coach, Kate Spade New York, Stuart Weitzman - was built by unconventional entrepreneurs and unexpected solutions, so when we say we believe in dreams, we mean we believe in making them happen. We're always on a journey to becoming our best, but you can count on this: Here, your voice is valued, your ambitions are supported, and your work is recognized. A member of the Tapestry family, we are part of a global house of brands that has unwavering optimism and is committed to being innovative and wholly inclusive. Visit Our People page to learn more about Tapestry's commitment to equity, inclusion, and diversity. Primary Purpose: The ideal candidate is an experienced Data Engineer with a strong background in Snowflake, SQL, and cloud-based data solutions. They are a self-motivated, independent problem-solver who is eager to learn new skills and adapt to changing technologies. Collaboration, performance optimization, and a commitment to maintaining data security and compliance are critical for success in this role. The successful individual will leverage their proficiency in Data Engineering to... Develop and manage data models, pipelines, and transformations. Demonstrate proficiency in SQL and BASH scripting. Leverage 5+ years of experience in data engineering or related roles. Collaborate with Data Engineering, Product Engineering, and Product Management teams to align with the product roadmap. Effectively document and communicate technical solutions to diverse audiences. Demonstrate a strong ability to work independently, take ownership of tasks and drive them to completion. Show a proactive approach to learning new technologies, tools and skills as needed. The accomplished individual will... Design, implement, and manage Snowflake Data Warehouse solutions. Be proficient in creating and optimizing Snowflake schema design. Optimize Snowflake schema designs for performance and scalability. Utilize Snowflake features such as data sharing, cloning and time travel. Analyze and optimize Snowflake compute resources (e.g. virtual warehouses). Optimize queries, indexes and storage for improved performance. Maintain data security and ensure compliance with regulations. Knowledge with SOC 2 compliance standards (preferred). Integrate Snowflake with AWS services, including S3 and Glue (preferred). An outstanding professional will have... Bachelor's degree in Computer Science, Information Systems, or a related field. Snowflake certification (preferred). AWS certification (preferred). Experience with Agile methodologies and modern software development lifecycle. Familiarity with Python and/or Java for Snowflake-related automation (preferred). Familiarity with Node.js for API development (preferred). Our Competencies for All Employees Courage: Doesn't hold back anything that needs to be said; provides current, direct, complete, and “actionable” positive and corrective feedback to others; lets people know where they stand; faces up to people problems on any person or situation (not including direct reports) quickly and directly; is not afraid to take negative action when necessary. Creativity: Comes up with a lot of new and unique ideas; easily makes connections among previously unrelated notions; tends to be seen as original and value-added in brainstorming settings. Customer Focus: Is dedicated to meeting the expectations and requirements of internal and external customers; gets first-hand customer information and uses it for improvements in products and services; acts with customers in mind; establishes and maintains effective relationships with customers and gains their trust and respect. Dealing with Ambiguity: Can effectively cope with change; can shift gears comfortably; can decide and act without having the total picture; isn't upset when things are up in the air; doesn't have to finish things before moving on; can comfortably handle risk and uncertainty. Drive for Results: Can be counted on to exceed goals successfully; is constantly and consistently one of the top performers; very bottom-line oriented; steadfastly pushes self and others for results. Interpersonal Savvy: Relates well to all kinds of people, up, down, and sideways, inside and outside the organization; builds appropriate rapport; builds constructive and effective relationships; uses diplomacy and tact; can diffuse even high-tension situations comfortably. Learning on the Fly: Learns quickly when facing new problems; a relentless and versatile learner; open to change; analyzes both successes and failures for clues to improvement; experiments and will try anything to find solutions; enjoys the challenge of unfamiliar tasks; quickly grasps the essence and the underlying structure of anything. Our Competencies for All People Managers Strategic Agility: Sees ahead clearly; can anticipate future consequences and trends accurately; has broad knowledge and perspective; is future oriented; can articulately paint credible pictures and visions of possibilities and likelihoods; can create competitive and breakthrough strategies and plans. Developing Direct Reports and Others: Provides challenging and stretching tasks and assignments; holds frequent development discussions; is aware of each person's career goals; constructs compelling development plans and executes them; pushes people to accept developmental moves; will take on those who need help and further development; cooperates with the developmental system in the organization; is a people builder. Building Effective Teams: Blends people into teams when needed; creates strong morale and spirit in his/her team; shares wins and successes; fosters open dialogue; lets people finish and be responsible for their work; defines success in terms of the whole team; creates a feeling of belonging in the team. Tapestry, Inc. is an equal opportunity and affirmative action employer and we pride ourselves on hiring and developing the best people. All employment decisions (including recruitment, hiring, promotion, compensation, transfer, training, discipline and termination) are based on the applicant's or employee's qualifications as they relate to the requirements of the position under consideration. These decisions are made without regard to age, sex, sexual orientation, gender identity, genetic characteristics, race, color, creed, religion, ethnicity, national origin, alienage, citizenship, disability, marital status, military status, pregnancy, or any other legally-recognized protected basis prohibited by applicable law. Americans with Disabilities Act (ADA) Tapestry, Inc. will provide applicants and employees with reasonable accommodation for disabilities or religious beliefs. If you require reasonable accommodation to complete the application process, please contact Tapestry People Services at ************** or ****************************** LI-HYBRID Visit Tapestry, Inc. at ************************ Work Setup Hybrid Flex (in office 1 to 3 days a week) BASE PAY RANGE $140,000.00 TO $170,000.00 Annually Click Here - U.S Corporate Compensation & Benefit
    $140k-170k yearly 60d+ ago
  • Principal Data Engineer

    Bayer Crop Science 4.5company rating

    Seattle, WA jobs

    At Bayer we're visionaries, driven to solve the world's toughest challenges and striving for a world where 'Health for all Hunger for none' is no longer a dream, but a real possibility. We're doing it with energy, curiosity and sheer dedication, always learning from unique perspectives of those around us, expanding our thinking, growing our capabilities and redefining ‘impossible'. There are so many reasons to join us. If you're hungry to build a varied and meaningful career in a community of brilliant and diverse minds to make a real difference, there's only one choice. Principal Data Engineer Principal Data Engineer for Seattle, WA for digital agricultural solutions provider to architect, build & launch new data models that provide intuitive analytics to business users; build & leverage state-of-the-art data access & alignment systems; ensure the best possible data is utilized by the models used to help farmers understand the impacts of decisions on their farming operations; rapidly prototype new data engineering solutions that support data science & analytics; & provide expert advice & education in the usage & interpretation of data systems. Requires Master's in C.S., Computer or Software Engineering or closely-related technical field & 2 yrs experience designing & developing scalable & robust data architecture plans and/or systems for agronomic machine & geospatial data; creating cloud-native data pipelines to build data lakehouses using Apache Iceberg & Apache Spark; processing large scale datasets to enable batch processing & data analysis using ECS & RDS for PostgreSQL; performing root cause analysis & optimizing data transformations & algorithms by generating time & memory profiling reports; packaging & deploying data algorithms to production using Airflow, Docker and Gitlab CI/CD practices to support geospatial data analytics & visualization; collaborate with cross-functional teams, including product teams & data scientists, defining data requirements & ensure alignment with business objectives; enhance petabyte scale data models in SQL using indexing & query optimization, to support efficient data retrieval & processing in large-scale databases; developing data processing scripts, ETL processes, & automation tasks using Python programming; processing & extracting features from geospatial shapefiles using GeoPandas, Fiona & Shapely frameworks. Will also accept Bachelor's in said fields & 5 years progressive post-Bachelor's stated experience. Telecommuting permitted from home office location within reasonable commuting distance of Seattle, WA up to 2 days per week. Salary Range: Employees can expect to be paid a salary between $151,000.00 to $175,000.00. Additional compensation may include a bonus or commission (if relevant). Additional benefits include health care, vision, dental, retirement, PTO, sick leave, etc. The offered salary may vary within this range based on an applicant's location, market data/ranges, an applicant's skills and prior relevant experience, certain degrees and certifications, and other relevant factors. Mail resume to Jill Martin, Bayer Research and Development Services LLC, 800 N. Lindbergh Blvd., E2NE, St. Louis, MO 63167 or email resume to careers_************. Include reference code below with resume. YOUR APPLICATION Bayer offers a wide variety of competitive compensation and benefits programs. If you meet the requirements of this unique opportunity, and want to impact our mission Science for a better life, we encourage you to apply now. Be part of something bigger. Be you. Be Bayer. To all recruitment agencies: Bayer does not accept unsolicited third party resumes. Bayer is an Equal Opportunity Employer/Disabled/Veterans Bayer is committed to providing access and reasonable accommodations in its application process for individuals with disabilities and encourages applicants with disabilities to request any needed accommodation(s) using the contact information below. Bayer is an E-Verify Employer. Location: United States : Washington : Seattle Division: Enabling Functions Reference Code: 858550 Contact Us Email: careers_************
    $151k-175k yearly Easy Apply 6d ago
  • Data Engineer IV

    Delhaize America 4.6company rating

    Quincy, MA jobs

    Ahold Delhaize USA, a division of global food retailer Ahold Delhaize, is part of the U.S. family of brands, which includes five leading omnichannel grocery brands - Food Lion, Giant Food, The GIANT Company, Hannaford and Stop & Shop. Our associates support the brands with a wide range of services, including Finance, Legal, Sustainability, Commercial, Digital and E-commerce, Technology and more. Primary Purpose The Data Engineer IV will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. They will drive our data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They engage through the entire lifecycle of a project from data mapping, data pipelines, data modeling, and finally data consumption. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. They will proactively optimize or even assist with the design of our company's data architecture to support our next generation of products and data initiatives. They will work on various, ambiguous, and complex issues, where analysis of situations or data requires an in- depth evaluation of variable factors, thorough understanding of the business strategy, operations, markets and key stakeholders. Our flexible/ hybrid work schedule includes 3 in-person days at one of our core locations and 2 remote days. Our core office locations include Salisbury, NC , Chicago, IL, Quincy, MA. Applicants must be currently authorized to work in the United States on a full-time basis. Duties and Responsibilities * Creates and approves source to target data mappings for data pipelines and integration activities * Identifies and resolves the impact of proposed application development/enhancements projects * Performs data profiling and process analysis to understand key source systems and uses knowledge of application features and functions to assess scope and impact of business needs * Works with business users to design, develop, test, and implement business intelligence solutions in the Data & Analytics Platform * Assists in assessing alternative software solutions for workability and technical feasibility. * Work closely with business teams in every stage from gathering requirements to closure of project * Work with the key stakeholders to create requirements for each business solution, ensure the requirements are developed in concert with and agreed upon by the business partner, and that the solution delivers the agreed upon business needs * Implement and maintain data governance policies and procedures to ensure data quality, security and compliance * Initiates and leads workstreams for improving way of working in teams, eliminating technical debt and optimizing business processes * Acts as a subject matter expert on technology solution * Responsible for coaching/mentoring/leading junior engineers * Consistently delivers very high-quality work in cross-functional environments - seen as an expert and technical leader among peers * Has a balanced approach to team empowerment and leadership driving successful brainstorming sessions and solid technical conclusions to work approaches while modeling self-accountability, continuous development and team contributions * Remain up to date on emerging technologies and best practices in data engineering. Recommend new approaches to improving data infrastructure * Ensure operational stability of a 24/7/365 grocery retail environment by providing technical support, system monitoring, and issue resolution which may be required during off-hours, weekends, and holidays as needed Qualifications * Bachelors Degree in Computer Science, CIS or related (or equivalent related work experience) * 8 or more years of equivalent experience in relevant job or field of technology * Experience in an advanced role or technical capacity, leading teams directly or indirectly * Experience directly responsible for guiding, training or onboarding team members in relevant technologies, capabilities or skills * Experience with some or all of the following: Databricks, Azure Data Factory, Azure DevOps, TSQL, Python, PySpark, PowerBI, Data Warehouse Concepts Preferred Qualifications * Masters Degree in relevant field of study, Additional trainings or certifications in relevant field of study * 3 or more years experience in Agile teams and Product/Platform based operating model * 3 or more years of experience in leading teams or advancing technical capability in teams * Experience in retail or grocery preferred #DICEJobs #LI-hybrid #LI-SS1 Salary Range: $146,960 - $220,440 Actual compensation offered to a candidate may vary based on their unique qualifications and experience, internal equity, and market conditions. Final compensation decisions will be made in accordance with company policies and applicable laws. At Ahold Delhaize USA, we provide services to one of the largest portfolios of grocery companies in the nation, and we're actively seeking top talent. Our team shares a common motivation to drive change, take ownership and enable our brands to better care for their customers. We thrive on supporting great local grocery brands and their strategies. Our associates are the heartbeat of our organization. We are committed to offering a welcoming work environment where all associates can succeed and thrive. Guided by our values of courage, care, teamwork, integrity (and even a little humor), we are dedicated to being a great place to work. We believe in collaboration, curiosity, and continuous learning in all that we think, create and do. While building a culture where personal and professional growth are just as important as business growth, we invest in our people, empowering them to learn, grow and deliver at all levels of the business.
    $147k-220.4k yearly 6d ago
  • Data Engineer IV

    Delhaize America 4.6company rating

    Chicago, IL jobs

    Ahold Delhaize USA, a division of global food retailer Ahold Delhaize, is part of the U.S. family of brands, which includes five leading omnichannel grocery brands - Food Lion, Giant Food, The GIANT Company, Hannaford and Stop & Shop. Our associates support the brands with a wide range of services, including Finance, Legal, Sustainability, Commercial, Digital and E-commerce, Technology and more. Primary Purpose The Data Engineer IV will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. They will drive our data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They engage through the entire lifecycle of a project from data mapping, data pipelines, data modeling, and finally data consumption. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. They will proactively optimize or even assist with the design of our company's data architecture to support our next generation of products and data initiatives. They will work on various, ambiguous, and complex issues, where analysis of situations or data requires an in- depth evaluation of variable factors, thorough understanding of the business strategy, operations, markets and key stakeholders. Our flexible/ hybrid work schedule includes 3 in-person days at one of our core locations and 2 remote days. Our core office locations include Salisbury, NC , Chicago, IL, Quincy, MA. Applicants must be currently authorized to work in the United States on a full-time basis. Duties and Responsibilities * Creates and approves source to target data mappings for data pipelines and integration activities * Identifies and resolves the impact of proposed application development/enhancements projects * Performs data profiling and process analysis to understand key source systems and uses knowledge of application features and functions to assess scope and impact of business needs * Works with business users to design, develop, test, and implement business intelligence solutions in the Data & Analytics Platform * Assists in assessing alternative software solutions for workability and technical feasibility. * Work closely with business teams in every stage from gathering requirements to closure of project * Work with the key stakeholders to create requirements for each business solution, ensure the requirements are developed in concert with and agreed upon by the business partner, and that the solution delivers the agreed upon business needs * Implement and maintain data governance policies and procedures to ensure data quality, security and compliance * Initiates and leads workstreams for improving way of working in teams, eliminating technical debt and optimizing business processes * Acts as a subject matter expert on technology solution * Responsible for coaching/mentoring/leading junior engineers * Consistently delivers very high-quality work in cross-functional environments - seen as an expert and technical leader among peers * Has a balanced approach to team empowerment and leadership driving successful brainstorming sessions and solid technical conclusions to work approaches while modeling self-accountability, continuous development and team contributions * Remain up to date on emerging technologies and best practices in data engineering. Recommend new approaches to improving data infrastructure * Ensure operational stability of a 24/7/365 grocery retail environment by providing technical support, system monitoring, and issue resolution which may be required during off-hours, weekends, and holidays as needed Qualifications * Bachelors Degree in Computer Science, CIS or related (or equivalent related work experience) * 8 or more years of equivalent experience in relevant job or field of technology * Experience in an advanced role or technical capacity, leading teams directly or indirectly * Experience directly responsible for guiding, training or onboarding team members in relevant technologies, capabilities or skills * Experience with some or all of the following: Databricks, Azure Data Factory, Azure DevOps, TSQL, Python, PySpark, PowerBI, Data Warehouse Concepts Preferred Qualifications * Masters Degree in relevant field of study, Additional trainings or certifications in relevant field of study * 3 or more years experience in Agile teams and Product/Platform based operating model * 3 or more years of experience in leading teams or advancing technical capability in teams * Experience in retail or grocery preferred #DICEJobs #LI-hybrid #LI-SS1 Salary Range: $146,960 - $220,440 Actual compensation offered to a candidate may vary based on their unique qualifications and experience, internal equity, and market conditions. Final compensation decisions will be made in accordance with company policies and applicable laws. At Ahold Delhaize USA, we provide services to one of the largest portfolios of grocery companies in the nation, and we're actively seeking top talent. Our team shares a common motivation to drive change, take ownership and enable our brands to better care for their customers. We thrive on supporting great local grocery brands and their strategies. Our associates are the heartbeat of our organization. We are committed to offering a welcoming work environment where all associates can succeed and thrive. Guided by our values of courage, care, teamwork, integrity (and even a little humor), we are dedicated to being a great place to work. We believe in collaboration, curiosity, and continuous learning in all that we think, create and do. While building a culture where personal and professional growth are just as important as business growth, we invest in our people, empowering them to learn, grow and deliver at all levels of the business.
    $147k-220.4k yearly 11d ago
  • Data Engineer - Kafka

    Delhaize America 4.6company rating

    Salisbury, NC jobs

    Ahold Delhaize USA, a division of global food retailer Ahold Delhaize, is part of the U.S. family of brands, which includes five leading omnichannel grocery brands - Food Lion, Giant Food, The GIANT Company, Hannaford and Stop & Shop. Our associates support the brands with a wide range of services, including Finance, Legal, Sustainability, Commercial, Digital and E-commerce, Technology and more. Primary Purpose: The Data Engineer II contributes to the expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. They will contribute to our data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They engage through the entire lifecycle of a project from data mapping, data pipelines, data modeling, and finally data consumption. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. They will learn to optimize or even re-design our company's data architecture to support our next generation of products and data initiatives. They can take on smaller projects from start to finish, work on problems of moderate scope where analysis of situations or data requires a review of a variety of factors and trace issues to their source. They develop solutions to a variety of problems of moderate scope and complexity. Our flexible/ hybrid work schedule includes 3 in-person days at one of our core locations and 2 remote days. Our core office locations include Salisbury, NC, Chicago, IL, and Quincy, MA. Applicants must be currently authorized to work in the United States on a full-time basis. Duties & Responsibilities: * Solves simple to moderate application errors, resolves application problems, following up promptly with all appropriate customers and IT personnel. * Reviews and contributes to QA test plans and supports QA team during test execution. * Participates in developing streaming data applications (Kafka), data transformation and data pipelines. * Ensures change control and change management procedures are followed within the program/project as they relate to requirements. * Able to interpret requirement documents, contributes to creating functional design documents as a part of data development life cycle. * Documents all phases of work including gathering requirements, architectural diagrams, and other program technical specifications using current specified design standards for new or revised solutions. * Relates information from various sources to draw logical conclusions. * Conducts unit testing on data streams. * Conducts data lineage and impact analysis as a part of the change management process. * Conducts data analysis (SQL, Excel, Data Discovery, etc.) on legacy systems and new data sources. * Creates source to target data mappings for data pipelines and integration activities. * Assists in identifying the impact of proposed application development/enhancements projects. * Performs data profiling and process analysis to understand key source systems and uses knowledge of application features and functions to assess scope and impact of business needs. * Implement and maintain data governance policies and procedures to ensure data quality, security and compliance * Ensure operational stability of a 24/7/365 grocery retail environment by providing technical support, system monitoring, and issue resolution which may be required during off-hours, weekends, and holidays as needed. Qualifications: * Bachelors Degree in Computer Science or Technical field; equivalent trainings/certifications/experience equivalency will be considered * 3 or more years of equivalent experience in relevant job or field of technology Preferred Qualifications: * Masters Degree in relevant field of study preferred, Additional trainings or certifications in relevant field of study preferred * Experience in Agile teams and or Product/Platform based operating model. * Experience in retail or grocery preferred. * Experience with Kafka. #DICEJobs #LI-hybrid #LI-SS1 Salary Range: $101,360 - $152,040 Actual compensation offered to a candidate may vary based on their unique qualifications and experience, internal equity, and market conditions. Final compensation decisions will be made in accordance with company policies and applicable laws. At Ahold Delhaize USA, we provide services to one of the largest portfolios of grocery companies in the nation, and we're actively seeking top talent. Our team shares a common motivation to drive change, take ownership and enable our brands to better care for their customers. We thrive on supporting great local grocery brands and their strategies. Our associates are the heartbeat of our organization. We are committed to offering a welcoming work environment where all associates can succeed and thrive. Guided by our values of courage, care, teamwork, integrity (and even a little humor), we are dedicated to being a great place to work. We believe in collaboration, curiosity, and continuous learning in all that we think, create and do. While building a culture where personal and professional growth are just as important as business growth, we invest in our people, empowering them to learn, grow and deliver at all levels of the business.
    $101.4k-152k yearly 49d ago
  • Data Engineer (Senior to Staff level)

    Sanity 4.1company rating

    Atlanta, GA jobs

    At Sanity, we're building the future of AI-powered Content Operations. Our AI Content Operating System gives teams the freedom to model, create, and automate content the way their business works, accelerating digital development and supercharging content operations efficiency. Companies like Linear, Figma, Cursor, Riot Games, Anthropic, and Morningbrew are using Sanity to power and automate their content operations. About the role We are seeking a talented and experienced Data Engineer (Senior to Staff level) to join our growing data team at a pivotal time in our development. As a key member of our data engineering team, you'll help scale and evolve our data infrastructure to ensure Sanity can make better data-driven decisions. This is an opportunity to work on mission-critical data systems that power our B2B SaaS platform. You'll improve our data pipelines, optimize data models, and strengthen our analytics capabilities using modern tools like Airflow, AirByte, BigQuery, DBT, and RudderStack. Working closely with engineers, analysts, and business stakeholders across US and European time zones, you'll help foster a data-driven culture by making data more accessible, reliable, and actionable across the organization. If you're passionate about solving complex data challenges, have experience scaling data infrastructure in B2B environments, and want to make a significant impact at a fast-growing company, we want to talk to you. This role offers the perfect blend of technical depth and strategic influence, allowing you to shape how Sanity leverages data to drive business success. What you'll be doing Data Infrastructure & ETL Development Design, develop, and maintain scalable ETL/ELT pipelines to ensure data is efficiently processed, transformed, and made available across the company. Collaborate with engineering teams to implement and scale product telemetry across our product surfaces. Develop and maintain data models in BigQuery that balance performance, cost, and usability Establish best practices for data ingestion, transformation, and orchestration, ensuring reliability and efficiency. Orchestrate data workflows to reduce manual effort, improve efficiency, and maintain high data quality standards. Collaboration & Cross-Team Partnerships Work closely with data analysts, engineers, and other internal stakeholders to understand their data needs and design robust pipelines that support data-driven decision-making. Build scalable and flexible data solutions that address both current business requirements and future growth needs. Partner with engineering, growth, and product teams to enhance data accessibility and usability. Data Observability & Reliability Build and maintain comprehensive monitoring, alerting, and logging systems for all data pipelines and infrastructure Implement SLAs/SLOs for critical data pipelines and establish incident response procedures Develop data quality monitoring that proactively detects anomalies, schema changes, and data freshness issues Create dashboards and alerting systems that provide visibility into pipeline health, data lineage, and system performance Debug and troubleshoot data issues efficiently using observability tools and data lineage tracking Continuous Improvement & Scalability Monitor and optimize data pipeline performance and costs as data volumes grow Implement and maintain data quality frameworks and testing practices Contribute to the evolution of our data infrastructure through careful evaluation of new tools and technologies Help establish data engineering best practices that scale with our growing business needs This may be you Remote in Europe or North America (East Coast/ET) 4+ years of experience building data pipelines at scale, with deep expertise in SQL, Python, and Node.js/TypeScript for data engineering workflows Proactive mindset with attention to detail, particularly in maintaining comprehensive documentation and data lineage Strong communication skills with demonstrated ability to collaborate effectively across US and European time zones Production experience with workflow orchestration tools like Airflow, and customer data platforms like RudderStack, ideally in a B2B SaaS environment Proven experience integrating and maintaining data flows with CRM systems like Salesforce, Marketo, or HubSpot Track record of building reliable data infrastructure that supports rapid business growth and evolving analytics needs Experience implementing data quality frameworks and monitoring systems to ensure reliable data delivery to stakeholders Nice to have: Experience with product analytics tools like Amplitude, Mixpanel, or PostHog Experience with Google Cloud Platform and BigQuery What we can offer A highly-skilled, inspiring, and supportive team Positive, flexible, and trust-based work environment that encourages long-term professional and personal growth A global, multi-culturally diverse group of colleagues and customers Comprehensive health plans and perks A healthy work-life balance that accommodates individual and family needs Competitive stock options program and location-based salary Who we are Sanity.io is a modern, flexible content operating system that replaces rigid legacy content management systems. One of our big differentiators is treating content as data so that it can be stored in a single source of truth, but seamlessly adapted and personalized for any channel without extra effort. Forward-thinking companies choose Sanity because they can create tailored content authoring experiences, customized workflows, and content models that reflect their business. Sanity recently raised a $85m Series C led by GP Bullhound and is also backed by leading investors like ICONIQ Growth, Threshold Ventures, Heavybit and Shopify, as well as founders of companies like Vercel, WPEngine, Twitter, Mux, Netlify and Heroku. This funding round has put Sanity in a strong position for accelerated growth in the coming years. You can only build a great company with a great culture. Sanity is a 200+ person company with highly committed and ambitious people. We are pioneers, we exist for our customers, we are hel ved, and we love type two fun! Read more about our values here! Sanity.io pledges to be an organization that reflects the globally diverse audience that our product serves. We believe that in addition to hiring the best talent, a diversity of perspectives, ideas, and cultures leads to the creation of better products and services. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, marital status, disability, or gender identity.
    $88k-122k yearly est. Auto-Apply 60d+ ago
  • Data Engineer

    Range 3.7company rating

    McLean, VA jobs

    Range is creating AI-powered solutions to eliminate financial complexity for our members. We're transforming wealth management through the perfect blend of cutting-edge technology and human expertise. We're obsessed with member experience! We've built an integrated platform that tackles the full spectrum of financial needs-investments, taxes, retirement planning, and estate management-all unified in one intuitive system. Backed by Scale, Google's Gradient Ventures, and Cathay Innovations, we're in hyper-growth mode and looking for exceptional talent to join our starting lineup. We recently raised $60M in our Series C funding and want builders to help scale the company. Every Ranger at this stage is shaping our culture and way of life-from former CEOs and startup founders to experts from leading hedge funds and tech companies. If you're ready to build something that truly matters in financial services, bring your talent to Range. Here, you'll make a genuine impact on how people manage their financial lives while working alongside a team that celebrates wins, makes big decisions, and blazes new trails together. About the role As a Data Engineer at Range, you'll play a central role in building and scaling the data infrastructure that powers our analytics, product insights, and customer experiences. You will design, develop, and maintain robust data pipelines and platforms that ensure data is accurate, secure, and available for both real-time and batch analytics. You'll collaborate closely with product, analytics, data science, and engineering teams to turn raw data into reliable, scalable information that drives business decisions and fuels growth. This role is ideal for someone who thrives on solving technical data challenges in a fast-moving fintech environment. We're excited to hire this role at Range's Headquarters in McLean, VA. All of our positions follow an in-office schedule Monday through Friday, allowing you to collaborate directly with your team. If you're not currently based in the area, but love what you see, let's discuss relocation as part of your journey to joining us. What you'll do with us Create, maintain, and optimize scalable data pipelines and ETL/ELT workflows to support analytics and product initiatives. Integrate data from various internal and external sources, ensuring seamless ingestion, transformation, and delivery. Help shape our data architecture, including data warehouse/schema design, storage solutions, and workflow orchestration. Optimize queries and storage performance for efficiency at scale, especially as data volume grows with our customer base. Implement data quality checks, monitoring systems, and troubleshooting tools to ensure data reliability and accuracy. Work closely with data scientists, analysts, and cross-functional partners to understand data needs and deliver timely solutions. Maintain clear documentation for pipelines, architecture decisions, and standards to support team alignment and onboarding. What will set you apart 5+ years of experience in Data Engineering BS or BA in Computer Science, Engineering, Statistics, Economics, Mathematics, or another quantitative discipline from a top-tier university Strong proficiency in Python and SQL for building and optimizing data workflows. Hands-on experience with cloud data platforms and toolchains (e.g., AWS, GCP, Snowflake, BigQuery, Redshift). Familiarity with data pipeline orchestration tools such as Airflow, dbt, Prefect, or similar. Experience with data modeling, schema design, and data warehousing concepts in large-scale environments. Strong analytical mindset with excellent problem-solving skills, especially in ambiguous or evolving environments. Comfortable working collaboratively across teams and translating business needs into technical solutions. Experience working in fintech or other consumer-focused, high-growth technology startup environments. Benefits Health & Wellness: 100% employer-covered medical insurance for employees (75% for dependents), plus dental and vision coverage 401(k): Retirement savings program to support your future Paid Time Off: Dedicated time to reset and recharge plus most federal holidays Parental Leave: Comprehensive leave policy for growing families Meals: Select meals covered throughout the week Fitness: Monthly movement stipend Equity & Career Growth: Early exercise eligibility and a strong focus on professional development Annual Compensation Reviews: Salary and equity refreshes based on performance Boomerang Program: After two years at Range, you can take time away to start your own company. We'll hold your spot for 6 months - and pause your equity vesting, which resumes if you return Range is proud to be an equal opportunity workplace. We are committed to equal employment opportunities regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. As a company, we are committed to designing products, building a culture, and supporting a team that reflects the diverse population we serve.
    $91k-128k yearly est. Auto-Apply 34d ago
  • Data Engineer - Kafka

    Delhaize America 4.6company rating

    Quincy, MA jobs

    Ahold Delhaize USA, a division of global food retailer Ahold Delhaize, is part of the U.S. family of brands, which includes five leading omnichannel grocery brands - Food Lion, Giant Food, The GIANT Company, Hannaford and Stop & Shop. Our associates support the brands with a wide range of services, including Finance, Legal, Sustainability, Commercial, Digital and E-commerce, Technology and more. Primary Purpose: The Data Engineer II contributes to the expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. They will contribute to our data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They engage through the entire lifecycle of a project from data mapping, data pipelines, data modeling, and finally data consumption. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. They will learn to optimize or even re-design our company's data architecture to support our next generation of products and data initiatives. They can take on smaller projects from start to finish, work on problems of moderate scope where analysis of situations or data requires a review of a variety of factors and trace issues to their source. They develop solutions to a variety of problems of moderate scope and complexity. Our flexible/ hybrid work schedule includes 3 in-person days at one of our core locations and 2 remote days. Our core office locations include Salisbury, NC, Chicago, IL, and Quincy, MA. Applicants must be currently authorized to work in the United States on a full-time basis. Duties & Responsibilities: * Solves simple to moderate application errors, resolves application problems, following up promptly with all appropriate customers and IT personnel. * Reviews and contributes to QA test plans and supports QA team during test execution. * Participates in developing streaming data applications (Kafka), data transformation and data pipelines. * Ensures change control and change management procedures are followed within the program/project as they relate to requirements. * Able to interpret requirement documents, contributes to creating functional design documents as a part of data development life cycle. * Documents all phases of work including gathering requirements, architectural diagrams, and other program technical specifications using current specified design standards for new or revised solutions. * Relates information from various sources to draw logical conclusions. * Conducts unit testing on data streams. * Conducts data lineage and impact analysis as a part of the change management process. * Conducts data analysis (SQL, Excel, Data Discovery, etc.) on legacy systems and new data sources. * Creates source to target data mappings for data pipelines and integration activities. * Assists in identifying the impact of proposed application development/enhancements projects. * Performs data profiling and process analysis to understand key source systems and uses knowledge of application features and functions to assess scope and impact of business needs. * Implement and maintain data governance policies and procedures to ensure data quality, security and compliance * Ensure operational stability of a 24/7/365 grocery retail environment by providing technical support, system monitoring, and issue resolution which may be required during off-hours, weekends, and holidays as needed. Qualifications: * Bachelors Degree in Computer Science or Technical field; equivalent trainings/certifications/experience equivalency will be considered * 3 or more years of equivalent experience in relevant job or field of technology Preferred Qualifications: * Masters Degree in relevant field of study preferred, Additional trainings or certifications in relevant field of study preferred * Experience in Agile teams and or Product/Platform based operating model. * Experience in retail or grocery preferred. * Experience with Kafka. #DICEJobs #LI-hybrid #LI-SS1 Salary Range: $101,360 - $152,040 Actual compensation offered to a candidate may vary based on their unique qualifications and experience, internal equity, and market conditions. Final compensation decisions will be made in accordance with company policies and applicable laws. At Ahold Delhaize USA, we provide services to one of the largest portfolios of grocery companies in the nation, and we're actively seeking top talent. Our team shares a common motivation to drive change, take ownership and enable our brands to better care for their customers. We thrive on supporting great local grocery brands and their strategies. Our associates are the heartbeat of our organization. We are committed to offering a welcoming work environment where all associates can succeed and thrive. Guided by our values of courage, care, teamwork, integrity (and even a little humor), we are dedicated to being a great place to work. We believe in collaboration, curiosity, and continuous learning in all that we think, create and do. While building a culture where personal and professional growth are just as important as business growth, we invest in our people, empowering them to learn, grow and deliver at all levels of the business.
    $101.4k-152k yearly 49d ago
  • Data Engineer - Kafka

    Delhaize America 4.6company rating

    Chicago, IL jobs

    Ahold Delhaize USA, a division of global food retailer Ahold Delhaize, is part of the U.S. family of brands, which includes five leading omnichannel grocery brands - Food Lion, Giant Food, The GIANT Company, Hannaford and Stop & Shop. Our associates support the brands with a wide range of services, including Finance, Legal, Sustainability, Commercial, Digital and E-commerce, Technology and more. Primary Purpose: The Data Engineer II contributes to the expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. They will contribute to our data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They engage through the entire lifecycle of a project from data mapping, data pipelines, data modeling, and finally data consumption. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. They will learn to optimize or even re-design our company's data architecture to support our next generation of products and data initiatives. They can take on smaller projects from start to finish, work on problems of moderate scope where analysis of situations or data requires a review of a variety of factors and trace issues to their source. They develop solutions to a variety of problems of moderate scope and complexity. Our flexible/ hybrid work schedule includes 3 in-person days at one of our core locations and 2 remote days. Our core office locations include Salisbury, NC, Chicago, IL, and Quincy, MA. Applicants must be currently authorized to work in the United States on a full-time basis. Duties & Responsibilities: * Solves simple to moderate application errors, resolves application problems, following up promptly with all appropriate customers and IT personnel. * Reviews and contributes to QA test plans and supports QA team during test execution. * Participates in developing streaming data applications (Kafka), data transformation and data pipelines. * Ensures change control and change management procedures are followed within the program/project as they relate to requirements. * Able to interpret requirement documents, contributes to creating functional design documents as a part of data development life cycle. * Documents all phases of work including gathering requirements, architectural diagrams, and other program technical specifications using current specified design standards for new or revised solutions. * Relates information from various sources to draw logical conclusions. * Conducts unit testing on data streams. * Conducts data lineage and impact analysis as a part of the change management process. * Conducts data analysis (SQL, Excel, Data Discovery, etc.) on legacy systems and new data sources. * Creates source to target data mappings for data pipelines and integration activities. * Assists in identifying the impact of proposed application development/enhancements projects. * Performs data profiling and process analysis to understand key source systems and uses knowledge of application features and functions to assess scope and impact of business needs. * Implement and maintain data governance policies and procedures to ensure data quality, security and compliance * Ensure operational stability of a 24/7/365 grocery retail environment by providing technical support, system monitoring, and issue resolution which may be required during off-hours, weekends, and holidays as needed. Qualifications: * Bachelors Degree in Computer Science or Technical field; equivalent trainings/certifications/experience equivalency will be considered * 3 or more years of equivalent experience in relevant job or field of technology Preferred Qualifications: * Masters Degree in relevant field of study preferred, Additional trainings or certifications in relevant field of study preferred * Experience in Agile teams and or Product/Platform based operating model. * Experience in retail or grocery preferred. * Experience with Kafka. #DICEJobs #LI-hybrid #LI-SS1 Salary Range: $101,360 - $152,040 Actual compensation offered to a candidate may vary based on their unique qualifications and experience, internal equity, and market conditions. Final compensation decisions will be made in accordance with company policies and applicable laws. At Ahold Delhaize USA, we provide services to one of the largest portfolios of grocery companies in the nation, and we're actively seeking top talent. Our team shares a common motivation to drive change, take ownership and enable our brands to better care for their customers. We thrive on supporting great local grocery brands and their strategies. Our associates are the heartbeat of our organization. We are committed to offering a welcoming work environment where all associates can succeed and thrive. Guided by our values of courage, care, teamwork, integrity (and even a little humor), we are dedicated to being a great place to work. We believe in collaboration, curiosity, and continuous learning in all that we think, create and do. While building a culture where personal and professional growth are just as important as business growth, we invest in our people, empowering them to learn, grow and deliver at all levels of the business.
    $101.4k-152k yearly 49d ago
  • Staff Data Engineer

    Market America Inc. 4.5company rating

    Miami Beach, FL jobs

    Market America, a product brokerage and Internet marketing company that specializes in One-to-One Marketing, is seeking an experienced Staff Data Engineer for our IT team. As a senior member of the Data Engineering team, you will have an important role in helping millions of customers on our SHOP.COM and Market America Worldwide multi-country and multi-language global eCommerce websites intelligently find what they want within different categories, merchant offers, products and taxonomy. We have thousands of 3rd party affiliates / feeds which goes through ETL and data ingestion pipelines before ingesting into our search systems. We have multiple orchestration pipelines supporting various types of data for products, store offers, analytics, customer behavioral profiles, segments, logs and much more. If you are passionate about data engineering in processing millions of data, ETL processes for products, stores, customers, analytics, this is highly visible role that will provide you the opportunity to make a huge impact in our business and a difference to millions of customers worldwide. Data engineering team processes large amounts of data that we import and collect. The team works to enrich content, pricing integration, taxonomy assignments and algorithms, category classifier nodes and machine learning integration within the pipeline. Key Responsibilities: * Must have minimum of 10-12 years of hands-on development experience implementing batch and events driven applications using Java, Kafka, Spark, Scala, PySpark and Python * Experience with Apache Kafka and Connectors, Java, Springboot in building event driven services, Python in building ML pipelines * Develop data pipelines responsible for ingesting large amounts of different kinds of data from various sources * Help evolve data architecture and work on Next Generation real time pipeline algorithms and architecture in addition to supporting and maintaining current pipelines and legacy systems * Write code and develop worker node for business logic, ETL and orchestration processes * Develop algorithms for better attribution rules and category classifiers * Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive search, discovery, and recommendations. * Work closely with architects, engineers, data analysts, data scientists, contractors/consultants and project managers in assessing project requirements, design, develop and support data ingestions and API services * Work with Data Scientists in building feature engineering pipelines and integrating machine learning models during content enrichment process * Able to influence on priorities working with various partners including engineers, project management office and leadership * Mentor junior team members, define architecture, code review, hands-on development and deliver the work in sprint cycle * Participate in design discussions with Architects and other team members for the design of new systems and re-engineering of components of existing systems * Wear Architect hat when required to bring new ideas to the table, thought leadership and forward thinking * Take holistic approach to building solutions by thinking big picture and overall solution * Work on moving away from legacy systems into next generation architecture * Take complete ownership from requirements, solution design, development, production launch and post launch production support. Participate in code reviews and regular on-call rotations. * Desire to apply best solution in the industry, apply correct design patterns during development and learn best practices and data engineering tools and technologies Required Skills & Experience: * BS or MS in Computer Science (or related field) with 10+ years of hands-on software development experience working in large-scale data processing pipelines * Must have skills are Apache Spark, Scala and PySpark with 2-4 years of experience building production grade batch pipelines that handle large volumes of data. * Must have at least 4+ years of experience in Java and API / Microservices * Must have at least 2+ years of experience in Python * 2+ years of experience in understanding and writing complex SQL and stored procedures for processing raw data, ETL, data validation, using databases such as SQL Server, Redis and other NoSQL DBs * Knowledge of Big data technologies, Hadoop, HDFS * Expertise with building events driven pipelines with Kafka, Java / Spark, Apache Flink * Expertise with Amazon AWS stack such as EMR, EC2, S3 * Experience working with APIs to collect and ingest data as well build the APIs for business logic * Experience working with setting up, maintaining, and debugging production systems and infrastructure * Experience in building fault-tolerant and resilient system * Experience in building worker nodes, knowledge of REST principles and data engineering design patterns * In-depth knowledge of Java, Springboot, Spark, Scala, PySpark, Python, Orchestration tools, ESB, SQL, Stored procedures, Docker, RESTful web services, Kubernetes, CI/CD, Observability techniques, Kafka, Release processes, caching strategies, versioning, B&D, BitBucket / Git and AWS Cloud Eco-system, NoSQL Databases, Hazelcast * Strong software development, architecture diagramming, problem-solving and debugging skills * Phenomenal communication and influencing skills Nice to Have: * Exposure to Machine Learning (ML), LLM models, using AI during coding, build with AI * Knowledge of Elastic APM, ELK stack and search technologies such as Elasticsearch / Solr * Nice to have some experience in workflow orchestration tools such as Air Flow or Apache NiFi Market America offers competitive salary and generous benefits, including health, dental, vision, life, short and long-term disability insurance, a 401(k) retirement plan with company match, and an on-site health clinic. Qualified candidates should apply online. This position will work remotely based from our Miami FL offices. Sorry, we are NOT able to sponsor for this position. Market America is proud to be an equal opportunity employer. Market America | SHOP.COM is changing the way people shop and changing the economic paradigm so anyone can become financially independent by creating their own economy and converting their spending into earning with the Shopping Annuity. ABOUT MARKET AMERICA, INC. & SHOP.COM Market America Worldwide | SHOP.COM is a global e-commerce and digital marketing company that specializes in one-to-one marketing and is the creator of the
    $81k-102k yearly est. 7d ago
  • Staff Data Engineer

    Market America 4.5company rating

    Miami Beach, FL jobs

    Market America, a product brokerage and Internet marketing company that specializes in One-to-One Marketing, is seeking an experienced Staff Data Engineer for our IT team. As a senior member of the Data Engineering team, you will have an important role in helping millions of customers on our SHOP.COM and Market America Worldwide multi-country and multi-language global eCommerce websites intelligently find what they want within different categories, merchant offers, products and taxonomy. We have thousands of 3 rd party affiliates / feeds which goes through ETL and data ingestion pipelines before ingesting into our search systems. We have multiple orchestration pipelines supporting various types of data for products, store offers, analytics, customer behavioral profiles, segments, logs and much more. If you are passionate about data engineering in processing millions of data, ETL processes for products, stores, customers, analytics, this is highly visible role that will provide you the opportunity to make a huge impact in our business and a difference to millions of customers worldwide. Data engineering team processes large amounts of data that we import and collect. The team works to enrich content, pricing integration, taxonomy assignments and algorithms, category classifier nodes and machine learning integration within the pipeline. Key Responsibilities: Must have minimum of 10-12 years of hands-on development experience implementing batch and events driven applications using Java, Kafka, Spark, Scala, PySpark and Python Experience with Apache Kafka and Connectors, Java, Springboot in building event driven services, Python in building ML pipelines Develop data pipelines responsible for ingesting large amounts of different kinds of data from various sources Help evolve data architecture and work on Next Generation real time pipeline algorithms and architecture in addition to supporting and maintaining current pipelines and legacy systems Write code and develop worker node for business logic, ETL and orchestration processes Develop algorithms for better attribution rules and category classifiers Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive search, discovery, and recommendations. Work closely with architects, engineers, data analysts, data scientists, contractors/consultants and project managers in assessing project requirements, design, develop and support data ingestions and API services Work with Data Scientists in building feature engineering pipelines and integrating machine learning models during content enrichment process Able to influence on priorities working with various partners including engineers, project management office and leadership Mentor junior team members, define architecture, code review, hands-on development and deliver the work in sprint cycle Participate in design discussions with Architects and other team members for the design of new systems and re-engineering of components of existing systems Wear Architect hat when required to bring new ideas to the table, thought leadership and forward thinking Take holistic approach to building solutions by thinking big picture and overall solution Work on moving away from legacy systems into next generation architecture Take complete ownership from requirements, solution design, development, production launch and post launch production support. Participate in code reviews and regular on-call rotations. Desire to apply best solution in the industry, apply correct design patterns during development and learn best practices and data engineering tools and technologies Required Skills & Experience: BS or MS in Computer Science (or related field) with 10+ years of hands-on software development experience working in large-scale data processing pipelines Must have skills are Apache Spark, Scala and PySpark with 2-4 years of experience building production grade batch pipelines that handle large volumes of data. Must have at least 4+ years of experience in Java and API / Microservices Must have at least 2+ years of experience in Python 2+ years of experience in understanding and writing complex SQL and stored procedures for processing raw data, ETL, data validation, using databases such as SQL Server, Redis and other NoSQL DBs Knowledge of Big data technologies, Hadoop, HDFS Expertise with building events driven pipelines with Kafka, Java / Spark, Apache Flink Expertise with Amazon AWS stack such as EMR, EC2, S3 Experience working with APIs to collect and ingest data as well build the APIs for business logic Experience working with setting up, maintaining, and debugging production systems and infrastructure Experience in building fault-tolerant and resilient system Experience in building worker nodes, knowledge of REST principles and data engineering design patterns In-depth knowledge of Java, Springboot, Spark, Scala, PySpark, Python, Orchestration tools, ESB, SQL, Stored procedures, Docker, RESTful web services, Kubernetes, CI/CD, Observability techniques, Kafka, Release processes, caching strategies, versioning, B&D, BitBucket / Git and AWS Cloud Eco-system, NoSQL Databases, Hazelcast Strong software development, architecture diagramming, problem-solving and debugging skills Phenomenal communication and influencing skills Nice to Have: Exposure to Machine Learning (ML), LLM models, using AI during coding, build with AI Knowledge of Elastic APM, ELK stack and search technologies such as Elasticsearch / Solr Nice to have some experience in workflow orchestration tools such as Air Flow or Apache NiFi Market America offers competitive salary and generous benefits, including health, dental, vision, life, short and long-term disability insurance, a 401(k) retirement plan with company match, and an on-site health clinic. Qualified candidates should apply online. This position will work remotely based from our Miami FL offices. Sorry, we are NOT able to sponsor for this position. Market America is proud to be an equal opportunity employer. Market America | SHOP.COM is changing the way people shop and changing the economic paradigm so anyone can become financially independent by creating their own economy and converting their spending into earning with the Shopping Annuity . ABOUT MARKET AMERICA, INC. & SHOP.COM Market America Worldwide | SHOP.COM is a global e-commerce and digital marketing company that specializes in one-to-one marketing and is the creator of the Shopping Annuity . Its mission is to provide a robust business system for entrepreneurs, while providing consumers a better way to shop. Headquartered in Greensboro, North Carolina, and with eight sites around the globe, including the U.S., Market America Worldwide was founded in 1992 by Founder, Chairman & CEO JR Ridinger. Through the company's primary, award-winning shopping website, SHOP.COM, consumers have access to millions of products, including Market America Worldwide exclusive brands and thousands of top retail brands. Further, SHOP.COM ranks 19th in Newsweek magazine's 2021 Best Online Shops, No. 52 in Digital Commerce 360's (formerly Internet Retailer) 2021 Top 1,000 Online Marketplaces, No. 79 in Digital Commerce 360's 2021 Top 1,000 Online Retailers and No. 11 in the 2021 Digital Commerce 360 Primary Merchandise Category Top 500. The company is also a two-time winner of the Better Business Bureau's Torch Award for Marketplace Ethics and was ranked No. 15 in The Business North Carolina Top 125 Private Companies for 2021. By combining Market America Worldwide's entrepreneurial business model with SHOP.COM's powerful comparative shopping engine, Cashback program, Hot Deals, ShopBuddy , Express Pay checkout, social shopping integration and countless other features, the company has become the ultimate online shopping destination. For more information about Market America Worldwide: MarketAmerica.com For more information on SHOP.COM, please visit: SHOP.COM
    $81k-102k yearly est. 7d ago
  • Retail Data Engineer

    Racetrac 4.4company rating

    Atlanta, GA jobs

    The Retail Data Engineer plays a critical role in managing the flow, transformation, and integrity of scan-level data within our retail data ecosystem. This role ensures that raw transactional data such as point-of-sale (POS), promotional, loyalty, and product data is clean, consistent, and fit for analysis. This individual will collaborate closely with merchandising, marketing, operations, and analytics teams to deliver trusted data that powers key business decisions in a dynamic retail environment. What You'll Do: Works with business teams (e.g., Category Management, Marketing, Supply Chain) to define and refine data needs. Identifies gaps or ambiguities in retail scan data (e.g., barcode inconsistencies, vendor mappings). Translates complex retail requirements into technical specifications for data ingestion and transformation. Develops, schedules, and optimizes ETL/ELT processes to ingest large volumes of scan data (e.g., from POS, ERP, loyalty programs). Applies robust transformation logic to normalize data across vendors, stores, and systems. Works with both structured and semi-structured retail datasets (CSV, JSON, EDI, etc.). Implements data validation, reconciliation, and anomaly detection for incoming retail data feeds. Designs and maintains audit trails and data lineage for scan data. Investigates and resolves data discrepancies in collaboration with store systems, IT, and vendors. Conducts exploratory data analysis to uncover trends, seasonality, anomalies, and root causes. Supports retail performance reporting, promotional effectiveness, and vendor analytics. Provides clear documentation and logic traceability for analysts and business users. Collaborates with cross-functional teams such as merchandising, inventory, loyalty, and finance to support retail KPIs and data insights. Acts as a data subject matter expert for scan and transaction data. Provides guidance on best practices for data usage and transformation in retail contexts. What We're Looking For: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or related field. 3-5 years of experience in data engineering, ideally in a retail environment. Experience working with scan-level data from large retail chains or CPG vendors. Familiarity with retail ERP systems (e.g., SAP, Oracle), merchandising tools, and vendor data feeds. Expert-level SQL and experience working with retail schema structures (e.g., SKUs, UPCs, store IDs). Proficient in data pipeline and orchestration tools such as dbt, Airflow, Fivetran, or Apache Spark. Experience with cloud-based data platforms (Snowflake, Google BigQuery, Azure Synapse, AWS Redshift). Familiarity with retail concepts such as POS systems, promotional pricing, markdowns, units vs. dollars sold, sell-through rates, and planogram compliance. Understanding of master data management (MDM) for products, stores, and vendors. Experience with data profiling and quality frameworks (e.g., Great Expectations, Soda, Monte Carlo). Responsibilities: Works with business teams (e.g., Category Management, Marketing, Supply Chain) to define and refine data needs. Identifies gaps or ambiguities in retail scan data (e.g., barcode inconsistencies, vendor mappings). Translates complex retail requirements into technical specifications for data ingestion and transformation. Develops, schedules, and optimizes ETL/ELT processes to ingest large volumes of scan data (e.g., from POS, ERP, loyalty programs). Applies robust transformation logic to normalize data across vendors, stores, and systems. Works with both structured and semi-structured retail datasets (CSV, JSON, EDI, etc.). Implements data validation, reconciliation, and anomaly detection for incoming retail data feeds. Designs and maintains audit trails and data lineage for scan data. Investigates and resolves data discrepancies in collaboration with store systems, IT, and vendors. Conducts exploratory data analysis to uncover trends, seasonality, anomalies, and root causes. Supports retail performance reporting, promotional effectiveness, and vendor analytics. Provides clear documentation and logic traceability for analysts and business users. Collaborates with cross-functional teams such as merchandising, inventory, loyalty, and finance to support retail KPIs and data insights. Acts as a data subject matter expert for scan and transaction data. Provides guidance on best practices for data usage and transformation in retail contexts. Qualifications: All qualified applicants will receive consideration for employment with RaceTrac without regard to their race, national origin, religion, age, color, sex, sexual orientation, gender identity, disability, or protected veteran status, or any other characteristic protected by local, state, or federal laws, rules, or regulations.
    $85k-107k yearly est. Auto-Apply 60d+ ago
  • Data Engineer

    Schwarz Partners 3.9company rating

    Carmel, IN jobs

    Schwarz Partners has an exciting opportunity available for a Data Engineer in Carmel, IN! Data Engineers build data pipelines that transform raw, unstructured data into formats that can be used for analysis. They are responsible for creating and maintaining the analytics infrastructure that enables almost every other data function. This includes architectures such as databases, servers, and large-scale processing systems. A Data Engineer uses different technologies to collect and map an organization's data landscape to help decision-makers find cost savings and optimization opportunities. In addition, data Engineers use this data to display trends in collected analytics information, encouraging transparency with stakeholders. Schwarz Partners is one of the largest independent manufacturers of corrugated sheets and packaging materials in the U.S. Through our family of companies, we continuously build and strengthen our capabilities. You'll find our products wherever goods are packaged, shipped, and sold-from innovative retail packaging to colorful in-store displays at pharmacies and grocers. You also may have spotted our trucks on the highway. Schwarz Partners is built around the idea that independence and innovation go hand in hand. Our structure allows us to adapt to change quickly, get new ideas off the ground, and thrive in the marketplace. Our people are empowered to tap into their talents, build their skills, and grow with us. ESSENTIAL JOB FUNCTIONS FOR THIS POSITION: Build and maintain the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud technologies. Assemble large, complex sets of data that meet non-functional and functional business requirements. Identify, design, and implement internal data-related process improvements. Working with stakeholders including data, design, product and executive teams and assisting them with data-related technical issues. Conduct configuration & design of application to better leverage the enterprise. Prepare data for prescriptive and predictive modeling. Use effective communication to work with application vendors. Assist in the creation and quality assurance review of design documents and test results to ensure all project requirements are satisfied. Ability to advise and implement on improvements to data warehousing and data workflow architecture. Think outside the box and come up with improvement and efficiency opportunities to streamline business and operational workflows. Document high-level business workflows and transform into low-level technical requirements. Ability to analyze complex information sets and communicate that information in a clear well thought out and well laid out manner. Ability to communicate at varying levels of detail (30,000 ft. view, 10,000 ft. view, granular level) and to produce corresponding documentation at varying levels of abstraction. Be an advocate for best practices and continued learning. Ability to communicate with business stakeholders on status of projects/issues. Ability to prioritize and multi-task between duties at any given time. Solid communication and interpersonal skills. Comply with company policies and procedures and all applicable laws and regulations. General DBA work as needed. Maintain and troubleshoot existing ETL processes. Create and maintain BI reports. Additional duties as assigned. REQUIRED EDUCATION / EXPERIENCE: Bachelor's degree in Computer Science or 4+ years' experience in related field. PREFERRED EDUCATION / EXPERIENCE: Experience developing data workflows. Ability to perform prescriptive and predictive modeling. REQUIRED SKILLS: Demonstrated experience with SQL in a large database environment. Direct experience utilizing SQL to develop queries or profile data. Experience in quantitative and qualitative analysis of data. Experienced level skills in Systems Analysis. Experienced level skills in Systems Engineering. Ability to function as a self-starter. REQUIRED MICROSOFT FABRIC SKILLS: Strong grasp of OneLake concepts: lakehouses vs. warehouses, shortcuts, mirroring, item/workspace structure. Hands-on with Delta Lake (Parquet, Delta tables, partitioning, V-ordering, Z-ordering, Vacuum retention). Understanding of Direct Lake, Import, and DirectQuery trade-offs and when to use each. Experience designing star schemas and modern medallion architectures (bronze/silver/gold). Spark/PySpark notebooks (jobs, clusters, caching, optimization, broadcast joins). Data Factory in Fabric (Pipelines): activities, triggers, parameterization, error handling/retries. Dataflows Gen2 (Power Query/M) for ELT, incremental refresh, and reusable transformations. Building/optimizing semantic models; DAX (measures, calculation groups, aggregations). Ability to multi-task, think on his/her feet and react, apply attention to detail and follow-up, and work effectively and collegially with management staff and end users.
    $73k-99k yearly est. 9d ago
  • Data Platform Engineer Co-op - Fall 2026

    Delhaize America 4.6company rating

    Quincy, MA jobs

    Ahold Delhaize USA, a division of global food retailer Ahold Delhaize, is part of the U.S. family of brands, which includes five leading omnichannel grocery brands - Food Lion, Giant Food, The GIANT Company, Hannaford and Stop & Shop. Our associates support the brands with a wide range of services, including Finance, Legal, Sustainability, Commercial, Digital and E-commerce, Technology and more. Co-op Program Overview: Get an insider view of the fast-changing grocery retail industry while developing relevant business, technical and leadership skills geared towards enhancing your career. This paid Co-op experience is an opportunity to help drive business results in an environment designed to promote and reward diversity, innovation and leadership. Our mission is to create impactful early talent programs that provide cohorts with meaningful project work, learning and development sessions, and mentorship opportunities. Applicants must be currently enrolled in a bachelor's or master's degree program. Applicants must be currently authorized to work in the United States on a full-time basis and be available from July 13, 2026 through December 4, 2026. We have a hybrid work environment that requires a minimum of three days a week in the office. Please submit your resume including your cumulative GPA. Transcripts may be requested at a future date. * Approximate 6-month Co-op session with competitive pay * Impactful project work to develop your skills/knowledge * Career assistance & mentoring in obtaining full time positions within ADUSA * Leadership speaker sessions and development activities * One-on-one mentoring in your area of interest * Involvement in group community service events * Networking and professional engagement opportunities * Access to online career development tools and resources * Opportunity to present project work to company leaders and gain executive visibility Department/Position Description: Are you passionate about data? Do you like solving complex problems? Then come and join our Business Intelligence & Analytics team at Ahold Delhaize USA, a startup environment in the middle of a big corporate office. We are looking for creative out-of-the-box problem solvers. Our Data & Analytics teams specialize in delivery of next generation data platform, information driven reporting solutions and works to align technology solutions with business objectives. This development position exposes associates to Cloud technologies and Business Intelligence tools including enterprise data lake, large scale databases, streaming technologies, ETL/ELT, and reporting tools. This position will be responsible for technical design and development of data products, translating requirements into data pipelines and reporting solutions, user acceptance testing to deliver effective solutions. Qualifications: * Must be enrolled in a BS/BA, MS, or PhD program or a recent graduate in related field * Genuine excitement and passion for analyzing complex datasets and converting them into the information/insights that drive business decisions at all levels of the organization * Basic understanding of fundamentals in computer science and programming * Possess an understanding of business intelligence solutions utilizing advanced excel skills, BI tools in a data warehouse or data mart environment * Excellent communication skills * Must be able to adapt quickly to change without being afraid to take on new responsibilities in a fast-paced team environment while being proactive and action-oriented * Experience writing, analyzing, and troubleshooting SQL code in relational database management systems * Experience with any one of the programming languages Java, Scala, and Python * Exposure to ETL/ELT, Data warehouse and dimensional modeling concepts * Ability to propose analytical strategies and solutions as a business need * Exposure to big data technologies * Manipulating high-volume, high-dimensionality data from varying sources to highlight patterns, anomalies, relationships, and trends * Ability to work on data pipelines * Excellent diagnostic, debugging, and troubleshooting skills Skills: * Computer programming * Data Management * Data Analysis * Data/Software Engineering * Teamwork * Good Communication * Technical Writing * Gen AI + MLOps Individual cohort pay rates vary based on location, academic year, and position. ME/NC/PA/SC Salary Range: $20.90 - $35.70 IL/MA/MD Salary Range: $22.80 - $37.30 #LI-Hybrid #LI-CW1 At Ahold Delhaize USA, we provide services to one of the largest portfolios of grocery companies in the nation, and we're actively seeking top talent. Our team shares a common motivation to drive change, take ownership and enable our brands to better care for their customers. We thrive on supporting great local grocery brands and their strategies. Our associates are the heartbeat of our organization. We are committed to offering a welcoming work environment where all associates can succeed and thrive. Guided by our values of courage, care, teamwork, integrity (and even a little humor), we are dedicated to being a great place to work. We believe in collaboration, curiosity, and continuous learning in all that we think, create and do. While building a culture where personal and professional growth are just as important as business growth, we invest in our people, empowering them to learn, grow and deliver at all levels of the business.
    $20.9-35.7 hourly 60d+ ago
  • Data Engineer (AI) Co-op - Fall 2026

    Delhaize America 4.6company rating

    Chicago, IL jobs

    Ahold Delhaize USA, a division of global food retailer Ahold Delhaize, is part of the U.S. family of brands, which includes five leading omnichannel grocery brands - Food Lion, Giant Food, The GIANT Company, Hannaford and Stop & Shop. Our associates support the brands with a wide range of services, including Finance, Legal, Sustainability, Commercial, Digital and E-commerce, Technology and more. Co-op Program Overview: Get an insider view of the fast-changing grocery retail industry while developing relevant business, technical and leadership skills geared towards enhancing your career. This paid Co-op experience is an opportunity to help drive business results in an environment designed to promote and reward diversity, innovation and leadership. Our mission is to create impactful early talent programs that provide cohorts with meaningful project work, learning and development sessions, and mentorship opportunities. Applicants must be currently enrolled in a bachelor's or master's degree program. Applicants must be currently authorized to work in the United States on a full-time basis and be available from July 13, 2026 through December 4, 2026. We have a hybrid work environment that requires a minimum of three days a week in the office. Please submit your resume including your cumulative GPA. Transcripts may be requested at a future date. * Approximate 6-month Co-op session with competitive pay * Impactful project work to develop your skills/knowledge * Career assistance & mentoring in obtaining full time positions within ADUSA * Leadership speaker sessions and development activities * One-on-one mentoring in your area of interest * Involvement in group community service events * Networking and professional engagement opportunities * Access to online career development tools and resources * Opportunity to present project work to company leaders and gain executive visibility Department/Position Description: The Data Engineering team is primarily responsible for all web data feeds including but not limited to pricing, fulfillment and data warehouse operational feeds. The Co-Op will be using new AI frameworks and technologies, QA AI Automation with data pipelines based on newer Azure fabric. Qualifications: * Working towards a degree in Computer Science, Data Analytics, and/or Engineering * Exposure and training with Azure AI, ChatGPT AI, or experience developing AI agents * Experience writing APIs or interactive code with AI * SQL experience * Python is a plus * Previous Co-op or Internship experience is a plus * Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams. Individual cohort pay rates vary based on location, academic year, and position. ME/NC/PA/SC Salary Range: $20.90 - $35.70 IL/MA/MD Salary Range: $22.80 - $37.30 #LI-Hybrid #LI-CW1 At Ahold Delhaize USA, we provide services to one of the largest portfolios of grocery companies in the nation, and we're actively seeking top talent. Our team shares a common motivation to drive change, take ownership and enable our brands to better care for their customers. We thrive on supporting great local grocery brands and their strategies. Our associates are the heartbeat of our organization. We are committed to offering a welcoming work environment where all associates can succeed and thrive. Guided by our values of courage, care, teamwork, integrity (and even a little humor), we are dedicated to being a great place to work. We believe in collaboration, curiosity, and continuous learning in all that we think, create and do. While building a culture where personal and professional growth are just as important as business growth, we invest in our people, empowering them to learn, grow and deliver at all levels of the business.
    $20.9-35.7 hourly 60d+ ago

Learn more about Foot Locker jobs