Post job

Data Engineer jobs at The J.M. Smucker Co. - 1172 jobs

  • Senior Developer, IS Commercial Operations

    The J. M. Smucker Company 4.8company rating

    Data engineer job at The J.M. Smucker Co.

    Your Opportunity as the Senior Developer, IS Commercial Operations The Smucker Information Services (IS) department enables technology solutions for capabilities that help our business perform, transform, and grow. The Senior Developer realizes this purpose by coordinating, designing, building and supporting projects and/or applications, in support of our Concept to Commercialization Operations business. Projects/applications may include PLM (Oracle P4P), MDM (Syndigo), MDH (Database), Digital Asset Management (The CART by BYNDER), Commercialization Portfolio Management (Accolade), Commercialization Project Management (Pega), and/or Green Coffee (Custom). This role will be heavily involved in solution delivery and support to address an integrated technology landscape that continues to evolve toward cloud-based tools. This technical hands-on position requires proven development skills, excellent communication, curiosity to solve problems and a willingness to learn new skills under moderate guidance within a collaborative team environment. Candidate must be willing to seek to deeply understand business process, take on requirements/functional analysis and/or Agile Project Management as a small percentage of the role and should seek to build meaningful partnerships/relationships with business counterparts. Location: Orrville, OH (Close proximity to Cleveland/Akron) Work Arrangements: Hybrid - onsite a minimum of 9 days a month primarily during core weeks as determined by the Company; maybe more as business need requires In this role you will: Deliver solutions Under moderate guidance, designs, builds and supports software applications, integrations and other related technologies that meet business requirements, factor supportability and balance cost versus benefit. May lead technical activities for smaller projects. Provide input to work plans and estimates based on experience with development activities. Author technical specifications for moderately complex solutions. May own full solution design for smaller projects. Follow documented standards for development, code promotion, and change management. Instructs junior resources as appropriate. Own the design, construction and execution of technical solution testing, including unit, automation, integration and performance tests. Independently troubleshoot and resolve defects. When applicable, participate in mock cutover exercises to prevent disruption and issues once live. Support and maintain existing solutions Provide troubleshooting and fixes for complex issues, driving understanding of root cause and prioritization. Follow defined support paths and incident management processes to meet Service Level Agreements (SLAs). Develop knowledge base and Standard Operating Procedures (SOPs) for technical support plans. Execute activities to support ongoing maintenance and periodic releases of software. Strengthen development capabilities Contribute to standardized code solutions and automation opportunities. Learn new tools and apply modern IT concepts to support ongoing shift toward cloud-based technologies. The Right Place for You We are bold, kind, strive to do the right thing, we play to win, and we believe in a strong community that thrives together. Our culture is rooted in our Basic Beliefs, and we believe in supporting every employee by meeting their physical, emotional, and financial needs. What we are looking for: Minimum Requirements: 3+ years of work experience as a developer, with either a Bachelor's degree in a STEM discipline or specialized training in Information Technology. Experience with multiple phases of the software development lifecycle (SDLC) and formal delivery methodologies/frameworks (Traditional/Waterfall, Agile, DevOps-GitHub preferred). Experience working on teams with assignment due dates or service level agreements (SLAs) to support customer needs. Knowledge and ability to build solutions based on business requirements. C#, Python, Java or equivalent language experience SQL or PL/SQL experience Knowledge of data structures, algorithms, formats and integration methods. Knowledge of enterprise toolsets for integration, reporting, process orchestration and/or scheduling. Ability to unit test, troubleshoot and debug developed code, tuning for performance or other optimization/scalability objectives. Experience with IT service/work management systems (for incidents, problems and work requests) and code management processes i.e., Service Now, Microsoft ADO, JIRA Experience in requirements analysis/documentation Additional skills and experience that we think would make someone successful in this role: OOP (Object Oriented Programming) System architecture patterns Windows/Unix scripting knowledge Web development (JavaScript, HTML, CSS) Low-Code/No-Code experience (Pega preferred) Experience with data integration and ETL tools, particularly Informatica Intelligent Cloud Service (IICS) or SnapLogic iPaaS. Experience developing solutions on major cloud service provider platforms (AWS, Azure, GCP). Experience with Software-as-a-Service (SaaS) implementations within an integrated enterprise environment. Experience leading Agile Ceremonies or Scrum Master/Iteration Lead certification. Experience in the Consumer-Packaged Goods (CPG) industry. Learn more about working at Smucker: Helping our Employees Thrive Delivering on Our Purpose Our Continued Commitment to Ensuring a Workplace for All Follow us on LinkedIn #LI-Hybrid
    $85k-106k yearly est. Auto-Apply 57d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • FPGA Engineer

    Lincoln Electric 4.6company rating

    Euclid, OH jobs

    Lincoln Electric is the world leader in the engineering, design, and manufacturing of advanced arc welding solutions, automated joining, assembly and cutting systems, plasma and oxy-fuel cutting equipment, and has a leading global position in brazing and soldering alloys. Lincoln is recognized as the Welding Expert™ for its leading materials science, software development, automation engineering, and application expertise, which advance customers' fabrication capabilities to help them build a better world. Headquartered in Cleveland, Ohio, Lincoln Electric is a $4.2B publicly traded company (NASDAQ:LECO) with over 12,000 employees around the world, with operations in 71 manufacturing and automation system integration locations across 21 countries and maintains a worldwide network of distributors and sales offices serving customers in over 160 countries. Location: Euclid - 22801 Employment Status: Salary Full-Time Function: Engineering Pay Grade and Range: US10-E-31 Level III - Min: 105,560 - Mid: $124,188; Level IV: Min: $133,043 - Mid: $156,521 Bonus Plan: AIPAIP Target Bonus: Level III-10%; Level IV - 15% Purpose Lincoln Electric is seeking a highly capable FPGA (Field-Programmable Gate Arrays) Design Engineer to join our R&D team. This role will focus on the architecture, design, and implementation of FPGA-based systems for embedded and high-performance applications. The ideal candidate will have deep experience with VHDL, timing analysis and closure, and integration of FPGA logic with ARM-based processing systems via AXI and other interconnect protocols. Familiarity with AMD (Xilinx), Intel (Altera), or Microchip (Microsemi) FPGA platforms is essential. Duties and Responsibilities FPGA Architecture & Design Develop and maintain VHDL-based designs for control, signal processing, and communication subsystems. Architect modular and reusable IP blocks for integration into complex FPGA systems. Collaborate with hardware and software engineers to define functional requirements and partition logic between hardware, firmware, and software. Timing Analysis & Closure Perform static timing analysis and achieve timing closure across multiple clock domains. Optimize designs for performance, area, and power using synthesis and place-and-route tools. Debug timing violations and implement constraints using industry-standard tools. Processor Interfacing & System Integration Design and implement AXI-based interfaces to ARM processors and other embedded subsystems. Integrate FPGA logic with SoC platforms and manage data flow between programmable logic and software. Support development of device drivers and firmware for FPGA-accelerated functions. Duties and Responsibilities (Continued) Verification & Validation Develop testbenches and simulation environments using VHDL. Perform functional and formal verification of FPGA designs. Support hardware bring-up and lab testing using logic analyzers, oscilloscopes, and JTAG tools. Cross-Functional Collaboration Work closely with embedded software, hardware, and systems teams to ensure seamless integration. Participate in design reviews and contribute to system-level architecture decisions. Document design specifications, test results, and performance metrics. Innovation & Continuous Improvement Stay current with FPGA technologies, high-level synthesis, and hardware acceleration trends. Evaluate new tools, platforms, and methodologies to improve design efficiency and reliability. Basic Requirements Bachelor's degree in Electrical Engineering, Computer Engineering, or related field. Level III: 5+ years of relevant experience. Works independently: receives minimal guidance. May lead projects or project steps within a broader project or have accountability for ongoing activities or objectives. Level IV: 8+ years of relevant experience. Recognized as an expert in own area within the organization. Works independently, with guidance in only the most complex situations. 3+ years of experience in FPGA design and development using VHDL. Proficiency with AMD/Xilinx, Intel/Altera, and/or Microchip/Microsemi FPGA platforms. Strong understanding of timing analysis, constraints, and closure techniques. Experience with AXI interconnects and integration with ARM-based processing systems. Familiarity with simulation and verification tools such as VUnit or Vivado Simulator. Hands-on experience with lab equipment such as oscilloscopes and logic analyzers. Excellent problem-solving skills and ability to work in cross-functional teams. Strong written and verbal communication skills. Preferred Requirements Experience with high-speed interfaces (e.g. PCI, Ethernet, DDR). Knowledge of High-Level Synthesis tools. Familiarity with embedded Linux and device driver development. Understanding of security implications within FPGA-based embedded systems. Experience with FPGA-based control systems and digital signal processing. Lincoln Electric is an Equal Opportunity Employer. We are committed to promoting equal employment opportunity for applicants, without regard to their race, color, national origin, religion, sex (including pregnancy, childbirth, or related medical conditions, including, but not limited to, lactation), sexual orientation, gender identity, age, veteran status, disability, genetic information, and any other category protected by federal, state, or local law.
    $124.2k-156.5k yearly 1d ago
  • Senior Data Warehouse & BI Developer

    Ariat International 4.7company rating

    San Leandro, CA jobs

    About the Role We're looking for a Senior Data Warehouse & BI Developer to join our Data & Analytics team and help shape the future of Ariat's enterprise data ecosystem. You'll design and build data solutions that power decision-making across the company, from eCommerce to finance and operations. In this role, you'll take ownership of data modeling, and BI reporting using Cognos and Tableau, and contribute to the development of SAP HANA Calculation Views. If you're passionate about data architecture, visualization, and collaboration - and love learning new tools - this role is for you. You'll Make a Difference By Designing and maintaining Ariat's enterprise data warehouse and reporting architecture. Developing and optimizing Cognos reports for business users. Collaborating with the SAP HANA team to develop and enhance Calculation Views. Translating business needs into technical data models and actionable insights. Ensuring data quality through validation, testing, and governance practices. Partnering with teams across the business to improve data literacy and reporting capabilities. Staying current with modern BI and data technologies to continuously evolve Ariat's analytics stack. About You 7+ years of hands-on experience in BI and Data Warehouse development. Advanced skills in Cognos (Framework Manager, Report Studio). Strong SQL skills and experience with data modeling (star schemas, dimensional modeling). Experience building and maintaining ETL processes. Excellent analytical and communication skills. A collaborative, learning-oriented mindset. Experience developing SAP HANA Calculation Views preferred Experience with Tableau (Desktop, Server) preferred Knowledge of cloud data warehouses (Snowflake, BigQuery, etc.). Background in retail or eCommerce analytics. Familiarity with Agile/Scrum methodologies. About Ariat Ariat is an innovative, outdoor global brand with roots in equestrian performance. We develop high-quality footwear and apparel for people who ride, work, and play outdoors, and care about performance, quality, comfort, and style. The salary range for this position is $120,000 - $150,000 per year. The salary is determined by the education, experience, knowledge, skills, and abilities of the applicant, internal equity, and alignment with market data for geographic locations. Ariat in good faith believes that this posted compensation range is accurate for this role at this location at the time of this posting. This range may be modified in the future. Ariat's holistic benefits package for full-time team members includes (but is not limited to): Medical, dental, vision, and life insurance options Expanded wellness and mental health benefits Paid time off (PTO), paid holidays, and paid volunteer days 401(k) with company match Bonus incentive plans Team member discount on Ariat merchandise Note: Availability of benefits may be subject to location & employment type and may have certain eligibility requirements. Ariat reserves the right to alter these benefits in whole or in part at any time without advance notice. Ariat will consider qualified applicants, including those with criminal histories, in a manner consistent with state and local laws. Ariat is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis protected under federal, state, or local law. Ariat is committed to providing reasonable accommodations to candidates with disabilities. If you need an accommodation during the application process, email *************************. Please see our Employment Candidate Privacy Policy at ********************* to learn more about how we collect, use, retain and disclose Personal Information. Please note that Ariat does not accept unsolicited resumes from recruiters or employment agencies. In the absence of a signed Agreement, Ariat will not consider or agree to payment of any referral compensation or recruiter/agency placement fee. In the event a recruiter or agency submits a resume or candidate without a previously signed Agreement, Ariat explicitly reserves the right to pursue and hire those candidate(s) without any financial obligation to the recruiter or agency. Any unsolicited resumes, including those submitted directly to hiring managers, are deemed to be the property of Ariat.
    $120k-150k yearly 5d ago
  • Software Engineer

    Plug 3.8company rating

    Santa Monica, CA jobs

    Plug is the only wholesale platform built exclusively for used electric vehicles. Designed for dealers and commercial consignors, Plug combines EV-specific data, systems and expertise to bring clarity and confidence to the wholesale buying and selling process. With the addition of Trade Desk™, dealers can quickly receive cash offers or list EV trade-ins directly into the auction, removing friction and maximizing returns. By replacing outdated wholesale methods with tools tailored to EVs, Plug empowers dealers to make faster and more profitable decisions with a partner they can trust. For more information, visit ***************** The Opportunity This is an on site role in Santa Monica, CA. We are looking for a Software Engineer to join our growing team! A full-stack software engineer who will report directly to our CTO, and who will own entire customer-facing products. We're building systems like multi-modal AI-enabled data onramps for EVs, near-real time API connectivity to the vehicles, and pricing intelligence tooling. As a member of the team you'll help lay the technical and product foundation for our growing business. We're building a culture that cares about collaboration, encourages intellectual honesty, celebrates technical excellence, and is driven by careful attention to detail and planning for the future. We believe diversity of perspective and experience are key to building great technology and a thriving team. Sound cool? Let's work together. Key Responsibilities Collaborate with colleagues and be a strong voice in product design sessions, architecture discussions, and code reviews. Design, implement, test, debug, and document work on new and existing software features and products, ensuring they meet business, quality, and operational needs. Write clear, efficient, and scalable code with an eye towards flexibility and maintainability. Take ownership of features and products, and support their planning and development by understanding the ultimate goal and evaluating effort, risk, and priority in an agile environment. Own and contribute to team productivity and process improvements. Use and develop APIs to create integrations between Plug and 3rd party platforms. Be an integral part of a close team of developers; this is an opportunity to help shape a nascent team culture. The ideal candidate will be a high-growth individual able to grow their career as the team grows. Qualifications 4-6 years of hands-on experience developing technical solutions Advanced understanding of web application technologies, both backend and frontend as well as relational databases. Familiarity with Cloud PaaS deployments. Familiarity with TypeScript or any other modern typed language. Familiarity with and positive disposition toward code generation AI tooling. Strong analytical and quantitative skills. Strong verbal and written communication skills with a focus on conciseness. A self-directed drive to deliver end-to-end solutions with measurable goals and results. Understanding and accepting of the ever-changing controlled chaos that is an early startup, and willing to work within that chaos to improve processes and outcomes. Experience balancing contending priorities and collaborating with colleagues to reach workable compromises. A proven track record of gaining trust and respect by consistently demonstrating sound critical-thinking and a risk-adjusted bias toward action. You pride yourself on having excellent reliability and integrity. Extraordinary grit; smart, creative, and persistent personality. Authorized to work in the US for any employer. Having worked in automotive or EV systems is a plus. Compensation and Benefits Annual Salary: 130K - 150K Equity: TBD Benefits: Health, vision, and dental insurance. Lunch stipend. Parking. This full-time position is based in Santa Monica, CA. We welcome candidates from all locations to apply, provided they are willing to relocate for the role. Relocation assistance will not be provided for successful candidates. Sponsorship not available at this time. Plug is an equal opportunity employer. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. And if you do, you suck.
    $108k-148k yearly est. 2d ago
  • Systems Software Engineer

    Sunbelt Controls 3.3company rating

    Denver, CO jobs

    Now Hiring: Systems Software Engineer II 📍 Denver , Colorado | 💰 $108,000 - $135,000 per year 🏢 About the Role We're looking for an experienced Systems Software Engineer II to join Sunbelt Controls, a leading provider of Building Automation System (BAS) solutions across the Western U.S. In this role, you'll develop and program databases, create custom graphics, and integrate control systems for smart buildings. You'll also support project startups, commissioning, and troubleshooting - working closely with project managers and engineers to deliver high-quality, energy-efficient building automation solutions. If you have a passion for technology, problem-solving, and helping create intelligent building systems, this opportunity is for you. ⚙️ What You'll Do Design and program BAS control system databases and graphics for assigned projects. Lead the startup, commissioning, and troubleshooting of control systems. Work with networked systems and diagnose LAN/WAN connectivity issues. Perform pre-functional and functional system testing, including LEED and Title 24 requirements. Manage project documentation, including as-builts and commissioning records. Coordinate with project teams, subcontractors, and clients for smooth execution. Mentor and support junior Systems Software Engineers. 🧠 What We're Looking For 2-5 years of experience in Building Automation Systems or a related field. Associate's degree in a technical field (Bachelor's in Mechanical or Electrical Engineering preferred). Proficiency in MS Office, Windows, and basic TCP/IP networking. Strong organizational skills and the ability to manage multiple priorities. Excellent communication and customer-service skills. Valid Colorado driver's license. 💎 Why You'll Love Working With Us At Sunbelt Controls, we don't just build smart buildings - we build smart careers. As a 100% employee-owned company (ESOP), we offer a supportive, growth-oriented environment where innovation and teamwork thrive. What we offer: Competitive salary: $108K - $135K, based on experience Employee-owned company culture with a family-oriented feel Comprehensive health, dental, and vision coverage Paid time off, holidays, and 401(k)/retirement plan Professional growth, mentorship, and ongoing learning opportunities Veteran-friendly employer & Equal Opportunity workplace 🌍 About Sunbelt Controls Sunbelt Controls is a premier BAS solutions provider serving clients across multiple industries, including data centers, healthcare, education, biotech, and commercial real estate. We specialize in smart building technology, system retrofits, analytics, and energy efficiency - helping clients reduce operational costs and achieve sustainable performance. 👉 Apply today to join a team that's shaping the future of intelligent buildings. #Sunbelt #BuildingAutomation #SystemsEngineer #HVACControls #BASCareers
    $108k-135k yearly 1d ago
  • Systems Software Engineer

    Sunbelt Controls 3.3company rating

    Pleasanton, CA jobs

    Now Hiring: Systems Software Engineer II 📍 Pleasanton, CA | 💰 $108,000 - $135,000 per year 🏢 About the Role We're looking for an experienced Systems Software Engineer II to join Sunbelt Controls, a leading provider of Building Automation System (BAS) solutions across the Western U.S. In this role, you'll develop and program databases, create custom graphics, and integrate control systems for smart buildings. You'll also support project startups, commissioning, and troubleshooting - working closely with project managers and engineers to deliver high-quality, energy-efficient building automation solutions. If you have a passion for technology, problem-solving, and helping create intelligent building systems, this opportunity is for you. ⚙️ What You'll Do Design and program BAS control system databases and graphics for assigned projects. Lead the startup, commissioning, and troubleshooting of control systems. Work with networked systems and diagnose LAN/WAN connectivity issues. Perform pre-functional and functional system testing, including LEED and Title 24 requirements. Manage project documentation, including as-builts and commissioning records. Coordinate with project teams, subcontractors, and clients for smooth execution. Mentor and support junior Systems Software Engineers. 🧠 What We're Looking For 2-5 years of experience in Building Automation Systems or a related field. Associate's degree in a technical field (Bachelor's in Mechanical or Electrical Engineering preferred). Proficiency in MS Office, Windows, and basic TCP/IP networking. Strong organizational skills and the ability to manage multiple priorities. Excellent communication and customer-service skills. Valid California driver's license. 💎 Why You'll Love Working With Us At Sunbelt Controls, we don't just build smart buildings - we build smart careers. As a 100% employee-owned company (ESOP), we offer a supportive, growth-oriented environment where innovation and teamwork thrive. What we offer: Competitive salary: $108K - $135K, based on experience Employee-owned company culture with a family-oriented feel Comprehensive health, dental, and vision coverage Paid time off, holidays, and 401(k)/retirement plan Professional growth, mentorship, and ongoing learning opportunities Veteran-friendly employer & Equal Opportunity workplace 🌍 About Sunbelt Controls Sunbelt Controls is a premier BAS solutions provider serving clients across multiple industries, including data centers, healthcare, education, biotech, and commercial real estate. We specialize in smart building technology, system retrofits, analytics, and energy efficiency - helping clients reduce operational costs and achieve sustainable performance. 👉 Apply today to join a team that's shaping the future of intelligent buildings. #Sunbelt #BuildingAutomation #SystemsEngineer #HVACControls #BASCareers
    $108k-135k yearly 1d ago
  • Senior Data Engineer

    Samsara 4.7company rating

    Remote

    Who we are Samsara (NYSE: IOT) is the pioneer of the Connected Operations™ Cloud, which is a platform that enables organizations that depend on physical operations to harness Internet of Things (IoT) data to develop actionable insights and improve their operations. At Samsara, we are helping improve the safety, efficiency and sustainability of the physical operations that power our global economy. Representing more than 40% of global GDP, these industries are the infrastructure of our planet, including agriculture, construction, field services, transportation, and manufacturing - and we are excited to help digitally transform their operations at scale. Working at Samsara means you'll help define the future of physical operations and be on a team that's shaping an exciting array of product solutions, including Video-Based Safety, Vehicle Telematics, Apps and Driver Workflows, and Equipment Monitoring. As part of a recently public company, you'll have the autonomy and support to make an impact as we build for the long term. About the role: We are looking for a Data Engineer to join our team. Data and Analytics is a critical team within Business Technology. Our mission is to enable integrated data layers for all of Samsara and Samsara customers with the insights, tools, infrastructure and consultation to make data driven decisions. We are a growing team that loves all things data! The team will be composed of data engineers, architects, analysts and data scientists. We are passionate about leveraging world class data and analytics to deliver a great customer experience. Our team promotes an agile, collaborative, supportive environment where diverse thinking, innovative design, and experimentation is welcomed and encouraged. This role is open to candidates residing in the US and Canada except San Francisco Bay Area and New York Metro City. You should apply if: You want to impact the industries that run our world: Your efforts will result in real-world impact-helping to keep the lights on, get food into grocery stores, reduce emissions, and most importantly, ensure workers return home safely. You are the architect of your own career: If you put in the work, this role won't be your last at Samsara. We set up our employees for success and have built a culture that encourages rapid career development, countless opportunities to experiment and master your craft in a hyper growth environment. You're energized by our opportunity: The vision we have to digitize large sectors of the global economy requires your full focus and best efforts to bring forth creative, ambitious ideas for our customers. You want to be with the best: At Samsara, we win together, celebrate together and support each other. You will be surrounded by a high-caliber team that will encourage you to do your best. In this role, you will: Develop and maintain E2E data pipelines, backend ingestion and participate in the build of Samsara's Data Platform to enable advanced automation and analytics. Work with data from a variety of sources including but not limited to: CRM data, Product data, Marketing data, Order flow data, Support ticket volume data. Manage critical data pipelines to enable our growth initiatives and advanced analytics. Facilitate data integration and transformation requirements for moving data between applications; ensuring interoperability of applications with data layers and data lake. Develop and improve the current data architecture, data quality, monitoring, observability and data availability. Write data transformations in SQL/Python to generate data products consumed by customer systems and Analytics, Marketing Operations, Sales Operations teams. Champion, role model, and embed Samsara's cultural principles (Focus on Customer Success, Build for the Long Term, Adopt a Growth Mindset, Be Inclusive, Win as a Team) as we scale globally and across new offices. Minimum requirements for the role: A Bachelor's degree in computer science, data engineering, data science, information technology, or equivalent engineering program. 8+ years of work experience as a Data Engineer 3+ years of experience in building/maintaining a large-scale production-grade end-to-end data pipelines, including Data Modeling. Experience with modern cloud-based data-lake and data-warehousing technology stacks, and familiarity with typical data-engineering tools, ETL/ELT, and data-warehousing processes and best practices. Experience with leading end-to-end projects, including being the central point of contact to stakeholders. Provide mentorship to junior team members, and provide technical guidance, training, and knowledge-sharing across teams. Engage directly with internal cross-functional stakeholders to understand their data needs and design scalable solutions. Experience with the following: 3+ years in Python, SQL. Exposure to ETL tools such as Fivetran, DBT or equivalent. API: Exposure to python based API frameworks for data pipelines. RDBMS: MySQL, AWS RDS/Aurora MySQL, PostgreSQL, Oracle, MS SQL-Server or equivalent. Cloud: AWS, Azure and/or GCP. Data warehouse: Databricks, Google Big Query, AWS Redshift, Snowflake or equivalent. An ideal candidate also has: Comfortable in working with business customers to gather requirements and gain a deep understanding of varied datasets. Familiarity working with Spark/PySpark and Terraform A self-starter, motivated, responsible, innovative and technology-driven person who performs well both solo and as a team member. A proactive problem solver and have good communication as well as project management skills to relay your findings and solutions across technical and non technical audiences. Logging and Monitoring: One or more of Splunk, DataDog, AWS Cloudwatch or equivalent. AWS Serverless: AWS API Gateway, Lambda, S3, SNS, SQS, SecretsManager. The range of annual base salary for full-time employees for this position is below. Please note that base pay offered may vary depending on factors including your city of residence, job-related knowledge, skills, and experience.$117,215-$203,895 USD At Samsara, we welcome everyone regardless of their background. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, gender, gender identity, sexual orientation, protected veteran status, disability, age, and other characteristics protected by law. We depend on the unique approaches of our team members to help us solve complex problems and want to ensure that Samsara is a place where people from all backgrounds can make an impact. Benefits Full time employees receive a competitive total compensation package along with employee-led remote and flexible working, health benefits, and much, much more. Take a look at our Benefits site to learn more. Accommodations Samsara is an inclusive work environment, and we are committed to ensuring equal opportunity in employment for qualified persons with disabilities. Please email ********************************** or click here if you require any reasonable accommodations throughout the recruiting process. Flexible Working At Samsara, we embrace a flexible working model that caters to the diverse needs of our teams. Our offices are open for those who prefer to work in-person and we also support remote work where it aligns with our operational requirements. For certain positions, being close to one of our offices or within a specific geographic area is important to facilitate collaboration, access to resources, or alignment with our service regions. In these cases, the job description will clearly indicate any working location requirements. Our goal is to ensure that all members of our team can contribute effectively, whether they are working on-site, in a hybrid model, or fully remotely. All offers of employment are contingent upon an individual's ability to secure and maintain the legal right to work at the company and in the specified work location, if applicable. Fraudulent Employment Offers Samsara is aware of scams involving fake job interviews and offers. Please know we do not charge fees to applicants at any stage of the hiring process. Official communication about your application will only come from emails ending in ‘@samsara.com' or ‘@us-greenhouse-mail.io'. For more information regarding fraudulent employment offers, please visit our blog post here.
    $117.2k-203.9k yearly Auto-Apply 1d ago
  • Senior Data Platform Engineer

    Samsara 4.7company rating

    Remote

    Who we are Samsara (NYSE: IOT) is the pioneer of the Connected Operations™ Cloud, which is a platform that enables organizations that depend on physical operations to harness Internet of Things (IoT) data to develop actionable insights and improve their operations. At Samsara, we are helping improve the safety, efficiency and sustainability of the physical operations that power our global economy. Representing more than 40% of global GDP, these industries are the infrastructure of our planet, including agriculture, construction, field services, transportation, and manufacturing - and we are excited to help digitally transform their operations at scale. Working at Samsara means you'll help define the future of physical operations and be on a team that's shaping an exciting array of product solutions, including Video-Based Safety, Vehicle Telematics, Apps and Driver Workflows, and Equipment Monitoring. As part of a recently public company, you'll have the autonomy and support to make an impact as we build for the long term. About the role The Integrations, Data Engineering and AI (IDEA) team within Samsara's Business Technology organization is looking for a Senior Data Platform Engineer to join our team. We enable various teams at Samsara to leverage GenAI and data to glean insights and make data-driven decisions by delivering a reliable data platform and trustworthy data and analytics. We are a team that are passionate about leveraging quality data and AI to deliver a great customer experience. Our team promotes an agile, collaborative, supportive environment where diverse thinking, innovative design, and experimentation is welcomed and encouraged. This role is open to candidates residing in the US except Alaska, Austin Metro, Boulder Metro, California, Chicago Metro, Connecticut, Dallas Metro, Denver Metro, Houston Metro, Maryland, Massachusetts, New Jersey, New York, Rhode Island, Seattle Metro, and Washington, D.C. You should apply if: You want to impact the industries that run our world: Your efforts will result in real-world impact-helping to keep the lights on, get food into grocery stores, reduce emissions, and most importantly, ensure workers return home safely. You are the architect of your own career: If you put in the work, this role won't be your last at Samsara. We set up our employees for success and have built a culture that encourages rapid career development, countless opportunities to experiment and master your craft in a hyper growth environment. You're energized by our opportunity: The vision we have to digitize large sectors of the global economy requires your full focus and best efforts to bring forth creative, ambitious ideas for our customers. You want to be with the best: At Samsara, we win together, celebrate together and support each other. You will be surrounded by a high-calibre team that will encourage you to do your best. In this role, you will: Databricks Administer and monitor Databricks workspaces and underlying AWS Infra: develop platform admin tools and capabilities to effectively administer, monitor, track and troubleshoot various platform resources: (clusters, catalogs, users, groups, databases, storage and security). Act as the primary Databricks SME, assisting and guiding teams on platform features and best practices. Collaborate with data engineers, analysts, and data scientists to operationalize and optimize workflows and data pipelines. Infrastructure & DevOps Oversee data-engineering dev ecosystem, including dev-tooling, CI/CD pipelines, and monitoring frameworks. Own implementation and management of IaaC (Terraform, CloudFormation) and CI/CD automation for Databricks and AWS resources. Ensure security, high availability, disaster recovery, and compliance across data services. Incorporate DevOps best practices including automated testing, deployment, observability, and monitoring. Cross-Functional Collaboration & Enablement Engage directly with internal cross-functional stakeholders to understand their data needs and design scalable solutions. Collaborate with engineers, managers, and vendor representatives in evaluating new features and solutions. Lead rapid prototyping and proof-of-concepts to evaluate for performance, cost, security, scalability, observability. Provide mentorship to junior team members, and provide technical guidance, training, and knowledge-sharing across teams. Minimum requirements for this role: Bachelor's or Master's degree in Computer Sciences, Software Engineering, Electrical Engineering, Computer Engineering, or related discipline. 10+ years experience in a Software Engineering, Platform Engineering, Data Engineering, DevOps/DataOps or similar technical role, including at least 2+ years experience in data infrastructure or platform focused role. Experience building, delivery and administering large-scale production-grade data platforms and services to data engineering, business analyst and data-science teams. Solid knowledge of Databricks features and administration, including Unity Catalog, cluster management, security, conducting troubleshooting and root-cause analysis, and performance optimization. Experience and familiarity with AWS services and Cloud Infrastructure provisioning, management, monitoring, and security (e.g., S3, IAM, RDS, Lambda, API-Gateway, VPC, EC2, ECS/EKS). Strong knowledge of SQL and Python, and hands-on data-engineering experience in designing and developing data pipelines and ETL routines from a variety of sources (SaaS corporate systems, APIs, RDBMS). Skilled with change-data capture, incremental and batch loading techniques. Experienced in troubleshooting underlying issues and translating them into technical solutions. Excellent problem-solving, communication, and stakeholder management skills. An ideal candidate also has: Experience delivering and managing Databricks and AWS Infra for 100s of users. Experience as a technical lead. Knowledge of DevOps tools and practices: GitHub/GitLab, CI/CD systems, Terraform, monitoring/logging tools (e.g., Datadog, CloudWatch). Experience implementing robust data governance and security measures to ensure data integrity and compliance. Samsara's Compensation Philosophy: Samsara's compensation program is designed to deliver Total Direct Compensation (based on role, level, and geography) that is at or above market. We do this through our base salary + bonus/variable + restricted stock unit awards (RSUs) for eligible roles. For eligible roles, a new hire RSU award may be awarded at the time of hire, and additional RSU refresh grants may be awarded annually. We pay for performance, and top performers in eligible roles may receive above-market equity refresh awards which allow employees to achieve higher market. The range of annual base salary for full-time employees for this position is below. Please note that base pay offered may vary depending on factors including your city of residence, job-related knowledge, skills, and experience.$125,545-$168,800 USD At Samsara, we welcome everyone regardless of their background. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, gender, gender identity, sexual orientation, protected veteran status, disability, age, and other characteristics protected by law. We depend on the unique approaches of our team members to help us solve complex problems and want to ensure that Samsara is a place where people from all backgrounds can make an impact. Benefits Full time employees receive a competitive total compensation package along with employee-led remote and flexible working, health benefits, and much, much more. Take a look at our Benefits site to learn more. Accommodations Samsara is an inclusive work environment, and we are committed to ensuring equal opportunity in employment for qualified persons with disabilities. Please email ********************************** or click here if you require any reasonable accommodations throughout the recruiting process. Flexible Working At Samsara, we embrace a flexible working model that caters to the diverse needs of our teams. Our offices are open for those who prefer to work in-person and we also support remote work where it aligns with our operational requirements. For certain positions, being close to one of our offices or within a specific geographic area is important to facilitate collaboration, access to resources, or alignment with our service regions. In these cases, the job description will clearly indicate any working location requirements. Our goal is to ensure that all members of our team can contribute effectively, whether they are working on-site, in a hybrid model, or fully remotely. All offers of employment are contingent upon an individual's ability to secure and maintain the legal right to work at the company and in the specified work location, if applicable. Fraudulent Employment Offers Samsara is aware of scams involving fake job interviews and offers. Please know we do not charge fees to applicants at any stage of the hiring process. Official communication about your application will only come from emails ending in ‘@samsara.com' or ‘@us-greenhouse-mail.io'. For more information regarding fraudulent employment offers, please visit our blog post here.
    $125.5k-168.8k yearly Auto-Apply 1d ago
  • Senior Data Engineer, Data Platform

    Otter 4.4company rating

    Mountain View, CA jobs

    The Opportunity We are looking for a Senior Data Engineer to join our Data Platform team and build the core data foundations that power analytics, experimentation, and decision-making across the company. In this role, you will design and own foundational data models, pipelines, and platforms that enable self-serve analytics and trustworthy insights at scale. You will partner closely with Growth, Product, Sales, Marketing, Data Insights, and Engineering teams as a technical thought leader who helps teams extract meaningful value from data. This is a high-ownership role in a growing company, and you will have a direct impact on how data is collected, modeled, and used to drive the business forward. Your Impact * Build and own foundational data models that enable self-serve analytics and key business metrics * Design, operate, and scale reliable data pipelines and platforms * Partner with stakeholders to translate business needs into trusted data products * Influence data collection and logging standards across production systems * Establish best practices for data modeling, quality, and governance * Implement data quality checks, statistical validation, and anomaly detection * Provide technical leadership that improves data reliability and data literacy We're Looking for Someone Who * Has 5+ years of experience in data engineering * Has a Bachelor's degree in Computer Science or equivalent experience * Is highly hands-on, actively contributing code and driving projects to completion. * Has strong programming skills in Python and SQL * Has expertise with cloud data warehouses such as Snowflake, BigQuery, or Databricks * Has experience building and maintaining ETL/ELT pipelines * Is experienced with workflow orchestration (e.g., Airflow, dbt) * Has strong data modeling skills and understands how to design for analytical workloads * Has experience with cloud platforms (preferably AWS) * Has worked in fast-paced tech, AI or SaaS environments Nice to Haves * Experience with data governance, data catalogs, and data quality frameworks * Familiarity with BI or data visualization tools * Experience with experimentation or A/B testing frameworks * Exposure to ML pipelines, feature engineering, or applied data science * Experience with streaming or near-real-time data systems About Otter.ai We are in the business of shaping the future of work. Our mission is to make conversations more valuable. With over 1B meetings transcribed, Otter.ai is the world's leading tool for meeting transcription, summarization, and collaboration. Using artificial intelligence, Otter generates real-time automated meeting notes, summaries, and other insights from in-person and virtual meetings - turning meetings into accessible, collaborative, and actionable data that can be shared across teams and organizations. The company is backed by early investors in Google, DeepMind, Zoom, and Tesla. Otter.ai is an equal opportunity employer. We proudly celebrate diversity and are committed to building an inclusive and accessible workplace. We provide reasonable accommodations for qualified applicants throughout the hiring process. Accessibility & Accommodations Otter.ai is committed to providing reasonable accommodations for candidates with disabilities in our hiring process. If you need assistance or an accommodation during any stage of the recruitment process, please contact *********** at least 3 business days before your interview. * Otter.ai does not accept unsolicited resumes from 3rd party recruitment agencies without a written agreement in place for permanent placements. Any resume or other candidate information submitted outside of established candidate submission guidelines (including through our website or via email to any Otter.ai employee) and without a written agreement otherwise will be deemed to be our sole property, and no fee will be paid should we hire the candidate. Salary range Salary Range: $185,000 to $230,000 USD per year. This salary range represents the low and high end of the estimated salary range for this position. The actual base salary offered for the role is dependent on several factors. Our base salary is just one component of a comprehensive total rewards package.
    $185k-230k yearly 12d ago
  • Data Engineer

    Kimball Midwest 4.4company rating

    Columbus, OH jobs

    Kimball Midwest, a national distributor of maintenance, repair, and operation products, is searching for a Data Engineer to join our IT team! In this role, you would be accountable for the development and maintenance of robust data management platforms, ensuring their efficient design, data accuracy, and secure accessibility. You would also be responsible for establishing and maintaining appropriate models while guaranteeing consistent naming conventions and calculations across all data repositories in alignment with the organization's Business Glossary As a Kimball Midwest associate, you will experience why we have been recognized as one of the Top Workplaces in Columbus Thirteen years in a row! Our sales revenue growth is dynamic, increasing from $1 million in 1983 to over $500 million today. Throughout all our growth we have kept the family owned and operated culture alive. At Kimball Midwest, you are a name and not a number and we pride ourselves on our unique culture. Responsibilities Develop and maintain data management platforms, including on-premises and cloud data warehouses and data lakes. Solution design and dimensional modeling of data warehouses and analytic models to support reporting, data science, AI and other forms of D&A workloads. ETL and ELT processing, source to target, including advanced SQL transformations. Sourcing from internal and external data sources, including on-premises SQL Servers, Azure SQL Databases, APIs and flat-files. Research and understanding of data structures in enterprise systems including Dynamics AX, Dynamics CRM, D365 F&SCM, D365 CE, Korber WMS, custom microservices, web development and legacy systems. Ensure consistency in naming and calculations across all data repositories, in partnership with data governance, development teams and business stakeholders. Implement and maintain security measures to protect organizational data, including access control, secure data transfer, auditing and monitoring. Participation in the on-call support rotation for providing prompt and effective system support and customer service outside of regular business hours. Demonstrate a passion for technology through continued growth in data and cloud related skills, staying abreast of technology trends and sharing knowledge with others. Qualifications 5+ years experience in data engineer, data scientist, or related role Bachelor's degree in computer related field or equivalent experience Experience with program languages related to data engineering such as Python, R, Azure Data Factory, Data Bricks, C#, SQL, Hadoop, Kafka, Spark, and Power BI Microsoft Fabric experience is preferred Additional Information This is a fully on-site position reporting to the office Monday through Friday. We offer a benefits package that includes health, dental and vision insurance, company sponsored life, optional life and disability insurance, Health Savings Accounts and Flexible Spending Accounts, a 401(k) plus match, Tuition Assistance, Paid Parental Leave, Paid Time Off (PTO), a Dress for your Day dress code and paid holidays. Kimball Midwest is an equal opportunity employer that is committed to a program of recruitment of females, minority group members, individuals with disabilities, qualifying veterans and any other classification that is protected by federal, state, or local law. We Participate in E-Verify. Participamos en E-Verify.
    $77k-99k yearly est. Auto-Apply 12d ago
  • Data Engineer

    Kimball Midwest 4.4company rating

    Columbus, OH jobs

    Kimball Midwest, a national distributor of maintenance, repair, and operation products, is searching for a Data Engineer to join our IT team! We are looking for someone who is driven, passionate about data and technology, and excited about the future of Microsoft Fabric. This role requires a leader who stays up-to-date on technology trends, brings new ideas to the table, and thrives in a collaborative team environment within a Microsoft ecosystem. As a Kimball Midwest associate, you will experience why we have been recognized as one of the Top Workplaces in Columbus thirteen years in a row! Our sales revenue growth is dynamic, increasing from $1 million in 1983 to over $500 million today. Throughout all our growth we have kept the family-owned and operated culture alive. At Kimball Midwest, you are a name and not a number and we pride ourselves on our unique culture.What We Value Passion for data, technology, and continuous learning. Leadership and initiative to drive innovation. Collaboration and teamwork across diverse stakeholders. Curiosity and excitement about emerging technologies, especially Microsoft Fabric. Commitment to delivering high-quality, secure, and scalable data solutions. Key Responsibilities Architect, develop, and maintain modern data platforms, including data warehouses and lakes. Design dimensional models that power reporting, analytics, and advanced insights. Build and optimize ETL/ELT pipelines using advanced SQL and PySpark transformations. Integrate data from diverse sources: SQL Server, Azure SQL, APIs, and flat files. Collaborate on systems such as Dynamics 365 F&SCM and CE, AX, CRM, Infios WMS, custom microservices, and legacy applications. Enable advanced analytics and AI/ML workloads through efficient, scalable data pipelines. Champion consistency in naming conventions and calculations in partnership with data governance and business teams. Implement robust data security measures, including access control, encryption, and auditing. Participate in on-call support rotation for complex tier-2/3 incidents. Stay ahead of industry trends, share knowledge, and proactively introduce innovative solutions. Qualifications 5+ years of experience in data engineering or a related field. Bachelor's degree in Computer Science or equivalent experience. Proven passion for data and technology, with a record of continuous learning and leadership. Expertise in tools and languages: Python, SQL, Azure Synapse Analytics, Microsoft Fabric (Lakehouse, Dataflows), Azure Data Factory, Spark, Databricks, and integration with Power BI and Copilot. Strong understanding of Microsoft ecosystem and enthusiasm for Microsoft Fabric advancements. Microsoft Fabric experience highly preferred. Additional Information This is a fully on-site position reporting to the Columbus, Ohio office Monday through Friday. We offer a benefits package that includes health, dental and vision insurance, company sponsored life, optional life and disability insurance, Health Savings Accounts and Flexible Spending Accounts, a 401(k) plus match, Tuition Assistance, Paid Parental Leave, Paid Time Off (PTO), a Dress for your Day dress code and paid holidays. Kimball Midwest is an equal opportunity employer that is committed to a program of recruitment of females, minority group members, individuals with disabilities, qualifying veterans and any other classification that is protected by federal, state, or local law. We Participate in E-Verify. Participamos en E-Verify.
    $77k-99k yearly est. Auto-Apply 12d ago
  • Data Engineer

    Kimball Midwest 4.4company rating

    Columbus, OH jobs

    Kimball Midwest, a national distributor of maintenance, repair, and operation products, is searching for a Data Engineer to join our IT team! In this role, you would be accountable for the development and maintenance of robust data management platforms, ensuring their efficient design, data accuracy, and secure accessibility. You would also be responsible for establishing and maintaining appropriate models while guaranteeing consistent naming conventions and calculations across all data repositories in alignment with the organization's Business Glossary As a Kimball Midwest associate, you will experience why we have been recognized as one of the Top Workplaces in Columbus Thirteen years in a row! Our sales revenue growth is dynamic, increasing from $1 million in 1983 to over $500 million today. Throughout all our growth we have kept the family owned and operated culture alive. At Kimball Midwest, you are a name and not a number and we pride ourselves on our unique culture. Responsibilities Develop and maintain data management platforms, including on-premises and cloud data warehouses and data lakes. Solution design and dimensional modeling of data warehouses and analytic models to support reporting, data science, AI and other forms of D&A workloads. ETL and ELT processing, source to target, including advanced SQL transformations. Sourcing from internal and external data sources, including on-premises SQL Servers, Azure SQL Databases, APIs and flat-files. Research and understanding of data structures in enterprise systems including Dynamics AX, Dynamics CRM, D365 F&SCM, D365 CE, Korber WMS, custom microservices, web development and legacy systems. Ensure consistency in naming and calculations across all data repositories, in partnership with data governance, development teams and business stakeholders. Implement and maintain security measures to protect organizational data, including access control, secure data transfer, auditing and monitoring. Participation in the on-call support rotation for providing prompt and effective system support and customer service outside of regular business hours. Demonstrate a passion for technology through continued growth in data and cloud related skills, staying abreast of technology trends and sharing knowledge with others. Qualifications 5+ years experience in data engineer, data scientist, or related role Bachelor's degree in computer related field or equivalent experience Experience with program languages related to data engineering such as Python, R, Azure Data Factory, Data Bricks, C#, SQL, Hadoop, Kafka, Spark, and Power BI Microsoft Fabric experience is preferred Additional Information This is a fully on-site position reporting to the office Monday through Friday. We offer a benefits package that includes health, dental and vision insurance, company sponsored life, optional life and disability insurance, Health Savings Accounts and Flexible Spending Accounts, a 401(k) plus match, Tuition Assistance, Paid Parental Leave, Paid Time Off (PTO), a Dress for your Day dress code and paid holidays. Kimball Midwest is an equal opportunity employer that is committed to a program of recruitment of females, minority group members, individuals with disabilities, qualifying veterans and any other classification that is protected by federal, state, or local law. We Participate in E-Verify. Participamos en E-Verify.
    $77k-99k yearly est. Auto-Apply 10d ago
  • Sr Data Engineer

    Jack Link's Protein Snacks 4.5company rating

    Minneapolis, MN jobs

    Running with Sasquatch is more than just a clever marketing campaign. As a Jack Link's team member, Running with Sasquatch means we roll up our buffalo plaid sleeves and do the hard work first. We don't shy away from challenges. In fact, we push hard and take risks. True to our North Woods roots, we're a bunch of ordinary people who accomplish extraordinary things by driving results with innovation, creativity and a clear sense of urgency. Like our awesome protein products, we have an unwavering passion for quality, and you won't find anything artificial here. What you see is what you get…authentic, humble and fun people who Run with Sasquatch! Running with Sasquatch takes a team. We invite you to run with us, succeed with us, and celebrate with us. Most importantly, Feed Your Wild Side with us on our journey to be the dominant global leader of branded protein snacks! Jack Link's Protein Snacks is a global leader in snacking and the No. 1 meat snack manufacturer worldwide. Still family-owned and operated with headquarters in Minong, Wisconsin, Jack Link's also has a large corporate hub in Downtown Minneapolis, Minnesota, and operates a total of 11 manufacturing and distribution facilities in four countries. Jack Link's produces high-quality, great-tasting protein snacks that feed the wild sides of consumers around the world. Jack Link's Protein Snacks family of brands includes Jack Link's, LK, World Kitchens Jerky, Bifi and Peperami. Job Description Are you ready to shape the future of data at Jack Link's? We're looking for a Senior Data Engineer to play a pivotal role in evolving our modern data foundation, enabling the next generation of analytics, automation, and insights. At Jack Link's, we are building an analytics capability that fuels smarter decisions, faster innovation, and stronger business outcomes. We're looking for a Senior Data Engineer to lead the development of scalable, end-to-end data pipelines that power analytics, automation, and external product integrations. This full-stack role spans the entire data lifecycle-from ingestion and transformation to governance and infrastructure. A key focus area is building and maintaining scalable data pipelines from multiple data sources such as S/4HANA and Datasphere into Microsoft Fabric. You'll work closely with IT professionals, product owners, business relationship managers, and analytics teams to design and maintain data models, schemas, and tables that support reporting, dashboards, and ML/AI workflows. A strong focus will be on building and maintaining data pipelines; and data preparation and feature engineering to ensure data is structured, accessible, and optimized for decision-making. This role collaborates with enterprise data architects, platform engineers, and analytics product owners, and is ideal for someone who thrives in a cross-functional, product-oriented environment. Core Responsibilities: Build & Manage Data Pipelines: Design and maintain scalable pipelines for ingesting, transforming, and storing data from SAP and non-SAP sources into Microsoft Fabric. Lead Medallion Architecture Design in Microsoft Fabric: Design and manage bronze, silver, and gold layer structures within Microsoft Fabric to support scalable, governed, and analytics-ready data pipelines. Develop in Microsoft Fabric: Use tools like Lakehouse, Data Warehouse, and Notebooks to process and transform data efficiently. Model & Prepare Data for Analytics: Build robust data models and perform feature engineering to support reporting, dashboards, and ML/AI use cases. Integrate SAP Systems: Connect SAP Datasphere and S/4HANA analytics into Fabric; orchestrate SAP and non-SAP data flows. Ensure Data Quality & Governance: Implement governance practices, maintain metadata, and ensure data integrity across platforms. Automate with Code: Write clean, efficient Python and SQL for ETL workflows, automation, and API development. Support MLOps & AIOps: Help deploy and monitor analytics models using modern DevOps practices. Collaborate & Mentor: Partner with business, analytics, and IT teams; mentor junior engineers and promote best practices. Qualifications Required Education: Bachelor's degree in Information Technology, Computer Science, or related field or equivalent experience. Required Skills & Qualifications: 7+ years professional experience as data engineer Proven experience managing complex data pipelines and lakehouse architectures. Strong expertise in SQL, Python, and Spark. Deep understanding of data modeling, data architecture, and data governance. Familiarity with Microsoft Fabric. Familiarity with SAP data extraction and integration. Strong problem-solving skills and the ability to communicate effectively with both technical and non-technical stakeholders. Comfortable working in a product-oriented analytics team environment. Interest in building the data foundation that enables automation and advanced analytics. Preferred Qualifications: CPG manufacturing or adjacent manufacturing and retail company experience a plus. Demonstrated interest and investment in professional development through ongoing education, professional certifications, participation in knowledge networks etc. Experience with internal analytics product development and external data product integration. Experience with multiple data lake/ lakehouse platforms Microsoft Fabric, Microsoft Azure, Databricks, Snowflake, and/ or GCP. Knowledge of DevOps practices related to data engineering. Familiarity with CI/CD pipelines and tools for data workflows. Additional Information JACK LINK'S CORE VALUES: Be Real, Speed Matters, Stewardship, Relationship Driven, Self-Discipline, and Show Awesome Character. Additional Information The salary range for this role is $135,000 - $165,000 (Annually). Actual salaries will vary based on several factors, including but not limited to external market data, internal equity, location, and candidate skill set and experience. Base pay is just one component of Jack Link's Total Rewards package for Team Members. Other rewards may include annual incentive and program-specific awards. Jack Link's provides a variety of benefits to eligible Team Members, including medical, dental and vision benefits, life and disability insurance, 401k participation, paid holidays, and paid time off. EQUAL EMPLOYMENT OPPORTUNITY EMPLOYER: Jack Link's provides equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic that is protected by federal, state or local law. E-VERIFY: Jack Link's is participant in the federal E-Verify program to confirm the identity and employment authorization of all newly hired employees. For information about the E-Verify program, please visit: *************************************** All your information will be kept confidential according to EEO guidelines.
    $135k-165k yearly 18d ago
  • Sr Data Engineer - SAP and AI Automation

    Jack Link's Protein Snacks 4.5company rating

    Minneapolis, MN jobs

    When it comes to being wild, we know a thing or two. We're not afraid of trying something new or the hard work it takes to make it happen. It's in our DNA. We've turned a family recipe into a new snacking category. And the wilderness into the world's largest meat snack business, that's still proudly family owned and operated. We're a company built by innovators, and are driven to not only satisfy your hunger, but to also feed your journey - whether that journey is on the road, on the run, at the campground, at the playground, in the office or in the moment. It's a journey we share with you. It's the journey forward of our people, of our communities, of our category…with a reverence for quality and an irreverence for the status quo. At Jack Link's Protein Snacks, we see every moment of every day as an opportunity to move forward, to forge new ground. To realize our vision of becoming the World's #1 Protein Snack Company. We never give up. You never give up. Together, we keep going. Are you wild enough to join us? Jack Link's Protein Snacks is a global leader in snacking and the No. 1 meat snack manufacturer worldwide. Family-owned and operated with headquarters in Minong, Wisconsin, Jack Link's Protein Snacks also has a large corporate hub in Downtown Minneapolis, Minnesota. The company is made up of over 4,000 passionate team members, across 11 countries, who share an uncompromising commitment to delivering awesome products and feeding the journey of those who move things forward. The Jack Link's Protein Snacks portfolio of brands includes, Jack Link's, Lorissa's Kitchen, MATADOR Jerky, BiFi and Peperami. Job Description JOB SUMMARY This role is responsible for delivering high-quality, governed, analytics-ready data products that support enterprise reporting, advanced analytics, and AI-driven use cases. Beyond traditional data engineering, this position requires strategic thinking and strong business acumen. You will collaborate with product owners and business stakeholders to understand operational and commercial processes, translate data and diagnostic requirements into technical specifications, and communicate complex technical concepts back to non-technical teams in clear, accessible language. This role is supported by a strong ecosystem of IT and Analytics experts, including product owners, business relationship managers, data engineers, data architects, BI developers, data scientists, and platform specialists. In addition to SAP data engineering, you will design, develop, and orchestrate AI-powered automation solutions to optimize workflows and processes within SAP environments. Your work will enable advanced analytics, intelligent automation, and seamless integration across enterprise systems. KEY RESPONSIBILITIES Build & Operate SAP Datasphere Data Pipelines Design and manage enterprise data pipelines within SAP Datasphere. Implement scalable ingestion using ODP, SLT, SDA/SDI, CDS extraction, APIs, and files. Engineer reliable delta handling, orchestration, and reconciliation. Design Enterprise HANA & Analytical Data Models Build HANA-based analytical models and business-ready datasets. Develop semantic layers aligned to Finance, Supply Chain, Manufacturing, and Sales. Optimize models for SAC, Power BI, and downstream platforms. Develop in SAP Datasphere & Enterprise HANA Develop transformations using Datasphere flows, SQL, and SQLScript. Manage Datasphere Spaces, compute allocation, access, and lifecycle. Architect virtualization vs replication strategies. Integrate SAP Source Systems Integrate SAP S/4HANA and SAP BW on HANA datasets into Datasphere. Leverage CDS Views, extractors, Open ODS Views, and BW InfoProviders. Design hybrid SAP and non-SAP integration patterns. Ensure Data Quality, Governance & Security Implement validation, reconciliation, lineage, and metadata frameworks. Enforce security, PII masking, and regulatory compliance. Performance Optimization & Cost Management Tune models, transformation logic, and analytical queries. Monitor performance, storage, and compute efficiency. Automate & Operationalize Develop reusable automation frameworks using SQL, scripting, and SAP-specific tools to streamline data workflows and system integrations. Design and implement SAP automation solutions leveraging SAP Joule and AI-powered orchestration to optimize business processes and reduce manual effort. Integrate intelligent agents for predictive and prescriptive automation within SAP environments, enabling proactive issue resolution and process optimization. Support CI/CD pipelines, version control, and operational monitoring to ensure reliability and scalability of automation solutions. Enable Advanced Analytics & AI Deliver analytics-ready feature-engineered datasets. Partner with data science teams to support AI/ML pipelines. Collaborate, Communicate & Mentor Work closely with product owners and business stakeholders to translate business requirements and diagnostic issues into technical solutions, and communicate technical concepts back in clear, non-technical language. Partner with SAP functional, architecture, and analytics teams. Mentor junior engineers and promote best practices. Perform additional duties as assigned. WORK ENVIRONMENT The work environment characteristics described here are representative of those an employee encounters while performing the essential functions of this job. The work environment is a plant/office setting with varying degrees of temperatures and noise levels. Exposure to manufacturing equipment movement and wet/slippery floors. However, the vast amount of work is conducted in a climate-controlled office. Travel may be required. PHYSICAL DEMANDS The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodation may be made to enable individuals with disabilities to perform the essential functions. While performing the duties of this job the employee is regularly required to sit, use hands to finger, handle, or feel and talk or hear. The employee is occasionally required to stand, walk and reach with hands and arms. The employee must occasionally lift and/or more up to 25 pounds. Specific vision abilities required by this job include close vision. Qualifications QUALIFICATIONS 7+ years in Data Engineering or SAP Analytics Engineering. 3+ years of primary hands-on SAP Datasphere experience. Strong expertise in Enterprise SAP HANA, SAP BW on HANA, SQL/SQLScript. Deep understanding of SAP S/4HANA data models and CDS views. Experience in analytical modeling, semantic layers, and governance. Strong communication and product-oriented team collaboration skills. Strategic Thinking & Business Acumen: Ability to understand general operations and commercial processes, work closely with product owners and business stakeholders to translate data and analytics requirements and diagnostic issues into clear technical specifications, and communicate technical concepts back to non-technical teams in an accessible way Preferred Qualifications Experience with SAP Datasphere as a strategic analytics platform. Experience designing, developing, and deploying AI agents in SAP ecosystem. Integration with SAP Analytics Cloud and enterprise BI platforms. SAP BTP governance and security exposure. Manufacturing, CPG, or supply chain industry experience. CI/CD and Git-based transport automation experience. Additional Information The salary range for this role is $135 ,000 - $165,000 (Annually). Actual salaries will vary based on several factors, including but not limited to external market data, internal equity, location, and candidate skill set and experience. Base pay is just one component of Jack Link's Total Rewards package for Team Members. Other rewards may include annual incentive and program-specific awards. Jack Link's provides a variety of benefits to eligible Team Members, including medical, dental and vision benefits, life and disability insurance, 401k participation, paid holidays, and paid time off. EQUAL EMPLOYMENT OPPORTUNITY EMPLOYER: Jack Link's provides equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic that is protected by federal, state or local law. E-VERIFY: Jack Link's is participant in the federal E-Verify program to confirm the identity and employment authorization of all newly hired employees. For information about the E-Verify program, please visit: *************************************** All your information will be kept confidential according to EEO guidelines.
    $165k yearly 10d ago
  • Sr Data Engineer - SAP and AI Automation

    Jack Link's Protein Snacks 4.5company rating

    Minneapolis, MN jobs

    When it comes to being wild, we know a thing or two. We're not afraid of trying something new or the hard work it takes to make it happen. It's in our DNA. We've turned a family recipe into a new snacking category. And the wilderness into the world's largest meat snack business, that's still proudly family owned and operated. We're a company built by innovators, and are driven to not only satisfy your hunger, but to also feed your journey - whether that journey is on the road, on the run, at the campground, at the playground, in the office or in the moment. It's a journey we share with you. It's the journey forward of our people, of our communities, of our category…with a reverence for quality and an irreverence for the status quo. At Jack Link's Protein Snacks, we see every moment of every day as an opportunity to move forward, to forge new ground. To realize our vision of becoming the World's #1 Protein Snack Company. We never give up. You never give up. Together, we keep going. Are you wild enough to join us? Jack Link's Protein Snacks is a global leader in snacking and the No. 1 meat snack manufacturer worldwide. Family-owned and operated with headquarters in Minong, Wisconsin, Jack Link's Protein Snacks also has a large corporate hub in Downtown Minneapolis, Minnesota. The company is made up of over 4,000 passionate team members, across 11 countries, who share an uncompromising commitment to delivering awesome products and feeding the journey of those who move things forward. The Jack Link's Protein Snacks portfolio of brands includes, Jack Link's, Lorissa's Kitchen, MATADOR Jerky, BiFi and Peperami. Job Description JOB SUMMARY This role is responsible for delivering high-quality, governed, analytics-ready data products that support enterprise reporting, advanced analytics, and AI-driven use cases. Beyond traditional data engineering, this position requires strategic thinking and strong business acumen. You will collaborate with product owners and business stakeholders to understand operational and commercial processes, translate data and diagnostic requirements into technical specifications, and communicate complex technical concepts back to non-technical teams in clear, accessible language. This role is supported by a strong ecosystem of IT and Analytics experts, including product owners, business relationship managers, data engineers, data architects, BI developers, data scientists, and platform specialists. In addition to SAP data engineering, you will design, develop, and orchestrate AI-powered automation solutions to optimize workflows and processes within SAP environments. Your work will enable advanced analytics, intelligent automation, and seamless integration across enterprise systems. KEY RESPONSIBILITIES Build & Operate SAP Datasphere Data Pipelines Design and manage enterprise data pipelines within SAP Datasphere. Implement scalable ingestion using ODP, SLT, SDA/SDI, CDS extraction, APIs, and files. Engineer reliable delta handling, orchestration, and reconciliation. Design Enterprise HANA & Analytical Data Models Build HANA-based analytical models and business-ready datasets. Develop semantic layers aligned to Finance, Supply Chain, Manufacturing, and Sales. Optimize models for SAC, Power BI, and downstream platforms. Develop in SAP Datasphere & Enterprise HANA Develop transformations using Datasphere flows, SQL, and SQLScript. Manage Datasphere Spaces, compute allocation, access, and lifecycle. Architect virtualization vs replication strategies. Integrate SAP Source Systems Integrate SAP S/4HANA and SAP BW on HANA datasets into Datasphere. Leverage CDS Views, extractors, Open ODS Views, and BW InfoProviders. Design hybrid SAP and non-SAP integration patterns. Ensure Data Quality, Governance & Security Implement validation, reconciliation, lineage, and metadata frameworks. Enforce security, PII masking, and regulatory compliance. Performance Optimization & Cost Management Tune models, transformation logic, and analytical queries. Monitor performance, storage, and compute efficiency. Automate & Operationalize Develop reusable automation frameworks using SQL, scripting, and SAP-specific tools to streamline data workflows and system integrations. Design and implement SAP automation solutions leveraging SAP Joule and AI-powered orchestration to optimize business processes and reduce manual effort. Integrate intelligent agents for predictive and prescriptive automation within SAP environments, enabling proactive issue resolution and process optimization. Support CI/CD pipelines, version control, and operational monitoring to ensure reliability and scalability of automation solutions. Enable Advanced Analytics & AI Deliver analytics-ready feature-engineered datasets. Partner with data science teams to support AI/ML pipelines. Collaborate, Communicate & Mentor Work closely with product owners and business stakeholders to translate business requirements and diagnostic issues into technical solutions, and communicate technical concepts back in clear, non-technical language. Partner with SAP functional, architecture, and analytics teams. Mentor junior engineers and promote best practices. Perform additional duties as assigned. WORK ENVIRONMENT The work environment characteristics described here are representative of those an employee encounters while performing the essential functions of this job. The work environment is a plant/office setting with varying degrees of temperatures and noise levels. Exposure to manufacturing equipment movement and wet/slippery floors. However, the vast amount of work is conducted in a climate-controlled office. Travel may be required. PHYSICAL DEMANDS The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodation may be made to enable individuals with disabilities to perform the essential functions. While performing the duties of this job the employee is regularly required to sit, use hands to finger, handle, or feel and talk or hear. The employee is occasionally required to stand, walk and reach with hands and arms. The employee must occasionally lift and/or more up to 25 pounds. Specific vision abilities required by this job include close vision. Qualifications QUALIFICATIONS 7+ years in Data Engineering or SAP Analytics Engineering. 3+ years of primary hands-on SAP Datasphere experience. Strong expertise in Enterprise SAP HANA, SAP BW on HANA, SQL/SQLScript. Deep understanding of SAP S/4HANA data models and CDS views. Experience in analytical modeling, semantic layers, and governance. Strong communication and product-oriented team collaboration skills. Strategic Thinking & Business Acumen: Ability to understand general operations and commercial processes, work closely with product owners and business stakeholders to translate data and analytics requirements and diagnostic issues into clear technical specifications, and communicate technical concepts back to non-technical teams in an accessible way Preferred Qualifications Experience with SAP Datasphere as a strategic analytics platform. Experience designing, developing, and deploying AI agents in SAP ecosystem. Integration with SAP Analytics Cloud and enterprise BI platforms. SAP BTP governance and security exposure. Manufacturing, CPG, or supply chain industry experience. CI/CD and Git-based transport automation experience. Additional Information The salary range for this role is $135 ,000 - $165,000 (Annually). Actual salaries will vary based on several factors, including but not limited to external market data, internal equity, location, and candidate skill set and experience. Base pay is just one component of Jack Link's Total Rewards package for Team Members. Other rewards may include annual incentive and program-specific awards. Jack Link's provides a variety of benefits to eligible Team Members, including medical, dental and vision benefits, life and disability insurance, 401k participation, paid holidays, and paid time off. EQUAL EMPLOYMENT OPPORTUNITY EMPLOYER: Jack Link's provides equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic that is protected by federal, state or local law. E-VERIFY: Jack Link's is participant in the federal E-Verify program to confirm the identity and employment authorization of all newly hired employees. For information about the E-Verify program, please visit: *************************************** All your information will be kept confidential according to EEO guidelines.
    $165k yearly 11d ago
  • Sr Data Engineer

    Jack Link's Protein Snacks 4.5company rating

    Minneapolis, MN jobs

    Running with Sasquatch is more than just a clever marketing campaign. As a Jack Link's team member, Running with Sasquatch means we roll up our buffalo plaid sleeves and do the hard work first. We don't shy away from challenges. In fact, we push hard and take risks. True to our North Woods roots, we're a bunch of ordinary people who accomplish extraordinary things by driving results with innovation, creativity and a clear sense of urgency. Like our awesome protein products, we have an unwavering passion for quality, and you won't find anything artificial here. What you see is what you get…authentic, humble and fun people who Run with Sasquatch! Running with Sasquatch takes a team. We invite you to run with us, succeed with us, and celebrate with us. Most importantly, Feed Your Wild Side with us on our journey to be the dominant global leader of branded protein snacks! Jack Link's Protein Snacks is a global leader in snacking and the No. 1 meat snack manufacturer worldwide. Still family-owned and operated with headquarters in Minong, Wisconsin, Jack Link's also has a large corporate hub in Downtown Minneapolis, Minnesota, and operates a total of 11 manufacturing and distribution facilities in four countries. Jack Link's produces high-quality, great-tasting protein snacks that feed the wild sides of consumers around the world. Jack Link's Protein Snacks family of brands includes Jack Link's, LK, World Kitchens Jerky, Bifi and Peperami. Job Description Are you ready to shape the future of data at Jack Link's? We're looking for a Senior Data Engineer to play a pivotal role in evolving our modern data foundation, enabling the next generation of analytics, automation, and insights. At Jack Link's, we are building an analytics capability that fuels smarter decisions, faster innovation, and stronger business outcomes. We're looking for a Senior Data Engineer to lead the development of scalable, end-to-end data pipelines that power analytics, automation, and external product integrations. This full-stack role spans the entire data lifecycle-from ingestion and transformation to governance and infrastructure. A key focus area is building and maintaining scalable data pipelines from multiple data sources such as S/4HANA and Datasphere into Microsoft Fabric. You'll work closely with IT professionals, product owners, business relationship managers, and analytics teams to design and maintain data models, schemas, and tables that support reporting, dashboards, and ML/AI workflows. A strong focus will be on building and maintaining data pipelines; and data preparation and feature engineering to ensure data is structured, accessible, and optimized for decision-making. This role collaborates with enterprise data architects, platform engineers, and analytics product owners, and is ideal for someone who thrives in a cross-functional, product-oriented environment. Core Responsibilities: Build & Manage Data Pipelines: Design and maintain scalable pipelines for ingesting, transforming, and storing data from SAP and non-SAP sources into Microsoft Fabric. Lead Medallion Architecture Design in Microsoft Fabric: Design and manage bronze, silver, and gold layer structures within Microsoft Fabric to support scalable, governed, and analytics-ready data pipelines. Develop in Microsoft Fabric: Use tools like Lakehouse, Data Warehouse, and Notebooks to process and transform data efficiently. Model & Prepare Data for Analytics: Build robust data models and perform feature engineering to support reporting, dashboards, and ML/AI use cases. Integrate SAP Systems: Connect SAP Datasphere and S/4HANA analytics into Fabric; orchestrate SAP and non-SAP data flows. Ensure Data Quality & Governance: Implement governance practices, maintain metadata, and ensure data integrity across platforms. Automate with Code: Write clean, efficient Python and SQL for ETL workflows, automation, and API development. Support MLOps & AIOps: Help deploy and monitor analytics models using modern DevOps practices. Collaborate & Mentor: Partner with business, analytics, and IT teams; mentor junior engineers and promote best practices. Qualifications Required Education: Bachelor's degree in Information Technology, Computer Science, or related field or equivalent experience. Required Skills & Qualifications: 7+ years professional experience as data engineer Proven experience managing complex data pipelines and lakehouse architectures. Strong expertise in SQL, Python, and Spark. Deep understanding of data modeling, data architecture, and data governance. Familiarity with Microsoft Fabric. Familiarity with SAP data extraction and integration. Strong problem-solving skills and the ability to communicate effectively with both technical and non-technical stakeholders. Comfortable working in a product-oriented analytics team environment. Interest in building the data foundation that enables automation and advanced analytics. Preferred Qualifications: CPG manufacturing or adjacent manufacturing and retail company experience a plus. Demonstrated interest and investment in professional development through ongoing education, professional certifications, participation in knowledge networks etc. Experience with internal analytics product development and external data product integration. Experience with multiple data lake/ lakehouse platforms Microsoft Fabric, Microsoft Azure, Databricks, Snowflake, and/ or GCP. Knowledge of DevOps practices related to data engineering. Familiarity with CI/CD pipelines and tools for data workflows. Additional Information JACK LINK'S CORE VALUES: Be Real, Speed Matters, Stewardship, Relationship Driven, Self-Discipline, and Show Awesome Character. Additional Information The salary range for this role is $135,000 - $165,000 (Annually). Actual salaries will vary based on several factors, including but not limited to external market data, internal equity, location, and candidate skill set and experience. Base pay is just one component of Jack Link's Total Rewards package for Team Members. Other rewards may include annual incentive and program-specific awards. Jack Link's provides a variety of benefits to eligible Team Members, including medical, dental and vision benefits, life and disability insurance, 401k participation, paid holidays, and paid time off. EQUAL EMPLOYMENT OPPORTUNITY EMPLOYER: Jack Link's provides equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic that is protected by federal, state or local law. E-VERIFY: Jack Link's is participant in the federal E-Verify program to confirm the identity and employment authorization of all newly hired employees. For information about the E-Verify program, please visit: *************************************** All your information will be kept confidential according to EEO guidelines.
    $135k-165k yearly 15d ago
  • Sr Data Engineer

    E.A. Sween 4.4company rating

    Eden Prairie, MN jobs

    Who We Are Since 1955, we have been on a mission To Passionately Feed Millions Daily with High Quality Food People Enjoy! We are a third-generation family-owned and professionally managed organization with a commitment to strategic growth. We continue to be successful because of talented people, just like you, who choose to join our family and call E.A. Sween home. We pride ourselves on fostering a welcoming, respectful, and rewarding culture where employees are encouraged to bring their whole selves to work each and every day. At E.A. Sween, our team members are seen, heard, and appreciated not just for what they do, but for who they are. We hope you'll join us! What We're Seeking As a Senior Data Engineer you will lead the design, development, and optimization of our data infrastructure, with a strong focus on integrating and visualizing data using JD Edwards (JDE) and EDI. This hybrid role combines the responsibilities of a Senior Data Engineer with advanced data integration skills, playing a key part in both the strategic direction and hands-on implementation of data systems that support the organization's growing needs. You will work alongside other senior engineers and cross-functional teams to build scalable, reliable, and secure data solutions. Compensation: The target salary range for this position is $94,810.51 - $115,000.00 annually, consistent with our internal compensation framework. This position is classified as Grade 16, with a full pay range of $94,810.51 to $150,352.02. You are eligible for an incentive bonus up to 10% of your annual salary, prorated based on your start date. Final pay will be determined by your experience, skills, internal equity, and available budget. What You'll Do (Responsibilities) Lead the design and implementation of advanced data pipelines and infrastructure to support the integration of ERP data from JD Edwards and other systems. Collaborate with business intelligence, analytics, and engineering teams to define data requirements and deliver data solutions that drive business value. Oversee the development and optimization of ETL processes, ensuring seamless data flow and integration, particularly with JD Edwards for ERP data. Develop and integrate complex data solutions using EDI to facilitate smooth data exchange across systems. Establish and enforce data governance, security, and quality standards to maintain data integrity and privacy across all platforms. Automate and streamline data workflows, integrating new technologies and data sources into the existing architecture. Perform detailed data profiling, validation, and cleansing to ensure high-quality, consistent data for business intelligence and reporting. Take initiative to address data-related issues directly and support cross-functional teams when challenges arise. Troubleshoot and resolve data-related issues, offering guidance and mentorship to junior engineers. Actively support team members by stepping in when needed to ensure projects stay on track and deliverables meet business expectations. Lead the design and development of reporting and dashboard solutions using Power BI, ensuring alignment with business needs. Continuously optimize data pipelines for performance, scalability, and reduced latency. Mentor junior data engineers, providing technical expertise, leadership, and support to foster a collaborative and high-performing team environment. Collaborate with data architects and senior engineers to ensure that data infrastructure remains scalable, sustainable, and aligned with business goals. Maintain thorough documentation for data models, processes, and systems, ensuring clarity and continuity. Demonstrate accountability by addressing issues head-on and owning outcomes in your work and within the team. What You'll Need (Qualifications) Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field (Master's degree is a plus). At least 5-7 years of experience in data engineering or a related technical role, with strong expertise in data integration and system architecture. Proven experience with JD Edwards, particularly in extracting and transforming ERP data. Strong experience with EDI processes and technologies for seamless data integration. Proficiency in programming languages such as Python, Java, SQL, and other relevant tools for data engineering. Extensive experience with ETL processes, data integration, and pipeline automation. Advanced knowledge of business intelligence tools, especially Power BI, for designing reports and dashboards. Solid understanding of data warehousing, data lakes, and real-time data streaming. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication and interpersonal skills, with the ability to work across teams and manage stakeholder expectations. Demonstrated ability to step in and support team members when needed, as well as independently take ownership of resolving issues. Proven track record of mentoring junior team members and providing leadership within a collaborative environment. Flexibility to adapt to evolving technologies and new business requirements. How You'll Find Success at EAS Value People Most of All: Show respect & care, embrace diversity, and empower others. Commit to Safety Everyday: See something say something do something, practice safe behavior, and celebrate safety success. Invest in Our Company to Thrive: Share ideas to improve, learn & grow, and embrace change. Think Before Doing And Act Decisively: Make thoughtful decisions, work together to find solutions, and do what's right. Welcome Constructive Straight Talk: Be honest and respectful even when difficult, be open to ideas and feedback, and ask questions to understand. Serve Up Exceptional Experiences: Provide value to customers, take pride in your work, and help others to be successful. Enjoy What You Do!: Have a positive attitude, Live the Spirit of E.A. Sween, and celebrate success.
    $94.8k-115k yearly 21d ago
  • Sr Data Engineer

    E.A. Sween Company 4.4company rating

    Eden Prairie, MN jobs

    Who We Are Since 1955, we have been on a mission To Passionately Feed Millions Daily with High Quality Food People Enjoy! We are a third-generation family-owned and professionally managed organization with a commitment to strategic growth. We continue to be successful because of talented people, just like you, who choose to join our family and call E.A. Sween home. We pride ourselves on fostering a welcoming, respectful, and rewarding culture where employees are encouraged to bring their whole selves to work each and every day. At E.A. Sween, our team members are seen, heard, and appreciated not just for what they do, but for who they are. We hope you'll join us! What We're Seeking As a Senior Data Engineer you will lead the design, development, and optimization of our data infrastructure, with a strong focus on integrating and visualizing data using JD Edwards (JDE) and EDI. This hybrid role combines the responsibilities of a Senior Data Engineer with advanced data integration skills, playing a key part in both the strategic direction and hands-on implementation of data systems that support the organization's growing needs. You will work alongside other senior engineers and cross-functional teams to build scalable, reliable, and secure data solutions. Compensation: The target salary range for this position is $94,810.51 - $115,000.00 annually, consistent with our internal compensation framework. This position is classified as Grade 16, with a full pay range of $94,810.51 to $150,352.02. You are eligible for an incentive bonus up to 10% of your annual salary, prorated based on your start date. Final pay will be determined by your experience, skills, internal equity, and available budget. What You'll Do (Responsibilities) * Lead the design and implementation of advanced data pipelines and infrastructure to support the integration of ERP data from JD Edwards and other systems. * Collaborate with business intelligence, analytics, and engineering teams to define data requirements and deliver data solutions that drive business value. * Oversee the development and optimization of ETL processes, ensuring seamless data flow and integration, particularly with JD Edwards for ERP data. * Develop and integrate complex data solutions using EDI to facilitate smooth data exchange across systems. * Establish and enforce data governance, security, and quality standards to maintain data integrity and privacy across all platforms. * Automate and streamline data workflows, integrating new technologies and data sources into the existing architecture. * Perform detailed data profiling, validation, and cleansing to ensure high-quality, consistent data for business intelligence and reporting. * Take initiative to address data-related issues directly and support cross-functional teams when challenges arise. * Troubleshoot and resolve data-related issues, offering guidance and mentorship to junior engineers. * Actively support team members by stepping in when needed to ensure projects stay on track and deliverables meet business expectations. * Lead the design and development of reporting and dashboard solutions using Power BI, ensuring alignment with business needs. * Continuously optimize data pipelines for performance, scalability, and reduced latency. * Mentor junior data engineers, providing technical expertise, leadership, and support to foster a collaborative and high-performing team environment. * Collaborate with data architects and senior engineers to ensure that data infrastructure remains scalable, sustainable, and aligned with business goals. * Maintain thorough documentation for data models, processes, and systems, ensuring clarity and continuity. * Demonstrate accountability by addressing issues head-on and owning outcomes in your work and within the team. What You'll Need (Qualifications) * Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field (Master's degree is a plus). * At least 5-7 years of experience in data engineering or a related technical role, with strong expertise in data integration and system architecture. * Proven experience with JD Edwards, particularly in extracting and transforming ERP data. * Strong experience with EDI processes and technologies for seamless data integration. * Proficiency in programming languages such as Python, Java, SQL, and other relevant tools for data engineering. * Extensive experience with ETL processes, data integration, and pipeline automation. * Advanced knowledge of business intelligence tools, especially Power BI, for designing reports and dashboards. * Solid understanding of data warehousing, data lakes, and real-time data streaming. * Excellent problem-solving, analytical, and troubleshooting skills. * Strong communication and interpersonal skills, with the ability to work across teams and manage stakeholder expectations. * Demonstrated ability to step in and support team members when needed, as well as independently take ownership of resolving issues. * Proven track record of mentoring junior team members and providing leadership within a collaborative environment. * Flexibility to adapt to evolving technologies and new business requirements. How You'll Find Success at EAS * Value People Most of All: Show respect & care, embrace diversity, and empower others. * Commit to Safety Everyday: See something say something do something, practice safe behavior, and celebrate safety success. * Invest in Our Company to Thrive: Share ideas to improve, learn & grow, and embrace change. * Think Before Doing And Act Decisively: Make thoughtful decisions, work together to find solutions, and do what's right. * Welcome Constructive Straight Talk: Be honest and respectful even when difficult, be open to ideas and feedback, and ask questions to understand. * Serve Up Exceptional Experiences: Provide value to customers, take pride in your work, and help others to be successful. * Enjoy What You Do!: Have a positive attitude, Live the Spirit of E.A. Sween, and celebrate success.
    $94.8k-115k yearly 15d ago
  • Data Engineer Senior

    Acuity Brands Inc. 4.6company rating

    Atlanta, GA jobs

    Acuity Inc. (NYSE: AYI) is a market-leading industrial technology company. We use technology to solve problems in spaces, light and more things to come. Through our two business segments, Acuity Brands Lighting (ABL) and Acuity Intelligent Spaces (AIS), we design, manufacture, and bring to market products and services that make a valuable difference in people's lives. We achieve growth through the development of innovative new products and services, including lighting, lighting controls, building management solutions, and an audio, video and control platform. We focus on customer outcomes and drive growth and productivity to increase market share and deliver superior returns. We look to aggressively deploy capital to grow the business and to enter attractive new verticals. Acuity Inc. is based in Atlanta, Georgia, with operations across North America, Europe and Asia. The Company is powered by approximately 13,000 dedicated and talented associates. Visit us at ****************** Team & Position Summary Acuity Brands is seeking a Senior Data Engineer to join its Atrius Analytics team. Acuity Brands is driving smarter, safer, and greener outcomes across industries like building automation, HVAC, AV systems, refrigeration, and lighting. By harnessing valuable data within a space, we are powering advanced analytics and AI-driven insights that transform environments and maximize occupant experiences. The Atrius Analytics team is building a scalable, intelligent data platform that ingests and normalizes IoT telemetry from across our diverse verticals. Supporting both batch and real-time processing, this platform provides the foundation for industry-leading cloud applications in building performance and spatial intelligence. We value trust, respect, asynchronous communication, creativity, and customer focus. Our day-to-day is guided by an agile design methodology, with close collaboration among software engineers, firmware engineers, data scientists, and third-party developers. Our team works across Azure, AWS, and GCP IOT services. The ideal candidate is passionate about driving impact through data while developing business, technical, and leadership acumen. Primary Responsibilities Include As a Data Engineer, you will play a critical role in architecting and developing pipelines that transform raw IoT data into actionable intelligence. You will build robust data models that ensure data quality and consistency, enabling downstream analytics and AI/ML applications that optimize operations, enhance user experiences, and unlock new business opportunities. Some responsibilities include: * Design and implement scalable data engineering pipelines for ingesting, transforming, and storing IoT telemetry data using Apache Flink, Apache Spark, and Databricks * Build and maintain time-series data solutions using PostgreSQL and TimescaleDB to support high-resolution telemetry analytics * Integrate data governance frameworks for metadata management, lineage tracking, and compliance within a data lake ecosystem *
    $78k-97k yearly est. 35d ago
  • Senior Game Engineer

    Rumble Entertainment 4.1company rating

    San Francisco, CA jobs

    Engineering | Remote Rumble Games was founded in 2011 and is headquartered in San Mateo, California. Our fully-remote development studio is home to a tight-knit team of professionals whose mission is to create the most engaging game experiences on the planet. We combine the best of AAA games, free-to-play accessibility and blockchain technology. We are passionate about collaboration and iteration to create games that will surprise and delight our players. We emphasize a positive work-life balance to allow our team to develop their best work. Join us! Your Mission We are looking for a talented Game Engineer to develop gameplay systems for online video games with large-scale deployments. You will work directly with our design and production teams using highly collaborative processes to create amazing products. You will write highly flexible code for prototyping game features and write robust, scalable code once the fun has been found, and you understand the trade-offs between both approaches. How You Will Contribute * You will collaborate with production, game and engineering teams to devise optimal engineering solutions to gameplay requirements. * You will architect and code sophisticated client/server gameplay systems. * You will implement software systems with attention to security, reliability, scalability, maintainability and performance. * You will innovate and iterate on processes, systems and technology to deliver a world-class gaming experience. * You will be a team-player; Identify and articulate technical and production risks and obstacles; generate and implement solutions in collaboration with the team. * You will help mentor other engineers to help develop their skill sets. We'd Love To Hear From You, If * You have a Bachelor's degree in Computer Science or related field, or equivalent experience. * You have 5+ years development experience with at least one shipped product. * You are Fluent in C#, C++, or Java; experience with other languages is a plus. * You have Unity Experience. * You have proven your effectiveness in the delivery of production quality code for client/server topologies and synchronous multiplayer gameplay. * You have passion for games, DApps, and Web3. * You have experience working on and playing RPGs, strategy, and action games. Benefits Having a happy team that collaborates well is our top priority. We offer exceptional benefits and invest in our team's happiness, wellbeing, and growth. * Generous salary, 401k matching, and paid time off. * Healthcare, Vision, Dental, & Disability Insurance. * Quarterly contribution & discounts for wellness related activities and programs. * Exceptional culture and dedication to our team. Send a resume to [email protected] California residents, please click here for our CCPA Employee and Applicant Privacy Notice.
    $105k-157k yearly est. 60d+ ago

Learn more about The J.M. Smucker Co. jobs