Post job

Data engineer jobs in DeKalb, IL

- 352 jobs
All
Data Engineer
Data Scientist
ETL Architect
Hadoop Developer
Requirements Engineer
  • Engineer, Corporate Decommissioning

    Constellation Energy 4.9company rating

    Data engineer job in Warrenville, IL

    Who We Are As the nation's largest producer of clean, carbon-free energy, Constellation is focused on our purpose: accelerating the transition to a carbon-free future. We have been the leader in clean energy production for more than a decade, and we are cultivating a workplace where our employees can grow, thrive, and contribute. Our culture and employee experience make it clear: We are powered by passion and purpose. Together, we're creating healthier communities and a cleaner planet, and our people are the driving force behind our success. At Constellation, you can build a fulfilling career with opportunities to learn, grow and make an impact. By doing our best work and meeting new challenges, we can accomplish great things and help fight climate change. Join us to lead the clean energy future. Total Rewards Constellation offers a wide range of benefits and rewards to help our employees thrive professionally and personally. We provide competitive compensation and benefits that support both employees and their families, helping them prepare for the future. In addition to highly competitive salaries, we offer a bonus program, 401(k) with company match, employee stock purchase program comprehensive medical, dental and vision benefits, including a robust wellness program paid time off for vacation, holidays, and sick days and much more. ***This Engineering role can be filled at the Entry, Mid-level or Senior Engineer level. Please see minimum qualifications list below for each level*** Expected salary range: Entry-Level - $85,000 Mid-Level - $90,000 - $111,000 Sr Level - $118,000- $150,000 Ranges are per year based on experience, along with a comprehensive benefits package that includes bonus and 401K. Primary Purpose of Position Performs advanced technical/engineering problem solving in support of nuclear plant operations while acting as a resource and technical expert to engineers. Responsible for technical decisions. Possesses excellent knowledge in functional discipline and its practical application and has detailed knowledge of applicable industry codes and regulations. Primary Duties and Accountabilities Provide in-depth technical expertise to develop, manage and implement engineering analysis, activities and programs Provide technical expertise and consultation through direct involvement and as a subject matter expert when consulted to identify and resolve equipment and system problems Directly fulfill engineering and technical leadership accountability regarding short-term and long-term programs that impact site operations Perform engineering tasks as assigned by supervision applying engineering principles Accountable for the accuracy, completeness, and timeliness of work ensuring proper configuration management and assuring that standard design criteria, practices, procedures and codes are used in preparation of plans and specifications Perform independent research, reviews, studies and analyses in support of technical projects and programs Recommend equipment, new concepts and techniques to improve performance, simplify construction, reduce costs, correct design or material flaws, or comply with changes in codes or regulations Perform other job assignments and duties as directed by management or pursuant to company policy, including but not limited to emergency response, departmental coverage, call outs, and support of outage activities in positions outside the department MINIMUM QUALIFICATIONS for Entry Level E01 Engineer Bachelor&rsquos degree in Engineering (Chemical, Civil/Structural, Electrical, Industrial, Mechanical or Nuclear) Maintain minimum access requirements or unescorted access requirements, as applicable, and favorable medical examination and/or testing in accordance with position duties MINIMUM QUALIFICATIONS for Mid-level E02 Engineer Bachelor&rsquos degree in engineering (chemical, civil/structural, electrical, industrial, mechanical or nuclear) with 2 years of nuclear or related engineering experience Maintain minimum access requirements or unescorted access requirements, as applicable, and favorable medical examination and/or testing in accordance with position duties MINIMUM QUALIFICATIONS for Senior E03 Engineer Bachelor&rsquos degree in Engineering (Chemical, Civil/Structural, Electrical, Industrial, Mechanical or Nuclear) with 5 years of nuclear experience or related engineering experience Maintain minimum access requirement or unescorted access requirements, as applicable, and favorable medical examination and/or testing in accordance with position duties Decommissioning Cost Estimates experience HIGHLY Preferred
    $118k-150k yearly Auto-Apply 1d ago
  • Supply Chain Data Scientist

    Treehouse Foods, Inc. 4.7company rating

    Data engineer job in Oak Brook, IL

    **Employee Type:** Full time **Job Type:** Information Technology Data Management **Job Posting Title:** Supply Chain Data Scientist **About Us** **:** TreeHouse Foods (NYSE: THS) is a leading manufacturer of private label packaged foods and beverages, operating a network of over 20 production facilities and several corporate offices across the United States and Canada. At TreeHouse Foods, our commitment to excellence extends beyond our products and revolves around our people. We are investing in talent and creating a performance-based culture where employees can do their best work and develop their careers, directly impacting our mission to make high quality, affordable food for our customers, communities, and families. We hope you will consider joining the team and being part of our future. Named one of America's Best Large Employers by Forbes Magazine, we are proud to live by a strong set of values and strive to "Engage and Delight - One Customer at a Time." Guided by our values- **Own It, Commit to Excellence, Be Agile, Speak Up, and Better Together.** We are a diverse team driven by integrity, accountability, and a commitment to exceptional results. We embrace change, prioritize continuous learning, and foster collaboration, transparency, and healthy debate. Together, we set each other up for success to achieve enterprise-wide goals. **What You G** **ain** **:** + Competitive compensation and benefits program with no waiting period - you're eligible from your first day! + 401(k) program with 5% employer match and 100% vesting as soon as you enroll. + Comprehensive paid time off opportunities, including immediate access to four weeks of vacation, five sick days, parental leave and 11 company holidays (including two floating holidays). + Leaders who are invested in supporting your accelerated career growth, plus paid training, tuition reimbursement and a robust educational platform - DevelopU - with more than 10,000 free courses to support you along the way. + An inclusive working environment where you can build meaningful work relationships with a diverse group of professionals. Take advantage of opportunities to build on our team-oriented culture, such as joining one of our Employee Resource Groups. + Access to our wellness and employee assistance programs. **Job Description:** **_About the Role:_** Treehouse Foods' Integrated Planning team is looking for a talented and data-driven Supply Chain Data Scientist to enhance our supply chain forecasting capabilities in our Oak Brook, IL corporate office. This role is ideal for a professional who thrives on analytical challenges and is committed to making a tangible impact. We value collaborative problem-solvers who bring a strong analytical mindset and demonstrate curiosity, resilience, and commitment to continuous learning. If you are adaptable, results-oriented, and motivated by cross-functional teamwork, we encourage you to join our team in supporting Treehouse Foods' mission of delivering quality food products with efficiency and precision. **_You'll add value to this role by performing various functions including, but not limited to:_** + Analyze large, complex datasets to extract actionable insights, identify trends, and address supply chain challenges. + Develop, deploy, and maintain statistical demand planning and inventory optimization models using Blue Yonder or similar platforms across various business units. + Present complex data insights in a clear and effective manner to both technical and non-technical stakeholders. + Research and implement new data science techniques and technologies to enhance forecasting accuracy and supply chain performance. + Ensure analytical insights are seamlessly integrated into business processes and guide model deployment requirements, including adherence to QA standards. + Work with both technical and business stakeholders to identify technology-driven opportunities that deliver measurable business value. + Other duties as assigned. **_Important Details_** **_:_** + This is a hybrid role on first shift in Oak Brook, IL. + The anticipated compensation for this position ranges from $ 100,800 to $151,200 annually. This is the lowest to highest salary we in good faith believe we would pay for this role at the time of this posting. An employee's position within the salary range will be based on several factors including, but not limited to, specific competencies, relevant education, qualifications, certifications, experience, skills, seniority, geographic location, performance, shift, travel requirements, sales or revenue-based metrics and business or organizational needs. For certain roles, the successful candidate may be eligible for annual discretionary merit compensation award, bonus and equity pay. **_You'll fit right in if you have:_** + Bachelor's or advanced degree in Statistics, Data Analytics, Applied Mathematics, or a related quantitative field; advanced degree preferred. + 5 or more years of experience in data science or machine learning roles, preferably within private-label CPG organizations. + Expertise in SQL and relational databases, proficiency with Blue Yonder or similar Supply Chain platform for statistical demand planning and inventory optimization. + Experience with machine learning libraries and frameworks (PyTorch, TensorFlow, NumPy) and data visualization tools (Tableau, PowerBI). + Strong quantitative, statistical, and data mining knowledge, including experience with techniques like regression, random forests, hierarchical clustering, deep learning, CNNs, and RNNs. + A strong desire to explore data-driven methodologies, with a focus on innovation and continuous improvement. **Your TreeHouse Foods Career is Just a Click Away!** Click on the "Apply" button or go directly to ****************************** to let us know you're ready to join our team! _At TreeHouse Foods, we embrace diversity and inclusion for innovation and growth. We are committed to building inclusive teams and an equitable workplace for our employees to bring their true selves to work to help us "Engage and Delight - One Customer at a Time"._ _TreeHouse Foods is an Equal Opportunity Employer that prohibits discrimination or harassment of any type. All qualified applicants are considered for employment without regard to race, color, national origin, age, sex, sexual orientation, gender, gender identity or expression, disability status, protected veteran status, or any other characteristic protected by law. Applicants who require an accommodation to participate in the job application or hiring process should contact_ _disability-accommodations@treehousefoods.com_ TreeHouse Use Only: #IND1 TreeHouse Foods is a private label food and beverage leader focused on customer brands and custom products. When customers partner with TreeHouse they can expect access to an industry-leading portfolio, strategic vision, on-trend innovation and insights, world-class supply chain, operational excellence and flexibility, collaborative approaches, and dedicated customer service. Our strategy is to be the leading supplier of private label food and beverage products by providing the best balance of quality and cost to our customers. We engage with retail grocery, food away from home, and industrial and export customers, including most of the leading grocery retailers and foodservice operators in the United States and Canada. Our portfolio includes a variety of shelf-stable, refrigerated, and snack products. Customers can expect comprehensive flavor profiles including natural, organic, and preservative-free ingredients in many categories and packaging formats. TreeHouse Foods is best known for food and beverages produced by our two largest businesses Bay Valley Foods, LLC (including E.D. Smith and Sturm Foods) and TreeHouse Private Brands. With more than 10,000 employees in over 26 plants across the United States and Canada, TreeHouse Foods is based in Oak Brook, Illinois. **Recruitment Fraud Alert** We want to ensure your career journey with TreeHouse Foods is safe and secure. Scammers may attempt to impersonate our company by sending fake job offers, interview, and sensitive document requests. If you receive an email claiming to be from us, always verify the sender's email address-it should match our official company domain (@treehousefoods.com) exactly. We will _never_ ask for payment, financial, or personal information and documents as part of our interview process. If you suspect fraudulent activity, please contact us directly by visiting the Contact page on our website (****************************************************** . Stay vigilant to protect yourself from recruitment scams. **Disability Assistance and EEO Considerations:** At TreeHouse Foods, we embrace diversity and inclusion for innovation and growth. We are committed to building inclusive teams and an equitable workplace for our employees to bring their true selves to work to help us "Engage and Delight - One Customer at a Time." TreeHouse Foods is an Equal Opportunity Employer that prohibits discrimination or harassment of any type. All qualified applicants are considered for employment without regard to race, color, national origin, age, sex, sexual orientation, gender, gender identity or expression, disability status, protected veteran status, or any other characteristic protected by law. Applicants who require an accommodation to participate in the job application or hiring process should contact disability-accommodations@treehousefoods.com **To all recruitment agencies:** TreeHouse Foods does not accept unsolicited agency resumes/CVs. Please do not forward resumes/CVs to our careers email addresses, Treehouse Foods employees, or any company location(s). TreeHouse Foods is not responsible for any fees related to unsolicited resumes/CVs.
    $100.8k-151.2k yearly 25d ago
  • Junior Data Engineer

    Calamos Asset Management, Inc. 4.3company rating

    Data engineer job in Naperville, IL

    Summary of the Role Calamos is seeking a Junior Data Engineer who is passionate about data and eager to grow their skills within the financial services industry. You'll join our Data Engineering Team as we continue to modernize our data platform using Databricks, transforming how the firm leverages data as a strategic asset. In this role, you'll participate in the lifecycle of data pipeline projects-from design and development through testing, documentation, and production support. You'll work hands-on with Databricks to build scalable data solutions that power critical investment decisions and operations. As part of a collaborative team, you'll contribute beyond individual projects by participating in design sessions, engaging with stakeholders to understand their needs, and sharing in rotation-based production support duties. Primary Responsibilities * Develop and maintain data pipelines in Databricks following Agile/Scrum methodology * Write clean, well-documented code that adheres to team and firm standards * Participate in the on-call rotation to monitor overnight data delivery SLAs, troubleshoot failures, and respond to data quality issues * Collaborate with architects, engineers, and business stakeholders to understand requirements and deliver solutions * Contribute to design sessions, code reviews, and continuous improvement initiatives Preferred Qualifications * Bachelor's degree in Computer Science, Data Engineering, or related field * 3+ years of experience developing data pipelines in a modern data platform environment * Strong Python development skills with experience in PySpark and Databricks * Understanding of medallion architecture and data lakehouse concepts * Experience with Azure cloud services (Data Lake Storage, Key Vault, Azure DevOps) * Familiarity with Delta Lake and data quality frameworks * Experience with modern CI/CD practices and infrastructure as code * Strong SQL skills and understanding of data modeling principles * Excellent communication skills with ability to translate technical concepts for business stakeholders * Financial services industry experience is a plus Compensation Disclosure The compensation for this role takes into account various factors, including work location, individual skill set, relevant experience, and other business needs. The estimated base salary range for this position is $90,000 - $110,000. Additionally, this position is eligible for an annual discretionary bonus. Please note that this is the current estimate of the base salary range intended for this role at the time of posting. The base salary range may be adjusted in the future. Benefits Calamos offers a comprehensive benefits package, including health and welfare benefits (medical, dental, vision, flexible spending accounts, and employer-paid short and long-term disability), as well as retirement benefits (401(k) and profit sharing), paid time off, paid parental leave, and other wellness benefits.
    $90k-110k yearly 26d ago
  • Data Scientist

    Navistar 4.7company rating

    Data engineer job in Lisle, IL

    We are seeking a driven Data Scientist to join our Analytics team. In this role, you will work on high-impact projects that leverage machine learning, AI, and data engineering techniques focused on the Service Solutions and Commercial organization. This is an excellent opportunity for early-career professionals and will work closely with experienced data scientists and engineers who will mentor and guide your development. A passion for learning, a collaborative mindset, and a bias toward innovation are key to success in this role. Responsibilities * Support the development of predictive and prescriptive models using machine learning and statistical techniques to address key business problems in electrification and supply chain. * Assist in building data pipelines and workflows to ingest, clean, and prepare large datasets from various internal and external sources. * Collaborate on the development and deployment of analytics solutions using cloud platforms such as Databricks and Azure ML. * Develop clear and compelling data visualizations and dashboards to communicate insights to both technical and non-technical stakeholders. * Work within an agile team environment, contributing to code repositories, sprint planning, and solution documentation. * Stay up to date with emerging trends in data science, generative AI, and commercial mobility innovation. * Participate in innovation initiatives, brainstorming new ways to apply AI and analytics to improve operations and customer experience. Minimum Requirements * Currently pursuing a Bachelor's degree in Computer Science, Statistics, Data Science/Analytics, Management Information Systems, Mathematics, Natural Science, Economics, Engineering or similar quantitative field and no experience and will obtain degree prior to first day of employment OR * Bachelor's degree in Computer Science, Statistics, Data Science/Analytics, Management Information Systems, Mathematics, Natural Science, Economics, Engineering or similar quantitative field Additional Requirements * Qualified candidates, excluding current employees, must be legally authorized on an unrestricted basis (US Citizen, Legal Permanent Resident, Refugee or Asylee) to be employed in the United States. We do not anticipate providing employment related work sponsorship for this position (e.g., H-1B status) Desired Skills * Familiarity with SQL and working with structured data. * Exposure to cloud-based tools (e.g., Azure, AWS, GCP), particularly for data science or engineering tasks. * Understanding of key machine learning concepts such as classification, regression, clustering, or time series forecasting. * Strong interest in applying AI and analytics to solve real-world business problems. * Comfortable using data visualization tools such as Power BI, matplotlib, seaborn, or Streamlit. * Exposure to agile ways of working, including version control (Git) and working in cross-functional teams, is a plus. * A growth mindset and willingness to learn from feedback and challenges. Benefits and Compensation We provide a competitive total rewards package which ensures job satisfaction both on and off the job. We offer market-based compensation, health benefits, 401(k) match, tuition assistance, EAP, legal insurance, an employee discount program, and more. For this position, the expected salary range will be commensurate with the candidate's applicable skills, knowledge and experience. You can learn more about our comprehensive benefits package at ******************************************** Company Overview ABOUT TRATON With its brands Scania, MAN, International, and Volkswagen Truck & Bus, TRATON SE is the parent and holding company of the TRATON GROUP and one of the world's leading commercial vehicle manufacturers. The Group's product portfolio comprises trucks, buses, and light-duty commercial vehicles. "Transforming Transportation Together. For a sustainable world.": this intention underlines the Company's ambition to have a lasting and sustainable impact on the commercial vehicle business and on the Group's commercial growth. ABOUT INTERNATIONAL From a one-man company built on the world-changing invention of the McCormick reaper in 1831, to the 15,000-person-strong company we are today, few companies can lay claim to a history like International. Based in Lisle, Illinois, International Motors, LLC* creates solutions that deliver greater uptime and productivity to our customers throughout the full operation of our commercial vehicles. We build International trucks and engines and IC Bus school and commercial buses that are as tough and as smart as the people who drive them. We also develop Fleetrite aftermarket parts. In everything we do, our vision is to accelerate the impact of sustainable mobility to create the cleaner, safer world we all deserve. As of 2021, we joined Scania, MAN and Volkswagen Truck & Bus in TRATON GROUP, a global champion of the truck and transport services industry. To learn more, visit ********************** * International Motors, LLC is d/b/a International Motors USA in Illinois, Missouri, New Jersey, Ohio, Texas, and Utah. EEO Statement We are an Equal Opportunity Employer. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other legally protected characteristics. If you are a qualified individual with a disability and require a reasonable accommodation to access the online application system or participate in the interview process due to your disability, please email ********************* to request assistance. Kindly specify Job Requisition Number / Job Title and Location in response. Otherwise, your request may not be considered.
    $68k-88k yearly est. Auto-Apply 37d ago
  • Lead Data & BI Scientist

    Zurn Elkay Water Solutions

    Data engineer job in Downers Grove, IL

    The Company Zurn Elkay Water Solutions Corporation is a thriving, values-driven company focused on doing the right things. We're a fast growing, publicly traded company (NYSE: ZWS), with an enduring reputation for integrity, giving back, and providing an engaging, inclusive environment where careers flourish and grow. Named by Newsweek as One of America's Most Responsible Companies and an Energage USA Top Workplace, at Zurn Elkay Water Solutions Corporation, we never forget that our people are at the center of what makes us successful. They are the driving force behind our superior quality, product ingenuity, and exceptional customer experience. Our commitment to our people and their professional development is a recipe for success that has fueled our growth for over 100 years, as one of today's leading international suppliers of plumbing and water delivery solutions. Headquartered in Milwaukee, WI, Zurn Elkay Water Solutions Corporation employs over 2800 employees worldwide, working from 24 locations across the U.S., China, Canada, Dubai, and Mexico, with sales offices available around the globe. We hope you'll visit our website and learn more about Zurn Elkay at zurnelkay.com. If you're ready to join a company where what you do makes a difference and you have pride in the work you are doing, talk to us about joining the Zurn Elkay Water Solutions Corporation family! If you are a current employee, please navigate here to apply internally. Job Description The Lead Data & BI Scientist is a senior-level role that blends advanced data science capabilities with business intelligence leadership. This position is responsible for driving strategic insight generation, building predictive models, and leading analytics initiatives across departments such as sales, marketing, pricing, manufacturing, logistics, supply chain, and finance. The role requires both technical depth and business acumen to ensure that data-driven solutions are aligned with organizational goals and deliver measurable value. Key Accountabilities Strategic Insight & Business Partnership Partner with business leaders to identify high-impact opportunities and form hypotheses. Present findings and recommendations to leadership in a clear, impactful manner. Demonstrate ROI and business value from analytics initiatives. Data Science Leadership Define and implement data science processes, tools, and governance frameworks. Mentor junior team members and foster a culture of continuous learning. Advanced Analytics & Modeling Design, build and validate predictive models, machine learning algorithms and statistical analyses. Translate complex data into actionable insights for strategic decision-making. Technology & Tools Utilize tools such as Tableau, Power BI, OBIEE/OAC, Snowflake, SQL, R, Python, and data catalogs. Stay current with emerging technologies like agentic analytics and AI-driven insights. Evaluate and recommend BI platforms and data science tools. Project & Change Management Manage analytics projects from inception to delivery. Provide training and change management support for new tools and processes. Lead the establishment of a data science center of excellence. Qualifications/Requirements Bachelor degree required, in a quantitative field such as engineering, mathematics, science, and/or MIS. Master degree preferred 10+ years of overall work experience 7+ years of experience in data science and statistical analysis Strong understanding and experience with analytics tool such as Tableau and OBIEE/OAC (or similar tools) for reporting and visualization, Snowflake for data storage, data modeling, data prep or ETL tools, R or Python, SQL, and data catalogs. Strong understanding and experience with multiple statistical and quantitative models and techniques such as (but not limited to) those used for predictive analytics, machine learning, AI, linear models and optimization, clustering, and decision tree. Deep experience applying data science to solve problems in at least one of the following areas is required. Experience in multiple areas is preferred: marketing, manufacturing, pricing, logistics, sourcing, and sales. Strong communication (verbal, written) skills, and ability to work with all levels of the organization effectively Working knowledge of and proven experience applying project management tools Strong analytical skills Ability to lead and mentor the work of others High degree of creativity and latitude is expected Capabilities and Success Factors Decision Quality - Making good and timely decisions that keep the organization moving forward. Manages Complexity - Making sense of complex, high quantity and sometimes contradictory information to effectively solve problems. Plans & Aligns - Planning and prioritizing work to meet commitments aligned with organizational goals. Drives Results - Consistently achieving results, even under tough circumstances. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Total Rewards and Benefits Competitive Salary Medical, Dental, Vision, STD, LTD, AD&D, and Life Insurance Matching 401(k) Contribution Health Savings Account Up to 3 weeks starting Vacation (may increase with tenure) 12 Paid Holidays Annual Bonus Eligibility Educational Reimbursement Matching Gift Program Employee Stock Purchase Plan - purchase company stock at a discount! **THIRD PARTY AGENCY: Any unsolicited submissions received from recruitment agencies will be considered property of Zurn Elkay, and we will not be liable for any fees or obligations related to those submissions.** Equal Opportunity Employer - Minority/Female/Disability/Veteran
    $70k-97k yearly est. Auto-Apply 60d+ ago
  • Lead Data & BI Scientist

    Zurn Elkay Water Solutions Corporation

    Data engineer job in Downers Grove, IL

    The Company Zurn Elkay Water Solutions Corporation is a thriving, values-driven company focused on doing the right things. We're a fast growing, publicly traded company (NYSE: ZWS), with an enduring reputation for integrity, giving back, and providing an engaging, inclusive environment where careers flourish and grow. Named by Newsweek as One of America's Most Responsible Companies and an Energage USA Top Workplace, at Zurn Elkay Water Solutions Corporation, we never forget that our people are at the center of what makes us successful. They are the driving force behind our superior quality, product ingenuity, and exceptional customer experience. Our commitment to our people and their professional development is a recipe for success that has fueled our growth for over 100 years, as one of today's leading international suppliers of plumbing and water delivery solutions. Headquartered in Milwaukee, WI, Zurn Elkay Water Solutions Corporation employs over 2800 employees worldwide, working from 24 locations across the U.S., China, Canada, Dubai, and Mexico, with sales offices available around the globe. We hope you'll visit our website and learn more about Zurn Elkay at zurnelkay.com. If you're ready to join a company where what you do makes a difference and you have pride in the work you are doing, talk to us about joining the Zurn Elkay Water Solutions Corporation family! If you are a current employee, please navigate here to apply internally. Job Description The Lead Data & BI Scientist is a senior-level role that blends advanced data science capabilities with business intelligence leadership. This position is responsible for driving strategic insight generation, building predictive models, and leading analytics initiatives across departments such as sales, marketing, pricing, manufacturing, logistics, supply chain, and finance. The role requires both technical depth and business acumen to ensure that data-driven solutions are aligned with organizational goals and deliver measurable value. Key Accountabilities Strategic Insight & Business Partnership * Partner with business leaders to identify high-impact opportunities and form hypotheses. * Present findings and recommendations to leadership in a clear, impactful manner. * Demonstrate ROI and business value from analytics initiatives. Data Science Leadership * Define and implement data science processes, tools, and governance frameworks. * Mentor junior team members and foster a culture of continuous learning. Advanced Analytics & Modeling * Design, build and validate predictive models, machine learning algorithms and statistical analyses. * Translate complex data into actionable insights for strategic decision-making. Technology & Tools * Utilize tools such as Tableau, Power BI, OBIEE/OAC, Snowflake, SQL, R, Python, and data catalogs. * Stay current with emerging technologies like agentic analytics and AI-driven insights. * Evaluate and recommend BI platforms and data science tools. Project & Change Management * Manage analytics projects from inception to delivery. * Provide training and change management support for new tools and processes. * Lead the establishment of a data science center of excellence. Qualifications/Requirements * Bachelor degree required, in a quantitative field such as engineering, mathematics, science, and/or MIS. Master degree preferred * 10+ years of overall work experience * 7+ years of experience in data science and statistical analysis * Strong understanding and experience with analytics tool such as Tableau and OBIEE/OAC (or similar tools) for reporting and visualization, Snowflake for data storage, data modeling, data prep or ETL tools, R or Python, SQL, and data catalogs. * Strong understanding and experience with multiple statistical and quantitative models and techniques such as (but not limited to) those used for predictive analytics, machine learning, AI, linear models and optimization, clustering, and decision tree. * Deep experience applying data science to solve problems in at least one of the following areas is required. Experience in multiple areas is preferred: marketing, manufacturing, pricing, logistics, sourcing, and sales. * Strong communication (verbal, written) skills, and ability to work with all levels of the organization effectively * Working knowledge of and proven experience applying project management tools * Strong analytical skills * Ability to lead and mentor the work of others * High degree of creativity and latitude is expected Capabilities and Success Factors * Decision Quality - Making good and timely decisions that keep the organization moving forward. * Manages Complexity - Making sense of complex, high quantity and sometimes contradictory information to effectively solve problems. * Plans & Aligns - Planning and prioritizing work to meet commitments aligned with organizational goals. * Drives Results - Consistently achieving results, even under tough circumstances. * Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Total Rewards and Benefits * Competitive Salary * Medical, Dental, Vision, STD, LTD, AD&D, and Life Insurance * Matching 401(k) Contribution * Health Savings Account * Up to 3 weeks starting Vacation (may increase with tenure) * 12 Paid Holidays * Annual Bonus Eligibility * Educational Reimbursement * Matching Gift Program * Employee Stock Purchase Plan - purchase company stock at a discount! THIRD PARTY AGENCY: Any unsolicited submissions received from recruitment agencies will be considered property of Zurn Elkay, and we will not be liable for any fees or obligations related to those submissions. Equal Opportunity Employer - Minority/Female/Disability/Veteran
    $70k-97k yearly est. Auto-Apply 60d+ ago
  • ETL Conversion Architect

    Pcmi LLC 3.7company rating

    Data engineer job in Park Ridge, IL

    Who We Are PCMI (Policy Claim Management International) is a fast-growing, leading provider of integrated software for Extended Warranty Management and Finance and Insurance (F&I) administration. We are a SaaS company that operates in a fast paced, entrepreneurial environment. Our 3 teams located in the US, Poland, and Thailand work collaboratively around the clock to build our PCRS platform that automates the full administration lifecycle of all extended warranties, F&I products, and service contracts for our customers. What You'll Do The ETL Conversion Architect will play a critical leadership role in shaping the data conversion strategy for enterprise-scale implementations of PCMI's PCRS platform. Rather than focusing solely on individual project delivery, this role is also responsible for establishing scalable frameworks, validation models, and governance structures that enable the broader Professional Services team to execute conversions with consistency, accuracy, and minimal rework. Acting as the subject matter expert in ETL methodology, this individual will define how legacy data is transformed, validated, and migrated into PCRS-prioritizing transparency, client alignment, and time-to-value. From developing SQL-based source-to-target comparison models to enabling trust-but-verify conversion tracking, this role is essential to building a repeatable, high-confidence conversion process that accelerates client onboarding and elevates delivery quality across the organization. In this role, you will own: Practice-Level Framework Design: Architect and maintain source-to-target field-level validation frameworks that ensure field-level integrity across legacy and PCRS systems. Design standardized SQL Server validation pipelines (leveraging SSIS) to log and store pre/post conversion data, enabling clear traceability and trust-but-verify validation with clients. Establish conversion hypothesis protocols-defining expected outcomes from each extract-transformation-load (ETL) step before execution begins. Define and govern control total expectations (e.g., contract count, rate bucket totals, claim payments) across major business objects like Contracts, Claims, and Payments. Automation & Quality Enablement: Build automated validation assets and exception monitoring templates that the broader team can leverage in Excel, SQL Server, or other tooling. Partner with DevOps/Engineering to maintain a centralized SQL repository of validation rules, transformation logic, and data anomaly flags. Provide field-level transformation and mapping patterns for common edge cases (e.g., address concatenation, rate truncation, legacy formatting inconsistencies). Client Alignment & Delivery Readiness: Define standardized conversion preview formats (e.g., before/after field-level reports) to be used in pre-conversion workshops with clients-helping them visualize how legacy data will translate into PCRS before any transformation begins. Serve as a client-facing SME during data onboarding and conversion planning sessions-guiding clients through expected outcomes, resolving mapping discrepancies, and confirming mutual alignment prior to data transformation. Act as a Trusted Advisor for clients throughout the full conversion lifecycle, from pre-conversion planning through post-load validation - ensuring transparency, accuracy, and accountability are maintained across all data migration milestones. Ensure ongoing data quality assurance by defining scalable approaches for post-load reconciliation, control totals, and referential integrity validations - enabling early detection of discrepancies and reducing reliance on UAT as the primary validation checkpoint. Governance & Oversight: Own the conversion data quality strategy for PCRS implementations - including release-specific adjustments based on schema evolution. Govern the process for tracking, storing, and surfacing control totals (financial and transactional) across conversions-ensuring full auditability. Define procedures for flagging scope changes or deviations from standard conversions and providing inputs into project change control discussions. Reasonable accommodations may be made to enable individuals with disabilities to perform these essential functions. Supervisory Responsibilities None. What You'll Need to Join Our Team 10+ years of progressive experience in ETL development, data conversion, and SaaS implementation, with a proven track record of designing and delivering scalable, SQL-based data frameworks in complex client environments. Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related field, or equivalent hands-on experience with high-volume data conversion projects. Deep expertise in Microsoft SQL Server, including schema design, scripting, data transformation, validation logic, stored procedures, and performance tuning. Proficiency in ETL frameworks and tools (e.g., SSIS, custom SQL-based ETL engines), with strong ability to architect reusable pipelines and enforce field-level mapping logic. Solid understanding of SaaS platforms, especially in highly configurable systems within the following industries: Insurance, F&I, Accounting, Risk Management, or warranty admin platforms. Experience working in Professional Services organizations with a focus on implementation quality, timeline predictability, and cost control. Strong communication skills with demonstrated ability to translate complex technical designs into digestible summary insights for non-technical audiences, including PMs and clients. Background in mentoring technical consultants and conversion engineers, and establishing standards for scalable data practices across client projects. Familiarity with Agile or Waterfall delivery methodologies, as well as Change Management, Data Governance, and Test Validation frameworks. Experience collaborating with Product and Engineering to identify platform gaps, inform roadmap priorities, and drive customer outcomes through technical innovation. Required Skills/Abilities Expert-level proficiency in ETL processes, frameworks, and source-to-target data mapping. Strong SQL skills and experience working with SQL Server for complex data transformations. Proficient in managing and querying large datasets across staging and production layers. Familiarity with REST APIs, JSON, XML, and Batch processing for system integrations. Competent in scripting with PowerShell and Python for ETL automation and validation. Strong analytical mindset with a focus on quality, traceability, and reusability of conversions. Excellent communication skills, capable of translating technical insights for non-technical audiences. Highly organized with the ability to manage multiple priorities under tight deadlines. Comfortable in a high-paced, client-facing environment with evolving business needs. Physical Requirements Prolonged periods of sitting at a desk and working on a computer. Must be able to lift up to 15 pounds at times. Travel Requirements Must be able to travel to client meetings or PCMI office; up to 10% Why Work For Us Competitive Compensation from $150,000-$170,000 Annually* Comprehensive Benefit Package** Health, Dental & Vision Insurance Health Savings Account (HSA) Flexible Spending Account (FSA) Short- & Long-Term Disability Insurance Company-paid Long-Term Disability Company-paid Life Insurance Voluntary Life Insurance Voluntary Accident Insurance Employee Assistance Program 401k with generous Company Match Commuter Benefits Paid Time Off accrued per pay period. 10 Paid Holidays Paid Parental Leave Annual Bonus Program Professional Development Opportunities Employee Events Wellness Programs Employee Discount Programs Office in Park Ridge, IL - Convenient location to Blue Line *Individual compensation packages are based on various factors unique to each candidate, including skill set, experience, qualifications, and other job-related aspects. **Eligible to enroll in the first day of employment for immediate coverage. Although the role is remote, PCMI can only hire employees in the following states: AL CT FL GA IL KY MO NH NC OH PA TX Note: It is required for this role to be in the Park Ridge, IL office, 2 days per week if candidate is located in the Chicagoland area.
    $150k-170k yearly Auto-Apply 60d+ ago
  • Principal Data Engineer

    Sdevops

    Data engineer job in Rolling Meadows, IL

    Key Responsibilities of Principal Data Engineer: Guide the team towards successful project delivery Provide technical leadership and key decision making Work with and mentor individual team members to meet professional goals Continuous effort to automate and increase team efficiency Maintain high standards of software quality via code review, testing, automation standardization, and tooling Collaborate directly with both developers and business stakeholders Provide estimates and risk-reducing spike stories Assist in collection and documentation of requirements from business Assist in planning deployments and migrations Prepare status updates and run the daily standup Analyze data problems and provide solutions Assess opportunities for improvements and optimization Required Qualifications of the Principal Data Engineer: Masters Degree or equivalent work experience Minimum 12 years of experience working with open source databases Minimum 10 years of experience working with ETL and related data pipeline technologies. Highly proficient in open source SQL systems, particularly MySQL and PostgreSQL Proficiency in multiple scripting languages, including Python and Ruby Demonstrable experience with cloud-based ETL tools, including EMR, Expertise with distributed data stores, with demonstrable experience using Redshift and with optimizing query performance Deep understanding of data structures and schema design Prior work with AWS ecosystem, particularly RDS, SQS and SNS, StepFunctions, CDK) Exceptional analytical, organizational, interpersonal, and communication (both oral and written) skills Self-motivated, driven, resourceful, and able to get things done Enjoy working in fast-paced, collaborative, Agile environments Ability to adapt and learn quickly is a fundamental necessity Benefits: 401(k) Dental Insurance Health insurance Health savings account Paid time off Professional development assistance Vision insurance
    $75k-100k yearly est. 60d+ ago
  • Big Data Engineer

    Forhyre

    Data engineer job in Rolling Meadows, IL

    Job Description Looking for an experienced Senior Big Data Developer Experience: 8 - 10 years Requirements Primary / Essential Skills : SPARK with Scala or Python Secondary / Optional Skills : AWS, UNIX & SQL Looking for an experienced Senior Big Data Developer who will be responsible for, Technical leadership in driving solutions and hands on contributions To build new cloud based ingestion, transformation and data movement applications To migrate / modernize legacy data platforms Contribute and Assist in translating the requirements to high level and low level solution designs and working program / code Interact with business/IT stakeholders, other involved teams to understand requirements, identify dependencies, suggest and convincingly present solutions any or all parties involved. Perform hands-on to deliver on commitments and coordinating with team members both onsite and offshore and enable the team to deliver on commitments. To be successful in this role, the candidate must need, Good working knowledge and strong on concepts on Spark framework Comfortably work with any one or more of the scripting language listed in the order of preference, Scala, Python , Unix shell scripting Experience / exposure in AWS Services (EC2, Lambda, S3, RDS) and related cloud technologies Good understanding of DATA space - Data Integration, Building Data Warehouse solutions
    $75k-100k yearly est. 22d ago
  • Real World Data Scientist, Oncology (Associate Director)

    Astellas Pharma 4.9company rating

    Data engineer job in Northbrook, IL

    Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at ***************** This position is based in Northbrook, Illinois. Hybrid work from certain states may be permitted in accordance with Astellas' Responsible Flexibility Guidelines. Candidates interested in hybrid work are encouraged to apply. Purpose: We are hiring an experienced real-world data scientist to join our Real-World Data Science (RWDS) team. As an Associate Director of RWDS, you will be an analytic researcher informing and conducting Real World Data (RWD) studies at any time in the drug lifecycle. You will work directly within the RWDS team to execute observational studies for internal and external consumption and partner closely with Development, Medical Affairs, and Pharmacovigilance/Pharmacoepidemiology colleagues in their research. Additionally, you will collaborate closely with others in RWDS, Biostatistics and the broader Quantitative Sciences & Evidence Generation department to enhance our RWD and analytics offerings. RWDS is multidisciplinary and provides RWE strategic input, study design, statistical and programming support to projects. Team members apply their unique knowledge, skills and experience in teams to deliver decision-shaping real-world evidence. Essential Job Responsibilities: Provide best-in-class data science support to Astellas drug development programs & marketed products in relation to RWD Design of observational studies (primary and/or secondary data) Execute (program and analyze) observational studies using in-house RWD or oversee vendors or other RWDS staff in executing observational studies Write, review, or contribute to key study documents to ensure optimal methodological & statistical presentation. These documents include, protocols, analysis plans, tables and figure (TLF) specifications, study reports, publications Ensure efficient planning, execution and reporting of analyses Advise as subject matter expert in specific data access partnerships Represent the company on matters related to RWD analysis at meetings with regulatory authorities, key opinion leaders and similar experts/bodies as needed Contributes to vendor selection with partner functions Participate in the creation and upkeep of best practices, tools/macros, and standards related to methods, data and data analysis at Astellas Collaborate with RWDS and Biostatistics colleagues and cross-functional teams in Development, Medical Affairs and Pharmacovigilance Mentor and guide junior members of the RWD Analytics team
    $75k-104k yearly est. 1d ago
  • Lead ETL Architect (No H1B)

    Sonsoft 3.7company rating

    Data engineer job in Deerfield, IL

    Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services. As background, this client has been working with us and internal recruiting to fill this role for a LONG time. Previously, the client was looking for someone who could both technically lead the team specific to SAG technologies and also work with the business. Two things have changed: 1) the span of control of the team increased from SAG to include the other technologies listed in the job description, and 2) he has been unsuccessful in finding someone who was both the best technical architect on the team and also had manager qualities. He is now open to lesser capabilities on the technical, as long as they have the manager. I WOULD TARGET CURRENT/ FORMER SR. MANAGERS/ DIRECTORS OF INTEGRATION/ SOA. PREFERABLY, WERE TECHNICAL AT SOME POINT IN THEIR CAREER, BUT MOVED INTO MANAGEMENT. WITH THIS TARGET, YOU SHOULD BE ABLE TO ACCESS THE APPROPRIATE TALENT AT THIS PAY RATE. Lead ETL Architect Client's is seeking a Lead ETL Architect as a key member of their Center of Expertise for Integration Technologies. This consultant will be primarily responsible for leading the demand intake/ management process with the business leaders and other IT groups related to managing the demand for the design, build and ongoing support of new ETL architectures. They also will interact extensively with the team of architects supporting these technologies. Expertise in one of more of the following integration technology areas is required: -- ETL - Datastage, AnInito, Talend (client is moving from AbInitio to Talend as their primary ETL tool) The overall team is responsible for addressing any architecture impacts and defining technology solution architectures focusing on infrastructure, logical and physical layout of the systems and technologies not limited to hardware, distributed and virtual systems, software, security, data management, storage, backup/recovery, appliances, messaging, networking, work flow, interoperability, flexibility, scalability, monitoring and reliability including fault tolerance and disaster recovery in collaboration with Enterprise Architect(s). Additionally, responsible for the build-out and ongoing run support of these integration platforms. Skills Set · Possession of fundamental skills of a solution architect with ability to inquire and resolve vague requirements · Ability to analyze existing systems through an interview process with technology SMEs. Does not need to be the SME. · Takes a holistic view and communicate to others to be able to understand the enterprise view to ensure coherence of all aspects of the project as an integrated system · Perform gap analysis between current state and the future state architecture to identify single points of failure, capabilities, capacity, fault tolerance, hours of operation (SLA), change windows, · Strong verbal and written communication with proven skills in facilitating the design sessions. · Able to influence, conceptualize, visualize and communicate the target architecture. · Able to communicate complex technical or architecture concepts in a simple manner and can adapt to different audiences · Ability to work independently and market architecture best practices, guidelines, standards, principles, and patterns while working with infrastructure and technology project teams. · Ability to document end-to-end application transaction flows through the enterprise · Ability to document the technology architecture decision process, and if required develop relevant templates · Resolve conflicts within infrastructure and application teams, business units and other architects · Identify opportunities to cut cost without sacrificing the overall business goals. · Ability to estimate the financial impact of solution architecture alternatives / options · Knowledge of all components of an enterprise technical architecture. Additional Information ** U.S. Citizens and those who are authorized to work independently in the United States are encouraged to apply. We are unable to sponsor at this time. Note:- This is a Contract job opportunity for you. Only US Citizen , Green Card Holder , GC-EAD , H4-EAD, L2-EAD, OPT-EAD & TN-Visa can apply. No H1B candidates, please. Please mention your Visa Status in your email or resume . ** All your information will be kept confidential according to EEO guidelines.
    $87k-114k yearly est. 9h ago
  • Data Engineer - Data Products & Delivery (On-site Work Schedule)

    Parts Town 3.4company rating

    Data engineer job in Addison, IL

    at Parts Town See What We're All About As the fastest-growing distributor of restaurant equipment, HVAC and residential appliance parts, we like to do things a little differently. First, you need to understand and demonstrate our Core Values with safety being your first priority. That's key. But we're also looking for unique enthusiasm, high integrity, courage to embrace change…and if you know a few jokes, that puts you on the top of our list! Do you have a genius-level knowledge of original equipment manufacturer parts? If not, no problem! We're more interested in passionate people with fresh ideas from different backgrounds. That's what keeps us at the top of our game. We're proud that our workplace has been recognized for its growth and innovation on the Inc. 5000 list 15 years in a row and the Crain's Fast 50 list ten times. We are honored to be voted by our Chicagoland team as a Chicago Tribune Top Workplace for the last four years. If you're ready to roll up your sleeves, go above and beyond and put your ambition to work, all while having some fun, let's chat - Apply Today! Perks Parts Town Pride - check out our virtual tour and culture! Quarterly profit-sharing bonus Hybrid work schedule Team member appreciation events and recognition programs Volunteer opportunities Monthly IT stipend Casual dress code On-demand pay options: Access your pay as you earn it, to cover unexpected or even everyday expenses All the traditional benefits like health insurance, 401k/401k match, employee assistance programs and time away - don't worry, we've got you covered. The Job at a Glance The Data Engineer- Data Products & Delivery will specialize in turning raw data into business-ready products within GCP. They will design and optimize data models, marts, and semantic layers in BigQuery, enabling analytics, BI, and ML use cases across the enterprise. You will also support downstream systems and APIs that deliver trusted data for operational and AI-driven processes. You will play a foundational role in shaping this future - building the pipelines, products, and platforms that power the next generation of digital distribution. A Typical Day Build silver/gold layers in BigQuery, transforming raw data into clean, business-ready models. Design semantic layers using Looker or dbt for consistent business metrics Develop data marts and star schemas optimized for analytics and self-service BI Build APIs and services for data delivery (Cloud Functions, Cloud Run) Partner with analysts, data scientists, and ML engineers to ensure data & AI readiness and support advanced modeling Collaborate with the Data Governance team to embed stewardship, lineage, and metadata into Dataplex and other MDM tooling Support real-time analytics using BigQuery streaming and Pub/Sub as needed. Optimize query performance and cost efficiency in BigQuery Drive adoption of AI/automation by ensuring data models are accessible for predictive and agentic use cases To Land This Opportunity You have 4+ years of experience in data engineering or BI-focused data modeling You have hands-on expertise in BigQuery (partitioning, clustering, performance tuning, cost management) You have strong knowledge of dbt, Looker, and SQL for transformation and semantic modeling You obtain experience with Cloud Functions, Cloud Run, and APIs for data delivery You're familiar with Pub/Sub and BigQuery streaming for real-time use cases You have exposure to ML feature engineering in Vertex AI or similar platforms a plus You have a strong understanding of data governance frameworks, Dataplex, and metadata management You're an all-star communicator and are proficient in English (both written and verbal) You have a quality, high speed internet connection at home About Your Future Team Our IT team's favorite pastimes include corny jokes, bowling, pool, and good pizza. They like vehicles that go really fast, Harry Potter, and coffee…a lot (they'll hear you out on whether Dunkin or Starbucks gets your vote). At Parts Town, we value transparency and are committed to ensuring our team members feel appreciated and supported. We prioritize our positive workplace culture where collaboration, growth, and work-life balance are celebrated. The salary range for this role is $114,300 - 132,715 which is based on including but not limited to qualifications, experience, and geographical location. Parts Town is a pay for performance-company. In addition to base pay, some roles offer a profit-sharing program, and an annual bonus depending on the role. Our comprehensive benefits package includes health, dental and vision insurance, 401(k) with match, employee assistance programs, paid time off, paid sick time off, paid holidays, paid parental leave, and professional development opportunities. Parts Town welcomes diversity and as an equal opportunity employer all qualified applicants will be considered regardless of race, religion, color, national origin, sex, age, sexual orientation, gender identity, disability or protected veteran status. We are an E-Verify employer. For more information, please click on the following links: E-Verify Participation Poster: English | Spanish E-Verify Right to Work Poster: English | Spanish
    $114.3k-132.7k yearly Auto-Apply 60d+ ago
  • Hadoop Developer

    Info. Services Inc. 4.2company rating

    Data engineer job in Riverwoods, IL

    Role: Hadoop Developer Duration: Fulltime BGV will be done for the selected candidates. • The Senior/Lead Hadoop Developer is responsible for designing, developing, testing, tuning and building a large-scale data processing system, for Data Ingestion and Data products that allow the Client to improve quality, velocity and monetization of our data assets for both Operational Applications and Analytical needs. • Design, develop, validate and deploy the ETL processes • Must have used HADOOP (PIG, HIVE, SQOOP) on HORTONWORKS Distribution. • Responsible for the documentation of all Extract, Transform and Load (ETL) processes • Maintain and enhance ETL code, work with the QA and DBA team to fix performance issues • Collaborate with the Application team to design and develop required ETL processes, performance tune ETL programs/scripts. • Work with business partners to develop business rules and business rule execution • Perform process improvement and re-engineering with an understanding of technical problems and solutions as they relate to the current and future business environment. • Design and develop innovative solutions for demanding business situations. • Help drive cross team design / development via technical leadership / mentoring. Work with Offshore team of developers. • Analyze complex distributed production deployments, and make recommendations to optimize performance Essential skills • Minimum 3 years ETL experience with RDBNS and Big Data strongly preferred, may consider experience with Informatica or Datastage as an alternate. • Minimum 2+ years of experience in creating reports using TABLEAU. • Proficiency with HORTONWORKS Hadoop distribution components and custom packages • Proven understanding and related experience with Hadoop, HBase, Hive, Pig, Sqoop, Flume, Hbase and/or Map/Reduce • Excellent RDBMS (Oracle, SQL Server) knowledge for development using SQL/PL SQL • Basic UNIX OS and Shell Scripting skills • 6+ years' experience in UNIX and Shell Scripting. • 3+ years' experience in job scheduling tools like AutoSys. • 3+ years' experience in Pig and Hive Queries. • 3+ years' experience Hand on experience with Oozie. • 3+ years' experience in Importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa. • 3+ years' experience in Importing and exporting the data using Sqoop from HDFS to Relational Database systems/Teradata and vice-versa. • Must have 2+ Experience in working with Spark for data manipulation, preparation, cleansing. Please respond with your word resume and requested details: Full Name : Work Authorization: Contact Number : Email ID : Skype ID: Current location: Willing to relocate : Salary : Additional Information All your information will be kept confidential according to EEO guidelines.
    $77k-98k yearly est. 9h ago
  • BigData Hadoop Developer

    Jobsbridge

    Data engineer job in Des Plaines, IL

    Jobs Bridge Inc is among the fastest growing IT staffing / professional services organization with its own job portal. Jobs Bridge works extremely closely with a large number of IT organizations in the most in-demand technology skill sets. Job Description Skill BigData Hadoop Developer Location Des Plaines, IL Total Experience 8 yrs. Max Salary Not Mentioned Employment Type Direct Jobs (Full Time) Domain Any Description GC can apply Level: 7+ years Good understanding of Hadoop and ecosystem Strong experience in managing, monitoring and troubleshooting Hadoop clusters and environments. (IBM BigInsights, HDP). Proficient in MapReduce, Hive, Flume, Sqoop Proficient in debugging hive, mapreduce,s qoop issue Strong Linux administration background and experience in troubleshooting and analyzing Linux and resident application issues Good knowledge of scripting (shell, python) Good analytical skills. Additional Information Multiple Openings for GC/Citizen
    $76k-99k yearly est. 9h ago
  • Data Engineer

    Influur

    Data engineer job in Mundelein, IL

    We're building the world's first viral agent. An AI purpose built for influencer marketing. Not a tool. Not a platform. An autonomous agent that goes from "I need this campaign to break" to "50 influencers live next week" with minimal human intervention. Why this matters: We have 3 years of proprietary data on what makes influencer content go viral. Warner, Sony, and Universal trust us to break their biggest artists. We're backed by the biggest names in entertainment including Sofia Vergara, Karol G, Tommy Mottola, and Tier 1 VCs like Point72 Ventures. We're not building for hypothetical use cases. We're shipping production AI that drives millions in revenue today. Why we'll win: We have what no one else has: both the data and the distribution. We have proprietary data on thousands of influencers and what makes content go viral, plus direct relationships with the influencers themselves. In the AI era, having both is the moat. Everyone else has one or the other. We have both. Why now: AGI agents will replace middle level work by 2027. We have a 12 to 18 month window before the market floods with agents. We're building the category defining social media viral agent right now. What we're looking for: Young talent ready to go all in. We're offering significant equity to people who want to build something that matters. This isn't a job. It's an opportunity to define the future of AI in influencer marketing and own a meaningful piece of it.Your Skillset Strong programming with Python and SQL. Comfortable building from scratch and improving existing code. Expertise in data modeling and warehousing, including dimensional modeling and performance tuning. Experience designing and operating ETL and ELT pipelines with tools like Airflow or Dagster, plus dbt for transformations. Hands-on with batch and streaming systems such as Spark and Kafka, and with Lakehouse or warehouse tech on AWS or GCP. Proficiency integrating third-party APIs and datasets, ensuring reliability, lineage, and governance. Familiarity with AI data needs: feature stores, embedding pipelines, vector databases, and feedback loops that close the gap between model and outcome. High standards for code quality, testing, observability, and CI. Comfortable with Docker and modern cloud infra. You're the Type Who Treats data as a product and ships improvements that users feel. Moves fast without breaking trust. You value contracts, schemas, and backward compatibility. Owns problems across the stack, from ingestion to modeling to serving. Communicates clearly with ML engineers, analysts, and business partners. Experiments, measures, and iterates. You set measurable SLAs and keep them green. Sees ambiguity as a chance to design the standard everyone else will follow. Gross salary range What We Offer• Competitive equity in a venture-backed company shaping the future of music influencer marketing.• A seat at the table as we redefine how the most iconic record labels, artists, and brands go viral (think Bad Bunny), with our tech, support, and strategic guidance.• Access to elite tools, AI copilots, and a team that builds daily at top speed.• Hybrid flexibility. We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
    $75k-100k yearly est. 22d ago
  • Junior Data Engineer

    Calamos Recruiting

    Data engineer job in Naperville, IL

    Summary of the Role Calamos is seeking a Junior Data Engineer who is passionate about data and eager to grow their skills within the financial services industry. You'll join our Data Engineering Team as we continue to modernize our data platform using Databricks, transforming how the firm leverages data as a strategic asset. In this role, you'll participate in the lifecycle of data pipeline projects-from design and development through testing, documentation, and production support. You'll work hands-on with Databricks to build scalable data solutions that power critical investment decisions and operations. As part of a collaborative team, you'll contribute beyond individual projects by participating in design sessions, engaging with stakeholders to understand their needs, and sharing in rotation-based production support duties. Primary Responsibilities Develop and maintain data pipelines in Databricks following Agile/Scrum methodology Write clean, well-documented code that adheres to team and firm standards Participate in the on-call rotation to monitor overnight data delivery SLAs, troubleshoot failures, and respond to data quality issues Collaborate with architects, engineers, and business stakeholders to understand requirements and deliver solutions Contribute to design sessions, code reviews, and continuous improvement initiatives Preferred Qualifications Bachelor's degree in Computer Science, Data Engineering, or related field 3+ years of experience developing data pipelines in a modern data platform environment Strong Python development skills with experience in PySpark and Databricks Understanding of medallion architecture and data lakehouse concepts Experience with Azure cloud services (Data Lake Storage, Key Vault, Azure DevOps) Familiarity with Delta Lake and data quality frameworks Experience with modern CI/CD practices and infrastructure as code Strong SQL skills and understanding of data modeling principles Excellent communication skills with ability to translate technical concepts for business stakeholders Financial services industry experience is a plus Compensation Disclosure The compensation for this role takes into account various factors, including work location, individual skill set, relevant experience, and other business needs. The estimated base salary range for this position is $90,000 - $110,000. Additionally, this position is eligible for an annual discretionary bonus. Please note that this is the current estimate of the base salary range intended for this role at the time of posting. The base salary range may be adjusted in the future. Benefits Calamos offers a comprehensive benefits package, including health and welfare benefits (medical, dental, vision, flexible spending accounts, and employer-paid short and long-term disability), as well as retirement benefits (401(k) and profit sharing), paid time off, paid parental leave, and other wellness benefits.
    $90k-110k yearly 25d ago
  • Data Scientist

    International 4.1company rating

    Data engineer job in Lisle, IL

    We are seeking a driven Data Scientist to join our Analytics team. In this role, you will work on high-impact projects that leverage machine learning, AI, and data engineering techniques focused on the Service Solutions and Commercial organization. This is an excellent opportunity for early-career professionals and will work closely with experienced data scientists and engineers who will mentor and guide your development. A passion for learning, a collaborative mindset, and a bias toward innovation are key to success in this role. Responsibilities + Support the development of predictive and prescriptive models using machine learning and statistical techniques to address key business problems in electrification and supply chain. + Assist in building data pipelines and workflows to ingest, clean, and prepare large datasets from various internal and external sources. + Collaborate on the development and deployment of analytics solutions using cloud platforms such as Databricks and Azure ML. + Develop clear and compelling data visualizations and dashboards to communicate insights to both technical and non-technical stakeholders. + Work within an agile team environment, contributing to code repositories, sprint planning, and solution documentation. + Stay up to date with emerging trends in data science, generative AI, and commercial mobility innovation. + Participate in innovation initiatives, brainstorming new ways to apply AI and analytics to improve operations and customer experience. Minimum Requirements + Currently pursuing a Bachelor's degree in Computer Science, Statistics, Data Science/Analytics, Management Information Systems, Mathematics, Natural Science, Economics, Engineering or similar quantitative field and no experience and will obtain degree prior to first day of employment OR + Bachelor's degree in Computer Science, Statistics, Data Science/Analytics, Management Information Systems, Mathematics, Natural Science, Economics, Engineering or similar quantitative field Additional Requirements + Qualified candidates, excluding current employees, must be legally authorized on an unrestricted basis (US Citizen, Legal Permanent Resident, Refugee or Asylee) to be employed in the United States. We do not anticipate providing employment related work sponsorship for this position (e.g., H-1B status) Desired Skills + Familiarity with SQL and working with structured data. + Exposure to cloud-based tools (e.g., Azure, AWS, GCP), particularly for data science or engineering tasks. + Understanding of key machine learning concepts such as classification, regression, clustering, or time series forecasting. + Strong interest in applying AI and analytics to solve real-world business problems. + Comfortable using data visualization tools such as Power BI, matplotlib, seaborn, or Streamlit. + Exposure to agile ways of working, including version control (Git) and working in cross-functional teams, is a plus. + A growth mindset and willingness to learn from feedback and challenges. Benefits and Compensation We provide a competitive total rewards package which ensures job satisfaction both on and off the job. We offer market-based compensation, health benefits, 401(k) match, tuition assistance, EAP, legal insurance, an employee discount program, and more. For this position, the expected salary range will be commensurate with the candidate's applicable skills, knowledge and experience. You can learn more about our comprehensive benefits package at ******************************************** Company Overview ABOUT TRATON With its brands Scania, MAN, International, and Volkswagen Truck & Bus, TRATON SE is the parent and holding company of the TRATON GROUP and one of the world's leading commercial vehicle manufacturers. The Group's product portfolio comprises trucks, buses, and light-duty commercial vehicles. "Transforming Transportation Together. For a sustainable world.": this intention underlines the Company's ambition to have a lasting and sustainable impact on the commercial vehicle business and on the Group's commercial growth. ABOUT INTERNATIONALFrom a one-man company built on the world-changing invention of the McCormick reaper in 1831, to the 15,000-person-strong company we are today, few companies can lay claim to a history like International. Based in Lisle, Illinois, International Motors, LLC* creates solutions that deliver greater uptime and productivity to our customers throughout the full operation of our commercial vehicles. We build International trucks and engines and IC Bus school and commercial buses that are as tough and as smart as the people who drive them. We also develop Fleetrite aftermarket parts. In everything we do, our vision is to accelerate the impact of sustainable mobility to create the cleaner, safer world we all deserve. As of 2021, we joined Scania, MAN and Volkswagen Truck & Bus in TRATON GROUP, a global champion of the truck and transport services industry. To learn more, visit ********************* (https://*********************/our-company) . *International Motors, LLC is d/b/a International Motors USA in Illinois, Missouri, New Jersey, Ohio, Texas, and Utah. EEO Statement We are an Equal Opportunity Employer. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other legally protected characteristics. If you are a qualified individual with a disability and require a reasonable accommodation to access the online application system or participate in the interview process due to your disability, please email ********************* to request assistance. Kindly specify Job Requisition Number / Job Title and Location in response. Otherwise, your request may not be considered.
    $67k-88k yearly est. 60d+ ago
  • Data-Senior Data Engineer-CL

    Endava 4.2company rating

    Data engineer job in Deerfield, IL

    Technology is our how. And people are our why. For over two decades, we have been harnessing technology to drive meaningful change. By combining world-class engineering, industry expertise and a people-centric mindset, we consult and partner with leading brands from various industries to create dynamic platforms and intelligent digital experiences that drive innovation and transform businesses. From prototype to real-world impact - be part of a global shift by doing work that matters. Job Description Our data team has expertise across engineering, analysis, architecture, modeling, machine learning, artificial intelligence, and data science. This discipline is responsible for transforming raw data into actionable insights, building robust data infrastructures, and enabling data-driven decision-making and innovation through advanced analytics and predictive modeling. Responsibilities: Work closely with the Data Analyst/Data Scientist to understand evolving needs and define the data processing flow or interactive reports. Discuss with the stakeholders from other teams to better understand how data flows are used within the existing environment. Propose solutions for the cloud-based architecture and deployment flow. Design and build processes, data transformation, and metadata to meet business requirements and platform needs. Design and propose solutions for the Relational and Dimensional Model based on platform capabilities. Develop, maintain, test, and evaluate big data solutions. Focus on production status and data quality of the data environment. Pioneer initiatives around data quality, integrity, and security. Qualifications Required: 5+ years of experience in Data Engineering. Proficiency in Apache Spark Proficiency in Python. Some experience leading IT projects and stakeholder management. Experience implementing ETL/ELT process and Data pipelines. Experience with Snowflake. Strong SQL scripting experience. Background and experience with cloud data technologies and tools. Familiar with data tools and technologies like: Spark, Hadoop, Apache beam, Dataproc or similar. BigQuery, Redshift, or other Data warehouse tools. Real-time pipelines with Kinesis or Kafka. Batch processing. Serverless processing. Strong analytic skills related to working with structured and unstructured data sensibilities. Must be able to work onsite 2-3 days a week Additional Information Discover some of the global benefits that empower our people to become the best version of themselves: Finance: Competitive salary package, share plan, company performance bonuses, value-based recognition awards, referral bonus; Career Development: Career coaching, global career opportunities, non-linear career paths, internal development programmes for management and technical leadership; Learning Opportunities: Complex projects, rotations, internal tech communities, training, certifications, coaching, online learning platforms subscriptions, pass-it-on sessions, workshops, conferences; Work-Life Balance: Hybrid work and flexible working hours, employee assistance programme; Health: Global internal wellbeing programme, access to wellbeing apps; Community: Global internal tech communities, hobby clubs and interest groups, inclusion and diversity programmes, events and celebrations. Additional Employee Requirements Participation in both internal meetings and external meetings via video calls, as necessary. Ability to go into corporate or client offices to work onsite, as necessary. Prolonged periods of remaining stationary at a desk and working on a computer, as necessary. Ability to bend, kneel, crouch, and reach overhead, as necessary. Hand-eye coordination necessary to operate computers and various pieces of office equipment, as necessary. Vision abilities including close vision, toleration of fluorescent lighting, and adjusting focus, as necessary. For positions that require business travel and/or event attendance, ability to lift 25 lbs, as necessary. For positions that require business travel and/or event attendance, a valid driver's license and acceptable driving record are required, as driving is an essential job function. *If requested, reasonable accommodations will be made to enable employees requiring accommodations to perform the essential functions of their jobs, absent undue hardship. USA Benefits (Full time roles only, does not apply to contractor positions) Robust healthcare and benefits including Medical, Dental, vision, Disability coverage, and various other benefit options Flexible Spending Accounts (Medical, Transit, and Dependent Care) Employer Paid Life Insurance and AD&D Coverages Health Savings account paired with our low-cost High Deductible Medical Plan 401(k) Safe Harbor Retirement plan with employer match with immediately vest At Endava, we're committed to creating an open, inclusive, and respectful environment where everyone feels safe, valued, and empowered to be their best. We welcome applications from people of all backgrounds, experiences, and perspectives-because we know that inclusive teams help us deliver smarter, more innovative solutions for our customers. Hiring decisions are based on merit, skills, qualifications, and potential. If you need adjustments or support during the recruitment process, please let us know.
    $79k-103k yearly est. 60d+ ago
  • Data Scientist, Generative AI

    Amira Learning 3.8company rating

    Data engineer job in Ohio, IL

    REMOTE / FULL TIME Amira Learning accelerates literacy outcomes by delivering the latest reading and neuroscience with AI. As the leader in third-generation edtech, Amira listens to students read out loud, assesses mastery, helps teachers supplement instruction and delivers 1:1 tutoring. Validated by independent university and SEA efficacy research, Amira is the only AI literacy platform proven to achieve gains surpassing 1:1 human tutoring, consistently delivering effect sizes over 0.4. Rooted in over thirty years of research, Amira is the first, foremost, and only proven Intelligent Assistant for teachers and AI Reading Tutor for students. The platform serves as a school district's Intelligent Growth Engine, driving instructional coherence by unifying assessment, instruction, and tutoring around the chosen curriculum. Unlike any other edtech tool, Amira continuously identifies each student's skill gaps and collaborates with teachers to build lesson plans aligned with district curricula, pulling directly from the district's high-quality instructional materials. Teachers can finally differentiate instruction with evidence and ease, and students get the 1:1 practice they specifically need, whether they are excelling or working below grade level. Trusted by more than 2,000 districts and working in partnership with twelve state education agencies, Amira is helping 3.5 million students worldwide become motivated and masterful readers. About this role: We are seeking a Data Scientist with expertise in the domain of reading science, education, literacy, and NLP; with practical experience building and utilizing Gen AI (LLM, image, and/or video) models. You will help to create Gen AI based apps that will power the most widely used Intelligent Assistant in U.S. schools, already helping more than 2 million children. We are looking for strong, education focused engineers who have a background using the latest generative AI models, with experience in areas such as prompt engineering, model evaluation; data processing for training and fine-tuning; model alignment, and human-feedback-based model training. Responsibilities include: * Design methods, tools, and infrastructure to enable Amira to interact with students and educators in novel ways. * Define approaches to content creation that will enable Amira to safely assist students to build their reading skills. This includes defining internal pipelines to interact with our content team. * Contribute to experiments, including designing experimental details and hypothesis testing, writing reusable code, running evaluations, and organizing and presenting results. * Work hands on with large, complex codebases, contributing meaningfully to enhance the capabilities of the machine learning team. * Work within a fully distributed (remote) team. * Find mechanisms for enabling the use of the Gen AI to be economically viable given the limited budgets of public schools. Who You Are: * You have a background in early education, reading science, literacy, and/or NLP. * You have at least one year of experience working with LLMs and Gen AI models. * You have a degree in computer science or a related technical area. * You are a proficient Python programmer. * You have created performant Machine Learning models. * You want to continue to be hands-on with LLMs and other Gen AI models over the next few years. * You have a desire to be at a Silicon Valley start-up, with the desire and commitment that requires. * You are able to enjoy working on a remote, distributed team and are a natural collaborator. * You love writing code - creating good products means a lot to you. Working is fun - not a passport to get to the next weekend. Qualifications * Bachelor's degree, and/or relevant experience * 1+ years of Gen AI experience - preferably in the Education SaaS industry * Ability to operate in a highly efficient manner by multitasking in a fast-paced, goal-oriented environment. * Exceptional organizational, analytical, and detail-oriented thinking skills. * Proven track record of meeting/exceeding goals and targets. * Great interpersonal, written and oral communication skills. * Experience working across remote teams. Amira's Culture * Flexibility - We encourage and support you to live and work where you desire. Amira works as a truly distributed team. We worked remotely before COVID and we'll be working remotely after the pandemic is long gone. Our office is Slack. Our coffee room is Zoom. Our team works hard but we work when we want, where we want. * Collaboration - We work together closely, using collaborative tools and periodic face to face get togethers. We believe great software is like movie-making. Lots of talented people with very different skills have to band together to build a great experience. * Lean & Agile -- We believe in ownership and continuous feedback. Yes, we employ Scrum ceremonies. But, what we're really after is using data and learning to be better and to do better for our teachers, students, and players. * Mission-Driven - What's important to us is helping kids. We're about tangible, measured impact. Benefits: * Competitive Salary * Medical, dental, and vision benefits * 401(k) with company matching * Flexible time off * Stock option ownership * Cutting-edge work * The opportunity to help children around the world reach their full potential Commitment to Diversity: Amira Learning serves a diverse group of students and educators across the United States and internationally. We believe every student should have access to a high-quality education and that it takes a diverse group of people with a wide range of experiences to develop and deliver a product that meets that goal. We are proud to be an equal opportunity employer. The posted salary range reflects the minimum and maximum base salary the company reasonably expects to pay for this role. Salary ranges are determined by role, level, and location. Individual pay is based on location, job-related skills, experience, and relevant education or training. We are an equal opportunity employer. We do not discriminate on the basis of race, religion, color, ancestry, national origin, sex, sexual orientation, gender identity or expression, age, disability, medical condition, pregnancy, genetic information, marital status, military service, or any other status protected by law.
    $71k-101k yearly est. 33d ago
  • Real World Data Scientist, Oncology (Associate Director)

    Astellas Pharma, Inc. 4.9company rating

    Data engineer job in Northbrook, IL

    Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas! Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at ***************** This position is based in Northbrook, Illinois. Hybrid work from certain states may be permitted in accordance with Astellas' Responsible Flexibility Guidelines. Candidates interested in hybrid work are encouraged to apply. Purpose: We are hiring an experienced real-world data scientist to join our Real-World Data Science (RWDS) team. As an Associate Director of RWDS, you will be an analytic researcher informing and conducting Real World Data (RWD) studies at any time in the drug lifecycle. You will work directly within the RWDS team to execute observational studies for internal and external consumption and partner closely with Development, Medical Affairs, and Pharmacovigilance/Pharmacoepidemiology colleagues in their research. Additionally, you will collaborate closely with others in RWDS, Biostatistics and the broader Quantitative Sciences & Evidence Generation department to enhance our RWD and analytics offerings. RWDS is multidisciplinary and provides RWE strategic input, study design, statistical and programming support to projects. Team members apply their unique knowledge, skills and experience in teams to deliver decision-shaping real-world evidence. Essential Job Responsibilities: * Provide best-in-class data science support to Astellas drug development programs & marketed products in relation to RWD * Design of observational studies (primary and/or secondary data) * Execute (program and analyze) observational studies using in-house RWD or oversee vendors or other RWDS staff in executing observational studies * Write, review, or contribute to key study documents to ensure optimal methodological & statistical presentation. These documents include, protocols, analysis plans, tables and figure (TLF) specifications, study reports, publications * Ensure efficient planning, execution and reporting of analyses * Advise as subject matter expert in specific data access partnerships * Represent the company on matters related to RWD analysis at meetings with regulatory authorities, key opinion leaders and similar experts/bodies as needed * Contributes to vendor selection with partner functions * Participate in the creation and upkeep of best practices, tools/macros, and standards related to methods, data and data analysis at Astellas * Collaborate with RWDS and Biostatistics colleagues and cross-functional teams in Development, Medical Affairs and Pharmacovigilance * Mentor and guide junior members of the RWD Analytics team
    $75k-104k yearly est. 25d ago

Learn more about data engineer jobs

How much does a data engineer earn in DeKalb, IL?

The average data engineer in DeKalb, IL earns between $65,000 and $114,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in DeKalb, IL

$86,000
Job type you want
Full Time
Part Time
Internship
Temporary