Post job

Data engineer jobs in Franklin, TN - 60 jobs

All
Data Engineer
Lead Data Technician
Data Scientist
Software Engineer
Data Consultant
Senior Software Engineer
Lead Data Architect
  • Data Scientist, Merchandising Analytics

    Tractor Supply 4.2company rating

    Data engineer job in Brentwood, TN

    The Data Scientist, Merchandising Analytics at Tractor Supply Company will play a key role in leveraging data to address complex business challenges. The role will develop advanced statistical methods using Machine Learning, AI, statistical modeling, and optimization techniques to support the merchandising strategies and broader organizational goals of TSC. Additionally, the role will contribute to setting objectives to enhance the overall data architecture and data governance within the Merchandising team. Key areas of focus within Merchandising will be Space Planning and Pricing. The Data Scientist will lead cross-functional projects, design and implement predictive models, and promote data-driven decision-making throughout the organization. Strong communication skills are essential as this role will translate complex analytical results into clear, actionable insights for both technical and non-technical stakeholders. This role work will ensure that Tractor Supply Company remains at the forefront of industry trends and emerging technologies in the data science field. Essential Duties and Responsibilities (Min 5%) * Work closely with key business partners to fully explore and frame a business question including objectives, goals, KPIs, decisions the analysis will support and required timing for deliverables. * Extracting available and relevant data from internal and external data sources to perform data science solution development. * Develop, maintain, and improve predictive models using R, Python, and Databricks to enhance business knowledge and processes. * Contribute and assist the team with best practices in data governance, data engineering, and data architecture. * Identify opportunities for automation and continuous improvement within data pipelines, processes, and systems. * Maintain a broad exposure to the wider ecosystem of AI / Machine Learning and ensure our team is pushing toward the most optimal solutions. * Manage multiple projects simultaneously with limited oversight including development of new technologies and maintenance of existing framework. * Foster a culture of data-driven decision making and collaborate with cross-functional teams and senior leadership to define the long-term vision and goals for data science and engineering within the organization. * Design and execute A/B tests to evaluate the effectiveness of various data-driven solutions, including designing appropriate sample sizes, metrics for evaluation, and statistical analysis plans. * Use proven, predictive science to right-size every store with localized plans that balance individual store space allocations with your top-down and bottom-up strategies. * Be a 'Go-To' person for any data analytical needs ranging from data extraction/ manipulations, long term trend analysis, statistical analysis, and modeling techniques. Perform code-review and debug with the team and assist during implementation where necessary. Required Qualifications Experience: 3-5 years proven experience as a Data Scientist, preferably in the retail or e-commerce industry. Prefer 2+ years of experience in predictive modeling utilizing CRM or transaction information. Education: Bachelor's Degree in Mathematics, Statistics, or Econometrics. Master's Degree prefered. Any combination of education and experiencce will be considered. Professional Certifications: None Preferred knowledge, skills or abilities * Intermediate-advanced in one or more programming language (Python, PySpark, R). * Deep expertise in writing and debugging complex SQL queries. * Ability to frame business questions and create an analytics solution using statistical or other advanced analytics methodologies. * Proven advanced modeling experience in leading data-driven projects from definition to execution, driving and influencing project roadmaps. * Experience using Azure, AWS, or another cloud compute platform a plus. * Familiarity with visualization tools such as Power BI and Tableau. * Must possess high degree of aptitude in communicating both verbally and written complex analysis results to Senior & Executive Leadership. * Knowledge of A/ B testing methods; capable of designing a controlled test design, running the test and providing measurement post-hoc. * Proficiency with managing data repository and version control systems like Git. * Speak, read and write effectively in the English language. Working Conditions * Hybrid / Flexible working conditions Physical Requirements * Sitting * Standing (not walking) * Walking * Kneeling/Stooping/Bending * Lifting up to 10 pounds Disclaimer This job description represents an overview of the responsibilities for the above referenced position. It is not intended to represent a comprehensive list of responsibilities. A team member should perform all duties as assigned by his/ her supervisor. Company Info At Tractor Supply and Petsense by Tractor Supply, our Team Members are the heart of our success. Their dedication, passion, and hard work drive everything we do, and we are committed to supporting them with a comprehensive and accessible total reward package. We understand the evolving needs of our Team Members and their families, and we strive to offer meaningful, competitive, and sustainable benefits that support their well-being today and in the future. Our benefits extend beyond medical, dental, and vision coverage, including company-paid life and disability insurance, paid parental leave, tuition reimbursement, and family planning resources such as adoption and surrogacy assistance, for all full-time Team Members and all part-time Team Members. Part time new hires gain eligibility for TSC Benefits by averaging at least 15 hours per week during their 90-day lookback period. The lookback period starts the first of the month following the date of hire. If the 15-hour requirement was met, the benefits eligibility date will be the first day of the month following 4 months of continuous service. Please visit this link for more specific information about the benefits and leave policies applicable to the position you're applying for.
    $81k-105k yearly est. 59d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Enterprise Data Scientist (Rev Cycle & Supply Chain)

    Community Health Systems 4.5company rating

    Data engineer job in Franklin, TN

    This role is responsible for leveraging your expertise in data analytics, advanced statistical methods, and programming to derive insights from clinical data. As a member of the Enterprise Data Science team, this role will be responsible for analyzing complex clinical datasets, developing data visualizations and dashboards, assisting with data model and/or feature development, and translating insights into actionable recommendations for improving patient care and operational outcomes. Responsibilities: + Collaborate with cross-functional teams including clinical leaders, data scientists, and software engineers to identify data-driven opportunities for enhancing clinical processes and patient care. + Utilize cloud-based technologies, such as Google Cloud Platform (GCP), for scalable data processing and analysis. + Develop easily consumable dashboards from complex clinical and operational data to provide visualizations derived from best practices and data consumption theory that drive healthcare and business performance. + Implement best practices for data management, including data quality assessment, data validation, and data governance. + Utilize Python programming and associated libraries such as PyTorch, Keras, Pandas and NumPY to create, train, test and implement meaningful data science models. + Lead the analysis of operational data to identify patterns, trends, and correlations relevant to healthcare outcomes. + Collaborate with healthcare professionals and domain experts to understand operational needs and design data-driven solutions. + Design and conduct experiments, interpret results, and communicate findings to both technical and non-technical stakeholders Requirements: + Master's degree in Data Science, Data Analytics, Computer Science, or a related field. + Proven experience in analyzing complex healthcare data and building customer facing dashboards and data visualizations. + Proficiency in Python programming and associated libraries. + Experience with cloud-based platforms such as Google Cloud Platform (GCP) for data storage, processing, and deployment. + Strong problem-solving skills and ability to work independently and collaboratively in a fast-paced environment. + Excellent communication and presentation skills with the ability to translate technical concepts to non-technical audiences. Equal Employment Opportunity This organization does not discriminate in any way to deprive any person of employment opportunities or otherwise adversely affect the status of any employee because of race, color, religion, sex, sexual orientation, genetic information, gender identity, national origin, age, disability, citizenship, veteran status, or military or uniformed services, in accordance with all applicable governmental laws and regulations. In addition, the facility complies with all applicable federal, state and local laws governing nondiscrimination in employment. This applies to all terms and conditions of employment including, but not limited to: hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. If you are an applicant with a mental or physical disability who needs a reasonable accommodation for any part of the application or hiring process, contact the director of Human Resources at the facility to which you are seeking employment; Simply go to ************************************************* to obtain the main telephone number of the facility and ask for Human Resources.
    $83k-108k yearly est. 40d ago
  • Senior Marketing Data & Activation Engineer (ESP Migration Enablement)

    Corps Team 4.0company rating

    Data engineer job in Forest Hills, TN

    Our client, a retail chain of home improvement and agriculture stores, is seeking a Senior Marketing Data & Activation Engineer (ESP Migration Enablement) for a 6+ month contract position in Brentwood, TN. This role is hybrid, onsite 4 days a week. Must Haves: Core Experience 5+ years of experience in marketing data engineering or ESP data roles Strong SQL skills (joins, deduplication, window functions) a must; other programming languages expected (Python, Apache Spark, HTML etc.) Hands-on Adobe Campaign Classic experience: Custom tables Workflow-based data processing Hands-on Zeta experience: Audience and event modeling Identity resolution concepts Real-time vs batch data ingestion Integration & Collaboration Experience working alongside IT teams on APIs and data feeds Understanding of ESP data dependencies and activation impacts Ability to translate and resolve technical data issues Strong understanding of web and app event data concepts Working knowledge of APIs and JSON payloads (conceptual, not development) Experience working along side data science and analytics teams to activate AI/ML based campaigns Experience with event data from clickstream, apps, APIs etc. Ability to build and automate tracking and health reports What This Role Is Not Responsible For Front-end or back-end web application development Managing infrastructure, cloud services, or SLAs Owning source systems (CRM, CDP, OMS, POS, etc.) Building machine learning models or doing data science work Preferred Qualifications: Familiarity with Movable Ink data feeds and ESP integrations Experience supporting experimentation or incrementality testing Strong documentation and data validation practices Retail or eCommerce marketing experience Reporting or dashboarding experience Experience collaborating with data science and analytics teams to productionize AI/ML models and support triggered communications Day to Day: Seeking a Senior Marketing Data & Activation Engineer to ensure data accuracy, audience integrity, integrations stability, and measurement confidence during and after the migration from Adobe Campaign Classic (ACC) to Zeta. This role bridges Marketing and IT by focusing on activation correctness, audience parity, identity resolution, reporting and measurement alignment, enabling scalable personalization and trusted performance reporting post-migration. These roles focus on marketing activation, automation, and data correctness. They are not intended to replace IT engineering, analytics teams, or vendor professional services. Key Responsibilities Migration Data Validation & Parity (Phase 1) Analyze and map ACC schemas to Zeta data structures Validate audience parity between ACC and Zeta during parallel runs Investigate and resolve audience count discrepancies Ensure suppression, eligibility, and preference logic behaves consistently Partner with IT and vendors to troubleshoot data issues affecting activation Activation & Identity Governance (Post-Migration) Monitor and maintain customer identity resolution across channels Ensure opt-in, opt-out, and preference data remains accurate and consistent Detect and address audience inflation, decay, or duplication Support compliance and deliverability through clean audience hygiene Personalization & Trigger Enablement Prepare and validate customer attributes and event data for: Personalization Triggered journeys Lifecycle messaging Build and support clean event modeling and trigger firing logic Enable scalable use of behavioral and transactional signals Measurement & Reporting Confidence Reconcile performance data across: Zeta Adobe Analytics Finance / Analytics outputs Investigate tracking and attribution discrepancies Support executive-ready explanations of performance changes Validate event capture and downstream reporting logic Experimentation & Test-and-Learn Support Support audience split logic and control group design Enable clean test vs control measurement for journeys and triggers Assist in defining guardrails to prevent cannibalization or over-messaging Pay Rate- $37.93- $68.97/hour
    $37.9-69 hourly 14d ago
  • Bigdata / Hadoop Technical Lead

    E*Pro 3.8company rating

    Data engineer job in Franklin, TN

    E*Pro Consulting service offerings include contingent Staff Augmentation of IT professionals, Permanent Recruiting and Temp-to-Hire. In addition, our industry expertise and knowledge within financial services, Insurance, Telecom, Manufacturing, Technology, Media and Entertainment, Pharmaceutical, Health Care and service industries ensures our services are customized to meet specific needs. For more details please visit our website ****************** Job Description Technical/Functional Skills: • Must have at least 1 full-scale Hadoop implementation from DEV to PROD • Must have experience in Production Deployment Process for Big Data projects • Must have experience in root cause analysis, trouble-shooting of Hadoop applications • Must have significant experience in designing solutions using Cloudera Hadoop • Must have significant experience with Java MapReduce, PIG, Hive, Sqoop and Oozie • Must have significant experience with Unix Shell Scripts • Exposure to Healthcare Provider domain Roles & Responsibilities: • Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions. • Mentor, guide and train Team members on Big Data • Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop. • Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks. • Perform complex coding tasks using Java Map Reduce, PIG, Hive and Sqoop • Review and certify code written by team members • Ensures that established change management and other procedures are adhered to also help developing needing standards, procedures, and practices. • Performance tuning with large data sets. Generic Managerial Skills: • Ability to lead the team, plan, track and manage and work performed by team members • Ability to work independently and communicate across multiple levels (Product owners, Executive sponsors, Team members) Additional Information All your information will be kept confidential according to EEO guidelines.
    $91k-120k yearly est. 1d ago
  • Big Data / Hadoop Technical Lead

    Tectammina

    Data engineer job in Franklin, TN

    First IT Solutions provides a professional, cost effective recruitment solution. We take the time to understand your needs in great detail. With our dedicated and experienced Recruitment Consultants, our market knowledge, and our emphasis on quality and satisfaction, we pride ourselves on offering the best solution the first time. Our consultants have substantial experience gained over many years placing talented individuals in Contract and Permanent positions within the technology industry. You can be sure that we understand the process well from your side of the desk. We started First IT to provide top quality service, something that clients complained was lacking at other recruiting and placement firms. At First IT, we strive continually to provide excellent service at all times. Job Description Technical/Functional Skills: Minimum Experience Required 8 years Must have experience as a Tech Lead for Big data projects Must have significant experience with architecting and designing solutions using Cloudera Hadoop Must have significant experience with Python, Java Map Reduce, PIG, Hive, Hbase, Oozie and Sqoop Must have significant experience with Unix Shell Scripts Exposure to Healthcare Provider domain Qualifications Architect and Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions. Estimate size, effort, complexity of solutions Plan, track and report project status Mentor, guide and train Team members on Big Data Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop. Prepare detailed specifications, diagrams, and other programming structures from which programs are written. Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks. Perform complex coding tasks using Python, Java Map Reduce, PIG, Hive, Hbase, Oozie and Sqoop Review and certify code written by team members Ensures that established change management and other procedures are adhired to also help developing needing standards, procedures, and practices. Performance tuning with large data sets. Additional Information Duration: Full Time Eligiblity: GC & US Citizens Only Share the Profiles to **************************** Contact: ************ Keep the subject line with Job Title and Location
    $86k-122k yearly est. Easy Apply 1d ago
  • Data Onboarding Consultant

    Corpay

    Data engineer job in Brentwood, TN

    What We Need Corpay is currently looking to hire a Data Onboarding Consultant within our Implementations division. This position falls under our Corporate Payments line of business based out of our Brentwood, TN location. In this role, you will manage critical data activities for Corpay's clients to ensure successful implementation and ongoing client success. This position combines client-facing and internal technical responsibilities. The ideal candidate is one that enjoys working with clients to assist them in navigating complex data landscapes, is analytical in nature allowing them to understand non-uniform data sets from various sources, can drive project success by creating deadlines and holding both internal and external parties accountable to performance, and can balance competing priorities to ensure ultimate success. The ideal candidate is a problem solver, a great communicator, and most importantly takes ownership of their projects and drives them to success. You will report directly to the Manager of Technical Implementations. How We Work As a Data Onboarding Consultant you will be expected to work out of our Brentwood, TN office location. Corpay will set you up for success by providing: Company-issued equipment Assigned workspace in our Brentwood office Formal, hands-on training Role Responsibilities The responsibilities of the role will include: This is a customer-facing role that will serve as the primary point of client contact for all data services from the sales process through implementation Work with clients and internal partners to obtain and validate data to be used in data services Analyze client data and present findings to improve the results of the data being ingested Utilize data cleaning and mapping tools to ingest data into the application Coordinate the scoping, prioritization, delivery, and, where applicable, ongoing maintenance of client data services (one-time data import, ongoing data integrations) First line of defense for triaging issues related to data imports/data integrations Work with clients and internal stakeholders to maintain a prioritized queue of data services deliverables Contribute to the overall strategy for Implementations Qualifications & Skills 2 - 5 years' experience in managing or working with data (training/education counts) Comfortable communicating complex information in simple terms Experience managing projects Experience working with large, non-uniform, data sets Experience working directly with clients and prospects to assess needs and define technical solutions Experience with data mapping and BI tools While this is not an engineering role, familiarity with engineering tools and practices will be greatly beneficial Benefits & Perks Medical, Dental & Vision benefits available the 1st month after hire Automatic enrollment into our 401k plan (subject to eligibility requirements) Virtual fitness classes offered company-wide Robust PTO offers including major holidays, vacation, sick, personal, & volunteer time Employee discounts with major providers (i.e. wireless, gym, car rental, etc.) Philanthropic support with both local and national organizations Fun culture with company-wide contests and prizes Equal Opportunity/Affirmative Action Employer Corpay is an Equal Opportunity Employer. Corpay provides equal employment opportunities to all qualified applicants without regard to race, color, gender (including pregnancy), religion, national origin, ancestry, disability, age, sexual orientation, gender identity or expression, marital status, language, ancestry, genetic information and/or military status or any other group status protected by federal or local law. If you require reasonable accommodation for the application and/or interview process, please notify a representative of the Human Resources Department. For more information about our commitment to equal employment opportunity and pay transparency, please click the following links: EEOC and Pay Transparency.
    $64k-87k yearly est. 30d ago
  • Data Engineer - Archimedes

    Navitus 4.7company rating

    Data engineer job in Brentwood, TN

    Company Archimedes About Us Archimedes - Transforming the Specialty Drug Benefit - Archimedes is the industry leader in specialty drug management solutions. Founded with the goal of transforming the PBM industry to provide the necessary ingredients for the sustainability of the prescription drug benefit - alignment, value and transparency - Archimedes achieves superior results for clients by eliminating tightly held PBM conflicts of interest including drug spread, rebate retention and pharmacy ownership and delivering the most rigorous clinical management at the lowest net cost. .______________________________________________________________________________________________________________________________________________________________________________________________________. Current associates must use SSO login option at ************************************ to be considered for internal opportunities. Pay Range USD $0.00 - USD $0.00 /Yr. STAR Bonus % (At Risk Maximum) 10.00 - Manager, Clinical Mgr, Pharm Supvr, CAE, Sr CAE I Work Schedule Description (e.g. M-F 8am to 5pm) Core Business Hours Overview The Data Engineer is responsible for designing, building, and maintaining scalable data pipelines and infrastructure that support enterprise analytics, reporting, and operational data flows. This role plays a critical part in enabling data-driven decision-making across Centene and SPM by ensuring the availability, integrity, and performance of data systems. The Data Engineer collaborates with data scientists, analysts, software developers, and business stakeholders to deliver robust ETL solutions, optimize data storage and retrieval, and implement secure, compliant data architectures in cloud and hybrid environments. Operating within a healthcare-focused, compliance-heavy landscape, the Data Engineer ensures that data platforms align with regulatory standards such as HIPAA and SOC 2, while embedding automation and CI/CD practices into daily workflows. The role supports both AWS and Azure environments, leveraging cloud-native services and modern tooling to streamline data ingestion, transformation, and delivery. Responsibilities Job Responsibilities: Design, develop, and maintain ETL pipelines for structured and unstructured data across cloud and on-prem environments. Build and optimize data models, schemas, and storage solutions in SQL Server, PostgreSQL, and cloud-native databases. Implement CI/CD workflows for data pipeline deployment and monitoring using tools such as GitHub Actions, Azure DevOps, or Jenkins. Develop and maintain data integrations using AWS Glue, Azure Data Factory, Lambda, EventBridge, and other cloud-native services. Ensure data quality, lineage, and governance through automated validation, logging, and monitoring frameworks. Collaborate with cross-functional teams to gather requirements, design scalable solutions, and support analytics and reporting needs. Monitor and troubleshoot data pipeline performance, latency, and failures; implement proactive alerting and remediation strategies. Support data security and compliance by enforcing access controls, encryption standards, and audit logging aligned with HIPAA and SOC 2. Maintain documentation for data flows, architecture diagrams, and operational procedures. Participate in sprint planning, code reviews, and agile ceremonies to support iterative development and continuous improvement. Evaluate and integrate new data tools, frameworks, and cloud services to enhance platform capabilities. Partner with DevOps and Security teams to ensure infrastructure-as-code and secure deployment practices are followed. Participate in, adhere to, and support compliance, people and culture, and learning programs. Perform other duties as assigned. Qualifications Essential Background Requirements: Education: Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field required. Master's degree preferred. Certification/Licenses: AWS Certified Data Analytics or Solutions Architect required. Microsoft Certified: Azure Data Engineer Associate required. Certified Data Management Professional (CDMP) required. Experience: 5+ years of experience in data engineering, ETL development, or cloud data architecture required. Proven experience with SQL, ETL tools, and CI/CD pipelines required. Hands-on experience with AWS and Azure data services and infrastructure required. Familiarity with data governance, compliance frameworks (HIPAA, SOC 2), and secure data handling practices required. Familiarity with CI/CD pipelines, automated testing, and version control systems required. Skills & Technologies: Languages & Tools: SQL, Python, Bash, Git, Terraform, PowerShell ETL & Orchestration: AWS Glue, Azure Data Factory, Apache Airflow CI/CD: GitHub Actions, Azure DevOps, Jenkins Cloud Platforms: AWS (S3, Lambda, RDS, Redshift), Azure (Blob Storage, Synapse, Functions) Monitoring & Logging: CloudWatch, Azure Monitor, ELK Stack Data Governance: Data cataloging, lineage tracking, encryption, and access control. Location : Address 5250 Virginia Way Ste 300 Location : City Brentwood Location : State/Province TN Location : Postal Code 37027 Location : Country US
    $74k-103k yearly est. Auto-Apply 60d+ ago
  • Data Engineer

    Two95 International 3.9company rating

    Data engineer job in Franklin, TN

    Title: Data Engineer Type: 6 months (contract to hire) RATE : open Requirements 5+ years of developing software using object-oriented or functional language experience 5+ years of SQL 3+ years working with open source Big Data technology stacks (Apache Nifi, Spark, Kafka, HBase, Hadoop/HDFS, Hive, Drill, Pig, etc.) or commercial open source Big Data technology stacks (Hortonworks, Cloudera, etc.) 3+ years with document databases (e.g. MongoDB, Accumulo, etc.) 3+ years of experience using Agile development processes (e.g. developing and estimating user stories, sprint planning, sprint retrospectives, etc.) 2+ years of distributed version control system (e.g. git) 3+ years of experience in cloud-based development and delivery Familiarity with distributed computing patterns, techniques, and technologies (e.g. ESB) Familiarity with continuous delivery technologies (e.g. Puppet, Chef, Ansible, Docker, Vagrant, etc.) Familiarity with build automation and continuous integration tools (e.g. Maven, Jenkins, Bamboo, etc.) Familiarity with Agile process management tools (e.g. Atlassian Jira) Familiarity with test automation (Selenium, SoapUI, etc.) Good software development and Object Oriented programming skills. Strong analytical skills and the ability to work with end users to transform requests into robust solutions. Excellent oral and written communication skills. Initiative and self-motivation to work independently on projects. Benefits Note: If interested please send your updated resume and include your salary requirement along with your contact details with a suitable time when we can reach you. If you know of anyone in your sphere of contacts, who would be a perfect match for this job then, we would appreciate if you can forward this posting to them with a copy to us.We look forward to hearing from you at the earliest!
    $75k-103k yearly est. Auto-Apply 60d+ ago
  • Data Platform Engineer

    Monogram Health 3.7company rating

    Data engineer job in Brentwood, TN

    Data Platform Engineer The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions. Responsibilities Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms. Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks. Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark. Develop and manage data models, data warehousing solutions, and data integration architectures in Azure. Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems. Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration. Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases. Ensure data quality, governance, and security across the data lifecycle. Collaborate with product managers by estimating technical tasks and deliverables. Uphold the mission and values of Monogram Health in all aspects of your role and activities. Position Requirements A bachelor's degree in computer science, data science, software engineering or related field. Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark. Expert level knowledge of Python or other scripting languages required. Proficiency in SQL and other data query languages. Understanding of data modeling and schema design principles Ability to work with large datasets and perform data analysis Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable. Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC). Thorough understanding of Azure Cloud Infrastructure offerings. Demonstrated problem-solving and troubleshooting skills. Team player with demonstrated written and communication skills. Benefits Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders. Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home. Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
    $75k-103k yearly est. 60d+ ago
  • Data and AI Lead

    Infosys Ltd. 4.4company rating

    Data engineer job in Brentwood, TN

    Infosys is seeking Data and AI Lead. You will spearhead the development and deployment of cutting-edge AI solutions. You will design and build scalable systems leveraging OpenAI APIs, Azure AI Search, vector databases, and Retrieval Augmented Generation (RAG) models to solve complex business challenges and drive exceptional user experiences. In this role, you will architect and implement end-to-end AI solutions combining OpenAI APIs, Azure AI Search, vector databases (utilizing Azure AI Search capabilities), and RAG models. Design and optimize RAG pipelines to enhance the accuracy and relevance of AI-generated responses. Manage vector databases within Azure AI Search for efficient semantic search and information retrieval. Develop and maintain robust APIs for seamless integration of AI services across applications and systems. Evaluate AI model performance and implement optimization strategies for continuous improvement. Integrate Azure AI Search with OpenAI services and vector databases to create powerful search and knowledge retrieval systems. Required Qualifications: * Candidate must be located within commuting distance of Overland Park, KS, Raleigh, NC, Brentwood, TN, Hartford, CT, Phoenix, AZ, Tempe, AZ, Dallas, TX, Richardson, TX, Indianapolis, IN, Atlanta, GA or be willing to relocate to the area. * Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education * Candidates authorized to work for any employer in the United States without employer-based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time * At least 4 years of Information Technology experience * Experience in utilizing OpenAI APIs and services. * Experience integrating Azure AI Search with other AI services. * Hands-on experience developing and deploying RAG models. * Strong programming skills in both Python and Java. * Experience designing and building RESTful APIs. Preferred qualifications: * Familiarity with cloud platforms, specifically Azure. * Good understanding of Agile software development frameworks * Strong communication and Analytical skills * Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams * Experience and desire to work in a global delivery environment The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email or face to face. Travel may be required as per the job requirements. Along with competitive pay, as a full-time Infosys employee you are also eligible for the following benefits: - * Medical/Dental/Vision/Life Insurance * Long-term/Short-term Disability * Health and Dependent Care Reimbursement Accounts * Insurance (Accident, Critical Illness, Hospital Indemnity, Legal) * 401(k) plan and contributions dependent on salary level * Paid holidays plus Paid Time Off
    $67k-79k yearly est. 26d ago
  • Software Engineer II

    Coats 4.3company rating

    Data engineer job in La Vergne, TN

    Job Description The Software Engineer II is responsible for the design, development, and maintenance of software applications in support of organizational objectives. This role contributes throughout the software development lifecycle, including requirements analysis, system design, coding, testing, debugging, and ongoing support. The position applies established software engineering practices while developing proficiency in more complex systems and problem domains. This position collaborates with cross-functional stakeholders to ensure solutions meet quality, performance, and reliability standards. What You'll Do: Build, test, and maintain application software Develop responsive, intuitive user interfaces Create test plans for features and products Develop, debug, and test embedded firmware Participate in code reviews Write deployment and support documentation Qualifications: Bachelor's in Electrical/Computer Engineering, Computer Science, or equivalent 2 to 6 years of experience in a work environment C# / .NET / Visual Studio experience Object-oriented programming and software design knowledge Git and bug-tracking experience (Atlassian a plus) Embedded systems firmware development experience Able to work independently and cross-functionally Preferred: Android or Linux experience C / C++ proficiency .NET MAUI/Xamarin Forms and MVVM design Customer-facing UI/UX experience Azure or AWS development experience Manufacturing environment experience The Coats Company is an equal opportunity employer that evaluates qualified applicants without regard to race, color, national origin, religion, ancestry, sex (including pregnancy, childbirth and related medical conditions), age, marital status, disability, veteran status, citizenship status, sexual orientation, gender identity or expression, and other characteristics protected by law. Powered by JazzHR uWQkK8DZ4G
    $72k-90k yearly est. 14d ago
  • Senior Software Engineer

    Ingram Content Group 4.6company rating

    Data engineer job in La Vergne, TN

    Job Description Ingram Content Group (ICG) is currently seeking a Senior Software Engineer to join our team in LaVergne, TN (Greater Nashville area). This person delivers enterprise-grade software solutions with high customer impact. Leads architecture and development activities with a specialization in at least one major enterprise IT application, one major database platform, and one major operating system. Performs all aspects of the development life cycle. Acts as the senior technical programmer for the assigned enterprise system and/or application of responsibility. Delivers results through independent contributions and through mentoring of junior engineers. This position will be expected to work from the Ingram headquarters 4 days per week. Want to help explore and build new ways to deliver content to the world? At Ingram, our Technology team is blazing a trail by providing content distribution services to thousands of publishers with key initiatives around business intelligence, machine learning, continuous integration and omnichannel. We support diverse people and technology that highlights innovation through SaaS platforms, metadata, cloud, and containerization. Our teams are agile, and emphasize authenticity, creativity, and transparency upon a fact-based foundation. The world is reading, and it is our goal to connect as many people as possible to the content they want in the simplest ways. If you are an IT professional who strives to deliver results through collaborative partnerships, understanding what drives business, and enjoys working in a connected culture, we can't wait to meet you! The ideal candidate will have the following minimum qualifications: Bachelor's degree in computer science or related field or directly related year for year experience 6 years' experience in designing, developing, implementing, and supporting enterprise level IT solutions We have a preference for: Knowledge of Development Tools with demonstrated expert experience in appropriate development tools - .Net Stack (C#, Win Forms, Web Api with Asp.net Core and Entity Framework Core), Kafka, Kubernetes, Javascript/Web front end technologies, MySQL, SQL Server, Visual Studio, Docker, REST and JSON technologies. Knowledge of external technologies within domain of expertise Knowledge of all phases of applications systems analysis and programming Knowledge of and in depth understanding of the business or function for which application is designed. Knowledge of Databases with demonstrated expert experience integrating with MySQL and SQL Server Knowledge of development source code management using GIT Hub and JIRA Knowledge of Object-oriented design The Senior Software Engineer key responsibilities are: Serves as Designer/Architect/Engineer for at least one major enterprise IT application. Leads areas of integration with at least one major operating system (e.g. Unix/Linux/Windows). Develops new design patterns, standards, etc. and works with other developers in implementation. Performs data modeling and architecture development. Reviews and evaluates application work flow and user experience. Acts as technical expert and provides application development oversight and involvement for Third Party integrations (e.g. Documentum, Adobe, etc.) and database (e.g. MySQL, Oracle, SQL Server) core components. Leads and executes testing to ensure the program meets the specified requirements. Drives solutions and guides the work of others to provide full application development life cycle support including specifications, prototypes, development, quality assurance and deployment. Champions innovation and expands sphere of influence through mentoring and guidance. Works with user/customer community, business analysts, and architects to capture system requirements and design. Leverages a technical network to collaborate across the organization Hiring Salary Range: $108,000k - $138,000k. This range represents the anticipated low and high end of the salary for this position. It will be determined by factors including but not limited to: the applicant's education, experience, knowledge, skills, and abilities, geographic location, as well as internal equity and alignment with market data. Additional Information Perks/Benefits: A highly competitive compensation package with generous benefits beginning first day of employment for Medical/Prescription Drug plans, HSA, Vision, Dental and Health Care FSA. 15 vacation days & 12 sick days accrued annually and 3 personal days 401K match, Life and AD&D, Employee Assistance programs, Group Legal, & more Wellness program with access to onsite gym and basketball court for associates Encouraged continued education with our tuition reimbursement program Financial and in-kind opportunities to engage with non-profits in your community Company match program for United Way donations Volunteer opportunities and in-kind drives for non-profits throughout the year Take breaks or brainstorm in our game room with ping pong & foosball Casual Dress Code & Flexible Schedules (per team) The world is reading, and Ingram Content Group (“Ingram”) connects people with content in all forms. Providing comprehensive services for publishers, retailers, libraries and educators, Ingram makes these services seamless and accessible through technology, innovation and creativity. With an expansive global network of offices and facilities, Ingram's services include digital and physical book distribution, print-on-demand, and digital learning. Ingram Content Group is a part of Ingram Industries Inc. and includes Ingram Book Group LLC, Ingram Publisher Services LLC, Lightning Source LLC, Ingram Library Services LLC, Tennessee Book Company LLC, Ingram Content Group UK Ltd. and Ingram Content Group Australia Pty Ltd. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, work related mental or physical disability, veteran status, sexual orientation, gender identity, or genetic information. EOE-Race/Gender/Veterans/Disabled We participate in EVerify. EEO Poster in English EEO Poster in Spanish
    $91k-110k yearly est. 28d ago
  • Software Engineer, macOS Core Product - Murfreesboro, USA

    Speechify

    Data engineer job in Murfreesboro, TN

    At Speechify, our mission is to ensure reading is never a barrier to learning. Over 50 million people use Speechify's text-to-speech products-including apps on iOS, Android, mac OS, Chrome, and web-to listen to PDFs, books, docs, and web content faster, smarter, and more joyfully than ever before. Our product has earned recognition from Google ( Chrome Extension of the Year ) and Apple ( App of the Day and 2025 Inclusivity Design Award) for its impact and accessibility. We're a fully remote, distributed team of engineers, designers, researchers, and product builders from world-class companies like Amazon, Microsoft, Google, Stripe, and more. We move fast, ship often, and love solving real user problems. Role Overview As a Software Engineer on the mac OS team, you'll help build and scale Speechify's core desktop experience for millions of users. You'll own significant parts of our mac OS app architecture, ship production-ready code, and collaborate closely with product, design, and engineering teams across the company. This is a key role for someone who thrives in a fast-paced startup environment, enjoys making high-impact product decisions, loves delightful user experiences, and has a passion for accessibility and performance. What You'll Do Lead key engineering and product decisions for the mac OS app. Write, test, and ship production-quality code that scales to millions of users. Maintain and evolve complex app architecture with a focus on performance and stability. Work within a cross-functional team, partnering with designers and PMs to shape features from concept to launch. Participate in product planning and roadmap discussions. Drive continuous improvement in code quality, CI/CD processes, and development workflows. You should have: Demonstrated experience shipping mac OS (or related desktop) applications used by many customers. Strong engineering instincts with a deep focus on user experience. A strategic mindset for building great products-not just writing code. Ability to work quickly, decide what to build now vs. later, and iterate fast. Experience working in remote, distributed teams. Technical requirements: Swift / SwiftUI (mac OS) proficiency. Solid understanding of AppKit, mac OS frameworks, and desktop-specific UI paradigms. Strong understanding of concurrency and asynchronous execution models Familiarity with Bitrise, CI/CD workflows (e.g., Xcode Cloud, GitHub Actions). Strong understanding of concurrency and asynchronous execution models What We offer: Impact & Ownership: Build and influence a product used by millions globally. Remote First: Flexible, asynchronous work culture. Growth & Leadership: Flat org - leadership is earned by impact, not title. Collaborative Environment: Work with smart, passionate engineers and designers. Competitive Compensation: Market-aligned salary, bonus, and equity. The United States Based Salary range for this role is: 140,000-200,000 USD/Year + Bonus + Stock depending on experience Why Join Speechify? At Speechify, we ship fast, build for real users, and care deeply about quality and accessibility. You'll work on products that change lives, and your contributions will shape both the product and the company. If this sounds like your kind of challenge, we'd love to hear from you. Apply with your resume and links to your portfolio or GitHub! Think you're a good fit for this job? Tell us more about yourself and why you're interested in the role when you apply. And don't forget to include links to your portfolio and LinkedIn. Not looking but know someone who would make a great fit? Refer them! Speechify is committed to a diverse and inclusive workplace. Speechify does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.
    $61k-82k yearly est. 12d ago
  • Data Scientist, Merchandising Analytics

    Tractor Supply Company 4.2company rating

    Data engineer job in Brentwood, TN

    The Data Scientist, Merchandising Analytics at Tractor Supply Company will play a key role in leveraging data to address complex business challenges. The role will develop advanced statistical methods using Machine Learning, AI, statistical modeling, and optimization techniques to support the merchandising strategies and broader organizational goals of TSC. Additionally, the role will contribute to setting objectives to enhance the overall data architecture and data governance within the Merchandising team. Key areas of focus within Merchandising will be Space Planning and Pricing. The Data Scientist will lead cross-functional projects, design and implement predictive models, and promote data-driven decision-making throughout the organization. Strong communication skills are essential as this role will translate complex analytical results into clear, actionable insights for both technical and non-technical stakeholders. This role work will ensure that Tractor Supply Company remains at the forefront of industry trends and emerging technologies in the data science field. **Essential Duties and Responsibilities (Min 5%)** + Work closely with key business partners to fully explore and frame a business question including objectives, goals, KPIs, decisions the analysis will support and required timing for deliverables. + Extracting available and relevant data from internal and external data sources to perform data science solution development. + Develop, maintain, and improve predictive models using R, Python, and Databricks to enhance business knowledge and processes. + Contribute and assist the team with best practices in data governance, data engineering, and data architecture. + Identify opportunities for automation and continuous improvement within data pipelines, processes, and systems. + Maintain a broad exposure to the wider ecosystem of AI / Machine Learning and ensure our team is pushing toward the most optimal solutions. + Manage multiple projects simultaneously with limited oversight including development of new technologies and maintenance of existing framework. + Foster a culture of data-driven decision making and collaborate with cross-functional teams and senior leadership to define the long-term vision and goals for data science and engineering within the organization. + Design and execute A/B tests to evaluate the effectiveness of various data-driven solutions, including designing appropriate sample sizes, metrics for evaluation, and statistical analysis plans. + Use proven, predictive science to right-size every store with localized plans that balance individual store space allocations with your top-down and bottom-up strategies. + Be a 'Go-To' person for any data analytical needs ranging from data extraction/ manipulations, long term trend analysis, statistical analysis, and modeling techniques. Perform code-review and debug with the team and assist during implementation where necessary. **Required Qualifications** _Experience_ : 3-5 years proven experience as a Data Scientist, preferably in the retail or e-commerce industry. Prefer 2+ years of experience in predictive modeling utilizing CRM or transaction information. _Education_ : Bachelor's Degree in Mathematics, Statistics, or Econometrics. Master's Degree prefered. Any combination of education and experiencce will be considered. _Professional Certifications_ : None **Preferred knowledge, skills or abilities** + Intermediate-advanced in one or more programming language (Python, PySpark, R). + Deep expertise in writing and debugging complex SQL queries. + Ability to frame business questions and create an analytics solution using statistical or other advanced analytics methodologies. + Proven advanced modeling experience in leading data-driven projects from definition to execution, driving and influencing project roadmaps. + Experience using Azure, AWS, or another cloud compute platform a plus. + Familiarity with visualization tools such as Power BI and Tableau. + Must possess high degree of aptitude in communicating both verbally and written complex analysis results to Senior & Executive Leadership. + Knowledge of A/ B testing methods; capable of designing a controlled test design, running the test and providing measurement post-hoc. + Proficiency with managing data repository and version control systems like Git. + Speak, read and write effectively in the English language. **Working Conditions** + Hybrid / Flexible working conditions **Physical Requirements** + Sitting + Standing (not walking) + Walking + Kneeling/Stooping/Bending + Lifting up to 10 pounds **Disclaimer** _This job description represents an overview of the responsibilities for the above referenced position. It is not intended to represent a comprehensive list of responsibilities. A team member should perform all duties as assigned by his/ her supervisor._ **Company Info** At Tractor Supply and Petsense by Tractor Supply, our Team Members are the heart of our success. Their dedication, passion, and hard work drive everything we do, and we are committed to supporting them with a comprehensive and accessible total reward package. We understand the evolving needs of our Team Members and their families, and we strive to offer meaningful, competitive, and sustainable benefits that support their well-being today and in the future. Our benefits extend beyond medical, dental, and vision coverage, including company-paid life and disability insurance, paid parental leave, tuition reimbursement, and family planning resources such as adoption and surrogacy assistance, for all full-time Team Members and all part-time Team Members. Part time new hires gain eligibility for TSC Benefits by averaging at least 15 hours per week during their 90-day lookback period. The lookback period starts the first of the month following the date of hire. If the 15-hour requirement was met, the benefits eligibility date will be the first day of the month following 4 months of continuous service. Please visitthis link (********************************************************************** for more specific information about the benefits and leave policies applicable to the position you're applying for. **ALREADY A TEAM MEMBER?** You must apply or refer a friend through our internal portal Click here (************************************************************************** **CONNECTION** Our Mission and Values are more than just words on the wall - they're the one constant in an ever-changing environment and the bedrock on which we build our culture. They're the core of who we are and the foundation of every decision we make. It's not just what we do that sets us apart, but how we do it. Learn More **EMPOWERMENT** We believe in managing your time for business and personal success, which is why we empower our Team Members to lead balanced lives through our benefits and total rewards offerings. For full-time and eligible part-time TSC and Petsense Team Members. We care about what you care about! Learn More **OPPORTUNITY** A lot of care goes into providing legendary service at Tractor Supply Company, which is why our Team Members are our top priority. Want a career with a clear path for growth? Your Opportunity is Out Here at Tractor Supply and Petsense. Learn More Join Our Talent Community **Nearest Major Market:** Nashville
    $81k-105k yearly est. 60d+ ago
  • Bigdata / Hadoop Technical Lead

    E Pro Consulting 3.8company rating

    Data engineer job in Franklin, TN

    E*Pro Consulting service offerings include contingent Staff Augmentation of IT professionals, Permanent Recruiting and Temp-to-Hire. In addition, our industry expertise and knowledge within financial services, Insurance, Telecom, Manufacturing, Technology, Media and Entertainment, Pharmaceutical, Health Care and service industries ensures our services are customized to meet specific needs. For more details please visit our website ****************** Job Description Technical/Functional Skills: • Must have at least 1 full-scale Hadoop implementation from DEV to PROD • Must have experience in Production Deployment Process for Big Data projects • Must have experience in root cause analysis, trouble-shooting of Hadoop applications • Must have significant experience in designing solutions using Cloudera Hadoop • Must have significant experience with Java MapReduce, PIG, Hive, Sqoop and Oozie • Must have significant experience with Unix Shell Scripts • Exposure to Healthcare Provider domain Roles & Responsibilities: • Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions. • Mentor, guide and train Team members on Big Data • Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop. • Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks. • Perform complex coding tasks using Java Map Reduce, PIG, Hive and Sqoop • Review and certify code written by team members • Ensures that established change management and other procedures are adhered to also help developing needing standards, procedures, and practices. • Performance tuning with large data sets. Generic Managerial Skills: • Ability to lead the team, plan, track and manage and work performed by team members • Ability to work independently and communicate across multiple levels (Product owners, Executive sponsors, Team members) Additional Information All your information will be kept confidential according to EEO guidelines.
    $91k-120k yearly est. 60d+ ago
  • Big Data / Hadoop Technical Lead

    Tectammina

    Data engineer job in Franklin, TN

    First IT Solutions provides a professional, cost effective recruitment solution. We take the time to understand your needs in great detail. With our dedicated and experienced Recruitment Consultants, our market knowledge, and our emphasis on quality and satisfaction, we pride ourselves on offering the best solution the first time. Our consultants have substantial experience gained over many years placing talented individuals in Contract and Permanent positions within the technology industry. You can be sure that we understand the process well from your side of the desk. We started First IT to provide top quality service, something that clients complained was lacking at other recruiting and placement firms. At First IT, we strive continually to provide excellent service at all times. Job Description Technical/Functional Skills: Minimum Experience Required 8 years Must have experience as a Tech Lead for Big data projects Must have significant experience with architecting and designing solutions using Cloudera Hadoop Must have significant experience with Python, Java Map Reduce, PIG, Hive, Hbase, Oozie and Sqoop Must have significant experience with Unix Shell Scripts Exposure to Healthcare Provider domain Qualifications Architect and Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions. Estimate size, effort, complexity of solutions Plan, track and report project status Mentor, guide and train Team members on Big Data Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop. Prepare detailed specifications, diagrams, and other programming structures from which programs are written. Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks. Perform complex coding tasks using Python, Java Map Reduce, PIG, Hive, Hbase, Oozie and Sqoop Review and certify code written by team members Ensures that established change management and other procedures are adhired to also help developing needing standards, procedures, and practices. Performance tuning with large data sets. Additional Information Duration: Full Time Eligiblity: GC & US Citizens Only Share the Profiles to **************************** Contact: ************ Keep the subject line with Job Title and Location
    $86k-122k yearly est. Easy Apply 60d+ ago
  • Senior Data Engineer

    Community Health Systems 4.5company rating

    Data engineer job in Franklin, TN

    **About CHS:** Community Health Systems is one of the nation's leading healthcare providers. Developing and operating healthcare delivery systems in 35 distinct markets across 14 states, CHS is committed to helping people get well and live healthier. CHS operates 70 affiliated hospitals with more than 10,000 beds and approximately 1,000 other sites of care, including physician practices, urgent care centers, freestanding emergency departments, imaging centers, cancer centers, and ambulatory surgery centers. **About the Role:** We are seeking an experienced **Data Engineer** to design, build, deploy, and manage our company's data infrastructure. This role is critical to managing and organizing structured and unstructured data across the organization to enable outcomes analysis, insights, compliance, reporting, and business intelligence. This role requires a strong blend of technical expertise, problem-solving abilities, and communication skills. Emphasis will be placed on leadership, mentorship, and ownership within the team. **Essential Duties and Responsibilities:** + Design, build, and maintain robust CI/CD pipelines, including infrastructure as code + Design, build, and maintain scalable and efficient data pipelines using various GCP services. + Maintain and refactor complex legacy SQL ETL processes in BigQuery. + Write clean, maintainable, and efficient code for data processing, automation, and API integrations. + Collaborate with data architects, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. + Ensure data quality, integrity, and security across all data platforms. + Troubleshoot and resolve data-related issues, performing root cause analysis and implementing corrective actions. + Contribute to the continuous improvement of our data engineering practices, tools, and methodologies. + Mentor junior team members and provide technical guidance. + Lead data initiatives from conception to deployment. **Our Stack (In Development):** + GitHub, GitHub Actions, Terraform, Docker, Helm + JVM languages (Kotlin preferred), SQL, Python + GCP services: Cloud Composer, Dataproc, Dataplex, BigQuery, BigLake, GKE + OSS: Kubernetes, Spark, Flink, Kafka **Required Education:** + Bachelor's Degree or 4 years equivalent professional experience **Required Experience:** + 5-7 years of professional experience in data engineering or a similar role. + Proficiency in SQL, with a deep understanding of relational databases and data warehousing concepts. + Expertise in JVM languages or Python for data manipulation, scripting, and automation. + Demonstrable experience with cloud services related to data engineering. + Strong understanding of ETL/ELT processes, data modeling, and data architecture principles. + Excellent problem-solving skills and a strong analytical mindset. + Ability to work independently and as part of a collaborative team. + Strong communication and interpersonal skills, with the ability to explain complex technical concepts to non-technical stakeholders. + Proven leadership potential and a willingness to take ownership of projects. + Experience with Agile development methodologies + Expertise with version control systems (e.g., Git, GitHub) **Preferred Experience:** + Experience with distributed data processing frameworks + Experience with distributed systems + Familiarity with data visualization tools (e.g., Looker, Tableau, Power BI) + Experience with data integration and migration within Oracle Cloud Infrastructure (OCI) + Familiarity with data structures and extraction methodologies for Oracle Cloud ERP applications (e.g., Financials, HCM, SCM) Equal Employment Opportunity This organization does not discriminate in any way to deprive any person of employment opportunities or otherwise adversely affect the status of any employee because of race, color, religion, sex, sexual orientation, genetic information, gender identity, national origin, age, disability, citizenship, veteran status, or military or uniformed services, in accordance with all applicable governmental laws and regulations. In addition, the facility complies with all applicable federal, state and local laws governing nondiscrimination in employment. This applies to all terms and conditions of employment including, but not limited to: hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. If you are an applicant with a mental or physical disability who needs a reasonable accommodation for any part of the application or hiring process, contact the director of Human Resources at the facility to which you are seeking employment; Simply go to ************************************************* to obtain the main telephone number of the facility and ask for Human Resources.
    $88k-107k yearly est. 60d+ ago
  • Data Onboarding Consultant

    Corpay

    Data engineer job in Brentwood, TN

    What We Need Corpay is currently looking to hire a Data Onboarding Consultant within our Implementations division. This position falls under our Corporate Payments line of business based out of our Brentwood, TN location. In this role, you will manage critical data activities for Corpay's clients to ensure successful implementation and ongoing client success. This position combines client-facing and internal technical responsibilities. The ideal candidate is one that enjoys working with clients to assist them in navigating complex data landscapes, is analytical in nature allowing them to understand non-uniform data sets from various sources, can drive project success by creating deadlines and holding both internal and external parties accountable to performance, and can balance competing priorities to ensure ultimate success. The ideal candidate is a problem solver, a great communicator, and most importantly takes ownership of their projects and drives them to success. You will report directly to the Manager of Technical Implementations. How We Work As a Data Onboarding Consultant you will be expected to work out of our Brentwood, TN office location. Corpay will set you up for success by providing: Company-issued equipment Assigned workspace in our Brentwood office Formal, hands-on training Role Responsibilities The responsibilities of the role will include: This is a customer-facing role that will serve as the primary point of client contact for all data services from the sales process through implementation Work with clients and internal partners to obtain and validate data to be used in data services Analyze client data and present findings to improve the results of the data being ingested Utilize data cleaning and mapping tools to ingest data into the application Coordinate the scoping, prioritization, delivery, and, where applicable, ongoing maintenance of client data services (one-time data import, ongoing data integrations) First line of defense for triaging issues related to data imports/data integrations Work with clients and internal stakeholders to maintain a prioritized queue of data services deliverables Contribute to the overall strategy for Implementations Qualifications & Skills 2 - 5 years' experience in managing or working with data (training/education counts) Comfortable communicating complex information in simple terms Experience managing projects Experience working with large, non-uniform, data sets Experience working directly with clients and prospects to assess needs and define technical solutions Experience with data mapping and BI tools While this is not an engineering role, familiarity with engineering tools and practices will be greatly beneficial Benefits & Perks Medical, Dental & Vision benefits available the 1st month after hire Automatic enrollment into our 401k plan (subject to eligibility requirements) Virtual fitness classes offered company-wide Robust PTO offers including major holidays, vacation, sick, personal, & volunteer time Employee discounts with major providers (i.e. wireless, gym, car rental, etc.) Philanthropic support with both local and national organizations Fun culture with company-wide contests and prizes Equal Opportunity/Affirmative Action Employer Corpay is an Equal Opportunity Employer. Corpay provides equal employment opportunities to all employees and applicants without regard to race, color, gender (including pregnancy), religion, national origin, ancestry, disability, age, sexual orientation, gender identity or expression, marital status, language, ancestry, genetic information, veteran and/or military status or any other group status protected by federal or local law. If you require reasonable accommodation for the application and/or interview process, please notify a representative of the Human Resources Department. Pay Transparency This salary range is provided for locations which require such disclosure. Where a position or applicant may fall in a particular wage range depends on a number of factors including but not limited to skill sets, experience training licenses and certifications (if applicable), and other business and organization needs. The disclosed range has not been adjusted for the applicable geographic markets. At Corpay it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions depend on the facts and circumstances of each case. The estimate of the minimum and maximum salary range is $63,800-$80,000 per annum. For more information about our commitment to equal employment opportunity and pay transparency, please click the following links: EEOC and Pay Transparency.
    $63.8k-80k yearly 10d ago
  • Data Platform Engineer

    Monogram Health Inc. 3.7company rating

    Data engineer job in Brentwood, TN

    Job DescriptionPosition: Data Platform Engineer The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions. Responsibilities Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms. Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks. Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark. Develop and manage data models, data warehousing solutions, and data integration architectures in Azure. Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems. Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration. Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases. Ensure data quality, governance, and security across the data lifecycle. Collaborate with product managers by estimating technical tasks and deliverables. Uphold the mission and values of Monogram Health in all aspects of your role and activities. Position Requirements A bachelor's degree in computer science, data science, software engineering or related field. Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark. Expert level knowledge of Python or other scripting languages required. Proficiency in SQL and other data query languages. Understanding of data modeling and schema design principles Ability to work with large datasets and perform data analysis Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable. Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC). Thorough understanding of Azure Cloud Infrastructure offerings. Demonstrated problem-solving and troubleshooting skills. Team player with demonstrated written and communication skills. Benefits Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders. Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home. Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
    $75k-103k yearly est. 1d ago
  • Data Governance and Privacy Lead

    Infosys Ltd. 4.4company rating

    Data engineer job in Brentwood, TN

    We're seeking a results-driven Data Governance and Privacy Lead to lead the development and execution of enterprise-wide data management strategies. This is a hands-on leadership role focused on building a trusted data foundation, integrating governance, privacy, and compliance frameworks across multiple business domains. You'll collaborate with senior executives to align data initiatives with strategic objectives, influence enterprise data culture, and implement best-in-class governance technologies. Key Responsibilities: * Develop and lead the Data Governance and Stewardship framework across the organization. * Implement data quality, metadata, and privacy standards aligned with global regulations (GDPR, CCPA, CPPA, COPPA and other US and Canadian Privacy Laws). * Oversee data lineage, cataloging, and integration using tools such as Collibra, Alation, Informatica CDGC, Atlan, OvalEDGE and Azure Purview. * Champion Privacy-by-Design and lead the rollout of data privacy automation via platforms like BigID, OneTrust, WireWheel and Securiti.ai. * Advise C-level executives on data strategy, compliance, and governance maturity improvements. * Build and mentor a high-performing team of Data Stewards, Privacy Analysts, and Data Architects. Required Qualification: * Candidate must be located within commuting distance of Overland Park, KS, Raleigh NC, Brentwood TN, Hartford CT, Phoenix AZ, Tempe, AZ, Dallas, TX, Richardson TX, Indianapolis, IN, or Atlanta, GA or be willing to relocate to the area. * Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education * Candidates authorized to work for any employer in the United States without employer-based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time * At least 4 years of Information Technology experience * Experience in Data Management, Data Governance, and Privacy leadership * Proven success implementing enterprise governance frameworks (DAMA-DMBOK, DCAM) * Deep technical expertise across metadata management, data quality, and MDM * Experience with enterprise data models (BDW, IIW, HPDM) and frameworks like TOGAF or Zachman Preferred Certifications: * Collibra Ranger / Collibra Expert * CDMP (Certified Data Management Professional) * CIPP/US (Privacy Professional) * TOGAF * Excellent executive communication and stakeholder management skills The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email or face to face. Travel may be required as per the job requirements. Along with competitive pay, as a full-time Infosys employee you are also eligible for the following benefits: - * Medical/Dental/Vision/Life Insurance * Long-term/Short-term Disability * Health and Dependent Care Reimbursement Accounts * Insurance (Accident, Critical Illness, Hospital Indemnity, Legal) * 401(k) plan and contributions dependent on salary level * Paid holidays plus Paid Time Off
    $76k-93k yearly est. 26d ago

Learn more about data engineer jobs

How much does a data engineer earn in Franklin, TN?

The average data engineer in Franklin, TN earns between $62,000 and $108,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Franklin, TN

$81,000

What are the biggest employers of Data Engineers in Franklin, TN?

The biggest employers of Data Engineers in Franklin, TN are:
  1. Monogram Health
  2. TWO95 International
  3. Corps Team / Mom Corps
  4. Community Health Systems
  5. Navitus
  6. LifePoint Health
Job type you want
Full Time
Part Time
Internship
Temporary