Post job

Senior data scientist jobs in Hendersonville, TN - 22 jobs

All
Senior Data Scientist
Data Engineer
Lead Data Technician
Data Scientist
Senior Data Architect
Lead Data Analyst
Senior Data Analyst-
  • Enterprise Data Scientist (Rev Cycle & Supply Chain)

    Community Health Systems 4.5company rating

    Senior data scientist job in Franklin, TN

    This role is responsible for leveraging your expertise in data analytics, advanced statistical methods, and programming to derive insights from clinical data. As a member of the Enterprise Data Science team, this role will be responsible for analyzing complex clinical datasets, developing data visualizations and dashboards, assisting with data model and/or feature development, and translating insights into actionable recommendations for improving patient care and operational outcomes. Responsibilities: + Collaborate with cross-functional teams including clinical leaders, data scientists, and software engineers to identify data-driven opportunities for enhancing clinical processes and patient care. + Utilize cloud-based technologies, such as Google Cloud Platform (GCP), for scalable data processing and analysis. + Develop easily consumable dashboards from complex clinical and operational data to provide visualizations derived from best practices and data consumption theory that drive healthcare and business performance. + Implement best practices for data management, including data quality assessment, data validation, and data governance. + Utilize Python programming and associated libraries such as PyTorch, Keras, Pandas and NumPY to create, train, test and implement meaningful data science models. + Lead the analysis of operational data to identify patterns, trends, and correlations relevant to healthcare outcomes. + Collaborate with healthcare professionals and domain experts to understand operational needs and design data-driven solutions. + Design and conduct experiments, interpret results, and communicate findings to both technical and non-technical stakeholders Requirements: + Master's degree in Data Science, Data Analytics, Computer Science, or a related field. + Proven experience in analyzing complex healthcare data and building customer facing dashboards and data visualizations. + Proficiency in Python programming and associated libraries. + Experience with cloud-based platforms such as Google Cloud Platform (GCP) for data storage, processing, and deployment. + Strong problem-solving skills and ability to work independently and collaboratively in a fast-paced environment. + Excellent communication and presentation skills with the ability to translate technical concepts to non-technical audiences. Equal Employment Opportunity This organization does not discriminate in any way to deprive any person of employment opportunities or otherwise adversely affect the status of any employee because of race, color, religion, sex, sexual orientation, genetic information, gender identity, national origin, age, disability, citizenship, veteran status, or military or uniformed services, in accordance with all applicable governmental laws and regulations. In addition, the facility complies with all applicable federal, state and local laws governing nondiscrimination in employment. This applies to all terms and conditions of employment including, but not limited to: hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. If you are an applicant with a mental or physical disability who needs a reasonable accommodation for any part of the application or hiring process, contact the director of Human Resources at the facility to which you are seeking employment; Simply go to ************************************************* to obtain the main telephone number of the facility and ask for Human Resources.
    $83k-108k yearly est. 20d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Data Scientist, Merchandising Analytics

    Tractor Supply 4.2company rating

    Senior data scientist job in Brentwood, TN

    The Data Scientist, Merchandising Analytics at Tractor Supply Company will play a key role in leveraging data to address complex business challenges. The role will develop advanced statistical methods using Machine Learning, AI, statistical modeling, and optimization techniques to support the merchandising strategies and broader organizational goals of TSC. Additionally, the role will contribute to setting objectives to enhance the overall data architecture and data governance within the Merchandising team. Key areas of focus within Merchandising will be Space Planning and Pricing. The Data Scientist will lead cross-functional projects, design and implement predictive models, and promote data-driven decision-making throughout the organization. Strong communication skills are essential as this role will translate complex analytical results into clear, actionable insights for both technical and non-technical stakeholders. This role work will ensure that Tractor Supply Company remains at the forefront of industry trends and emerging technologies in the data science field. Essential Duties and Responsibilities (Min 5%) * Work closely with key business partners to fully explore and frame a business question including objectives, goals, KPIs, decisions the analysis will support and required timing for deliverables. * Extracting available and relevant data from internal and external data sources to perform data science solution development. * Develop, maintain, and improve predictive models using R, Python, and Databricks to enhance business knowledge and processes. * Contribute and assist the team with best practices in data governance, data engineering, and data architecture. * Identify opportunities for automation and continuous improvement within data pipelines, processes, and systems. * Maintain a broad exposure to the wider ecosystem of AI / Machine Learning and ensure our team is pushing toward the most optimal solutions. * Manage multiple projects simultaneously with limited oversight including development of new technologies and maintenance of existing framework. * Foster a culture of data-driven decision making and collaborate with cross-functional teams and senior leadership to define the long-term vision and goals for data science and engineering within the organization. * Design and execute A/B tests to evaluate the effectiveness of various data-driven solutions, including designing appropriate sample sizes, metrics for evaluation, and statistical analysis plans. * Use proven, predictive science to right-size every store with localized plans that balance individual store space allocations with your top-down and bottom-up strategies. * Be a 'Go-To' person for any data analytical needs ranging from data extraction/ manipulations, long term trend analysis, statistical analysis, and modeling techniques. Perform code-review and debug with the team and assist during implementation where necessary. Required Qualifications Experience: 3-5 years proven experience as a Data Scientist, preferably in the retail or e-commerce industry. Prefer 2+ years of experience in predictive modeling utilizing CRM or transaction information. Education: Bachelor's Degree in Mathematics, Statistics, or Econometrics. Master's Degree prefered. Any combination of education and experiencce will be considered. Professional Certifications: None Preferred knowledge, skills or abilities * Intermediate-advanced in one or more programming language (Python, PySpark, R). * Deep expertise in writing and debugging complex SQL queries. * Ability to frame business questions and create an analytics solution using statistical or other advanced analytics methodologies. * Proven advanced modeling experience in leading data-driven projects from definition to execution, driving and influencing project roadmaps. * Experience using Azure, AWS, or another cloud compute platform a plus. * Familiarity with visualization tools such as Power BI and Tableau. * Must possess high degree of aptitude in communicating both verbally and written complex analysis results to Senior & Executive Leadership. * Knowledge of A/ B testing methods; capable of designing a controlled test design, running the test and providing measurement post-hoc. * Proficiency with managing data repository and version control systems like Git. * Speak, read and write effectively in the English language. Working Conditions * Hybrid / Flexible working conditions Physical Requirements * Sitting * Standing (not walking) * Walking * Kneeling/Stooping/Bending * Lifting up to 10 pounds Disclaimer This job description represents an overview of the responsibilities for the above referenced position. It is not intended to represent a comprehensive list of responsibilities. A team member should perform all duties as assigned by his/ her supervisor. Company Info At Tractor Supply and Petsense by Tractor Supply, our Team Members are the heart of our success. Their dedication, passion, and hard work drive everything we do, and we are committed to supporting them with a comprehensive and accessible total reward package. We understand the evolving needs of our Team Members and their families, and we strive to offer meaningful, competitive, and sustainable benefits that support their well-being today and in the future. Our benefits extend beyond medical, dental, and vision coverage, including company-paid life and disability insurance, paid parental leave, tuition reimbursement, and family planning resources such as adoption and surrogacy assistance, for all full-time Team Members and all part-time Team Members. Part time new hires gain eligibility for TSC Benefits by averaging at least 15 hours per week during their 90-day lookback period. The lookback period starts the first of the month following the date of hire. If the 15-hour requirement was met, the benefits eligibility date will be the first day of the month following 4 months of continuous service. Please visit this link for more specific information about the benefits and leave policies applicable to the position you're applying for.
    $81k-105k yearly est. 39d ago
  • Bigdata / Hadoop Technical Lead

    E Pro Consulting 3.8company rating

    Senior data scientist job in Franklin, TN

    E*Pro Consulting service offerings include contingent Staff Augmentation of IT professionals, Permanent Recruiting and Temp-to-Hire. In addition, our industry expertise and knowledge within financial services, Insurance, Telecom, Manufacturing, Technology, Media and Entertainment, Pharmaceutical, Health Care and service industries ensures our services are customized to meet specific needs. For more details please visit our website ****************** Job Description Technical/Functional Skills: • Must have at least 1 full-scale Hadoop implementation from DEV to PROD • Must have experience in Production Deployment Process for Big Data projects • Must have experience in root cause analysis, trouble-shooting of Hadoop applications • Must have significant experience in designing solutions using Cloudera Hadoop • Must have significant experience with Java MapReduce, PIG, Hive, Sqoop and Oozie • Must have significant experience with Unix Shell Scripts • Exposure to Healthcare Provider domain Roles & Responsibilities: • Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions. • Mentor, guide and train Team members on Big Data • Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop. • Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks. • Perform complex coding tasks using Java Map Reduce, PIG, Hive and Sqoop • Review and certify code written by team members • Ensures that established change management and other procedures are adhered to also help developing needing standards, procedures, and practices. • Performance tuning with large data sets. Generic Managerial Skills: • Ability to lead the team, plan, track and manage and work performed by team members • Ability to work independently and communicate across multiple levels (Product owners, Executive sponsors, Team members) Additional Information All your information will be kept confidential according to EEO guidelines.
    $91k-120k yearly est. 60d+ ago
  • Big Data / Hadoop Technical Lead

    Tectammina

    Senior data scientist job in Franklin, TN

    First IT Solutions provides a professional, cost effective recruitment solution. We take the time to understand your needs in great detail. With our dedicated and experienced Recruitment Consultants, our market knowledge, and our emphasis on quality and satisfaction, we pride ourselves on offering the best solution the first time. Our consultants have substantial experience gained over many years placing talented individuals in Contract and Permanent positions within the technology industry. You can be sure that we understand the process well from your side of the desk. We started First IT to provide top quality service, something that clients complained was lacking at other recruiting and placement firms. At First IT, we strive continually to provide excellent service at all times. Job Description Technical/Functional Skills: Minimum Experience Required 8 years Must have experience as a Tech Lead for Big data projects Must have significant experience with architecting and designing solutions using Cloudera Hadoop Must have significant experience with Python, Java Map Reduce, PIG, Hive, Hbase, Oozie and Sqoop Must have significant experience with Unix Shell Scripts Exposure to Healthcare Provider domain Qualifications Architect and Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions. Estimate size, effort, complexity of solutions Plan, track and report project status Mentor, guide and train Team members on Big Data Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop. Prepare detailed specifications, diagrams, and other programming structures from which programs are written. Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks. Perform complex coding tasks using Python, Java Map Reduce, PIG, Hive, Hbase, Oozie and Sqoop Review and certify code written by team members Ensures that established change management and other procedures are adhired to also help developing needing standards, procedures, and practices. Performance tuning with large data sets. Additional Information Duration: Full Time Eligiblity: GC & US Citizens Only Share the Profiles to **************************** Contact: ************ Keep the subject line with Job Title and Location
    $86k-122k yearly est. Easy Apply 12h ago
  • Senior Data Architect

    Fortive 4.1company rating

    Senior data scientist job in Franklin, TN

    Job Title: Sr. Data Architect About Censis Censis Technologies (************************* a global leader in surgical instrument tracking and asset management solutions. At the forefront of healthcare innovation, Censis, the first company to engineer a surgical asset management system that tracks down to the instrument and patient levels, has continually set the standards for the sterile processing industry. From the beginning, Censis has recognized the vital connection between perioperative innovation and efficiency, unparalleled customer care and improved operational performance. By continuing to invest in technology, ease of integration, education and support, Censis provides solutions that empower hospitals and healthcare providers to stay compliant and ahead of healthcare's rapidly changing environment. With Censis, you're positioned to start ahead and stay ahead, no matter what the future holds. Role Overview Censis is seeking a highly experienced and innovative Sr. Data Architect to lead the design and implementation of modern data solutions using Microsoft Fabric, Lakehouse architecture, Power BI, semantic data modelling, and medallion architecture. The ideal candidate will have a strong foundation in data architecture, SQL Server, data pipelines, on-premises data integration using ODG (On-prem Data Gateway), and semantic layer development. This role demands a strategic thinker with hands-on expertise in building scalable, secure, and high-performance BI ecosystems, leveraging AI-driven development and delivery. In addition to data architecture, the role will be responsible for building and leading an enterprise architecture function, ensuring technology alignment with business strategy and innovation objectives. Key Responsibilities Enterprise Architecture Leadership Define and maintain enterprise architecture strategy aligning business objectives with technology capabilities, ensuring scalability, reliability, security, and compliance. Lead, mentor, and scale a team of solution and enterprise architects, fostering a high-performance culture rooted in architectural excellence, innovation, and collaboration. Lead architecture governance and standards, establishing frameworks, best practices, and review designs and processes across applications, data, interfaces, and infrastructure domains. Drive cross-functional alignment by collaborating with business, IT, and engineering leaders to ensure technology roadmaps support organizational priorities and innovation. Build the Enterprise Architecture team, fostering strong architectural excellence, knowledge sharing, and continuous improvement across the enterprise. Evaluate emerging technologies and guide strategic investments in data platforms, AI tools, interfaces and automation to enhance healthcare efficiency and outcomes. Build relationships with partners such as Microsoft, AWS and Service delivery partners to execute on the enterprise architecture vision and strategy. Architecture & Design Design and implement scalable BI and data architecture using Data Lake or Lakehouse paradigms, including medallion architecture. Architect and optimize ELT/ETL pipelines using SQL Server, Dataflows, and data Pipelines. Integrate on-premises data sources using On-prem Data Gateway (ODG) with cloud-based solutions. Develop a robust semantic layer and underlying data model to bridge technical data and business language, applying data virtualization principles. Design and optimize semantic models that represent data in a way that enhances understanding and usability for analytics. Development & Implementation Build and manage data pipelines, Lakehouses, Data Lakes, and Data Warehouses in Azure or AWS. Manage Power BI dashboards with advanced DAX and real-time data integration. Implement data governance, security, and compliance best practices. Utilize AI-driven development and delivery to enhance solution effectiveness and efficiency. Define data quality checks, transformations, and cleansing rules, and work with data engineers to implement them within the semantic layer. Strong T-SQL knowledge, materialize View, Indexing, Column store, Dimensional Data Modelling Monitoring & Optimization Monitor and optimize data pipeline performance and troubleshoot issues. Ensure data quality, lineage, and availability across all reporting layers. Maintain comprehensive documentation of architecture, data models, workflows, and semantic layer details. Required Skills & Qualifications Experience: 12+ years in data architecture, with at least 3-5 years in an enterprise or solution architecture leadership capacity. Leadership: Proven experience building and leading cross-functional enterprise architecture teams and influencing enterprise-wide technology direction. Expertise in semantic data modelling and data engineering skills. Experience in architecture frameworks, best practices, designs and processes across applications, data, interfaces, and infrastructure domains. Strong experience with data platforms such as Snowflake, Databricks, Microsoft Fabric, or Azure Data Gateway, Azure Synapse. BI Tools: Working Knowledge in Power BI and integration with Microsoft Fabric is a plus. Data Integration: Experience with interfaces, API designs and 3 rd party systems integration using tools such as Boomi, Azure API Management, and other middleware platforms. Data Engineering: Proficient in designing ELT/ETL processes using SQL Server, columnar data format such as Delta or Parquet and Fabric Pipelines. Architecture: Strong understanding of medallion architecture, data virtualization principles, cloud-based data management, and analytics technologies. AI: Working knowledge of AI tools to accelerate development is a plus, such as Github Copilot, Cursor AI, Claude or similar. Programming: Expertise with Python, T-SQL, Spark or other scripting languages for data transformation. Experience in programming languages such as C#, .NET platform, Node.js, Vue.js will be preferable Methodologies: Agile/Scrum project delivery experience. Communication: Exceptional communication, strategic thinking, and stakeholder management skills; ability to bridge technical and business domains effectively. Certifications: Certifications in Azure, Power BI, Microsoft Fabric and other Data or Enterprise architecture platforms are a plus. Bonus or Equity This position is also eligible for bonus as part of the total compensation package. Fortive Corporation Overview Fortive's essential technology makes the world safe and more productive. We accelerate transformation across a broad range of applications including environmental, health and safety compliance, industrial condition monitoring, next-generation product design, and healthcare safety solutions. We are a global industrial technology innovator with a startup spirit. Our forward-looking companies lead the way in software-powered workflow solutions, data-driven intelligence, AI-powered automation, and other disruptive technologies. We're a force for progress, working alongside our customers and partners to solve challenges on a global scale, from workplace safety in the most demanding conditions to groundbreaking sustainability solutions. We are a diverse team 10,000 strong, united by a dynamic, inclusive culture and energized by limitless learning and growth. We use the proven Fortive Business System (FBS) to accelerate our positive impact. At Fortive, we believe in you. We believe in your potential-your ability to learn, grow, and make a difference. At Fortive, we believe in us. We believe in the power of people working together to solve problems no one could solve alone. At Fortive, we believe in growth. We're honest about what's working and what isn't, and we never stop improving and innovating. Fortive: For you, for us, for growth. About Censis Censis, the first company to engineer a surgical asset management system that tracks down to the instrument and patient levels, has continually set the standards for the sterile processing industry.From the beginning, Censis has recognized the vital connection between perioperative innovation and efficiency, unparalleled customer care and improved operational performance. By continuing to invest in technology, ease of integration, education and support, Censis provides solutions that empower hospitals and healthcare providers to stay compliant and ahead of healthcare's rapidly changing environment. With Censis, you're positioned to start ahead and stay ahead, no matter what the future holds. We Are an Equal Opportunity Employer. Fortive Corporation and all Fortive Companies are proud to be equal opportunity employers. We value and encourage diversity and solicit applications from all qualified applicants without regard to race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity or expression, or other characteristics protected by law. Fortive and all Fortive Companies are also committed to providing reasonable accommodations for applicants with disabilities. Individuals who need a reasonable accommodation because of a disability for any part of the employment application process, please contact us at applyassistance@fortive.com.
    $87k-111k yearly est. Auto-Apply 60d+ ago
  • Data and AI Lead

    Infosys Ltd. 4.4company rating

    Senior data scientist job in Brentwood, TN

    Infosys is seeking Data and AI Lead. You will spearhead the development and deployment of cutting-edge AI solutions. You will design and build scalable systems leveraging OpenAI APIs, Azure AI Search, vector databases, and Retrieval Augmented Generation (RAG) models to solve complex business challenges and drive exceptional user experiences. In this role, you will architect and implement end-to-end AI solutions combining OpenAI APIs, Azure AI Search, vector databases (utilizing Azure AI Search capabilities), and RAG models. Design and optimize RAG pipelines to enhance the accuracy and relevance of AI-generated responses. Manage vector databases within Azure AI Search for efficient semantic search and information retrieval. Develop and maintain robust APIs for seamless integration of AI services across applications and systems. Evaluate AI model performance and implement optimization strategies for continuous improvement. Integrate Azure AI Search with OpenAI services and vector databases to create powerful search and knowledge retrieval systems. Required Qualifications: * Candidate must be located within commuting distance of Overland Park, KS, Raleigh, NC, Brentwood, TN, Hartford, CT, Phoenix, AZ, Tempe, AZ, Dallas, TX, Richardson, TX, Indianapolis, IN, Atlanta, GA or be willing to relocate to the area. * Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education * Candidates authorized to work for any employer in the United States without employer-based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time * At least 4 years of Information Technology experience * Experience in utilizing OpenAI APIs and services. * Experience integrating Azure AI Search with other AI services. * Hands-on experience developing and deploying RAG models. * Strong programming skills in both Python and Java. * Experience designing and building RESTful APIs. Preferred qualifications: * Familiarity with cloud platforms, specifically Azure. * Good understanding of Agile software development frameworks * Strong communication and Analytical skills * Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams * Experience and desire to work in a global delivery environment The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email or face to face. Travel may be required as per the job requirements.
    $67k-79k yearly est. 6d ago
  • Data Engineer

    Zipliens

    Senior data scientist job in Spring Hill, TN

    We're seeking a Data Engineer to help shape the foundation of Zipliens' growing data ecosystem. Our engineering team supports a diverse set of tools and systems that power lien resolution operations, client transparency, and decision-making across the company. In this role, you'll design and maintain reliable data pipelines, optimize data storage and retrieval, and ensure that our systems deliver accurate, timely, and actionable insights. You'll collaborate closely with data analysts, product owners, and engineers to build scalable data infrastructure and contribute to data quality standards that will support the next generation of Zipliens applications. Requirements Responsibilities: Develop and optimize SQL queries and database schemas for efficient data retrieval and storage. Design, develop, and maintain scalable ETL processes. Develop and maintain scripts for data processing. Ensure scalability and performance optimization of data pipelines and queries. Develop and implement data quality checks and monitoring to ensure data accuracy and reliability. Contribute to the development and maintenance of data quality standards and best practices. Collaborate with data analysts to understand requirements and deliver solutions that enable effective reporting and analytics. Design and build reports to provide actionable insights to stakeholders. Document data models, ETL processes, and reporting solutions. Qualifications: Bachelor's degree in Business, Computer Information Systems, Computer Science, or equivalent practical experience. 4+ years of experience as a Data Engineer, Senior Data Analyst, or similar role. Strong proficiency in SQL and experience with relational and cloud-based data storage solutions (e.g., PostgreSQL, MySQL, SQL Server, Snowflake, Redshift, BigQuery). Experience with ETL tools and techniques. Experience with a general-purpose programming language (Python preferred). Familiarity with data warehousing concepts and data modeling principles. Understanding of cloud platforms (e.g., AWS, Azure, GCP) and their data services is a plus. Strong analytical and problem-solving skills with a focus on data quality, performance, and reliability. Collaborative mindset with the ability to communicate effectively with stakeholders. This role requires on-site presence at least three days per week (60%) in our Spring Hill, TN office. Benefits Comprehensive Health Benefits (Medical, Dental, and Vision), including HSA with employer contributions, FSA, and Dependent Care FSA Company-Paid Life Insurance and Short-Term Disability 401(k) Plan with Company Match Paid Time Off (Vacation, Sick Leave, and 10 Holidays) Paid Parental Leave Pay Disclosure: The total base salary range for this role is $89,000 - $120,000 annually, with an opportunity for a discretionary bonus. Final compensation will be determined based on skills and experience.
    $89k-120k yearly Auto-Apply 48d ago
  • Data Engineer

    Two95 International 3.9company rating

    Senior data scientist job in Franklin, TN

    Title: Data Engineer Type: 6 months (contract to hire) RATE : open Requirements 5+ years of developing software using object-oriented or functional language experience 5+ years of SQL 3+ years working with open source Big Data technology stacks (Apache Nifi, Spark, Kafka, HBase, Hadoop/HDFS, Hive, Drill, Pig, etc.) or commercial open source Big Data technology stacks (Hortonworks, Cloudera, etc.) 3+ years with document databases (e.g. MongoDB, Accumulo, etc.) 3+ years of experience using Agile development processes (e.g. developing and estimating user stories, sprint planning, sprint retrospectives, etc.) 2+ years of distributed version control system (e.g. git) 3+ years of experience in cloud-based development and delivery Familiarity with distributed computing patterns, techniques, and technologies (e.g. ESB) Familiarity with continuous delivery technologies (e.g. Puppet, Chef, Ansible, Docker, Vagrant, etc.) Familiarity with build automation and continuous integration tools (e.g. Maven, Jenkins, Bamboo, etc.) Familiarity with Agile process management tools (e.g. Atlassian Jira) Familiarity with test automation (Selenium, SoapUI, etc.) Good software development and Object Oriented programming skills. Strong analytical skills and the ability to work with end users to transform requests into robust solutions. Excellent oral and written communication skills. Initiative and self-motivation to work independently on projects. Benefits Note: If interested please send your updated resume and include your salary requirement along with your contact details with a suitable time when we can reach you. If you know of anyone in your sphere of contacts, who would be a perfect match for this job then, we would appreciate if you can forward this posting to them with a copy to us.We look forward to hearing from you at the earliest!
    $75k-103k yearly est. Auto-Apply 60d+ ago
  • Data Engineer - Archimedes

    Navitus 4.7company rating

    Senior data scientist job in Brentwood, TN

    Company Archimedes About Us Archimedes - Transforming the Specialty Drug Benefit - Archimedes is the industry leader in specialty drug management solutions. Founded with the goal of transforming the PBM industry to provide the necessary ingredients for the sustainability of the prescription drug benefit - alignment, value and transparency - Archimedes achieves superior results for clients by eliminating tightly held PBM conflicts of interest including drug spread, rebate retention and pharmacy ownership and delivering the most rigorous clinical management at the lowest net cost. .______________________________________________________________________________________________________________________________________________________________________________________________________. Current associates must use SSO login option at ************************************ to be considered for internal opportunities. Pay Range USD $0.00 - USD $0.00 /Yr. STAR Bonus % (At Risk Maximum) 10.00 - Manager, Clinical Mgr, Pharm Supvr, CAE, Sr CAE I Work Schedule Description (e.g. M-F 8am to 5pm) Core Business Hours Overview The Data Engineer is responsible for designing, building, and maintaining scalable data pipelines and infrastructure that support enterprise analytics, reporting, and operational data flows. This role plays a critical part in enabling data-driven decision-making across Centene and SPM by ensuring the availability, integrity, and performance of data systems. The Data Engineer collaborates with data scientists, analysts, software developers, and business stakeholders to deliver robust ETL solutions, optimize data storage and retrieval, and implement secure, compliant data architectures in cloud and hybrid environments. Operating within a healthcare-focused, compliance-heavy landscape, the Data Engineer ensures that data platforms align with regulatory standards such as HIPAA and SOC 2, while embedding automation and CI/CD practices into daily workflows. The role supports both AWS and Azure environments, leveraging cloud-native services and modern tooling to streamline data ingestion, transformation, and delivery. Responsibilities Job Responsibilities: * Design, develop, and maintain ETL pipelines for structured and unstructured data across cloud and on-prem environments. * Build and optimize data models, schemas, and storage solutions in SQL Server, PostgreSQL, and cloud-native databases. * Implement CI/CD workflows for data pipeline deployment and monitoring using tools such as GitHub Actions, Azure DevOps, or Jenkins. * Develop and maintain data integrations using AWS Glue, Azure Data Factory, Lambda, EventBridge, and other cloud-native services. * Ensure data quality, lineage, and governance through automated validation, logging, and monitoring frameworks. * Collaborate with cross-functional teams to gather requirements, design scalable solutions, and support analytics and reporting needs. * Monitor and troubleshoot data pipeline performance, latency, and failures; implement proactive alerting and remediation strategies. * Support data security and compliance by enforcing access controls, encryption standards, and audit logging aligned with HIPAA and SOC 2. * Maintain documentation for data flows, architecture diagrams, and operational procedures. * Participate in sprint planning, code reviews, and agile ceremonies to support iterative development and continuous improvement. * Evaluate and integrate new data tools, frameworks, and cloud services to enhance platform capabilities. * Partner with DevOps and Security teams to ensure infrastructure-as-code and secure deployment practices are followed. * Participate in, adhere to, and support compliance, people and culture, and learning programs. * Perform other duties as assigned. Qualifications Essential Background Requirements: * Education: Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field required. Master's degree preferred. * Certification/Licenses: AWS Certified Data Analytics or Solutions Architect required. Microsoft Certified: Azure Data Engineer Associate required. Certified Data Management Professional (CDMP) required. * Experience: * 5+ years of experience in data engineering, ETL development, or cloud data architecture required. * Proven experience with SQL, ETL tools, and CI/CD pipelines required. * Hands-on experience with AWS and Azure data services and infrastructure required. * Familiarity with data governance, compliance frameworks (HIPAA, SOC 2), and secure data handling practices required. * Familiarity with CI/CD pipelines, automated testing, and version control systems required. * Skills & Technologies: * Languages & Tools: SQL, Python, Bash, Git, Terraform, PowerShell * ETL & Orchestration: AWS Glue, Azure Data Factory, Apache Airflow * CI/CD: GitHub Actions, Azure DevOps, Jenkins * Cloud Platforms: AWS (S3, Lambda, RDS, Redshift), Azure (Blob Storage, Synapse, Functions) * Monitoring & Logging: CloudWatch, Azure Monitor, ELK Stack * Data Governance: Data cataloging, lineage tracking, encryption, and access control. Location : Address 5250 Virginia Way Ste 300 Location : City Brentwood Location : State/Province TN Location : Postal Code 37027 Location : Country US
    $74k-103k yearly est. Auto-Apply 60d+ ago
  • Engineer, Data

    Holley Performance

    Senior data scientist job in Bowling Green, KY

    Job Description This role focuses on backend development and integrations for building and maintaining enterprise data warehouses and data lakes. The ideal candidate will possess a deep understanding of data architecture, ETL pipelines, and integration technologies, ensuring seamless data flow and accessibility across the organization. Key Responsibilities: · Design, develop, and maintain scalable backend systems to support data warehousing and data lake initiatives. · Build and optimize ETL/ELT processes to extract, transform, and load data from various sources into centralized data repositories. · Develop and implement integration solutions for seamless data exchange between systems, applications, and platforms. · Collaborate with data architects, analysts, and other stakeholders to define and implement data models, schemas, and storage solutions. · Ensure data quality, consistency, and security by implementing best practices and monitoring frameworks. · Monitor and troubleshoot data pipelines and systems to ensure high availability and performance. · Stay up-to-date with emerging technologies and trends in data engineering and integration to recommend improvements and innovations. · Document technical designs, processes, and standards for the team and stakeholders. Qualifications: · Bachelor's degree in Computer Science, Engineering, or a related field; equivalent experience considered. · Proven experience as a Data Engineer with 5 or more years of experience; or in a similar backend development role. · Strong proficiency in programming languages such as Python, Java, or Scala. · Hands-on experience with ETL/ELT tools and frameworks (e.g., Apache Airflow, Talend, Informatica, etc.). · Extensive knowledge of relational and non-relational databases (e.g., SQL, NoSQL, PostgreSQL, MongoDB). · Expertise in building and managing enterprise data warehouses (e.g., Snowflake, Amazon Redshift, Google BigQuery) and data lakes (e.g., AWS S3, Azure Data Lake). · Familiarity with cloud platforms (AWS, Azure, Google Cloud) and their data services. · Experience with API integrations and data exchange protocols (e.g., REST, SOAP, JSON, XML). · Solid understanding of data governance, security, and compliance standards. · Strong analytical and problem-solving skills with attention to detail. · Excellent communication and collaboration abilities. Preferred Qualifications: · Certifications in cloud platforms (AWS Certified Data Analytics, Azure Data Engineer, etc.) · Experience with big data technologies (e.g., Apache Hadoop, Spark, Kafka). · Knowledge of data visualization tools (e.g., Tableau, Power BI) for supporting downstream analytics. · Familiarity with DevOps practices and tools (e.g., Docker, Kubernetes, Jenkins). Please note: Relocation assistance will not be available for this position.
    $70k-94k yearly est. 2d ago
  • Data Platform Engineer

    Monogram Health Inc. 3.7company rating

    Senior data scientist job in Brentwood, TN

    Job DescriptionPosition: Data Platform Engineer The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions. Responsibilities Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms. Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks. Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark. Develop and manage data models, data warehousing solutions, and data integration architectures in Azure. Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems. Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration. Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases. Ensure data quality, governance, and security across the data lifecycle. Collaborate with product managers by estimating technical tasks and deliverables. Uphold the mission and values of Monogram Health in all aspects of your role and activities. Position Requirements A bachelor's degree in computer science, data science, software engineering or related field. Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark. Expert level knowledge of Python or other scripting languages required. Proficiency in SQL and other data query languages. Understanding of data modeling and schema design principles Ability to work with large datasets and perform data analysis Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable. Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC). Thorough understanding of Azure Cloud Infrastructure offerings. Demonstrated problem-solving and troubleshooting skills. Team player with demonstrated written and communication skills. Benefits Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders. Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home. Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
    $75k-103k yearly est. 11d ago
  • Data Engineer

    Lattimore Black Morgan & Cain, PC and Affiliates

    Senior data scientist job in Brentwood, TN

    OPPORTUNITY We are seeking an experienced Data Engineer with 2-3+ years of hands-on experience to design, build, and maintain robust data solutions within our Azure/Microsoft-centric technology stack. The ideal candidate will thrive in a collaborative yet independent work environment, delivering high-quality analytics solutions that drive business insights and decision-making. SCOPE OF WORK * Design and implement scalable data models and pipelines using modern data engineering practices * Develop and maintain production-grade analytics solutions within Azure ecosystem * Build comprehensive dashboards and reports using Power BI to support business stakeholders * Collaborate with cross-functional teams to translate business requirements into technical solutions * Optimize SQL queries and data processing workflows for performance and reliability * Support and enhance existing data infrastructure while implementing new analytics capabilities * Participate in code reviews and maintain high standards for data quality and documentation * Respond quickly to critical data issues and provide solutions under tight deadlines IDEAL CANDIDATE PROFILE Required Qualifications * 2-3+ years of experience in analytics engineering, data engineering, or similar role * Advanced SQL proficiency with experience in complex query optimization and database design * Python programming skills for data manipulation, automation, and analytics * Power BI expertise including dashboard development, DAX, and data visualization best practices * Azure cloud platform experience including Azure Data Factory, Azure SQL, and related services * Data modeling experience with dimensional modeling, star/snowflake schemas, and ETL/ELT processes * Production environment experience including deployment, monitoring, and maintenance of data systems * Strong communication skills with ability to explain technical concepts to non-technical stakeholders * Proven ability to work independently and manage multiple priorities effectively * Experience working under pressure with quick turnaround requirements Preferred Qualifications * Snowflake and/or Databricks experience with modern cloud data platforms * Private equity or financial services background - understanding of investment data, portfolio management, or financial reporting * Machine Learning experience including model development, deployment, or MLOps practices * Experience with additional Azure services (Synapse, Logic Apps, Power Platform) * Knowledge of data governance and compliance frameworks * Experience with version control (Git) and CI/CD practices * Advanced Python libraries experience (pandas, scikit-learn, etc.)
    $70k-94k yearly est. 60d+ ago
  • Oracle Enterprise Data Scientist

    Community Health Systems 4.5company rating

    Senior data scientist job in Franklin, TN

    We are seeking a highly specialized and experienced Enterprise Data Scientist to drive data quality, standardization, and insight generation across our core Oracle operational suite. This role serves as the authoritative expert on translating complex, high-volume data from Oracle Supply Chain Management (SCM), Oracle Procurement, Oracle Revenue Cycle Management (RCM), and Oracle Inventory into actionable business intelligence. The successful candidate will be focused on ensuring absolute data integrity-a critical function in a regulated healthcare environment-and transforming raw transactional data into high-value operational reports, interactive dashboards, and predictive models that optimize cost-per-case, enhance inventory accuracy, and accelerate the revenue cycle. **Essential Functions** 1. Data Validation, Integrity, and Compliance (Critical Focus) + Healthcare Data Quality Assurance: Design and implement automated data validation frameworks specific to healthcare operations, ensuring transactional data (e.g., supply usage, procedure charging, contract pricing) is accurate. + Compliance Verification: Develop reports and monitoring tools to detect anomalies and discrepancies that could impact regulatory reporting, financial audits (e.g., SOX implications), or compliance with GPO contracts and payer rules. + Revenue Leakage Identification: Specifically focus on validating the link between inventory consumption (SCM) and patient billing (RCM) data to prevent charge capture errors, ensuring accurate patient bills and maximizing appropriate reimbursement. + Root Cause Analysis: Investigate and diagnose data errors originating in Oracle system configurations (EBS or Fusion), ensuring the integrity of critical data points like item master definitions, vendor codes, and pricing tiers. 2. Standardized Operational Analytics and Reporting + KPI Development (Healthcare Specific): Define, standardize, and institutionalize critical operational metrics across the organization, such as: + Inventory Accuracy Rate for Critical Supplies + Procurement Compliance Rate (Off-Contract Spend) + Days of Supply (DOS) for high-value pharmaceuticals and implants + Cost-Per-Case Variance analysis (linking supply cost to procedure type) + Claims Denial Rate Analysis linked to operational inputs + High-Value Reporting: Develop and maintain standardized operational reports and interactive dashboards (e.g., Tableau, Power BI) focused on optimizing the efficiency and spend within the OR, Clinics, and centralized purchasing departments. + Executive Insights: Create visually compelling and accurate reports for executive leadership on the overall health and financial performance driven by Oracle system outputs. 3. Advanced Modeling and Process Optimization + Predictive Inventory Modeling: Develop sophisticated models to forecast demand volatility (e.g., flu season spikes, pandemic-related surges) for critical supplies and pharmaceuticals, minimizing shortages and excess waste. + Revenue Cycle Modeling: Build predictive models to forecast cash flow, anticipate denials based on procurement/charging patterns, and prioritize RCM work queues based on expected return. + Efficiency Optimization: Utilize machine learning techniques to optimize logistics (e.g., warehouse routing, supply replenishment schedules) and procurement processes (e.g., automated purchase order generation based on consumption velocity). 4. Collaboration and System Expertise + Serve as the technical data expert for functional Oracle teams (Finance, Clinical Operations, Materials Management), bridging the gap between business needs and data structure. + Document data lineage, metric definitions, and model methodologies to ensure transparency and trust in derived insights across the enterprise. **Required Qualifications:** + Education: Master's degree in Data Science, Health Informatics, Statistics, Industrial Engineering, or a related quantitative field. + Experience: 2+ years of experience in a specialized data science, BI, or analytics role, working within a large healthcare system, hospital, or payer environment. + Deep Oracle Domain Expertise (Mandatory): Proven practical experience analyzing, querying, and understanding the complex data models within at least two of the following Oracle applications (EBS or Fusion): + Oracle Supply Chain Management (SCM) & Inventory: Specific understanding of item masters, warehouse transactions, and consumption data. + Oracle Procurement: Expertise in purchase order data, contract management, and vendor performance metrics. + Oracle Revenue Cycle Management (RCM): Understanding of charge capture, billing, and the data linkage to operational inputs. **Technical Proficiency:** + Expert-level SQL skills for complex database querying, including experience navigating Oracle tables/views. + Proficiency in Python or R, with experience in statistical modeling, time series analysis, and machine learning libraries. + Experience developing advanced visualizations using industry-leading tools (Tableau, Power BI). + Demonstrable experience working with large-scale Enterprise Data Warehouses (EDW) in a regulated environment. + Preferred Skills and Attributes + Familiarity with clinical coding standards (CPT, ICD-10) as they relate to procedure costing and RCM data. + Understanding of HIPAA, HITECH, and general healthcare data governance standards. + Experience with advanced analytics applied to surgical services or procedural areas. + Excellent collaboration and communication skills, with the ability to present complex analytical findings to clinical and executive audiences. + Certification in Oracle applications or cloud platforms is a plus. Equal Employment Opportunity This organization does not discriminate in any way to deprive any person of employment opportunities or otherwise adversely affect the status of any employee because of race, color, religion, sex, sexual orientation, genetic information, gender identity, national origin, age, disability, citizenship, veteran status, or military or uniformed services, in accordance with all applicable governmental laws and regulations. In addition, the facility complies with all applicable federal, state and local laws governing nondiscrimination in employment. This applies to all terms and conditions of employment including, but not limited to: hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. If you are an applicant with a mental or physical disability who needs a reasonable accommodation for any part of the application or hiring process, contact the director of Human Resources at the facility to which you are seeking employment; Simply go to ************************************************* to obtain the main telephone number of the facility and ask for Human Resources.
    $83k-108k yearly est. 60d+ ago
  • Data Scientist, Merchandising Analytics

    Tractor Supply Company 4.2company rating

    Senior data scientist job in Brentwood, TN

    The Data Scientist, Merchandising Analytics at Tractor Supply Company will play a key role in leveraging data to address complex business challenges. The role will develop advanced statistical methods using Machine Learning, AI, statistical modeling, and optimization techniques to support the merchandising strategies and broader organizational goals of TSC. Additionally, the role will contribute to setting objectives to enhance the overall data architecture and data governance within the Merchandising team. Key areas of focus within Merchandising will be Space Planning and Pricing. The Data Scientist will lead cross-functional projects, design and implement predictive models, and promote data-driven decision-making throughout the organization. Strong communication skills are essential as this role will translate complex analytical results into clear, actionable insights for both technical and non-technical stakeholders. This role work will ensure that Tractor Supply Company remains at the forefront of industry trends and emerging technologies in the data science field. **Essential Duties and Responsibilities (Min 5%)** + Work closely with key business partners to fully explore and frame a business question including objectives, goals, KPIs, decisions the analysis will support and required timing for deliverables. + Extracting available and relevant data from internal and external data sources to perform data science solution development. + Develop, maintain, and improve predictive models using R, Python, and Databricks to enhance business knowledge and processes. + Contribute and assist the team with best practices in data governance, data engineering, and data architecture. + Identify opportunities for automation and continuous improvement within data pipelines, processes, and systems. + Maintain a broad exposure to the wider ecosystem of AI / Machine Learning and ensure our team is pushing toward the most optimal solutions. + Manage multiple projects simultaneously with limited oversight including development of new technologies and maintenance of existing framework. + Foster a culture of data-driven decision making and collaborate with cross-functional teams and senior leadership to define the long-term vision and goals for data science and engineering within the organization. + Design and execute A/B tests to evaluate the effectiveness of various data-driven solutions, including designing appropriate sample sizes, metrics for evaluation, and statistical analysis plans. + Use proven, predictive science to right-size every store with localized plans that balance individual store space allocations with your top-down and bottom-up strategies. + Be a 'Go-To' person for any data analytical needs ranging from data extraction/ manipulations, long term trend analysis, statistical analysis, and modeling techniques. Perform code-review and debug with the team and assist during implementation where necessary. **Required Qualifications** _Experience_ : 3-5 years proven experience as a Data Scientist, preferably in the retail or e-commerce industry. Prefer 2+ years of experience in predictive modeling utilizing CRM or transaction information. _Education_ : Bachelor's Degree in Mathematics, Statistics, or Econometrics. Master's Degree prefered. Any combination of education and experiencce will be considered. _Professional Certifications_ : None **Preferred knowledge, skills or abilities** + Intermediate-advanced in one or more programming language (Python, PySpark, R). + Deep expertise in writing and debugging complex SQL queries. + Ability to frame business questions and create an analytics solution using statistical or other advanced analytics methodologies. + Proven advanced modeling experience in leading data-driven projects from definition to execution, driving and influencing project roadmaps. + Experience using Azure, AWS, or another cloud compute platform a plus. + Familiarity with visualization tools such as Power BI and Tableau. + Must possess high degree of aptitude in communicating both verbally and written complex analysis results to Senior & Executive Leadership. + Knowledge of A/ B testing methods; capable of designing a controlled test design, running the test and providing measurement post-hoc. + Proficiency with managing data repository and version control systems like Git. + Speak, read and write effectively in the English language. **Working Conditions** + Hybrid / Flexible working conditions **Physical Requirements** + Sitting + Standing (not walking) + Walking + Kneeling/Stooping/Bending + Lifting up to 10 pounds **Disclaimer** _This job description represents an overview of the responsibilities for the above referenced position. It is not intended to represent a comprehensive list of responsibilities. A team member should perform all duties as assigned by his/ her supervisor._ **Company Info** At Tractor Supply and Petsense by Tractor Supply, our Team Members are the heart of our success. Their dedication, passion, and hard work drive everything we do, and we are committed to supporting them with a comprehensive and accessible total reward package. We understand the evolving needs of our Team Members and their families, and we strive to offer meaningful, competitive, and sustainable benefits that support their well-being today and in the future. Our benefits extend beyond medical, dental, and vision coverage, including company-paid life and disability insurance, paid parental leave, tuition reimbursement, and family planning resources such as adoption and surrogacy assistance, for all full-time Team Members and all part-time Team Members. Part time new hires gain eligibility for TSC Benefits by averaging at least 15 hours per week during their 90-day lookback period. The lookback period starts the first of the month following the date of hire. If the 15-hour requirement was met, the benefits eligibility date will be the first day of the month following 4 months of continuous service. Please visitthis link (********************************************************************** for more specific information about the benefits and leave policies applicable to the position you're applying for. **ALREADY A TEAM MEMBER?** You must apply or refer a friend through our internal portal Click here (************************************************************************** **CONNECTION** Our Mission and Values are more than just words on the wall - they're the one constant in an ever-changing environment and the bedrock on which we build our culture. They're the core of who we are and the foundation of every decision we make. It's not just what we do that sets us apart, but how we do it. Learn More **EMPOWERMENT** We believe in managing your time for business and personal success, which is why we empower our Team Members to lead balanced lives through our benefits and total rewards offerings. For full-time and eligible part-time TSC and Petsense Team Members. We care about what you care about! Learn More **OPPORTUNITY** A lot of care goes into providing legendary service at Tractor Supply Company, which is why our Team Members are our top priority. Want a career with a clear path for growth? Your Opportunity is Out Here at Tractor Supply and Petsense. Learn More Join Our Talent Community **Nearest Major Market:** Nashville
    $81k-105k yearly est. 60d+ ago
  • Bigdata / Hadoop Technical Lead

    E*Pro 3.8company rating

    Senior data scientist job in Franklin, TN

    E*Pro Consulting service offerings include contingent Staff Augmentation of IT professionals, Permanent Recruiting and Temp-to-Hire. In addition, our industry expertise and knowledge within financial services, Insurance, Telecom, Manufacturing, Technology, Media and Entertainment, Pharmaceutical, Health Care and service industries ensures our services are customized to meet specific needs. For more details please visit our website ****************** Job Description Technical/Functional Skills: • Must have at least 1 full-scale Hadoop implementation from DEV to PROD • Must have experience in Production Deployment Process for Big Data projects • Must have experience in root cause analysis, trouble-shooting of Hadoop applications • Must have significant experience in designing solutions using Cloudera Hadoop • Must have significant experience with Java MapReduce, PIG, Hive, Sqoop and Oozie • Must have significant experience with Unix Shell Scripts • Exposure to Healthcare Provider domain Roles & Responsibilities: • Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions. • Mentor, guide and train Team members on Big Data • Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop. • Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks. • Perform complex coding tasks using Java Map Reduce, PIG, Hive and Sqoop • Review and certify code written by team members • Ensures that established change management and other procedures are adhered to also help developing needing standards, procedures, and practices. • Performance tuning with large data sets. Generic Managerial Skills: • Ability to lead the team, plan, track and manage and work performed by team members • Ability to work independently and communicate across multiple levels (Product owners, Executive sponsors, Team members) Additional Information All your information will be kept confidential according to EEO guidelines.
    $91k-120k yearly est. 12h ago
  • Big Data / Hadoop Technical Lead

    Tectammina

    Senior data scientist job in Franklin, TN

    First IT Solutions provides a professional, cost effective recruitment solution. We take the time to understand your needs in great detail. With our dedicated and experienced Recruitment Consultants, our market knowledge, and our emphasis on quality and satisfaction, we pride ourselves on offering the best solution the first time. Our consultants have substantial experience gained over many years placing talented individuals in Contract and Permanent positions within the technology industry. You can be sure that we understand the process well from your side of the desk. We started First IT to provide top quality service, something that clients complained was lacking at other recruiting and placement firms. At First IT, we strive continually to provide excellent service at all times. Job Description Technical/Functional Skills: Minimum Experience Required 8 years Must have experience as a Tech Lead for Big data projects Must have significant experience with architecting and designing solutions using Cloudera Hadoop Must have significant experience with Python, Java Map Reduce, PIG, Hive, Hbase, Oozie and Sqoop Must have significant experience with Unix Shell Scripts Exposure to Healthcare Provider domain Qualifications Architect and Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions. Estimate size, effort, complexity of solutions Plan, track and report project status Mentor, guide and train Team members on Big Data Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop. Prepare detailed specifications, diagrams, and other programming structures from which programs are written. Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks. Perform complex coding tasks using Python, Java Map Reduce, PIG, Hive, Hbase, Oozie and Sqoop Review and certify code written by team members Ensures that established change management and other procedures are adhired to also help developing needing standards, procedures, and practices. Performance tuning with large data sets. Additional Information Duration: Full Time Eligiblity: GC & US Citizens Only Share the Profiles to **************************** Contact: ************ Keep the subject line with Job Title and Location
    $86k-122k yearly est. Easy Apply 60d+ ago
  • Senior Data Architect

    Fortive Corporation 4.1company rating

    Senior data scientist job in Franklin, TN

    Job Title: Sr. Data Architect **About Censis** Censis Technologies (************************* a global leader in surgical instrument tracking and asset management solutions. At the forefront of healthcare innovation, Censis, the first company to engineer a surgical asset management system that tracks down to the instrument and patient levels, has continually set the standards for the sterile processing industry. From the beginning, Censis has recognized the vital connection between perioperative innovation and efficiency, unparalleled customer care and improved operational performance. By continuing to invest in technology, ease of integration, education and support, Censis provides solutions that empower hospitals and healthcare providers to stay compliant and ahead of healthcare's rapidly changing environment. With Censis, you're positioned to start ahead and stay ahead, no matter what the future holds. Role Overview Censis is seeking a highly experienced and innovative Sr. Data Architect to lead the design and implementation of modern data solutions using Microsoft Fabric, Lakehouse architecture, Power BI, semantic data modelling, and medallion architecture. The ideal candidate will have a strong foundation in data architecture, SQL Server, data pipelines, on-premises data integration using ODG (On-prem Data Gateway), and semantic layer development. This role demands a strategic thinker with hands-on expertise in building scalable, secure, and high-performance BI ecosystems, leveraging AI-driven development and delivery. In addition to data architecture, the role will be responsible for building and leading an enterprise architecture function, ensuring technology alignment with business strategy and innovation objectives. Key Responsibilities **Enterprise Architecture Leadership** + Define and maintain enterprise architecture strategy aligning business objectives with technology capabilities, ensuring scalability, reliability, security, and compliance. + Lead, mentor, and scale a team of solution and enterprise architects, fostering a high-performance culture rooted in architectural excellence, innovation, and collaboration. + Lead architecture governance and standards, establishing frameworks, best practices, and review designs and processes across applications, data, interfaces, and infrastructure domains. + Drive cross-functional alignment by collaborating with business, IT, and engineering leaders to ensure technology roadmaps support organizational priorities and innovation. + Build the Enterprise Architecture team, fostering strong architectural excellence, knowledge sharing, and continuous improvement across the enterprise. + Evaluate emerging technologies and guide strategic investments in data platforms, AI tools, interfaces and automation to enhance healthcare efficiency and outcomes. + Build relationships with partners such as Microsoft, AWS and Service delivery partners to execute on the enterprise architecture vision and strategy. **Architecture & Design** + Design and implement scalable BI and data architecture using Data Lake or Lakehouse paradigms, including medallion architecture. + Architect and optimize ELT/ETL pipelines using SQL Server, Dataflows, and data Pipelines. + Integrate on-premises data sources using On-prem Data Gateway (ODG) with cloud-based solutions. + Develop a robust semantic layer and underlying data model to bridge technical data and business language, applying data virtualization principles. + Design and optimize semantic models that represent data in a way that enhances understanding and usability for analytics. **Development & Implementation** + Build and manage data pipelines, Lakehouses, Data Lakes, and Data Warehouses in Azure or AWS. + Manage Power BI dashboards with advanced DAX and real-time data integration. + Implement data governance, security, and compliance best practices. + Utilize AI-driven development and delivery to enhance solution effectiveness and efficiency. + Define data quality checks, transformations, and cleansing rules, and work with data engineers to implement them within the semantic layer. + Strong T-SQL knowledge, materialize View, Indexing, Column store, Dimensional Data Modelling **Monitoring & Optimization** + Monitor and optimize data pipeline performance and troubleshoot issues. + Ensure data quality, lineage, and availability across all reporting layers. + Maintain comprehensive documentation of architecture, data models, workflows, and semantic layer details. **Required Skills & Qualifications** + Experience: 12+ years in data architecture, with at least 3-5 years in an enterprise or solution architecture leadership capacity. + Leadership: Proven experience building and leading cross-functional enterprise architecture teams and influencing enterprise-wide technology direction. + Expertise in semantic data modelling and data engineering skills. + Experience in architecture frameworks, best practices, designs and processes across applications, data, interfaces, and infrastructure domains. + Strong experience with data platforms such as Snowflake, Databricks, Microsoft Fabric, or Azure Data Gateway, Azure Synapse. + BI Tools: Working Knowledge in Power BI and integration with Microsoft Fabric is a plus. + Data Integration: Experience with interfaces, API designs and 3 rd party systems integration using tools such as Boomi, Azure API Management, and other middleware platforms. + Data Engineering: Proficient in designing ELT/ETL processes using SQL Server, columnar data format such as Delta or Parquet and Fabric Pipelines. + Architecture: Strong understanding of medallion architecture, data virtualization principles, cloud-based data management, and analytics technologies. + AI: Working knowledge of AI tools to accelerate development is a plus, such as Github Copilot, Cursor AI, Claude or similar. + Programming: Expertise with Python, T-SQL, Spark or other scripting languages for data transformation. + Experience in programming languages such as C#, .NET platform, Node.js, Vue.js will be preferable + Methodologies: Agile/Scrum project delivery experience. + Communication: Exceptional communication, strategic thinking, and stakeholder management skills; ability to bridge technical and business domains effectively. Certifications: Certifications in Azure, Power BI, Microsoft Fabric and other Data or Enterprise architecture platforms are a plus. **Bonus or Equity** This position is also eligible for bonus as part of the total compensation package. **Fortive Corporation Overview** Fortive's essential technology makes the world safe and more productive. We accelerate transformation across a broad range of applications including environmental, health and safety compliance, industrial condition monitoring, next-generation product design, and healthcare safety solutions. We are a global industrial technology innovator with a startup spirit. Our forward-looking companies lead the way in software-powered workflow solutions, data-driven intelligence, AI-powered automation, and other disruptive technologies. We're a force for progress, working alongside our customers and partners to solve challenges on a global scale, from workplace safety in the most demanding conditions to groundbreaking sustainability solutions. We are a diverse team 10,000 strong, united by a dynamic, inclusive culture and energized by limitless learning and growth. We use the proven Fortive Business System (FBS) to accelerate our positive impact. At Fortive, we believe in you. We believe in your potential-your ability to learn, grow, and make a difference. At Fortive, we believe in us. We believe in the power of people working together to solve problems no one could solve alone. At Fortive, we believe in growth. We're honest about what's working and what isn't, and we never stop improving and innovating. Fortive: For you, for us, for growth. **About Censis** Censis, the first company to engineer a surgical asset management system that tracks down to the instrument and patient levels, has continually set the standards for the sterile processing industry.From the beginning, Censis has recognized the vital connection between perioperative innovation and efficiency, unparalleled customer care and improved operational performance. By continuing to invest in technology, ease of integration, education and support, Censis provides solutions that empower hospitals and healthcare providers to stay compliant and ahead of healthcare's rapidly changing environment. With Censis, you're positioned to start ahead and stay ahead, no matter what the future holds. We Are an Equal Opportunity Employer. Fortive Corporation and all Fortive Companies are proud to be equal opportunity employers. We value and encourage diversity and solicit applications from all qualified applicants without regard to race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity or expression, or other characteristics protected by law. Fortive and all Fortive Companies are also committed to providing reasonable accommodations for applicants with disabilities. Individuals who need a reasonable accommodation because of a disability for any part of the employment application process, please contact us at applyassistance@fortive.com. **Fortive Corporation Overview** Fortive's essential technology makes the world safer and more productive. We accelerate transformation in high-impact fields like workplace safety, build environments, and healthcare. We are a global industrial technology innovator with a startup spirit. Our forward-looking companies lead the way in healthcare sterilization, industrial safety, predictive maintenance, and other mission-critical solutions. We're a force for progress, working alongside our customers and partners to solve challenges on a global scale, from workplace safety in the most demanding conditions to advanced technologies that help providers focus on exceptional patient care. We are a diverse team 10,000 strong, united by a dynamic, inclusive culture and energized by limitless learning and growth. We use the proven Fortive Business System (FBS) to accelerate our positive impact. At Fortive, we believe in you. We believe in your potential-your ability to learn, grow, and make a difference. At Fortive, we believe in us. We believe in the power of people working together to solve problems no one could solve alone. At Fortive, we believe in growth. We're honest about what's working and what isn't, and we never stop improving and innovating. Fortive: For you, for us, for growth. **About Censis** Censis, the first company to engineer a surgical asset management system that tracks down to the instrument and patient levels, has continually set the standards for the sterile processing industry.From the beginning, Censis has recognized the vital connection between perioperative innovation and efficiency, unparalleled customer care and improved operational performance. By continuing to invest in technology, ease of integration, education and support, Censis provides solutions that empower hospitals and healthcare providers to stay compliant and ahead of healthcare's rapidly changing environment. With Censis, you're positioned to start ahead and stay ahead, no matter what the future holds. We Are an Equal Opportunity Employer. Fortive Corporation and all Fortive Companies are proud to be equal opportunity employers. We value and encourage diversity and solicit applications from all qualified applicants without regard to race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity or expression, or other characteristics protected by law. Fortive and all Fortive Companies are also committed to providing reasonable accommodations for applicants with disabilities. Individuals who need a reasonable accommodation because of a disability for any part of the employment application process, please contact us at applyassistance@fortive.com. **Pay Range** The salary range for this position (in local currency) is 101,500.00 - 188,500.00 The salary range for this position (in local currency) is 101,500.00 - 188,500.00 We are an Equal Opportunity Employer Fortive Corporation and all Fortive Companies are proud to be equal opportunity employers. We value and encourage diversity and solicit applications from all qualified applicants without regard to race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity or expression, or other characteristics protected by law. Fortive and all Fortive Companies are also committed to providing reasonable accommodations for applicants with disabilities. Individuals who need a reasonable accommodation because of a disability for any part of the employment application process, please contact us at applyassistance@fortive.com.
    $87k-111k yearly est. 60d+ ago
  • Engineer, Data

    Holley Performance

    Senior data scientist job in Bowling Green, KY

    This role focuses on backend development and integrations for building and maintaining enterprise data warehouses and data lakes. The ideal candidate will possess a deep understanding of data architecture, ETL pipelines, and integration technologies, ensuring seamless data flow and accessibility across the organization. Key Responsibilities: · Design, develop, and maintain scalable backend systems to support data warehousing and data lake initiatives. · Build and optimize ETL/ELT processes to extract, transform, and load data from various sources into centralized data repositories. · Develop and implement integration solutions for seamless data exchange between systems, applications, and platforms. · Collaborate with data architects, analysts, and other stakeholders to define and implement data models, schemas, and storage solutions. · Ensure data quality, consistency, and security by implementing best practices and monitoring frameworks. · Monitor and troubleshoot data pipelines and systems to ensure high availability and performance. · Stay up-to-date with emerging technologies and trends in data engineering and integration to recommend improvements and innovations. · Document technical designs, processes, and standards for the team and stakeholders. Qualifications: · Bachelor's degree in Computer Science, Engineering, or a related field; equivalent experience considered. · Proven experience as a Data Engineer with 5 or more years of experience; or in a similar backend development role. · Strong proficiency in programming languages such as Python, Java, or Scala. · Hands-on experience with ETL/ELT tools and frameworks (e.g., Apache Airflow, Talend, Informatica, etc.). · Extensive knowledge of relational and non-relational databases (e.g., SQL, NoSQL, PostgreSQL, MongoDB). · Expertise in building and managing enterprise data warehouses (e.g., Snowflake, Amazon Redshift, Google BigQuery) and data lakes (e.g., AWS S3, Azure Data Lake). · Familiarity with cloud platforms (AWS, Azure, Google Cloud) and their data services. · Experience with API integrations and data exchange protocols (e.g., REST, SOAP, JSON, XML). · Solid understanding of data governance, security, and compliance standards. · Strong analytical and problem-solving skills with attention to detail. · Excellent communication and collaboration abilities. Preferred Qualifications: · Certifications in cloud platforms (AWS Certified Data Analytics, Azure Data Engineer, etc.) · Experience with big data technologies (e.g., Apache Hadoop, Spark, Kafka). · Knowledge of data visualization tools (e.g., Tableau, Power BI) for supporting downstream analytics. · Familiarity with DevOps practices and tools (e.g., Docker, Kubernetes, Jenkins). Please note: Relocation assistance will not be available for this position.
    $70k-94k yearly est. Auto-Apply 60d+ ago
  • Data Platform Engineer

    Monogram Health 3.7company rating

    Senior data scientist job in Brentwood, TN

    Data Platform Engineer The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions. Responsibilities * Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms. * Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks. * Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark. * Develop and manage data models, data warehousing solutions, and data integration architectures in Azure. * Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems. * Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration. * Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases. * Ensure data quality, governance, and security across the data lifecycle. * Collaborate with product managers by estimating technical tasks and deliverables. * Uphold the mission and values of Monogram Health in all aspects of your role and activities. Position Requirements * A bachelor's degree in computer science, data science, software engineering or related field. * Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark. * Expert level knowledge of Python or other scripting languages required. * Proficiency in SQL and other data query languages. * Understanding of data modeling and schema design principles * Ability to work with large datasets and perform data analysis * Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable. * Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC). * Thorough understanding of Azure Cloud Infrastructure offerings. * Demonstrated problem-solving and troubleshooting skills. * Team player with demonstrated written and communication skills. Benefits * Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts * Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources * Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave * Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders. Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home. Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
    $75k-103k yearly est. 60d+ ago
  • Data Governance and Privacy Lead

    Infosys Ltd. 4.4company rating

    Senior data scientist job in Brentwood, TN

    We're seeking a results-driven Data Governance and Privacy Lead to lead the development and execution of enterprise-wide data management strategies. This is a hands-on leadership role focused on building a trusted data foundation, integrating governance, privacy, and compliance frameworks across multiple business domains. You'll collaborate with senior executives to align data initiatives with strategic objectives, influence enterprise data culture, and implement best-in-class governance technologies. Key Responsibilities: * Develop and lead the Data Governance and Stewardship framework across the organization. * Implement data quality, metadata, and privacy standards aligned with global regulations (GDPR, CCPA, CPPA, COPPA and other US and Canadian Privacy Laws). * Oversee data lineage, cataloging, and integration using tools such as Collibra, Alation, Informatica CDGC, Atlan, OvalEDGE and Azure Purview. * Champion Privacy-by-Design and lead the rollout of data privacy automation via platforms like BigID, OneTrust, WireWheel and Securiti.ai. * Advise C-level executives on data strategy, compliance, and governance maturity improvements. * Build and mentor a high-performing team of Data Stewards, Privacy Analysts, and Data Architects. Required Qualification: * Candidate must be located within commuting distance of Overland Park, KS, Raleigh NC, Brentwood TN, Hartford CT, Phoenix AZ, Tempe, AZ, Dallas, TX, Richardson TX, Indianapolis, IN, or Atlanta, GA or be willing to relocate to the area. * Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education * Candidates authorized to work for any employer in the United States without employer-based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time * At least 4 years of Information Technology experience * Experience in Data Management, Data Governance, and Privacy leadership * Proven success implementing enterprise governance frameworks (DAMA-DMBOK, DCAM) * Deep technical expertise across metadata management, data quality, and MDM * Experience with enterprise data models (BDW, IIW, HPDM) and frameworks like TOGAF or Zachman Preferred Certifications: * Collibra Ranger / Collibra Expert * CDMP (Certified Data Management Professional) * CIPP/US (Privacy Professional) * TOGAF * Excellent executive communication and stakeholder management skills The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email or face to face. Travel may be required as per the job requirements.
    $65k-81k yearly est. 6d ago

Learn more about senior data scientist jobs

How much does a senior data scientist earn in Hendersonville, TN?

The average senior data scientist in Hendersonville, TN earns between $60,000 and $111,000 annually. This compares to the national average senior data scientist range of $90,000 to $170,000.

Average senior data scientist salary in Hendersonville, TN

$82,000
Job type you want
Full Time
Part Time
Internship
Temporary