Oracle Enterprise Data Scientist
Senior data scientist job in Franklin, TN
We are seeking a highly specialized and experienced Enterprise Data Scientist to drive data quality, standardization, and insight generation across our core Oracle operational suite. This role serves as the authoritative expert on translating complex, high-volume data from Oracle Supply Chain Management (SCM), Oracle Procurement, Oracle Revenue Cycle Management (RCM), and Oracle Inventory into actionable business intelligence.
The successful candidate will be focused on ensuring absolute data integrity-a critical function in a regulated healthcare environment-and transforming raw transactional data into high-value operational reports, interactive dashboards, and predictive models that optimize cost-per-case, enhance inventory accuracy, and accelerate the revenue cycle.
**Essential Functions**
1. Data Validation, Integrity, and Compliance (Critical Focus)
+ Healthcare Data Quality Assurance: Design and implement automated data validation frameworks specific to healthcare operations, ensuring transactional data (e.g., supply usage, procedure charging, contract pricing) is accurate.
+ Compliance Verification: Develop reports and monitoring tools to detect anomalies and discrepancies that could impact regulatory reporting, financial audits (e.g., SOX implications), or compliance with GPO contracts and payer rules.
+ Revenue Leakage Identification: Specifically focus on validating the link between inventory consumption (SCM) and patient billing (RCM) data to prevent charge capture errors, ensuring accurate patient bills and maximizing appropriate reimbursement.
+ Root Cause Analysis: Investigate and diagnose data errors originating in Oracle system configurations (EBS or Fusion), ensuring the integrity of critical data points like item master definitions, vendor codes, and pricing tiers.
2. Standardized Operational Analytics and Reporting
+ KPI Development (Healthcare Specific): Define, standardize, and institutionalize critical operational metrics across the organization, such as:
+ Inventory Accuracy Rate for Critical Supplies
+ Procurement Compliance Rate (Off-Contract Spend)
+ Days of Supply (DOS) for high-value pharmaceuticals and implants
+ Cost-Per-Case Variance analysis (linking supply cost to procedure type)
+ Claims Denial Rate Analysis linked to operational inputs
+ High-Value Reporting: Develop and maintain standardized operational reports and interactive dashboards (e.g., Tableau, Power BI) focused on optimizing the efficiency and spend within the OR, Clinics, and centralized purchasing departments.
+ Executive Insights: Create visually compelling and accurate reports for executive leadership on the overall health and financial performance driven by Oracle system outputs.
3. Advanced Modeling and Process Optimization
+ Predictive Inventory Modeling: Develop sophisticated models to forecast demand volatility (e.g., flu season spikes, pandemic-related surges) for critical supplies and pharmaceuticals, minimizing shortages and excess waste.
+ Revenue Cycle Modeling: Build predictive models to forecast cash flow, anticipate denials based on procurement/charging patterns, and prioritize RCM work queues based on expected return.
+ Efficiency Optimization: Utilize machine learning techniques to optimize logistics (e.g., warehouse routing, supply replenishment schedules) and procurement processes (e.g., automated purchase order generation based on consumption velocity).
4. Collaboration and System Expertise
+ Serve as the technical data expert for functional Oracle teams (Finance, Clinical Operations, Materials Management), bridging the gap between business needs and data structure.
+ Document data lineage, metric definitions, and model methodologies to ensure transparency and trust in derived insights across the enterprise.
**Required Qualifications:**
+ Education: Master's degree in Data Science, Health Informatics, Statistics, Industrial Engineering, or a related quantitative field.
+ Experience: 2+ years of experience in a specialized data science, BI, or analytics role, working within a large healthcare system, hospital, or payer environment.
+ Deep Oracle Domain Expertise (Mandatory): Proven practical experience analyzing, querying, and understanding the complex data models within at least two of the following Oracle applications (EBS or Fusion):
+ Oracle Supply Chain Management (SCM) & Inventory: Specific understanding of item masters, warehouse transactions, and consumption data.
+ Oracle Procurement: Expertise in purchase order data, contract management, and vendor performance metrics.
+ Oracle Revenue Cycle Management (RCM): Understanding of charge capture, billing, and the data linkage to operational inputs.
**Technical Proficiency:**
+ Expert-level SQL skills for complex database querying, including experience navigating Oracle tables/views.
+ Proficiency in Python or R, with experience in statistical modeling, time series analysis, and machine learning libraries.
+ Experience developing advanced visualizations using industry-leading tools (Tableau, Power BI).
+ Demonstrable experience working with large-scale Enterprise Data Warehouses (EDW) in a regulated environment.
+ Preferred Skills and Attributes
+ Familiarity with clinical coding standards (CPT, ICD-10) as they relate to procedure costing and RCM data.
+ Understanding of HIPAA, HITECH, and general healthcare data governance standards.
+ Experience with advanced analytics applied to surgical services or procedural areas.
+ Excellent collaboration and communication skills, with the ability to present complex analytical findings to clinical and executive audiences.
+ Certification in Oracle applications or cloud platforms is a plus.
Equal Employment Opportunity
This organization does not discriminate in any way to deprive any person of employment opportunities or otherwise adversely affect the status of any employee because of race, color, religion, sex, sexual orientation, genetic information, gender identity, national origin, age, disability, citizenship, veteran status, or military or uniformed services, in accordance with all applicable governmental laws and regulations. In addition, the facility complies with all applicable federal, state and local laws governing nondiscrimination in employment. This applies to all terms and conditions of employment including, but not limited to: hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. If you are an applicant with a mental or physical disability who needs a reasonable accommodation for any part of the application or hiring process, contact the director of Human Resources at the facility to which you are seeking employment; Simply go to ************************************************* to obtain the main telephone number of the facility and ask for Human Resources.
Data Scientist, Merchandising Analytics
Senior data scientist job in Brentwood, TN
The Data Scientist, Merchandising Analytics at Tractor Supply Company will play a key role in leveraging data to address complex business challenges. The role will develop advanced statistical methods using Machine Learning, AI, statistical modeling, and optimization techniques to support the merchandising strategies and broader organizational goals of TSC. Additionally, the role will contribute to setting objectives to enhance the overall data architecture and data governance within the Merchandising team. Key areas of focus within Merchandising will be Space Planning and Pricing.
The Data Scientist will lead cross-functional projects, design and implement predictive models, and promote data-driven decision-making throughout the organization. Strong communication skills are essential as this role will translate complex analytical results into clear, actionable insights for both technical and non-technical stakeholders. This role work will ensure that Tractor Supply Company remains at the forefront of industry trends and emerging technologies in the data science field.
Essential Duties and Responsibilities (Min 5%)
* Work closely with key business partners to fully explore and frame a business question including objectives, goals, KPIs, decisions the analysis will support and required timing for deliverables.
* Extracting available and relevant data from internal and external data sources to perform data science solution development.
* Develop, maintain, and improve predictive models using R, Python, and Databricks to enhance business knowledge and processes.
* Contribute and assist the team with best practices in data governance, data engineering, and data architecture.
* Identify opportunities for automation and continuous improvement within data pipelines, processes, and systems.
* Maintain a broad exposure to the wider ecosystem of AI / Machine Learning and ensure our team is pushing toward the most optimal solutions.
* Manage multiple projects simultaneously with limited oversight including development of new technologies and maintenance of existing framework.
* Foster a culture of data-driven decision making and collaborate with cross-functional teams and senior leadership to define the long-term vision and goals for data science and engineering within the organization.
* Design and execute A/B tests to evaluate the effectiveness of various data-driven solutions, including designing appropriate sample sizes, metrics for evaluation, and statistical analysis plans.
* Use proven, predictive science to right-size every store with localized plans that balance individual store space allocations with your top-down and bottom-up strategies.
* Be a 'Go-To' person for any data analytical needs ranging from data extraction/ manipulations, long term trend analysis, statistical analysis, and modeling techniques. Perform code-review and debug with the team and assist during implementation where necessary.
Required Qualifications
Experience: 3-5 years proven experience as a Data Scientist, preferably in the retail or e-commerce industry. Prefer 2+ years of experience in predictive modeling utilizing CRM or transaction information.
Education: Bachelor's Degree in Mathematics, Statistics, or Econometrics. Master's Degree prefered. Any combination of education and experiencce will be considered.
Professional Certifications: None
Preferred knowledge, skills or abilities
* Intermediate-advanced in one or more programming language (Python, PySpark, R).
* Deep expertise in writing and debugging complex SQL queries.
* Ability to frame business questions and create an analytics solution using statistical or other advanced analytics methodologies.
* Proven advanced modeling experience in leading data-driven projects from definition to execution, driving and influencing project roadmaps.
* Experience using Azure, AWS, or another cloud compute platform a plus.
* Familiarity with visualization tools such as Power BI and Tableau.
* Must possess high degree of aptitude in communicating both verbally and written complex analysis results to Senior & Executive Leadership.
* Knowledge of A/ B testing methods; capable of designing a controlled test design, running the test and providing measurement post-hoc.
* Proficiency with managing data repository and version control systems like Git.
* Speak, read and write effectively in the English language.
Working Conditions
* Hybrid / Flexible working conditions
Physical Requirements
* Sitting
* Standing (not walking)
* Walking
* Kneeling/Stooping/Bending
* Lifting up to 10 pounds
Disclaimer
This job description represents an overview of the responsibilities for the above referenced position. It is not intended to represent a comprehensive list of responsibilities. A team member should perform all duties as assigned by his/ her supervisor.
Company Info
At Tractor Supply and Petsense by Tractor Supply, our Team Members are the heart of our success. Their dedication, passion, and hard work drive everything we do, and we are committed to supporting them with a comprehensive and accessible total reward package. We understand the evolving needs of our Team Members and their families, and we strive to offer meaningful, competitive, and sustainable benefits that support their well-being today and in the future.
Our benefits extend beyond medical, dental, and vision coverage, including company-paid life and disability insurance, paid parental leave, tuition reimbursement, and family planning resources such as adoption and surrogacy assistance, for all full-time Team Members and all part-time Team Members. Part time new hires gain eligibility for TSC Benefits by averaging at least 15 hours per week during their 90-day lookback period. The lookback period starts the first of the month following the date of hire. If the 15-hour requirement was met, the benefits eligibility date will be the first day of the month following 4 months of continuous service.
Please visit this link for more specific information about the benefits and leave policies applicable to the position you're applying for.
Bigdata / Hadoop Technical Lead
Senior data scientist job in Franklin, TN
E*Pro Consulting service offerings include contingent Staff Augmentation of IT professionals, Permanent Recruiting and Temp-to-Hire. In addition, our industry expertise and knowledge within financial services, Insurance, Telecom, Manufacturing, Technology, Media and Entertainment, Pharmaceutical, Health Care and service industries ensures our services are customized to meet specific needs. For more details please visit our website ******************
Job Description
Technical/Functional Skills:
• Must have at least 1 full-scale Hadoop implementation from DEV to PROD
• Must have experience in Production Deployment Process for Big Data projects
• Must have experience in root cause analysis, trouble-shooting of Hadoop applications
• Must have significant experience in designing solutions using Cloudera Hadoop
• Must have significant experience with Java MapReduce, PIG, Hive, Sqoop and Oozie
• Must have significant experience with Unix Shell Scripts
• Exposure to Healthcare Provider domain
Roles & Responsibilities:
• Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions.
• Mentor, guide and train Team members on Big Data
• Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop.
• Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks.
• Perform complex coding tasks using Java Map Reduce, PIG, Hive and Sqoop
• Review and certify code written by team members
• Ensures that established change management and other procedures are adhered to also help developing needing standards, procedures, and practices.
• Performance tuning with large data sets.
Generic Managerial Skills:
• Ability to lead the team, plan, track and manage and work performed by team members
• Ability to work independently and communicate across multiple levels (Product owners, Executive sponsors, Team members)
Additional Information
All your information will be kept confidential according to EEO guidelines.
Big Data / Hadoop Technical Lead
Senior data scientist job in Franklin, TN
First IT Solutions provides a professional, cost effective recruitment solution. We take the time to understand your needs in great detail. With our dedicated and experienced Recruitment Consultants, our market knowledge, and our emphasis on quality and satisfaction, we pride ourselves on offering the best solution the first time.
Our consultants have substantial experience gained over many years placing talented individuals in Contract and Permanent positions within the technology industry. You can be sure that we understand the process well from your side of the desk. We started First IT to provide top quality service, something that clients complained was lacking at other recruiting and placement firms. At First IT, we strive continually to provide excellent service at all times.
Job Description
Technical/Functional Skills:
Minimum Experience Required 8 years
Must have experience as a Tech Lead for Big data projects
Must have significant experience with architecting and designing solutions using Cloudera Hadoop
Must have significant experience with Python, Java Map Reduce, PIG, Hive, Hbase, Oozie and Sqoop
Must have significant experience with Unix Shell Scripts
Exposure to Healthcare Provider domain
Qualifications
Architect and Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions.
Estimate size, effort, complexity of solutions
Plan, track and report project status
Mentor, guide and train Team members on Big Data
Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop.
Prepare detailed specifications, diagrams, and other programming structures from which programs are written.
Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks.
Perform complex coding tasks using Python, Java Map Reduce, PIG, Hive, Hbase, Oozie and Sqoop
Review and certify code written by team members
Ensures that established change management and other procedures are adhired to also help developing needing standards, procedures, and practices.
Performance tuning with large data sets.
Additional Information
Duration: Full Time
Eligiblity: GC & US Citizens Only
Share the Profiles to ****************************
Contact:
************
Keep the subject line with Job Title and Location
Easy ApplySenior Data Architect
Senior data scientist job in Franklin, TN
Job Title: Sr. Data Architect **About Censis** Censis Technologies (************************* a global leader in surgical instrument tracking and asset management solutions. At the forefront of healthcare innovation, Censis, the first company to engineer a surgical asset management system that tracks down to the instrument and patient levels, has continually set the standards for the sterile processing industry. From the beginning, Censis has recognized the vital connection between perioperative innovation and efficiency, unparalleled customer care and improved operational performance. By continuing to invest in technology, ease of integration, education and support, Censis provides solutions that empower hospitals and healthcare providers to stay compliant and ahead of healthcare's rapidly changing environment. With Censis, you're positioned to start ahead and stay ahead, no matter what the future holds.
Role Overview
Censis is seeking a highly experienced and innovative Sr. Data Architect to lead the design and implementation of modern data solutions using Microsoft Fabric, Lakehouse architecture, Power BI, semantic data modelling, and medallion architecture. The ideal candidate will have a strong foundation in data architecture, SQL Server, data pipelines, on-premises data integration using ODG (On-prem Data Gateway), and semantic layer development. This role demands a strategic thinker with hands-on expertise in building scalable, secure, and high-performance BI ecosystems, leveraging AI-driven development and delivery.
In addition to data architecture, the role will be responsible for building and leading an enterprise architecture function, ensuring technology alignment with business strategy and innovation objectives.
Key Responsibilities
**Enterprise Architecture Leadership**
+ Define and maintain enterprise architecture strategy aligning business objectives with technology capabilities, ensuring scalability, reliability, security, and compliance.
+ Lead, mentor, and scale a team of solution and enterprise architects, fostering a high-performance culture rooted in architectural excellence, innovation, and collaboration.
+ Lead architecture governance and standards, establishing frameworks, best practices, and review designs and processes across applications, data, interfaces, and infrastructure domains.
+ Drive cross-functional alignment by collaborating with business, IT, and engineering leaders to ensure technology roadmaps support organizational priorities and innovation.
+ Build the Enterprise Architecture team, fostering strong architectural excellence, knowledge sharing, and continuous improvement across the enterprise.
+ Evaluate emerging technologies and guide strategic investments in data platforms, AI tools, interfaces and automation to enhance healthcare efficiency and outcomes.
+ Build relationships with partners such as Microsoft, AWS and Service delivery partners to execute on the enterprise architecture vision and strategy.
**Architecture & Design**
+ Design and implement scalable BI and data architecture using Data Lake or Lakehouse paradigms, including medallion architecture.
+ Architect and optimize ELT/ETL pipelines using SQL Server, Dataflows, and data Pipelines.
+ Integrate on-premises data sources using On-prem Data Gateway (ODG) with cloud-based solutions.
+ Develop a robust semantic layer and underlying data model to bridge technical data and business language, applying data virtualization principles.
+ Design and optimize semantic models that represent data in a way that enhances understanding and usability for analytics.
**Development & Implementation**
+ Build and manage data pipelines, Lakehouses, Data Lakes, and Data Warehouses in Azure or AWS.
+ Manage Power BI dashboards with advanced DAX and real-time data integration.
+ Implement data governance, security, and compliance best practices.
+ Utilize AI-driven development and delivery to enhance solution effectiveness and efficiency.
+ Define data quality checks, transformations, and cleansing rules, and work with data engineers to implement them within the semantic layer.
+ Strong T-SQL knowledge, materialize View, Indexing, Column store, Dimensional Data Modelling
**Monitoring & Optimization**
+ Monitor and optimize data pipeline performance and troubleshoot issues.
+ Ensure data quality, lineage, and availability across all reporting layers.
+ Maintain comprehensive documentation of architecture, data models, workflows, and semantic layer details.
**Required Skills & Qualifications**
+ Experience: 12+ years in data architecture, with at least 3-5 years in an enterprise or solution architecture leadership capacity.
+ Leadership: Proven experience building and leading cross-functional enterprise architecture teams and influencing enterprise-wide technology direction.
+ Expertise in semantic data modelling and data engineering skills.
+ Experience in architecture frameworks, best practices, designs and processes across applications, data, interfaces, and infrastructure domains.
+ Strong experience with data platforms such as Snowflake, Databricks, Microsoft Fabric, or Azure Data Gateway, Azure Synapse.
+ BI Tools: Working Knowledge in Power BI and integration with Microsoft Fabric is a plus.
+ Data Integration: Experience with interfaces, API designs and 3 rd party systems integration using tools such as Boomi, Azure API Management, and other middleware platforms.
+ Data Engineering: Proficient in designing ELT/ETL processes using SQL Server, columnar data format such as Delta or Parquet and Fabric Pipelines.
+ Architecture: Strong understanding of medallion architecture, data virtualization principles, cloud-based data management, and analytics technologies.
+ AI: Working knowledge of AI tools to accelerate development is a plus, such as Github Copilot, Cursor AI, Claude or similar.
+ Programming: Expertise with Python, T-SQL, Spark or other scripting languages for data transformation.
+ Experience in programming languages such as C#, .NET platform, Node.js, Vue.js will be preferable
+ Methodologies: Agile/Scrum project delivery experience.
+ Communication: Exceptional communication, strategic thinking, and stakeholder management skills; ability to bridge technical and business domains effectively.
Certifications: Certifications in Azure, Power BI, Microsoft Fabric and other Data or Enterprise architecture platforms are a plus.
**Bonus or Equity**
This position is also eligible for bonus as part of the total compensation package.
**Fortive Corporation Overview**
Fortive's essential technology makes the world safe and more productive. We accelerate transformation across a broad range of applications including environmental, health and safety compliance, industrial condition monitoring, next-generation product design, and healthcare safety solutions.
We are a global industrial technology innovator with a startup spirit. Our forward-looking companies lead the way in software-powered workflow solutions, data-driven intelligence, AI-powered automation, and other disruptive technologies. We're a force for progress, working alongside our customers and partners to solve challenges on a global scale, from workplace safety in the most demanding conditions to groundbreaking sustainability solutions.
We are a diverse team 10,000 strong, united by a dynamic, inclusive culture and energized by limitless learning and growth. We use the proven Fortive Business System (FBS) to accelerate our positive impact.
At Fortive, we believe in you. We believe in your potential-your ability to learn, grow, and make a difference.
At Fortive, we believe in us. We believe in the power of people working together to solve problems no one could solve alone.
At Fortive, we believe in growth. We're honest about what's working and what isn't, and we never stop improving and innovating.
Fortive: For you, for us, for growth.
**About Censis**
Censis, the first company to engineer a surgical asset management system that tracks down to the instrument and patient levels, has continually set the standards for the sterile processing industry.From the beginning, Censis has recognized the vital connection between perioperative innovation and efficiency, unparalleled customer care and improved operational performance. By continuing to invest in technology, ease of integration, education and support, Censis provides solutions that empower hospitals and healthcare providers to stay compliant and ahead of healthcare's rapidly changing environment. With Censis, you're positioned to start ahead and stay ahead, no matter what the future holds.
We Are an Equal Opportunity Employer. Fortive Corporation and all Fortive Companies are proud to be equal opportunity employers. We value and encourage diversity and solicit applications from all qualified applicants without regard to race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity or expression, or other characteristics protected by law. Fortive and all Fortive Companies are also committed to providing reasonable accommodations for applicants with disabilities. Individuals who need a reasonable accommodation because of a disability for any part of the employment application process, please contact us at applyassistance@fortive.com.
**Fortive Corporation Overview**
Fortive's essential technology makes the world safer and more productive. We accelerate transformation in high-impact fields like workplace safety, build environments, and healthcare.
We are a global industrial technology innovator with a startup spirit. Our forward-looking companies lead the way in healthcare sterilization, industrial safety, predictive maintenance, and other mission-critical solutions. We're a force for progress, working alongside our customers and partners to solve challenges on a global scale, from workplace safety in the most demanding conditions to advanced technologies that help providers focus on exceptional patient care.
We are a diverse team 10,000 strong, united by a dynamic, inclusive culture and energized by limitless learning and growth. We use the proven Fortive Business System (FBS) to accelerate our positive impact.
At Fortive, we believe in you. We believe in your potential-your ability to learn, grow, and make a difference.
At Fortive, we believe in us. We believe in the power of people working together to solve problems no one could solve alone.
At Fortive, we believe in growth. We're honest about what's working and what isn't, and we never stop improving and innovating.
Fortive: For you, for us, for growth.
**About Censis**
Censis, the first company to engineer a surgical asset management system that tracks down to the instrument and patient levels, has continually set the standards for the sterile processing industry.From the beginning, Censis has recognized the vital connection between perioperative innovation and efficiency, unparalleled customer care and improved operational performance. By continuing to invest in technology, ease of integration, education and support, Censis provides solutions that empower hospitals and healthcare providers to stay compliant and ahead of healthcare's rapidly changing environment. With Censis, you're positioned to start ahead and stay ahead, no matter what the future holds.
We Are an Equal Opportunity Employer. Fortive Corporation and all Fortive Companies are proud to be equal opportunity employers. We value and encourage diversity and solicit applications from all qualified applicants without regard to race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity or expression, or other characteristics protected by law. Fortive and all Fortive Companies are also committed to providing reasonable accommodations for applicants with disabilities. Individuals who need a reasonable accommodation because of a disability for any part of the employment application process, please contact us at applyassistance@fortive.com.
**Pay Range**
The salary range for this position (in local currency) is 101,500.00 - 188,500.00
The salary range for this position (in local currency) is 101,500.00 - 188,500.00
We are an Equal Opportunity Employer
Fortive Corporation and all Fortive Companies are proud to be equal opportunity employers. We value and encourage diversity and solicit applications from all qualified applicants without regard to race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity or expression, or other characteristics protected by law. Fortive and all Fortive Companies are also committed to providing reasonable accommodations for applicants with disabilities. Individuals who need a reasonable accommodation because of a disability for any part of the employment application process, please contact us at applyassistance@fortive.com.
Data Engineer
Senior data scientist job in Spring Hill, TN
Job Description
We're seeking a Data Engineer to help shape the foundation of Zipliens' growing data ecosystem. Our engineering team supports a diverse set of tools and systems that power lien resolution operations, client transparency, and decision-making across the company. In this role, you'll design and maintain reliable data pipelines, optimize data storage and retrieval, and ensure that our systems deliver accurate, timely, and actionable insights. You'll collaborate closely with data analysts, product owners, and engineers to build scalable data infrastructure and contribute to data quality standards that will support the next generation of Zipliens applications.
Requirements
Responsibilities:
Develop and optimize SQL queries and database schemas for efficient data retrieval and storage.
Design, develop, and maintain scalable ETL processes.
Develop and maintain scripts for data processing.
Ensure scalability and performance optimization of data pipelines and queries.
Develop and implement data quality checks and monitoring to ensure data accuracy and reliability.
Contribute to the development and maintenance of data quality standards and best practices.
Collaborate with data analysts to understand requirements and deliver solutions that enable effective reporting and analytics.
Design and build reports to provide actionable insights to stakeholders.
Document data models, ETL processes, and reporting solutions.
Qualifications:
Bachelor's degree in Business, Computer Information Systems, Computer Science, or equivalent practical experience.
4+ years of experience as a Data Engineer, Senior Data Analyst, or similar role.
Strong proficiency in SQL and experience with relational and cloud-based data storage solutions (e.g., PostgreSQL, MySQL, SQL Server, Snowflake, Redshift, BigQuery).
Experience with ETL tools and techniques.
Experience with a general-purpose programming language (Python preferred).
Familiarity with data warehousing concepts and data modeling principles.
Understanding of cloud platforms (e.g., AWS, Azure, GCP) and their data services is a plus.
Strong analytical and problem-solving skills with a focus on data quality, performance, and reliability.
Collaborative mindset with the ability to communicate effectively with stakeholders.
This role requires on-site presence at least three days per week (60%) in our Spring Hill, TN office.
Benefits
Comprehensive Health Benefits (Medical, Dental, and Vision), including HSA with employer contributions, FSA, and Dependent Care FSA
Company-Paid Life Insurance and Short-Term Disability
401(k) Plan with Company Match
Paid Time Off (Vacation, Sick Leave, and 10 Holidays)
Paid Parental Leave
Pay Disclosure: The total base salary range for this role is $89,000 - $120,000 annually, with an opportunity for a discretionary bonus. Final compensation will be determined based on skills and experience.
Data Engineer
Senior data scientist job in Franklin, TN
Title: Data Engineer
Type: 6 months (contract to hire)
RATE : open
Requirements
5+ years of developing software using object-oriented or functional language experience
5+ years of SQL
3+ years working with open source Big Data technology stacks (Apache Nifi, Spark, Kafka, HBase, Hadoop/HDFS, Hive, Drill, Pig, etc.) or commercial open source Big Data technology stacks (Hortonworks, Cloudera, etc.)
3+ years with document databases (e.g. MongoDB, Accumulo, etc.)
3+ years of experience using Agile development processes (e.g. developing and estimating user stories, sprint planning, sprint retrospectives, etc.)
2+ years of distributed version control system (e.g. git)
3+ years of experience in cloud-based development and delivery
Familiarity with distributed computing patterns, techniques, and technologies (e.g. ESB)
Familiarity with continuous delivery technologies (e.g. Puppet, Chef, Ansible, Docker, Vagrant, etc.)
Familiarity with build automation and continuous integration tools (e.g. Maven, Jenkins, Bamboo, etc.)
Familiarity with Agile process management tools (e.g. Atlassian Jira)
Familiarity with test automation (Selenium, SoapUI, etc.)
Good software development and Object Oriented programming skills.
Strong analytical skills and the ability to work with end users to transform requests into robust solutions.
Excellent oral and written communication skills.
Initiative and self-motivation to work independently on projects.
Benefits
Note: If interested please send your updated resume and include your salary requirement along with your contact details with a suitable time when we can reach you. If you know of anyone in your sphere of contacts, who would be a perfect match for this job then, we would appreciate if you can forward this posting to them with a copy to us.We look forward to hearing from you at the earliest!
Auto-ApplyData Engineer - Archimedes
Senior data scientist job in Brentwood, TN
Company Archimedes About Us Archimedes - Transforming the Specialty Drug Benefit - Archimedes is the industry leader in specialty drug management solutions. Founded with the goal of transforming the PBM industry to provide the necessary ingredients for the sustainability of the prescription drug benefit - alignment, value and transparency - Archimedes achieves superior results for clients by eliminating tightly held PBM conflicts of interest including drug spread, rebate retention and pharmacy ownership and delivering the most rigorous clinical management at the lowest net cost. .______________________________________________________________________________________________________________________________________________________________________________________________________. Current associates must use SSO login option at ************************************ to be considered for internal opportunities. Pay Range USD $0.00 - USD $0.00 /Yr. STAR Bonus % (At Risk Maximum) 10.00 - Manager, Clinical Mgr, Pharm Supvr, CAE, Sr CAE I Work Schedule Description (e.g. M-F 8am to 5pm) Core Business Hours Overview
The Data Engineer is responsible for designing, building, and maintaining scalable data pipelines and infrastructure that support enterprise analytics, reporting, and operational data flows. This role plays a critical part in enabling data-driven decision-making across Centene and SPM by ensuring the availability, integrity, and performance of data systems. The Data Engineer collaborates with data scientists, analysts, software developers, and business stakeholders to deliver robust ETL solutions, optimize data storage and retrieval, and implement secure, compliant data architectures in cloud and hybrid environments.
Operating within a healthcare-focused, compliance-heavy landscape, the Data Engineer ensures that data platforms align with regulatory standards such as HIPAA and SOC 2, while embedding automation and CI/CD practices into daily workflows. The role supports both AWS and Azure environments, leveraging cloud-native services and modern tooling to streamline data ingestion, transformation, and delivery.
Responsibilities
Job Responsibilities:
Design, develop, and maintain ETL pipelines for structured and unstructured data across cloud and on-prem environments.
Build and optimize data models, schemas, and storage solutions in SQL Server, PostgreSQL, and cloud-native databases.
Implement CI/CD workflows for data pipeline deployment and monitoring using tools such as GitHub Actions, Azure DevOps, or Jenkins.
Develop and maintain data integrations using AWS Glue, Azure Data Factory, Lambda, EventBridge, and other cloud-native services.
Ensure data quality, lineage, and governance through automated validation, logging, and monitoring frameworks.
Collaborate with cross-functional teams to gather requirements, design scalable solutions, and support analytics and reporting needs.
Monitor and troubleshoot data pipeline performance, latency, and failures; implement proactive alerting and remediation strategies.
Support data security and compliance by enforcing access controls, encryption standards, and audit logging aligned with HIPAA and SOC 2.
Maintain documentation for data flows, architecture diagrams, and operational procedures.
Participate in sprint planning, code reviews, and agile ceremonies to support iterative development and continuous improvement.
Evaluate and integrate new data tools, frameworks, and cloud services to enhance platform capabilities.
Partner with DevOps and Security teams to ensure infrastructure-as-code and secure deployment practices are followed.
Participate in, adhere to, and support compliance, people and culture, and learning programs.
Perform other duties as assigned.
Qualifications
Essential Background Requirements:
Education: Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field required. Master's degree preferred.
Certification/Licenses: AWS Certified Data Analytics or Solutions Architect required. Microsoft Certified: Azure Data Engineer Associate required. Certified Data Management Professional (CDMP) required.
Experience:
5+ years of experience in data engineering, ETL development, or cloud data architecture required.
Proven experience with SQL, ETL tools, and CI/CD pipelines required.
Hands-on experience with AWS and Azure data services and infrastructure required.
Familiarity with data governance, compliance frameworks (HIPAA, SOC 2), and secure data handling practices required.
Familiarity with CI/CD pipelines, automated testing, and version control systems required.
Skills & Technologies:
Languages & Tools: SQL, Python, Bash, Git, Terraform, PowerShell
ETL & Orchestration: AWS Glue, Azure Data Factory, Apache Airflow
CI/CD: GitHub Actions, Azure DevOps, Jenkins
Cloud Platforms: AWS (S3, Lambda, RDS, Redshift), Azure (Blob Storage, Synapse, Functions)
Monitoring & Logging: CloudWatch, Azure Monitor, ELK Stack
Data Governance: Data cataloging, lineage tracking, encryption, and access control.
Location : Address 5250 Virginia Way Ste 300 Location : City Brentwood Location : State/Province TN Location : Postal Code 37027 Location : Country US
Auto-ApplyData Platform Engineer
Senior data scientist job in Brentwood, TN
Job DescriptionPosition:
Data Platform Engineer
The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions.
Responsibilities
Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms.
Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks.
Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark.
Develop and manage data models, data warehousing solutions, and data integration architectures in Azure.
Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems.
Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration.
Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases.
Ensure data quality, governance, and security across the data lifecycle.
Collaborate with product managers by estimating technical tasks and deliverables.
Uphold the mission and values of Monogram Health in all aspects of your role and activities.
Position Requirements
A bachelor's degree in computer science, data science, software engineering or related field.
Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark.
Expert level knowledge of Python or other scripting languages required.
Proficiency in SQL and other data query languages.
Understanding of data modeling and schema design principles
Ability to work with large datasets and perform data analysis
Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable.
Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC).
Thorough understanding of Azure Cloud Infrastructure offerings.
Demonstrated problem-solving and troubleshooting skills.
Team player with demonstrated written and communication skills.
Benefits
Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts
Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources
Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave
Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts
Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders.
Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home.
Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
Data Engineer
Senior data scientist job in Brentwood, TN
OPPORTUNITY We are seeking an experienced Data Engineer with 2-3+ years of hands-on experience to design, build, and maintain robust data solutions within our Azure/Microsoft-centric technology stack. The ideal candidate will thrive in a collaborative yet independent work environment, delivering high-quality analytics solutions that drive business insights and decision-making.
SCOPE OF WORK
* Design and implement scalable data models and pipelines using modern data engineering practices
* Develop and maintain production-grade analytics solutions within Azure ecosystem
* Build comprehensive dashboards and reports using Power BI to support business stakeholders
* Collaborate with cross-functional teams to translate business requirements into technical solutions
* Optimize SQL queries and data processing workflows for performance and reliability
* Support and enhance existing data infrastructure while implementing new analytics capabilities
* Participate in code reviews and maintain high standards for data quality and documentation
* Respond quickly to critical data issues and provide solutions under tight deadlines
IDEAL CANDIDATE PROFILE
Required Qualifications
* 2-3+ years of experience in analytics engineering, data engineering, or similar role
* Advanced SQL proficiency with experience in complex query optimization and database design
* Python programming skills for data manipulation, automation, and analytics
* Power BI expertise including dashboard development, DAX, and data visualization best practices
* Azure cloud platform experience including Azure Data Factory, Azure SQL, and related services
* Data modeling experience with dimensional modeling, star/snowflake schemas, and ETL/ELT processes
* Production environment experience including deployment, monitoring, and maintenance of data systems
* Strong communication skills with ability to explain technical concepts to non-technical stakeholders
* Proven ability to work independently and manage multiple priorities effectively
* Experience working under pressure with quick turnaround requirements
Preferred Qualifications
* Snowflake and/or Databricks experience with modern cloud data platforms
* Private equity or financial services background - understanding of investment data, portfolio management, or financial reporting
* Machine Learning experience including model development, deployment, or MLOps practices
* Experience with additional Azure services (Synapse, Logic Apps, Power Platform)
* Knowledge of data governance and compliance frameworks
* Experience with version control (Git) and CI/CD practices
* Advanced Python libraries experience (pandas, scikit-learn, etc.)
Data Engineer
Senior data scientist job in Goodlettsville, TN
Work Where You Matter At Dollar General, our mission is Serving Others! We value each and every one of our employees. Whether you are looking to launch a new career in one of our many convenient Store locations, Distribution Centers, Store Support Center or with our Private Fleet Team, we are proud to provide a wide range of career opportunities. We are not just a retail company; we are a company that values the unique strengths and perspectives that each individual brings. Your difference truly makes a difference at Dollar General. How would you like to Serve? Join the Dollar General Journey and see how your career can thrive.
Company Overview
The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. This role will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.
Job Details
* Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
* Assemble large, complex data sets that meet functional / non-functional business requirements.
* Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
* Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud technologies.
* Build analytics tools that utilize the data pipelines to provide actionable insights into operational efficiency and other key business performance metrics.
* Work with stakeholders including the Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
* Knowledge of programming languages (e.g. Python, Java, C#)
* Hands-on experience with SQL database design
* Great numerical and analytical skills
* Degree in Computer Science, IT, or similar field; a Master's is a plus
* Data engineering certification (e.g Certified Data Engineer) is a plus
* Experience with big data tools: Hadoop, Spark, Kafka, etc.
* Experience with relational SQL and NoSQL databases, including Mongo DB, Postgres and Cassandra.
* Experience with data pipeline and workflow management tools: Composer, Dataflow, Airflow, etc.
* Experience with Snowflake/Azure or GCP cloud services: GCE, GCS, EMR, RDS, Redshift, etc.
* Experience with stream-processing systems: Google Cloud Dataflow, Apache Storm, Spark-Streaming, etc.
Qualifications
* Degree in information technology or computer science
* BS or MS degree in Computer Science or a related technical field
* 4+ years of Python or Java development experience
* 4+ years of SQL experience (No-SQL experience is a plus)
* 4+ years of experience with schema design and dimensional data modeling
Enterprise Data Scientist (Rev Cycle & Supply Chain)
Senior data scientist job in Franklin, TN
This role is responsible for leveraging your expertise in data analytics, advanced statistical methods, and programming to derive insights from clinical data. As a member of the Enterprise Data Science team, this role will be responsible for analyzing complex clinical datasets, developing data visualizations and dashboards, assisting with data model and/or feature development, and translating insights into actionable recommendations for improving patient care and operational outcomes.
Responsibilities:
+ Collaborate with cross-functional teams including clinical leaders, data scientists, and software engineers to identify data-driven opportunities for enhancing clinical processes and patient care.
+ Utilize cloud-based technologies, such as Google Cloud Platform (GCP), for scalable data processing and analysis.
+ Develop easily consumable dashboards from complex clinical and operational data to provide visualizations derived from best practices and data consumption theory that drive healthcare and business performance.
+ Implement best practices for data management, including data quality assessment, data validation, and data governance.
+ Utilize Python programming and associated libraries such as PyTorch, Keras, Pandas and NumPY to create, train, test and implement meaningful data science models.
+ Lead the analysis of operational data to identify patterns, trends, and correlations relevant to healthcare outcomes.
+ Collaborate with healthcare professionals and domain experts to understand operational needs and design data-driven solutions.
+ Design and conduct experiments, interpret results, and communicate findings to both technical and non-technical stakeholders
Requirements:
+ Master's degree in Data Science, Data Analytics, Computer Science, or a related field.
+ Proven experience in analyzing complex healthcare data and building customer facing dashboards and data visualizations.
+ Proficiency in Python programming and associated libraries.
+ Experience with cloud-based platforms such as Google Cloud Platform (GCP) for data storage, processing, and deployment.
+ Strong problem-solving skills and ability to work independently and collaboratively in a fast-paced environment.
+ Excellent communication and presentation skills with the ability to translate technical concepts to non-technical audiences.
Equal Employment Opportunity
This organization does not discriminate in any way to deprive any person of employment opportunities or otherwise adversely affect the status of any employee because of race, color, religion, sex, sexual orientation, genetic information, gender identity, national origin, age, disability, citizenship, veteran status, or military or uniformed services, in accordance with all applicable governmental laws and regulations. In addition, the facility complies with all applicable federal, state and local laws governing nondiscrimination in employment. This applies to all terms and conditions of employment including, but not limited to: hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. If you are an applicant with a mental or physical disability who needs a reasonable accommodation for any part of the application or hiring process, contact the director of Human Resources at the facility to which you are seeking employment; Simply go to ************************************************* to obtain the main telephone number of the facility and ask for Human Resources.
Data Scientist, Merchandising Analytics
Senior data scientist job in Brentwood, TN
The Data Scientist, Merchandising Analytics at Tractor Supply Company will play a key role in leveraging data to address complex business challenges. The role will develop advanced statistical methods using Machine Learning, AI, statistical modeling, and optimization techniques to support the merchandising strategies and broader organizational goals of TSC. Additionally, the role will contribute to setting objectives to enhance the overall data architecture and data governance within the Merchandising team. Key areas of focus within Merchandising will be Space Planning and Pricing.
The Data Scientist will lead cross-functional projects, design and implement predictive models, and promote data-driven decision-making throughout the organization. Strong communication skills are essential as this role will translate complex analytical results into clear, actionable insights for both technical and non-technical stakeholders. This role work will ensure that Tractor Supply Company remains at the forefront of industry trends and emerging technologies in the data science field.
**Essential Duties and Responsibilities (Min 5%)**
+ Work closely with key business partners to fully explore and frame a business question including objectives, goals, KPIs, decisions the analysis will support and required timing for deliverables.
+ Extracting available and relevant data from internal and external data sources to perform data science solution development.
+ Develop, maintain, and improve predictive models using R, Python, and Databricks to enhance business knowledge and processes.
+ Contribute and assist the team with best practices in data governance, data engineering, and data architecture.
+ Identify opportunities for automation and continuous improvement within data pipelines, processes, and systems.
+ Maintain a broad exposure to the wider ecosystem of AI / Machine Learning and ensure our team is pushing toward the most optimal solutions.
+ Manage multiple projects simultaneously with limited oversight including development of new technologies and maintenance of existing framework.
+ Foster a culture of data-driven decision making and collaborate with cross-functional teams and senior leadership to define the long-term vision and goals for data science and engineering within the organization.
+ Design and execute A/B tests to evaluate the effectiveness of various data-driven solutions, including designing appropriate sample sizes, metrics for evaluation, and statistical analysis plans.
+ Use proven, predictive science to right-size every store with localized plans that balance individual store space allocations with your top-down and bottom-up strategies.
+ Be a 'Go-To' person for any data analytical needs ranging from data extraction/ manipulations, long term trend analysis, statistical analysis, and modeling techniques. Perform code-review and debug with the team and assist during implementation where necessary.
**Required Qualifications**
_Experience_ : 3-5 years proven experience as a Data Scientist, preferably in the retail or e-commerce industry. Prefer 2+ years of experience in predictive modeling utilizing CRM or transaction information.
_Education_ : Bachelor's Degree in Mathematics, Statistics, or Econometrics. Master's Degree prefered. Any combination of education and experiencce will be considered.
_Professional Certifications_ : None
**Preferred knowledge, skills or abilities**
+ Intermediate-advanced in one or more programming language (Python, PySpark, R).
+ Deep expertise in writing and debugging complex SQL queries.
+ Ability to frame business questions and create an analytics solution using statistical or other advanced analytics methodologies.
+ Proven advanced modeling experience in leading data-driven projects from definition to execution, driving and influencing project roadmaps.
+ Experience using Azure, AWS, or another cloud compute platform a plus.
+ Familiarity with visualization tools such as Power BI and Tableau.
+ Must possess high degree of aptitude in communicating both verbally and written complex analysis results to Senior & Executive Leadership.
+ Knowledge of A/ B testing methods; capable of designing a controlled test design, running the test and providing measurement post-hoc.
+ Proficiency with managing data repository and version control systems like Git.
+ Speak, read and write effectively in the English language.
**Working Conditions**
+ Hybrid / Flexible working conditions
**Physical Requirements**
+ Sitting
+ Standing (not walking)
+ Walking
+ Kneeling/Stooping/Bending
+ Lifting up to 10 pounds
**Disclaimer**
_This job description represents an overview of the responsibilities for the above referenced position. It is not intended to represent a comprehensive list of responsibilities. A team member should perform all duties as assigned by his/ her supervisor._
**Company Info**
At Tractor Supply and Petsense by Tractor Supply, our Team Members are the heart of our success. Their dedication, passion, and hard work drive everything we do, and we are committed to supporting them with a comprehensive and accessible total reward package. We understand the evolving needs of our Team Members and their families, and we strive to offer meaningful, competitive, and sustainable benefits that support their well-being today and in the future.
Our benefits extend beyond medical, dental, and vision coverage, including company-paid life and disability insurance, paid parental leave, tuition reimbursement, and family planning resources such as adoption and surrogacy assistance, for all full-time Team Members and all part-time Team Members. Part time new hires gain eligibility for TSC Benefits by averaging at least 15 hours per week during their 90-day lookback period. The lookback period starts the first of the month following the date of hire. If the 15-hour requirement was met, the benefits eligibility date will be the first day of the month following 4 months of continuous service.
Please visitthis link (********************************************************************** for more specific information about the benefits and leave policies applicable to the position you're applying for.
**ALREADY A TEAM MEMBER?**
You must apply or refer a friend through our internal portal
Click here (**************************************************************************
**CONNECTION**
Our Mission and Values are more than just words on the wall - they're the one constant in an ever-changing environment and the bedrock on which we build our culture. They're the core of who we are and the foundation of every decision we make. It's not just what we do that sets us apart, but how we do it.
Learn More
**EMPOWERMENT**
We believe in managing your time for business and personal success, which is why we empower our Team Members to lead balanced lives through our benefits and total rewards offerings. For full-time and eligible part-time TSC and Petsense Team Members. We care about what you care about!
Learn More
**OPPORTUNITY**
A lot of care goes into providing legendary service at Tractor Supply Company, which is why our Team Members are our top priority. Want a career with a clear path for growth? Your Opportunity is Out Here at Tractor Supply and Petsense.
Learn More
Join Our Talent Community
**Nearest Major Market:** Nashville
Bigdata / Hadoop Technical Lead
Senior data scientist job in Franklin, TN
E*Pro Consulting service offerings include contingent Staff Augmentation of IT professionals, Permanent Recruiting and Temp-to-Hire. In addition, our industry expertise and knowledge within financial services, Insurance, Telecom, Manufacturing, Technology, Media and Entertainment, Pharmaceutical, Health Care and service industries ensures our services are customized to meet specific needs. For more details please visit our website ******************
Job Description
Technical/Functional Skills:
• Must have at least 1 full-scale Hadoop implementation from DEV to PROD
• Must have experience in Production Deployment Process for Big Data projects
• Must have experience in root cause analysis, trouble-shooting of Hadoop applications
• Must have significant experience in designing solutions using Cloudera Hadoop
• Must have significant experience with Java MapReduce, PIG, Hive, Sqoop and Oozie
• Must have significant experience with Unix Shell Scripts
• Exposure to Healthcare Provider domain
Roles & Responsibilities:
• Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions.
• Mentor, guide and train Team members on Big Data
• Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop.
• Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks.
• Perform complex coding tasks using Java Map Reduce, PIG, Hive and Sqoop
• Review and certify code written by team members
• Ensures that established change management and other procedures are adhered to also help developing needing standards, procedures, and practices.
• Performance tuning with large data sets.
Generic Managerial Skills:
• Ability to lead the team, plan, track and manage and work performed by team members
• Ability to work independently and communicate across multiple levels (Product owners, Executive sponsors, Team members)
Additional Information
All your information will be kept confidential according to EEO guidelines.
Senior Data Architect
Senior data scientist job in Franklin, TN
Job Title: Sr. Data Architect About Censis Censis Technologies (************************* a global leader in surgical instrument tracking and asset management solutions. At the forefront of healthcare innovation, Censis, the first company to engineer a surgical asset management system that tracks down to the instrument and patient levels, has continually set the standards for the sterile processing industry. From the beginning, Censis has recognized the vital connection between perioperative innovation and efficiency, unparalleled customer care and improved operational performance. By continuing to invest in technology, ease of integration, education and support, Censis provides solutions that empower hospitals and healthcare providers to stay compliant and ahead of healthcare's rapidly changing environment. With Censis, you're positioned to start ahead and stay ahead, no matter what the future holds.
Role Overview
Censis is seeking a highly experienced and innovative Sr. Data Architect to lead the design and implementation of modern data solutions using Microsoft Fabric, Lakehouse architecture, Power BI, semantic data modelling, and medallion architecture. The ideal candidate will have a strong foundation in data architecture, SQL Server, data pipelines, on-premises data integration using ODG (On-prem Data Gateway), and semantic layer development. This role demands a strategic thinker with hands-on expertise in building scalable, secure, and high-performance BI ecosystems, leveraging AI-driven development and delivery.
In addition to data architecture, the role will be responsible for building and leading an enterprise architecture function, ensuring technology alignment with business strategy and innovation objectives.
Key Responsibilities
Enterprise Architecture Leadership
Define and maintain enterprise architecture strategy aligning business objectives with technology capabilities, ensuring scalability, reliability, security, and compliance.
Lead, mentor, and scale a team of solution and enterprise architects, fostering a high-performance culture rooted in architectural excellence, innovation, and collaboration.
Lead architecture governance and standards, establishing frameworks, best practices, and review designs and processes across applications, data, interfaces, and infrastructure domains.
Drive cross-functional alignment by collaborating with business, IT, and engineering leaders to ensure technology roadmaps support organizational priorities and innovation.
Build the Enterprise Architecture team, fostering strong architectural excellence, knowledge sharing, and continuous improvement across the enterprise.
Evaluate emerging technologies and guide strategic investments in data platforms, AI tools, interfaces and automation to enhance healthcare efficiency and outcomes.
Build relationships with partners such as Microsoft, AWS and Service delivery partners to execute on the enterprise architecture vision and strategy.
Architecture & Design
Design and implement scalable BI and data architecture using Data Lake or Lakehouse paradigms, including medallion architecture.
Architect and optimize ELT/ETL pipelines using SQL Server, Dataflows, and data Pipelines.
Integrate on-premises data sources using On-prem Data Gateway (ODG) with cloud-based solutions.
Develop a robust semantic layer and underlying data model to bridge technical data and business language, applying data virtualization principles.
Design and optimize semantic models that represent data in a way that enhances understanding and usability for analytics.
Development & Implementation
Build and manage data pipelines, Lakehouses, Data Lakes, and Data Warehouses in Azure or AWS.
Manage Power BI dashboards with advanced DAX and real-time data integration.
Implement data governance, security, and compliance best practices.
Utilize AI-driven development and delivery to enhance solution effectiveness and efficiency.
Define data quality checks, transformations, and cleansing rules, and work with data engineers to implement them within the semantic layer.
Strong T-SQL knowledge, materialize View, Indexing, Column store, Dimensional Data Modelling
Monitoring & Optimization
Monitor and optimize data pipeline performance and troubleshoot issues.
Ensure data quality, lineage, and availability across all reporting layers.
Maintain comprehensive documentation of architecture, data models, workflows, and semantic layer details.
Required Skills & Qualifications
Experience: 12+ years in data architecture, with at least 3-5 years in an enterprise or solution architecture leadership capacity.
Leadership: Proven experience building and leading cross-functional enterprise architecture teams and influencing enterprise-wide technology direction.
Expertise in semantic data modelling and data engineering skills.
Experience in architecture frameworks, best practices, designs and processes across applications, data, interfaces, and infrastructure domains.
Strong experience with data platforms such as Snowflake, Databricks, Microsoft Fabric, or Azure Data Gateway, Azure Synapse.
BI Tools: Working Knowledge in Power BI and integration with Microsoft Fabric is a plus.
Data Integration: Experience with interfaces, API designs and 3
rd
party systems integration using tools such as Boomi, Azure API Management, and other middleware platforms.
Data Engineering: Proficient in designing ELT/ETL processes using SQL Server, columnar data format such as Delta or Parquet and Fabric Pipelines.
Architecture: Strong understanding of medallion architecture, data virtualization principles, cloud-based data management, and analytics technologies.
AI: Working knowledge of AI tools to accelerate development is a plus, such as Github Copilot, Cursor AI, Claude or similar.
Programming: Expertise with Python, T-SQL, Spark or other scripting languages for data transformation.
Experience in programming languages such as C#, .NET platform, Node.js, Vue.js will be preferable
Methodologies: Agile/Scrum project delivery experience.
Communication: Exceptional communication, strategic thinking, and stakeholder management skills; ability to bridge technical and business domains effectively.
Certifications: Certifications in Azure, Power BI, Microsoft Fabric and other Data or Enterprise architecture platforms are a plus.
Bonus or Equity
This position is also eligible for bonus as part of the total compensation package.
Fortive Corporation Overview
Fortive's essential technology makes the world safe and more productive. We accelerate transformation across a broad range of applications including environmental, health and safety compliance, industrial condition monitoring, next-generation product design, and healthcare safety solutions.
We are a global industrial technology innovator with a startup spirit. Our forward-looking companies lead the way in software-powered workflow solutions, data-driven intelligence, AI-powered automation, and other disruptive technologies. We're a force for progress, working alongside our customers and partners to solve challenges on a global scale, from workplace safety in the most demanding conditions to groundbreaking sustainability solutions.
We are a diverse team 10,000 strong, united by a dynamic, inclusive culture and energized by limitless learning and growth. We use the proven Fortive Business System (FBS) to accelerate our positive impact.
At Fortive, we believe in you. We believe in your potential-your ability to learn, grow, and make a difference.
At Fortive, we believe in us. We believe in the power of people working together to solve problems no one could solve alone.
At Fortive, we believe in growth. We're honest about what's working and what isn't, and we never stop improving and innovating.
Fortive: For you, for us, for growth.
About Censis
Censis, the first company to engineer a surgical asset management system that tracks down to the instrument and patient levels, has continually set the standards for the sterile processing industry.From the beginning, Censis has recognized the vital connection between perioperative innovation and efficiency, unparalleled customer care and improved operational performance. By continuing to invest in technology, ease of integration, education and support, Censis provides solutions that empower hospitals and healthcare providers to stay compliant and ahead of healthcare's rapidly changing environment. With Censis, you're positioned to start ahead and stay ahead, no matter what the future holds.
We Are an Equal Opportunity Employer. Fortive Corporation and all Fortive Companies are proud to be equal opportunity employers. We value and encourage diversity and solicit applications from all qualified applicants without regard to race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity or expression, or other characteristics protected by law. Fortive and all Fortive Companies are also committed to providing reasonable accommodations for applicants with disabilities. Individuals who need a reasonable accommodation because of a disability for any part of the employment application process, please contact us at applyassistance@fortive.com.
Auto-ApplyData Engineer
Senior data scientist job in Spring Hill, TN
We're seeking a Data Engineer to help shape the foundation of Zipliens' growing data ecosystem. Our engineering team supports a diverse set of tools and systems that power lien resolution operations, client transparency, and decision-making across the company. In this role, you'll design and maintain reliable data pipelines, optimize data storage and retrieval, and ensure that our systems deliver accurate, timely, and actionable insights. You'll collaborate closely with data analysts, product owners, and engineers to build scalable data infrastructure and contribute to data quality standards that will support the next generation of Zipliens applications.
Requirements
Responsibilities:
Develop and optimize SQL queries and database schemas for efficient data retrieval and storage.
Design, develop, and maintain scalable ETL processes.
Develop and maintain scripts for data processing.
Ensure scalability and performance optimization of data pipelines and queries.
Develop and implement data quality checks and monitoring to ensure data accuracy and reliability.
Contribute to the development and maintenance of data quality standards and best practices.
Collaborate with data analysts to understand requirements and deliver solutions that enable effective reporting and analytics.
Design and build reports to provide actionable insights to stakeholders.
Document data models, ETL processes, and reporting solutions.
Qualifications:
Bachelor's degree in Business, Computer Information Systems, Computer Science, or equivalent practical experience.
4+ years of experience as a Data Engineer, Senior Data Analyst, or similar role.
Strong proficiency in SQL and experience with relational and cloud-based data storage solutions (e.g., PostgreSQL, MySQL, SQL Server, Snowflake, Redshift, BigQuery).
Experience with ETL tools and techniques.
Experience with a general-purpose programming language (Python preferred).
Familiarity with data warehousing concepts and data modeling principles.
Understanding of cloud platforms (e.g., AWS, Azure, GCP) and their data services is a plus.
Strong analytical and problem-solving skills with a focus on data quality, performance, and reliability.
Collaborative mindset with the ability to communicate effectively with stakeholders.
This role requires on-site presence at least three days per week (60%) in our Spring Hill, TN office.
Benefits
Comprehensive Health Benefits (Medical, Dental, and Vision), including HSA with employer contributions, FSA, and Dependent Care FSA
Company-Paid Life Insurance and Short-Term Disability
401(k) Plan with Company Match
Paid Time Off (Vacation, Sick Leave, and 10 Holidays)
Paid Parental Leave
Pay Disclosure: The total base salary range for this role is $89,000 - $120,000 annually, with an opportunity for a discretionary bonus. Final compensation will be determined based on skills and experience.
Auto-ApplyData Platform Engineer
Senior data scientist job in Brentwood, TN
Data Platform Engineer
The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions.
Responsibilities
Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms.
Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks.
Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark.
Develop and manage data models, data warehousing solutions, and data integration architectures in Azure.
Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems.
Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration.
Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases.
Ensure data quality, governance, and security across the data lifecycle.
Collaborate with product managers by estimating technical tasks and deliverables.
Uphold the mission and values of Monogram Health in all aspects of your role and activities.
Position Requirements
A bachelor's degree in computer science, data science, software engineering or related field.
Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark.
Expert level knowledge of Python or other scripting languages required.
Proficiency in SQL and other data query languages.
Understanding of data modeling and schema design principles
Ability to work with large datasets and perform data analysis
Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable.
Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC).
Thorough understanding of Azure Cloud Infrastructure offerings.
Demonstrated problem-solving and troubleshooting skills.
Team player with demonstrated written and communication skills.
Benefits
Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts
Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources
Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave
Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts
Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders.
Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home.
Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
Senior Data Engineer
Senior data scientist job in Franklin, TN
**About CHS:** Community Health Systems is one of the nation's leading healthcare providers. Developing and operating healthcare delivery systems in 35 distinct markets across 14 states, CHS is committed to helping people get well and live healthier. CHS operates 70 affiliated hospitals with more than 10,000 beds and approximately 1,000 other sites of care, including physician practices, urgent care centers, freestanding emergency departments, imaging centers, cancer centers, and ambulatory surgery centers.
**About the Role:**
We are seeking an experienced **Data Engineer** to design, build, deploy, and manage our company's data infrastructure. This role is critical to managing and organizing structured and unstructured data across the organization to enable outcomes analysis, insights, compliance, reporting, and business intelligence. This role requires a strong blend of technical expertise, problem-solving abilities, and communication skills. Emphasis will be placed on leadership, mentorship, and ownership within the team.
**Essential Duties and Responsibilities:**
+ Design, build, and maintain robust CI/CD pipelines, including infrastructure as code
+ Design, build, and maintain scalable and efficient data pipelines using various GCP services.
+ Maintain and refactor complex legacy SQL ETL processes in BigQuery.
+ Write clean, maintainable, and efficient code for data processing, automation, and API integrations.
+ Collaborate with data architects, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.
+ Ensure data quality, integrity, and security across all data platforms.
+ Troubleshoot and resolve data-related issues, performing root cause analysis and implementing corrective actions.
+ Contribute to the continuous improvement of our data engineering practices, tools, and methodologies.
+ Mentor junior team members and provide technical guidance.
+ Lead data initiatives from conception to deployment.
**Our Stack (In Development):**
+ GitHub, GitHub Actions, Terraform, Docker, Helm
+ JVM languages (Kotlin preferred), SQL, Python
+ GCP services: Cloud Composer, Dataproc, Dataplex, BigQuery, BigLake, GKE
+ OSS: Kubernetes, Spark, Flink, Kafka
**Required Education:**
+ Bachelor's Degree or 4 years equivalent professional experience
**Required Experience:**
+ 5-7 years of professional experience in data engineering or a similar role.
+ Proficiency in SQL, with a deep understanding of relational databases and data warehousing concepts.
+ Expertise in JVM languages or Python for data manipulation, scripting, and automation.
+ Demonstrable experience with cloud services related to data engineering.
+ Strong understanding of ETL/ELT processes, data modeling, and data architecture principles.
+ Excellent problem-solving skills and a strong analytical mindset.
+ Ability to work independently and as part of a collaborative team.
+ Strong communication and interpersonal skills, with the ability to explain complex technical concepts to non-technical stakeholders.
+ Proven leadership potential and a willingness to take ownership of projects.
+ Experience with Agile development methodologies
+ Expertise with version control systems (e.g., Git, GitHub)
**Preferred Experience:**
+ Experience with distributed data processing frameworks
+ Experience with distributed systems
+ Familiarity with data visualization tools (e.g., Looker, Tableau, Power BI)
+ Experience with data integration and migration within Oracle Cloud Infrastructure (OCI)
+ Familiarity with data structures and extraction methodologies for Oracle Cloud ERP applications (e.g., Financials, HCM, SCM)
Equal Employment Opportunity
This organization does not discriminate in any way to deprive any person of employment opportunities or otherwise adversely affect the status of any employee because of race, color, religion, sex, sexual orientation, genetic information, gender identity, national origin, age, disability, citizenship, veteran status, or military or uniformed services, in accordance with all applicable governmental laws and regulations. In addition, the facility complies with all applicable federal, state and local laws governing nondiscrimination in employment. This applies to all terms and conditions of employment including, but not limited to: hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. If you are an applicant with a mental or physical disability who needs a reasonable accommodation for any part of the application or hiring process, contact the director of Human Resources at the facility to which you are seeking employment; Simply go to ************************************************* to obtain the main telephone number of the facility and ask for Human Resources.
Sr. Analyst - Omni Channel Data
Senior data scientist job in Brentwood, TN
This role performs advanced business analysis using various techniques including statistical analysis, explanatory and predictive modeling, and data mining. It will determine best practices and develop actionable insights and recommendations for the current business operation or need. This role works closely with others to determine analytical requirements, including operational and project key metrics requests.
Essential Duties and Responsibilities (Min 5%)
* Provide detailed analytics to organizational initiatives
* Establish centralized reporting & analytics competence within Omni-channel space
* Analyze large amounts of information to discover trends and patterns
* Identify and develop reporting around key success metrics to measure project performance and critical operations (ex. Day After Thanksgiving forecasting & Modeling)
* Provide thought leadership on how to leverage analytics to improve our everyday operation and future business growth
* Provide weekly, monthly, and quarterly Omni-channel metrics and analysis reporting, including financial reporting, operational reporting, and web analytics reporting
* Present information using data visualization techniques
* Propose solutions and strategies to business challenges
* Identify valuable data sources and automate collection processes
* Filter and "clean" data to locate and report code problems
* Work with management to prioritize business and information needs
* Define and execute audits on key business processes
* Develop operation procedures
* Support ad hoc reporting and analysis requests
* Provide forecasting (ex. sales, shipment volume broken-down in various categories, locations)
* Manage the "backlogs" to grow sustainable reporting and analytics practices
Required Qualifications
Experience: 5 year's relevant experience
Education: Bachelor's degree from an accredited college or university. Any suitable combination of education and experience will be considered.
Preferred knowledge, skills or abilities
* Experience using business intelligence tools
* Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
* Problem-solving aptitude
* Adept at queries, report writing and presenting findings
* Knowledge of statistical concepts and techniques
* Understanding of machine-learning and operations research
* Experience with business reporting tools such as Business Objects, Tableau, Alteryx, PowerBI, SnowFlake (SQL) or other reporting & analytical tools
* Experience with web analytics tools such as Adobe Analytics
* Background in eCommerce
* Understanding of and familiarity with web and emerging technologies
* Agile Scrum Process experience
* Experience working with consulting partners
Working Conditions
* Normal office working conditions
Physical Requirements
* Sitting
* Standing (not walking)
* Walking
* Kneeling/Stooping/Bending
* Reaching overhead
* Lifting up to 10 pounds
Disclaimer
This job description represents an overview of the responsibilities for the above referenced position. It is not intended to represent a comprehensive list of responsibilities. A team member should perform all duties as assigned by his/ her supervisor.
Company Info
At Tractor Supply and Petsense by Tractor Supply, our Team Members are the heart of our success. Their dedication, passion, and hard work drive everything we do, and we are committed to supporting them with a comprehensive and accessible total reward package. We understand the evolving needs of our Team Members and their families, and we strive to offer meaningful, competitive, and sustainable benefits that support their well-being today and in the future.
Our benefits extend beyond medical, dental, and vision coverage, including company-paid life and disability insurance, paid parental leave, tuition reimbursement, and family planning resources such as adoption and surrogacy assistance, for all full-time Team Members and all part-time Team Members. Part time new hires gain eligibility for TSC Benefits by averaging at least 15 hours per week during their 90-day lookback period. The lookback period starts the first of the month following the date of hire. If the 15-hour requirement was met, the benefits eligibility date will be the first day of the month following 4 months of continuous service.
Please visit this link for more specific information about the benefits and leave policies applicable to the position you're applying for.
Sr. Data Analyst, L3, Information Technology
Senior data scientist job in Franklin, TN
As a Senior Data Analyst for the Information Technology organization, you'll be responsible for identifying, curating, publishing, and visualizing data in a way that is easily interpreted and understood. Successful data analysts have strong SQL skills, a deep understanding of data warehousing and data products, and the ability to quickly understand the structure and relationship of data from a broad range of sources. You will have the opportunity to work with various programming languages, technologies, and both structured and unstructured data.
**A Qualified Candidate:**
+ Is a Lifelong Learner and Passionate about Technology
+ Is Experienced with Business Intelligence (BI) and data visualization and can show examples of past work
+ Is very proficient with SQL, with the ability to demonstrate understanding of various concepts, capabilities, and functions including, but not limited to, the following: Joins, grouping, ordering, common table expressions, case functions, regex
+ Derives joy from tackling complex problems and working through solution tradeoffs
+ Can learn on the fly and fill knowledge gaps on demand
+ Has experience working with a variety of people at various levels
+ Has a strong ability to interpret datasets and identify information, trends, and patterns.
+ Has excellent data management and QA skills - Process Oriented
+ Recognizes business requirements in the context of data visualization and reporting and creates data models to transform raw data into relevant insights
+ Has aptitude for data presentation and ability to transform raw data into meaningful, actionable reports
+ Has experience with Looker / Google Data Studio or similar platforms
+ Has strong exploratory data analysis skills and can translate stakeholder requirements into data products
+ Has excellent communication skills
**Essential Functions**
+ Provides technical consultation on data product projects by analyzing end to end data product requirements and existing business processes to lead in the design, development and implementation of data products.
+ Collaborates with stakeholders to understand data needs and requirements for visualizations taking a "Design Thinking" approach to problem solving and interactive solutioning.
+ Collaborates with stakeholders to define metrics and cultivate data sources to support reporting insights aligned to business goals.
+ Creates dashboards and interactive visualizations that allow users to explore data in meaningful ways.
+ Develops complex SQL queries to combine and transform raw data into datasets needed for metrics and other analytical functions.
+ Provides training and support to end-users on how to interpret and interact with data visualizations.
+ Produces data views, data models, and data flows for varying client demands such as dimensional data, standard and ad hoc reporting, data feeds, dashboard reporting, and data science research & exploration.
+ Translates business data stories into a technical story breakdown structure and work estimate so value and fit for a schedule or sprint is determined.
+ Collaborates with enterprise teams and other internal organizations on CI/CD best practices experience using JIRA, Jenkins, Confluence etc.
+ Implements production processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
+ Practices code management and integration with engineering Git principle and practice repositories.
+ Participates as an expert and learner in team tasks for data analysis, architecture, application design, coding, and testing practices.
**Qualifications:**
+ Required Education: Bachelor's degree in computer science, information systems, cyber security, business, statistics, mathematics, or a related field
+ Preferred Education: Master's degree in computer science, information systems, cyber security, business, statistics, mathematics, or a related field
+ Computer Skills Required:
+ Advanced skills with SQL
+ Experience with python, javascript, CSS, or other languages a plus.
+ Desired experience in: Looker Studio / Google Data Studio, BigQuery
**Required Experience:**
+ 3+ years of experience with developing compelling stories and distinctive visualizations.
+ 3+ years of relevant experience with data quality rules, data management organization/standards, practices and software development.
+ 4+ years of SQL experience.
+ 3+ years of dashboarding / BI tool experience (Looker Studio, PowerBI, Tableau, etc).
+ Experience in statistical analysis, data models, data warehousing, and queries.
+ Data application and practice knowledge.
+ Good problem solving, oral and written communication skills.
+ Strong working knowledge of graphic design or UI design.
+ Preferred Experience:
+ Healthcare/Insurance/financial services industry knowledge
+ Python
+ Javascript
+ CSS
+ Looker Studio / Google Data Studio
Equal Employment Opportunity
This organization does not discriminate in any way to deprive any person of employment opportunities or otherwise adversely affect the status of any employee because of race, color, religion, sex, sexual orientation, genetic information, gender identity, national origin, age, disability, citizenship, veteran status, or military or uniformed services, in accordance with all applicable governmental laws and regulations. In addition, the facility complies with all applicable federal, state and local laws governing nondiscrimination in employment. This applies to all terms and conditions of employment including, but not limited to: hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. If you are an applicant with a mental or physical disability who needs a reasonable accommodation for any part of the application or hiring process, contact the director of Human Resources at the facility to which you are seeking employment; Simply go to ************************************************* to obtain the main telephone number of the facility and ask for Human Resources.