Senior Software Engineer (Backend)
Data engineer job in La Vergne, TN
Senior Software Engineer Job Description:
This person delivers enterprise-grade software solutions with high customer impact. Leads architecture and development activities with a specialization in at least one major enterprise IT application, one major database platform, and one major operating system. Performs all aspects of the development life cycle. Acts as the senior technical programmer for the assigned enterprise system and/or application of responsibility. Delivers results through independent contributions and through mentoring of junior engineers.
Senior Software Engineer Minimum Qualifications:
Bachelor's degree in computer science or related field or directly related year for year experience
6 years' experience in designing, developing, implementing, and supporting enterprise level IT solutions
Senior Software Engineer Preferred Skills:
Knowledge of Development Tools with demonstrated expert experience in appropriate development tools - .NET Stack (C#, Web API with Asp.net Core and Entity Framework Core), Kafka, Kubernetes, JavaScript/Web front end technologies, PostgreSQL, SQL Server, IBM DB2, Visual Studio, VS Code, Docker, REST, JSON and XML technologies.
Knowledge of Messaging / Enterprise Integration Patterns
Knowledge of external technologies within domain of expertise
Knowledge of all phases of applications systems analysis and programming
Knowledge of and in depth understanding of the business or function for which application is designed.
Knowledge of Databases with demonstrated expert experience integrating with PostgreSQL, IBM DB2, or SQL Server
Knowledge of development source code management using git
Knowledge of issue management and tracking using JIRAKnowledge of Object-oriented or Domain Driven Design
Senior Software Engineer Key Responsibilities:
Serves as Designer/Architect/Engineer for at least one major enterprise IT application.
Leads areas of integration with at least one major operating system (e.g. Unix/Linux/Windows).
Develops new design patterns, standards, etc. and works with other developers in implementation.
Performs data modeling and architecture development.
Reviews and evaluates application workflow and user experience.
Acts as technical expert and provides application development oversight and involvement for Third Party integrations (e.g. Documentum, Adobe, etc.) and database (e.g. MySQL, Oracle, SQL Server) core components.
Leads and executes testing to ensure the program meets the specified requirements.
Drives solutions and guides the work of others to provide full application development life cycle support including specifications, prototypes, development, quality assurance and deployment.
Champions innovation and expands sphere of influence through mentoring
and guidance.
Works with user/customer community, business analysts, and architects to capture system requirements and design.
Leverages a technical network to collaborate across the organization
Data Architect
Data engineer job in Franklin, TN
Job Title: Data Architect Company: Censis About Censis Censis Technologies (************************* a global leader in surgical instrument tracking and asset management solutions. At the forefront of healthcare innovation, Censis, the first company to engineer a surgical asset management system that tracks down to the instrument and patient levels, has continually set the standards for the sterile processing industry. From the beginning, Censis has recognized the vital connection between perioperative innovation and efficiency, unparalleled customer care and improved operational performance. By continuing to invest in technology, ease of integration, education and support, Censis provides solutions that empower hospitals and healthcare providers to stay compliant and ahead of healthcare's rapidly changing environment. With Censis, you're positioned to start ahead and stay ahead, no matter what the future holds.
Role Overview
Censis is seeking a highly experienced and innovative Data Architect to lead the design and implementation of modern data solutions using Microsoft Fabric, Lakehouse architecture, Power BI, semantic data modelling, and medallion architecture. The ideal candidate will have a strong foundation in data architecture, SQL Server, data pipelines, on-premises data integration using ODG (On-prem Data Gateway), and semantic layer development. This role demands a strategic thinker with hands-on expertise in building scalable, secure, and high-performance BI ecosystems, leveraging AI-driven development and delivery.
Key Responsibilities
Architecture & Design
Design and implement scalable BI and data architecture using Data Lake or Lakehouse paradigms, including medallion architecture.
Architect and optimize ELT/ETL pipelines using SQL Server, Dataflows, and data Pipelines.
Integrate on-premises data sources using On-prem Data Gateway (ODG) with cloud-based solutions.
Develop a robust semantic layer and underlying data model to bridge technical data and business language, applying data virtualization principles.
Design and optimize semantic models that represent data in a way that enhances understanding and usability for analytics.
Development & Implementation
Build and manage data pipelines, Lakehouses, Data Lakes, and Data Warehouses in Azure or AWS.
Manage Power BI dashboards with advanced DAX and real-time data integration.
Implement data governance, security, and compliance best practices.
Utilize AI-driven development and delivery to enhance solution effectiveness and efficiency.
Define data quality checks, transformations, and cleansing rules, and work with data engineers to implement them within the semantic layer.
Strong T-SQL knowledge, materialize View, Indexing, Column store, Dimensional Data Modelling
Leadership & Collaboration
Collaborate with business stakeholders to translate requirements into technical solutions and intuitive business language.
Lead and mentor a team of BI developers, data engineers, analysts, and collaborate with data architects.
Drive adoption of unified data platform, modern BI practices, and semantic modelling methodologies across the organization.
Monitoring & Optimization
Monitor and optimize data pipeline performance and troubleshoot issues.
Ensure data quality, lineage, and availability across all reporting layers.
Maintain comprehensive documentation of architecture, data models, workflows, and semantic layer details.
Required Skills & Qualifications
Experience: 10+ years in Data & Analytics and expertise in semantic data modelling and data engineering skills.
Strong experience with data platforms such as Snowflake, Databricks, Microsoft Fabric, or Azure Data Gateway, Azure Synapse.
BI Tools: Working Knowledge in Power BI and integration with Microsoft Fabric is a plus.
Data Integration: Experience with On-prem Data Gateway (ODG) and hybrid data environments.
Data Engineering: Proficient in designing ELT/ETL processes using SQL Server, columnar data format such as Delta or Parquet and Fabric Pipelines.
Architecture: Strong understanding of medallion architecture, data virtualization principles, cloud-based data management, and analytics technologies.
Programming: Expertise with Python, T-SQL, Spark or other scripting languages for data transformation.
Methodologies: Agile/Scrum project delivery experience.
Communication: Strong verbal and written communication skills with the ability to convey complex technical data in simple business language.
Good to have: Certifications: Microsoft certifications in Azure, Power BI, or Fabric are a plus.
Bonus or Equity
This position is also eligible for bonus as part of the total compensation package.
Fortive Corporation Overview
Fortive's essential technology makes the world safe and more productive. We accelerate transformation across a broad range of applications including environmental, health and safety compliance, industrial condition monitoring, next-generation product design, and healthcare safety solutions.
We are a global industrial technology innovator with a startup spirit. Our forward-looking companies lead the way in software-powered workflow solutions, data-driven intelligence, AI-powered automation, and other disruptive technologies. We're a force for progress, working alongside our customers and partners to solve challenges on a global scale, from workplace safety in the most demanding conditions to groundbreaking sustainability solutions.
We are a diverse team 10,000 strong, united by a dynamic, inclusive culture and energized by limitless learning and growth. We use the proven Fortive Business System (FBS) to accelerate our positive impact.
At Fortive, we believe in you. We believe in your potential-your ability to learn, grow, and make a difference.
At Fortive, we believe in us. We believe in the power of people working together to solve problems no one could solve alone.
At Fortive, we believe in growth. We're honest about what's working and what isn't, and we never stop improving and innovating.
Fortive: For you, for us, for growth.
About Censis
Censis, the first company to engineer a surgical asset management system that tracks down to the instrument and patient levels, has continually set the standards for the sterile processing industry.From the beginning, Censis has recognized the vital connection between perioperative innovation and efficiency, unparalleled customer care and improved operational performance. By continuing to invest in technology, ease of integration, education and support, Censis provides solutions that empower hospitals and healthcare providers to stay compliant and ahead of healthcare's rapidly changing environment. With Censis, you're positioned to start ahead and stay ahead, no matter what the future holds.
We Are an Equal Opportunity Employer. Fortive Corporation and all Fortive Companies are proud to be equal opportunity employers. We value and encourage diversity and solicit applications from all qualified applicants without regard to race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity or expression, or other characteristics protected by law. Fortive and all Fortive Companies are also committed to providing reasonable accommodations for applicants with disabilities. Individuals who need a reasonable accommodation because of a disability for any part of the employment application process, please contact us at applyassistance@fortive.com.
Auto-ApplySenior Data Engineer
Data engineer job in Franklin, TN
**About CHS:** Community Health Systems is one of the nation's leading healthcare providers. Developing and operating healthcare delivery systems in 35 distinct markets across 14 states, CHS is committed to helping people get well and live healthier. CHS operates 70 affiliated hospitals with more than 10,000 beds and approximately 1,000 other sites of care, including physician practices, urgent care centers, freestanding emergency departments, imaging centers, cancer centers, and ambulatory surgery centers.
**About the Role:**
We are seeking an experienced **Data Engineer** to design, build, deploy, and manage our company's data infrastructure. This role is critical to managing and organizing structured and unstructured data across the organization to enable outcomes analysis, insights, compliance, reporting, and business intelligence. This role requires a strong blend of technical expertise, problem-solving abilities, and communication skills. Emphasis will be placed on leadership, mentorship, and ownership within the team.
**Essential Duties and Responsibilities:**
+ Design, build, and maintain robust CI/CD pipelines, including infrastructure as code
+ Design, build, and maintain scalable and efficient data pipelines using various GCP services.
+ Maintain and refactor complex legacy SQL ETL processes in BigQuery.
+ Write clean, maintainable, and efficient code for data processing, automation, and API integrations.
+ Collaborate with data architects, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.
+ Ensure data quality, integrity, and security across all data platforms.
+ Troubleshoot and resolve data-related issues, performing root cause analysis and implementing corrective actions.
+ Contribute to the continuous improvement of our data engineering practices, tools, and methodologies.
+ Mentor junior team members and provide technical guidance.
+ Lead data initiatives from conception to deployment.
**Our Stack (In Development):**
+ GitHub, GitHub Actions, Terraform, Docker, Helm
+ JVM languages (Kotlin preferred), SQL, Python
+ GCP services: Cloud Composer, Dataproc, Dataplex, BigQuery, BigLake, GKE
+ OSS: Kubernetes, Spark, Flink, Kafka
**Required Education:**
+ Bachelor's Degree or 4 years equivalent professional experience
**Required Experience:**
+ 5-7 years of professional experience in data engineering or a similar role.
+ Proficiency in SQL, with a deep understanding of relational databases and data warehousing concepts.
+ Expertise in JVM languages or Python for data manipulation, scripting, and automation.
+ Demonstrable experience with cloud services related to data engineering.
+ Strong understanding of ETL/ELT processes, data modeling, and data architecture principles.
+ Excellent problem-solving skills and a strong analytical mindset.
+ Ability to work independently and as part of a collaborative team.
+ Strong communication and interpersonal skills, with the ability to explain complex technical concepts to non-technical stakeholders.
+ Proven leadership potential and a willingness to take ownership of projects.
+ Experience with Agile development methodologies
+ Expertise with version control systems (e.g., Git, GitHub)
**Preferred Experience:**
+ Experience with distributed data processing frameworks
+ Experience with distributed systems
+ Familiarity with data visualization tools (e.g., Looker, Tableau, Power BI)
+ Experience with data integration and migration within Oracle Cloud Infrastructure (OCI)
+ Familiarity with data structures and extraction methodologies for Oracle Cloud ERP applications (e.g., Financials, HCM, SCM)
Equal Employment Opportunity
This organization does not discriminate in any way to deprive any person of employment opportunities or otherwise adversely affect the status of any employee because of race, color, religion, sex, sexual orientation, genetic information, gender identity, national origin, age, disability, citizenship, veteran status, or military or uniformed services, in accordance with all applicable governmental laws and regulations. In addition, the facility complies with all applicable federal, state and local laws governing nondiscrimination in employment. This applies to all terms and conditions of employment including, but not limited to: hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. If you are an applicant with a mental or physical disability who needs a reasonable accommodation for any part of the application or hiring process, contact the director of Human Resources at the facility to which you are seeking employment; Simply go to ************************************************* to obtain the main telephone number of the facility and ask for Human Resources.
Data Scientist, Merchandising Analytics
Data engineer job in Brentwood, TN
The Data Scientist, Merchandising Analytics at Tractor Supply Company will play a key role in leveraging data to address complex business challenges. The role will develop advanced statistical methods using Machine Learning, AI, statistical modeling, and optimization techniques to support the merchandising strategies and broader organizational goals of TSC. Additionally, the role will contribute to setting objectives to enhance the overall data architecture and data governance within the Merchandising team. Key areas of focus within Merchandising will be Space Planning and Pricing.
The Data Scientist will lead cross-functional projects, design and implement predictive models, and promote data-driven decision-making throughout the organization. Strong communication skills are essential as this role will translate complex analytical results into clear, actionable insights for both technical and non-technical stakeholders. This role work will ensure that Tractor Supply Company remains at the forefront of industry trends and emerging technologies in the data science field.
**Essential Duties and Responsibilities (Min 5%)**
+ Work closely with key business partners to fully explore and frame a business question including objectives, goals, KPIs, decisions the analysis will support and required timing for deliverables.
+ Extracting available and relevant data from internal and external data sources to perform data science solution development.
+ Develop, maintain, and improve predictive models using R, Python, and Databricks to enhance business knowledge and processes.
+ Contribute and assist the team with best practices in data governance, data engineering, and data architecture.
+ Identify opportunities for automation and continuous improvement within data pipelines, processes, and systems.
+ Maintain a broad exposure to the wider ecosystem of AI / Machine Learning and ensure our team is pushing toward the most optimal solutions.
+ Manage multiple projects simultaneously with limited oversight including development of new technologies and maintenance of existing framework.
+ Foster a culture of data-driven decision making and collaborate with cross-functional teams and senior leadership to define the long-term vision and goals for data science and engineering within the organization.
+ Design and execute A/B tests to evaluate the effectiveness of various data-driven solutions, including designing appropriate sample sizes, metrics for evaluation, and statistical analysis plans.
+ Use proven, predictive science to right-size every store with localized plans that balance individual store space allocations with your top-down and bottom-up strategies.
+ Be a 'Go-To' person for any data analytical needs ranging from data extraction/ manipulations, long term trend analysis, statistical analysis, and modeling techniques. Perform code-review and debug with the team and assist during implementation where necessary.
**Required Qualifications**
_Experience_ : 3-5 years proven experience as a Data Scientist, preferably in the retail or e-commerce industry. Prefer 2+ years of experience in predictive modeling utilizing CRM or transaction information.
_Education_ : Bachelor's Degree in Mathematics, Statistics, or Econometrics. Master's Degree prefered. Any combination of education and experiencce will be considered.
_Professional Certifications_ : None
**Preferred knowledge, skills or abilities**
+ Intermediate-advanced in one or more programming language (Python, PySpark, R).
+ Deep expertise in writing and debugging complex SQL queries.
+ Ability to frame business questions and create an analytics solution using statistical or other advanced analytics methodologies.
+ Proven advanced modeling experience in leading data-driven projects from definition to execution, driving and influencing project roadmaps.
+ Experience using Azure, AWS, or another cloud compute platform a plus.
+ Familiarity with visualization tools such as Power BI and Tableau.
+ Must possess high degree of aptitude in communicating both verbally and written complex analysis results to Senior & Executive Leadership.
+ Knowledge of A/ B testing methods; capable of designing a controlled test design, running the test and providing measurement post-hoc.
+ Proficiency with managing data repository and version control systems like Git.
+ Speak, read and write effectively in the English language.
**Working Conditions**
+ Hybrid / Flexible working conditions
**Physical Requirements**
+ Sitting
+ Standing (not walking)
+ Walking
+ Kneeling/Stooping/Bending
+ Lifting up to 10 pounds
**Disclaimer**
_This job description represents an overview of the responsibilities for the above referenced position. It is not intended to represent a comprehensive list of responsibilities. A team member should perform all duties as assigned by his/ her supervisor._
**Company Info**
At Tractor Supply and Petsense by Tractor Supply, our Team Members are the heart of our success. Their dedication, passion, and hard work drive everything we do, and we are committed to supporting them with a comprehensive and accessible total reward package. We understand the evolving needs of our Team Members and their families, and we strive to offer meaningful, competitive, and sustainable benefits that support their well-being today and in the future.
Our benefits extend beyond medical, dental, and vision coverage, including company-paid life and disability insurance, paid parental leave, tuition reimbursement, and family planning resources such as adoption and surrogacy assistance, for all full-time Team Members and all part-time Team Members. Part time new hires gain eligibility for TSC Benefits by averaging at least 15 hours per week during their 90-day lookback period. The lookback period starts the first of the month following the date of hire. If the 15-hour requirement was met, the benefits eligibility date will be the first day of the month following 4 months of continuous service.
Please visitthis link (********************************************************************** for more specific information about the benefits and leave policies applicable to the position you're applying for.
**ALREADY A TEAM MEMBER?**
You must apply or refer a friend through our internal portal
Click here (**************************************************************************
**CONNECTION**
Our Mission and Values are more than just words on the wall - they're the one constant in an ever-changing environment and the bedrock on which we build our culture. They're the core of who we are and the foundation of every decision we make. It's not just what we do that sets us apart, but how we do it.
Learn More
**EMPOWERMENT**
We believe in managing your time for business and personal success, which is why we empower our Team Members to lead balanced lives through our benefits and total rewards offerings. For full-time and eligible part-time TSC and Petsense Team Members. We care about what you care about!
Learn More
**OPPORTUNITY**
A lot of care goes into providing legendary service at Tractor Supply Company, which is why our Team Members are our top priority. Want a career with a clear path for growth? Your Opportunity is Out Here at Tractor Supply and Petsense.
Learn More
Join Our Talent Community
**Nearest Major Market:** Nashville
Data Engineer III
Data engineer job in Brentwood, TN
Job Description
Data Engineer III (Confidential Client) There is NOsponsorship available for this role and candidates MUST be in Nashville and available to work onsite. Please DO NOT APPLY for this role if you require sponsorship at any time or if you are not CURRENTLY in Nashville, TN.
Overview
A leading organization is seeking an experienced Data Engineer III to join its data and analytics team. This role will focus on designing, building, and enhancing modern data platforms and enterprise data environments. The ideal candidate has a strong background in cloud data engineering, data warehousing, streaming technologies, and enterprise-grade ETL development, with the ability to engage cross-functional stakeholders and drive scalable data solutions.
Responsibilities
Architect, build, and maintain scalable data pipelines and analytics environments using both on-premises and cloud technologies
Translate business and technical requirements into reliable data solutions that support analytics, forecasting, and reporting
Lead end-to-end delivery of data engineering initiatives, including planning, design, development, testing, and deployment
Implement data models, metadata strategies, data quality controls, and data governance best practices
Develop solutions involving real-time streaming, API integrations, and change data capture
Collaborate with cross-functional teams, provide technical mentorship, and document solution designs and standards
Support platform stability, manage tickets and requests, and ensure system availability and performance
Maintain strong data security, compliance, and privacy practices in alignment with regulatory standards
Partner with business stakeholders to understand needs and deliver high-quality data capabilities that enhance decision support
Required Skills
Bachelors degree in Computer Science, Information Technology, Engineering, or related field
7+ years of professional data engineering experience with increasing responsibility
Hands-on experience building and supporting modern data platforms, data lakes, and data warehouse environments
Strong proficiency in SQL and relational/dimensional data modeling
Experience with leading cloud data tools (Microsoft Azure preferred) and enterprise data warehouse solutions
Proficiency with ETL development using tools such as Talend, Informatica, or similar enterprise-grade platforms
Experience developing streaming and API-driven data integrations
Familiarity with data governance, metadata, data cleansing, and security practices
Proven ability to lead technical projects and collaborate with cross-functional stakeholders
Excellent communication skills and ability to present findings to technical and business leaders
Preferred Qualifications
Experience with Microsoft Azure services such as Data Lake, Data Factory, Synapse, or Fabric
Background with Oracle-based data environments
Knowledge of enterprise-class data security and access control tools
Programming experience with Python or similar languages
Experience supporting highly regulated or compliance-driven environments
How to Apply
For confidential consideration, please submit your resume. Our team will review your background and contact qualified candidates to discuss next steps. We look forward to connecting with you.
Bigdata / Hadoop Technical Lead
Data engineer job in Franklin, TN
E*Pro Consulting service offerings include contingent Staff Augmentation of IT professionals, Permanent Recruiting and Temp-to-Hire. In addition, our industry expertise and knowledge within financial services, Insurance, Telecom, Manufacturing, Technology, Media and Entertainment, Pharmaceutical, Health Care and service industries ensures our services are customized to meet specific needs. For more details please visit our website ******************
Job Description
Technical/Functional Skills:
• Must have at least 1 full-scale Hadoop implementation from DEV to PROD
• Must have experience in Production Deployment Process for Big Data projects
• Must have experience in root cause analysis, trouble-shooting of Hadoop applications
• Must have significant experience in designing solutions using Cloudera Hadoop
• Must have significant experience with Java MapReduce, PIG, Hive, Sqoop and Oozie
• Must have significant experience with Unix Shell Scripts
• Exposure to Healthcare Provider domain
Roles & Responsibilities:
• Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions.
• Mentor, guide and train Team members on Big Data
• Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop.
• Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks.
• Perform complex coding tasks using Java Map Reduce, PIG, Hive and Sqoop
• Review and certify code written by team members
• Ensures that established change management and other procedures are adhered to also help developing needing standards, procedures, and practices.
• Performance tuning with large data sets.
Generic Managerial Skills:
• Ability to lead the team, plan, track and manage and work performed by team members
• Ability to work independently and communicate across multiple levels (Product owners, Executive sponsors, Team members)
Additional Information
All your information will be kept confidential according to EEO guidelines.
Big Data / Hadoop Technical Lead
Data engineer job in Franklin, TN
First IT Solutions provides a professional, cost effective recruitment solution. We take the time to understand your needs in great detail. With our dedicated and experienced Recruitment Consultants, our market knowledge, and our emphasis on quality and satisfaction, we pride ourselves on offering the best solution the first time.
Our consultants have substantial experience gained over many years placing talented individuals in Contract and Permanent positions within the technology industry. You can be sure that we understand the process well from your side of the desk. We started First IT to provide top quality service, something that clients complained was lacking at other recruiting and placement firms. At First IT, we strive continually to provide excellent service at all times.
Job Description
Technical/Functional Skills:
Minimum Experience Required 8 years
Must have experience as a Tech Lead for Big data projects
Must have significant experience with architecting and designing solutions using Cloudera Hadoop
Must have significant experience with Python, Java Map Reduce, PIG, Hive, Hbase, Oozie and Sqoop
Must have significant experience with Unix Shell Scripts
Exposure to Healthcare Provider domain
Qualifications
Architect and Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions.
Estimate size, effort, complexity of solutions
Plan, track and report project status
Mentor, guide and train Team members on Big Data
Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop.
Prepare detailed specifications, diagrams, and other programming structures from which programs are written.
Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks.
Perform complex coding tasks using Python, Java Map Reduce, PIG, Hive, Hbase, Oozie and Sqoop
Review and certify code written by team members
Ensures that established change management and other procedures are adhired to also help developing needing standards, procedures, and practices.
Performance tuning with large data sets.
Additional Information
Duration: Full Time
Eligiblity: GC & US Citizens Only
Share the Profiles to ****************************
Contact:
************
Keep the subject line with Job Title and Location
Easy ApplyData Engineer - Archimedes
Data engineer job in Brentwood, TN
Company Archimedes About Us Archimedes - Transforming the Specialty Drug Benefit - Archimedes is the industry leader in specialty drug management solutions. Founded with the goal of transforming the PBM industry to provide the necessary ingredients for the sustainability of the prescription drug benefit - alignment, value and transparency - Archimedes achieves superior results for clients by eliminating tightly held PBM conflicts of interest including drug spread, rebate retention and pharmacy ownership and delivering the most rigorous clinical management at the lowest net cost. .______________________________________________________________________________________________________________________________________________________________________________________________________. Current associates must use SSO login option at ************************************ to be considered for internal opportunities.
Pay Range
USD $0.00 - USD $0.00 /Yr.
STAR Bonus % (At Risk Maximum)
10.00 - Manager, Clinical Mgr, Pharm Supvr, CAE, Sr CAE I
Work Schedule Description (e.g. M-F 8am to 5pm)
Core Business Hours
Overview
The Data Engineer is responsible for designing, building, and maintaining scalable data pipelines and infrastructure that support enterprise analytics, reporting, and operational data flows. This role plays a critical part in enabling data-driven decision-making across Centene and SPM by ensuring the availability, integrity, and performance of data systems. The Data Engineer collaborates with data scientists, analysts, software developers, and business stakeholders to deliver robust ETL solutions, optimize data storage and retrieval, and implement secure, compliant data architectures in cloud and hybrid environments.
Operating within a healthcare-focused, compliance-heavy landscape, the Data Engineer ensures that data platforms align with regulatory standards such as HIPAA and SOC 2, while embedding automation and CI/CD practices into daily workflows. The role supports both AWS and Azure environments, leveraging cloud-native services and modern tooling to streamline data ingestion, transformation, and delivery.
Responsibilities
Job Responsibilities:
* Design, develop, and maintain ETL pipelines for structured and unstructured data across cloud and on-prem environments.
* Build and optimize data models, schemas, and storage solutions in SQL Server, PostgreSQL, and cloud-native databases.
* Implement CI/CD workflows for data pipeline deployment and monitoring using tools such as GitHub Actions, Azure DevOps, or Jenkins.
* Develop and maintain data integrations using AWS Glue, Azure Data Factory, Lambda, EventBridge, and other cloud-native services.
* Ensure data quality, lineage, and governance through automated validation, logging, and monitoring frameworks.
* Collaborate with cross-functional teams to gather requirements, design scalable solutions, and support analytics and reporting needs.
* Monitor and troubleshoot data pipeline performance, latency, and failures; implement proactive alerting and remediation strategies.
* Support data security and compliance by enforcing access controls, encryption standards, and audit logging aligned with HIPAA and SOC 2.
* Maintain documentation for data flows, architecture diagrams, and operational procedures.
* Participate in sprint planning, code reviews, and agile ceremonies to support iterative development and continuous improvement.
* Evaluate and integrate new data tools, frameworks, and cloud services to enhance platform capabilities.
* Partner with DevOps and Security teams to ensure infrastructure-as-code and secure deployment practices are followed.
* Participate in, adhere to, and support compliance, people and culture, and learning programs.
* Perform other duties as assigned.
Qualifications
Essential Background Requirements:
* Education: Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field required. Master's degree preferred.
* Certification/Licenses: AWS Certified Data Analytics or Solutions Architect required. Microsoft Certified: Azure Data Engineer Associate required. Certified Data Management Professional (CDMP) required.
* Experience:
* 5+ years of experience in data engineering, ETL development, or cloud data architecture required.
* Proven experience with SQL, ETL tools, and CI/CD pipelines required.
* Hands-on experience with AWS and Azure data services and infrastructure required.
* Familiarity with data governance, compliance frameworks (HIPAA, SOC 2), and secure data handling practices required.
* Familiarity with CI/CD pipelines, automated testing, and version control systems required.
* Skills & Technologies:
* Languages & Tools: SQL, Python, Bash, Git, Terraform, PowerShell
* ETL & Orchestration: AWS Glue, Azure Data Factory, Apache Airflow
* CI/CD: GitHub Actions, Azure DevOps, Jenkins
* Cloud Platforms: AWS (S3, Lambda, RDS, Redshift), Azure (Blob Storage, Synapse, Functions)
* Monitoring & Logging: CloudWatch, Azure Monitor, ELK Stack
* Data Governance: Data cataloging, lineage tracking, encryption, and access control.
Location : Address
5250 Virginia Way Ste 300
Location : City
Brentwood
Location : State/Province
TN
Location : Postal Code
37027
Location : Country
US
Auto-ApplyData Engineer Based in U.S.A
Data engineer job in Altamont, TN
Job Description
.
is only for candidates based in Texas or California (U.S.A.)***
At Advancio, we are passionate about technology and its ability to transform the world. We are rapidly expanding and building a company where we serve exceptional businesses, hire top talent, and have a lot of fun doing what we love!
Job Summary:
We are seeking an experienced Data Scientist to extract actionable insights from complex datasets and develop advanced models to drive business growth. The ideal candidate has a deep understanding of machine learning, statistical analysis, and data visualization, with the ability to translate findings into impactful solutions.
What will you do:
Analyze large and diverse datasets to uncover patterns, trends, and actionable insights.
Develop, train, and deploy machine learning models to solve complex business problems.
Collaborate with cross-functional teams to identify opportunities for data-driven improvements.
Build and maintain predictive models, recommendation systems, or optimization algorithms.
Design and implement data preprocessing pipelines, ensuring data quality and accessibility.
Create compelling visualizations and reports to communicate findings effectively to stakeholders.
Stay updated with the latest advancements in data science, machine learning, and AI technologies.
Requirements
5+ years of professional experience in data science or related fields.
Advanced English communication skills, both verbal and written.
Proficiency in programming languages such as Python or R, with experience in data manipulation libraries (e.g., Pandas, NumPy).
Expertise in machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn.
Strong statistical analysis skills and knowledge of algorithms and data structures.
Experience with SQL and working with relational databases.
Familiarity with big data tools (e.g., Spark, Hadoop) and cloud platforms (AWS, Azure, GCP).
Ability to create data visualizations using tools like Tableau, Power BI, or Matplotlib.
Data Platform Engineer
Data engineer job in Brentwood, TN
Data Platform Engineer The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions.
Responsibilities
* Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms.
* Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks.
* Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark.
* Develop and manage data models, data warehousing solutions, and data integration architectures in Azure.
* Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems.
* Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration.
* Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases.
* Ensure data quality, governance, and security across the data lifecycle.
* Collaborate with product managers by estimating technical tasks and deliverables.
* Uphold the mission and values of Monogram Health in all aspects of your role and activities.
Position Requirements
* A bachelor's degree in computer science, data science, software engineering or related field.
* Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark.
* Expert level knowledge of Python or other scripting languages required.
* Proficiency in SQL and other data query languages.
* Understanding of data modeling and schema design principles
* Ability to work with large datasets and perform data analysis
* Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable.
* Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC).
* Thorough understanding of Azure Cloud Infrastructure offerings.
* Demonstrated problem-solving and troubleshooting skills.
* Team player with demonstrated written and communication skills.
Benefits
* Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts
* Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources
* Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave
* Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts
Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders.
Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home.
Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
Data Engineer
Data engineer job in Spring Hill, TN
We're seeking a Data Engineer to help shape the foundation of Zipliens' growing data ecosystem. Our engineering team supports a diverse set of tools and systems that power lien resolution operations, client transparency, and decision-making across the company. In this role, you'll design and maintain reliable data pipelines, optimize data storage and retrieval, and ensure that our systems deliver accurate, timely, and actionable insights. You'll collaborate closely with data analysts, product owners, and engineers to build scalable data infrastructure and contribute to data quality standards that will support the next generation of Zipliens applications.
Requirements
Responsibilities:
Develop and optimize SQL queries and database schemas for efficient data retrieval and storage.
Design, develop, and maintain scalable ETL processes.
Develop and maintain scripts for data processing.
Ensure scalability and performance optimization of data pipelines and queries.
Develop and implement data quality checks and monitoring to ensure data accuracy and reliability.
Contribute to the development and maintenance of data quality standards and best practices.
Collaborate with data analysts to understand requirements and deliver solutions that enable effective reporting and analytics.
Design and build reports to provide actionable insights to stakeholders.
Document data models, ETL processes, and reporting solutions.
Qualifications:
Bachelor's degree in Business, Computer Information Systems, Computer Science, or equivalent practical experience.
4+ years of experience as a Data Engineer, Senior Data Analyst, or similar role.
Strong proficiency in SQL and experience with relational and cloud-based data storage solutions (e.g., PostgreSQL, MySQL, SQL Server, Snowflake, Redshift, BigQuery).
Experience with ETL tools and techniques.
Experience with a general-purpose programming language (Python preferred).
Familiarity with data warehousing concepts and data modeling principles.
Understanding of cloud platforms (e.g., AWS, Azure, GCP) and their data services is a plus.
Strong analytical and problem-solving skills with a focus on data quality, performance, and reliability.
Collaborative mindset with the ability to communicate effectively with stakeholders.
This role requires on-site presence at least three days per week (60%) in our Spring Hill, TN office.
Benefits
Comprehensive Health Benefits (Medical, Dental, and Vision), including HSA with employer contributions, FSA, and Dependent Care FSA
Company-Paid Life Insurance and Short-Term Disability
401(k) Plan with Company Match
Paid Time Off (Vacation, Sick Leave, and 10 Holidays)
Paid Parental Leave
Pay Disclosure: The total base salary range for this role is $89,000 - $120,000 annually, with an opportunity for a discretionary bonus. Final compensation will be determined based on skills and experience.
Auto-ApplySr. Data Engineer TO
Data engineer job in Tullahoma, TN
Job Description
.
WE ARE ARCARITHM, and we are changing the world!
If you are ready to grow your career and change the world with us, then join the Arcarithm team!
We are located in beautiful, downtown Huntsville, AL, one of the fastest growing cities in the U.S.! At Arcarithm, we cultivate and foster an environment of integrity, open communication, work life balance, and career development. We are committed to investing in our employees by offering comprehensive health insurance options, a generous 401K plan, competitive salaries, continuous career growth opportunities, flexible schedules including remote work, mentoring and performance incentives.
Arcarithm is currently seeking top talent in the areas of full stack software development, artificial intelligence, optimization, and data analytics. You will work in a dynamic and challenging environment alongside our customers which include Lockheed Martin, General Dynamics, Northrop Grumman, Raytheon, US Army, US Navy, US Air Force, the Missile Defense Agency, and NASA on cutting edge technologies including machine learning, augmented and virtual reality, big data analytics, and more!
We are excited to continue to change and improve the world through innovation and technology!
Contact us today to hear more about Arcarithm and all we offer!
Job Title: Sr. Data Engineer
Job Location: Tullahoma, TN
Arcarithm has an exciting opportunity for a Digital Enterprise Sr. Data Engineer, supporting TOS II, at Arnold Air Force Base, TN. As a Digital Enterprise Sr. Data Engineer, you will work as a member of the Digital Enterprise group (DE) in the Mission Support Branch to lead implementation of data-centered projects to improve the AEDC ground test data infrastructure, facility operations, and business systems.
Must have an active and transferable DoD security clearance with current investigation at the required level. Must be able to maintain the required clearance
B.S. in Computer Science, Statistics, Mathematics, Engineering or another relevant engineering field from an accredited university program plus a minimum of 5 to 14 years of progressive and relevant experience.
Current U.S. Citizenship is required.
Key Responsibilities:
Development and implement data-engineering strategies and programs.
Utilize test data sources to optimize data analytics at AEDC and suggest ways which insights obtained might be used to inform testing sustainment and operational strategies.
Automate manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
Utilize machine learning tools to select features, create, and optimize classifiers.
Define & develop approaches and demonstrate abilities to mine and analyze data from databases to drive optimization and improvement of data collected at AEDC.
Identify and assess the effectiveness of data engineering, data sources and data gathering techniques.
Utilize applicable experience in “big data”, analytics, algorithmic, custom data models, algorithms and/or machine learning approaches to help extract data that will help drive engineering decisions.
Coordinate with different multidisciplinary teams to implement models and monitor outcomes.
Define & develop processes and tools to monitor and analyze model performance and data accuracy.
Define & develop the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data' technologies.
It is a condition of employment to wear company issued PPE (Personal Protective Equipment) in accordance with supervisory direction and company policy.
Identify and complete projects associated to data-engineering.
Performs other related duties as required.
Desired Skills:
Master's or PhD degree in Computer Science, Statistics, Mathematics, Physics, Engineering or another relevant engineering field from an accredited university program.
Experience with development lifecycle methodologies such as Agile DevSecOps.
Experience with data scripting language software like Python, Java, C++, or Ruby and SQL.
Experience with data extraction tools and processes, data ingestion, ETL, data mining, API's and data warehousing.
Demonstrated experience in a data engineering role.
Experience working with and creating data architectures.
Knowledge of a variety of machine learning techniques and their real-world advantages/drawbacks.
Coding knowledge and experience with multiple languages
Arcarithm is an Equal Opportunity employer and all qualified candidates will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, genetic information, citizenship, ancestry, marital status, protected veteran status, disability status or any other status protected by federal, state, or local law. Arcarithm participates in E-Verify.
Lead Building Engineer
Data engineer job in Gallatin, TN
Meta is seeking a data center Lead Building Engineer to join our Data Center Facility Operations team. Our data centers serve as the foundation upon which our software operates to meet the demands of our customers. The Lead Building Engineer will be a part of the Facility Operations team responsible for operating and maintaining critical systems in our data centers. They will be an individual contributor who provides day to day coordination and oversight of work in one ~30MW building. The Lead Building Engineer will report to a Critical Operations Manager and will be expected to maintain a comprehensive understanding of the state of the data center. The candidate will need to be experienced in diverse industries such as electrical generation, electrical distribution, cooling technologies, building control and fire protection systems.
**Required Skills:**
Lead Building Engineer Responsibilities:
1. Oversee the work for a 30MW+ data center to ensure schedule deconfliction, work priority, maintenance compliance, procedural compliance, and team readiness
2. Schedule and supervise vendors/subcontractors during equipment/systems maintenance and service
3. Perform hands-on operations and maintenance which includes all physical and administrative operations tasks, service, and maintenance in accordance with site processes and procedures to ensure the highest levels of uptime, efficiency, and safety without disruption to the business
4. Review all operating procedures, serve as final approval authority on some procedures based on internal standards
5. Support incident management, root cause analysis, and corrective action closure for any operational issues that arise in the data center
6. Achieve and maintain a high-level of technical knowledge regarding data center infrastructure and operations. Successfully complete personnel qualification standards (PQS) training
7. Maintain up to date knowledge of Meta policies and standards and follow standards in all work
8. Provide in-the-field training to Critical Facility Engineers and support the identification of training needs across the team
9. Ensure the accuracy of work orders in the Computerized Maintenance Management System (CMMS)
10. Collaborate with peer Lead Building Engineers to deconflict work across campus and maintain consistency in operations between buildings
11. Provide technical expertise and assistance as required
12. Regularly inspect equipment, buildings, safety routes and grounds to check or identify any abnormal or unsafe conditions or faults
13. Troubleshoot, evaluate and recommend system upgrades
14. Order parts and supplies for maintenance and repairs through internal tooling
15. Escalate issues to facility management appropriately and timely
**Minimum Qualifications:**
Minimum Qualifications:
16. 7+ years experience in electrical, HVAC, mechanical, controls, or other technical maintenance field
17. Associate's Degree in engineering plus 5+ years experience or Bachelor's degree in related field plus 3+ years experience in electrical, HVAC, mechanical, or controls will be considered in lieu of 7+ years experience
18. Proficient with maintenance management program
19. Proficient with computer systems including documents, spreadsheets, and email
20. Regularly walk job site areas of flat and uneven terrain
21. Work at varying heights and from ladders
22. Use hands and fingers
23. Reach/push/pull with hands/arms/shoulders
24. Stoop, kneel, crouch and crawl
25. Lift and/or otherwise move 45 pounds or more
26. Sit or stand at a workstation for extended periods of time
**Preferred Qualifications:**
Preferred Qualifications:
27. Experience in leading a team of maintenance Engineers/Technicians
28. Experience interpreting blueprints/CAD drawings
29. Experience in planning large scale maintenance programs
30. Knowledge of mechanical, electrical, and life safety monitoring and control systems typically used in critical environments
31. Professional affiliations (7x24 Exchange, IFMA, Data Center Pulse, etc.)
32. 5+ years experience in a data center or other Critical Environment (pharma, clean room, medical, power production, etc.)
33. Trade Certification or state licensure in Electrical or Mechanical (HVAC)
**Public Compensation:**
$50.00/hour to $70.19/hour + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
Software Engineer, Android Core Product - Murfreesboro, USA
Data engineer job in Murfreesboro, TN
The mission of Speechify is to make sure that reading is never a barrier to learning.
Over 50 million people use Speechify's text-to-speech products to turn whatever they're reading - PDFs, books, Google Docs, news articles, websites - into audio, so they can read faster, read more, and remember more. Speechify's text-to-speech reading products include its iOS app, Android App, Mac App, Chrome Extension, and Web App. Google recently named Speechify the Chrome Extension of the Year and Apple named Speechify its Design award winner for inclusivity for 2025.
Today, nearly 200 people around the globe work on Speechify in a 100% distributed setting - Speechify has no office. These include frontend and backend engineers, AI research scientists, and others from Amazon, Microsoft, and Google, leading PhD programs like Stanford, high growth startups like Stripe, Vercel, Bolt, and many founders of their own companies.
Overview
With the growth of our Android app, being the most used text-to-speech app in the Play Store, we find the need for a Senior Android Engineer to help us support the new user base as well as work on new and exciting projects to push us forward.
This is a key role and ideal for someone who thinks strategically, enjoys fast-paced environments, passionate about making product decisions, and has experience building great user experiences that delight users.
We are a flat organization that allows anyone to become a leader by showing excellent technical skills and delivering results consistently and fast. Work ethic, solid communication skills, and obsession with winning are paramount.
Our interview process involves several technical interviews and we aim to complete them within 1 week.
What You'll Do
Owning major features and working closely with our design team - take ownership of features inside the app and become responsible of delivering high quality features
Shape the future of our Android team
Own, maintain and improve reliability metrics for key features
Participate in discussions across different teams - Product, Design, Engineering
Review pull requests, and support other teammates
Handle critical issues or cope with unexpected challenges
Take ownership of feature releases and provide nightly builds for the QA team
An Ideal Candidate Should Have
5+ years of software engineering experience
Familiarity with Android components
Experience building or contributing to at least one Android app
Product design intuition and user empathy
Drive to push the boundaries of Android UI/UX
Understanding of the importance of tests and how to approach writing tests
Self-drive to improve the app and codebase above and beyond what's outlined in the spec
Rock solid experience with Kotlin, Kotlin Coroutines, Kotlin Flow, Dagger 2, MVVM, Clean Architecture, Background Services, Music Player Service, Android Animations, Jetpack Navigation, JUnit tests
Excellent communication skills
User oriented problem solving approach
Driven with continuous feedback from leaders
Bonus:
Experience building, maintaining, or otherwise contributing to open source projects in Android
Experience with iOS, Web or NodeJS
Technologies we use:
Kotlin
Kotlin Coroutines
Kotlin Flow
Jetpack Navigation
Dagger 2
Room
Custom Views, Canvas & Paint
Jetpack Compose
JUnit
What We offer:
A fast-growing environment where you can help shape the company and product.
An entrepreneurial-minded team that supports risk, intuition, and hustle.
A hands-off management approach so you can focus and do your best work.
An opportunity to make a big impact in a transformative industry.
Competitive salaries, a friendly and laid-back atmosphere, and a commitment to building a great asynchronous culture.
Opportunity to work on a life-changing product that millions of people use.
Build products that directly impact and support people with learning differences like dyslexia, ADD, low vision, concussions, autism, and more.
Work in one of the fastest growing sectors of tech, the intersection of artificial intelligence and audio.
The United States Based Salary range for this role is: 140,000-200,000 USD/Year + Bonus + Stock depending on experience
Think you're a good fit for this job?
Tell us more about yourself and why you're interested in the role when you apply.
And don't forget to include links to your portfolio and LinkedIn.
Not looking but know someone who would make a great fit?
Refer them!
Speechify is committed to a diverse and inclusive workplace.
Speechify does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.
Auto-ApplySenior Software Engineer
Data engineer job in La Vergne, TN
Senior Software Engineer Job Description:
This person delivers the development and maintenance of our next-generation Order Management System using modern Microservices and Data Mesh architecture. This role requires a deep understanding of distributed systems, cloud and on-prem technologies, and scalable software development within the Manufacturing & Distribution industry. This position will collaborate with cross-functional teams to architect, develop, and optimize enterprise-grade solutions that drive efficiency and innovation in distribution, as well as order fulfillment processes.
Senior Software Engineer Minimum Qualifications:
Bachelor's degree in computer science or related field or directly related year for year experience
6+ years in .NET Core, C#, ASP.NET Core, Web APIs, and front-end frameworks (Angular/React/Blazor).
Strong experience designing and implementing microservices-based architectures.
Senior Software Engineer Preferred Skills:
Experience integrating ERP, WMS, and e-commerce systems is a plus.
Experience working with Order Management Systems in a Manufacturing or Distribution environment.
Knowledge of Data Mesh principles, event-driven architectures, and distributed data systems.
Hands-on experience with cloud platforms (Azure preferred, AWS/GCP is a plus).
Experience with containerization (Docker, Kubernetes) and serverless architectures.
Strong understanding of database technologies (SQL Server, DB2, NoSQL, Redis, Elasticsearch).
Proficiency in CI/CD, DevOps, and Infrastructure as Code (Terraform, Bicep, ARM templates).
Knowledge of GraphQL, gRPC, and API Gateway solutions
Hands-on experience with data lakes or real-time analytics.
Senior Software Engineer Key Responsibilities:
Architecture & Development:
Design and implement scalable. NET-based full-stack solutions using C#, ASP.NET Core, Blazor, Angular, or React.
Architect microservices-based systems, ensuring high availability, resilience, and performance.
Establish a Data Mesh strategy to manage decentralized data ownership and governance across the organization.
Design and optimize databases using SQL Server and NoSQL (PostgreSQL, MongoDB).
Order Management System (OMS):
Lead the development of a modern, cloud-native Order Management System tailored for manufacturing & distribution.
Define APIs, workflows, and integrations with ERP, WMS, and e-commerce platforms.
Ensure real-time order processing, tracking, and fulfillment using event-driven architecture (Kafka, RabbitMQ).
DevOps:
Implement CI/CD pipelines using GitHub Actions, Jenkins, Azure DevOps.
Ensure security best practices, including OAuth, JWT, and API Gateway implementations.
Deploy and maintain cloud-native applications on Azure / AWS / GCP.
Technical Leadership & Best Practices:
Set coding standards, perform code reviews and mentor engineering teams.
Drive the adoption of modern engineering practices, including Domain-Driven Design (DDD), Test-Driven Development (TDD), and CI/CD.
Work with data engineers to build data pipelines that support analytical and operational workloads.
Apply secure coding practices, use OWASP guidelines
Data Scientist, Merchandising Analytics
Data engineer job in Brentwood, TN
The Data Scientist, Merchandising Analytics at Tractor Supply Company will play a key role in leveraging data to address complex business challenges. The role will develop advanced statistical methods using Machine Learning, AI, statistical modeling, and optimization techniques to support the merchandising strategies and broader organizational goals of TSC. Additionally, the role will contribute to setting objectives to enhance the overall data architecture and data governance within the Merchandising team. Key areas of focus within Merchandising will be Space Planning and Pricing.
The Data Scientist will lead cross-functional projects, design and implement predictive models, and promote data-driven decision-making throughout the organization. Strong communication skills are essential as this role will translate complex analytical results into clear, actionable insights for both technical and non-technical stakeholders. This role work will ensure that Tractor Supply Company remains at the forefront of industry trends and emerging technologies in the data science field.
Essential Duties and Responsibilities (Min 5%)
* Work closely with key business partners to fully explore and frame a business question including objectives, goals, KPIs, decisions the analysis will support and required timing for deliverables.
* Extracting available and relevant data from internal and external data sources to perform data science solution development.
* Develop, maintain, and improve predictive models using R, Python, and Databricks to enhance business knowledge and processes.
* Contribute and assist the team with best practices in data governance, data engineering, and data architecture.
* Identify opportunities for automation and continuous improvement within data pipelines, processes, and systems.
* Maintain a broad exposure to the wider ecosystem of AI / Machine Learning and ensure our team is pushing toward the most optimal solutions.
* Manage multiple projects simultaneously with limited oversight including development of new technologies and maintenance of existing framework.
* Foster a culture of data-driven decision making and collaborate with cross-functional teams and senior leadership to define the long-term vision and goals for data science and engineering within the organization.
* Design and execute A/B tests to evaluate the effectiveness of various data-driven solutions, including designing appropriate sample sizes, metrics for evaluation, and statistical analysis plans.
* Use proven, predictive science to right-size every store with localized plans that balance individual store space allocations with your top-down and bottom-up strategies.
* Be a 'Go-To' person for any data analytical needs ranging from data extraction/ manipulations, long term trend analysis, statistical analysis, and modeling techniques. Perform code-review and debug with the team and assist during implementation where necessary.
Required Qualifications
Experience: 3-5 years proven experience as a Data Scientist, preferably in the retail or e-commerce industry. Prefer 2+ years of experience in predictive modeling utilizing CRM or transaction information.
Education: Bachelor's Degree in Mathematics, Statistics, or Econometrics. Master's Degree prefered. Any combination of education and experiencce will be considered.
Professional Certifications: None
Preferred knowledge, skills or abilities
* Intermediate-advanced in one or more programming language (Python, PySpark, R).
* Deep expertise in writing and debugging complex SQL queries.
* Ability to frame business questions and create an analytics solution using statistical or other advanced analytics methodologies.
* Proven advanced modeling experience in leading data-driven projects from definition to execution, driving and influencing project roadmaps.
* Experience using Azure, AWS, or another cloud compute platform a plus.
* Familiarity with visualization tools such as Power BI and Tableau.
* Must possess high degree of aptitude in communicating both verbally and written complex analysis results to Senior & Executive Leadership.
* Knowledge of A/ B testing methods; capable of designing a controlled test design, running the test and providing measurement post-hoc.
* Proficiency with managing data repository and version control systems like Git.
* Speak, read and write effectively in the English language.
Working Conditions
* Hybrid / Flexible working conditions
Physical Requirements
* Sitting
* Standing (not walking)
* Walking
* Kneeling/Stooping/Bending
* Lifting up to 10 pounds
Disclaimer
This job description represents an overview of the responsibilities for the above referenced position. It is not intended to represent a comprehensive list of responsibilities. A team member should perform all duties as assigned by his/ her supervisor.
Company Info
At Tractor Supply and Petsense by Tractor Supply, our Team Members are the heart of our success. Their dedication, passion, and hard work drive everything we do, and we are committed to supporting them with a comprehensive and accessible total reward package. We understand the evolving needs of our Team Members and their families, and we strive to offer meaningful, competitive, and sustainable benefits that support their well-being today and in the future.
Our benefits extend beyond medical, dental, and vision coverage, including company-paid life and disability insurance, paid parental leave, tuition reimbursement, and family planning resources such as adoption and surrogacy assistance, for all full-time Team Members and all part-time Team Members. Part time new hires gain eligibility for TSC Benefits by averaging at least 15 hours per week during their 90-day lookback period. The lookback period starts the first of the month following the date of hire. If the 15-hour requirement was met, the benefits eligibility date will be the first day of the month following 4 months of continuous service.
Please visit this link for more specific information about the benefits and leave policies applicable to the position you're applying for.
Bigdata / Hadoop Technical Lead
Data engineer job in Franklin, TN
E*Pro Consulting service offerings include contingent Staff Augmentation of IT professionals, Permanent Recruiting and Temp-to-Hire. In addition, our industry expertise and knowledge within financial services, Insurance, Telecom, Manufacturing, Technology, Media and Entertainment, Pharmaceutical, Health Care and service industries ensures our services are customized to meet specific needs. For more details please visit our website ******************
Job Description
Technical/Functional Skills:
• Must have at least 1 full-scale Hadoop implementation from DEV to PROD
• Must have experience in Production Deployment Process for Big Data projects
• Must have experience in root cause analysis, trouble-shooting of Hadoop applications
• Must have significant experience in designing solutions using Cloudera Hadoop
• Must have significant experience with Java MapReduce, PIG, Hive, Sqoop and Oozie
• Must have significant experience with Unix Shell Scripts
• Exposure to Healthcare Provider domain
Roles & Responsibilities:
• Design solutions. Provide technical expertise in researching, designing, implementing and maintaining business application solutions.
• Mentor, guide and train Team members on Big Data
• Perform moderately to highly complex development tasks and assignments using Cloudera Hadoop.
• Designs, codes, tests, debugs, and documents moderately to highly complex processes, in addition to performing advanced application maintenance tasks.
• Perform complex coding tasks using Java Map Reduce, PIG, Hive and Sqoop
• Review and certify code written by team members
• Ensures that established change management and other procedures are adhered to also help developing needing standards, procedures, and practices.
• Performance tuning with large data sets.
Generic Managerial Skills:
• Ability to lead the team, plan, track and manage and work performed by team members
• Ability to work independently and communicate across multiple levels (Product owners, Executive sponsors, Team members)
Additional Information
All your information will be kept confidential according to EEO guidelines.
Data Platform Engineer
Data engineer job in Brentwood, TN
Job DescriptionPosition:
Data Platform Engineer
The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions.
Responsibilities
Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms.
Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks.
Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark.
Develop and manage data models, data warehousing solutions, and data integration architectures in Azure.
Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems.
Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration.
Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases.
Ensure data quality, governance, and security across the data lifecycle.
Collaborate with product managers by estimating technical tasks and deliverables.
Uphold the mission and values of Monogram Health in all aspects of your role and activities.
Position Requirements
A bachelor's degree in computer science, data science, software engineering or related field.
Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark.
Expert level knowledge of Python or other scripting languages required.
Proficiency in SQL and other data query languages.
Understanding of data modeling and schema design principles
Ability to work with large datasets and perform data analysis
Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable.
Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC).
Thorough understanding of Azure Cloud Infrastructure offerings.
Demonstrated problem-solving and troubleshooting skills.
Team player with demonstrated written and communication skills.
Benefits
Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts
Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources
Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave
Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts
Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders.
Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home.
Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
Data Engineer
Data engineer job in Spring Hill, TN
Job Description
We're seeking a Data Engineer to help shape the foundation of Zipliens' growing data ecosystem. Our engineering team supports a diverse set of tools and systems that power lien resolution operations, client transparency, and decision-making across the company. In this role, you'll design and maintain reliable data pipelines, optimize data storage and retrieval, and ensure that our systems deliver accurate, timely, and actionable insights. You'll collaborate closely with data analysts, product owners, and engineers to build scalable data infrastructure and contribute to data quality standards that will support the next generation of Zipliens applications.
Requirements
Responsibilities:
Develop and optimize SQL queries and database schemas for efficient data retrieval and storage.
Design, develop, and maintain scalable ETL processes.
Develop and maintain scripts for data processing.
Ensure scalability and performance optimization of data pipelines and queries.
Develop and implement data quality checks and monitoring to ensure data accuracy and reliability.
Contribute to the development and maintenance of data quality standards and best practices.
Collaborate with data analysts to understand requirements and deliver solutions that enable effective reporting and analytics.
Design and build reports to provide actionable insights to stakeholders.
Document data models, ETL processes, and reporting solutions.
Qualifications:
Bachelor's degree in Business, Computer Information Systems, Computer Science, or equivalent practical experience.
4+ years of experience as a Data Engineer, Senior Data Analyst, or similar role.
Strong proficiency in SQL and experience with relational and cloud-based data storage solutions (e.g., PostgreSQL, MySQL, SQL Server, Snowflake, Redshift, BigQuery).
Experience with ETL tools and techniques.
Experience with a general-purpose programming language (Python preferred).
Familiarity with data warehousing concepts and data modeling principles.
Understanding of cloud platforms (e.g., AWS, Azure, GCP) and their data services is a plus.
Strong analytical and problem-solving skills with a focus on data quality, performance, and reliability.
Collaborative mindset with the ability to communicate effectively with stakeholders.
This role requires on-site presence at least three days per week (60%) in our Spring Hill, TN office.
Benefits
Comprehensive Health Benefits (Medical, Dental, and Vision), including HSA with employer contributions, FSA, and Dependent Care FSA
Company-Paid Life Insurance and Short-Term Disability
401(k) Plan with Company Match
Paid Time Off (Vacation, Sick Leave, and 10 Holidays)
Paid Parental Leave
Pay Disclosure: The total base salary range for this role is $89,000 - $120,000 annually, with an opportunity for a discretionary bonus. Final compensation will be determined based on skills and experience.
Senior Data Architect TO
Data engineer job in Tullahoma, TN
Job Description
WE ARE ARCARITHM, and we are changing the world!
If you are ready to grow your career and change the world with us, then join the Arcarithm team!
We are located in beautiful, downtown Huntsville, AL, one of the fastest growing cities in the U.S.! At Arcarithm, we cultivate and foster an environment of integrity, open communication, work life balance, and career development. We are committed to investing in our employees by offering comprehensive health insurance options, a generous 401K plan, competitive salaries, continuous career growth opportunities, flexible schedules including remote work, mentoring and performance incentives.
Arcarithm is currently seeking top talent in the areas of full stack software development, artificial intelligence, optimization, and data analytics. You will work in a dynamic and challenging environment alongside our customers which include Lockheed Martin, General Dynamics, Northrop Grumman, Raytheon, US Army, US Navy, US Air Force, the Missile Defense Agency, and NASA on cutting edge technologies including machine learning, augmented and virtual reality, big data analytics, and more!
We are excited to continue to change and improve the world through innovation and technology!
Contact us today to hear more about Arcarithm and all we offer!
Job Title: Senior Data Architect
Job Location: Tullahoma, TN
The Sr. Data Architect will be a primary subject matter expert for data within the Digital Enterprise (DE) organization. This organization is responsible for leading the Digital Modernization and Transformation of AEDC. This position is the primary technical counterpart to the Digital Enterprise Manager, helping provide technical input and leadership for AEDCs digital modernization initiatives. The CDO will help shape and lead the development of a Digital Enterprise strategic plan and roadmap. This position is responsible for developing and formalizing data governance, establishing robust policies and procedures to ensure consistent application of the developed data governance standards and ensuring high data quality standards are followed. The CDO will design and implement a data management program aligned with globally recognized frameworks. This position will work with various mission area and other stakeholders to implement a data framework optimized for the requirements of the AEDC mission, including ensuring all cybersecurity requirements are met. This role will work with the Digital Enterprise manager to shape our data strategy, grow and develop a data and analytics team, and work with this team to implement new AI/ML and predictive analytics.
The CDO is pivotal in aligning data strategies with AEDC strategic goals, helping various stakeholders understand and leverage the power of data by communicating complex data concepts to a non-technical audience. This position will work closely with the customer's Chief Data Officer and any implemented Digital Engineering committees as assigned. This is a critical position for ensuring the success of the Digital future of AEDC.
Job Duties
Technical Advisor for the DE group reporting to the Digital Enterprise Manager.
Liaison between customer's Chief Data Officer and DE group.
Develop and implement a data strategy with roadmap focused on the following areas:
Data Governance - Help develop an AEDC data governance governing board responsible for establishing and enforcing policies and procedures for managing data privacy, security, quality, and compliance.
Data Management - Oversee the collection, storage, integration, and analysis of data across the organization.
Data Architecture - Design and implement data architecture that enables efficient and effective data management.
Data Analytics - Insight to make data-driven decisions.
Data-driven Innovation - Find new ways to use data to create new opportunities.
Cybersecurity - Work with the cybersecurity team to ensure these initiatives are implemented with a secure cyber foundation.
Ensure the organization's data assets are managed, leveraged, and protected to drive business value that align with the customer's Chief Data Officer.
Facilitate and provide inputs and approvals as necessary on business definitions, data quality, data assets, and any changes within their domain.
Work with the Digital Enterprise Manager, stakeholders, and other personnel as necessary to plan and execute Digital Modernization initiatives based on the developed strategic plan.
Collaborate with other functions to ensure communication and value attainment.
Galvanize stakeholders for change management.
It is a condition of employment to wear company issued PPE (Personal Protective Equipment) in accordance with supervisory direction and company policy.
Performs other related duties as required.
Basic Qualifications
B.S. degree in Computer Science, Engineering, Mathematics, Physics, or other related field with experience in developing Data Platforms, and a minimum 8 years progressive experience in a data-related position
Current United States citizenship required.
Preferred Qualifications
Advanced degree in Computer Science, Engineering, Physics, or other related field.
Experience working within Department of Defense or other government organizations.
Strong and proven experience leading and executing an Enterprise Data Strategy, building On-Prem or Cloud based Enterprise Data Platforms and/or Business Intelligence initiative(s).
Experience in large-scale, multi-year transformation and capability building efforts.
Experience in digital transformations and effective digital capability building.
Ability to execute a strategy across a complex enterprise comprised of numerous directorates and functions.
Strong data, AI / data science and technological knowledge, covering recent topics such as data governance, data domains, AI models and techniques, platform, cloud, API, microservices, DevSecOps, engineering, data security/privacy, BI and analytics.
Strong change management leadership; influencing skills with suppliers and customers.
Can make tradeoffs in data exchange, usage, partnering, investment based on a risk-based model.
Analytical ability to develop and implement new concepts and tools.
Well versed in data compliance and regulations.
Active Department of Defense Secret Security Clearance.