Post job

Data engineer jobs in Nashville, TN - 576 jobs

All
Data Engineer
Data Consultant
Data Scientist
Data Architect
Senior Software Engineer
  • DATA SCIENTIST/MACHINE LEARNING (ML) ENGINEER - 74416

    State of Tennessee 4.4company rating

    Data engineer job in Nashville, TN

    Executive Service DATA SCIENTIST/MACHINE LEARNING (ML) ENGINEERFinance & AdministrationStrategic Technology SolutionsNashville, TNSalary: $11,353.00/monthly-$14,750.00/Monthly $136,236.00/yearly-$177,000.00/yearly Closing Date: 01/27/2026 is designed as Hybrid. Background Check: Requires CJIS/FTI Fingerprints and name-based checks. This position requires a criminal background check. Therefore, you may be required to provide information about your criminal history in order to be considered for this position. Who we are and what we do: The Data Scientist leverages advanced analytics, machine learning, and AI techniques to solve complex business problems. This role involves translating business needs into data-driven solutions, performing exploratory analysis, and developing predictive and generative models. The ideal candidate combines strong technical expertise with business acumen and thrives in collaborative environments. How you make a difference in this role: See key responsibilities below. Key Responsibilities: Works closely with agencies and STSs functional, engineering, and technology teams to solve difficult, non-routine analysis problems, applying advanced analytical methods as needed. Conducts end-to-end analysis that includes data gathering and requirements specification, processing, analysis, ongoing deliverables, and presentations. Develop, validate, and deploy machine learning and AI models, including GenAI frameworks. Conduct feature engineering and optimize algorithms for performance. Communicate insights and recommendations through clear visualizations and presentations. Collaborate with stakeholders, engineers, and MLOps teams to integrate solutions into production. Research emerging technologies and contribute to internal innovation initiatives Required Skills & Qualifications: Proficiency in Python and data analysis libraries (e.g., Pandas, NumPy, Scikit-learn). Experience with GenAI frameworks and deep learning techniques such as retrieval augmented generation (RAG), AI agents, and model fine-tuning. Experience with SQL and data manipulation (e.g., Snowflake, Redshift). Strong understanding of statistical modeling, machine learning, and AI concepts. Familiarity with AWS-based AI tools (Bedrock, Sagemaker). Ability to communicate technical concepts to non-technical audiences. Minimum Qualifications: Preferred Skills: Knowledge of data visualization tools (Tableau, Power BI). Familiarity with MLOps practices and version control (Git). Exposure to cloud-based data pipelines and streaming technologies. Ability to write scripts to automate ETL tasks Actively participates and contributes in brain-storming session. Can work seamlessly with people from other disciplines: IT engineers, Agency SMEs, business consultants Education & Experience: Bachelors degree in a quantitative field (Computer Science, Statistics, Mathematics, Engineering). Masters degree preferred. 2+ years of experience in data science, analytics, or related roles. Pursuant to the State of Tennessee's Workplace Discrimination and Harassment policy, the State is firmly committed to the principle of fair and equal employment opportunities for its citizens and strives to protect the rights and opportunities of all people to seek, obtain, and hold employment without being subjected to illegal discrimination and harassment in the workplace. It is the State's policy to provide an environment free of discrimination and harassment of an individual because of that person's race, color, national origin, age (40 and over), sex, pregnancy, religion, creed, disability, veteran's status or any other category protected by state and/or federal civil rights laws.
    $41k-57k yearly est. 5d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Senior Algorithm Software Engineer

    Freightwise

    Data engineer job in Nashville, TN

    Hiring: Senior Algorithm / Software Engineer (Transportation Optimization) At FreightWise Department: Engineering Employment Type: Full-Time Pay: Based on experience Who we are At FreightWise we combine agile tech development with deep industry expertise to provide transportation management services, and software to small and mid-sized shippers. Founded in 2015 and 7 years on the INC. 5000 list.FreightWise was voted one of the Best Places to Work by the Nashville Business Journal. Summary We are seeking a highly skilled Senior Algorithm / Software Engineer to help design, build, and scale FreightWise's transportation planning and execution systems. This role is central to developing advanced optimization solutions that power real-time freight execution, including load routing and sequencing, capacity planning, tendering workflows, and decision support. The ideal candidate combines production-grade software engineering expertise with hands-on transportation and logistics experience, enabling them to translate complex real-world constraints-such as service windows, carrier capacity, rates, dwell time, and disruptions-into scalable, high-performance systems. This position offers strong technical ownership and leadership opportunities, including mentoring engineers and influencing architecture and optimization strategy across the platform. What you will do: Lead the design and evolution of truckload transportation planning and execution solutions, including: Routing and sequencing Multi-stop optimization Capacity planning Schedule feasibility analysis Design and apply heuristic and optimization-based approaches to solve large-scale transportation problems under real-world constraints such as: Hours of service Service windows Carrier capacity and rates Dwell time and network disruptions Drive architectural decisions that balance solution quality, runtime performance, scalability, and explainability Partner closely with product, platform, data, and operations teams to ensure optimization solutions deliver measurable business and service outcomes Mentor and guide senior and mid-level engineers, setting a high bar for code quality, system design, and applied optimization practices What you will need: Bachelor's degree in Computer Science, Mathematics, Operations Research, Engineering, or a related field (Master's or PhD preferred) 7+ years of experience in algorithm development, optimization, or applied data science Proven experience deploying and operating complex optimization solutions in production environments Strong proficiency in Java, Node.js, or similar backend languages Deep understanding of algorithms, data structures, and computational complexity Hands-on experience with optimization techniques such as: Linear Programming (LP) Mixed-Integer Programming (MIP) Constraint Programming Heuristics and metaheuristics Familiarity with Agile development, SaaS architectures, and modern deployment practices Experience with distributed systems, microservices, and cloud platforms (AWS) Transportation routing problems (pickup & delivery, time windows, multi-stop routing) Capacity planning, lane optimization, and network design Real-time replanning and exception handling Understanding of logistics constraints including: Driver hours of service Equipment types Service-level agreements Cost and rate structures Strong analytical and problem-solving abilities Ability to balance theoretical rigor with practical engineering constraints Clear communication skills for explaining complex algorithms and trade-offs Leadership mindset with a passion for mentoring, collaboration, and building high-performing teams What we provide: Insurance: Medical, Dental, Vision, STD, LTD Company paid Life insurance $50k 401(k), with match Paid Time Off 11 Paid Holidays Hybrid work and flexible schedules Great work environment with opportunity for growth E-Verify: FreightWise participates in the federal government's E-Verify program, which confirms employment authorization of all newly hired employees and most existing employees through an electronic database maintained by the Social Security Administration and Department of Homeland Security. For all new hires, the E-Verify process is completed in conjunction with the Form I-9 Employment Eligibility Verification on or before the first day of work. E-Verify is not used as a tool to pre-screen candidates. For up-to-date information on E-Verify, go to ************* and click on 'E-Verify' located near the bottom of the page.
    $78k-102k yearly est. 3d ago
  • Data Scientist, Analytics (Technical Leadership)

    Meta 4.8company rating

    Data engineer job in Nashville, TN

    We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond. **Required Skills:** Data Scientist, Analytics (Technical Leadership) Responsibilities: 1. Work with complex data sets to solve challenging problems using analytical and statistical approaches 2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies 3. Identify and measure success through goal setting, forecasting, and monitoring key metrics 4. Partner with cross-functional teams to inform and execute product strategy and investment decisions 5. Build long-term vision and strategy for programs and products 6. Collaborate with executives to define and develop data platforms and instrumentation 7. Effectively communicate insights and recommendations to stakeholders 8. Define success metrics, forecast changes, and set team goals 9. Support developing roadmaps and coordinate analytics efforts across teams **Minimum Qualifications:** Minimum Qualifications: 10. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience 11. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab) 12. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development 13. Experience working effectively with and leading complex analytics projects across multiple stakeholders and cross-functional teams, including Engineering, PM/TPM, Analytics and Finance 14. Experience with predictive modeling, machine learning, and experimentation/causal inference methods 15. Experience communicating complex technical topics in a clear, precise, and actionable manner **Preferred Qualifications:** Preferred Qualifications: 16. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy 17. Masters or Ph.D. Degree in a quantitative field 18. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research) 19. 10+ years of experience doing complex quantitative analysis in product analytics **Public Compensation:** $210,000/year to $281,000/year + bonus + equity + benefits **Industry:** Internet **Equal Opportunity:** Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment. Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
    $210k-281k yearly 60d+ ago
  • Data Scientist, Privacy

    Datavant

    Data engineer job in Nashville, TN

    Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care. By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare. As part of the Privacy Science team within Privacy Hub you will play a crucial role in ensuring that privacy of patients is safeguarded in the modern world of data sharing. As well as working on real data, you will be involved in exciting research to keep us as industry leaders in this area, and stimulating discussions on re-identification risk. You will be supported in developing/consolidating data analysis and coding skills to become proficient in the analysis of large health-related datasets. **You Will:** + Critically analyze large health datasets using standard and bespoke software libraries + Discuss your findings and progress with internal and external stakeholders + Produce high quality reports which summarise your findings + Contribute to research activities as we explore novel and established sources of re-identification risk **What You Will Bring to the Table:** + Excellent communication skills. Meticulous attention to detail in the production of comprehensive, well-presented reports + A good understanding of statistical probability distributions, bias, error and power as well as sampling and resampling methods + Seeks to understand real-world data in context rather than consider it in abstraction. + Familiarity or proficiency with programmable data analysis software R or Python, and the desire to develop expertise in its language + Application of scientific methods to practical problems through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions + Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines + Initiative and ability to independently explore and research novel topics and concepts as they arise, to expand Privacy Hub's knowledge base + An appreciation of the need for effective methods in data privacy and security, and an awareness of the relevant legislation + Familiarity with Amazon Web Services cloud-based storage and computing facilities **Bonus Points If You Have:** + Experience creating documents using LATEX + Detailed knowledge of one or more types of health information, e.g., genomics, disease, health images + Experience working with or supporting public sector organizations, such as federal agencies (e.g., CMS, NIH, VA, CDC), state health departments, or public health research partners. Familiarity with government data environments, procurement processes, or privacy frameworks in regulated settings is highly valued. \#LI-BC1 We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services. The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job. The estimated total cash compensation range for this role is: $104,000-$130,000 USD To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion. This job is not eligible for employment sponsorship. Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay. At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way. Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis. For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
    $104k-130k yearly 13d ago
  • AWS Data Migration Consultant

    Slalom 4.6company rating

    Data engineer job in Nashville, TN

    Candidates can live within commutable distance to any Slalom office in the US. We have a hybrid and flexible environment. Who You'll Work With As a modern technology company, we've never met a technical challenge we didn't like. We enable our clients to learn from their data, create incredible digital experiences, and make the most of new technologies. We blend design, engineering, and analytics expertise to build the future. We surround our technologists with interesting challenges, innovative minds, and emerging technologies. We are seeking an experienced Cloud Data Migration Architect with deep expertise in SQL Server, Oracle, DB2, or a combination of these platforms, to lead the design, migration, and optimization of scalable database solutions in the AWS cloud. This role will focus on modernizing on-premises database systems by architecting high-performance, secure, and reliable AWS-hosted solutions. As a key technical leader, you will work closely with data engineers, cloud architects, and business stakeholders to define data strategies, lead complex database migrations, build out ETL pipelines, and optimize performance across legacy and cloud-native environments. What You'll Do * Design and optimize database solutions on AWS, including Amazon RDS, EC2-hosted instances, and advanced configurations like SQL Server Always On or Oracle RAC (Real Application Clusters). * Lead and execute cloud database migrations using AWS Database Migration Service (DMS), Schema Conversion Tool (SCT), and custom automation tools. * Architect high-performance database schemas, indexing strategies, partitioning models, and query optimization techniques. * Optimize complex SQL queries, stored procedures, functions, and views to ensure performance and scalability in the cloud. * Implement high-availability and disaster recovery (HA/DR) strategies including Always-On, Failover Clusters, Log Shipping, and Replication, tailored to each RDBMS. * Ensure security best practices are followed including IAM-based access control, encryption, and compliance with industry standards. * Collaborate with DevOps teams to implement Infrastructure-as-Code (IaC) using tools like Terraform, CloudFormation, or AWS CDK. * Monitor performance using tools such as AWS CloudWatch, Performance Insights, Query Store, Dynamic Management Views (DMVs), or Oracle-native tools. * Work with software engineers and data teams to integrate cloud databases into enterprise applications and analytics platforms. What You'll Bring * 5+ years of experience in database architecture, design, and administration with at least one of the following: SQL Server, Oracle, or DB2. * Expertise in one or more of the following RDBMS platforms: Microsoft SQL Server, Oracle, DB2. * Hands-on experience with AWS database services (RDS, EC2-hosted databases). * Strong understanding of HA/DR solutions and cloud database design patterns. * Experience with ETL development and data integration, using tools such as SSIS, AWS Glue, or custom solutions. * Familiarity with AWS networking components (VPCs, security groups) and hybrid cloud connectivity. * Strong troubleshooting and analytical skills to resolve complex database and performance issues. * Ability to work independently and lead database modernization initiatives in collaboration with engineering and client stakeholders. Nice to Have * AWS certifications such as AWS Certified Database - Specialty or AWS Certified Solutions Architect - Professional. * Experience with NoSQL databases or hybrid data architectures. * Knowledge of analytics and big data tools (e.g., Snowflake, Redshift, Athena, Power BI, Tableau). * Familiarity with containerization (Docker, Kubernetes) and serverless technologies (AWS Lambda, Fargate). * Experience with DB2 on-premise or cloud-hosted environments. About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this position, the target base salary pay range in the following locations: Boston, Houston, Los Angeles, Orange County, Seattle, San Diego, Washington DC, New York, New Jersey, for Consultant level is $105,000-147,000 and for Senior Consultant level it is $120,000-$169,000 and for Principal level it is $133,000-$187,000. In all other markets, the target base salary pay range for Consultant level is $96,000-$135,000 and for Senior Consultant level it is $110,000-$155,000 and for Principal level it is $122,000-$172,000. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time. EEO and Accommodations Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process. We will accept applications until 1/31/2026 or until the positions are filled.
    $133k-187k yearly 2d ago
  • 211018 / Consultant - Data Migration

    Procom Services

    Data engineer job in Nashville, TN

    Procom is a leading provider of professional IT services and staffing to businesses and governments in Canada. With revenues over $500 million, the Branham Group has recognized Procom as the 3rd largest professional services firm in Canada and is now the largest “Canadian-Owned” IT staffing/consulting company. Procom's areas of staffing expertise include: • Application Development • Project Management • Quality Assurance • Business/Systems Analysis • Datawarehouse & Business Intelligence • Infrastructure & Network Services • Risk Management & Compliance • Business Continuity & Disaster Recovery • Security & Privacy Specialties• Contract Staffing (Staff Augmentation) • Permanent Placement (Staff Augmentation) • ICAP (Contractor Payroll) • Flextrack (Vendor Management System) Job Description Assess and execute a migration from Connect Enterprise to Sterling File Gateway Qualifications Experience with Sterling Commerce application migrations Well developed analytical skills and written communications skills. Must have very good written and verbal communications skills Must have very good written and verbal communications skills. Required skills/experience for this role includes: Sterling Commerce product migration experience (5 years minimum) Sterling File Gateway experience (5 years minimum) Connect Direct product experience (5 years minimum) Sterling Enterprise experience (5 years minimum) Scripting language experience (Perl, Unix shell scripting, and Widows scripting) (5 years minimum) DOS batch scripting (5 years minimum) System analysis experience (5 years minimum) Additional Information PLEASE NOTE THAT WE ARE NOT ABLE TO WORK WITH CANDIDATES ON H1B VISAS OR CANDIDATES REPRESENTED BY THIRD PARTIES.
    $64k-87k yearly est. 1d ago
  • Senior Data Consultant

    Arctiq

    Data engineer job in Nashville, TN

    We are looking for a Senior Data Consultant to join our Delivery organization. In this role, we are looking for someone with strong communication skills, a results-oriented mindset, and a passion for presenting and teaching. The ideal candidate will play a key role in designing, developing, and implementing data solutions for our clients. As a Senior Data Consultant, you will collaborate with cross-functional teams, analyze data requirements, and provide innovative solutions to address complex business challenges. This role requires a deep understanding of cloud architecture, ETL/ELT processes, and proficiency in various data technologies. Embrace a culture of continuous learning and tech experimentation as we navigate the forefront of emerging technologies. Role Responsibilities Directly supporting the Data Services Practice, you will be responsible for the core delivery of professional services engagements with our customers. Specifically, you will be responsible for identifying requirements, developing solution architectures, either directly or overseeing the technical delivery of solutions, and providing post-deployment enablement. You will be required to follow industry standards, delivering top-tier solutions to our valued customers. Assess, recommend, and implement DevOps technology solutions in alignment with client business requirements, ensuring a strategic fit and optimal functionality. Lead clients towards industry best practices in cloud adoption and innovative business engagements, fostering a transformative approach to technology solutions. Develop innovative solutions, architectures, proof of concepts, demo/lab environments, and compelling business cases. Showcase the benefits of digital transformation programs and the adoption of cutting-edge solutions. Work closely with the Project Management Office (PMO) to provide visibility into project delivery scope, timelines, and expectations. Ensure the delivery of high-quality projects on time and within budget. Collaborate with Partner and Marketing teams to develop content, including sales collateral, blog posts, podcasts, and live workshops. Present at company and partner events, contributing to thought leadership in the industry. Provide mentorship and coaching to junior team members, fostering professional development. Actively participate in professional associations, industry events, and community engagements, contributing intellectual property (IP) development. Keep abreast of industry trends and technology developments by maintaining partner certifications and actively participating in technology events. Cloud Engineering and Architecture: Deep understanding of cloud technologies Ability to deploy cloud infrastructure at scale using leading infrastructure as code platforms Information Gathering: Collaborate with clients to understand their data needs and platform requirements Gather information via client interviews and review of client documentation Architecture Design: Design scalable and efficient data architecture based on business requirements and industry leading approaches Develop and maintain conceptual, logical, and physical solution models Data Integration: Implement data integration solutions to enable seamless flow of information across various systems Work with APIs, web services, and other integration tools to connect disparate data sources ETL/ELT Development: Design, develop, and optimize ETL processes to extract, transform, and load data from diverse sources into data warehouses or data lakes Ensure data quality and integrity throughout the ETL pipeline Database Management: Manage and optimize databases for performance, scalability, and reliability Implement best practices for data storage, indexing, and query optimization Data Security and Compliance: Implement and enforce data security measures to protect sensitive information Ensure compliance with relevant data regulations and standards Collaboration and Communication: Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders Communicate complex technical concepts to non-technical stakeholders in a clear and understandable manner What will you bring to the role? Bilingual French & English is an asset. Bachelor's or Master's degree in computer engineering / science, data engineering / science, or a related field. Demonstrated and applied experience in establishing and delivering complex projects, showcasing a track record of successful implementations. Proven experience as a Data Engineer or in a similar role. Skilled at coding and scripting languages, including but not limited to Python, SQL, YML, Terraform. Experience with cloud solution implementation, data integration, ETL development, and database management. Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., Databricks, BigQuery, etc.). Familiarity with Big Data technologies such as Hadoop, Spark, and Kafka, and related tools. Proficient use of SCM tools such as Git, GitHub, and GitLab for efficient version control and collaborative development. Experience with automation and configuration management solutions, utilizing tools like Ansible, Terraform, Octopus Deploy, AWS Config, and Azure Automation & Control. Desired Certifications: While not required, the following certifications demonstrate valuable knowledge and may strengthen your application: Google Professional Data Engineer Databricks Certified Data Engineer Professional Data Engineering on Microsoft Azure Public Cloud Professional Certifications (Azure, AWS, GCP) We also recognize the value of general data, AI, and ML certifications. Don't have these certifications yet? No problem! We're committed to investing in promising talent and providing on-the-job training for motivated individuals. If you are passionate about data and advanced analytics, thrive in a dynamic environment, and are eager to contribute to innovative solutions, we encourage you to apply for this exciting opportunity. Arctiq is an equal opportunity employer. If you need any accommodations or adjustments throughout the interview process and beyond, please let us know. We celebrate our inclusive work environment and welcome members of all backgrounds and perspectives to apply. We thank you for your interest in joining the Arctiq team! While we welcome all applicants, only those who are selected for an interview will be contacted.
    $64k-87k yearly est. 60d+ ago
  • Data Streaming Consultant (Big Data)

    Ventures Unlimited

    Data engineer job in Nashville, TN

    Title : Data Streaming Consultant (Big Data) Duration: Full Time Job Description : Mandatory Skills Hadoop, Java, Spark (or Storm or Kafka) Desirable skills Data warehousing, SQL Detailed JD : • Take sprint activities related to real-time streaming requirement (Spark Streaming), Big Data technologies, platform and workflow orchestration etc. • Perform quick PoCs • Train other team members on big data and real-time technologies. Skills: - Experience working Hadoop, NoSQL - Spark, Storm or other streaming/real-time technologies - Java - Preferred: SQL, Data Warehousing Qualifications Big Data, Java, Spark Additional Information Interested candidates please contact me on "************" Ext 157
    $64k-87k yearly est. 60d+ ago
  • Data Onboarding Consultant

    Corpay

    Data engineer job in Brentwood, TN

    What We Need Corpay is currently looking to hire a Data Onboarding Consultant within our Implementations division. This position falls under our Corporate Payments line of business based out of our Brentwood, TN location. In this role, you will manage critical data activities for Corpay's clients to ensure successful implementation and ongoing client success. This position combines client-facing and internal technical responsibilities. The ideal candidate is one that enjoys working with clients to assist them in navigating complex data landscapes, is analytical in nature allowing them to understand non-uniform data sets from various sources, can drive project success by creating deadlines and holding both internal and external parties accountable to performance, and can balance competing priorities to ensure ultimate success. The ideal candidate is a problem solver, a great communicator, and most importantly takes ownership of their projects and drives them to success. You will report directly to the Manager of Technical Implementations. How We Work As a Data Onboarding Consultant you will be expected to work out of our Brentwood, TN office location. Corpay will set you up for success by providing: Company-issued equipment Assigned workspace in our Brentwood office Formal, hands-on training Role Responsibilities The responsibilities of the role will include: This is a customer-facing role that will serve as the primary point of client contact for all data services from the sales process through implementation Work with clients and internal partners to obtain and validate data to be used in data services Analyze client data and present findings to improve the results of the data being ingested Utilize data cleaning and mapping tools to ingest data into the application Coordinate the scoping, prioritization, delivery, and, where applicable, ongoing maintenance of client data services (one-time data import, ongoing data integrations) First line of defense for triaging issues related to data imports/data integrations Work with clients and internal stakeholders to maintain a prioritized queue of data services deliverables Contribute to the overall strategy for Implementations Qualifications & Skills 2 - 5 years' experience in managing or working with data (training/education counts) Comfortable communicating complex information in simple terms Experience managing projects Experience working with large, non-uniform, data sets Experience working directly with clients and prospects to assess needs and define technical solutions Experience with data mapping and BI tools While this is not an engineering role, familiarity with engineering tools and practices will be greatly beneficial Benefits & Perks Medical, Dental & Vision benefits available the 1st month after hire Automatic enrollment into our 401k plan (subject to eligibility requirements) Virtual fitness classes offered company-wide Robust PTO offers including major holidays, vacation, sick, personal, & volunteer time Employee discounts with major providers (i.e. wireless, gym, car rental, etc.) Philanthropic support with both local and national organizations Fun culture with company-wide contests and prizes Equal Opportunity/Affirmative Action Employer Corpay is an Equal Opportunity Employer. Corpay provides equal employment opportunities to all qualified applicants without regard to race, color, gender (including pregnancy), religion, national origin, ancestry, disability, age, sexual orientation, gender identity or expression, marital status, language, ancestry, genetic information and/or military status or any other group status protected by federal or local law. If you require reasonable accommodation for the application and/or interview process, please notify a representative of the Human Resources Department. For more information about our commitment to equal employment opportunity and pay transparency, please click the following links: EEOC and Pay Transparency.
    $64k-87k yearly est. 7d ago
  • Sr Data Engineer, Palantir

    The Hertz Corporation 4.3company rating

    Data engineer job in Nashville, TN

    **A Day in the Life:** We are seeking a talented **Sr Data Engineer, Palantir (experience required)** to join our Strategic Data & Analytics team working on Hertz's strategic applications and initiatives. This role will work in multi-disciplinary teams rapidly building high-value products that directly impact our financial performance and customer experience. You'll build cloud-native, large-scale, employee facing software using modern technologies including React, Python, Java, AWS, and Palantir Foundry. The ideal candidate will have strong development skills across the full stack, a growth mindset, and a passion for building software at a sustainable pace in a highly productive engineering culture. Experience with Palantir Foundry is highly preferred but not required - we're looking for engineers who are eager to learn and committed to engineering excellence. We expect the starting salary to be around $135k but will be commensurate with experience. **What You'll Do:** Day-to-Day Responsibilities + Work in balanced teams consisting of Product Managers, Product Designers, and engineers + Test first - We strive for Test-Driven Development (TDD) for all production code + CI (Continuous Integration) everything - Automation is core to our development process + Architect user-facing interfaces and design functions that help users visualize and interact with their data + Contribute to both frontend and backend codebases to enhance and develop projects + Build software at a sustainable pace to ensure longevity, reliability, and higher quality output Frontend Development + Design and develop responsive, intuitive user interfaces using React and modern JavaScript/TypeScript + Build reusable component libraries and implement best practices for frontend architecture + Generate UX/UI designs (no dedicated UX/UI designers on team) with considerations for usability and efficiency + Optimize applications for maximum speed, scalability, and accessibility + Develop large-scale, web and mobile software utilizing appropriate technologies for use by our employees Backend Development + Develop and maintain RESTful APIs and backend services using Python or Java + Design and implement data models and database schemas + Deploy to cloud environments (primarily AWS) + Integrate with third-party services and APIs + Write clean, maintainable, and well-documented code Palantir Foundry Development (Highly Preferred) + Build custom applications and integrations within the Palantir Foundry platform + Develop Ontology-based applications leveraging object types, link types, and actions + Create data pipelines and transformations using Python transforms + Implement custom widgets and user experiences using the Foundry SDK + Design and build functions that assist users to visualize and interact with their data Product Development & Delivery + Research problems and break them into deliverable parts + Work with a Lean mindset and deliver value quickly + Participate in all stages of the product development and deployment lifecycle + Conduct code reviews and provide constructive feedback to team members + Work with product managers and stakeholders to define requirements and deliverables + Contribute to architectural decisions and technical documentation **What We're Looking For:** + Experience with Palantir Foundry platform, required + 5+ years in web front-end or mobile development + Bachelor's or Master's degree in Computer Science or other related field, preferred + Strong proficiency in React, JavaScript/TypeScript, HTML, and CSS for web front-end development + Strong knowledge of one or more Object Oriented Programming or Functional Programming languages such as JavaScript, Typescript, Java, Python, or Kotlin + Experience with RESTful API design and development + Experience deploying to cloud environments (AWS preferred) + Understanding of version control systems, particularly GitHub + Experience with relational and/or NoSQL databases + Familiarity with modern frontend build tools and package managers (e.g., Webpack, npm, yarn) + Experience with React, including React Native for mobile app development, preferred + Experience in Android or iOS development, preferred + Experience with data visualization libraries (e.g., D3.js, Plotly, Chart.js), preferred + Familiarity with CI/CD pipelines and DevOps practices, preferred + Experience with Spring framework, preferred + Working knowledge of Lean, User Centered Design, and Agile methodologies + Strong communication skills and ability to collaborate effectively across teams + Growth mindset - Aptitude and willingness to learn new technologies + Empathy - Kindness and empathy when building software for end users + Pride - Takes pride in engineering excellence and quality craftsmanship + Customer obsession - Obsessed with the end user experience of products + Strong problem-solving skills and attention to detail + Ability to work independently and as part of a balanced, multi-disciplinary team + Self-motivated with a passion for continuous learning and improvement **What You'll Get:** + Up to 40% off the base rate of any standard Hertz Rental + Paid Time Off + Medical, Dental & Vision plan options + Retirement programs, including 401(k) employer matching + Paid Parental Leave & Adoption Assistance + Employee Assistance Program for employees & family + Educational Reimbursement & Discounts + Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness + Perks & Discounts -Theme Park Tickets, Gym Discounts & more The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world. **US EEO STATEMENT** At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company. Individuals are encouraged to apply for positions because of the characteristics that make them unique. EOE, including disability/veteran
    $135k yearly 45d ago
  • Data Engineer with GCP

    Conflux Systems, Inc.

    Data engineer job in Nashville, TN

    Hi Team, Please share resume ASAP. Data Engineers with GCP with experience on data processing using Dataflow and Dataproc. Remote work: No rate :-USD $60/hr. Requirements: · Bachelor's degree in computer science, related technical field, or equivalent experience · 5+ years of experience in Information Technology · Good understanding of best practices and standards for GCP Data process design and implementation. · One plus Years of hands-on experience with GCP platform and experience with many of the following components: Cloud Run, Cloud Functions, Pub/Sub, Bigtable, Firestore, Cloud SQL, Cloud Spanner, JSON, Avro, Parquet, Python, Terraform, Big Query, Dataflow, Data Fusion, Cloud Composer, DataProc, CI/CD, Cloud Logging, GitHub · Ability to multitask and to balance competing priorities. · Ability to define and utilize best practice techniques and to impose order in a fast-changing environment. · Strong problem-solving skills. · Strong verbal, written, and interpersonal skills, including a desire to work within a highly matrixed, team-oriented environment. · Experience in Healthcare Domain preferred · Hardware/Operating Systems: GCP, Distributed, highly scalable processing environments, Linux, UNIX
    $60 hourly 56d ago
  • Data Engineer, Data Platform

    Mechanical Licensing Collective

    Data engineer job in Nashville, TN

    Are you interested in joining a purpose-driven company in the music industry? Do you thrive in a collaborative, hybrid work environment? If you do, we would like to get to know you. WORKING AT THE MLC The MLC is committed to excellence, service and transparency. Our culture is collaborative, and our team works in a hybrid environment. On our team, you are respected, valued for your unique strengths and experiences, and empowered to identify and resolve your own challenges. THE ROLE We are looking for a results-oriented and creative Data Engineer to join The MLC Data Platform Team. You will join a growing team working on our Data Platform in a fast-paced, friendly environment. The Data Platform function enables The MLC to obtain intelligence from its ever-expanding dataset . As an Engineer of The MLC Data Platform Team, you will implement, and maintain data pipelines, BI and reporting tools in alignment with the organization's data strategy , as well as ensure continuous stability, high availability, and performance. QUALIFICATIONS Minimum 3 years' development experience using multiple programming languages Demonstrable experience of working on Large datastores, using Big Data Techniques Understanding of ETL and orchestration tools such as Prefect, Airflow, Dagster. Experience with AWS and infrastructure as code, ideally with Terraform ESSENTIAL RESPONSIBILITIES AS A DEVELOPER YOU WILL Implement and maintain scalable and reliable data pipelines Be familiar with Software Development best practices such as Continuous Integration and Continuous Delivery Deploy your code on various platform AS A MEMBER OF THE TECHNOLOGY TEAM YOU WILL Work in an Agile environment and participate actively in team ceremonies and collaborative planning. Be adaptable to change and able to deal with ambiguity Be able to seek resolution when confronted with technical challenges. Embrace a data-driven, collaborative, and continuous improvement mindset. YOU WILL CHAMPION THE MLC'S CULTURE BY: Embracing The MLC's leadership values and applying The MLC's Guiding Principles to your team's work Being process-oriented, data-driven, and tech-savvy; being collaborative, curious, and open to new ideas Building a diverse and dynamic team; mentoring team members; developing future leaders Inspiring others with your enthusiasm and humility THE MLC IS AN EQUAL OPPORTUNITY EMPLOYER THAT COMMITS TO PURSUING, HIRING, AND CELEBRATING A DIVERSE WORKFORCE AND CREATING AN INCLUSIVE ENVIRONMENT. THE MLC DOES NOT MAKE EMPLOYMENT DECISIONS BASED ON RACE, COLOR, RELIGION OR RELIGIOUS BELIEF, ETHNIC OR NATIONAL ORIGIN, SEX, GENDER, GENDER-IDENTITY, SEXUAL ORIENTATION, MARITAL STATUS, CITIZENSHIP STATUS, DISABILITY, AGE, MILITARY OR VETERAN STATUS, OR ANY OTHER CATEGORY PROTECTED BY LOCAL, STATE, OR FEDERAL LAW. THIS POLICY APPLIES TO ALL TERMS AND CONDITIONS OF EMPLOYMENT, INCLUDING RECRUITING, HIRING, PLACEMENT, PROMOTION, TERMINATION, LAYOFF, TRANSFER, LEAVES OF ABSENCE, AND COMPENSATION. YOU WILL CHAMPION THE MLC'S CULTURE BY: Embracing The MLC's leadership values and applying The MLC's Guiding Principles to your team's work Being process-oriented, data-driven, and tech-savvy; being collaborative, curious, and open to new ideas Building a dynamic team; mentoring team members; developing future leaders Inspiring others with your enthusiasm and humility THE MLC IS AN EQUAL OPPORTUNITY EMPLOYER THE MLC DOES NOT MAKE EMPLOYMENT DECISIONS BASED ON RACE, COLOR, RELIGION OR RELIGIOUS BELIEF, ETHNIC OR NATIONAL ORIGIN, SEX, GENDER, GENDER-IDENTITY, SEXUAL ORIENTATION, MARITAL STATUS, CITIZENSHIP STATUS, DISABILITY, AGE, MILITARY OR VETERAN STATUS, OR ANY OTHER CATEGORY PROTECTED BY LOCAL, STATE, OR FEDERAL LAW. THIS POLICY APPLIES TO ALL TERMS AND CONDITIONS OF EMPLOYMENT, INCLUDING RECRUITING, HIRING, PLACEMENT, PROMOTION, TERMINATION, LAYOFF, TRANSFER, LEAVES OF ABSENCE, AND COMPENSATION.
    $70k-94k yearly est. Auto-Apply 3d ago
  • Data Engineer

    Evidencecare

    Data engineer job in Nashville, TN

    Data Engineer Company Name: EvidenceCare Position Type: Full Time Pay: Salary Reports to: Database Architect EvidenceCare is a fast-paced company scaling their market share with innovative healthcare products. Our products not only address the needs of inefficient healthcare delivery but also disrupt the status quo. EvidenceCare is a unique clinical decision support system (CDSS) because of its EHR-integrated platform that optimizes clinician workflows to deliver better patient care, reduce hospital costs, and capture more revenue. Founded in response to the professional experience of emergency physician Dr. Brian Fengler, the platform provides clinicians with evidence-based care and measurable outcomes. Company Vision: We envision a day when every clinical decision will deliver the right care at the right time. Company Mission: To empower better care decisions. Company Values: Grit, Respect, Innovation, Teamwork, Integrity, and Fun About the Role We're looking for a Data Engineer who combines deep expertise in data infrastructure with strong Python programming skills. You'll design and maintain ETL processes, build applications that perform complex data calculations, and generate reports that drive business decisions. You'll work across our data stack-from ingestion through to our data warehouse-and own the systems that turn millions of rows of raw data into actionable insights. What You'll Do Design and build data pipelines that are reliable, performant, and maintainable Develop Python applications for data calculation, transformation, and automated report generation Maintain and enhance existing legacy Python applications while we modernize our platform-this includes bug fixes, performance improvements, and incremental refactoring Work with SQL databases (PostgreSQL, Snowflake) to optimize queries, design schemas, and ensure data integrity Contribute to the design and maintenance of our data warehouse architecture Build and maintain APIs and services that expose data products to internal and external consumers Contribute to data quality frameworks, monitoring, and alerting Participate in architecture decisions and help evolve our data platform- Collaborate with Product and Clinical teams to translate business requirements into technical solutions What We're Looking For 3-5 years of experience in data engineering or a similar role Strong proficiency in Python, including experience building production applications (not just scripts) Comfortable working with legacy codebases-you can navigate unfamiliar code, understand its intent, and improve it incrementally without breaking things Solid SQL skills with experience in PostgreSQL, data warehouse platforms, query optimization, and schema design Experience building and maintaining ETL pipelines that process millions of rows reliably Experience with cloud data platforms, preferably AWS and Snowflake Familiarity with data orchestration tools (Airflow, Dagster, or similar) Understanding of software engineering best practices: version control, testing, code review, CI/CD Ability to communicate technical concepts clearly to both technical and non-technical audiences Nice to Have Experience in healthcare or another compliance-sensitive industry Familiarity with reporting frameworks or BI tools Exposure to containerization (Docker, Kubernetes) Experience with event-driven architectures or message queues Join Us: If you're passionate about data engineering and want to contribute to innovative healthcare solutions, apply to join our team at EvidenceCare. Benefits: Competitive salary + stock option opportunities Unlimited PTO Company-provided laptop Medical, Dental, Vision, & Life Insurance Benefit Plans Company 401k plan Frequent company and team outings to celebrate wins and life together Professional development opportunities through conferences and online courses
    $70k-94k yearly est. Auto-Apply 1d ago
  • Engineer, Data

    Holley Performance

    Data engineer job in Nashville, TN

    Job Description This role focuses on backend development and integrations for building and maintaining enterprise data warehouses and data lakes. The ideal candidate will possess a deep understanding of data architecture, ETL pipelines, and integration technologies, ensuring seamless data flow and accessibility across the organization. Key Responsibilities: · Design, develop, and maintain scalable backend systems to support data warehousing and data lake initiatives. · Build and optimize ETL/ELT processes to extract, transform, and load data from various sources into centralized data repositories. · Develop and implement integration solutions for seamless data exchange between systems, applications, and platforms. · Collaborate with data architects, analysts, and other stakeholders to define and implement data models, schemas, and storage solutions. · Ensure data quality, consistency, and security by implementing best practices and monitoring frameworks. · Monitor and troubleshoot data pipelines and systems to ensure high availability and performance. · Stay up-to-date with emerging technologies and trends in data engineering and integration to recommend improvements and innovations. · Document technical designs, processes, and standards for the team and stakeholders. Qualifications: · Bachelor's degree in Computer Science, Engineering, or a related field; equivalent experience considered. · Proven experience as a Data Engineer with 5 or more years of experience; or in a similar backend development role. · Strong proficiency in programming languages such as Python, Java, or Scala. · Hands-on experience with ETL/ELT tools and frameworks (e.g., Apache Airflow, Talend, Informatica, etc.). · Extensive knowledge of relational and non-relational databases (e.g., SQL, NoSQL, PostgreSQL, MongoDB). · Expertise in building and managing enterprise data warehouses (e.g., Snowflake, Amazon Redshift, Google BigQuery) and data lakes (e.g., AWS S3, Azure Data Lake). · Familiarity with cloud platforms (AWS, Azure, Google Cloud) and their data services. · Experience with API integrations and data exchange protocols (e.g., REST, SOAP, JSON, XML). · Solid understanding of data governance, security, and compliance standards. · Strong analytical and problem-solving skills with attention to detail. · Excellent communication and collaboration abilities. Preferred Qualifications: · Certifications in cloud platforms (AWS Certified Data Analytics, Azure Data Engineer, etc.) · Experience with big data technologies (e.g., Apache Hadoop, Spark, Kafka). · Knowledge of data visualization tools (e.g., Tableau, Power BI) for supporting downstream analytics. · Familiarity with DevOps practices and tools (e.g., Docker, Kubernetes, Jenkins). Please note: Relocation assistance will not be available for this position.
    $70k-94k yearly est. 19d ago
  • Data Platform Engineer

    Monogram Health 3.7company rating

    Data engineer job in Brentwood, TN

    Data Platform Engineer The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions. Responsibilities Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms. Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks. Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark. Develop and manage data models, data warehousing solutions, and data integration architectures in Azure. Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems. Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration. Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases. Ensure data quality, governance, and security across the data lifecycle. Collaborate with product managers by estimating technical tasks and deliverables. Uphold the mission and values of Monogram Health in all aspects of your role and activities. Position Requirements A bachelor's degree in computer science, data science, software engineering or related field. Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark. Expert level knowledge of Python or other scripting languages required. Proficiency in SQL and other data query languages. Understanding of data modeling and schema design principles Ability to work with large datasets and perform data analysis Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable. Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC). Thorough understanding of Azure Cloud Infrastructure offerings. Demonstrated problem-solving and troubleshooting skills. Team player with demonstrated written and communication skills. Benefits Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders. Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home. Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
    $75k-103k yearly est. 60d+ ago
  • Azure Data Engineer with Java Experience

    NTT Data North America 4.7company rating

    Data engineer job in Nashville, TN

    NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Azure Data Engineer with Java Experience to join our team in Nashville, Tennessee (US-TN), United States (US). Job Description: Job Title: Azure Data Engineer with Java experience + Location: Nashville, TN (On-site at one of NTT DATA's flagship data delivery centers) + Overview: NTT DATA is seeking a skilled Azure Data Engineer to join our Data Village team. This role involves working on-site in Nashville as part of a dynamic team focused on delivering cutting-edge data solutions. The ideal candidate will have 4-7 years of experience in data engineering, with expertise in Azure Cloud, Azure Functions, SQL/PLSQL, Python, and Power BI. Key Responsibilities: + Data Engineering: Utilize your expertise in Azure data services (Azure functions, Azure Data Factory, Azure Infrastructure, API based extraction) to build and maintain robust data solutions using SQL/PLSQL (Oracle), Python, and Power BI with strong knowledge of the SDLC are required. Build and manage dozens of data pipelines to source and transform data based on business requirements. Java experience is preferred but not required. + Financial Data Analysis: Apply your knowledge in financial data analysis, risk, and compliance data management to support our financial services customers. + Innovation and Learning: Quickly learn new technologies by applying your current skills, staying ahead of industry trends and advancements. Self-identify the need for new skills to be developed and adopt new technologies into your skill set in a month's time. + Client Collaboration: Work closely with financial services clients to build modern data solutions that transform how they leverage data for key business decisions, investment portfolio performance analysis, and risk and compliance management. Manage multiple stakeholder groups and their requirements. + Team Collaboration: Collaborate within a Pod of 4+ data engineers, working towards common objectives in a consultative fashion with clients. + Data Movement and Transformation: Use Azure Data Factory, Azure Functions and SQL (PL/SQL as well) for data movement, streaming, and transformation services, ensuring efficient and reliable data workflows. + Industry Leadership: Work with a client that is leading the industry in using data to drive business decision optimization and investment management strategies. Requirements: + Experience: 4-7 years of experience in data engineering. + Technical Skills: Proficiency Azure data services (Azure functions, Azure Data Factory, Azure Infrastructure, API based extraction). Proficiency in the following areas: + Data Warehousing: Strong knowledge of data warehousing concepts. + Python: Advanced skills in Python programming for data engineering and data pipelines. + Data Integration: Proficiency in integrating data from various sources using API and file-based extraction. + Data Platforms: Strong knowledge of relational databases and data warehouse/data lakes. + Data Security: Expertise in securing data in transit and in rest. + Data Pipelines: Experience in building and managing data pipelines using Azure Data Factory. + Azure: Experience Azure data services such as Azure Data Factory, Azure Functions, Azure Data Lake storage. + GitHub: Proficiency in using GitHub for version control. + Java: Adequate level experience in Java + Power BI: Adequate level experience in Power BI + Domain Expertise: Nice to have experience in financial data analysis, risk, and compliance data management. + Learning Agility: Ability to quickly learn new technologies by applying current skills. + Adaptability: Ability to adapt to new technologies quickly and efficiently. About NTT DATA: NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make ********************** accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at **********************/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
    $69k-89k yearly est. 6d ago
  • Data Scientist, Product Analytics

    Meta 4.8company rating

    Data engineer job in Nashville, TN

    As a Data Scientist at Meta, you will shape the future of people-facing and business-facing products we build across our entire family of applications (Facebook, Instagram, Messenger, WhatsApp, Oculus). By applying your technical skills, analytical mindset, and product intuition to one of the richest data sets in the world, you will help define the experiences we build for billions of people and hundreds of millions of businesses around the world. You will collaborate on a wide array of product and business problems with a wide-range of cross-functional partners across Product, Engineering, Research, Data Engineering, Marketing, Sales, Finance and others. You will use data and analysis to identify and solve product development's biggest challenges. You will influence product strategy and investment decisions with data, be focused on impact, and collaborate with other teams. By joining Meta, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.Product leadership: You will use data to shape product development, quantify new opportunities, identify upcoming challenges, and ensure the products we build bring value to people, businesses, and Meta. You will help your partner teams prioritize what to build, set goals, and understand their product's ecosystem.Analytics: You will guide teams using data and insights. You will focus on developing hypotheses and employ a varied toolkit of rigorous analytical approaches, different methodologies, frameworks, and technical approaches to test them.Communication and influence: You won't simply present data, but tell data-driven stories. You will convince and influence your partners using clear insights and recommendations. You will build credibility through structure and clarity, and be a trusted strategic partner. **Required Skills:** Data Scientist, Product Analytics Responsibilities: 1. Work with large and complex data sets to solve a wide array of challenging problems using different analytical and statistical approaches 2. Apply technical expertise with quantitative analysis, experimentation, data mining, and the presentation of data to develop strategies for our products that serve billions of people and hundreds of millions of businesses 3. Identify and measure success of product efforts through goal setting, forecasting, and monitoring of key product metrics to understand trends 4. Define, understand, and test opportunities and levers to improve the product, and drive roadmaps through your insights and recommendations 5. Partner with Product, Engineering, and cross-functional teams to inform, influence, support, and execute product strategy and investment decisions **Minimum Qualifications:** Minimum Qualifications: 6. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience 7. Bachelor's degree in Mathematics, Statistics, a relevant technical field, or equivalent 8. 4+ years of work experience in analytics, data querying languages such as SQL, scripting languages such as Python, and/or statistical mathematical software such as R (minimum of 2 years with a Ph.D.) 9. 4+ years of experience solving analytical problems using quantitative approaches, understanding ecosystems, user behaviors & long-term product trends, and leading data-driven projects from definition to execution [including defining metrics, experiment, design, communicating actionable insights] **Preferred Qualifications:** Preferred Qualifications: 10. Master's or Ph.D. Degree in a quantitative field **Public Compensation:** $147,000/year to $208,000/year + bonus + equity + benefits **Industry:** Internet **Equal Opportunity:** Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment. Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
    $147k-208k yearly 60d+ ago
  • 211018 / Consultant - Data Migration

    Procom Services

    Data engineer job in Nashville, TN

    Procom is a leading provider of professional IT services and staffing to businesses and governments in Canada. With revenues over $500 million, the Branham Group has recognized Procom as the 3rd largest professional services firm in Canada and is now the largest “Canadian-Owned” IT staffing/consulting company. Procom's areas of staffing expertise include: • Application Development • Project Management • Quality Assurance • Business/Systems Analysis • Datawarehouse & Business Intelligence • Infrastructure & Network Services • Risk Management & Compliance • Business Continuity & Disaster Recovery • Security & Privacy Specialties• Contract Staffing (Staff Augmentation) • Permanent Placement (Staff Augmentation) • ICAP (Contractor Payroll) • Flextrack (Vendor Management System) Job Description Assess and execute a migration from Connect Enterprise to Sterling File Gateway Qualifications Experience with Sterling Commerce application migrations Well developed analytical skills and written communications skills. Must have very good written and verbal communications skills Must have very good written and verbal communications skills. Required skills/experience for this role includes: Sterling Commerce product migration experience (5 years minimum) Sterling File Gateway experience (5 years minimum) Connect Direct product experience (5 years minimum) Sterling Enterprise experience (5 years minimum) Scripting language experience (Perl, Unix shell scripting, and Widows scripting) (5 years minimum) DOS batch scripting (5 years minimum) System analysis experience (5 years minimum) Additional InformationPLEASE NOTE THAT WE ARE NOT ABLE TO WORK WITH CANDIDATES ON H1B VISAS OR CANDIDATES REPRESENTED BY THIRD PARTIES.
    $64k-87k yearly est. 60d+ ago
  • Data Platform Engineer

    Monogram Health Inc. 3.7company rating

    Data engineer job in Brentwood, TN

    Job DescriptionPosition: Data Platform Engineer The Data Engineering team is seeking a highly skilled and experienced Data Platform Engineer with expertise in Data Engineering, Database Modeling, and modern Cloud Data Platforms. The Data Platform Engineer designs, builds, and maintains scalable and secure data infrastructure, tools, and pipelines to support data analytics, machine learning, and business intelligence initiatives. They will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions. Responsibilities Design and implement robust, scalable, and efficient data models and pipelines across cloud-based platforms. Develop, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks. Build and orchestrate Databricks Notebooks and Jobs using PySpark, Spark SQL, or Scala Spark. Develop and manage data models, data warehousing solutions, and data integration architectures in Azure. Implement Azure Functions, Azure WebApps, and Application Insights to support microservices and monitor distributed systems. Configure and manage Databricks clusters, including autoscaling, Photon acceleration, and job orchestration. Collaborate with cross-functional teams to support data-driven decision-making and analytics use cases. Ensure data quality, governance, and security across the data lifecycle. Collaborate with product managers by estimating technical tasks and deliverables. Uphold the mission and values of Monogram Health in all aspects of your role and activities. Position Requirements A bachelor's degree in computer science, data science, software engineering or related field. Minimum of five (5) years in designing and hands-on development in cloud-based analytics solutions, which includes a minimum of three (3) years' hands on work with big data frameworks and tools, such as Apache Kafka and Spark. Expert level knowledge of Python or other scripting languages required. Proficiency in SQL and other data query languages. Understanding of data modeling and schema design principles Ability to work with large datasets and perform data analysis Designing and building data integration pipelines using API's and Streaming ingestion methods is desirable. Familiarity with DevOps practices, including automation, CI/CD, and infrastructure as code (IaC). Thorough understanding of Azure Cloud Infrastructure offerings. Demonstrated problem-solving and troubleshooting skills. Team player with demonstrated written and communication skills. Benefits Comprehensive Benefits - Medical, dental, and vision insurance, employee assistance program, employer-paid and voluntary life insurance, disability insurance, plus health and flexible spending accounts Financial & Retirement Support - Competitive compensation, 401k with employer match, and financial wellness resources Time Off & Leave - Paid holidays, flexible vacation time/PSSL, and paid parental leave Wellness & Growth - Work life assistance resources, physical wellness perks, mental health support, employee referral program, and BenefitHub for employee discounts Monogram Health is a leading multispecialty provider of in-home, evidence-based care for the most complex of patients who have multiple chronic conditions. Monogram health takes a comprehensive and personalized approach to a person's health, treating not only a disease, but all of the chronic conditions that are present - such as diabetes, hypertension, chronic kidney disease, heart failure, depression, COPD, and other metabolic disorders. Monogram Health employs a robust clinical team, leveraging specialists across multiple disciplines including nephrology, cardiology, endocrinology, pulmonology, behavioral health, and palliative care to diagnose and treat health issues; review and prescribe medication; provide guidance, education, and counselling on a patient's healthcare options; as well as assist with daily needs such as access to food, eating healthy, transportation, financial assistance, and more. Monogram Health is available 24 hours a day, 7 days a week, and on holidays, to support and treat patients in their home. Monogram Health's personalized and innovative treatment model is proven to dramatically improve patient outcomes and quality of life while reducing medical costs across the health care continuum.
    $75k-103k yearly est. 28d ago
  • Google Cloud Data & AI Engineer

    Slalom 4.6company rating

    Data engineer job in Nashville, TN

    Who You'll Work With As a modern technology company, our Slalom Technologists are disrupting the market and bringing to life the art of the possible for our clients. We have passion for building strategies, solutions, and creative products to help our clients solve their most complex and interesting business problems. We surround our technologists with interesting challenges, innovative minds, and emerging technologies You will collaborate with cross-functional teams, including Google Cloud architects, data scientists, and business units, to design and implement Google Cloud data and AI solutions. As a Consultant, Senior Consultant or Principal at Slalom, you will be a part of a team of curious learners who lean into the latest technologies to innovate and build impactful solutions for our clients. What You'll Do * Design, build, and operationalize large-scale enterprise data and AI solutions using Google Cloud services such as BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub and more. * Implement cloud-based data solutions for data ingestion, transformation, and storage; and AI solutions for model development, deployment, and monitoring, ensuring both areas meet performance, scalability, and compliance needs. * Develop and maintain comprehensive architecture plans for data and AI solutions, ensuring they are optimized for both data processing and AI model training within the Google Cloud ecosystem. * Provide technical leadership and guidance on Google Cloud best practices for data engineering (e.g., ETL pipelines, data pipelines) and AI engineering (e.g., model deployment, MLOps). * Conduct assessments of current data architectures and AI workflows, and develop strategies for modernizing, migrating, or enhancing data systems and AI models within Google Cloud. * Stay current with emerging Google Cloud data and AI technologies, such as BigQuery ML, AutoML, and Vertex AI, and lead efforts to integrate new innovations into solutions for clients. * Mentor and develop team members to enhance their skills in Google Cloud data and AI technologies, while providing leadership and training on both data pipeline optimization and AI/ML best practices. What You'll Bring * Proven experience as a Cloud Data and AI Engineer or similar role, with hands-on experience in Google Cloud tools and services (e.g., BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub, etc.). * Strong knowledge of data engineering concepts, such as ETL processes, data warehousing, data modeling, and data governance. * Proficiency in AI engineering, including experience with machine learning models, model training, and MLOps pipelines using tools like Vertex AI, BigQuery ML, and AutoML. * Strong problem-solving and decision-making skills, particularly with large-scale data systems and AI model deployment. * Strong communication and collaboration skills to work with cross-functional teams, including data scientists, business stakeholders, and IT teams, bridging data engineering and AI efforts. * Experience with agile methodologies and project management tools in the context of Google Cloud data and AI projects. * Ability to work in a fast-paced environment, managing multiple Google Cloud data and AI engineering projects simultaneously. * Knowledge of security and compliance best practices as they relate to data and AI solutions on Google Cloud. * Google Cloud certifications (e.g., Professional Data Engineer, Professional Database Engineer, Professional Machine Learning Engineer) or willingness to obtain certification within a defined timeframe. About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this position the target base salaries are listed below. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The target salary pay range is subject to change and may be modified at any time. East Bay, San Francisco, Silicon Valley: * Consultant $114,000-$171,000 * Senior Consultant: $131,000-$196,500 * Principal: $145,000-$217,500 San Diego, Los Angeles, Orange County, Seattle, Houston, New Jersey, New York City, Westchester, Boston, Washington DC: * Consultant $105,000-$157,500 * Senior Consultant: $120,000-$180,000 * Principal: $133,000-$199,500 All other locations: * Consultant: $96,000-$144,000 * Senior Consultant: $110,000-$165,000 * Principal: $122,000-$183,000 EEO and Accommodations Slalom is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process. We are accepting applications until 12/31. #LI-FB1
    $145k-217.5k yearly 28d ago

Learn more about data engineer jobs

How much does a data engineer earn in Nashville, TN?

The average data engineer in Nashville, TN earns between $62,000 and $108,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Nashville, TN

$82,000

What are the biggest employers of Data Engineers in Nashville, TN?

Job type you want
Full Time
Part Time
Internship
Temporary