Post job

Data scientist jobs in The Villages, FL

- 655 jobs
All
Data Scientist
Data Engineer
Data Analyst
Data Architect
Data Science Internship
Actuarial Analyst
Data Modeler
Senior Data Scientist
Actuary
  • Data Architect

    Vlink Inc. 4.0company rating

    Data scientist job in Orlando, FL

    Data Architect Duration: 6 Months Responsible for enterprise-wide data design, balancing optimization of data access with batch loading and resource utilization factors. Knowledgeable in most aspects of designing and constructing data architectures, operational data stores, and data marts. Focuses on enterprise-wide data modelling and database design. Defines data architecture standards, policies and procedures for the organization, structure, attributes and nomenclature of data elements, and applies accepted data content standards to technology projects. Responsible for business analysis, data acquisition and access analysis and design, Database Management Systems optimization, recovery strategy and load strategy design and implementation. Essential Position Functions: Evaluate and recommend data management processes. Design, prepare and optimize data pipelines and workflows. Lead implementations of secure, scalable, and reliable Azure solutions. Observe and recommend how to monitor and optimize Azure for performance and cost-efficiency. Endorse and foster security best practices, access controls, and compliance standards for all data lake resources. Perform knowledge transfer about troubleshooting and documenting Azure architectures and solutions. Skills required: Deep understanding of Azure synapse Analytics, Azure Data Factory, and related Azure data tools Lead implementations of secure, scalable, and reliable Azure solutions. Observe and recommend how to monitor and optimize Azure for performance and cost efficiency. Expertise in implementing Data Vault 2.0 methodologies using Wherescape automation software. Proficient in designing and optimizing fact and dimension table models. Demonstrated ability to design, develop, and maintain data pipelines and workflows. Strong skills in formulating, reviewing, and optimizing SQL code. Expertise in data collection, storage, accessibility, and quality improvement processes. Endorse and foster security best practices, access controls, and compliance standards for all data lake resources. Proven track record of delivering consumable data using information marts. Excellent communication skills to effectively liaise with technical and non-technical team members. Ability to document designs, procedures, and troubleshooting methods clearly. Proficiency in Python or PowerShell preferred. Bachelor's or master's degree in computer science, Information Systems, or other related field. Or equivalent work experience. A minimum of 7 years of experience with large and complex database management systems.
    $86k-111k yearly est. 5d ago
  • Data Analyst

    Brooksource 4.1company rating

    Data scientist job in Lake Mary, FL

    Lake Mary, FL Brooksource is a looking for a detail-oriented and dedicated individual to support our Specialty Pharmacy Distribution client and support their Customer Master domain. This person will be responsible for working with client accounts and updating/maintaining as needed. Responsibilities: Accurately enter and update customer data in the SAP system. Maintain and manage customer master data, ensuring data integrity and consistency. Verify and validate data entries for accuracy and completeness. Collaborate with cross-functional teams to resolve data discrepancies and ensure timely updates. Generate and analyze reports to identify and correct data issues. Assist in the development and implementation of data entry procedures and guidelines. Provide support for data migration and integration projects. Ensure compliance with company policies and data management standards. Qualifications: High school diploma or equivalent; additional certification in data entry or related field is a plus. Proven experience in data entry, preferably within the SAP environment. Familiarity with Customer Master data management. Strong attention to detail and accuracy. Excellent organizational and time management skills. Proficient in Microsoft Office Suite (Excel, Word, Outlook). Ability to work independently and as part of a team. Strong communication skills, both written and verbal. Preferred Skills: Experience with SAP modules related to Customer Master data. Knowledge of data governance and data quality principles. Ability to troubleshoot and resolve data-related issues.
    $55k-76k yearly est. 5d ago
  • Senior Data Engineer

    Leon Recruitment 4.2company rating

    Data scientist job in Miami, FL

    Sr. Data Engineer CLIENT: Fortune 150 Company; Financial Services SUMMARY DESCRIPTION: The Data Engineer will serve in a strategic role designing and managing the infrastructure that supports data storage, transforming, processing, and retrieval enabling efficient data analysis and decision-making within the organization. This position is critical as part of the Database and Analytics team responsible for design, development, and implementation of complex enterprise-level data integration and consumption solutions. It requires a highly technical, self-motivated senior engineer who will work with analysts, architects, and systems engineers to develop solutions based on functional and technical specifications that meet quality and performance requirements. Must have Experience with Microsoft Fabric. PRIMARY DUTIES AND RESPONSIBILITIES: Utilize experience in ETL tools, with at least 5 years dedicated to Azure Data Factory (ADF), to design, code, implement, and manage multiple parallel data pipelines. Experience with Microsoft Fabric, Pipelines, Mirroring, and Data Flows Gen 2 usage is required. Apply a deep understanding of data warehousing concepts, including data modeling techniques like star and snowflake schemas, SCD Type 2, Change Data Feeds, Change Data Capture. Also demonstrates hands-on experience with Data Lake Gen 2, Delta Lake, Delta Parquet files, JSON files, big data storage layers, optimize and maintain big data storage using Partitioning, V-Order, Optimize, Vacuum and other techniques. Design and optimize medallion data models, warehouses, architectures, schemas, indexing, and partitioning strategies. Collaborate with Business Insights and Analytics teams to understand data requirements and optimize storage for analytical queries. Modernize databases and data warehouses and prepare them for analysis, managing for optimal performance. Design, build, manage, and optimize enterprise data pipelines ensuring efficient data flow, data integrity, and data quality throughout the process. Automate efficient data acquisition, transformation, and integration from a variety of data sources including databases, APIs, message queues, data streams, etc. Competently performs advanced data tasks with minimal supervision, including architecting advanced data solutions, leading and coaching others, and effectively partnering with stakeholders. Interface with other technical and non-technical departments and outside vendors on assigned projects. Under the direction of the IT Management, will establish standards, policies and procedures pertaining to data governance, database/data warehouse management, metadata management, security, optimization, and utilization. Ensure data security and privacy by implementing access controls, encryption, and anonymization techniques as per data governance and compliance policies. Expertise in managing schema drift within ETL processes, ensuring robust and adaptable data integration solutions. Document data pipelines, processes, and architectural designs for future reference and knowledge sharing. Stay informed of latest trends and technologies in the data engineering field, and evaluate and adopt new tools, frameworks, and platforms (like Microsoft Fabric) to enhance data processing and storage capabilities. When necessary, implement and document schema modifications made to legacy production environment. Perform any other function required by IT Management for the successful operation of all IT and data services provided to our clients. Available nights and weekends as needed for system changes and rollouts. EDUCATION AND EXPERIENCE REQUIREMENTS: Bachelor's or Master's degree in computer science, information systems, applied mathematics, or closely related field. Minimum of ten (10) years full time employment experience as a data engineer, data architect, or equivalent required. Must have Experience with Microsoft Fabric SKILLS: Experience in working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures, and integrated datasets using traditional and modern data integration technologies (such as ETL, ELT, MPP, data replication, change data captures, message-oriented data movement, API design, stream data integration and data virtualization) Experience working with cloud data engineering stacks (specifically Azure and Microsoft Fabric), Data Lake, Synapse, Azure Data Factory, Databricks, Informatica, Data Explorer, etc. Strong, in-depth understanding of database architecture, storage, and administration utilizing Azure stack. Deep understanding of Data architectural approaches, Data Engineering Solutions, Software Engineering principles and best practices. Working knowledge and experience with modern BI and ETL tools (Power BI, Power Automate, ADF, SSIS, etc.) Experience utilizing data storage solutions including Azure Blob storage, ADLS Gen 2. Solid understanding of relational and dimensional database principles and best practices in a client/server, thin-client, and cloud computing environment. Advanced working knowledge of TSQL and SQL Server, transactions, error handling, security and maintenance with experience writing complex stored procedures, views, and user-defined functions as well as complex functions, dynamic SQL, partitions, CDC, CDF, etc. Experience with .net scripting and understanding of API integration in a service-oriented architecture. Knowledge of reporting tools, query language, semantic models with specific experience with Power BI. Understanding of and experience with agile methodology. PowerShell scripting experience desired. Experience with Service Bus, Azure Functions, Event Grids, Event Hubs, Kafka would be beneficial. Experience working in Agile methodology. Working Conditions: Available to work evenings and/or weekends (as required). Workdays and hours are Monday through Friday 8:30 am to 5:30 pm ET.
    $79k-106k yearly est. 3d ago
  • Financial Data Analyst

    Heirloom Fair Legal

    Data scientist job in Palm Beach, FL

    Heirloom Fair Legal is a specialist legal financer, providing financing to law firms, claimants, and their service providers to promote access to justice for consumer and small-business legal claims in the UK. Based in London and with offices in Manchester and Warrington in England, we are expanding to the US with a new Palm Beach, FL office. Role Description This is a full-time hybrid role for a Financial Data Analyst, located in Palm Beach, FL, with opportunities for remote work. The Financial Data Analyst is responsible for our portfolio and reporting system. This tracks our financings to or via approximately 20 law firms or service providers, tracking approximately 350,000 pieces of underlying collateral. We are developing a new system based on SQL and Python with an as-yet unselected ETL/Business Intelligence layer. This role will be responsible for leading the buildout / design of this new system, migrating data over, and leading the system for data cleansing and ingestion. It will also be responsible for generating regular reporting for HFL's Investment Committee, external investors and other stakeholders. Qualifications Strong Analytical Skills and proficiency in Data Analytics Experience with SQL, Python and ETL or Business Intelligence tools Proficiency in generating reports using data visualization and similar tools Ability to work independently, balancing multiple priorities and to deadline Strong problem-solving skills and critical thinking capabilities Bachelor's degree in Finance, Economics, Data Analytics, or a related field Prior experience in the legal finance or investment industry is a plus
    $49k-73k yearly est. 4d ago
  • Data Architect

    Integris Group 4.0company rating

    Data scientist job in Orlando, FL

    (Orlando, FL) Business Challenge The company is in the midst of an AI transformation, creating exciting opportunities for growth. At the same time, they are leading a Salesforce modernization and integrating the systems and data of their recent acquisition. To support these initiatives, they are bringing in a Senior Data Architect/Engineer to establish enterprise standards for application and data architecture, partnering closely with the Solutions Architect and Tech Leads. Role Overview The Senior Data Architect/Engineer leads the design, development, and evolution of enterprise data architecture, while contributing directly to the delivery of robust, scalable solutions. This position blends strategy and hands-on engineering, requiring deep expertise in modern data platforms, pipeline development, and cloud-native architecture. You will: Define architectural standards and best practices. Evaluate and implement new tools. Guide enterprise data initiatives. Partner with data product teams, engineers, and business stakeholders to build platforms supporting analytics, reporting, and AI/ML workloads. Day-to-Day Responsibilities Lead the design and documentation of scalable data frameworks: data lakes, warehouses, streaming architectures, and Azure-native data platforms. Build and optimize secure, high-performing ETL/ELT pipelines, data APIs, and data models. Develop solutions that support analytics, advanced reporting, and AI/ML use cases. Recommend and standardize modern data tools, frameworks, and architectural practices. Mentor and guide team members, collaborating across business, IT, and architecture groups. Partner with governance teams to ensure data quality, lineage, security, and stewardship. Desired Skills & Experience 10+ years of progressive experience in Data Engineering and Architecture. Strong leadership experience, including mentoring small distributed teams (currently 4 people: 2 onshore, 2 offshore; team growing to 6). Deep knowledge of Azure ecosystem (Data Lake, Synapse, SQL DB, Data Factory, Databricks). Proven expertise with ETL pipelines (including 3rd-party/vendor integrations). Strong SQL and data modeling skills; familiarity with star/snowflake schemas and other approaches. Hands-on experience creating Data APIs. Solid understanding of metadata management, governance, security, and data lineage. Programming experience with SQL, Python, Spark. Familiarity with containerized compute/orchestration frameworks (Docker, Kubernetes) is a plus. Experience with Salesforce data models, MDM tools, and streaming platforms (Kafka, Event Hub) is preferred. Excellent problem-solving, communication, and leadership skills. Education: Bachelor's degree in Computer Science, Information Systems, or related field (Master's preferred). Azure certifications in Data Engineering or Solution Architecture strongly preferred. Essential Duties & Time Allocation Data Architecture Leadership - Define enterprise-wide strategies and frameworks (35%) Engineering & Delivery - Build and optimize ETL/ELT pipelines, APIs, and models (30%) Tooling & Standards - Evaluate new tools and support adoption of modern practices (15%) Mentorship & Collaboration - Mentor engineers and align stakeholders (10%) Governance & Quality - Embed stewardship, lineage, and security into architecture (10%)
    $84k-119k yearly est. 3d ago
  • Sr Electronic Data Interchange Coordinator

    Ashley Furniture Industries 4.1company rating

    Data scientist job in Tampa, FL

    On-Site: Locations - Tampa FL, Arcadia WI (GC/USC Only) Senior EDI Coordinator Senior EDI Coordinators create new and update existing EDI maps to support the movement of thousands of transactions each day, setup and maintain EDI trading partners, setup and maintain EDI communication configurations, and provide support for a large assortment of EDI transactions with variety of trading partners. Primary Job Functions: Monitor inbound and outbound transaction processing to ensure successful delivery. Take corrective action on those transactions that are not successful. Develop and modify EDI translation maps according to Business Requirements Documents and EDI Specifications. Perform unit testing and coordinate integrated testing with internal and external parties. Perform map reviews to ensure new maps and map changes comply with requirements and standards. Prepare, maintain, and review documentation. This includes Mapping Documents, Standard Operating Procedures, and System Documentation. Perform Trading Partner setup, configuration, and administrative activities. Analyze and troubleshoot connectivity, mapping, and data issues. Provide support to our business partners and external parties. Participate in an after-hours on-call rotation. Setup and maintain EDI communication channels. Provide coaching and mentoring to EDI Coordinators. Suggest EDI best practices and opportunities for improvement. Maintain and update AS2 Certificates. Deploy map changes to production. Perform EDI system maintenance and upgrades. Job Qualifications: Education: Bachelor's Degree in Information Systems, Computer Science, or other related fields; or equivalent combination of education and experience, Required Experience: 5+ years of practical EDI mapping experience, with emphasis in ANSI X.12, Required Experience working with XML and JSON transactions, Preferred Experience working with AS2, VAN, and sFTP communications, Preferred Experience working with AS2 Certificates, Preferred Experience with Azure DevOps Agile/Scrum platform, Preferred Experience in large, complex enterprise environments, Preferred Knowledge, Skills and Abilities: Advanced analytical and problem-solving skills Strong attention to detail Excellent written and verbal communication skills Excellent client facing and interpersonal skills Effective time management and organizational skills Work independently as well as in a team environment Handle multiple projects simultaneously within established time constraints Perform under strong demands in a fast-paced environment Display empathy, understanding and patience with employees and external customers Respond professionally in situations with difficult employee/vendor/customer issues or inquiries Working knowledge of Continuous Improvement methodologies Strong working knowledge of Microsoft Office Suite
    $56k-76k yearly est. 4d ago
  • Data Analyst

    Demand The Limits Personal Injury Attorneys

    Data scientist job in Boca Raton, FL

    **This role is only available in Boca Raton, FL. Please only apply if you're within a commutable distance to Boca. ** We are seeking a highly analytical and detail-oriented Data Analyst to join our team. This position offers high visibility and the opportunity to directly impact business strategy, marketing performance, and operational efficiency. Reporting to the firm's owners and managing partners, the Data Analyst will transform raw data into actionable insights that support decision-making across departments including case management, marketing, and operations. Key Responsibilities · Collect, organize, and analyze data from multiple firm systems (CRM, case management, marketing platforms, and financial systems). · Develop and maintain dashboards, reports, and KPIs to monitor performance and identify trends. · Present data insights and recommendations to firm leadership in a clear and actionable manner. · Partner with management and department leads to identify opportunities for process improvement through data. · Support forecasting, budgeting, and strategic planning initiatives. · Ensure data accuracy, integrity, and consistency across systems. · Assist in developing automation and reporting processes to streamline data workflows. Qualifications · Bachelor's degree in Data Analytics, Statistics, Business, Finance, or a related field. · 2-5 years of experience as a Data Analyst, preferably in a professional services or law firm environment. · Proficient in data visualization and reporting tools (e.g., Power BI, Tableau, or similar). · Strong proficiency in Excel and comfort with database querying (SQL preferred). · Excellent analytical, critical thinking, and problem-solving skills. · Strong communication skills-able to translate complex data into clear insights for non-technical stakeholders. · Highly organized and detail-oriented with the ability to manage multiple priorities. Why Join Demand The Limits · Direct exposure to firm leadership and a meaningful role in shaping data-driven strategy. · A collaborative, entrepreneurial culture where your ideas and insights make a tangible impact. · Competitive compensation package with discretionary bonuses based on performance. · Opportunities for growth as the firm continues to expand. Compensation: Salary range: $70,000 - $83,000 annually, commensurate with experience, plus performance-based bonuses. Location: Boca Raton, FL (Limited hybrid flexibility may be considered for the right candidate)
    $70k-83k yearly 4d ago
  • Data Analyst

    Kantar Media 4.0company rating

    Data scientist job in Miami, FL

    WFA Cross Media Measurement Initiative Miami, FL / On Site Kantar Media has won the contract to build a new Cross-Media Measurement panel in the United States, one of the first of its kind in the world. We are looking for forward-thinking, analytical, and detail-oriented professionals to join our team and help transform the way advertisers understand audiences. The Media Division at Kantar are experts in decoding the evolving media landscape. Our Audience Measurement teams are developing innovative ways to quantify how people consume media across platforms, from streaming services to broadcast TV. This new U.S. panel represents the future of measurement, integrating the latest technologies, partnerships with global platforms (including Google and Meta), and cutting-edge data visualization. Kantar is the world's leading data, insights, and consulting company. We understand more about how people think, feel, shop, share, vote, and view than anyone else. Combining deep human insight with advanced analytics, Kantar's 25,000 employees in over 100 countries help the world's leading organizations succeed and grow. Nobody knows people better than Kantar. This is your opportunity to be part of something genuinely transformative. Job Details Our Data Analyst will play a key role in ensuring the accuracy, consistency, and insightfulness of data used across the Cross-Media Measurement initiative. You will clean, structure, and analyze panel and operational data, create dashboards and reports, and provide data-driven insights to stakeholders across Operations, Product, and Client Services. This is a hands-on analytical role that combines strong technical skills with business acumen and storytelling ability. You will help ensure our metrics are not only accurate but actionable, turning raw data into meaningful insights that drive decisions. Tasks & Responsibilities Clean, validate, and structure raw panel and operational datasets for reporting and analysis. Develop and maintain Excel dashboards, Power BI reports, and Power Apps solutions that streamline data entry, tracking, and visualization workflows. Build and manage Power Apps used by operations teams for data capture, process automation, and reporting integration. Identify data anomalies, trends, and root causes of issues that impact panel performance or data quality. Partner with Operations and Technology teams to ensure data integrity and proper data flow across systems. Automate routine reporting processes and improve efficiency in data collection and transformation. Support leadership with ad-hoc analysis, visualizations, and performance summaries to inform strategic decisions. Collaborate cross-functionally to design and implement new KPIs and data views aligned with project goals. Ensure compliance with data governance standards and documentation best practices. Contribute to continuous improvement initiatives through data insights and analytics innovation. The Skills & Experience Needed A minimum of one year of professional experience using Excel, Power BI, and Power Apps is required. Strong proficiency in Microsoft Excel (pivot tables, advanced formulas, Power Query, VBA desirable). Experience with Power BI, building dashboards, data models, and visual reports from multiple data sources. Hands-on experience with Microsoft Power Apps, including building and maintaining low-code applications, automating workflows, and integrating with Power BI and SharePoint. Proven analytical and problem-solving skills with exceptional attention to detail and accuracy. Solid understanding of data structures, quality assurance, and basic statistical concepts. Ability to interpret complex datasets and communicate findings in a clear, concise, and actionable way. Experience working with large operational or media datasets preferred. Knowledge of SQL, Python, or similar data manipulation tools is a plus (but not required). Strong organizational and time management skills, able to prioritize and deliver under tight deadlines. Excellent collaboration skills, comfortable working cross-functionally in a dynamic, fast-paced environment. Fluent English essential; Spanish desirable.
    $49k-76k yearly est. 1d ago
  • Data Quality Analyst

    Sanford Rose Associates-Jfspartners 4.1company rating

    Data scientist job in Orlando, FL

    Sanford Rose JFSPartners is currently looking for a Data Quality Analyst for a full-time opportunity in Orlando. Qualified candidates will participate in the full data quality lifecycle from requirement gathering through ongoing support. The candidate selected for this role will develop technical components that meet the business/functional requirements or from logged data incidents. RESPONSIBILITIES: Develop technical specifications that demonstrate how data quality will be preserved/enforced. Work with the BA team to generate data to power quality dashboards, which allow both data providers and data consumers to monitor data quality. Contribute to business/technical definitions of data objects within the data catalogue. Serve as an SME for multiple data domains. Assist business users in the selection, understanding and use of data. Perform UAT on data sets as part of data ingestion, egress, transformation and rule execution. REQUIRED TECHNICAL SKILLS: Strong understanding of data structures, data types, and data transformation. Ability to perform complex data mappings, workflows and sessions. Experience with SQL, and other data transformation/analytics tools such as Informatica, Talend, or Alteryx. Expertise in reading, analyzing and debugging SQL. Experience or willingness to learn data profiling/quality tools such as Collibra, Ataccama, Informatica or OEDQ. At Sanford Rose Associates - JFSPartners, we specialize in Finance & Accounting, Legal, and Information Technology recruitment, dedicated to helping professionals like you discover the perfect career opportunities. With a track record of assisting thousands of professionals nationwide, we are prepared to leverage our expertise on your behalf. Partnering with us means gaining access to serious candidates, minimizing hiring errors, and ensuring top-tier hires, all while navigating the hiring process with confidence. We understand the significance of finding the ideal role and aligning with an organization that shares your values.
    $47k-69k yearly est. 1d ago
  • Lead Data Engineer

    Selby Jennings

    Data scientist job in Tampa, FL

    A leading Investment Management Firm is looking to bring on a Lead Data Engineer to join its team in Tampa, Denver, Memphis, or Southfield. This is an excellent chance to work alongside industry leaders while getting to be both hands on and helping lead the team. Key Responsibilities Project Oversight: Direct end-to-end software development activities, from initial requirements through deployment, ensuring projects meet deadlines and quality standards. Database Engineering: Architect and refine SQL queries, stored procedures, and schema designs to maximize efficiency and scalability within Oracle environments. Performance Tuning: Evaluate system performance and apply strategies to enhance data storage and retrieval processes. Data Processing: Utilize tools like Pandas and Spark for data wrangling, transformation, and analysis. Python Solutions: Develop and maintain Python-based applications and automation workflows. Pipeline Automation: Implement and manage continuous integration and delivery pipelines using Jenkins and similar technologies to optimize build, test, and release cycles. Team Development: Guide and support junior engineers, promoting collaboration and technical growth. Technical Documentation: Create and maintain comprehensive documentation for all development initiatives. Core Skills Experience: Over a decade in software engineering, with deep expertise in Python and Oracle database systems. Technical Knowledge: Strong command of SQL, Oracle, Python, Spark, Jenkins, Kubernetes, Pandas, and modern CI/CD practices. Optimization Expertise: Skilled in database tuning and applying best practices for performance. Leadership Ability: Proven track record in managing teams and delivering complex projects. Analytical Strength: Exceptional problem-solving capabilities with a data-centric mindset. Communication: Clear and effective written and verbal communication skills. Education: Bachelor's degree in Computer Science, Engineering, or equivalent professional experience. Preferred Qualifications Certifications: Professional credentials in Oracle, Python, Kubernetes, or CI/CD technologies. Agile Background: Hands-on experience with Agile or Scrum frameworks. Cloud Platforms: Familiarity with AWS, Azure, or Google Cloud services.
    $72k-99k yearly est. 1d ago
  • Data Modeling

    Tata Consultancy Services 4.3company rating

    Data scientist job in Melbourne, FL

    Must Have Technical/Functional Skills • 5+ years of experience in data modeling, data architecture, or a similar role • Proficiency in SQL and experience with relational databases such as Oracle, SQL Server, or PostgreSQL • Experience with data modeling tools such as Erwin, IBM Infosphere Data Architect, or similar • Ability to communicate complex concepts clearly to diverse audiences Roles & Responsibilities • Design and develop conceptual, logical, and physical data models that support both operational and analytical needs • Collaborate with business stakeholders to gather requirements and translate them into scalable data models • Perform data profiling and analysis to understand data quality issues and identify opportunities for improvement • Implement best practices for data modeling, including normalization, denormalization, and indexing strategies • Lead data architecture discussions and present data modeling solutions to technical and non-technical audiences • Mentor and guide junior data modelers and data architects within the team • Continuously evaluate data modeling tools and techniques to enhance team efficiency and productivity Base Salary Range: $100,000 - $150,000 per annum TCS Employee Benefits Summary: Discretionary Annual Incentive. Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans. Family Support: Maternal & Parental Leaves. Insurance Options: Auto & Home Insurance, Identity Theft Protection. Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement. Time Off: Vacation, Time Off, Sick Leave & Holidays. Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
    $100k-150k yearly 4d ago
  • Data Engineer

    Flybridge Staffing

    Data scientist job in Palm Beach Gardens, FL

    Flybridge Staffing is currently searching for a Data Engineer for a client located in the Palm Beach Gardens area. This is a direct-hire position that will work off a hybrid schedule of 2 days remote. This person will design systems that supply high-performance datasets for advanced analytics. Experience: BA degree and 5+ years of Data Engineering experience Strong experience building ETL data pipelines for on-premises SQL Server 2017 or newer Deep understanding of the development of data pipelines with either SSIS or Python Broad experience with SQL Server, including Columnstore, etc. Extensive experience using SSMS and T-SQL to create and maintain SQL Server tables, views, functions, stored procedures, and user-defined table types. Experience with data modeling indexes, Temporal tables, CLR, and Service Broker. Experience in partitioning tables and indexes, and performance improvement with Query Analyzer Experience writing C#, PowerShell, and Python. Experience with source control integration with GitHub, BitBucket, and Azure DevOps. Experience working in an Agile and Kanban SDLC. Experience with cloud-based data management solutions such as Snowflake, Redshift. Experience with Python programming is a plus. Libraries such as Pandas, Numpy, csv, Traceback, JSON, PyODBC, Math-Are nice to have. Experience writing design documentation such as ERDs, Data Flow Diagrams, and Process Flow Diagrams. Experience with open-source database engines such as Clickhouse, ArcticDB, and PostGreSQL is a plus. Responsibilities: Collaborate effectively with Stakeholders, Project Managers, Software Engineers, Data Analysts, QA Analysts, DBAs, and Data Engineers. Build and maintain data pipelines based on functional and non-functional requirements. Proactively seek out information and overcome obstacles to deliver projects efficiently. Ensure that data pipelines incorporate best practices related to performance, scaling, extensibility, fault tolerance, instrumentation, and maintainability. Ensure that data pipelines are kept simple and not overly engineered. Produce and maintain design and operational documentation. Analyze complex data problems and engineer elegant solutions. ****NO SPONSORSHIP AVAILABLE**** US Citizen, GC, EAD only please. If your background aligns with the above details and you would like to learn more, please submit your resume to jobs@flybridgestaffing.com or on our website, www.flybridgestaffing.com and one of our recruiters will be in touch with you ASAP. Follow us on LinkedIn to keep up with all our latest job openings and referral program.
    $71k-98k yearly est. 4d ago
  • GCP Data Architect with 14+ years (Day 1 onsite)

    M3Bi-A Zensar Company

    Data scientist job in Sunrise, FL

    12-14 years of overall IT experience with expertise in Data landscape - Data Warehouse, Data lake etc. Hands on experience in Big Data and Hadoop ecosystem; Strong skills in SQL, Python or Spark Proficient in Data Warehousing concepts and Customer Data Management (Customer 360) Experience in GCP platform - Dataflow, Dataproc, Kubernetes containers etc. Expertise in deep Data exploration and Data analysis Excellent communication and inter personal skills
    $78k-110k yearly est. 3d ago
  • Actuarial Analyst

    Nation Safe Drivers 4.1company rating

    Data scientist job in Boca Raton, FL

    Actuarial Analyst - Nation Safe Drivers (Boca Raton, FL | On-site) Nation Safe Drivers (NSD), a 60-year industry leader in the automotive and financial services sector, is expanding our Actuarial & Analytics team. NSD is proudly employee-centric, offering a fun and collaborative culture, and excellent benefits. Our corporate headquarters is located in the heart of Boca Raton. We are seeking a motivated Actuarial Analyst who is passionate about modeling risk, improving financial outcomes, and helping guide strategic decisions. You will work closely with our Actuary and cross-functional teams to support pricing, reserves, forecasting, and product development. What You'll Do Analyze data, trends, and loss events to assess and forecast financial risk. Build and enhance models for pricing, reserves, premiums, and other actuarial functions. Support development and improvement of insurance and financial products. Ensure compliance with regulatory standards and reporting requirements. Communicate findings clearly to leadership, peers, and regulatory stakeholders. Identify emerging risks and propose strategies that support long-term organizational stability. Collaborate with actuaries, underwriters, analysts, and operational teams. Continuously improve actuarial processes, tools, and methodologies. What We're Looking For Bachelor's degree in Actuarial Science, Mathematics, Statistics, Finance, or related field. Actuarial exam progress; ACAS/FCAS preferred. Strong analytical and statistical modeling skills. Experience with R, Python, SQL, SAS, or similar tools. Knowledge of Prophet, AXIS, Emblem or similar actuarial software is a plus. Excellent communication skills with the ability to simplify complex findings. Detail-oriented, proactive, and able to thrive in a fast-paced environment. Why Join NSD Excellent Benefits: Health, dental, vision, disability, life, PTO, paid holidays-and pet insurance! Competitive compensation and family-friendly schedule. Great Culture: Social events, recognition lunches, celebrations, and a supportive leadership team. Career Growth: NSD has a long-standing reputation for promoting from within. Ready to Advance Your Actuarial Career? If you're excited to work on meaningful actuarial projects while growing with a supportive and innovative company, we'd love to meet you. Apply today!
    $65k-89k yearly est. 3d ago
  • Sr. Data Engineer (SQL+Python+AWS)

    SGS Technologie 3.5company rating

    Data scientist job in Saint Petersburg, FL

    looking for a Sr. Data Engineer (SQL+Python+AWS) to work on a 12+ Months, Contract (potential Extension or may Convert to Full-time) = Hybrid at St. Petersburg, FL 33716 with a Direct Financial Client = only on W2 for US Citizen or Green Card Holders. Notes from the Hiring Manager: • Setting up Python environments and data structures to support the Data Science/ML team. • No prior Data Science or Machine Learning experience required. • Role involves building new data pipelines and managing file-loading connections. • Strong SQL skills are essential. • Contract-to-hire position. • Hybrid role based in St. Pete, FL (33716) only. Duties: This role is building and maintaining data pipelines that connect Oracle-based source systems to AWS cloud environments, to provide well-structured data for analysis and machine learning in AWS SageMaker. It includes working closely with data scientists to deliver scalable data workflows as a foundation for predictive modeling and analytics. • Develop and maintain data pipelines to extract, transform, and load data from Oracle databases and other systems into AWS environments (S3, Redshift, Glue, etc.). • Collaborate with data scientists to ensure data is prepared, cleaned, and optimized for SageMaker-based machine learning workloads. • Implement and manage data ingestion frameworks, including batch and streaming pipelines. • Automate and schedule data workflows using AWS Glue, Step Functions, or Airflow. • Develop and maintain data models, schemas, and cataloging processes for discoverability and consistency. • Optimize data processes for performance and cost efficiency. • Implement data quality checks, validation, and governance standards. • Work with DevOps and security teams to comply with RJ standards. Skills: Required: • Strong proficiency with SQL and hands-on experience working with Oracle databases. • Experience designing and implementing ETL/ELT pipelines and data workflows. • Hands-on experience with AWS data services, such as S3, Glue, Redshift, Lambda, and IAM. • Proficiency in Python for data engineering (pandas, boto3, pyodbc, etc.). • Solid understanding of data modeling, relational databases, and schema design. • Familiarity with version control, CI/CD, and automation practices. • Ability to collaborate with data scientists to align data structures with model and analytics requirements Preferred: • Experience integrating data for use in AWS SageMaker or other ML platforms. • Exposure to MLOps or ML pipeline orchestration. • Familiarity with data cataloging and governance tools (AWS Glue Catalog, Lake Formation). • Knowledge of data warehouse design patterns and best practices. • Experience with data orchestration tools (e.g., Apache Airflow, Step Functions). • Working knowledge of Java is a plus. Education: B.S. in Computer Science, MIS or related degree and a minimum of five (5) years of related experience or combination of education, training and experience.
    $71k-91k yearly est. 3d ago
  • Data Architect

    Zensar Technologies 4.3company rating

    Data scientist job in Sunrise, FL

    JD: 14+ years of overall IT experience with expertise in Data landscape - Data Warehouse, Data lake etc. Hands on experience in Big Data and Hadoop ecosystem; Strong skills in SQL, Python or Spark Proficient in Data Warehousing concepts and Customer Data Management (Customer 360) Experience in GCP platform - Dataflow, Dataproc, Kubernetes containers etc. Expertise in deep Data exploration and Data analysis Excellent communication and inter personal skills
    $77k-102k yearly est. 2d ago
  • Senior Data Engineer

    Toorak Capital Partners

    Data scientist job in Tampa, FL

    Company: Toorak Capital Partners is an integrated correspondent lending and table funding platform that acquires business purpose residential, multifamily and mixed-use loans throughout the U.S. and the United Kingdom. Headquartered in Tampa, FL., Toorak Capital Partners acquires these loans directly from a network of private lenders on a correspondent basis. Summary: The role of the Lead Data Engineer is to develop, implement, for building high performance, scalable data solution to support Toorak's Data Strategy Lead Data architecture for Toorak Capital. Lead efforts to create API framework to use data across customer facing and back office applications. Establish consistent data standards, reference architectures, patterns, and practices across the organization for both OLTP and OLAP (Data warehouse, Data Lake house) MDM and AI / ML technologies Lead sourcing and synthesis of Data Standardization and Semantics discovery efforts turning insights into actionable strategies that will define the priorities for the team and rally stakeholders to the vision Lead the data integration and mapping efforts to harmonize data. Champion standards, guidelines, and direction for ontology, data modeling, semantics and Data Standardization in general at Toorak. Lead strategies and design solutions for a wide variety of use cases like Data Migration (end-to-end ETL process), database optimization, and data architectural solutions for Analytics Data Projects Required Skills: Designing and maintaining the data models, including conceptual, logical, and physical data models 5+ years of experience using NoSQL systems like MongoDB, DynamoDB and Relational SQL Database systems (PostgreSQL) and Athena 5+ years of experience on Data Pipeline development, ETL and processing of structured and unstructured data 5+ years of experience in large scale real-time stream processing using Apache Flink or Apache Spark with messaging infrastructure like Kafka/Pulsar Proficiency in using data management tools and platforms, such as data cataloging software, data quality tools), and data governance platforms Experience with Big Query, SQL Mesh(or similar SQL-based cloud platform). Knowledge of cloud platforms and technologies such as Google Cloud Platform, Amazon Web Services. Strong SQL skills. Experience with API development and frameworks. Knowledge in designing solutions with Data Quality, Data Lineage, and Data Catalogs Strong background in Data Science, Machine Learning, NLP, Text processing of large data sets Experience in one or more of the following: Dataiku, DataRobot, Databricks, UiPath would be nice to have. Using version control systems (e.g., Git) to manage changes to data governance policies, procedures, and documentation Ability to rapidly comprehend changes to key business processes and the impact on overall Data framework. Flexibility to adjust to multiple demands, shifting priorities, ambiguity, and rapid change. Advanced analytical skills. High level of organization and attention to detail. Self-starter attitude with the ability to work independently. Knowledge of legal, compliance, and regulatory issues impacting data. Experience in finance preferred.
    $72k-99k yearly est. 4d ago
  • ML Data Engineer #978695

    Dexian

    Data scientist job in Seffner, FL

    Job Title: Data Engineer - AI/ML Pipelines Work Model: Hybrid Duration: CTH The Data Engineer - AI/ML Pipelines plays a key role in designing, building, and maintaining scalable data infrastructure that powers analytics and machine learning initiatives. This position focuses on developing production-grade data pipelines that support end-to-end ML workflows-from data ingestion and transformation to feature engineering, model deployment, and monitoring. The ideal candidate has hands-on experience working with operational systems such as Warehouse Management Systems (WMS) or ERP platforms, and is comfortable partnering closely with data scientists, ML engineers, and operational stakeholders to deliver high-quality, ML-ready datasets. Key Responsibilities ML-Focused Data Engineering Build, optimize, and maintain data pipelines specifically designed for machine learning workflows. Collaborate with data scientists to develop feature sets, implement data versioning, and support model training, evaluation, and retraining cycles. Participate in initiatives involving feature stores, model input validation, and monitoring of data quality feeding ML systems. Data Integration from Operational Systems Ingest, normalize, and transform data from WMS, ERP, telemetry, and other operational data sources. Model and enhance operational datasets to support real-time analytics and predictive modeling use cases. Pipeline Automation & Orchestration Build automated, reliable, and scalable pipelines using tools such as Azure Data Factory, Airflow, or Databricks Workflows. Ensure data availability, accuracy, and timeliness across both batch and streaming systems. Data Governance & Quality Implement validation frameworks, anomaly detection, and reconciliation processes to ensure high-quality ML inputs. Support metadata management, lineage tracking, and documentation of governed, auditable data flows. Cross-Functional Collaboration Work closely with data scientists, ML engineers, software engineers, and business teams to gather requirements and deliver ML-ready datasets. Translate modeling and analytics needs into efficient, scalable data architecture solutions. Documentation & Mentorship Document data flows, data mappings, and pipeline logic in a clear, reproducible format. Provide guidance and mentorship to junior engineers and analysts on ML-focused data engineering best practices. Required Qualifications Technical Skills Strong experience building ML-focused data pipelines, including feature engineering and model lifecycle support. Proficiency in Python, SQL, and modern data transformation tools (dbt, Spark, Delta Lake, or similar). Solid understanding of orchestrators and cloud data platforms (Azure, Databricks, etc.). Familiarity with ML operations tools such as MLflow, TFX, or equivalent frameworks. Hands-on experience working with WMS or operational/logistics data. Experience 5+ years in data engineering, with at least 2 years directly supporting AI/ML applications or teams. Experience designing and maintaining production-grade pipelines in cloud environments. Proven ability to collaborate with data scientists and translate ML requirements into scalable data solutions. Education & Credentials Bachelor's degree in Computer Science, Data Engineering, Data Science, or a related field (Master's preferred). Relevant certifications are a plus (e.g., Azure AI Engineer, Databricks ML, Google Professional Data Engineer). Preferred Qualifications Experience with real-time ingestion using Kafka, Kinesis, Event Hub, or similar. Exposure to MLOps practices and CI/CD for data pipelines. Background in logistics, warehousing, fulfillment, or similar operational domains.
    $72k-99k yearly est. 5d ago
  • Life Actuary

    USAA 4.7company rating

    Data scientist job in Tampa, FL

    Why USAA? At USAA, our mission is to empower our members to achieve financial security through highly competitive products, exceptional service and trusted advice. We seek to be the #1 choice for the military community and their families. Embrace a fulfilling career at USAA, where our core values - honesty, integrity, loyalty and service - define how we treat each other and our members. Be part of what truly makes us special and impactful. The Opportunity We are seeking a qualified Life Actuary to join our diverse team. The ideal candidate will possess strong risk management skills, with a particular focus on Interest Rate Risk Management and broader financial risk experience. This role requires an individual who has acquired their ASA designation or FSA designation and has a few years of meaningful experience. Key responsibilities will include experience in Asset-Liability Management (ALM), encompassing liquidity management, asset allocation, cashflow matching, and duration targeting. You will also be responsible for conducting asset adequacy testing to ensure the sufficiency of assets to meet future obligations. Experience in product pricing, especially for annuity products. Furthermore, an understanding of Risk-Based Capital (RBC) frameworks and methodologies is required. Proficiency with actuarial software platforms, with a strong preference for AXIS, is highly advantageous. We offer a flexible work environment that requires an individual to be in the office 4 days per week. This position can be based in one of the following locations: San Antonio, TX, Plano, TX, Phoenix, AZ, Colorado Springs, CO, Charlotte, NC, or Tampa, FL. Relocation assistance is not available for this position. What you'll do: Performs complex work assignments using actuarial modeling software driven models for pricing, valuation, and/or risk management. Reviews laws and regulations to ensure all processes are compliant; and provides recommendations for improvements and monitors industry communications regarding potential changes to existing laws and regulations. Runs models, generates reports, and presents recommendations and detailed analysis of all model runs to Actuarial Leadership. May make recommendations for model adjustments and improvements, when appropriate. Shares knowledge with team members and serves as a resource to team on raised issues and navigates obstacles to deliver work product. Leads or participates as a key resource on moderately complex projects through concept, planning, execution, and implementation phases with minimal guidance, involving cross functional actuarial areas. Develops exhibits and reports that help explain proposals/findings and provides information in an understandable and usable format for partners. Identifies and provides recommended solutions to business problems independently, often presenting recommendation to leadership. Maintains accurate price level, price structure, data availability and other requirements to achieve profitability and competitive goals. Identifies critical assumptions to monitor and suggest timely remedies to correct or prevent unfavorable trends. Tests impact of assumptions by identifying sources of gain and loss, the appropriate premiums, interest margins, reserves, and cash values for profitability and viability of new and existing products. Advises management on issues and serves as a primary resource for their individual team members on raised issues. Ensures risks associated with business activities are identified, measured, monitored, and controlled in accordance with risk and compliance policies and procedures. What you have: Bachelor's degree; OR 4 years of related experience (in addition to the minimum years of experience required) may be substituted in lieu of degree. 4 years relevant actuarial or analytical experience and attainment of Fellow within the Society of Actuaries; OR 8 years relevant actuarial experience and attainment of Associate within the Society of Actuaries. Experience performing complex work assignments using actuarial modeling software driven models for pricing, valuation, and/or risk management. Experience presenting complex actuarial analysis and recommendations to technical and non-technical audiences. What sets you apart: Asset-Liability Management (ALM): Experience in ALM, including expertise in liquidity management, asset allocation, cashflow matching, and duration targeting. Asset Adequacy Testing: Experience conducting asset adequacy testing to ensure the sufficiency of assets to meet future obligations. Product Pricing: Experience in pricing financial products, with a particular emphasis on annuity products. Risk-Based Capital (RBC): Experience with risk-based capital frameworks and methodologies. Actuarial Software Proficiency: Familiarity with actuarial software platforms. Experience with AXIS is considered a significant advantage. Actuarial Designations: Attainment of Society of Actuaries Associateship (ASA) or Fellowship (FSA). Compensation range: The salary range for this position is: $127,310 - $243,340. USAA does not provide visa sponsorship for this role. Please do not apply for this role if at any time (now or in the future) you will need immigration support (i.e., H-1B, TN, STEM OPT Training Plans, etc.). Compensation: USAA has an effective process for assessing market data and establishing ranges to ensure we remain competitive. You are paid within the salary range based on your experience and market data of the position. The actual salary for this role may vary by location. Employees may be eligible for pay incentives based on overall corporate and individual performance and at the discretion of the USAA Board of Directors. The above description reflects the details considered necessary to describe the principal functions of the job and should not be construed as a detailed description of all the work requirements that may be performed in the job. Benefits: At USAA our employees enjoy best-in-class benefits to support their physical, financial, and emotional wellness. These benefits include comprehensive medical, dental and vision plans, 401(k), pension, life insurance, parental benefits, adoption assistance, paid time off program with paid holidays plus 16 paid volunteer hours, and various wellness programs. Additionally, our career path planning and continuing education assists employees with their professional goals. For more details on our outstanding benefits, visit our benefits page on USAAjobs.com. Applications for this position are accepted on an ongoing basis, this posting will remain open until the position is filled. Thus, interested candidates are encouraged to apply the same day they view this posting. USAA is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
    $85k-106k yearly est. 3d ago
  • Data Scientist

    Ubertal 4.0company rating

    Data scientist job in Orlando, FL

    We are passionate people with an expert understanding of the digital consumer, data sciences, global telecom business, and emerging financial services. And we believe that we can make the world a better place. Job Description Looking for candidate making a career in Data Science with experience applying advanced statistics, data mining and machine learning algorithms to make data-driven predictions using programming languages like Python (including: Numpy, Pandas, Scikit-learn, Matplotlib, Seaborn), SQL (Postgresql). Experience with ElasticSearch, information/document retrieval, natural language processing is a plus. Experience with various machine learning methods (classification, clustering, natural language processing, ensemble methods, outlier analysis) and parameters that affect their performance also helps. You will leverage your strong collaboration skills and ability to extract valuable insights from highly complex data sets to ask the right questions and find the right answers. Qualifications Qualifications · Bachelor's degree or equivalent experience in quantative field (Statistics, Mathematics, Computer Science, Engineering, etc.) · At least 2 years' of experience in quantitative analytics or data modeling · Some understanding of predictive modeling, machine-learning, clustering and classification techniques, and algorithms · Fluency in these programming languages (Python, SQL), Javascript/HTML/CSS/Web Development nice to have. · Familiarity with data science frameworks and visualization tools (Pandas, Visualizations (matplotlib, altair, etc), Jupyter Notebooks) Additional Information Responsibilities · Analyze raw data: assessing quality, cleansing, structuring for downstream processing · Design accurate and scalable prediction algorithms · Collaborate with engineering team to bring analytical prototypes to production · Generate actionable insights for business improvements tion will be kept confidential according to EEO guidelines.
    $64k-90k yearly est. 60d+ ago

Learn more about data scientist jobs

How much does a data scientist earn in The Villages, FL?

The average data scientist in The Villages, FL earns between $53,000 and $104,000 annually. This compares to the national average data scientist range of $75,000 to $148,000.

Average data scientist salary in The Villages, FL

$74,000
Job type you want
Full Time
Part Time
Internship
Temporary