Data Architect
Data scientist job in Orlando, FL
Data Architect
Duration: 6 Months
Responsible for enterprise-wide data design, balancing optimization of data access with batch loading and resource
utilization factors. Knowledgeable in most aspects of designing and constructing data architectures, operational data
stores, and data marts. Focuses on enterprise-wide data modelling and database design. Defines data architecture
standards, policies and procedures for the organization, structure, attributes and nomenclature of data elements, and
applies accepted data content standards to technology projects. Responsible for business analysis, data acquisition and
access analysis and design, Database Management Systems optimization, recovery strategy and load strategy design and implementation.
Essential Position Functions:
Evaluate and recommend data management processes.
Design, prepare and optimize data pipelines and workflows.
Lead implementations of secure, scalable, and reliable Azure solutions.
Observe and recommend how to monitor and optimize Azure for performance and cost-efficiency.
Endorse and foster security best practices, access controls, and compliance standards for all data lake resources.
Perform knowledge transfer about troubleshooting and documenting Azure architectures and solutions.
Skills required:
Deep understanding of Azure synapse Analytics, Azure Data Factory, and related Azure data tools
Lead implementations of secure, scalable, and reliable Azure solutions.
Observe and recommend how to monitor and optimize Azure for performance and cost efficiency.
Expertise in implementing Data Vault 2.0 methodologies using Wherescape automation software.
Proficient in designing and optimizing fact and dimension table models.
Demonstrated ability to design, develop, and maintain data pipelines and workflows.
Strong skills in formulating, reviewing, and optimizing SQL code.
Expertise in data collection, storage, accessibility, and quality improvement processes.
Endorse and foster security best practices, access controls, and compliance standards for all data lake resources.
Proven track record of delivering consumable data using information marts.
Excellent communication skills to effectively liaise with technical and non-technical team members.
Ability to document designs, procedures, and troubleshooting methods clearly.
Proficiency in Python or PowerShell preferred.
Bachelor's or master's degree in computer science, Information Systems, or other related field. Or equivalent work experience.
A minimum of 7 years of experience with large and complex database management systems.
Data Analyst
Data scientist job in Lake Mary, FL
Hybrid - Tues & Wed On-site in Lake Mary, FL
Brooksource is looking for a detail-oriented and dedicated individual to support our Specialty Pharmacy Distribution client and support their Customer Master domain. This person will be responsible for working with client accounts and updating/maintaining as needed.
Responsibilities:
Accurately enter and update customer data in the SAP system.
Maintain and manage customer master data, ensuring data integrity and consistency.
Verify and validate data entries for accuracy and completeness.
Collaborate with cross-functional teams to resolve data discrepancies and ensure timely updates.
Generate and analyze reports to identify and correct data issues.
Assist in the development and implementation of data entry procedures and guidelines.
Provide support for data migration and integration projects.
Ensure compliance with company policies and data management standards.
Qualifications:
High school diploma or equivalent; additional certification in data entry or related field is a plus.
Proven experience in data entry, preferably within the SAP environment.
Familiarity with Customer Master data management.
Strong attention to detail and accuracy.
Excellent organizational and time management skills.
Proficient in Microsoft Office Suite (Excel, Word, Outlook).
Ability to work independently and as part of a team.
Strong communication skills, both written and verbal.
Preferred Skills:
Experience with SAP modules related to Customer Master data.
Knowledge of data governance and data quality principles.
Ability to troubleshoot and resolve data-related issues.
Data Architect
Data scientist job in Orlando, FL
(Orlando, FL)
Business Challenge
The company is in the midst of an AI transformation, creating exciting opportunities for growth. At the same time, they are leading a Salesforce modernization and integrating the systems and data of their recent acquisition.
To support these initiatives, they are bringing in a Senior Data Architect/Engineer to establish enterprise standards for application and data architecture, partnering closely with the Solutions Architect and Tech Leads.
Role Overview
The Senior Data Architect/Engineer leads the design, development, and evolution of enterprise data architecture, while contributing directly to the delivery of robust, scalable solutions. This position blends strategy and hands-on engineering, requiring deep expertise in modern data platforms, pipeline development, and cloud-native architecture.
You will:
Define architectural standards and best practices.
Evaluate and implement new tools.
Guide enterprise data initiatives.
Partner with data product teams, engineers, and business stakeholders to build platforms supporting analytics, reporting, and AI/ML workloads.
Day-to-Day Responsibilities
Lead the design and documentation of scalable data frameworks: data lakes, warehouses, streaming architectures, and Azure-native data platforms.
Build and optimize secure, high-performing ETL/ELT pipelines, data APIs, and data models.
Develop solutions that support analytics, advanced reporting, and AI/ML use cases.
Recommend and standardize modern data tools, frameworks, and architectural practices.
Mentor and guide team members, collaborating across business, IT, and architecture groups.
Partner with governance teams to ensure data quality, lineage, security, and stewardship.
Desired Skills & Experience
10+ years of progressive experience in Data Engineering and Architecture.
Strong leadership experience, including mentoring small distributed teams (currently 4 people: 2 onshore, 2 offshore; team growing to 6).
Deep knowledge of Azure ecosystem (Data Lake, Synapse, SQL DB, Data Factory, Databricks).
Proven expertise with ETL pipelines (including 3rd-party/vendor integrations).
Strong SQL and data modeling skills; familiarity with star/snowflake schemas and other approaches.
Hands-on experience creating Data APIs.
Solid understanding of metadata management, governance, security, and data lineage.
Programming experience with SQL, Python, Spark.
Familiarity with containerized compute/orchestration frameworks (Docker, Kubernetes) is a plus.
Experience with Salesforce data models, MDM tools, and streaming platforms (Kafka, Event Hub) is preferred.
Excellent problem-solving, communication, and leadership skills.
Education:
Bachelor's degree in Computer Science, Information Systems, or related field (Master's preferred).
Azure certifications in Data Engineering or Solution Architecture strongly preferred.
Essential Duties & Time Allocation
Data Architecture Leadership - Define enterprise-wide strategies and frameworks (35%)
Engineering & Delivery - Build and optimize ETL/ELT pipelines, APIs, and models (30%)
Tooling & Standards - Evaluate new tools and support adoption of modern practices (15%)
Mentorship & Collaboration - Mentor engineers and align stakeholders (10%)
Governance & Quality - Embed stewardship, lineage, and security into architecture (10%)
Sr Electronic Data Interchange Coordinator
Data scientist job in Tampa, FL
On-Site: Locations - Tampa FL, Arcadia WI
(GC/USC Only)
Senior EDI Coordinator
Senior EDI Coordinators create new and update existing EDI maps to support the movement of thousands of transactions each day, setup and maintain EDI trading partners, setup and maintain EDI communication configurations, and provide support for a large assortment of EDI transactions with variety of trading partners.
Primary Job Functions:
Monitor inbound and outbound transaction processing to ensure successful delivery. Take corrective action on those transactions that are not successful.
Develop and modify EDI translation maps according to Business Requirements Documents and EDI Specifications.
Perform unit testing and coordinate integrated testing with internal and external parties.
Perform map reviews to ensure new maps and map changes comply with requirements and standards.
Prepare, maintain, and review documentation. This includes Mapping Documents, Standard Operating Procedures, and System Documentation.
Perform Trading Partner setup, configuration, and administrative activities.
Analyze and troubleshoot connectivity, mapping, and data issues.
Provide support to our business partners and external parties.
Participate in an after-hours on-call rotation.
Setup and maintain EDI communication channels.
Provide coaching and mentoring to EDI Coordinators.
Suggest EDI best practices and opportunities for improvement.
Maintain and update AS2 Certificates.
Deploy map changes to production.
Perform EDI system maintenance and upgrades.
Job Qualifications:
Education:
Bachelor's Degree in Information Systems, Computer Science, or other related fields; or equivalent combination of education and experience, Required
Experience:
5+ years of practical EDI mapping experience, with emphasis in ANSI X.12, Required
Experience working with XML and JSON transactions, Preferred
Experience working with AS2, VAN, and sFTP communications, Preferred
Experience working with AS2 Certificates, Preferred
Experience with Azure DevOps Agile/Scrum platform, Preferred
Experience in large, complex enterprise environments, Preferred
Knowledge, Skills and Abilities:
Advanced analytical and problem-solving skills
Strong attention to detail
Excellent written and verbal communication skills
Excellent client facing and interpersonal skills
Effective time management and organizational skills
Work independently as well as in a team environment
Handle multiple projects simultaneously within established time constraints
Perform under strong demands in a fast-paced environment
Display empathy, understanding and patience with employees and external customers
Respond professionally in situations with difficult employee/vendor/customer issues or inquiries
Working knowledge of Continuous Improvement methodologies
Strong working knowledge of Microsoft Office Suite
Production Data Coordinator
Data scientist job in Sarasota, FL
eComSystems is seeking a hyper-detail-oriented Production Coordinator responsible for serving as the team's dedicated data and system architect. This specialized role ensures the seamless integration of all client data into the AdStudio platform. The Coordinator takes the initial brief from the Project Manager and transforms it into a 100% clean, ready-to-design file shell, enabling the Production Artists to focus purely on creative execution.
Key Responsibilities
Data Mastering & Preparation: Own the end-to-end management, manipulation, and upload of all data files (primarily using Microsoft Excel) required for weekly circular production.
System Architecture: Utilize coding tools (e.g., VBA) and advanced Excel functions to cleanse, format, and validate large datasets to ensure compliance with AdStudio's strict import requirements.
AdStudio Shell Building: Execute data merges, image uploads, and template application to create the production-ready ad files ("shells") that are handed off to the Production Artists.
Process Integrity: Perform strict internal quality checks on all data imports and shell builds to eliminate errors before they reach the design phase (critical for the high volume of weekly ads).
Asset Management: Coordinate with the Project Manager to ensure all assets are sourced, named correctly, and available in the shared directories for seamless integration.
Workflow Handoff: Formally update the project tracking system to signal the clean handoff of files to the Production Artist team.
Qualifications
3+ years of experience in a high-volume production environment, focusing specifically on data manipulation, systems integration, or production coordination.
Expert proficiency in Microsoft Excel is mandatory, including mastery of complex formulas, pivot tables, and data validation techniques. Familiarity with VBA or other scripting languages is highly desirable.
Experience with proprietary software, content management systems (CMS), or data-driven design platforms (AdStudio or similar) is strongly preferred.
Demonstrated high attention to detail, precision, and a proactive approach to troubleshooting data errors.
Exceptional organizational skills with a strong ability to adhere to strict procedural guidelines.
About eCom
eComSystems (“eCom”) provides proprietary ad tech solutions for well-known national brands, clients and retailers, enabling them to do business better, improve efficiencies, and impact the bottom line. For more than 25 years, eCom has been the marketing technology platform of major distributors, retailers, and wholesalers across the US. With a focus on grocery, hardware, building materials, pharmacy, food distribution, and sporting goods channels, eCom's patented omnichannel platform creates, distributes, and manages national promotions through digital, web, social, and print media.
Job Type: Full-time
Benefits:
401(k)
Dental insurance
Health insurance
Life insurance
Paid time off
Vision insurance
Ability to Commute:
Sarasota, FL 34240 (Required)
Job Type: Full-time
Pay: $50,000.00 - $60,000.00 per year
Data Quality Analyst
Data scientist job in Orlando, FL
Sanford Rose JFSPartners is currently looking for a Data Quality Analyst for a full-time opportunity in Orlando. Qualified candidates will participate in the full data quality lifecycle from requirement gathering through ongoing support. The candidate selected for this role will develop technical components that meet the business/functional requirements or from logged data incidents.
RESPONSIBILITIES:
Develop technical specifications that demonstrate how data quality will be preserved/enforced.
Work with the BA team to generate data to power quality dashboards, which allow both data providers and data consumers to monitor data quality.
Contribute to business/technical definitions of data objects within the data catalogue.
Serve as an SME for multiple data domains. Assist business users in the selection, understanding and use of data.
Perform UAT on data sets as part of data ingestion, egress, transformation and rule execution.
REQUIRED TECHNICAL SKILLS:
Strong understanding of data structures, data types, and data transformation.
Ability to perform complex data mappings, workflows and sessions.
Experience with SQL, and other data transformation/analytics tools such as Informatica, Talend, or Alteryx.
Expertise in reading, analyzing and debugging SQL.
Experience or willingness to learn data profiling/quality tools such as Collibra, Ataccama, Informatica or OEDQ.
At Sanford Rose Associates - JFSPartners, we specialize in Finance & Accounting, Legal, and Information Technology recruitment, dedicated to helping professionals like you discover the perfect career opportunities. With a track record of assisting thousands of professionals nationwide, we are prepared to leverage our expertise on your behalf. Partnering with us means gaining access to serious candidates, minimizing hiring errors, and ensuring top-tier hires, all while navigating the hiring process with confidence. We understand the significance of finding the ideal role and aligning with an organization that shares your values.
AWS data Architect
Data scientist job in Sunrise, FL
Role :Sr Data Architect
Duration: Contract
Skills: Hadoop, Bigdata, DWH, GCP OR AWS
JD:
14+ years of overall IT experience with expertise in Data landscape - Data Warehouse, Data lake etc.
Hands on experience in Big Data and Hadoop ecosystem; Strong skills in SQL, Python or Spark
Proficient in Data Warehousing concepts and Customer Data Management (Customer 360)
Experience in GCP platform - Dataflow, Dataproc, Kubernetes containers etc.
Expertise in deep Data exploration and Data analysis
Excellent communication and inter personal skills
Lead Data Engineer
Data scientist job in Tampa, FL
A leading Investment Management Firm is looking to bring on a Lead Data Engineer to join its team in Tampa, Denver, Memphis, or Southfield. This is an excellent chance to work alongside industry leaders while getting to be both hands on and helping lead the team.
Key Responsibilities
Project Oversight: Direct end-to-end software development activities, from initial requirements through deployment, ensuring projects meet deadlines and quality standards.
Database Engineering: Architect and refine SQL queries, stored procedures, and schema designs to maximize efficiency and scalability within Oracle environments.
Performance Tuning: Evaluate system performance and apply strategies to enhance data storage and retrieval processes.
Data Processing: Utilize tools like Pandas and Spark for data wrangling, transformation, and analysis.
Python Solutions: Develop and maintain Python-based applications and automation workflows.
Pipeline Automation: Implement and manage continuous integration and delivery pipelines using Jenkins and similar technologies to optimize build, test, and release cycles.
Team Development: Guide and support junior engineers, promoting collaboration and technical growth.
Technical Documentation: Create and maintain comprehensive documentation for all development initiatives.
Core Skills
Experience: Over a decade in software engineering, with deep expertise in Python and Oracle database systems.
Technical Knowledge: Strong command of SQL, Oracle, Python, Spark, Jenkins, Kubernetes, Pandas, and modern CI/CD practices.
Optimization Expertise: Skilled in database tuning and applying best practices for performance.
Leadership Ability: Proven track record in managing teams and delivering complex projects.
Analytical Strength: Exceptional problem-solving capabilities with a data-centric mindset.
Communication: Clear and effective written and verbal communication skills.
Education: Bachelor's degree in Computer Science, Engineering, or equivalent professional experience.
Preferred Qualifications
Certifications: Professional credentials in Oracle, Python, Kubernetes, or CI/CD technologies.
Agile Background: Hands-on experience with Agile or Scrum frameworks.
Cloud Platforms: Familiarity with AWS, Azure, or Google Cloud services.
Data Modeling
Data scientist job in Melbourne, FL
Must Have Technical/Functional Skills
• 5+ years of experience in data modeling, data architecture, or a similar role
• Proficiency in SQL and experience with relational databases such as Oracle, SQL Server, or PostgreSQL
• Experience with data modeling tools such as Erwin, IBM Infosphere Data Architect, or similar
• Ability to communicate complex concepts clearly to diverse audiences
Roles & Responsibilities
• Design and develop conceptual, logical, and physical data models that support both operational and analytical needs
• Collaborate with business stakeholders to gather requirements and translate them into scalable data models
• Perform data profiling and analysis to understand data quality issues and identify opportunities for improvement
• Implement best practices for data modeling, including normalization, denormalization, and indexing strategies
• Lead data architecture discussions and present data modeling solutions to technical and non-technical audiences
• Mentor and guide junior data modelers and data architects within the team
• Continuously evaluate data modeling tools and techniques to enhance team efficiency and productivity
Base Salary Range: $100,000 - $150,000 per annum
TCS Employee Benefits Summary:
Discretionary Annual Incentive.
Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans.
Family Support: Maternal & Parental Leaves.
Insurance Options: Auto & Home Insurance, Identity Theft Protection.
Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement.
Time Off: Vacation, Time Off, Sick Leave & Holidays.
Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
Data Engineer
Data scientist job in Palm Beach Gardens, FL
Flybridge Staffing is currently searching for a Data Engineer for a client located in the Palm Beach Gardens area. This is a direct-hire position that will work off a hybrid schedule of 2 days remote. This person will design systems that supply high-performance datasets for advanced analytics.
Experience:
BA degree and 5+ years of Data Engineering experience
Strong experience building ETL data pipelines for on-premises SQL Server 2017 or newer
Deep understanding of the development of data pipelines with either SSIS or Python
Broad experience with SQL Server, including Columnstore, etc.
Extensive experience using SSMS and T-SQL to create and maintain SQL Server tables, views, functions, stored procedures, and user-defined table types.
Experience with data modeling indexes, Temporal tables, CLR, and Service Broker.
Experience in partitioning tables and indexes, and performance improvement with Query Analyzer
Experience writing C#, PowerShell, and Python.
Experience with source control integration with GitHub, BitBucket, and Azure DevOps.
Experience working in an Agile and Kanban SDLC.
Experience with cloud-based data management solutions such as Snowflake, Redshift.
Experience with Python programming is a plus. Libraries such as Pandas, Numpy, csv, Traceback, JSON, PyODBC, Math-Are nice to have.
Experience writing design documentation such as ERDs, Data Flow Diagrams, and Process Flow Diagrams.
Experience with open-source database engines such as Clickhouse, ArcticDB, and PostGreSQL is a plus.
Responsibilities:
Collaborate effectively with Stakeholders, Project Managers, Software Engineers, Data Analysts, QA Analysts, DBAs, and Data Engineers.
Build and maintain data pipelines based on functional and non-functional requirements.
Proactively seek out information and overcome obstacles to deliver projects efficiently.
Ensure that data pipelines incorporate best practices related to performance, scaling, extensibility, fault tolerance, instrumentation, and maintainability.
Ensure that data pipelines are kept simple and not overly engineered.
Produce and maintain design and operational documentation.
Analyze complex data problems and engineer elegant solutions.
****NO SPONSORSHIP AVAILABLE**** US Citizen, GC, EAD only please. If your background aligns with the above details and you would like to learn more, please submit your resume to jobs@flybridgestaffing.com or on our website, www.flybridgestaffing.com and one of our recruiters will be in touch with you ASAP.
Follow us on LinkedIn to keep up with all our latest job openings and referral program.
GCP Data Architect with 14+ years (Day 1 onsite)
Data scientist job in Sunrise, FL
12-14 years of overall IT experience with expertise in Data landscape - Data Warehouse, Data lake etc.
Hands on experience in Big Data and Hadoop ecosystem; Strong skills in SQL, Python or Spark
Proficient in Data Warehousing concepts and Customer Data Management (Customer 360)
Experience in GCP platform - Dataflow, Dataproc, Kubernetes containers etc.
Expertise in deep Data exploration and Data analysis
Excellent communication and inter personal skills
Data Modeler - Manager
Data scientist job in Miami, FL
South Miami, FL (Onsite)
Fulltime employment with Big 4
We're seeking an experienced Data Modeling Manager to lead the design and development of enterprise‑grade data models and cloud analytics solutions. This role is ideal for someone with deep expertise in dimensional modeling, strong leadership skills, and hands‑on experience with modern data platforms.
What You'll Do
Lead the design of conceptual, logical, and physical data models using Kimball methodology
Build scalable data architecture and star schemas (fact/dimension tables) for analytics and BI
Define and implement SCD Type 2 and other SCD types
Create source‑to‑target mappings and data transformation specifications
Collaborate with cross‑functional teams to translate business needs into technical models
Optimize data models for performance and maintain metadata/data dictionaries
Manage and mentor a small team (1-3 resources)
Stay current with industry best practices and emerging data modeling techniques
What We're Looking For
10+ years of hands‑on data modeling experience
Strong SQL and dimensional modeling expertise
Experience with major RDBMS or cloud data warehouses
Proficiency with data modeling tools (Erwin, PowerDesigner, ER/Studio, Visio)
Deep understanding of SCD2, surrogate keys, and data warehousing principles
Strong communication, documentation, and stakeholder‑management skills
Cloud knowledge or certification (Azure DP‑203 preferred) is a plus
Spanish or Italian language skills are a bonus
Big Data Architect
Data scientist job in Sunrise, FL
Big Data Architect || Location : Sun Rise, FL (Hybrid) || Hire Type: Contract (W2)
Project description
Skills: Hadoop, Bigdata, DWH, GCP OR AWS
14+ years of overall IT experience with expertise in Data landscape - Data Warehouse, Data lake etc.
Hands on experience in Big Data and Hadoop ecosystem; Strong skills in SQL, Python or Spark
Proficient in Data Warehousing concepts and Customer Data Management (Customer 360)
Experience in GCP platform - Dataflow, Dataproc, Kubernetes containers etc.
Expertise in deep Data exploration and Data analysis
ML Data Engineer #978695
Data scientist job in Seffner, FL
Job Title: Data Engineer - AI/ML Pipelines
Work Model: Hybrid
Duration: CTH
The Data Engineer - AI/ML Pipelines plays a key role in designing, building, and maintaining scalable data infrastructure that powers analytics and machine learning initiatives. This position focuses on developing production-grade data pipelines that support end-to-end ML workflows-from data ingestion and transformation to feature engineering, model deployment, and monitoring.
The ideal candidate has hands-on experience working with operational systems such as Warehouse Management Systems (WMS) or ERP platforms, and is comfortable partnering closely with data scientists, ML engineers, and operational stakeholders to deliver high-quality, ML-ready datasets.
Key Responsibilities
ML-Focused Data Engineering
Build, optimize, and maintain data pipelines specifically designed for machine learning workflows.
Collaborate with data scientists to develop feature sets, implement data versioning, and support model training, evaluation, and retraining cycles.
Participate in initiatives involving feature stores, model input validation, and monitoring of data quality feeding ML systems.
Data Integration from Operational Systems
Ingest, normalize, and transform data from WMS, ERP, telemetry, and other operational data sources.
Model and enhance operational datasets to support real-time analytics and predictive modeling use cases.
Pipeline Automation & Orchestration
Build automated, reliable, and scalable pipelines using tools such as Azure Data Factory, Airflow, or Databricks Workflows.
Ensure data availability, accuracy, and timeliness across both batch and streaming systems.
Data Governance & Quality
Implement validation frameworks, anomaly detection, and reconciliation processes to ensure high-quality ML inputs.
Support metadata management, lineage tracking, and documentation of governed, auditable data flows.
Cross-Functional Collaboration
Work closely with data scientists, ML engineers, software engineers, and business teams to gather requirements and deliver ML-ready datasets.
Translate modeling and analytics needs into efficient, scalable data architecture solutions.
Documentation & Mentorship
Document data flows, data mappings, and pipeline logic in a clear, reproducible format.
Provide guidance and mentorship to junior engineers and analysts on ML-focused data engineering best practices.
Required Qualifications
Technical Skills
Strong experience building ML-focused data pipelines, including feature engineering and model lifecycle support.
Proficiency in Python, SQL, and modern data transformation tools (dbt, Spark, Delta Lake, or similar).
Solid understanding of orchestrators and cloud data platforms (Azure, Databricks, etc.).
Familiarity with ML operations tools such as MLflow, TFX, or equivalent frameworks.
Hands-on experience working with WMS or operational/logistics data.
Experience
5+ years in data engineering, with at least 2 years directly supporting AI/ML applications or teams.
Experience designing and maintaining production-grade pipelines in cloud environments.
Proven ability to collaborate with data scientists and translate ML requirements into scalable data solutions.
Education & Credentials
Bachelor's degree in Computer Science, Data Engineering, Data Science, or a related field (Master's preferred).
Relevant certifications are a plus (e.g., Azure AI Engineer, Databricks ML, Google Professional Data Engineer).
Preferred Qualifications
Experience with real-time ingestion using Kafka, Kinesis, Event Hub, or similar.
Exposure to MLOps practices and CI/CD for data pipelines.
Background in logistics, warehousing, fulfillment, or similar operational domains.
Sr. Data Engineer (SQL+Python+AWS)
Data scientist job in Saint Petersburg, FL
looking for a Sr. Data Engineer (SQL+Python+AWS) to work on a 12+ Months, Contract (potential Extension or may Convert to Full-time) = Hybrid at St. Petersburg, FL 33716 with a Direct Financial Client = only on W2 for US Citizen or Green Card Holders.
Notes from the Hiring Manager:
• Setting up Python environments and data structures to support the Data Science/ML team.
• No prior Data Science or Machine Learning experience required.
• Role involves building new data pipelines and managing file-loading connections.
• Strong SQL skills are essential.
• Contract-to-hire position.
• Hybrid role based in St. Pete, FL (33716) only.
Duties:
This role is building and maintaining data pipelines that connect Oracle-based source systems to AWS cloud environments, to provide well-structured data for analysis and machine learning in AWS SageMaker.
It includes working closely with data scientists to deliver scalable data workflows as a foundation for predictive modeling and analytics.
• Develop and maintain data pipelines to extract, transform, and load data from Oracle databases and other systems into AWS environments (S3, Redshift, Glue, etc.).
• Collaborate with data scientists to ensure data is prepared, cleaned, and optimized for SageMaker-based machine learning workloads.
• Implement and manage data ingestion frameworks, including batch and streaming pipelines.
• Automate and schedule data workflows using AWS Glue, Step Functions, or Airflow.
• Develop and maintain data models, schemas, and cataloging processes for discoverability and consistency.
• Optimize data processes for performance and cost efficiency.
• Implement data quality checks, validation, and governance standards.
• Work with DevOps and security teams to comply with RJ standards.
Skills:
Required:
• Strong proficiency with SQL and hands-on experience working with Oracle databases.
• Experience designing and implementing ETL/ELT pipelines and data workflows.
• Hands-on experience with AWS data services, such as S3, Glue, Redshift, Lambda, and IAM.
• Proficiency in Python for data engineering (pandas, boto3, pyodbc, etc.).
• Solid understanding of data modeling, relational databases, and schema design.
• Familiarity with version control, CI/CD, and automation practices.
• Ability to collaborate with data scientists to align data structures with model and analytics requirements
Preferred:
• Experience integrating data for use in AWS SageMaker or other ML platforms.
• Exposure to MLOps or ML pipeline orchestration.
• Familiarity with data cataloging and governance tools (AWS Glue Catalog, Lake Formation).
• Knowledge of data warehouse design patterns and best practices.
• Experience with data orchestration tools (e.g., Apache Airflow, Step Functions).
• Working knowledge of Java is a plus.
Education:
B.S. in Computer Science, MIS or related degree and a minimum of five (5) years of related experience or combination of education, training and experience.
Data Architect
Data scientist job in Sunrise, FL
JD:
14+ years of overall IT experience with expertise in Data landscape - Data Warehouse, Data lake etc.
Hands on experience in Big Data and Hadoop ecosystem; Strong skills in SQL, Python or Spark
Proficient in Data Warehousing concepts and Customer Data Management (Customer 360)
Experience in GCP platform - Dataflow, Dataproc, Kubernetes containers etc.
Expertise in deep Data exploration and Data analysis
Excellent communication and inter personal skills
Claims Data Engineer
Data scientist job in Plantation, FL
NationsBenefits is recognized as one of the fastest growing companies in America and a Healthcare Fintech provider of supplemental benefits, flex cards, and member engagement solutions. We partner with managed care organizations to provide innovative healthcare solutions that drive growth, improve outcomes, reduce costs, and bring value to their members.
Through our comprehensive suite of innovative supplemental benefits, fintech payment platforms, and member engagement solutions, we help health plans deliver high-quality benefits to their members that address the social determinants of health and improve member health outcomes and satisfaction.
Our compliance-focused infrastructure, proprietary technology systems, and premier service delivery model allow our health plan partners to deliver high-quality, value-based care to millions of members.
We offer a fulfilling work environment that attracts top talent and encourages all associates to contribute to delivering premier service to internal and external customers alike. Our goal is to transform the healthcare industry for the better! We provide career advancement opportunities from within the organization across multiple locations in the US, South America, and India.
Position Summary:
We are seeking a seasoned EDI 837 Claims Data Engineer to design, develop, and maintain data pipelines that process healthcare claims in compliance with HIPAA and ANSI X12 standards. This role requires deep expertise in Electronic Data Interchange (EDI), particularly the 837-transaction set, and will be pivotal in ensuring accurate, timely, and secure claims data exchange across payers, providers, clearinghouses, state agencies, and CMS.
Key Responsibilities
EDI Development & Integration
Design, build, and maintain pipelines for processing 837 healthcare claim transactions.
Implement and support EDI workflows across multiple trading partners.
Ensure compliance with HIPAA regulations and ANSI X12 standards.
Data Engineering
Develop ETL processes to transform, validate, and load claims data into enterprise data warehouses.
Optimize data flows for scalability, reliability, and performance.
Collaborate with analysts and stakeholders to ensure claims data accuracy.
Write and optimize SQL queries, stored procedures, and scripts for validation and reporting.
Monitoring & Troubleshooting
Monitor EDI transactions for errors, rejections, and compliance issues.
Troubleshoot and resolve data mapping, translation, and connectivity problems.
Perform root cause analysis and implement corrective actions.
Collaboration
Work closely with business analysts, QA teams, and IT operations to support claims processing.
Partner with healthcare domain experts to align technical solutions with business needs.
Required Skills & Qualifications
5+ years of experience in healthcare data engineering or claims integration.
Strong expertise with EDI 837 transactions and healthcare claims processing.
Proven experience with Medicaid and Medicare data exchanges between state agencies and CMS.
Hands-on experience with Databricks, SSIS, and SQL Server.
Knowledge of HIPAA compliance, CMS reporting requirements, and interoperability standards.
Strong problem-solving skills and ability to work in cross-functional teams.
Excellent communication and documentation skills.
Preferred Skills
Experience with Azure cloud platforms
Familiarity with other EDI transactions (835, 270/271, 276/277).
Exposure to data governance frameworks and security best practices.
Background in data warehousing and healthcare analytics.
Senior Data Engineer
Data scientist job in Tampa, FL
Company:
Toorak Capital Partners is an integrated correspondent lending and table funding platform that acquires business purpose residential, multifamily and mixed-use loans throughout the U.S. and the United Kingdom. Headquartered in Tampa, FL., Toorak Capital Partners acquires these loans directly from a network of private lenders on a correspondent basis.
Summary:
The role of the Lead Data Engineer is to develop, implement, for building high performance, scalable data solution to support Toorak's Data Strategy
Lead Data architecture for Toorak Capital.
Lead efforts to create API framework to use data across customer facing and back office applications.
Establish consistent data standards, reference architectures, patterns, and practices across the organization for both OLTP and OLAP (Data warehouse, Data Lake house) MDM and AI / ML technologies
Lead sourcing and synthesis of Data Standardization and Semantics discovery efforts turning insights into actionable strategies that will define the priorities for the team and rally stakeholders to the vision
Lead the data integration and mapping efforts to harmonize data.
Champion standards, guidelines, and direction for ontology, data modeling, semantics and Data Standardization in general at Toorak.
Lead strategies and design solutions for a wide variety of use cases like Data Migration (end-to-end ETL process), database optimization, and data architectural solutions for Analytics Data Projects
Required Skills:
Designing and maintaining the data models, including conceptual, logical, and physical data models
5+ years of experience using NoSQL systems like MongoDB, DynamoDB and Relational SQL Database systems (PostgreSQL) and Athena
5+ years of experience on Data Pipeline development, ETL and processing of structured and unstructured data
5+ years of experience in large scale real-time stream processing using Apache Flink or Apache Spark with messaging infrastructure like Kafka/Pulsar
Proficiency in using data management tools and platforms, such as data cataloging software, data quality tools), and data governance platforms
Experience with Big Query, SQL Mesh(or similar SQL-based cloud platform).
Knowledge of cloud platforms and technologies such as Google Cloud Platform, Amazon Web Services.
Strong SQL skills.
Experience with API development and frameworks.
Knowledge in designing solutions with Data Quality, Data Lineage, and Data Catalogs
Strong background in Data Science, Machine Learning, NLP, Text processing of large data sets
Experience in one or more of the following: Dataiku, DataRobot, Databricks, UiPath would be nice to have.
Using version control systems (e.g., Git) to manage changes to data governance policies, procedures, and documentation
Ability to rapidly comprehend changes to key business processes and the impact on overall Data framework.
Flexibility to adjust to multiple demands, shifting priorities, ambiguity, and rapid change.
Advanced analytical skills.
High level of organization and attention to detail.
Self-starter attitude with the ability to work independently.
Knowledge of legal, compliance, and regulatory issues impacting data.
Experience in finance preferred.
Life Actuary
Data scientist job in Tampa, FL
Why USAA?
At USAA, our mission is to empower our members to achieve financial security through highly competitive products, exceptional service and trusted advice. We seek to be the #1 choice for the military community and their families.
Embrace a fulfilling career at USAA, where our core values - honesty, integrity, loyalty and service - define how we treat each other and our members. Be part of what truly makes us special and impactful.
The Opportunity
We are seeking a qualified Life Actuary to join our diverse team. The ideal candidate will possess strong risk management skills, with a particular focus on Interest Rate Risk Management and broader financial risk experience. This role requires an individual who has acquired their ASA designation or FSA designation and has a few years of meaningful experience.
Key responsibilities will include experience in Asset-Liability Management (ALM), encompassing liquidity management, asset allocation, cashflow matching, and duration targeting. You will also be responsible for conducting asset adequacy testing to ensure the sufficiency of assets to meet future obligations. Experience in product pricing, especially for annuity products. Furthermore, an understanding of Risk-Based Capital (RBC) frameworks and methodologies is required. Proficiency with actuarial software platforms, with a strong preference for AXIS, is highly advantageous.
We offer a flexible work environment that requires an individual to be in the office 4 days per week. This position can be based in one of the following locations: San Antonio, TX, Plano, TX, Phoenix, AZ, Colorado Springs, CO, Charlotte, NC, or Tampa, FL.
Relocation assistance is not available for this position.
What you'll do:
Performs complex work assignments using actuarial modeling software driven models for pricing, valuation, and/or risk management.
Reviews laws and regulations to ensure all processes are compliant; and provides recommendations for improvements and monitors industry communications regarding potential changes to existing laws and regulations.
Runs models, generates reports, and presents recommendations and detailed analysis of all model runs to Actuarial Leadership.
May make recommendations for model adjustments and improvements, when appropriate.
Shares knowledge with team members and serves as a resource to team on raised issues and navigates obstacles to deliver work product.
Leads or participates as a key resource on moderately complex projects through concept, planning, execution, and implementation phases with minimal guidance, involving cross functional actuarial areas.
Develops exhibits and reports that help explain proposals/findings and provides information in an understandable and usable format for partners.
Identifies and provides recommended solutions to business problems independently, often presenting recommendation to leadership.
Maintains accurate price level, price structure, data availability and other requirements to achieve profitability and competitive goals.
Identifies critical assumptions to monitor and suggest timely remedies to correct or prevent unfavorable trends.
Tests impact of assumptions by identifying sources of gain and loss, the appropriate premiums, interest margins, reserves, and cash values for profitability and viability of new and existing products.
Advises management on issues and serves as a primary resource for their individual team members on raised issues.
Ensures risks associated with business activities are identified, measured, monitored, and controlled in accordance with risk and compliance policies and procedures.
What you have:
Bachelor's degree; OR 4 years of related experience (in addition to the minimum years of experience required) may be substituted in lieu of degree.
4 years relevant actuarial or analytical experience and attainment of Fellow within the Society of Actuaries; OR 8 years relevant actuarial experience and attainment of Associate within the Society of Actuaries.
Experience performing complex work assignments using actuarial modeling software driven models for pricing, valuation, and/or risk management.
Experience presenting complex actuarial analysis and recommendations to technical and non-technical audiences.
What sets you apart:
Asset-Liability Management (ALM): Experience in ALM, including expertise in liquidity management, asset allocation, cashflow matching, and duration targeting.
Asset Adequacy Testing: Experience conducting asset adequacy testing to ensure the sufficiency of assets to meet future obligations.
Product Pricing: Experience in pricing financial products, with a particular emphasis on annuity products.
Risk-Based Capital (RBC): Experience with risk-based capital frameworks and methodologies.
Actuarial Software Proficiency: Familiarity with actuarial software platforms. Experience with AXIS is considered a significant advantage.
Actuarial Designations: Attainment of Society of Actuaries Associateship (ASA) or Fellowship (FSA).
Compensation range: The salary range for this position is: $127,310 - $243,340.
USAA does not provide visa sponsorship for this role. Please do not apply for this role if at any time (now or in the future) you will need immigration support (i.e., H-1B, TN, STEM OPT Training Plans, etc.).
Compensation: USAA has an effective process for assessing market data and establishing ranges to ensure we remain competitive. You are paid within the salary range based on your experience and market data of the position. The actual salary for this role may vary by location.
Employees may be eligible for pay incentives based on overall corporate and individual performance and at the discretion of the USAA Board of Directors.
The above description reflects the details considered necessary to describe the principal functions of the job and should not be construed as a detailed description of all the work requirements that may be performed in the job.
Benefits: At USAA our employees enjoy best-in-class benefits to support their physical, financial, and emotional wellness. These benefits include comprehensive medical, dental and vision plans, 401(k), pension, life insurance, parental benefits, adoption assistance, paid time off program with paid holidays plus 16 paid volunteer hours, and various wellness programs. Additionally, our career path planning and continuing education assists employees with their professional goals.
For more details on our outstanding benefits, visit our benefits page on USAAjobs.com.
Applications for this position are accepted on an ongoing basis, this posting will remain open until the position is filled. Thus, interested candidates are encouraged to apply the same day they view this posting.
USAA is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Data Scientist
Data scientist job in Orlando, FL
We are passionate people with an expert understanding of the digital consumer, data sciences, global telecom business, and emerging financial services. And we believe that we can make the world a better place.
Job Description
Looking for candidate making a career in Data Science with experience applying advanced statistics, data mining and machine learning algorithms to make data-driven predictions using programming languages like Python (including: Numpy, Pandas, Scikit-learn, Matplotlib, Seaborn), SQL (Postgresql). Experience with ElasticSearch, information/document retrieval, natural language processing is a plus. Experience with various machine learning methods (classification, clustering, natural language processing, ensemble methods, outlier analysis) and parameters that affect their performance also helps. You will leverage your strong collaboration skills and ability to extract valuable insights from highly complex data sets to ask the right questions and find the right answers.
Qualifications
Qualifications
· Bachelor's degree or equivalent experience in quantative field (Statistics, Mathematics, Computer Science, Engineering, etc.)
· At least 2 years' of experience in quantitative analytics or data modeling
· Some understanding of predictive modeling, machine-learning, clustering and classification techniques, and algorithms
· Fluency in these programming languages (Python, SQL), Javascript/HTML/CSS/Web Development nice to have.
· Familiarity with data science frameworks and visualization tools (Pandas, Visualizations (matplotlib, altair, etc), Jupyter Notebooks)
Additional Information
Responsibilities
· Analyze raw data: assessing quality, cleansing, structuring for downstream processing
· Design accurate and scalable prediction algorithms
· Collaborate with engineering team to bring analytical prototypes to production
· Generate actionable insights for business improvements
tion will be kept confidential according to EEO guidelines.