Shape the Future of Predictive Medicine. One-Year Project. Innovators Wanted!
Are you driven by curiosity, energized by ambiguity, and passionate about transforming healthcare? Dr. Ciara Freeman at the esteemed Moffitt Cancer Center is searching for a bold, entrepreneurial-minded
Applied Data Scientist - Regulatory and Statistical
for a dynamic, one-year project to
help build the first regulatory-grade AI models
that predict therapy response in multiple myeloma.
You'll partner with a physician-scientist PI and data-engineering team to prototype, validate, and document predictive models designed for clinical use. This is hands-on translational ML - fast iteration, real impact, auditable results. Your models will form the core of clinically actionable, auditable AI systems that change how we treat cancer.
Ideal Candidate:
Expert in Python (scikit-learn, XGBoost, PyTorch/TensorFlow).
Skilled in survival or clinical modeling; thrive where rigor meets speed.
Startup thinkers with a thirst for discovery
Individuals who thrive in fast-paced, risk-friendly environments
Problem solvers who see challenges as opportunities
Team players eager to see their ideas put into action
Responsibilities:
Develop and validate multimodal survival and risk-stratification models (clinical + omics + imaging).
Collaborate with engineers to define and extract features.
Perform calibration, bias analysis, and explainability (SHAP, PDPs, model cards).
Translate results into clinician-friendly insights and contribute to IP and regulatory filings.
Credentials & Qualifications:
Master's degree in Computer Science, Data Science, Biostatistics, or a related quantitative field with seven (7) years of applied statistical or machine learning model development experience in healthcare, biotech, or regulated environments.
Or PhD with five (5) years of applied statistical or machine learning model development experience in healthcare, biotech, or regulated environments.
Familiarity with Snowflake or modern data-engineering workflows preferred.
Join a project that's not just about data - it's about revolutionizing patient care. Help us bridge the gap between today's personalized medicine and tomorrow's predictive breakthroughs.
If you're ready to take risks, drive results, and change the future of medicine, apply today!
Moffitt Cancer Center proudly stands as a Comprehensive Cancer Center designated by the National Cancer Institute (NCI) in the vibrant city of Tampa, Florida. This dynamic city is an exceptional choice for those seeking exciting opportunities in a rapidly growing metropolitan area. With its flourishing economy and rich cultural diversity, the Tampa Bay region masterfully combines urban elegance with breathtaking natural beauty. Discover why countless individuals have chosen to make Tampa their home and experience firsthand what makes it one of the fastest-growing metropolitan cities in the United States.
$64k-86k yearly est. 3d ago
Looking for a job?
Let Zippia find it for you.
Senior Data Engineer
Toorak Capital Partners
Data engineer job in Tampa, FL
Company:
Toorak Capital Partners is an integrated correspondent lending and table funding platform that acquires business purpose residential, multifamily and mixed-use loans throughout the U.S. and the United Kingdom. Headquartered in Tampa, FL., Toorak Capital Partners acquires these loans directly from a network of private lenders on a correspondent basis.
Summary:
The role of the Lead DataEngineer is to develop, implement, for building high performance, scalable data solution to support Toorak's Data Strategy
Lead Data architecture for Toorak Capital.
Lead efforts to create API framework to use data across customer facing and back office applications.
Establish consistent data standards, reference architectures, patterns, and practices across the organization for both OLTP and OLAP (Data warehouse, Data Lake house) MDM and AI / ML technologies
Lead sourcing and synthesis of Data Standardization and Semantics discovery efforts turning insights into actionable strategies that will define the priorities for the team and rally stakeholders to the vision
Lead the data integration and mapping efforts to harmonize data.
Champion standards, guidelines, and direction for ontology, data modeling, semantics and Data Standardization in general at Toorak.
Lead strategies and design solutions for a wide variety of use cases like Data Migration (end-to-end ETL process), database optimization, and data architectural solutions for Analytics Data Projects
Required Skills:
Designing and maintaining the data models, including conceptual, logical, and physical data models
5+ years of experience using NoSQL systems like MongoDB, DynamoDB and Relational SQL Database systems (PostgreSQL) and Athena
5+ years of experience on Data Pipeline development, ETL and processing of structured and unstructured data
5+ years of experience in large scale real-time stream processing using Apache Flink or Apache Spark with messaging infrastructure like Kafka/Pulsar
Proficiency in using data management tools and platforms, such as data cataloging software, data quality tools), and data governance platforms
Experience with Big Query, SQL Mesh(or similar SQL-based cloud platform).
Knowledge of cloud platforms and technologies such as Google Cloud Platform, Amazon Web Services.
Strong SQL skills.
Experience with API development and frameworks.
Knowledge in designing solutions with Data Quality, Data Lineage, and Data Catalogs
Strong background in Data Science, Machine Learning, NLP, Text processing of large data sets
Experience in one or more of the following: Dataiku, DataRobot, Databricks, UiPath would be nice to have.
Using version control systems (e.g., Git) to manage changes to data governance policies, procedures, and documentation
Ability to rapidly comprehend changes to key business processes and the impact on overall Data framework.
Flexibility to adjust to multiple demands, shifting priorities, ambiguity, and rapid change.
Advanced analytical skills.
High level of organization and attention to detail.
Self-starter attitude with the ability to work independently.
Knowledge of legal, compliance, and regulatory issues impacting data.
Experience in finance preferred.
$72k-99k yearly est. 1d ago
Data Scientist (Exploitation Specialist Level-3) - Tampa, FL
Masego
Data engineer job in Tampa, FL
_________________________________________________________________________________________________
Masego is an award-winning small business that specializes in GEOINT services. As a Service-Disabled Veteran-Owned Small Business (SDVOSB), we recognize and award your hard work.
Description
We are looking for a Level-3 TS/SCI-cleared Data Scientist to join our team. This role provides automation/collection support to the main team at NGA Washington. Because of this, this opportunity relies on good communication skills and a baseline knowledge of GEOINT collection and/or automation systems like JEMA.
Minimum Required Qualifications:
At least 5 years of related GEOINT work experience, or 2 years with a relevant Bachelor's degree.
Able to work on client site 40-hours a week (very limited option for telework)
Proficient with Python
Experience with JEMA
Preferred Qualifications:
Experience with multiple intelligence types (SIGINT, OSINT, ELINT, GEOINT, MASINT, HUMINT
Experience with Brewlytics, ArcPro and/or other geospatial data analysis tools
Knowledge of GEOINT collection and associated NGA/NRO systems
Proficiency with common programming languages including R, SQL, HTML, and JavaScript
Experience analyzing geospatially enabled data
Ability to learn new technologies and adapt to dynamic mission needs
Ability to work collaboratively with a remote team (main gov team is based out of NGA Washington)
Experience providing embedded data science/automation support to analytic teams
Security Clearance Requirement:
Active TS/SCI, with a willingness to take a polygraph test.
Salary Range: $128,600 based on ability to meet or exceed stated requirements
About Masego
Masego Inc. provides expert Geospatial Intelligence Solutions in addition to Activity Based Intelligence (ABI) and GEOINT instructional services. Masego provides expert-level Geospatial Collection Management, Full Motion Video; Human Geography; Information Technology and Cyber; Technical Writing; and ABI, Agile, and other professional training.
Masego is a Service-Disabled Veteran-Owned Small Business headquartered in Fredericksburg, Virginia. With high-level expertise and decades of experience, coupled with proven project management systems and top-notch client support, Masego enhances the performance capabilities of the Department of Defense and the intelligence community.
Pay and Benefits
We seek to provide and take care of our team members. We currently offer Medical, Dental, Vision, 401k, Generous PTO, and more!
Diversity
Masego, Inc. is an equal opportunity/equal access/affirmative action employer fully committed to achieving a diverse workforce and complies with all applicable Federal and Virginia State laws, regulations, and executive orders regarding nondiscrimination and affirmative action in its programs and activities. Masego, Inc. does not discriminate on the basis of race, color, religion, ethnic or national origin, gender, genetic information, age, disability, sexual orientation, gender identity, gender expression, and veteran's status.
$128.6k yearly Auto-Apply 60d+ ago
Lead Data Engineer
The Walt Disney Company 4.6
Data engineer job in Key Vista, FL
The Disney Decision Science + Integration (DDSI) is a consulting team that supports clients across The Walt Disney Company, including Disney Experiences (Parks & Resorts worldwide, Cruise Line, Consumer Products, etc.), Disney Entertainment (ABC, The Walt Disney Studios, Disney Theatrical, Disney Streaming Services, etc.), ESPN, and Corporate Finance. Key partners to the DDSI organization include Marketing, Finance, Business Development, Research, and Operations. We develop, analyze, and execute strategies and improve the value proposition for our Guests, Cast Members, and Shareholders. The team leverages technology, data analytics, optimization, statistical and econometric modeling to explore opportunities, shape business decisions and drive business value.
What You Will Do
You will be responsible for planning and leading research and development related to advanced analytic data solutions, leveraging GenAI. You will work with business and technology leaders to understand scope and requirements, business needs, and data from across the Disney company in order to design and deliver the data pipelines necessary for our solutions. You will partner with the Decision Science Products, Decision Science, and client teams on critical projects. Other activities include planning, estimating, design, development, testing, production rollout and sustainment activities. You will also need to consult and collaborate with project team members, lead design reviews, do hands-on development, and communicate with colleagues and leaders.
Required Qualifications & Skills
7+ years overall experience in a dataengineering development capacity using a multiple environments (Dev, QA, Prod, etc.) and DevOps procedures for code deployment/promotion
Experience with a variety of GenAI models, tools, and concepts
3+ years using, designing and building relational databases (preferably Snowflake or PostgreSQL)
3+ years of experience leading and deploying code using a source control product such as GitLab/GitHub
2+ years of experience with job scheduling software like Apache Airflow, Amazon MWAA, GitLab Runners or UC4
Multiple years of experience with ELT/ETL data pipeline development and maintenance
Multiple years of shown experience and expertise using SQL and Python
Experience using containerization technologies such as Docker or Kubernetes
Knowledgeable on cloud architecture and product offerings, preferably AWS
Understanding of Knowledge Graphs, Data Mesh, and other data sharing platforms
Experience translating project scope and high-level requirements into technical dataengineering tasks
Experience defining solutions to sophisticated dataengineering problems in support of advanced analytic processes
Experience collaborating with multiple project teams in a fast-paced environment
Experience with defining and estimating level-of-effort dataengineering activities
Experience with project and sprint planning
Ability to communicate technical concepts and solutions to non-technical team members
Experience designing and building data structures to support requirements
Preferred Qualifications
Experience leading development of GenAI based systems including model selection, pipeline orchestration and deployment strategies
Experience defining GenAI architectures
Knowledgeable with Disney Parks attendance, reservations and/or products
Experience with cloud based technologies, preferably AWS EMR, EC2, and S3
Experience with advanced Snowflake offerings such as Snowpark, Data Exchange, Data Marketplace and Snowpipe
Education
Bachelor's degree in computer science, Information Systems, Software, Electrical or Electronics Engineering, or comparable field of study and/or equivalent work experience
Master's degree preferred in computer science, Information Systems, Software, Electrical or Electronics Engineering, or comparable field of study and/or equivalent work experience
#DisneyTech
#DisneyAnalytics
**********************
Job Posting Segment:
Corporate Strategy
Job Posting Primary Business:
Decision Science & Integration
Primary Job Posting Category:
DataEngineering
Employment Type:
Full time
Primary City, State, Region, Postal Code:
Lake Buena Vista, FL, USA
Alternate City, State, Region, Postal Code:
Date Posted:
2025-09-30
$101k-146k yearly est. Auto-Apply 60d+ ago
ETL Architect
Healthplan Services 4.7
Data engineer job in Tampa, FL
HealthPlan
Services
(HPS) is the nation's largest independent provider of sales, benefits administration, retention, reform and technology solutions to the insurance and managed care industries.
Headquartered in Tampa, Florida, HPS was founded in 1970 and employs 1,500+ associates. HPS stands at the forefront of the insurance industry, providing exchange connectivity, administration, distribution and technology services to insurers of individual, small group, voluntary and association plans, as well as valuable solutions to thousands of brokers and agents, nationwide.
Job Description
Position: ETL Architect
The ETL Architect will have experience delivering BI solutions with an Agile BI delivery methodology.
Essential Job Functions and Duties:
Develop and
maintain ETL jobs for data warehouses/marts
Design ETL
via source-to-target mapping and design documents that consider security,
performance tuning and best practices
Collaborate
with delivery and technical team members on design and development
Collaborate
with business partners to understand business processes, underlying data and
reporting needs
Conduct data
analysis in support of ETL development and other activities
Assist with data architecture and data modeling
Preferred Qualifications:
12+ years of work experience as Business Intelligence Developer
Work experience with multiple database platforms and BI delivery solutions
10+ years of experience with End to End ETL
architecture, data modeling BI and Analytics data marts, implementing
and supporting production environments.
10+ years of experience designing, building and implementing BI solutions with
modern BI tools like Microstrategy, Microsoft and Tableau
Experience as a Data Architect
Experience delivering BI solutions with an Agile BI delivery methodology
Ability to communicate, present and interact comfortably with senior leadership
Demonstrated proficiency implementing self-service solutions to empower an organization to
generate valuable actionable insights
Strong team player
Ability to understand information quickly, derive insight, synthesize information clearly
and concisely, and devise solutions
Inclination to take initiative, set priorities, take ownership of assigned projects and
initiatives, drive for results, and collaborate to achieve greatest value
Strong relationship-building and interpersonal skills
Demonstrated self-confidence, honesty and integrity
Conscientious of Enterprise Data Warehouse Release management
process; Conduct Operations readiness and environment compatibility review of
any changes prior to deployment with strong sensitivity around Impact and SLA
Experience with data modeling tools a plus.
Expert in data warehousing methodologies and best practices
required.
Ability to initiate and follow through on complex projects of
both short and long term duration required.
Works independently, assumes responsibility for job development
and training, researches and resolves questions and problems, requests
supervisor input and keeps supervisor informed required.
Proactive recommendation for improving the performance and
operability of the data warehouse and reporting environment.
Participate on interdepartmental teams to support organizational
goals
Perform other related duties and tasks as assigned
Experience facilitating user sessions and gathering requirements
Education Requirements:
Bachelors or equivalent degree in a business, technical, or related field
Additional Information
All your information will be kept confidential according to EEO guidelines.
$84k-105k yearly est. 60d+ ago
Principal Data Scientist
Maximus 4.3
Data engineer job in Tampa, FL
Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team.
You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes.
This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.)
This position requires occasional travel to the DC area for client meetings.
Essential Duties and Responsibilities:
- Make deep dives into the data, pulling out objective insights for business leaders.
- Initiate, craft, and lead advanced analyses of operational data.
- Provide a strong voice for the importance of data-driven decision making.
- Provide expertise to others in data wrangling and analysis.
- Convert complex data into visually appealing presentations.
- Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners.
- Understand the importance of automation and look to implement and initiate automated solutions where appropriate.
- Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects.
- Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects.
- Guide operational partners on product performance and solution improvement/maturity options.
- Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization.
- Learn new skills in advanced analytics/AI/ML tools, techniques, and languages.
- Mentor more junior data analysts/data scientists as needed.
- Apply strategic approach to lead projects from start to finish;
Job-Specific Minimum Requirements:
- Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
- Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning.
- Contribute to the development of mathematically rigorous process improvement procedures.
- Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments.
Minimum Requirements
- Bachelor's degree in related field required.
- 10-12 years of relevant professional experience required.
Job-Specific Minimum Requirements:
- 10+ years of relevant Software Development + AI / ML / DS experience.
- Professional Programming experience (e.g. Python, R, etc.).
- Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML.
- Experience with API programming.
- Experience with Linux.
- Experience with Statistics.
- Experience with Classical Machine Learning.
- Experience working as a contributor on a team.
Preferred Skills and Qualifications:
- Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.).
- Experience developing machine learning or signal processing algorithms:
- Ability to leverage mathematical principles to model new and novel behaviors.
- Ability to leverage statistics to identify true signals from noise or clutter.
- Experience working as an individual contributor in AI.
- Use of state-of-the-art technology to solve operational problems in AI and Machine Learning.
- Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles.
- Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions.
- Ability to build reference implementations of operational AI & Advanced Analytics processing solutions.
Background Investigations:
- IRS MBI - Eligibility
#techjobs #VeteransPage
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
156,740.00
Maximum Salary
$
234,960.00
$64k-92k yearly est. Easy Apply 9d ago
Data Scientist
Redhorse Corporation
Data engineer job in Tampa, FL
About the OrganizationNow is a great time to join Redhorse Corporation. We are a solution-driven company delivering data insights and technology solutions to customers with missions critical to U.S. national interests. We're looking for thoughtful, skilled professionals who thrive as trusted partners building technology-agnostic solutions and want to apply their talents supporting customers with difficult and important mission sets.
About the RoleRedhorse Corporation is seeking a highly skilled Data Scientist to join our team supporting the United States Central Command (USCENTCOM) Directorate of Logistics (CCJ4). You will play a critical role in accelerating the delivery of AI-enabled capabilities within the Joint Logistics Common Operating Picture (JLOGCOP), directly impacting USCENTCOM's ability to promote international cooperation, respond to crises, deter aggression, and build resilient logistics capabilities for our partners. This is a high-impact role contributing to national security and global stability. You will be working on a custom build of AI/ML capabilities into the JLOGCOP leveraging dozens of data feeds to enhance decision-making and accelerate planning for USCENTCOM missions.Key Responsibilities
Communicate with the client regularly regarding enterprise values and project direction.
Find the intersection between business value and achievable technical work.
Articulate and translate business questions into technical solutions using available DoD data.
Explore datasets to find meaningful entities and relationships.
Create data ingestion and cleaning pipelines.
Develop applications and effective visualizations to communicate insights.
Serve as an ambassador for executive DoD leadership to sponsor data literacy growth across the enterprise.
Required Experience/Clearance
US citizen with a Secret US government clearance. Applicants who are not US Citizens and who do not have a current and active Secret security clearance will not be considered for this role.
Ability to work independently to recommend solutions to the client and as part of a team to accomplish tasks.
Experience with functional programming (Python, R, Scala) and database languages (SQL).
Familiarity using AI/ML tools to support logistics use cases.
Ability to discern which statistical approaches are appropriate for different contexts.
Experience communicating key findings with visualizations.
8+ years of professional experience.
Master's degree in a quantitative discipline (Statistics, Computer Science, Physics, Electrical Engineering, etc.).
Desired Experience
Experience with cloud-based development platforms.
Experience with large-scale data processing tools.
Experience with data visualization tools.
Ph.D. in a quantitative discipline (Statistics, Computer Science, Physics, Electrical Engineering, etc.).
Equal Opportunity Employer/Veterans/Disabled Accommodations:If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to access job openings or apply for a job on this site as a result of your disability. You can request reasonable accommodations by contacting Talent Acquisition at *********************************** Redhorse Corporation shall, in its discretion, modify or adjust the position to meet Redhorse's changing needs.This job description is not a contract and may be adjusted as deemed appropriate in Redhorse's sole discretion.
About the Role
Culmen International is hiring Expert Exploitation Specialist/Data Scientists to provide support on-site at the National Geospatial-Intelligence Agency (NGA) in Tampa, FL.
The National Geospatial-Intelligence Agency (NGA) expects to deliver AOS Metadata Cataloging and Management Services to enhance product and asset management of content enabling rapid creation of discoverable, modular, web enabled, and visually enriched Geospatial Intelligence (GEOINT) products for intelligence producers in NGA, across the National System for Geospatial-Intelligence (NSG).
TALENT PIPELINE - Qualified applicants will be contacted as soon as funding for this position is secured.
What You'll Do in Your New Role
The Data Scientist will coordinate with our clients to understand questions and issues involving the client's datasets, then determine the best method and approach to create data-driven solutions within program guidelines. This position will be relied upon as a Subject Matter Expert (SME), and be expected to lead/assist in the development of automated processes, architect data science solutions, automated workflows, conduct analysis, use available tools to analyze data, remain adaptable to mission requirements, and identify patterns to help solve some of the complex problems that face the DoD and Intelligence Community (IC).
Work with large structured / unstructured data in a modeling and analytical environment to define and create streamline processes in the evaluation of unique datasets and solve challenging intelligence issues
Lead and participate in the design of solutions and refinement of pre-existing processes
Work with Customer Stakeholders, Program Managers, and Product Owners to translate road map features into components/tasks, estimate timelines, identify resources, suggest solutions, and recognize possible risks
Use exploratory data analysis techniques to identify meaningful relationships, patterns, or trends from complex data
Combine applied mathematics, programming skills, analytical techniques, and data to provide impactful insights for decision makers
Research and implement optimization models, strategies, and methods to inform data management activities and analysis
Apply big data analytic tools to large, diverse sets of data to deliver impactful insights and assessments
Conduct peer reviews to improve quality of workflows, procedures, and methodologies
Help build high-performing teams; mentor team members providing development opportunities to increase their technical skills and knowledge
Required Qualifications
TS/SCI Clearance w/CI Poly Eligible
Minimum of 18 years combined experience (A combination of years of experience & professional certifications/trainings can be used in lieu of a degree)
BS in related Field with Graduate level work
Expert proficiency in Python and other programming languages applicable to automation development.
Demonstrated experience designing and implementing workflow automation systems
Advanced experience with ETL (Extract, Transform, Load) processes for geospatial data
Expertise in integrating disparate systems through APl development and implementation
Experience developing and deploying enterprise-scale automation solutions
Knowledge of NGA's Foundation GEOINT products, data types, and delivery methods
Demonstrated experience with database design, implementation, and optimization
Experience with digital media generation systems and automated content delivery platforms
Ability to analyze existing workflows and develop technical solutions to streamline processes
Knowledge of DLA systems and interfaces, particularly MEBS and WebFLIS
Expertise in data quality assurance and validation methodologies
Experience with geospatial data processing, transformation, and delivery automation
Proficiency with ArcGIS tools, GEODEC and ACCORD software systems
Understanding of cartographic principles and standards for CADRG/ECRG products
Strong analytical skills for identifying workflow inefficiencies and implementing solutions
Experience writing technical documentation including SOPS, CONOPS, and system design
Desired Qualifications
Certification(s) in relevant automation technologies or programming languages
Experience with DevOps practices and C/CD implementation
Knowledge of cloud-based automation solutions and their implementation in - government environments
Experience with machine learning applications for GEOINT Workflow optimization
Expertise in data analytics and visualization for workflow performance metrics
Understanding of NGA's enterprise architecture and integration points
Experience implementing RPA (Robotic Process Automation) solutions
Knowledge of secure coding practices and cybersecurity principles
Demonstrated expertise in digital transformation initiatives
Experience mentoring junior staff in automation techniques and best practices
Background in agile development methodologies
Understanding of human-centered design principles for workflow optimization
About the Company
Culmen International is committed to enhancing international safety and security, strengthening homeland defense, advancing humanitarian missions, and optimizing government operations. With experience in over 150 countries, Culmen supports our clients to accomplish critical missions in challenging environments.
Exceptional Medical/Dental/Vision Insurance, premiums for employees are 100% paid by Culmen,
and dependent coverage is available at a nominal rate (including same or opposite sex domestic partners)
401k - Vested immediately and 4% match
Life insurance and disability paid by the company
Supplemental Insurance Available
Opportunities for Training and Continuing Education
12 Paid Holidays
To learn more about Culmen International, please visit **************
At Culmen International, we are committed to creating and sustaining a workplace that upholds the principles of Equal Employment Opportunity (EEO). We believe in the importance of fair treatment and equal access to opportunities for all employees and applicants. Our commitment to these principles is unwavering across all our operations worldwide.
$62k-90k yearly est. Auto-Apply 8d ago
Data Platform Engineer- 6 Month Assignment
Maxhealth
Data engineer job in Tampa, FL
MaxHealth is seeking a highly skilled Data Platform Engineer (DBA) to join our dataengineering team to modernize and optimize our data pipelines. The Data Platform Engineer (DBA) will be responsible for the management, optimization, modernization, and governance of our enterprise data warehouse. This role requires strong technical expertise in Azure Data Factory (ADF)/Fabric/or other similar technologies, SQL Server Integration Services (SSIS), SQL Server, and modern BI platforms (Domo and Power BI)
Location- Hybrid/Tampa
1099 Contractor- Starting salary $110,000 annum based on experience
This is a 6-month assignment with the possibility of opportunity for extension into a permanent position.
Key Responsibilities
Modernization & Optimization
· Partner with IT leadership and dataengineering teams to design and implement a modernized, scalable, and cloud-ready data platform.
· Design, develop, and maintain modernized ETL pipelines using Azure Data Factory, Fabric, Databricks and/or SSIS to integrate multiple data sources into the enterprise data warehouse.
· Recommend and implement improvements to ETL processes, storage, indexing strategies, and query performance.
· Explore and adopt automation tools, monitoring systems, and DevOps practices for database operations.
· Support migration strategies from legacy systems to cloud or hybrid architectures.
· Optimize and refine existing ETL processes to improve performance, scalability, and maintainability.
Database Administration & Operations
· Oversee the daily management of the SQL data warehouse, ensuring high availability, security, and optimal performance.
· Implement and maintain backup, recovery, and disaster recovery strategies.
· Monitor database performance and tune queries, indexes, and structures to maximize efficiency.
· Manage user access, roles, and permissions to uphold least-privilege security standards.
· Maintain documentation for database structures, configurations, and procedures.
· Design, build, and maintain ETL/ELT processes across SQL Agent jobs, stored procedures, and SSIS packages & optimize existing ETL processes for performance and reliability.
Governance & Controls
· Implement data quality checks, error handling, and monitoring to ensure accurate and consistent data flows.
· Adopt DevOps/DataOps practices for automation, monitoring, and repeatable deployments.
· Establish and enforce data management policies, including auditing, logging, and compliance monitoring.
· Ensure adherence to HIPAA, HITRUST, and other healthcare data security requirements.
· Implement robust change management processes for database updates, schema changes, and production deployments.
· Utilize a code repository (Git or similar) to manage, version, and document ETL code.
· Work closely with Finance, Clinical, and Analytics teams to ensure data pipelines and reporting processes are accurate, consistent, and reliable.
Collaboration & Support
· Partner with business intelligence (BI) and analytics teams to ensure data is structured and accessible for actionable reporting.
· Work with application developers to optimize integration with downstream applications and operational systems.
· Provide advanced support for ETL & database-related incidents, outages, and service requests.
· issues, perform root cause analysis, and implement preventive measures.
Job Qualifications
· Bachelor's degree in Computer Science, Information Systems, or a related field; or equivalent experience.
· 5+ years of experience as a DBA in a large-scale SQL environment
· 5+ years of experience in dataengineering or data platform roles with expertise in ETL/ELT pipeline development and orchestration.
· Proven experiencing managing workloads in cloud native data management platforms, preferably in Azure (Azure Synapse, Microsoft Fabric, Databricks, Data Factory, or market equivalents such as Snowflake, BigQuery, Redshift, dbt, or Airflow).
· Strong knowledge of T-SQL, stored procedures, indexing strategies, and performance tuning.
· Experience with ETL processes, data warehouse management, and BI/reporting environments.
· Proven track record of implementing controls, monitoring systems, and compliance frameworks.
· Deep understanding of data security, governance, and regulatory compliance (HIPAA, HITRUST, SOC2, etc.).
Preferred Skills
· Experience with cloud database platforms (preferred Azure SQL, Snowflake, etc.).
· Exposure to data warehouse modernization initiatives and architecture redesigns.
· Familiarity with data cataloging and governance platforms and practices (e.g., Purview, Collibra, or Alation).
· Knowledge of automation tools (Ansible, Terraform, Liquibase, etc.) and DevOps practices.
· Familiarity with healthcare data standards (HL7, FHIR, claims/EMR data).
· Professional certifications such as Microsoft Certified: Azure Database Administrator Associate, AWS Certified Database - Specialty, or equivalent.
ABOUT MAXHEALTH
MaxHealth is dedicated to simplifying healthcare and ensuring healthier futures. Founded in 2015, MaxHealth is a leading primary care platform focused on providing high-quality, integrated care to adults and senior patients throughout Florida. We provide care for more than 120,000 patients, most of which are beneficiaries of government-sponsored healthcare programs like Medicare, or of health plans purchased on the Affordable Care Act exchange marketplace. MaxHealth is a rapidly growing medical practice with more than 50 clinics spread across central and southern Florida. MaxHealth also partners with independent providers who are like-minded and utilizes its platform to help them provide high-quality care. We are customer-centered; compassionate; results-driven; proactive; collaborative; and adaptable in executing our vision to help patients live their best lives. Our mission is to deliver quality care, a simplified experience, and happiness. One patient at a time.
#IND123
$110k yearly 60d+ ago
Senior Data Engineer
Slide Insurance
Data engineer job in Tampa, FL
Slide Insurance - Fun. Innovation Driven. Fueled by Passion, Purpose and Technology.
At Slide, you will not only be part of a successful team, but you will also be a part of our Slide Vibe/award winning culture where collaboration and innovation are expected, recognized and awarded!
What you will be doing:
Design, develop, and optimize robust ETL/ELT pipelines for ingesting data from internal systems (policy, claims, billing) and external sources when applicable.
Evaluate and integrate emerging technologies and tools to improve dataengineering capabilities and performance.
Design and manage modern data platforms (e.g., cloud-based data lakes or warehouses) to support real-time and batch processing use cases.
Ensure data quality, integrity, and lineage by implementing rigorous validation and monitoring frameworks.
Collaborate with data scientists, analysts, and business stakeholders to translate insurance-specific needs into scalable data solutions.
Support data governance efforts by developing and maintaining documentation, metadata, and data dictionaries.
Monitor individual and team performance, providing ongoing feedback and coaching to continuously improve results.
Mentor junior engineers and contribute to setting dataengineering best practices across the organization.
Serve as the primary liaison between business and IT to ensure quality data and alignment.
Perform other duties as assigned.
What you have:
Education, Experience, and Licensing:
Bachelor's degree in computer science, Information Technology, Engineering, or related field.
Minimum 6 years' experience in dataengineering required.
2+ years of experience with Databricks and related technologies (Azure Data Factory, MS SQL Server, etc.).
2+ years' experience in the financial services industry preferred.
Certificate in SQL BI/EDW preferred.
Qualifications/Skills and Competencies:
Advanced proficiency in Databricks, SQL, Python, and distributed data processing (e.g., Spark, Kafka).
Strong ability in designing data models and working with cloud platforms (e.g., AWS, Azure, GCP).
Strong understanding of insurance data domains: policy, claims, billing, reinsurance, and regulatory reporting.
Familiarity with data governance, privacy, and compliance standards (e.g., NAIC, GDPR).
Effective research, problem solving, analytical, critical thinking, influencing, and relationship management skills necessary.
Ability to excel in a fast-paced environment.
Excellent interpersonal skills with the ability to professionally interact with team members across departments.
Strong written and verbal communication skills with the ability to professionally interact with team members in other departments.
Exceptional time management skills with ability to prioritize tasks and allocate resources efficiently.
Proven ability to be adaptable and flexible; able to adjust to new requirements or unforeseen issues.
Proficient in MSO/365 applications such as Microsoft Teams, SharePoint, Word, Excel, PowerPoint, and Outlook.
Desire to live Slide's Core Values.
What Slide offers to you:
The Slide Vibe - An opportunity to be a part of a fun and innovation-driven culture fueled by Passion, Purpose and Technology! Slide offers many opportunities to collaborate and innovate across the company and departments, as well as get to know other Sliders. From coffee chats, to clubs, to social events - we plan it, so all Sliders feel included and Enjoy their Journey.
Benefits - Created using Slider feedback, Slide offers a comprehensive and affordable benefits package to cover all aspects of health...Physical, Emotional, Financial, Social and Professional. A Lifestyle Spending Account is set up for each Slider and Slide contributes to it monthly for use on any benefit that individually suits you - Health Your Way!
2023, 2024 & 2025 BEST PLACE TO WORK - Tampa Bay Business Journal
2024 & 2025 TOP WORKPLACE - Tampa Bay Times (Local) &
2024 TOP WORKPLACE - USA Today (National)
$72k-99k yearly est. Auto-Apply 60d+ ago
Data Engineer
Contact Government Services, LLC
Data engineer job in Tampa, FL
DataEngineer Employment Type: Full-Time, Mid-level Department: Business Intelligence CGS is seeking a passionate and driven DataEngineer to support a rapidly growing Data Analytics and Business Intelligence platform focused on providing solutions that empower our federal customers with the tools and capabilities needed to turn data into actionable insights. The ideal candidate is a critical thinker and perpetual learner; excited to gain exposure and build skillsets across a range of technologies while solving some of our clients' toughest challenges.
CGS brings motivated, highly skilled, and creative people together to solve the government's most dynamic problems with cutting-edge technology. To carry out our mission, we are seeking candidates who are excited to contribute to government innovation, appreciate collaboration, and can anticipate the needs of others. Here at CGS, we offer an environment in which our employees feel supported, and we encourage professional growth through various learning opportunities. Skills and attributes for success:-Complete development efforts across data pipeline to store, manage, store, and provision to data consumers.-Being an active and collaborating member of an Agile/Scrum team and following all Agile/Scrum best practices.-Write code to ensure the performance and reliability of data extraction and processing.-Support continuous process automation for data ingest.-Achieve technical excellence by advocating for and adhering to lean-agile engineering principles and practices such as API-first design, simple design, continuous integration, version control, and automated testing.-Work with program management and engineers to implement and document complex and evolving requirements.-Help cultivate an environment that promotes customer service excellence, innovation, collaboration, and teamwork.-Collaborate with others as part of a cross-functional team that includes user experience researchers and designers, product managers, engineers, and other functional specialists.
Qualifications:-Must be a US Citizen.-Must be able to obtain a Public Trust Clearance.-7+ years of IT experience including experience in design, management, and solutioning of large, complex data sets and models.-Experience with developing data pipelines from many sources from structured and unstructured data sets in a variety of formats.-Proficiency in developing ETL processes, and performing test and validation steps.-Proficiency to manipulate data (Python, R, SQL, SAS).-Strong knowledge of big data analysis and storage tools and technologies.-Strong understanding of the agile principles and ability to apply them.-Strong understanding of the CI/CD pipelines and ability to apply them.-Experience with relational database, such as, PostgreSQL.-Work comfortably in version control systems, such as, Git Repositories.
Ideally, you will also have:-Experience creating and consuming APIs.-Experience with DHS and knowledge of DHS standards a plus.-Candidates will be given special consideration for extensive experience with Python.-Ability to develop visualizations utilizing Tableau or PowerBI.-Experience in developing Shell scripts on Linux.-Demonstrated experience translating business and technical requirements into comprehensive data strategies and analytic solutions.-Demonstrated ability to communicate across all levels of the organization and communicate technical terms to non-technical audiences. Our Commitment:Contact Government Services (CGS) strives to simplify and enhance government bureaucracy through the optimization of human, technical, and financial resources. We combine cutting-edge technology with world-class personnel to deliver customized solutions that fit our client's specific needs. We are committed to solving the most challenging and dynamic problems.
For the past seven years, we've been growing our government-contracting portfolio, and along the way, we've created valuable partnerships by demonstrating a commitment to honesty, professionalism, and quality work.
Here at CGS we value honesty through hard work and self-awareness, professionalism in all we do, and to deliver the best quality to our consumers mending those relations for years to come.
We care about our employees. Therefore, we offer a comprehensive benefits package:-Health, Dental, and Vision-Life Insurance-401k-Flexible Spending Account (Health, Dependent Care, and Commuter)-Paid Time Off and Observance of State/Federal Holidays
Contact Government Services, LLC is an Equal Opportunity Employer. Applicants will be considered without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Join our team and become part of government innovation!
Explore additional job opportunities with CGS on our Job Board:*************************************
For more information about CGS please visit: ************************** or contact:Email: *******************
#CJ
$72k-99k yearly est. Auto-Apply 60d+ ago
Data Engineer
Centurion Consulting Group
Data engineer job in Tampa, FL
Job Description Centurion is looking for a DataEngineer with an active TS/SCI to work on site in Tampa, FL As a DataEngineer you will play a critical role in supporting data scientists and analysts by designing, building, and optimizing data pipelines tailored for large-scale text analytics and AI applications. You will collaborate closely with business stakeholders, IT experts, and subject-matter experts to deliver robust dataengineering solutions that enable advanced natural language processing (NLP), generative AI, and agentic AI capabilities.
Responsibilities include:
Develop and design data pipelines to support an end-to-end solution.
Develop and maintain artifacts i.e., schemas, data dictionaries, and transforms related to ETL processes.
Manage production data within multiple datasets ensuring fault tolerance and redundancy.
Design and develop robust and functional dataflows to support raw data and expected data.
Collaborate with the rest of dataengineering team to design and launch new features. Includes coordination and documentation of dataflows, capabilities, etc.
Design and develop databases to support multiple user groups with various levels of access to raw and processed data.
Apply a knowledge of the SDLC to bring applications from proof of concept to production, including on the TS network.
The Team:
Our AI & Data offering provides a full spectrum of solutions for designing, developing, and operating cutting-edge Data and AI platforms, products, insights, and services. Our offerings help clients innovate, enhance and operate their data, AI, and analytics capabilities, ensuring they can mature and scale effectively.
Required Qualifications:
Bachelor's degree required
Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future
Active TS/SCI security clearance required
4+ years of experience working with software platforms and services, such as Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similar.
4+ years of experience datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, Redis, and graph databases such as Neo4j, Memgraph, or others.
Ability to work on-site in Tampa, FL 5 days per week
Preferred Qualifications:
Familiar with Linux/Unix server environments.
Familiar with common data structures needed to support common machine learning packages such as scikit learn, nltk, spacey, and others.
Familiar with or desire to become familiar with data structures need to support Generative AI pipelines such as vector databases, NER, and RAG.
$72k-99k yearly est. 60d+ ago
Lead Data Engineer
Ascension Federal Services
Data engineer job in Tampa, FL
Lead DataEngineer Location: Tampa, FL, 33601 Clearance: TOP SECRET Job Description:
The Lead DataEngineer will be responsible for designing, developing, and maintaining the company's data architecture. They will lead a team of dataengineers and work closely with data scientists, analysts, and other stakeholders to ensure that data is accurate, reliable, and accessible. The Lead DataEngineer will also be responsible for implementing data security measures and ensuring compliance with data privacy regulations.
Responsibilities:
Design and develop the company's data architecture
Lead a team of dataengineers
Collaborate with data scientists, analysts, and other stakeholders to ensure data accuracy and reliability
Implement data security measures and ensure compliance with data privacy regulations
Develop and maintain data pipelines
Optimize data storage and retrieval
Identify and resolve data quality issues
Stay up-to-date with emerging trends and technologies in dataengineering
Requirements:
Bachelor's or Master's degree in Computer Science, Engineering, or a related field
10+ years of experience in dataengineering
Experience with data modeling, ETL, and data warehousing
Proficiency in SQL and at least one programming language (Python, Java, C#, etc.)
Experience with cloud-based data platforms (AWS, Azure, Google Cloud, etc.)
Strong leadership and communication skills
Ability to work independently and as part of a team
$72k-99k yearly est. 60d+ ago
Tech Lead, Data & Inference Engineer
Catalyst Labs
Data engineer job in Tampa, FL
Job Description
Our Client
A fast moving and venture backed advertising technology startup based in San Francisco. They have raised twelve million dollars in funding and are transforming how business to business marketers reach their ideal customers. Their identity resolution technology blends business and consumer signals to convert static audience lists into high match and cross channel segments without the use of cookies. By transforming first party and third party data into precision targetable audiences across platforms such as Meta, Google and YouTube, they enable marketing teams to reach higher match rates, reduce wasted advertising spend and accelerate pipeline growth. With a strong understanding of how business buyers behave in channels that have traditionally been focused on business to consumer activity, they are redefining how business brands scale demand generation and account based efforts.
About Us
Catalyst Labs is a leading talent agency with a specialized vertical in Applied AI, Machine Learning, and Data Science. We stand out as an agency thats deeply embedded in our clients recruitment operations.
We collaborate directly with Founders, CTOs, and Heads of AI in those themes who are driving the next wave of applied intelligence from model optimization to productized AI workflows. We take pride in facilitating conversations that align with your technical expertise, creative problem-solving mindset, and long-term growth trajectory in the evolving world of intelligent systems.
Location: San Francisco
Work type: Full Time,
Compensation: above market base + bonus + equity
Roles & Responsibilities
Lead the design, development and scaling of an end to end data platform from ingestion to insights, ensuring that data is fast, reliable and ready for business use.
Build and maintain scalable batch and streaming pipelines, transforming diverse data sources and third party application programming interfaces into trusted and low latency systems.
Take full ownership of reliability, cost and service level objectives. This includes achieving ninety nine point nine percent uptime, maintaining minutes level latency and optimizing cost per terabyte. Conduct root cause analysis and provide long lasting solutions.
Operate inference pipelines that enhance and enrich data. This includes enrichment, scoring and quality assurance using large language models and retrieval augmented generation. Manage version control, caching and evaluation loops.
Work across teams to deliver data as a product through the creation of clear data contracts, ownership models, lifecycle processes and usage based decision making.
Guide architectural decisions across the data lake and the entire pipeline stack. Document lineage, trade offs and reversibility while making practical decisions on whether to build internally or buy externally.
Scale integration with application programming interfaces and internal services while ensuring data consistency, high data quality and support for both real time and batch oriented use cases.
Mentor engineers, review code and raise the overall technical standard across teams. Promote data driven best practices throughout the organization.
Qualifications
Bachelors or Masters degree in Computer Science, Computer Engineering, Electrical Engineering, or Mathematics.
Excellent written and verbal communication; proactive and collaborative mindset.
Comfortable in hybrid or distributed environments with strong ownership and accountability.
A founder-level bias for actionable to identify bottlenecks, automate workflows, and iterate rapidly based on measurable outcomes.
Demonstrated ability to teach, mentor, and document technical decisions and schemas clearly.
Core Experience
6 to 12 years of experience building and scaling production-grade data systems, with deep expertise in data architecture, modeling, and pipeline design.
Expert SQL (query optimization on large datasets) and Python skills.
Hands-on experience with distributed data technologies (Spark, Flink, Kafka) and modern orchestration tools (Airflow, Dagster, Prefect).
Familiarity with dbt, DuckDB, and the modern data stack; experience with IaC, CI/CD, and observability.
Exposure to Kubernetes and cloud infrastructure (AWS, GCP, or Azure).
Bonus: Strong Node.js skills for faster onboarding and system integration.
Previous experience at a high-growth startup (10 to 200 people) or early-stage environment with a strong product mindset.
$72k-99k yearly est. 12d ago
Data Scientist
Tampa Bay Lightning 3.6
Data engineer job in Tampa, FL
In order to be considered for this role, after clicking "Apply Now" above and being redirected, you must fully complete the application process on the follow-up screen.
This position will work with hockey data to fulfill given directives as well as the ability to create new projects and communicate results. This role will be part of a team-based environment in the hockey analytics department and is expected to work at Benchmark International Arena. Here is a unique opportunity to create impactful solutions in a growth-minded environment and culture. This full-time position will report to the Director of Hockey Analytics and Associate Director of Hockey Analytics.
We ask that you submit a cover letter for this position. In your cover letter, please discuss prior work or a project you have done that you are most proud of. Please also discuss why you are interested in the position. Additionally, please provide any examples of your open-source projects or public work that you can share.
Essential Duties & Responsibilities:
Build and validate models for hockey metrics using multiple data sources
Create projects to advance our understanding of hockey
Collaborate with other data scientists, dataengineers, etc.
Responsible for communicating the results of projects to coaches and management
May be expected to travel a few times a year for work
Game/Event Responsibilities:
Option to attend all home games
Qualifications:
Computer Science, Math, Engineering or Physics degree required
Minimum 3-5 years of experience writing modern Python code
Experience using Docker
Experience using SQL databases
Experience working with time series data
Ability to write geometric and physics analyses
Ability to problem solve and a desire for personal development
Experience working in a team-based environment
Familiarity with Git/VCS
Ability to collaborate with others and take ownership of projects
Ability to interpret results and communicate effectively
Ability to understand and implement research papers related to sports and data science
Preferred Qualifications:
Graduate Degree in Physics, Math or Engineering
Machine learning concepts and applications
Advanced statistical modelling and knowledge of statistics
Familiarity working with sports data
We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.
$63k-74k yearly est. 49d ago
Data Engineer
Telyrx LLC
Data engineer job in Clearwater, FL
At TelyRx, we're revolutionizing access to essential medications by combining cutting-edge
technology with a patient-centered approach. Inspired by our mission of "A Faster Way to
Wellness," we strive to simplify healthcare and provide seamless, hassle-free access to the
medications people need.
As a DataEngineer, you will be responsible for designing, building, and maintaining the data
infrastructure that powers our analytics and business intelligence capabilities. You'll work closely
with our analytics team to ensure reliable, scalable, and efficient data pipelines that support
marketing attribution, customer analytics, and operational reporting across the organization.
This role offers the opportunity to own critical data infrastructure, work with modern cloud-based
tools, and directly impact how we leverage data to improve patient outcomes and business
performance.
Key Responsibilities
Design, build, and maintain scalable ETL/ELT pipelines to ingest data from multiple sources including advertising platforms (Google Ads, Facebook Ads, Bing Ads), CRM systems, and operational databases
Manage and optimize our Snowflake data warehouse, including schema design, query performance tuning, and cost optimization
Configure and maintain Fivetran connectors and data integrations, ensuring data quality and timely syncs across all platforms
Develop and maintain data transformation layers using SQL and dbt to create clean, reliable datasets for analytics consumption
Build and manage automated workflows and Snowflake tasks for scheduled data refreshes and reporting
Partner with the analytics team to understand data requirements and translate them into robust technical solutions
Implement data quality monitoring, alerting, and validation frameworks to ensure accuracy and completeness
Document data models, pipelines, and processes to maintain institutional knowledge
Support HIPAA-compliant data handling practices and maintain appropriate access controls
Troubleshoot data issues and perform root cause analysis when discrepancies arise
Requirements:
Basic Qualifications
B.S. in Computer Science, DataEngineering, Information Systems, or related technical field
5+ years of experience in dataengineering, analytics engineering, or related roles
Expert-level SQL skills with experience writing complex queries and optimizing performance
Hands-on experience with cloud data warehouses, preferably Snowflake
Experience with ETL/ELT tools and data integration platforms (Fivetran, Stitch, Airbyte, or similar)
Proficiency in Python for data processing and automation
Experience with data transformation tools such as dbt
Strong understanding of data modeling concepts (dimensional modeling, star schemas)
Familiarity with version control (Git) and CI/CD practices for data pipelines
Experience working with marketing and advertising data (Google Ads, Facebook Ads APIs) is a plus
Strong problem-solving skills and attention to detail
Excellent communication skills and ability to collaborate with non-technical stakeholders
Preferred Qualifications
Experience in healthcare, digital health, or e-commerce industries
Familiarity with HIPAA compliance requirements for data handling
Experience with workflow orchestration tools (Airflow, Dagster, Prefect)
Knowledge of BI tools such as Sigma Computing, Tableau, or Looker
Experience with GA4 data exports and BigQuery
Understanding of marketing attribution models and customer analytics data requirements
Snowflake certifications or equivalent cloud data platform experience
Benefits
Health Coverage: Comprehensive health, dental, and vision insurance
Retirement: 401(k) plan
Time Off: Generous paid time off policy
Career Growth: Opportunities to grow within a rapidly expanding company
Mission-Driven Work: Be part of an organization focused on healthcare accessibility and innovation
If you're eager for the freedom to take ownership of our analytics strategy, develop innovative
optimization solutions, and thrive while being given the room to build and lead a world-class
analytics organization, this is the perfect opportunity for you. We're looking for someone driven
to make an impact through data, take initiative in establishing best practices, and grow
alongside our company as we tackle new analytical challenges together.
$73k-99k yearly est. 7d ago
Hadoop Admin / Developer
Us Tech Solutions 4.4
Data engineer job in Tampa, FL
US Tech Solutions is a global staff augmentation firm providing a wide-range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit our website ************************ We are constantly on the lookout for professionals to fulfill the staffing needs of our clients, sets the correct expectation and thus becomes an accelerator in the mutual growth of the individual and the organization as well.
Keeping the same intent in mind, we would like you to consider the job opening with US Tech Solutions that fits your expertise and skillset.
Job Description
Position:Hadoop Admin / Developer
Duration:6 + Months / Contract - to - Hire / Fulltime
Location:Tampa, FL
Interview:Phone & F2F/Skype
Qualifications
• Advanced knowledge in administration of Hadoop components including HDFS, MapReduce, Hive, YARN, Tez, Flume
• Advanced skills in performance tuning and troubleshooting Hadoop jobs• Intermediate skills in Data ingestion to/from Hadoop
• Knowledge of Greenplum, Informatica, Tableau, SAS desired
• Knowledge in Java desired
Additional Information
Chandra Kumar
************
Chandra at ustechsolutionsinc com
$85k-111k yearly est. 17h ago
Data Engineer - Machine Learning (Marketing Analytics)
PODS 4.0
Data engineer job in Clearwater, FL
At PODS (Portable On Demand Storage), we're not just a leader in the moving and storage industry, we redefined it. Since 1998, we've empowered customers across the U.S. and Canada with flexible, portable solutions that put customers in control of their move. Whether it's a local transition or a cross-country journey, our personalized service makes any experience smoother, smarter, and more human.
We're driven by a culture of trust, authenticity, and continuous improvement. Our team is the heartbeat of our success, and together we strive to make each day better than the last. If you're looking for a place where your work matters, your ideas are valued, and your growth is supported- PODS is your next destination.
JOB SUMMARY
The DataEngineer- Machine Learning is responsible for scaling a modern data & AI stack to drive revenue growth, improve customer satisfaction, and optimize resource utilization. As an ML DataEngineer, you will bridge dataengineering and ML engineering: build high‑quality feature pipelines in Snowflake/Snowpark, Databricks, productionize and operate batch/real‑time inference, and establish MLOps/LLMOps practices so models deliver measurable business impact at scale.
Note: This role is required onsite at PODS headquarters in Clearwater, FL. The onsite working schedule is Monday - Thursday onsite with Friday remote.
It is NOT a remote opportunity.
General Benefits & Other Compensation:
* Medical, dental, and vision insurance
* Employer-paid life insurance and disability coverage
* 401(k) retirement plan with employer match
* Paid time off (vacation, sick leave, personal days)
* Paid holidays
* Parental leave / family leave
* Bonus eligibility / incentive pay
* Professional development / training reimbursement
* Employee assistance program (EAP)
* Commuter benefits / transit subsidies (if available)
* Other fringe benefits (e.g. wellness credits)
What you will do:
● Design, build, and operate feature pipelines that transform curated datasets into reusable, governed feature tables in Snowflake
● Productionize ML models (batch and real‑time) with reliable inference jobs/APIs, SLAs, and observability
● Setup processes in Databricks and Snowflake/Snowpark to schedule, monitor, and auto‑heal training/inference pipelines
● Collaborate with our Enterprise Data & Analytics (ED&A) team centered on replicating operational data into Snowflake, enriching it into governed, reusable models/feature tables, and enabling advanced analytics & ML-with Databricks as a core collaboration environment
● Partner with Data Science to optimize models that grow customer base and revenue, improve CX, and optimize resources
● Implement MLOps/LLMOps: experiment tracking, reproducible training, model/asset registry, safe rollout, and automated retraining triggers
● Enforce data governance & security policies and contribute metadata, lineage, and definitions to the ED&A catalog
● Optimize cost/performance across Snowflake/Snowpark and Databricks
● Follow robust and established version control and DevOps practices
● Create clear runbooks and documentation, and share best practices with analytics, dataengineering, and product partners
Also, you will
DELIVER QUALITY RESULTS: Able to deliver top quality service to all customers (internal and external); Able to ensure all details are covered and adhere to company policies; Able to strive to do things right the first time; Able to meet agreed-upon commitments or advises customer when deadlines are jeopardized; Able to define high standards for quality and evaluate products, services, and own performance against those standards
TAKE INITIATIVE: Able to exhibit tendencies to be self-starting and not wait for signals; Able to be proactive and demonstrate readiness and ability to initiate action; Able to take action beyond what is required and volunteers to take on new assignments; Able to complete assignments independently without constant supervision
BE INNOVATIVE / CREATIVE: Able to examine the status quo and consistently look for better ways of doing things; Able to recommend changes based on analyzed needs; Able to develop proper solutions and identify opportunities
BE PROFESSIONAL: Able to project a positive, professional image with both internal and external business contacts; Able to create a positive first impression; Able to gain respect and trust of others through personal image and demeanor
ADVANCED COMPUTER USER: Able to use required software applications to produce correspondence, reports, presentations, electronic communication, and complex spreadsheets including formulas and macros and/or databases. Able to operate general office equipment including company telephone system
What you will need:
* Bachelor's or Master's in CS, Data/ML, or related field (or equivalent experience) required
* 4+ years in data/ML engineering building production‑grade pipelines with Python and SQL
* Strong hands‑on with Snowflake/Snowpark and Databricks; comfort with Tasks & Streams for orchestration
* 2+ years of experience optimizing models: batch jobs and/or real‑time APIs, containerized services, CI/CD, and monitoring
* Solid understanding of data modeling and governance/lineage practices expected by ED&A
It would be nice if you had:
* Familiarity with LLMOps patterns for generative AI applications
* Experience with NLP, call center data, and voice analytics
* Exposure to feature stores, model registries, canary/shadow deploys, and A/B testing frameworks
* Marketing analytics domain familiarity (lead scoring, propensity, LTV, routing/prioritization)
MANAGEMENT & SUPERVISORY RESPONSIBILTIES
* Direct supervisor job title(s) typically include: VP, Marketing Analytics
* Job may require supervising Analytics associates
No Unsolicited Resumes from Third-Party Recruiters
Please note that as per PODS policy, we do not accept unsolicited resumes from third-party recruiters unless such recruiters are engaged to provide candidates for a specified opening and in alignment with our Inclusive Diversity values.Any employment agency, person or entity that submits an unsolicited resume does so with the understanding that PODS will have the right to hire that applicant at its discretion without any fee owed to the submitting employment agency, person, or entity.
DISCLAIMER
The preceding job description has been designed to indicate the general nature of work performed; the level of knowledge and skills typically required; and usual working conditions of this position. It is not designed to contain, or be interpreted as, a comprehensive listing of all requirements or responsibilities that may be required by employees in this job.
Equal Opportunity, Affirmative Action Employer
PODS Enterprises, LLC is an Equal Opportunity, Affirmative Action Employer. We will not discriminate unlawfully against qualified applicants or employees with respect to any term or condition of employment based on race, color, national origin, ancestry, sex, sexual orientation, age, religion, physical or mental disability, marital status, place of birth, military service status, or other basis protected by law.
$80k-113k yearly est. 42d ago
Data Engineer-Lead - Project Planning and Execution
DPR Construction 4.8
Data engineer job in Tampa, FL
We are a leading construction company committed to delivering high-quality, innovative projects. Our team integrates cutting-edge technologies into the construction process to streamline operations, enhance decision-making, and drive efficiency across all levels. We are looking for a talented DataEngineer to join our team and contribute to developing robust data solutions that support our business goals.
This role is ideal for someone who enjoys combining technical problem-solving with stakeholder collaboration. You will collaborate with business leaders to understand data needs and work closely with a global engineering team to deliver scalable, timely, and high-quality data solutions that power insights and operations.
Responsibilities
* Own data delivery for specific business verticals by translating stakeholder needs into scalable, reliable, and well-documented data solutions.
* Participate in requirements gathering, technical design reviews, and planning discussions with business and technical teams.
* Partner with the extended data team to define, develop, and maintain shared data models and definitions.
* Design, develop, and maintain robust data pipelines and ETL processes using tools like Azure Data Factory and Python across internal and external systems.
* Proactively manage data quality, error handling, monitoring, and alerting to ensure timely and trustworthy data delivery.
* Perform debugging, application issue resolution, root cause analysis, and assist in proactive/preventive maintenance.
* Support incident resolution and perform root cause analysis for data-related issues.
* Create and maintain both business requirement and technical requirement documentation
* Collaborate with data analysts, business users, and developers to ensure the accuracy and efficiency of data solutions.
* Collaborate with platform and architecture teams to align with best practices and extend shared dataengineering patterns.
Qualifications
* Minimum of 4 years of experience as a DataEngineer, working with cloud platforms (Azure, AWS).
* Proven track record of managing stakeholder expectations and delivering data solutions aligned with business priorities.
* Strong hands-on expertise in Azure Data Factory, Azure Data Lake, Python, and SQL
* Familiarity with cloud storage (Azure, AWS S3) and integration techniques (APIs, webhooks, REST).
* Experience with modern data platforms like Snowflake and Microsoft Fabric.
* Solid understanding of Data Modeling, pipeline orchestration and performance optimization
* Strong problem-solving skills and ability to troubleshoot complex data issues.
* Excellent communication skills, with the ability to work collaboratively in a team environment.
* Familiarity with tools like Power BI for data visualization is a plus.
* Experience working with or coordinating with overseas teams is a strong plus
Preferred Skills
* Knowledge of Airflow or other orchestration tools.
* Experience working with Git-based workflows and CI/CD pipelines
* Experience in the construction industry or a similar field is a plus but not required.
DPR Construction is a forward-thinking, self-performing general contractor specializing in technically complex and sustainable projects for the advanced technology, life sciences, healthcare, higher education and commercial markets. Founded in 1990, DPR is a great story of entrepreneurial success as a private, employee-owned company that has grown into a multi-billion-dollar family of companies with offices around the world.
Working at DPR, you'll have the chance to try new things, explore unique paths and shape your future. Here, we build opportunity together-by harnessing our talents, enabling curiosity and pursuing our collective ambition to make the best ideas happen. We are proud to be recognized as a great place to work by our talented teammates and leading news organizations like U.S. News and World Report, Forbes, Fast Company and Newsweek.
Explore our open opportunities at ********************
$83k-109k yearly est. Auto-Apply 37d ago
Interoperability Engineer (Workday)
Moffitt Cancer Center 4.9
Data engineer job in Tampa, FL
Highlights
The Workday Interoperability Engineer serves as a senior technical expert responsible for architecting, deploying, and maintaining Workday integrations and interoperability frameworks that support secure, scalable data exchange across HR, finance, clinical, research, and enterprise systems.
Acts as a subject matter expert in Workday integration patterns including Workday Studio, EIBs, RaaS, APIs, event-driven integrations, and streaming/data pipelines.
Owns the design and operational delivery of Workday-centric interoperability initiatives, ensuring reliability and alignment with business outcomes.
Provides mentorship and technical leadership to engineers and analysts, guiding them in best practices for Workday and enterprise integration.
Combines deep Workday integration expertise with an understanding of cross-functional business processes and downstream system dependencies.
The role will also be responsible for developing and maintaining frameworks that support information exchange needs across clinical systems
Responsibilities
Hands-on experience building integrations with Workday HCM, Finance, Payroll, Recruiting, or other Workday modules.
Strong understanding of Workday data structures, security groups, calculated fields, and Workday report development (including RaaS).
Proficiency in developing integrations using Workday Studio, EIB, Core Connectors, and PECI(Payroll Effective Change Interface).
Translate Workday integration requirements into technical specifications, integration contracts, and design standards.
Ability to gather API requirements, translate them into technical specifications, and produce comprehensive API design documentation (standards, contracts, and specifications).
Hands-on experience implementing application security frameworks, including OAuth2, SAML, OpenID Connect, and JWT.
Experience in API testing strategies - functional, regression, performance, and security testing - using tools such as Postman, SoapUI, JMeter, or equivalents.
Good understanding of firewall and advanced networking concepts to support secure system integrations.
Provide on-call support and keep integration documentation and records up to date.
Credentials and Experience:
Bachelor's Degree - field of study: Computer Science, systems analysis, or a related study
Minimum 7 years of experience leading end-to-end integration implementations, with a strong emphasis on Workday and supporting middleware technologies like Cloverleaf and Boomi.
Minimum of 3 years' experience working with cross functional teams providing expert knowledge for ERP data analysis to design, build and deploy integrations.
The Ideal Candidate will have the following experience:
Strong hands-on experience developing Workday integrations using Workday Studio, EIBs, Core Connectors, RaaS, and Workday Web Services.
Experience designing and supporting interoperability between Workday and downstream systems
Familiarity with healthcare interoperability concepts and standards such as HL7 or FHIR, especially where Workday interacts with clinical or research environments.
Proficiency with integration platforms such as Boomi and/or Cloverleaf for orchestrating Workday-related data flows.
Experience with EMR systems such as Epic is a plus, particularly when supporting Workday-to-clinical data exchange.
The average data engineer in Tampa, FL earns between $63,000 and $114,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.
Average data engineer salary in Tampa, FL
$85,000
What are the biggest employers of Data Engineers in Tampa, FL?
The biggest employers of Data Engineers in Tampa, FL are: