Data engineer jobs in Town North Country, FL - 564 jobs
All
Data Engineer
Data Scientist
ETL Architect
Hadoop Developer
Senior Data Engineer
Toorak Capital Partners
Data engineer job in Tampa, FL
Company:
Toorak Capital Partners is an integrated correspondent lending and table funding platform that acquires business purpose residential, multifamily and mixed-use loans throughout the U.S. and the United Kingdom. Headquartered in Tampa, FL., Toorak Capital Partners acquires these loans directly from a network of private lenders on a correspondent basis.
Summary:
The role of the Lead DataEngineer is to develop, implement, for building high performance, scalable data solution to support Toorak's Data Strategy
Lead Data architecture for Toorak Capital.
Lead efforts to create API framework to use data across customer facing and back office applications.
Establish consistent data standards, reference architectures, patterns, and practices across the organization for both OLTP and OLAP (Data warehouse, Data Lake house) MDM and AI / ML technologies
Lead sourcing and synthesis of Data Standardization and Semantics discovery efforts turning insights into actionable strategies that will define the priorities for the team and rally stakeholders to the vision
Lead the data integration and mapping efforts to harmonize data.
Champion standards, guidelines, and direction for ontology, data modeling, semantics and Data Standardization in general at Toorak.
Lead strategies and design solutions for a wide variety of use cases like Data Migration (end-to-end ETL process), database optimization, and data architectural solutions for Analytics Data Projects
Required Skills:
Designing and maintaining the data models, including conceptual, logical, and physical data models
5+ years of experience using NoSQL systems like MongoDB, DynamoDB and Relational SQL Database systems (PostgreSQL) and Athena
5+ years of experience on Data Pipeline development, ETL and processing of structured and unstructured data
5+ years of experience in large scale real-time stream processing using Apache Flink or Apache Spark with messaging infrastructure like Kafka/Pulsar
Proficiency in using data management tools and platforms, such as data cataloging software, data quality tools), and data governance platforms
Experience with Big Query, SQL Mesh(or similar SQL-based cloud platform).
Knowledge of cloud platforms and technologies such as Google Cloud Platform, Amazon Web Services.
Strong SQL skills.
Experience with API development and frameworks.
Knowledge in designing solutions with Data Quality, Data Lineage, and Data Catalogs
Strong background in Data Science, Machine Learning, NLP, Text processing of large data sets
Experience in one or more of the following: Dataiku, DataRobot, Databricks, UiPath would be nice to have.
Using version control systems (e.g., Git) to manage changes to data governance policies, procedures, and documentation
Ability to rapidly comprehend changes to key business processes and the impact on overall Data framework.
Flexibility to adjust to multiple demands, shifting priorities, ambiguity, and rapid change.
Advanced analytical skills.
High level of organization and attention to detail.
Self-starter attitude with the ability to work independently.
Knowledge of legal, compliance, and regulatory issues impacting data.
Experience in finance preferred.
$72k-99k yearly est. 4d ago
Looking for a job?
Let Zippia find it for you.
Data Scientist (Exploitation Specialist Level-3) - Tampa, FL
Masego
Data engineer job in Tampa, FL
Job Description
_________________________________________________________________________________________________
Masego is an award-winning small business that specializes in GEOINT services. As a Service-Disabled Veteran-Owned Small Business (SDVOSB), we recognize and award your hard work.
Description
We are looking for a Level-3 TS/SCI-cleared Data Scientist to join our team. This role provides automation/collection support to the main team at NGA Washington. Because of this, this opportunity relies on good communication skills and a baseline knowledge of GEOINT collection and/or automation systems like JEMA.
Minimum Required Qualifications:
At least 5 years of related GEOINT work experience, or 2 years with a relevant Bachelor's degree.
Able to work on client site 40-hours a week (very limited option for telework)
Proficient with Python
Experience with JEMA
Preferred Qualifications:
Experience with multiple intelligence types (SIGINT, OSINT, ELINT, GEOINT, MASINT, HUMINT
Experience with Brewlytics, ArcPro and/or other geospatial data analysis tools
Knowledge of GEOINT collection and associated NGA/NRO systems
Proficiency with common programming languages including R, SQL, HTML, and JavaScript
Experience analyzing geospatially enabled data
Ability to learn new technologies and adapt to dynamic mission needs
Ability to work collaboratively with a remote team (main gov team is based out of NGA Washington)
Experience providing embedded data science/automation support to analytic teams
Security Clearance Requirement:
Active TS/SCI, with a willingness to take a polygraph test.
Salary Range: $128,600 based on ability to meet or exceed stated requirements
About Masego
Masego Inc. provides expert Geospatial Intelligence Solutions in addition to Activity Based Intelligence (ABI) and GEOINT instructional services. Masego provides expert-level Geospatial Collection Management, Full Motion Video; Human Geography; Information Technology and Cyber; Technical Writing; and ABI, Agile, and other professional training.
Masego is a Service-Disabled Veteran-Owned Small Business headquartered in Fredericksburg, Virginia. With high-level expertise and decades of experience, coupled with proven project management systems and top-notch client support, Masego enhances the performance capabilities of the Department of Defense and the intelligence community.
Pay and Benefits
We seek to provide and take care of our team members. We currently offer Medical, Dental, Vision, 401k, Generous PTO, and more!
Diversity
Masego, Inc. is an equal opportunity/equal access/affirmative action employer fully committed to achieving a diverse workforce and complies with all applicable Federal and Virginia State laws, regulations, and executive orders regarding nondiscrimination and affirmative action in its programs and activities. Masego, Inc. does not discriminate on the basis of race, color, religion, ethnic or national origin, gender, genetic information, age, disability, sexual orientation, gender identity, gender expression, and veteran's status.
Powered by JazzHR
dWDxKmrwOd
$128.6k yearly 25d ago
Lead Data Engineer
The Walt Disney Company 4.6
Data engineer job in Key Vista, FL
The Disney Decision Science + Integration (DDSI) is a consulting team that supports clients across The Walt Disney Company, including Disney Experiences (Parks & Resorts worldwide, Cruise Line, Consumer Products, etc.), Disney Entertainment (ABC, The Walt Disney Studios, Disney Theatrical, Disney Streaming Services, etc.), ESPN, and Corporate Finance. Key partners to the DDSI organization include Marketing, Finance, Business Development, Research, and Operations. We develop, analyze, and execute strategies and improve the value proposition for our Guests, Cast Members, and Shareholders. The team leverages technology, data analytics, optimization, statistical and econometric modeling to explore opportunities, shape business decisions and drive business value.
What You Will Do
You will be responsible for planning and leading research and development related to advanced analytic data solutions, leveraging GenAI. You will work with business and technology leaders to understand scope and requirements, business needs, and data from across the Disney company in order to design and deliver the data pipelines necessary for our solutions. You will partner with the Decision Science Products, Decision Science, and client teams on critical projects. Other activities include planning, estimating, design, development, testing, production rollout and sustainment activities. You will also need to consult and collaborate with project team members, lead design reviews, do hands-on development, and communicate with colleagues and leaders.
Required Qualifications & Skills
7+ years overall experience in a dataengineering development capacity using a multiple environments (Dev, QA, Prod, etc.) and DevOps procedures for code deployment/promotion
Experience with a variety of GenAI models, tools, and concepts
3+ years using, designing and building relational databases (preferably Snowflake or PostgreSQL)
3+ years of experience leading and deploying code using a source control product such as GitLab/GitHub
2+ years of experience with job scheduling software like Apache Airflow, Amazon MWAA, GitLab Runners or UC4
Multiple years of experience with ELT/ETL data pipeline development and maintenance
Multiple years of shown experience and expertise using SQL and Python
Experience using containerization technologies such as Docker or Kubernetes
Knowledgeable on cloud architecture and product offerings, preferably AWS
Understanding of Knowledge Graphs, Data Mesh, and other data sharing platforms
Experience translating project scope and high-level requirements into technical dataengineering tasks
Experience defining solutions to sophisticated dataengineering problems in support of advanced analytic processes
Experience collaborating with multiple project teams in a fast-paced environment
Experience with defining and estimating level-of-effort dataengineering activities
Experience with project and sprint planning
Ability to communicate technical concepts and solutions to non-technical team members
Experience designing and building data structures to support requirements
Preferred Qualifications
Experience leading development of GenAI based systems including model selection, pipeline orchestration and deployment strategies
Experience defining GenAI architectures
Knowledgeable with Disney Parks attendance, reservations and/or products
Experience with cloud based technologies, preferably AWS EMR, EC2, and S3
Experience with advanced Snowflake offerings such as Snowpark, Data Exchange, Data Marketplace and Snowpipe
Education
Bachelor's degree in computer science, Information Systems, Software, Electrical or Electronics Engineering, or comparable field of study and/or equivalent work experience
Master's degree preferred in computer science, Information Systems, Software, Electrical or Electronics Engineering, or comparable field of study and/or equivalent work experience
#DisneyTech
#DisneyAnalytics
**********************
Job Posting Segment:
Corporate Strategy
Job Posting Primary Business:
Decision Science & Integration
Primary Job Posting Category:
DataEngineering
Employment Type:
Full time
Primary City, State, Region, Postal Code:
Lake Buena Vista, FL, USA
Alternate City, State, Region, Postal Code:
Date Posted:
2025-09-30
$101k-146k yearly est. Auto-Apply 60d+ ago
Data Scientist
Calhoun International 4.7
Data engineer job in Tampa, FL
Join our team at Core One! Our mission is to be at the forefront of devising analytical, operational and technical solutions to our Nation's most complex national security challenges. In order to achieve our mission, Core One values people first! We are committed to recruiting, nurturing, and retaining top talent! We offer a competitive total compensation package that sets us apart from our competition. Core One is a team-oriented, dynamic, and growing company that values exceptional performance!
* This position requires an active TS/SCI clearance.*
Responsibilities:
Provide data science (DS) and operations research (OR) capabilities on-site for a combatant command Operation Assessment Division. Design, develop, and apply a variety of data collection and decision analytics processes and applications, including the employment of mathematical, statistical, and other analytic methods. Identify effective, efficient, and innovative technical solutions for meeting Division data and automation requirements, including potential artificial intelligence (AI) and machine learning (ML) solutions. Develop automated applications, data visualizations, information displays, decision briefings, analytic papers, and facilitate senior leadership decisions with analytic products. Identify and develop data stream interfaces for authoritative data sources to support assessments and risk analysis. Integrate Division functions and products into the Command and Control of the Information Environment (C2IE) system, MAVEN Smart Systems, and/or Advana. Build digital solutions using programming applications (e.g., R, R/Shiny, Python) to digitalize and partially or fully automate data collection, analysis, and staff processes while accelerating the rate at which the Division can execute tasks. Develop and lead small teams in the development of real-time/near real-time data visualization and analysis methodologies and analytic tools. Participate in client operational planning processes in support of Joint planning. Support Knowledge Management and Information processes requirements.
Basic Qualifications:
* Possess a Master's Degree, preferably in a related technical field, such as operations research, data science, math, engineering, science, or computer science.
* 5-12 years of combined professional DS/OR experience, with a minimum of 5 years of related DS/OR experience at a Combatant Command staff, Joint or Combined Command Headquarters, or Defense Department equivalent.
* High levels of proficiency using the following applications: R, R-Shiny, Python, Python-Shiny, SQL/POSTRESQL, Microsoft Office applications, and Microsoft SharePoint
* Functional knowledge of MAVEN Smart Systems, C2IE, Advana, AI, ML, Git, and Large Learning Models
* Top Secret (TS)/Secure Compartmented Information (SCI) clearance is required. Applicants are subject to a security investigation and need to meet eligibility requirements for access to classified information.
Additional Qualifications:
* Ability to work independently or as the leader or member of a small team in conducting analysis in support of assessments with high visibility, unusual urgency or program criticality; requiring a variety of OR and DS techniques and tools.
* Possession of excellent oral and written communication skills with the ability to communicate, prepare correspondence, and make formal presentations at the 4-Star General Officer/Flag Officer level.
* Ability to develop and support new analytic capabilities as requirements evolve within the command for assessments.
* Knowledge of Joint Warfighting and Combatant Command functions.
Security Clearance:
* Active TS/SCI clearance is required
Core One is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender identity, sexual orientation, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
__PRESENT__PRESENT__PRESENT
$66k-93k yearly est. 8d ago
ETL Architect
Healthplan Services 4.7
Data engineer job in Tampa, FL
HealthPlan Services (HPS) is the nation's largest independent provider of sales, benefits administration, retention, reform and technology solutions to the insurance and managed care industries. Headquartered in Tampa, Florida, HPS was founded in 1970 and employs 1,500+ associates. HPS stands at the forefront of the insurance industry, providing exchange connectivity, administration, distribution and technology services to insurers of individual, small group, voluntary and association plans, as well as valuable solutions to thousands of brokers and agents, nationwide.
Job Description
Position: ETL Architect
The ETL Architect will have experience
delivering BI solutions with an Agile BI delivery methodology.
Essential Job Functions and Duties:
Develop and
maintain ETL jobs for data warehouses/marts
Design ETL
via source-to-target mapping and design documents that consider security,
performance tuning and best practices
Collaborate
with delivery and technical team members on design and development
Collaborate
with business partners to understand business processes, underlying data and
reporting needs
Conduct data
analysis in support of ETL development and other activities
Assist with data architecture and data modeling
Preferred Qualifications:
12+ years of work experience as Business Intelligence Developer
Work experience with multiple database platforms and BI delivery solutions
10+ years of experience with
End to End ETL
architecture
, data modeling BI and Analytics data marts, implementing
and supporting production environments.
10+ years of experience designing, building and implementing BI solutions with
modern BI tools like Microstrategy, Microsoft and Tableau
Experience as a Data Architect
Experience delivering BI solutions with an Agile BI delivery methodology
Ability to communicate, present and interact comfortably with senior leadership
Demonstrated proficiency implementing self-service solutions to empower an organization to
generate valuable actionable insights
Strong team player
Ability to understand information quickly, derive insight, synthesize information clearly
and concisely, and devise solutions
Inclination to take initiative, set priorities, take ownership of assigned projects and
initiatives, drive for results, and collaborate to achieve greatest value
Strong relationship-building and interpersonal skills
Demonstrated self-confidence, honesty and integrity
Conscientious of Enterprise Data Warehouse Release management
process; Conduct Operations readiness and environment compatibility review of
any changes prior to deployment with strong sensitivity around Impact and SLA
Experience with data modeling tools a plus.
Expert in data warehousing methodologies and best practices
required.
Ability to initiate and follow through on complex projects of
both short and long term duration required.
Works independently, assumes responsibility for job development
and training, researches and resolves questions and problems, requests
supervisor input and keeps supervisor informed required.
Proactive recommendation for improving the performance and
operability of the data warehouse and reporting environment.
Participate on interdepartmental teams to support organizational
goals
Perform other related duties and tasks as assigned
Experience facilitating user sessions and gathering requirements
Education Requirements:
Bachelors or equivalent degree in a business, technical, or related field
Additional Information
All your information will be kept confidential according to EEO guidelines.
$84k-105k yearly est. 1d ago
Principal Data Scientist
Maximus 4.3
Data engineer job in Tampa, FL
Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team.
You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes.
This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.)
This position requires occasional travel to the DC area for client meetings.
U.S. citizenship is required for this position due to government contract requirements.
Essential Duties and Responsibilities:
- Make deep dives into the data, pulling out objective insights for business leaders.
- Initiate, craft, and lead advanced analyses of operational data.
- Provide a strong voice for the importance of data-driven decision making.
- Provide expertise to others in data wrangling and analysis.
- Convert complex data into visually appealing presentations.
- Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners.
- Understand the importance of automation and look to implement and initiate automated solutions where appropriate.
- Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects.
- Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects.
- Guide operational partners on product performance and solution improvement/maturity options.
- Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization.
- Learn new skills in advanced analytics/AI/ML tools, techniques, and languages.
- Mentor more junior data analysts/data scientists as needed.
- Apply strategic approach to lead projects from start to finish;
Job-Specific Minimum Requirements:
- Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
- Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning.
- Contribute to the development of mathematically rigorous process improvement procedures.
- Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments.
Minimum Requirements
- Bachelor's degree in related field required.
- 10-12 years of relevant professional experience required.
Job-Specific Minimum Requirements:
- 10+ years of relevant Software Development + AI / ML / DS experience.
- Professional Programming experience (e.g. Python, R, etc.).
- Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML.
- Experience with API programming.
- Experience with Linux.
- Experience with Statistics.
- Experience with Classical Machine Learning.
- Experience working as a contributor on a team.
Preferred Skills and Qualifications:
- Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.).
- Experience developing machine learning or signal processing algorithms:
- Ability to leverage mathematical principles to model new and novel behaviors.
- Ability to leverage statistics to identify true signals from noise or clutter.
- Experience working as an individual contributor in AI.
- Use of state-of-the-art technology to solve operational problems in AI and Machine Learning.
- Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles.
- Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions.
- Ability to build reference implementations of operational AI & Advanced Analytics processing solutions.
Background Investigations:
- IRS MBI - Eligibility
#techjobs #VeteransPage #LI-Remote
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
156,740.00
Maximum Salary
$
234,960.00
$64k-92k yearly est. Easy Apply 7d ago
Associate Data Scientist
Reliaquest 3.5
Data engineer job in Tampa, FL
Why it's worth it:
Are you a passionate Data Scientist with a knack for solving complex problems and a love for innovation? At ReliaQuest, you'll have the opportunity to analyze and interpret complex data sets, driving automation of threat detection and response for one of the world's fastest-growing AI cybersecurity companies. You'll contribute to the creation, testing, and deployment of cutting-edge technology for enterprise customers worldwide. Most importantly, you'll collaborate with some of the brightest minds in the industry and make a direct impact on the growth and success of ReliaQuest. This role offers you the chance to take ownership of projects and work on systems that operate at a significant scale, providing a unique opportunity to see the tangible results of your efforts.
The everyday hustle:
Analyze and interpret complex data sets to identify patterns and trends.
Develop and implement data models and algorithms to enhance our GreyMatter agentic AI security operations platform.
Collaborate with software engineers to integrate data-driven solutions into our products.
Automate data collection and processing to streamline our customers' security operations.
Work on systems that operate at a significant scale, providing a unique opportunity to see the tangible results of your efforts.
Collaborate closely with various business units, both internally and externally, to ensure seamless product usage and maximum potential.
Take ownership of projects and drive them to completion, ensuring high-quality deliverables.
Do you have what it takes?
Completed Bachelor's or Master's degree in Data Science, Computer Science, Statistics, or a related field.
Passion and experience in data analysis and machine learning using languages and tools such as Python, R, SQL, and TensorFlow.
Appreciation and understanding of data security principles.
Practical or lab experience with data visualization tools like Tableau or Power BI.
Passion and inquisitiveness with cloud platforms such as AWS, GCP, or Azure.
Education or relevant experiences that allow you to work with some of the world's best engineering minds.
Proficiency in English, written and verbal.
$67k-92k yearly est. Auto-Apply 2d ago
Data Scientist
Core One
Data engineer job in Tampa, FL
Join our team at Core One! Our mission is to be at the forefront of devising analytical, operational and technical solutions to our Nation's most complex national security challenges. In order to achieve our mission, Core One values people first! We are committed to recruiting, nurturing, and retaining top talent! We offer a competitive total compensation package that sets us apart from our competition. Core One is a team-oriented, dynamic, and growing company that values exceptional performance!
*This position requires an active TS/SCI clearance.*
Responsibilities:
Provide data science (DS) and operations research (OR) capabilities on-site for a combatant command Operation Assessment Division. Design, develop, and apply a variety of data collection and decision analytics processes and applications, including the employment of mathematical, statistical, and other analytic methods. Identify effective, efficient, and innovative technical solutions for meeting Division data and automation requirements, including potential artificial intelligence (AI) and machine learning (ML) solutions. Develop automated applications, data visualizations, information displays, decision briefings, analytic papers, and facilitate senior leadership decisions with analytic products. Identify and develop data stream interfaces for authoritative data sources to support assessments and risk analysis. Integrate Division functions and products into the Command and Control of the Information Environment (C2IE) system, MAVEN Smart Systems, and/or Advana. Build digital solutions using programming applications (e.g., R, R/Shiny, Python) to digitalize and partially or fully automate data collection, analysis, and staff processes while accelerating the rate at which the Division can execute tasks. Develop and lead small teams in the development of real-time/near real-time data visualization and analysis methodologies and analytic tools. Participate in client operational planning processes in support of Joint planning. Support Knowledge Management and Information processes requirements.
Basic Qualifications:
Possess a Master's Degree, preferably in a related technical field, such as operations research, data science, math, engineering, science, or computer science.
5-12 years of combined professional DS/OR experience, with a minimum of 5 years of related DS/OR experience at a Combatant Command staff, Joint or Combined Command Headquarters, or Defense Department equivalent.
High levels of proficiency using the following applications: R, R-Shiny, Python, Python-Shiny, SQL/POSTRESQL, Microsoft Office applications, and Microsoft SharePoint
Functional knowledge of MAVEN Smart Systems, C2IE, Advana, AI, ML, Git, and Large Learning Models
Top Secret (TS)/Secure Compartmented Information (SCI) clearance is required. Applicants are subject to a security investigation and need to meet eligibility requirements for access to classified information.
Additional Qualifications:
Ability to work independently or as the leader or member of a small team in conducting analysis in support of assessments with high visibility, unusual urgency or program criticality; requiring a variety of OR and DS techniques and tools.
Possession of excellent oral and written communication skills with the ability to communicate, prepare correspondence, and make formal presentations at the 4-Star General Officer/Flag Officer level.
Ability to develop and support new analytic capabilities as requirements evolve within the command for assessments.
Knowledge of Joint Warfighting and Combatant Command functions.
Security Clearance:
Active TS/SCI clearance is required
Core One is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender identity, sexual orientation, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
__PRESENT__PRESENT__PRESENT
$63k-91k yearly est. Auto-Apply 9d ago
Data Scientist
Redhorse Corporation
Data engineer job in Tampa, FL
About the OrganizationNow is a great time to join Redhorse Corporation. We are a solution-driven company delivering data insights and technology solutions to customers with missions critical to U.S. national interests. We're looking for thoughtful, skilled professionals who thrive as trusted partners building technology-agnostic solutions and want to apply their talents supporting customers with difficult and important mission sets.
About the RoleRedhorse Corporation is seeking a highly skilled Data Scientist to join our team supporting the United States Central Command (USCENTCOM) Directorate of Logistics (CCJ4). You will play a critical role in accelerating the delivery of AI-enabled capabilities within the Joint Logistics Common Operating Picture (JLOGCOP), directly impacting USCENTCOM's ability to promote international cooperation, respond to crises, deter aggression, and build resilient logistics capabilities for our partners. This is a high-impact role contributing to national security and global stability. You will be working on a custom build of AI/ML capabilities into the JLOGCOP leveraging dozens of data feeds to enhance decision-making and accelerate planning for USCENTCOM missions.Key Responsibilities
Communicate with the client regularly regarding enterprise values and project direction.
Find the intersection between business value and achievable technical work.
Articulate and translate business questions into technical solutions using available DoD data.
Explore datasets to find meaningful entities and relationships.
Create data ingestion and cleaning pipelines.
Develop applications and effective visualizations to communicate insights.
Serve as an ambassador for executive DoD leadership to sponsor data literacy growth across the enterprise.
Required Experience/Clearance
US citizen with a Secret US government clearance. Applicants who are not US Citizens and who do not have a current and active Secret security clearance will not be considered for this role.
Ability to work independently to recommend solutions to the client and as part of a team to accomplish tasks.
Experience with functional programming (Python, R, Scala) and database languages (SQL).
Familiarity using AI/ML tools to support logistics use cases.
Ability to discern which statistical approaches are appropriate for different contexts.
Experience communicating key findings with visualizations.
8+ years of professional experience.
Master's degree in a quantitative discipline (Statistics, Computer Science, Physics, Electrical Engineering, etc.).
Desired Experience
Experience with cloud-based development platforms.
Experience with large-scale data processing tools.
Experience with data visualization tools.
Ph.D. in a quantitative discipline (Statistics, Computer Science, Physics, Electrical Engineering, etc.).
Equal Opportunity Employer/Veterans/Disabled Accommodations:If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to access job openings or apply for a job on this site as a result of your disability. You can request reasonable accommodations by contacting Talent Acquisition at *********************************** Redhorse Corporation shall, in its discretion, modify or adjust the position to meet Redhorse's changing needs.This job description is not a contract and may be adjusted as deemed appropriate in Redhorse's sole discretion.
About the Role
Culmen International is hiring Expert Exploitation Specialist/Data Scientists to provide support on-site at the National Geospatial-Intelligence Agency (NGA) in Tampa, FL.
The National Geospatial-Intelligence Agency (NGA) expects to deliver AOS Metadata Cataloging and Management Services to enhance product and asset management of content enabling rapid creation of discoverable, modular, web enabled, and visually enriched Geospatial Intelligence (GEOINT) products for intelligence producers in NGA, across the National System for Geospatial-Intelligence (NSG).
TALENT PIPELINE - Qualified applicants will be contacted as soon as funding for this position is secured.
What You'll Do in Your New Role
The Data Scientist will coordinate with our clients to understand questions and issues involving the client's datasets, then determine the best method and approach to create data-driven solutions within program guidelines. This position will be relied upon as a Subject Matter Expert (SME), and be expected to lead/assist in the development of automated processes, architect data science solutions, automated workflows, conduct analysis, use available tools to analyze data, remain adaptable to mission requirements, and identify patterns to help solve some of the complex problems that face the DoD and Intelligence Community (IC).
Work with large structured / unstructured data in a modeling and analytical environment to define and create streamline processes in the evaluation of unique datasets and solve challenging intelligence issues
Lead and participate in the design of solutions and refinement of pre-existing processes
Work with Customer Stakeholders, Program Managers, and Product Owners to translate road map features into components/tasks, estimate timelines, identify resources, suggest solutions, and recognize possible risks
Use exploratory data analysis techniques to identify meaningful relationships, patterns, or trends from complex data
Combine applied mathematics, programming skills, analytical techniques, and data to provide impactful insights for decision makers
Research and implement optimization models, strategies, and methods to inform data management activities and analysis
Apply big data analytic tools to large, diverse sets of data to deliver impactful insights and assessments
Conduct peer reviews to improve quality of workflows, procedures, and methodologies
Help build high-performing teams; mentor team members providing development opportunities to increase their technical skills and knowledge
Required Qualifications
TS/SCI Clearance w/CI Poly Eligible
Minimum of 18 years combined experience (A combination of years of experience & professional certifications/trainings can be used in lieu of a degree)
BS in related Field with Graduate level work
Expert proficiency in Python and other programming languages applicable to automation development.
Demonstrated experience designing and implementing workflow automation systems
Advanced experience with ETL (Extract, Transform, Load) processes for geospatial data
Expertise in integrating disparate systems through APl development and implementation
Experience developing and deploying enterprise-scale automation solutions
Knowledge of NGA's Foundation GEOINT products, data types, and delivery methods
Demonstrated experience with database design, implementation, and optimization
Experience with digital media generation systems and automated content delivery platforms
Ability to analyze existing workflows and develop technical solutions to streamline processes
Knowledge of DLA systems and interfaces, particularly MEBS and WebFLIS
Expertise in data quality assurance and validation methodologies
Experience with geospatial data processing, transformation, and delivery automation
Proficiency with ArcGIS tools, GEODEC and ACCORD software systems
Understanding of cartographic principles and standards for CADRG/ECRG products
Strong analytical skills for identifying workflow inefficiencies and implementing solutions
Experience writing technical documentation including SOPS, CONOPS, and system design
Desired Qualifications
Certification(s) in relevant automation technologies or programming languages
Experience with DevOps practices and C/CD implementation
Knowledge of cloud-based automation solutions and their implementation in - government environments
Experience with machine learning applications for GEOINT Workflow optimization
Expertise in data analytics and visualization for workflow performance metrics
Understanding of NGA's enterprise architecture and integration points
Experience implementing RPA (Robotic Process Automation) solutions
Knowledge of secure coding practices and cybersecurity principles
Demonstrated expertise in digital transformation initiatives
Experience mentoring junior staff in automation techniques and best practices
Background in agile development methodologies
Understanding of human-centered design principles for workflow optimization
About the Company
Culmen International is committed to enhancing international safety and security, strengthening homeland defense, advancing humanitarian missions, and optimizing government operations. With experience in over 150 countries, Culmen supports our clients to accomplish critical missions in challenging environments.
Exceptional Medical/Dental/Vision Insurance, premiums for employees are 100% paid by Culmen,
and dependent coverage is available at a nominal rate (including same or opposite sex domestic partners)
401k - Vested immediately and 4% match
Life insurance and disability paid by the company
Supplemental Insurance Available
Opportunities for Training and Continuing Education
12 Paid Holidays
To learn more about Culmen International, please visit **************
At Culmen International, we are committed to creating and sustaining a workplace that upholds the principles of Equal Employment Opportunity (EEO). We believe in the importance of fair treatment and equal access to opportunities for all employees and applicants. Our commitment to these principles is unwavering across all our operations worldwide.
$62k-90k yearly est. Auto-Apply 60d+ ago
Data Governance & Metadata Scientist
Nv5
Data engineer job in Saint Petersburg, FL
NV5 Geospatial is actively recruiting a Data Governance & Metadata Scientist. Strong capabilities in developing, maintaining, and optimizing an outward-facing data catalog integrating geospatial and research layers are required. The Data Governance & Metadata Scientist will be based remotely supporting US Southern Command. US citizenship, along with the ability to successfully pass a basic background check for access to US military bases, is required for employment. While no clearance is required, a Secret or higher clearance is preferred.
Work Setting:
This role offers flexibility in location, with the option to work from any NV5 Regional Office or remotely from home.
Potential travel up to 5-15% of the time
NV5 is a global technology solutions and consulting services company with a workforce of over 4,500 professionals in more than 100 offices worldwide. NV5's continued growth has been spurred through strategic investments in firms with unique capabilities to help current and future customers solve the world's toughest problems. The NV5 family brings together talent across a wide range of markets and fields, including Professional Engineers, Professional Land Surveyors, Architects, Photogrammetrists, GIS Professionals, Software Developers, IT, Project Management Professionals, and more.
At NV5 Geospatial, we are a collaboration of intelligent, innovative thinkers who care for each other, our communities, and the environment. We value both heart and head, the diversity of our people, and their experiences because that is how we continue to grow as leaders in our industry and expand our individual and collective potential.
Responsibilities
Implement data lineage tracking and metadata synchronization to ensure consistency across Databricks, Kubernetes, and research dashboards.
Support ontology-driven decision support systems, mapping structured and unstructured datasets to enhance data interoperability.
Develop automated metadata validation and quality control mechanisms, ensuring research datasets maintain compliance with DoD governance frameworks.
Integrate metadata into platforms and implement tagging policies consistent with program standards.
Utilize GitLab pipelines and CI/CD tools for publishing and indexing routines.
Publish or embed outputs in approved web services for research dashboards intended for external access.
Utilize Azure-native indexing services such as Cognitive Search to implement federated metadata and research product discovery pipelines.
Ensure security boundary compliance.
Qualifications
Minimum Requriements:
Bachelor's degree in Computer Science, DataEngineering, Geographic Information Systems (GIS), or a related field, or five (5) years of equivalent experience in dataengineering, full-stack development, and metadata-driven data cataloging.
Demonstrated experience in developing interactive data portals, implementing API-driven search and data exchange, and integrating geospatial data layers into web applications.
Experience working with Databricks, Esri ArcGIS Feature Services, OpenLineage, and metadata management solutions.
Software development skills in Python, JavaScript (React, Angular, Vue), SQL, and RESTful API design.
Proficiency in cloud environments such as AWS, Azure, or Google Cloud, and implementing scalable, data-driven applications.
Ability to manage and prioritize complex project tasks.
Preferred:
Microsoft Certified Azure DataEngineer, AWS Certified Data Analytics Specialty, or Esri Web GIS Developer Certification.
Portuguese or Spanish language skills.
Experience with government IT programs and environments.
Clearance Requirement:
None ; Active Secret or TS/SCI preferred
Please be aware that some of our positions may require the ability to obtain security clearance. Security clearances may only be granted to U.S. citizens. In addition, applicants who accept a conditional offer of employment may be subject to government security investigation(s) and must meet eligibility requirements for access to classified information.
Employment is contingent upon successful completion of a background check and drug screening.
NV5 offers a competitive compensation and benefits package including medical, dental, life insurance, FTO, 401(k) and professional development/advancement opportunities.
NV5 provides equal employment opportunities (EEO) to all applicants for employment without regard to race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, disability, genetic information, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state, and local laws. NV5 complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including, but not limited to, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
#LI-Remote
$63k-92k yearly est. Auto-Apply 15d ago
Senior Data Engineer Architecture C4ISR
SOSi
Data engineer job in Tampa, FL
Founded in 1989, SOSi is among the largest private, founder-owned technology and services integrators in the defense and government services industry. We deliver tailored solutions, tested leadership, and trusted results to enable national security missions worldwide.
Job Description
Overview
**This position is contingent upon award of contract**
SOS International LLC (SOSi) is seeking a DataEngineer (C4ISR Architecture) to support our customer in McDill AFB, Florida.
Essential Job Duties
Provide administration of full motion video (FMV) dissemination systems to include Force Point Raise the Bar Compliant Cross Domain System Feed Management portals at USCENTCOM HQ and other locations as required.
Engineer, configure, and deploy FMV dissemination solutions in support of USCENTCOM, component, and coalition partner requirements.
Provide subject matter expertise of C4ISR Systems employed by USCENTCOM, components, and coalition partners in order to provide integration of ISR platforms supporting USCENTCOM Operational requirements.
When required, represent USCENTCOM J2 equities to Service and/or CSA Programs of Record such as DISA UVDS, Air Force DGS, NSA CuAS, and NGA MAVEN to ensure interoperability of all C4ISR Systems supporting USCENTCOM.
Provide crisis operations support for FMV dissemination to JWICS, SIPR Rel, BICES, CPN Bi-Lats, CPN-X, TALON, and SEAGULL as required.
Provide Network Engineering, FMV routing, and dissemination in support of integrating coalition and service federated PED Nodes.
Qualifications
Minimum Requirements
Active In-Scope TS/SCI clearance.
Experience providing support during crisis operation for FMV dissemination.
Experience providing network engineering, FMV routing, and dissemination in support of integrating coalition service federated PED nodes.
Preferred Qualifications
Minimum 12 years of experience related to the specific labor category with at least a portion of the experience within the last 2 years.
Master's degree in an area related to the labor category from a college or university accredited by an agency recognized by the U.S. Department of education; or have Bachelor's degree related to the labor category from a college or university accredited by an agency recognized by the U.S. Department of Education and an additional 5 years of related senior experience, for a total of 17 years, as a substitute to the Master's degree.
Additional Information
Work Environment
Working conditions are normal for an office environment.
Working at SOSi
All interested individuals will receive consideration and will not be discriminated against for any reason.
$72k-99k yearly est. 1d ago
Full Stack Cloud & Data Engineer
Shyftoff
Data engineer job in Tampa, FL
Corp:
ShyftOff's flexible, on-demand contact center platform matches businesses with top CX talent, scaling their contact center operations to meet demand. We're revolutionizing the traditional model by staffing on-demand with top US talent, minus the HR overhead.
Position Summary:
We're hiring our next engineer - someone who's obsessed with data, passionate about systems that: 1.
work
2.
are performant
and 3.
are clean
(in that order!), and eager to design, build, and optimize data pipelines that power product growth.
In this role, you'll own the entire data product ecosystem - from how data flows through the platform to how it's surfaced for decision-making. You'll play a key role in designing data systems, enabling data-informed insights, and ensuring our platform scales efficiently.
This role is onsite in Tampa, FL
Duties and Responsibilities:
Build and maintain scalable data models, cloud infrastructure, and E2E pipelines, that power ShyftOff's platform.
Design, implement, and optimize workflows using Airflow for ETL/ELT processes.
Develop and maintain PostgreSQL databases.
Write clean, maintainable Python code (with a focus on Pandas for data manipulation).
Partner cross-functionally with Sales, Marketing, and Operations to drive data-informed decisions.
Manage integrations between internal systems, ensuring smooth data flow across the business.
Maintain, monitor, and troubleshoot production data systems hosted in AWS (RDS, S3, ECS, Lambda) and GCP (BigQuery, Looker Studio).
Own the data lifecycle-from schema design and ingestion through transformation, validation, and reporting.
Champion reliability, scalability, monitoring, and performance across the data platform.
Contribute ideas, explore new tools/technologies, and take pride in building something foundational.
Experience and Qualifications:
Essential:
4+ years of professional experience in DataEngineering or a related backend engineering field.
Strong command of PostgreSQL (schema design, optimization, complex queries).
Proficient in Python and experience creating and maintaining Airflow DAGs.
Hands-on experience with AWS Cloud Services (S3, ECS, RDS, DynamoDB).
Proven ability to design, build, troubleshoot, and maintain robust ETL/ELT pipelines.
Strong understanding of software engineering principles, data modeling, and distributed systems.
Excellent communication skills and ability to collaborate effectively with non-technical teams.
Thrives in fast-paced startup environments where speed and ownership matter.
Desirable:
AWS Certification (Solutions Architect, DataEngineer, or equivalent).
Prior startup experience or experience as an early technical team member.
Strong GitHub presence or portfolio of open-source contributions.
What Sets You Apart:
You're a true data enthusiast -you love clean systems, structured databases, and elegant architecture.
You balance vision and execution: you can architect scalable systems
and
roll up your sleeves to build them.
You like to work iteratively & quickly on new concepts.
You care deeply about shipping reliable, high-impact code that drives business value.
You move fast and communicate clearly-helping the team stay aligned and productive.
You thrive in a collaborative environment and believe great systems are built through shared context and trust.
You're excited about the opportunity to help shape the future of ShyftOff's data ecosystem.
Benefits:
Competitive salary and equity
Health and wellness benefits
Professional development opportunities
High-impact role with visibility across the company
The chance to help shape the technical culture and data infrastructure of a growing startup
Equal Opportunity Employer:
ShyftOff Corp values diversity and does not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
$72k-99k yearly est. Auto-Apply 47d ago
Data Engineer III
Boar's Head Resort 4.3
Data engineer job in Sarasota, FL
Hiring Company: Delicatessen Services Co., LLCOverview:A Boar's Head DataEngineer III is a key part of the Enterprise Applications team and is responsible for developing and delivering innovative solutions aligning with the organization's goals. The role focuses on maintaining scalable, reliable, consistent, and repeatable systems that efficiently provide data for company-wide analytics and operations. A DataEngineer III must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products.
Essential Skills:
•Application Proficiency & Data Modeling: Expertise in using specific applications (Oracle, Kafka) and the ability to create conceptual and logical data models for complex systems.
•Coding & Technical Writing: Proficiency in one or more programming languages (e.g., SQL, JAVA, Python, R etc.) and the ability to clearly document technical processes, instructions, and project reports.
•Technical Testing & Troubleshooting: Superior skills in testing software or hardware systems for errors or issues and effectively diagnosing and resolving these problems.
Desired Skills:
• Experience with data integration of Anaplan is a significant plus. Ability to support QA technical testing, troubleshoot Anaplan data issues, and create applicable documentation when necessary is a plus.Job Description:Responsibilities & Specific DutiesResponsibilities
Data Streaming/Data Analytics: Leverages appropriate technologies to provide business solutions, focusing on data streaming and analytics.
Collaboration: Collaborates with various teams and resident experts to resolve data-related technical issues and support data infrastructure needs.
Business Operations: Understands business operations to identify automation opportunities and effective use of reporting & analytics tools for all business areas (e.g., Marketing, Sales, Distribution, Manufacturing, Finance & Human Resources).
Technical Delivery: Assists in managing technical delivery and identifying potential solutions based on technical and business suitability.
Data Governance: Works with knowledge management leads to maintain consistency in Standard Operating Procedures and Work Instructions for Data Governance and Analytics applications.
Support: Supports the implementation of new and expanding applications/processes/projects.
Specific Duties
Streaming/Data Pipelines: Creates and maintains data pipelines using Boar's Head streaming technologies (Kafka, Flink, Clickhouse).
Data Modeling: Analyzes business requirements and potential data sources to define the required data mapping and streaming architecture to assemble complex data sets that meet business requirements.
Automation: Identify, design, and implement internal processes through automation, data delivery optimization, and infrastructure redesign for greater scalability.
Infrastructure: Builds infrastructure for data extraction, transformation, and loading using SQL and AWS ‘big data' technologies.
Data Quality/Observability: Ensures end-to-end (source to target data warehouse) data accuracy and integrity either by thorough data validation and testing or using a data observability tool.
Testing Partnership: Collaborates with QA teams for systems testing and user acceptance testing strategy development and validation.
Collaboration: Identifies and communicates business process states, recommending future state options in collaboration with team members (Technical Delivery Managers, other DataEngineers/Developers, and Analysts).
Continuous Improvement: Fosters understanding of software functions, user interactions, limitations, usage scenarios, error handling, and expected success measures (targets, metrics for continuous improvement).
Documentation: Outlines the current state and designs future state process diagrams using Visio or other advanced modeling tools, and provides technical process documentation and data flows, including system/application dependencies across auxiliary systems.
All Other: Other duties and responsibilities may be added at the manager's discretion.
Project Management Support
Technical Requirements: Provides technical requirements on assigned projects and assesses level of effort estimates.
PM Assistance: Maintains project focus and assists the Project Manager with schedules, resources and scope; defines change requests for missed technical requirements and/or scope change or other constraints (schedule, resources, and scope/quality).
Agile Tools: Experience with Atlassian project management tools (JIRA, Confluence) is a significant plus.
Education and Experience
Education: Bachelor's degree in a tech-related field such as computer science or software engineering or a related field.
Experience: At least 6 years' experience in a role equivalent to DataEngineer.
Data Science: Familiarity with data streaming (Kafka, Flink or equivalent, Clickhouse or equivalent), wrangling, modeling, and analytics.
Data Quality/Observability: Experience in data quality, data observability, data governance, data integrity, data analytics and validation.
Reporting: Tableau or similar analytics tool knowledge is beneficial.
Topology: Conceptual understanding in technical topology, overarching technical footprints.
Oracle: Oracle EBS/Oracle dB: Working knowledge of Oracle dB, Oracle ERP and SQL
Cyber Security: Basic understanding of Cyber Security concepts.
Microsoft Proficiency: Proficiency with the full Microsoft Office Suite, Microsoft Project, Microsoft Visio, SmartDraw is advantageous
Location:Sarasota, FLTime Type:Full time Department:Management Information Systems
$74k-103k yearly est. Auto-Apply 1d ago
Hadoop Admin / Developer
Us Tech Solutions 4.4
Data engineer job in Tampa, FL
US Tech Solutions is a global staff augmentation firm providing a wide-range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit our website ************************ We are constantly on the lookout for professionals to fulfill the staffing needs of our clients, sets the correct expectation and thus becomes an accelerator in the mutual growth of the individual and the organization as well.
Keeping the same intent in mind, we would like you to consider the job opening with US Tech Solutions that fits your expertise and skillset.
Job Description
Position: Hadoop Admin / Developer
Duration:6 + Months / Contract - to - Hire / Fulltime
Location:Tampa, FL
Interview:Phone & F2F/Skype
Qualifications
• Advanced knowledge in administration of Hadoop components including HDFS, MapReduce, Hive, YARN, Tez, Flume
• Advanced skills in performance tuning and troubleshooting Hadoop jobs• Intermediate skills in Data ingestion to/from Hadoop
• Knowledge of Greenplum, Informatica, Tableau, SAS desired
• Knowledge in Java desired
Additional Information
Chandra Kumar
************
Chandra at ustechsolutionsinc com
$85k-111k yearly est. 1d ago
Data Engineer - Machine Learning (Marketing Analytics)
PODS 4.0
Data engineer job in Clearwater, FL
At PODS (Portable On Demand Storage), we're not just a leader in the moving and storage industry, we redefined it. Since 1998, we've empowered customers across the U.S. and Canada with flexible, portable solutions that put customers in control of their move. Whether it's a local transition or a cross-country journey, our personalized service makes any experience smoother, smarter, and more human.
We're driven by a culture of trust, authenticity, and continuous improvement. Our team is the heartbeat of our success, and together we strive to make each day better than the last. If you're looking for a place where your work matters, your ideas are valued, and your growth is supported- PODS is your next destination.
JOB SUMMARY
The DataEngineer- Machine Learning is responsible for scaling a modern data & AI stack to drive revenue growth, improve customer satisfaction, and optimize resource utilization. As an ML DataEngineer, you will bridge dataengineering and ML engineering: build highâquality feature pipelines in Snowflake/Snowpark, Databricks, productionize and operate batch/realâtime inference, and establish MLOps/LLMOps practices so models deliver measurable business impact at scale.
Note: This role is required onsite at PODS headquarters in Clearwater, FL. The onsite working schedule is Monday - Thursday onsite with Friday remote.
It is NOT a remote opportunity.
General Benefits & Other Compensation:
Medical, dental, and vision insurance
Employer-paid life insurance and disability coverage
401(k) retirement plan with employer match
Paid time off (vacation, sick leave, personal days)
Paid holidays
Parental leave / family leave
Bonus eligibility / incentive pay
Professional development / training reimbursement
Employee assistance program (EAP)
Commuter benefits / transit subsidies (if available)
Other fringe benefits (e.g. wellness credits)
What you will do:
â Design, build, and operate feature pipelines that transform curated datasets into reusable, governed feature tables in Snowflake
â Productionize ML models (batch and realâtime) with reliable inference jobs/APIs, SLAs, and observability
â Setup processes in Databricks and Snowflake/Snowpark to schedule, monitor, and autoâheal training/inference pipelines
â Collaborate with our Enterprise Data & Analytics (ED&A) team centered on replicating operational data into Snowflake, enriching it into governed, reusable models/feature tables, and enabling advanced analytics & ML-with Databricks as a core collaboration environment
â Partner with Data Science to optimize models that grow customer base and revenue, improve CX, and optimize resources
â Implement MLOps/LLMOps: experiment tracking, reproducible training, model/asset registry, safe rollout, and automated retraining triggers
â Enforce data governance & security policies and contribute metadata, lineage, and definitions to the ED&A catalog
â Optimize cost/performance across Snowflake/Snowpark and Databricks
â Follow robust and established version control and DevOps practices
â Create clear runbooks and documentation, and share best practices with analytics, dataengineering, and product partners
Also, you will
DELIVER QUALITY RESULTS: Able to deliver top quality service to all customers (internal and external); Able to ensure all details are covered and adhere to company policies; Able to strive to do things right the first time; Able to meet agreed-upon commitments or advises customer when deadlines are jeopardized; Able to define high standards for quality and evaluate products, services, and own performance against those standards
TAKE INITIATIVE: Able to exhibit tendencies to be self-starting and not wait for signals; Able to be proactive and demonstrate readiness and ability to initiate action; Able to take action beyond what is required and volunteers to take on new assignments; Able to complete assignments independently without constant supervision
BE INNOVATIVE / CREATIVE: Able to examine the status quo and consistently look for better ways of doing things; Able to recommend changes based on analyzed needs; Able to develop proper solutions and identify opportunities
BE PROFESSIONAL: Able to project a positive, professional image with both internal and external business contacts; Able to create a positive first impression; Able to gain respect and trust of others through personal image and demeanor
ADVANCED COMPUTER USER: Able to use required software applications to produce correspondence, reports, presentations, electronic communication, and complex spreadsheets including formulas and macros and/or databases. Able to operate general office equipment including company telephone system
What you will need:
Bachelor's or Master's in CS, Data/ML, or related field (or equivalent experience) required
4+ years in data/ML engineering building productionâgrade pipelines with Python and SQL
Strong handsâon with Snowflake/Snowpark and Databricks; comfort with Tasks & Streams for orchestration
2+ years of experience optimizing models: batch jobs and/or realâtime APIs, containerized services, CI/CD, and monitoring
Solid understanding of data modeling and governance/lineage practices expected by ED&A
It would be nice if you had:
Familiarity with LLMOps patterns for generative AI applications
Experience with NLP, call center data, and voice analytics
Exposure to feature stores, model registries, canary/shadow deploys, and A/B testing frameworks
Marketing analytics domain familiarity (lead scoring, propensity, LTV, routing/prioritization)
MANAGEMENT & SUPERVISORY RESPONSIBILTIES
• Direct supervisor job title(s) typically include: VP, Marketing Analytics
• Job may require supervising Analytics associates
No Unsolicited Resumes from Third-Party Recruiters
Please note that as per PODS policy, we do not accept unsolicited resumes from third-party recruiters unless such recruiters are engaged to provide candidates for a specified opening and in alignment with our Inclusive Diversity values. Any employment agency, person or entity that submits an unsolicited resume does so with the understanding that PODS will have the right to hire that applicant at its discretion without any fee owed to the submitting employment agency, person, or entity.
DISCLAIMER
The preceding job description has been designed to indicate the general nature of work performed; the level of knowledge and skills typically required; and usual working conditions of this position. It is not designed to contain, or be interpreted as, a comprehensive listing of all requirements or responsibilities that may be required by employees in this job.
Equal Opportunity, Affirmative Action Employer
PODS Enterprises, LLC is an Equal Opportunity, Affirmative Action Employer. We will not discriminate unlawfully against qualified applicants or employees with respect to any term or condition of employment based on race, color, national origin, ancestry, sex, sexual orientation, age, religion, physical or mental disability, marital status, place of birth, military service status, or other basis protected by law.
$80k-113k yearly est. 60d+ ago
Data Engineer-Lead - Project Planning and Execution
DPR Construction 4.8
Data engineer job in Tampa, FL
We are a leading construction company committed to delivering high-quality, innovative projects. Our team integrates cutting-edge technologies into the construction process to streamline operations, enhance decision-making, and drive efficiency across all levels. We are looking for a talented DataEngineer to join our team and contribute to developing robust data solutions that support our business goals.
This role is ideal for someone who enjoys combining technical problem-solving with stakeholder collaboration. You will collaborate with business leaders to understand data needs and work closely with a global engineering team to deliver scalable, timely, and high-quality data solutions that power insights and operations.
Responsibilities
* Own data delivery for specific business verticals by translating stakeholder needs into scalable, reliable, and well-documented data solutions.
* Participate in requirements gathering, technical design reviews, and planning discussions with business and technical teams.
* Partner with the extended data team to define, develop, and maintain shared data models and definitions.
* Design, develop, and maintain robust data pipelines and ETL processes using tools like Azure Data Factory and Python across internal and external systems.
* Proactively manage data quality, error handling, monitoring, and alerting to ensure timely and trustworthy data delivery.
* Perform debugging, application issue resolution, root cause analysis, and assist in proactive/preventive maintenance.
* Support incident resolution and perform root cause analysis for data-related issues.
* Create and maintain both business requirement and technical requirement documentation
* Collaborate with data analysts, business users, and developers to ensure the accuracy and efficiency of data solutions.
* Collaborate with platform and architecture teams to align with best practices and extend shared dataengineering patterns.
Qualifications
* Minimum of 4 years of experience as a DataEngineer, working with cloud platforms (Azure, AWS).
* Proven track record of managing stakeholder expectations and delivering data solutions aligned with business priorities.
* Strong hands-on expertise in Azure Data Factory, Azure Data Lake, Python, and SQL
* Familiarity with cloud storage (Azure, AWS S3) and integration techniques (APIs, webhooks, REST).
* Experience with modern data platforms like Snowflake and Microsoft Fabric.
* Solid understanding of Data Modeling, pipeline orchestration and performance optimization
* Strong problem-solving skills and ability to troubleshoot complex data issues.
* Excellent communication skills, with the ability to work collaboratively in a team environment.
* Familiarity with tools like Power BI for data visualization is a plus.
* Experience working with or coordinating with overseas teams is a strong plus
Preferred Skills
* Knowledge of Airflow or other orchestration tools.
* Experience working with Git-based workflows and CI/CD pipelines
* Experience in the construction industry or a similar field is a plus but not required.
DPR Construction is a forward-thinking, self-performing general contractor specializing in technically complex and sustainable projects for the advanced technology, life sciences, healthcare, higher education and commercial markets. Founded in 1990, DPR is a great story of entrepreneurial success as a private, employee-owned company that has grown into a multi-billion-dollar family of companies with offices around the world.
Working at DPR, you'll have the chance to try new things, explore unique paths and shape your future. Here, we build opportunity together-by harnessing our talents, enabling curiosity and pursuing our collective ambition to make the best ideas happen. We are proud to be recognized as a great place to work by our talented teammates and leading news organizations like U.S. News and World Report, Forbes, Fast Company and Newsweek.
Explore our open opportunities at ********************
$83k-109k yearly est. Auto-Apply 60d+ ago
Data Scientist II - Client Protection
Bank of America Corporation 4.7
Data engineer job in Tampa, FL
At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. We do this by driving Responsible Growth and delivering for our clients, teammates, communities and shareholders every day.
Being a Great Place to Work is core to how we drive Responsible Growth. This includes our commitment to being an inclusive workplace, attracting and developing exceptional talent, supporting our teammates' physical, emotional, and financial wellness, recognizing and rewarding performance, and how we make an impact in the communities we serve.
Bank of America is committed to an in-office culture with specific requirements for office-based attendance and which allows for an appropriate level of flexibility for our teammates and businesses based on role-specific considerations.
At Bank of America, you can build a successful career with opportunities to learn, grow, and make an impact. Join us!
Job Summary:
This job is responsible for reviewing and interpretating large datasets to uncover revenue generation opportunities and ensuring the development of effective risk management strategies. Key responsibilities include working with lines of business to comprehend problems, utilizing sophisticated analytics and deploying advanced techniques to devise solutions, and presenting recommendations based on findings. Job expectations include demonstrating leadership, resilience, accountability, a disciplined approach, and a commitment to fostering responsible growth for the enterprise.
Client Protection Shared Services - Advanced Analytics is looking for an energetic and inquisitive data scientist to join our team and help us combat financial crime. In this role, you will be expected to work on large and complex data science projects that entail working with both relational and graph databases. In these projects, it is expected to collaborate with internal strategy, technology, product, and policy partners to deploy advanced analytical solutions with the goal of reducing fraud losses, lowering false positive impacts, improving client experience, and ensuring the Bank minimizes its total cost of fraud. Key responsibilities include applying knowledge of multiple business and technical-related topics and independently driving strategic initiatives, large-scale projects, and overall improvements.
Responsibilities:
* Perform graph analytics to find and mitigate densely connected fraud networks
* Assist with the generation, prioritization, and investigation of fraud rings
* Enable business analytics, including data analysis, trend identification, and pattern recognition, using advanced techniques to drive decision making and data driven insights
* Understanding of end-to-end model development work, ranging from supervised, unsupervised, and graph-based machine learning solutions, to maximize detection of fraud or capture anomalous behavior
* Manage multiple priorities and ensures quality and timeliness of work deliverables such as data science products, data analysis reports, or data visualizations, while exhibiting the ability to work independently and in a team environment
* Manage relationships with multiple technology teams, development team, and line of business leaders, including alignment of roadmaps, managing projects, and managing risks
* Oversee development, delivery and quality assurance for data science use cases delivered to the production environment and other areas of the line of business
* Support the identification of potential issues and development of controls
* Support execution of large-scale projects, such as platform conversions or new project integrations by conducting advanced reporting and drawing analytical-based insights
* Manages a roadmap of data science use cases to answer business trends based on economic and portfolio conditions and communicate findings to senior management, while diligently working and leading peers to solve for these use cases
* Coach and mentor peers to improve proficiency in a variety of systems and serve as a subject matter expert on multiple business and technical-related topics
* Apply agile practices for project management, solution development, deployment, and maintenance
* Deliver presentations in an engaging and effective manner through in-person and virtual conversations that communicates technical concepts and analysis results to a diverse set of internal stakeholders, and develops professional relationships to foster collaboration on work deliverables
* Maintain knowledge of the latest advances in the fields of data science and artificial intelligence to support business analytics
* Engage business and technology senior leaders on reporting of project/deliverable statuses, opportunity identification, and planning efforts
Required Qualifications:
* 4+ years of experience in data and analytics
* 4+ years of experience in data analytics within fraud prevention
* Must be proficient with SQL and one of SAS, Python, or Java
* Must have familiarity with Graph databases (e.g. TigerGraph, Neo4J) and graph query languages
* Problem-solving skills including selection of data and deployment of solutions
* Proven ability to manage projects, exercise thought leadership and work with limited direction on complex problems to achieve project goals while also leading a broader team
* Excellent communication and influencing skills
* Thrives in fast-paced and highly dynamic environment
* Intellectual curiosity and strong urge to figure out the "whys" of a problem and produce creative solutions
* Exposure to model development leveraging supervised and unsupervised machine learning (regression, tree-based algorithms, etc.)
* Expertise in data analytics and technical development lifecycles including having coached junior staff
* Expertise handling and manipulating data across its lifecycle in a variety of formats, sizes, and storage technologies to solve a problem (e.g., structured, semi-structured, unstructured; graph; hadoop; kafka)
Desired Qualifications
* Advanced Quantitative degree (Master's or PhD)
* 7+ years of experience; work in financial services is very helpful, with preference to fraud, credit, cybersecurity, or other heavily quantitative areas
* Understanding of advanced machine learning methodologies including neural networks, graph algorithms, ensemble learning like XGB, and other techniques
* Proficient with SPARK, H2O, or similar advanced analytical tools
* Analytical and Innovating Thinking
* Problem Solving and Business Acumen
* Risk and Issue Management, interpreting relevant laws, rules, and regulations
* Data Visualization, Oral and Written Communication, and Presentation Skills
* Experience managing multi-year roadmaps, engaging technical and non-technical stakeholders, and leading large cross-functional formal projects
* Experience influencing mid to senior (executive) level leaders
* Experience managing risk and issue remediation
* Understanding of computer science topics like automation, code versioning, computational complexity, parallel processing, requirements gathering, testing methodologies, and development lifecycle models like Agile
Skills:
* Agile Practices
* Application Development
* DevOps Practices
* Technical Documentation
* Written Communications
* Artificial Intelligence/Machine Learning
* Business Analytics
* Data Visualization
* Presentation Skills
* Risk Management
* Adaptability
* Collaboration
* Consulting
* Networking
* Policies, Procedures, and Guidelines Management
It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.
Shift:
1st shift (United States of America)
Hours Per Week:
40
$71k-93k yearly est. 11d ago
Data Engineer - AI/ML
Fintech 4.2
Data engineer job in Tampa, FL
We are seeking a DataEngineer with strong AI/ML expertise to modernize and scale our Business Intelligence (BI) capabilities. This role will design and build data pipelines, deploy machine learning solutions, and operationalize intelligent analytics to drive decision-making across the organization. The ideal candidate blends dataengineering best practices with applied machine learning, MLOps, and AI.
Key Responsibilities
Project Responsibility: End-to-end data pipelines and integrations
Technical Competencies:
* Advanced SQL optimization and complex query design
* Kafka streaming applications and connector development
* Databricks workflow development with medallion architecture
* Data governance implementation and compliance
* Performance tuning for large-scale data processing
* Data security and privacy best practices
* Apache NiFi pipeline development for invoice and PO processing
* Integration with purpose-built data stores (Druid, MongoDB, OpenSearch, Postgres)
* Build and maintain end-to-end ML pipelines for training, deployment, and monitoring of models.
* Design and optimize data architectures for large-scale ML workloads
* Explore and implement LLM-based solutions, RAG architectures, and generative AI for business use cases.
Soft Skills:
* Cross-functional collaboration with product and engineering teams
* Technical mentoring for junior dataengineers
* Analytical thinking for complex data problems
* Stakeholder communication for data requirements
* Process improvement and efficiency focus
* Quality mindset for data accuracy and reliability Vendor Management:
* Direct communication with data platform vendors
* Evaluates vendor tools for specific data use cases
* Provides technical feedback on vendor product roadmaps
* Coordinates with vendors for data integration projects
Qualifications:
Bachelor's/Master's in Computer Science, DataEngineering, Statistics, or related field.
5+ years in dataengineering; 2+ years applying ML in production.
Our Benefits:
* Hybrid Work
* Employer Matched 401K
* Company Paid Medical Insurance Option for Employee and Dependent Children
* Company Paid Dental Insurance for Employee
* Company Paid Vision Insurance for Employee
* Company Paid Long and Short-Term Disability
* Company Paid Life and AD&D Insurance
* 18 Paid Vacation Days a Year
* Six Paid Holidays
* Employee Recognition Programs
* Holiday Bonus
* Incentive Compensation
* Community Outreach Opportunities
* Business Casual Dress Code
About Fintech:
Fintech, a pioneering accounts payable (AP) automation solutions provider, has dedicated nearly 35 years to automating invoice processing between retail and hospitality businesses, and their supply chain partners. Backed by leading investors TA Associates and General Atlantic, it stands as a leader in this sector. Its flagship product, PaymentSource, was first built for the alcohol industry to provide invoice payment automation between alcohol distributors and their customers across all 50 states. Today, it is utilized by over 267,000 businesses nationwide for invoice payment and collection associated with all B2B business transactions. This proven platform automates invoice payment, streamlines payment collection, and facilitates comprehensive data capture for over 1.1 million business relationships. Recognizing operational hurdles, Fintech expanded its payment capabilities to include scan-based trading/consignment selling for its vendors and retailers and built an advanced CRM tool with functionality to fortify vendor, supplier, and distributor field execution, addressing diverse profit center challenges. For more information about Fintech and its range of solutions, please visit ****************
Fintech is a Drug-Free Workplace. Fintech is an Equal Opportunity Employer that does not discriminate on the basis of actual or perceived race, color, creed, religion, national origin, ancestry, citizenship status, age, sex or gender (including pregnancy, childbirth and pregnancy-related conditions), gender identity or expression (including transgender status), sexual orientation, marital status, military service and veteran status, physical or mental disability, genetic information, or any other characteristic protected by applicable federal, state, or local laws and ordinances. Fintech's management team is dedicated to this policy with respect to recruitment, hiring, placement, promotion, transfer, training, compensation, benefits, employee activities, access to facilities and programs and general treatment during employment. We E-Verify.
$93k-129k yearly est. 22d ago
ETL Architect
Healthplan Services 4.7
Data engineer job in Tampa, FL
HealthPlan
Services
(HPS) is the nation's largest independent provider of sales, benefits administration, retention, reform and technology solutions to the insurance and managed care industries.
Headquartered in Tampa, Florida, HPS was founded in 1970 and employs 1,500+ associates. HPS stands at the forefront of the insurance industry, providing exchange connectivity, administration, distribution and technology services to insurers of individual, small group, voluntary and association plans, as well as valuable solutions to thousands of brokers and agents, nationwide.
Job Description
Position: ETL Architect
The ETL Architect will have experience delivering BI solutions with an Agile BI delivery methodology.
Essential Job Functions and Duties:
Develop and
maintain ETL jobs for data warehouses/marts
Design ETL
via source-to-target mapping and design documents that consider security,
performance tuning and best practices
Collaborate
with delivery and technical team members on design and development
Collaborate
with business partners to understand business processes, underlying data and
reporting needs
Conduct data
analysis in support of ETL development and other activities
Assist with data architecture and data modeling
Preferred Qualifications:
12+ years of work experience as Business Intelligence Developer
Work experience with multiple database platforms and BI delivery solutions
10+ years of experience with End to End ETL
architecture, data modeling BI and Analytics data marts, implementing
and supporting production environments.
10+ years of experience designing, building and implementing BI solutions with
modern BI tools like Microstrategy, Microsoft and Tableau
Experience as a Data Architect
Experience delivering BI solutions with an Agile BI delivery methodology
Ability to communicate, present and interact comfortably with senior leadership
Demonstrated proficiency implementing self-service solutions to empower an organization to
generate valuable actionable insights
Strong team player
Ability to understand information quickly, derive insight, synthesize information clearly
and concisely, and devise solutions
Inclination to take initiative, set priorities, take ownership of assigned projects and
initiatives, drive for results, and collaborate to achieve greatest value
Strong relationship-building and interpersonal skills
Demonstrated self-confidence, honesty and integrity
Conscientious of Enterprise Data Warehouse Release management
process; Conduct Operations readiness and environment compatibility review of
any changes prior to deployment with strong sensitivity around Impact and SLA
Experience with data modeling tools a plus.
Expert in data warehousing methodologies and best practices
required.
Ability to initiate and follow through on complex projects of
both short and long term duration required.
Works independently, assumes responsibility for job development
and training, researches and resolves questions and problems, requests
supervisor input and keeps supervisor informed required.
Proactive recommendation for improving the performance and
operability of the data warehouse and reporting environment.
Participate on interdepartmental teams to support organizational
goals
Perform other related duties and tasks as assigned
Experience facilitating user sessions and gathering requirements
Education Requirements:
Bachelors or equivalent degree in a business, technical, or related field
Additional Information
All your information will be kept confidential according to EEO guidelines.
How much does a data engineer earn in Town North Country, FL?
The average data engineer in Town North Country, FL earns between $63,000 and $114,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.
Average data engineer salary in Town North Country, FL
$85,000
What are the biggest employers of Data Engineers in Town North Country, FL?
The biggest employers of Data Engineers in Town North Country, FL are: