Job Family:
Data Science Consulting
Travel Required:
Up to 10%
Clearance Required:
Ability to Obtain Public Trust
All candidates who meet the minimum qualifications for this opportunity will be reviewed after the application period closes on Friday, January 30; Candidates who are selected to interview will be notified by Friday, February 6
What You Will Do:
Many organizations lack a clear view of their data assets, keeping the full value of their data out of reach. Guidehouse delivers end-to-end services, including designing, implementing, and deploying AI solutions and robust data platforms as well as providing advanced analytics and insights. Guidehouse's tailored solutions optimize operations, enhance customer experiences, and drive innovation.
As a Consultant, you will join Guidehouse's AI & Data team - a “horizontal” team dedicated to delivering artificial intelligence, machine learning, and advanced analytics solutions to drive innovation and deliver impactful value across Guidehouse's Public Sector client segments: Defense & Security, Communities, Energy, and Infrastructure, Financial Services, and Health. The AI & Data team has a sub-team within each segment focused on applying cutting-edge technologies and strategies to address the segment's most complex and rapidly evolving challenges across a variety of domains. You'll contribute to high-value initiatives, which may include internal innovation efforts or strategic projects, and gain hands-on experience with modern tools and methodologies. You'll collaborate with experienced professionals, grow your technical capabilities, and help shape data-driven solutions that matter.
Consultants support project teams both on client engagements (on and off-site) and internal projects. Responsibilities will include client and project management, data and information analysis, solution implementation and generation of project deliverables. As a Consultant, a key function of your role will be to support the development and creation of quality deliverables that support essential project workstreams. You will gather and analyze data, identify gaps and trends, and make recommendations related to baseline performance and structure, as well as established best practices and benchmarks.
We encourage career development and hiring for the long term. As a Consultant, you will follow a clearly defined career path and continue to deepen your specialized industry knowledge and consulting skills. As you develop project management skills and leadership abilities, you will have the opportunity to progress to the Senior Consultant level.
What You Will Need:
Minimum Years of Experience: 0 years
Minimum Degree Status: Undergraduate Degree or Graduate Degree (must be enrolled in an accredited undergraduate or graduate degree program through Fall 2025 and graduate by Summer 2026)
Working knowledge of programming languages such as Python, R, SQL.
Willingness to learn new technical skills.
Ability to work collaboratively with other data scientists and adjacent roles.
Ability to adhere to on-site work schedules in the DC metro area as directed.
Ability to work in the United States without sponsorship now or anytime in the future; Students possessing F-1 or J-1 visas are excluded from interview schedules or being hired for this position.
Ability to obtain and maintain a Public Trust, Secret, or higher level of federal/government security clearance (US Citizenship is one of the requirements for security clearance).
What Would Be Nice To Have:
Degree Concentration: Technical field of study relevant to AI/ML and data science, such as Computer Science, Data Science, Machine Learning, Artificial Intelligence, Information Science, Information Technology, etc.
Previous internship or work experience
Experience developing data science, predictive models, and AI solutions using tools such as Python or R.
Experience performing dataengineering and data wrangling using tools such as Python.
Experience performing data visualization using tools such as Power BI or Tableau.
Strong communication and presentation skills for both technical and non-technical audiences.
Ability to write technical process flows, diagrams, and model documentation.
What We Offer:
Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace.
Benefits include:
Medical, Rx, Dental & Vision Insurance
Personal and Family Sick Time & Company Paid Holidays
Position may be eligible for a discretionary variable incentive bonus
Parental Leave and Adoption Assistance
401(k) Retirement Plan
Basic Life & Supplemental Life
Health Savings Account, Dental/Vision & Dependent Care Flexible Spending Accounts
Short-Term & Long-Term Disability
Student Loan PayDown
Tuition Reimbursement, Personal Development & Learning Opportunities
Skills Development & Certifications
Employee Referral Program
Corporate Sponsored Events & Community Outreach
Emergency Back-Up Childcare Program
Mobility Stipend
About Guidehouse
Guidehouse is an Equal Opportunity Employer-Protected Veterans, Individuals with Disabilities or any other basis protected by law, ordinance, or regulation.
Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco.
If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at ************** or via email at RecruitingAccommodation@guidehouse.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation.
All communication regarding recruitment for a Guidehouse position will be sent from Guidehouse email domains including @guidehouse.com or ************************. Correspondence received by an applicant from any other domain should be considered unauthorized and will not be honored by Guidehouse. Note that Guidehouse will never charge a fee or require a money transfer at any stage of the recruitment process and does not collect fees from educational institutions for participation in a recruitment event. Never provide your banking information to a third party purporting to need that information to proceed in the hiring process.
If any person or organization demands money related to a job opportunity with Guidehouse, please report the matter to Guidehouse's Ethics Hotline. If you want to check the validity of correspondence you have received, please contact *************************. Guidehouse is not responsible for losses incurred (monetary or otherwise) from an applicant's dealings with unauthorized third parties.
Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee.
$56k-74k yearly est. Auto-Apply 1d ago
Looking for a job?
Let Zippia find it for you.
Senior Data Engineer
Toorak Capital Partners
Data engineer job in Tampa, FL
Company:
Toorak Capital Partners is an integrated correspondent lending and table funding platform that acquires business purpose residential, multifamily and mixed-use loans throughout the U.S. and the United Kingdom. Headquartered in Tampa, FL., Toorak Capital Partners acquires these loans directly from a network of private lenders on a correspondent basis.
Summary:
The role of the Lead DataEngineer is to develop, implement, for building high performance, scalable data solution to support Toorak's Data Strategy
Lead Data architecture for Toorak Capital.
Lead efforts to create API framework to use data across customer facing and back office applications.
Establish consistent data standards, reference architectures, patterns, and practices across the organization for both OLTP and OLAP (Data warehouse, Data Lake house) MDM and AI / ML technologies
Lead sourcing and synthesis of Data Standardization and Semantics discovery efforts turning insights into actionable strategies that will define the priorities for the team and rally stakeholders to the vision
Lead the data integration and mapping efforts to harmonize data.
Champion standards, guidelines, and direction for ontology, data modeling, semantics and Data Standardization in general at Toorak.
Lead strategies and design solutions for a wide variety of use cases like Data Migration (end-to-end ETL process), database optimization, and data architectural solutions for Analytics Data Projects
Required Skills:
Designing and maintaining the data models, including conceptual, logical, and physical data models
5+ years of experience using NoSQL systems like MongoDB, DynamoDB and Relational SQL Database systems (PostgreSQL) and Athena
5+ years of experience on Data Pipeline development, ETL and processing of structured and unstructured data
5+ years of experience in large scale real-time stream processing using Apache Flink or Apache Spark with messaging infrastructure like Kafka/Pulsar
Proficiency in using data management tools and platforms, such as data cataloging software, data quality tools), and data governance platforms
Experience with Big Query, SQL Mesh(or similar SQL-based cloud platform).
Knowledge of cloud platforms and technologies such as Google Cloud Platform, Amazon Web Services.
Strong SQL skills.
Experience with API development and frameworks.
Knowledge in designing solutions with Data Quality, Data Lineage, and Data Catalogs
Strong background in Data Science, Machine Learning, NLP, Text processing of large data sets
Experience in one or more of the following: Dataiku, DataRobot, Databricks, UiPath would be nice to have.
Using version control systems (e.g., Git) to manage changes to data governance policies, procedures, and documentation
Ability to rapidly comprehend changes to key business processes and the impact on overall Data framework.
Flexibility to adjust to multiple demands, shifting priorities, ambiguity, and rapid change.
Advanced analytical skills.
High level of organization and attention to detail.
Self-starter attitude with the ability to work independently.
Knowledge of legal, compliance, and regulatory issues impacting data.
Experience in finance preferred.
$72k-99k yearly est. 3d ago
Data Scientist (Exploitation Specialist Level-3) - Tampa, FL
Masego
Data engineer job in Tampa, FL
_________________________________________________________________________________________________
Masego is an award-winning small business that specializes in GEOINT services. As a Service-Disabled Veteran-Owned Small Business (SDVOSB), we recognize and award your hard work.
Description
We are looking for a Level-3 TS/SCI-cleared Data Scientist to join our team. This role provides automation/collection support to the main team at NGA Washington. Because of this, this opportunity relies on good communication skills and a baseline knowledge of GEOINT collection and/or automation systems like JEMA.
Minimum Required Qualifications:
At least 5 years of related GEOINT work experience, or 2 years with a relevant Bachelor's degree.
Able to work on client site 40-hours a week (very limited option for telework)
Proficient with Python
Experience with JEMA
Preferred Qualifications:
Experience with multiple intelligence types (SIGINT, OSINT, ELINT, GEOINT, MASINT, HUMINT
Experience with Brewlytics, ArcPro and/or other geospatial data analysis tools
Knowledge of GEOINT collection and associated NGA/NRO systems
Proficiency with common programming languages including R, SQL, HTML, and JavaScript
Experience analyzing geospatially enabled data
Ability to learn new technologies and adapt to dynamic mission needs
Ability to work collaboratively with a remote team (main gov team is based out of NGA Washington)
Experience providing embedded data science/automation support to analytic teams
Security Clearance Requirement:
Active TS/SCI, with a willingness to take a polygraph test.
Salary Range: $128,600 based on ability to meet or exceed stated requirements
About Masego
Masego Inc. provides expert Geospatial Intelligence Solutions in addition to Activity Based Intelligence (ABI) and GEOINT instructional services. Masego provides expert-level Geospatial Collection Management, Full Motion Video; Human Geography; Information Technology and Cyber; Technical Writing; and ABI, Agile, and other professional training.
Masego is a Service-Disabled Veteran-Owned Small Business headquartered in Fredericksburg, Virginia. With high-level expertise and decades of experience, coupled with proven project management systems and top-notch client support, Masego enhances the performance capabilities of the Department of Defense and the intelligence community.
Pay and Benefits
We seek to provide and take care of our team members. We currently offer Medical, Dental, Vision, 401k, Generous PTO, and more!
Diversity
Masego, Inc. is an equal opportunity/equal access/affirmative action employer fully committed to achieving a diverse workforce and complies with all applicable Federal and Virginia State laws, regulations, and executive orders regarding nondiscrimination and affirmative action in its programs and activities. Masego, Inc. does not discriminate on the basis of race, color, religion, ethnic or national origin, gender, genetic information, age, disability, sexual orientation, gender identity, gender expression, and veteran's status.
$128.6k yearly Auto-Apply 60d+ ago
Data Scientist
Calhoun International 4.7
Data engineer job in Tampa, FL
Join our team at Core One! Our mission is to be at the forefront of devising analytical, operational and technical solutions to our Nation's most complex national security challenges. In order to achieve our mission, Core One values people first! We are committed to recruiting, nurturing, and retaining top talent! We offer a competitive total compensation package that sets us apart from our competition. Core One is a team-oriented, dynamic, and growing company that values exceptional performance!
* This position requires an active TS/SCI clearance.*
Responsibilities:
Provide data science (DS) and operations research (OR) capabilities on-site for a combatant command Operation Assessment Division. Design, develop, and apply a variety of data collection and decision analytics processes and applications, including the employment of mathematical, statistical, and other analytic methods. Identify effective, efficient, and innovative technical solutions for meeting Division data and automation requirements, including potential artificial intelligence (AI) and machine learning (ML) solutions. Develop automated applications, data visualizations, information displays, decision briefings, analytic papers, and facilitate senior leadership decisions with analytic products. Identify and develop data stream interfaces for authoritative data sources to support assessments and risk analysis. Integrate Division functions and products into the Command and Control of the Information Environment (C2IE) system, MAVEN Smart Systems, and/or Advana. Build digital solutions using programming applications (e.g., R, R/Shiny, Python) to digitalize and partially or fully automate data collection, analysis, and staff processes while accelerating the rate at which the Division can execute tasks. Develop and lead small teams in the development of real-time/near real-time data visualization and analysis methodologies and analytic tools. Participate in client operational planning processes in support of Joint planning. Support Knowledge Management and Information processes requirements.
Basic Qualifications:
* Possess a Master's Degree, preferably in a related technical field, such as operations research, data science, math, engineering, science, or computer science.
* 5-12 years of combined professional DS/OR experience, with a minimum of 5 years of related DS/OR experience at a Combatant Command staff, Joint or Combined Command Headquarters, or Defense Department equivalent.
* High levels of proficiency using the following applications: R, R-Shiny, Python, Python-Shiny, SQL/POSTRESQL, Microsoft Office applications, and Microsoft SharePoint
* Functional knowledge of MAVEN Smart Systems, C2IE, Advana, AI, ML, Git, and Large Learning Models
* Top Secret (TS)/Secure Compartmented Information (SCI) clearance is required. Applicants are subject to a security investigation and need to meet eligibility requirements for access to classified information.
Additional Qualifications:
* Ability to work independently or as the leader or member of a small team in conducting analysis in support of assessments with high visibility, unusual urgency or program criticality; requiring a variety of OR and DS techniques and tools.
* Possession of excellent oral and written communication skills with the ability to communicate, prepare correspondence, and make formal presentations at the 4-Star General Officer/Flag Officer level.
* Ability to develop and support new analytic capabilities as requirements evolve within the command for assessments.
* Knowledge of Joint Warfighting and Combatant Command functions.
Security Clearance:
* Active TS/SCI clearance is required
Core One is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender identity, sexual orientation, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
__PRESENT__PRESENT__PRESENT
About the Role
Culmen International is hiring Expert Exploitation Specialist/Data Scientists to provide support on-site at the National Geospatial-Intelligence Agency (NGA) in Tampa, FL.
The National Geospatial-Intelligence Agency (NGA) expects to deliver AOS Metadata Cataloging and Management Services to enhance product and asset management of content enabling rapid creation of discoverable, modular, web enabled, and visually enriched Geospatial Intelligence (GEOINT) products for intelligence producers in NGA, across the National System for Geospatial-Intelligence (NSG).
TALENT PIPELINE - Qualified applicants will be contacted as soon as funding for this position is secured.
What You'll Do in Your New Role
The Data Scientist will coordinate with our clients to understand questions and issues involving the client's datasets, then determine the best method and approach to create data-driven solutions within program guidelines. This position will be relied upon as a Subject Matter Expert (SME), and be expected to lead/assist in the development of automated processes, architect data science solutions, automated workflows, conduct analysis, use available tools to analyze data, remain adaptable to mission requirements, and identify patterns to help solve some of the complex problems that face the DoD and Intelligence Community (IC).
Work with large structured / unstructured data in a modeling and analytical environment to define and create streamline processes in the evaluation of unique datasets and solve challenging intelligence issues
Lead and participate in the design of solutions and refinement of pre-existing processes
Work with Customer Stakeholders, Program Managers, and Product Owners to translate road map features into components/tasks, estimate timelines, identify resources, suggest solutions, and recognize possible risks
Use exploratory data analysis techniques to identify meaningful relationships, patterns, or trends from complex data
Combine applied mathematics, programming skills, analytical techniques, and data to provide impactful insights for decision makers
Research and implement optimization models, strategies, and methods to inform data management activities and analysis
Apply big data analytic tools to large, diverse sets of data to deliver impactful insights and assessments
Conduct peer reviews to improve quality of workflows, procedures, and methodologies
Help build high-performing teams; mentor team members providing development opportunities to increase their technical skills and knowledge
Required Qualifications
TS/SCI Clearance w/CI Poly Eligible
Minimum of 18 years combined experience (A combination of years of experience & professional certifications/trainings can be used in lieu of a degree)
BS in related Field with Graduate level work
Expert proficiency in Python and other programming languages applicable to automation development.
Demonstrated experience designing and implementing workflow automation systems
Advanced experience with ETL (Extract, Transform, Load) processes for geospatial data
Expertise in integrating disparate systems through APl development and implementation
Experience developing and deploying enterprise-scale automation solutions
Knowledge of NGA's Foundation GEOINT products, data types, and delivery methods
Demonstrated experience with database design, implementation, and optimization
Experience with digital media generation systems and automated content delivery platforms
Ability to analyze existing workflows and develop technical solutions to streamline processes
Knowledge of DLA systems and interfaces, particularly MEBS and WebFLIS
Expertise in data quality assurance and validation methodologies
Experience with geospatial data processing, transformation, and delivery automation
Proficiency with ArcGIS tools, GEODEC and ACCORD software systems
Understanding of cartographic principles and standards for CADRG/ECRG products
Strong analytical skills for identifying workflow inefficiencies and implementing solutions
Experience writing technical documentation including SOPS, CONOPS, and system design
Desired Qualifications
Certification(s) in relevant automation technologies or programming languages
Experience with DevOps practices and C/CD implementation
Knowledge of cloud-based automation solutions and their implementation in - government environments
Experience with machine learning applications for GEOINT Workflow optimization
Expertise in data analytics and visualization for workflow performance metrics
Understanding of NGA's enterprise architecture and integration points
Experience implementing RPA (Robotic Process Automation) solutions
Knowledge of secure coding practices and cybersecurity principles
Demonstrated expertise in digital transformation initiatives
Experience mentoring junior staff in automation techniques and best practices
Background in agile development methodologies
Understanding of human-centered design principles for workflow optimization
About the Company
Culmen International is committed to enhancing international safety and security, strengthening homeland defense, advancing humanitarian missions, and optimizing government operations. With experience in over 150 countries, Culmen supports our clients to accomplish critical missions in challenging environments.
Exceptional Medical/Dental/Vision Insurance, premiums for employees are 100% paid by Culmen,
and dependent coverage is available at a nominal rate (including same or opposite sex domestic partners)
401k - Vested immediately and 4% match
Life insurance and disability paid by the company
Supplemental Insurance Available
Opportunities for Training and Continuing Education
12 Paid Holidays
To learn more about Culmen International, please visit **************
At Culmen International, we are committed to creating and sustaining a workplace that upholds the principles of Equal Employment Opportunity (EEO). We believe in the importance of fair treatment and equal access to opportunities for all employees and applicants. Our commitment to these principles is unwavering across all our operations worldwide.
$62k-90k yearly est. Auto-Apply 35d ago
ETL Architect
Healthplan Services 4.7
Data engineer job in Tampa, FL
HealthPlan
Services
(HPS) is the nation's largest independent provider of sales, benefits administration, retention, reform and technology solutions to the insurance and managed care industries.
Headquartered in Tampa, Florida, HPS was founded in 1970 and employs 1,500+ associates. HPS stands at the forefront of the insurance industry, providing exchange connectivity, administration, distribution and technology services to insurers of individual, small group, voluntary and association plans, as well as valuable solutions to thousands of brokers and agents, nationwide.
Job Description
Position: ETL Architect
The ETL Architect will have experience delivering BI solutions with an Agile BI delivery methodology.
Essential Job Functions and Duties:
Develop and
maintain ETL jobs for data warehouses/marts
Design ETL
via source-to-target mapping and design documents that consider security,
performance tuning and best practices
Collaborate
with delivery and technical team members on design and development
Collaborate
with business partners to understand business processes, underlying data and
reporting needs
Conduct data
analysis in support of ETL development and other activities
Assist with data architecture and data modeling
Preferred Qualifications:
12+ years of work experience as Business Intelligence Developer
Work experience with multiple database platforms and BI delivery solutions
10+ years of experience with End to End ETL
architecture, data modeling BI and Analytics data marts, implementing
and supporting production environments.
10+ years of experience designing, building and implementing BI solutions with
modern BI tools like Microstrategy, Microsoft and Tableau
Experience as a Data Architect
Experience delivering BI solutions with an Agile BI delivery methodology
Ability to communicate, present and interact comfortably with senior leadership
Demonstrated proficiency implementing self-service solutions to empower an organization to
generate valuable actionable insights
Strong team player
Ability to understand information quickly, derive insight, synthesize information clearly
and concisely, and devise solutions
Inclination to take initiative, set priorities, take ownership of assigned projects and
initiatives, drive for results, and collaborate to achieve greatest value
Strong relationship-building and interpersonal skills
Demonstrated self-confidence, honesty and integrity
Conscientious of Enterprise Data Warehouse Release management
process; Conduct Operations readiness and environment compatibility review of
any changes prior to deployment with strong sensitivity around Impact and SLA
Experience with data modeling tools a plus.
Expert in data warehousing methodologies and best practices
required.
Ability to initiate and follow through on complex projects of
both short and long term duration required.
Works independently, assumes responsibility for job development
and training, researches and resolves questions and problems, requests
supervisor input and keeps supervisor informed required.
Proactive recommendation for improving the performance and
operability of the data warehouse and reporting environment.
Participate on interdepartmental teams to support organizational
goals
Perform other related duties and tasks as assigned
Experience facilitating user sessions and gathering requirements
Education Requirements:
Bachelors or equivalent degree in a business, technical, or related field
Additional Information
All your information will be kept confidential according to EEO guidelines.
$84k-105k yearly est. 60d+ ago
Principal Data Scientist
Maximus 4.3
Data engineer job in Tampa, FL
Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team.
You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes.
This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.)
This position requires occasional travel to the DC area for client meetings.
U.S. citizenship is required for this position due to government contract requirements.
Essential Duties and Responsibilities:
- Make deep dives into the data, pulling out objective insights for business leaders.
- Initiate, craft, and lead advanced analyses of operational data.
- Provide a strong voice for the importance of data-driven decision making.
- Provide expertise to others in data wrangling and analysis.
- Convert complex data into visually appealing presentations.
- Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners.
- Understand the importance of automation and look to implement and initiate automated solutions where appropriate.
- Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects.
- Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects.
- Guide operational partners on product performance and solution improvement/maturity options.
- Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization.
- Learn new skills in advanced analytics/AI/ML tools, techniques, and languages.
- Mentor more junior data analysts/data scientists as needed.
- Apply strategic approach to lead projects from start to finish;
Job-Specific Minimum Requirements:
- Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
- Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning.
- Contribute to the development of mathematically rigorous process improvement procedures.
- Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments.
Minimum Requirements
- Bachelor's degree in related field required.
- 10-12 years of relevant professional experience required.
Job-Specific Minimum Requirements:
- 10+ years of relevant Software Development + AI / ML / DS experience.
- Professional Programming experience (e.g. Python, R, etc.).
- Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML.
- Experience with API programming.
- Experience with Linux.
- Experience with Statistics.
- Experience with Classical Machine Learning.
- Experience working as a contributor on a team.
Preferred Skills and Qualifications:
- Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.).
- Experience developing machine learning or signal processing algorithms:
- Ability to leverage mathematical principles to model new and novel behaviors.
- Ability to leverage statistics to identify true signals from noise or clutter.
- Experience working as an individual contributor in AI.
- Use of state-of-the-art technology to solve operational problems in AI and Machine Learning.
- Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles.
- Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions.
- Ability to build reference implementations of operational AI & Advanced Analytics processing solutions.
Background Investigations:
- IRS MBI - Eligibility
#techjobs #VeteransPage #LI-Remote
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Accommodations
Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************.
Minimum Salary
$
156,740.00
Maximum Salary
$
234,960.00
$64k-92k yearly est. Easy Apply 6d ago
Data Scientist
Core One
Data engineer job in Tampa, FL
Join our team at Core One! Our mission is to be at the forefront of devising analytical, operational and technical solutions to our Nation's most complex national security challenges. In order to achieve our mission, Core One values people first! We are committed to recruiting, nurturing, and retaining top talent! We offer a competitive total compensation package that sets us apart from our competition. Core One is a team-oriented, dynamic, and growing company that values exceptional performance!
*This position requires an active TS/SCI clearance.*
Responsibilities:
Provide data science (DS) and operations research (OR) capabilities on-site for a combatant command Operation Assessment Division. Design, develop, and apply a variety of data collection and decision analytics processes and applications, including the employment of mathematical, statistical, and other analytic methods. Identify effective, efficient, and innovative technical solutions for meeting Division data and automation requirements, including potential artificial intelligence (AI) and machine learning (ML) solutions. Develop automated applications, data visualizations, information displays, decision briefings, analytic papers, and facilitate senior leadership decisions with analytic products. Identify and develop data stream interfaces for authoritative data sources to support assessments and risk analysis. Integrate Division functions and products into the Command and Control of the Information Environment (C2IE) system, MAVEN Smart Systems, and/or Advana. Build digital solutions using programming applications (e.g., R, R/Shiny, Python) to digitalize and partially or fully automate data collection, analysis, and staff processes while accelerating the rate at which the Division can execute tasks. Develop and lead small teams in the development of real-time/near real-time data visualization and analysis methodologies and analytic tools. Participate in client operational planning processes in support of Joint planning. Support Knowledge Management and Information processes requirements.
Basic Qualifications:
Possess a Master's Degree, preferably in a related technical field, such as operations research, data science, math, engineering, science, or computer science.
5-12 years of combined professional DS/OR experience, with a minimum of 5 years of related DS/OR experience at a Combatant Command staff, Joint or Combined Command Headquarters, or Defense Department equivalent.
High levels of proficiency using the following applications: R, R-Shiny, Python, Python-Shiny, SQL/POSTRESQL, Microsoft Office applications, and Microsoft SharePoint
Functional knowledge of MAVEN Smart Systems, C2IE, Advana, AI, ML, Git, and Large Learning Models
Top Secret (TS)/Secure Compartmented Information (SCI) clearance is required. Applicants are subject to a security investigation and need to meet eligibility requirements for access to classified information.
Additional Qualifications:
Ability to work independently or as the leader or member of a small team in conducting analysis in support of assessments with high visibility, unusual urgency or program criticality; requiring a variety of OR and DS techniques and tools.
Possession of excellent oral and written communication skills with the ability to communicate, prepare correspondence, and make formal presentations at the 4-Star General Officer/Flag Officer level.
Ability to develop and support new analytic capabilities as requirements evolve within the command for assessments.
Knowledge of Joint Warfighting and Combatant Command functions.
Security Clearance:
Active TS/SCI clearance is required
Core One is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender identity, sexual orientation, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
__PRESENT__PRESENT__PRESENT
$63k-91k yearly est. Auto-Apply 41d ago
Data Scientist
Redhorse Corporation
Data engineer job in Tampa, FL
About the OrganizationNow is a great time to join Redhorse Corporation. We are a solution-driven company delivering data insights and technology solutions to customers with missions critical to U.S. national interests. We're looking for thoughtful, skilled professionals who thrive as trusted partners building technology-agnostic solutions and want to apply their talents supporting customers with difficult and important mission sets.
About the RoleRedhorse Corporation is seeking a highly skilled Data Scientist to join our team supporting the United States Central Command (USCENTCOM) Directorate of Logistics (CCJ4). You will play a critical role in accelerating the delivery of AI-enabled capabilities within the Joint Logistics Common Operating Picture (JLOGCOP), directly impacting USCENTCOM's ability to promote international cooperation, respond to crises, deter aggression, and build resilient logistics capabilities for our partners. This is a high-impact role contributing to national security and global stability. You will be working on a custom build of AI/ML capabilities into the JLOGCOP leveraging dozens of data feeds to enhance decision-making and accelerate planning for USCENTCOM missions.Key Responsibilities
Communicate with the client regularly regarding enterprise values and project direction.
Find the intersection between business value and achievable technical work.
Articulate and translate business questions into technical solutions using available DoD data.
Explore datasets to find meaningful entities and relationships.
Create data ingestion and cleaning pipelines.
Develop applications and effective visualizations to communicate insights.
Serve as an ambassador for executive DoD leadership to sponsor data literacy growth across the enterprise.
Required Experience/Clearance
US citizen with a Secret US government clearance. Applicants who are not US Citizens and who do not have a current and active Secret security clearance will not be considered for this role.
Ability to work independently to recommend solutions to the client and as part of a team to accomplish tasks.
Experience with functional programming (Python, R, Scala) and database languages (SQL).
Familiarity using AI/ML tools to support logistics use cases.
Ability to discern which statistical approaches are appropriate for different contexts.
Experience communicating key findings with visualizations.
8+ years of professional experience.
Master's degree in a quantitative discipline (Statistics, Computer Science, Physics, Electrical Engineering, etc.).
Desired Experience
Experience with cloud-based development platforms.
Experience with large-scale data processing tools.
Experience with data visualization tools.
Ph.D. in a quantitative discipline (Statistics, Computer Science, Physics, Electrical Engineering, etc.).
Equal Opportunity Employer/Veterans/Disabled Accommodations:If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to access job openings or apply for a job on this site as a result of your disability. You can request reasonable accommodations by contacting Talent Acquisition at *********************************** Redhorse Corporation shall, in its discretion, modify or adjust the position to meet Redhorse's changing needs.This job description is not a contract and may be adjusted as deemed appropriate in Redhorse's sole discretion.
$63k-91k yearly est. Auto-Apply 60d+ ago
Data Governance & Metadata Scientist
Nv5
Data engineer job in Saint Petersburg, FL
NV5 Geospatial is actively recruiting a Data Governance & Metadata Scientist. Strong capabilities in developing, maintaining, and optimizing an outward-facing data catalog integrating geospatial and research layers are required. The Data Governance & Metadata Scientist will be based remotely supporting US Southern Command. US citizenship, along with the ability to successfully pass a basic background check for access to US military bases, is required for employment. While no clearance is required, a Secret or higher clearance is preferred.
Work Setting:
This role offers flexibility in location, with the option to work from any NV5 Regional Office or remotely from home.
Potential travel up to 5-15% of the time
NV5 is a global technology solutions and consulting services company with a workforce of over 4,500 professionals in more than 100 offices worldwide. NV5's continued growth has been spurred through strategic investments in firms with unique capabilities to help current and future customers solve the world's toughest problems. The NV5 family brings together talent across a wide range of markets and fields, including Professional Engineers, Professional Land Surveyors, Architects, Photogrammetrists, GIS Professionals, Software Developers, IT, Project Management Professionals, and more.
At NV5 Geospatial, we are a collaboration of intelligent, innovative thinkers who care for each other, our communities, and the environment. We value both heart and head, the diversity of our people, and their experiences because that is how we continue to grow as leaders in our industry and expand our individual and collective potential.
Responsibilities
Implement data lineage tracking and metadata synchronization to ensure consistency across Databricks, Kubernetes, and research dashboards.
Support ontology-driven decision support systems, mapping structured and unstructured datasets to enhance data interoperability.
Develop automated metadata validation and quality control mechanisms, ensuring research datasets maintain compliance with DoD governance frameworks.
Integrate metadata into platforms and implement tagging policies consistent with program standards.
Utilize GitLab pipelines and CI/CD tools for publishing and indexing routines.
Publish or embed outputs in approved web services for research dashboards intended for external access.
Utilize Azure-native indexing services such as Cognitive Search to implement federated metadata and research product discovery pipelines.
Ensure security boundary compliance.
Qualifications
Minimum Requriements:
Bachelor's degree in Computer Science, DataEngineering, Geographic Information Systems (GIS), or a related field, or five (5) years of equivalent experience in dataengineering, full-stack development, and metadata-driven data cataloging.
Demonstrated experience in developing interactive data portals, implementing API-driven search and data exchange, and integrating geospatial data layers into web applications.
Experience working with Databricks, Esri ArcGIS Feature Services, OpenLineage, and metadata management solutions.
Software development skills in Python, JavaScript (React, Angular, Vue), SQL, and RESTful API design.
Proficiency in cloud environments such as AWS, Azure, or Google Cloud, and implementing scalable, data-driven applications.
Ability to manage and prioritize complex project tasks.
Preferred:
Microsoft Certified Azure DataEngineer, AWS Certified Data Analytics Specialty, or Esri Web GIS Developer Certification.
Portuguese or Spanish language skills.
Experience with government IT programs and environments.
Clearance Requirement:
None ; Active Secret or TS/SCI preferred
Please be aware that some of our positions may require the ability to obtain security clearance. Security clearances may only be granted to U.S. citizens. In addition, applicants who accept a conditional offer of employment may be subject to government security investigation(s) and must meet eligibility requirements for access to classified information.
Employment is contingent upon successful completion of a background check and drug screening.
NV5 offers a competitive compensation and benefits package including medical, dental, life insurance, FTO, 401(k) and professional development/advancement opportunities.
NV5 provides equal employment opportunities (EEO) to all applicants for employment without regard to race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, disability, genetic information, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state, and local laws. NV5 complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including, but not limited to, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
#LI-Remote
$63k-92k yearly est. Auto-Apply 13d ago
Data Engineer
Clarity Innovations
Data engineer job in Tampa, FL
Clarity Innovations is a trusted national security partner, dedicated to safeguarding our nation's interests and delivering innovative solutions that empower the Intelligence Community (IC) and Department of Defense (DoD) to transform data into actionable intelligence, ensuring mission success in an evolving world.
Our mission-first software and dataengineering platform modernizes data operations, utilizing advanced workflows, CI/CD, and secure DevSecOps practices. We focus on challenges in Information Warfare, Cyber Operations, Operational Security, and Data Structuring, enabling end-to-end solutions that drive operational impact.
We are committed to delivering cutting-edge tools and capabilities that address the most complex national security challenges, empowering our partners to stay ahead of emerging threats and ensuring the success of their critical missions. At Clarity, we are people-focused and set on being a destination employer for top talent, offering an environment where innovation thrives, careers grow, and individuals are valued. Join us as we continue to lead innovation and tackle the most pressing challenges in national security.
Role
This DataEngineer role is supporting the U.S. Special Operations Command (SOCOM) based in MacDill Air Force Base, Florida. DataEngineers in this program support the command through a combination of data governance and data modernization efforts. DataEngineers will be embedded within the SOCOM HQ to provide subject matter expertise and technical capability for integrating complex data used by the command. DataEngineers in this program support SOCOM's initiative to be more capable and effective when leveraging their data resources.
Responsibilities
A dataengineer has a deep understanding of performance optimization and data pipelining. In addition to the baseline skills of a data analyst, dataengineers can make raw data more useful for the enterprise. Dataengineers can create and integrate application programming interfaces (APIs). Their technical skills generally include multiple programming languages and a deep knowledge of SQL database design.
The dataengineer role requires a more in-depth knowledge in programming for integrating complex models and using advanced software library frameworks to distribute large, clustered data sets. Dataengineers collect and arrange data in a form that is useful for analytics. A basic knowledge in machine learning is also required to build efficient and accurate data pipelines to meet the needs for downstream users such as data scientists to create the models and analytics that produce insight.
The DataEngineer shall perform the following tasks:
Developing, maintaining, and testing infrastructures for data generation to transform data from various structured and unstructured data sources.
Develop complex queries to ensure accessibility while optimizing the performance of NoSQL and or big data infrastructure. Create and maintain optimal data pipeline architecture.
Build and maintain the infrastructure to support extraction, transformation, and loading (ETL) of data from a wide variety of data sources. Extract data from multiple data sources, relational SQL and NoSQL databases, and other platform APIs, for data ingestion and integration.
Configure and manage data analytic frameworks and pipelines using databases and tools such as NoSQL, SQL, HDInsight, MongoDB, Cassandra, Neo4j, GraphDB, OrientDB, Spark, Hadoop, Kafka, Hive, and Pig.
Apply distributed systems concepts and principles such as consistency and availability, liveness and safety, durability, reliability, fault-tolerance, consensus algorithms.
Administrate cloud computing and CI/CD pipelines to include Azure, Google, and Amazon Web Service (AWS).
Coordinate with stakeholders, including product, data and design teams to assist with data-related technical issues and support their data infrastructure needs
Requirements
Minimum of 1-year experience is required.
Bachelor's degree in a STEM field with preference towards Computer Science and Software Engineering.
Verifiable work experience working with data structures, database management, distributed computing, and API driven architectures using SQL and No-SQL engines.
A Certified Data Management Professional certification is preferred.
Proficient in modeling frameworks like Universal Modeling Language (UML), Agile Development, and Git Operations.
A Top Secret Clearance with SCI eligibility is required
Preferred Qualifications
Familiarity with PowerBI, Foundry, Databricks.
Familiarity with DoD.
We are an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status.
$72k-99k yearly est. Auto-Apply 3d ago
Full Stack Cloud & Data Engineer
Shyftoff
Data engineer job in Tampa, FL
Corp:
ShyftOff's flexible, on-demand contact center platform matches businesses with top CX talent, scaling their contact center operations to meet demand. We're revolutionizing the traditional model by staffing on-demand with top US talent, minus the HR overhead.
Position Summary:
We're hiring our next engineer - someone who's obsessed with data, passionate about systems that: 1.
work
2.
are performant
and 3.
are clean
(in that order!), and eager to design, build, and optimize data pipelines that power product growth.
In this role, you'll own the entire data product ecosystem - from how data flows through the platform to how it's surfaced for decision-making. You'll play a key role in designing data systems, enabling data-informed insights, and ensuring our platform scales efficiently.
This role is onsite in Tampa, FL
Duties and Responsibilities:
Build and maintain scalable data models, cloud infrastructure, and E2E pipelines, that power ShyftOff's platform.
Design, implement, and optimize workflows using Airflow for ETL/ELT processes.
Develop and maintain PostgreSQL databases.
Write clean, maintainable Python code (with a focus on Pandas for data manipulation).
Partner cross-functionally with Sales, Marketing, and Operations to drive data-informed decisions.
Manage integrations between internal systems, ensuring smooth data flow across the business.
Maintain, monitor, and troubleshoot production data systems hosted in AWS (RDS, S3, ECS, Lambda) and GCP (BigQuery, Looker Studio).
Own the data lifecycle-from schema design and ingestion through transformation, validation, and reporting.
Champion reliability, scalability, monitoring, and performance across the data platform.
Contribute ideas, explore new tools/technologies, and take pride in building something foundational.
Experience and Qualifications:
Essential:
4+ years of professional experience in DataEngineering or a related backend engineering field.
Strong command of PostgreSQL (schema design, optimization, complex queries).
Proficient in Python and experience creating and maintaining Airflow DAGs.
Hands-on experience with AWS Cloud Services (S3, ECS, RDS, DynamoDB).
Proven ability to design, build, troubleshoot, and maintain robust ETL/ELT pipelines.
Strong understanding of software engineering principles, data modeling, and distributed systems.
Excellent communication skills and ability to collaborate effectively with non-technical teams.
Thrives in fast-paced startup environments where speed and ownership matter.
Desirable:
AWS Certification (Solutions Architect, DataEngineer, or equivalent).
Prior startup experience or experience as an early technical team member.
Strong GitHub presence or portfolio of open-source contributions.
What Sets You Apart:
You're a true data enthusiast -you love clean systems, structured databases, and elegant architecture.
You balance vision and execution: you can architect scalable systems
and
roll up your sleeves to build them.
You like to work iteratively & quickly on new concepts.
You care deeply about shipping reliable, high-impact code that drives business value.
You move fast and communicate clearly-helping the team stay aligned and productive.
You thrive in a collaborative environment and believe great systems are built through shared context and trust.
You're excited about the opportunity to help shape the future of ShyftOff's data ecosystem.
Benefits:
Competitive salary and equity
Health and wellness benefits
Professional development opportunities
High-impact role with visibility across the company
The chance to help shape the technical culture and data infrastructure of a growing startup
Equal Opportunity Employer:
ShyftOff Corp values diversity and does not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
$72k-99k yearly est. Auto-Apply 46d ago
Data Engineer
Centurion Consulting Group
Data engineer job in Tampa, FL
Job Description Centurion is looking for a DataEngineer with an active TS/SCI to work on site in Tampa, FL As a DataEngineer you will play a critical role in supporting data scientists and analysts by designing, building, and optimizing data pipelines tailored for large-scale text analytics and AI applications. You will collaborate closely with business stakeholders, IT experts, and subject-matter experts to deliver robust dataengineering solutions that enable advanced natural language processing (NLP), generative AI, and agentic AI capabilities.
Responsibilities include:
Develop and design data pipelines to support an end-to-end solution.
Develop and maintain artifacts i.e., schemas, data dictionaries, and transforms related to ETL processes.
Manage production data within multiple datasets ensuring fault tolerance and redundancy.
Design and develop robust and functional dataflows to support raw data and expected data.
Collaborate with the rest of dataengineering team to design and launch new features. Includes coordination and documentation of dataflows, capabilities, etc.
Design and develop databases to support multiple user groups with various levels of access to raw and processed data.
Apply a knowledge of the SDLC to bring applications from proof of concept to production, including on the TS network.
The Team:
Our AI & Data offering provides a full spectrum of solutions for designing, developing, and operating cutting-edge Data and AI platforms, products, insights, and services. Our offerings help clients innovate, enhance and operate their data, AI, and analytics capabilities, ensuring they can mature and scale effectively.
Required Qualifications:
Bachelor's degree required
Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future
Active TS/SCI security clearance required
4+ years of experience working with software platforms and services, such as Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similar.
4+ years of experience datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, Redis, and graph databases such as Neo4j, Memgraph, or others.
Ability to work on-site in Tampa, FL 5 days per week
Preferred Qualifications:
Familiar with Linux/Unix server environments.
Familiar with common data structures needed to support common machine learning packages such as scikit learn, nltk, spacey, and others.
Familiar with or desire to become familiar with data structures need to support Generative AI pipelines such as vector databases, NER, and RAG.
$72k-99k yearly est. 60d+ ago
Marketing Data Engineer
American Veterinary Group
Data engineer job in Tampa, FL
Job DescriptionDescriptionWe're looking for a Marketing DataEngineer to help bridge the gap between marketing strategy and data execution. In this role, you'll design, build, and maintain the data pipelines and models that power marketing analytics, attribution, and reporting. Your work will enable marketing teams to make faster, smarter, and more measurable decisions. This is a hands-on role suited for someone who enjoys working with complex data sources, simplifying messy data, and turning it into trusted, actionable insights.
Key Responsibilities
Design and maintain scalable data pipelines for marketing data sources (e.g., CRM, ad platforms, web analytics, email tools)
Integrate data from platforms such as Braze, Google Analytics, HubSpot, Salesforce, Google Ads, Meta, LinkedIn, and similar tools
Build and maintain end-to-end marketing data pipelines into Snowflake
Model and transform marketing data using dbt, following Analytics Engineering best practices
Create and maintain analytics-ready marts to support marketing performance, attribution, and funnel analysis
Partner with Marketing and FP&A to define KPIs such as CAC, LTV, ROAS, pipeline conversion, and attribution logic
Support and optimize Sigma Computing dashboards, enabling true self-service analytics
Implement testing, documentation, and freshness monitoring within dbt
Ensure data reliability, scalability, and performance within Snowflake
Standardize naming conventions, metrics definitions, and transformation logic across the marketing domain
Reduce manual reporting by automating data transformations and refresh logic
Skills, Knowledge & ExpertiseRequired:
5+ years of experience in dataengineering, analytics engineering, or a similar role
Strong SQL skills and experience working with cloud data warehouses (e.g., Snowflake, BigQuery, Redshift)
Experience building ELT/ETL pipelines using tools such as Fivetran, OpenFlow, Airbyte, dbt, or similar
Familiarity with marketing platforms and metrics (CAC, LTV, funnel conversion, attribution models, ROAS)
Experience supporting BI tools (Sigma, Looker, Tableau, Power BI, etc.)
Strong problem-solving skills and attention to data accuracy
Ability to communicate technical concepts to non-technical stakeholders
A Plus:
Experience with dbt and analytics engineering best practices
Python experience for data transformations or automation
Understanding of multi-touch attribution and marketing measurement frameworks
Experience working in a fast-growing or data-maturing organization
Job Benefits
Here's what you can expect at AVG:
Competitive salary
Flexible work schedules to support work-life balance
Comprehensive group insurance including medical, dental, and vision
Paid Family Leave and Paid Parental Leave (including maternity & paternity)
Paid Time Off
One Life Balance Day for added flexibility
Amazing Pet Discounts to keep your furry family members covered
Retirement Savings Plan with employer match to help build your future
Employee Assistance Program (EAP) for mental health and wellbeing support
Career development resources may include:
Learning and development programs
Tuition reimbursement
A vibrant, inclusive culture
At AVG, you're more than an employee, you are part of a community that values your whole self. We're here to help you grow, thrive, and feel supported every step of the way.
If you're passionate about payroll, thrive in a collaborative environment, and are ready to make a difference-we'd love to hear from you.
$72k-99k yearly est. 27d ago
Hadoop Admin / Developer
Us Tech Solutions 4.4
Data engineer job in Tampa, FL
US Tech Solutions is a global staff augmentation firm providing a wide-range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit our website ************************
We are constantly on the lookout for professionals to fulfill the staffing needs of our clients, sets the correct expectation and thus becomes an accelerator in the mutual growth of the individual and the organization as well.
Keeping the same intent in mind, we would like you to consider the job opening with US Tech Solutions that fits your expertise and skillset.
Job Description
Position:Hadoop Admin / Developer
Duration:6 + Months / Contract - to - Hire / Fulltime
Location:Tampa, FL
Interview:Phone & F2F/Skype
Qualifications
• Advanced knowledge in administration of Hadoop components including HDFS, MapReduce, Hive, YARN, Tez, Flume
• Advanced skills in performance tuning and troubleshooting Hadoop jobs• Intermediate skills in Data ingestion to/from Hadoop
• Knowledge of Greenplum, Informatica, Tableau, SAS desired
• Knowledge in Java desired
Additional Information
Chandra Kumar
************
Chandra at ustechsolutionsinc com
$85k-111k yearly est. 60d+ ago
Data Engineer - Machine Learning (Marketing Analytics)
PODS 4.0
Data engineer job in Clearwater, FL
At PODS (Portable On Demand Storage), we're not just a leader in the moving and storage industry, we redefined it. Since 1998, we've empowered customers across the U.S. and Canada with flexible, portable solutions that put customers in control of their move. Whether it's a local transition or a cross-country journey, our personalized service makes any experience smoother, smarter, and more human.
We're driven by a culture of trust, authenticity, and continuous improvement. Our team is the heartbeat of our success, and together we strive to make each day better than the last. If you're looking for a place where your work matters, your ideas are valued, and your growth is supported- PODS is your next destination.
JOB SUMMARY
The DataEngineer- Machine Learning is responsible for scaling a modern data & AI stack to drive revenue growth, improve customer satisfaction, and optimize resource utilization. As an ML DataEngineer, you will bridge dataengineering and ML engineering: build high‑quality feature pipelines in Snowflake/Snowpark, Databricks, productionize and operate batch/real‑time inference, and establish MLOps/LLMOps practices so models deliver measurable business impact at scale.
Note: This role is required onsite at PODS headquarters in Clearwater, FL. The onsite working schedule is Monday - Thursday onsite with Friday remote.
It is NOT a remote opportunity.
General Benefits & Other Compensation:
* Medical, dental, and vision insurance
* Employer-paid life insurance and disability coverage
* 401(k) retirement plan with employer match
* Paid time off (vacation, sick leave, personal days)
* Paid holidays
* Parental leave / family leave
* Bonus eligibility / incentive pay
* Professional development / training reimbursement
* Employee assistance program (EAP)
* Commuter benefits / transit subsidies (if available)
* Other fringe benefits (e.g. wellness credits)
What you will do:
● Design, build, and operate feature pipelines that transform curated datasets into reusable, governed feature tables in Snowflake
● Productionize ML models (batch and real‑time) with reliable inference jobs/APIs, SLAs, and observability
● Setup processes in Databricks and Snowflake/Snowpark to schedule, monitor, and auto‑heal training/inference pipelines
● Collaborate with our Enterprise Data & Analytics (ED&A) team centered on replicating operational data into Snowflake, enriching it into governed, reusable models/feature tables, and enabling advanced analytics & ML-with Databricks as a core collaboration environment
● Partner with Data Science to optimize models that grow customer base and revenue, improve CX, and optimize resources
● Implement MLOps/LLMOps: experiment tracking, reproducible training, model/asset registry, safe rollout, and automated retraining triggers
● Enforce data governance & security policies and contribute metadata, lineage, and definitions to the ED&A catalog
● Optimize cost/performance across Snowflake/Snowpark and Databricks
● Follow robust and established version control and DevOps practices
● Create clear runbooks and documentation, and share best practices with analytics, dataengineering, and product partners
Also, you will
DELIVER QUALITY RESULTS: Able to deliver top quality service to all customers (internal and external); Able to ensure all details are covered and adhere to company policies; Able to strive to do things right the first time; Able to meet agreed-upon commitments or advises customer when deadlines are jeopardized; Able to define high standards for quality and evaluate products, services, and own performance against those standards
TAKE INITIATIVE: Able to exhibit tendencies to be self-starting and not wait for signals; Able to be proactive and demonstrate readiness and ability to initiate action; Able to take action beyond what is required and volunteers to take on new assignments; Able to complete assignments independently without constant supervision
BE INNOVATIVE / CREATIVE: Able to examine the status quo and consistently look for better ways of doing things; Able to recommend changes based on analyzed needs; Able to develop proper solutions and identify opportunities
BE PROFESSIONAL: Able to project a positive, professional image with both internal and external business contacts; Able to create a positive first impression; Able to gain respect and trust of others through personal image and demeanor
ADVANCED COMPUTER USER: Able to use required software applications to produce correspondence, reports, presentations, electronic communication, and complex spreadsheets including formulas and macros and/or databases. Able to operate general office equipment including company telephone system
What you will need:
* Bachelor's or Master's in CS, Data/ML, or related field (or equivalent experience) required
* 4+ years in data/ML engineering building production‑grade pipelines with Python and SQL
* Strong hands‑on with Snowflake/Snowpark and Databricks; comfort with Tasks & Streams for orchestration
* 2+ years of experience optimizing models: batch jobs and/or real‑time APIs, containerized services, CI/CD, and monitoring
* Solid understanding of data modeling and governance/lineage practices expected by ED&A
It would be nice if you had:
* Familiarity with LLMOps patterns for generative AI applications
* Experience with NLP, call center data, and voice analytics
* Exposure to feature stores, model registries, canary/shadow deploys, and A/B testing frameworks
* Marketing analytics domain familiarity (lead scoring, propensity, LTV, routing/prioritization)
MANAGEMENT & SUPERVISORY RESPONSIBILTIES
* Direct supervisor job title(s) typically include: VP, Marketing Analytics
* Job may require supervising Analytics associates
No Unsolicited Resumes from Third-Party Recruiters
Please note that as per PODS policy, we do not accept unsolicited resumes from third-party recruiters unless such recruiters are engaged to provide candidates for a specified opening and in alignment with our Inclusive Diversity values.Any employment agency, person or entity that submits an unsolicited resume does so with the understanding that PODS will have the right to hire that applicant at its discretion without any fee owed to the submitting employment agency, person, or entity.
DISCLAIMER
The preceding job description has been designed to indicate the general nature of work performed; the level of knowledge and skills typically required; and usual working conditions of this position. It is not designed to contain, or be interpreted as, a comprehensive listing of all requirements or responsibilities that may be required by employees in this job.
Equal Opportunity, Affirmative Action Employer
PODS Enterprises, LLC is an Equal Opportunity, Affirmative Action Employer. We will not discriminate unlawfully against qualified applicants or employees with respect to any term or condition of employment based on race, color, national origin, ancestry, sex, sexual orientation, age, religion, physical or mental disability, marital status, place of birth, military service status, or other basis protected by law.
$80k-113k yearly est. 60d+ ago
Data Engineer - AI/ML
Fintech 4.2
Data engineer job in Tampa, FL
We are seeking a DataEngineer with strong AI/ML expertise to modernize and scale our Business Intelligence (BI) capabilities. This role will design and build data pipelines, deploy machine learning solutions, and operationalize intelligent analytics to drive decision-making across the organization. The ideal candidate blends dataengineering best practices with applied machine learning, MLOps, and AI.
Key Responsibilities
Project Responsibility: End-to-end data pipelines and integrations
Technical Competencies:
* Advanced SQL optimization and complex query design
* Kafka streaming applications and connector development
* Databricks workflow development with medallion architecture
* Data governance implementation and compliance
* Performance tuning for large-scale data processing
* Data security and privacy best practices
* Apache NiFi pipeline development for invoice and PO processing
* Integration with purpose-built data stores (Druid, MongoDB, OpenSearch, Postgres)
* Build and maintain end-to-end ML pipelines for training, deployment, and monitoring of models.
* Design and optimize data architectures for large-scale ML workloads
* Explore and implement LLM-based solutions, RAG architectures, and generative AI for business use cases.
Soft Skills:
* Cross-functional collaboration with product and engineering teams
* Technical mentoring for junior dataengineers
* Analytical thinking for complex data problems
* Stakeholder communication for data requirements
* Process improvement and efficiency focus
* Quality mindset for data accuracy and reliability Vendor Management:
* Direct communication with data platform vendors
* Evaluates vendor tools for specific data use cases
* Provides technical feedback on vendor product roadmaps
* Coordinates with vendors for data integration projects
Qualifications:
Bachelor's/Master's in Computer Science, DataEngineering, Statistics, or related field.
5+ years in dataengineering; 2+ years applying ML in production.
Our Benefits:
* Hybrid Work
* Employer Matched 401K
* Company Paid Medical Insurance Option for Employee and Dependent Children
* Company Paid Dental Insurance for Employee
* Company Paid Vision Insurance for Employee
* Company Paid Long and Short-Term Disability
* Company Paid Life and AD&D Insurance
* 18 Paid Vacation Days a Year
* Six Paid Holidays
* Employee Recognition Programs
* Holiday Bonus
* Incentive Compensation
* Community Outreach Opportunities
* Business Casual Dress Code
About Fintech:
Fintech, a pioneering accounts payable (AP) automation solutions provider, has dedicated nearly 35 years to automating invoice processing between retail and hospitality businesses, and their supply chain partners. Backed by leading investors TA Associates and General Atlantic, it stands as a leader in this sector. Its flagship product, PaymentSource, was first built for the alcohol industry to provide invoice payment automation between alcohol distributors and their customers across all 50 states. Today, it is utilized by over 267,000 businesses nationwide for invoice payment and collection associated with all B2B business transactions. This proven platform automates invoice payment, streamlines payment collection, and facilitates comprehensive data capture for over 1.1 million business relationships. Recognizing operational hurdles, Fintech expanded its payment capabilities to include scan-based trading/consignment selling for its vendors and retailers and built an advanced CRM tool with functionality to fortify vendor, supplier, and distributor field execution, addressing diverse profit center challenges. For more information about Fintech and its range of solutions, please visit ****************
Fintech is a Drug-Free Workplace. Fintech is an Equal Opportunity Employer that does not discriminate on the basis of actual or perceived race, color, creed, religion, national origin, ancestry, citizenship status, age, sex or gender (including pregnancy, childbirth and pregnancy-related conditions), gender identity or expression (including transgender status), sexual orientation, marital status, military service and veteran status, physical or mental disability, genetic information, or any other characteristic protected by applicable federal, state, or local laws and ordinances. Fintech's management team is dedicated to this policy with respect to recruitment, hiring, placement, promotion, transfer, training, compensation, benefits, employee activities, access to facilities and programs and general treatment during employment. We E-Verify.
$93k-129k yearly est. 21d ago
Data Engineer-Lead - Project Planning and Execution
DPR Construction 4.8
Data engineer job in Tampa, FL
We are a leading construction company committed to delivering high-quality, innovative projects. Our team integrates cutting-edge technologies into the construction process to streamline operations, enhance decision-making, and drive efficiency across all levels. We are looking for a talented DataEngineer to join our team and contribute to developing robust data solutions that support our business goals.
This role is ideal for someone who enjoys combining technical problem-solving with stakeholder collaboration. You will collaborate with business leaders to understand data needs and work closely with a global engineering team to deliver scalable, timely, and high-quality data solutions that power insights and operations.
Responsibilities
* Own data delivery for specific business verticals by translating stakeholder needs into scalable, reliable, and well-documented data solutions.
* Participate in requirements gathering, technical design reviews, and planning discussions with business and technical teams.
* Partner with the extended data team to define, develop, and maintain shared data models and definitions.
* Design, develop, and maintain robust data pipelines and ETL processes using tools like Azure Data Factory and Python across internal and external systems.
* Proactively manage data quality, error handling, monitoring, and alerting to ensure timely and trustworthy data delivery.
* Perform debugging, application issue resolution, root cause analysis, and assist in proactive/preventive maintenance.
* Support incident resolution and perform root cause analysis for data-related issues.
* Create and maintain both business requirement and technical requirement documentation
* Collaborate with data analysts, business users, and developers to ensure the accuracy and efficiency of data solutions.
* Collaborate with platform and architecture teams to align with best practices and extend shared dataengineering patterns.
Qualifications
* Minimum of 4 years of experience as a DataEngineer, working with cloud platforms (Azure, AWS).
* Proven track record of managing stakeholder expectations and delivering data solutions aligned with business priorities.
* Strong hands-on expertise in Azure Data Factory, Azure Data Lake, Python, and SQL
* Familiarity with cloud storage (Azure, AWS S3) and integration techniques (APIs, webhooks, REST).
* Experience with modern data platforms like Snowflake and Microsoft Fabric.
* Solid understanding of Data Modeling, pipeline orchestration and performance optimization
* Strong problem-solving skills and ability to troubleshoot complex data issues.
* Excellent communication skills, with the ability to work collaboratively in a team environment.
* Familiarity with tools like Power BI for data visualization is a plus.
* Experience working with or coordinating with overseas teams is a strong plus
Preferred Skills
* Knowledge of Airflow or other orchestration tools.
* Experience working with Git-based workflows and CI/CD pipelines
* Experience in the construction industry or a similar field is a plus but not required.
DPR Construction is a forward-thinking, self-performing general contractor specializing in technically complex and sustainable projects for the advanced technology, life sciences, healthcare, higher education and commercial markets. Founded in 1990, DPR is a great story of entrepreneurial success as a private, employee-owned company that has grown into a multi-billion-dollar family of companies with offices around the world.
Working at DPR, you'll have the chance to try new things, explore unique paths and shape your future. Here, we build opportunity together-by harnessing our talents, enabling curiosity and pursuing our collective ambition to make the best ideas happen. We are proud to be recognized as a great place to work by our talented teammates and leading news organizations like U.S. News and World Report, Forbes, Fast Company and Newsweek.
Explore our open opportunities at ********************
$83k-109k yearly est. Auto-Apply 60d+ ago
Data Scientist II - Client Protection
Bank of America 4.7
Data engineer job in Tampa, FL
Charlotte, North Carolina;Plano, Texas; Boston, Massachusetts; Chandler, Arizona; Tampa, Florida; Chicago, Illinois; Jacksonville, Florida; Newark, Delaware; Phoenix, Arizona **To proceed with your application, you must be at least 18 years of age.** Acknowledge
Refer a friend
**To proceed with your application, you must be at least 18 years of age.**
Acknowledge (**************************************************************************************************************
**Job Description:**
At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. We do this by driving Responsible Growth and delivering for our clients, teammates, communities and shareholders every day.
Being a Great Place to Work is core to how we drive Responsible Growth. This includes our commitment to being an inclusive workplace, attracting and developing exceptional talent, supporting our teammates' physical, emotional, and financial wellness, recognizing and rewarding performance, and how we make an impact in the communities we serve.
Bank of America is committed to an in-office culture with specific requirements for office-based attendance and which allows for an appropriate level of flexibility for our teammates and businesses based on role-specific considerations.
At Bank of America, you can build a successful career with opportunities to learn, grow, and make an impact. Join us!
**Job Summary:**
This job is responsible for reviewing and interpretating large datasets to uncover revenue generation opportunities and ensuring the development of effective risk management strategies. Key responsibilities include working with lines of business to comprehend problems, utilizing sophisticated analytics and deploying advanced techniques to devise solutions, and presenting recommendations based on findings. Job expectations include demonstrating leadership, resilience, accountability, a disciplined approach, and a commitment to fostering responsible growth for the enterprise.
Client Protection Shared Services - Advanced Analytics is looking for an energetic and inquisitive data scientist to join our team and help us combat financial crime. In this role, you will be expected to work on large and complex data science projects that entail working with both relational and graph databases. In these projects, it is expected to collaborate with internal strategy, technology, product, and policy partners to deploy advanced analytical solutions with the goal of reducing fraud losses, lowering false positive impacts, improving client experience, and ensuring the Bank minimizes its total cost of fraud. Key responsibilities include applying knowledge of multiple business and technical-related topics and independently driving strategic initiatives, large-scale projects, and overall improvements.
**Responsibilities:**
+ Perform graph analytics to find and mitigate densely connected fraud networks
+ Assist with the generation, prioritization, and investigation of fraud rings
+ Enable business analytics, including data analysis, trend identification, and pattern recognition, using advanced techniques to drive decision making and data driven insights
+ Understanding of end-to-end model development work, ranging from supervised, unsupervised, and graph-based machine learning solutions, to maximize detection of fraud or capture anomalous behavior
+ Manage multiple priorities and ensures quality and timeliness of work deliverables such as data science products, data analysis reports, or data visualizations, while exhibiting the ability to work independently and in a team environment
+ Manage relationships with multiple technology teams, development team, and line of business leaders, including alignment of roadmaps, managing projects, and managing risks
+ Oversee development, delivery and quality assurance for data science use cases delivered to the production environment and other areas of the line of business
+ Support the identification of potential issues and development of controls
+ Support execution of large-scale projects, such as platform conversions or new project integrations by conducting advanced reporting and drawing analytical-based insights
+ Manages a roadmap of data science use cases to answer business trends based on economic and portfolio conditions and communicate findings to senior management, while diligently working and leading peers to solve for these use cases
+ Coach and mentor peers to improve proficiency in a variety of systems and serve as a subject matter expert on multiple business and technical-related topics
+ Apply agile practices for project management, solution development, deployment, and maintenance
+ Deliver presentations in an engaging and effective manner through in-person and virtual conversations that communicates technical concepts and analysis results to a diverse set of internal stakeholders, and develops professional relationships to foster collaboration on work deliverables
+ Maintain knowledge of the latest advances in the fields of data science and artificial intelligence to support business analytics
+ Engage business and technology senior leaders on reporting of project/deliverable statuses, opportunity identification, and planning efforts
**Required Qualifications:**
+ 4+ years of experience in data and analytics
+ 4+ years of experience in data analytics within fraud prevention
+ Must be proficient with SQL and one of SAS, Python, or Java
+ Must have familiarity with Graph databases (e.g. TigerGraph, Neo4J) and graph query languages
+ Problem-solving skills including selection of data and deployment of solutions
+ Proven ability to manage projects, exercise thought leadership and work with limited direction on complex problems to achieve project goals while also leading a broader team
+ Excellent communication and influencing skills
+ Thrives in fast-paced and highly dynamic environment
+ Intellectual curiosity and strong urge to figure out the "whys" of a problem and produce creative solutions
+ Exposure to model development leveraging supervised and unsupervised machine learning (regression, tree-based algorithms, etc.)
+ Expertise in data analytics and technical development lifecycles including having coached junior staff
+ Expertise handling and manipulating data across its lifecycle in a variety of formats, sizes, and storage technologies to solve a problem (e.g., structured, semi-structured, unstructured; graph; hadoop; kafka)
**Desired Qualifications**
+ Advanced Quantitative degree (Master's or PhD)
+ 7+ years of experience; work in financial services is very helpful, with preference to fraud, credit, cybersecurity, or other heavily quantitative areas
+ Understanding of advanced machine learning methodologies including neural networks, graph algorithms, ensemble learning like XGB, and other techniques
+ Proficient with SPARK, H2O, or similar advanced analytical tools
+ Analytical and Innovating Thinking
+ Problem Solving and Business Acumen
+ Risk and Issue Management, interpreting relevant laws, rules, and regulations
+ Data Visualization, Oral and Written Communication, and Presentation Skills
+ Experience managing multi-year roadmaps, engaging technical and non-technical stakeholders, and leading large cross-functional formal projects
+ Experience influencing mid to senior (executive) level leaders
+ Experience managing risk and issue remediation
+ Understanding of computer science topics like automation, code versioning, computational complexity, parallel processing, requirements gathering, testing methodologies, and development lifecycle models like Agile
**Skills:**
+ Agile Practices
+ Application Development
+ DevOps Practices
+ Technical Documentation
+ Written Communications
+ Artificial Intelligence/Machine Learning
+ Business Analytics
+ Data Visualization
+ Presentation Skills
+ Risk Management
+ Adaptability
+ Collaboration
+ Consulting
+ Networking
+ Policies, Procedures, and Guidelines Management
**_It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. _**
**Shift:**
1st shift (United States of America)
**Hours Per Week:**
40
Bank of America and its affiliates consider for employment and hire qualified candidates without regard to race, religious creed, religion, color, sex, sexual orientation, genetic information, gender, gender identity, gender expression, age, national origin, ancestry, citizenship, protected veteran or disability status or any factor prohibited by law, and as such affirms in policy and practice to support and promote the concept of equal employment opportunity, in accordance with all applicable federal, state, provincial and municipal laws. The company also prohibits discrimination on other bases such as medical condition, marital status or any other factor that is irrelevant to the performance of our teammates.
View your **"Know your Rights (************************************************************************************** "** poster.
**View the LA County Fair Chance Ordinance (************************************************************************************************** .**
Bank of America aims to create a workplace free from the dangers and resulting consequences of illegal and illicit drug use and alcohol abuse. Our Drug-Free Workplace and Alcohol Policy ("Policy") establishes requirements to prevent the presence or use of illegal or illicit drugs or unauthorized alcohol on Bank of America premises and to provide a safe work environment.
Bank of America is committed to an in-office culture with specific requirements for office-based attendance and which allows for an appropriate level of flexibility for our teammates and businesses based on role-specific considerations. Should you be offered a role with Bank of America, your hiring manager will provide you with information on the in-office expectations associated with your role. These expectations are subject to change at any time and at the sole discretion of the Company. To the extent you have a disability or sincerely held religious belief for which you believe you need a reasonable accommodation from this requirement, you must seek an accommodation through the Bank's required accommodation request process before your first day of work.
This communication provides information about certain Bank of America benefits. Receipt of this document does not automatically entitle you to benefits offered by Bank of America. Every effort has been made to ensure the accuracy of this communication. However, if there are discrepancies between this communication and the official plan documents, the plan documents will always govern. Bank of America retains the discretion to interpret the terms or language used in any of its communications according to the provisions contained in the plan documents. Bank of America also reserves the right to amend or terminate any benefit plan in its sole discretion at any time for any reason.
$71k-93k yearly est. 10d ago
ETL Architect
Healthplan Services 4.7
Data engineer job in Tampa, FL
HealthPlan Services (HPS) is the nation's largest independent provider of sales, benefits administration, retention, reform and technology solutions to the insurance and managed care industries. Headquartered in Tampa, Florida, HPS was founded in 1970 and employs 1,500+ associates. HPS stands at the forefront of the insurance industry, providing exchange connectivity, administration, distribution and technology services to insurers of individual, small group, voluntary and association plans, as well as valuable solutions to thousands of brokers and agents, nationwide.
Job Description
Position: ETL Architect
The ETL Architect will have experience
delivering BI solutions with an Agile BI delivery methodology.
Essential Job Functions and Duties:
Develop and
maintain ETL jobs for data warehouses/marts
Design ETL
via source-to-target mapping and design documents that consider security,
performance tuning and best practices
Collaborate
with delivery and technical team members on design and development
Collaborate
with business partners to understand business processes, underlying data and
reporting needs
Conduct data
analysis in support of ETL development and other activities
Assist with data architecture and data modeling
Preferred Qualifications:
12+ years of work experience as Business Intelligence Developer
Work experience with multiple database platforms and BI delivery solutions
10+ years of experience with
End to End ETL
architecture
, data modeling BI and Analytics data marts, implementing
and supporting production environments.
10+ years of experience designing, building and implementing BI solutions with
modern BI tools like Microstrategy, Microsoft and Tableau
Experience as a Data Architect
Experience delivering BI solutions with an Agile BI delivery methodology
Ability to communicate, present and interact comfortably with senior leadership
Demonstrated proficiency implementing self-service solutions to empower an organization to
generate valuable actionable insights
Strong team player
Ability to understand information quickly, derive insight, synthesize information clearly
and concisely, and devise solutions
Inclination to take initiative, set priorities, take ownership of assigned projects and
initiatives, drive for results, and collaborate to achieve greatest value
Strong relationship-building and interpersonal skills
Demonstrated self-confidence, honesty and integrity
Conscientious of Enterprise Data Warehouse Release management
process; Conduct Operations readiness and environment compatibility review of
any changes prior to deployment with strong sensitivity around Impact and SLA
Experience with data modeling tools a plus.
Expert in data warehousing methodologies and best practices
required.
Ability to initiate and follow through on complex projects of
both short and long term duration required.
Works independently, assumes responsibility for job development
and training, researches and resolves questions and problems, requests
supervisor input and keeps supervisor informed required.
Proactive recommendation for improving the performance and
operability of the data warehouse and reporting environment.
Participate on interdepartmental teams to support organizational
goals
Perform other related duties and tasks as assigned
Experience facilitating user sessions and gathering requirements
Education Requirements:
Bachelors or equivalent degree in a business, technical, or related field
Additional Information
All your information will be kept confidential according to EEO guidelines.
How much does a data engineer earn in Sarasota, FL?
The average data engineer in Sarasota, FL earns between $63,000 and $115,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.
Average data engineer salary in Sarasota, FL
$85,000
What are the biggest employers of Data Engineers in Sarasota, FL?
The biggest employers of Data Engineers in Sarasota, FL are: