Post job

Data engineer jobs in Largo, FL - 577 jobs

All
Data Engineer
Data Scientist
ETL Architect
Hadoop Developer
Requirements Engineer
  • Senior Data Engineer

    Toorak Capital Partners

    Data engineer job in Tampa, FL

    Company: Toorak Capital Partners is an integrated correspondent lending and table funding platform that acquires business purpose residential, multifamily and mixed-use loans throughout the U.S. and the United Kingdom. Headquartered in Tampa, FL., Toorak Capital Partners acquires these loans directly from a network of private lenders on a correspondent basis. Summary: The role of the Lead Data Engineer is to develop, implement, for building high performance, scalable data solution to support Toorak's Data Strategy Lead Data architecture for Toorak Capital. Lead efforts to create API framework to use data across customer facing and back office applications. Establish consistent data standards, reference architectures, patterns, and practices across the organization for both OLTP and OLAP (Data warehouse, Data Lake house) MDM and AI / ML technologies Lead sourcing and synthesis of Data Standardization and Semantics discovery efforts turning insights into actionable strategies that will define the priorities for the team and rally stakeholders to the vision Lead the data integration and mapping efforts to harmonize data. Champion standards, guidelines, and direction for ontology, data modeling, semantics and Data Standardization in general at Toorak. Lead strategies and design solutions for a wide variety of use cases like Data Migration (end-to-end ETL process), database optimization, and data architectural solutions for Analytics Data Projects Required Skills: Designing and maintaining the data models, including conceptual, logical, and physical data models 5+ years of experience using NoSQL systems like MongoDB, DynamoDB and Relational SQL Database systems (PostgreSQL) and Athena 5+ years of experience on Data Pipeline development, ETL and processing of structured and unstructured data 5+ years of experience in large scale real-time stream processing using Apache Flink or Apache Spark with messaging infrastructure like Kafka/Pulsar Proficiency in using data management tools and platforms, such as data cataloging software, data quality tools), and data governance platforms Experience with Big Query, SQL Mesh(or similar SQL-based cloud platform). Knowledge of cloud platforms and technologies such as Google Cloud Platform, Amazon Web Services. Strong SQL skills. Experience with API development and frameworks. Knowledge in designing solutions with Data Quality, Data Lineage, and Data Catalogs Strong background in Data Science, Machine Learning, NLP, Text processing of large data sets Experience in one or more of the following: Dataiku, DataRobot, Databricks, UiPath would be nice to have. Using version control systems (e.g., Git) to manage changes to data governance policies, procedures, and documentation Ability to rapidly comprehend changes to key business processes and the impact on overall Data framework. Flexibility to adjust to multiple demands, shifting priorities, ambiguity, and rapid change. Advanced analytical skills. High level of organization and attention to detail. Self-starter attitude with the ability to work independently. Knowledge of legal, compliance, and regulatory issues impacting data. Experience in finance preferred.
    $72k-99k yearly est. 2d ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Interoperability Engineer (Workday)

    Moffitt Cancer Center 4.9company rating

    Data engineer job in Tampa, FL

    Highlights The Workday Interoperability Engineer serves as a senior technical expert responsible for architecting, deploying, and maintaining Workday integrations and interoperability frameworks that support secure, scalable data exchange across HR, finance, clinical, research, and enterprise systems. Acts as a subject matter expert in Workday integration patterns including Workday Studio, EIBs, RaaS, APIs, event-driven integrations, and streaming/data pipelines. Owns the design and operational delivery of Workday-centric interoperability initiatives, ensuring reliability and alignment with business outcomes. Provides mentorship and technical leadership to engineers and analysts, guiding them in best practices for Workday and enterprise integration. Combines deep Workday integration expertise with an understanding of cross-functional business processes and downstream system dependencies. The role will also be responsible for developing and maintaining frameworks that support information exchange needs across clinical systems Responsibilities Hands-on experience building integrations with Workday HCM, Finance, Payroll, Recruiting, or other Workday modules. Strong understanding of Workday data structures, security groups, calculated fields, and Workday report development (including RaaS). Proficiency in developing integrations using Workday Studio, EIB, Core Connectors, and PECI(Payroll Effective Change Interface). Translate Workday integration requirements into technical specifications, integration contracts, and design standards. Ability to gather API requirements, translate them into technical specifications, and produce comprehensive API design documentation (standards, contracts, and specifications). Hands-on experience implementing application security frameworks, including OAuth2, SAML, OpenID Connect, and JWT. Experience in API testing strategies - functional, regression, performance, and security testing - using tools such as Postman, SoapUI, JMeter, or equivalents. Good understanding of firewall and advanced networking concepts to support secure system integrations. Provide on-call support and keep integration documentation and records up to date. Credentials and Experience: Bachelor's Degree - field of study: Computer Science, systems analysis, or a related study Minimum 7 years of experience leading end-to-end integration implementations, with a strong emphasis on Workday and supporting middleware technologies like Cloverleaf and Boomi. Minimum of 3 years' experience working with cross functional teams providing expert knowledge for ERP data analysis to design, build and deploy integrations. The Ideal Candidate will have the following experience: Strong hands-on experience developing Workday integrations using Workday Studio, EIBs, Core Connectors, RaaS, and Workday Web Services. Experience designing and supporting interoperability between Workday and downstream systems Familiarity with healthcare interoperability concepts and standards such as HL7 or FHIR, especially where Workday interacts with clinical or research environments. Proficiency with integration platforms such as Boomi and/or Cloverleaf for orchestrating Workday-related data flows. Experience with EMR systems such as Epic is a plus, particularly when supporting Workday-to-clinical data exchange.
    $67k-87k yearly est. 1d ago
  • Data Scientist (Exploitation Specialist Level-3) - Tampa, FL

    Masego

    Data engineer job in Tampa, FL

    _________________________________________________________________________________________________ Masego is an award-winning small business that specializes in GEOINT services. As a Service-Disabled Veteran-Owned Small Business (SDVOSB), we recognize and award your hard work. Description We are looking for a Level-3 TS/SCI-cleared Data Scientist to join our team. This role provides automation/collection support to the main team at NGA Washington. Because of this, this opportunity relies on good communication skills and a baseline knowledge of GEOINT collection and/or automation systems like JEMA. Minimum Required Qualifications: At least 5 years of related GEOINT work experience, or 2 years with a relevant Bachelor's degree. Able to work on client site 40-hours a week (very limited option for telework) Proficient with Python Experience with JEMA Preferred Qualifications: Experience with multiple intelligence types (SIGINT, OSINT, ELINT, GEOINT, MASINT, HUMINT Experience with Brewlytics, ArcPro and/or other geospatial data analysis tools Knowledge of GEOINT collection and associated NGA/NRO systems Proficiency with common programming languages including R, SQL, HTML, and JavaScript Experience analyzing geospatially enabled data Ability to learn new technologies and adapt to dynamic mission needs Ability to work collaboratively with a remote team (main gov team is based out of NGA Washington) Experience providing embedded data science/automation support to analytic teams Security Clearance Requirement: Active TS/SCI, with a willingness to take a polygraph test. Salary Range: $128,600 based on ability to meet or exceed stated requirements About Masego Masego Inc. provides expert Geospatial Intelligence Solutions in addition to Activity Based Intelligence (ABI) and GEOINT instructional services. Masego provides expert-level Geospatial Collection Management, Full Motion Video; Human Geography; Information Technology and Cyber; Technical Writing; and ABI, Agile, and other professional training. Masego is a Service-Disabled Veteran-Owned Small Business headquartered in Fredericksburg, Virginia. With high-level expertise and decades of experience, coupled with proven project management systems and top-notch client support, Masego enhances the performance capabilities of the Department of Defense and the intelligence community. Pay and Benefits We seek to provide and take care of our team members. We currently offer Medical, Dental, Vision, 401k, Generous PTO, and more! Diversity Masego, Inc. is an equal opportunity/equal access/affirmative action employer fully committed to achieving a diverse workforce and complies with all applicable Federal and Virginia State laws, regulations, and executive orders regarding nondiscrimination and affirmative action in its programs and activities. Masego, Inc. does not discriminate on the basis of race, color, religion, ethnic or national origin, gender, genetic information, age, disability, sexual orientation, gender identity, gender expression, and veteran's status.
    $128.6k yearly Auto-Apply 60d+ ago
  • ETL Architect

    Healthplan Services 4.7company rating

    Data engineer job in Tampa, FL

    HealthPlan Services (HPS) is the nation's largest independent provider of sales, benefits administration, retention, reform and technology solutions to the insurance and managed care industries. Headquartered in Tampa, Florida, HPS was founded in 1970 and employs 1,500+ associates. HPS stands at the forefront of the insurance industry, providing exchange connectivity, administration, distribution and technology services to insurers of individual, small group, voluntary and association plans, as well as valuable solutions to thousands of brokers and agents, nationwide. Job Description Position: ETL Architect The ETL Architect will have experience delivering BI solutions with an Agile BI delivery methodology. Essential Job Functions and Duties: Develop and maintain ETL jobs for data warehouses/marts Design ETL via source-to-target mapping and design documents that consider security, performance tuning and best practices Collaborate with delivery and technical team members on design and development Collaborate with business partners to understand business processes, underlying data and reporting needs Conduct data analysis in support of ETL development and other activities Assist with data architecture and data modeling Preferred Qualifications: 12+ years of work experience as Business Intelligence Developer Work experience with multiple database platforms and BI delivery solutions 10+ years of experience with End to End ETL architecture , data modeling BI and Analytics data marts, implementing and supporting production environments. 10+ years of experience designing, building and implementing BI solutions with modern BI tools like Microstrategy, Microsoft and Tableau Experience as a Data Architect Experience delivering BI solutions with an Agile BI delivery methodology Ability to communicate, present and interact comfortably with senior leadership Demonstrated proficiency implementing self-service solutions to empower an organization to generate valuable actionable insights Strong team player Ability to understand information quickly, derive insight, synthesize information clearly and concisely, and devise solutions Inclination to take initiative, set priorities, take ownership of assigned projects and initiatives, drive for results, and collaborate to achieve greatest value Strong relationship-building and interpersonal skills Demonstrated self-confidence, honesty and integrity Conscientious of Enterprise Data Warehouse Release management process; Conduct Operations readiness and environment compatibility review of any changes prior to deployment with strong sensitivity around Impact and SLA Experience with data modeling tools a plus. Expert in data warehousing methodologies and best practices required. Ability to initiate and follow through on complex projects of both short and long term duration required. Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required. Proactive recommendation for improving the performance and operability of the data warehouse and reporting environment. Participate on interdepartmental teams to support organizational goals Perform other related duties and tasks as assigned Experience facilitating user sessions and gathering requirements Education Requirements: Bachelors or equivalent degree in a business, technical, or related field Additional Information All your information will be kept confidential according to EEO guidelines.
    $84k-105k yearly est. 1d ago
  • Lead Graph Data Scientist - Identity Analytics

    USAA 4.7company rating

    Data engineer job in Tampa, FL

    **Why USAA?** At USAA, our mission is to empower our members to achieve financial security through highly competitive products, exceptional service and trusted advice. We seek to be the #1 choice for the military community and their families. Embrace a fulfilling career at USAA, where our core values - honesty, integrity, loyalty and service - define how we treat each other and our members. Be part of what truly makes us special and impactful. **The Opportunity** We offer a flexible work environment that requires an individual to be **in the office 4 days per week.** This position can be based in one of the following locations: San Antonio, TX, Plano, TX, Phoenix, AZ, Colorado Springs, CO, Charlotte, NC, or Tampa, FL. Relocation assistance is **not** available for this position. The **Lead Graph Data Scientist - Identity Analytics** is responsible for development and implementation of quantitative solutions that improve USAA's ability to detect and prevent identity theft, account takeover, and first party/synthetic fraud. These solutions will range from the development of machine learning models to broad implementation of solutions such as graph analytics to protect USAA and our Members from risks emanating from these threats. Strong candidates will be able to deploy the following work products and processes: + Develop and continuously update internal identity theft and authentication models to mitigate fraud losses and negative member experience from fraud application, synthetic fraud and account takeover attempts + Closely partner with Strategies team, Director of Fraud Identity Analytics and Director of Fraud Model Management and Model Users on model builds and priorities + Partner with Technology and other key collaborators to deploy a Financial Crimes graph database strategy, including vendor selection, business requirements, data needs, and clear use cases spanning financial crimes + Deploy graph databases and graph techniques to identify criminal networks engaging in fraud, scams, disputes/claims and AML and deliver highly significant benefits + Generate and prioritize fraud-dense rings to mitigate losses and improve Member experience + Identify and work with technology to integrate new data sources for models and graphs to augment predictive power and improve business performance + Exports insights to decision systems to enable better fraud targeting and model development efforts + Drives continuous innovation in modeling efforts including advanced techniques like graph neural networks + Develops and mentors junior staff, establishing a culture of R&D to augment the day-to-day aspects of the job **What you'll do:** + Gathers, interprets, and manipulates sophisticated structured and unstructured data to enable sophisticated analytical solutions for the business. + Leads and conducts sophisticated analytics demonstrating machine learning, simulation, and optimization to deliver business insights and achieve business objectives. + Guides team on selecting the appropriate modeling technique and/or technology with consideration to data limitations, application, and business needs. + Develops and deploys models within the Model Development Control (MDC) and Model Risk Management (MRM) framework. + Composes and peer reviews technical documents for knowledge persistence, risk management, and technical review audiences. + Partners with business leaders from across the organization to proactively identify business needs and proposes/recommends analytical and modeling projects to generate business value. + Works with business and analytics leaders to prioritize analytics and highly sophisticated modeling. problems/research efforts. + Leads efforts to build and maintain a robust library of reusable, production-quality algorithms and supporting code, to ensure model development and research efforts are transparent and based on the highest quality data. + Assists team with translating business request(s) into specific analytical questions, implementing analysis and/or modeling, and communicating outcomes to non-technical business colleagues with a focus on business action and recommendations. + Manages project portfolio breakthroughs, risks, and impediments. Anticipates potential issues that could limit project success or implementation and intensifies as needed. + Establishes and maintains standard methodologies for engaging with Data Engineering and IT to deploy production-ready analytical assets consistent with modeling best practices and model risk management standards. + Interacts with internal and external peers and management to maintain expertise and awareness of pioneering techniques. Actively seeks opportunities and materials to learn new techniques, technologies, and methodologies. + Serves as a mentor to data scientists in modeling, analytics, computer science, eye for business, and other interpersonal skills. + Participates in enterprise-level efforts to drive the maintenance and transformation of data science technologies and culture. + Ensures risks associated with business activities are effectively identified, measured, monitored, and controlled in accordance with risk and compliance policies and procedures. **What you have:** + Bachelor's degree in mathematics, computer science, statistics, economics, finance, actuarial sciences, science and engineering, or other similar quantitative field; OR 4 years of experience in statistics, mathematics, quantitative analytics, or related experience (in addition to the minimum years of experience required) may be substituted in lieu of degree. + 8 years of experience in a predictive analytics or data analysis + 6 years of experience in training and validating statistical, physical, machine learning, and other advanced analytics models. + 4 years of experience in one or more dynamic scripted language (such as Python, R, etc.) for performing statistical analyses and/or building and scoring AI/ML models. + Expert ability to write code that is easy to follow, well documented, and commented where necessary to explain logic (high code transparency). + Strong experience in querying and preprocessing data from structured and/or unstructured databases using query languages such as SQL, NoSQL, etc. + Strong experience in working with structured, semi-structured, and unstructured data files such as delimited numeric data files, JSON/XML files, and/or text documents, images, etc. + Excellent demonstrated skill in performing ad-hoc analytics using descriptive, diagnostic, and inferential statistics. + Proven ability to assess and articulate regulatory implications and expectations of distinct modeling efforts. + Project management experience that demonstrates the ability to anticipate and appropriately manage project landmarks, risks, and impediments. Demonstrated history of appropriately communicating potential issues that could limit project success or implementation. + Expert level experience with the concepts and technologies associated with classical supervised modeling for prediction such as linear/logistic models, discriminant analysis, support vector machines, decision trees, and ensemble methods such as Random Forests, XGBoost, LightGBM, and CatBoost. + Expert level experience with the concepts and technologies associated with unsupervised modeling such as k-means clustering, hierarchical/agglomerative clustering, neighbors algorithms, DBSCAN, etc. + Demonstrated experience in guiding and mentoring junior technical staff in business interactions and model building. + Demonstrated ability to communicate ideas with team members and/or business leaders to convey and present very technical information to an audience that may have little or no understanding of technical concepts in data science. + A strong track record of communicating results, insights, and technical solutions to Senior Executive Management (or equivalent). + Extensive technical skills, consulting experience, and business savvy to collaborate with all levels and subject areas within the organization. **What sets you apart:** + US military experience through military service or a military spouse/domestic partner + Graduate degree in a quantitative subject area + Over 5 years of experience with model development or other advanced fraud detection algorithms + Over 4 years of experience with graph databases and graph solutions + Experience in fraud/financial crimes model development **Compensation:** The salary range for this position is: **$164,780 - $296,610.** **USAA does not provide visa sponsorship for this role. Please do not apply for this role if at any time (now or in the future) you will need immigration support (i.e., H-1B, TN, STEM OPT Training Plans, etc.).** **Compensation:** USAA has an effective process for assessing market data and establishing ranges to ensure we remain competitive. You are paid within the salary range based on your experience and market data of the position. The actual salary for this role may vary by location. Employees may be eligible for pay incentives based on overall corporate and individual performance and at the discretion of the USAA Board of Directors. The above description reflects the details considered necessary to describe the principal functions of the job and should not be construed as a detailed description of all the work requirements that may be performed in the job. **Benefits:** At USAA our employees enjoy best-in-class benefits to support their physical, financial, and emotional wellness. These benefits include comprehensive medical, dental and vision plans, 401(k), pension, life insurance, parental benefits, adoption assistance, paid time off program with paid holidays plus 16 paid volunteer hours, and various wellness programs. Additionally, our career path planning and continuing education assists employees with their professional goals. For more details on our outstanding benefits, visit our benefits page on USAAjobs.com. _Applications for this position are accepted on an ongoing basis, this posting will remain open until the position is filled. Thus, interested candidates are encouraged to apply the same day they view this posting._ _USAA is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran._ **If you are an existing USAA employee, please use the internal career site in OneSource to apply.** **Please do not type your first and last name in all caps.** **_Find your purpose. Join our mission._** USAA is unlike any other financial services organization. The mission of the association is to facilitate the financial security of its members, associates and their families through provision of a full range of highly competitive financial products and services; in so doing, USAA seeks to be the provider of choice for the military community. We do this by upholding the highest standards and ensuring that our corporate business activities and individual employee conduct reflect good judgment and common sense, and are consistent with our core values of service, loyalty, honesty and integrity. USAA attributes its long-standing success to its most valuable resource: our 35,000 employees. They are the heart and soul of our member-service culture. When you join us, you'll become part of a thriving community committed to going above for those who have gone beyond: the men and women of the U.S. military, their associates and their families. In order to play a role on our team, you don't have to be connected to the military yourself - you just need to share our passion for serving our more than 13 million members. USAA is an EEO/AA Employer - applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, disability, genetic information, sexual orientation, gender identity or expression, pregnancy, protected veteran status or other status protected by law. California applicants, please review our HR CCPA - Notice at Collection (********************************************************************************************************** here. USAA is an EEO/AA Employer - applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, disability, genetic information, sexual orientation, gender identity or expression, pregnancy, protected veteran status or other status protected by law.
    $67k-85k yearly est. 4d ago
  • Data Scientist

    Redhorse Corporation

    Data engineer job in Tampa, FL

    About the OrganizationNow is a great time to join Redhorse Corporation. We are a solution-driven company delivering data insights and technology solutions to customers with missions critical to U.S. national interests. We're looking for thoughtful, skilled professionals who thrive as trusted partners building technology-agnostic solutions and want to apply their talents supporting customers with difficult and important mission sets. About the RoleRedhorse Corporation is seeking a highly skilled Data Scientist to join our team supporting the United States Central Command (USCENTCOM) Directorate of Logistics (CCJ4). You will play a critical role in accelerating the delivery of AI-enabled capabilities within the Joint Logistics Common Operating Picture (JLOGCOP), directly impacting USCENTCOM's ability to promote international cooperation, respond to crises, deter aggression, and build resilient logistics capabilities for our partners. This is a high-impact role contributing to national security and global stability. You will be working on a custom build of AI/ML capabilities into the JLOGCOP leveraging dozens of data feeds to enhance decision-making and accelerate planning for USCENTCOM missions.Key Responsibilities Communicate with the client regularly regarding enterprise values and project direction. Find the intersection between business value and achievable technical work. Articulate and translate business questions into technical solutions using available DoD data. Explore datasets to find meaningful entities and relationships. Create data ingestion and cleaning pipelines. Develop applications and effective visualizations to communicate insights. Serve as an ambassador for executive DoD leadership to sponsor data literacy growth across the enterprise. Required Experience/Clearance US citizen with a Secret US government clearance. Applicants who are not US Citizens and who do not have a current and active Secret security clearance will not be considered for this role. Ability to work independently to recommend solutions to the client and as part of a team to accomplish tasks. Experience with functional programming (Python, R, Scala) and database languages (SQL). Familiarity using AI/ML tools to support logistics use cases. Ability to discern which statistical approaches are appropriate for different contexts. Experience communicating key findings with visualizations. 8+ years of professional experience. Master's degree in a quantitative discipline (Statistics, Computer Science, Physics, Electrical Engineering, etc.). Desired Experience Experience with cloud-based development platforms. Experience with large-scale data processing tools. Experience with data visualization tools. Ph.D. in a quantitative discipline (Statistics, Computer Science, Physics, Electrical Engineering, etc.). Equal Opportunity Employer/Veterans/Disabled Accommodations:If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to access job openings or apply for a job on this site as a result of your disability. You can request reasonable accommodations by contacting Talent Acquisition at *********************************** Redhorse Corporation shall, in its discretion, modify or adjust the position to meet Redhorse's changing needs.This job description is not a contract and may be adjusted as deemed appropriate in Redhorse's sole discretion.
    $63k-91k yearly est. Auto-Apply 60d+ ago
  • Data Scientist

    Core One

    Data engineer job in Tampa, FL

    Join our team at Core One! Our mission is to be at the forefront of devising analytical, operational and technical solutions to our Nation's most complex national security challenges. In order to achieve our mission, Core One values people first! We are committed to recruiting, nurturing, and retaining top talent! We offer a competitive total compensation package that sets us apart from our competition. Core One is a team-oriented, dynamic, and growing company that values exceptional performance! *This position requires an active TS/SCI clearance.* Responsibilities: Provide data science (DS) and operations research (OR) capabilities on-site for a combatant command Operation Assessment Division. Design, develop, and apply a variety of data collection and decision analytics processes and applications, including the employment of mathematical, statistical, and other analytic methods. Identify effective, efficient, and innovative technical solutions for meeting Division data and automation requirements, including potential artificial intelligence (AI) and machine learning (ML) solutions. Develop automated applications, data visualizations, information displays, decision briefings, analytic papers, and facilitate senior leadership decisions with analytic products. Identify and develop data stream interfaces for authoritative data sources to support assessments and risk analysis. Integrate Division functions and products into the Command and Control of the Information Environment (C2IE) system, MAVEN Smart Systems, and/or Advana. Build digital solutions using programming applications (e.g., R, R/Shiny, Python) to digitalize and partially or fully automate data collection, analysis, and staff processes while accelerating the rate at which the Division can execute tasks. Develop and lead small teams in the development of real-time/near real-time data visualization and analysis methodologies and analytic tools. Participate in client operational planning processes in support of Joint planning. Support Knowledge Management and Information processes requirements. Basic Qualifications: Possess a Master's Degree, preferably in a related technical field, such as operations research, data science, math, engineering, science, or computer science. 5-12 years of combined professional DS/OR experience, with a minimum of 5 years of related DS/OR experience at a Combatant Command staff, Joint or Combined Command Headquarters, or Defense Department equivalent. High levels of proficiency using the following applications: R, R-Shiny, Python, Python-Shiny, SQL/POSTRESQL, Microsoft Office applications, and Microsoft SharePoint Functional knowledge of MAVEN Smart Systems, C2IE, Advana, AI, ML, Git, and Large Learning Models Top Secret (TS)/Secure Compartmented Information (SCI) clearance is required. Applicants are subject to a security investigation and need to meet eligibility requirements for access to classified information. Additional Qualifications: Ability to work independently or as the leader or member of a small team in conducting analysis in support of assessments with high visibility, unusual urgency or program criticality; requiring a variety of OR and DS techniques and tools. Possession of excellent oral and written communication skills with the ability to communicate, prepare correspondence, and make formal presentations at the 4-Star General Officer/Flag Officer level. Ability to develop and support new analytic capabilities as requirements evolve within the command for assessments. Knowledge of Joint Warfighting and Combatant Command functions. Security Clearance: Active TS/SCI clearance is required Core One is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender identity, sexual orientation, national origin, or protected veteran status and will not be discriminated against on the basis of disability. __PRESENT__PRESENT__PRESENT
    $63k-91k yearly est. Auto-Apply 36d ago
  • Expert Exploitation Specialist/Data Scientist (TS/SCI)

    Culmen International 4.3company rating

    Data engineer job in Tampa, FL

    About the Role Culmen International is hiring Expert Exploitation Specialist/Data Scientists to provide support on-site at the National Geospatial-Intelligence Agency (NGA) in Tampa, FL. The National Geospatial-Intelligence Agency (NGA) expects to deliver AOS Metadata Cataloging and Management Services to enhance product and asset management of content enabling rapid creation of discoverable, modular, web enabled, and visually enriched Geospatial Intelligence (GEOINT) products for intelligence producers in NGA, across the National System for Geospatial-Intelligence (NSG). TALENT PIPELINE - Qualified applicants will be contacted as soon as funding for this position is secured. What You'll Do in Your New Role The Data Scientist will coordinate with our clients to understand questions and issues involving the client's datasets, then determine the best method and approach to create data-driven solutions within program guidelines. This position will be relied upon as a Subject Matter Expert (SME), and be expected to lead/assist in the development of automated processes, architect data science solutions, automated workflows, conduct analysis, use available tools to analyze data, remain adaptable to mission requirements, and identify patterns to help solve some of the complex problems that face the DoD and Intelligence Community (IC). Work with large structured / unstructured data in a modeling and analytical environment to define and create streamline processes in the evaluation of unique datasets and solve challenging intelligence issues Lead and participate in the design of solutions and refinement of pre-existing processes Work with Customer Stakeholders, Program Managers, and Product Owners to translate road map features into components/tasks, estimate timelines, identify resources, suggest solutions, and recognize possible risks Use exploratory data analysis techniques to identify meaningful relationships, patterns, or trends from complex data Combine applied mathematics, programming skills, analytical techniques, and data to provide impactful insights for decision makers Research and implement optimization models, strategies, and methods to inform data management activities and analysis Apply big data analytic tools to large, diverse sets of data to deliver impactful insights and assessments Conduct peer reviews to improve quality of workflows, procedures, and methodologies Help build high-performing teams; mentor team members providing development opportunities to increase their technical skills and knowledge Required Qualifications TS/SCI Clearance w/CI Poly Eligible Minimum of 18 years combined experience (A combination of years of experience & professional certifications/trainings can be used in lieu of a degree) BS in related Field with Graduate level work Expert proficiency in Python and other programming languages applicable to automation development. Demonstrated experience designing and implementing workflow automation systems Advanced experience with ETL (Extract, Transform, Load) processes for geospatial data Expertise in integrating disparate systems through APl development and implementation Experience developing and deploying enterprise-scale automation solutions Knowledge of NGA's Foundation GEOINT products, data types, and delivery methods Demonstrated experience with database design, implementation, and optimization Experience with digital media generation systems and automated content delivery platforms Ability to analyze existing workflows and develop technical solutions to streamline processes Knowledge of DLA systems and interfaces, particularly MEBS and WebFLIS Expertise in data quality assurance and validation methodologies Experience with geospatial data processing, transformation, and delivery automation Proficiency with ArcGIS tools, GEODEC and ACCORD software systems Understanding of cartographic principles and standards for CADRG/ECRG products Strong analytical skills for identifying workflow inefficiencies and implementing solutions Experience writing technical documentation including SOPS, CONOPS, and system design Desired Qualifications Certification(s) in relevant automation technologies or programming languages Experience with DevOps practices and C/CD implementation Knowledge of cloud-based automation solutions and their implementation in - government environments Experience with machine learning applications for GEOINT Workflow optimization Expertise in data analytics and visualization for workflow performance metrics Understanding of NGA's enterprise architecture and integration points Experience implementing RPA (Robotic Process Automation) solutions Knowledge of secure coding practices and cybersecurity principles Demonstrated expertise in digital transformation initiatives Experience mentoring junior staff in automation techniques and best practices Background in agile development methodologies Understanding of human-centered design principles for workflow optimization About the Company Culmen International is committed to enhancing international safety and security, strengthening homeland defense, advancing humanitarian missions, and optimizing government operations. With experience in over 150 countries, Culmen supports our clients to accomplish critical missions in challenging environments. Exceptional Medical/Dental/Vision Insurance, premiums for employees are 100% paid by Culmen, and dependent coverage is available at a nominal rate (including same or opposite sex domestic partners) 401k - Vested immediately and 4% match Life insurance and disability paid by the company Supplemental Insurance Available Opportunities for Training and Continuing Education 12 Paid Holidays To learn more about Culmen International, please visit ************** At Culmen International, we are committed to creating and sustaining a workplace that upholds the principles of Equal Employment Opportunity (EEO). We believe in the importance of fair treatment and equal access to opportunities for all employees and applicants. Our commitment to these principles is unwavering across all our operations worldwide.
    $62k-90k yearly est. Auto-Apply 30d ago
  • Data Governance & Metadata Scientist

    Nv5

    Data engineer job in Saint Petersburg, FL

    NV5 Geospatial is actively recruiting a Data Governance & Metadata Scientist. Strong capabilities in developing, maintaining, and optimizing an outward-facing data catalog integrating geospatial and research layers are required. The Data Governance & Metadata Scientist will be based remotely supporting US Southern Command. US citizenship, along with the ability to successfully pass a basic background check for access to US military bases, is required for employment. While no clearance is required, a Secret or higher clearance is preferred. Work Setting: This role offers flexibility in location, with the option to work from any NV5 Regional Office or remotely from home. Potential travel up to 5-15% of the time NV5 is a global technology solutions and consulting services company with a workforce of over 4,500 professionals in more than 100 offices worldwide. NV5's continued growth has been spurred through strategic investments in firms with unique capabilities to help current and future customers solve the world's toughest problems. The NV5 family brings together talent across a wide range of markets and fields, including Professional Engineers, Professional Land Surveyors, Architects, Photogrammetrists, GIS Professionals, Software Developers, IT, Project Management Professionals, and more. At NV5 Geospatial, we are a collaboration of intelligent, innovative thinkers who care for each other, our communities, and the environment. We value both heart and head, the diversity of our people, and their experiences because that is how we continue to grow as leaders in our industry and expand our individual and collective potential. Responsibilities Implement data lineage tracking and metadata synchronization to ensure consistency across Databricks, Kubernetes, and research dashboards. Support ontology-driven decision support systems, mapping structured and unstructured datasets to enhance data interoperability. Develop automated metadata validation and quality control mechanisms, ensuring research datasets maintain compliance with DoD governance frameworks. Integrate metadata into platforms and implement tagging policies consistent with program standards. Utilize GitLab pipelines and CI/CD tools for publishing and indexing routines. Publish or embed outputs in approved web services for research dashboards intended for external access. Utilize Azure-native indexing services such as Cognitive Search to implement federated metadata and research product discovery pipelines. Ensure security boundary compliance. Qualifications Minimum Requriements: Bachelor's degree in Computer Science, Data Engineering, Geographic Information Systems (GIS), or a related field, or five (5) years of equivalent experience in data engineering, full-stack development, and metadata-driven data cataloging. Demonstrated experience in developing interactive data portals, implementing API-driven search and data exchange, and integrating geospatial data layers into web applications. Experience working with Databricks, Esri ArcGIS Feature Services, OpenLineage, and metadata management solutions. Software development skills in Python, JavaScript (React, Angular, Vue), SQL, and RESTful API design. Proficiency in cloud environments such as AWS, Azure, or Google Cloud, and implementing scalable, data-driven applications. Ability to manage and prioritize complex project tasks. Preferred: Microsoft Certified Azure Data Engineer, AWS Certified Data Analytics Specialty, or Esri Web GIS Developer Certification. Portuguese or Spanish language skills. Experience with government IT programs and environments. Clearance Requirement: None ; Active Secret or TS/SCI preferred Please be aware that some of our positions may require the ability to obtain security clearance. Security clearances may only be granted to U.S. citizens. In addition, applicants who accept a conditional offer of employment may be subject to government security investigation(s) and must meet eligibility requirements for access to classified information. Employment is contingent upon successful completion of a background check and drug screening. NV5 offers a competitive compensation and benefits package including medical, dental, life insurance, FTO, 401(k) and professional development/advancement opportunities. NV5 provides equal employment opportunities (EEO) to all applicants for employment without regard to race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, disability, genetic information, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state, and local laws. NV5 complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including, but not limited to, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training. #LI-Remote
    $63k-92k yearly est. Auto-Apply 8d ago
  • Data Engineer - Wealth Technology

    Dynasty Financial Partners 3.7company rating

    Data engineer job in Saint Petersburg, FL

    Dynasty Financial Partners is seeking a motivated, self-starting individual to join our team and pursue our shared goal of providing exceptional service for our clients. As a Data Engineer, you will serve as a dedicated resource to support our Data Lake and assist on internal and external client solutions/product-based support. You will identify, develop, and implement product and technology-based solutions to increase operational efficiency, improve accuracy, and support platform adoption. Please Note: This job is an IN-OFFICE position located in St. Petersburg, FL. If you are unable or unwilling to fulfill this requirement of the position, please refrain from applying Responsibilities * Design, develop, and maintain reliable and efficient data pipelines using Azure Data Factory and Azure Databricks. * Construct and maintain complex data processing systems, ensuring they are scalable, repeatable, and secure. * Develop data pipelines that answer critical business questions and solve internal and external reporting needs. * Develop and maintain data models and data architecture to support data lakehouse architecture. * Implement automated data validation, testing, and error handling mechanisms within data pipelines. * Stay up to date with the latest Azure services and technologies, as well as industry best practices, to continuously improve the data engineering processes. * Document all data engineering processes, systems, and designs to ensure knowledge sharing and business continuity. Requirements Requirements * Bachelor's degree in computer science, Engineering, or related field; master's degree is a plus. * Minimum of 3 years of experience as a Data Engineer with a focus on Azure cloud services. * Strong experience with Azure Databricks for data processing and analytics. * Proficient in Azure Data Factory for creating, scheduling, and orchestrating data workflows. * Familiarity with data modeling, SQL, and data warehousing principles. * Experience with programming languages such as Python, Scala, or SQL for data manipulation and transformation. * Solid understanding of CI/CD practices and version control systems (e.g., Git). * Excellent problem-solving skills, attention to detail, and the ability to work in a fast-paced environment. * Strong communication and collaboration skills to work effectively with both technical and non-technical team members. Relevant Certifications * Azure Data Engineer Associate * Azure Data Fundamentals * Microsoft PowerBI Preferred, but not Required * Knowledge of Data Lake and Lakehouse architecture patterns * Experience with Databricks Unity Catalog for data governance and metadata management * Work history in the Wealth Management industry is a plus BENEFITS * Health Insurance * Dental insurance * Vision insurance * Retirement plan 401(k) * 401(k) matching * Paid Time Off * FSA/HSA benefits plans * Disability benefits * Voluntary Life Insurance * Basic Life Insurance EQUAL EMPLOYMENT OPPORTUNITY ?Dynasty Financial Partners is committed to providing equal employment opportunities and ensuring that all employment-related decisions are made without regard to race, color, sex, age, national origin, religion, physical or mental disability (unrelated to the ability to perform job duties) veteran status, or any other protected status under applicable law.
    $87k-113k yearly est. 10d ago
  • Data Platform Engineer- 6 Month Assignment

    Maxhealth

    Data engineer job in Tampa, FL

    Job Description MaxHealth is seeking a highly skilled Data Platform Engineer (DBA) to join our data engineering team to modernize and optimize our data pipelines. The Data Platform Engineer (DBA) will be responsible for the management, optimization, modernization, and governance of our enterprise data warehouse. This role requires strong technical expertise in Azure Data Factory (ADF)/Fabric/or other similar technologies, SQL Server Integration Services (SSIS), SQL Server, and modern BI platforms (Domo and Power BI) Location- Hybrid/Tampa 1099 Contractor- Starting salary $110,000 annum based on experience This is a 6-month assignment with the possibility of opportunity for extension into a permanent position. Key Responsibilities Modernization & Optimization · Partner with IT leadership and data engineering teams to design and implement a modernized, scalable, and cloud-ready data platform. · Design, develop, and maintain modernized ETL pipelines using Azure Data Factory, Fabric, Databricks and/or SSIS to integrate multiple data sources into the enterprise data warehouse. · Recommend and implement improvements to ETL processes, storage, indexing strategies, and query performance. · Explore and adopt automation tools, monitoring systems, and DevOps practices for database operations. · Support migration strategies from legacy systems to cloud or hybrid architectures. · Optimize and refine existing ETL processes to improve performance, scalability, and maintainability. Database Administration & Operations · Oversee the daily management of the SQL data warehouse, ensuring high availability, security, and optimal performance. · Implement and maintain backup, recovery, and disaster recovery strategies. · Monitor database performance and tune queries, indexes, and structures to maximize efficiency. · Manage user access, roles, and permissions to uphold least-privilege security standards. · Maintain documentation for database structures, configurations, and procedures. · Design, build, and maintain ETL/ELT processes across SQL Agent jobs, stored procedures, and SSIS packages & optimize existing ETL processes for performance and reliability. Governance & Controls · Implement data quality checks, error handling, and monitoring to ensure accurate and consistent data flows. · Adopt DevOps/DataOps practices for automation, monitoring, and repeatable deployments. · Establish and enforce data management policies, including auditing, logging, and compliance monitoring. · Ensure adherence to HIPAA, HITRUST, and other healthcare data security requirements. · Implement robust change management processes for database updates, schema changes, and production deployments. · Utilize a code repository (Git or similar) to manage, version, and document ETL code. · Work closely with Finance, Clinical, and Analytics teams to ensure data pipelines and reporting processes are accurate, consistent, and reliable. Collaboration & Support · Partner with business intelligence (BI) and analytics teams to ensure data is structured and accessible for actionable reporting. · Work with application developers to optimize integration with downstream applications and operational systems. · Provide advanced support for ETL & database-related incidents, outages, and service requests. · issues, perform root cause analysis, and implement preventive measures. Job Qualifications · Bachelor's degree in Computer Science, Information Systems, or a related field; or equivalent experience. · 5+ years of experience as a DBA in a large-scale SQL environment · 5+ years of experience in data engineering or data platform roles with expertise in ETL/ELT pipeline development and orchestration. · Proven experiencing managing workloads in cloud native data management platforms, preferably in Azure (Azure Synapse, Microsoft Fabric, Databricks, Data Factory, or market equivalents such as Snowflake, BigQuery, Redshift, dbt, or Airflow). · Strong knowledge of T-SQL, stored procedures, indexing strategies, and performance tuning. · Experience with ETL processes, data warehouse management, and BI/reporting environments. · Proven track record of implementing controls, monitoring systems, and compliance frameworks. · Deep understanding of data security, governance, and regulatory compliance (HIPAA, HITRUST, SOC2, etc.). Preferred Skills · Experience with cloud database platforms (preferred Azure SQL, Snowflake, etc.). · Exposure to data warehouse modernization initiatives and architecture redesigns. · Familiarity with data cataloging and governance platforms and practices (e.g., Purview, Collibra, or Alation). · Knowledge of automation tools (Ansible, Terraform, Liquibase, etc.) and DevOps practices. · Familiarity with healthcare data standards (HL7, FHIR, claims/EMR data). · Professional certifications such as Microsoft Certified: Azure Database Administrator Associate, AWS Certified Database - Specialty, or equivalent. ABOUT MAXHEALTH MaxHealth is dedicated to simplifying healthcare and ensuring healthier futures. Founded in 2015, MaxHealth is a leading primary care platform focused on providing high-quality, integrated care to adults and senior patients throughout Florida. We provide care for more than 120,000 patients, most of which are beneficiaries of government-sponsored healthcare programs like Medicare, or of health plans purchased on the Affordable Care Act exchange marketplace. MaxHealth is a rapidly growing medical practice with more than 50 clinics spread across central and southern Florida. MaxHealth also partners with independent providers who are like-minded and utilizes its platform to help them provide high-quality care. We are customer-centered; compassionate; results-driven; proactive; collaborative; and adaptable in executing our vision to help patients live their best lives. Our mission is to deliver quality care, a simplified experience, and happiness. One patient at a time. #IND123 Job Posted by ApplicantPro
    $110k yearly 26d ago
  • Senior Data Engineer Architecture C4ISR

    SOSi

    Data engineer job in Tampa, FL

    Founded in 1989, SOSi is among the largest private, founder-owned technology and services integrators in the defense and government services industry. We deliver tailored solutions, tested leadership, and trusted results to enable national security missions worldwide. Job Description Overview **This position is contingent upon award of contract** SOS International LLC (SOSi) is seeking a Data Engineer (C4ISR Architecture) to support our customer in McDill AFB, Florida. Essential Job Duties Provide administration of full motion video (FMV) dissemination systems to include Force Point Raise the Bar Compliant Cross Domain System Feed Management portals at USCENTCOM HQ and other locations as required. Engineer, configure, and deploy FMV dissemination solutions in support of USCENTCOM, component, and coalition partner requirements. Provide subject matter expertise of C4ISR Systems employed by USCENTCOM, components, and coalition partners in order to provide integration of ISR platforms supporting USCENTCOM Operational requirements. When required, represent USCENTCOM J2 equities to Service and/or CSA Programs of Record such as DISA UVDS, Air Force DGS, NSA CuAS, and NGA MAVEN to ensure interoperability of all C4ISR Systems supporting USCENTCOM. Provide crisis operations support for FMV dissemination to JWICS, SIPR Rel, BICES, CPN Bi-Lats, CPN-X, TALON, and SEAGULL as required. Provide Network Engineering, FMV routing, and dissemination in support of integrating coalition and service federated PED Nodes. Qualifications Minimum Requirements Active In-Scope TS/SCI clearance. Experience providing support during crisis operation for FMV dissemination. Experience providing network engineering, FMV routing, and dissemination in support of integrating coalition service federated PED nodes. Preferred Qualifications Minimum 12 years of experience related to the specific labor category with at least a portion of the experience within the last 2 years. Master's degree in an area related to the labor category from a college or university accredited by an agency recognized by the U.S. Department of education; or have Bachelor's degree related to the labor category from a college or university accredited by an agency recognized by the U.S. Department of Education and an additional 5 years of related senior experience, for a total of 17 years, as a substitute to the Master's degree. Additional Information Work Environment Working conditions are normal for an office environment. Working at SOSi All interested individuals will receive consideration and will not be discriminated against for any reason.
    $72k-99k yearly est. 1d ago
  • Data Engineer

    Stemboard

    Data engineer job in Tampa, FL

    Job Description STEMBoard is a technology solutions company that creates smart systems and software solutions for government and large-scale private sector clients. We are growing fast and need passionate, innovative people who love working with technology and are ready to make an impact. Here's what you can expect from us: You will work with great people who love what they do: our team includes published authors, patent holders, and internationally renowned engineers. We care about our employees: we invest in professional development and reward creativity. Starting day one, every employee is bonus eligible and receives 20 days of paid leave. We invest in the community: STEMBoard boasts a lively education and outreach program that teaches engineering to the historically underserved. Principal Duties and Responsibilities: A data engineer has a deep understanding of performance optimization and data pipelining. In addition to the baseline skills of a data analyst, data engineers can make raw data more useful for the enterprise. Data engineers can create and integrate application programming interfaces (APIs). Their technical skills generally include multiple programming languages and a deep knowledge of SQL database design. The data engineer role requires a more in-depth knowledge in programming for integrating complex models and using advanced software library frameworks to distribute large, clustered data sets. Data engineers collect and arrange data in a form that is useful for analytics. A basic knowledge in machine learning is also required to build efficient and accurate data pipelines to meet the needs for downstream users such as data scientists to create the models and analytics that produce insight. The Data Engineer shall perform the following tasks: Developing, maintaining, and testing infrastructures for data generation to transform data from various structured and unstructured data sources. Develop complex queries to ensure accessibility while optimizing the performance of NoSQL and or big data infrastructure. Create and maintain optimal data pipeline architecture. Build and maintain the infrastructure to support extraction, transformation, and loading (ETL) of data from a wide variety of data sources. Extract data from multiple data sources, relational SQL and NoSQL databases, and other platform APIs, for data ingestion and integration. Configure and manage data analytic frameworks and pipelines using databases and tools such as NoSQL, SQL, HDInsight, MongoDB, Cassandra, Neo4j, GraphDB, OrientDB, Spark, Hadoop, Kafka, Hive, and Pig. Apply distributed systems concepts and principles such as consistency and availability, liveness and safety, durability, reliability, fault-tolerance, consensus algorithms. Administrate cloud computing and CI/CD pipelines to include Azure, Google, and Amazon Web Service (AWS). Coordinate with stakeholders, including product, data and design teams to assist with data-related technical issues and support their data infrastructure needs. Requirements: Experience: 1+ years of experience with software engineering, data science or related experience. Education: Bachelors or degree in STEM with a preference towards Computer Science or Software Engineering. Tools: Proficient with one or more programming languages (Java, C++, Python, R, etc.) Security Clearance: minimum TOP SECRET level Verifiable work experience working with data structures, database management, distributed computing, and API driven architectures using SQL and No-SQL engines. Proficient in modeling frameworks like Universal Modeling Language (UML), Agile Development, and Git Operations. Benefits: Healthcare, Vision, and Dental Insurance 20 Days of PTO 401K Matching Training/Certification Reimbursement Short term/Long term disability Parental/Maternity Leave Military Leave Life Insurance STEMBoard is committed to hiring and retaining a diverse workforce. All qualified candidates will receive consideration for employment without regard to disability, protected veteran status, race, color, religious creed, national origin, citizenship, marital status, sex, sexual orientation/gender identity, age, or genetic information. Selected applicant will be subject to a background investigation. STEMBoard is an Equal Opportunity/Affirmative Action employer.
    $72k-99k yearly est. 12d ago
  • Data Engineer

    Silverthorne Advisory Group LLC

    Data engineer job in Tampa, FL

    Benefits: 401(k) 401(k) matching Bonus based on performance Company parties Competitive salary Dental insurance Health insurance Opportunity for advancement Paid time off Training & development Vision insurance Job Description Silverthorne Advisory Group is seeking a skilled and highly motivated Data Engineer to join an exciting and growing opportunity within the Defense sector. As a Data Engineer, you will play a crucial role in designing, developing, and maintaining our Advana data infrastructure and systems. Your expertise in ETL, Databricks, Python, Spark, Scala, JavaScript/JSON, SQL, and Jupyter Notebooks will be essential in ensuring efficient data processing and analysis. Responsibilities Design, develop, and implement end-to-end data pipelines, utilizing ETL processes and technologies such as Databricks, Python, Spark, PySpark, Scala, JavaScript/JSON, SQL, and Jupyter Notebooks. Create and optimize data pipelines from scratch, ensuring scalability, reliability, and high-performance processing. Perform data cleansing, data integration, and data quality assurance activities to maintain the accuracy and integrity of large datasets. Leverage big data technologies to efficiently process and analyze large datasets, particularly those encountered in a federal agency. Troubleshoot data-related problems and provide innovative solutions to address complex data challenges. Implement and enforce data governance policies and procedures, ensuring compliance with regulatory requirements and industry best practices. Work closely with cross-functional teams to understand data requirements and design optimal data models and architectures. Collaborate with data scientists, analysts, and stakeholders to provide timely and accurate data insights and support decision-making processes. Maintain documentation for software applications, workflows, and processes. Stay updated with emerging trends and advancements in data engineering and recommend suitable tools and technologies for continuous improvement. Requirements Top Secret clearance or higher required. High level of proficiency in ETL processes and demonstrated, hands-on experience with technologies such as Databricks, Python, Spark, Scala, JavaScript/JSON, SQL, and Jupyter Notebooks. Strong problem-solving skills and ability to solve complex data-related issues. Demonstrated experience working with large datasets and leveraging big data technologies to process and analyze data efficiently. Understanding of data modeling/visualization, database design principles, and data governance practices. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. Detail-oriented mindset with a commitment to delivering high-quality results. Benefits Compensation - We carefully consider a wide range of compensation factors, including but not limited to prior experience, skills, expertise, location, and other considerations permitted by law. Healthcare - We offer Health, Vision, and Dental Plans for our employees and their families. Retirement Plan We invest in your future with a competitive 401(k) plan, where we match 100% of your contributions up to your first 6% and give you access to Vanguard Admiral funds. Paid Time Off - Based on length of service, we offer a generous amount of paid leave. Bonus System As you invest in us, we invest in you. We offer bonuses to all employees who meet and exceed goals throughout the year. Professional Development Support for career growth through training programs and certifications. Company Retreats & Team Events Sponsored trips, team-building activities, and annual conferences related to your skillset.
    $72k-99k yearly est. 8d ago
  • Azure Data Engineer

    Akkodis

    Data engineer job in Tampa, FL

    Akkodis is seeking an Azure Data Engineer for a Contract with a client in Tampa, FL. You will design and build scalable data pipelines and architectures on the Azure cloud platform, ensuring high performance and reliability. Rate Range: $42/hour to $44/hour; The rate may be negotiable based on experience, education, geographic location, and other factors. Azure Data Engineer job responsibilities include: * Design, develop, and optimize end-to-end data pipelines and ETL/ELT processes using Azure Data services and frameworks. * Build scalable data solutions leveraging Azure Databricks, PySpark, and Snowflake for batch and real-time data processing. * Develop and maintain data models and schemas in relational and NoSQL databases such as Postgres and MongoDB. * Write efficient, reusable code in Python and SQL to transform and load data across multiple systems. * Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to deliver high-quality data solutions. * Monitor and improve data pipeline performance, ensuring reliability, scalability, and compliance with data governance standards. Required Qualifications: * Bachelor's or master's degree in computer science, Data Engineering, or a related field. * Minimum 5+ years of experience in data engineering with strong expertise in Azure cloud services. * Hands-on experience with Azure Databricks, PySpark, Snowflake, and building scalable data pipelines and architectures. * Strong proficiency in Python and SQL, with experience in relational and NoSQL databases (Postgres, MongoDB) and ETL/ELT processes. If you are interested in this role, then please click APPLY NOW. For other opportunities available at Akkodis, or any questions, feel free to contact me at ****************************. Pay Details: $42.00 to $44.00 per hour Benefit offerings available for our associates include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, EAP program, commuter benefits and a 401K plan. Our benefit offerings provide employees the flexibility to choose the type of coverage that meets their individual needs. In addition, our associates may be eligible for paid leave including Paid Sick Leave or any other paid leave required by Federal, State, or local law, as well as Holiday pay where applicable. Equal Opportunity Employer/Veterans/Disabled Military connected talent encouraged to apply To read our Candidate Privacy Information Statement, which explains how we will use your information, please navigate to ****************************************************** The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable: * The California Fair Chance Act * Los Angeles City Fair Chance Ordinance * Los Angeles County Fair Chance Ordinance for Employers * San Francisco Fair Chance Ordinance Massachusetts Candidates Only: It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.
    $42-44 hourly Easy Apply 11d ago
  • Hadoop Admin / Developer

    Us Tech Solutions 4.4company rating

    Data engineer job in Tampa, FL

    US Tech Solutions is a global staff augmentation firm providing a wide-range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit our website ************************ We are constantly on the lookout for professionals to fulfill the staffing needs of our clients, sets the correct expectation and thus becomes an accelerator in the mutual growth of the individual and the organization as well. Keeping the same intent in mind, we would like you to consider the job opening with US Tech Solutions that fits your expertise and skillset. Job Description Position:Hadoop Admin / Developer Duration:6 + Months / Contract - to - Hire / Fulltime Location:Tampa, FL Interview:Phone & F2F/Skype Qualifications • Advanced knowledge in administration of Hadoop components including HDFS, MapReduce, Hive, YARN, Tez, Flume • Advanced skills in performance tuning and troubleshooting Hadoop jobs• Intermediate skills in Data ingestion to/from Hadoop • Knowledge of Greenplum, Informatica, Tableau, SAS desired • Knowledge in Java desired Additional Information Chandra Kumar ************ Chandra at ustechsolutionsinc com
    $85k-111k yearly est. 1d ago
  • Data Engineer-Lead - Project Planning and Execution

    DPR Construction 4.8company rating

    Data engineer job in Tampa, FL

    We are a leading construction company committed to delivering high-quality, innovative projects. Our team integrates cutting-edge technologies into the construction process to streamline operations, enhance decision-making, and drive efficiency across all levels. We are looking for a talented Data Engineer to join our team and contribute to developing robust data solutions that support our business goals. This role is ideal for someone who enjoys combining technical problem-solving with stakeholder collaboration. You will collaborate with business leaders to understand data needs and work closely with a global engineering team to deliver scalable, timely, and high-quality data solutions that power insights and operations. Responsibilities * Own data delivery for specific business verticals by translating stakeholder needs into scalable, reliable, and well-documented data solutions. * Participate in requirements gathering, technical design reviews, and planning discussions with business and technical teams. * Partner with the extended data team to define, develop, and maintain shared data models and definitions. * Design, develop, and maintain robust data pipelines and ETL processes using tools like Azure Data Factory and Python across internal and external systems. * Proactively manage data quality, error handling, monitoring, and alerting to ensure timely and trustworthy data delivery. * Perform debugging, application issue resolution, root cause analysis, and assist in proactive/preventive maintenance. * Support incident resolution and perform root cause analysis for data-related issues. * Create and maintain both business requirement and technical requirement documentation * Collaborate with data analysts, business users, and developers to ensure the accuracy and efficiency of data solutions. * Collaborate with platform and architecture teams to align with best practices and extend shared data engineering patterns. Qualifications * Minimum of 4 years of experience as a Data Engineer, working with cloud platforms (Azure, AWS). * Proven track record of managing stakeholder expectations and delivering data solutions aligned with business priorities. * Strong hands-on expertise in Azure Data Factory, Azure Data Lake, Python, and SQL * Familiarity with cloud storage (Azure, AWS S3) and integration techniques (APIs, webhooks, REST). * Experience with modern data platforms like Snowflake and Microsoft Fabric. * Solid understanding of Data Modeling, pipeline orchestration and performance optimization * Strong problem-solving skills and ability to troubleshoot complex data issues. * Excellent communication skills, with the ability to work collaboratively in a team environment. * Familiarity with tools like Power BI for data visualization is a plus. * Experience working with or coordinating with overseas teams is a strong plus Preferred Skills * Knowledge of Airflow or other orchestration tools. * Experience working with Git-based workflows and CI/CD pipelines * Experience in the construction industry or a similar field is a plus but not required. DPR Construction is a forward-thinking, self-performing general contractor specializing in technically complex and sustainable projects for the advanced technology, life sciences, healthcare, higher education and commercial markets. Founded in 1990, DPR is a great story of entrepreneurial success as a private, employee-owned company that has grown into a multi-billion-dollar family of companies with offices around the world. Working at DPR, you'll have the chance to try new things, explore unique paths and shape your future. Here, we build opportunity together-by harnessing our talents, enabling curiosity and pursuing our collective ambition to make the best ideas happen. We are proud to be recognized as a great place to work by our talented teammates and leading news organizations like U.S. News and World Report, Forbes, Fast Company and Newsweek. Explore our open opportunities at ********************
    $83k-109k yearly est. Auto-Apply 58d ago
  • Data Engineer - Machine Learning (Marketing Analytics)

    PODS Enterprises, LLC 4.0company rating

    Data engineer job in Clearwater, FL

    At PODS (Portable On Demand Storage), we're not just a leader in the moving and storage industry, we redefined it. Since 1998, we've empowered customers across the U.S. and Canada with flexible, portable solutions that put customers in control of their move. Whether it's a local transition or a cross-country journey, our personalized service makes any experience smoother, smarter, and more human. We're driven by a culture of trust, authenticity, and continuous improvement. Our team is the heartbeat of our success, and together we strive to make each day better than the last. If you're looking for a place where your work matters, your ideas are valued, and your growth is supported- PODS is your next destination. JOB SUMMARY The Data Engineer- Machine Learning is responsible for scaling a modern data & AI stack to drive revenue growth, improve customer satisfaction, and optimize resource utilization. As an ML Data Engineer, you will bridge data engineering and ML engineering: build high‑quality feature pipelines in Snowflake/Snowpark, Databricks, productionize and operate batch/real‑time inference, and establish MLOps/LLMOps practices so models deliver measurable business impact at scale. Note: This role is required onsite at PODS headquarters in Clearwater, FL. The onsite working schedule is Monday - Thursday onsite with Friday remote. It is NOT a remote opportunity. General Benefits & Other Compensation: Medical, dental, and vision insurance Employer-paid life insurance and disability coverage 401(k) retirement plan with employer match Paid time off (vacation, sick leave, personal days) Paid holidays Parental leave / family leave Bonus eligibility / incentive pay Professional development / training reimbursement Employee assistance program (EAP) Commuter benefits / transit subsidies (if available) Other fringe benefits (e.g. wellness credits) What you will do: ● Design, build, and operate feature pipelines that transform curated datasets into reusable, governed feature tables in Snowflake ● Productionize ML models (batch and real‑time) with reliable inference jobs/APIs, SLAs, and observability ● Setup processes in Databricks and Snowflake/Snowpark to schedule, monitor, and auto‑heal training/inference pipelines ● Collaborate with our Enterprise Data & Analytics (ED&A) team centered on replicating operational data into Snowflake, enriching it into governed, reusable models/feature tables, and enabling advanced analytics & ML-with Databricks as a core collaboration environment ● Partner with Data Science to optimize models that grow customer base and revenue, improve CX, and optimize resources ● Implement MLOps/LLMOps: experiment tracking, reproducible training, model/asset registry, safe rollout, and automated retraining triggers ● Enforce data governance & security policies and contribute metadata, lineage, and definitions to the ED&A catalog ● Optimize cost/performance across Snowflake/Snowpark and Databricks ● Follow robust and established version control and DevOps practices ● Create clear runbooks and documentation, and share best practices with analytics, data engineering, and product partners Also, you will DELIVER QUALITY RESULTS: Able to deliver top quality service to all customers (internal and external); Able to ensure all details are covered and adhere to company policies; Able to strive to do things right the first time; Able to meet agreed-upon commitments or advises customer when deadlines are jeopardized; Able to define high standards for quality and evaluate products, services, and own performance against those standards TAKE INITIATIVE: Able to exhibit tendencies to be self-starting and not wait for signals; Able to be proactive and demonstrate readiness and ability to initiate action; Able to take action beyond what is required and volunteers to take on new assignments; Able to complete assignments independently without constant supervision BE INNOVATIVE / CREATIVE: Able to examine the status quo and consistently look for better ways of doing things; Able to recommend changes based on analyzed needs; Able to develop proper solutions and identify opportunities BE PROFESSIONAL: Able to project a positive, professional image with both internal and external business contacts; Able to create a positive first impression; Able to gain respect and trust of others through personal image and demeanor ADVANCED COMPUTER USER: Able to use required software applications to produce correspondence, reports, presentations, electronic communication, and complex spreadsheets including formulas and macros and/or databases. Able to operate general office equipment including company telephone system What you will need: Bachelor's or Master's in CS, Data/ML, or related field (or equivalent experience) required 4+ years in data/ML engineering building production‑grade pipelines with Python and SQL Strong hands‑on with Snowflake/Snowpark and Databricks; comfort with Tasks & Streams for orchestration 2+ years of experience optimizing models: batch jobs and/or real‑time APIs, containerized services, CI/CD, and monitoring Solid understanding of data modeling and governance/lineage practices expected by ED&A It would be nice if you had: Familiarity with LLMOps patterns for generative AI applications Experience with NLP, call center data, and voice analytics Exposure to feature stores, model registries, canary/shadow deploys, and A/B testing frameworks Marketing analytics domain familiarity (lead scoring, propensity, LTV, routing/prioritization) MANAGEMENT & SUPERVISORY RESPONSIBILTIES • Direct supervisor job title(s) typically include: VP, Marketing Analytics • Job may require supervising Analytics associates No Unsolicited Resumes from Third-Party Recruiters Please note that as per PODS policy, we do not accept unsolicited resumes from third-party recruiters unless such recruiters are engaged to provide candidates for a specified opening and in alignment with our Inclusive Diversity values. Any employment agency, person or entity that submits an unsolicited resume does so with the understanding that PODS will have the right to hire that applicant at its discretion without any fee owed to the submitting employment agency, person, or entity. DISCLAIMER The preceding job description has been designed to indicate the general nature of work performed; the level of knowledge and skills typically required; and usual working conditions of this position. It is not designed to contain, or be interpreted as, a comprehensive listing of all requirements or responsibilities that may be required by employees in this job. Equal Opportunity, Affirmative Action Employer PODS Enterprises, LLC is an Equal Opportunity, Affirmative Action Employer. We will not discriminate unlawfully against qualified applicants or employees with respect to any term or condition of employment based on race, color, national origin, ancestry, sex, sexual orientation, age, religion, physical or mental disability, marital status, place of birth, military service status, or other basis protected by law.
    $80k-113k yearly est. 11d ago
  • Data Scientist II - Client Protection

    Bank of America Corporation 4.7company rating

    Data engineer job in Tampa, FL

    At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. We do this by driving Responsible Growth and delivering for our clients, teammates, communities and shareholders every day. Being a Great Place to Work is core to how we drive Responsible Growth. This includes our commitment to being an inclusive workplace, attracting and developing exceptional talent, supporting our teammates' physical, emotional, and financial wellness, recognizing and rewarding performance, and how we make an impact in the communities we serve. Bank of America is committed to an in-office culture with specific requirements for office-based attendance and which allows for an appropriate level of flexibility for our teammates and businesses based on role-specific considerations. At Bank of America, you can build a successful career with opportunities to learn, grow, and make an impact. Join us! Job Summary: This job is responsible for reviewing and interpretating large datasets to uncover revenue generation opportunities and ensuring the development of effective risk management strategies. Key responsibilities include working with lines of business to comprehend problems, utilizing sophisticated analytics and deploying advanced techniques to devise solutions, and presenting recommendations based on findings. Job expectations include demonstrating leadership, resilience, accountability, a disciplined approach, and a commitment to fostering responsible growth for the enterprise. Client Protection Shared Services - Advanced Analytics is looking for an energetic and inquisitive data scientist to join our team and help us combat financial crime. In this role, you will be expected to work on large and complex data science projects that entail working with both relational and graph databases. In these projects, it is expected to collaborate with internal strategy, technology, product, and policy partners to deploy advanced analytical solutions with the goal of reducing fraud losses, lowering false positive impacts, improving client experience, and ensuring the Bank minimizes its total cost of fraud. Key responsibilities include applying knowledge of multiple business and technical-related topics and independently driving strategic initiatives, large-scale projects, and overall improvements. Responsibilities: * Perform graph analytics to find and mitigate densely connected fraud networks * Assist with the generation, prioritization, and investigation of fraud rings * Enable business analytics, including data analysis, trend identification, and pattern recognition, using advanced techniques to drive decision making and data driven insights * Understanding of end-to-end model development work, ranging from supervised, unsupervised, and graph-based machine learning solutions, to maximize detection of fraud or capture anomalous behavior * Manage multiple priorities and ensures quality and timeliness of work deliverables such as data science products, data analysis reports, or data visualizations, while exhibiting the ability to work independently and in a team environment * Manage relationships with multiple technology teams, development team, and line of business leaders, including alignment of roadmaps, managing projects, and managing risks * Oversee development, delivery and quality assurance for data science use cases delivered to the production environment and other areas of the line of business * Support the identification of potential issues and development of controls * Support execution of large-scale projects, such as platform conversions or new project integrations by conducting advanced reporting and drawing analytical-based insights * Manages a roadmap of data science use cases to answer business trends based on economic and portfolio conditions and communicate findings to senior management, while diligently working and leading peers to solve for these use cases * Coach and mentor peers to improve proficiency in a variety of systems and serve as a subject matter expert on multiple business and technical-related topics * Apply agile practices for project management, solution development, deployment, and maintenance * Deliver presentations in an engaging and effective manner through in-person and virtual conversations that communicates technical concepts and analysis results to a diverse set of internal stakeholders, and develops professional relationships to foster collaboration on work deliverables * Maintain knowledge of the latest advances in the fields of data science and artificial intelligence to support business analytics * Engage business and technology senior leaders on reporting of project/deliverable statuses, opportunity identification, and planning efforts Required Qualifications: * 4+ years of experience in data and analytics * 4+ years of experience in data analytics within fraud prevention * Must be proficient with SQL and one of SAS, Python, or Java * Must have familiarity with Graph databases (e.g. TigerGraph, Neo4J) and graph query languages * Problem-solving skills including selection of data and deployment of solutions * Proven ability to manage projects, exercise thought leadership and work with limited direction on complex problems to achieve project goals while also leading a broader team * Excellent communication and influencing skills * Thrives in fast-paced and highly dynamic environment * Intellectual curiosity and strong urge to figure out the "whys" of a problem and produce creative solutions * Exposure to model development leveraging supervised and unsupervised machine learning (regression, tree-based algorithms, etc.) * Expertise in data analytics and technical development lifecycles including having coached junior staff * Expertise handling and manipulating data across its lifecycle in a variety of formats, sizes, and storage technologies to solve a problem (e.g., structured, semi-structured, unstructured; graph; hadoop; kafka) Desired Qualifications * Advanced Quantitative degree (Master's or PhD) * 7+ years of experience; work in financial services is very helpful, with preference to fraud, credit, cybersecurity, or other heavily quantitative areas * Understanding of advanced machine learning methodologies including neural networks, graph algorithms, ensemble learning like XGB, and other techniques * Proficient with SPARK, H2O, or similar advanced analytical tools * Analytical and Innovating Thinking * Problem Solving and Business Acumen * Risk and Issue Management, interpreting relevant laws, rules, and regulations * Data Visualization, Oral and Written Communication, and Presentation Skills * Experience managing multi-year roadmaps, engaging technical and non-technical stakeholders, and leading large cross-functional formal projects * Experience influencing mid to senior (executive) level leaders * Experience managing risk and issue remediation * Understanding of computer science topics like automation, code versioning, computational complexity, parallel processing, requirements gathering, testing methodologies, and development lifecycle models like Agile Skills: * Agile Practices * Application Development * DevOps Practices * Technical Documentation * Written Communications * Artificial Intelligence/Machine Learning * Business Analytics * Data Visualization * Presentation Skills * Risk Management * Adaptability * Collaboration * Consulting * Networking * Policies, Procedures, and Guidelines Management It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. Shift: 1st shift (United States of America) Hours Per Week: 40
    $71k-93k yearly est. 4d ago
  • ETL Architect

    Healthplan Services 4.7company rating

    Data engineer job in Tampa, FL

    HealthPlan Services (HPS) is the nation's largest independent provider of sales, benefits administration, retention, reform and technology solutions to the insurance and managed care industries. Headquartered in Tampa, Florida, HPS was founded in 1970 and employs 1,500+ associates. HPS stands at the forefront of the insurance industry, providing exchange connectivity, administration, distribution and technology services to insurers of individual, small group, voluntary and association plans, as well as valuable solutions to thousands of brokers and agents, nationwide. Job Description Position: ETL Architect The ETL Architect will have experience delivering BI solutions with an Agile BI delivery methodology. Essential Job Functions and Duties: Develop and maintain ETL jobs for data warehouses/marts Design ETL via source-to-target mapping and design documents that consider security, performance tuning and best practices Collaborate with delivery and technical team members on design and development Collaborate with business partners to understand business processes, underlying data and reporting needs Conduct data analysis in support of ETL development and other activities Assist with data architecture and data modeling Preferred Qualifications: 12+ years of work experience as Business Intelligence Developer Work experience with multiple database platforms and BI delivery solutions 10+ years of experience with End to End ETL architecture, data modeling BI and Analytics data marts, implementing and supporting production environments. 10+ years of experience designing, building and implementing BI solutions with modern BI tools like Microstrategy, Microsoft and Tableau Experience as a Data Architect Experience delivering BI solutions with an Agile BI delivery methodology Ability to communicate, present and interact comfortably with senior leadership Demonstrated proficiency implementing self-service solutions to empower an organization to generate valuable actionable insights Strong team player Ability to understand information quickly, derive insight, synthesize information clearly and concisely, and devise solutions Inclination to take initiative, set priorities, take ownership of assigned projects and initiatives, drive for results, and collaborate to achieve greatest value Strong relationship-building and interpersonal skills Demonstrated self-confidence, honesty and integrity Conscientious of Enterprise Data Warehouse Release management process; Conduct Operations readiness and environment compatibility review of any changes prior to deployment with strong sensitivity around Impact and SLA Experience with data modeling tools a plus. Expert in data warehousing methodologies and best practices required. Ability to initiate and follow through on complex projects of both short and long term duration required. Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required. Proactive recommendation for improving the performance and operability of the data warehouse and reporting environment. Participate on interdepartmental teams to support organizational goals Perform other related duties and tasks as assigned Experience facilitating user sessions and gathering requirements Education Requirements: Bachelors or equivalent degree in a business, technical, or related field Additional Information All your information will be kept confidential according to EEO guidelines.
    $84k-105k yearly est. 60d+ ago

Learn more about data engineer jobs

How much does a data engineer earn in Largo, FL?

The average data engineer in Largo, FL earns between $63,000 and $115,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Largo, FL

$85,000

What are the biggest employers of Data Engineers in Largo, FL?

The biggest employers of Data Engineers in Largo, FL are:
  1. PODS
  2. Pacemate
  3. Pacemate™
Job type you want
Full Time
Part Time
Internship
Temporary