Post job

Data engineer jobs in Gulfport, FL

- 521 jobs
All
Data Engineer
Data Scientist
Senior Software Engineer
ETL Architect
Hadoop Developer
Data Architect
Requirements Engineer
  • Data Engineer

    Hays 4.8company rating

    Data engineer job in Seffner, FL

    Data Engineer (AI/ML Pipelines) - Contract to Hire - Seffner, FL - $48-67/hr. The final salary or hourly wage, as applicable, paid to each candidate/applicant for this position is ultimately dependent on a variety of factors, including, but not limited to, the candidate's/applicant's qualifications, skills, and level of experience as well as the geographical location of the position. Applicants must be legally authorized to work in the United States. Sponsorship not available. Our client is seeking a Data Engineer (AI/ML Pipelines) in Seffner, FL. Responsibilities The Data Engineer - AI/ML Pipelines is a pivotal role responsible for architecting, developing, and maintaining scalable data pipelines that support advanced analytics, machine learning, and real-time operational intelligence across enterprise systems. A key requirement for this role is hands-on experience working with Warehouse Management Systems (WMS), including the ability to ingest, normalize, and interpret data from WMS platforms to support business-critical operations. You'll collaborate closely with software engineers, data scientists, and cross-functional teams to drive initiatives in data transformation, governance, and automation. The role offers exposure to cutting-edge cloud-based technologies, direct mentorship, and well-defined pathways for advancement-including growth into senior or lead data engineering roles, specialization in machine learning engineering, or transition into software engineering roles. Skills & Requirements 3+ years of experience in data engineering Hands-on experience with Databricks, cloud-based data processing, and AI/ML workflows Demonstrated ability to independently build production-grade data pipelines Deep knowledge of data modeling, ETL/ELT, and data architecture in cloud environments. Proficiency in Python, SQL, and data engineering best practices. Benefits/Other Compensation This position is a contract/temporary role where Hays offers you the opportunity to enroll in full medical benefits, dental benefits, vision benefits, 401K and Life Insurance ($20,000 benefit). Why Hays? You will be working with a professional recruiter who has intimate knowledge of the industry and market trends. Your Hays recruiter will lead you through a thorough screening process in order to understand your skills, experience, needs, and drivers. You will also get support on resume writing, interview tips, and career planning, so when there's a position you really want, you're fully prepared to get it. Nervous about an upcoming interview? Unsure how to write a new resume? Visit the Hays Career Advice section to learn top tips to help you stand out from the crowd when job hunting. Hays is committed to building a thriving culture of diversity that embraces people with different backgrounds, perspectives, and experiences. We believe that the more inclusive we are, the better we serve our candidates, clients, and employees. We are an equal employment opportunity employer, and we comply with all applicable laws prohibiting discrimination based on race, color, creed, sex (including pregnancy, sexual orientation, or gender identity), age, national origin or ancestry, physical or mental disability, veteran status, marital status, genetic information, HIV-positive status, as well as any other characteristic protected by federal, state, or local law. One of Hays' guiding principles is ‘do the right thing'. We also believe that actions speak louder than words. In that regard, we train our staff on ensuring inclusivity throughout the entire recruitment process and counsel our clients on these principles. If you have any questions about Hays or any of our processes, please contact us. In accordance with applicable federal, state, and local law protecting qualified individuals with known disabilities, Hays will attempt to reasonably accommodate those individuals unless doing so would create an undue hardship on the company. Any qualified applicant or consultant with a disability who requires an accommodation in order to perform the essential functions of the job should call or text ************. Drug testing may be required; please contact a recruiter for more information.
    $48-67 hourly 2d ago
  • Data Scientist w/ Observability exp #987999

    Dexian

    Data engineer job in Tampa, FL

    Job Title: AI / ML Engineer - Observability Work Type: Contract-to-Hire (CTH) We are seeking a highly skilled AI/ML Engineer with strong Observability expertise to join our team as a strategic contributor. This role goes beyond hands-on engineering and requires a subject matter expert who can act as a thought partner, helping to design, recommend, and implement AI-driven observability solutions across applications, platforms, and business data. The ideal candidate has experience building end-to-end observability solutions, a strong foundation in automation, and a deep understanding of modern observability principles and tooling. Key Responsibilities Design, build, and enhance AI/ML-driven observability solutions across applications, infrastructure, and business data Act as a subject matter expert in observability, providing guidance and recommendations to engineering and leadership teams Implement and support observability frameworks using OpenTelemetry Develop dashboards, alerts, and analytics using Grafana or similar observability platforms Leverage AI/ML techniques to improve monitoring, anomaly detection, forecasting, and operational insights Build automated, scalable solutions to support end-to-end observability across multiple technologies Collaborate with cross-functional teams to align observability strategy with business and technical goals Evaluate and recommend new tools, platforms, and AI-driven approaches to enhance observability capabilities Ensure observability best practices are consistently applied across systems and environments Required Qualifications Strong experience in AI/ML development and engineering Solid understanding of core observability concepts (metrics, logs, traces) Hands-on experience with OpenTelemetry Experience with observability platforms such as: Grafana (preferred), or Splunk, Dynatrace, or similar tools Strong background in automation and building innovative, scalable engineering solutions Ability to design and implement end-to-end observability architectures Excellent problem-solving and communication skills Preferred Qualifications Experience recommending or implementing AI-driven observability or monitoring solutions Background working in complex, distributed systems environments Ability to quickly learn and adapt to new observability tools and platforms Experience influencing technical direction and strategy beyond individual contributor tasks
    $63k-91k yearly est. 1d ago
  • Ab Initio / Abinitio Developer- Hadoop

    Ltimindtree

    Data engineer job in Tampa, FL

    LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 750 clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit ******************* Ab Initio Developer Location: Irving, TX/ Tampa, FL Duration: Full time. Abinitio GDE basic: Experience working with various basic Abinitio components such as Rollup ,Scan, Join, Partition, Gather, Merge, Sort, Lookup etc. Conduct>IT : Should have Knowledge on Plan. Advanced Abinitio: Vector, XML, should have working knowledge on Metaprogramming and PDL.(MUST) Bigdata/Hadoop: Need Hive HDFS experience. (MUST) Control Center/Tivoli: Should understand any scheduling tool. BRE/ ACE/Express>IT: Working knowledge on Express>IT and EZ graph is a plus. Metadata-Hub: Must have working knowledge on Mhub. Oracle/PL/SQL: Should have good knowledge on complex SQL Unix/Shell scripting: Knowledge on Abinitio air commands and m commands and complex shell scripting. Design/Automation: Should have Automation experience. Good to have design exp as well. HLD/LLD Documentation: Proficient in creating documentation related to Project end User Manual and Operations Hand off guide. Data warehouse concept: Good understanding in Data warehousing concepts. Communication skill: Good verbal and written communication skills Domain experience: Experience in BFSI domain is preferable Problem solving/Management skill: Very good problem-solving skills. Benefits/perks listed below may vary depending on the nature of your employment with LTIMindtree (“LTIM”): Benefits and Perks: Comprehensive Medical Plan Covering Medical, Dental, Vision Short Term and Long-Term Disability Coverage 401(k) Plan with Company match Life Insurance Vacation Time, Sick Leave, Paid Holidays Paid Paternity and Maternity Leave LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, colour, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.
    $77k-102k yearly est. 2d ago
  • Applied Data Scientist Regulatory and Statistical

    Moffitt Cancer Center 4.9company rating

    Data engineer job in Tampa, FL

    Shape the Future of Predictive Medicine. One-Year Project. Innovators Wanted! Are you driven by curiosity, energized by ambiguity, and passionate about transforming healthcare? Dr. Ciara Freeman at the esteemed Moffitt Cancer Center is searching for a bold, entrepreneurial-minded Applied Data Scientist - Regulatory and Statistical for a dynamic, one-year project to help build the first regulatory-grade AI models that predict therapy response in multiple myeloma. You'll partner with a physician-scientist PI and data-engineering team to prototype, validate, and document predictive models designed for clinical use. This is hands-on translational ML - fast iteration, real impact, auditable results. Your models will form the core of clinically actionable, auditable AI systems that change how we treat cancer. Ideal Candidate: Expert in Python (scikit-learn, XGBoost, PyTorch/TensorFlow). Skilled in survival or clinical modeling; thrive where rigor meets speed. Startup thinkers with a thirst for discovery Individuals who thrive in fast-paced, risk-friendly environments Problem solvers who see challenges as opportunities Team players eager to see their ideas put into action Responsibilities: Develop and validate multimodal survival and risk-stratification models (clinical + omics + imaging). Collaborate with engineers to define and extract features. Perform calibration, bias analysis, and explainability (SHAP, PDPs, model cards). Translate results into clinician-friendly insights and contribute to IP and regulatory filings. Credentials & Qualifications: Master's degree in Computer Science, Data Science, Biostatistics, or a related quantitative field with seven (7) years of applied statistical or machine learning model development experience in healthcare, biotech, or regulated environments. Or PhD with five (5) years of applied statistical or machine learning model development experience in healthcare, biotech, or regulated environments. Familiarity with Snowflake or modern data-engineering workflows preferred. Join a project that's not just about data - it's about revolutionizing patient care. Help us bridge the gap between today's personalized medicine and tomorrow's predictive breakthroughs. If you're ready to take risks, drive results, and change the future of medicine, apply today! Moffitt Cancer Center proudly stands as a Comprehensive Cancer Center designated by the National Cancer Institute (NCI) in the vibrant city of Tampa, Florida. This dynamic city is an exceptional choice for those seeking exciting opportunities in a rapidly growing metropolitan area. With its flourishing economy and rich cultural diversity, the Tampa Bay region masterfully combines urban elegance with breathtaking natural beauty. Discover why countless individuals have chosen to make Tampa their home and experience firsthand what makes it one of the fastest-growing metropolitan cities in the United States.
    $64k-86k yearly est. 3d ago
  • Data Architect

    Radiant Digital 4.1company rating

    Data engineer job in Tampa, FL

    Data Architecture & Modeling Design and maintain enterprise-level logical, conceptual, and physical data models. Define data standards, naming conventions, metadata structures, and modeling best practices. Ensure scalability, performance, and alignment of data models with business requirements. Data Governance & Quality Implement and enforce data governance principles and policies. Define data ownership, stewardship, data lineage, and lifecycle management. Lead initiatives to improve data quality, consistency, and compliance. Enterprise Data Management Develop enterprise data strategies, including data integration, master data management (MDM), and reference data frameworks. Define and oversee the enterprise data architecture blueprint. Ensure alignment between business vision and data technology roadmaps.
    $83k-118k yearly est. 3d ago
  • Senior Data Engineer

    Toorak Capital Partners

    Data engineer job in Tampa, FL

    Company: Toorak Capital Partners is an integrated correspondent lending and table funding platform that acquires business purpose residential, multifamily and mixed-use loans throughout the U.S. and the United Kingdom. Headquartered in Tampa, FL., Toorak Capital Partners acquires these loans directly from a network of private lenders on a correspondent basis. Summary: The role of the Lead Data Engineer is to develop, implement, for building high performance, scalable data solution to support Toorak's Data Strategy Lead Data architecture for Toorak Capital. Lead efforts to create API framework to use data across customer facing and back office applications. Establish consistent data standards, reference architectures, patterns, and practices across the organization for both OLTP and OLAP (Data warehouse, Data Lake house) MDM and AI / ML technologies Lead sourcing and synthesis of Data Standardization and Semantics discovery efforts turning insights into actionable strategies that will define the priorities for the team and rally stakeholders to the vision Lead the data integration and mapping efforts to harmonize data. Champion standards, guidelines, and direction for ontology, data modeling, semantics and Data Standardization in general at Toorak. Lead strategies and design solutions for a wide variety of use cases like Data Migration (end-to-end ETL process), database optimization, and data architectural solutions for Analytics Data Projects Required Skills: Designing and maintaining the data models, including conceptual, logical, and physical data models 5+ years of experience using NoSQL systems like MongoDB, DynamoDB and Relational SQL Database systems (PostgreSQL) and Athena 5+ years of experience on Data Pipeline development, ETL and processing of structured and unstructured data 5+ years of experience in large scale real-time stream processing using Apache Flink or Apache Spark with messaging infrastructure like Kafka/Pulsar Proficiency in using data management tools and platforms, such as data cataloging software, data quality tools), and data governance platforms Experience with Big Query, SQL Mesh(or similar SQL-based cloud platform). Knowledge of cloud platforms and technologies such as Google Cloud Platform, Amazon Web Services. Strong SQL skills. Experience with API development and frameworks. Knowledge in designing solutions with Data Quality, Data Lineage, and Data Catalogs Strong background in Data Science, Machine Learning, NLP, Text processing of large data sets Experience in one or more of the following: Dataiku, DataRobot, Databricks, UiPath would be nice to have. Using version control systems (e.g., Git) to manage changes to data governance policies, procedures, and documentation Ability to rapidly comprehend changes to key business processes and the impact on overall Data framework. Flexibility to adjust to multiple demands, shifting priorities, ambiguity, and rapid change. Advanced analytical skills. High level of organization and attention to detail. Self-starter attitude with the ability to work independently. Knowledge of legal, compliance, and regulatory issues impacting data. Experience in finance preferred.
    $72k-99k yearly est. 1d ago
  • Senior Angular Developer

    Firstpro 360 4.5company rating

    Data engineer job in Tampa, FL

    We are seeking a Senior Angular Developer to join a high-performing team building a large-scale enterprise platform. In this role, you will work on complex, data-driven interfaces that support real-time analytics and device lifecycle visibility. You will be responsible for developing UI components used by enterprise customers to manage assets, repairs, orders, and workflows. This position offers competitive pay, great benefits, and the opportunity to work with modern technologies. Requirements: 6+ years of hands-on Angular development experience (Angular 14+) Strong skills in TypeScript, RxJS, and reactive programming Experience building complex UI components such as dashboards, data tables, and forms Ability to create high-performance interfaces that handle large datasets Experience integrating RESTful APIs and transforming data for frontend use Familiarity with state management patterns (NgRx, Akita, or similar) and responsive design standards
    $88k-116k yearly est. 2d ago
  • Senior VBA Developer

    Minisoft Technologies LLC

    Data engineer job in Tampa, FL

    Job Title: Senior VBA Developer Duration: 6+ months Interview: Phone/ Skype-2Rounds Visa: USC, H4-EAD LinkedIn Must have Experience: 10+ Job Description: Role Overview We're seeking a hand‑on Senior VBA Developer with deep T‑SQL expertise and the ability to read and write MS Access 2003/VBA. You'll reverse‑engineer and stabilize the legacy SnapShot application, design and build modern SQL‑centric data flows, and implement integrations that connect the current Access front end (on‑prem) + SQL Server back end ERP with the Infor system. The near‑term focus is delivering the first integration, followed by a broader modernization program (S4 migration to follow). REQUIRED SKILLS What You'll Do Reverse‑engineer, read, and author MS Access 2003/VBA code; untangle ~100k lines of legacy logic to document workflows and stabilize critical functions. Design and implement SQL Server schemas (approx. 500 fields, with ~100 Q&A fields flowing into SnapShot) to support modernized processing. Build robust ETL/data ingestion pipelines (e.g., parsing output files and loading into SQL DB for downstream processing); automate validation and reconciliation. Integrate with Infor: design interfaces/APIs/bridge tables and orchestrations to pass configuration/estimator data between Access/SQL and the Infor ecosystem. Maintain on‑prem performance and reliability: indexing, query tuning, statistics management, isolation levels, and transactional integrity. Establish coding standards, version control, and technical documentation for legacy and modern components; implement unit/integration test harnesses. Partner with business (estimators, dealers) to map legacy rules to modern data structures; convert tribal knowledge into testable specifications. Prepare for the S4 migration (start-to-finish planning) by isolating dependencies, defining cutover strategies, and drafting rollback/safety nets. Drive secure handling of ERP data (authentication, role‑based access, auditing), and plan for multi‑year on‑prem operation before any cloud move. Thanks & Regards, Jennifer |Sr Technical Recruiter Minisoft Technologies LLC ************| *************************
    $81k-109k yearly est. 1d ago
  • Senior Dotnet Developer

    Vdart 4.5company rating

    Data engineer job in Tampa, FL

    Job Title: .Net Developer with Aveva Duration: 6 months Contract to hire Skills Needed: Mid to Sr Full stack as a “maintenance” full stack .Net Developer, Aveva (formerly wonderware), SDK Company is developing a new software platform and is phasing out this .Net/Aveva/SDK/SQL Server platform so this Developer will be maintaining, supporting, and help phasing out with integration into new platform A mid-senior .NET engineer who has genuinely used AVEVA/Wonderware + ArchestrA SDK in production-not someone who only used InTouch scripting or Historian browsing. Someone comfortable maintaining and stabilizing a legacy SCADA-connected .NET/SQL Server system while assisting with integration into the new platform. A developer who understands object models inside AVEVA (Templates, Instances, Extension Attributes, Deployment behavior) and can explain + code against the SDK confidently.
    $90k-122k yearly est. 5d ago
  • Data Scientist (Exploitation Specialist Level-3) - Tampa, FL

    Masego

    Data engineer job in Tampa, FL

    Job Description _________________________________________________________________________________________________ Masego is an award-winning small business that specializes in GEOINT services. As a Service-Disabled Veteran-Owned Small Business (SDVOSB), we recognize and award your hard work. Description We are looking for a Level-3 TS/SCI-cleared Data Scientist to join our team. This role provides automation/collection support to the main team at NGA Washington. Because of this, this opportunity relies on good communication skills and a baseline knowledge of GEOINT collection and/or automation systems like JEMA. Minimum Required Qualifications: At least 5 years of related GEOINT work experience, or 2 years with a relevant Bachelor's degree. Able to work on client site 40-hours a week (very limited option for telework) Proficient with Python Experience with JEMA Preferred Qualifications: Experience with multiple intelligence types (SIGINT, OSINT, ELINT, GEOINT, MASINT, HUMINT Experience with Brewlytics, ArcPro and/or other geospatial data analysis tools Knowledge of GEOINT collection and associated NGA/NRO systems Proficiency with common programming languages including R, SQL, HTML, and JavaScript Experience analyzing geospatially enabled data Ability to learn new technologies and adapt to dynamic mission needs Ability to work collaboratively with a remote team (main gov team is based out of NGA Washington) Experience providing embedded data science/automation support to analytic teams Security Clearance Requirement: Active TS/SCI, with a willingness to take a polygraph test. Salary Range: $128,600 based on ability to meet or exceed stated requirements About Masego Masego Inc. provides expert Geospatial Intelligence Solutions in addition to Activity Based Intelligence (ABI) and GEOINT instructional services. Masego provides expert-level Geospatial Collection Management, Full Motion Video; Human Geography; Information Technology and Cyber; Technical Writing; and ABI, Agile, and other professional training. Masego is a Service-Disabled Veteran-Owned Small Business headquartered in Fredericksburg, Virginia. With high-level expertise and decades of experience, coupled with proven project management systems and top-notch client support, Masego enhances the performance capabilities of the Department of Defense and the intelligence community. Pay and Benefits We seek to provide and take care of our team members. We currently offer Medical, Dental, Vision, 401k, Generous PTO, and more! Diversity Masego, Inc. is an equal opportunity/equal access/affirmative action employer fully committed to achieving a diverse workforce and complies with all applicable Federal and Virginia State laws, regulations, and executive orders regarding nondiscrimination and affirmative action in its programs and activities. Masego, Inc. does not discriminate on the basis of race, color, religion, ethnic or national origin, gender, genetic information, age, disability, sexual orientation, gender identity, gender expression, and veteran's status. Powered by JazzHR dWDxKmrwOd
    $128.6k yearly 21d ago
  • ETL Architect

    Healthplan Services 4.7company rating

    Data engineer job in Tampa, FL

    HealthPlan Services (HPS) is the nation's largest independent provider of sales, benefits administration, retention, reform and technology solutions to the insurance and managed care industries. Headquartered in Tampa, Florida, HPS was founded in 1970 and employs 1,500+ associates. HPS stands at the forefront of the insurance industry, providing exchange connectivity, administration, distribution and technology services to insurers of individual, small group, voluntary and association plans, as well as valuable solutions to thousands of brokers and agents, nationwide. Job Description Position: ETL Architect The ETL Architect will have experience delivering BI solutions with an Agile BI delivery methodology. Essential Job Functions and Duties: Develop and maintain ETL jobs for data warehouses/marts Design ETL via source-to-target mapping and design documents that consider security, performance tuning and best practices Collaborate with delivery and technical team members on design and development Collaborate with business partners to understand business processes, underlying data and reporting needs Conduct data analysis in support of ETL development and other activities Assist with data architecture and data modeling Preferred Qualifications: 12+ years of work experience as Business Intelligence Developer Work experience with multiple database platforms and BI delivery solutions 10+ years of experience with End to End ETL architecture , data modeling BI and Analytics data marts, implementing and supporting production environments. 10+ years of experience designing, building and implementing BI solutions with modern BI tools like Microstrategy, Microsoft and Tableau Experience as a Data Architect Experience delivering BI solutions with an Agile BI delivery methodology Ability to communicate, present and interact comfortably with senior leadership Demonstrated proficiency implementing self-service solutions to empower an organization to generate valuable actionable insights Strong team player Ability to understand information quickly, derive insight, synthesize information clearly and concisely, and devise solutions Inclination to take initiative, set priorities, take ownership of assigned projects and initiatives, drive for results, and collaborate to achieve greatest value Strong relationship-building and interpersonal skills Demonstrated self-confidence, honesty and integrity Conscientious of Enterprise Data Warehouse Release management process; Conduct Operations readiness and environment compatibility review of any changes prior to deployment with strong sensitivity around Impact and SLA Experience with data modeling tools a plus. Expert in data warehousing methodologies and best practices required. Ability to initiate and follow through on complex projects of both short and long term duration required. Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required. Proactive recommendation for improving the performance and operability of the data warehouse and reporting environment. Participate on interdepartmental teams to support organizational goals Perform other related duties and tasks as assigned Experience facilitating user sessions and gathering requirements Education Requirements: Bachelors or equivalent degree in a business, technical, or related field Additional Information All your information will be kept confidential according to EEO guidelines.
    $84k-105k yearly est. 6h ago
  • Principal Data Scientist

    Maximus 4.3company rating

    Data engineer job in Tampa, FL

    Description & Requirements We now have an exciting opportunity for a Principal Data Scientist to join the Maximus AI Accelerator supporting both the enterprise and our clients. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team. You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. This role requires strong systems thinking, problem-solving abilities, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes. This position will be focused on strategic company-wide initiatives but will also play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.) This position requires occasional travel to the DC area for client meetings. Essential Duties and Responsibilities: - Make deep dives into the data, pulling out objective insights for business leaders. - Initiate, craft, and lead advanced analyses of operational data. - Provide a strong voice for the importance of data-driven decision making. - Provide expertise to others in data wrangling and analysis. - Convert complex data into visually appealing presentations. - Develop and deploy advanced methods to analyze operational data and derive meaningful, actionable insights for stakeholders and business development partners. - Understand the importance of automation and look to implement and initiate automated solutions where appropriate. - Initiate and take the lead on AI/ML initiatives as well as develop AI/ML code for projects. - Utilize various languages for scripting and write SQL queries. Serve as the primary point of contact for data and analytical usage across multiple projects. - Guide operational partners on product performance and solution improvement/maturity options. - Participate in intra-company data-related initiatives as well as help foster and develop relationships throughout the organization. - Learn new skills in advanced analytics/AI/ML tools, techniques, and languages. - Mentor more junior data analysts/data scientists as needed. - Apply strategic approach to lead projects from start to finish; Job-Specific Minimum Requirements: - Develop, collaborate, and advance the applied and responsible use of AI, ML and data science solutions throughout the enterprise and for our clients by finding the right fit of tools, technologies, processes, and automation to enable effective and efficient solutions for each unique situation. - Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital. - Contribute to efforts across the enterprise to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, LLMs and classical machine learning. - Contribute to the development of mathematically rigorous process improvement procedures. - Maintain current knowledge and evaluation of the AI technology landscape and emerging. developments and their applicability for use in production/operational environments. Minimum Requirements - Bachelor's degree in related field required. - 10-12 years of relevant professional experience required. Job-Specific Minimum Requirements: - 10+ years of relevant Software Development + AI / ML / DS experience. - Professional Programming experience (e.g. Python, R, etc.). - Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML. - Experience with API programming. - Experience with Linux. - Experience with Statistics. - Experience with Classical Machine Learning. - Experience working as a contributor on a team. Preferred Skills and Qualifications: - Masters or BS in quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.). - Experience developing machine learning or signal processing algorithms: - Ability to leverage mathematical principles to model new and novel behaviors. - Ability to leverage statistics to identify true signals from noise or clutter. - Experience working as an individual contributor in AI. - Use of state-of-the-art technology to solve operational problems in AI and Machine Learning. - Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles. - Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions. - Ability to build reference implementations of operational AI & Advanced Analytics processing solutions. Background Investigations: - IRS MBI - Eligibility #techjobs #VeteransPage EEO Statement Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics. Pay Transparency Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances. Accommodations Maximus provides reasonable accommodations to individuals requiring assistance during any phase of the employment process due to a disability, medical condition, or physical or mental impairment. If you require assistance at any stage of the employment process-including accessing job postings, completing assessments, or participating in interviews,-please contact People Operations at **************************. Minimum Salary $ 156,740.00 Maximum Salary $ 234,960.00
    $64k-92k yearly est. Easy Apply 3d ago
  • Data Scientist

    Redhorse Corporation

    Data engineer job in Tampa, FL

    About the OrganizationNow is a great time to join Redhorse Corporation. We are a solution-driven company delivering data insights and technology solutions to customers with missions critical to U.S. national interests. We're looking for thoughtful, skilled professionals who thrive as trusted partners building technology-agnostic solutions and want to apply their talents supporting customers with difficult and important mission sets. About the RoleRedhorse Corporation is seeking a highly skilled Data Scientist to join our team supporting the United States Central Command (USCENTCOM) Directorate of Logistics (CCJ4). You will play a critical role in accelerating the delivery of AI-enabled capabilities within the Joint Logistics Common Operating Picture (JLOGCOP), directly impacting USCENTCOM's ability to promote international cooperation, respond to crises, deter aggression, and build resilient logistics capabilities for our partners. This is a high-impact role contributing to national security and global stability. You will be working on a custom build of AI/ML capabilities into the JLOGCOP leveraging dozens of data feeds to enhance decision-making and accelerate planning for USCENTCOM missions.Key Responsibilities Communicate with the client regularly regarding enterprise values and project direction. Find the intersection between business value and achievable technical work. Articulate and translate business questions into technical solutions using available DoD data. Explore datasets to find meaningful entities and relationships. Create data ingestion and cleaning pipelines. Develop applications and effective visualizations to communicate insights. Serve as an ambassador for executive DoD leadership to sponsor data literacy growth across the enterprise. Required Experience/Clearance US citizen with a Secret US government clearance. Applicants who are not US Citizens and who do not have a current and active Secret security clearance will not be considered for this role. Ability to work independently to recommend solutions to the client and as part of a team to accomplish tasks. Experience with functional programming (Python, R, Scala) and database languages (SQL). Familiarity using AI/ML tools to support logistics use cases. Ability to discern which statistical approaches are appropriate for different contexts. Experience communicating key findings with visualizations. 8+ years of professional experience. Master's degree in a quantitative discipline (Statistics, Computer Science, Physics, Electrical Engineering, etc.). Desired Experience Experience with cloud-based development platforms. Experience with large-scale data processing tools. Experience with data visualization tools. Ph.D. in a quantitative discipline (Statistics, Computer Science, Physics, Electrical Engineering, etc.). Equal Opportunity Employer/Veterans/Disabled Accommodations:If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to access job openings or apply for a job on this site as a result of your disability. You can request reasonable accommodations by contacting Talent Acquisition at *********************************** Redhorse Corporation shall, in its discretion, modify or adjust the position to meet Redhorse's changing needs.This job description is not a contract and may be adjusted as deemed appropriate in Redhorse's sole discretion. We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
    $63k-91k yearly est. 10d ago
  • Data Scientist

    Applied Memetics

    Data engineer job in Tampa, FL

    AM LLC is seeking a Data Scientist that will support Operations in the Information Environment (OIE) initiatives under the J39 directorate by leveraging advanced data analysis techniques to extract actionable insights. The candidate will be responsible for analyzing complex data sets, developing predictive models, and providing data-driven recommendations to enhance Information campaigns. An active TS/SCI or TS w/SCI eligibility clearance is required. This position is for a forthcoming contract and will be based at the Customer's location in Tampa, FL. Requirements Essential Duties and Responsibilities Conducts analysis of structured and semi-structured data sets, which include opinion research, social media-based big data, and various secondary-source indices. Reviews the methods and means to provide data visualization and quantitative analysis to the team. Develops automated approaches leveraging artificial intelligence/machine learning (AI/ML) and natural language processing (NLP) to streamline the input of operational assessment data into the Command and Control of the Information Environment (C2IE) system. Supports the analytics and assessment team in the construction and modification of research design. Acquires, processes, and integrates impactful data from various commercial and customer sources, including HUMINT, SIGINT, OSINT, and GEOINT. Applies advanced analytical techniques, including machine learning, natural language processing, and deep learning, to identify trends and patterns. Develops predictive models to support influence operations and behavioral targeting. Collaborates with OIE planners, OIE analysts, and Data Engineers to ensure effective data utilization. Visualizes data insights through dashboards and reports for decision-makers. Evaluates the effectiveness of OIE campaigns using data-driven methods Stays current on emerging data science techniques and tools relevant to OIE. Provides recommendations to senior leadership on data-driven strategies for OIE Defines the means to create data visualizations and quantitative analysis to support mission needs. Reviews/analyzes statistical data received from team members, other agencies, and CCMD organic assets. Reviews empirical/quantitative studies conducted by similar organizations and determine reliability, validity, and other strength factors for potential use in theater. Provides data visualization and explain quantitative analysis from the research team / vendors and quantitative analysis. Responsible for contributing to analytic and assessment projects evaluating the Information Environment (IE) for a variety of Combatant Commands (CCMDs). Knowledge, Skills, and Abilities: General: Confident in supporting projects and people, and proactive in making independent decisions and taking appropriate action. Strong written, analytical, presentation, and verbal communication skills, with the ability to communicate effectively with all levels of stakeholders, from assistants to senior executives. Strong organizational and time management skills, attention to detail, and the ability to troubleshoot when faced with challenges. Demonstrates growth from feedback and actively seeks ways to improve. Able to liaise effectively with a small internal team and external partners. Minimum Qualifications: General Requirements: Degree in Data Science, Computer Science, Mathematics, Statistics, or a related field Experience in data science, with a focus on predictive analytics, machine learning, and statistical modelin Experience with data visualization tools (Tableau, Power BI, Matplotlib) Experience with natural language processing (NLP) and text analytics Proficiency in Python, R, SQL, and machine learning frameworks (Scikit-learn, TensorFlow, PyTorch). Familiarity with data security protocols for classified environments Preferred Qualifications: Experience supporting Joint Intelligence or U.S. Geographic Combatant CommandsExperience with advanced analytic tools (Open.IO, Echosec, Babel Street, Scraawl) Experience with geospatial analytic tools (ArcGIS, QGIS) Experience with data manipulation and analysis tools (SPSS, STATA, IBM Watson) Proficiency with cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes) Demonstrated experience with Command and Control of the Information Environment (C2IE) Demonstrated experience briefing senior military leadership (O-5 and above) Expertise in OSINT, including social media monitoring and analysis. Ability to conduct complex data queries and scripting for automation. Security Clearance: Active TS/SCI or TS w/SCI Eligibility clearance is required. Physical and Environmental Requirements: Employees must have the ability to perform the following physical demands for extended periods of time with or without assistance: This position requires the ability to remain in a stationary position (standing and/or seated) most of the time. Viewing a computer screen/monitor. Utilizing a keyboard. Levels: Junior: 3+ years Journeyman: 3-10 years Senior: 10+ years SME: 15+ years Proposed Salary Range: $130,000 - $235,000 Tampa, FL Please note that actual salaries may vary within the range or be above or below the range based on factors including, but not limited to, education, training, experience, business need, and location. AM LLC is an Equal Opportunity Employer. Our policy is clear: there shall be no discrimination on the basis of age, disability, sex, race, religion or belief, gender reassignment, marriage/civil partnership, pregnancy/maternity, or sexual orientation. We are an inclusive organization and actively promote equality of opportunity for all with the right mix of talent, skills and potential. We welcome all applications from a wide range of candidates. Selection for roles will be based on individual merit alone. Salary Description $130,000-$235,000
    $63k-91k yearly est. 60d+ ago
  • 48627 Data Scientist

    Shuvel Digital

    Data engineer job in Tampa, FL

    Data Scientist What you'll work on:. Work with subject matter experts, team leads, and third party vendors to define new data science prototypes. Build AI solution - NLP, regression, clustering, embedding, recommendation, Retrieval, Anomaly detection, LLM. Design, code, test, and document data science microservices - typically in Python, React, docker Support the mapping of disparate bulk data sources to a unified database. Create graph traversal queries and analytic pipelines to support analyst use cases. Support transitioning custom pipeline from dev to test to production. Incorporate feedback from leadership and user base. Extract meaningful information from unstructured text such as entities, identifiable attributes, and relationships. Types of text include but are not limited to, SAR narratives and web scraped data. Your areas of expertise: • Python • React • R • Tableau/ qlik • Vectordb •Gremlin/cypher/graph ML/ Neo4J or other experience with graph traversals • Network analytics such as centrality, community detection, link prediction, pattern recognition, blockchain analytics • SQL or other relational database query experience • Graph structured data and analytics • NLP Required Skills: Data Visualization: Qlik Tableau Databases Python (Programming Language) SQL (Structured Query Language) Anomaly Detection Data Science Data Visualisation Neo4J Knowledge Graphs Data Visualization: Tableau Natural Language Processing (NLP) Day-to-day Responsibilities: Interface with the team and clients in group and one on one settings to determine program technical development needs and execute on them. Contribute to daily stand up meetings. Expected Deliverables: Working visualizations, prototype analytics, documentation Education: Bachelors Degree
    $63k-91k yearly est. 60d+ ago
  • Expert Exploitation Specialist/Data Scientist (TS/SCI)

    Culmen International 4.3company rating

    Data engineer job in Tampa, FL

    About the Role Culmen International is hiring Expert Exploitation Specialist/Data Scientists to provide support on-site at the National Geospatial-Intelligence Agency (NGA) in Tampa, FL. The National Geospatial-Intelligence Agency (NGA) expects to deliver AOS Metadata Cataloging and Management Services to enhance product and asset management of content enabling rapid creation of discoverable, modular, web enabled, and visually enriched Geospatial Intelligence (GEOINT) products for intelligence producers in NGA, across the National System for Geospatial-Intelligence (NSG). TALENT PIPELINE - Qualified applicants will be contacted as soon as funding for this position is secured. What You'll Do in Your New Role The Data Scientist will coordinate with our clients to understand questions and issues involving the client's datasets, then determine the best method and approach to create data-driven solutions within program guidelines. This position will be relied upon as a Subject Matter Expert (SME), and be expected to lead/assist in the development of automated processes, architect data science solutions, automated workflows, conduct analysis, use available tools to analyze data, remain adaptable to mission requirements, and identify patterns to help solve some of the complex problems that face the DoD and Intelligence Community (IC). * Work with large structured / unstructured data in a modeling and analytical environment to define and create streamline processes in the evaluation of unique datasets and solve challenging intelligence issues * Lead and participate in the design of solutions and refinement of pre-existing processes * Work with Customer Stakeholders, Program Managers, and Product Owners to translate road map features into components/tasks, estimate timelines, identify resources, suggest solutions, and recognize possible risks * Use exploratory data analysis techniques to identify meaningful relationships, patterns, or trends from complex data * Combine applied mathematics, programming skills, analytical techniques, and data to provide impactful insights for decision makers * Research and implement optimization models, strategies, and methods to inform data management activities and analysis * Apply big data analytic tools to large, diverse sets of data to deliver impactful insights and assessments * Conduct peer reviews to improve quality of workflows, procedures, and methodologies * Help build high-performing teams; mentor team members providing development opportunities to increase their technical skills and knowledge Required Qualifications * TS/SCI Clearance w/CI Poly Eligible * Minimum of 18 years combined experience (A combination of years of experience & professional certifications/trainings can be used in lieu of a degree) * BS in related Field with Graduate level work * Expert proficiency in Python and other programming languages applicable to automation development. * Demonstrated experience designing and implementing workflow automation systems * Advanced experience with ETL (Extract, Transform, Load) processes for geospatial data * Expertise in integrating disparate systems through APl development and implementation * Experience developing and deploying enterprise-scale automation solutions * Knowledge of NGA's Foundation GEOINT products, data types, and delivery methods * Demonstrated experience with database design, implementation, and optimization * Experience with digital media generation systems and automated content delivery platforms * Ability to analyze existing workflows and develop technical solutions to streamline processes * Knowledge of DLA systems and interfaces, particularly MEBS and WebFLIS * Expertise in data quality assurance and validation methodologies * Experience with geospatial data processing, transformation, and delivery automation * Proficiency with ArcGIS tools, GEODEC and ACCORD software systems * Understanding of cartographic principles and standards for CADRG/ECRG products * Strong analytical skills for identifying workflow inefficiencies and implementing solutions * Experience writing technical documentation including SOPS, CONOPS, and system design Desired Qualifications * Certification(s) in relevant automation technologies or programming languages * Experience with DevOps practices and C/CD implementation * Knowledge of cloud-based automation solutions and their implementation in - government environments * Experience with machine learning applications for GEOINT Workflow optimization * Expertise in data analytics and visualization for workflow performance metrics * Understanding of NGA's enterprise architecture and integration points * Experience implementing RPA (Robotic Process Automation) solutions * Knowledge of secure coding practices and cybersecurity principles * Demonstrated expertise in digital transformation initiatives * Experience mentoring junior staff in automation techniques and best practices * Background in agile development methodologies * Understanding of human-centered design principles for workflow optimization About the Company Culmen International is committed to enhancing international safety and security, strengthening homeland defense, advancing humanitarian missions, and optimizing government operations. With experience in over 150 countries, Culmen supports our clients to accomplish critical missions in challenging environments. * Exceptional Medical/Dental/Vision Insurance, premiums for employees are 100% paid by Culmen, and dependent coverage is available at a nominal rate (including same or opposite sex domestic partners) * 401k - Vested immediately and 4% match * Life insurance and disability paid by the company * Supplemental Insurance Available * Opportunities for Training and Continuing Education * 12 Paid Holidays To learn more about Culmen International, please visit ************** At Culmen International, we are committed to creating and sustaining a workplace that upholds the principles of Equal Employment Opportunity (EEO). We believe in the importance of fair treatment and equal access to opportunities for all employees and applicants. Our commitment to these principles is unwavering across all our operations worldwide.
    $62k-90k yearly est. Auto-Apply 57d ago
  • Data Engineer - Machine Learning (Marketing Analytics)

    PODS 4.0company rating

    Data engineer job in Clearwater, FL

    At PODS (Portable On Demand Storage), we're not just a leader in the moving and storage industry, we redefined it. Since 1998, we've empowered customers across the U.S. and Canada with flexible, portable solutions that put customers in control of their move. Whether it's a local transition or a cross-country journey, our personalized service makes any experience smoother, smarter, and more human. We're driven by a culture of trust, authenticity, and continuous improvement. Our team is the heartbeat of our success, and together we strive to make each day better than the last. If you're looking for a place where your work matters, your ideas are valued, and your growth is supported- PODS is your next destination. JOB SUMMARY The Data Engineer- Machine Learning is responsible for scaling a modern data & AI stack to drive revenue growth, improve customer satisfaction, and optimize resource utilization. As an ML Data Engineer, you will bridge data engineering and ML engineering: build high‑quality feature pipelines in Snowflake/Snowpark, Databricks, productionize and operate batch/real‑time inference, and establish MLOps/LLMOps practices so models deliver measurable business impact at scale. Note: This role is required onsite at PODS headquarters in Clearwater, FL. The onsite working schedule is Monday - Thursday onsite with Friday remote. It is NOT a remote opportunity. General Benefits & Other Compensation: * Medical, dental, and vision insurance * Employer-paid life insurance and disability coverage * 401(k) retirement plan with employer match * Paid time off (vacation, sick leave, personal days) * Paid holidays * Parental leave / family leave * Bonus eligibility / incentive pay * Professional development / training reimbursement * Employee assistance program (EAP) * Commuter benefits / transit subsidies (if available) * Other fringe benefits (e.g. wellness credits) What you will do: ● Design, build, and operate feature pipelines that transform curated datasets into reusable, governed feature tables in Snowflake ● Productionize ML models (batch and real‑time) with reliable inference jobs/APIs, SLAs, and observability ● Setup processes in Databricks and Snowflake/Snowpark to schedule, monitor, and auto‑heal training/inference pipelines ● Collaborate with our Enterprise Data & Analytics (ED&A) team centered on replicating operational data into Snowflake, enriching it into governed, reusable models/feature tables, and enabling advanced analytics & ML-with Databricks as a core collaboration environment ● Partner with Data Science to optimize models that grow customer base and revenue, improve CX, and optimize resources ● Implement MLOps/LLMOps: experiment tracking, reproducible training, model/asset registry, safe rollout, and automated retraining triggers ● Enforce data governance & security policies and contribute metadata, lineage, and definitions to the ED&A catalog ● Optimize cost/performance across Snowflake/Snowpark and Databricks ● Follow robust and established version control and DevOps practices ● Create clear runbooks and documentation, and share best practices with analytics, data engineering, and product partners Also, you will DELIVER QUALITY RESULTS: Able to deliver top quality service to all customers (internal and external); Able to ensure all details are covered and adhere to company policies; Able to strive to do things right the first time; Able to meet agreed-upon commitments or advises customer when deadlines are jeopardized; Able to define high standards for quality and evaluate products, services, and own performance against those standards TAKE INITIATIVE: Able to exhibit tendencies to be self-starting and not wait for signals; Able to be proactive and demonstrate readiness and ability to initiate action; Able to take action beyond what is required and volunteers to take on new assignments; Able to complete assignments independently without constant supervision BE INNOVATIVE / CREATIVE: Able to examine the status quo and consistently look for better ways of doing things; Able to recommend changes based on analyzed needs; Able to develop proper solutions and identify opportunities BE PROFESSIONAL: Able to project a positive, professional image with both internal and external business contacts; Able to create a positive first impression; Able to gain respect and trust of others through personal image and demeanor ADVANCED COMPUTER USER: Able to use required software applications to produce correspondence, reports, presentations, electronic communication, and complex spreadsheets including formulas and macros and/or databases. Able to operate general office equipment including company telephone system What you will need: * Bachelor's or Master's in CS, Data/ML, or related field (or equivalent experience) required * 4+ years in data/ML engineering building production‑grade pipelines with Python and SQL * Strong hands‑on with Snowflake/Snowpark and Databricks; comfort with Tasks & Streams for orchestration * 2+ years of experience optimizing models: batch jobs and/or real‑time APIs, containerized services, CI/CD, and monitoring * Solid understanding of data modeling and governance/lineage practices expected by ED&A It would be nice if you had: * Familiarity with LLMOps patterns for generative AI applications * Experience with NLP, call center data, and voice analytics * Exposure to feature stores, model registries, canary/shadow deploys, and A/B testing frameworks * Marketing analytics domain familiarity (lead scoring, propensity, LTV, routing/prioritization) MANAGEMENT & SUPERVISORY RESPONSIBILTIES * Direct supervisor job title(s) typically include: VP, Marketing Analytics * Job may require supervising Analytics associates No Unsolicited Resumes from Third-Party Recruiters Please note that as per PODS policy, we do not accept unsolicited resumes from third-party recruiters unless such recruiters are engaged to provide candidates for a specified opening and in alignment with our Inclusive Diversity values.Any employment agency, person or entity that submits an unsolicited resume does so with the understanding that PODS will have the right to hire that applicant at its discretion without any fee owed to the submitting employment agency, person, or entity. DISCLAIMER The preceding job description has been designed to indicate the general nature of work performed; the level of knowledge and skills typically required; and usual working conditions of this position. It is not designed to contain, or be interpreted as, a comprehensive listing of all requirements or responsibilities that may be required by employees in this job. Equal Opportunity, Affirmative Action Employer PODS Enterprises, LLC is an Equal Opportunity, Affirmative Action Employer. We will not discriminate unlawfully against qualified applicants or employees with respect to any term or condition of employment based on race, color, national origin, ancestry, sex, sexual orientation, age, religion, physical or mental disability, marital status, place of birth, military service status, or other basis protected by law.
    $80k-113k yearly est. 37d ago
  • Data Scientist

    Tampa Bay Lightning 3.6company rating

    Data engineer job in Tampa, FL

    In order to be considered for this role, after clicking "Apply Now" above and being redirected, you must fully complete the application process on the follow-up screen. This position will work with hockey data to fulfill given directives as well as the ability to create new projects and communicate results. This role will be part of a team-based environment in the hockey analytics department and is expected to work at Benchmark International Arena. Here is a unique opportunity to create impactful solutions in a growth-minded environment and culture. This full-time position will report to the Director of Hockey Analytics and Associate Director of Hockey Analytics. We ask that you submit a cover letter for this position. In your cover letter, please discuss prior work or a project you have done that you are most proud of. Please also discuss why you are interested in the position. Additionally, please provide any examples of your open-source projects or public work that you can share. Essential Duties & Responsibilities: Build and validate models for hockey metrics using multiple data sources Create projects to advance our understanding of hockey Collaborate with other data scientists, data engineers, etc. Responsible for communicating the results of projects to coaches and management May be expected to travel a few times a year for work Game/Event Responsibilities: Option to attend all home games Qualifications: Computer Science, Math, Engineering or Physics degree required Minimum 3-5 years of experience writing modern Python code Experience using Docker Experience using SQL databases Experience working with time series data Ability to write geometric and physics analyses Ability to problem solve and a desire for personal development Experience working in a team-based environment Familiarity with Git/VCS Ability to collaborate with others and take ownership of projects Ability to interpret results and communicate effectively Ability to understand and implement research papers related to sports and data science Preferred Qualifications: Graduate Degree in Physics, Math or Engineering Machine learning concepts and applications Advanced statistical modelling and knowledge of statistics Familiarity working with sports data We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.
    $63k-74k yearly est. 44d ago
  • Interoperability Engineer (Workday)

    Moffitt Cancer Center 4.9company rating

    Data engineer job in Tampa, FL

    Highlights The Workday Interoperability Engineer serves as a senior technical expert responsible for architecting, deploying, and maintaining Workday integrations and interoperability frameworks that support secure, scalable data exchange across HR, finance, clinical, research, and enterprise systems. Acts as a subject matter expert in Workday integration patterns including Workday Studio, EIBs, RaaS, APIs, event-driven integrations, and streaming/data pipelines. Owns the design and operational delivery of Workday-centric interoperability initiatives, ensuring reliability and alignment with business outcomes. Provides mentorship and technical leadership to engineers and analysts, guiding them in best practices for Workday and enterprise integration. Combines deep Workday integration expertise with an understanding of cross-functional business processes and downstream system dependencies. The role will also be responsible for developing and maintaining frameworks that support information exchange needs across clinical systems Responsibilities Hands-on experience building integrations with Workday HCM, Finance, Payroll, Recruiting, or other Workday modules. Strong understanding of Workday data structures, security groups, calculated fields, and Workday report development (including RaaS). Proficiency in developing integrations using Workday Studio, EIB, Core Connectors, and PECI(Payroll Effective Change Interface). Translate Workday integration requirements into technical specifications, integration contracts, and design standards. Ability to gather API requirements, translate them into technical specifications, and produce comprehensive API design documentation (standards, contracts, and specifications). Hands-on experience implementing application security frameworks, including OAuth2, SAML, OpenID Connect, and JWT. Experience in API testing strategies - functional, regression, performance, and security testing - using tools such as Postman, SoapUI, JMeter, or equivalents. Good understanding of firewall and advanced networking concepts to support secure system integrations. Provide on-call support and keep integration documentation and records up to date. Credentials and Experience: Bachelor's Degree - field of study: Computer Science, systems analysis, or a related study Minimum 7 years of experience leading end-to-end integration implementations, with a strong emphasis on Workday and supporting middleware technologies like Cloverleaf and Boomi. Minimum of 3 years' experience working with cross functional teams providing expert knowledge for ERP data analysis to design, build and deploy integrations. The Ideal Candidate will have the following experience: Strong hands-on experience developing Workday integrations using Workday Studio, EIBs, Core Connectors, RaaS, and Workday Web Services. Experience designing and supporting interoperability between Workday and downstream systems Familiarity with healthcare interoperability concepts and standards such as HL7 or FHIR, especially where Workday interacts with clinical or research environments. Proficiency with integration platforms such as Boomi and/or Cloverleaf for orchestrating Workday-related data flows. Experience with EMR systems such as Epic is a plus, particularly when supporting Workday-to-clinical data exchange.
    $67k-87k yearly est. 5d ago
  • ETL Architect

    Healthplan Services 4.7company rating

    Data engineer job in Tampa, FL

    HealthPlan Services (HPS) is the nation's largest independent provider of sales, benefits administration, retention, reform and technology solutions to the insurance and managed care industries. Headquartered in Tampa, Florida, HPS was founded in 1970 and employs 1,500+ associates. HPS stands at the forefront of the insurance industry, providing exchange connectivity, administration, distribution and technology services to insurers of individual, small group, voluntary and association plans, as well as valuable solutions to thousands of brokers and agents, nationwide. Job Description Position: ETL Architect The ETL Architect will have experience delivering BI solutions with an Agile BI delivery methodology. Essential Job Functions and Duties: Develop and maintain ETL jobs for data warehouses/marts Design ETL via source-to-target mapping and design documents that consider security, performance tuning and best practices Collaborate with delivery and technical team members on design and development Collaborate with business partners to understand business processes, underlying data and reporting needs Conduct data analysis in support of ETL development and other activities Assist with data architecture and data modeling Preferred Qualifications: 12+ years of work experience as Business Intelligence Developer Work experience with multiple database platforms and BI delivery solutions 10+ years of experience with End to End ETL architecture, data modeling BI and Analytics data marts, implementing and supporting production environments. 10+ years of experience designing, building and implementing BI solutions with modern BI tools like Microstrategy, Microsoft and Tableau Experience as a Data Architect Experience delivering BI solutions with an Agile BI delivery methodology Ability to communicate, present and interact comfortably with senior leadership Demonstrated proficiency implementing self-service solutions to empower an organization to generate valuable actionable insights Strong team player Ability to understand information quickly, derive insight, synthesize information clearly and concisely, and devise solutions Inclination to take initiative, set priorities, take ownership of assigned projects and initiatives, drive for results, and collaborate to achieve greatest value Strong relationship-building and interpersonal skills Demonstrated self-confidence, honesty and integrity Conscientious of Enterprise Data Warehouse Release management process; Conduct Operations readiness and environment compatibility review of any changes prior to deployment with strong sensitivity around Impact and SLA Experience with data modeling tools a plus. Expert in data warehousing methodologies and best practices required. Ability to initiate and follow through on complex projects of both short and long term duration required. Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required. Proactive recommendation for improving the performance and operability of the data warehouse and reporting environment. Participate on interdepartmental teams to support organizational goals Perform other related duties and tasks as assigned Experience facilitating user sessions and gathering requirements Education Requirements: Bachelors or equivalent degree in a business, technical, or related field Additional Information All your information will be kept confidential according to EEO guidelines.
    $84k-105k yearly est. 60d+ ago

Learn more about data engineer jobs

How much does a data engineer earn in Gulfport, FL?

The average data engineer in Gulfport, FL earns between $63,000 and $115,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Gulfport, FL

$85,000

What are the biggest employers of Data Engineers in Gulfport, FL?

The biggest employers of Data Engineers in Gulfport, FL are:
  1. Raymond James Financial
  2. Leons Texas Cuisine
Job type you want
Full Time
Part Time
Internship
Temporary