Senior Data Engineer
Data engineer job in Saint Petersburg, FL
Sr. Data Engineer
CLIENT: Fortune 150 Company; Financial Services
SUMMARY DESCRIPTION:
The Data Engineer will serve in a strategic role designing and managing the infrastructure that supports data storage, transforming, processing, and retrieval enabling efficient data analysis and decision-making within the organization. This position is critical as part of the Database and Analytics team responsible for design, development, and implementation of complex enterprise-level data integration and consumption solutions. It requires a highly technical, self-motivated senior engineer who will work with analysts, architects, and systems engineers to develop solutions based on functional and technical specifications that meet quality and performance requirements.
Must have Experience with Microsoft Fabric.
PRIMARY DUTIES AND RESPONSIBILITIES:
Utilize experience in ETL tools, with at least 5 years dedicated to Azure Data Factory (ADF), to design, code, implement, and manage multiple parallel data pipelines. Experience with Microsoft Fabric, Pipelines, Mirroring, and Data Flows Gen 2 usage is required.
Apply a deep understanding of data warehousing concepts, including data modeling techniques like star and snowflake schemas, SCD Type 2, Change Data Feeds, Change Data Capture. Also demonstrates hands-on experience with Data Lake Gen 2, Delta Lake, Delta Parquet files, JSON files, big data storage layers, optimize and maintain big data storage using Partitioning, V-Order, Optimize, Vacuum and other techniques.
Design and optimize medallion data models, warehouses, architectures, schemas, indexing, and partitioning strategies.
Collaborate with Business Insights and Analytics teams to understand data requirements and optimize storage for analytical queries.
Modernize databases and data warehouses and prepare them for analysis, managing for optimal performance.
Design, build, manage, and optimize enterprise data pipelines ensuring efficient data flow, data integrity, and data quality throughout the process.
Automate efficient data acquisition, transformation, and integration from a variety of data sources including databases, APIs, message queues, data streams, etc.
Competently performs advanced data tasks with minimal supervision, including architecting advanced data solutions, leading and coaching others, and effectively partnering with stakeholders.
Interface with other technical and non-technical departments and outside vendors on assigned projects.
Under the direction of the IT Management, will establish standards, policies and procedures pertaining to data governance, database/data warehouse management, metadata management, security, optimization, and utilization.
Ensure data security and privacy by implementing access controls, encryption, and anonymization techniques as per data governance and compliance policies.
Expertise in managing schema drift within ETL processes, ensuring robust and adaptable data integration solutions.
Document data pipelines, processes, and architectural designs for future reference and knowledge sharing.
Stay informed of latest trends and technologies in the data engineering field, and evaluate and adopt new tools, frameworks, and platforms (like Microsoft Fabric) to enhance data processing and storage capabilities.
When necessary, implement and document schema modifications made to legacy production environment.
Perform any other function required by IT Management for the successful operation of all IT and data services provided to our clients.
Available nights and weekends as needed for system changes and rollouts.
EDUCATION AND EXPERIENCE REQUIREMENTS:
Bachelor's or Master's degree in computer science, information systems, applied mathematics, or closely related field.
Minimum of ten (10) years full time employment experience as a data engineer, data architect, or equivalent required.
Must have Experience with Microsoft Fabric
SKILLS:
Experience in working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures, and integrated datasets using traditional and modern data integration technologies (such as ETL, ELT, MPP, data replication, change data captures, message-oriented data movement, API design, stream data integration and data virtualization)
Experience working with cloud data engineering stacks (specifically Azure and Microsoft Fabric), Data Lake, Synapse, Azure Data Factory, Databricks, Informatica, Data Explorer, etc.
Strong, in-depth understanding of database architecture, storage, and administration utilizing Azure stack.
Deep understanding of Data architectural approaches, Data Engineering Solutions, Software Engineering principles and best practices.
Working knowledge and experience with modern BI and ETL tools (Power BI, Power Automate, ADF, SSIS, etc.)
Experience utilizing data storage solutions including Azure Blob storage, ADLS Gen 2.
Solid understanding of relational and dimensional database principles and best practices in a client/server, thin-client, and cloud computing environment.
Advanced working knowledge of TSQL and SQL Server, transactions, error handling, security and maintenance with experience writing complex stored procedures, views, and user-defined functions as well as complex functions, dynamic SQL, partitions, CDC, CDF, etc.
Experience with .net scripting and understanding of API integration in a service-oriented architecture.
Knowledge of reporting tools, query language, semantic models with specific experience with Power BI.
Understanding of and experience with agile methodology.
PowerShell scripting experience desired.
Experience with Service Bus, Azure Functions, Event Grids, Event Hubs, Kafka would be beneficial.
Experience working in Agile methodology.
Working Conditions:
Available to work evenings and/or weekends (as required).
Workdays and hours are Monday through Friday 8:30 am to 5:30 pm ET.
Sr. Data Modeler
Data engineer job in Tampa, FL
Role: Sr. Data Modeler
The Senior Database Designer is responsible for building the organization's enterprise data models and database structures. The role is responsible for conceptual, logical, and physical data modeling that supports operational systems, analytical workloads, and harmonized data domains within the enterprise data ecosystem. The position will partner closely with business SMEs, data engineering, governance, and analytics teams to ensure that data structures are documented, standardized, scalable, performant, and aligned to corporate governance policies and integration standards. The successful candidate will bring deep expertise in dimensional and relational modeling, strong proficiency with modern cloud data platforms, and the ability to drive modeling best practices across the organization.
Key Responsibilities
Enterprise Data Modeling and Architecture
• Lead the design and delivery of conceptual, logical, and physical data models for enterprise data domains and data products (operational and analytic).
• Develop harmonized, reusable, and governed data models that support single-source-of-truth design principles.
• Establish and maintain modeling standards, including naming conventions, dimensional modeling patterns, SCD2 strategies, surrogate key methodologies, lineage documentation, and data enrichment frameworks.
• Design models to support high-volume incremental ingestion (CDC), complex history tracking, and auditable data transformations.
• Produce and maintain full metadata and lineage documentation through approved tools (e.g., ER/Studio, Unity Catalog).
Integration, Data Engineering Enablement, and Delivery
• Create detailed source-to-target mappings aligned to model definitions and business rules to support data engineering development.
• Partner with data pipeline engineering to validate build quality, ensure model fidelity in pipelines, and support UAT and performance testing.
• Contribute to database and datamart design for analytics solutions, including fact and dimension architectures, semantic layers, and data consumption optimization.
Performance, Quality, and Governance
• Validate data model performance characteristics; recommend indexing, partitioning, and clustering strategies for the data platform.
• Collaborate with Data Governance to ensure data definitions, standards, quality rules, and ownership are aligned to enterprise data strategy.
• Design models emphasizing security classification, access permissions, compliance obligations, and auditability.
Stakeholder Engagement
• Serve as a trusted advisor to product owners, business leaders, and analytics users, translating business requirements into data structures that support meaningful insights.
• Communicate tradeoffs and design alternatives when evaluating new use cases or changes to the enterprise model.
• Contribute to roadmap planning for enterprise data domains and long-term architectural evolution.
Qualifications
• Required
o Bachelor's or Master's degree in Computer Science, Information Systems, or a related discipline.
o 7+ years of progressive experience in data modeling, database design, and data architecture.
o Demonstrated expertise with relational and dimensional modeling (3NF and star schema design).
o Proficiency with cloud-based modern data stack environments (Azure preferred; Databricks experience highly valued).
o Strong proficiency with SQL for model validation, profiling, and optimization.
o Experience with data modeling tools such as ER/Studio, ERwin, DB Schema, or equivalent.
o Hands-on experience supporting data warehouses, datamarts, and metadata-driven modeling approaches.
o Experience supporting data ingestion and CDC design patterns and SCD2 data history strategy.
o Strong attention to detail regarding data quality, lineage, governance, and documentation.
o Excellent communication skills with proven ability to clearly articulate design rationale to technical and non-technical audiences.
• Preferred
o Experience in the insurance or financial services industry with knowledge of policy, client, and revenue data structures.
o Familiarity with ETL/ELT orchestration tools (Fivetran, Airflow, MuleSoft) and distributed processing frameworks (Spark).
o Experience with semantic modeling layers (e.g., Tableau semantic layer, dbt metrics, or similar).
o Certification in cloud platforms (Azure Data Engineer, AWS Data Analytics, or equivalent).
Ab Initio / Abinitio Developer- Hadoop
Data engineer job in Tampa, FL
LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 750 clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit *******************
Ab Initio Developer
Location: Irving, TX/ Tampa, FL
Duration: Full time.
Abinitio GDE basic: Experience working with various basic Abinitio components such as Rollup ,Scan, Join, Partition, Gather, Merge, Sort, Lookup etc.
Conduct>IT : Should have Knowledge on Plan.
Advanced Abinitio: Vector, XML, should have working knowledge on Metaprogramming and PDL.(MUST)
Bigdata/Hadoop: Need Hive HDFS experience. (MUST)
Control Center/Tivoli: Should understand any scheduling tool.
BRE/ ACE/Express>IT: Working knowledge on Express>IT and EZ graph is a plus.
Metadata-Hub: Must have working knowledge on Mhub.
Oracle/PL/SQL: Should have good knowledge on complex SQL
Unix/Shell scripting: Knowledge on Abinitio air commands and m commands and complex shell scripting.
Design/Automation: Should have Automation experience. Good to have design exp as well.
HLD/LLD Documentation: Proficient in creating documentation related to Project end User Manual and Operations Hand off guide.
Data warehouse concept: Good understanding in Data warehousing concepts.
Communication skill: Good verbal and written communication skills
Domain experience: Experience in BFSI domain is preferable
Problem solving/Management skill: Very good problem-solving skills.
Benefits/perks listed below may vary depending on the nature of your employment with LTIMindtree (“LTIM”):
Benefits and Perks:
Comprehensive Medical Plan Covering Medical, Dental, Vision
Short Term and Long-Term Disability Coverage
401(k) Plan with Company match
Life Insurance
Vacation Time, Sick Leave, Paid Holidays
Paid Paternity and Maternity Leave
LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, colour, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.
Lead Data Engineer
Data engineer job in Tampa, FL
A leading Investment Management Firm is looking to bring on a Lead Data Engineer to join its team in Tampa, Denver, Memphis, or Southfield. This is an excellent chance to work alongside industry leaders while getting to be both hands on and helping lead the team.
Key Responsibilities
Project Oversight: Direct end-to-end software development activities, from initial requirements through deployment, ensuring projects meet deadlines and quality standards.
Database Engineering: Architect and refine SQL queries, stored procedures, and schema designs to maximize efficiency and scalability within Oracle environments.
Performance Tuning: Evaluate system performance and apply strategies to enhance data storage and retrieval processes.
Data Processing: Utilize tools like Pandas and Spark for data wrangling, transformation, and analysis.
Python Solutions: Develop and maintain Python-based applications and automation workflows.
Pipeline Automation: Implement and manage continuous integration and delivery pipelines using Jenkins and similar technologies to optimize build, test, and release cycles.
Team Development: Guide and support junior engineers, promoting collaboration and technical growth.
Technical Documentation: Create and maintain comprehensive documentation for all development initiatives.
Core Skills
Experience: Over a decade in software engineering, with deep expertise in Python and Oracle database systems.
Technical Knowledge: Strong command of SQL, Oracle, Python, Spark, Jenkins, Kubernetes, Pandas, and modern CI/CD practices.
Optimization Expertise: Skilled in database tuning and applying best practices for performance.
Leadership Ability: Proven track record in managing teams and delivering complex projects.
Analytical Strength: Exceptional problem-solving capabilities with a data-centric mindset.
Communication: Clear and effective written and verbal communication skills.
Education: Bachelor's degree in Computer Science, Engineering, or equivalent professional experience.
Preferred Qualifications
Certifications: Professional credentials in Oracle, Python, Kubernetes, or CI/CD technologies.
Agile Background: Hands-on experience with Agile or Scrum frameworks.
Cloud Platforms: Familiarity with AWS, Azure, or Google Cloud services.
Sr. Data Engineer (SQL+Python+AWS)
Data engineer job in Saint Petersburg, FL
looking for a Sr. Data Engineer (SQL+Python+AWS) to work on a 12+ Months, Contract (potential Extension or may Convert to Full-time) = Hybrid at St. Petersburg, FL 33716 with a Direct Financial Client = only on W2 for US Citizen or Green Card Holders.
Notes from the Hiring Manager:
• Setting up Python environments and data structures to support the Data Science/ML team.
• No prior Data Science or Machine Learning experience required.
• Role involves building new data pipelines and managing file-loading connections.
• Strong SQL skills are essential.
• Contract-to-hire position.
• Hybrid role based in St. Pete, FL (33716) only.
Duties:
This role is building and maintaining data pipelines that connect Oracle-based source systems to AWS cloud environments, to provide well-structured data for analysis and machine learning in AWS SageMaker.
It includes working closely with data scientists to deliver scalable data workflows as a foundation for predictive modeling and analytics.
• Develop and maintain data pipelines to extract, transform, and load data from Oracle databases and other systems into AWS environments (S3, Redshift, Glue, etc.).
• Collaborate with data scientists to ensure data is prepared, cleaned, and optimized for SageMaker-based machine learning workloads.
• Implement and manage data ingestion frameworks, including batch and streaming pipelines.
• Automate and schedule data workflows using AWS Glue, Step Functions, or Airflow.
• Develop and maintain data models, schemas, and cataloging processes for discoverability and consistency.
• Optimize data processes for performance and cost efficiency.
• Implement data quality checks, validation, and governance standards.
• Work with DevOps and security teams to comply with RJ standards.
Skills:
Required:
• Strong proficiency with SQL and hands-on experience working with Oracle databases.
• Experience designing and implementing ETL/ELT pipelines and data workflows.
• Hands-on experience with AWS data services, such as S3, Glue, Redshift, Lambda, and IAM.
• Proficiency in Python for data engineering (pandas, boto3, pyodbc, etc.).
• Solid understanding of data modeling, relational databases, and schema design.
• Familiarity with version control, CI/CD, and automation practices.
• Ability to collaborate with data scientists to align data structures with model and analytics requirements
Preferred:
• Experience integrating data for use in AWS SageMaker or other ML platforms.
• Exposure to MLOps or ML pipeline orchestration.
• Familiarity with data cataloging and governance tools (AWS Glue Catalog, Lake Formation).
• Knowledge of data warehouse design patterns and best practices.
• Experience with data orchestration tools (e.g., Apache Airflow, Step Functions).
• Working knowledge of Java is a plus.
Education:
B.S. in Computer Science, MIS or related degree and a minimum of five (5) years of related experience or combination of education, training and experience.
ML Data Engineer #978695
Data engineer job in Seffner, FL
Job Title: Data Engineer - AI/ML Pipelines
Work Model: Hybrid
Duration: CTH
The Data Engineer - AI/ML Pipelines plays a key role in designing, building, and maintaining scalable data infrastructure that powers analytics and machine learning initiatives. This position focuses on developing production-grade data pipelines that support end-to-end ML workflows-from data ingestion and transformation to feature engineering, model deployment, and monitoring.
The ideal candidate has hands-on experience working with operational systems such as Warehouse Management Systems (WMS) or ERP platforms, and is comfortable partnering closely with data scientists, ML engineers, and operational stakeholders to deliver high-quality, ML-ready datasets.
Key Responsibilities
ML-Focused Data Engineering
Build, optimize, and maintain data pipelines specifically designed for machine learning workflows.
Collaborate with data scientists to develop feature sets, implement data versioning, and support model training, evaluation, and retraining cycles.
Participate in initiatives involving feature stores, model input validation, and monitoring of data quality feeding ML systems.
Data Integration from Operational Systems
Ingest, normalize, and transform data from WMS, ERP, telemetry, and other operational data sources.
Model and enhance operational datasets to support real-time analytics and predictive modeling use cases.
Pipeline Automation & Orchestration
Build automated, reliable, and scalable pipelines using tools such as Azure Data Factory, Airflow, or Databricks Workflows.
Ensure data availability, accuracy, and timeliness across both batch and streaming systems.
Data Governance & Quality
Implement validation frameworks, anomaly detection, and reconciliation processes to ensure high-quality ML inputs.
Support metadata management, lineage tracking, and documentation of governed, auditable data flows.
Cross-Functional Collaboration
Work closely with data scientists, ML engineers, software engineers, and business teams to gather requirements and deliver ML-ready datasets.
Translate modeling and analytics needs into efficient, scalable data architecture solutions.
Documentation & Mentorship
Document data flows, data mappings, and pipeline logic in a clear, reproducible format.
Provide guidance and mentorship to junior engineers and analysts on ML-focused data engineering best practices.
Required Qualifications
Technical Skills
Strong experience building ML-focused data pipelines, including feature engineering and model lifecycle support.
Proficiency in Python, SQL, and modern data transformation tools (dbt, Spark, Delta Lake, or similar).
Solid understanding of orchestrators and cloud data platforms (Azure, Databricks, etc.).
Familiarity with ML operations tools such as MLflow, TFX, or equivalent frameworks.
Hands-on experience working with WMS or operational/logistics data.
Experience
5+ years in data engineering, with at least 2 years directly supporting AI/ML applications or teams.
Experience designing and maintaining production-grade pipelines in cloud environments.
Proven ability to collaborate with data scientists and translate ML requirements into scalable data solutions.
Education & Credentials
Bachelor's degree in Computer Science, Data Engineering, Data Science, or a related field (Master's preferred).
Relevant certifications are a plus (e.g., Azure AI Engineer, Databricks ML, Google Professional Data Engineer).
Preferred Qualifications
Experience with real-time ingestion using Kafka, Kinesis, Event Hub, or similar.
Exposure to MLOps practices and CI/CD for data pipelines.
Background in logistics, warehousing, fulfillment, or similar operational domains.
Active Directory Engineer (AD CS, Certificate Web Enrollment, NDES and Online Responder)
Data engineer job in Tampa, FL
S3/Strategic Staffing Solutions has an Active Directory Engineering opportunity for a leading utilities client in Tampa, FL. Please review the following if you are interested in joining a leading organization!
Duration: 12 months + possible extension
Pay Rate: $50-60/hr. W2. W2 only, sorry we cannot do C2C
Qualifications & Description:
Must have a solid understanding of the AD CS role services like Certificate Web Enrollment, NDES and Online Responder.
We are seeking a highly skilled Senior IT Contractor to lead and manage our enterprise Certificate Management operations, with a strong focus on Microsoft Certificate Management and Active Directory integration. This role is critical to ensuring the security, reliability, and compliance of our digital identity infrastructure.
Key Responsibilities:
Oversee the lifecycle management of digital certificates across the enterprise.
Administer and maintain Microsoft Certificate Services, including deployment, renewal, revocation, and auditing.
Integrate certificate management with Microsoft Active Directory and Group Policy for automated certificate enrollment.
Develop and enforce certificate policies, standards, and procedures.
Monitor certificate expiration and proactively mitigate risks of service disruption.
Collaborate with security, infrastructure, and application teams to support secure communications and authentication.
Troubleshoot certificate-related issues across various platforms and services.
Sr. Software Engineer (On-Site)
Data engineer job in Saint Petersburg, FL
Compensation: $160,000-$170,000 + 15% bonus Responsibilities * Design, develop, and maintain scalable, high-availability applications using Azure services. * Implement containerized applications using Azure Container Apps and orchestration tools such as Kubernetes.
* Utilize Azure Redis Cache for high-performance data retrieval and caching strategies.
* Develop and optimize SQL Server databases for performance and scalability.
* Design and implement RESTful APIs and integrate microservices to support application functionality.
* Develop front-end interfaces using React, ensuring a seamless and responsive user experience.
* Develop GenAI solutions and architecture
* Collaborate with cross-functional teams to define, design, and ship new features.
* Ensure the best possible performance, quality, and responsiveness of applications.
* Maintain code quality, organization, and automatization.
Requirements
* Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
* Minimum of 5 years of software development experience with a proven track record of successful development efforts.
* Strong experience with Azure cloud services, Azure DevOps, and CI/CD pipelines.
* Expert knowledge of Container Apps and Docker.
* Proficient with Redis and SQL Server, as well as database design and management.
* In-depth experience with building and consuming APIs in microservices architecture.
* Solid understanding of React and modern front-end development practices.
Relevant Certifications
* Microsoft Certified: Azure Developer Associate
* Microsoft Certified: Azure Solutions Architect Expert
* Certified Kubernetes Application Developer (CKAD) or Certified Kubernetes Administrator (CKA)
* Microsoft Certified: Azure Data Engineer Associate
* Microsoft Certified: Azure Database Administrator Associate
* React certification from a recognized provider
Senior Frontend Developer
Data engineer job in Tampa, FL
We are seeking a talented UI Developer with strong Angular experience to join our team in Tampa, FL. The ideal candidate will have a passion for building intuitive, responsive, and scalable user interfaces.
Responsibilities:
Develop and maintain web applications using Angular framework.
Collaborate with UX designers and backend developers to deliver seamless user experiences.
Optimize application performance and ensure cross-browser compatibility.
Write clean, maintainable, and well-documented code.
Required Skills:
5+ years of experience in UI development.
Strong proficiency in Angular (latest versions), TypeScript, HTML5, CSS3, and JavaScript.
Experience with RESTful APIs and integrating front-end with backend services.
Familiarity with responsive design and modern UI/UX principles.
Nice to Have:
Experience with RxJS, NgRx, or similar state management libraries.
Knowledge of Agile methodologies.
Employment Details:
Type: W2
Location: Tampa, FL (Hybrid - 3 days onsite per week)
Data Scientist (Exploitation Specialist Level-3) - Tampa, FL
Data engineer job in Tampa, FL
Job Description
_________________________________________________________________________________________________
Masego is an award-winning small business that specializes in GEOINT services. As a Service-Disabled Veteran-Owned Small Business (SDVOSB), we recognize and award your hard work.
Description
We are looking for a Level-3 TS/SCI-cleared Data Scientist to join our team. This role provides automation/collection support to the main team at NGA Washington. Because of this, this opportunity relies on good communication skills and a baseline knowledge of GEOINT collection and/or automation systems like JEMA.
Minimum Required Qualifications:
At least 5 years of related GEOINT work experience, or 2 years with a relevant Bachelor's degree.
Able to work on client site 40-hours a week (very limited option for telework)
Proficient with Python
Experience with JEMA
Preferred Qualifications:
Experience with multiple intelligence types (SIGINT, OSINT, ELINT, GEOINT, MASINT, HUMINT
Experience with Brewlytics, ArcPro and/or other geospatial data analysis tools
Knowledge of GEOINT collection and associated NGA/NRO systems
Proficiency with common programming languages including R, SQL, HTML, and JavaScript
Experience analyzing geospatially enabled data
Ability to learn new technologies and adapt to dynamic mission needs
Ability to work collaboratively with a remote team (main gov team is based out of NGA Washington)
Experience providing embedded data science/automation support to analytic teams
Security Clearance Requirement:
Active TS/SCI, with a willingness to take a polygraph test.
Salary Range: $128,600 based on ability to meet or exceed stated requirements
About Masego
Masego Inc. provides expert Geospatial Intelligence Solutions in addition to Activity Based Intelligence (ABI) and GEOINT instructional services. Masego provides expert-level Geospatial Collection Management, Full Motion Video; Human Geography; Information Technology and Cyber; Technical Writing; and ABI, Agile, and other professional training.
Masego is a Service-Disabled Veteran-Owned Small Business headquartered in Fredericksburg, Virginia. With high-level expertise and decades of experience, coupled with proven project management systems and top-notch client support, Masego enhances the performance capabilities of the Department of Defense and the intelligence community.
Pay and Benefits
We seek to provide and take care of our team members. We currently offer Medical, Dental, Vision, 401k, Generous PTO, and more!
Diversity
Masego, Inc. is an equal opportunity/equal access/affirmative action employer fully committed to achieving a diverse workforce and complies with all applicable Federal and Virginia State laws, regulations, and executive orders regarding nondiscrimination and affirmative action in its programs and activities. Masego, Inc. does not discriminate on the basis of race, color, religion, ethnic or national origin, gender, genetic information, age, disability, sexual orientation, gender identity, gender expression, and veteran's status.
Powered by JazzHR
dWDxKmrwOd
ETL Architect
Data engineer job in Tampa, FL
HealthPlan Services (HPS) is the nation's largest independent provider of sales, benefits administration, retention, reform and technology solutions to the insurance and managed care industries. Headquartered in Tampa, Florida, HPS was founded in 1970 and employs 1,500+ associates. HPS stands at the forefront of the insurance industry, providing exchange connectivity, administration, distribution and technology services to insurers of individual, small group, voluntary and association plans, as well as valuable solutions to thousands of brokers and agents, nationwide.
Job Description
Position: ETL Architect
The ETL Architect will have experience
delivering BI solutions with an Agile BI delivery methodology.
Essential Job Functions and Duties:
Develop and
maintain ETL jobs for data warehouses/marts
Design ETL
via source-to-target mapping and design documents that consider security,
performance tuning and best practices
Collaborate
with delivery and technical team members on design and development
Collaborate
with business partners to understand business processes, underlying data and
reporting needs
Conduct data
analysis in support of ETL development and other activities
Assist with data architecture and data modeling
Preferred Qualifications:
12+ years of work experience as Business Intelligence Developer
Work experience with multiple database platforms and BI delivery solutions
10+ years of experience with
End to End ETL
architecture
, data modeling BI and Analytics data marts, implementing
and supporting production environments.
10+ years of experience designing, building and implementing BI solutions with
modern BI tools like Microstrategy, Microsoft and Tableau
Experience as a Data Architect
Experience delivering BI solutions with an Agile BI delivery methodology
Ability to communicate, present and interact comfortably with senior leadership
Demonstrated proficiency implementing self-service solutions to empower an organization to
generate valuable actionable insights
Strong team player
Ability to understand information quickly, derive insight, synthesize information clearly
and concisely, and devise solutions
Inclination to take initiative, set priorities, take ownership of assigned projects and
initiatives, drive for results, and collaborate to achieve greatest value
Strong relationship-building and interpersonal skills
Demonstrated self-confidence, honesty and integrity
Conscientious of Enterprise Data Warehouse Release management
process; Conduct Operations readiness and environment compatibility review of
any changes prior to deployment with strong sensitivity around Impact and SLA
Experience with data modeling tools a plus.
Expert in data warehousing methodologies and best practices
required.
Ability to initiate and follow through on complex projects of
both short and long term duration required.
Works independently, assumes responsibility for job development
and training, researches and resolves questions and problems, requests
supervisor input and keeps supervisor informed required.
Proactive recommendation for improving the performance and
operability of the data warehouse and reporting environment.
Participate on interdepartmental teams to support organizational
goals
Perform other related duties and tasks as assigned
Experience facilitating user sessions and gathering requirements
Education Requirements:
Bachelors or equivalent degree in a business, technical, or related field
Additional Information
All your information will be kept confidential according to EEO guidelines.
48627 Data Scientist
Data engineer job in Tampa, FL
Data Scientist
What you'll work on:.
Work with subject matter experts, team leads, and third party vendors to define new data science prototypes.
Build AI solution - NLP, regression, clustering, embedding, recommendation, Retrieval, Anomaly detection, LLM.
Design, code, test, and document data science microservices - typically in Python, React, docker
Support the mapping of disparate bulk data sources to a unified database.
Create graph traversal queries and analytic pipelines to support analyst use cases. Support
transitioning custom pipeline from dev to test to production. Incorporate feedback from leadership and user base.
Extract meaningful information from unstructured text such as entities, identifiable attributes, and relationships. Types of text include but are not limited to, SAR narratives and web scraped data.
Your areas of expertise:
• Python
• React
• R
• Tableau/ qlik
• Vectordb
•Gremlin/cypher/graph ML/ Neo4J or other experience with graph traversals
• Network analytics such as centrality, community detection, link prediction, pattern recognition,
blockchain analytics
• SQL or other relational database query experience
• Graph structured data and analytics
• NLP
Required Skills: Data Visualization: Qlik
Tableau
Databases
Python (Programming Language)
SQL (Structured Query Language)
Anomaly Detection
Data Science
Data Visualisation
Neo4J
Knowledge Graphs
Data Visualization: Tableau
Natural Language Processing (NLP)
Day-to-day Responsibilities: Interface with the team and clients in group and one on one
settings to determine program technical development needs and execute on them. Contribute to
daily stand up meetings.
Expected Deliverables: Working visualizations, prototype analytics, documentation
Education: Bachelors Degree
Data Scientist
Data engineer job in Tampa, FL
About the OrganizationNow is a great time to join Redhorse Corporation. We are a solution-driven company delivering data insights and technology solutions to customers with missions critical to U.S. national interests. We're looking for thoughtful, skilled professionals who thrive as trusted partners building technology-agnostic solutions and want to apply their talents supporting customers with difficult and important mission sets.
About the RoleRedhorse Corporation is seeking a highly skilled Data Scientist to join our team supporting the United States Central Command (USCENTCOM) Directorate of Logistics (CCJ4). You will play a critical role in accelerating the delivery of AI-enabled capabilities within the Joint Logistics Common Operating Picture (JLOGCOP), directly impacting USCENTCOM's ability to promote international cooperation, respond to crises, deter aggression, and build resilient logistics capabilities for our partners. This is a high-impact role contributing to national security and global stability. You will be working on a custom build of AI/ML capabilities into the JLOGCOP leveraging dozens of data feeds to enhance decision-making and accelerate planning for USCENTCOM missions.Key Responsibilities
Communicate with the client regularly regarding enterprise values and project direction.
Find the intersection between business value and achievable technical work.
Articulate and translate business questions into technical solutions using available DoD data.
Explore datasets to find meaningful entities and relationships.
Create data ingestion and cleaning pipelines.
Develop applications and effective visualizations to communicate insights.
Serve as an ambassador for executive DoD leadership to sponsor data literacy growth across the enterprise.
Required Experience/Clearance
US citizen with a Secret US government clearance. Applicants who are not US Citizens and who do not have a current and active Secret security clearance will not be considered for this role.
Ability to work independently to recommend solutions to the client and as part of a team to accomplish tasks.
Experience with functional programming (Python, R, Scala) and database languages (SQL).
Familiarity using AI/ML tools to support logistics use cases.
Ability to discern which statistical approaches are appropriate for different contexts.
Experience communicating key findings with visualizations.
8+ years of professional experience.
Master's degree in a quantitative discipline (Statistics, Computer Science, Physics, Electrical Engineering, etc.).
Desired Experience
Experience with cloud-based development platforms.
Experience with large-scale data processing tools.
Experience with data visualization tools.
Ph.D. in a quantitative discipline (Statistics, Computer Science, Physics, Electrical Engineering, etc.).
Equal Opportunity Employer/Veterans/Disabled Accommodations:If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to access job openings or apply for a job on this site as a result of your disability. You can request reasonable accommodations by contacting Talent Acquisition at *********************************** Redhorse Corporation shall, in its discretion, modify or adjust the position to meet Redhorse's changing needs.This job description is not a contract and may be adjusted as deemed appropriate in Redhorse's sole discretion.
We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
Data Scientist
Data engineer job in Tampa, FL
AM LLC is seeking a Data Scientist that will support Operations in the Information Environment (OIE) initiatives under the J39 directorate by leveraging advanced data analysis techniques to extract actionable insights. The candidate will be responsible for analyzing complex data sets, developing predictive models, and providing data-driven recommendations to enhance Information campaigns.
An active TS/SCI or TS w/SCI eligibility clearance is required.
This position is for a forthcoming contract and will be based at the Customer's location in Tampa, FL.
Requirements
Essential Duties and Responsibilities
Conducts analysis of structured and semi-structured data sets, which include opinion research, social media-based big data, and various secondary-source indices.
Reviews the methods and means to provide data visualization and quantitative analysis to the team.
Develops automated approaches leveraging artificial intelligence/machine learning (AI/ML) and natural language processing (NLP) to streamline the input of operational assessment data into the Command and Control of the Information Environment (C2IE) system.
Supports the analytics and assessment team in the construction and modification of research design. Acquires, processes, and integrates impactful data from various commercial and customer sources, including HUMINT, SIGINT, OSINT, and GEOINT.
Applies advanced analytical techniques, including machine learning, natural language processing, and deep learning, to identify trends and patterns.
Develops predictive models to support influence operations and behavioral targeting.
Collaborates with OIE planners, OIE analysts, and Data Engineers to ensure effective data utilization. Visualizes data insights through dashboards and reports for decision-makers. Evaluates the effectiveness of OIE campaigns using data-driven methods
Stays current on emerging data science techniques and tools relevant to OIE.
Provides recommendations to senior leadership on data-driven strategies for OIE
Defines the means to create data visualizations and quantitative analysis to support mission needs.
Reviews/analyzes statistical data received from team members, other agencies, and CCMD organic assets.
Reviews empirical/quantitative studies conducted by similar organizations and determine reliability, validity, and other strength factors for potential use in theater.
Provides data visualization and explain quantitative analysis from the research team / vendors and quantitative analysis.
Responsible for contributing to analytic and assessment projects evaluating the Information Environment (IE) for a variety of Combatant Commands (CCMDs).
Knowledge, Skills, and Abilities:
General:
Confident in supporting projects and people, and proactive in making independent decisions and taking appropriate action.
Strong written, analytical, presentation, and verbal communication skills, with the ability to communicate effectively with all levels of stakeholders, from assistants to senior executives.
Strong organizational and time management skills, attention to detail, and the ability to troubleshoot when faced with challenges.
Demonstrates growth from feedback and actively seeks ways to improve.
Able to liaise effectively with a small internal team and external partners.
Minimum Qualifications:
General Requirements:
Degree in Data Science, Computer Science, Mathematics, Statistics, or a related field
Experience in data science, with a focus on predictive analytics, machine learning, and statistical modelin
Experience with data visualization tools (Tableau, Power BI, Matplotlib)
Experience with natural language processing (NLP) and text analytics
Proficiency in Python, R, SQL, and machine learning frameworks (Scikit-learn, TensorFlow, PyTorch).
Familiarity with data security protocols for classified environments
Preferred Qualifications:
Experience supporting Joint Intelligence or U.S. Geographic Combatant CommandsExperience with advanced analytic tools (Open.IO, Echosec, Babel Street, Scraawl)
Experience with geospatial analytic tools (ArcGIS, QGIS)
Experience with data manipulation and analysis tools (SPSS, STATA, IBM Watson)
Proficiency with cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes)
Demonstrated experience with Command and Control of the Information Environment (C2IE)
Demonstrated experience briefing senior military leadership (O-5 and above)
Expertise in OSINT, including social media monitoring and analysis.
Ability to conduct complex data queries and scripting for automation.
Security Clearance: Active TS/SCI or TS w/SCI Eligibility clearance is required.
Physical and Environmental Requirements:
Employees must have the ability to perform the following physical demands for extended periods of time with or without assistance:
This position requires the ability to remain in a stationary position (standing and/or seated) most of the time.
Viewing a computer screen/monitor.
Utilizing a keyboard.
Levels:
Junior: 3+ years
Journeyman: 3-10 years
Senior: 10+ years
SME: 15+ years
Proposed Salary Range: $130,000 - $235,000 Tampa, FL
Please note that actual salaries may vary within the range or be above or below the range based on factors including, but not limited to, education, training, experience, business need, and location.
AM LLC is an Equal Opportunity Employer. Our policy is clear: there shall be no discrimination on the basis of age, disability, sex, race, religion or belief, gender reassignment, marriage/civil partnership, pregnancy/maternity, or sexual orientation.
We are an inclusive organization and actively promote equality of opportunity for all with the right mix of talent, skills and potential. We welcome all applications from a wide range of candidates. Selection for roles will be based on individual merit alone.
Salary Description $130,000-$235,000
Data Scientist
Data engineer job in Tampa, FL
Dark Wolf Solutions is seeking a highly motivated and experienced Data Scientist to join our team and contribute to the development and optimization of our Data/ML Platform. In this role, you will be instrumental in building, testing, and validating data pipelines that enable the creation and deployment of cutting-edge AI/ML. You will work collaboratively with a team of data engineers and AI/ML specialists to ensure the quality, accessibility, and performance of our data assets for a key government customer based in Tampa, FL. Tasks may include, but are not limited to:
Testing and validating the performance of ML models
Assisting with data acquisition, ingestion, and tagging.
Data exploration and understanding to identify valuable insights.
Feature extraction and analysis to engineer effective AI/ML models.
Data engineering and conditioning to ensure data quality and consistency.
Data labeling to create high-quality training datasets.
Constructing, training, and validating AI/ML datasets.
Required Qualifications:
5+ years of experience performing data science activities to include: data exploration, feature engineering, model building, training, and validation.
2+ years of experience working with Cloud environments (AWS, Azure, GCP).
1+ year experience using scripting languages such as R or Python for data analysis and model development.
1+ years experience using Big Data technologies and tools (e.g. Spark, Hadoop, Hive, Cassandra, Druid, Flink, Drill, Trino, NoSQL) for large-scale data processing.
Experience with a variety of open source and commercial AI/ML frameworks (e.g., TensorFlow, PyTorch, scikit-learn).
Ability to manipulate raw data into effective visualizations and actionable insights. Ability to communicate end-to-end data outcomes visually to both technical and non-technical audiences.
Demonstrates a deep understanding of statistical modeling, machine learning algorithms, and data mining techniques.
Strong understanding of SQL and experience querying relational databases. Comfortable in a project startup environment, demonstrating adaptability and initiative.
US Citizenship and an active TS/SCI clearance
Desired Qualifications:
Master's degree or Ph.D. in Statistics, Mathematics, Computer Science, Engineering, or related field, or the equivalent combination of education, training and experience.
A Data Science Certification OR a cloud certification in AWS, Azure, or GCP focused on data science or AI/ML services.
Previous experience supporting the DoD. Bachelor's Degree in Computer Science, Mathematics, or equivalent technical degree; or in lieu of degree, 3 years of equivalent industry experience.
This position is located in Tampa, FL. The estimated salary range for this position is $130,000.00 - $200,000.00, commensurate on experience and technical skillset.
We are proud to be an EEO/AA employer Minorities/Women/Veterans/Disabled and other protected categories.
In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification form upon hire.
Auto-ApplyProvider Contract Data Consultant Sr
Data engineer job in Tampa, FL
Hours: Monday - Friday Travel: This role requires associates to be in-office 1 - 2 days per week, fostering collaboration and connectivity, while providing flexibility to support productivity and work-life balance. This approach combines structured office engagement with the autonomy of virtual work, promoting a dynamic and adaptable workplace. Alternate locations may be considered if candidates reside within a commuting distance from an office.
Position Overview:
Provides the highest level of analytical support to the Cost of Care and/or Provider Contracting organizations. Focuses efforts on lowering claims costs, improving the quality of care, and increasing member and provider network satisfaction. Provides expert advice, analytic and consultative support to Medical Directors and management on cost of care issues. Supports large scale initiatives with high dollar cost savings opportunities. Partners with provider contractors to develop contracting strategy and supports all aspects of the contract negotiation process. Works with multiple provider types including the most complex, high profile providers. Supports a full range of contract arrangements and pricing mechanisms including the most complex contract terms. Works on the most complex, large scale enterprise wide initiatives and acts as project lead. Acts as a strategic partner to management.
How You Will Make an Impact:
* Uses analytic tools to: track both health risks and compliance, as well as supporting the contract negotiation process
* Types of analyses include performing sophisticated retrospective data analytics; developing the most complex new models and modifies existing models to create predictive impact decision making tools; performing healthcare cost analysis to identify strategies to control costs; projecting cost increases in medical services by using analytic techniques for PMPM trending via multiple variable analysis; preparing complex pre-negotiation analyses to support development of defensible pricing strategies; performing modeling to compare various contract scenarios based on member utilization patterns and 'what if'; measuring and evaluating the cost impact of various negotiation; researching the financial profitability/stability and competitive environment of providers to determine impact of proposed rates; and projects different cost of savings targets based upon various analytics
* Identifies cost of care savings opportunities by analyzing practice patterns in relation to office visits, referral practices, and specialty care procedures and recommends policy changes and claim's system changes to pursue cost savings
* Reviews results post-implementation to ensure projected cost savings are realized and recommends modifications as applicable
* Recommends standardized practices to optimize cost of care
* Educates provider contractors on contracting analytics from a financial impact perspective
* May recommend alternative contract language and may go on-site to provider premises during contract negotiations
* Researches provider's financial profitability/stability and competitive environment to determine impact of proposed rates
* Provides on-going analytic and consultative support during complex and the most intense provider negotiations
* Educates provider contractors on contracting analytics from a financial impact perspective
* May recommend alternative contract language
* Acts as a source of direction, training and guidance for less experienced staff
* Looks for continuous quality improvements and finds better ways to accomplish end results
* Works side by side with their program manager
Required Qualifications:
* Requires BA/BS degree in Mathematics, Statistics or related field and a minimum of 7 years experience in broad-based analytical, managed care payor or provider environment as well as in depth experience in statistical analysis and modeling; or any combination of education and experience which would provide an equivalent background.
Preferred Qualifications:
* Experience providing leadership in evaluating and analyzing complex initiatives strongly preferred
* Government Business Experience strongly preferred
* HEDIS/STARS experience preferred
* Ability to Develop/code in SQL preferred
* SAS/Python Experience preferred
* Data Analysis experience is a must
* Value Based care experience nice to have
Job Level:
Non-Management Exempt
Workshift:
1st Shift (United States of America)
Job Family:
RDA > Health Economics & Cost of Care
Please be advised that Elevance Health only accepts resumes for compensation from agencies that have a signed agreement with Elevance Health. Any unsolicited resumes, including those submitted to hiring managers, are deemed to be the property of Elevance Health.
Who We Are
Elevance Health is a health company dedicated to improving lives and communities - and making healthcare simpler. We are a Fortune 25 company with a longstanding history in the healthcare industry, looking for leaders at all levels of the organization who are passionate about making an impact on our members and the communities we serve.
How We Work
At Elevance Health, we are creating a culture that is designed to advance our strategy but will also lead to personal and professional growth for our associates. Our values and behaviors are the root of our culture. They are how we achieve our strategy, power our business outcomes and drive our shared success - for our consumers, our associates, our communities and our business.
We offer a range of market-competitive total rewards that include merit increases, paid holidays, Paid Time Off, and incentive bonus programs (unless covered by a collective bargaining agreement), medical, dental, vision, short and long term disability benefits, 401(k) +match, stock purchase plan, life insurance, wellness programs and financial education resources, to name a few.
Elevance Health operates in a Hybrid Workforce Strategy. Unless specified as primarily virtual by the hiring manager, associates are required to work at an Elevance Health location at least once per week, and potentially several times per week. Specific requirements and expectations for time onsite will be discussed as part of the hiring process.
The health of our associates and communities is a top priority for Elevance Health. We require all new candidates in certain patient/member-facing roles to become vaccinated against COVID-19 and Influenza. If you are not vaccinated, your offer will be rescinded unless you provide an acceptable explanation. Elevance Health will also follow all relevant federal, state and local laws.
Elevance Health is an Equal Employment Opportunity employer, and all qualified applicants will receive consideration for employment without regard to age, citizenship status, color, creed, disability, ethnicity, genetic information, gender (including gender identity and gender expression), marital status, national origin, race, religion, sex, sexual orientation, veteran status or any other status or condition protected by applicable federal, state, or local laws. Applicants who require accommodation to participate in the job application process may contact ******************************************** for assistance. Qualified applicants with arrest or conviction records will be considered for employment in accordance with all federal, state, and local laws, including, but not limited to, the Los Angeles County Fair Chance Ordinance and the California Fair Chance Act.
Expert Exploitation Specialist/Data Scientist (TS/SCI)
Data engineer job in Tampa, FL
About the Role Culmen International is hiring Expert Exploitation Specialist/Data Scientists to provide support on-site at the National Geospatial-Intelligence Agency (NGA) in Tampa, FL. The National Geospatial-Intelligence Agency (NGA) expects to deliver AOS Metadata Cataloging and Management Services to enhance product and asset management of content enabling rapid creation of discoverable, modular, web enabled, and visually enriched Geospatial Intelligence (GEOINT) products for intelligence producers in NGA, across the National System for Geospatial-Intelligence (NSG).
TALENT PIPELINE - Qualified applicants will be contacted as soon as funding for this position is secured.
What You'll Do in Your New Role
The Data Scientist will coordinate with our clients to understand questions and issues involving the client's datasets, then determine the best method and approach to create data-driven solutions within program guidelines. This position will be relied upon as a Subject Matter Expert (SME), and be expected to lead/assist in the development of automated processes, architect data science solutions, automated workflows, conduct analysis, use available tools to analyze data, remain adaptable to mission requirements, and identify patterns to help solve some of the complex problems that face the DoD and Intelligence Community (IC).
* Work with large structured / unstructured data in a modeling and analytical environment to define and create streamline processes in the evaluation of unique datasets and solve challenging intelligence issues
* Lead and participate in the design of solutions and refinement of pre-existing processes
* Work with Customer Stakeholders, Program Managers, and Product Owners to translate road map features into components/tasks, estimate timelines, identify resources, suggest solutions, and recognize possible risks
* Use exploratory data analysis techniques to identify meaningful relationships, patterns, or trends from complex data
* Combine applied mathematics, programming skills, analytical techniques, and data to provide impactful insights for decision makers
* Research and implement optimization models, strategies, and methods to inform data management activities and analysis
* Apply big data analytic tools to large, diverse sets of data to deliver impactful insights and assessments
* Conduct peer reviews to improve quality of workflows, procedures, and methodologies
* Help build high-performing teams; mentor team members providing development opportunities to increase their technical skills and knowledge
Required Qualifications
* TS/SCI Clearance w/CI Poly Eligible
* Minimum of 18 years combined experience (A combination of years of experience & professional certifications/trainings can be used in lieu of a degree)
* BS in related Field with Graduate level work
* Expert proficiency in Python and other programming languages applicable to automation development.
* Demonstrated experience designing and implementing workflow automation systems
* Advanced experience with ETL (Extract, Transform, Load) processes for geospatial data
* Expertise in integrating disparate systems through APl development and implementation
* Experience developing and deploying enterprise-scale automation solutions
* Knowledge of NGA's Foundation GEOINT products, data types, and delivery methods
* Demonstrated experience with database design, implementation, and optimization
* Experience with digital media generation systems and automated content delivery platforms
* Ability to analyze existing workflows and develop technical solutions to streamline processes
* Knowledge of DLA systems and interfaces, particularly MEBS and WebFLIS
* Expertise in data quality assurance and validation methodologies
* Experience with geospatial data processing, transformation, and delivery automation
* Proficiency with ArcGIS tools, GEODEC and ACCORD software systems
* Understanding of cartographic principles and standards for CADRG/ECRG products
* Strong analytical skills for identifying workflow inefficiencies and implementing solutions
* Experience writing technical documentation including SOPS, CONOPS, and system design
Desired Qualifications
* Certification(s) in relevant automation technologies or programming languages
* Experience with DevOps practices and C/CD implementation
* Knowledge of cloud-based automation solutions and their implementation in - government environments
* Experience with machine learning applications for GEOINT Workflow optimization
* Expertise in data analytics and visualization for workflow performance metrics
* Understanding of NGA's enterprise architecture and integration points
* Experience implementing RPA (Robotic Process Automation) solutions
* Knowledge of secure coding practices and cybersecurity principles
* Demonstrated expertise in digital transformation initiatives
* Experience mentoring junior staff in automation techniques and best practices
* Background in agile development methodologies
* Understanding of human-centered design principles for workflow optimization
About the Company
Culmen International is committed to enhancing international safety and security, strengthening homeland defense, advancing humanitarian missions, and optimizing government operations. With experience in over 150 countries, Culmen supports our clients to accomplish critical missions in challenging environments.
* Exceptional Medical/Dental/Vision Insurance, premiums for employees are 100% paid by Culmen,
and dependent coverage is available at a nominal rate (including same or opposite sex domestic partners)
* 401k - Vested immediately and 4% match
* Life insurance and disability paid by the company
* Supplemental Insurance Available
* Opportunities for Training and Continuing Education
* 12 Paid Holidays
To learn more about Culmen International, please visit **************
At Culmen International, we are committed to creating and sustaining a workplace that upholds the principles of Equal Employment Opportunity (EEO). We believe in the importance of fair treatment and equal access to opportunities for all employees and applicants. Our commitment to these principles is unwavering across all our operations worldwide.
Auto-ApplyData Engineer - Machine Learning (Marketing Analytics)
Data engineer job in Clearwater, FL
At PODS (Portable On Demand Storage), we're not just a leader in the moving and storage industry, we redefined it. Since 1998, we've empowered customers across the U.S. and Canada with flexible, portable solutions that put customers in control of their move. Whether it's a local transition or a cross-country journey, our personalized service makes any experience smoother, smarter, and more human.
We're driven by a culture of trust, authenticity, and continuous improvement. Our team is the heartbeat of our success, and together we strive to make each day better than the last. If you're looking for a place where your work matters, your ideas are valued, and your growth is supported- PODS is your next destination.
JOB SUMMARY
The Data Engineer- Machine Learning is responsible for scaling a modern data & AI stack to drive revenue growth, improve customer satisfaction, and optimize resource utilization. As an ML Data Engineer, you will bridge data engineering and ML engineering: build high‑quality feature pipelines in Snowflake/Snowpark, Databricks, productionize and operate batch/real‑time inference, and establish MLOps/LLMOps practices so models deliver measurable business impact at scale.
Note: This role is required onsite at PODS headquarters in Clearwater, FL. The onsite working schedule is Monday - Thursday onsite with Friday remote.
It is NOT a remote opportunity.
General Benefits & Other Compensation:
Medical, dental, and vision insurance
Employer-paid life insurance and disability coverage
401(k) retirement plan with employer match
Paid time off (vacation, sick leave, personal days)
Paid holidays
Parental leave / family leave
Bonus eligibility / incentive pay
Professional development / training reimbursement
Employee assistance program (EAP)
Commuter benefits / transit subsidies (if available)
Other fringe benefits (e.g. wellness credits)
What you will do:
● Design, build, and operate feature pipelines that transform curated datasets into reusable, governed feature tables in Snowflake
● Productionize ML models (batch and real‑time) with reliable inference jobs/APIs, SLAs, and observability
● Setup processes in Databricks and Snowflake/Snowpark to schedule, monitor, and auto‑heal training/inference pipelines
● Collaborate with our Enterprise Data & Analytics (ED&A) team centered on replicating operational data into Snowflake, enriching it into governed, reusable models/feature tables, and enabling advanced analytics & ML-with Databricks as a core collaboration environment
● Partner with Data Science to optimize models that grow customer base and revenue, improve CX, and optimize resources
● Implement MLOps/LLMOps: experiment tracking, reproducible training, model/asset registry, safe rollout, and automated retraining triggers
● Enforce data governance & security policies and contribute metadata, lineage, and definitions to the ED&A catalog
● Optimize cost/performance across Snowflake/Snowpark and Databricks
● Follow robust and established version control and DevOps practices
● Create clear runbooks and documentation, and share best practices with analytics, data engineering, and product partners
Also, you will
DELIVER QUALITY RESULTS: Able to deliver top quality service to all customers (internal and external); Able to ensure all details are covered and adhere to company policies; Able to strive to do things right the first time; Able to meet agreed-upon commitments or advises customer when deadlines are jeopardized; Able to define high standards for quality and evaluate products, services, and own performance against those standards
TAKE INITIATIVE: Able to exhibit tendencies to be self-starting and not wait for signals; Able to be proactive and demonstrate readiness and ability to initiate action; Able to take action beyond what is required and volunteers to take on new assignments; Able to complete assignments independently without constant supervision
BE INNOVATIVE / CREATIVE: Able to examine the status quo and consistently look for better ways of doing things; Able to recommend changes based on analyzed needs; Able to develop proper solutions and identify opportunities
BE PROFESSIONAL: Able to project a positive, professional image with both internal and external business contacts; Able to create a positive first impression; Able to gain respect and trust of others through personal image and demeanor
ADVANCED COMPUTER USER: Able to use required software applications to produce correspondence, reports, presentations, electronic communication, and complex spreadsheets including formulas and macros and/or databases. Able to operate general office equipment including company telephone system
What you will need:
Bachelor's or Master's in CS, Data/ML, or related field (or equivalent experience) required
4+ years in data/ML engineering building production‑grade pipelines with Python and SQL
Strong hands‑on with Snowflake/Snowpark and Databricks; comfort with Tasks & Streams for orchestration
2+ years of experience optimizing models: batch jobs and/or real‑time APIs, containerized services, CI/CD, and monitoring
Solid understanding of data modeling and governance/lineage practices expected by ED&A
It would be nice if you had:
Familiarity with LLMOps patterns for generative AI applications
Experience with NLP, call center data, and voice analytics
Exposure to feature stores, model registries, canary/shadow deploys, and A/B testing frameworks
Marketing analytics domain familiarity (lead scoring, propensity, LTV, routing/prioritization)
MANAGEMENT & SUPERVISORY RESPONSIBILTIES
• Direct supervisor job title(s) typically include: VP, Marketing Analytics
• Job may require supervising Analytics associates
No Unsolicited Resumes from Third-Party Recruiters
Please note that as per PODS policy, we do not accept unsolicited resumes from third-party recruiters unless such recruiters are engaged to provide candidates for a specified opening and in alignment with our Inclusive Diversity values. Any employment agency, person or entity that submits an unsolicited resume does so with the understanding that PODS will have the right to hire that applicant at its discretion without any fee owed to the submitting employment agency, person, or entity.
DISCLAIMER
The preceding job description has been designed to indicate the general nature of work performed; the level of knowledge and skills typically required; and usual working conditions of this position. It is not designed to contain, or be interpreted as, a comprehensive listing of all requirements or responsibilities that may be required by employees in this job.
Equal Opportunity, Affirmative Action Employer
PODS Enterprises, LLC is an Equal Opportunity, Affirmative Action Employer. We will not discriminate unlawfully against qualified applicants or employees with respect to any term or condition of employment based on race, color, national origin, ancestry, sex, sexual orientation, age, religion, physical or mental disability, marital status, place of birth, military service status, or other basis protected by law.
Endpoint Engineer #975877
Data engineer job in Tampa, FL
Endpoint Security Engineer
Duration: Direct Hire (PERM)
We are seeking an experienced Endpoint Security Engineer to lead the implementation, configuration, and ongoing management of our Netskope Endpoint DLP solution. This role is part of the broader Infrastructure Operations & Security organization and will contribute to a diverse team responsible for maintaining secure, resilient, and compliant systems across the enterprise. The role involves hands-on engineering, operational support, incident response, and continuous improvement of endpoint security capabilities.
The Endpoint Security Engineer will report to the Infrastructure Security leadership team.
Key Responsibilities
Endpoint Security Engineering
Design, implement, and manage endpoint security solutions, specifically Netskope Endpoint DLP.
Integrate endpoint security tools into the broader network, infrastructure, and enterprise security ecosystem.
Ensure endpoint security systems follow industry best practices and organizational standards.
Security Monitoring & Incident Response
Monitor network and endpoint data to detect and respond to security incidents.
Conduct root cause analysis and support correlation of SIEM, firewall, IDS/IPS, and other log sources to identify threat indicators.
Participate in incident response activities and security investigations.
Security Assessment & Compliance
Perform security assessments, vulnerability scans, and risk evaluations to identify and remediate weaknesses.
Contribute to policy documentation, runbooks, procedures, and operational standards.
Follow change management processes and maintain documentation for all system updates.
Collaboration & Continuous Improvement
Work closely with systems engineers, network teams, and security stakeholders to enhance the organization's security posture.
Research, evaluate, and recommend new security technologies or improvements to existing tools.
Provide training and guidance to end-users and technical teams on endpoint security usage and best practices.
Track and report on system health, risks, and program progress using metrics.
Qualifications
5+ years of experience implementing and supporting enterprise-wide Endpoint Detection & Response (EDR) or Endpoint DLP solutions.
5+ years of hands-on experience with Netskope (strongly preferred).
3+ years as a systems engineer in a medium- or large-scale enterprise environment.
Strong background in cybersecurity operations, including:
Log correlation
Threat indicator detection
SIEM, firewall, IDS/IPS tools
Security scanning and architecture
Bachelor's degree in Computer Science, Information Systems, Cybersecurity, or equivalent experience.
Relevant industry certifications a plus (CISSP, SSCP, CISM, SANS, Security+, etc.).
Provider Contract Data Consultant Sr
Data engineer job in Tampa, FL
**Hours:** Monday - Friday **Travel:** This role requires associates to be in-office 1 - 2 days per week, fostering collaboration and connectivity, while providing flexibility to support productivity and work-life balance. This approach combines structured office engagement with the autonomy of virtual work, promoting a dynamic and adaptable workplace. Alternate locations may be considered if candidates reside within a commuting distance from an office.
**Position Overview:**
Provides the highest level of analytical support to the Cost of Care and/or Provider Contracting organizations. Focuses efforts on lowering claims costs, improving the quality of care, and increasing member and provider network satisfaction. Provides expert advice, analytic and consultative support to Medical Directors and management on cost of care issues. Supports large scale initiatives with high dollar cost savings opportunities. Partners with provider contractors to develop contracting strategy and supports all aspects of the contract negotiation process. Works with multiple provider types including the most complex, high profile providers. Supports a full range of contract arrangements and pricing mechanisms including the most complex contract terms. Works on the most complex, large scale enterprise wide initiatives and acts as project lead. Acts as a strategic partner to management.
**How You Will Make an Impact:**
+ Uses analytic tools to: track both health risks and compliance, as well as supporting the contract negotiation process
+ Types of analyses include performing sophisticated retrospective data analytics; developing the most complex new models and modifies existing models to create predictive impact decision making tools; performing healthcare cost analysis to identify strategies to control costs; projecting cost increases in medical services by using analytic techniques for PMPM trending via multiple variable analysis; preparing complex pre-negotiation analyses to support development of defensible pricing strategies; performing modeling to compare various contract scenarios based on member utilization patterns and 'what if'; measuring and evaluating the cost impact of various negotiation; researching the financial profitability/stability and competitive environment of providers to determine impact of proposed rates; and projects different cost of savings targets based upon various analytics
+ Identifies cost of care savings opportunities by analyzing practice patterns in relation to office visits, referral practices, and specialty care procedures and recommends policy changes and claim's system changes to pursue cost savings
+ Reviews results post-implementation to ensure projected cost savings are realized and recommends modifications as applicable
+ Recommends standardized practices to optimize cost of care
+ Educates provider contractors on contracting analytics from a financial impact perspective
+ May recommend alternative contract language and may go on-site to provider premises during contract negotiations
+ Researches provider's financial profitability/stability and competitive environment to determine impact of proposed rates
+ Provides on-going analytic and consultative support during complex and the most intense provider negotiations
+ Educates provider contractors on contracting analytics from a financial impact perspective
+ May recommend alternative contract language
+ Acts as a source of direction, training and guidance for less experienced staff
+ Looks for continuous quality improvements and finds better ways to accomplish end results
+ Works side by side with their program manager
**Required Qualifications:**
+ Requires BA/BS degree in Mathematics, Statistics or related field and a minimum of 7 years experience in broad-based analytical, managed care payor or provider environment as well as in depth experience in statistical analysis and modeling; or any combination of education and experience which would provide an equivalent background.
**Preferred Qualifications:**
+ Experience providing leadership in evaluating and analyzing complex initiatives strongly preferred
+ Government Business Experience strongly preferred
+ HEDIS/STARS experience preferred
+ Ability to Develop/code in SQL preferred
+ SAS/Python Experience preferred
+ Data Analysis experience is a must
+ Value Based care experience nice to have
Please be advised that Elevance Health only accepts resumes for compensation from agencies that have a signed agreement with Elevance Health. Any unsolicited resumes, including those submitted to hiring managers, are deemed to be the property of Elevance Health.
Who We Are
Elevance Health is a health company dedicated to improving lives and communities - and making healthcare simpler. We are a Fortune 25 company with a longstanding history in the healthcare industry, looking for leaders at all levels of the organization who are passionate about making an impact on our members and the communities we serve.
How We Work
At Elevance Health, we are creating a culture that is designed to advance our strategy but will also lead to personal and professional growth for our associates. Our values and behaviors are the root of our culture. They are how we achieve our strategy, power our business outcomes and drive our shared success - for our consumers, our associates, our communities and our business.
We offer a range of market-competitive total rewards that include merit increases, paid holidays, Paid Time Off, and incentive bonus programs (unless covered by a collective bargaining agreement), medical, dental, vision, short and long term disability benefits, 401(k) +match, stock purchase plan, life insurance, wellness programs and financial education resources, to name a few.
Elevance Health operates in a Hybrid Workforce Strategy. Unless specified as primarily virtual by the hiring manager, associates are required to work at an Elevance Health location at least once per week, and potentially several times per week. Specific requirements and expectations for time onsite will be discussed as part of the hiring process.
The health of our associates and communities is a top priority for Elevance Health. We require all new candidates in certain patient/member-facing roles to become vaccinated against COVID-19 and Influenza. If you are not vaccinated, your offer will be rescinded unless you provide an acceptable explanation. Elevance Health will also follow all relevant federal, state and local laws.
Elevance Health is an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to age, citizenship status, color, creed, disability, ethnicity, genetic information, gender (including gender identity and gender expression), marital status, national origin, race, religion, sex, sexual orientation, veteran status or any other status or condition protected by applicable federal, state, or local laws. Applicants who require accommodation to participate in the job application process may contact ******************************************** for assistance.
Qualified applicants with arrest or conviction records will be considered for employment in accordance with all federal, state, and local laws, including, but not limited to, the Los Angeles County Fair Chance Ordinance and the California Fair Chance Act.