Data Modeling
Data engineer job in Melbourne, FL
Must Have Technical/Functional Skills
• 5+ years of experience in data modeling, data architecture, or a similar role
• Proficiency in SQL and experience with relational databases such as Oracle, SQL Server, or PostgreSQL
• Experience with data modeling tools such as Erwin, IBM Infosphere Data Architect, or similar
• Ability to communicate complex concepts clearly to diverse audiences
Roles & Responsibilities
• Design and develop conceptual, logical, and physical data models that support both operational and analytical needs
• Collaborate with business stakeholders to gather requirements and translate them into scalable data models
• Perform data profiling and analysis to understand data quality issues and identify opportunities for improvement
• Implement best practices for data modeling, including normalization, denormalization, and indexing strategies
• Lead data architecture discussions and present data modeling solutions to technical and non-technical audiences
• Mentor and guide junior data modelers and data architects within the team
• Continuously evaluate data modeling tools and techniques to enhance team efficiency and productivity
Base Salary Range: $100,000 - $150,000 per annum
TCS Employee Benefits Summary:
Discretionary Annual Incentive.
Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans.
Family Support: Maternal & Parental Leaves.
Insurance Options: Auto & Home Insurance, Identity Theft Protection.
Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement.
Time Off: Vacation, Time Off, Sick Leave & Holidays.
Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
Data Engineer (Mid-Level)
Data engineer job in Orlando, FL
Job Title: Data Engineer (Mid-Level)
Employment Type: 6 months contract to hire
About the Role
We are seeking a highly skilled professional who can bridge the gap between data engineering and data analysis. This role is ideal for someone who thrives on building robust data models and optimizing existing data infrastructure to drive actionable insights. 70% engineering/30% analyst
Key Responsibilities
· Design and implement data models for service lines that currently lack structured models.
· Build and maintain scalable ETL pipelines to ensure efficient data flow and transformation.
· Optimize and enhance existing data models and processes for improved performance and reliability.
· Collaborate with stakeholders to understand business requirements and translate them into technical solutions.
· Must have excellent communication skills
Required Skills
· SQL (must-have): Advanced proficiency in writing complex queries and optimizing performance.
· Data Modeling: Strong experience in designing and implementing relational and dimensional models.
· ETL Development: Hands-on experience with data extraction, transformation, and loading processes.
· Alteryx or Azure Data Factory (ADF) for pipeline development.
· Excel proficiency
· Experience with BI tools and data visualization (Power BI, Tableau).
Bachelor's degree required
Thanks and Regards
Ashish Tripathi || US IT RecruiterKPG99,INC
******************| ***************
3240 E State, St Ext | Hamilton, NJ 08
Data Architect
Data engineer job in Orlando, FL
Responsible for providing expertise on decisions and priorities regarding the enterprise's overall systems architecture.
Facilitates the establishment and implementation of standards and guidelines that guide the design of technology solutions, including architecting and implementing solutions requiring integration of multiple platforms, operating systems and applications across the enterprise.
Reviews, advises, and designs standard software and hardware builds, system options, risks, costs vs. benefits, and impact on the enterprise business process and goals.
Develops and documents the framework for integration and implementation for changes to technical standards.
Assists in the development of and manages an architecture governance process. Provides technical guidance to project team areas as appropriate.
Tracks industry trends and maintains knowledge of new technologies to better serve the enterprise's architecture needs.
Essential Position Functions:
Ability to design and implement scalable AI solutions that integrate seamlessly with existing business and IT infrastructure and drive the organization's vision
Leading cross-disciplinary teams to develop AI applications that meet strategic business objectives
Ensuring AI solutions comply with ethical standards and industry regulations
Provide technical direction and leadership for internal software development efforts
Conducts regular code reviews to ensure that the application code is of high quality and meets code standards.
Identify and implement technology solutions to support Research & Development initiatives including data interchange systems, security frameworks, and business intelligence systems
Plan and architect integration layers between internal applications that manage school and student information and external partner systems, including Learning Management Systems (LMS), Student Information System (SIS), Curriculum Services/Courses, and Physical School based systems
Design and develop new software products or major enhancements to existing software
Acts as highest-level technical expert, addressing problems of systems integration, compatibility, and multiple platforms
Performs feasibility analysis on potential future projects to management
Work across internal development teams to deliver system performance and architecture improvements
Work with leadership to identify and evaluate new technologies
Exhibit excellent communication skills to earn trust, persuade, motivate, and mentor
Assist in the establishment and maintenance of vendor relationships
Work with leadership to document and present technology roadmaps and strategies
Work together with peers and functional groups to ensure consistent approach and maximum value for development initiatives and business platforms
Produce architectural documentation as required by the Software Development Life Cycle (SDLC) using industry standards; maintain and evolve software architecture documents based on evolving system requirements and industry trends and technologies
Meet professional obligations through efficient work habits such as, meeting deadlines, honouring schedules, coordinating resources and meetings in an effective and timely manner, and demonstrate respect for others
Technical Skills:
Bachelor's Degree in Artificial Intelligence, Computer Science, or Information Systems; or equivalent combination of education and relevant experience. Or equivalent work experience.
Industry certifications in software, systems, network, or project management disciplines, preferred
Eight (8) years of software and systems architecture/development experience
Five (5) years of Artificial Intelligence development experience
Development Patterns, Methodologies and Tools
Code Repositories like Git/GitHub/TFS, required
Story Management Applications like Azure DevOps, preferred
Development Frameworks, preferred
DevSecOps / DevOps
Agile Development
Service Oriented Architectures
High -traffic and -volume Distributed System Design, required
Experience with Microservices or Service Bus Technologies, preferred
Message Brokers like RabbitMQ, Kafka, or Azure Service Bus, preferred
Data Storage and Persistence
SQL server/MySQL, required
No-SQL like MongoDB or Redis, required
Experience with Azure SQL, Azure Cosmos DB, or Azure Key Vault, preferred
Experience with Cloud Storage like Azure, preferred
Container-based Architectures like Docker or Kubernetes, preferred
AI and Machine Learning with tools like Azure Cognitive Search, Azure Machine Learning, Azure Cognize Services or Azure Bot Services, preferred
Experience with analytics like Azure Data Lake Storage and Analytics, Azure Databricks or Azure Open Datasets, preferred
Experience in Business Intelligence, Big Data, or analytics, preferred
Experience in Education industry or with education data, preferred
Experience with git version control, required
Service Oriented Architectures
High -traffic and -volume Distributed System Design, required
Experience with Microservices or Service Bus Technologies, preferred
Ability to design large scale, enterprise-wide systems, optimized for performance utilizing appropriate industry technologies and best practices, required
Data Architect
Data engineer job in Orlando, FL
(Orlando, FL)
Business Challenge
The company is in the midst of an AI transformation, creating exciting opportunities for growth. At the same time, they are leading a Salesforce modernization and integrating the systems and data of their recent acquisition.
To support these initiatives, they are bringing in a Senior Data Architect/Engineer to establish enterprise standards for application and data architecture, partnering closely with the Solutions Architect and Tech Leads.
Role Overview
The Senior Data Architect/Engineer leads the design, development, and evolution of enterprise data architecture, while contributing directly to the delivery of robust, scalable solutions. This position blends strategy and hands-on engineering, requiring deep expertise in modern data platforms, pipeline development, and cloud-native architecture.
You will:
Define architectural standards and best practices.
Evaluate and implement new tools.
Guide enterprise data initiatives.
Partner with data product teams, engineers, and business stakeholders to build platforms supporting analytics, reporting, and AI/ML workloads.
Day-to-Day Responsibilities
Lead the design and documentation of scalable data frameworks: data lakes, warehouses, streaming architectures, and Azure-native data platforms.
Build and optimize secure, high-performing ETL/ELT pipelines, data APIs, and data models.
Develop solutions that support analytics, advanced reporting, and AI/ML use cases.
Recommend and standardize modern data tools, frameworks, and architectural practices.
Mentor and guide team members, collaborating across business, IT, and architecture groups.
Partner with governance teams to ensure data quality, lineage, security, and stewardship.
Desired Skills & Experience
10+ years of progressive experience in Data Engineering and Architecture.
Strong leadership experience, including mentoring small distributed teams (currently 4 people: 2 onshore, 2 offshore; team growing to 6).
Deep knowledge of Azure ecosystem (Data Lake, Synapse, SQL DB, Data Factory, Databricks).
Proven expertise with ETL pipelines (including 3rd-party/vendor integrations).
Strong SQL and data modeling skills; familiarity with star/snowflake schemas and other approaches.
Hands-on experience creating Data APIs.
Solid understanding of metadata management, governance, security, and data lineage.
Programming experience with SQL, Python, Spark.
Familiarity with containerized compute/orchestration frameworks (Docker, Kubernetes) is a plus.
Experience with Salesforce data models, MDM tools, and streaming platforms (Kafka, Event Hub) is preferred.
Excellent problem-solving, communication, and leadership skills.
Education:
Bachelor's degree in Computer Science, Information Systems, or related field (Master's preferred).
Azure certifications in Data Engineering or Solution Architecture strongly preferred.
Essential Duties & Time Allocation
Data Architecture Leadership - Define enterprise-wide strategies and frameworks (35%)
Engineering & Delivery - Build and optimize ETL/ELT pipelines, APIs, and models (30%)
Tooling & Standards - Evaluate new tools and support adoption of modern practices (15%)
Mentorship & Collaboration - Mentor engineers and align stakeholders (10%)
Governance & Quality - Embed stewardship, lineage, and security into architecture (10%)
Software Support Engineer
Data engineer job in Orlando, FL
Required:
Highly experienced in their ability to troubleshoot difficult technical issues with ease
Strong object-oriented programming skills in Java/JavaScript
Experience working with dynamic HTML components: AJAX, JavaScript, AngularJS, CSS, XML, HTML, XHTML
Working knowledge of the components in a web applications stack
Experience with relational databases such as MySQL
Excellent written and verbal communication skills with the ability to clearly articulate solutions to complex technical problems
Strong personal commitment to quality and customer service
Ability to work with high-value customer administrators and developers
Understanding of basic networking and system administration.
We value a work-life balance while also collaborating through consistent presence in the office
Lead Software Engineer
Data engineer job in Orlando, FL
Candidates will be disqualified if the following criteria are not met:
*PLEASE READ JOB CRITERIA BEFORE APPLYING*
Employment Type: No C2C (Corp-to-Corp) or C2H (Contract-to-Hire) arrangements. W2 contract only. No referral fees will be entertained.
Work Authorization: Must be a U.S. Citizen or Green Card holder.
Local in Orlando, FL or 1 hour drive to Orlando
Strong programming skills in Python, with additional experience in JavaScript, C, and C++.
Hands-on experience with AI/ML development, backend tooling, and automation pipelines.
Proficiency with SQL and database technologies (relational and NoSQL) for data modeling and backend integration.
Familiarity with CI/CD pipelines, deployment automation, and scripting.
Experience:
• Strong programming skills in Python, with additional experience in JavaScript, C, and C++.
• Hands-on experience with AI/ML development, backend tooling, and automation pipelines.
• Proficiency with SQL and database technologies (relational and NoSQL) for data modeling and backend integration.
• Familiarity with CI/CD pipelines, deployment automation, and scripting.
• Understanding of backend architectures, distributed systems, and performance optimization.
• Strong problem-solving and debugging skills.
• Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team environment. Or equivalent combination of education and experience.
Responsibilities:
• Design, develop, and maintain backend tools, automation scripts, services, and automation pipelines to support AI and data-driven applications.
• Implement monitoring, logging, and database management to ensure performance and reliability.
• Collaborate with teams, document best practices, and maintain security and compliance standards.
• Diagnose and resolve backend, database, and automation issues.
• Stay current with backend, AI, and automation technologies to drive scalability and efficiency.
• Understand and actively participate in Environmental, Health & Safety responsibilities by following established UO policy, procedures, training and team member involvement activities.
• Perform other duties as assigned.
Software Engineer - Living Room Devices (Smart TV)
Data engineer job in Orlando, FL
Salary Range: $95K-$110K base, Direct Hire
A team building and optimizing Smart TV applications is looking for a Software Engineer with strong JavaScript and HTML experience.
This role is perfect for someone who enjoys creating high-performance TV apps and collaborating cross-functionally to deliver great viewing experiences across a range of Smart TV platforms.
What You'll Be Doing
Develop and optimize Smart TV applications using JavaScript, HTML5, and CSS3
Build reliable, user-friendly interfaces that perform well across different TV platforms
Work closely with cross-functional teams to ensure compatibility and quality
Support video playback features, streaming workflows, and device-specific behaviors
Participate in agile development practices, code reviews, and CI/CD pipelines
Mentor junior developers and contribute to best practices
What We're Looking For
Hands-on experience building Smart TV or TV-based applications
Strong skills in JavaScript, HTML5, and CSS3
Familiarity with video playback technologies, DRM, or streaming formats
Experience working in collaborative, agile engineering teams
Comfort working with Git and modern development workflows
Nice to Have
Experience with Roku, Android TV, or similar platforms
Exposure to HLS, DASH, or other streaming protocols
Knowledge of accessibility standards
Experience with analytics or A/B testing tools
Related backgrounds: Application Developer, Front-End Developer, UI/UX Developer
If you've built apps for TVs (or want to!) and enjoy working on consumer-facing technology, we'd love to connect.
Equal Opportunity Employer/Veterans/Disabled
Military connected talent encouraged to apply
To read our Candidate Privacy Information Statement, which explains how we will use your information, please navigate to ***********************************************
The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:
The California Fair Chance Act
Los Angeles City Fair Chance Ordinance
Los Angeles County Fair Chance Ordinance for Employers
San Francisco Fair Chance Ordinance
Massachusetts Candidates Only: It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.
Technical Application Support Engineer
Data engineer job in Orlando, FL
Harvey Nash USA has been engaged to find a talented Technical Support Engineer - Integrations for an Enterprise SaaS/Tech Client.
Integrations gets a mix of issue types that typically center around there broad topics:
• Email Configuration and Maintenance (Not just Outlook App)
• SSO/Authentication (Candidate should Know who to configure the technology and how this works internally, NOT just assign or grant user/permissions).
• Web Services
• Scripting /Rest API
• Database issues
Non-negotiable items that a candidate must have:
• Web Services (SOAP/REST), Networking fundamentals, scripting and familiarity with a programing language
• SSO and authentication, Email protocols and servers
Job Title: Technical Support Engineer - Integrations
Location: Orlando, FL (Hybrid - Flexible: Weekly on Wednesday and Thursday at the office)
Duration: 12 Months Contract
Overview:
• Provides technical support to administrators, technicians, and product support personnel who are diagnosing, troubleshooting, repairing and debugging complex computer systems, complex software, integrations, or networked and/or wireless systems.
• Responds to situations where first-line product support has failed to isolate or fix problems in malfunctioning applications and software. Troubleshoots and diagnoses design, reliability and maintenance problems or bugs to platform engineering/software engineering.
What you will do in this role:
• Be a Customer Advocate providing support to users/administrators of our platform
• Understand our platform, cloud technologies and troubleshooting practices to ensure successful resolution of challenging technical situations
• Resolve technical cases created by customers looking for help to understand or troubleshoot unexpected behaviors or answer technical questions about the Client software and platform.
• Gain an understanding of the Client platform and all core functionality.
• Analyze data with a view to isolate the potential cause of the issue.
• Involve others to accomplish personal and group goals.
A reasonable, good faith estimate of the minimum and maximum hourly wage for this position is $35.71/hr. on W2. Benefits will be available, and details are available at the following links:
Benefits Details: ***********************************************
401K Plan: Our employees work hard, which is why Harvey Nash is proud to contribute to their hard-earned savings with a 401(k) retirement plan that includes a 25% company match on all deferrals. We also offer a Roth 401(k) for even more flexibility. Employees 21 years of age or older, and have completed 3 months of service, are eligible to participate.
Data Scientist
Data engineer job in Orlando, FL
We are passionate people with an expert understanding of the digital consumer, data sciences, global telecom business, and emerging financial services. And we believe that we can make the world a better place.
Job Description
Looking for candidate making a career in Data Science with experience applying advanced statistics, data mining and machine learning algorithms to make data-driven predictions using programming languages like Python (including: Numpy, Pandas, Scikit-learn, Matplotlib, Seaborn), SQL (Postgresql). Experience with ElasticSearch, information/document retrieval, natural language processing is a plus. Experience with various machine learning methods (classification, clustering, natural language processing, ensemble methods, outlier analysis) and parameters that affect their performance also helps. You will leverage your strong collaboration skills and ability to extract valuable insights from highly complex data sets to ask the right questions and find the right answers.
Qualifications
Qualifications
· Bachelor's degree or equivalent experience in quantative field (Statistics, Mathematics, Computer Science, Engineering, etc.)
· At least 2 years' of experience in quantitative analytics or data modeling
· Some understanding of predictive modeling, machine-learning, clustering and classification techniques, and algorithms
· Fluency in these programming languages (Python, SQL), Javascript/HTML/CSS/Web Development nice to have.
· Familiarity with data science frameworks and visualization tools (Pandas, Visualizations (matplotlib, altair, etc), Jupyter Notebooks)
Additional Information
Responsibilities
· Analyze raw data: assessing quality, cleansing, structuring for downstream processing
· Design accurate and scalable prediction algorithms
· Collaborate with engineering team to bring analytical prototypes to production
· Generate actionable insights for business improvements
tion will be kept confidential according to EEO guidelines.
Data Engineer
Data engineer job in Orlando, FL
Our client, a US education pioneer since 2000, leads in next-generation curriculum and formative assessment, creating interactive web solutions for teachers, students, and parents. They are technology-driven, with many software engineers using best practices and cutting-edge tools.
Our data analytics teams transform, model, and aggregate the data that empowers our customers to make sense of and tell stories with their data. You'll be working with data scientists, data analysts, data engineers, and software engineers to provide clean, accurate, reliable models and metrics for our products.
The estimated salary range for this position is from $120,000 to $170,000 USD per year.
* Built scalable and reliable data pipelines that transform raw learning data into analytics-ready models, supporting accurate reporting and decision-making.
* Create data models that provide clear visibility into learner progress and performance, enabling meaningful comparisons across cohorts while preserving detailed individual insights.
* Implement flexible dimensional models that handle real-world educational changes, such as year-over-year transitions and learner movement between schools or classes.
* Improve deployment and testing processes for data pipelines, reducing data issues and increasing confidence in analytics outputs.
* BS in Computer Science, Data Science, or equivalent experience.
* 5+ years experience in computer, data, and analytics engineering.
* Expertise in computer, data, and analytics engineering.
* Expertise in SQL and its use in code-based ETL frameworks, preferably dbt, focusing on reuse and efficiency.
* Expertise in ETL/ELT pipelines, analytical data modeling, aggregations, and metrics
* Expertise in dbt and git preferably with automation skills.
* Expertise in analytical modeling architectures, including the Kimball design
* Strong communication skills in writing and conversation, including writing engineering training documentation.
* Fluency in a development language such as Python
* Experience building dashboards, reports, and models in business intelligence tools such as Tableau or Looker
* Expertise with tools we use every day:
* Storage: Snowflake, AWS Storage Services (S3, RDS, Glacier, DynamoDB) and Postgres
* ETL/ELT: Airflow, dbt, Matillion, Fivetran
* BI: Cube.dev, Looker, Tableau
Data Engineer
Data engineer job in Orlando, FL
OUC - The
Reliable
One is presently seeking a Data Engineer to join the Transformation division. At OUC, we don't just work - we're building a bright future of innovation and transformation for future generations.
This is a hybrid position offering two remote workdays per week. Must be onsite Tuesdays and Thursdays.
We are looking for a highly skilled and innovative profession to structure, organize, and optimize scalable data pipelines, cloud architecture, and business intelligence. This role will translate complex business needs into secure, high performance data solutions. If you thrive at the intersection of data engineering, cloud technology, and advanced analytics, this is an opportunity to make a measurable impact.
OUC is an industry leader and the second largest municipal utility in Florida committed to innovation, sustainability, and our community. OUC's mission is to provide exceptional value to our customers and community by delivering sustainable and reliable services and solutions.
Join a team of visionary Change Agents, Strategists, and Community Ambassadors who understand the vital role of diverse experiences in powering creativity and industry transformation.
At OUC, each position contributes to the success and achievement of our goals.
Click here to learn more about what we do.
The ideal candidate will have:
Bachelor's Degree in Information Technology, Computer Science, Computer Engineering, Management Information Systems, Mathematics, Statistics or related field of study, from an accredited college or university.
Master's Degree in Data Engineering, Information Systems, or related field of study from an accredited college or university (preferred.)
Minimum of three (3) years of experience in SQL or Python Programming, ELT/ELT processes, data warehousing & cloud, data modeling and architecture, data security and governance, and visualization and reporting.
Experience with cloud infrastructure components as well as self service analytics platforms.
OUC offers a very competitive compensation and benefits package. Our Total Rewards package includes to cite a few:
Competitive compensation
Low-cost medical, dental, and vision benefits and paid life insurance premiums with no probationary period.
OUC's Hybrid Retirement Program includes a fully-funded cash balance account, defined contribution with employer matching along with a health reimbursement account
Generous paid vacation, holidays, and sick time
Paid parental leave
Educational Assistance Program, to include tuition reimbursement, paid memberships in professional associations, paid conference and training opportunities
Wellness incentives and free access to all on-site OUC fitness facilities
Access to family-oriented recreational areas
Paid Conference and Training Opportunities
Free downtown parking
Hybrid work schedule
Click here to view our Benefits Summary.
Salary Range: $99,981.65 - $124,977.370 annually - commensurate with experience
Location Gardenia - 3800 Gardenia Ave, Orlando, FL 32839
Please see below a complete Job description for this position.
Job Purpose:
Performs ETL/ELT processes, designs data models, and implements multidimensional data architectures to optimize data processing for analytics, reporting, business intelligence, and operational decision-making.
Develops and enhances scalable data pipelines, integrations, and storage solutions across on-premises and cloud environments, ensuring high-performance, reliable data accessibility.Builds and maintains AI/ML-ready data infrastructure, enabling seamless integration of machine learning models and automated analytics workflows. Supports feature engineering, model training, and inference to enhance predictive and prescriptive analytics capabilities. Executes query auditing, code reviews, quality assurance (QA), and data governance to uphold best practices in security, performance, and compliance, ensuring integrity, reliability, and accessibility of data assets.
Translates business requirements into technical designs, optimized data flows, and infrastructure solutions, ensuring alignment with organizational objectives. Collaborates with data scientists, ML engineers, analysts, and IT teams to streamline data workflows, troubleshoot performance bottlenecks, and maintain scalable architectures for structured and unstructured data. Generates effort assessments for data engineering projects, defining Work Breakdown Structures (WBS) to scope and estimate tasks, timelines, and resource requirements. Supports project planning by forecasting development efforts, infrastructure needs, and optimization strategies, ensuring efficiency in execution and delivery.
Primary Functions:
Data Engineering & Integration
Gather and interpret business requirements, translating them into optimized data pipelines, models, and infrastructure solutions that support BI, analytics, and AI-driven decision- making.
Perform data modeling using foundational and modern principles, methodologies, and automation strategies to structure data for predictive analytics, reporting, and AI-driven decision-making.
Implement secure, high-performance integrations between on-premises and cloud-based storage solutions, optimizing data accessibility for BI, analytics, and operational insights.
Develop and maintain scalable ETL/ELT pipelines for efficient data extraction, transformation, and loading, ensuring integration with AI/ML workflows or Data Analytics reporting layer.
Ensure real-time and batch processing capabilities, supporting streaming, and event-driven architectures, while implementing robust data access management solutions to facilitate secure, compliant data accessibility for business intelligence (BI), analytics, and operational decision-making.
Data Warehousing & Cloud Infrastructure
Develop and implement hybrid cloud and on-prem data storage solutions with future-proof scalability and security.
Design and optimize multidimensional data models, ensuring efficient storage and retrieval for AI-assisted data science, analytics and reporting.
Implement star and snowflake schemas, enhancing analytical performance and query optimization.
Utilize data visualization, AI-powered analytics, and business intelligence tools to structure data for strategic for decision-making and automation.
Problem-Solving, Query Auditing, Optimization & Quality Assurance (QA)
Review and audit more complex SQL queries and ETL/ELT pipelines, ensuring efficiency, compliance, and cloud and AI-readiness, adhering to best practices and established standards.
Evaluate data engineering scripts and transformation logic, ensuring integrity, scalability, and security of structured and unstructured datasets.
Conduct troubleshooting for advanced data inconsistencies, optimizing high-performance analytics pipelines across hybrid data environments, leveraging cloud tools or AI-enhanced tuning when needed.
Collaborate with data scientists, reporting analysts, programmers, business systems analysts, solution architects, and IT teams to deliver reliable data solutions.
Identify opportunities and recommend automation, AI-driven improvements, cloud efficiency options, and other process and performance improvement in the data engineering process.
General
Define and generate effort assessments for data engineering tasks, establishing work breakdown structures (WBS) to scope, estimate, and track project execution.
Forecast development efforts, infrastructure requirements, and reporting, Cloud, AI/ML data processing needs, ensuring efficiency and resource allocation.
Support Agile methodologies, aligning engineering workflows with business strategy and evolving project demands.
Apply data governance, AI-augmented security, privacy protocols, and compliance best practices.
Adhere to ethical guidelines and legal regulations governing the collection and use of data, including and not limiting to closed-source intelligence, open-source intelligence data, and AI- generated insights.
Maintain security and confidentiality of sensitive information and data;
Perform other duties as assigned.
Technical Requirements:
Programming languages savvy & Query Optimization expert.
Expertise in SQL, Python, and Scala, ensuring optimized query performance for structured and unstructured data processing.
Proficiency in two or more of the following languages for diverse technical implementations: JavaScript (Snowpark), DAX, Power Query (M), WDL, Power Fx, C#, Perl, VBA, R, Julia.
Strong understanding of big data frameworks and distributed computing for scalable analytics.
ETL/ELT Tools:
Proficiency in ETL/ELT tools, including Talend, Dataiku, Alteryx, Snowpipes, Azure Data Factory, SSIS.
Experience in workflow automation using Power Automate and cloud-based integration platforms.
Expertise in real-time and batch processing, supporting streaming data and event-driven architectures.
Data Warehousing & Cloud:
Strong knowledge of cloud-native data solutions, including Snowflake, Databricks, AWS S3, Azure Synapse, and relational databases like Oracle, MySQL.
Proficiency in VM-based deployments using Azure VMs, AWS EC2, Google Compute Engine for scalable data processing.
Familiarity with IBM Framework for structured data modeling, governance, and analytics.
Experience in data lakes, warehouse optimization, and hybrid cloud architectures for scalable analytics.
Data Modeling & Architecture:
Expertise in data modeling methodologies: Inmon, Kimball, Data Vault, ensuring robust analytics solutions.
Proficiency in IBM Framework, Dataiku, Alteryx, Cognos Analytics, Power BI, and DBeaver to structure enterprise-wide data solutions.
Ability to design AI/ML-ready architectures, supporting predictive and prescriptive analytics.
Data Security & Governance:
Strong knowledge of role-based access control (RBAC), encryption strategies, data lineage, and compliance frameworks.
Familiarity with GDPR, CCPA, HIPAA, NIST, BCSI, and security protocols for data governance and regulatory compliance.
Ability to implement industry best practices for secure data management in cloud and VM environments.
Visualization & Reporting:
Proficiency in Power BI, Tableau, and Cognos Analytics, enabling data-driven decision-making through interactive dashboards.
Strong critical thinking and problem-solving capabilities for optimizing insights.
Ability to translate business needs into technical solutions, ensuring structured data workflows.
Business Alignment & Project Estimation
Ability to effectively communicate insights to non-technical stakeholders, gather and translate business requirements into scalable data architectures.
Expertise in Work Breakdown Structures (WBS), effort estimation, and resource planning for efficient project execution.
Strong collaboration skills with data scientists, ML engineers, analysts, and IT teams.
Strong critical thinking and problem-solving capabilities.
Business acumen with a curiosity for data-driven decision-making.
Ability to rapidly learn and adapt to evolving analytics and reporting tools.
Ability to interpret and analyze data.
Ability to make arithmetic computations using whole numbers, fractions and decimals, and compute rates, ratios, and percentages;
Ability to use Microsoft Office Suite (Outlook, Excel, Word, etc.) and standard office equipment (computer, telephone, fax, copier, etc.)
Education/ Certification/ Years of Experience Requirements:
Bachelor's Degree in Information Technology, Computer Science, Computer Engineering, Management Information Systems, Mathematics, Statistics, or a related field of study from an accredited college or university. In lieu of a degree, equivalent combination of education, certifications, and experience may be substitutable on a 1:1 basis;
Master's Degree in Data Engineering, Information Systems, Computer Science, Mathematics or Statistics, or a related field of study from an accredited college or university (preferred).
Minimum of 3 years of experience in SQL or Python programming, ETL/ELT processes, data warehousing & cloud, data modeling and architecture, data security and governance, and visualization and reporting.
Certifications (Preferred):
SnowPro Advanced: Data Engineer
Databricks Certified Data Engineer
IBM Certified Data Engineer - Cognos Analytics; IBM Data Engineering Professional Certificate (Coursera)
Cloudera Certified Professional (CCP) Data Engineer
Google Cloud Professional Database Engineer
Microsoft Azure Data Engineer Associate
Snowflake SnowPro Certification
SnowPro Associate: Platform Certification; SnowPro Core Certification; SnowPro Specialty: Snowpark; SnowPro Specialty: Native Apps; SnowPro Advanced: Architect
IBM Certified Solution Architect - Cloud Pak for Data
AWS Certified Data Analytics; AWS Certified Solutions Architect - Professional
Microsoft Certified: Azure Solutions Architect Expert
TOGAF 9 Certification
Google Data Engineering Professional Certificate
Oracle Certified Professional or MySQL Database Administrator
Coursera/Udemy Data Engineering Bootcamps
Zachman Certified Enterprise Architect
Working Conditions:
This job is absent of disagreeable working conditions. This job is performed in an office work environment.
Physical Requirements:
This job consists of speaking, hearing, reading, typing and writing. This job requires frequent sitting, occasional standing and walking and may require lifting up to twenty (20) lbs., bending/ stooping, reaching over head.
OUC-The Reliable One is an Equal Opportunity Employer who is committed through responsible management policies to recruit, hire, promote, train, transfer, compensate, and administer all other personnel actions without regard to race, color, ethnicity, national origin, age, religion, disability, marital status, sex, sexual orientation, gender identity or expression, genetic information and any other factor prohibited under applicable federal, state, and local civil rights laws, rules, and regulations
.
EOE M/F/Vets/Disabled
Data Scientist
Data engineer job in Orlando, FL
We are passionate people with an expert understanding of the digital consumer, data sciences, global telecom business, and emerging financial services. And we believe that we can make the world a better place. Job Description Looking for candidate making a career in Data Science with experience applying advanced statistics, data mining and machine learning algorithms to make data-driven predictions using programming languages like Python (including: Numpy, Pandas, Scikit-learn, Matplotlib, Seaborn), SQL (Postgresql). Experience with ElasticSearch, information/document retrieval, natural language processing is a plus. Experience with various machine learning methods (classification, clustering, natural language processing, ensemble methods, outlier analysis) and parameters that affect their performance also helps. You will leverage your strong collaboration skills and ability to extract valuable insights from highly complex data sets to ask the right questions and find the right answers.
Qualifications
Qualifications
· Bachelor's degree or equivalent experience in quantative field (Statistics, Mathematics, Computer Science, Engineering, etc.)
· At least 2 years' of experience in quantitative analytics or data modeling
· Some understanding of predictive modeling, machine-learning, clustering and classification techniques, and algorithms
· Fluency in these programming languages (Python, SQL), Javascript/HTML/CSS/Web Development nice to have.
· Familiarity with data science frameworks and visualization tools (Pandas, Visualizations (matplotlib, altair, etc), Jupyter Notebooks)
Additional Information
Responsibilities
· Analyze raw data: assessing quality, cleansing, structuring for downstream processing
· Design accurate and scalable prediction algorithms
· Collaborate with engineering team to bring analytical prototypes to production
· Generate actionable insights for business improvements
tion will be kept confidential according to EEO guidelines.
Data Engineer Consultant
Data engineer job in Orlando, FL
Job Description About the job Client's Education Analytics practice is looking for an experienced data engineer with the ability to own the data lifecycle from sourcing through consumption. Our team members are dynamic and able to wear a lot of hats, and this person will need to leverage their breadth of experience within data analysis and engineering. This role will split time between our core product development and consulting projects, where we develop custom data solutions for both K-12 and higher-ed institutions.
You Will
-Source complex data, manage and prepare for consumption into PowerBI visualizations for K-12 & Higher-Ed data products, including early warnings, live progress monitoring, graduation & school-grade predictors
-Be comfortable across the data lifecycle and be able to jump into tasks related to front end prototyping and dashboard development as needed
-Contribute to data architecture and strategy for our core education analytics product development.
-Build for the scalability of a turnkey solution.
-Engineer custom data solutions for specific consulting clients, including contributing to data architecture and management.
-Work with cross-functional client stakeholders to gather, analyze, and understand the user requirements
-Act as a liaison to the client stakeholders for the models, analysis or reports created; understand the impact of specific drivers and provide insights.
-Continually refine models and/or reports and analyze results.
-Utilize a core management consulting skill set including business analysis, project management, and process analysis.
-Work with managing director(s) to successfully deliver to clients
-Oversee and direct the work of 2-3 junior team members assisting with troubleshooting of data problems and client requests
You Have
-2-3 years of data engineering experience, preferably with consulting experience OR in an education institution
-Exposure to a variety of areas of data & analytics, including data management, statistical programming, and visualization
-Experience with large enterprise level data environments and working with 1B+ record data sets
-Skills with the Azure Tech Stack, including: Data Lake, Data Factory, Data Bricks or Analytics, and Azure Machine Learning
-Reporting & Dashboard development experience with PowerBI SSRS Reports Migration to Paginated Reports
-Understanding of management consulting functional skills including: business analysis, process improvement, solutions architecture, and project management
Extras we are looking for
-Experience in Education Analytics, either in a research or professional role
-Excellent written and verbal communication skills
-Knowledgeable in back-testing, simulation, and statistical techniques (auto-regression, auto-correlation, and Principal Component Analysis)
-Bachelor's degree in a quantitative major; Graduate degree in data & analytics field preferred
Azure Sr. Data Engineer
Data engineer job in Orlando, FL
Employment Type: FTE
Level: Experienced/Sr. Level
GENERAL DESCRIPTION & RESPONSIBILITIES
As a Sr. Data Engineering Consultant, you will architect, build, and optimize scalable data pipelines, data lakes, warehouses, and analytics platforms in the Microsoft ecosystem. Collaborate directly with clients to drive successful data modernization and advanced analytics projects while mentoring teams and sharing best practices.
Key Responsibilities:
Architect, develop, and deploy robust, scalable data solutions using Azure Synapse, Databricks, Fabric, Data Factory, and related tools.
Design and optimize ETL/ELT data pipelines with Python, PySpark, and SQL for structured and unstructured data.
Build modern data architectures: data lakes, warehouses, lakehouses, and streaming platforms.
Implement and automate data ingestion, transformation, modeling, and validation within Azure and Databricks.
Lead end-to-end Power BI or Fabric analytics solutions, including semantic modeling, DAX, and data visualization.
Advice on data modernization, governance (Purview, Entra ID), and security best practices aligned with Microsoft frameworks.
Partner with clients and project teams, translate business requirements into actionable architectures, and communicate technical plans to stakeholders.
Mentor peers and guide teams on best practices, technical standards, and continuous learning in the Microsoft/Databricks ecosystem.
Author high-quality project, design, and technical documentation as well as support pre-sales and business development activities as needed.
PLACEMENT CRITERIA & REQUIREMENTS
Bachelors in Computer Science, Engineering, or related discipline (Master's preferred).
5+ years of hands-on experience designing, building, and deploying data solutions in Microsoft Azure and Databricks.
Strong expertise in Python, SQL, PySpark, and modern ETL data engineering practices.
Advanced knowledge of Databricks, Azure Data Factory, Azure Synapse, Fabric, Azure SQL/Dataverse, and Power BI.
Experience with data modeling, pipeline optimization, performance tuning, and cloud-based storage (OneLake, Data Lake, CosmosDB).
Deep understanding of data governance, security, and compliance frameworks within Microsoft cloud.
Excellent communication, project leadership, and stakeholder management skills.
DESIRED SKILLS
Bachelors in Computer Science, Engineering, or related discipline (Master's preferred).
5+ years of hands-on experience designing, building, and deploying data solutions in Microsoft Azure.
Strong expertise in Python, SQL, PySpark, and modern ETL data engineering practices.
Advanced knowledge of Azure Data Factory, Azure Synapse, Fabric, Azure SQL/Dataverse, and Power BI.
Experience with data modeling, pipeline optimization, performance tuning, and cloud-based storage (OneLake, Data Lake, CosmosDB).
Deep understanding of data governance, security, and compliance frameworks within Microsoft cloud.
Excellent communication, project leadership, and stakeholder management skills.
NON-TECHNICAL SKILLS
Communication
Clear and effective verbal and written communication for both technical and non-technical audiences
Confident presentation skills, including the ability to explain complex concepts to senior clients.
Teamwork & Leadership
Ability to mentor, coach, and lead cross-functional teams in collaborative environments
Ability to foster a problem-solving culture and contribute meaningfully to team objectives.
Client Engagement
Strong sense of responsibility, accountability, and trust-building with client stakeholders.
Ability to manage client expectations regarding scope, timeline, and cost.
Adaptability & Learning
Growth mindset and willingness to continuously improve skills and adopt new technologies.
Comfort in fast-paced consulting environments and ability to quickly learn new tools and concepts.
Organization & Initiative
Results-oriented approach with a reputation for getting things done.
Strong organizational skills, ability to independently drive projects, and proactive identification and mitigation of risks.
CYCLOTRON EXPECTATIONS
These are the traits that we hold in the same regard for all employees, also viewed as performance indicators. Each employee should uphold these attributes.
Communication and Interpersonal Skills - verbal and written, professional, concise, and effective.
Collaboration and Teamwork - share knowledge, build alliances, show respect.
Personal Ownership and Responsibility - take initiative, solve problems, be fully present.
Customer Focus - Proactive management or Client expectations and needs.
Time Management and Productivity - Set Goals, Commit, Record, Communicate
Cyclotron is an equal opportunity employer; we encourage diversity and promote an inclusive environment where everyone feels respected and valued. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, gender expression, age, marital status, religion, national origin, veteran, or disability status. We would love to learn about what you can add to our diverse team.
Cyclotron is an Equal Opportunity Employer. Cyclotron values diversity, equity and inclusion, and aims to practice DE&I in all that we do.
Auto-ApplyData Engineer Senior
Data engineer job in Orlando, FL
Data Engineer
EXp:10+
W2/C2C
Must be US Citizen
Orlando, FL
Full onsite
Local to FL
Snowflake development and data modeling:
Design, implement, and maintain Snowflake schemas, tables, views, materialized views, and data sharing
Develop robust ELT/ETL transformations using Snowflake SQL, UDFs, and semi-structured data handling
Implement and optimize Snowflake features such as warehouses, clustering, time travel, zero-copy cloning, streams, tasks, and data sharing
Alteryx integration and workflow optimization:
Build and maintain Alteryx workflows that connect to Snowflake as a data source/target
Optimize Alteryx Designer/Server workflows for performance, scalability, and reliability
Collaborate with data engineers to design efficient data pipelines and transformations
Data quality, governance, and security:
Implement data quality checks, data lineage, and metadata management
Apply RBAC, masking policies, encryption, and network policies as appropriate for banking data
Maintain documentation and support regulatory data requirements
Collaboration and delivery:
Translate business requirements into technical designs, data mappings, and specs
Perform code reviews, contribute to design best practices, and ensure changes meet regulatory and security standards
Partner with analytics, BI, and governance teams to fulfill reporting and analytics needs
Testing, deployment, and automation:
Write unit/integration tests for SQL and data pipelines; participate in CI/CD for data assets
Automate routine maintenance and deployment tasks with Python, SQL, or shell scripting
Incident response and support:
Troubleshoot production issues, perform root-cause analysis, and implement durable fixes
Create runbooks and knowledge base articles for common scenarios
Documentation:
Maintain data dictionaries, lineage, and design documentation for data assets
Required Qualifications
Bachelor's degree in Computer Science, Information Systems, or a related field (or equivalent work experience)
10+ years of Snowflake development experience (designing schemas, pipelines, and transformations)
Strong SQL skills with the ability to write complex queries and optimize performance
Practical experience building and maintaining Alteryx workflows that interact with Snowflake
Thorough understanding of Snowflake concepts: warehouses, clustering, time travel, zero-copy cloning, data sharing, streams/tasks, and security features
Experience with data governance, data quality, and metadata management
Familiarity with cloud environments (Snowflake on AWS/Azure/GCP) and cloud storage integrations
Scripting experience (Python, Bash, or similar) for task automation
Excellent problem-solving, communication, and collaboration skills
Ability to work effectively in a remote or distributed team and align with US banking/regulatory workflows
Preferred Qualifications
SnowPro Core Certification or equivalent
Alteryx Designer Core/Advanced Certification; experience with Alteryx Server administration
Banking/financial services domain experience and knowledge of regulatory requirements
Experience with data lineage tools (e.g., Collibra, Alation) and data governance frameworks
Familiarity with orchestration tools (e.g., Apache Airflow) and ETL/ELT best practices
Knowledge of BI/analytics tools (Tableau, Power BI) and data visualization needs
Auto-ApplyData Engineer
Data engineer job in Orlando, FL
Job DescriptionSalary:
At Frontline Insurance, we are on a mission to Make Things Better, and our Data Engineer plays a pivotal role in achieving this vision. We strive to provide high quality service and proactive solutions to all our customers to ensure that we are making things better for each one.
What makes us different? At Frontline Insurance, our core values Integrity, Patriotism, Family, and Creativity are at the heart of everything we do. Were committed to making a difference and achieving remarkable things together. If youre looking for a role, as a Data Engineer, where you can make a meaningful impact and grow your career, your next adventure starts here!
Our Data Engineer enjoys robust benefits:
Hybrid work schedule!
Health & Wellness: Company-sponsored Medical, Dental, Vision, Life, and Disability Insurance (Short-Term and Long-Term).
Financial Security: 401k Retirement Plan with a generous 9% match
Work-Life Balance: Four weeks of PTO and Pet Insurance for your furry family members.
What you can expect as a Data Engineer:
Identify, analyze, organizing, and storing raw data from various mediums (RDBMS, flat files, APIs, etc.)
Develop, test, and deploy scalable data pipelines for critical business needs
Evaluate, maintain, and enhance current data pipelines and architecture
Maintaining code repositories adhering to proper branching flows
Building and maintaining CI/CD framework and architecture
Deploying and maintaining testing frameworks/suites for data pipelines and CI
Developing, deploying, and maintaining APIs and endpoints for analyst and system use
Deploying and maintenance of Kubernetes cluster(s)
Peer code reviews
Collaborate with business, development, and analytics teams to gather requirements
Participate in data governance and stewardship program to enhance control and dissemination of data with best practices
Develop and maintain technical documentation for pipeline architecture and tooling
What we are looking for as a Data Engineer:
Strong object-oriented/functional programming skills
Python/Scala programming experience
Proficient with ANSI SQL skills
Familiarity with orchestration tools (Dagster preferred), Docker/Podman, CI/CD tools, and version control (git)
Familiar and/or experience with lightweight directory access
Experience and/or knowledge of Agile Development methodologies and SDLC
Excellent problem-solving and analytical skills
Proven ability to extract data from a variety of sources (Relational/Non-Relational Databases, APIs, FTP/SFTP, etc.)
Comfortable using cloud technology platforms (AWS preferred)
Proven abilities to take initiative and be innovative
College Degree and/or 3 - 5 years related experience.
Why work for Frontline Insurance?
At Frontline Insurance, were more than just a workplace were a community of innovators, problem solvers, and dedicated professionals committed to our core values:Integrity, Patriotism, Family, and Creativity.
We provide a collaborative, inclusive, and growth-oriented work environment where every team member can thrive.
Frontline Insurance is an equal-opportunity employer that is committed to diversity and inclusion in the workplace. We prohibit discrimination and harassment of any kind based on race, color, sex, religion, sexual orientation, national origin, disability, genetic information, pregnancy, or any other protected characteristic as outlined by federal, state, or local laws.
Data Engineer
Data engineer job in Orlando, FL
General Description:
The Data Engineer will play a critical role within the Information Technology organization, partnering closely with data and analytics leadership to build and optimize scalable, reliable data pipelines and platforms that support enterprise analytics and decision-making. This role focuses on enabling data availability, quality, and accessibility across business domains, ensuring alignment with strategic data initiatives. The engineer will work with cross-functional teams to gather and translate business data requirements into technical solutions, support data integration efforts, and uphold data governance standards. Responsibilities may include contributing to the development of data products, enhancing data infrastructure, and supporting data platform modernization efforts to ensure trusted, timely data is accessible for business use.
Essential Duties and Responsibilities:
Guidewire Data Warehouse Management: Continue to support and enhance the Guidewire Enterprise Data Warehouse, ensuring data availability, accuracy, and efficiency in ETL operations.
Data Platform Implementation: Architect, design, and implement a scalable, on-prem or cloud-based enterprise data platform, integrating diverse data sources beyond Guidewire Insurance Suite
Data Integration & Engineering: Develop and oversee ETL/ELT pipelines to ingest, transform, and store data efficiently, leveraging modern tools.
Data Modeling & Architecture: Design and implement optimized data models for structured and unstructured data, supporting reporting, analytics, and AI/ML initiatives.
Data Governance & Security: Establish best practices for data governance, data quality, metadata management, and security compliance across all data assets.
Advanced Analytics Support: Enable self-service analytics, real-time data processing, and AI/ML-driven insights by integrating modern data technologies such as data lakes, streaming data, Graph and NoSQL databases.
Collaboration & Leadership: Act as a strategic partner to IT, business units, and analytics teams, aligning data initiatives with organizational goals. Mentor junior team members and foster a culture of data-driven decision-making.
Monitor the task queue, take, and update tickets as directed by your supervisor.
Successfully engage in multiple initiatives simultaneously.
Contributes to the development of project plans and may assign and monitor tasks.
Assist in the development and generation of new reports to be provided to senior management across functional departments.
Performs other duties as required.
Supplemental Information:
This job description has been prepared to indicate the general nature and level of the work that the employees perform within their classification. This description is not and cannot be interpreted as an inventory of all the duties, tasks, responsibilities, and qualifications required for the employees assigned to this job.
Education and / or Experience:
Bachelor's or Master's degree in Computer Science, Information Systems, Data Science, or a related field.
5+ years of experience in data architecture, engineering, or related roles, preferably within the insurance industry.
Strong expertise in the Guidewire InsuranceSuite database schemas (PolicyCenter, BillingCenter, ClaimCenter) is a plus.
Ability to analyze and learn new complex data sources models and integrate them into the data platform's pipelines.
Experience implementing cloud-based data platforms in Azure and familiarity with data lakehouse architectures.
Proficiency in modern ETL/ELT tools (e.g., MS SSIS and Azure Data Factory) and database technologies (SQL, Databricks, etc.).
Hands-on experience with big data processing, streaming technologies (Kafka, Spark, Flink), and API-driven data integration.
Strong understanding of data security, compliance, and governance best practices (GDPR, CCPA, SOC2, etc.).
Familiarity with BI/reporting tools such as Power BI, Tableau, Looker.
Strong knowledge and experience implementing Data Mesh architecture is a plus.
Knowledge of machine learning frameworks and MLOps is a plus.
Familiarity with ticketing systems like Atlassian Jira used to assign and track work amongst multiple team members.
Must be resourceful, industrious, and willing to take on new tasks and proactively learn new technologies to keep up with business needs.
Must be able to work under tight deadlines efficiently and with high quality.
Must possess strong organizational skills with demonstrated attention to detail.
Must be flexible and able to adapt in a changing business environment.
Must possess a positive attitude and strong work ethic.
Excellent verbal and written communication skills and the ability to interact professionally with a diverse group (executives, managers, and subject matter experts).
Must be proficient in Microsoft Office (Excel, Word, Power Point).
Licenses and / or Certifications:
Azure Data Engineer Associate or higher preferred.
Data Engineer-Lead - Project Planning and Execution
Data engineer job in Orlando, FL
We are a leading construction company committed to delivering high-quality, innovative projects. Our team integrates cutting-edge technologies into the construction process to streamline operations, enhance decision-making, and drive efficiency across all levels. We are looking for a talented Data Engineer to join our team and contribute to developing robust data solutions that support our business goals.
This role is ideal for someone who enjoys combining technical problem-solving with stakeholder collaboration. You will collaborate with business leaders to understand data needs and work closely with a global engineering team to deliver scalable, timely, and high-quality data solutions that power insights and operations.
Responsibilities
* Own data delivery for specific business verticals by translating stakeholder needs into scalable, reliable, and well-documented data solutions.
* Participate in requirements gathering, technical design reviews, and planning discussions with business and technical teams.
* Partner with the extended data team to define, develop, and maintain shared data models and definitions.
* Design, develop, and maintain robust data pipelines and ETL processes using tools like Azure Data Factory and Python across internal and external systems.
* Proactively manage data quality, error handling, monitoring, and alerting to ensure timely and trustworthy data delivery.
* Perform debugging, application issue resolution, root cause analysis, and assist in proactive/preventive maintenance.
* Support incident resolution and perform root cause analysis for data-related issues.
* Create and maintain both business requirement and technical requirement documentation
* Collaborate with data analysts, business users, and developers to ensure the accuracy and efficiency of data solutions.
* Collaborate with platform and architecture teams to align with best practices and extend shared data engineering patterns.
Qualifications
* Minimum of 4 years of experience as a Data Engineer, working with cloud platforms (Azure, AWS).
* Proven track record of managing stakeholder expectations and delivering data solutions aligned with business priorities.
* Strong hands-on expertise in Azure Data Factory, Azure Data Lake, Python, and SQL
* Familiarity with cloud storage (Azure, AWS S3) and integration techniques (APIs, webhooks, REST).
* Experience with modern data platforms like Snowflake and Microsoft Fabric.
* Solid understanding of Data Modeling, pipeline orchestration and performance optimization
* Strong problem-solving skills and ability to troubleshoot complex data issues.
* Excellent communication skills, with the ability to work collaboratively in a team environment.
* Familiarity with tools like Power BI for data visualization is a plus.
* Experience working with or coordinating with overseas teams is a strong plus
Preferred Skills
* Knowledge of Airflow or other orchestration tools.
* Experience working with Git-based workflows and CI/CD pipelines
* Experience in the construction industry or a similar field is a plus but not required.
DPR Construction is a forward-thinking, self-performing general contractor specializing in technically complex and sustainable projects for the advanced technology, life sciences, healthcare, higher education and commercial markets. Founded in 1990, DPR is a great story of entrepreneurial success as a private, employee-owned company that has grown into a multi-billion-dollar family of companies with offices around the world.
Working at DPR, you'll have the chance to try new things, explore unique paths and shape your future. Here, we build opportunity together-by harnessing our talents, enabling curiosity and pursuing our collective ambition to make the best ideas happen. We are proud to be recognized as a great place to work by our talented teammates and leading news organizations like U.S. News and World Report, Forbes, Fast Company and Newsweek.
Explore our open opportunities at ********************
Auto-ApplyPart-Time Data Architect
Data engineer job in Orlando, FL
Part Time Data Architect Remote
TEWS has opportunities with leading companies for professionals at all career stages, whether you're a seasoned consultant, recent graduate, or transitioning into a new phase of your career, we are here to help.
TEWS has an enterprise level client in need of a Data Architect to work approximately 20 hours per week. The candidate must be US based. No offshore r sponsorship available.
Deep understanding of Azure synapse Analytics, Azure Data Factory, and related Azure data tools
Lead implementations of secure, scalable, and reliable Azure solutions.
Observe and recommend how to monitor and optimize Azure for performance and cost efficiency.
Expertise in implementing Data Vault 2.0 methodologies using Wherescape automation software.
Proficient in designing and optimizing fact and dimension table models.
Demonstrated ability to design, develop, and maintain data pipelines and workflows.
Strong skills in formulating, reviewing, and optimizing SQL code.
Expertise in data collection, storage, accessibility, and quality improvement processes.
Endorse and foster security best practices, access controls, and compliance standards for all data lake resources.
Proven track record of delivering consumable data using information marts.
Excellent communication skills to effectively liaise with technical and non-technical team members.
Ability to document designs, procedures, and troubleshooting methods clearly.
Proficiency in Python or PowerShell preferred.
Bachelor's or Master's Degree in Computer Science, Information Systems, or other related field. Or equivalent work experience.
A minimum of 7 years of experience with large and complex database management systems.
Job Responsibilities:
Responsible for enterprise-wide data design, balancing optimization of data access with batch loading and resource
utilization factors. Knowledgeable in most aspects of designing and constructing data architectures, operational data
stores, and data marts. Focuses on enterprise-wide data modeling and database design. Defines data architecture
standards, policies and procedures for the organization, structure, attributes and nomenclature of data elements, and
applies accepted data content standards to technology projects. Responsible for business analysis, data acquisition and
access analysis and design, Database Management Systems optimization, recovery strategy and load strategy design and implementation.
Essential Position Functions:
Evaluate and recommend data management processes.
Design, prepare and optimize data pipelines and workflows.
Lead implementations of secure, scalable, and reliable Azure solutions.
Observe and recommend how to monitor and optimize Azure for performance and cost-efficiency.
Endorse and foster security best practices, access controls, and compliance standards for all data lake resources.
Perform knowledge transfer about troubleshooting and documenting Azure architectures and solutions.
Tews is an equal opportunity employer and will consider all applications for employment without regards to age, color, sex, disability, national origin, race, religion, or veteran status.
#zip
Business Intelligence Engineer III
Data engineer job in Melbourne, FL
At Space Coast Credit Union (SCCU), our members are at the heart of everything we do. Since 1951, we've been committed to delivering financial services founded on integrity and a people-first philosophy.
As a Business Intelligence Engineer III in our Melbourne Headquarters, You'll lead the design and delivery of robust data pipelines and scalable dataâwarehouse solutions that power actionable insights. Apply expertise in data architecture, modeling, mining, and integrity practices to transform complex data into clear, reliable analytics. As a senior member of the BI team, mentor colleagues on emerging technologies, champion best practices, and build systems for data collection, cleansing, and analysis-empowering smarter decisions across the organization.
Why Join SCCU?
• Member-Focused Mission: Be part of a not-for-profit organization that reinvests in its members.
• Hybrid and Flexible Schedule Options: Available for select positions. This position is Hybrid with 2 days per week required in office.
• Career Growth: We prioritize internal promotions and offer on-the-job training.
Principal Duties and Responsibilities:
Professional proficiency in ETL/ELT, SQL Server, SSMS, SSRS, SSIS, and Oracle IDE environments.
Professional proficiency in Microsoft Azure enterprise environment, including MS Azure Data Factory, MS Azure Data Bricks, and MS Azure Synapse.
Professional proficiency in MS Visual Studio.
Specify, design, build, and support data warehousing and other BI reporting and data storage solutions.
Monitors and tunes BI tools to ensure optimal efficiencies and performance metrics.
Supports upgrades, configuration, and trouble-shooting for various Business Intelligence tools.
Development and maintenance of multi-dimensional (OLAP) reporting databases.
Responsible for program design, coding, testing, debugging, and documentation of all BI systems.
Minimum Qualifications:
To perform this job successfully, an individual must to able to perform each essential task or duty satisfactorily. The requirements listed are representative of the minimum level of knowledge, skills, and abilities required to be successful in this role. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of this role.
Education and Training:
Four-year secondary degree required in Computer Science, Information Technology, Finance, Mathematics, or other related field is preferred, however, an equivalent combination of education and relevant knowledge or experience may be considered.
Prior Experience:
5 - 7 years of experience required in any combination of the following: MS Azure suite of data reporting and integration, database development, reporting or analytics in an enterprise environment, Microsoft SQL coding language, Oracle database experience, or the development and maintenance of analytics reporting in an enterprise environment. Equivalence may be considered for verified coursework studies, but experience is preferred.
Business Intelligence Engineer III: Compensation
• Starting Compensation: $111,100 - $117,665 annually
• Bonus Opportunity: Eligible for ONE SCCU Annual Bonus
SCCU Benefits
• Health & Wellness: Medical, dental, and vision insurance, plus an Employee Assistance Program.
• Financial Perks: 401(k) match (5%), HSA match, and SCCU-paid insurance (short/long-term disability, life insurance).
• Education Support: Tuition reimbursement after one year of service.
• Generous Time Off: 20+ days of PTO, birthday PTO, and 11 federal holidays.
• Exclusive Discounts: Lower rates on loans, credit cards, and no fee SCCU accounts!
Hours
Monday - Friday: 8:00am - 5:00pm
About SCCU
Since 1951, Space Coast Credit Union (SCCU) has proudly served our community, growing to over 685,000 members and managing $9 billion in assets. With 67 branches spanning Florida's east coast, we are the third-largest credit union in the state. In 2025, we expanded into Orange County to better serve the growing East Orlando market.
As a not-for-profit financial institution, SCCU is dedicated to putting our members first. Unlike traditional banks, we return profits to our members through better rates, lower fees, and enhanced services. While we offer the same financial products-like checking, savings, and loans-our focus remains on empowering our members and supporting their financial well-being. With local decision-making and a commitment to exceptional service, we strive to make a meaningful difference in the lives of those we serve.
At SCCU, we also prioritize our team members by fostering a supportive and collaborative environment that encourages career growth and development. As we continue to grow, we are seeking talented, member-focused professionals to join our team and help deliver innovative financial solutions and outstanding service.
I UNDERSTAND this application a legal document for purposes of your employment. Upon acceptance of an offer with SCCU, I UNDERSTAND that I will be required to complete background, employment verifications, and drug screening. I UNDERSTAND further that any misstatements or omissions in this application and pre-employment process can be considered falsification and will result in a decision not to hire me, or to discharge me if discovered after I am hired. I UNDERSTAND that the information requested regarding date of birth, race and sex is for the sole purpose of gathering the above information accurately, and will not be used to discriminate against me in violation of any law. In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification document form upon hired.
SCCU is a drug-free workplace. I understand that as a condition of my employment, I will be required to submit to any testing for the presence of drugs, and to submit to any procedure to assess my qualifications for employment. If hired, I ALSO AGREE that if I am hired, my employment is for no definite time and may be terminated at any time without prior notice.