Data Engineer jobs at Randstad North America, Inc. - 797 jobs
VB .Net Software Systems Engineer
Randstad 4.6
Data engineer job at Randstad North America, Inc.
Hi,This is Raju. I am Recruiter at Randstad Technologies and I am reaching out in regards to your background. I have an exciting opportunity
*Candidates MUST be able to work on W2 to qualify for this role.*
Job Title: VB .Net Software Systems Engineer
Duration: 6 Months can be extended up to 30 months Location: Hilliard, OH
:
Design, develop, analyze, evaluate, test, debug, document, and implement moderately complex software automation applications from collected specifications/requirements.
Under general direction, devise or modify procedures to solve complex problems while considering equipment capacity, bandwidth, operating time, and form of desired results.
Provide Windows Server administrative support for the production, UAT, and development environments. Participate in occasional evening change windows as required.
Work as a team member in Revenue Assurance providing technical recommendations to solve for business issues.
Responsible for producing high quality deliverables in a timely fashion.
Minimum Qualifications
BS in Engineering, Mathematics, CIS or related field, or equivalent relevant experience in required.
The successful candidate must demonstrate the indicated enterprise experience below:
3 years enterprise experience with software development in an intranet web environment
3 years enterprise experience with VB.NET
3 years enterprise experience with CSharp.NET (C#)
3 years enterprise experience with JavaScript
3 years enterprise experience with JQuery
3 years enterprise experience with MVC
3 years enterprise experience with ANSI SQL
Strong analytical and troubleshooting skills
Competency to work at the highest technical level at all phases of the software development lifecycle
Ability to accept responsibility for multiple development projects
Ability to engage and network with contacts inside and outside the team
Please have a look at the below job requirement and reply back to me with your latest word format resume ASAP. Do let me know the best contact number to reach you.
Note: If you feel this job description does not suit your profile then kindly let me know your preferred job roles so that I can be in touch with you with similar jobs and you can refer me a suitable person for this job as well. Hope to hear from you soon.
Additional Information
All your information will be kept confidential according to EEO guidelines.
$86k-112k yearly est. 60d+ ago
Looking for a job?
Let Zippia find it for you.
Greenhouse AI Engineer - Applied ML
iUNU, Inc. 3.9
Seattle, WA jobs
A leading agriculture technology company is seeking an Applied Machine Learning Engineer to develop AI-driven solutions for greenhouse operations. You will create machine learning models, design systems, and implement algorithms to enhance operational efficiency and crop yields. Ideal candidates have 3-5 years of experience, robust skills in statistics and programming, and familiarity with ML frameworks. This role offers an opportunity to contribute to transformative aspects of agriculture in an inclusive environment.
#J-18808-Ljbffr
$98k-145k yearly est. 3d ago
Flight Software Engineer
General Atomics 4.7
Huntsville, AL jobs
General Atomics (GA), and its affiliated companies, is one of the world's leading resources for high-technology systems development ranging from the nuclear fuel cycle to remotely piloted aircraft, airborne sensors, and advanced electric, electronic, wireless and laser technologies.
General Atomics Electromagnetic Systems (GA-EMS) designs and manufactures first-of-a-kind electromagnetic and electric power generation systems. GA-EMS' expanding portfolio of specialized products and integrated system solutions support critical fleet, space systems and satellites, missile defense, power and energy, and process and monitoring applications for defense, industrial, and commercial customers worldwide.
We currently have an exciting opportunity within our Flight Software Design Engineering group for a Missile Flight Software Engineer to support a missile system development program in our Huntsville, AL location. This role will develop requirements, design, implement, integrate and test missile related embedded systems.
DUTIES AND RESPONSIBILITIES:
Assess and/or develop requirements, and design, implement, integrate and test work products.
Responsible for direct hands-on development of software in support of real-time and embedded missile related systems.
Collaborates daily with GA and customer engineering and program management teams.
Provides documentation and makes technical presentations as required.
Support development, review, and editing of technical documents as required.
Performs other duties as assigned or required.
We operate on a 9x80 schedule with every other Friday off
We recognize and appreciate the value and contributions of individuals with diverse backgrounds and experiences and welcome all qualified individuals to apply.
Job Category
Engineering
Travel Percentage Required
0% - 25%
Full-Time/Part-Time
Full-Time Salary
State
Alabama
Clearance Level
Secret
Pay Range Low
81,080
City
Huntsville
Clearance Required?
Desired
Pay Range High
141,650
Recruitment Posting Title
Flight Software Engineer
Job Qualifications
Typically requires a bachelors degree, masters degree or PhD in engineering or a related technical discipline from an accredited institution and progressive engineering experience as follows; four or more years of experience with a bachelors degree or two or more years of experience with a masters degree. May substitute equivalent engineering experience in lieu of education.
US citizenship is required.
Must be able to obtain a DoD Secret clearance. Active Secret DoD security clearance is preferred.
Bachelors in Electrical or Computer Engineering or related engineering field required.
Strong embedded software development discipline is a required.
Embedded C/C++ programming experience is required.
Experience working with custom hardware, reading schematics, and working with hardware designers is required.
Must be a resourceful "self-starter" capable of researching details of target processor hardware and interfaces.
Experience with RTOS, ARM, I2C, SPI, Ethernet, and Linux is desired.
Experience with Visual Studio IDE is desired.
Experience with missile avionics flight code is highly desired.
Must possess the ability to understand new concepts quickly and apply them accurately throughout an evolving environment.
Strong communication, presentation, and interpersonal skills are required enabling an effective interface with other departments and/or professionals.
Must be able to work both independently and in a team environment.
Must be willing to work extended hours to meet deadlines and increase probability of project success.
Must be customer focused and able to work on a self-initiated basis or in a team environment and able to work extended hours and travel as required.
US Citizenship Required?
Yes
Experience Level
Mid-Level (3-7 years)
Relocation Assistance Provided?
Yes
Workstyle
Onsite
$68k-88k yearly est. 4d ago
Senior Flight Software Engineer
General Atomics 4.7
Huntsville, AL jobs
General Atomics (GA), and its affiliated companies, is one of the world's leading resources for high-technology systems development ranging from the nuclear fuel cycle to remotely piloted aircraft, airborne sensors, and advanced electric, electronic, wireless and laser technologies.
General Atomics Electromagnetic Systems (GA-EMS) designs and manufactures first-of-a-kind electromagnetic and electric power generation systems. GA-EMS' expanding portfolio of specialized products and integrated system solutions support critical fleet, space systems and satellites, missile defense, power and energy, and process and monitoring applications for defense, industrial, and commercial customers worldwide.
We currently have an exciting opportunity within our Flight Software Design Engineering group for a Missile Flight Software Engineer to support a missile system development program in our Huntsville, AL location. This role will develop requirements, design, implement, integrate and test missile related embedded systems.
DUTIES AND RESPONSIBILITIES:
Assess and/or develop requirements, and design, implement, integrate and test work products.
Responsible for direct hands-on development of software in support of real-time and embedded missile related systems.
Collaborates daily with GA and customer engineering and program management teams.
Provides documentation and makes technical presentations as required.
Support development, review, and editing of technical documents as required.
Performs other duties as assigned or required.
We operate on a 9x80 schedule with every other Friday off.
We recognize and appreciate the value and contributions of individuals with diverse backgrounds and experiences and welcome all qualified individuals to apply.
Job Category
Engineering
Travel Percentage Required
0% - 25%
Full-Time/Part-Time
Full-Time Salary
State
Alabama
Clearance Level
Secret
Pay Range Low
116,480
City
Huntsville
Clearance Required?
Desired
Pay Range High
208,505
Recruitment Posting Title
Senior Flight Software Engineer
Job Qualifications
Typically requires a bachelors degree, masters degree or PhD in engineering or a related technical discipline from an accredited institution and progressive engineering experience as follows; twelve or more years of experience with a bachelors degree, ten or more years of experience with a masters degree, or seven or more years with a PhD. May substitute equivalent engineering experience in lieu of education.
US citizenship is required.
Must be able to obtain a DoD Secret clearance. Active Secret DoD security clearance is preferred.
Bachelors in Electrical or Computer Engineering or related engineering field required.
Strong embedded software development discipline is a required.
Embedded C/C++ programming experience is required.
Experience working with custom hardware, reading schematics, and working with hardware designers is required.
Must be a resourceful "self-starter" capable of researching details of target processor hardware and interfaces.
Experience with RTOS, ARM, I2C, SPI, Ethernet, and Linux is desired.
Experience with Visual Studio IDE is desired.
Experience with missile avionics flight code is highly desired.
Experience with software unit, integration, and functional testing is desired.
Must possess the ability to understand new concepts quickly and apply them accurately throughout an evolving environment.
Strong communication, presentation, and interpersonal skills are required enabling an effective interface with other departments and/or professionals.
Must be able to work both independently and in a team environment.
Technical task leadership experience is desired.
Must be willing to work extended hours to meet deadlines and increase probability of project success.
Must be customer focused and able to work on a self-initiated basis or in a team environment and able to work extended hours and travel as required.
US Citizenship Required?
Yes
Experience Level
Senior (8+ years)
Relocation Assistance Provided?
Yes
Workstyle
Onsite
$87k-107k yearly est. 4d ago
Data Scientist
Aerovironment 4.6
Annapolis, MD jobs
The primary role of the **Data Scientist** is to drive mission-focused insights from complex datasets. This role involves managing, modeling, and interpreting large-scale government data holdings to support decision-making and operational success. You will combine expertise in mathematics, statistics, computer science, and domain-specific knowledge to deliver actionable conclusions and communicate them effectively to both technical and non-technical audiences.
**Position Responsibilities**
+ Build and assess analytic models tailored to mission-specific needs
+ Communicate principal conclusions clearly, using mathematics, statistics, and computer science methods
+ Develop reproducible workflows and ensure data integrity
+ Develop qualitative and quantitative methods for exploring and assessing datasets in varying states of organization and cleanliness.
+ Organize, clean, and curate large datasets for analysis
+ Present complex technical findings in a way that is accessible to technical and non-technical stakeholders
**Experience**
+ Bachelor's degree in Mathematics, Applied Mathematics, Applied Statistics, Machine Learning, Data Science, Computer Science or related field or equivalent combination of education and experience.
+ Minimum 3+ years' relevant work experience.
+ Experience with software development working with Python in a Unix environment.
+ Experience using the Unix command line.
+ Practical knowledge in Python Machine Learning and Data Visualization
+ Practical knowledge in Data ETL such as working with: Data loading from SQL, CSV, JSON, Excel, etc., Web scraping (Beautiful Soup, Scrapy, etc.) and Data Wrangling/Cleaning
+ Proficiency in statistical packages using any of the following: Python, R, STATA, SPSS, etc.
+ An active TS/SCI with polygraph
**Additional Requirements**
+ Experience using the Atlassian Tool Suite.
+ Experience with development of any of the following; Hadoop, Pig, MapReduce, or HDFS
+ Working knowledge with other object-oriented programming languages such as Java or C++ Working knowledge with Front-end data visualization libraries (i.e., D3.js; Raphael.js, etc.)
**Physical Demands**
+ Ability to work in an office environment (Constant)
+ Required to sit and stand for long periods; talk, hear, and use hands and fingers to operate a computer and telephone keyboard (Frequent)
**Salary Range: $107,000 to $212,000**
The AV pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Determination of official compensation or salary relies on several factors including, but not limited to, level of position, job responsibilities, geographic location, scope of relevant work experience, educational background, certifications, contract-specific affordability, organizational requirements, alignment with local internal equity as well as alignment with market data.
**Clearance Level**
Top Secret with Poly
**ITAR Requirement:**
_T_ _his position requires access to information that is subject to compliance with the International Traffic Arms Regulations ("ITAR") and/or the Export Administration Regulations ("EAR"). In order to comply with the requirements of the ITAR and/or the EAR, applicants must qualify as a U.S. person under the ITAR and the EAR, or a person to be approved for an export license by the governing agency whose technology comes under its jurisdiction. Please understand that any job offer that requires approval of an export license will be conditional on AeroVironment's determination that it will be able to obtain an export license in a time frame consistent with AeroVironment's business requirements. A "U.S. person" according to the ITAR definition is a U.S. citizen, U.S. lawful permanent resident (green card holder), or protected individual such as a refugee or asylee. See 22 CFR § 120.15. Some positions will require current U.S. Citizenship due to contract requirements._
**Benefits** : AV offers an excellent benefits package including medical, dental vision, 401K with company matching, a 9/80 work schedule and a paid holiday shutdown. For more information about our company benefit offerings please visit: ********************************* .
We also encourage you to review our company website at ******************** to learn more about us.
Principals only need apply. NO agencies please.
**Who We Are**
Based in California, AeroVironment (AVAV) is a global leader in unmanned aircraft systems (UAS) and tactical missile systems. Founded in 1971 by celebrated physicist and engineer, Dr. Paul MacCready, we've been at the leading edge of technical innovation for more than 45 years. Be a part of the team that developed the world's most widely used military drones and created the first submarine-launched reconnaissance drone, and has seven innovative vehicles that are part of the Smithsonian Institution's permanent collection in Washington, DC.
Join us today in developing the next generation of small UAS and tactical missile systems that will deliver more actionable intelligence to our customers so they can proceed with certainty - and succeed.
**What We Do**
Building on a history of technological innovation, AeroVironment designs, develops, produces, and supports an advanced portfolio of unmanned aircraft systems (UAS) and tactical missile systems. Agencies of the U.S. Department of Defense and allied military services use the company's hand-launched UAS to provide situational awareness to tactical operating units through real-time, airborne reconnaissance, surveillance, and target acquisition.
_We are proud to be an EEO/AA Equal Opportunity Employer, including disability/veterans. AeroVironment, Inc. is an Equal Employment Opportunity (EEO) employer and welcomes all qualified applicants. Qualified applicants will receive fair and impartial consideration without regard to race, sex, color, religion, national origin, age, disability, protected veteran status, genetic data, sexual orientation, gender identity or other legally protected status._
**ITAR**
U.S. Citizenship required
**About AV:**
**AV isn't for everyone. We hire the curious, the relentless, the mission-obsessed. The best of the best.**
We don't just build defense technology-we redefine what's possible. As the premier autonomous systems company in the U.S., AV delivers breakthrough capabilities across air, land, sea, space, and cyber. From AI-powered drones and loitering munitions to integrated autonomy and space resilience, our technologies shape the future of warfare and protect those who serve.
Founded by legendary innovator Dr. Paul MacCready, AV has spent over 50 years pushing the boundaries of what unmanned systems can do. Our heritage includes seven platforms in the Smithsonian-but we're not building history, we're building what's next.
**If you're ready to build technology that matters-with speed, scale, and purpose-there's no better place to do it than AV.**
**Careers at AeroVironment (*****************************************
$69k-93k yearly est. 41d ago
Data Scientist
Aerovironment 4.6
Annapolis, MD jobs
The primary role of the Data Scientist is to drive mission-focused insights from complex datasets. This role involves managing, modeling, and interpreting large-scale government data holdings to support decision-making and operational success. You will combine expertise in mathematics, statistics, computer science, and domain-specific knowledge to deliver actionable conclusions and communicate them effectively to both technical and non-technical audiences.
Position Responsibilities
Build and assess analytic models tailored to mission-specific needs
Communicate principal conclusions clearly, using mathematics, statistics, and computer science methods
Develop reproducible workflows and ensure data integrity
Develop qualitative and quantitative methods for exploring and assessing datasets in varying states of organization and cleanliness.
Organize, clean, and curate large datasets for analysis
Present complex technical findings in a way that is accessible to technical and non-technical stakeholders
Experience
Bachelor's degree in Mathematics, Applied Mathematics, Applied Statistics, Machine Learning, Data Science, Computer Science or related field or equivalent combination of education and experience.
Minimum 3+ years' relevant work experience.
Experience with software development working with Python in a Unix environment.
Experience using the Unix command line.
Practical knowledge in Python Machine Learning and Data Visualization
Practical knowledge in Data ETL such as working with: Data loading from SQL, CSV, JSON, Excel, etc., Web scraping (Beautiful Soup, Scrapy, etc.) and Data Wrangling/Cleaning
Proficiency in statistical packages using any of the following: Python, R, STATA, SPSS, etc.
An active TS/SCI with polygraph
Additional Requirements
Experience using the Atlassian Tool Suite.
Experience with development of any of the following; Hadoop, Pig, MapReduce, or HDFS
Working knowledge with other object-oriented programming languages such as Java or C++
Working knowledge with Front-end data visualization libraries (i.e., D3.js; Raphael.js, etc.)
Physical Demands
Ability to work in an office environment (Constant)
Required to sit and stand for long periods; talk, hear, and use hands and fingers to operate a computer and telephone keyboard (Frequent)
Salary Range: $107,000 to $212,000
The AV pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Determination of official compensation or salary relies on several factors including, but not limited to, level of position, job responsibilities, geographic location, scope of relevant work experience, educational background, certifications, contract-specific affordability, organizational requirements, alignment with local internal equity as well as alignment with market data.
Clearance Level
Top Secret with Poly
ITAR Requirement:
T
his position requires access to information that is subject to compliance with the International Traffic Arms Regulations (“ITAR”) and/or the Export Administration Regulations (“EAR”). In order to comply with the requirements of the ITAR and/or the EAR, applicants must qualify as a U.S. person under the ITAR and the EAR, or a person to be approved for an export license by the governing agency whose technology comes under its jurisdiction. Please understand that any job offer that requires approval of an export license will be conditional on AeroVironment's determination that it will be able to obtain an export license in a time frame consistent with AeroVironment's business requirements. A “U.S. person” according to the ITAR definition is a U.S. citizen, U.S. lawful permanent resident (green card holder), or protected individual such as a refugee or asylee. See 22 CFR § 120.15. Some positions will require current U.S. Citizenship due to contract requirements.
Benefits: AV offers an excellent benefits package including medical, dental vision, 401K with company matching, a 9/80 work schedule and a paid holiday shutdown. For more information about our company benefit offerings please visit: **********************************
We also encourage you to review our company website at ******************** to learn more about us.
Principals only need apply. NO agencies please.
Who We Are
Based in California, AeroVironment (AVAV) is a global leader in unmanned aircraft systems (UAS) and tactical missile systems. Founded in 1971 by celebrated physicist and engineer, Dr. Paul MacCready, we've been at the leading edge of technical innovation for more than 45 years. Be a part of the team that developed the world's most widely used military drones and created the first submarine-launched reconnaissance drone, and has seven innovative vehicles that are part of the Smithsonian Institution's permanent collection in Washington, DC.
Join us today in developing the next generation of small UAS and tactical missile systems that will deliver more actionable intelligence to our customers so they can proceed with certainty - and succeed.
What We Do
Building on a history of technological innovation, AeroVironment designs, develops, produces, and supports an advanced portfolio of unmanned aircraft systems (UAS) and tactical missile systems. Agencies of the U.S. Department of Defense and allied military services use the company's hand-launched UAS to provide situational awareness to tactical operating units through real-time, airborne reconnaissance, surveillance, and target acquisition.
We are proud to be an EEO/AA Equal Opportunity Employer, including disability/veterans. AeroVironment, Inc. is an Equal Employment Opportunity (EEO) employer and welcomes all qualified applicants. Qualified applicants will receive fair and impartial consideration without regard to race, sex, color, religion, national origin, age, disability, protected veteran status, genetic data, sexual orientation, gender identity or other legally protected status.
ITAR
U.S. Citizenship required
$69k-93k yearly est. Auto-Apply 5d ago
Data Engineer/Analyst
Southern Company 4.5
Birmingham, AL jobs
Onsite 4 days a week (Monday through Thursday Remote option on Fridays)
The DataEngineer/Analyst role is located in Birmingham, Alabama. The position will be filled as a member of the Operations & Commercial Technology Solutions organization, within Southern Power Technology Solutions. The DataEngineer/Analyst role will perform advanced and descriptive analytics that lead to improved business profitability, increased operational efficiencies, and better customer experience. This role requires a combination of conceptual thinking, analytical expertise, business acumen, strategic mindset, and customer focus. This role is responsible for discovering business insights using statistics, machine learning, and visualization techniques. The successful candidate will need to acquire strong knowledge of our internal business. This role focuses on supporting various departments across Southern Power. Ability to collaborate with stakeholders across the company is essential to success in this role.
JOB QUALIFICATIONS:
Bachelor's degree in Engineering, Computer Science, Information Technology, Data Science, or a related field is required
A master's degree in Engineering, Computer Science, Information Technology, Data Science, or a related field is preferred
3+ years of hands-on experience with data analytics or programming; with progressively increased level of responsibility
Experience querying data from various data sources
Excellent analytics and problem-solving skills, with the ability to troubleshoot and resolve data-related issues.
Proficient with one or more programming languages used for analytics, preferably Python and SQL
Experience with Databricks, Power BI, Tableau, Plotly/Dash is a plus
Experience working with operational, industrial, IoT, or timeseries data is a plus
Strong interest in AI and aptitude to learn how to develop AI solutions for internal business partners
Ability to effectively collaborate in highly cross-functional work environment, engage diverse stakeholders, ask questions to understand client needs, and partner with colleagues across different teams
Written communication skills and experience summarizing key insights in concise emails, a few bullet points, and executive summaries
Verbal communication skills and experience effectively presenting analytics outcomes and methodology in practical terms to business audiences
Experience managing and prioritizing multiple projects simultaneously
Behaviors rooted in our corporate values: Safety First, Intentional Inclusion, Act with Integrity, Superior Performance
Knowledge of the electric utility industry is a plus
MAJOR JOB RESPONSIBILITIES:
Build relationships with internal clients to understand their goals, processes, and challenges
Promptly respond to requests from internal clients, understand their business questions, and perform analytics to meet their business needs
Proactively suggest advanced analytics solutions to clients
Write programs to pull data from various data sources
Identify, gather, and prepare data for your projects
Analyze structured and unstructured data
Fulfill high volumes of ad-hoc analytics requests
Develop and deploy analytical models
Demonstrated ability to monitor, diagnose, and optimize data pipelines and analytical solutions
Develop dashboards that automatically pull data from various internal databases
Summarize key insights from your analytics projects and provide conclusions and recommendations
Produce analytics and models to enhance internal business processes, reduce operational costs, increase revenues, optimize decision making, and support capacity growth
Provide insights and tools that quantify, track, and improve data quality
Establish external and internal relationships to share best practices in data science, stay abreast of industry trends and emerging technologies
Continuously learn new data, new platforms, and new methods. Demonstrate curiosity and creativity in the rapidly evolving world of data and AI.
Behavioral Attributes
Opportunistic Drive - Committed to delivering technology solutions that help Southern Power achieve their business objectives
Positive Can-Do Attitude - Must be willing to take full responsibility for duties and work effectively under the pressure of deadlines and shifting priorities.
Self-Starter - Able to work in a professional environment with limited direct supervision.
Results-Oriented - Acts with speed and decisiveness; takes initiative and does what it takes to meet commitments.
Safety Focused - Accepts responsibility for the safety of self and co-workers.
Commitment to continuous learning and improvement - Stays abreast of new technologies and techniques in the market; Looks for opportunities improve through strategy and innovation.
$77k-97k yearly est. Auto-Apply 3d ago
Lead Data Engineer (King Of Prussia, PA, US, 19406)
UGI Corp 4.7
King of Prussia, PA jobs
When you work for AmeriGas, you become a part of something BIG! Founded in 1959, AmeriGas is the nation's premier propane company, serving over 1.5 million residential, commercial, industrial and motor fuel propane customers. Together, over 6,500 dedicated professionals will deliver over 1 billion gallons of propane from 1,800+ distribution points across the United States.
Position Summary
The Lead DataEngineer is a hands-on technical leader responsible for leading dataengineering initiatives, managing a team of 1-3 direct reports, and personally building robust data models and ETL pipelines to meet organizational goals. This role requires proficiency in Snowflake, SQL, and Databricks, with extensive experience integrating data from multiple source systems. The Lead DataEngineer will be deeply involved in coding and implementation work while also serving as a thought leader in best dataengineering practices, mentoring team members, and establishing technical standards across the organization.
Duties and Responsibilities
Hands-on development:
* Personally design, build, and maintain complex ETL/ELT pipelines from diverse source systems using Snowflake and Databricks
Direct technical contribution:
* Write production-quality code and SQL on a daily basis, serving as a primary technical contributor to critical dataengineering projects
* Lead and manage a team of 1-3 dataengineers, providing technical guidance, code reviews, mentorship, and performance management
* Architect and implement scalable data models and infrastructure that align with business objectives
Thought leadership:
* Drive innovation and establish best practices in dataengineering, serving as the subject matter expert for modern data platform capabilities
* Manage the full life-cycle development process for dataengineering projects, from requirements gathering through deployment and maintenance
* Collaborate with data science and BI teams to enable deployment of scalable data solutions across the company's ecosystem
* Establish and enforce dataengineering best practices, coding standards, and quality assurance protocols through both leadership and example
* Provide expert-level support and hands-on troubleshooting for SQL optimization, data warehousing, and cloud data platform architecture
* Adhere to stringent quality assurance and documentation standards using version control and code repositories (e.g., Git, GitHub)
Knowledge, Skills and Abilities
* Proficiency in Snowflake, including advanced features such as data sharing, streams, tasks, and performance optimization
* Expert-level SQL skills for complex queries, stored procedures, and database optimization with ability to write highly efficient, production-ready code
* Extensive hands-on experience with Databricks, including Delta Lake, Apache Spark, and cluster management
* Proven experience designing and personally implementing ETL/ELT processes from multiple heterogeneous source systems
* Experience with QlikSense and QlikView for data integration and analytics (strongly preferred)
* Advanced working knowledge of Python and/or other programming languages for dataengineering workflows
* Strong understanding of modern data architecture principles, including data warehousing, data lakes, and lakehouse architectures
* Excellent communication skills with ability to effectively convey complex technical concepts to both technical and non-technical stakeholders
* Business acumen with ability to identify risks, opportunities, and generate innovative solutions to complex data challenges
* Strong project management skills with ability to prioritize competing demands and deliver results on time
* Collaborative mindset with experience working across data science, analytics, engineering, and business teams
Minimum Qualifications
* Bachelor's degree in Computer Science, DataEngineering, Information Systems, Mathematics, Statistics, or related technical discipline required; Master's Degree preferred
* 5+ years of hands-on, individual contributor experience in dataengineering with progressively increasing responsibility
* 2+ years of experience in a technical leadership or team lead capacity while maintaining significant hands-on coding responsibilities
AmeriGas Propane, Inc. is an Equal Opportunity Employer. The Company does not discriminate on the basis of race, color, sex, national origin, disability, age, gender identity, sexual orientation, veteran status, or any other legally protected class in its practices.
AmeriGas is a Drug Free Workplace. Candidates must be willing to submit to a pre-employment drug screen and a criminal background check. Successful applicants shall be required to pass a pre-employment drug screen as a condition of employment, and if hired, shall be subject to substance abuse testing in accordance with AmeriGas policies. As a federal contractor that engages in safety-sensitive work, AmeriGas cannot permit employees in certain positions to use medical marijuana, even if prescribed by an authorized physician. Similarly, applicants for such positions who are actively using medical marijuana may be denied hire on that basis.
$78k-97k yearly est. 13d ago
Senior Data Engineer
Crusoe 4.1
San Francisco, CA jobs
Job Description
Crusoe's mission is to accelerate the abundance of energy and intelligence. We're crafting the engine that powers a world where people can create ambitiously with AI - without sacrificing scale, speed, or sustainability.
Be a part of the AI revolution with sustainable technology at Crusoe. Here, you'll drive meaningful innovation, make a tangible impact, and join a team that's setting the pace for responsible, transformative cloud infrastructure.
About This Role:
Join Crusoe Energy as a Senior DataEngineer, an early and pivotal hire on our growing central Data Science and Engineering team. This full-time position offers you the unique opportunity to architect and build the foundational data platform infrastructure that powers Crusoe's AI and cloud operations. You'll go beyond building pipelines-you'll design and implement the systems that make those pipelines possible, playing a key role in Crusoe's mission to meet the ever-increasing demand for AI infrastructure and innovate sustainably.
What You'll Be Working On:
Architect Data Platform Infrastructure: Design and build scalable, reliable data infrastructure systems that serve as the foundation for analytics, machine learning, and operational intelligence across Crusoe.
Build High-Performance Data Systems: Develop robust data processing frameworks and storage solutions optimized for the unique demands of AI infrastructure and cloud operations.
Champion Data Quality & Reliability: Implement data validation, monitoring, and observability systems that ensure data integrity and platform reliability at scale.
Collaborate Across Engineering Teams: Partner with software engineers, data scientists, and operations teams to understand requirements and deliver infrastructure that accelerates their work.
Drive Technical Excellence: Establish best practices for dataengineering, contribute to architectural decisions, and mentor team members as the team grows.
What You'll Bring to the Team:
Strong Software Engineering Foundation: Demonstrate proficiency in Python and at least one systems-level language (Go preferred; Java, C, or C++ also valued), with the ability to write production-quality, maintainable code.
Data Infrastructure Expertise: Possess deep experience designing and building data platforms, including data warehouses, data lakes, streaming systems, and ETL/ELT frameworks.
Distributed Systems Knowledge: Show understanding of distributed computing principles, with experience building systems that scale reliably.
SQL Mastery: Exhibit advanced SQL skills for data modeling, query optimization, and database design.
Infrastructure-as-Code Mindset: Demonstrate familiarity with modern DevOps practices including CI/CD, containerization, and infrastructure automation.
Ownership & Autonomy: As an early hire on the dataengineering team, you'll be expected to drive projects from conception to production with minimal supervision.
Collaboration Skills: Communicate effectively with technical and non-technical stakeholders to translate requirements into robust solutions.
Bonus Points:
Experience with Google Cloud Platform (GCP) data services (BigQuery, Dataflow, Pub/Sub, Cloud Storage).
Experience with Apache Beam.
Background in building infrastructure for AI/ML workloads or cloud computing platforms.
Experience on a growing or early-stage data team, contributing to foundational development.
Familiarity with data orchestration tools (e.g., Airflow, Dagster, Prefect).
Experience with real-time data processing and streaming architectures.
Benefits:
Industry competitive pay
Restricted Stock Units in a fast growing, well-funded technology company
Health insurance package options that include HDHP and PPO, vision, and dental for you and your dependents
Employer contributions to HSA accounts
Paid Parental Leave
Paid life insurance, short-term and long-term disability
Teladoc
401(k) with a 100% match up to 4% of salary
Generous paid time off and holiday schedule
Cell phone reimbursement
Tuition reimbursement
Subscription to the Calm app
MetLife Legal
Company paid commuter benefit; $300 per month
Crusoe is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, disability, genetic information, pregnancy, citizenship, marital status, sex/gender, sexual preference/ orientation, gender identity, age, veteran status, national origin, or any other status protected by law or regulation.
$126k-178k yearly est. 20d ago
Senior Data Engineer
Crusoe 4.1
San Francisco, CA jobs
Crusoe's mission is to accelerate the abundance of energy and intelligence. We're crafting the engine that powers a world where people can create ambitiously with AI - without sacrificing scale, speed, or sustainability.
Be a part of the AI revolution with sustainable technology at Crusoe. Here, you'll drive meaningful innovation, make a tangible impact, and join a team that's setting the pace for responsible, transformative cloud infrastructure.
About This Role:
Join Crusoe Energy as a Senior DataEngineer, an early and pivotal hire on our growing central Data Science and Engineering team. This full-time position offers you the unique opportunity to architect and build the foundational data platform infrastructure that powers Crusoe's AI and cloud operations. You'll go beyond building pipelines-you'll design and implement the systems that make those pipelines possible, playing a key role in Crusoe's mission to meet the ever-increasing demand for AI infrastructure and innovate sustainably.
What You'll Be Working On:
Architect Data Platform Infrastructure: Design and build scalable, reliable data infrastructure systems that serve as the foundation for analytics, machine learning, and operational intelligence across Crusoe.
Build High-Performance Data Systems: Develop robust data processing frameworks and storage solutions optimized for the unique demands of AI infrastructure and cloud operations.
Champion Data Quality & Reliability: Implement data validation, monitoring, and observability systems that ensure data integrity and platform reliability at scale.
Collaborate Across Engineering Teams: Partner with software engineers, data scientists, and operations teams to understand requirements and deliver infrastructure that accelerates their work.
Drive Technical Excellence: Establish best practices for dataengineering, contribute to architectural decisions, and mentor team members as the team grows.
What You'll Bring to the Team:
Strong Software Engineering Foundation: Demonstrate proficiency in Python and at least one systems-level language (Go preferred; Java, C, or C++ also valued), with the ability to write production-quality, maintainable code.
Data Infrastructure Expertise: Possess deep experience designing and building data platforms, including data warehouses, data lakes, streaming systems, and ETL/ELT frameworks.
Distributed Systems Knowledge: Show understanding of distributed computing principles, with experience building systems that scale reliably.
SQL Mastery: Exhibit advanced SQL skills for data modeling, query optimization, and database design.
Infrastructure-as-Code Mindset: Demonstrate familiarity with modern DevOps practices including CI/CD, containerization, and infrastructure automation.
Ownership & Autonomy: As an early hire on the dataengineering team, you'll be expected to drive projects from conception to production with minimal supervision.
Collaboration Skills: Communicate effectively with technical and non-technical stakeholders to translate requirements into robust solutions.
Bonus Points:
Experience with Google Cloud Platform (GCP) data services (BigQuery, Dataflow, Pub/Sub, Cloud Storage).
Experience with Apache Beam.
Background in building infrastructure for AI/ML workloads or cloud computing platforms.
Experience on a growing or early-stage data team, contributing to foundational development.
Familiarity with data orchestration tools (e.g., Airflow, Dagster, Prefect).
Experience with real-time data processing and streaming architectures.
Benefits:
Industry competitive pay
Restricted Stock Units in a fast growing, well-funded technology company
Health insurance package options that include HDHP and PPO, vision, and dental for you and your dependents
Employer contributions to HSA accounts
Paid Parental Leave
Paid life insurance, short-term and long-term disability
Teladoc
401(k) with a 100% match up to 4% of salary
Generous paid time off and holiday schedule
Cell phone reimbursement
Tuition reimbursement
Subscription to the Calm app
MetLife Legal
Company paid commuter benefit; $300 per month
Crusoe is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, disability, genetic information, pregnancy, citizenship, marital status, sex/gender, sexual preference/ orientation, gender identity, age, veteran status, national origin, or any other status protected by law or regulation.
$126k-178k yearly est. Auto-Apply 20d ago
BI Data Engineer
NCS Multistage LLC 4.1
Houston, TX jobs
Job Description
Job Title - Business Intelligence (BI) DataEngineer
Department - Information Technology
Reports to - Director Business Intel
The Business Intelligence (BI) DataEngineer will be responsible for designing, building, and maintaining the data infrastructure, pipelines, and analytics solutions that drive data-informed decision making across the organization. This role bridges dataengineering and business intelligence, ensuring data is clean, reliable, and transformed into meaningful insights through platforms like Microsoft Fabric, Power BI, SQL Server, and Snowflake. This role requires the ability to solve complex data challenges, experience across cloud and on-premises platforms, and the ability to turn raw data into trusted, actionable intelligence for business stakeholders.
Key Areas of Responsibility
DataEngineering & Pipelines
Design and implement relational and dimensional databases, including schema design, indexing strategies, and performance tuning
Develop, maintain, and optimize ETL/ELT pipelines using Apache Airflow or Azure Data Factory (Fabric Data Factory)
Design scalable data models and pipelines for SQL Server and Snowflake
Deliver reporting and analytics solutions using Microsoft BI, including Fabric and Power BI
Ensure high availability, reliability, and performance of data processes
Business Intelligence & Analytics
Build, maintain, and optimize dashboards and reports in Power BI and Fabric
Translate complex data sets into clear visualizations and metrics for business stakeholders
Partner with teams to identify KPIs and deliver actionable insights
Data Governance & Quality
Implement and monitor data validation, error handling, and performance tuning strategies
Contribute to best practices in data security, compliance, and governance
Collaboration & Strategy
Work closely with data scientists, analysts, and business units to support cross-functional analytics initiatives
Participate in architectural discussions to improve scalability and efficiency of data solutions
Core Database Administration (DBA) Skills
Backup, Recovery & High Availability
Design and implement backup and restore strategies (full, differential, transaction log backups)
Knowledge of disaster recovery planning and RPO/RTO requirements
Experience with high availability solutions:
SQL Server: Always On Availability Groups, Failover Clustering
Snowflake: Time Travel, Fail-safe
Ability to test and document recovery processes
Performance Monitoring & Tuning
Diagnose slow queries, deadlocks, and bottlenecks
Use tools like SQL Profiler, Extended Events, DMVs, Query Store, or Performance Monitor
Tune indexes, statistics, and query plans
Optimize ETL job performance and concurrency
Security & Compliance
Implement role-based access control (RBAC), encryption at rest/in transit, and auditing
Understand GDPR, SOC2, HIPAA, or other compliance frameworks
Manage user provisioning, privilege management, and data masking
Maintenance & Automation
Set up and monitor database maintenance plans (index rebuilds, integrity checks)
Automate housekeeping tasks via SQL Agent, PowerShell, or Fabric pipelines
Capacity Planning & Storage Management
Forecast storage growth and manage file groups/partitions
Understand I/O characteristics and underlying hardware/cloud configurations
Advanced/Strategic DBA Skills
Capacity and scalability planning for BI workloads
Data lifecycle management and archiving strategies
Collaboration with data architects to align database design with business goals
Documentation and governance of data assets and metadata
Cloud DBA/Modern Data Platform Skills (for hybrid or cloud environments)
Snowflake Administration: Roles, warehouses, resource monitors, credit usage optimization
Azure SQL/Synapse/Fabric Data Warehouse administration
Familiarity with IAM, networking, and cost control in cloud data platforms
Experience with infrastructure as code (IaC) tools for database provisioning (e.g., Terraform, ARM templates)
Support and uphold HS&E policies and procedures of NCS and the customer
Align individual goals with NCS corporate goals, while adhering to the NCS Promise
Participate in your Personal Development for Success (PDS)
Other duties, relevant to the position, shall be assigned as required
Knowledge, Skills and Abilities
Bachelor's degree in Computer Science, Information Technology, or equivalent experience
3+ years of experience in BI development, dataengineering, or similar roles
Strong proficiency in SQL Server (database design, query optimization, stored procedures, performance tuning)
Hands-on experience with Snowflake (warehousing, schema design, data sharing, performance optimization)
Practical knowledge of Apache Airflow or Azure Data Factory (Fabric Data Factory) for orchestration and workflow management
Proficiency with the Microsoft BI stack, including Fabric and Power BI
Track record of building and maintaining well-designed databases, complex data pipelines, and reporting solutions
Strong analytical skills and ability to explain technical concepts clearly to business audiences
Experience with Python or other scripting languages for data manipulation
Knowledge of CI/CD practices for data pipeline deployment
Exposure to data governance frameworks and compliance standards
Familiarity with APIs and data integration tools
Understanding of AI-powered BI tools, including how to prepare and connect datasets for Microsoft Copilot in Power BI/Fabric
Awareness of how to design data models for AI-driven analytics and natural language queries
Additional Information
FLSA Status: Exempt
Employment Classification: Full-time, Regular
Work schedule: 5 days on, 2 days off; Monday to Friday 8:00am - 5:00pm, on-call 24/7 for support
Travel: 15-20% domestic travel required; some international travel may be required
Target Discretionary Bonus: Eligible
Special Equipment: Laptop
Criminal background check required for all positions
Safety sensitive positions will require additional pre-employment testing
Core Competencies
Teamwork/Collaboration - Able to work cooperatively with other individuals
Service Focus - Builds & maintains customer satisfaction and provides excellent service to internal & external customers
Decision Making - Able to make decisions and solve problems of varied levels of complexity using logical, systematic, and sequential approach
Ethics & Integrity - Trustworthiness and ethical behavior with consideration for impact & consequence when making decisions/taking action
Problem Solving - Ability to approach a problem by using a logical, systematic, sequential approach
Continuous Improvement - Ongoing improvement of products, services, or processes through incremental & breakthrough improvements
Accountability - Obligation or willingness to be answerable for an outcome
$84k-119k yearly est. 8d ago
Data Engineer
Floworks International LLC 4.2
Houston, TX jobs
FloWorks is a leading, privately held specialty industrial supplier of pipe, valves, fittings, and related products, as well as a provider of technical solutions to the energy and industrial sectors. Headquarters in Houston, Texas, Floworks is dedicated to delivering exceptional products, expertise, and service to its customers.
Job Information:
As the DataEngineer you are responsible for building, managing, and optimizing cloud-based data solutions to support business processes and analytics across the organization. This role requires expertise in ETL development, data modeling, cloud technologies, and business intelligence to ensure data integrity, insightful analysis, and effective reporting.
Key Responsibilities:
Data Analytics:
Build and manage cloud ETL processes for data extraction, transformation, and loading from multiple sources into data lakes and data warehouses primarily within Microsoft Data Fabric.
Apply business rules, audit, and stage data to ensure data integrity and compliance.
Develop fact and dimension tables to support Power BI report development and other business intelligence needs.
Create visualizations and reports to meet business requirements and support decision-making.
Provide business analysis, problem-solving, and creativity to identify KPIs and metrics that drive business goals.
Ensure timely and accurate performance on assigned projects, maintaining compliance with project budgets and deadlines.
Proactively engage in projects, recognize and resolve problems, and implement solutions independently.
Collaborate with cross-functional teams to gather complete datasets and communicate findings company-wide.
Train stakeholders on best practices for data reporting and self-service analytics.
System Integration and IPAAS solutions:
Integrate data across systems using REST and SOAP APIs, including authentication (OAuth), API keys, pagination, rate limits, retries, and error handling.
Design and manage scalable data ingestion pipelines that pull data from SaaS platforms such as Salesforce, NetSuite, Workday, or ERP systems.
Build and maintain internal data APIs to support curated datasets for analysts and applications.
Translate API payloads (JSON, XML) into internal data models for loading into data lakes, warehouses, or MDM environments.
Support Master Data Management (MDM) processes by integrating and synchronizing core business entities across platforms, ensuring consistent, governed, and high‑quality master data throughout the ecosystem.
Implement middleware workflows using platforms such as Azure Logic Apps, Azure Data Factory, MuleSoft, Boomi, or Informatica Cloud.
Develop event‑driven integrations using messaging systems such as Azure Service Bus, Kafka, or RabbitMQ.
Build end‑to‑end orchestration workflows that call APIs, transform data, and deliver it to destination systems.
Apply integration patterns such as ETL/ELT, Change Data Capture (CDC), event streaming, batch vs. real‑time ingestion, and data synchronization.
Ensure secure API access (OAuth flows, token refresh logic) and apply governance practices for PII, auditability, and cross‑platform data flows.
Support cloud‑aligned practices including automated deployment of API connectors or middleware workflows using CI/CD.
Qualifications:
Bachelor's degree in technology, mathematics, statistics, accounting, finance, or a related quantitative discipline.
Over 5 years of experience in data analytics Microsoft Fabric (high grade).
Prior in-depth hands-on experience with system integrations utilizing Boomi is required for this position.
Expert in SQL (a must-have skill).
Highly experienced in cloud technologies, with a strong preference for Microsoft Fabric, DBT, and Azure. Experience with Databricks, Snowflake, and AWS may also be considered.
Proficient in Python programming and data modeling using the Kimball Method (Star Schema).
Skilled in analytical and visualization tools, with a strong preference for Power BI. Experience with Tableau may also be considered.
Experience and passion for training data science and machine learning models.
Familiarity with Git and source control concepts.
Experience with Databricks, Airflow, Python, AI, AWS, and data integrations with ERPs or Salesforce is a plus.
Ability to work with Azure DevOps and cross-functional teams to define analytics use cases and translate them into technical solutions.
Strong intellectual and analytical curiosity, adaptability, and independence.
Physical Demands
Frequently required to stand
Frequently required to walk
Continually required to sit
Continually required to utilize hand and finger dexterity
Occasionally balance, bend, stoop, kneel or crawl
Continually required to talk or hear
Continually utilize visual acuity to read technical information and/or use a keyboard
Occasionally required to lift/push/carry items up to 25 pounds
Occasionally work near moving mechanical parts
Occasionally exposure to outside weather conditions
Occasionally loud noise (examples: shop tool noises, electric motors, moving mechanical equipment)
Work Environment
This role operates in a professional office environment with flexibility for hybrid work. Standard office equipment such as computers, phones, and printers are used. Occasional visits to warehouses or operational sites may be required.
The Perks of Working Here
FloWorks offers a competitive benefits package designed to support your health, financial well-being, and work-life balance. Highlights include:
Medical, Dental & Vision Insurance with multiple plan options
Company-paid Life and Disability Insurance
401(k) with company match
Health Savings & Flexible Spending Accounts
Supplemental coverage (Accident, Critical Illness, Hospital Indemnity)
Employee Assistance Program (includes 3 free counseling sessions)
Identity Theft Protection at discounted rates
This information indicates the general nature and level of work performed by associates in this role. It is not designed to contain a comprehensive inventory of all duties, responsibilities, and qualifications required of associates assigned to this role. This description supersedes any previous or undated descriptions for this role. Management retains the right to add or change the duties of the position at any time. Questions about the duties and responsibilities of this position should be directed to the reporting Manager or Human Resources.
FloWorks is an equal opportunity employer and gives consideration for employment to qualified applicants without regard to race, color, religion, gender, gender identity, sexual orientation, national origin, genetics, disability, age, or protected veteran status. Committed to fostering a culture where every individual is valued and empowered to contribute to shared success.
FloWorks participates in the US Government's E-Verify program.
$84k-119k yearly est. Auto-Apply 2d ago
Data Engineer
Floworks International LLC 4.2
Houston, TX jobs
Job Description
FloWorks is a leading, privately held specialty industrial supplier of pipe, valves, fittings, and related products, as well as a provider of technical solutions to the energy and industrial sectors. Headquarters in Houston, Texas, Floworks is dedicated to delivering exceptional products, expertise, and service to its customers.
Job Information:
As the DataEngineer you are responsible for building, managing, and optimizing cloud-based data solutions to support business processes and analytics across the organization. This role requires expertise in ETL development, data modeling, cloud technologies, and business intelligence to ensure data integrity, insightful analysis, and effective reporting.
Key Responsibilities:
Data Analytics:
Build and manage cloud ETL processes for data extraction, transformation, and loading from multiple sources into data lakes and data warehouses primarily within Microsoft Data Fabric.
Apply business rules, audit, and stage data to ensure data integrity and compliance.
Develop fact and dimension tables to support Power BI report development and other business intelligence needs.
Create visualizations and reports to meet business requirements and support decision-making.
Provide business analysis, problem-solving, and creativity to identify KPIs and metrics that drive business goals.
Ensure timely and accurate performance on assigned projects, maintaining compliance with project budgets and deadlines.
Proactively engage in projects, recognize and resolve problems, and implement solutions independently.
Collaborate with cross-functional teams to gather complete datasets and communicate findings company-wide.
Train stakeholders on best practices for data reporting and self-service analytics.
System Integration and IPAAS solutions:
Integrate data across systems using REST and SOAP APIs, including authentication (OAuth), API keys, pagination, rate limits, retries, and error handling.
Design and manage scalable data ingestion pipelines that pull data from SaaS platforms such as Salesforce, NetSuite, Workday, or ERP systems.
Build and maintain internal data APIs to support curated datasets for analysts and applications.
Translate API payloads (JSON, XML) into internal data models for loading into data lakes, warehouses, or MDM environments.
Support Master Data Management (MDM) processes by integrating and synchronizing core business entities across platforms, ensuring consistent, governed, and high‑quality master data throughout the ecosystem.
Implement middleware workflows using platforms such as Azure Logic Apps, Azure Data Factory, MuleSoft, Boomi, or Informatica Cloud.
Develop event‑driven integrations using messaging systems such as Azure Service Bus, Kafka, or RabbitMQ.
Build end‑to‑end orchestration workflows that call APIs, transform data, and deliver it to destination systems.
Apply integration patterns such as ETL/ELT, Change Data Capture (CDC), event streaming, batch vs. real‑time ingestion, and data synchronization.
Ensure secure API access (OAuth flows, token refresh logic) and apply governance practices for PII, auditability, and cross‑platform data flows.
Support cloud‑aligned practices including automated deployment of API connectors or middleware workflows using CI/CD.
Qualifications:
Bachelor's degree in technology, mathematics, statistics, accounting, finance, or a related quantitative discipline.
Over 5 years of experience in data analytics Microsoft Fabric (high grade).
Prior in-depth hands-on experience with system integrations utilizing Boomi is required for this position.
Expert in SQL (a must-have skill).
Highly experienced in cloud technologies, with a strong preference for Microsoft Fabric, DBT, and Azure. Experience with Databricks, Snowflake, and AWS may also be considered.
Proficient in Python programming and data modeling using the Kimball Method (Star Schema).
Skilled in analytical and visualization tools, with a strong preference for Power BI. Experience with Tableau may also be considered.
Experience and passion for training data science and machine learning models.
Familiarity with Git and source control concepts.
Experience with Databricks, Airflow, Python, AI, AWS, and data integrations with ERPs or Salesforce is a plus.
Ability to work with Azure DevOps and cross-functional teams to define analytics use cases and translate them into technical solutions.
Strong intellectual and analytical curiosity, adaptability, and independence.
Physical Demands
Frequently required to stand
Frequently required to walk
Continually required to sit
Continually required to utilize hand and finger dexterity
Occasionally balance, bend, stoop, kneel or crawl
Continually required to talk or hear
Continually utilize visual acuity to read technical information and/or use a keyboard
Occasionally required to lift/push/carry items up to 25 pounds
Occasionally work near moving mechanical parts
Occasionally exposure to outside weather conditions
Occasionally loud noise (examples: shop tool noises, electric motors, moving mechanical equipment)
Work Environment
This role operates in a professional office environment with flexibility for hybrid work. Standard office equipment such as computers, phones, and printers are used. Occasional visits to warehouses or operational sites may be required.
The Perks of Working Here
FloWorks offers a competitive benefits package designed to support your health, financial well-being, and work-life balance. Highlights include:
Medical, Dental & Vision Insurance with multiple plan options
Company-paid Life and Disability Insurance
401(k) with company match
Health Savings & Flexible Spending Accounts
Supplemental coverage (Accident, Critical Illness, Hospital Indemnity)
Employee Assistance Program (includes 3 free counseling sessions)
Identity Theft Protection at discounted rates
This information indicates the general nature and level of work performed by associates in this role. It is not designed to contain a comprehensive inventory of all duties, responsibilities, and qualifications required of associates assigned to this role. This description supersedes any previous or undated descriptions for this role. Management retains the right to add or change the duties of the position at any time. Questions about the duties and responsibilities of this position should be directed to the reporting Manager or Human Resources.
FloWorks is an equal opportunity employer and gives consideration for employment to qualified applicants without regard to race, color, religion, gender, gender identity, sexual orientation, national origin, genetics, disability, age, or protected veteran status. Committed to fostering a culture where every individual is valued and empowered to contribute to shared success.
FloWorks participates in the US Government's E-Verify program.
$84k-119k yearly est. 4d ago
Staff Data Platform Engineer, Enterprise Data
Radiant Food Store 4.2
El Segundo, CA jobs
Radiant is an El Segundo, CA-based startup building the world's first mass-produced, portable nuclear microreactors. The company's first reactor, Kaleidos, is a 1-megawatt failsafe microreactor that can be transported anywhere power is needed and run for up to five years without the need to refuel. Portable nuclear power with rapid deploy capability can replace similar-sized diesel generators, and provide critical asset support for hospitals, data centers, remote sites, and military bases. Radiant's unique, practical approach to nuclear development utilizes modern software engineering to rapidly achieve safe, factory-built microreactors that leverage existing, well-qualified materials. Founded in 2020, Radiant is on track to test its first reactor next year at the Idaho National Laboratory, with initial customer deliveries beginning in 2028.
About the Role
We're building a new team to own Radiant's internal data infrastructure, executive analytics, and AI capabilities. As a Staff Data Platform Engineer, you'll be hands-on building the integrations and pipelines that connect Radiant's operational systems-Finance, HR, Supply Chain, Manufacturing, and Recruiting-into a unified data platform.
Your first priority will be the ION-Ramp integration: building the pipeline that enables three-way matching (PO + Receipt + Invoice) between our manufacturing execution system and payments platform. This directly reduces manual work for the Finance team. From there, you'll expand to connect 9 enterprise systems, build dbt transformation models, and ensure data quality across the platform.
What You'll Do
Build integrations: Configure Fivetran connectors, build custom pipelines, and connect 9 enterprise systems (ION, Ramp, QuickBooks, Rippling, Ashby, Salesforce, SharePoint, Smartsheet, DocuSign)
Develop the ION-Ramp pipeline: Build the Snowflake → Databricks → Ramp pipeline that enables three-way matching for Finance
Write dbt models: Transform raw data into clean, documented, tested models for dashboards and analytics
Ensure data quality: Implement testing, monitoring, and alerting for data pipelines; catch issues before they reach dashboards
Support dashboards: Work with the PM and stakeholders to ensure the data models support Finance, Operations, and HR/Recruiting dashboards
Document everything: Maintain clear documentation for pipelines, models, and data definitions
What We're Looking For Required
5+ years of dataengineering or software engineering experience
Python and SQL proficiency-you'll write both daily
Experience with data warehouses-Databricks, Snowflake, BigQuery, or Redshift
ETL/ELT pipeline development-Fivetran, Airbyte, Stitch, or custom pipeline experience
API integration experience-REST, GraphQL; you're comfortable reading API docs and building integrations with enterprise SaaS tools
Data modeling skills-dimensional modeling, schema design, understanding of how data will be used downstream
Preferred
Background in hardware or hardtech companies-you understand manufacturing, supply chain, and physical product development
Experience with dbt (data build tool)
Experience with data quality frameworks and testing (Great Expectations, dbt tests, etc.)
Comfortable building lightweight internal tools-Python/Flask, basic frontend-or using AI coding assistants (Cursor, Copilot) to extend your capabilities
Familiarity with MES, ERP, or supply chain systems (ION, SAP, NetSuite, etc.)
Why Join Radiant
Mission: Clean energy that can go anywhere-few problems matter more
Team: Work alongside exceptional people from SpaceX, Blue Origin, and other top engineering organizations
Compensation: Competitive compensation with equity
Benefits: Health/dental/vision, 401(k), and flexible PTO
Impact: The pipelines you build directly reduce manual work for Finance and give leadership real-time visibility
Variety: You'll work across 9 different systems-Finance, HR, Manufacturing, CRM, Documents-never boring
Modern stack: Databricks, Fivetran, dbt, Airbyte-best-in-class tools
Additional Information
This position requires the ability to work in the United States and eligibility for access to export-controlled information under ITAR/EAR.
Total Compensation and Benefits
Radiant's new hire compensation package includes base salary, substantial equity grants, and comprehensive benefits. Total compensation and level are determined through a holistic evaluation of your interview performance, experience, education, and qualifications.
Benefits and Perks for Eligible Employees:
Stock: Substantial incentive stock plan for all full-time employees.
Medical: 100% premium coverage for base Silver level plan for employee + 50% premium coverage for dependents. Platinum plans available.
One Medical: Sponsored memberships for eligible employees and their dependents.
Vision: 100% premium coverage for top tier plan + 50% for dependents.
Dental: 100% premium coverage for top tier plan (including orthodontia) + 50% for dependents.
Voluntary life, accident, hospital, critical illness, commuter and FSA/HSA are offered as employee contributed benefits.
8-weeks of paid parental leave for all parents. Additional paid pregnancy leave for CA employees.
Daily catered lunch. Free snacks and drinks.
Flexible PTO policy. Remote workday allocation.
Company and team-bonding events, happy hours and in-person camaraderie.
Beautiful El Segundo headquarters close to the Pacific Ocean.
We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.
Unfortunately, we are unable to provide visa sponsorship at this time.
This position involves access to technology that is subject to U.S. export controls. Any job offer made will be contingent upon the applicant's capacity to serve in compliance with U.S. export controls.
$115k-154k yearly est. Auto-Apply 20d ago
Senior Data Engineer
Trystar Inc. 4.4
Burnsville, MN jobs
Trystar is at the forefront of advancing power solutions, charged and driven by a committed, dynamic team, tackling complex challenges, and creating innovative solutions. Safety and integrity aren't just buzzwords. They are the north star guiding us as we aspire to wow our customers every day. We've created power solutions that are not only durable and unique but are also the result of serious teamwork from every corner of our organization. Individually and collectively, every team member at Trystar plays for each other and strives to deliver unmatched value and 100% accuracy to our customers every single day.
As a big name and pioneer in power solutions, we're not just part of the power conversation - we're shaping its future, and we're doing so across diverse industry sectors including healthcare, data centers, entertainment, education, government, and commercial construction - just to name a few! We're committed to growing and evolving our product lineup to stay on top of the game and that includes leaning into sustainable, resilient, and renewable energy solutions.
Our cutting-edge headquarters in Faribault, MN, is partially solar- and wind-powered by our own microgrid! You will find that we're also pushing boundaries across the country at our additional facilities in Troy, MI; Houston, TX; Merrimack, NH; Burnsville, MN; Murfreesboro, TN; Waukesha, WI; and in Montreal, Quebec, Canada. Together, we are Trystar, where we power the future and nurture innovation for a brighter tomorrow.
Trystar's team members are our most important asset, and we are in search of a Senior DataEngineer to join our IT team. As a Senior DataEngineer, you will play a crucial role in designing, developing, and maintaining our data platform. Your responsibilities will include building and optimizing data pipelines, ensuring data quality and integrity, and implementing data processing systems that support Trystar's various business needs. Additionally, you will collaborate with cross-functional teams to understand data requirements, develop data models, and provide actionable insights. Your expertise will be instrumental in driving Trystar's data strategy and supporting our commitment to innovation and operational excellence.
This role requires a dynamic individual with a passion for technology, a commitment to delivering exceptional user experiences, and cross-functional collaboration. This is an opportunity to play a pivotal role in driving Trystar's commitment to innovation, customer focus, and operational excellence.
We are looking for people who believe in our guiding principles and values of:
Safety - We believe everyone should leave Trystar facilities in the same or better condition than when they arrived.
Integrity - We're honest, transparent, and committed to doing what's right.
Customer focus - We have relentless focus on our customers and their success.
Right with speed - We use good judgement, make thoughtful decisions quickly, and execute them with purpose and intensity.
Play for each other - We're a team. We show up for each other and we know that through teamwork we achieve greatness.
Champion change - We know adaptation and improvement are requirements to survive and to thrive.
Enjoy the journey - We create an environment where our team feels appreciated and has fun along the way.
In this role you will get to:
Act as the strategic point of contact for all dataengineering initiatives, collaborating with cross-functional teams to design, build, and maintain robust data pipelines.
Develop and implement scalable data solutions to support analytics, machine learning, and business intelligence activities.
Optimize and manage the performance of databases, ensuring data integrity and high availability.
Collaborate with stakeholders to gather, document, and translate business requirements into efficient data architecture.
Provide hands-on support and mentoring for junior dataengineers, empowering them to effectively contribute to Trystar's data solutions.
Orchestrate data validation and testing procedures to ensure data quality and alignment with operational goals.
Develop and maintain detailed documentation for data processes, customizations, and resolutions.
Track, manage, and communicate data-related updates, ensuring all stakeholders are informed and aligned.
Support data governance initiatives, including the implementation of best practices for data management.
Job Requirements:
BASIC QUALIFICATIONS
Education: Bachelor's degree in computer science, DataEngineering & Analytics or a related field.
Experience: Minimum of 7 years of experience in dataengineering, with a focus on designing and implementing data solutions at scale.
Technical Skills: Proficiency in SQL, Python, and ETL tools. Experience with cloud data platforms Microsoft fabric, Azure Synapse, Azure Data Factory, Azure Databricks, Power BI or similar.
Strong understanding of data modeling, data warehousing, data lake houses, semantic models etc.
Proven ability to lead and mentor a team of dataengineers, ensuring high performance and continuous improvement.
Excellent analytical and problem-solving skills, with a strategic approach to data management and architecture.
Strong communication skills, capable of translating complex technical concepts into business-friendly language.
Experience in managing large-scale data projects, including planning, execution, and delivery.
Knowledge of data governance principles and best practices, with a commitment to ensuring data quality and integrity.
ADDITIONAL QUALIFICATIONS
Familiarity with digital manufacturing and related systems like ERP, CRM, QMS etc.
Ability to troubleshoot technical issues and implement minor system enhancements.
Strong communication skills, with the ability to explain technical concepts in clear, business-friendly language.
Passion for learning and optimizing technologies and user experiences.
Exceptional organizational skills and attention to detail, with the ability to manage multiple tasks effectively.
Strong interpersonal skills and highly resourceful in both team environments and building business relationships.
Strategic mindset with a hands-on approach to problem-solving.
Willingness and ability to travel 20%.
$80k-110k yearly est. 28d ago
Data Engineer, People Analytics
Crusoe 4.1
San Francisco, CA jobs
Crusoe's mission is to accelerate the abundance of energy and intelligence. We're crafting the engine that powers a world where people can create ambitiously with AI - without sacrificing scale, speed, or sustainability.
Be a part of the AI revolution with sustainable technology at Crusoe. Here, you'll drive meaningful innovation, make a tangible impact, and join a team that's setting the pace for responsible, transformative cloud infrastructure.
About the Role:
A DataEngineer is a builder, an innovator, and a trusted strategic partner. You will sit at the intersection of dataengineering, business consulting, and applied AI.
In this role, you will do more than just manage pipelines; you will architect and build the infrastructure powering the People Analytics team's insights and product suite. You will champion the use of practical AI and machine learning to modernize how we understand our workforce-from analyzing unstructured sentiment to predicting talent trends. You will own the full stack-extracting data from HCM and ATS Systems like Rippling and Ashby, modeling it in Google Cloud Platform, and bringing it to life in self-service products-while partnering with leaders to design innovative workplace solutions that scale our culture.
What You'll Be Working On:
Full-Stack DataEngineering: Build and maintain resilient ETL pipelines to centralize data from our core HCM and ATS systems into Google Cloud Platform, Big Query, and other people analytics products.
Semantic Modeling & Self-Service: Architect a robust semantic data layer (using dbt) that translates raw database schemas into business-friendly logic. You will enable non-technical leaders to ask natural language questions and get accurate answers.
Applied AI & Predictive Analytics: Leverage AI and LLMs to unlock insights from unstructured data (e.g., engagement survey comments, interview feedback) and build predictive models for attrition and headcount planning.
Innovative Workplace Solutions: Go beyond dashboards to design data products that solve operational problems. This could mean automating manual HR workflows, building custom apps for internal mobility, or using data to redesign our organizational structure.
Consultative Partnership: Partner proactively with Talent, Finance, and People leaders. You will translate vague business anxieties into rigorous data questions, consulting on the "art of the possible" with modern analytics.
Visualization & Storytelling: Design and deploy high-impact Sigma workbooks that guide executives through complex narratives, ensuring data is not just viewed, but acted upon.
What You'll Bring to the Team:
The "Hybrid" Skill Set: You are an Engineer who thinks like a Consultant. You love writing clean Python/SQL code, but you are equally passionate about solving human-centric business problems.
Core Tech Stack Mastery:
Warehouse: Deep experience managing data in Google Cloud Platform
Visualization: Expert proficiency in Sigma or similar BI platforms. You know how to use input tables, workbook parameters, and materialized views to build interactive tools.
HR/Recruiting Systems: Hands-on experience with Ashby, Rippling, or similar. You understand their API nuances and schemas.
Semantic Architecture: Experience building semantic layers that create context based in business logic from raw data, ensuring consistency across all reporting.
AI/ML Curiosity: Experience or strong interest in applying AI/ML techniques (NLP, regression analysis, etc.) to people data. You want to use the tools Crusoe is empowering the world to build.
Communication: You can explain complex data architectures to non-technical stakeholders and act as a trusted advisor on data strategy
Bonus Skills:
Experience deploying or working with People Analytics SaaS platforms
Experience with Sigma.
Benefits:
Industry competitive pay
Restricted Stock Units in a fast growing, well-funded technology company
Health insurance package options that include HDHP and PPO, vision, and dental for you and your dependents
Employer contributions to HSA accounts
Paid Parental Leave
Paid life insurance, short-term and long-term disability
Teladoc
401(k) with a 100% match up to 4% of salary
Generous paid time off and holiday schedule
Cell phone reimbursement
Tuition reimbursement
Subscription to the Calm app
MetLife Legal
Company paid commuter benefit; $300/month
Compensation Range
Compensation will be paid in the range of up to $135,000 -$164,00 + Bonus. Restricted Stock Units are included in all offers. Compensation to be determined by the applicants knowledge, education, and abilities, as well as internal equity and alignment with market data.
Crusoe is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, disability, genetic information, pregnancy, citizenship, marital status, sex/gender, sexual preference/ orientation, gender identity, age, veteran status, national origin, or any other status protected by law or regulation.
$135k-164k yearly Auto-Apply 20h ago
Data Engineer, Manager
The Energy Authority Inc. 4.1
Jacksonville, FL jobs
Job Description
About The Energy Authority
The Energy Authority is a public power-owned, nonprofit corporation with offices in Jacksonville, Florida, and Bellevue (Seattle), Washington. TEA provides public power utilities with access to advanced resources and technology systems so they can respond competitively in the changing energy markets. Through partnership with TEA, utilities benefit from an experienced organization that is singularly focused on deriving the maximum value of their assets from the market.
Join Our Team as a DataEngineering Manager
Are you a strategic technical leader who thrives at the intersection of innovation, data, and collaboration? TEA is looking for a DataEngineering Manager to guide the development of our enterprise data ecosystem and drive cloud-first, data-centric solutions that support both internal teams and external clients. In this high-impact role, you'll partner directly with Managers and Directors across the organization, influencing data strategy while leading a team of talented engineers to build scalable, secure, and business-critical data products. If you love shaping strategy
and
rolling up your sleeves to solve complex problems-whether designing cloud architectures, reviewing critical implementations, or modeling best practices-this is an opportunity to make your mark. You'll play a pivotal role in TEA's cloud transformation and help define the future of data at TEA.
What You'll Do
Identify and articulate effective cloud-based strategies to meet TEA's evolving dataengineering needs
Lead the design and development of business-critical and client-supporting cloud data solutions
Ensure the stability, integrity, security, and efficient operation of cloud data solutions supporting analytics and machine learning initiatives
Lead the development of TEA's Data Lake architecture, including data modeling, ELT processes, and data pipeline standards
Recruit, mentor, and develop engineers to build a high-performing and collaborative team
Influence decision-making and champion best practices across engineering and business stakeholder groups
Technical Skills We're Looking For
Expertise in Microsoft Azure (Azure Data Lake, Azure Databricks, Fabric, Azure Data Factory)
Deep expertise in Azure Databricks, including Notebook development, Workflows, and Asset Bundles
Proven ability to design and implement modern data architectures (data lakes, warehouses, lakehouses, data mesh, streaming pipelines)
Proficiency in Python, PySpark, SQL, and modern dataengineering frameworks
Strong understanding of APIs, data integration patterns, data governance frameworks, and security/compliance standards
Experience optimizing pipelines, architectures, and queries for performance and cost
Knowledge of data governance, metadata management, and data security principles
Language Skills
Ability to read, analyze, and interpret technical journals, financial reports, and legal documents
Ability to respond to inquiries from customers, regulatory agencies, or business stakeholders
Ability to write polished professional content such as presentations, speeches, and publications
Ability to effectively present information to senior management, public groups, or boards of directors
Management Responsibilities
Direct supervision of Enterprise Data Team personnel responsible for DataEngineering functions
Interviewing, hiring, training, and developing employees
Planning, assigning, and directing daily work
Conducting performance evaluations and providing ongoing feedback
Rewarding excellence and addressing performance issues
Ensuring compliance with organizational policies and applicable laws
Promoting strong communication, teamwork, and a culture of accountability
Education & Experience
Master's degree (M.A.) or equivalent preferred
4-10 years of related experience and/or technical leadership
Or an equivalent combination of education and experience
If you're ready to lead transformative data initiatives and help shape the future of TEA's data strategy, we'd love to meet you. Apply today and bring your expertise to a team where innovation, collaboration, and impact come together.
TEA Values
TEA employees share a common sense of purpose. When TEA accomplishes its mission, the result is improved quality of life for the citizens and businesses of the communities our clients serve.
TEA employees exceed the expectations of those they serve, deliver services with the highest standards of fair, honest, and ethical behavior, set the standard for service and expertise in our industry, embody a spirit of collaboration, and embrace TEA's founding entrepreneurial spirit by seizing opportunities to deliver value.
If you are self-motivated, driven to deliver excellence, and passionate about your career, TEA is the perfect place for you.
It's YOUR Future. It's OUR Future.
$86k-109k yearly est. 23d ago
Data Engineer, Manager
The Energy Authority 4.1
Jacksonville, FL jobs
The Energy Authority is a public power-owned, nonprofit corporation with offices in Jacksonville, Florida, and Bellevue (Seattle), Washington. TEA provides public power utilities with access to advanced resources and technology systems so they can respond competitively in the changing energy markets. Through partnership with TEA, utilities benefit from an experienced organization that is singularly focused on deriving the maximum value of their assets from the market.
Join Our Team as a DataEngineering Manager
Are you a strategic technical leader who thrives at the intersection of innovation, data, and collaboration? TEA is looking for a DataEngineering Manager to guide the development of our enterprise data ecosystem and drive cloud-first, data-centric solutions that support both internal teams and external clients. In this high-impact role, you'll partner directly with Managers and Directors across the organization, influencing data strategy while leading a team of talented engineers to build scalable, secure, and business-critical data products. If you love shaping strategy
and
rolling up your sleeves to solve complex problems-whether designing cloud architectures, reviewing critical implementations, or modeling best practices-this is an opportunity to make your mark. You'll play a pivotal role in TEA's cloud transformation and help define the future of data at TEA.
What You'll Do
Identify and articulate effective cloud-based strategies to meet TEA's evolving dataengineering needs
Lead the design and development of business-critical and client-supporting cloud data solutions
Ensure the stability, integrity, security, and efficient operation of cloud data solutions supporting analytics and machine learning initiatives
Lead the development of TEA's Data Lake architecture, including data modeling, ELT processes, and data pipeline standards
Recruit, mentor, and develop engineers to build a high-performing and collaborative team
Influence decision-making and champion best practices across engineering and business stakeholder groups
Technical Skills We're Looking For
Expertise in Microsoft Azure (Azure Data Lake, Azure Databricks, Fabric, Azure Data Factory)
Deep expertise in Azure Databricks, including Notebook development, Workflows, and Asset Bundles
Proven ability to design and implement modern data architectures (data lakes, warehouses, lakehouses, data mesh, streaming pipelines)
Proficiency in Python, PySpark, SQL, and modern dataengineering frameworks
Strong understanding of APIs, data integration patterns, data governance frameworks, and security/compliance standards
Experience optimizing pipelines, architectures, and queries for performance and cost
Knowledge of data governance, metadata management, and data security principles
Language Skills
Ability to read, analyze, and interpret technical journals, financial reports, and legal documents
Ability to respond to inquiries from customers, regulatory agencies, or business stakeholders
Ability to write polished professional content such as presentations, speeches, and publications
Ability to effectively present information to senior management, public groups, or boards of directors
Management Responsibilities
Direct supervision of Enterprise Data Team personnel responsible for DataEngineering functions
Interviewing, hiring, training, and developing employees
Planning, assigning, and directing daily work
Conducting performance evaluations and providing ongoing feedback
Rewarding excellence and addressing performance issues
Ensuring compliance with organizational policies and applicable laws
Promoting strong communication, teamwork, and a culture of accountability
Education & Experience
Master's degree (M.A.) or equivalent preferred
4-10 years of related experience and/or technical leadership
Or an equivalent combination of education and experience
If you're ready to lead transformative data initiatives and help shape the future of TEA's data strategy, we'd love to meet you. Apply today and bring your expertise to a team where innovation, collaboration, and impact come together.
TEA Values
TEA employees share a common sense of purpose. When TEA accomplishes its mission, the result is improved quality of life for the citizens and businesses of the communities our clients serve.
TEA employees exceed the expectations of those they serve, deliver services with the highest standards of fair, honest, and ethical behavior, set the standard for service and expertise in our industry, embody a spirit of collaboration, and embrace TEA's founding entrepreneurial spirit by seizing opportunities to deliver value.
If you are self-motivated, driven to deliver excellence, and passionate about your career, TEA is the perfect place for you.
It's YOUR Future. It's OUR Future.
$86k-109k yearly est. Auto-Apply 51d ago
Data Engineer
Lee County Electric Cooperative, Inc. 4.4
North Fort Myers, FL jobs
Category Information Technology Tracking Code 864-376 Type Full-Time/Regular JOB TITLE: DataEngineer Work Hours: M-F 8:00am - 5:00pm Our benefits include: * Company-wide annual incentive plan * Medical, vision and dental insurance
* 401(k) plan with a generous 6% company match
* Company funded Pension Plan
* On-site wellness/medical facility
* Company paid Short & Long-Term Disability insurance
* Health Savings Account with an employer contribution
* Flexible Spending Accounts
* Paid time off and paid holidays
* Wellness program with financial rewards
* Tuition reimbursement
* Group life insurance
* Critical Illness and Accident Insurance
LCEC provides reliable, cost-competitive electricity to more than 250,000 members throughout a five-county service territory located in Southwest Florida. We employ approximately 460 skilled employees and are one of more than 900 electric distribution cooperatives located throughout the United States. LCEC has been recognized locally and statewide as an industry leader and continually receives acknowledgment for the work that our employees do in the community along with other civic, environmental and professional honors.
Position Summary: The DataEngineer is responsible for building and maintaining data pipelines, streaming systems, and transformation layers that power LCEC's new Microsoft-centered analytics ecosystem. This role is essential in modernizing our data platform that integrates Apache Kafka, Apache Spark, Python, MongoDB, SQL Server, Data Frames, Rapids, Microsoft Fabric, Power BI, Copilot, and Purview.
Position Responsibilities
* Design, build, and maintain scalable batch and streaming data pipelines that support LCEC's Microsoft Fabric-based analytics ecosystem.
* Develop reliable data ingestion and transformation processes across layered architectures (e.g., Bronze/Silver/Gold) to enable operational analytics, BI, and advanced use cases.
* Engineer high-performance, fault-tolerant solutions for both real-time and batch data processing.
* Design and implement logical and physical data models that align with enterprise analytics, semantic layers, and Power BI consumption.
* Collaborate closely with BI analysts, data consumers, and platform teams to ensure data products are well-modeled, discoverable, and trusted.
* Work in a managed data environment that maintains lineage, metadata, and thorough documentation.
* Apply engineering best practices, including code reviews, monitoring, optimization, and cost-aware design.
* Contribute to emerging analytics capabilities, including AI-assisted and Copilot-enabled data experiences.
* Maintain effective working relationships with employees and customers at all levels within LCEC. Ensure smooth operations, productive communications, and effective understanding during all interpersonal contacts. Provide current and accurate information to all requesters, courteously and in a timely manner.
* Support Storm Restoration efforts when needed. Work in emergency storm situations (i.e. hurricanes) and work long hours (>12 hours per day) for many continuous days/weeks as needed.
* Perform other related duties as assigned.
Education
* Bachelor's degree in computer science, Engineering, or a related field. (Required)
Experience
Minimum six (6) years' professional experience in dataengineering (or related role) to include experience with:
* Apache Kafka, including producers, consumers, topic design, and retention concepts. (Required)
* Integrating data from MongoDB, SQL Server, APIs, and operational systems. (Required)
* Dimensional modeling, including star schemas, fact tables, and slowly changing dimensions. (Required)
* Apache Spark / PySpark for scalable batch and streaming workloads. (Required)
* Microsoft Fabric, including Lakehouse, Warehouse, OneLake, notebooks, and pipelines. (Required)
* Demonstrated experience with Power Platform tools, including Power Apps and Power Automate. (Required)
* Designing and operating ETL/ELT pipelines in production environment. (Required)
* Operating in governed environments using Microsoft Purview. (Required)
* Experience integrating data pipelines with machine learning or MLOps workflows. (Required)
* Experience implementing real-time monitoring, alerting, and observability. (Required)
* Experience optimizing data platforms for cost, performance, and scalability. (Required)
Knowledge, Skills, and Abilities
* Advanced proficiency in Python for data processing, ETL/ELT, and automation. (Required)
* Expertise in SQL for complex transformations and performance tuning. (Required)
* Familiarity with Data Vault (or equivalent structured modeling approaches). (Required)
* Ability to design data models that support semantic layers and BI tools. (Required)
* Familiarity with Power BI and semantic modeling (DAX is a plus). (Required)
* Awareness of Copilot / AI-assisted analytics capabilities. (Required)
* Proficiency with Git/GitHub, CI/CD pipelines, and environment management. (Required)
* Strong documentation, communication, and collaboration skills. (Required)
* Strong skills in conceptual, logical, and physical data modeling. (Required)
* Ability to partner effectively with analysts, BI developers, and business stakeholders. (Required)
* Familiarity with Kafka schema registries and event schema governance (Preferred)
* Exposure to Microsoft Fabric Data Science Workloads. (Preferred)
Physical Demands and Working Environment: The physical demands and working environment characteristics described here must be met by an employee to successfully perform the essential functions of the job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
* Physical Demands: Standing Occasionally,
Walking Occasionally,
Sitting Constantly,
Lifting Rarely,
Carrying Rarely,
Pushing Rarely,
Pulling Rarely,
Climbing Rarely,
Balancing Rarely,
Stooping Rarely,
Kneeling Rarely,
Crouching Rarely,
Crawling Rarely,
Reaching Rarely,
Handling Occasionally,
Grasping Occasionally,
Feeling Rarely,
Talking Constantly,
Hearing Constantly,
Repetitive Motions Frequently,
Eye/Hand/Foot Coordination Frequently.
* Working Environment: Constantly Air-conditioned office environment, Rarely Extreme cold, Rarely Extreme heat, Rarely Humidity, Rarely Wet, Occasionally Noise, Rarely Hazards, Rarely Temperature Change, Rarely Atmospheric Conditions, Rarely Vibration.
STORM DUTY REQUIREMENTS.... Responding to storms may be considered a condition of employment: LCEC provides critical services to our community during an emergency. Employees may be required to participate in the response/recovery activities related to emergencies/disasters to maintain service to our LCEC members. Employees are required to work in their normal job duties or other assigned activities. Proper compensation will be made in accordance with the company's guidelines and procedures.
Please note that at the time a candidate is made a job offer, the candidate will be subject to a background check and a drug screening.
$69k-87k yearly est. 3d ago
Data Engineer
LCEC 4.4
Florida jobs
JOB TITLE: DataEngineer
Work Hours: M-F 8:00am - 5:00pm
Our benefits include:
Company-wide annual incentive plan
Medical, vision and dental insurance
401(k) plan with a generous 6% company match
Company funded Pension Plan
On-site wellness/medical facility
Company paid Short & Long-Term Disability insurance
Health Savings Account with an employer contribution
Flexible Spending Accounts
Paid time off and paid holidays
Wellness program with financial rewards
Tuition reimbursement
Group life insurance
Critical Illness and Accident Insurance
LCEC provides reliable, cost-competitive electricity to more than 250,000 members throughout a five-county service territory located in Southwest Florida. We employ approximately 460 skilled employees and are one of more than 900 electric distribution cooperatives located throughout the United States. LCEC has been recognized locally and statewide as an industry leader and continually receives acknowledgment for the work that our employees do in the community along with other civic, environmental and professional honors.
Position Summary: The DataEngineer is responsible for building and maintaining data pipelines, streaming systems, and transformation layers that power LCEC's new Microsoft-centered analytics ecosystem. This role is essential in modernizing our data platform that integrates Apache Kafka, Apache Spark, Python, MongoDB, SQL Server, Data Frames, Rapids, Microsoft Fabric, Power BI, Copilot, and Purview.
Position Responsibilities
Design, build, and maintain scalable batch and streaming data pipelines that support LCEC's Microsoft Fabric-based analytics ecosystem.
Develop reliable data ingestion and transformation processes across layered architectures (e.g., Bronze/Silver/Gold) to enable operational analytics, BI, and advanced use cases.
Engineer high-performance, fault-tolerant solutions for both real-time and batch data processing.
Design and implement logical and physical data models that align with enterprise analytics, semantic layers, and Power BI consumption.
Collaborate closely with BI analysts, data consumers, and platform teams to ensure data products are well-modeled, discoverable, and trusted.
Work in a managed data environment that maintains lineage, metadata, and thorough documentation.
Apply engineering best practices, including code reviews, monitoring, optimization, and cost-aware design.
Contribute to emerging analytics capabilities, including AI-assisted and Copilot-enabled data experiences.
Maintain effective working relationships with employees and customers at all levels within LCEC. Ensure smooth operations, productive communications, and effective understanding during all interpersonal contacts. Provide current and accurate information to all requesters, courteously and in a timely manner.
Support Storm Restoration efforts when needed. Work in emergency storm situations (i.e. hurricanes) and work long hours (>12 hours per day) for many continuous days/weeks as needed.
Perform other related duties as assigned.
Education
Bachelor's degree in computer science, Engineering, or a related field. (Required)
Experience
Minimum six (6) years' professional experience in dataengineering (or related role) to include experience with:
Apache Kafka, including producers, consumers, topic design, and retention concepts. (Required)
Integrating data from MongoDB, SQL Server, APIs, and operational systems. (Required)
Dimensional modeling, including star schemas, fact tables, and slowly changing dimensions. (Required)
Apache Spark / PySpark for scalable batch and streaming workloads. (Required)
Microsoft Fabric, including Lakehouse, Warehouse, OneLake, notebooks, and pipelines. (Required)
Demonstrated experience with Power Platform tools, including Power Apps and Power Automate. (Required)
Designing and operating ETL/ELT pipelines in production environment. (Required)
Operating in governed environments using Microsoft Purview. (Required)
Experience integrating data pipelines with machine learning or MLOps workflows. (Required)
Experience implementing real-time monitoring, alerting, and observability. (Required)
Experience optimizing data platforms for cost, performance, and scalability. (Required)
Knowledge, Skills, and Abilities
Advanced proficiency in Python for data processing, ETL/ELT, and automation. (Required)
Expertise in SQL for complex transformations and performance tuning. (Required)
Familiarity with Data Vault (or equivalent structured modeling approaches). (Required)
Ability to design data models that support semantic layers and BI tools. (Required)
Familiarity with Power BI and semantic modeling (DAX is a plus). (Required)
Awareness of Copilot / AI-assisted analytics capabilities. (Required)
Proficiency with Git/GitHub, CI/CD pipelines, and environment management. (Required)
Strong documentation, communication, and collaboration skills. (Required)
Strong skills in conceptual, logical, and physical data modeling. (Required)
Ability to partner effectively with analysts, BI developers, and business stakeholders. (Required)
Familiarity with Kafka schema registries and event schema governance (Preferred)
Exposure to Microsoft Fabric Data Science Workloads. (Preferred)
Physical Demands and Working Environment: The physical demands and working environment characteristics described here must be met by an employee to successfully perform the essential functions of the job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Physical Demands: Standing Occasionally,
Walking Occasionally,
Sitting Constantly,
Lifting Rarely,
Carrying Rarely,
Pushing Rarely,
Pulling Rarely,
Climbing Rarely,
Balancing Rarely,
Stooping Rarely,
Kneeling Rarely,
Crouching Rarely,
Crawling Rarely,
Reaching Rarely,
Handling Occasionally,
Grasping Occasionally,
Feeling Rarely,
Talking Constantly,
Hearing Constantly,
Repetitive Motions Frequently,
Eye/Hand/Foot Coordination Frequently.
Working Environment: Constantly Air-conditioned office environment, Rarely Extreme cold, Rarely Extreme heat, Rarely Humidity, Rarely Wet, Occasionally Noise, Rarely Hazards, Rarely Temperature Change, Rarely Atmospheric Conditions, Rarely Vibration.
STORM DUTY REQUIREMENTS.... Responding to storms may be considered a condition of employment: LCEC provides critical services to our community during an emergency. Employees may be required to participate in the response/recovery activities related to emergencies/disasters to maintain service to our LCEC members. Employees are required to work in their normal job duties or other assigned activities. Proper compensation will be made in accordance with the company's guidelines and procedures.
Please note that at the time a candidate is made a job offer, the candidate will be subject to a background check and a drug screening.
$68k-86k yearly est. 3d ago
Learn more about Randstad North America, Inc. jobs