Post job

Data Engineer jobs at Florida Power and Light

- 539 jobs
  • Data Engineer - Rates & Load Forecasting Systems

    Nextera Energy, Inc. 4.2company rating

    Data engineer job at Florida Power and Light

    Florida Power & Light Company is the largest electric utility in the U.S., providing reliable energy to nearly 12 million Floridians. With one of the nation's most fuel-efficient, cost-effective power generation fleets and industry-leading reliability, we're redefining what's possible in energy. Want to be part of something powerful? Join our outstanding team and help shape the future of energy. Position Specific Description This position supports the Rates & Load Forecasting Systems team responsible for designing, developing, and maintaining automated data pipelines and analytical infrastructure used by the Rates & Load Forecasting teams. The Data Engineer builds and optimizes data flows from multiple enterprise and external sources to ensure clean, reliable, and timely access to data for regulatory filings, rate design, and forecasting models. The role requires strong technical expertise in SQL, Python, and ETL design, along with a working understanding of business processes supporting regulatory and financial analysis. Job Duties & Responsibilities * Design, develop, and maintain automated ETL (Extract, Transform, Load) pipelines to support rate and forecasting data needs. * Integrate data from enterprise systems (e.g., SAP, AMI, billing, financials) into structured analytical environments. * Implement data validation and monitoring frameworks to ensure accuracy and reliability. * Optimize query performance, storage structures, and refresh schedules to improve reporting efficiency. * Collaborate with forecasting, rate design, and reporting teams to translate analytical requirements into scalable data solutions. * Maintain version-controlled code repositories and technical documentation. * Support automation of recurring processes using scripting and orchestration tools. * Work with big data and distributed computing platforms; perform other job‑related duties as assigned. Requirements * Bachelor's degree in Computer Science, Data Engineering, Information Systems, or related technical field. * 2-4 years of experience in data engineering or analytics development. * Proficiency in SQL and Python * Experience with relational databases and cloud-based data platforms * Understanding of data governance, version control, and CI/CD practices. * Strong analytical and problem-solving skills with attention to detail. Preferred Qualifications * Deep knowledge of customer service and metering domains; experience modeling revenue and forecasting * Programming skills in SQL, Python, R; experience with Git and software engineering best practices * Familiarity with data warehousing strategies (e.g., EDP, SAP) and customer service, metering, and accounting systems * Experience with AWS, PostgreSQL, shell scripting; big data/distributed computing * Experience with machine learning and AI; common industry analytical tools and techniques * Understanding of rates and revenue forecasting; ability to coordinate across multiple internal and external SMEs Job Overview Employees in this role develop and maintain automated data pipelines and systems that support the company's rate design, load forecasting, and reporting operations. The position ensures high data quality, efficient processing, and continuous improvement in automation and scalability. Job Overview
    $87k-109k yearly est. 4d ago
  • Principal Geospatial Data Scientist

    Nextera Energy 4.2company rating

    Data engineer job at Florida Power and Light

    **Company:** NextEra Energy **Requisition ID:** 92194 NextEra Energy Resources is one of America's largest wholesale electricity generators, harnessing diverse energy sources to power progress. We deliver tailored energy solutions that fuel economic growth, strengthen communities, and help customers achieve their energy goals. Ready to make a lasting impact? Take the next step in your career with us! **Position Specific Description** NextEra Energy is a Fortune 200 company that owns Florida Power & Light Company, the largest electric utility in the United States, serving approximately 12 million residents in Florida. Our diverse energy portfolio includes natural gas, nuclear, renewable energy, and battery storage solutions. With a strong commitment to meeting America's evolving energy needs sustainably, NextEra Energy Resources, LLC, is a leader in energy infrastructure development. Ready to make a meaningful impact? Join our team to elevate your career. The Principal Geospatial Data Scientist for the Enterprise NEE.GIS Geospatial DataOps team will drive innovation by integrating cutting-edge data science methodologies, including machine learning (ML), deep learning (DL), and artificial intelligence (AI), into geospatial analytics. This role involves leading cross-functional teams to design, build, and deliver advanced geospatial data and analytics solutions, enhancing operational efficiency and strategic decision-making. We place a strong emphasis on expertise in current technologies and processes, coupled with a proven track record of innovation. Ideal candidates will have extensive experience in geospatial data management and analytics, and demonstrate a passion for developing groundbreaking analyses, processes, or tools utilizing advanced data science techniques. Key Responsibilities The Principal Geospatial Data Scientist leads complex geospatial data projects by deploying advanced analytical techniques and modeling. This role involves transforming raw spatial data into strategic insights to drive decision-making and business innovation. + Acquire and preprocess geospatial data from diverse sources, including satellite imagery, GIS, vendors, and governmental agencies; automate workflows where feasible. + Develop and apply sophisticated statistical models and machine learning algorithms to perform large-scale spatial data analyses. + Design and implement dynamic visualizations-apps, dashboards, maps, and charts-to effectively communicate geospatial insights to non-technical audiences. + Collaborate with cross-functional stakeholders to discern data requirements and develop tailored solutions. + Construct predictive models to forecast spatial trends and patterns; continuously refine and advance model efficacy. + Synthesize varied data sources into unified geospatial datasets; ensure data integrity through routine audits and quality assurance. + Stay abreast of cutting-edge geospatial analytics techniques and GIS technology advancements. + Innovate by exploring new data sources and methodologies within geospatial analysis. + Perform additional job-related duties as necessary. **Preferred Qualifications:** + Possess expert knowledge of coordinate reference systems, including their reprojections and conversions, with a focus on understanding the implications of accuracy and performance impact on large-scale geospatial data analytics. + Proven experience in national or global scale data analysis and modeling. + Proficient in programming languages, notably Python; familiarity with version control systems like Git and Agile methodologies. + Expertise in Safe Software FME and Esri Enterprise GIS platforms, as well as cloud computing environments like AWS, leveraging Kubernetes. + Extensive experience with machine learning, deep learning, and artificial intelligence applications. + Ability to coordinate multifaceted analysis projects with various subject matter experts to achieve cost-effective and competitive outcomes. + Background in the Energy sector, particularly in Oil & Gas, Renewables, or Utilities. **Desired Qualifications** + GISP and/or ASPRS CMS Certifications + DASCA or similar certification + Master's Degree + Doctoral Degree + Experience: 10+ years **Job Overview** This position is responsible for leading the development of algorithms, modeling techniques, and optimization methods that support many aspects of NextEra and FPL business. Employees in this role use knowledge of machine learning, optimization, statistics, and applied mathematics along with abilities in software engineering with a focus on distributed computing and data storage infrastructure (i.e., "Big Data"). **Job Duties & Responsibilities** + Provide thought leadership, set technical strategy, and identify possible uses of data science methods + Explain methods and results to upper-level executives + Develop machine learning, optimizations or other modeling solutions + Oversee related employee work, learn new techniques being developed + Prepares comprehensive documented observations, analyses and interpretations of results including technical reports, summaries, protocols and quantitative analyses + Works with big data and distributed computing platforms + Develops software and contributes to product development + Performs other job-related duties as assigned **Required Qualifications** + Bachelor's Degree + Experience: 6+ years **Preferred Qualifications** + Master's Degree + Doctoral Degree NextEra Energy offers a wide range of benefits to support our employees and their eligible family members. Clickto learn more. **Employee Group:** Exempt **Employee Type:** Full Time **Job Category:** Science, Research, and Technology **Organization:** NextEra Energy Project Management, LLC **Relocation Provided:** Yes, if applicable NextEra Energy is an Equal Opportunity Employer. Qualified applicants are considered for employment without regard to race, color, age, national origin, religion, marital status, sex, sexual orientation, gender identity, gender expression, genetics, disability, protected veteran status or any other basis prohibited by law. NextEra Energy provides reasonable accommodation in its application and selection process for qualified individuals, including accommodations related to compliance with conditional job offer requirements, consistent with federal, state, and local laws. Supporting medical or religious documentation will be required where applicable and permitted by applicable law. To request a reasonable accommodation, please send an e-mail to, providing your name, telephone number and the best time for us to reach you. Alternatively, you may call **************. Please do not use this line to inquire about your application status. NextEra Energy will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information. NextEra Energy **does not** accept any unsolicited resumes or referrals from **any third-party recruiting firms or agencies** . Please see ourfor more information. \#LI-JN2
    $66k-85k yearly est. 2d ago
  • Scada Ignition Engineer

    Hanwha Convergence USA 4.1company rating

    Georgetown, TX jobs

    SCADA Engineer will be responsible for providing leadership and technical expertise in design, development and delivery of Hanwha Convergence SCADA/PPC solutions for the renewable energy industry. He or She will design, develop work packages, troubleshoot, and continuously improve the SCADA system including RTUs, RTACs, HMI, and electrical control systems on large scale PV and/or BESS projects. He or She also will conduct applicable tests and commissioning complying with local/international codes and standards. **Attention external recruitment firms, we will not accept any unsolicited resumes at this time. Please do not contact any internal member of our company to discuss the position or to solicit candidates. ** DUTIES: · Lead and manage the assigned projects with available resources for successful projects completion in a due date and a budget. · Provide project status reports to stakeholders, and support risk mitigation measures as needed to maintain project goals and objectives. · Lead the development of monitoring and control systems for utility scale renewable energy projects including but not limited to: Solar PV, Battery Energy Storage Systems. · Provide team oversight in the development of device points lists, IP address lists, Logic Diagrams, HMI mockups & assets, commissioning test plans and completion checklists, utilizing company defined documentation and standards. · Work within a team environment to define and implement product design standards and best practices that align with company goals and objectives. · Program and commission PPC, SCADA servers, data historians, and HMI systems. · Develop engineering work packages, construction work packages, inspection and test procedures, FAT/SAT, commissioning, and operation and maintenance procedures. · Identify applicable standards and collateral standards for the diverse applicable sites. · Lead any design changes required to ensure standards compliance or continuous improvement. · Perform technical presentations to clients including SCADA, PPC(Plant Power Control), and HEIS(Hanwha Energy Integration System) but not limited. · Mentor and train the less experienced engineers and technicians. · Conduct/facilitate risk analysis activities as required. · Perform other duties and/or tasks as required. SKILLS/EXPERIENCE/EDUCATION · Bachelor's degree in electrical, electronic, or computer engineering preferred. · Minimum 2+ years' direct experience in Ignition SCADA application, and other SCADA application engineering experience considered as an asset. · Schweitzer Engineering RTAC Platform experience considered as an asset. · Strong knowledge of design, installation and commissioning of SCADA networks using; Fiber Optics, Serial RS-232 / RS-485, Ethernet TCP/IP, MQTT. · Strong knowledge of industrial automation protocols including but not limited to; Modbus RTU/TCP, DNP3, OPC UA and DA. · Proficiency in reading and developing diagrams and schematics including but not limited to, power system, networking and control, electrical, mechanical and civil layouts. · Ability to solve problems and identify root causes as a part of investigation. · In-depth understanding of power plant operating procedures and control system interaction with governing bodies such as: Regional Compliance Entities, Independent System Operators (CAISO, ERCOT experience preferred), Transmission Operators, and Generator Operators. LANGUAGE SKILLS: · Ability to communicate effectively in English. · Communication in Korean is considered as an asset. WORK ENVIRONMENT: · This position can be offered with work from home. However, it's preferred to be at the office at Georgetown, TX and the candidates to be hired may be eligible for relocation assistance · Fast paced with priorities that often change to meet current priorities. · Travel to customer sites is required, and the ability to travel internationally with a valid passport. · Must be legally entitled to work in the USA and prepared to travel abroad. Hanwha Convergence is proud to be an at-will Equal Opportunity Employer and prohibits discrimination against race, color, religion, sex, sexual orientation, gender perception or identity, national origin, age, marital status, pregnancy, citizenship, disability, protected veteran status and any other classification protected by applicable federal, state or local law. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. We are committed to the full inclusion of all qualified individuals. As part of this commitment, Hanwha Convergence will provide reasonable accommodations to all qualified individuals with disabilities to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment. Please contact us to request accommodations. Nothing in this statement shall imply implicitly or explicitly a guarantee of employment outside our at-will employment opportunity. You may view your privacy rights by reviewing Hanwha Convergence Privacy Policy here or contacting our HR Team for a copy.
    $76k-111k yearly est. 3d ago
  • SAP GRC Engineer

    Summit Group Solutions, LLC 4.4company rating

    Issaquah, WA jobs

    The SAP GRC Engineer supports the values and business goals as they relate to legal, ethical, and regulatory obligations; protect privacy; and maintain a secure technology environment. SAP GRC Engineers develop and execute security controls, defenses, and countermeasures to intercept and prevent internal/external attacks, infiltration of company data, and compromising of systems and accounts. SAP GRC Engineers research attempted/successful efforts to compromise systems security; design countermeasures; implement and maintain physical, technical, and administrative security controls; and provide information to management regarding the negative impact to the business. The SAP GRC Engineer is responsible for the creation and maintenance of General IT control objectives in the area of SAP GRC. This position will be responsible for ensuring that all SAP GRC IT control objectives are in compliance and running to full efficiency. In addition, this role will assist with the daily and monthly reporting of SOD (Segregation of Duties) activities from SAP GRC in support of meeting applicable compliance objectives. This is a cross-functional role, working closely with the SAP Security team and other functional teams to ensure security requirements and solutions meet compliance objectives. ROLE Provides GRC, security, and technical expertise to support the development of GRC objects to satisfy business requirements. Analyzes and administers GRC policies to control physical and virtual system access. Identifies and investigates GRC issues and develops solutions that address compliance requirements that can/do impact GRC and security. Identifies, develops, and implements mechanisms to detect incidents in order to enhance compliance and support of the standards and procedures. Assesses business role requirements, reviews authorization roles, and supports authorizations. Demonstrates a comprehensive skill set with testing authorizations for multiple environments and coordinates testing with business/technical users. Validates system configurations to ensure the safety of information systems assets and protects information systems from intentional or inadvertent access or destruction. Implements best practice when applying knowledge of information systems security standards/practices (e.g. access control and system hardening, system audit and log file monitoring, security policies, and incident handling). Identifies GRC gaps that expose Costco to potential exploit and develop short- and long-term prioritized remediation to address those gaps. Determines strategy and protocol for network behavior, analysis techniques, and tool implementation. Creates dashboards, configures alerts, implements and supports security software platforms, and monitors tools/apps. Identifies opportunities for streamlining and increasing effectiveness through continuous process improvement. Implements practices, processes, and procedures consistent with Costco's information security policy and IT standards. Develops and documents GRC events and incident handling procedures into Playbooks. Ensures that incident documentation is comprehensive, accurate, and complete. Triages, prioritizes, investigates, and coordinates security events and incident handling activities. Creates and/or remediates GITC (General IT Controls) in support of meeting audit objectives for all SAP modules and their supporting Databases, within the company SAP landscape (i.e. Finance, Retail, Warehouse Management, Payroll, HANA, etc.). Designs IT testing procedures to identify and evaluate risk exposures and determine the effectiveness and efficiency of controls. Assists with the creation of effective remediation solutions and/or exception documentation where applicable. Serves as the subject matter expert and point of contact to Internal and External Auditors. Assists project teams with creation and implementation of IT controls objectives and integration into SAP-GRC. Assists with the successful completion of the quarterly UAR (User Access Review) audit process. Collaborates with Internal Audit in developing, testing, and devising solutions to effectively meet applicable IT control objectives. Takes responsibility for continued personal growth in the areas of technology, business knowledge, Costco policies, and platforms. Participates in team activities and team planning in regards to improving team skills, awareness, and quality of work. REQUIRED Minimum of 12 years of experience of SAP GRC Access 10.0 and or 12.0 with expertise using the following modules: Account Request Management (ARM), Access Risk Analysis (ARA), Emergency Access Management (EAM), User Access Review (UAR), Process Control (PC), SAP ETD. Minimum of 7 years work experience in IT Risk Management, SOX compliance, and/or auditing with a strong background in IT controls. Minimum of 7 years of experience with SAP Security across various applications, including but not limited to, S/4 HANA, ECC, BW, MDG, Fiori, PI/PO, eWM, and Solution Manager. Minimum of 7 years experience with SOD conflict resolution. Direct “hands-on” experience in IT audits and functional experience using SAP GRC. Understanding of SAP cloud security. Strong understanding of Sarbanes-Oxley (SOX) and other compliance requirements that may impact controls. Expertise in working with internal and external auditors. Experience developing SAP GRC solutions that address Sarbanes-Oxley requirements. Effective communication and technical leadership; ability to fluently speak both technical and business language interchangeably. Ability to effectively mentor other team members on SAP compliance. Experience in successful project implementation and follow-up; strong time management skills. Strong conceptual, analytical, problem-solving, troubleshooting, and resolution skills. Ability to monitor and manage the progress of tasks and work independently. Ability to design, develop, and maintain SAP user management and security architecture across SAP environments, including hands-on role design and build across a number of complex SAP applications and databases. Scheduling flexibility to meet the needs of the business, including 24x7 on call rotational support. Recommended Bachelor's degree in Accounting, Business, Information Technology, or Computer Science preferred. Documentation and presentation skills catered to a diverse technical and business audience. Technical knowledge of SAP landscapes and roadmaps. Proficient in Google Workspace applications, including Sheets, Docs, Slides, and Gmail. Required Documents Cover Letter Resume Pay Range- $150,000 - $180,000 DOE plus Bonus and Restricted Stock Units (RSU) Location: Hybrid onsite 3 days per week in Issaquah, WA
    $150k-180k yearly 3d ago
  • Senior Software Engineer

    Consol Partners 4.4company rating

    Austin, TX jobs

    Sr Software Engineer (Fintech Startup) Direct Hire W2 (no 3rd parties) - MUST be US Citizen or Green Card Holder Hybrid - Austin 78701 Required: 5+ years of professional software engineering experience 3+ years in Fintech or Payments Backend expertise in at Python, Node or Go (No Java) Strong API development experience Proven experience designing and scaling cloud-native systems (AWS) Experience with secure payment processing, reconciliation, and data integrity Settlement of Ledger accuracy experience PCI DSS/NACHA/SOC2 implementation experience Kafka experience Familiarity with AI/ML model deployment and MLOps best practices Perks: 100% Company paid benefits (Medical, Dental, Vision) Competitive base salary + Equity ($150-200k DOE) Flexible PTO & Hybrid work environment Annual professional development budget
    $150k-200k yearly 2d ago
  • Computer/Data Science Engineer (3676)

    Navarro Inc. 4.0company rating

    Niskayuna, NY jobs

    Job Description Navarro Research and Engineering is recruiting a Computer/Data Science Engineer in Niskayuna, NY. A DOE L clearance or DOD equivalent is required to be considered for this position. Navarro Research & Engineering is an award-winning federal contractor dedicated to partnering with clients to advance clean energy and deliver effective solutions for complex challenges in the nuclear and environmental fields. Joining Navarro means being a part of an exceptional team committed to quality and safety while also looking for innovative strategies to create value for the client's success. Headquartered in Oak Ridge, Tennessee, Navarro has active programs in place across the nation for DOE/NNSA, NASA, and the Department of Defense. Candidate will provide an individual experienced with Microsoft Power Platform tools to develop forms, automate business workflows, and provide data visualizations, work with Buyer's subject matter experts (SME) to understand the functional requirements to meet various KLS office operational streamlining needs, provide a developer user guide documenting the custom application, workflows, and reporting setup, develop an end-user guide for Buyer; utilize Microsoft Power Platform tools effectively and answer any technical related questions, develop a project schedule with associated milestones to complete the Power Platform applications development following completion of functional requirement gathering and documentation, give weekly status updates on the project, communicate schedule delays larger than 8 hours. Schedule- This position allows for some remote work offsite. 40 hours a week. Monday through Friday, for a minimum of 12 months with the option to extend the duration based upon business needs; The standard workday shall begin no earlier than 6:00 AM and end no later than 5:30 PM (Eastern standard time). Requirements A DOE L clearance or DOD equivalent is required to be considered for this position. Bachelor's degree in computer science/engineering or a related discipline, or equivalent work experience. 4-6 years of experience in application development; 2+ years of Microsoft Power Platforms applications development required Strong verbal and written communication skills. Ability to work independently and in a team environment. Experience developing and maintaining project schedules. o Experience gathering and documenting functional requirements. Ability to work in an agile environment and consistently maintain development documentation Required to learn, observe, and adhere to all pertinent Site Security, Site Safety, and Site Information Security requirements. Knowledge/experience in database development is strongly preferred. Due to the nature of the government contract requirements and/or clearances requirements, US citizenship is required. Navarro is an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, race, religion, color, national origin, age, disability, veteran's status or any classification protected by applicable state or local law. EEO Employer/Vet/Disabled Benefits Health Care Plan (Medical, Dental & Vision) Retirement Plan (401k) Life Insurance (Basic, Voluntary & AD&D) Paid Time Off (Vacation & Public Holidays) Short Term & Long Term Disability $90K to $180K annually
    $90k-180k yearly 3d ago
  • Sr. Data Engineer - Hybrid (Andover, MA)

    Enel 4.6company rating

    Massachusetts jobs

    Who We Are: Enel North America is a proven renewables leader delivering clean, flexible and sustainable energy solutions. As part of the Enel Group, we develop, build, own and operate renewable power plants and demand response solutions, with over 11 gigawatts (GW) of installed wind and solar capacity, over 1 GW of energy storage and nearly 5 GW of demand response in the US and Canada. For nearly 25 years, we've reliably powered modern life and driven climate action with our people, partners and communities by putting sustainability at the center of everything we do. Enel is a top-five industry leader for clean power capacity in the US, demand response in North America and utility-scale battery storage in Texas. We are a smart and passionate team working together to build the Enel North America that we want for the long-term - one that is founded on strong financial, social and environmental values. Being on our team means being part of lasting progress to create a thriving and more sustainable world for our climate and communities. It means valuing safety, trust, innovation, proactivity, flexibility and respect in all we do. Our vision is ambitious, and we'll get there together. The Opportunity: Enel Green Power Operations and Maintenance seeks a Senior Data Engineering Specialist to manage ETL, integration, and analytics projects, supporting both O&M and other business lines. The role focuses on data quality, reliable ETL solutions, and actionable analytics. The ideal candidate combines technical expertise with strong problem-solving skills and a passion for turning data into business insights. What You'll Do at Enel North America: * Develop and manage ETL (Extract, Transform, Load) processes to ingest data from various sources with tools such as Python, Airflow, and Dagster. * Integrate structured and unstructured data from internal and external systems. * Lead automation of data workflows and ensure data quality and consistency. * Administer and optimize large-scale databases (PostgreSQL, MSSQL, NoSQL). * Identify and resolve bottlenecks in data processing and storage. * Optimize queries and data flows for efficiency and scalability. * Lead development of data structures within AVEVA PI System. * Work closely with data scientists, analysts, and business stakeholders to understand data requirements. * Translate business needs into technical solutions. * Provide technical leadership to empower end users across the organization. * Collaborate on technical solutions with other business organizations (i.e. IT & SCADA) * Research and implement the use of AI within data engineering applications. Who You Are: Must have: * Strong coding skills in a high-level language such as Python or R. * Strong SQL skills * Experience developing with APIs. * Experience with cloud-based environments (AWS and Azure) * Experience with data lakes and large-scale data pipelines. * Fundamental understanding of data storage and processing techniques. * ETL pipelines with tools such as Airflow, Dagster, or similar. * Experience with Power BI, Tableau, or similar BI tool. * Proficiency in Microsoft Office. Preferred: * Project management skills * Experience with Machine Learning and/or AI data techniques. * Experience in developing or administering AVEVA PI platform, PI Asset Framework, and PI Web API, preferably within the power/utility industry. * Experience in PI Suite and/or other real-time data systems. What You've Accomplished: * Bachelor's (Master's a plus) in a quantitative discipline (e.g., statistics, mathematics, operations research, engineering, data science or computer science). * Minimum of 3-5 years of working with databases, administrative or end user. Diversity, Equity & Inclusion: Enel North America is dedicated to providing equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, gender, national origin, citizenship, age, disability, sexual orientation, genetic information. We will not discriminate, in any employment decision, against any individual or group on the basis of race, color, religion, sex, gender, national origin, citizenship, age, disability, sexual orientation, genetic information, or veterans/national guard/military reserve status. This shall be done in compliance with all applicable federal, state, and local laws in every location in which Enel North America has facilities. Enel North America maintains a drug-free workplace and performs pre-employment substance abuse testing and background checks, where permitted by law. Accessibility - If you require accessibility assistance applying for open positions please contact ************************. What Enel North America Offers You: The pay range for this position is $118,700.00 to $178,000.00 per year. The base pay actually offered will be based on several factors, including job-related knowledge, skills, work experience, education, and internal equity. At Enel, base pay is one part of your total compensation package. Please see below for additional information on Enel North America rewards. * Enel North America offers its regular full-time employees affordable, quality healthcare for you and your family, life insurance and disability benefits to provide security, and retirement benefits to help you plan for your financial future. In addition, we offer an array of other benefits such as flexible spending accounts, tuition reimbursement and professional development allowance. * Benefits are effective as of day one! * Some additional perks to working with Enel North America include: * 401k with match fully vested as of day one. Enel-NA matches 100% of the first 4% that you contribute up to set IRS limits. * Generous PTO that supports work/life balance including: 4 weeks annually of vacation as well as personal days, volunteer days, your birthday off, paid holidays, and sick time. Proration may apply during first year of employment. * Paid leave programs * The opportunity to grow and develop your career with the support and mentorship of senior leaders. * The opportunity to work for one of the world's most recognizable and respected brands in the energy industry that believes by working together we can create a new energy era in which the world can become more sustainable. * An employee's eligibility for these benefits shall be subject to the governing documents for such plans and programs and/or company policy. The benefits described above may be modified or eliminated with or without notice in accordance with the governing documents and applicable law. #LI-Hybrid PL
    $118.7k-178k yearly 32d ago
  • Senior Data Engineer, Data Platforms

    Global Partners LP 4.2company rating

    Waltham, MA jobs

    The Senior Data Engineer, Data Platforms is a pivotal role on our Data Team - with broad responsibility. You're not just managing data; you're pioneering the very platforms that underpin our data and analytics engineering pursuits across the company's extensive landscape. You will own state-of-the-art big data platforms that power the Global Partners data stack -- with your work standing as the backbone supporting all data-centric innovations.You are fluent with platforms like AWS, Snowflake, Dagster, and dbt at your fingertips, and deploying via tools such as Kubernetes, Docker, and Terraform being second nature, you are primed to spearhead our pursuit of data excellence. Your vast experience, extending from data storage best practices to continuously assessing and integrating new technologies, ensures that Global Partners stays ahead of the curve. You are a problem solver and an automation enthusiast who will be responsible for deploying robust solution for data engineering and data platform management.At the heart of it all, you're not just an engineer; you see the art in orchestrating data. As you engage with teams, provide strategic guidance, and champion the consistent adoption of best practices, you're also shaping the future of data analytics. If you're ignited by the prospect of being at the helm of technological evolution, where every decision melds strategy with data - Join us. Global Partners offers a collaborative team and an environment where we actively invest to create a culture of data driven excellence. At Global Partners, business starts with people. Since 1933, we've believed in taking care of our customers, our guests, our communities, and each other-and that belief continues to guide us. The Global Spirit is how we work to fuel that long term commitment to success. As a Fortune 500 company with 90+ years of experience, we're proud to fuel communities-responsibly and sustainably. We show up every day with grit, passion, and purpose-anticipating needs, building lasting relationships, and creating shared value. : * Architect and implement scalable, cloud-native data platforms that serve as the foundation for all data engineering initiatives across the organization, utilizing technologies such as AWS, GCP, or Azure, Python, Docker, Kubernetes * · Automate deployment (CI/CD) pipelines for data infrastructure and applications, leveraging tools like Jenkins, GitLab CI,GitHub Actions to ensure rapid, reliable deployments. * · Implement Infrastructure as Code (IaC) practices using tools such as Terraform or CloudFormation to manage and version control cloud resources. * · Develop and maintain robust data orchestration workflows using modern tools like Apache Airflow, Dagster, or Prefect, ensuring efficient data processing and transformation. * · Develop automated solutions and self-service platforms to enable efficient onboarding of developers to efficiently set up, configure, and monitor their data environments. * · Optimize data storage and processing systems, including data lakes and data warehouses (e.g., Snowflake, BigQuery, Redshift), to ensure cost-effectiveness and performance at scale. * · Implement observability and monitoring solutions for data pipelines and infrastructure using tools like Prometheus, Grafana, or DataDog to ensure system reliability and performance. * · Lead the adoption of DataOps practices, fostering collaboration between data engineering, data science, and operations teams to streamline the entire data lifecycle. Additional Job Description: * · Bachelor's or Master's degree in Computer Science, Engineering, Mathematics, or a related field, or equivalent experience in Data Engineering, DataOps, MLOps, Software Engineering with a minimum of 5 years' experience or 7 years' experience, in lieu of an applicable degree. * · Strong proficiency in designing and implementing scalable, cloud-native (containerized) data platforms using Infrastructure as Code (e.g., Terraform, Docker, Kubernetes). * · Advanced programming skills in Python focusing on data-intensive applications. Strong SQL proficiency and experience with cloud data warehouses (e.g., Snowflake, BigQuery) required * · Proven track record in designing and implementing CI/CD pipelines for data infrastructure and applications, using tools like Jenkins, GitLab CI, or GitHub Actions * · In-depth knowledge of big data technologies (e.g., Apache Spark, Kafka) and data orchestration tools (e.g., Apache Airflow, Dagster). Experience with data transformation frameworks like dbt and ETL/ELT processes in cloud environments * · Strong background in data security, governance, and metadata management. Experience implementing IAM/RBAC policies, encryption, and data access controls in cloud environments. * · Proficiency in implementing monitoring, logging, and alerting solutions for data infrastructure (e.g., Prometheus, Grafana, ELK stack). Familiarity with serverless architectures is a plus. * · Ability to design and develop automated tools and self-service platforms enabling efficient data environment setup and management for data scientists and analyst. * · Experience in optimizing data storage and processing systems for cost-effectiveness and performance at scale. Familiarity with MLOps and integrating ML models into production. * · Exceptional team player with strong communication skills, ability to work with cross-functional teams, and a willingness to mentor and share knowledge. * · Proficiency in modern Agile development methodologies, coupled with excellent problem-solving abilities and a metrics-first mindset * Bachelor's Degree Pay Range: $136,200.00 - $204,200.00 The pay range for this position is outlined above. The final amount offered at the start of employment is determined based on factors including, but not limited to, experience level, knowledge, skills, abilities and geographic location, and the Company reserves the right to modify base salary at any time, including for reasons related to individual performance, Company or individual department/team performance and market factors. Our Commitments to You * Coins! We offer competitive salaries and opportunities for growth. We have an amazing Talent Development Team who create trainings for growth and job development. * Health & Wellness - Medical, Dental, Visions and Life Insurance. Along with additional wellness support. * The Road Ahead - We offer 401k and a match component! * Professional Development - We provide tuition reimbursement; this benefit is offered after 6 months of service. What to Expect From the Hiring Process (old GPS of the Interview Process) We value passion and potential. Please apply if you're qualified and interested-we'd love to hear from you. A member of our Talent Acquisition team will review your application and may connect you with the hiring manager if your experience is a strong match. Interviews are conducted virtually and in person, depending on the role. We'll provide more details about next steps if selected to move forward. Global Partners LP is an equal opportunity employer. We foster a company culture where ideas from all people help us grow, move and thrive. We embrace the diversity of all applicants and do not discriminate against race, color, religion, sex, age, national origin, sexual orientation, gender identity, disability, protected veteran status or any other basis prohibited by federal, state or local law. If you have a disability and need an accommodation to apply, please contact our recruiting department at ************ or 781-7GP-WORK. * Disclaimer: At Global Partners, we don't use lie detector tests for any employment decisions. We follow all the rules and regulations, so we need to let you know: In Massachusetts, it's illegal to require or administer a lie detector test as a condition of employment of continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.
    $136.2k-204.2k yearly Auto-Apply 2d ago
  • Data Engineer

    Quantumscape Corp 3.9company rating

    San Jose, CA jobs

    QuantumScape is on a mission to transform energy storage with solid-state lithium-metal battery technology. The company's next-generation batteries are designed to enable greater energy density, faster charging and enhanced safety to support the transition away from legacy energy sources toward a lower carbon future. About the team: The data engineering team at QuantumScape is small, agile, and central to the company's mission. We connect data from all manufacturing development areas to the engineers, analysts, and leaders who use it, enabling tighter feedback loops and driving clarity in decision-making. We achieve this by 1) improving access to data for both humans and AI agents, and 2) automating analytics to deliver actionable insights from our manufacturing development operations. What we need: As we expand to meet the demands of scaling our pilot line, you will be a critical new member with a high degree of impact. We are seeking a motivated data engineer who wants to help build our data product lifecycle from the ground up, accelerate our use of AI to augment engineering workflows, and directly contribute to the future of energy storage. What you'll do: * Design & Build Data Pipelines: Design, build, and maintain robust, scalable ETL/ELT pipelines using Python and advanced SQL to process data from manufacturing systems, test hardware, and enterprise databases into our GCP/BigQuery data warehouse. * Manage the Data Product Lifecycle: Own new data products from end-to-end. You will collaborate with cross-functional stakeholders in Manufacturing, R&D, and Quality to gather requirements, define data models, and deliver trusted, high-quality data solutions. * Enable Analytics & BI: Develop and support analytics-ready datasets and interactive dashboards (e.g., using Plotly Dash) to report on production, yield, and key performance metrics for technical and executive audiences. * Ensure Data Quality & Reliability: Implement automated testing, validation, and monitoring to ensure data integrity, identify opportunities for improvement, and uphold SLAs for your data products. * Automate & Innovate: Serve as a key partner in our AI initiatives by preparing data for machine learning models focused on yield improvement, anomaly detection, and predictive analytics. Write reusable Python libraries to streamline common data analysis and automate engineering workflows. Skill you'll need: * A bachelor's degree in a technical field (such as Computer Science, Physics, Mathematics, Engineering, or a related quantitative field) + 2 years of industry experience or a master's degree in a technical field + 1 year of industry experience. * Fluency in Python and an advanced working knowledge of SQL for complex data transformation and analysis * Experience building, maintaining, and orchestrating ETL and ELT pipelines * Experience with a major cloud data platform and a modern data warehouse (GCP & BigQuery preferred) * Experience using git and hosted services such as GitHub for version control, collaboration, and code review * Excellent problem-solving and communication skills for cross-functional collaboration * Excitement to work in a fast-paced, collaborative environment! Nice to have: * Hands-on experience with "modern data stack" tooling for transformation (e.g., dbt) and orchestration (e.g., Prefect) * Comfort with various data models and/or experience with data warehouse design. * Hands-on experience with the Python Data Science stack (Pandas, NumPy, SciPy, scikit-learn). * Hands-on experience with DevOps practices: CI/CD, infrastructure-as-code. * Hands-on experience building analytics dashboards using Plotly Dash. * Familiarity with manufacturing systems: Manufacturing Execution Systems (MES), SCADA, data acquisition, and product genealogy. * Understanding of Statistical Process Control (SPC) concepts. Compensation & Benefits: Expected salary range for this role is from $119,450 to $173,250, and a final salary will be determined by the candidate's experience and educational background. QuantumScape also offers an annual bonus and a generous RSU/Equity package as part of its compensation plan. In addition, we do offer a tremendous benefits plan including employee paid health care, Employee Stock Purchase Plan (ESPP), and other benefits. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive benefits and privileges of employment. Please contact us to request an accommodation. Nearest Major Market: San Jose Nearest Secondary Market: Palo Alto
    $119.5k-173.3k yearly 27d ago
  • BI Data Engineer

    Ncs Multistage LLC 4.1company rating

    Houston, TX jobs

    Job Title - Business Intelligence (BI) Data Engineer Department - Information Technology Reports to - Director IT Infrastructure & Ops The Business Intelligence (BI) Data Engineer will be responsible for designing, building, and maintaining the data infrastructure, pipelines, and analytics solutions that drive data-informed decision making across the organization. This role bridges data engineering and business intelligence, ensuring data is clean, reliable, and transformed into meaningful insights through platforms like Microsoft Fabric, Power BI, SQL Server, and Snowflake. This role requires the ability to solve complex data challenges, experience across cloud and on-premises platforms, and the ability to turn raw data into trusted, actionable intelligence for business stakeholders. Key Areas of Responsibility Data Engineering & Pipelines Design and implement relational and dimensional databases, including schema design, indexing strategies, and performance tuning Develop, maintain, and optimize ETL/ELT pipelines using Apache Airflow or Azure Data Factory (Fabric Data Factory) Design scalable data models and pipelines for SQL Server and Snowflake Deliver reporting and analytics solutions using Microsoft BI, including Fabric and Power BI Ensure high availability, reliability, and performance of data processes Business Intelligence & Analytics Build, maintain, and optimize dashboards and reports in Power BI and Fabric Translate complex data sets into clear visualizations and metrics for business stakeholders Partner with teams to identify KPIs and deliver actionable insights Data Governance & Quality Implement and monitor data validation, error handling, and performance tuning strategies Contribute to best practices in data security, compliance, and governance Collaboration & Strategy Work closely with data scientists, analysts, and business units to support cross-functional analytics initiatives Participate in architectural discussions to improve scalability and efficiency of data solutions Core Database Administration (DBA) Skills Backup, Recovery & High Availability Design and implement backup and restore strategies (full, differential, transaction log backups) Knowledge of disaster recovery planning and RPO/RTO requirements Experience with high availability solutions: SQL Server: Always On Availability Groups, Failover Clustering Snowflake: Time Travel, Fail-safe Ability to test and document recovery processes Performance Monitoring & Tuning Diagnose slow queries, deadlocks, and bottlenecks Use tools like SQL Profiler, Extended Events, DMVs, Query Store, or Performance Monitor Tune indexes, statistics, and query plans Optimize ETL job performance and concurrency Security & Compliance Implement role-based access control (RBAC), encryption at rest/in transit, and auditing Understand GDPR, SOC2, HIPAA, or other compliance frameworks Manage user provisioning, privilege management, and data masking Maintenance & Automation Set up and monitor database maintenance plans (index rebuilds, integrity checks) Automate housekeeping tasks via SQL Agent, PowerShell, or Fabric pipelines Capacity Planning & Storage Management Forecast storage growth and manage file groups/partitions Understand I/O characteristics and underlying hardware/cloud configurations Advanced/Strategic DBA Skills Capacity and scalability planning for BI workloads Data lifecycle management and archiving strategies Collaboration with data architects to align database design with business goals Documentation and governance of data assets and metadata Cloud DBA/Modern Data Platform Skills ( for hybrid or cloud environments) Snowflake Administration: Roles, warehouses, resource monitors, credit usage optimization Azure SQL/Synapse/Fabric Data Warehouse administration Familiarity with IAM, networking, and cost control in cloud data platforms Experience with infrastructure as code (IaC) tools for database provisioning (e.g., Terraform, ARM templates) Support and uphold HS&E policies and procedures of NCS and the customer Align individual goals with NCS corporate goals, while adhering to the NCS Promise Participate in your Personal Development for Success (PDS) Other duties, relevant to the position, shall be assigned as required Knowledge, Skills and Abilities Bachelor's degree in Computer Science, Information Technology, or equivalent experience 3+ years of experience in BI development, data engineering, or similar roles Strong proficiency in SQL Server (database design, query optimization, stored procedures, performance tuning) Hands-on experience with Snowflake (warehousing, schema design, data sharing, performance optimization) Practical knowledge of Apache Airflow or Azure Data Factory (Fabric Data Factory) for orchestration and workflow management Proficiency with the Microsoft BI stack, including Fabric and Power BI Track record of building and maintaining well-designed databases, complex data pipelines, and reporting solutions Strong analytical skills and ability to explain technical concepts clearly to business audiences Experience with Python or other scripting languages for data manipulation Knowledge of CI/CD practices for data pipeline deployment Exposure to data governance frameworks and compliance standards Familiarity with APIs and data integration tools Understanding of AI-powered BI tools, including how to prepare and connect datasets for Microsoft Copilot in Power BI/Fabric Awareness of how to design data models for AI-driven analytics and natural language queries Additional Information FLSA Status: Exempt Employment Classification: Full-time, Regular Work schedule: 5 days on, 2 days off; Monday to Friday 8:00am - 5:00pm, on-call 24/7 for support Travel: 15-20% domestic travel required; some international travel may be required Target Discretionary Bonus: Eligible Special Equipment: Laptop Criminal background check required for all positions Safety sensitive positions will require additional pre-employment testing Core Competencies Teamwork/Collaboration - Able to work cooperatively with other individuals Service Focus - Builds & maintains customer satisfaction and provides excellent service to internal & external customers Decision Making - Able to make decisions and solve problems of varied levels of complexity using logical, systematic, and sequential approach Ethics & Integrity - Trustworthiness and ethical behavior with consideration for impact & consequence when making decisions/taking action Problem Solving - Ability to approach a problem by using a logical, systematic, sequential approach Continuous Improvement - Ongoing improvement of products, services, or processes through incremental & breakthrough improvements Accountability - Obligation or willingness to be answerable for an outcome
    $84k-119k yearly est. Auto-Apply 56d ago
  • BI Data Engineer

    Ncs Multistage LLC 4.1company rating

    Houston, TX jobs

    Job Title - Business Intelligence (BI) Data Engineer Department - Information Technology Reports to - Director IT Infrastructure & Ops The Business Intelligence (BI) Data Engineer will be responsible for designing, building, and maintaining the data infrastructure, pipelines, and analytics solutions that drive data-informed decision making across the organization. This role bridges data engineering and business intelligence, ensuring data is clean, reliable, and transformed into meaningful insights through platforms like Microsoft Fabric, Power BI, SQL Server, and Snowflake. This role requires the ability to solve complex data challenges, experience across cloud and on-premises platforms, and the ability to turn raw data into trusted, actionable intelligence for business stakeholders. Key Areas of Responsibility Data Engineering & Pipelines Design and implement relational and dimensional databases, including schema design, indexing strategies, and performance tuning Develop, maintain, and optimize ETL/ELT pipelines using Apache Airflow or Azure Data Factory (Fabric Data Factory) Design scalable data models and pipelines for SQL Server and Snowflake Deliver reporting and analytics solutions using Microsoft BI, including Fabric and Power BI Ensure high availability, reliability, and performance of data processes Business Intelligence & Analytics Build, maintain, and optimize dashboards and reports in Power BI and Fabric Translate complex data sets into clear visualizations and metrics for business stakeholders Partner with teams to identify KPIs and deliver actionable insights Data Governance & Quality Implement and monitor data validation, error handling, and performance tuning strategies Contribute to best practices in data security, compliance, and governance Collaboration & Strategy Work closely with data scientists, analysts, and business units to support cross-functional analytics initiatives Participate in architectural discussions to improve scalability and efficiency of data solutions Core Database Administration (DBA) Skills Backup, Recovery & High Availability Design and implement backup and restore strategies (full, differential, transaction log backups) Knowledge of disaster recovery planning and RPO/RTO requirements Experience with high availability solutions: SQL Server: Always On Availability Groups, Failover Clustering Snowflake: Time Travel, Fail-safe Ability to test and document recovery processes Performance Monitoring & Tuning Diagnose slow queries, deadlocks, and bottlenecks Use tools like SQL Profiler, Extended Events, DMVs, Query Store, or Performance Monitor Tune indexes, statistics, and query plans Optimize ETL job performance and concurrency Security & Compliance Implement role-based access control (RBAC), encryption at rest/in transit, and auditing Understand GDPR, SOC2, HIPAA, or other compliance frameworks Manage user provisioning, privilege management, and data masking Maintenance & Automation Set up and monitor database maintenance plans (index rebuilds, integrity checks) Automate housekeeping tasks via SQL Agent, PowerShell, or Fabric pipelines Capacity Planning & Storage Management Forecast storage growth and manage file groups/partitions Understand I/O characteristics and underlying hardware/cloud configurations Advanced/Strategic DBA Skills Capacity and scalability planning for BI workloads Data lifecycle management and archiving strategies Collaboration with data architects to align database design with business goals Documentation and governance of data assets and metadata Cloud DBA/Modern Data Platform Skills (for hybrid or cloud environments) Snowflake Administration: Roles, warehouses, resource monitors, credit usage optimization Azure SQL/Synapse/Fabric Data Warehouse administration Familiarity with IAM, networking, and cost control in cloud data platforms Experience with infrastructure as code (IaC) tools for database provisioning (e.g., Terraform, ARM templates) Support and uphold HS&E policies and procedures of NCS and the customer Align individual goals with NCS corporate goals, while adhering to the NCS Promise Participate in your Personal Development for Success (PDS) Other duties, relevant to the position, shall be assigned as required Knowledge, Skills and Abilities Bachelor's degree in Computer Science, Information Technology, or equivalent experience 3+ years of experience in BI development, data engineering, or similar roles Strong proficiency in SQL Server (database design, query optimization, stored procedures, performance tuning) Hands-on experience with Snowflake (warehousing, schema design, data sharing, performance optimization) Practical knowledge of Apache Airflow or Azure Data Factory (Fabric Data Factory) for orchestration and workflow management Proficiency with the Microsoft BI stack, including Fabric and Power BI Track record of building and maintaining well-designed databases, complex data pipelines, and reporting solutions Strong analytical skills and ability to explain technical concepts clearly to business audiences Experience with Python or other scripting languages for data manipulation Knowledge of CI/CD practices for data pipeline deployment Exposure to data governance frameworks and compliance standards Familiarity with APIs and data integration tools Understanding of AI-powered BI tools, including how to prepare and connect datasets for Microsoft Copilot in Power BI/Fabric Awareness of how to design data models for AI-driven analytics and natural language queries Additional Information FLSA Status: Exempt Employment Classification: Full-time, Regular Work schedule: 5 days on, 2 days off; Monday to Friday 8:00am - 5:00pm, on-call 24/7 for support Travel: 15-20% domestic travel required; some international travel may be required Target Discretionary Bonus: Eligible Special Equipment: Laptop Criminal background check required for all positions Safety sensitive positions will require additional pre-employment testing Core Competencies Teamwork/Collaboration - Able to work cooperatively with other individuals Service Focus - Builds & maintains customer satisfaction and provides excellent service to internal & external customers Decision Making - Able to make decisions and solve problems of varied levels of complexity using logical, systematic, and sequential approach Ethics & Integrity - Trustworthiness and ethical behavior with consideration for impact & consequence when making decisions/taking action Problem Solving - Ability to approach a problem by using a logical, systematic, sequential approach Continuous Improvement - Ongoing improvement of products, services, or processes through incremental & breakthrough improvements Accountability - Obligation or willingness to be answerable for an outcome
    $84k-119k yearly est. Auto-Apply 55d ago
  • BI Data Engineer

    NCS Multistage LLC 4.1company rating

    Houston, TX jobs

    Job Description Job Title - Business Intelligence (BI) Data Engineer Department - Information Technology Reports to - Director IT Infrastructure & Ops The Business Intelligence (BI) Data Engineer will be responsible for designing, building, and maintaining the data infrastructure, pipelines, and analytics solutions that drive data-informed decision making across the organization. This role bridges data engineering and business intelligence, ensuring data is clean, reliable, and transformed into meaningful insights through platforms like Microsoft Fabric, Power BI, SQL Server, and Snowflake. This role requires the ability to solve complex data challenges, experience across cloud and on-premises platforms, and the ability to turn raw data into trusted, actionable intelligence for business stakeholders. Key Areas of Responsibility Data Engineering & Pipelines Design and implement relational and dimensional databases, including schema design, indexing strategies, and performance tuning Develop, maintain, and optimize ETL/ELT pipelines using Apache Airflow or Azure Data Factory (Fabric Data Factory) Design scalable data models and pipelines for SQL Server and Snowflake Deliver reporting and analytics solutions using Microsoft BI, including Fabric and Power BI Ensure high availability, reliability, and performance of data processes Business Intelligence & Analytics Build, maintain, and optimize dashboards and reports in Power BI and Fabric Translate complex data sets into clear visualizations and metrics for business stakeholders Partner with teams to identify KPIs and deliver actionable insights Data Governance & Quality Implement and monitor data validation, error handling, and performance tuning strategies Contribute to best practices in data security, compliance, and governance Collaboration & Strategy Work closely with data scientists, analysts, and business units to support cross-functional analytics initiatives Participate in architectural discussions to improve scalability and efficiency of data solutions Core Database Administration (DBA) Skills Backup, Recovery & High Availability Design and implement backup and restore strategies (full, differential, transaction log backups) Knowledge of disaster recovery planning and RPO/RTO requirements Experience with high availability solutions: SQL Server: Always On Availability Groups, Failover Clustering Snowflake: Time Travel, Fail-safe Ability to test and document recovery processes Performance Monitoring & Tuning Diagnose slow queries, deadlocks, and bottlenecks Use tools like SQL Profiler, Extended Events, DMVs, Query Store, or Performance Monitor Tune indexes, statistics, and query plans Optimize ETL job performance and concurrency Security & Compliance Implement role-based access control (RBAC), encryption at rest/in transit, and auditing Understand GDPR, SOC2, HIPAA, or other compliance frameworks Manage user provisioning, privilege management, and data masking Maintenance & Automation Set up and monitor database maintenance plans (index rebuilds, integrity checks) Automate housekeeping tasks via SQL Agent, PowerShell, or Fabric pipelines Capacity Planning & Storage Management Forecast storage growth and manage file groups/partitions Understand I/O characteristics and underlying hardware/cloud configurations Advanced/Strategic DBA Skills Capacity and scalability planning for BI workloads Data lifecycle management and archiving strategies Collaboration with data architects to align database design with business goals Documentation and governance of data assets and metadata Cloud DBA/Modern Data Platform Skills (for hybrid or cloud environments) Snowflake Administration: Roles, warehouses, resource monitors, credit usage optimization Azure SQL/Synapse/Fabric Data Warehouse administration Familiarity with IAM, networking, and cost control in cloud data platforms Experience with infrastructure as code (IaC) tools for database provisioning (e.g., Terraform, ARM templates) Support and uphold HS&E policies and procedures of NCS and the customer Align individual goals with NCS corporate goals, while adhering to the NCS Promise Participate in your Personal Development for Success (PDS) Other duties, relevant to the position, shall be assigned as required Knowledge, Skills and Abilities Bachelor's degree in Computer Science, Information Technology, or equivalent experience 3+ years of experience in BI development, data engineering, or similar roles Strong proficiency in SQL Server (database design, query optimization, stored procedures, performance tuning) Hands-on experience with Snowflake (warehousing, schema design, data sharing, performance optimization) Practical knowledge of Apache Airflow or Azure Data Factory (Fabric Data Factory) for orchestration and workflow management Proficiency with the Microsoft BI stack, including Fabric and Power BI Track record of building and maintaining well-designed databases, complex data pipelines, and reporting solutions Strong analytical skills and ability to explain technical concepts clearly to business audiences Experience with Python or other scripting languages for data manipulation Knowledge of CI/CD practices for data pipeline deployment Exposure to data governance frameworks and compliance standards Familiarity with APIs and data integration tools Understanding of AI-powered BI tools, including how to prepare and connect datasets for Microsoft Copilot in Power BI/Fabric Awareness of how to design data models for AI-driven analytics and natural language queries Additional Information FLSA Status: Exempt Employment Classification: Full-time, Regular Work schedule: 5 days on, 2 days off; Monday to Friday 8:00am - 5:00pm, on-call 24/7 for support Travel: 15-20% domestic travel required; some international travel may be required Target Discretionary Bonus: Eligible Special Equipment: Laptop Criminal background check required for all positions Safety sensitive positions will require additional pre-employment testing Core Competencies Teamwork/Collaboration - Able to work cooperatively with other individuals Service Focus - Builds & maintains customer satisfaction and provides excellent service to internal & external customers Decision Making - Able to make decisions and solve problems of varied levels of complexity using logical, systematic, and sequential approach Ethics & Integrity - Trustworthiness and ethical behavior with consideration for impact & consequence when making decisions/taking action Problem Solving - Ability to approach a problem by using a logical, systematic, sequential approach Continuous Improvement - Ongoing improvement of products, services, or processes through incremental & breakthrough improvements Accountability - Obligation or willingness to be answerable for an outcome
    $84k-119k yearly est. 26d ago
  • Data Engineer

    Floworks International LLC 4.2company rating

    Houston, TX jobs

    FloWorks is a leading, privately held specialty industrial supplier of pipe, valves, fittings, and related products, as well as a provider of technical solutions to the energy and industrial sectors. Headquarters in Houston, Texas, Floworks is dedicated to delivering exceptional products, expertise, and service to its customers. Job Information: As the Data Engineer you are responsible for building, managing, and optimizing cloud-based data solutions to support business processes and analytics across the organization. This role requires expertise in ETL development, data modeling, cloud technologies, and business intelligence to ensure data integrity, insightful analysis, and effective reporting. Key Responsibilities: Build and manage cloud ETL processes for data extraction, transformation, and loading from multiple sources into data lakes and data warehouses primarily within Microsoft Fabric Apply business rules, audit, and stage data to ensure data integrity and compliance Develop fact and dimension tables to support Power BI report development and other business intelligence needs Create visualizations and reports to meet business requirements and support decision-making Provide business analysis, problem-solving, and creativity to identify KPIs and metrics that drive business goals Ensure timely and accurate performance on assigned projects, maintaining compliance with project budgets and deadlines Proactively engage in projects, recognize and resolve problems, and implement solutions independently Collaborate with cross-functional teams to gather complete datasets and communicate findings company-wide Train stakeholders on best practices for data reporting and self-service analytics Qualifications: Bachelor's degree in technology, mathematics, statistics, accounting, finance, or a related quantitative discipline Over 2 years of experience in data analytics (high grade) Expert in SQL (a must-have skill) Highly experienced in cloud technologies, with a strong preference for Microsoft Fabric, DBT, and Azure. Experience with Databricks, Snowflake, and AWS may also be considered Proficient in Python programming and data modeling using the Kimball Method (Star Schema) Skilled in analytical and visualization tools, with a strong preference for Power BI. Experience with Tableau may also be considered Experience and passion for training data science and machine learning models. Familiarity with Git and source control concepts Experience with Databricks, Airflow, Python, AI, AWS, and data integrations with ERPs or Salesforce is a plus Ability to work with Azure DevOps and cross-functional teams to define analytics use cases and translate them into technical solutions Strong intellectual and analytical curiosity, adaptability, and independence Physical Demands Frequently required to stand Frequently required to walk Continually required to sit Continually required to utilize hand and finger dexterity Occasionally balance, bend, stoop, kneel or crawl Continually required to talk or hear Continually utilize visual acuity to read technical information and/or use a keyboard Occasionally required to lift/push/carry items up to 25 pounds Occasionally work near moving mechanical parts Occasionally exposure to outside weather conditions Occasionally loud noise (examples: shop tool noises, electric motors, moving mechanical equipment) Work Environment This role operates in a professional office environment with flexibility for hybrid work. Standard office equipment such as computers, phones, and printers are used. Occasional visits to warehouses or operational sites may be required. The Perks of Working Here FloWorks offers a competitive benefits package designed to support your health, financial well-being, and work-life balance. Highlights include: Medical, Dental & Vision Insurance with multiple plan options Company-paid Life and Disability Insurance 401(k) with company match Health Savings & Flexible Spending Accounts Supplemental coverage (Accident, Critical Illness, Hospital Indemnity) Employee Assistance Program (includes 3 free counseling sessions) Identity Theft Protection at discounted rates This information indicates the general nature and level of work performed by associates in this role. It is not designed to contain a comprehensive inventory of all duties, responsibilities, and qualifications required of associates assigned to this role. This description supersedes any previous or undated descriptions for this role. Management retains the right to add or change the duties of the position at any time. Questions about the duties and responsibilities of this position should be directed to the reporting Manager or Human Resources. FloWorks is an equal opportunity employer and gives consideration for employment to qualified applicants without regard to race, color, religion, gender, gender identity, sexual orientation, national origin, genetics, disability, age, or protected veteran status. Committed to fostering a culture where every individual is valued and empowered to contribute to shared success. FloWorks participates in the US Government's E-Verify program.
    $84k-119k yearly est. Auto-Apply 60d ago
  • Data Engineer I - 009267

    EOG Resources 4.9company rating

    Houston, TX jobs

    Data Engineer I is responsible for implementing business applications services and integrations projects. These projects will involve developing, maintaining and troubleshooting Rest APIs, SQL and Pl/Sql code bases in development, staging and production environments specific to oil and gas E&P cycle.
    $96k-123k yearly est. 21d ago
  • Data Science Engineer

    Goodnight Midstream 4.2company rating

    Dallas, TX jobs

    We're seeking an outgoing and collaborative Data Science Engineer to join our team. You'll play a key role in building and integrating data tools and solutions that will directly impact our business operations. Goodnight Midstream is predominantly a MS Power Platform, Palantir Foundry and Ignition/SCADA shop. This is an exempt-level position based out of our Dallas office, working a hybrid schedule. Applicants must be currently authorized to work in the United States on a full-time basis- Goodnight Midstream is not able to sponsor applicants for work visa. This position requires someone who's not only technically skilled but also a great communicator, capable of working effectively across multiple teams to understand their needs and deliver innovative solutions. You'll be instrumental in developing and maintaining our data infrastructure, with a particular focus on integrating with industrial SCADA systems and SQL databases to drive insights and improve efficiency. Responsibilities: Serve as an engineering resource for the development of Data Science projects. Build and maintain data pipelines to support data-driven decision making across the organization. Integrate data from diverse sources, including, but not limited to, APIs, industrial SCADA systems and various SQL databases. Develop and deploy AI-powered tools and models to solve complex business problems. Collaborate with cross-functional teams, including engineers, operations and management to define project requirements and deliver effective solutions. Manage data science projects from conception to completion, ensuring timely and successful delivery using professional project management tools. Train end users on optimizing their use of the Data Science data platform. Utilize a variety of AI tools to automate processes, enhance data analysis and improve overall operational intelligence. Manage and maintain the company's schema and data map Education, Experience, and Technical Skills : A bachelor's degree is required. While a technical background is preferred, we welcome applicants from a variety of educational fields, including engineering disciplines such as mechanical or electrical engineering. Candidates should have at least two years of professional experience in a data-focused role, with proven expertise in building and integrating data systems. Experience and proficiency with the following technologies is required: Python Microsoft Power Platform MS SQL Server MS Excel Proficiency in programmatically interacting with APIs for data exchange and system integration. Experience with the following technologies would be preferred and ideal: Palantir Foundry, and specifically Quiver Splunk AI Tools You should be able to demonstrate your skills without references. The candidate should be highly skilled in the following areas: problem solving, prioritization, attention to detail, verbal and written communication. This position requires a self-starter/self-learner that demonstrates clear accountability and can adapt easily whether working independently or with a team. The candidate must have a strong sense of urgency and sound judgment. This candidate must be able to work independently, as well as in a group setting. Ability to plan out, manage, document and delegate multiple concurrent projects. Working Environment/Physical Requirements : Working conditions are normal for an office environment. This position remains sedentary at least 75% of the time. This position constantly operates computers, phones, printers and other office productivity machines (e.g., fax machine, copier, printer, etc.). Occasionally required to move documents or files weighing up to 10 lbs. This position requires frequent communication and exchange of information verbally and in writing. This position must be able to inspect and observe information on a computer screen at least 80% of the time. Occasional travel to field facilities and other locations may be required.
    $83k-116k yearly est. 60d+ ago
  • Principal Data Engineer

    TXU Energy Services Co 4.1company rating

    Irving, TX jobs

    If you have what it takes to become part of the Vistra family and would like to start a promising career with a global leader, take a look at the exciting employment opportunities that are currently available and apply online. We are seeking a highly skilled and experienced Principal Data Engineer to join our team. As a Principal Data Engineer, you will be responsible for architecting, designing, and implementing scalable and robust data solutions that enable efficient data processing, storage, and retrieval. You will provide technical leadership, drive innovation, and ensure the integrity and reliability of our data infrastructure. This is a senior-level position that requires exceptional technical expertise, strong leadership capabilities, and a proven track record of successfully delivering complex data engineering projects. Job Description Key Accountabilities Lead the design and development of scalable and reliable data pipelines, including data ingestion, processing, storage, and retrieval. Develop data models and schemas that support efficient data storage, retrieval, and analytics, employing optimization techniques to enhance query performance and scalability. Leverage big data technologies and frameworks (e.g., Hadoop, Spark, Hive) to process and analyze large volumes of data, enabling advanced analytics and machine learning initiatives. Manage and optimize data infrastructure, including cloud-based platforms, containerization technologies, and distributed computing environments. Work closely with other teams, including Data Science, Analytics, and Product, to understand their data needs and requirements. Develop and implement best practices for data modeling, storage, and retrieval. Ensure the security and privacy of our data and compliance with relevant regulations. Develop and maintain documentation for all data processes and systems. Evaluate new technologies and tools for data processing, storage, and retrieval and recommend solutions to improve the efficiency and scalability of our data infrastructure. Propose and lead continuous improvement opportunities. Mentor team members to develop their technical and leadership skills. Education, Experience, & Skill Requirements 9-11 years of experience in data engineering, including experience in designing and building data pipelines Strong proficiency in data engineering technologies, such as ETL frameworks, big data processing, and SQL and NoSQL databases. Deep understanding of database systems, data modeling, and data warehousing. Experience with cloud-based data storage and processing technologies, such as AWS, Azure, or Google Cloud. Key Metrics Understanding of data privacy and data governance policies. Strong problem-solving and analytical skills. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Ability to lead and manage projects. Experience in leading a team of data engineers and managing complex projects. Strong problem-solving skills and ability to work in a fast-paced environment. Excellent communication skills and ability to work collaboratively with cross-functional teams. Experience with agile software development methodologies. #LI-Hybrid #LI-ND1 #Dice Job Family Data Analytics Company Vistra Corporate Services Company Locations Irving, TexasTexas We are a company of people committed to: Exceeding Customer Expectations, Great People, Teamwork, Competitive Spirit and Effective Communication. If this describes you, then apply today! If you currently work for Vistra or its subsidiaries, please apply via the internal career site. It is the policy of the Company to comply with all employment laws and to afford equal employment opportunity to individuals in all aspects of employment, including in selection for job opportunities, without regard to race, color, religion, sex, sexual orientation, gender identity, pregnancy, national origin, age, disability, genetic information, military service, protected veteran status, or any other consideration protected by federal, state or local laws. If you are an individual with a disability and need assistance submitting an application or would like to request an accommodation, please email us at assistance@vistraenergy.com to make a request.
    $99k-129k yearly est. Auto-Apply 8d ago
  • Data Engineer (AI-RPA)

    Padnos 3.8company rating

    Grandville, MI jobs

    Title: Data Engineer YOUR ROLE PADNOS is seeking a Data Engineer on our Data and Software team who thrives at the intersection of data, automation, and applied AI. This role builds intelligent data pipelines and robotic process automations (RPAs) that connect systems, streamline operations, and unlock efficiency across the enterprise. You'll design and develop pipelines using Python, SQL Server, and modern APIs-integrating services such as OpenAI, Anthropic, and Azure ML-to drive automation and accelerate business processes. Your work will extend beyond traditional data engineering, applying AI models and API logic to eliminate manual effort and make data more actionable across teams. You will report directly to IT Manager, at PADNOS Corporate in Grandville, MI. This is an in-person role based in Grandville, Michigan. Must reside within daily commuting distance of Grandville, Michigan. We do not relocate, sponsor visas, or consider remote applicants. ACCOUNTABILITIES Design and develop automated data pipelines that integrate AI and machine learning services to process, enrich, and deliver high-value data for analytics and automation use cases. Build, maintain, and optimize SQL Server ELT workflows and Python-based automation scripts. Connect to external APIs (OpenAI, Anthropic, Azure ML, and other SaaS systems) to retrieve, transform, and post data as part of end-to-end workflows. Partner with business stakeholders to identify manual workflows and translate them into AI-enabled automations. Work with software developers to integrate automation logic directly into enterprise applications. Implement and monitor data quality, reliability, and observability metrics across pipelines. Apply performance tuning and best practices for database and process efficiency. Develop and maintain reusable Python modules and configuration standards for automation scripts. Support data governance and version control processes to ensure consistency and transparency across environments. Collaborate closely with analytics, software, and operations teams to prioritize and deliver automation solutions that create measurable business impact. MEASUREMENTS Reduction in manual hours across teams through implemented automations. Reliable and reusable data pipelines supporting AI and analytics workloads. Delivery of production-ready automation projects in collaboration with business units. Adherence to data quality and reliability standards. Continuous improvement in data pipeline performance and maintainability. QUALIFICATIONS/EXPERIENCE Bachelor's degree or equivalent experience in data engineering, computer science, or software development. Must have personally owned an automated pipeline end-to-end (design → build → deploy → maintain). Minimum 3 years hands-on experience building production data pipelines using Python and SQL Server. Contract, academic, bootcamp, or coursework experience does not qualify. Intermediate to advanced Python development skills, particularly for data and API automation. Experience working with RESTful APIs and JSON data structures. Familiarity with AI/ML API services (OpenAI, Anthropic, Azure ML, etc.) and their integration into data workflows. Experience with modern data stack components such as Fivetran, dbt, or similar tools preferred. Knowledge of SQL Server performance tuning and query optimization. Familiarity with Git and CI/CD workflows for data pipeline deployment. Bonus: Experience deploying or maintaining RPA or AI automation solutions. PADNOS is an Equal Opportunity Employer and does not discriminate on the basis of race, color, religion, sex, age, national origin, disability, veteran status, sexual orientation or any other classification protected by Federal, State or local law.
    $79k-109k yearly est. 3d ago
  • Data Engineer, Manager

    The Energy Authority Inc. 4.1company rating

    Jacksonville, FL jobs

    Job Description About The Energy Authority The Energy Authority is a public power-owned, nonprofit corporation with offices in Jacksonville, Florida, and Bellevue (Seattle), Washington. TEA provides public power utilities with access to advanced resources and technology systems so they can respond competitively in the changing energy markets. Through partnership with TEA, utilities benefit from an experienced organization that is singularly focused on deriving the maximum value of their assets from the market. Join Our Team as a Data Engineering Manager Are you a strategic technical leader who thrives at the intersection of innovation, data, and collaboration? TEA is looking for a Data Engineering Manager to guide the development of our enterprise data ecosystem and drive cloud-first, data-centric solutions that support both internal teams and external clients. In this high-impact role, you'll partner directly with Managers and Directors across the organization, influencing data strategy while leading a team of talented engineers to build scalable, secure, and business-critical data products. If you love shaping strategy and rolling up your sleeves to solve complex problems-whether designing cloud architectures, reviewing critical implementations, or modeling best practices-this is an opportunity to make your mark. You'll play a pivotal role in TEA's cloud transformation and help define the future of data at TEA. What You'll Do Identify and articulate effective cloud-based strategies to meet TEA's evolving data engineering needs Lead the design and development of business-critical and client-supporting cloud data solutions Ensure the stability, integrity, security, and efficient operation of cloud data solutions supporting analytics and machine learning initiatives Lead the development of TEA's Data Lake architecture, including data modeling, ELT processes, and data pipeline standards Recruit, mentor, and develop engineers to build a high-performing and collaborative team Influence decision-making and champion best practices across engineering and business stakeholder groups Technical Skills We're Looking For Expertise in Microsoft Azure (Azure Data Lake, Azure Databricks, Fabric, Azure Data Factory) Deep expertise in Azure Databricks, including Notebook development, Workflows, and Asset Bundles Proven ability to design and implement modern data architectures (data lakes, warehouses, lakehouses, data mesh, streaming pipelines) Proficiency in Python, PySpark, SQL, and modern data engineering frameworks Strong understanding of APIs, data integration patterns, data governance frameworks, and security/compliance standards Experience optimizing pipelines, architectures, and queries for performance and cost Knowledge of data governance, metadata management, and data security principles Language Skills Ability to read, analyze, and interpret technical journals, financial reports, and legal documents Ability to respond to inquiries from customers, regulatory agencies, or business stakeholders Ability to write polished professional content such as presentations, speeches, and publications Ability to effectively present information to senior management, public groups, or boards of directors Management Responsibilities Direct supervision of Enterprise Data Team personnel responsible for Data Engineering functions Interviewing, hiring, training, and developing employees Planning, assigning, and directing daily work Conducting performance evaluations and providing ongoing feedback Rewarding excellence and addressing performance issues Ensuring compliance with organizational policies and applicable laws Promoting strong communication, teamwork, and a culture of accountability Education & Experience Master's degree (M.A.) or equivalent preferred 4-10 years of related experience and/or technical leadership Or an equivalent combination of education and experience If you're ready to lead transformative data initiatives and help shape the future of TEA's data strategy, we'd love to meet you. Apply today and bring your expertise to a team where innovation, collaboration, and impact come together. TEA Values TEA employees share a common sense of purpose. When TEA accomplishes its mission, the result is improved quality of life for the citizens and businesses of the communities our clients serve. TEA employees exceed the expectations of those they serve, deliver services with the highest standards of fair, honest, and ethical behavior, set the standard for service and expertise in our industry, embody a spirit of collaboration, and embrace TEA's founding entrepreneurial spirit by seizing opportunities to deliver value. If you are self-motivated, driven to deliver excellence, and passionate about your career, TEA is the perfect place for you. It's YOUR Future. It's OUR Future.
    $86k-109k yearly est. 10d ago
  • Data Engineer

    AES Drilling Fluids 4.5company rating

    Midland, TX jobs

    Full-time Description AES Completion Services has an opportunity for an ambitious and innovative Data Engineer to join our Engineering team in Midland, TX. This is a new role for the team that supports daily operations by ensuring accurate data collection, entry, and validation across inventory and accounting systems. This role manages high-volume data in Excel, oversees field invoicing, and assists with reporting to maintain accuracy and compliance. Key duties include reconciling data, identifying discrepancies, understanding data relationships, and translating findings for the Accounting team, especially Accounts Payable. Strong attention to detail, organization, and collaboration with field and office staff are essential in this fast-paced oilfield service environment. Fantastic benefits! Affordable medical / dental / vision / dependent Employer paid life insurance Vacation / sick pay / generous holidays 401K (6% match) Work Location: Midland, TX RESPONSIBILITIES: Data Management & Analysis 1. Enter, validate, and maintain operational and financial data with precision. 2. Develop and manage Excel spreadsheets, pivot tables, and formulas to support reporting needs. 3. Perform quality checks to ensure accuracy and consistency of field and office data. Invoicing Oversight 1. Performs administrative functions, such as reviewing and writing reports, approving expenditures, enforcing rules, and making decisions about the purchase of materials or services and updates AES executive management concerning procurement and personnel needs. 2. Monitor invoicing timelines to ensure prompt and correct billing to customers. 3. Collaborate with field supervisors and accounting teams to resolve discrepancies Operational Support 1. Assist engineers and field personnel by preparing data reports and summaries. 2. Support compliance with company standards, customer requirements, and industry best practices. 3. Provide ad hoc analysis and reporting as required by management Effectively uses tools to manage and build customer relationships, including entertainment, business calls, technical support, conferences, and other appropriate outreach. Requirements Bachelor's degree in Business, Engineering Technology (or related degree) is preferred. Strong proficiency in Microsoft Excel (advanced formulas, pivot tables, data validation). 3-5 years of experience in oilfield, data management, or invoicing preferred. Excellent attention to detail and ability to spot errors in large datasets. Strong organizational and communication skills with the ability to work across teams. Knowledge of oilfield operations and terminology is a plus. AES Drilling Fluids is an equal opportunity employer. All persons shall have the opportunity to be considered for employment on the basis of their qualification for the job in question without regard to their race, color, religion, sex, national origin age, disability, military/veteran status, genetic characteristics, or any other characteristic protected by applicable federal, state or local law. AES Drilling Fluids regrets that it is unable to sponsor employment Visas or consider individuals on time-limited Visa status for this position.
    $83k-117k yearly est. 60d+ ago
  • Data Engineer

    AES Drilling Fluids LLC 4.5company rating

    Midland, TX jobs

    Job DescriptionDescription: AES Completion Services has an opportunity for an ambitious and innovative Data Engineer to join our Engineering team in Midland, TX. This is a new role for the team that supports daily operations by ensuring accurate data collection, entry, and validation across inventory and accounting systems. This role manages high-volume data in Excel, oversees field invoicing, and assists with reporting to maintain accuracy and compliance. Key duties include reconciling data, identifying discrepancies, understanding data relationships, and translating findings for the Accounting team, especially Accounts Payable. Strong attention to detail, organization, and collaboration with field and office staff are essential in this fast-paced oilfield service environment. Fantastic benefits! Affordable medical / dental / vision / dependent Employer paid life insurance Vacation / sick pay / generous holidays 401K (6% match) Work Location: Midland, TX RESPONSIBILITIES: Data Management & Analysis 1. Enter, validate, and maintain operational and financial data with precision. 2. Develop and manage Excel spreadsheets, pivot tables, and formulas to support reporting needs. 3. Perform quality checks to ensure accuracy and consistency of field and office data. Invoicing Oversight 1. Performs administrative functions, such as reviewing and writing reports, approving expenditures, enforcing rules, and making decisions about the purchase of materials or services and updates AES executive management concerning procurement and personnel needs. 2. Monitor invoicing timelines to ensure prompt and correct billing to customers. 3. Collaborate with field supervisors and accounting teams to resolve discrepancies Operational Support 1. Assist engineers and field personnel by preparing data reports and summaries. 2. Support compliance with company standards, customer requirements, and industry best practices. 3. Provide ad hoc analysis and reporting as required by management Effectively uses tools to manage and build customer relationships, including entertainment, business calls, technical support, conferences, and other appropriate outreach. Requirements: Bachelor's degree in Business, Engineering Technology (or related degree) is preferred. Strong proficiency in Microsoft Excel (advanced formulas, pivot tables, data validation). 3-5 years of experience in oilfield, data management, or invoicing preferred. Excellent attention to detail and ability to spot errors in large datasets. Strong organizational and communication skills with the ability to work across teams. Knowledge of oilfield operations and terminology is a plus. AES Drilling Fluids is an equal opportunity employer. All persons shall have the opportunity to be considered for employment on the basis of their qualification for the job in question without regard to their race, color, religion, sex, national origin age, disability, military/veteran status, genetic characteristics, or any other characteristic protected by applicable federal, state or local law. AES Drilling Fluids regrets that it is unable to sponsor employment Visas or consider individuals on time-limited Visa status for this position.
    $83k-117k yearly est. 15d ago

Learn more about Florida Power and Light jobs

View all jobs