Post job

Data engineer jobs in Burlington, NC - 365 jobs

All
Data Engineer
Data Architect
Data Scientist
Data Warehouse Developer
Data Consultant
Hadoop Developer
Requirements Engineer
  • Scada Engineer

    CBTS 4.9company rating

    Data engineer job in Liberty, NC

    JD: - Must Have: - Participate in daily written and verbal reports up to Senior Manager level. Developing process plans for battery production lines within the Battery PE department with direct support from the Manager. Working in a collaborative team environment (supporting other Engineers, Construction Management Group, Technicians, Vendors, General Contractors, etc.) to achieve project milestones. Equipment procurement, installation planning to execution, commissioning, pre-production trials, and launch of battery production lines. Participate in advanced cleanliness protocols (clean room) necessary for assigned area of battery manufacturing. Lead and/or support equipment trials at vendors prior to shipping and on the manufacturing floor after installation ensuring all targets are met for safety, quality and productivity. Support shop floor-level implementation of network systems, which includes andon and Manufacturing Execution System (MES), as well as connectivity to higher level IT-managed systems, i.e. Manufacturing Operations Management (MOM). Support battery network system plans and specifications, applying knowledge of OPC-UA, CC-Link, SQL, and SCADA system implementation. Configuring manufacturing equipment PLCs (primarily Mitsubishi) for network systems connectivity using SLMP, Ethernet TCP, and Ethernet IP protocols. Punchlist item identification, root cause analysis, and countermeasure management KPI summarization, mass-production handover, and production issues support Cross-functional team engagement (domestic and international) to discuss open items and key project schedules/milestones. Creating and maintaining detailed schedules for assigned areas. Maintain budgets for assigned areas. Completing internal and external training Requirements: What you bring Bachelor's degree or higher in Engineering or similar technical field. 3+ years of equivalent professional experience in a manufacturing environment. Experience in production preparation and execution of capital projects, preferably from initial strategy planning to the start of mass production. Project management experience dealing with capital investments greater than $0.5M. Experience in creating and maintaining detailed schedules including milestone achievement for manufacturing equipment installation. Ability to read and interpret 2D drawings including building facilities and process equipment drawings. Proficiency with Microsoft Office products (Word, Excel, PowerPoint, etc.). Proficiency with 2D and 3D drafting software such as AutoCAD. PLC experience including the ability to Read, interpret and modify. Ability for business travel, both domestic and international up to 10%. Ability to work weekends, holidays, and shutdown periods (such as July and December shutdown) as needed, based on project condition and schedule. SQL / Oracle or other database experience. Experience with Cisco networks and maintain a machine network. Experience using Ignition for programming Andon visualization and other tools. Added bonus if you have Mitsubishi PLC Programming experience SCADA Systems experience MES experience Experience with Kepware software or similar SCADA Data systems.
    $69k-92k yearly est. 16h ago
  • Job icon imageJob icon image 2

    Looking for a job?

    Let Zippia find it for you.

  • Senior Data Engineer

    The Clorox Company 4.6company rating

    Data engineer job in Durham, NC

    Clorox is the place that's committed to growth - for our people and our brands. Guided by our purpose and values, and with people at the center of everything we do, we believe every one of us can make a positive impact on consumers, communities, and teammates. Join our team. #CloroxIsThePlace (**************************************************************************** UpdateUrns=urn%3Ali%3Aactivity%3A**********048001024) **Your role at Clorox:** We are seeking an experienced and highly skilled senior data engineer to join our enterprise data strategy and operations team. The ideal candidate will have extensive expertise in designing, building and maintaining data pipelines and data solution architectures on cloud platforms, particularly Azure. This role involves leading data engineering products, optimizing data processing workflows, ensuring data quality and governance, and collaborating with cross-functional teams to support analysis and insights generation to fuel large scale ideation and roadmapping associated with revenue growth and margin improvements projects at scale across the enterprise. The role of Senior Data Engineer at Clorox will play a key role in leading and delivering enterprise quality data solutions that can enable data driven business decisions. Role is an analytical big picture thinker with a product mindset and strong background in business intelligence and engineering in cloud platform, that can leverage technology and build scalable data products that create value across the organization.This role will also serve as a key collaborator with our business analytics and enterprise technology stakeholders to innovate, build and sustain the cloud data infrastructure that will further Clorox's digital transformation efforts. **In this role, you will:** + **Collaborate & Lead** : Work closely with business product owners, data scientists, analysts, and cross-functional stakeholders to understand the business' data needs and provide technical solutions. Influence business partners to align to the technical solutions and to adhere to technical architecture standards. Provide technical guidance to junior engineers, BI developers, and contractors to create efficient and effective data solutions. + **Architecting and Innovate** : Strong proficiency in Python, Spark, SQL, PySQL, Pandas, CI/CD methodologies is required. Strong data ingestion, data modeling and dimensional modeling skills using medallion lake house architecture. Strong BI skills to build reports & dashboards using Power BI and Tableau etc. Experience in reporting security like row level, column level, object level and masking etc. Experience with SQL and DML to recast data in backend database for data changes, restatements and data processing errors, etc. Experience with ML Ops and supporting Data Science workflow pipelines. Knowledge of Gen AI frameworks and LLMs to support agentic products + **Optimize and Scale:** Build and maintain data pipelines to integrate data from various source systems. Optimize data pipelines for performance, reliability and cost-effectiveness. Work with enterprise infrastructure and technology teams to implement best practices for performance monitoring, cloud resource management, including scaling, cost control and security. + **Ensure Quality and Governance:** Ensure safe custody, transport and storage of data in the data platforms. Collaborate with Data Governance Stewards and Business Stakeholders to enforce the business rules, data quality rules and data cataloging activities. Ensure data quality, security and compliance for the data products responsible under this role. + **Enhance BI Capabilities:** Develop and manage business intelligence solutions for the organization to transform data into insights that can drive business value. Help Analytics Product Owners and Business Leaders improve business decisions through data analytics, data visualization, and data modeling techniques and technologies. **What we look for:** + 7+ years of experience if the candidate holds BS degree in Computer Science, Information Systems or relevant streams; 5-7 years of experience if the candidate holds MS/PhD degree + Experience in architecting data solutions, cloud data engineering, end to end data warehouse or lake house implementations, end to end business intelligence implementations + 7 plus years of experience with data engineering, data warehousing, business intelligence with substantial experience in managing large-scale data projects + 5 plus years' experience with data solutions implementations in Cloud platform technologies like Microsoft Azure, AWS etc. + 4 plus years with business intelligence using technologies like Power BI, Tableau etc. + 4plus years of experience with Azure services like Data Factory, Databricks, and Delta Lake will be an added advantage. + Experience in end-to-end support for data engineering solutions (Data Pipelines), including designing, developing, deploying, and supporting solutions for existing platforms + Knowledge or experience in Microsoft D365 Dataverse and reporting in Microsoft Fabric technology \#LI-HYBRID **Workplace type:** Hybrid - 3 days in the office, 2 days WFH **Our values-based culture connects to our purpose and empowers people to be their best, professionally and personally. We serve a diverse consumer base which is why we believe teams that reflect our consumers bring fresh perspectives, drive innovation, and help us stay attuned to the world around us. That's why we foster an inclusive culture where every person can feel respected, valued, and fully able to participate, and ultimately able to thrive.** Learn more (********************************************************************************************************* **.** **[U.S.]Additional Information:** At Clorox, we champion people to be well and thrive, starting with our own people. To help make this possible, we offer comprehensive, competitive benefits that prioritize all aspects of wellbeing and provide flexibility for our teammates' unique needs. This includes robust health plans, a market-leading 401(k) program with a company match, flexible time off benefits (including half-day summer Fridays depending on location), inclusive fertility/adoption benefits, and more. We are committed to fair and equitable pay and are transparent with current and future teammates about our full salary ranges. We use broad salary ranges that reflect the competitive market for similar jobs, provide sufficient opportunity for growth as you gain experience and expand responsibilities, while also allowing for differentiation based on performance. Based on the breadth of our ranges, most new hires will start at Clorox in the first half of the applicable range. Your starting pay will depend on job-related factors, including relevant skills, knowledge, experience and location. The applicable salary range for every role in the U.S. is based on your work location and is aligned to one of three zones according to the cost of labor in your area. -Zone A: $128,000 - $252,200 -Zone B: $117,400 - $231,200 -Zone C: $106,700 - $210,200 All ranges are subject to change in the future. Your recruiter can share more about the specific salary range for your location during the hiring process. This job is also eligible for participation in Clorox's incentive plans, subject to the terms of the applicable plan documents and policies. Please apply directly to our job postings and do not submit your resume to any person via text message. Clorox does not conduct text-based interviews and encourages you to be cautious of anyone posing as a Clorox recruiter via unsolicited texts during these uncertain times. To all recruitment agencies: Clorox (and its brand families) does not accept agency resumes. Please do not forward resumes to Clorox employees, including any members of our leadership team. Clorox is not responsible for any fees related to unsolicited resumes. **Who we are.** We champion people to be well and thrive every single day. We're proud to be in every corner of homes, schools, and offices-making daily life simpler and easier through our beloved brands. Working with us, you'll join a team of passionate problem solvers and relentless innovators fueled by curiosity, growth, and progress. We relish taking on new, interesting challenges that allow our people to collaborate and thrive at work. And most importantly, we care about each other as multifaceted, whole humans. Join us as we reimagine what's possible and work with purpose to make a difference in the world. **This is the place where doing the right thing matters.** Doing the right thing is the compass that guides every decision we make-and we're proud to be globally recognized and awarded for our continuous corporate responsibility efforts. Clorox is a signatory of the United Nations Global Compact and the Ellen MacArthur Foundation's New Plastics Economy Global Commitment. The Clorox Company and its Foundation prioritize giving back to the communities we call home and contribute millions annually in combined cash grants, product donations, and cause-marketing. For more information, visit TheCloroxCompany.com and follow us on social media at @CloroxCo. **Our commitment to diversity, inclusion, and equal employment opportunity.** We seek out and celebrate diverse backgrounds and experiences. We're always looking for fresh perspectives, a desire to bring your best, and a nonstop drive to keep growing and learning. Learn more about our Inclusion, Diversity, Equity, and Allyship (IDEA) journey here (*********************************************** . The Clorox Company and its subsidiaries are an EEO/AA/Minorities/Women/LGBT/Protected Veteran/Disabled employer. Learn more to Know Your Rights (*********************************************************************************************** . Clorox is committed to providing reasonable accommodations for qualified applicants with disabilities and disabled veterans during the hiring and interview process. If you need assistance or accommodations due to a disability, please contact us at ***************** . Please note: this inbox is reserved for individuals with disabilities in need of assistance and is not a means of inquiry about positions/application statuses. The Clorox Company and its subsidiaries are an EEO/AA/ Minorities/Women/LGBT/Protected Veteran/Disabled employer.
    $128k-252.2k yearly 60d+ ago
  • Principal Clinical Data Scientist- Data Management

    Syneos Health, Inc.

    Data engineer job in Morrisville, NC

    Syneos Health is a leading fully integrated biopharmaceutical solutions organization built to accelerate customer success. We translate unique clinical, medical affairs and commercial insights into outcomes to address modern market realities. Our Clinical Development model brings the customer and the patient to the center of everything that we do. We are continuously looking for ways to simplify and streamline our work to not only make Syneos Health easier to work with, but to make us easier to work for. Whether you join us in a Functional Service Provider partnership or a Full-Service environment, you'll collaborate with passionate problem solvers, innovating as a team to help our customers achieve their goals. We are agile and driven to accelerate the delivery of therapies, because we are passionate to change lives. Discover what our 29,000 employees, across 110 countries already know: WORK HERE MATTERS EVERYWHERE Why Syneos Health * We are passionate about developing our people, through career development and progression; supportive and engaged line management; technical and therapeutic area training; peer recognition and total rewards program. * We are committed to our Total Self culture - where you can authentically be yourself. Our Total Self culture is what unites us globally, and we are dedicated to taking care of our people. * We are continuously building the company we all want to work for and our customers want to work with. Why? Because when we bring together diversity of thoughts, backgrounds, cultures, and perspectives - we're able to create a place where everyone feels like they belong. Job Responsibilities Summary The Principal Clinical Data Scientist provides strategic and operational leadership for end-to-end clinical data collection, cleaning, and quality oversight across complex clinical studies. This role serves as the functional lead for Clinical Data Science, ensuring clinical data deliverables are fit for purpose, compliant with regulatory and contractual requirements, and aligned with sponsor expectations and study timelines. The position partners cross-functionally to drive data quality, risk mitigation, analytics innovation, and timely delivery of clinical data milestones. Responsibilities * Serve as the Data Management Functional Lead for Clinical Data Science on complex, multi-scope clinical projects and act as the primary liaison between Clinical Data Science, Project Management, Clinical Monitoring, and other functional groups. * Develop Data Management Plan, Communicate, troubleshoot, and resolve complex data-related issues; recommend solutions and escalate issues impacting patient safety, data integrity, or study analysis * Act as the central steward of clinical data quality through holistic review of clinical and operational data using detailed protocol and therapeutic area knowledge. * Ensure required data elements and corresponding data quality oversight steps are identified to support defined study analyses. * Coordinate cross-functional data cleaning activities to meet quality standards, timelines, and contractual obligations. * Communicate, troubleshoot, and resolve complex data-related issues; recommend solutions and escalate issues impacting patient safety, data integrity, or study analysis. * Develop Clinical Data Acquisition Plans and data flow diagrams for complex studies and align data flow with study protocols, regulatory requirements, and study endpoints. * Assess risks related to protocol design, program-level strategies, and study parameters that may impact data credibility and trial reliability. * Design and drive development of analytical tools and dashboards to identify potentially unreliable or high-risk data. * Perform analytic reviews as defined in the scope of work and data acquisition plans; identify root causes and implement systematic resolutions. * Demonstrate understanding of advanced technologies and assess their applicability to individual studies or programs. * Monitor and communicate project progress using status reports, tracking tools, and metrics to Sponsors and internal teams. * Ensure launch, delivery, and completion of Clinical Data Science milestones in compliance with contracts, SOPs, guidelines, and regulatory requirements. * Collect and analyze metrics to support continuous process improvement initiatives. * Review and manage Clinical Data Science budgets, identify out-of-scope activities, and initiate change orders through Project Management. * Plan, manage, and allocate Clinical Data Science resources and coordinate the work of assigned team members. * Develop and maintain project plans, specifications, and documentation in compliance with SOP requirements. * Maintain ongoing documentation and ensure Trial Master File (TMF) completeness and accuracy. * Participate in and present at internal, Sponsor, investigator, and third-party meetings. * Provide input to proposals, bid defenses, and RFP responses and promote new Clinical Data Science business opportunities aligned with Sponsor strategies. * Prepare documentation for and participate in internal and external audits. * Train and mentor junior team members and maintain proficiency in Clinical Data Science systems through ongoing training. * Perform other duties as assigned. Qualifications Education * Bachelor's degree in Biological Sciences, Computer Science, Mathematics, Data Science, or related discipline required. * Master's degree preferred. * Equivalent relevant experience may be considered in lieu of degree. Experience * Minimum of 10 years of experience in Clinical Data Management and/or Clinical Data Science. * At least 5 years of project management experience. * Experience with Clinical Data Science practices and relational database management systems. * In-depth knowledge of the drug development lifecycle, including risk-based data quality approaches and biometrics workflows. Skills & Knowledge * Expertise in protocol interpretation, data collection strategies, and data cleaning specification development. * Knowledge of ALCOA++ data quality principles. * Knowledge of medical terminology, clinical trial data, and ICH/GCP regulatory requirements. * Proficiency with Microsoft Word, Excel, PowerPoint, email, and Windows-based applications. * Strong leadership, communication, organizational, and time-management skills. * Ability to manage multiple priorities in a fast-paced, dynamic environment. * Ability to work independently and collaboratively across multidisciplinary teams. At Syneos Health, we believe in providing an environment and culture in which Our People can thrive, develop and advance. We reward and recognize our people by providing valuable benefits and a quality-of-life balance. The benefits for this position may include a company car or car allowance, Health benefits to include Medical, Dental and Vision, Company match 401k, eligibility to participate in Employee Stock Purchase Plan, Eligibility to earn commissions/bonus based on company and individual performance, and flexible paid time off (PTO) and sick time. Because certain states and municipalities have regulated paid sick time requirements, eligibility for paid sick time may vary depending on where you work. Syneos complies with all applicable federal, state, and municipal paid sick time requirements. Salary Range: $95,000.00 - $175,700.00 The base salary range represents the anticipated low and high of the Syneos Health range for this position. Actual salary will vary based on various factors such as the candidate's qualifications, skills, competencies, and proficiency for the role. Get to know Syneos Health Over the past 5 years, we have worked with 94% of all Novel FDA Approved Drugs, 95% of EMA Authorized Products and over 200 Studies across 73,000 Sites and 675,000+ Trial patients. No matter what your role is, you'll take the initiative and challenge the status quo with us in a highly competitive and ever-changing environment. Learn more about Syneos Health. *************************** Additional Information Tasks, duties, and responsibilities as listed in this are not exhaustive. The Company, at its sole discretion and with no prior notice, may assign other tasks, duties, and job responsibilities. Equivalent experience, skills, and/or education will also be considered so qualifications of incumbents may differ from those listed in the Job Description. The Company, at its sole discretion, will determine what constitutes as equivalent to the qualifications described above. Further, nothing contained herein should be construed to create an employment contract. Occasionally, required skills/experiences for jobs are expressed in brief terms. Any language contained herein is intended to fully comply with all obligations imposed by the legislation of each country in which it operates, including the implementation of the EU Equality Directive, in relation to the recruitment and employment of its employees. The Company is committed to compliance with the Americans with Disabilities Act, including the provision of reasonable accommodations, when appropriate, to assist employees or applicants to perform the essential functions of the job. Summary Principal Clinical Data Scientist- Clinical Data Management Syneos Health is a leading fully integrated biopharmaceutical solutions organization built to accelerate customer success. We translate unique clinical, medical affairs and commercial insights into outcomes to address modern market realities. Our Clinical Development model brings the customer and the patient to the center of everything that we do. We are continuously looking for ways to simplify and streamline our work to not only make Syneos Health easier to work with, but to make us easier to work for. Whether you join us in a Functional Service Provider partnership or a Full-Service environment, you'll collaborate with passionate problem solvers, innovating as a team to help our customers achieve their goals. We are agile and driven to accelerate the delivery of therapies, because we are passionate to change lives. Discover what our 29,000 employees, across 110 countries already know: WORK HERE MATTERS EVERYWHERE Why Syneos Health We are passionate about developing our people, through career development and progression; supportive and engaged line management; technical and therapeutic area training; peer recognition and total rewards program. We are committed to our Total Self culture - where you can authentically be yourself. Our Total Self culture is what unites us globally, and we are dedicated to taking care of our people. We are continuously building the company we all want to work for and our customers want to work with. Why? Because when we bring together diversity of thoughts, backgrounds, cultures, and perspectives - we're able to create a place where everyone feels like they belong.
    $95k-175.7k yearly 34d ago
  • Data Scientist

    Nextwave Resources 4.4company rating

    Data engineer job in Durham, NC

    Temp Data Scientist - Boston, MA or NC, NH, RI, TX, MA, or CO (18 month contract- probable extension or permanent conversion) Notes: Master's Degree Required Python development to build time series models Run SQL queries Linux OS administration Any web development experience preferred. Experience working with Artificial Intelligence, Machine learning algorithms, neural networks, decision trees, modeling, Cloud Machine Learning, time series analysis and robotics process automation. Description: We are seeking a hands-on experienced data scientist with financial services industry experience. As part of a small, nimble team, the associate's key differentiating abilities will be exceptional analytical skills, and an ability to conceive of and develop differentiated products for the benefit of customers. Absolutely critical is the associate's ability to carry an initiative from idea through to execution. 5+ years' experience in Information security/technology risk management for large-scale, complex IT infrastructures and distributed environments or an equivalent combination of related training and experience Analytic Skills: In addition to core regression, classification and time series skills that accompany the data science role, experience with next best action (NBA) prediction, multi-armed bandits, online learning, A/B testing, and experimentation methods are preferred Natural programmer, and confirmed industry experience with statistics and data modeling Experience with one or more of the following tools/frameworks - python, scikit-learn, nltk, pandas, numpy, R, pyspark, scala, SQL/big data tools, TensorFlow, PyTorch, etc Education- At least one advanced degree (Master or PhD level) in a technical or mathematically-oriented discipline, e.g., coursework or experience in fields such as statistics, machine learning, computer science, applied mathematics, econometrics, engineering, etc. Extensive experience in written and oral communications/presentations, and ability to produce a variety of business documents (business requirements, technical specs, slide presentations, etc.) that demonstrate command of language, clarity of thought, and orderliness of presentation We are looking for an authority quantitative developer to advance the research and development of AI/ML methods as components in the delivery of creative investment management technology solutions. You will have experience combining multi-variate statistical modeling, predictive machine learning methods and open-source approaches to Cloud computing and Big Data.
    $70k-100k yearly est. 60d+ ago
  • Data Scientist

    Tek Spikes

    Data engineer job in Cary, NC

    Job Description Note: Only on W2 Must be local to either Cary, NC or Irving, TX Client: Caterpillar Education: • Bachelors or Masters are required Qualifications: • 5+ years of experience are required Top Skills: • Proficiency in Python, SQL, and data science libraries (Pandas, Scikit-learn, TensorFlow) • Strong foundation in statistics, probability, and machine learning • Familiarity with cloud platforms (Azure, AWS, Snowflake) and data modeling • Excellent communication skills to explain technical concepts to non-technical stakeholders Job Duties: • A Data Scientist is responsible for analyzing large volumes of structured and unstructured data to extract actionable insights, build predictive models, and support data-driven decision-making. • This role blends statistical expertise, programming skills, and business acumen to solve complex problems and drive innovation. • Data Collection & Preparation: Gather, clean, and validate data from various sources to ensure quality and usability • Exploratory Data Analysis: Identify trends, anomalies, and patterns in large datasets • Model Development: Design and implement machine learning models (e.g., regression, classification, clustering, NLP) to support forecasting and decision-making • Data Visualization: Create dashboards and reports using tools like Power BI, etc., to communicate findings • Automation & Optimization: Develop scripts and tools to automate data processing and model deployment • Collaboration: Work cross-functionally with product, engineering, and business teams to align data initiatives with strategic goals • Research & Innovation: Stay current with emerging technologies and methodologies in data science and apply them to business challenges
    $70k-97k yearly est. 26d ago
  • Big Data Consultant (Durham, NC, Westlake, TX)

    Sonsoft 3.7company rating

    Data engineer job in Durham, NC

    Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services. Green Card and USC :- At least 7+ years of overall experience HDFS architecture and understanding of all critical architectural concepts including type of nodes, node interaction, YARN, Zoo Keeper, and Map reduce etc. Hand on experience in Hive: All concepts including Hive queries, UDF, Different file formats like ORC, AVRO, and Parquet etc. Hand on experience in developing Sqoop, Spark. Experience in processing structured data - Warehousing Concepts like de-duplication, cleansing, look ups, transformation, data versioning etc. Hand on experience in developing Oozie workflow definition and execution Knowledge of a HDFS distribution preferably Cloudera. Understanding of Monitoring and operational capabilities of the distribution Knowledge of Flume, Kafka is a plus Hand on experience in programming languages like Java, Python, Perl. At least 4 years of experience in translating functional/non-functional requirements to system requirements. Experience in working with business users to analyze and understand business data and scenarios. Ability to work in a team environment with client interfacing skills. Experience and desire to work in a Global delivery environment. Experience leading medium to large sized teams. CloudEra Certification Knowledge of PLSQL. Job Description:- At least 7+ years of overall experience HDFS architecture and understanding of all critical architectural concepts including type of nodes, node interaction, YARN, Zoo Keeper, and Map reduce etc. Hand on experience in Hive: All concepts including Hive queries, UDF, Different file formats like ORC, AVRO, and Parquet etc. Hand on experience in developing Sqoop, Spark. Experience in processing structured data - Warehousing Concepts like de-duplication, cleansing, look ups, transformation, data versioning etc. Hand on experience in developing Oozie workflow definition and execution Knowledge of a HDFS distribution preferably Cloudera. Understanding of Monitoring and operational capabilities of the distribution Knowledge of Flume, Kafka is a plus Hand on experience in programming languages like Java, Python, Perl. At least 4 years of experience in translating functional/non-functional requirements to system requirements. Experience in working with business users to analyze and understand business data and scenarios. Ability to work in a team environment with client interfacing skills. Experience and desire to work in a Global delivery environment. Experience leading medium to large sized teams. CloudEra Certification Knowledge of PLSQL. Qualifications Basic Qualifications :- Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education. At least 4 years of experience within the Information Technologies. Additional Information Note:- This is a Full-Time & Permanent job opportunity for you. Only US Citizen, Green Card Holder, GC-EAD, H4-EAD & L2-EAD can apply. No OPT-EAD, H1B & TN candidates please. Please mention your Visa Status in your email or resume.
    $78k-107k yearly est. 60d+ ago
  • Sr. Data Engineer

    Aspida Financial Services

    Data engineer job in Durham, NC

    : Aspida is a tech-driven, nimble insurance carrier. Backed by Ares Management Corporation, a leading global alternative asset manager, we offer simple and secure retirement solutions and annuity products with speed and precision. More than that, we're in the business of protecting dreams; those of our partners, our producers, and especially our clients. Our suite of products, available through our elegant and intuitive digital platform, focuses on secure, stable retirement solutions with attractive features and downside protection. A subsidiary of Ares Management Corporation (NYSE: ARES) acts as the dedicated investment manager, capital solutions and corporate development partner to Aspida. For more information, please visit ************** or follow them on LinkedIn. Who We Are: Sometimes, a group of people come together and create something amazing. They don't let egos get in the way. They don't settle for the status quo, and they don't complain when things get tough. Instead, they see a common vision for the future and each person makes an unspoken commitment to building that future together. That's the culture, the moxie, and the story of Aspida. Our business focuses on annuities and life insurance. At first, it might not sound flashy, but that's why we're doing things differently than everyone else in our industry. We're dedicated to developing data-driven tech solutions, providing amazing customer experiences, and applying an entrepreneurial spirit to everything we do. Our work ethic is built on three main tenets: Get $#!+ Done, Do It with Moxie, and Have Fun. If this sounds like the place for you, read on, and then apply at aspida.com/careers. What We Are Looking For: As a Snowflake Senior Data Engineer, you will be responsible for architecting, developing, and optimizing data warehouse and Lakehouse solutions using the Snowflake platform and integrating with internal systems. This role will design, build, and maintain robust and scalable data pipelines and infrastructure using modern cloud-based technologies (AWS and Azure). Experience with Python, SQL Server, Snowflake, DBT Core, Airflow, and GIT is required. You will collaborate closely with data architects, data engineers, and business stakeholders to ensure data ingestion, storage, and access solutions on Snowflake meet performance, security, and compliance requirements for migration and analytics. This role reports to the VP of Data Engineering and is required to be onsite 3 days a week at our Durham, NC office. What You Will Do: Design and implement Snowflake Lakehouse architectures. Develop and optimize Snowflake SQL queries, stored procedures for curating data. Manage Snowflake account configurations, resource monitors, virtual warehouses, and access controls. Integrate Snowflake with Oracle Financials and other 3rd party sources. Implement best practices for data partitioning, clustering, and caching to optimize performance and cost-efficiency. Participate in data ingestion automation, metadata management, and monitoring within Snowflake environments. Collaborate with security teams to enforce data governance, encryption, and compliance policies in Snowflake. Support CI/CD and automation of Snowflake deployment and pipeline orchestration. Provide KT and technical guidance to team members and business stakeholders. What We Provide: Salaried, DOE Long-Term Incentive Plan Full-Time Full Benefits Package Available What We Believe: At Aspida Financial Services, LLC, we are committed to creating a diverse and inclusive environment and are proud to be an equal opportunity employer. As such, Aspida does not and will not discriminate in employment and personnel practices on the basis of race, sex, age, handicap, religion, national origin or any other basis prohibited by applicable law. Hiring, transferring and promotion practices are performed without regard to the above listed items. Requirements What We Require: BS/MS in Computer Science, Data Engineering, Information Systems, or a related field. 7+ years in data warehousing and cloud data platforms with 5+ years of hands-on Lakehouse experience Experience working on Snowflake migration is preferred. Retail Insurance or Reinsurance experience and data governance knowledge are a plus Agile and DevOps experience with automation and pipeline deployment. Certifications like SnowPro Core Certification or relevant cloud certifications are desirable Deep expertise in Snowflake data warehouse platform, including performance tuning and cost optimization. Strong SQL skills and experience with Snowflake-specific features such as Time Travel, Zero Copy Cloning, Streams, and Tasks. Experience with cloud ecosystems (Azure, AWS) and data orchestration tools (DBT, Airflow). Understanding of data security, role-based access control (RBAC), and compliance in Snowflake. Knowledge of data modeling principles for enterprise data warehouses. Excellent communication and collaboration skills to work with data teams.
    $78k-106k yearly est. 60d+ ago
  • Data Engineer

    Govserviceshub

    Data engineer job in Durham, NC

    Role: Data Engineer Duration: 12+ Months Contract (Long Term Renewable contract) Requirements Must Haves: 1.) SQL Stored Procedures 2.) Aurora Postgres 3.) AWS (S3, Glue, Lambda functions) 4.) Java API Development 5.) Basic Snowflake experience 6.) Disaster Recovery
    $78k-106k yearly est. 32d ago
  • Data Engineer II

    GSO 4.7company rating

    Data engineer job in Greensboro, NC

    The Fresh Market & You: Our mission is to make everyday eating extraordinary for our guests. We create a warm, welcoming, memorable experience with exceptional, personal service . We're looking for a new team member who strives for excellence and brings positive energy, commitment, and a ā€œcan-doā€ attitude to work every day . We value teamwork and celebrate our successes as a team and will value your contribution! Added Benefits for choosing The Fresh Market Team: Team member discount up to 40% Health, Dental & Vision insurance available for individual, spouse, partner, and family. And much more! 401K contribution and match for part-time and full-time team members. Personal time off and additional time off purchase plans available About the Position: The Fresh Market currently has an opening for a Data Engineer in our Store Support Center (099). The Data Engineer will be designing, developing, and supporting robust data warehouse systems. Expertise in planning, managing databases, and ensuring efficient data storage, access, and security. Proficient in technical administration for data warehouse development and maintenance, with a focus on coordinating data quality, performance, security, and overall management for organizational needs. What You'll Do: Collaborates with users to design, code, test, debug, and deploy databases that meet requirements and organizational needs. Acts as liaison between information technology and business units. Evaluate new data sources for adherence to the organization's quality standards and ease of integration. Manipulates and combines data from various sources to enlarge and enhance the data warehouse. Performs data extraction, ensures data accuracy, troubleshoots, and resolves bugs during data warehouse operations. Writes queries, stored procedures and functions for database development. Develops strategies for warehouse implementation, data acquisition and access, and data archiving and recovery. Plans and executes data warehouse implementations following system requirements and anticipated usage. Provides technical support and coordination during warehouse design, testing and movement to production. Designs and builds databases for data storage or processing. Evaluates existing database design to determine necessary updates, performance tuning, and integration requirements. Monitors databases' performance, scalability, and security and modifies procedures to optimize database designs. Builds data models and defines the structure, attributes and nomenclature of data elements. Understands and utilizes ETL tools like Informatica and programming languages like Python. Designs and writes code for ETL data solutions. Implements and enforces standards and procedures to ensure data is managed consistently and properly integrated within the warehouse. Ensures sufficient data quality is maintained so that the data can effectively support the business process. Defines data elements and establishes policies and procedures related to the collection and accuracy of data, and performs tests on data systems. Possesses comprehensive knowledge of database technologies and solid coding and computer system skills. Works on projects indpendently with general supervision. Work will be have moderate technicalical complexity. Can communicate facts, policies, and practices related to job area. Qualifications: At a minimum, what you'll need: Bachlors Degree in one of the following disciplines or similar: Computer Science, Database Systems, Information Systems, or equivelent engineering degree. Requires 2 - 5 years of professionsal experience or deemed to possses that level of skill. Collaborate with other technical teams to assist in achievement of team agile sprint goals. Proficiency with the following technologies: SQL Language. Snowflake. (ETL/ELT) Systems. Cloud Data Warehouse Systems. Excellent analytical, conceptual thinking, problem solving, planning, and execution skills. Ability to identify and evaluation new data management technologies and the business benefits. Demonstrated ability to use agile methodology to estimate and plan project outcomes. Ability to drive organizational change and build data capabilities that effectively balance the need to continuously exploit capabilities to optimize operational efficiency with the need to deliver innovative and agile infrastructure. Ability to communicate ideas in both technical in a user-friendly language. Enthusiastic and willing to work in a team-oriented, collaborative Agile SCRUM environment. Strong customer-service orientation and a professional demeanor. Excellent written and oral communication skills. Excellent listening and interpersonal skills. Excellent documentation skills. Highly self-motivated and directed. After hours and Weekend on call support in a rotating manner. Preferred qualifications: Certification in Snowflake, SQL Server, T-SQL. We are proud to be an Equal Opportunity Employer: REASONABLE ACCOMMODATIONS: Consistent with applicable laws, TFM will make reasonable accommodations for qualified applicants and team members, unless doing so would result in an undue hardship to TFM. This guiding principle applies to all aspects of employment, including hiring and job assignment, compensation, discipline, termination, and access to benefits and training. Qualified applicants will receive consideration for employment without regard race, color, creed, religion, age, sex, gender, sexual orientation, gender identity, pregnancy and related medical conditions, national origin, genetic information, uniformed service, veteran status, disability, or any other basis prohibited by federal or state law. The statements in this job description are provided to describe the general nature and level of work expected in this role. While these statements include the essential functions of the job, they are not intended to be a complete list of all responsibilities, duties and skills required. As we work as a team, there may be times team members are needed to perform duties outside of their normal responsibilities based on business needs. #CL-1 #LI-REMOTE
    $83k-105k yearly est. Auto-Apply 39d ago
  • Hadoop Developer

    Stem Xpert

    Data engineer job in Durham, NC

    TekWissen provides a unique portfolio of innovative capabilities that seamlessly combines clients insights, strategy, design, software engineering and systems integration. Our tightly integrated offerings are tailored to each clients requirements and span the services spectrum from Application Development/Maintenance testing, IT Consulting & staffing for IT Infrastructure Management through strategic consulting and industry-oriented business process. Job Details: Job Title: Hadoop Developer Job Location: -Durham, NC Duration: 6+ Months Job Description: Overview: Responsibilities • Design, implement and deploy custom applications on Hadoop. • Troubleshoot production issues with Hadoop • Collaborate with other teams to design and develop data tools that support both operations and product use cases Minimum Qualifications • Experience with Hadoop platform including hands on development • Need strong Shell, Perl and Hive experience. Couple of years of experience with Hadoop • Proficiency with Software Development Life Cycle (SDLC) • Strong verbal and written communication skills Preferred Qualifications • Exposure to Performance Tuning And Control - M Additional InformationThanks & Regards Akshu Reshma reshma(dot)********************** ************ Ext: 106
    $82k-106k yearly est. Easy Apply 60d+ ago
  • Hadoop Big Data Developer

    CapB Infotek

    Data engineer job in Cary, NC

    JOB DESCRIPTION: "Spark, Scala/Python, HIVE, Hadoop, BIGDATA developer with Exposure to Cloud (Azure Preferably). 4-5 Years of experience in Building and Implementing data ingestion and curation process developed using Big data tools such as Spark (Scala/python), Hive, Spark, HDFS, Sqoop, Hbase, Kerberos, Sentry and Impala etc. Ingesting huge volumes data from various platforms for Analytics needs and writing high-performance, reliable and maintainable ETL code Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance. .Hands on Experience on writing shell scripts. Complex SQL queries, Hadoop commands and Git.4 Good Hands-on creating Database, Schemas, Hive tables (External and Managed) with various file formats (Orc, Parquet, Avro and Text etc.), Complex Transformations, Partitioning, bucketing and Performance optimizations. .Recent Exposure to Cloud will be a good to have. Azure will be preferred.6.Spark Complex transformations, data frames, semi-structured data, utilities using spark, Spark Sql and spark configurations.7.Proficiency and extensive Experience with Spark & Scala/Python and performance tuning is a MUST. Monitoring performance of production jobs and advising any necessary infrastructure changes. .Ability to write abstracted reusable code components. .Code versioning experience using Bitbucket and CI/CD pipe line.
    $82k-106k yearly est. 60d+ ago
  • Data Access Engineer

    University of North Carolina at Chapel Hill 4.2company rating

    Data engineer job in Chapel Hill, NC

    A global higher education leader in innovative teaching, research and public service, the University of North Carolina at Chapel Hill consistently ranks as one of the nation's top public universities. Known for its beautiful campus, world-class medical care, commitment to the arts and top athletic programs, Carolina is an ideal place to teach, work and learn. One of the best college towns and best places to live in the United States, Chapel Hill has diverse social, cultural, recreation and professional opportunities that span the campus and community. University employees can choose from a wide range of professional training opportunities for career growth, skill development and lifelong learning and enjoy exclusive perks for numerous retail, restaurant and performing arts discounts, savings on local child care centers and special rates on select campus events. UNC-Chapel Hill offers full-time employees a comprehensive benefits package, paid leave, and a variety of health, life and retirement plans and additional programs that support a healthy work/life balance. Primary Purpose of Organizational Unit Our primary mission is providing instruction of the highest quality in physics and astronomy to undergraduate and graduate students at UNC-CH. Training in physics is fundamental among the natural sciences because it provides understanding of the forces governing the structure of matter, from subatomic particles to the large-scale structure of the universe. Our departmental instructional mission spans all segments of the student community. Over 60% of all college graduates from UNC-CH have taken a course in our department. Among them: undergraduate liberal arts majors who should master the power of quantitative reasoning; undergraduates seeking a degree in another of the natural sciences, who need physics as a foundation for their advanced scientific training; the small number of undergraduates who major in physics, to whom we provide broad and rigorous training. Our graduate education and research advances frontiers of knowledge at the two extremes of space and time, from the very small) the realm of particle physics) to the very large (the realm of astronomy, while expanding the boundaries of knowledge in the many subfields spanning length scales between. Forefront physics and astronomy research requires access to well-equipped laboratories and extensive computing capabilities; it also must be supported by comprehensive and accessible library collections. Again, in research at this advanced level, the bridging aspect of physics provides basic interdisciplinary insight for many other sciences. A strong managerial, administrative, and technical staff supports this instructional mission. Duties of these employees range from budget planning and management for the numerous research grants held by faculty in the department to maintaining course and student records. These activities are usually accomplished in a standard 40- hour week schedule. Position Summary The Argus Array will be the largest optical telescope array ever assembled, with a collecting area comparable to the largest monolithic telescopes in the world. The Array will push our observations of the universe into a new regime, scanning the sky 100,000x faster than current nightly-cadence sky surveys. Argus will capture a continuous multi-color, 55-gigapixel movie of the night sky, shared with the entire astronomical community in real-time through public transient alerts, images, and light curves with millions of epochs for hundreds of millions of stars. Joining our local team of astronomers, telescope-instrumentalists, and engineers, the Data Access Engineer will build software systems to get Argus data products into the hands of astronomers around the world. Three years of prototyping efforts have demonstrated the core pipeline architecture and built an archive of representative data; we are now working to build pipelines and platforms for broad accessibility that scale to the full array. This scale up involves integration with cloud services, existing distributed storage networks, and the Array's high-performance GPU accelerated pipelines. In collaboration with a worldwide network of real-time data release and processing centers, the Data Access Engineer will take the alert distribution system to production, bringing streaming notifications and images of new and changing phenomena in the night sky to both professional and amateur astronomers alike. As Data Access Engineer, you will also oversee periodic data releases of our trillion-point light curves, establishing best practices for data versioning, integrity, and accessibility. Other responsibilities will include contributing to collaborative development of intuitive, API-first user interfaces for both internal quality assurance and public data access. Our project management philosophy emphasizes a small and local team; flat management structure; and a highly collaborative working environment. We routinely produce and test prototypes and complex hardware in-house. While you will lead the data access systems, team members frequently collaborate across boundaries and contribute hands-on to various Argus hardware and software subsystems. Minimum Education and Experience Requirements Relevant post-Baccalaureate degree required (or foreign degree equivalent); for candidates demonstrating comparable independent research productivity, will accept a relevant Bachelor's degree (or foreign degree equivalent) and 3 or more years of relevant experience in substitution. May require terminal degree and licensure. Required Qualifications, Competencies, and Experience We are searching for an engineer with deep Python expertise (5+ years) and demonstrated success in designing and optimizing high-throughput, distributed message systems (3+ years). Experience with large software projects, including proficiency across the software development lifecycle (version control, documentation, and testing) is required. Preferred Qualifications, Competencies, and Experience Experience with both cloud-based relational databases (PostgreSQL and Apache Kafka or equivalents) and time-series databases optimized for astronomical data storage and retrieval is preferred. Prior experience translating scientific requirements into technical specifications and building researcher-friendly interfaces for complex datasets. Special Physical/Mental Requirements Ability to sustain nighttime validations testing and monitoring (as part of a rota of qualified support personnel). Campus Security Authority Responsibilities Not Applicable. Special Instructions Quick Link *******************************************
    $69k-92k yearly est. 18d ago
  • Data Access Engineer

    UNC-Chapel Hill

    Data engineer job in Chapel Hill, NC

    The Argus Array will be the largest optical telescope array ever assembled, with a collecting area comparable to the largest monolithic telescopes in the world. The Array will push our observations of the universe into a new regime, scanning the sky 100,000x faster than current nightly-cadence sky surveys. Argus will capture a continuous multi-color, 55-gigapixel movie of the night sky, shared with the entire astronomical community in real-time through public transient alerts, images, and light curves with millions of epochs for hundreds of millions of stars. Joining our local team of astronomers, telescope-instrumentalists, and engineers, the Data Access Engineer will build software systems to get Argus data products into the hands of astronomers around the world. Three years of prototyping efforts have demonstrated the core pipeline architecture and built an archive of representative data; we are now working to build pipelines and platforms for broad accessibility that scale to the full array. This scale up involves integration with cloud services, existing distributed storage networks, and the Array's high-performance GPU accelerated pipelines. In collaboration with a worldwide network of real-time data release and processing centers, the Data Access Engineer will take the alert distribution system to production, bringing streaming notifications and images of new and changing phenomena in the night sky to both professional and amateur astronomers alike. As Data Access Engineer, you will also oversee periodic data releases of our trillion-point light curves, establishing best practices for data versioning, integrity, and accessibility. Other responsibilities will include contributing to collaborative development of intuitive, API -first user interfaces for both internal quality assurance and public data access. Our project management philosophy emphasizes a small and local team; flat management structure; and a highly collaborative working environment. We routinely produce and test prototypes and complex hardware in-house. While you will lead the data access systems, team members frequently collaborate across boundaries and contribute hands-on to various Argus hardware and software subsystems. Required Qualifications, Competencies, And Experience We are searching for an engineer with deep Python expertise (5+ years) and demonstrated success in designing and optimizing high-throughput, distributed message systems (3+ years). Experience with large software projects, including proficiency across the software development lifecycle (version control, documentation, and testing) is required. Preferred Qualifications, Competencies, And Experience Experience with both cloud-based relational databases (PostgreSQL and Apache Kafka or equivalents) and time-series databases optimized for astronomical data storage and retrieval is preferred. Prior experience translating scientific requirements into technical specifications and building researcher-friendly interfaces for complex datasets.
    $78k-106k yearly est. 17d ago
  • Data Architect / Eng.

    International Market Centers 4.6company rating

    Data engineer job in High Point, NC

    We at Andmore are seeking a technically skilled and business-aware Data Architect to lead the development of scalable data infrastructure that powers B2B marketing analytics and decision-making. This role requires deep expertise in Snowflake and the ability to collaborate with marketing technology teams, front-end developers, and CRM specialists to deliver data solutions that support customer engagement and internal operations. Key Responsibilities: * Design and optimize data pipelines and models using DBT and Snowflake, tailored to B2B marketing use cases. * Design and implement logical and physical data models (e.g., dimensional modeling) to represent analytics use-cases. * Lead technical discussions with front-end developers (e.g., Power BI specialists), marketing analysts, and CRM engineers to ensure data structures support reporting and analytics needs. * Collaborate with marketing operations and CRM teams (Microsoft Dynamics) to integrate and harmonize data across platforms, enabling unified customer views and actionable insights. * Translate functional marketing requirements into technical specifications by asking targeted questions and proposing scalable data architecture solutions. * Implement data governance and metadata management using tools like Snowflake or Microsoft Purview, ensuring compliance and transparency. * Support self-service analytics and dashboarding through Power BI, enabling marketing teams to explore campaign performance and audience behavior. * Monitor and troubleshoot data workflows, ensuring high availability, data quality, and performance across marketing data assets. * Contribute to the evolution of the marketing data stack, including experimentation with new tools and architecture patterns. Required Qualifications: * Bachelor's degree in Computer Science, Engineering, Information Systems, or a related field. * 7+ years of experience in data engineering, with a strong focus on cloud-based data platforms and data modeling. * Proven expertise in Snowflake. * Strong SQL skills and experience with dimensional modeling for marketing and activity data. * Experience working with marketing data sources such as CRM, web analytics, and campaign management platforms. * Excellent communication skills, with a focus on technical collaboration across teams. * Ability to design and implement scalable data architecture solutions. Recommended Skills & Tools: * Familiarity with Snowflake or Microsoft Purview for data governance and cataloging. * Experience integrating data from Microsoft Dynamics and other B2B Marketing platforms. * Experience with DBT for data transformations * Working knowledge of Power BI for dashboarding and reporting. * Understanding of data privacy, compliance (e.g., GDPR), and security best practices. * Strategic thinker with a proactive approach to problem-solving. * Strong interpersonal and technical collaboration skills. * Passion for continuous learning and staying current with data engineering trends.
    $92k-130k yearly est. 60d+ ago
  • Cloud Data Architect

    ZP Group 4.0company rating

    Data engineer job in Durham, NC

    Piper Companies is seeking a Cloud Data Architect to join a Financial Services company located in Durham, NC. The Cloud Data Architect will be responsible for playing a pivotal role in designing, implementing, and managing the Data Lake. Responsibilities of the Cloud Data Architect * Developing, designing, implementing, and managing the Data Lake * Manage canary releases and blue-green deployments. * Collaborate with product owners to define service level objectives and manage the health and performance of cloud systems by implementing observability and responsive * Ensure all data and infrastructure deployments comply with enterprise security and access controls, best practices, and standards. * Collaborate with partners and internal teams to design scalable and secure cloud architectures based on best practices and business requirements. * Implement security measures and compliance. Qualifications of the Cloud Data Architect * 8+ years of experience as a Data Architect * AWS experience * Hands on experience with architecting a Data Lake Compensation for the Cloud Data Architect * Salary Range: $165,000 * Full Benefits: Medical, Dental, Vision, #LI-NT1 #LI-REMOTE This job opens for applications on 1/16/2026. Applications for this job will be accepted for at least 30 days from the posting date
    $165k yearly 2d ago
  • IBM DataCap developer

    Krg Technology 4.0company rating

    Data engineer job in Cary, NC

    3+ year's extensive experience with DataCap and C#. Strong familiarity with the Java EE foundation Stack: Servlet, JSP, JMS, EL, JCA, CDI, JTA, JPA, JAX-WS, JAX-RS, JMX, JAXR, JASPIC, JSTL, JDBC, Java Mail. Experience in applying best practices and methodologies such as Test Driven Development, Aspect Oriented Programming and Dependency Injection, Agile Software Development. Experience in any application middleware with hands-on experience in integration with host/legacy applications. Solid experience with PL/SQL and understanding of RDBMS table structures (Oracle experience is preferable). FileNet P8 (version 5.x required) platform development, enhancement, customization and integration in a global environment is preferable Hands on knowledge of ECM/BPM API and package structure; Experience in architecture and design of Workflow & Process Roadmap. Hands on experience designing and developing custom web base applications using the FileNet base web tool kit is preferable Ability to design, troubleshoot and resolve DataCap and FileNet related issues in clustered environments is preferable. Working knowledge of FileNet related tools like FEM, WF designer & PCC is preferable. Solid understanding of and experience with Principles of Object Oriented Design, GoF Design Patterns, Enterprise Integration Patterns, SOA Patterns; Demonstrate ability to consult pattern catalogs to identify a standard solution to a common problem; Demonstrate an ability to identify anti-patterns and propose corrective solutions. Facilitate business process decisions with members of the Production and Tech teams. Perform impact analysis of changes to components. Coordinate with development teams during various SDLC stages. Participation in testing efforts (Unit, System, Performance). Able to work in a fast paced environment. Responsible for working with and supporting a Level 3 support team which consists of both onshore and offshore resources. Proven ability in writing technical documentation, including design documentation, training materials, and white papers. Liaising with architects, developers and infrastructure managers to ensure the platform meets all functional and non-functional requirements. Ability to lead requirements review meetings. Platform delivery including packaging, installation, and configuration. Ensuring quality IT delivery through the application of standards and technology design principles. Liaising with Production Management to ensure the platform is fully supportable and requires purely run-time involvement to perform the core functions. Platform support including troubleshooting, root cause analysis, and problem resolution. Identifying and assessing risks, determining impact to platform and mitigation plans. Desired but not required: any DataCap and FileNet Certifications. Thanks & Regards Qualifications Experience with Datacap, Filenet, Java EE foundation stack Additional Information All your information will be kept confidential according to EEO guidelines.
    $86k-109k yearly est. 9h ago
  • Senior Data Engineer

    Elder Research 3.9company rating

    Data engineer job in Cary, NC

    Job Title: Senior Data Engineer Workplace: Hybrid - Due to in-office requirements, candidates must be local to either Raleigh, NC or Charlottesville, VA. Relocation assistance is not available Clearance Required: Not required, BUT YOU MUST BE ELIGIBLE FOR A CLEARANCE Position Overview: Elder Research, Inc. (ERI) is seeking to hire a Senior Data Engineer with strong engineering skills who will provide technical support across multiple project teams by leading, designing, and implementing the software and data architectures necessary to deliver analytics to our clients, as well as providing consulting and training support to client teams in the areas of architecture, data engineering, ML engineering and/or related areas. The ideal candidate will have a strong command of Python for data analysis and engineering tasks, a demonstrated ability to create reports and visualizations using tools like R, Python, SQL, or Power BI, and deep expertise in Microsoft Azure environments. The candidate will play a key role in collaborating with cross-functional teams, including software developers, cloud engineers, architects, business leaders, and power users, to deliver innovative data solutions to our clients. This role requires a consultative mindset, excellent communication skills, and a thorough understanding of the Software Development Life Cycle (SDLC). Candidates should have 7-12 years of relevant experience and experience in client-facing or consultative roles. The role will be based out of Raleigh NC or Charlottesville VA and will require 2-4 days of Business Travel to our customer site every 6 weeks. Key Responsibilities: Data Engineering & Analysis: * Develop, optimize, and maintain scalable data pipelines and systems in Azure environments. * Analyze large, complex datasets to extract insights and support business decision-making. * Create detailed and visually appealing reports and dashboards using R, Python, SQL, and Power BI. Collaboration & Consulting: * Work closely with software developers, cloud engineers, architects, business leaders, and power users to understand requirements and deliver tailored solutions. * Act as a subject-matter expert in data engineering and provide guidance on best practices. * Translate complex technical concepts into actionable business insights for stakeholders. Azure Expertise: * Leverage Azure services such as Azure Data Factory, Azure Synapse Analytics, Azure Databricks, Azure SQL Database, and Azure Blob Storage for data solutions. * Ensure data architecture is aligned with industry standards and optimized for performance in cloud environments. SDLC Proficiency: * Follow and advocate for SDLC best practices in data engineering projects. * Collaborate with software development teams to ensure seamless integration of data solutions into applications. Required Qualifications: * Experience: 7-12 years in data engineering, analytics, or related fields, with a focus on Azure environments. * Education: Masters degree in Computer Science, Data Science, Engineering, or a related field. Technical Skills: * Programming: Advanced expertise in Python; experience with R is a plus. * Data Tools: Proficient in SQL, Power BI, and Azure-native data tools. * Azure Knowledge: Strong understanding of Azure services, including data integration, storage, and analytics solutions. * SDLC Knowledge: Proven track record of delivering data solutions following SDLC methodologies. * Consultative Skills: Strong client-facing experience with excellent communication and presentation abilities. * Due to Customer requirements Candidates must be US Citizens or Permanent Residents of the United States of America. Preferred Skills and Qualifications: * Certifications in Azure (e.g., Azure Data Engineer, Azure Solutions Architect). * Familiarity with Azure Functions, Event Grid, and Logic Apps. * Hands-on experience with machine learning frameworks and big data processing tools (e.g., Spark, Hadoop). * Familiarity with CI/CD pipelines and DevOps practices for data engineering workflows. Why apply to this position at Elder Research? * Competitive Salary and Benefits * Important Work / Make a Difference supporting U.S. national security. * Job Stability: Elder Research is not a typical government contractor, we hire you for a career not just a contract. * People-Focused Culture: we prioritize work-life-balance and provide a supportive, positive, and collaborative work environment as well as opportunities for professional growth and advancement. * Company Stock Ownership: all employees are provided with shares of the company each year based on company value and profits.
    $85k-119k yearly est. 58d ago
  • Principal Data Architect - S/4HANA & Domain Analytics

    ABB Ltd. 4.6company rating

    Data engineer job in Cary, NC

    At ABB, we help industries outrun - leaner and cleaner. Here, progress is an expectation - for you, your team, and the world. As a global market leader, we'll give you what you need to make it happen. It won't always be easy, growing takes grit. But at ABB, you'll never run alone. Run what runs the world. This Position reports to: IS Manager This person acts as the Architect for Operational Intelligence across two distinct landscapes: our $200M S/4HANA Public Cloud unit and our $6B Legacy ECC core. While these environments run separately, you will implement a consistent 'Agentic Readiness' strategy for both. You will architect the Semantic Data Layers (using SAP Datasphere & SAC) that expose live operational data to the Enterprise AI team, ensuring that AI Agents can interact with our Legacy ECC supply chain just as effectively as our modern S/4 cloud stack. This role will be responsible for defining and executing a roadmap to expose these critical operational datasets to modern AI and analytics platforms, enabling advanced insights, automation, and data‑driven decision making across the enterprise. This role is critical to: * Modernize Legacy Access: Architect a 'Virtual' Semantic Layer over the $6B ECC landscape. This allows the Enterprise AI team to build Agents that can 'see' into the legacy factory floor without requiring a massive data migration. * Establish S/4 Standards: Define the 'Clean Core' data patterns for the new S/4 Public Cloud unit, set-ting the standard for how Agents interact with Cloud ERP. * Bridge the AI Gap: Ensure that the Enterprise AI team has consistent, governed access to both environments, preventing our legacy core from becoming a 'Data Black Hole' during the AI transformation The work model for the role is: #LI-Hybrid #LI-Hybrid This role is contributing to the ABB Installation Products Division of the Electrification Business Area. In this role, you will act as the Architect of the Semantic Layer for ABB's ERP landscape. You will lead the data strategy for our S/4HANA Public Cloud program, creating a governed, AI-ready data fabric that bridges our Tacti-cal ERP Operations (S/4, ECC) and our Strategic Enterprise Analytics (Snowflake). Beyond standard reporting, you will design the 'Decoupling Layer' (using SAP Datasphere) that extracts business logic from raw ERP tables. This ensures that downstream consumers-whether they are human analysts in Snowflake or AI Agents in our automation layer-consume consistent, governed, 'apples-to-apples' data, regardless of whether the source is our modern S/4 unit or our legacy ECC core. You will be mainly accountable for: * Shaping the Semantic Architecture: Leading the strategy for the SAP Datasphere and SAC layer. You define how the business consumes operational data directly from the ERP, creating a virtual "Single Source of Truth" that shields the business from the complexity of underlying table structures. * Bridging ERP & Enterprise AI: Acting as the Federation Architect who enables data exchange between our ERP Core and the Corporate Data Platforms (Snowflake, Enterprise AI). You do not manage the Snowflake infrastructure; instead, you ensure that Supply Chain and Finance data is modeled, clean, and "Contract-Ready" for the Enterprise team to ingest. * Defining the Governance "Decision Tree": Building the central reporting catalog and ownership structure. You establish the clear rules of engagement: guiding the business on when to use Operational Reporting (Datasphere/SAC) for real-time action versus when to turn to the Enterprise Data Lake (Snowflake) for aggregated strategic analysis. * Enabling Agentic AI: Defining the data accessibility models for the SAP landscape. You ensure that our ERP data is not just "human-readable" on dashboards but is "Machine-Readable" via APIs and Semantic Views, preparing our landscape for AI Agents that require structured, real-time context. Qualifications for the role : * The "Semantic Architect": 8-12 years in data architecture. You are a Modeler, not a Report Developer. You think in "Semantic Layers," "Reusability," and "Data Products," not just "Dashboards." * The "S/4 Builder": Expert-level, hands-on experience with SAP Datasphere (formerly DWC) and SAP Analytics Cloud (SAC). You have at least one full S/4HANA implementation under your belt and understand CDS Views and OData services deeply. * The "Domain Expert": You speak the language of the Physical Supply Chain. You know the difference between a "Material Document" and a "Financial Posting," and you can translate complex logistics operations into clean data models. * The "Federation Thinker": Strong proficiency in Snowflake data modeling and SQL. You must speak both "SAP" (CDS Views, Extractors) and "Cloud Native" languages to bridge the two worlds. * The "AI Ready" Mindset: Understanding of AI/ML data models and how to architect data for automation. You know what it takes to make data "Contract-Ready" for an AI Agent. * The "Strategic Communicator": Proven ability to define a roadmap and communicate it to C-Level stakeholders. You don't just build technical solutions; you define the Data Strategy that guides the business. What's in it for you? * Ownership & Autonomy: This is a role for a self-starter who wants to own the Data Strategy, not just execute tickets. You will have a seat at the table to define the "Single Source of Truth." * Build the Future: This is a rare "Greenfield" opportunity to build a Modern Data Fabric from the ground up using the latest SAP stack (Datasphere, SAC). You aren't here to patch legacy code; you are here to build the Age-tic Data Blueprint for the next decade. * Global Impact: You will be the architect behind our "Lighthouse" program. The standards you set today for the S/4 Public Cloud will become the roadmap for our entire $6B North American operation. It's a complex, intellectual challenge where your work directly shapes the future of the enterprise. More about us: ABB Installation Products Division (formerly Thomas Betts), helps manage the connection, protection and distribution of electrical power from source to socket. The Division's products are engineered to provide ease of installation and perform in demanding and harsh conditions, helping to ensure safety and continuous operation for utilities, businesses and people around the world. The Commercial Essentials product segment includes electrical junction boxes, commercial fittings, strut and cable tray metal framing systems for commercial and residential construction. The Premier Industrial product segment includes multiple product lines, such as Ty-Rap cable ties, T&B Liquid tight Systems protection products, PVC coated and nylon conduit systems, power connection and grounding systems, and cable protection systems of conduits and fittings for harsh and industrial applications. The Division also manufactures solutions for medium-voltage applications used in the utility market under its marquee brands including Elastimold reclosers and switchgear, capacitor switches, current limiting fuses, Homac distribution connectors, Hi-Tech Valiant full-range current limiting fuse for fire mitigation, faulted current indicators and distribution connectors, cable accessories and apparatus with products for overhead and underground distribution. Manufacturing includes made-to-stock and custom-made solutions. #ABBCareers #RunwithABB #Runwhatrunstheworld We value people from different backgrounds. Could this be your story? Apply today or visit *********** to read more about us and learn about the impact of our solutions across the globe. Fraud Warning: Any genuine offer from ABB will always be preceded by a formal application and interview process. We never ask for money from job applicants. For current open positions you can visit our career website *********************************** and apply. Please refer to detailed recruitment fraud caution notice using the link ***********************************/how-to-apply/fraud-warning.
    $79k-104k yearly est. 34d ago
  • Data Base Developer Durham, NC

    Esrhealthcare

    Data engineer job in Durham, NC

    5+ years of Oracle development hands on experience and strong Oracle development practices, Experience on Unix hands on experience, Unix scripting: Korn Shell is preferred, Perl Scripting is a strong plus, Experience in hands-on Informatica experience, hands-on 24X7 support operations, hands-on Data analysis, issue triaging and issue resolutions Excellent SQL skills: 5+ years minimum required including writing efficient SQL and SQL tuning Experience working in a large data warehouse environment Strong analytical and technical design skills Strong written and oral communication skills Attention to detail and the ability to juggle multiple projects simultaneously Proven ability to work in a team environment with minimal supervision Should be reliable, thorough, fast-learner, and self-motivated RED, Attunity Replication & GreenPlum Knowledge is an added advantage Prior experience in Control-M preferred Educational Qualifications: Masters degree in Comp Sci (any) Engg (any) MBA (any) or as an alternative Bachelors degree with 5 years experience and/or any suitable combination of education, training or work experience Location: Durham, NC
    $76k-100k yearly est. 4d ago
  • Data Analytics Architect

    360 It Professionals 3.6company rating

    Data engineer job in Morrisville, NC

    360 IT Professionals is a Software Development Company based in Fremont, California that offers complete technology services in Mobile development, Web development, Cloud computing and IT staffing. Merging Information Technology skills in all its services and operations, the company caters to its globally positioned clients by providing dynamic feasible IT solutions. 360 IT Professionals work along with its clients to deliver high-performance results, based exclusively on the one of a kind requirement. Our services are vast and we produce software and web products. We specialize in Mobile development, i.e. iPhone and Android apps. We use Objective C and Swift programming languages to create native applications for iPhone, whereas we use Android Code to develop native applications for Android devices. To create applications that work on cross-platforms, we use a number of frameworks such as Titanium, PhoneGap and JQuery mobile. Furthermore, we build web products and offer services such as web designing, layouts, responsive designing, graphic designing, web application development using frameworks based on model view controller architecture and content management system. Our services also extend to the domain of Cloud Computing, where we provide Salesforce CRM to effectively manage one's business and ease out all the operations by giving an easy platform. Apart from this, we also provide IT Staffing services that can help your organization to a great extent as you can hire highly skilled personnel's through us. We make sure that we deliver performance driven products that are optimally developed as per your organization's needs. Take a shot at us for your IT requirements and experience a radical change. Job Description Working with Data Architects to : Model data to be used in BI modeling tools like Cognos and Tableau. Understands the importance of denormalized and aggregate data for optimized reporting Can tune SQL provided by BI Tools for performance optimization. What software tools/skills are needed to perform these daily responsibilities? - SQL - Tableau - Cognos - Oracle DB - MySQL What skills/attributes are a must have? SQL scripting Tableau report development DB tuning What skills/attributes are nice to have? Cognos Framework Manager SQL scripting Additional Information Thanks & Regards Preeti Nahar Sr. Talent & Client Acquisition Specialist 360 IT Professionals Inc.|510-254-3300 Ext 140
    $91k-120k yearly est. 9h ago

Learn more about data engineer jobs

How much does a data engineer earn in Burlington, NC?

The average data engineer in Burlington, NC earns between $68,000 and $121,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Burlington, NC

$91,000
Job type you want
Full Time
Part Time
Internship
Temporary