*Top Skills' Details*** * *DataEngineering experience* * Experience with AWS/Public Cloud *Strong Python experience *Snowflake experience *Data Automation with AI (very important to this role) Building agents- using langchain or langraph They are using using langgraph
Aws- understanding how to deploy on those
*Description*
Our client is currently seeking a DataEngineer with AI Automation experience. This role is a short-term role until they find funding to extend, but the manager said it could be 6 months before that happens with budget constraints. The person we put into the role could also be under consideration for the perm role.
These are the areas where we can utilize someone who has experience in one of these areas:
AI engineer - Experience building AI agents, MCP server/tools, etc.
Data/ETL Engineer - Experience with Apache Airflow, Glue ETL, and Snowflake
Python developer with AWS experience - to build scripts for different use cases and strong AWS knowledge is important
UI developer(Angular) - we have a design ready for a dashboard where we can use them to finish that project.
Ideally, someone would have experience in all 4 areas above but if someone is an SME in one of these areas, they will consider them.
DataEngineer with AI Experience
What You Will Do:
* Design and implement AI Agents to monitor and optimize cloud resources based on findings and recommendations from Cloud Service Providers.
* Develop predictive models for drift detection, cost anomaly detection, and forecasting of public cloud resources and spend.
* Automate operational workflows using machine learning and intelligent scripting.
* Integrate AI-driven insights with Cloud Service Providers like AWS, GCP, Azure, and existing data and tools.
* Conduct anomaly detection for security, cost optimization, and performance analytics.
* Design, build, and maintain scalable ETL pipelines using AWS Glue and other cloud-native services.
* Utilize AWS Athena for interactive querying of data stored in data lakes.
* Manage and optimize data storage and processing using Snowflake cloud data platform.
* Orchestrate complex workflows and data pipelines using Apache Airflow DAGs.
* Continuously evaluate emerging AI technologies and tools for operational improvements.
* Maintain documentation and best practices for AI/ML integration in cloud systems.
Our Minimum Requirements Include:
* Bachelor's or Master's degree in Computer Science, Data Science, or related technical field, or equivalent experience.
* Proven ability building and deploying ML models, with at least 2 years focused on cloud operations.
* Solid knowledge of cloud technologies (AWS, GCP, Azure, OCI).
* Experience with Python, PySpark, and ML libraries such as PyTorch, TensorFlow, or scikit-learn.
* Comfortable working with streaming data, APIs, and telemetry systems.
* Experience with AWS Glue ETL, AWS Athena, Snowflake, and Apache Airflow DAGs.
* Strong communication and multi-functional collaboration skills.
* Experience with Agile and DevOps operating models, including project tracking tools (e.g., Jira), Git (any Version Control systems), and CI/CD systems (e.g., GitLab, GitHub Actions, Jenkins).
* Proficient in general-purpose programming languages (Python, Golang, Bash) and development platforms and technologies.
Preferred Qualifications:
* Understanding of Cloud Technologies and Services of one or more providers including AWS, GCP, Azure, Oracle, and Alibaba.
* Established record of leading technical initiatives, delivering results, and a commitment to fostering a supportive work environment.
* Hard-working, dedicated to providing quality support for your customers.
* Full stack development experience with Angular for frontend and Flask for backend application development
*Additional Skills & Qualifications*
Cloud management and optimization
* Cloud account provisioning
* Multi cloud management
* Managed services
Public cloud infrastructure- all public accounts- not deployments- they have top level governance of clouds- gov/security/ etc. they oversee that- GPC/Azure/ AWS/ etc.
They have administration access to cloud.
Responsible for security, governance, cost management
Cloudx platform automation- this group of developers helps to rebuild different platforms to support various business units.
Data from cost optimization- want to present data to but in a meaningful way
Governance control- do that at scale-automation
they dont use all LLM models to be used - build a pipeline with only the custom models
Building the data pipelines
Most deployments are done in AWS
Public cloud
Snowflake- all the data aggregated here
Use Glue pipelines - in AWS
Python scripts for the data
Will dive more into AI in the coming months.
Questions around cloud spend is already avail- they released- agentic AI approach
MPC servers build those to support current environment
Build AI pipelines for MCP servers
Someone who knows dataData automation
AI would be a good fit
Building agents- using langchain or langraph
They are using using langgraph
Aws- understanding how to deploy on those
They deploy in containers but they have a boiler plate so they don't need that exp necessarily
Building Pipelines in Python
ETL experience
Taking data and making it meaningful into a pipeline
Working with developers on the team and AI
Organization deployments- monitor resources- if someone deletes- how do we remedy that? Monitoring that data
Cost optimization/ security/ etc are the main use cases
*Experience Level*
Intermediate Level
*Job Type & Location*This is a Contract position based out of Morrisville, NC.
*Pay and Benefits*The pay range for this position is $40.00 - $45.00/hr.
Eligibility requirements apply to some benefits and may depend on your job
classification and length of employment. Benefits are subject to change and may be
subject to specific elections, plan, or program terms. If eligible, the benefits
available for this temporary role may include the following:
* Medical, dental & vision
* Critical Illness, Accident, and Hospital
* 401(k) Retirement Plan - Pre-tax and Roth post-tax contributions available
* Life Insurance (Voluntary Life & AD&D for the employee and dependents)
* Short and long-term disability
* Health Spending Account (HSA)
* Transportation benefits
* Employee Assistance Program
* Time Off/Leave (PTO, Vacation or Sick Leave)
*Workplace Type*This is a fully remote position.
*Application Deadline*This position is anticipated to close on Jan 23, 2026.
h4>About TEKsystems:
We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company.
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
About TEKsystems and TEKsystems Global Services
We're a leading provider of business and technology services. We accelerate business transformation for our customers. Our expertise in strategy, design, execution and operations unlocks business value through a range of solutions. We're a team of 80,000 strong, working with over 6,000 customers, including 80% of the Fortune 500 across North America, Europe and Asia, who partner with us for our scale, full-stack capabilities and speed. We're strategic thinkers, hands-on collaborators, helping customers capitalize on change and master the momentum of technology. We're building tomorrow by delivering business outcomes and making positive impacts in our global communities. TEKsystems and TEKsystems Global Services are Allegis Group companies. Learn more at TEKsystems.com.
The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
$40-45 hourly 2d ago
Looking for a job?
Let Zippia find it for you.
Staff Full Stack Software Engineer, Platform Engineering
Cloudera 4.7
Data engineer job in Raleigh, NC
At Cloudera, we empower people to transform complex data into clear and actionable insights. With as much data under management as the hyperscalers, we're the preferred data partner for the top companies in almost every industry. Powered by the relentless innovation of the open source community, Cloudera advances digital transformation for the world's largest enterprises.
Ready to take cloud innovation to the next level? Join Cloudera's Anywhere Cloud team and help deliver a true "build your own pipeline, bring your own engine" experience. enabling data and AI workloads to run anywhere, without friction or vendor lock-in. We take the best of the public cloud- cost efficiency, scalability, elasticity, and agility and extend it to wherever data lives: public clouds, private data centers, and even the edge. Powered by Kubernetes, our hybrid architecture separates compute and storage, giving customers maximum flexibility and optimized infrastructure usage.
We are looking for a Staff Full Stack Software Engineer to lead the architecture and delivery of AI-powered workflows that are core to our product. You will define the technical strategy, set quality and reliability standards, and deliver end-to-end systems that transform ambiguous customer needs into robust, measurable, and privacy-safe AI experiences. You'll partner closely with Product, Design, Data Science, and GTM to deliver high-impact features at scale.
As a Staff Full Stack Software Engineer you will:
Own the architecture: Design, evolve, and document the end-to-end AI workflow stack (prompting, retrieval, tools/function-calling, agents, orchestration, evaluation, observability, and safety) with clear interfaces, SLAs, and versioning.
Ship production systems: Build reliable, low-latency services that integrate foundation models (hosted and self-hosted), and traditional microservices.
Own end-to-end delivery of features from the user-facing aspect (UI) to the backend services.
Implement robust testing frameworks, including unit, regression, and end-to-end tests, to guarantee deterministic and predictable behavior from our AI-powered data platform. Establish safety guardrails and human-in-the-loop processes to maintain accuracy and ensure the production of ethical, responsible, and non-toxic outputs.
Optimize for cost & performance: Instrument, analyze, and optimize unit economics (token usage, caching, batching, distillation) and performance (p95 latency, throughput, autoscaling).
Drive data excellence: Shape data contracts, feedback loops, labeling strategies, and feature stores to continuously improve model and workflow quality.
Mentor and multiply: Provide technical leadership across teams, unblock complex projects, raise code/design standards, and mentor senior engineers.
Partner across functions: Translate product intent into technical plans, influence roadmaps with data-driven insights, and communicate trade-offs to executives and stakeholders.
We are excited about you if you have:
Bachelor's degree in Computer Science or equivalent, and 6+ years of experience
Expertise in at least one primary language (Rust preferred) and ecosystem (e.g., Python, Go, or Java) and cloud-native architectures (containers, service mesh, queues, eventing).
Proven experience in integrating AI/ML models into user interfaces. This is more than just calling an API; you should have experience building features like AI-powered assistants, natural language interfaces (e.g., text-to-SQL), proactive suggestions, or intelligent data visualization.
Familiarity with the AI/ML ecosystem: You understand the fundamentals of LLMs, vector databases, RAG, and prompt engineering. Familiarity with tools such as MLflow, LangChain, or Hugging Face is a significant advantage.
Security & privacy mindset: Familiarity with data governance, PII handling, tenant isolation, and compliance considerations.
You might also have:
Platform thinking: Experience designing reusable AI workflow primitives, SDKs, or internal platforms used by multiple product teams.
Model ops: Experience with model lifecycle management, feature/embedding stores, prompt/version management, and offline/online eval systems.
Search & data infra: Experience with vector databases (e.g., Pinecone, Weaviate, pgvector), retrieval strategies, and indexing pipelines.
Observability: Built robust tracing/metrics/logging for AI systems; familiarity with quality dashboards and prompt diff tooling.
Cost strategy: Experience with model selection, distillation, caching layers, router policies, and autoscaling to manage spend.
Experience with managing machine learning workloads on container orchestration platforms like Kubernetes, including setting up GPU resources, managing distributed training jobs, and deploying models at scale.
Why this role matters:
This is more than cloud management, it's about building the foundation for a consistent, secure, and compliant cloud experience that gives organizations 100% access to 100% of their data, anywhere.
With the recent acquisition of Taikun, we are simplifying Kubernetes and cloud management even further, creating a platform that is unified, scalable, and future-ready.
If you are passionate about Kubernetes, not just using it but building it at the core managing workloads across hybrid clouds and datacenters and obsessed with performance, devops, etc. this is where you belong.
This role is not eligible for immigrationsponsorship
What you can expect from us:
Generous PTO Policy
Support work life balance with Unplugged Days
Flexible WFH Policy
Mental & Physical Wellness programs
Phone and Internet Reimbursement program
Access to Continued Career Development
Comprehensive Benefits and Competitive Packages
Paid Volunteer Time
Employee Resource Groups
EEO/VEVRAA
# LI-BV1
#LI-REMOTE
$110k-144k yearly est. 4d ago
Principal Software Engineer
Divihn Integration Inc.
Data engineer job in Raleigh, NC
Title: Senior Principal Software Systems Engineer (3 Openings) - Hybrid
Duration: 12 Months
Role is Hybrid: 3 days in office and 2 from home.
For further inquiries regarding the following opportunity, please contact our Talent Specialist.
Hema at **************
Description:
This is where your work saves lives
As a Senior Principal Software Systems Engineer in the software organization, you will be responsible for developing innovative healthcare solutions and supporting development and sustaining activities within connected Infusion Pump Platforms to meet customer needs and regulatory standards.
What you'll be doing:
o Drive the implementation of best practices in software systems development and product lifecycles in collaboration with development and verification teams, for Digital Applications that are part of infusion pumps ecosystem
o Be a technical leader providing team members guidance and feedback on technical work.
o Develop technical solutions to complex software system problems and deliver high-quality solutions on tight schedules
o Lead efforts with cross-functional team members (e.g. Commercial and Clinical) to document user needs and translate them to user needs into system requirements.
o Lead decomposition of system requirements into software subsystem requirements.
o Lead risk analysis activities for Digital Applications software from the capture of inherent hazards through mitigation implementation.
o Work with verification engineers to define test strategies for the development of verification and validation plans using requirement tracing methods.
o Participate in software design reviews for components or features.
o Perform product backlog and feature grooming/definition activities as part of Agile planning/execution o Drive collaboration with internal and external stakeholders and enable the team on better processes, practices and technical mentorship.
o Interface with manufacturing, service, and customer training staff through the design transfer process.
o Ensure compliance to the product development process and quality system.
What you'll be doing:
o Subject matter expertise in requirements management and risk management for complex, medically regulated, connected/interoperable system of systems
o Bachelor's degree in an engineering discipline with 10 + years of experience.
o Experience with Digital Applications (SaMD. MDDS) connected to regulated electro-mechanical devices in a clinical environment preferred.
o Experience with development in an agile environment with experience creating and maintaining product backlogs.
o Excellent oral and written communication skills.
o Experience in a regulated industry preferred.
o Excellent documentation skills.
If you're a passionate and innovative software systems engineer with a desire to shape the future of healthcare technology, we want to hear from you. Apply now to become a part of our dynamic team and help us create life-changing solutions for millions of people around the world.
$93k-124k yearly est. 3d ago
Data Scientist, Product Analytics
Meta 4.8
Data engineer job in Raleigh, NC
As a Data Scientist at Meta, you will shape the future of people-facing and business-facing products we build across our entire family of applications (Facebook, Instagram, Messenger, WhatsApp, Oculus). By applying your technical skills, analytical mindset, and product intuition to one of the richest data sets in the world, you will help define the experiences we build for billions of people and hundreds of millions of businesses around the world. You will collaborate on a wide array of product and business problems with a wide-range of cross-functional partners across Product, Engineering, Research, DataEngineering, Marketing, Sales, Finance and others. You will use data and analysis to identify and solve product development's biggest challenges. You will influence product strategy and investment decisions with data, be focused on impact, and collaborate with other teams. By joining Meta, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.Product leadership: You will use data to shape product development, quantify new opportunities, identify upcoming challenges, and ensure the products we build bring value to people, businesses, and Meta. You will help your partner teams prioritize what to build, set goals, and understand their product's ecosystem.Analytics: You will guide teams using data and insights. You will focus on developing hypotheses and employ a varied toolkit of rigorous analytical approaches, different methodologies, frameworks, and technical approaches to test them.Communication and influence: You won't simply present data, but tell data-driven stories. You will convince and influence your partners using clear insights and recommendations. You will build credibility through structure and clarity, and be a trusted strategic partner.
**Required Skills:**
Data Scientist, Product Analytics Responsibilities:
1. Work with large and complex data sets to solve a wide array of challenging problems using different analytical and statistical approaches
2. Apply technical expertise with quantitative analysis, experimentation, data mining, and the presentation of data to develop strategies for our products that serve billions of people and hundreds of millions of businesses
3. Identify and measure success of product efforts through goal setting, forecasting, and monitoring of key product metrics to understand trends
4. Define, understand, and test opportunities and levers to improve the product, and drive roadmaps through your insights and recommendations
5. Partner with Product, Engineering, and cross-functional teams to inform, influence, support, and execute product strategy and investment decisions
**Minimum Qualifications:**
Minimum Qualifications:
6. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
7. Bachelor's degree in Mathematics, Statistics, a relevant technical field, or equivalent
8. 4+ years of work experience in analytics, data querying languages such as SQL, scripting languages such as Python, and/or statistical mathematical software such as R (minimum of 2 years with a Ph.D.)
9. 4+ years of experience solving analytical problems using quantitative approaches, understanding ecosystems, user behaviors & long-term product trends, and leading data-driven projects from definition to execution [including defining metrics, experiment, design, communicating actionable insights]
**Preferred Qualifications:**
Preferred Qualifications:
10. Master's or Ph.D. Degree in a quantitative field
**Public Compensation:**
$147,000/year to $208,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
$147k-208k yearly 60d+ ago
Senior Data Engineer
The Clorox Company 4.6
Data engineer job in Durham, NC
Clorox is the place that's committed to growth - for our people and our brands. Guided by our purpose and values, and with people at the center of everything we do, we believe every one of us can make a positive impact on consumers, communities, and teammates. Join our team. #CloroxIsThePlace (**************************************************************************** UpdateUrns=urn%3Ali%3Aactivity%3A**********048001024)
**Your role at Clorox:**
We are seeking an experienced and highly skilled senior dataengineer to join our enterprise data strategy and operations team. The ideal candidate will have extensive expertise in designing, building and maintaining data pipelines and data solution architectures on cloud platforms, particularly Azure. This role involves leading dataengineering products, optimizing data processing workflows, ensuring data quality and governance, and collaborating with cross-functional teams to support analysis and insights generation to fuel large scale ideation and roadmapping associated with revenue growth and margin improvements projects at scale across the enterprise.
The role of Senior DataEngineer at Clorox will play a key role in leading and delivering enterprise quality data solutions that can enable data driven business decisions. Role is an analytical big picture thinker with a product mindset and strong background in business intelligence and engineering in cloud platform, that can leverage technology and build scalable data products that create value across the organization.This role will also serve as a key collaborator with our business analytics and enterprise technology stakeholders to innovate, build and sustain the cloud data infrastructure that will further Clorox's digital transformation efforts.
**In this role, you will:**
+ **Collaborate & Lead** : Work closely with business product owners, data scientists, analysts, and cross-functional stakeholders to understand the business' data needs and provide technical solutions. Influence business partners to align to the technical solutions and to adhere to technical architecture standards. Provide technical guidance to junior engineers, BI developers, and contractors to create efficient and effective data solutions.
+ **Architecting and Innovate** : Strong proficiency in Python, Spark, SQL, PySQL, Pandas, CI/CD methodologies is required. Strong data ingestion, data modeling and dimensional modeling skills using medallion lake house architecture. Strong BI skills to build reports & dashboards using Power BI and Tableau etc. Experience in reporting security like row level, column level, object level and masking etc. Experience with SQL and DML to recast data in backend database for data changes, restatements and data processing errors, etc. Experience with ML Ops and supporting Data Science workflow pipelines. Knowledge of Gen AI frameworks and LLMs to support agentic products
+ **Optimize and Scale:** Build and maintain data pipelines to integrate data from various source systems. Optimize data pipelines for performance, reliability and cost-effectiveness. Work with enterprise infrastructure and technology teams to implement best practices for performance monitoring, cloud resource management, including scaling, cost control and security.
+ **Ensure Quality and Governance:** Ensure safe custody, transport and storage of data in the data platforms. Collaborate with Data Governance Stewards and Business Stakeholders to enforce the business rules, data quality rules and data cataloging activities. Ensure data quality, security and compliance for the data products responsible under this role.
+ **Enhance BI Capabilities:** Develop and manage business intelligence solutions for the organization to transform data into insights that can drive business value. Help Analytics Product Owners and Business Leaders improve business decisions through data analytics, data visualization, and data modeling techniques and technologies.
**What we look for:**
+ 7+ years of experience if the candidate holds BS degree in Computer Science, Information Systems or relevant streams; 5-7 years of experience if the candidate holds MS/PhD degree
+ Experience in architecting data solutions, cloud dataengineering, end to end data warehouse or lake house implementations, end to end business intelligence implementations
+ 7 plus years of experience with dataengineering, data warehousing, business intelligence with substantial experience in managing large-scale data projects
+ 5 plus years' experience with data solutions implementations in Cloud platform technologies like Microsoft Azure, AWS etc.
+ 4 plus years with business intelligence using technologies like Power BI, Tableau etc.
+ 4plus years of experience with Azure services like Data Factory, Databricks, and Delta Lake will be an added advantage.
+ Experience in end-to-end support for dataengineering solutions (Data Pipelines), including designing, developing, deploying, and supporting solutions for existing platforms
+ Knowledge or experience in Microsoft D365 Dataverse and reporting in Microsoft Fabric technology
\#LI-HYBRID
**Workplace type:**
Hybrid - 3 days in the office, 2 days WFH
**Our values-based culture connects to our purpose and empowers people to be their best, professionally and personally. We serve a diverse consumer base which is why we believe teams that reflect our consumers bring fresh perspectives, drive innovation, and help us stay attuned to the world around us. That's why we foster an inclusive culture where every person can feel respected, valued, and fully able to participate, and ultimately able to thrive.** Learn more (********************************************************************************************************* **.**
**[U.S.]Additional Information:**
At Clorox, we champion people to be well and thrive, starting with our own people. To help make this possible, we offer comprehensive, competitive benefits that prioritize all aspects of wellbeing and provide flexibility for our teammates' unique needs. This includes robust health plans, a market-leading 401(k) program with a company match, flexible time off benefits (including half-day summer Fridays depending on location), inclusive fertility/adoption benefits, and more.
We are committed to fair and equitable pay and are transparent with current and future teammates about our full salary ranges. We use broad salary ranges that reflect the competitive market for similar jobs, provide sufficient opportunity for growth as you gain experience and expand responsibilities, while also allowing for differentiation based on performance. Based on the breadth of our ranges, most new hires will start at Clorox in the first half of the applicable range. Your starting pay will depend on job-related factors, including relevant skills, knowledge, experience and location. The applicable salary range for every role in the U.S. is based on your work location and is aligned to one of three zones according to the cost of labor in your area.
-Zone A: $128,000 - $252,200
-Zone B: $117,400 - $231,200
-Zone C: $106,700 - $210,200
All ranges are subject to change in the future. Your recruiter can share more about the specific salary range for your location during the hiring process.
This job is also eligible for participation in Clorox's incentive plans, subject to the terms of the applicable plan documents and policies.
Please apply directly to our job postings and do not submit your resume to any person via text message. Clorox does not conduct text-based interviews and encourages you to be cautious of anyone posing as a Clorox recruiter via unsolicited texts during these uncertain times.
To all recruitment agencies: Clorox (and its brand families) does not accept agency resumes. Please do not forward resumes to Clorox employees, including any members of our leadership team. Clorox is not responsible for any fees related to unsolicited resumes.
**Who we are.**
We champion people to be well and thrive every single day. We're proud to be in every corner of homes, schools, and offices-making daily life simpler and easier through our beloved brands. Working with us, you'll join a team of passionate problem solvers and relentless innovators fueled by curiosity, growth, and progress. We relish taking on new, interesting challenges that allow our people to collaborate and thrive at work. And most importantly, we care about each other as multifaceted, whole humans. Join us as we reimagine what's possible and work with purpose to make a difference in the world.
**This is the place where doing the right thing matters.**
Doing the right thing is the compass that guides every decision we make-and we're proud to be globally recognized and awarded for our continuous corporate responsibility efforts. Clorox is a signatory of the United Nations Global Compact and the Ellen MacArthur Foundation's New Plastics Economy Global Commitment. The Clorox Company and its Foundation prioritize giving back to the communities we call home and contribute millions annually in combined cash grants, product donations, and cause-marketing. For more information, visit TheCloroxCompany.com and follow us on social media at @CloroxCo.
**Our commitment to diversity, inclusion, and equal employment opportunity.**
We seek out and celebrate diverse backgrounds and experiences. We're always looking for fresh perspectives, a desire to bring your best, and a nonstop drive to keep growing and learning. Learn more about our Inclusion, Diversity, Equity, and Allyship (IDEA) journey here (*********************************************** .
The Clorox Company and its subsidiaries are an EEO/AA/Minorities/Women/LGBT/Protected Veteran/Disabled employer. Learn more to Know Your Rights (*********************************************************************************************** .
Clorox is committed to providing reasonable accommodations for qualified applicants with disabilities and disabled veterans during the hiring and interview process. If you need assistance or accommodations due to a disability, please contact us at ***************** . Please note: this inbox is reserved for individuals with disabilities in need of assistance and is not a means of inquiry about positions/application statuses.
The Clorox Company and its subsidiaries are an EEO/AA/ Minorities/Women/LGBT/Protected Veteran/Disabled employer.
$128k-252.2k yearly 60d+ ago
Principal Clinical Data Scientist- Data Management
Syneos Health, Inc.
Data engineer job in Morrisville, NC
Syneos Health is a leading fully integrated biopharmaceutical solutions organization built to accelerate customer success. We translate unique clinical, medical affairs and commercial insights into outcomes to address modern market realities. Our Clinical Development model brings the customer and the patient to the center of everything that we do. We are continuously looking for ways to simplify and streamline our work to not only make Syneos Health easier to work with, but to make us easier to work for.
Whether you join us in a Functional Service Provider partnership or a Full-Service environment, you'll collaborate with passionate problem solvers, innovating as a team to help our customers achieve their goals. We are agile and driven to accelerate the delivery of therapies, because we are passionate to change lives.
Discover what our 29,000 employees, across 110 countries already know:
WORK HERE MATTERS EVERYWHERE
Why Syneos Health
* We are passionate about developing our people, through career development and progression; supportive and engaged line management; technical and therapeutic area training; peer recognition and total rewards program.
* We are committed to our Total Self culture - where you can authentically be yourself. Our Total Self culture is what unites us globally, and we are dedicated to taking care of our people.
* We are continuously building the company we all want to work for and our customers want to work with. Why? Because when we bring together diversity of thoughts, backgrounds, cultures, and perspectives - we're able to create a place where everyone feels like they belong.
Job Responsibilities
Summary
The Principal Clinical Data Scientist provides strategic and operational leadership for end-to-end clinical data collection, cleaning, and quality oversight across complex clinical studies. This role serves as the functional lead for Clinical Data Science, ensuring clinical data deliverables are fit for purpose, compliant with regulatory and contractual requirements, and aligned with sponsor expectations and study timelines. The position partners cross-functionally to drive data quality, risk mitigation, analytics innovation, and timely delivery of clinical data milestones.
Responsibilities
* Serve as the Data Management Functional Lead for Clinical Data Science on complex, multi-scope clinical projects and act as the primary liaison between Clinical Data Science, Project Management, Clinical Monitoring, and other functional groups.
* Develop Data Management Plan, Communicate, troubleshoot, and resolve complex data-related issues; recommend solutions and escalate issues impacting patient safety, data integrity, or study analysis
* Act as the central steward of clinical data quality through holistic review of clinical and operational data using detailed protocol and therapeutic area knowledge.
* Ensure required data elements and corresponding data quality oversight steps are identified to support defined study analyses.
* Coordinate cross-functional data cleaning activities to meet quality standards, timelines, and contractual obligations.
* Communicate, troubleshoot, and resolve complex data-related issues; recommend solutions and escalate issues impacting patient safety, data integrity, or study analysis.
* Develop Clinical Data Acquisition Plans and data flow diagrams for complex studies and align data flow with study protocols, regulatory requirements, and study endpoints.
* Assess risks related to protocol design, program-level strategies, and study parameters that may impact data credibility and trial reliability.
* Design and drive development of analytical tools and dashboards to identify potentially unreliable or high-risk data.
* Perform analytic reviews as defined in the scope of work and data acquisition plans; identify root causes and implement systematic resolutions.
* Demonstrate understanding of advanced technologies and assess their applicability to individual studies or programs.
* Monitor and communicate project progress using status reports, tracking tools, and metrics to Sponsors and internal teams.
* Ensure launch, delivery, and completion of Clinical Data Science milestones in compliance with contracts, SOPs, guidelines, and regulatory requirements.
* Collect and analyze metrics to support continuous process improvement initiatives.
* Review and manage Clinical Data Science budgets, identify out-of-scope activities, and initiate change orders through Project Management.
* Plan, manage, and allocate Clinical Data Science resources and coordinate the work of assigned team members.
* Develop and maintain project plans, specifications, and documentation in compliance with SOP requirements.
* Maintain ongoing documentation and ensure Trial Master File (TMF) completeness and accuracy.
* Participate in and present at internal, Sponsor, investigator, and third-party meetings.
* Provide input to proposals, bid defenses, and RFP responses and promote new Clinical Data Science business opportunities aligned with Sponsor strategies.
* Prepare documentation for and participate in internal and external audits.
* Train and mentor junior team members and maintain proficiency in Clinical Data Science systems through ongoing training.
* Perform other duties as assigned.
Qualifications
Education
* Bachelor's degree in Biological Sciences, Computer Science, Mathematics, Data Science, or related discipline required.
* Master's degree preferred.
* Equivalent relevant experience may be considered in lieu of degree.
Experience
* Minimum of 10 years of experience in Clinical Data Management and/or Clinical Data Science.
* At least 5 years of project management experience.
* Experience with Clinical Data Science practices and relational database management systems.
* In-depth knowledge of the drug development lifecycle, including risk-based data quality approaches and biometrics workflows.
Skills & Knowledge
* Expertise in protocol interpretation, data collection strategies, and data cleaning specification development.
* Knowledge of ALCOA++ data quality principles.
* Knowledge of medical terminology, clinical trial data, and ICH/GCP regulatory requirements.
* Proficiency with Microsoft Word, Excel, PowerPoint, email, and Windows-based applications.
* Strong leadership, communication, organizational, and time-management skills.
* Ability to manage multiple priorities in a fast-paced, dynamic environment.
* Ability to work independently and collaboratively across multidisciplinary teams.
At Syneos Health, we believe in providing an environment and culture in which Our People can thrive, develop and advance. We reward and recognize our people by providing valuable benefits and a quality-of-life balance. The benefits for this position may include a company car or car allowance, Health benefits to include Medical, Dental and Vision, Company match 401k, eligibility to participate in Employee Stock Purchase Plan, Eligibility to earn commissions/bonus based on company and individual performance, and flexible paid time off (PTO) and sick time. Because certain states and municipalities have regulated paid sick time requirements, eligibility for paid sick time may vary depending on where you work. Syneos complies with all applicable federal, state, and municipal paid sick time requirements.
Salary Range:
$95,000.00 - $175,700.00
The base salary range represents the anticipated low and high of the Syneos Health range for this position. Actual salary will vary based on various factors such as the candidate's qualifications, skills, competencies, and proficiency for the role.
Get to know Syneos Health
Over the past 5 years, we have worked with 94% of all Novel FDA Approved Drugs, 95% of EMA Authorized Products and over 200 Studies across 73,000 Sites and 675,000+ Trial patients.
No matter what your role is, you'll take the initiative and challenge the status quo with us in a highly competitive and ever-changing environment. Learn more about Syneos Health.
***************************
Additional Information
Tasks, duties, and responsibilities as listed in this are not exhaustive. The Company, at its sole discretion and with no prior notice, may assign other tasks, duties, and job responsibilities. Equivalent experience, skills, and/or education will also be considered so qualifications of incumbents may differ from those listed in the Job Description. The Company, at its sole discretion, will determine what constitutes as equivalent to the qualifications described above. Further, nothing contained herein should be construed to create an employment contract. Occasionally, required skills/experiences for jobs are expressed in brief terms. Any language contained herein is intended to fully comply with all obligations imposed by the legislation of each country in which it operates, including the implementation of the EU Equality Directive, in relation to the recruitment and employment of its employees. The Company is committed to compliance with the Americans with Disabilities Act, including the provision of reasonable accommodations, when appropriate, to assist employees or applicants to perform the essential functions of the job.
Summary
Principal Clinical Data Scientist- Clinical Data Management Syneos Health is a leading fully integrated biopharmaceutical solutions organization built to accelerate customer success. We translate unique clinical, medical affairs and commercial insights into outcomes to address modern market realities. Our Clinical Development model brings the customer and the patient to the center of everything that we do. We are continuously looking for ways to simplify and streamline our work to not only make Syneos Health easier to work with, but to make us easier to work for. Whether you join us in a Functional Service Provider partnership or a Full-Service environment, you'll collaborate with passionate problem solvers, innovating as a team to help our customers achieve their goals. We are agile and driven to accelerate the delivery of therapies, because we are passionate to change lives. Discover what our 29,000 employees, across 110 countries already know: WORK HERE MATTERS EVERYWHERE Why Syneos Health We are passionate about developing our people, through career development and progression; supportive and engaged line management; technical and therapeutic area training; peer recognition and total rewards program. We are committed to our Total Self culture - where you can authentically be yourself. Our Total Self culture is what unites us globally, and we are dedicated to taking care of our people. We are continuously building the company we all want to work for and our customers want to work with. Why? Because when we bring together diversity of thoughts, backgrounds, cultures, and perspectives - we're able to create a place where everyone feels like they belong.
$95k-175.7k yearly 34d ago
Data Scientist, Privacy
Datavant
Data engineer job in Raleigh, NC
Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care.
By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare.
As part of the Privacy Science team within Privacy Hub you will play a crucial role in ensuring that privacy of patients is safeguarded in the modern world of data sharing. As well as working on real data, you will be involved in exciting research to keep us as industry leaders in this area, and stimulating discussions on re-identification risk. You will be supported in developing/consolidating data analysis and coding skills to become proficient in the analysis of large health-related datasets.
**You Will:**
+ Critically analyze large health datasets using standard and bespoke software libraries
+ Discuss your findings and progress with internal and external stakeholders
+ Produce high quality reports which summarise your findings
+ Contribute to research activities as we explore novel and established sources of re-identification risk
**What You Will Bring to the Table:**
+ Excellent communication skills. Meticulous attention to detail in the production of comprehensive, well-presented reports
+ A good understanding of statistical probability distributions, bias, error and power as well as sampling and resampling methods
+ Seeks to understand real-world data in context rather than consider it in abstraction.
+ Familiarity or proficiency with programmable data analysis software R or Python, and the desire to develop expertise in its language
+ Application of scientific methods to practical problems through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions
+ Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines
+ Initiative and ability to independently explore and research novel topics and concepts as they arise, to expand Privacy Hub's knowledge base
+ An appreciation of the need for effective methods in data privacy and security, and an awareness of the relevant legislation
+ Familiarity with Amazon Web Services cloud-based storage and computing facilities
**Bonus Points If You Have:**
+ Experience creating documents using LATEX
+ Detailed knowledge of one or more types of health information, e.g., genomics, disease, health images
+ Experience working with or supporting public sector organizations, such as federal agencies (e.g., CMS, NIH, VA, CDC), state health departments, or public health research partners. Familiarity with government data environments, procurement processes, or privacy frameworks in regulated settings is highly valued.
\#LI-BC1
We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status.
At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services.
The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job.
The estimated total cash compensation range for this role is:
$104,000-$130,000 USD
To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion.
This job is not eligible for employment sponsorship.
Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay.
At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way.
Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis.
For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
$104k-130k yearly 13d ago
AWS Data Migration Consultant
Slalom 4.6
Data engineer job in Raleigh, NC
Candidates can live within commutable distance to any Slalom office in the US. We have a hybrid and flexible environment. Who You'll Work With As a modern technology company, we've never met a technical challenge we didn't like. We enable our clients to learn from their data, create incredible digital experiences, and make the most of new technologies. We blend design, engineering, and analytics expertise to build the future. We surround our technologists with interesting challenges, innovative minds, and emerging technologies.
We are seeking an experienced Cloud Data Migration Architect with deep expertise in SQL Server, Oracle, DB2, or a combination of these platforms, to lead the design, migration, and optimization of scalable database solutions in the AWS cloud. This role will focus on modernizing on-premises database systems by architecting high-performance, secure, and reliable AWS-hosted solutions.
As a key technical leader, you will work closely with dataengineers, cloud architects, and business stakeholders to define data strategies, lead complex database migrations, build out ETL pipelines, and optimize performance across legacy and cloud-native environments.
What You'll Do
* Design and optimize database solutions on AWS, including Amazon RDS, EC2-hosted instances, and advanced configurations like SQL Server Always On or Oracle RAC (Real Application Clusters).
* Lead and execute cloud database migrations using AWS Database Migration Service (DMS), Schema Conversion Tool (SCT), and custom automation tools.
* Architect high-performance database schemas, indexing strategies, partitioning models, and query optimization techniques.
* Optimize complex SQL queries, stored procedures, functions, and views to ensure performance and scalability in the cloud.
* Implement high-availability and disaster recovery (HA/DR) strategies including Always-On, Failover Clusters, Log Shipping, and Replication, tailored to each RDBMS.
* Ensure security best practices are followed including IAM-based access control, encryption, and compliance with industry standards.
* Collaborate with DevOps teams to implement Infrastructure-as-Code (IaC) using tools like Terraform, CloudFormation, or AWS CDK.
* Monitor performance using tools such as AWS CloudWatch, Performance Insights, Query Store, Dynamic Management Views (DMVs), or Oracle-native tools.
* Work with software engineers and data teams to integrate cloud databases into enterprise applications and analytics platforms.
What You'll Bring
* 5+ years of experience in database architecture, design, and administration with at least one of the following: SQL Server, Oracle, or DB2.
* Expertise in one or more of the following RDBMS platforms: Microsoft SQL Server, Oracle, DB2.
* Hands-on experience with AWS database services (RDS, EC2-hosted databases).
* Strong understanding of HA/DR solutions and cloud database design patterns.
* Experience with ETL development and data integration, using tools such as SSIS, AWS Glue, or custom solutions.
* Familiarity with AWS networking components (VPCs, security groups) and hybrid cloud connectivity.
* Strong troubleshooting and analytical skills to resolve complex database and performance issues.
* Ability to work independently and lead database modernization initiatives in collaboration with engineering and client stakeholders.
Nice to Have
* AWS certifications such as AWS Certified Database - Specialty or AWS Certified Solutions Architect - Professional.
* Experience with NoSQL databases or hybrid data architectures.
* Knowledge of analytics and big data tools (e.g., Snowflake, Redshift, Athena, Power BI, Tableau).
* Familiarity with containerization (Docker, Kubernetes) and serverless technologies (AWS Lambda, Fargate).
* Experience with DB2 on-premise or cloud-hosted environments.
About Us
Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all.
Compensation and Benefits
Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance.
Slalom is committed to fair and equitable compensation practices. For this position, the target base salary pay range in the following locations:
Boston, Houston, Los Angeles, Orange County, Seattle, San Diego, Washington DC, New York, New Jersey, for Consultant level is $105,000-147,000 and for Senior Consultant level it is $120,000-$169,000 and for Principal level it is $133,000-$187,000.
In all other markets, the target base salary pay range for Consultant level is $96,000-$135,000 and for Senior Consultant level it is $110,000-$155,000 and for Principal level it is $122,000-$172,000.
In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time.
EEO and Accommodations
Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process.
We will accept applications until 1/31/2026 or until the positions are filled.
$133k-187k yearly 3d ago
Data Scientist
Ionna
Data engineer job in Raleigh, NC
Reports To: Chief of Staff
This role requires a full-time onsite presence in Durham, NC
Target Base Range: $110 - 120k
Please note: We are only able to consider candidates who are U.S. citizens or lawful permanent residents (green card holders) and who do not require current or future visa sponsorship of any sort.
Position Overview
We are seeking a highly skilled and analytical Data Scientist to join our team. The ideal candidate will leverage data-driven insights to solve complex business problems, develop predictive models, and support strategic decision-making. This role requires strong expertise in statistical analysis, machine learning, and data visualization, combined with the ability to communicate findings effectively to both technical and non-technical stakeholders. A successful candidate will be curious, self-directed, and thoughtful as they apply analysis to business issues.
Mission: Provide actionable insights to IONNA
Key Responsibilities
Collect, clean, and preprocess large datasets from multiple sources.
Develop and implement predictive models and machine learning algorithms to address business challenges.
Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies.
Design and execute experiments to validate hypotheses and measure impact.
Create dashboards and visualizations to communicate insights clearly and effectively.
Collaborate with cross-functional teams (engineering, product, marketing) to integrate data solutions into business processes.
Stay current with emerging technologies, tools, and best practices in data science and AI.
Required Qualifications
Bachelor's or Master's degree in Data Science, Computer Science, Statistics, Mathematics, or a related field, and a minimum of 3 years of relevant experience.
Proven experience in data analysis, statistical modeling, and machine learning.
Proficiency in programming languages such as Python or R.
Strong knowledge of SQL and experience working with relational databases.
Familiarity with data visualization tools (e.g., Tableau, Power BI, matplotlib, seaborn).
Excellent problem-solving skills and attention to detail.
Strong communication skills to present complex findings in a clear and actionable manner.
Preferred Qualifications
Experience with big data technologies (e.g., Hadoop, Spark).
Knowledge of cloud platforms (AWS, Azure, GCP).
Background in deploying machine learning models into production environments.
Understanding of business intelligence and data governance principles.
At IONNA, we believe in recognizing talent with a compensation package that truly stands out! Our competitive base salary is just the beginning. Dive into a world of rewarding bonus programs and a comprehensive benefits package that covers medical, dental, vision, and life insurance-planning for your future? We've got you covered with a robust 401(K) plan after 90 days! Enjoy the freedom of unlimited PTO and a generous schedule of paid holidays, along with invaluable employee assistance programs. We know every candidate is unique, so our base salaries reflect your individual skills, experience, and certifications. Join us and thrive in a supportive environment where your contributions are celebrated!
We are committed to an inclusive and diverse team. IONNA is an equal opportunity employer. We do not discriminate based on race, color, ethnicity, ancestry, national origin, religion, sex, gender, gender identity, gender expression, sexual orientation, age, disability, veteran status, genetic information, marital status or any legally protected status.
$110k-120k yearly 29d ago
Head of AI/Data Consulting
Carimus
Data engineer job in Raleigh, NC
& The Role
We are Carimus, a brand experience and digital transformation agency, now proudly part of the Spyrosoft Group. Since 2013, we've brought together the best of art and engineering to create meaningful impact in the digital world. By fusing strategy, creativity, and technology, we help brands break through and connect with their audiences on an emotional level. As part of Spyrosoft, we're expanding our capabilities and reach while staying true to our human centered approach crafting experiences that matter for both our clients and our team.
This role plays a pivotal part in shaping and scaling our AI and Data consulting practice. You'll work closely with our leadership, delivery, and commercial teams to define our vision, guide clients on meaningful AI strategies, and build a high-performing team that brings those strategies to life. As the face of our AI expertise, you'll help clients cut through the noise, make smart decisions, and create real business impact. Your blend of technical depth, commercial instinct, and strong relationships will be key to driving growth and delivering exceptional results.
Department: TBD
Classification: Exempt
Status: Full Time
Location: Raleigh, NC (Hybrid 3x per week)
Travel Requirement: 30-50%
What You'll Do
Build and lead a team of AI/Data Consultants who can advise clients on AI and data strategies, tools, architectures, and implementation approaches.
Partner with marketing and commercial teams to shape marketing and go-to-market strategies aimed at winning new clients in the US and Europe.
Own the P&L for the AI/Data Consulting unit, ensuring financial performance, scalability, and profitability.
Conduct workshops, strategic assessments, and executive-level discussions with key customers.
Represent Carimus and Spyrosoft externally as a credible AI voice, supporting business development, shaping proposals, and closing strategic deals.
Define a clear vision for how AI can drive tangible business outcomes and bring that vision to clients and internal teams.
Foster a culture of excellence, curiosity, accountability, and collaboration within the team.
Required Qualifications
Deep expertise in AI and Data Science, with an up-to-date understanding of modern AI (LLMs, agents, MLOps, data architectures, etc.).
Several years of experience in AI/Data consulting or in a software company offering AI/Data advisory services.
Demonstrated success in strategic advisory work with enterprise clients.
Proven ability to connect technical solutions to business value and ROI.
Prior P&L ownership or substantial commercial responsibility.
Exceptional interpersonal, communication, and client-facing skills.
Experience hiring, developing, and leading technical consulting teams.
Ability to travel to customer locations as needed.
Who We're Looking For
We're looking for an ambitious, well connected AI/Data leader who is both a deep technologist and a business builder. Someone who can operate as a one-person practice at the start, shaping strategy and also defining how it gets executed. You bring credibility in the AI/Data space, understand real world use cases beyond the hype, and can explain complex ideas in a clear, accessible way. You're skilled at developing client relationships, identifying opportunities, and crafting AI strategies along with practical implementation plans, whether we deliver them or empower clients to do so themselves. Above all, you're entrepreneurial, influential, and motivated to build something exceptional from the ground up driving meaningful value for our clients and accelerating the growth of Carimus and Spyrosoft.
Our Values
At Carimus, these values guide every interaction and collaboration internally and with our clients.
Live in the ZOPD. We continually expand our skills by working in the Zone of Proximal Development. We take measured risks and incorporate new technology, but only what we can deliver with excellence.
Be Transparent & Tenacious. We don't hide from the truth and won't let our clients, either. We embrace reality, own our mistakes, and attack problems with teamwork and creativity.
Invest in Relationships. Life is better doing interesting things with people we like. We build trusting relationships and strong connections-with our employees and our clients. We go further together.
Create Exceptional Experiences. We exceed expectations-yours and ours. We unite art and engineering in smart, compelling ways that inspire confidence and human connection. We excite and engage, from concept to launch.
Commit to Caring. Caring is in our blood-and our name, “Care I Must.” We're proudest when we tackle real problems and advance positive change for people and the environment. Let's get to work.
Physical Requirements
Normal periods of sitting and standing in an office environment.
Lifting and/or pushing objects up to 35 lbs. on an occasional basis.
Travel Requirement 30-50%.
Carimus provides equal employment opportunities to all employees and applicants and prohibits discrimination and harassment of any kind, regardless of race, color, religion, age, sex, national origin, disability status, genetic information, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected under federal, state, or local laws.
$78k-106k yearly est. 44d ago
Big Data Consultant (Durham, NC, Westlake, TX)
Sonsoft 3.7
Data engineer job in Durham, NC
Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services.
Green Card and USC
:-
At least 7+ years of overall experience
HDFS architecture and understanding of all critical architectural concepts including type of
nodes, node interaction, YARN, Zoo Keeper, and Map reduce etc
.
Hand on experience in
Hive: All concepts including Hive queries, UDF, Different file formats like ORC, AVRO, and Parquet etc.
Hand on experience
in developing Sqoop, Spark.
Experience in processing structured data - Warehousing Concepts like de-duplication, cleansing, look ups, transformation, data versioning etc.
Hand on experience in developing Oozie workflow definition and execution
Knowledge of a HDFS distribution preferably Cloudera. Understanding of Monitoring and operational capabilities of the distribution
Knowledge of Flume, Kafka is a plus
Hand on experience in programming languages like Java, Python, Perl.
At least 4 years of experience in translating functional
/non-functional requirements to system requirements.
Experience in working with business users to analyze and understand business data and scenarios.
Ability to work in a team environment with client interfacing skills.
Experience and desire to work in a Global delivery environment.
Experience leading medium to large sized teams.
CloudEra Certification
Knowledge of PLSQL.
Job Description:-
At least 7+ years of overall experience
HDFS architecture and understanding of all critical architectural concepts including type of
nodes, node interaction, YARN, Zoo Keeper, and Map reduce etc
.
Hand on experience in
Hive: All concepts including Hive queries, UDF, Different file formats like ORC, AVRO, and Parquet etc.
Hand on experience
in developing Sqoop, Spark.
Experience in processing structured data - Warehousing Concepts like de-duplication, cleansing, look ups, transformation, data versioning etc.
Hand on experience in developing Oozie workflow definition and execution
Knowledge of a HDFS distribution preferably Cloudera. Understanding of Monitoring and operational capabilities of the distribution
Knowledge of Flume, Kafka is a plus
Hand on experience in programming languages like Java, Python, Perl.
At least 4 years of experience in translating functional
/non-functional requirements to system requirements.
Experience in working with business users to analyze and understand business data and scenarios.
Ability to work in a team environment with client interfacing skills.
Experience and desire to work in a Global delivery environment.
Experience leading medium to large sized teams.
CloudEra Certification
Knowledge of PLSQL.
Qualifications
Basic Qualifications :-
Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
At least 4 years of experience
within the Information Technologies.
Additional Information
Note:-
This is a
Full-Time & Permanent job opportunity
for you.
Only
US Citizen
,
Green Card Holder
,
GC-EAD
,
H4-EAD & L2-EAD
can apply.
No
OPT-EAD, H1B & TN candidates
please.
Please mention your
Visa Status
in your
email
or
resume
.
$78k-107k yearly est. 1d ago
Data Scientist
Nextwave Resources 4.4
Data engineer job in Durham, NC
Temp
Data Scientist - Boston, MA or NC, NH, RI, TX, MA, or CO (18 month contract- probable extension or permanent conversion)
Notes:
Master's Degree Required Python development to build time series models Run SQL queries Linux OS administration
Any web development experience preferred.
Experience working with Artificial Intelligence, Machine learning algorithms, neural networks, decision trees, modeling, Cloud Machine Learning, time series analysis and robotics process automation.
Description:
We are seeking a hands-on experienced data scientist with financial services industry experience. As part of a small, nimble team, the associate's key differentiating abilities will be exceptional analytical skills, and an ability to conceive of and develop differentiated products for the benefit of customers. Absolutely critical is the associate's ability to carry an initiative from idea through to execution.
5+ years' experience in Information security/technology risk management for large-scale, complex IT infrastructures and distributed environments or an equivalent combination of related training and experience
Analytic Skills: In addition to core regression, classification and time series skills that accompany the data science role, experience with next best action (NBA) prediction, multi-armed bandits, online learning, A/B testing, and experimentation methods are preferred
Natural programmer, and confirmed industry experience with statistics and data modeling
Experience with one or more of the following tools/frameworks - python, scikit-learn, nltk, pandas, numpy, R, pyspark, scala, SQL/big data tools, TensorFlow, PyTorch, etc
Education- At least one advanced degree (Master or PhD level) in a technical or mathematically-oriented discipline, e.g., coursework or experience in fields such as statistics, machine learning, computer science, applied mathematics, econometrics, engineering, etc.
Extensive experience in written and oral communications/presentations, and ability to produce a variety of business documents (business requirements, technical specs, slide presentations, etc.) that demonstrate command of language, clarity of thought, and orderliness of presentation
We are looking for an authority quantitative developer to advance the research and development of AI/ML methods as components in the delivery of creative investment management technology solutions. You will have experience combining multi-variate statistical modeling, predictive machine learning methods and open-source approaches to Cloud computing and Big Data.
$70k-100k yearly est. 60d+ ago
Data Scientist
Tek Spikes
Data engineer job in Cary, NC
Job Description
Note: Only on W2
Must be local to either Cary, NC or Irving, TX
Client: Caterpillar
Education:
• Bachelors or Masters are required
Qualifications:
• 5+ years of experience are required
Top Skills:
• Proficiency in Python, SQL, and data science libraries (Pandas, Scikit-learn, TensorFlow)
• Strong foundation in statistics, probability, and machine learning
• Familiarity with cloud platforms (Azure, AWS, Snowflake) and data modeling
• Excellent communication skills to explain technical concepts to non-technical stakeholders
Job Duties:
• A Data Scientist is responsible for analyzing large volumes of structured and unstructured data to extract actionable insights, build predictive models, and support data-driven decision-making.
• This role blends statistical expertise, programming skills, and business acumen to solve complex problems and drive innovation.
• Data Collection & Preparation: Gather, clean, and validate data from various sources to ensure quality and usability
• Exploratory Data Analysis: Identify trends, anomalies, and patterns in large datasets
• Model Development: Design and implement machine learning models (e.g., regression, classification, clustering, NLP) to support forecasting and decision-making
• Data Visualization: Create dashboards and reports using tools like Power BI, etc., to communicate findings
• Automation & Optimization: Develop scripts and tools to automate data processing and model deployment
• Collaboration: Work cross-functionally with product, engineering, and business teams to align data initiatives with strategic goals
• Research & Innovation: Stay current with emerging technologies and methodologies in data science and apply them to business challenges
$70k-97k yearly est. 25d ago
Data Scientist II
Insight Global
Data engineer job in Raleigh, NC
As a Senior Data Scientist II, you will leverage your advanced analytical skills to extract insights from complex datasets. Your expertise will drive data-driven decision-making and contribute to the development of innovative solutions. You will collaborate with cross-functional teams to enhance business strategies and drive growth through actionable data analysis.
· Leading the development of advanced AI and machine learning models to solve complex business problem
· Working closely with other data scientists and engineers to design, develop, and deploy AI solutions
· Collaborating with cross-functional teams to ensure AI solutions are aligned with business goals and customer needs
· Building models, performing analytics, and creating AI features
· Mentoring junior data scientists and provide guidance on AI and machine learning best practices
Working with product leaders to apply data science solutions
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to ********************.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: ****************************************************
Skills and Requirements
3-5+ years of professional experience (POST GRAD) as a delivery-focused Data Scientist or ML Engineer
Understand how LLMs integrate to complex systems
Deep understanding of RAG
Strong in Python
Expeirence creating AI Agents and Agentic Workflows Masters and PHD
$70k-97k yearly est. 60d+ ago
Data Scientist
Advance Stores Company
Data engineer job in Raleigh, NC
We are seeking an experienced Data Scientist with strong expertise in Data Science, machine learning engineering with hands on experience in designing and deploying ML solutions in production. This role focuses on building scalable ML solutions, productionizing models, and enabling robust ML platforms for enterprise-grade deployments.
This position is 4 days in office, 1 day remote per week, based at our corporate headquarters in Raleigh, North Carolina (North Hills)
Key Responsibilities
Build ML Models: Design and implement predictive and prescriptive models for regression, classification, and optimization problems.Apply advanced techniques such as structural time series modeling and boosting algorithms (e.g., XGBoost, LightGBM).
Train and Tune Models: Develop and tune machine learning models using Python, PySpark, TensorFlow, and PyTorch.
Collaboration & Communication: Work closely with stakeholders to understand business challenges and translate them into data science solutions and work in the end-to-end solutioning. Collaborate with cross-functional teams to ensure successful integration of models into business processes.
Monitoring & Visualization: Rapidly prototype and test hypotheses to validate model approaches. Build automated workflows for model monitoring and performance evaluation. Create dashboards using tools like Databricks and Palantir to visualize key model metrics like model drift, Shapley values etc.
Productionize ML: Build repeatable paths from experimentation to deployment (batch, streaming, and low-latency endpoints), including feature engineering, training, evaluation,
Own ML Platform: Stand up and operate core platform components-model registry, feature store, experiment tracking, artifact stores, and standardized CI/CD for ML.
Pipeline Engineering: Author robust data/ML pipelines (orchestrated with Step Functions / Airflow / Argo) that train, validate, and release models on schedules or events.
Observability & Quality: Implement end-to-end monitoring, data validation, model/drift checks, and alerting SLA/SLOs.
Governance & Risk: Enforce model/version lineage, reproducibility, approvals, rollback plans, auditability, and cost controls aligned to enterprise policies.
Partner & Mentor: Collaborate with on-shore/off-shore teams; coach data scientists on packaging, testing, and performance; contribute to standards and reviews.
Hands-on Delivery: Prototype new patterns; troubleshoot production issues across data, model, and infrastructure layers.
Required Qualifications
Education: Bachelor's degree in Computer Science, Information Technology, Data Science, or related field.
Programming: 5+ years experience with Python (pandas, PySpark, scikit-learn; familiarity with PyTorch/TensorFlow helpful), bash, experience with Docker.
ML Experimentation: Design and implement predictive and prescriptive models for regression, classification, and optimization problems. Apply advanced techniques such as structural time series modeling and boosting algorithms (e.g., XGBoost, LightGBM).
ML Tooling: 5+ years experience with SageMaker (training, processing, pipelines, model registry, endpoints) or equivalents (Kubeflow, MLflow/Feast, Vertex, Databricks ML).
Pipelines & Orchestration: 5+ years' experience with Databricks DABS or Airflow or Step Functions, e-driven designs with EventBridge/SQS/Kinesis.
Cloud Foundations: 3+ years experience with AWS/Azure/GCP on various services like ECR/ECS, Lambda, API Gateway, S3, Glue/Athena/EMR, RDS/Aurora (PostgreSQL/MySQL), DynamoDB, CloudWatch, IAM, VPC, WAF.
Snowflake Foundations: Warehouses, databases, schemas, stages, Snowflake SQL, RBAC, UDF, Snowpark.
CI/CD: 3+ years hands-on experience with CodeBuild/Code Pipeline or GitHub Actions/GitLab; blue/green, canary, and shadow deployments for models and services.
Feature Pipelines: Proven experience with batch/stream pipelines, schema management, partitioning, performance tuning; parquet/iceberg best practices.
Testing & Monitoring: Unit/integration tests for data and models, contract tests for features, reproducible training; data drift/performance monitoring.
Operational Mindset: Incident response for model services, SLOs, dashboards, runbooks; strong debugging across data, model, and infra layers.
Soft Skills: Clear communication, collaborative mindset, and a bias to automate & document.
Additional Qualification:
Experience in retail/manufacturing is preferred.
California Residents click below for Privacy Notice:
***************************************************
$70k-97k yearly est. Auto-Apply 15d ago
Lead Data Engineer
Tata Consulting Services 4.3
Data engineer job in Raleigh, NC
Lead DataEngineer - Snowflake, DBT and Qlik * Design, develop, and maintain robust and scalable data transformation pipelines using dbt on the Snowflake platform. * DBT Macro Development to Create and utilize Jinja-based DBT macros to promote code reusability, modularity, and dynamic SQL generation within DBT projects.
* Data Transformation & Orchestration to Implement and manage data transformation pipelines using DBT, integrating with various data sources and ensuring efficient data flow.
* Utilize advanced dbt concepts, including macros, materializations (e.g., incremental, view, table), snapshots, and configurations to build efficient data models.
* Write highly optimized and complex SQL queries for data manipulation, cleaning, aggregation, and transformation within dbt models.
* Implement and enforce best practices for dbt project structure, version control (Git), documentation, and testing.
* Collaborate with data analysts, engineers, and business stakeholders to understand data requirements and translate them into effective data models (e.g., star schema, snowflake schema).
* Design and implement logical and physical data models within dbt to support analytical and reporting needs.
* Leverage Snowflake features and functionalities for performance optimization, including virtual warehouses, clustering, caching, and query optimization.
* Manage and optimize data ingestion and integration processes from various sources into Snowflake.
* Ensure data quality, integrity, and lineage throughout the data transformation process.
* Implement and maintain DBT tests to ensure data quality, integrity, and adherence to business rules.
* Implement and maintain data governance policies and procedures within the dbt environment.
* Develop and execute automated tests for dbt models to ensure data accuracy and reliability.
Required Skills:
* Proven hands-on experience with dbt in a production environment, including extensive use of macros and advanced modeling techniques.
* Expert-level proficiency in SQL for data querying, manipulation, and transformation.
* Strong experience with Snowflake, including performance tuning and optimization.
* Solid understanding of data warehousing concepts and ETL/ELT processes.
* Experience with version control systems, particularly Git.
* Familiarity with data modeling principles (star schema, snowflake schema).
Salary Range- $100,000-$110,000 a year
#LI-SP3
#LI-VX1
$100k-110k yearly 7d ago
Data and MLOps Engineer
DPR Construction 4.8
Data engineer job in Raleigh, NC
Job DescriptionDPR is looking for an experienced Data and MLOps Engineer to join our Data and AI team and work closely with the Data Platform, BI and Enterprise architecture teams to influence the technical direction of DPR's AI initiatives. You will work closely with cross-functional teams, including business stakeholders, dataengineers, and technical leads, to ensure alignment between business needs and data architecture and define data models for specific focus areas.Data & MLOps Engineer
DPR is a leading construction company committed to delivering high-quality, innovative projects. Our team integrates cutting-edge technologies into the construction process to streamline operations, enhance decision-making, and drive efficiency at all levels. We are looking for a Data & MLOps Engineer to join our team and contribute to developing robust data solutions to support our AI, Machine Learning and Data Science team.
Position Overview
The Data & MLOps Engineer will be instrumental in the design and implementation of scalable, cloud-native solutions to meet the growing needs of our Data and AI team. The successful candidate will demonstrate the ability to abstract complexity and create reusable, scalable patterns that accelerate development. The Data & MLOps Engineer will design, build and support the data pipelines and ML infrastructure that enable our teams to deliver reliable, high-impact analytics and AI workflows, collaborating closely with dataengineers, software developers, data scientists and product teams. The role bridges DataEngineering and ML Operations, focusing on scalable data processing, robust deployment practices and continuous monitoring and improvement of production systems.
Responsibilities
Design distributed, cloud-native, scalable architecture for data and ML pipelines
Develop CI/CD pipelines and pipeline templates to be used across DataEngineering, AI/ML and Data Science teams
Automate training, testing and deployment processes for machine learning models
Develop and maintain ETL pipelines to move data in real-time/stream, on-demand, and in batch emphasizing security, reusability, and data quality
Leverage Infrastructure-as-code platforms such as Terraform and Bicep to automate infrastructure provisioning and streamline deployments
Implementation and management of APM and observability tools such as Azure App Insights or Datadog to monitor infrastructure, focusing on ML workloads
Contribute to preventive maintenance, technical debt reduction, and the promotion of clean code principles
Manage and maintain cloud infrastructure in both Azure and AWS environments
Qualifications
Bachelor's degree in Computer Science, Data Science, Information Systems, or a related field
3-5 years of experience in DataEngineering, DevOps, MLOps, Software Engineering or Site
Reliability Engineering
Strong understanding of cloud infrastructure and experience working with at least one major cloud provider
Excellent troubleshooting and debugging skills, with a focus on data integrity and system
optimization
Proficiency in at least one objected-oriented programming language, preferably python with
hands-on experience in ml frameworks like TensorFlow, PyTorch or Scikit-learn
Proficiency in SQL, preferably Snowflake SQL
DPR Construction is a forward-thinking, self-performing general contractor specializing in technically complex and sustainable projects for the advanced technology, life sciences, healthcare, higher education and commercial markets. Founded in 1990, DPR is a great story of entrepreneurial success as a private, employee-owned company that has grown into a multi-billion-dollar family of companies with offices around the world.
Working at DPR, you'll have the chance to try new things, explore unique paths and shape your future. Here, we build opportunity together-by harnessing our talents, enabling curiosity and pursuing our collective ambition to make the best ideas happen. We are proud to be recognized as a great place to work by our talented teammates and leading news organizations like U.S. News and World Report, Forbes, Fast Company and Newsweek.
Explore our open opportunities at ********************
$84k-111k yearly est. Auto-Apply 35d ago
Senior Data Engineer
Elder Research 3.9
Data engineer job in Raleigh, NC
Job Title: Senior DataEngineer
Workplace: Hybrid - Due to in-office requirements, candidates must be local to either Raleigh, NC or Charlottesville, VA. Relocation assistance is not available
Clearance Required: Not required, BUT YOU MUST BE ELIGIBLE FOR A CLEARANCE
Position Overview:
Elder Research, Inc. (ERI) is seeking to hire a Senior DataEngineer with strong engineering skills who will provide technical support across multiple project teams by leading, designing, and implementing the software and data architectures necessary to deliver analytics to our clients, as well as providing consulting and training support to client teams in the areas of architecture, dataengineering, ML engineering and/or related areas. The ideal candidate will have a strong command of Python for data analysis and engineering tasks, a demonstrated ability to create reports and visualizations using tools like R, Python, SQL, or Power BI, and deep expertise in Microsoft Azure environments. The candidate will play a key role in collaborating with cross-functional teams, including software developers, cloud engineers, architects, business leaders, and power users, to deliver innovative data solutions to our clients.
This role requires a consultative mindset, excellent communication skills, and a thorough understanding of the Software Development Life Cycle (SDLC). Candidates should have 7-12 years of relevant experience and experience in client-facing or consultative roles. The role will be based out of Raleigh NC or Charlottesville VA and will require 2-4 days of Business Travel to our customer site every 6 weeks.
Key Responsibilities:
DataEngineering & Analysis:
Develop, optimize, and maintain scalable data pipelines and systems in Azure environments.
Analyze large, complex datasets to extract insights and support business decision-making.
Create detailed and visually appealing reports and dashboards using R, Python, SQL, and Power BI.
Collaboration & Consulting:
Work closely with software developers, cloud engineers, architects, business leaders, and power users to understand requirements and deliver tailored solutions.
Act as a subject-matter expert in dataengineering and provide guidance on best practices.
Translate complex technical concepts into actionable business insights for stakeholders.
Azure Expertise:
Leverage Azure services such as Azure Data Factory, Azure Synapse Analytics, Azure Databricks, Azure SQL Database, and Azure Blob Storage for data solutions.
Ensure data architecture is aligned with industry standards and optimized for performance in cloud environments.
SDLC Proficiency:
Follow and advocate for SDLC best practices in dataengineering projects.
Collaborate with software development teams to ensure seamless integration of data solutions into applications.
Required Qualifications:
Experience: 7-12 years in dataengineering, analytics, or related fields, with a focus on Azure environments.
Education: Master s degree in Computer Science, Data Science, Engineering, or a related field.
Technical Skills:
Programming: Advanced expertise in Python; experience with R is a plus.
Data Tools: Proficient in SQL, Power BI, and Azure-native data tools.
Azure Knowledge: Strong understanding of Azure services, including data integration, storage, and analytics solutions.
SDLC Knowledge: Proven track record of delivering data solutions following SDLC methodologies.
Consultative Skills: Strong client-facing experience with excellent communication and presentation abilities.
Due to Customer requirements Candidates must be US Citizens or Permanent Residents of the United States of America.
Preferred Skills and Qualifications:
Certifications in Azure (e.g., Azure DataEngineer, Azure Solutions Architect).
Familiarity with Azure Functions, Event Grid, and Logic Apps.
Hands-on experience with machine learning frameworks and big data processing tools (e.g., Spark, Hadoop).
Familiarity with CI/CD pipelines and DevOps practices for dataengineering workflows.
Why apply to this position at Elder Research?
Competitive Salary and Benefits
Important Work / Make a Difference supporting U.S. national security.
Job Stability: Elder Research is not a typical government contractor, we hire you for a career not just a contract.
People-Focused Culture: we prioritize work-life-balance and provide a supportive, positive, and collaborative work environment as well as opportunities for professional growth and advancement.
Company Stock Ownership: all employees are provided with shares of the company each year based on company value and profits.
About Elder Research, Inc
People Centered. Data Driven
Elder Research is a fast growing consulting firm specializing in predictive analytics. Being in the data mining business almost 30 years, we pride ourselves in our ability to find creative, cutting edge solutions to real-world problems. We work hard to provide the best value to our clients and allow each person to contribute their ideas and put their skills to use immediately.
Our team members are passionate, curious, life-long learners. We value humility, servant-leadership, teamwork, and integrity. We seek to serve our clients and our teammates to the best of our abilities. In keeping with our entrepreneurial spirit, we want candidates who are self-motivated with an innate curiosity and strong team work.
Elder Research believes in continuous learning and community - each week the entire company attends a Tech Talk and each office location provides lunch. Elder Research provides a supportive work environment with established parental, bereavement, and PTO policies. By prioritizing a healthy work-life balance - with reasonable hours, solid pay, low travel, and extremely flexible time off - Elder Research enables and encourages its employees to serve others and enjoy their lives.
Elder Research, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
Elder Research is a Government contractor and many of our positions require US Citizenship.
$85k-119k yearly est. 58d ago
Big Data Consultant (Durham, NC, Westlake, TX)
Sonsoft 3.7
Data engineer job in Durham, NC
Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services.
Green Card and USC
:-
At least 7+ years of overall experience
HDFS architecture and understanding of all critical architectural concepts including type of nodes, node interaction, YARN, Zoo Keeper, and Map reduce etc.
Hand on experience in Hive: All concepts including Hive queries, UDF, Different file formats like ORC, AVRO, and Parquet etc.
Hand on experience in developing Sqoop, Spark.
Experience in processing structured data - Warehousing Concepts like de-duplication, cleansing, look ups, transformation, data versioning etc.
Hand on experience in developing Oozie workflow definition and execution
Knowledge of a HDFS distribution preferably Cloudera. Understanding of Monitoring and operational capabilities of the distribution
Knowledge of Flume, Kafka is a plus
Hand on experience in programming languages like Java, Python, Perl.
At least 4 years of experience in translating functional/non-functional requirements to system requirements.
Experience in working with business users to analyze and understand business data and scenarios.
Ability to work in a team environment with client interfacing skills.
Experience and desire to work in a Global delivery environment.
Experience leading medium to large sized teams.
CloudEra Certification
Knowledge of PLSQL.
Job Description:-
At least 7+ years of overall experience
HDFS architecture and understanding of all critical architectural concepts including type of nodes, node interaction, YARN, Zoo Keeper, and Map reduce etc.
Hand on experience in Hive: All concepts including Hive queries, UDF, Different file formats like ORC, AVRO, and Parquet etc.
Hand on experience in developing Sqoop, Spark.
Experience in processing structured data - Warehousing Concepts like de-duplication, cleansing, look ups, transformation, data versioning etc.
Hand on experience in developing Oozie workflow definition and execution
Knowledge of a HDFS distribution preferably Cloudera. Understanding of Monitoring and operational capabilities of the distribution
Knowledge of Flume, Kafka is a plus
Hand on experience in programming languages like Java, Python, Perl.
At least 4 years of experience in translating functional/non-functional requirements to system requirements.
Experience in working with business users to analyze and understand business data and scenarios.
Ability to work in a team environment with client interfacing skills.
Experience and desire to work in a Global delivery environment.
Experience leading medium to large sized teams.
CloudEra Certification
Knowledge of PLSQL.
Qualifications
Basic Qualifications :-
Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
At least 4 years of experience within the Information Technologies.
Additional Information
Note:-
This is a Full-Time & Permanent job opportunity for you.
Only US Citizen, Green Card Holder, GC-EAD, H4-EAD & L2-EAD can apply.
No OPT-EAD, H1B & TN candidates please.
Please mention your Visa Status in your email or resume.
$78k-107k yearly est. 60d+ ago
Senior Data Engineer
Elder Research 3.9
Data engineer job in Cary, NC
Job Title: Senior DataEngineer Workplace: Hybrid - Due to in-office requirements, candidates must be local to either Raleigh, NC or Charlottesville, VA. Relocation assistance is not available Clearance Required: Not required, BUT YOU MUST BE ELIGIBLE FOR A CLEARANCE
Position Overview:
Elder Research, Inc. (ERI) is seeking to hire a Senior DataEngineer with strong engineering skills who will provide technical support across multiple project teams by leading, designing, and implementing the software and data architectures necessary to deliver analytics to our clients, as well as providing consulting and training support to client teams in the areas of architecture, dataengineering, ML engineering and/or related areas. The ideal candidate will have a strong command of Python for data analysis and engineering tasks, a demonstrated ability to create reports and visualizations using tools like R, Python, SQL, or Power BI, and deep expertise in Microsoft Azure environments. The candidate will play a key role in collaborating with cross-functional teams, including software developers, cloud engineers, architects, business leaders, and power users, to deliver innovative data solutions to our clients.
This role requires a consultative mindset, excellent communication skills, and a thorough understanding of the Software Development Life Cycle (SDLC). Candidates should have 7-12 years of relevant experience and experience in client-facing or consultative roles. The role will be based out of Raleigh NC or Charlottesville VA and will require 2-4 days of Business Travel to our customer site every 6 weeks.
Key Responsibilities:
DataEngineering & Analysis:
* Develop, optimize, and maintain scalable data pipelines and systems in Azure environments.
* Analyze large, complex datasets to extract insights and support business decision-making.
* Create detailed and visually appealing reports and dashboards using R, Python, SQL, and Power BI.
Collaboration & Consulting:
* Work closely with software developers, cloud engineers, architects, business leaders, and power users to understand requirements and deliver tailored solutions.
* Act as a subject-matter expert in dataengineering and provide guidance on best practices.
* Translate complex technical concepts into actionable business insights for stakeholders.
Azure Expertise:
* Leverage Azure services such as Azure Data Factory, Azure Synapse Analytics, Azure Databricks, Azure SQL Database, and Azure Blob Storage for data solutions.
* Ensure data architecture is aligned with industry standards and optimized for performance in cloud environments.
SDLC Proficiency:
* Follow and advocate for SDLC best practices in dataengineering projects.
* Collaborate with software development teams to ensure seamless integration of data solutions into applications.
Required Qualifications:
* Experience: 7-12 years in dataengineering, analytics, or related fields, with a focus on Azure environments.
* Education: Masters degree in Computer Science, Data Science, Engineering, or a related field.
Technical Skills:
* Programming: Advanced expertise in Python; experience with R is a plus.
* Data Tools: Proficient in SQL, Power BI, and Azure-native data tools.
* Azure Knowledge: Strong understanding of Azure services, including data integration, storage, and analytics solutions.
* SDLC Knowledge: Proven track record of delivering data solutions following SDLC methodologies.
* Consultative Skills: Strong client-facing experience with excellent communication and presentation abilities.
* Due to Customer requirements Candidates must be US Citizens or Permanent Residents of the United States of America.
Preferred Skills and Qualifications:
* Certifications in Azure (e.g., Azure DataEngineer, Azure Solutions Architect).
* Familiarity with Azure Functions, Event Grid, and Logic Apps.
* Hands-on experience with machine learning frameworks and big data processing tools (e.g., Spark, Hadoop).
* Familiarity with CI/CD pipelines and DevOps practices for dataengineering workflows.
Why apply to this position at Elder Research?
* Competitive Salary and Benefits
* Important Work / Make a Difference supporting U.S. national security.
* Job Stability: Elder Research is not a typical government contractor, we hire you for a career not just a contract.
* People-Focused Culture: we prioritize work-life-balance and provide a supportive, positive, and collaborative work environment as well as opportunities for professional growth and advancement.
* Company Stock Ownership: all employees are provided with shares of the company each year based on company value and profits.
How much does a data engineer earn in Chapel Hill, NC?
The average data engineer in Chapel Hill, NC earns between $68,000 and $121,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.
Average data engineer salary in Chapel Hill, NC
$91,000
What are the biggest employers of Data Engineers in Chapel Hill, NC?
The biggest employers of Data Engineers in Chapel Hill, NC are: