Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care.
By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare.
As part of the Privacy Science team within Privacy Hub you will play a crucial role in ensuring that privacy of patients is safeguarded in the modern world of data sharing. As well as working on real data, you will be involved in exciting research to keep us as industry leaders in this area, and stimulating discussions on re-identification risk. You will be supported in developing/consolidating data analysis and coding skills to become proficient in the analysis of large health-related datasets.
**You Will:**
+ Critically analyze large health datasets using standard and bespoke software libraries
+ Discuss your findings and progress with internal and external stakeholders
+ Produce high quality reports which summarise your findings
+ Contribute to research activities as we explore novel and established sources of re-identification risk
**What You Will Bring to the Table:**
+ Excellent communication skills. Meticulous attention to detail in the production of comprehensive, well-presented reports
+ A good understanding of statistical probability distributions, bias, error and power as well as sampling and resampling methods
+ Seeks to understand real-world data in context rather than consider it in abstraction.
+ Familiarity or proficiency with programmable data analysis software R or Python, and the desire to develop expertise in its language
+ Application of scientific methods to practical problems through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions
+ Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines
+ Initiative and ability to independently explore and research novel topics and concepts as they arise, to expand Privacy Hub's knowledge base
+ An appreciation of the need for effective methods in data privacy and security, and an awareness of the relevant legislation
+ Familiarity with Amazon Web Services cloud-based storage and computing facilities
**Bonus Points If You Have:**
+ Experience creating documents using LATEX
+ Detailed knowledge of one or more types of health information, e.g., genomics, disease, health images
+ Experience working with or supporting public sector organizations, such as federal agencies (e.g., CMS, NIH, VA, CDC), state health departments, or public health research partners. Familiarity with government data environments, procurement processes, or privacy frameworks in regulated settings is highly valued.
\#LI-BC1
We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status.
At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services.
The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job.
The estimated total cash compensation range for this role is:
$104,000-$130,000 USD
To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion.
This job is not eligible for employment sponsorship.
Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay.
At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way.
Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis.
For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
$104k-130k yearly 23d ago
Looking for a job?
Let Zippia find it for you.
Data Scientist, Analytics (Technical Leadership)
Meta 4.8
Data scientist job in Raleigh, NC
We are seeking experienced DataScientists to join our team and drive impact across various product areas. As a DataScientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.
**Required Skills:**
DataScientist, Analytics (Technical Leadership) Responsibilities:
1. Work with complex data sets to solve challenging problems using analytical and statistical approaches
2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies
3. Identify and measure success through goal setting, forecasting, and monitoring key metrics
4. Partner with cross-functional teams to inform and execute product strategy and investment decisions
5. Build long-term vision and strategy for programs and products
6. Collaborate with executives to define and develop data platforms and instrumentation
7. Effectively communicate insights and recommendations to stakeholders
8. Define success metrics, forecast changes, and set team goals
9. Support developing roadmaps and coordinate analytics efforts across teams
**Minimum Qualifications:**
Minimum Qualifications:
10. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
11. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab)
12. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development
13. Experience working effectively with and leading complex analytics projects across multiple stakeholders and cross-functional teams, including Engineering, PM/TPM, Analytics and Finance
14. Experience with predictive modeling, machine learning, and experimentation/causal inference methods
15. Experience communicating complex technical topics in a clear, precise, and actionable manner
**Preferred Qualifications:**
Preferred Qualifications:
16. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy
17. Masters or Ph.D. Degree in a quantitative field
18. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research)
19. 10+ years of experience doing complex quantitative analysis in product analytics
**Public Compensation:**
$210,000/year to $281,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@meta.com.
$210k-281k yearly 60d+ ago
Data Scientist II
Insight Global
Data scientist job in Raleigh, NC
As a Senior DataScientist II, you will leverage your advanced analytical skills to extract insights from complex datasets. Your expertise will drive data-driven decision-making and contribute to the development of innovative solutions. You will collaborate with cross-functional teams to enhance business strategies and drive growth through actionable data analysis.
· Leading the development of advanced AI and machine learning models to solve complex business problem
· Working closely with other datascientists and engineers to design, develop, and deploy AI solutions
· Collaborating with cross-functional teams to ensure AI solutions are aligned with business goals and customer needs
· Building models, performing analytics, and creating AI features
· Mentoring junior datascientists and provide guidance on AI and machine learning best practices
Working with product leaders to apply data science solutions
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to ********************.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: ****************************************************
Skills and Requirements
3-5+ years of professional experience (POST GRAD) as a delivery-focused DataScientist or ML Engineer
Understand how LLMs integrate to complex systems
Deep understanding of RAG
Strong in Python
Expeirence creating AI Agents and Agentic Workflows Masters and PHD
$70k-97k yearly est. 60d+ ago
Data Scientist
Advance Stores Company
Data scientist job in Raleigh, NC
We are seeking an experienced DataScientist with strong expertise in Data Science, machine learning engineering with hands on experience in designing and deploying ML solutions in production. This role focuses on building scalable ML solutions, productionizing models, and enabling robust ML platforms for enterprise-grade deployments.
This position is 4 days in office, 1 day remote per week, based at our corporate headquarters in Raleigh, North Carolina (North Hills)
Key Responsibilities
Build ML Models: Design and implement predictive and prescriptive models for regression, classification, and optimization problems.Apply advanced techniques such as structural time series modeling and boosting algorithms (e.g., XGBoost, LightGBM).
Train and Tune Models: Develop and tune machine learning models using Python, PySpark, TensorFlow, and PyTorch.
Collaboration & Communication: Work closely with stakeholders to understand business challenges and translate them into data science solutions and work in the end-to-end solutioning. Collaborate with cross-functional teams to ensure successful integration of models into business processes.
Monitoring & Visualization: Rapidly prototype and test hypotheses to validate model approaches. Build automated workflows for model monitoring and performance evaluation. Create dashboards using tools like Databricks and Palantir to visualize key model metrics like model drift, Shapley values etc.
Productionize ML: Build repeatable paths from experimentation to deployment (batch, streaming, and low-latency endpoints), including feature engineering, training, evaluation,
Own ML Platform: Stand up and operate core platform components-model registry, feature store, experiment tracking, artifact stores, and standardized CI/CD for ML.
Pipeline Engineering: Author robust data/ML pipelines (orchestrated with Step Functions / Airflow / Argo) that train, validate, and release models on schedules or events.
Observability & Quality: Implement end-to-end monitoring, data validation, model/drift checks, and alerting SLA/SLOs.
Governance & Risk: Enforce model/version lineage, reproducibility, approvals, rollback plans, auditability, and cost controls aligned to enterprise policies.
Partner & Mentor: Collaborate with on-shore/off-shore teams; coach datascientists on packaging, testing, and performance; contribute to standards and reviews.
Hands-on Delivery: Prototype new patterns; troubleshoot production issues across data, model, and infrastructure layers.
Required Qualifications
Education: Bachelor's degree in Computer Science, Information Technology, Data Science, or related field.
Programming: 5+ years experience with Python (pandas, PySpark, scikit-learn; familiarity with PyTorch/TensorFlow helpful), bash, experience with Docker.
ML Experimentation: Design and implement predictive and prescriptive models for regression, classification, and optimization problems. Apply advanced techniques such as structural time series modeling and boosting algorithms (e.g., XGBoost, LightGBM).
ML Tooling: 5+ years experience with SageMaker (training, processing, pipelines, model registry, endpoints) or equivalents (Kubeflow, MLflow/Feast, Vertex, Databricks ML).
Pipelines & Orchestration: 5+ years' experience with Databricks DABS or Airflow or Step Functions, e-driven designs with EventBridge/SQS/Kinesis.
Cloud Foundations: 3+ years experience with AWS/Azure/GCP on various services like ECR/ECS, Lambda, API Gateway, S3, Glue/Athena/EMR, RDS/Aurora (PostgreSQL/MySQL), DynamoDB, CloudWatch, IAM, VPC, WAF.
Snowflake Foundations: Warehouses, databases, schemas, stages, Snowflake SQL, RBAC, UDF, Snowpark.
CI/CD: 3+ years hands-on experience with CodeBuild/Code Pipeline or GitHub Actions/GitLab; blue/green, canary, and shadow deployments for models and services.
Feature Pipelines: Proven experience with batch/stream pipelines, schema management, partitioning, performance tuning; parquet/iceberg best practices.
Testing & Monitoring: Unit/integration tests for data and models, contract tests for features, reproducible training; data drift/performance monitoring.
Operational Mindset: Incident response for model services, SLOs, dashboards, runbooks; strong debugging across data, model, and infra layers.
Soft Skills: Clear communication, collaborative mindset, and a bias to automate & document.
Additional Qualification:
Experience in retail/manufacturing is preferred.
California Residents click below for Privacy Notice:
***************************************************
$70k-97k yearly est. Auto-Apply 16d ago
Senior Data Scientist, Data Management
Ire
Data scientist job in Raleigh, NC
Senior DataScientist, Data Management - Office Based in Blue Bell, PA or Raleigh, NC
ICON plc is a world-leading healthcare intelligence and clinical research organization. We're proud to foster an inclusive environment driving innovation and excellence, and we welcome you to join us on our mission to shape the future of clinical development
Symphony Health, part of the ICON plc family, is a team of curious thinkers and intellectual problem solvers driving the healthcare data industry forward. We leverage our large, integrated healthcare data repository and our analytic expertise to build customized, agile data solutions which answer the questions our clients -- life science manufacturers, payers, and providers -- have today, as well as those they'll have tomorrow. Together, we can help patients get the right drugs at the right times.
We are currently seeking a Senior DataScientist, Data Management to join our diverse and dynamic team. As a Senior DataScientist at ICON Plc, you will play a crucial role in leveraging advanced analytical techniques and machine learning models to derive actionable insights from complex data sets. You will contribute to the success of our data-driven initiatives by developing sophisticated models, performing in-depth analyses, and guiding strategic decision-making across the organization.
What You Will Be Doing:
Designing and implementing advanced data models and algorithms to solve complex business problems and drive strategic insights.
Conducting in-depth analyses of large and diverse data sets to uncover patterns, trends, and correlations that inform decision-making.
Collaborating with cross-functional teams to understand business needs, translate them into analytical solutions, and communicate findings effectively.
Developing and deploying machine learning models and statistical techniques to enhance predictive capabilities and optimize business processes.
Providing thought leadership on data science best practices and emerging technologies to drive innovation and continuous improvement.
Your Profile:
Domain expertise in one or more targeted areas of the Revenue Cycle Management processes
Proficient in deterministic and probabilistic linkage concepts/methodologies to support patient journeys derived from anonymized RCM data
Advanced degree in a relevant field such as data science, statistics, computer science, or mathematics.
Extensive experience in data science, with a strong track record of developing and implementing complex models and analyses.
Proficiency in statistical analysis, machine learning techniques, and data visualization tools, with expertise in programming languages such as Python or R.
Excellent problem-solving skills and the ability to manage multiple projects and priorities effectively.
Outstanding communication and interpersonal skills, with the ability to present complex data insights clearly and collaborate with stakeholders across the organization.
Located in or around Blue Bell, PA or Raleigh, NC
#LI-MH1
#LI-HYBRID
What ICON can offer you:
Our success depends on the quality of our people. That's why we've made it a priority to build a diverse culture that rewards high performance and nurtures talent.
In addition to your competitive salary, ICON offers a range of additional benefits. Our benefits are designed to be competitive within each country and are focused on well-being and work life balance opportunities for you and your family.
Our benefits examples include:
Various annual leave entitlements
A range of health insurance offerings to suit you and your family's needs.
Competitive retirement planning offerings to maximize savings and plan with confidence for the years ahead.
Global Employee Assistance Programme, TELUS Health, offering 24-hour access to a global network of over 80,000 independent specialised professionals who are there to support you and your family's well-being.
Life assurance
Flexible country-specific optional benefits, including childcare vouchers, bike purchase schemes, discounted gym memberships, subsidised travel passes, health assessments, among others.
Visit our careers site to read more about the benefits ICON offers.
At ICON, inclusion & belonging are fundamental to our culture and values. We're dedicated to providing an inclusive and accessible environment for all candidates. ICON is committed to providing a workplace free of discrimination and harassment. All qualified applicants will receive equal consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status.
If, because of a medical condition or disability, you need a reasonable accommodation for any part of the application process, or in order to perform the essential functions of a position, please let us know or submit a request here.
Interested in the role, but unsure if you meet all of the requirements? We would encourage you to apply regardless - there's every chance you're exactly what we're looking for here at ICON whether it is for this or other roles.
Are you a current ICON Employee? Please click here to apply
$79k-110k yearly est. Auto-Apply 13d ago
Sr. Data Scientist
DPR Construction 4.8
Data scientist job in Raleigh, NC
Job DescriptionDPR Construction is seeking a skilled Senior DataScientist to help advance our data-driven approach to building. In this role, you'll use statistical analysis, machine learning, and data visualization to turn complex construction and business data into actionable insights that improve project planning, cost forecasting, resource management, and safety. Working with project and operations teams, you'll build and deploy scalable, secure data solutions on cloud platforms like Azure and AWS, driving innovation and operational excellence across DPR's projects.Responsibilities
Data analysis and modeling: Analyze large datasets to identify trends, bottlenecks, and areas for improvement in operational performance. Build predictive and statistical models to forecast demand, capacity, and potential issues.
Develop and deploy models: Build, test, and deploy machine learning and AI models to improve operational processes.
Analyze operational data: Examine data related to projects, production, supply chains, inventory, and quality control to identify patterns, trends, and inefficiencies.
Optimize processes: Use data-driven insights to streamline workflows, allocate resources more effectively, and improve overall performance.
Forecast and predict: Create predictive models to forecast outcomes, such as demand, and inform strategic decisions.
Communicate findings: Present findings and recommendations to stakeholders through reports, visualizations, and presentations.
Ensure reliability: Build and maintain reliable, scalable, and efficient data science systems and processes.
Collaboration: Partner with project managers, engineers, and business leaders to ensure data solutions are aligned with organizational goals and deliver tangible improvements.
Continuous Learning: Stay current with advancements in data science and machine learning to continually enhance the company's data capabilities.
Reporting and communication: Create dashboards and reports that clearly communicate performance trends and key insights to leadership and other stakeholders. Translate complex data into actionable recommendations.
Performance monitoring: Implement data quality checks and monitor the performance of models and automated systems, creating feedback loops for continuous improvement.
Experimentation: Design and evaluate experiments to quantify the impact of new systems and changes on operational outcomes.
Qualifications
Bachelor's or Master's degree in Data Science, Computer Science, Statistics, Engineering, or a related field.
7+ years of experience in data science roles within AEC, product or technology organizations.
At least 4 years of experience working with cloud platforms, specifically Azure and AWS, for model deployment and data management.
Strong proficiency in Python or R for data analysis, modeling, and machine learning, with experience in relevant libraries (e.g., Scikit-learn, TensorFlow, PyTorch) and NLP frameworks (e.g., GPT, Hugging Face Transformers).
Expertise in SQL for data querying and manipulation, and experience with data visualization tools (e.g., Power BI, Tableau).
Solid understanding of statistical methods, predictive modeling, and optimization techniques.
Expertise in statistics and causal inference, applied in both experimentation and observational causal inference studies.
Proven experience designing and interpreting experiments and making statistically sound recommendations.
Strategic and impact-driven mindset, capable of translating complex business problems into actionable frameworks.
Ability to build relationships with diverse stakeholders and cultivate strong partnerships.
Strong communication skills, including the ability to bridge technical and non-technical stakeholders and collaborate across various functions to ensure business impact.
Ability to operate effectively in a fast-moving, ambiguous environment with limited structure.
Experience working with construction-related data or similar industries (e.g., engineering, manufacturing) is a plus.
Preferred Skills
Familiarity with construction management software (e.g., ACC, Procore, BIM tools) and knowledge of project management methodologies.
Hands-on experience with Generative AI tools and libraries.
Background in experimentation infrastructure or human-AI interaction systems.
Knowledge of time-series analysis, anomaly detection, and risk modeling specific to construction environments.
DPR Construction is a forward-thinking, self-performing general contractor specializing in technically complex and sustainable projects for the advanced technology, life sciences, healthcare, higher education and commercial markets. Founded in 1990, DPR is a great story of entrepreneurial success as a private, employee-owned company that has grown into a multi-billion-dollar family of companies with offices around the world.
Working at DPR, you'll have the chance to try new things, explore unique paths and shape your future. Here, we build opportunity together-by harnessing our talents, enabling curiosity and pursuing our collective ambition to make the best ideas happen. We are proud to be recognized as a great place to work by our talented teammates and leading news organizations like U.S. News and World Report, Forbes, Fast Company and Newsweek.
Explore our open opportunities at ********************
$80k-103k yearly est. Auto-Apply 60d+ ago
Senior Data Scientist, Navista
Cardinal Health 4.4
Data scientist job in Raleigh, NC
At Navista, our mission is to empower community oncology practices to deliver patient-centered cancer care. Navista, a Cardinal Health company, is an oncology practice alliance co-created with oncologists and practice leaders that offers advanced support services and technology to help practices remain independent and thrive. True to our name, our experienced team is passionate about helping oncology practices navigate the future.
We are seeking an innovative and highly skilled **Senior DataScientist** with specialized expertise in Generative AI (GenAI), Large Language Models (LLMs), and Agentic Systems to join the Navista - Data & Advanced Analytics team supporting the growth of our Navista Application Suite and the Integrated Oncology Network (IoN). In this critical role, you will be at the forefront of designing, developing, and deploying advanced AI solutions that leverage the power of generative models and intelligent agents to transform our products and operations. You will be responsible for pushing the boundaries of what's possible, from foundational research to production-ready applications, working with diverse datasets and complex problem spaces, particularly within the oncology domain.
The ideal candidate will possess a deep theoretical understanding and practical experience in building, fine-tuning, and deploying LLMs, as well as architecting and implementing agentic frameworks. You will play a key role in shaping our AI strategy, mentoring junior team members, and collaborating with cross-functional engineering and product teams to bring groundbreaking AI capabilities to life, including developing predictive models from complex, often unstructured, oncology data.
**_Responsibilities_**
+ **Research & Development:** Lead the research, design, and development of novel Generative AI models and algorithms, including but not limited to LLMs, diffusion models, GANs, and VAEs, to address complex business challenges.
+ **LLM Expertise:** Architect, fine-tune, and deploy Large Language Models for various applications such as natural language understanding, generation, summarization, question-answering, and code generation, with a focus on extracting insights from unstructured clinical and research data.
+ **Agentic Systems Design:** Design and implement intelligent agentic systems capable of autonomous decision-making, planning, reasoning, and interaction within complex environments, leveraging LLMs as core components.
+ **Predictive Modeling:** Develop and deploy advanced predictive models and capabilities using both structured and unstructured data, particularly within the oncology space, to forecast outcomes, identify trends, and support clinical or commercial decision-making.
+ **Prompt Engineering & Optimization:** Develop advanced prompt engineering strategies and techniques to maximize the performance and reliability of LLM-based applications.
+ **Data Strategy for GenAI:** Work with data engineers to define and implement data collection, preprocessing, and augmentation strategies specifically tailored for training and fine-tuning generative models and LLMs, including techniques for handling and enriching unstructured oncology data (e.g., clinical notes, pathology reports).
+ **Model Evaluation & Deployment:** Develop robust evaluation metrics and methodologies for generative models, agentic systems, and predictive models. Oversee the deployment, monitoring, and continuous improvement of these models in production environments.
+ **Collaboration & Leadership:** Collaborate closely with machine learning engineers, software engineers, and product managers to integrate AI solutions into our products. Provide technical leadership and mentorship to junior datascientists.
+ **Innovation & Thought Leadership:** Stay abreast of the latest advancements in GenAI, LLMs, and agentic AI research. Proactively identify new opportunities and technologies that can enhance our capabilities and competitive advantage.
+ **Ethical AI:** Ensure the responsible and ethical development and deployment of AI systems, addressing potential biases, fairness, and transparency concerns.
**_Qualifications_**
+ 8-12 years of experience as a DataScientist or Machine Learning Engineer, with a significant focus on deep learning and natural language processing, preferred
+ Bachelor's degree in related field, or equivalent work experience, preferred
+ Proven hands-on experience with Generative AI models (e.g., Transformers, GANs, VAEs, Diffusion Models) and their applications.
+ Extensive experience working with Large Language Models (LLMs), including fine-tuning, prompt engineering, RAG (Retrieval Augmented Generation), and understanding various architectures (e.g., GPT, Llama, BERT, T5).
+ Demonstrated experience in designing, building, and deploying agentic systems or multi-agent systems, including concepts like planning, reasoning, and tool use.
+ Strong experience working with unstructured data, particularly in the oncology domain (e.g., clinical notes, pathology reports, genomic data, imaging reports), and extracting meaningful features for analysis.
+ Demonstrated ability to create and deploy predictive capabilities and models from complex datasets, including those with unstructured components.
+ Proficiency in Python and deep learning frameworks such as PyTorch or TensorFlow.
+ Experience with relevant libraries and tools (e.g., Hugging Face Transformers, LangChain, LlamaIndex).
+ Strong understanding of machine learning fundamentals, statistical modeling, and experimental design.
+ Experience with at least one cloud platforms ( GCP, Azure) for training and deploying large-scale AI models.
+ Excellent problem-solving skills, with the ability to tackle complex, ambiguous problems and drive solutions.
+ Strong communication and presentation skills, capable of explaining comp
+ Experience in the healthcare or life sciences industry, specifically with oncology data and research, highly preferred
+ Experience with MLOps practices for deploying and managing large-scale AI models, highly preferred
+ Familiarity with distributed computing frameworks (e.g., Spark, Dask), highly preferred
+ Experience contributing to open-source AI projects, highly preferred
**_What is expected of you and others at this level_**
+ Applies advanced knowledge and understanding of concepts, principles, and technical capabilities to manage a wide variety of projects
+ Participates in the development of policies and procedures to achieve specific goals
+ Recommends new practices, processes, metrics, or models
+ Works on or may lead complex projects of large scope
+ Projects may have significant and long-term impact
+ Provides solutions which may set precedent
+ Independently determines method for completion of new projects
+ Receives guidance on overall project objectives
+ Acts as a mentor to less experienced colleagues
**Anticipated salary range:** $123,400 - $176,300
**Bonus eligible:** Yes
**Benefits:** Cardinal Health offers a wide variety of benefits and programs to support health and well-being.
+ Medical, dental and vision coverage
+ Paid time off plan
+ Health savings account (HSA)
+ 401k savings plan
+ Access to wages before pay day with my FlexPay
+ Flexible spending accounts (FSAs)
+ Short- and long-term disability coverage
+ Work-Life resources
+ Paid parental leave
+ Healthy lifestyle programs
**Application window anticipated to close:** 02/15/2026 *if interested in opportunity, please submit application as soon as possible.
The salary range listed is an estimate. Pay at Cardinal Health is determined by multiple factors including, but not limited to, a candidate's geographical location, relevant education, experience and skills and an evaluation of internal pay equity.
\#LI-Remote
_Candidates who are back-to-work, people with disabilities, without a college degree, and Veterans are encouraged to apply._
_Cardinal Health supports an inclusive workplace that values diversity of thought, experience and background. We celebrate the power of our differences to create better solutions for our customers by ensuring employees can be their authentic selves each day. Cardinal Health is an Equal_ _Opportunity/Affirmative_ _Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, ancestry, age, physical or mental disability, sex, sexual orientation, gender identity/expression, pregnancy, veteran status, marital status, creed, status with regard to public assistance, genetic status or any other status protected by federal, state or local law._
_To read and review this privacy notice click_ here (***************************************************************************************************************************
$123.4k-176.3k yearly 43d ago
AWS Data Migration Consultant
Slalom 4.6
Data scientist job in Raleigh, NC
Candidates can live within commutable distance to any Slalom office in the US. We have a hybrid and flexible environment. Who You'll Work With As a modern technology company, we've never met a technical challenge we didn't like. We enable our clients to learn from their data, create incredible digital experiences, and make the most of new technologies. We blend design, engineering, and analytics expertise to build the future. We surround our technologists with interesting challenges, innovative minds, and emerging technologies.
We are seeking an experienced Cloud Data Migration Architect with deep expertise in SQL Server, Oracle, DB2, or a combination of these platforms, to lead the design, migration, and optimization of scalable database solutions in the AWS cloud. This role will focus on modernizing on-premises database systems by architecting high-performance, secure, and reliable AWS-hosted solutions.
As a key technical leader, you will work closely with data engineers, cloud architects, and business stakeholders to define data strategies, lead complex database migrations, build out ETL pipelines, and optimize performance across legacy and cloud-native environments.
What You'll Do
* Design and optimize database solutions on AWS, including Amazon RDS, EC2-hosted instances, and advanced configurations like SQL Server Always On or Oracle RAC (Real Application Clusters).
* Lead and execute cloud database migrations using AWS Database Migration Service (DMS), Schema Conversion Tool (SCT), and custom automation tools.
* Architect high-performance database schemas, indexing strategies, partitioning models, and query optimization techniques.
* Optimize complex SQL queries, stored procedures, functions, and views to ensure performance and scalability in the cloud.
* Implement high-availability and disaster recovery (HA/DR) strategies including Always-On, Failover Clusters, Log Shipping, and Replication, tailored to each RDBMS.
* Ensure security best practices are followed including IAM-based access control, encryption, and compliance with industry standards.
* Collaborate with DevOps teams to implement Infrastructure-as-Code (IaC) using tools like Terraform, CloudFormation, or AWS CDK.
* Monitor performance using tools such as AWS CloudWatch, Performance Insights, Query Store, Dynamic Management Views (DMVs), or Oracle-native tools.
* Work with software engineers and data teams to integrate cloud databases into enterprise applications and analytics platforms.
What You'll Bring
* 5+ years of experience in database architecture, design, and administration with at least one of the following: SQL Server, Oracle, or DB2.
* Expertise in one or more of the following RDBMS platforms: Microsoft SQL Server, Oracle, DB2.
* Hands-on experience with AWS database services (RDS, EC2-hosted databases).
* Strong understanding of HA/DR solutions and cloud database design patterns.
* Experience with ETL development and data integration, using tools such as SSIS, AWS Glue, or custom solutions.
* Familiarity with AWS networking components (VPCs, security groups) and hybrid cloud connectivity.
* Strong troubleshooting and analytical skills to resolve complex database and performance issues.
* Ability to work independently and lead database modernization initiatives in collaboration with engineering and client stakeholders.
Nice to Have
* AWS certifications such as AWS Certified Database - Specialty or AWS Certified Solutions Architect - Professional.
* Experience with NoSQL databases or hybrid data architectures.
* Knowledge of analytics and big data tools (e.g., Snowflake, Redshift, Athena, Power BI, Tableau).
* Familiarity with containerization (Docker, Kubernetes) and serverless technologies (AWS Lambda, Fargate).
* Experience with DB2 on-premise or cloud-hosted environments.
About Us
Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all.
Compensation and Benefits
Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance.
Slalom is committed to fair and equitable compensation practices. For this position, the target base salary pay range in the following locations:
Boston, Houston, Los Angeles, Orange County, Seattle, San Diego, Washington DC, New York, New Jersey, for Consultant level is $105,000-147,000 and for Senior Consultant level it is $120,000-$169,000 and for Principal level it is $133,000-$187,000.
In all other markets, the target base salary pay range for Consultant level is $96,000-$135,000 and for Senior Consultant level it is $110,000-$155,000 and for Principal level it is $122,000-$172,000.
In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The salary pay range is subject to change and may be modified at any time.
EEO and Accommodations
Slalom is an equal opportunity employer and is committed to attracting, developing and retaining highly qualified talent who empower our innovative teams through unique perspectives and experiences. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process.
We will accept applications until 1/31/2026 or until the positions are filled.
$133k-187k yearly 12d ago
Data Science Intern - Summer 2026
Bandwidth 4.5
Data scientist job in Raleigh, NC
Who We Are:
Bandwidth, a prior “Best of EC” award winner, is a global software company that helps enterprises deliver exceptional experiences through voice, messaging, and emergency services. Reaching 65+ countries and over 90 percent of the global economy, we're the only provider offering an owned communications cloud that delivers advanced automation, AI integrations, global reach, and premium human support. Bandwidth is trusted for mission-critical communications by the Global 2000, hyperscalers, and SaaS builders!
At Bandwidth, your music matters when you are part of the BAND. We celebrate differences and encourage BANDmates to be their authentic selves. #jointheband
What We Are Looking For:
As a Data Science Intern during Summer 2026, you'll be at the forefront of empowering Bandwidth to gain valuable insights from our large datasets. You'll work with leadership, product owners and team members to understand complex business needs, and then design, develop and implement data-driven solutions. You have an eye for detail, but can also think abstractly, analytically and creatively about big challenges. You can create novel solutions that will support several teams like NOC, TAC, Fraud Operations and Bandwidth as a whole by making sense of our data, and taking action.
What You'll Do:
Use analytical tools and statistical techniques to build ways for identifying, analyzing, and troubleshooting anomalies, trends and patterns found in network data
Use data analysis techniques and methods to get accurate and actionable insights across multiple datasets
Work with development teams to request improvements for statistical techniques and reporting methods
Work with teams to optimize data-driven workflows; and provide actionable data solutions, data visualization, and analysis results
What You Need:
Currently pursuing a Bachelor's degree in math, statistics, computer sciences, or related fields; or Bachelor's degree in unrelated field plus 2 years work experience in statistical analysis
Experience with development on projects heavily involved with large data sets and statistical methods, preferably in Python
Familiarity and knowledge with the concepts of machine learning, including data mining and unsupervised learning
Strong analytical and critical thinking skills, with high attention to detail, and the ability to collect, organize, analyze, and disseminate significant amounts of information
Highly self-motivated, with the ability to work independently and take ownership of issues, and willingness to overcome challenging problems while identifying opportunities for improvement
Strong communication skills and the ability to simplify and explain information and findings to leadership, team members, and other internal stakeholders
The Whole Person Promise:
At Bandwidth, we're pretty proud of our corporate culture, which is rooted in our “Whole Person Promise.” We promise all employees that they can have meaningful work AND a full life, and we provide a work environment geared toward enriching your body, mind, and spirit. How do we do that? Well…
Are you ready for an awesome internship experience? At Bandwidth we're all about making your time with us fun and fulfilling! Take a break with our 90-minute workout lunch to energize your day, or roll up your sleeves for some cool volunteer activities that give back to our community. You'll also get to meet and connect with our leaders who can share their wisdom and advice. And let's not forget the fun social activities to bond with your fellow interns!
Join us for a summer full of learning, laughter, and new experiences-let's make some great memories together!
Are you excited about the position and its responsibilities, but not sure if you're 100% qualified? Do you feel you can work to help us crush the mission? If you answered ‘yes' to both of these questions, we encourage you to apply! You won't want to miss the opportunity to be a part of the BAND.
Applicant Privacy Notice
$63k-89k yearly est. Auto-Apply 19d ago
Head of AI/Data Consulting
Carimus
Data scientist job in Raleigh, NC
& The Role
We are Carimus, a brand experience and digital transformation agency, now proudly part of the Spyrosoft Group. Since 2013, we've brought together the best of art and engineering to create meaningful impact in the digital world. By fusing strategy, creativity, and technology, we help brands break through and connect with their audiences on an emotional level. As part of Spyrosoft, we're expanding our capabilities and reach while staying true to our human centered approach crafting experiences that matter for both our clients and our team.
This role plays a pivotal part in shaping and scaling our AI and Data consulting practice. You'll work closely with our leadership, delivery, and commercial teams to define our vision, guide clients on meaningful AI strategies, and build a high-performing team that brings those strategies to life. As the face of our AI expertise, you'll help clients cut through the noise, make smart decisions, and create real business impact. Your blend of technical depth, commercial instinct, and strong relationships will be key to driving growth and delivering exceptional results.
Department: TBD
Classification: Exempt
Status: Full Time
Location: Raleigh, NC (Hybrid 3x per week)
Travel Requirement: 30-50%
What You'll Do
Build and lead a team of AI/Data Consultants who can advise clients on AI and data strategies, tools, architectures, and implementation approaches.
Partner with marketing and commercial teams to shape marketing and go-to-market strategies aimed at winning new clients in the US and Europe.
Own the P&L for the AI/Data Consulting unit, ensuring financial performance, scalability, and profitability.
Conduct workshops, strategic assessments, and executive-level discussions with key customers.
Represent Carimus and Spyrosoft externally as a credible AI voice, supporting business development, shaping proposals, and closing strategic deals.
Define a clear vision for how AI can drive tangible business outcomes and bring that vision to clients and internal teams.
Foster a culture of excellence, curiosity, accountability, and collaboration within the team.
Required Qualifications
Deep expertise in AI and Data Science, with an up-to-date understanding of modern AI (LLMs, agents, MLOps, data architectures, etc.).
Several years of experience in AI/Data consulting or in a software company offering AI/Data advisory services.
Demonstrated success in strategic advisory work with enterprise clients.
Proven ability to connect technical solutions to business value and ROI.
Prior P&L ownership or substantial commercial responsibility.
Exceptional interpersonal, communication, and client-facing skills.
Experience hiring, developing, and leading technical consulting teams.
Ability to travel to customer locations as needed.
Who We're Looking For
We're looking for an ambitious, well connected AI/Data leader who is both a deep technologist and a business builder. Someone who can operate as a one-person practice at the start, shaping strategy and also defining how it gets executed. You bring credibility in the AI/Data space, understand real world use cases beyond the hype, and can explain complex ideas in a clear, accessible way. You're skilled at developing client relationships, identifying opportunities, and crafting AI strategies along with practical implementation plans, whether we deliver them or empower clients to do so themselves. Above all, you're entrepreneurial, influential, and motivated to build something exceptional from the ground up driving meaningful value for our clients and accelerating the growth of Carimus and Spyrosoft.
Our Values
At Carimus, these values guide every interaction and collaboration internally and with our clients.
Live in the ZOPD. We continually expand our skills by working in the Zone of Proximal Development. We take measured risks and incorporate new technology, but only what we can deliver with excellence.
Be Transparent & Tenacious. We don't hide from the truth and won't let our clients, either. We embrace reality, own our mistakes, and attack problems with teamwork and creativity.
Invest in Relationships. Life is better doing interesting things with people we like. We build trusting relationships and strong connections-with our employees and our clients. We go further together.
Create Exceptional Experiences. We exceed expectations-yours and ours. We unite art and engineering in smart, compelling ways that inspire confidence and human connection. We excite and engage, from concept to launch.
Commit to Caring. Caring is in our blood-and our name, “Care I Must.” We're proudest when we tackle real problems and advance positive change for people and the environment. Let's get to work.
Physical Requirements
Normal periods of sitting and standing in an office environment.
Lifting and/or pushing objects up to 35 lbs. on an occasional basis.
Travel Requirement 30-50%.
Carimus provides equal employment opportunities to all employees and applicants and prohibits discrimination and harassment of any kind, regardless of race, color, religion, age, sex, national origin, disability status, genetic information, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected under federal, state, or local laws.
$78k-106k yearly est. 53d ago
Sr Data Engineer (MFT - IBM Sterling)
The Hertz Corporation 4.3
Data scientist job in Raleigh, NC
**A Day in the Life:** The IBM Sterling Senior MFT Engineer will have expertise in Sterling B2B Integrator, Sterling File Gateway, Sterling Secure Proxy, Control Center. This position requires an expert level knowledge of these technologies. You'll provide third-level support for core hardware, software, data and infrastructure components within the B2B integration team. The software focus areas will include software infrastructure components, continuous improvement and automation related to middleware and enterprise data movement within a mission-critical, multi-site enterprise environment.
The ideal candidate will have a passion for technology; possess the ability to create change and be the facilitator for this transformation. They will have experience tailoring software design, developing, and implementing of solutions in support of data quality and automation. They will need to understand the requirements of a project, test the performance of the infrastructure, and verify that the requirements have been successfully met.
We expect the starting salary to be around $135k but will be commensurate with experience.
**What You'll Do:**
TECHNICAL SENIORSHIP
+ Communication with internal and external business users on Sterling Integrator mappings
+ Making changes to existing partner integrations to meet internal and external requirements
+ Design, develop and implement solutions based on standards and processes that establish consistency across the enterprise data, reduce risks, and promote efficiencies in support of the organizational goals and objectives.
+ Diagnose and troubleshoot complex issues, restore services and perform root cause analysis.
+ Facilitate the review, vetting of these designs with the architecture governance bodies, as required.
+ Be aware of all aspects of security related to the Sterling environment and integrations
INNOVATIVE INFRASTRUCTURE & PROBLEM SOLVING
+ Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows.
TEAMWORK & COMMUNICATION
+ Superior & demonstrated team building & development skills to harness powerful teams
+ Ability to communicate effectively with different levels of Seniorship within the organization
+ Provide timely updates so that progress against each individual incident can be updated as required
+ Write and review high quality technical documentation
CONTROL & AUDIT
+ Ensures their workstation and all processes and procedures, follow organization standards
CONTINUOUS IMPROVEMENT
+ Encourages and maintains a 'Best practice sharing culture', always striving to find ways to improve service and change mind set.
**What We're Looking For:**
+ Bachelor's degree in Engineering, Statistics, Computer Science or other quantitative fields, required
+ 5+ years of IT experience
+ 4+ years' experience working with Sterling B2B Integrator, Sterling File Gateway, PCM (preferred)
+ 3+ years' experience with scripting to enable automation of standard activities (example: Ansible, Python, Bash, Java)
+ Strong interpersonal and communication skills with Agile/Scrum experience.
+ Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions with the ability to develop custom setups.
+ Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels.
+ Prefer Travel, transportation, or hospitality experience
+ Prefer experience with designing application data models for mobile or web applications
+ Excellent written and verbal communication skills.
+ Flexibility in scheduling which may include nights, weekends, and holidays
**What You'll Get:**
+ Up to 40% off the base rate of any standard Hertz Rental
+ Paid Time Off
+ Medical, Dental & Vision plan options
+ Retirement programs, including 401(k) employer matching
+ Paid Parental Leave & Adoption Assistance
+ Employee Assistance Program for employees & family
+ Educational Reimbursement & Discounts
+ Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness
+ Perks & Discounts -Theme Park Tickets, Gym Discounts & more
The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world.
**US EEO STATEMENT**
At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company.
Individuals are encouraged to apply for positions because of the characteristics that make them unique.
EOE, including disability/veteran
$135k yearly 60d+ ago
Senior Data Engineer
KAC 4.0
Data scientist job in Raleigh, NC
About Us
Grover Gaming, now proudly a part of Light & Wonder, is a leading force in the charitable gaming industry. Our mission is to deliver world-class electronic gaming solutions that support veteran, fraternal and charitable organizations across the country. With a strong focus on building relationships, game and product innovation, service, and support, we're transforming how our charitable partners raise money for the causes that matter most.
Position Overview
We're looking for a Senior Data Engineer to work on our data processing pipeline and lead our data lake and data warehouse building, to help us deliver more insights and scale our data infrastructure. You will have a chance to contribute to the company's evolving culture, bring innovative approaches and learn from talented colleagues.
Key Responsibilities
Designs and develops programs and tools to support data ingestion, curation and provisioning of complex data to achieve analytics & reporting on our current technology stack
Designs and builds data extracts, integrations and transformations
Provides successful deployment and provisioning of data solutions to required environments
Designs and builds data architecture and applications that successfully enable speed, quality and efficient pipelines
Interacts with cross-functional teams to gather and define requirements.
Develops understanding of the data and builds business acumen.
Reviews discrepancies in requirements and resolves with stakeholders.
Identifies and recommends appropriate continuous improvement opportunities and ensures integrations are automated and have proper exception handling.
Qualifications
Core Qualifications
6 years as a Data Engineer on a data and analytics team
Bachelor or Master`s degree in technical discipline such as Computer Science, Information Systems or another technical field
Proficiency in data modeling and data warehousing (Synapse / Snowflake / Redshift / BigQuery)
Proficiency in SQL Server / MySQL (query optimization, performance tuning, schema design)
Advanced experience with Azure / AWS
Proficient in Python for ETL, automation, and data transformation
Preferred
Experience with building data pipelines and ETL using Azure Data Factory / Databricks.
Experience working with agile methodologies and working in cross-functional teams.
Experience with BI tools (Tableau / Power BI).
Experience with data pipeline workflow management tools such as: Airflow, Astronomer.
Understanding of gaming analytics, telemetry, or other event-driven data domains.
Mindset & Collaboration
People person, team player with a strong can-do mentality
Must be proactive, demonstrate initiative and be a logical thinker
Must be an inquisitive learner and have a thirst for improvement
Ability to provide work guidance to junior level developers
Strong time management and work organization skills
Have an ability to prioritize workloads and handle multiple tasks and at times meet tight deadlines.
We are Grover Gaming!
At Grover Gaming, we build entertainment experiences that excite and inspire. From innovative electronic games to mission-driven partnerships, our work powers charitable gaming across the country, helping nonprofits fund the causes that matter most. We believe in doing what you love and doing it with purpose. Our team of innovators, creators, and problem-solvers is shaping the future of charitable gaming. Together, we are building more than games, we are #playingitforward by building community, impact, and opportunity.
Why Grover Gaming?
• Join a passionate team in one of the most exciting sectors of the gaming industry
• Be part of a mission-driven organization that supports charitable causes
• Competitive salary and benefits
• Opportunities for advancement and growth
• A culture built on innovation, integrity, and service
Don't meet every requirement? Studies show that women and people of color are less likely to apply for jobs unless they meet every single qualification. At Grover Gaming, we know that creativity, passion, and different perspectives are what make our games and impact truly special. We welcome people from all backgrounds and experiences. If this role excites you but your experience doesn't match every qualification, we still want to hear from you. You could be exactly the teammate we need!
#LI-Onsite #LI-RR1
Light & Wonder is an Equal Opportunity Employer and does not discriminate against applicants due to race, color, sex, age, national origin, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any other federal, state or local protected class. If you'd like more information about your equal employment opportunity rights as an applicant under the law, please click here for EEOC Poster.
$97k-125k yearly est. Auto-Apply 60d+ ago
Data Engineer
Elder Research 3.9
Data scientist job in Raleigh, NC
Job Title: Data Engineer
Workplace: Hybrid - Due to in-office requirements, candidates must be local to either Raleigh, NC or Charlottesville, VA. Relocation assistance is not available
Clearance Required: Not required, BUT YOU MUST BE ELIGIBLE FOR A CLEARANCE
Position Overview:
Elder Research, Inc. (ERI) is seeking to hire a Data Engineer with strong engineering skills who will provide technical support across multiple project teams by leading, designing, and implementing the software and data architectures necessary to deliver analytics to our clients, as well as providing consulting and training support to client teams in the areas of architecture, data engineering, ML engineering and/or related areas. The ideal candidate will have a strong command of Python for data analysis and engineering tasks, a demonstrated ability to create reports and visualizations using tools like R, Python, SQL, or Power BI, and deep expertise in Microsoft Azure environments. The candidate will play a key role in collaborating with cross-functional teams, including software developers, cloud engineers, architects, business leaders, and power users, to deliver innovative data solutions to our clients.
This role requires a consultative mindset, excellent communication skills, and a thorough understanding of the Software Development Life Cycle (SDLC). Candidates should have 5-8 years of relevant experience and experience in client-facing or consultative roles. The role will be based out of Raleigh NC or Charlottesville VA and will require 2-4 days of Business Travel to our customer site every 6 weeks.
Key Responsibilities:
Data Engineering & Analysis:
Develop, optimize, and maintain scalable data pipelines and systems in Azure environments.
Analyze large, complex datasets to extract insights and support business decision-making.
Create detailed and visually appealing reports and dashboards using R, Python, SQL, and Power BI.
Collaboration & Consulting:
Work closely with software developers, cloud engineers, architects, business leaders, and power users to understand requirements and deliver tailored solutions.
Act as a subject-matter expert in data engineering and provide guidance on best practices.
Translate complex technical concepts into actionable business insights for stakeholders.
Azure Expertise:
Leverage Azure services such as Azure Data Factory, Azure Synapse Analytics, Azure Databricks, Azure SQL Database, and Azure Blob Storage for data solutions.
Ensure data architecture is aligned with industry standards and optimized for performance in cloud environments.
SDLC Proficiency:
Follow and advocate for SDLC best practices in data engineering projects.
Collaborate with software development teams to ensure seamless integration of data solutions into applications.
Required Qualifications:
Experience: 5-8 years in data engineering, analytics, or related fields, with a focus on Azure environments.
Education: Bachelors degree in Computer Science, Data Science, Engineering, or a related field (Master s degree preferred).
Technical Skills:
Programming: Advanced expertise in Python; experience with R is a plus.
Data Tools: Proficient in SQL, Power BI, and Azure-native data tools.
Azure Knowledge: Strong understanding of Azure services, including data integration, storage, and analytics solutions.
SDLC Knowledge: Proven track record of delivering data solutions following SDLC methodologies.
Consultative Skills: Strong client-facing experience with excellent communication and presentation abilities.
Due to Customer requirements Candidates must be US Citizens or Permanent Residents of the United States of America.
Preferred Skills and Qualifications:
Certifications in Azure (e.g., Azure Data Engineer, Azure Solutions Architect).
Familiarity with Azure Functions, Event Grid, and Logic Apps.
Hands-on experience with machine learning frameworks and big data processing tools (e.g., Spark, Hadoop).
Familiarity with CI/CD pipelines and DevOps practices for data engineering workflows.
Why apply to this position at Elder Research?
Competitive Salary and Benefits
Important Work / Make a Difference supporting U.S. national security.
Job Stability: Elder Research is not a typical government contractor, we hire you for a career not just a contract.
People-Focused Culture: we prioritize work-life-balance and provide a supportive, positive, and collaborative work environment as well as opportunities for professional growth and advancement.
About Elder Research, Inc
People Centered. Data Driven
Elder Research is a fast growing consulting firm specializing in predictive analytics. Being in the data mining business almost 30 years, we pride ourselves in our ability to find creative, cutting edge solutions to real-world problems. We work hard to provide the best value to our clients and allow each person to contribute their ideas and put their skills to use immediately.
Our team members are passionate, curious, life-long learners. We value humility, servant-leadership, teamwork, and integrity. We seek to serve our clients and our teammates to the best of our abilities. In keeping with our entrepreneurial spirit, we want candidates who are self-motivated with an innate curiosity and strong team work.
Elder Research believes in continuous learning and community - each week the entire company attends a Tech Talk and each office location provides lunch. Elder Research provides a supportive work environment with established parental, bereavement, and PTO policies. By prioritizing a healthy work-life balance - with reasonable hours, solid pay, low travel, and extremely flexible time off - Elder Research enables and encourages its employees to serve others and enjoy their lives.
Elder Research, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
Elder Research is a Government contractor and many of our positions require US Citizenship.
$85k-119k yearly est. 30d ago
Qlik Data Engineer
Akkodis
Data scientist job in Raleigh, NC
Akkodis is seeking a Qlik Data Engineer for a Contract with a client in Raleigh, NC(Remote). You will design and automate scalable data ingestion pipelines and implement optimized data models for efficient storage and retrieval. Proficiency in Qlik platforms and strong SQL expertise is essential for success in this role.
Rate Range: $49/hour to $53/hour; The rate may be negotiable based on experience, education, geographic location, and other factors.
Qlik Data Engineer job responsibilities include:
* Design and develop scalable ETL/ELT pipelines using Qlik tools (Qlik Replicate, Qlik Compose) for batch and real-time data processing.
* Automate data ingestion and application reloads using Qlik Automate or scripting (e.g., Python) to improve efficiency.
* Implement and optimize data models in Snowflake schema for efficient storage and retrieval.
* Monitor and troubleshoot data integration processes, ensuring performance and resolving bottlenecks.
* Collaborate with cross-functional teams to gather requirements and deliver actionable data solutions.
* Ensure data quality and governance by implementing validation frameworks and compliance measures.
Required Qualifications:
* Bachelor's degree in computer science, Information Technology, or related field.
* 3-7 years of experience in data engineering and Qlik platform development.
* Proven expertise in Qlik tools (Qlik Replicate, Qlik Compose, Qlik Sense) and strong proficiency in SQL for data integration and transformation.
* Hands-on experience with Snowflake and AWS cloud services, along with knowledge of ETL/ELT processes and data modeling techniques.
If you are interested in this role, then please click APPLY NOW. For other opportunities available at Akkodis, or any questions, feel free to contact me at ****************************.
Pay Details: $49.00 to $53.00 per hour
Benefit offerings available for our associates include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, EAP program, commuter benefits and a 401K plan. Our benefit offerings provide employees the flexibility to choose the type of coverage that meets their individual needs. In addition, our associates may be eligible for paid leave including Paid Sick Leave or any other paid leave required by Federal, State, or local law, as well as Holiday pay where applicable.
Equal Opportunity Employer/Veterans/Disabled
Military connected talent encouraged to apply
To read our Candidate Privacy Information Statement, which explains how we will use your information, please navigate to ******************************************************
The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:
* The California Fair Chance Act
* Los Angeles City Fair Chance Ordinance
* Los Angeles County Fair Chance Ordinance for Employers
* San Francisco Fair Chance Ordinance
Massachusetts Candidates Only: It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.
$49-53 hourly Easy Apply 55d ago
Lead Data Engineer
Tata Consulting Services 4.3
Data scientist job in Raleigh, NC
Lead Data Engineer - Snowflake, DBT and Qlik * Design, develop, and maintain robust and scalable data transformation pipelines using dbt on the Snowflake platform. * DBT Macro Development to Create and utilize Jinja-based DBT macros to promote code reusability, modularity, and dynamic SQL generation within DBT projects.
* Data Transformation & Orchestration to Implement and manage data transformation pipelines using DBT, integrating with various data sources and ensuring efficient data flow.
* Utilize advanced dbt concepts, including macros, materializations (e.g., incremental, view, table), snapshots, and configurations to build efficient data models.
* Write highly optimized and complex SQL queries for data manipulation, cleaning, aggregation, and transformation within dbt models.
* Implement and enforce best practices for dbt project structure, version control (Git), documentation, and testing.
* Collaborate with data analysts, engineers, and business stakeholders to understand data requirements and translate them into effective data models (e.g., star schema, snowflake schema).
* Design and implement logical and physical data models within dbt to support analytical and reporting needs.
* Leverage Snowflake features and functionalities for performance optimization, including virtual warehouses, clustering, caching, and query optimization.
* Manage and optimize data ingestion and integration processes from various sources into Snowflake.
* Ensure data quality, integrity, and lineage throughout the data transformation process.
* Implement and maintain DBT tests to ensure data quality, integrity, and adherence to business rules.
* Implement and maintain data governance policies and procedures within the dbt environment.
* Develop and execute automated tests for dbt models to ensure data accuracy and reliability.
Required Skills:
* Proven hands-on experience with dbt in a production environment, including extensive use of macros and advanced modeling techniques.
* Expert-level proficiency in SQL for data querying, manipulation, and transformation.
* Strong experience with Snowflake, including performance tuning and optimization.
* Solid understanding of data warehousing concepts and ETL/ELT processes.
* Experience with version control systems, particularly Git.
* Familiarity with data modeling principles (star schema, snowflake schema).
Salary Range- $100,000-$110,000 a year
#LI-SP3
#LI-VX1
$100k-110k yearly 16d ago
Senior Data Analytics Engineer (Data Sciences)
Invitrogen Holdings
Data scientist job in Raleigh, NC
When you join us at Thermo Fisher Scientific, you'll be part of an inquisitive team that shares your passion for exploration and discovery. We have revenues of more than $40 billion and make the largest R&D investment in the industry. This gives our people the resources and chances to make significant contributions to the world.
If you are passionate about engineering and how it can drive decision-making within a world-leading life science company, then this is the role for you!
As a key member of our team, you will collaborate across all divisions and functions. You will develop, build, and maintain analytics and data solutions for customers with various technical backgrounds. Your work will generate data insights to improve efficiency, productivity, and revenue. The ideal candidate will possess strong business analytics and communication skills, experience with data analytics and reporting tools, ETL processes, modeling knowledge, and solid project management abilities.
Key Responsibilities:
Lead the architecture and implementation of data pipelines and Microsoft Fabric-based data models to enable unified financial reporting and analytics.
Build, review, and optimize ETL processes developed in Python, ensuring efficient data ingestion, transformation, and quality across diverse data sources (ERP, financial systems, operational platforms).
Partner with Finance leadership to define business requirements, translate them into robust technical builds, and deliver actionable insights for decision-making.
Establish and implement data engineering standards, including coding guidelines, documentation, version control, and automated testing for ETL pipelines.
Develop, test, and deploy scalable data models using Lakehouse architectures and Fabric Data Warehouses to support forecasting, planning, and profitability analysis.
Provide technical leadership and mentorship to data engineers and analysts, encouraging skill development in Fabric, Power BI, and modern Python data frameworks (Pandas, PySpark, SQLAlchemy).
Collaborate multi-functionally with Finance, IT, and Data Governance teams to ensure alignment with enterprise data architecture and security policies.
Keep up to date with developments in Microsoft Fabric, Power BI, Python, and data orchestration tools, and suggest their strategic implementation.
Run multiple concurrent initiatives - leading all aspects of planning, prioritisation, communication, and customer engagement.
Requirements/Qualifications:
Bachelor's degree or equivalent experience in a quantitative field, such as Statistics, Computer Science, Mathematics, Data Science or a related field, master's or equivalent experience preferred
Proven experience in a data engineer or data science role with dynamic responsibilities and scope
Experience building models and analyzing large, complex data sets yielding opportunities for revenue and/or process improvement within an organization
Technical proficiency in Python and SQL
Proficiency in data engineering and reporting platforms (Databricks, Power Bi, Tableau, SAS Analytics)
Interpersonal skills: outstanding verbal/written communication
Proven track record of achieving desired results without direct report authority
Knowledge of Agile/Scrum methodology is a plus
Hands-on experience with distributed computational framework, such as Spark
Command of statistical topics, including distributions, hypothesis testing, and experiment building
Minimal travel required about 10%
$78k-106k yearly est. Auto-Apply 6d ago
Senior Data Scientist
Insight Global
Data scientist job in Raleigh, NC
As a Senior DataScientist, you will leverage your advanced analytical skills to extract insights from complex datasets. Your expertise will drive data-driven decision-making and contribute to the development of innovative solutions. You will collaborate with cross-functional teams to enhance business strategies and drive growth through actionable data analysis.
· Leading the development of advanced AI and machine learning models to solve complex business problem
· Working closely with other datascientists and engineers to design, develop, and deploy AI solutions
· Collaborating with cross-functional teams to ensure AI solutions are aligned with business goals and customer needs
· Building models, performing analytics, and creating AI features
· Mentoring junior datascientists and provide guidance on AI and machine learning best practices
Working with product leaders to apply data science solutions
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to ********************.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: ****************************************************
Skills and Requirements
3-5+ years of professional experience (POST GRAD) as a delivery-focused DataScientist or ML Engineer
Understand how LLMs integrate to complex systems
Deep understanding of RAG
Strong in Python
Expeirence creating AI Agents and Agentic Workflows Masters and PHD
$79k-110k yearly est. 7d ago
Senior Data Scientist
Cardinal Health 4.4
Data scientist job in Raleigh, NC
**_What Data Science contributes to Cardinal Health_** The Data & Analytics Function oversees the analytics life-cycle in order to identify, analyze and present relevant insights that drive business decisions and anticipate opportunities to achieve a competitive advantage. This function manages analytic data platforms, the access, design and implementation of reporting/business intelligence solutions, and the application of advanced quantitative modeling.
Data Science applies base, scientific methodologies from various disciplines, techniques and tools that extracts knowledge and insight from data to solve complex business problems on large data sets, integrating multiple systems.
At Cardinal Health's Artificial Intelligence Center of Excellence (AI CoE), we are pushing the boundaries of healthcare with cutting-edge Data Science and Artificial Intelligence (AI). Our mission is to leverage the power of data to create innovative solutions that improve patient outcomes, streamline operations, and enhance the overall healthcare experience.
We are seeking a highly motivated and experienced Senior DataScientist to join our team as a thought leader and architect of our AI strategy. You will play a critical role in fulfilling our vision through delivery of impactful solutions that drive real-world change.
**_Responsibilities_**
+ Lead the Development of Innovative AI solutions: Be responsible for designing, implementing, and scaling sophisticated AI solutions that address key business challenges within the healthcare industry by leveraging your expertise in areas such as Machine Learning, Generative AI, and RAG Technologies.
+ Develop advanced ML models for forecasting, classification, risk prediction, and other critical applications.
+ Explore and leverage the latest Generative AI (GenAI) technologies, including Large Language Models (LLMs), for applications like summarization, generation, classification and extraction.
+ Build robust Retrieval Augmented Generation (RAG) systems to integrate LLMs with vast repositories of healthcare and business data, ensuring accurate and relevant outputs.
+ Shape Our AI Strategy: Work closely with key stakeholders across the organization to understand their needs and translate them into actionable AI-driven or AI-powered solutions.
+ Act as a champion for AI within Cardinal Health, influencing the direction of our technology roadmap and ensuring alignment with our overall business objectives.
+ Guide and mentor a team of DataScientists and ML Engineers by providing technical guidance, mentorship, and support to a team of skilled and geographically distributed datascientists, while fostering a collaborative and innovative environment that encourages continuous learning and growth.
+ Embrace a AI-Driven Culture: foster a culture of data-driven decision-making, promoting the use of AI insights to drive business outcomes and improve customer experience and patient care.
**_Qualifications_**
+ 8-12 years of experience with a minimum of 4 years of experience in data science, with a strong track record of success in developing and deploying complex AI/ML solutions, preferred
+ Bachelor's degree in related field, or equivalent work experience, preferred
+ GenAI Proficiency: Deep understanding of Generative AI concepts, including LLMs, RAG technologies, embedding models, prompting techniques, and vector databases, along with evaluating retrievals from RAGs and GenAI models without ground truth
+ Experience working with building production ready Generative AI Applications involving RAGs, LLM, vector databases and embeddings model.
+ Extensive knowledge of healthcare data, including clinical data, patient demographics, and claims data. Understanding of HIPAA and other relevant regulations, preferred.
+ Experience working with cloud platforms like Google Cloud Platform (GCP) for data processing, model training, evaluation, monitoring, deployment and support preferred.
+ Proven ability to lead data science projects, mentor colleagues, and effectively communicate complex technical concepts to both technical and non-technical audiences preferred.
+ Proficiency in Python, statistical programming languages, machine learning libraries (Scikit-learn, TensorFlow, PyTorch), cloud platforms, and data engineering tools preferred.
+ Experience in Cloud Functions, VertexAI, MLFlow, Storage Buckets, IAM Principles and Service Accounts preferred.
+ Experience in building end-to-end ML pipelines, from data ingestion and feature engineering to model training, deployment, and scaling preferred.
+ Experience in building and implementing CI/CD pipelines for ML models and other solutions, ensuring seamless integration and deployment in production environments preferred.
+ Familiarity with RESTful API design and implementation, including building robust APIs to integrate your ML models and GenAI solutions with existing systems preferred.
+ Working understanding of software engineering patterns, solutions architecture, information architecture, and security architecture with an emphasis on ML/GenAI implementations preferred.
+ Experience working in Agile development environments, including Scrum or Kanban, and a strong understanding of Agile principles and practices preferred.
+ Familiarity with DevSecOps principles and practices, incorporating coding standards and security considerations into all stages of the development lifecycle preferred.
**_What is expected of you and others at this level_**
+ Applies advanced knowledge and understanding of concepts, principles, and technical capabilities to manage a wide variety of projects
+ Participates in the development of policies and procedures to achieve specific goals
+ Recommends new practices, processes, metrics, or models
+ Works on or may lead complex projects of large scope
+ Projects may have significant and long-term impact
+ Provides solutions which may set precedent
+ Independently determines method for completion of new projects
+ Receives guidance on overall project objectives
+ Acts as a mentor to less experienced colleagues
**Anticipated salary range:** $121,600 - $173,700
**Bonus eligible:** Yes
**Benefits:** Cardinal Health offers a wide variety of benefits and programs to support health and well-being.
+ Medical, dental and vision coverage
+ Paid time off plan
+ Health savings account (HSA)
+ 401k savings plan
+ Access to wages before pay day with my FlexPay
+ Flexible spending accounts (FSAs)
+ Short- and long-term disability coverage
+ Work-Life resources
+ Paid parental leave
+ Healthy lifestyle programs
**Application window anticipated to close:** 11/05/2025
*if interested in opportunity, please submit application as soon as possible.
The salary range listed is an estimate. Pay at Cardinal Health is determined by multiple factors including, but not limited to, a candidate's geographical location, relevant education, experience and skills and an evaluation of internal pay equity.
_Candidates who are back-to-work, people with disabilities, without a college degree, and Veterans are encouraged to apply._
_Cardinal Health supports an inclusive workplace that values diversity of thought, experience and background. We celebrate the power of our differences to create better solutions for our customers by ensuring employees can be their authentic selves each day. Cardinal Health is an Equal_ _Opportunity/Affirmative_ _Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, ancestry, age, physical or mental disability, sex, sexual orientation, gender identity/expression, pregnancy, veteran status, marital status, creed, status with regard to public assistance, genetic status or any other status protected by federal, state or local law._
_To read and review this privacy notice click_ here (***************************************************************************************************************************
$121.6k-173.7k yearly 60d+ ago
Google Cloud Data & AI Engineer
Slalom 4.6
Data scientist job in Raleigh, NC
Who You'll Work With As a modern technology company, our Slalom Technologists are disrupting the market and bringing to life the art of the possible for our clients. We have passion for building strategies, solutions, and creative products to help our clients solve their most complex and interesting business problems. We surround our technologists with interesting challenges, innovative minds, and emerging technologies
You will collaborate with cross-functional teams, including Google Cloud architects, datascientists, and business units, to design and implement Google Cloud data and AI solutions. As a Consultant, Senior Consultant or Principal at Slalom, you will be a part of a team of curious learners who lean into the latest technologies to innovate and build impactful solutions for our clients.
What You'll Do
* Design, build, and operationalize large-scale enterprise data and AI solutions using Google Cloud services such as BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub and more.
* Implement cloud-based data solutions for data ingestion, transformation, and storage; and AI solutions for model development, deployment, and monitoring, ensuring both areas meet performance, scalability, and compliance needs.
* Develop and maintain comprehensive architecture plans for data and AI solutions, ensuring they are optimized for both data processing and AI model training within the Google Cloud ecosystem.
* Provide technical leadership and guidance on Google Cloud best practices for data engineering (e.g., ETL pipelines, data pipelines) and AI engineering (e.g., model deployment, MLOps).
* Conduct assessments of current data architectures and AI workflows, and develop strategies for modernizing, migrating, or enhancing data systems and AI models within Google Cloud.
* Stay current with emerging Google Cloud data and AI technologies, such as BigQuery ML, AutoML, and Vertex AI, and lead efforts to integrate new innovations into solutions for clients.
* Mentor and develop team members to enhance their skills in Google Cloud data and AI technologies, while providing leadership and training on both data pipeline optimization and AI/ML best practices.
What You'll Bring
* Proven experience as a Cloud Data and AI Engineer or similar role, with hands-on experience in Google Cloud tools and services (e.g., BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub, etc.).
* Strong knowledge of data engineering concepts, such as ETL processes, data warehousing, data modeling, and data governance.
* Proficiency in AI engineering, including experience with machine learning models, model training, and MLOps pipelines using tools like Vertex AI, BigQuery ML, and AutoML.
* Strong problem-solving and decision-making skills, particularly with large-scale data systems and AI model deployment.
* Strong communication and collaboration skills to work with cross-functional teams, including datascientists, business stakeholders, and IT teams, bridging data engineering and AI efforts.
* Experience with agile methodologies and project management tools in the context of Google Cloud data and AI projects.
* Ability to work in a fast-paced environment, managing multiple Google Cloud data and AI engineering projects simultaneously.
* Knowledge of security and compliance best practices as they relate to data and AI solutions on Google Cloud.
* Google Cloud certifications (e.g., Professional Data Engineer, Professional Database Engineer, Professional Machine Learning Engineer) or willingness to obtain certification within a defined timeframe.
About Us
Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all.
Compensation and Benefits
Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance.
Slalom is committed to fair and equitable compensation practices. For this position the target base salaries are listed below. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The target salary pay range is subject to change and may be modified at any time.
East Bay, San Francisco, Silicon Valley:
* Consultant $114,000-$171,000
* Senior Consultant: $131,000-$196,500
* Principal: $145,000-$217,500
San Diego, Los Angeles, Orange County, Seattle, Houston, New Jersey, New York City, Westchester, Boston, Washington DC:
* Consultant $105,000-$157,500
* Senior Consultant: $120,000-$180,000
* Principal: $133,000-$199,500
All other locations:
* Consultant: $96,000-$144,000
* Senior Consultant: $110,000-$165,000
* Principal: $122,000-$183,000
EEO and Accommodations
Slalom is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process.
We are accepting applications until 12/31.
#LI-FB1
$145k-217.5k yearly 38d ago
Qlik Data Engineer
Tata Consulting Services 4.3
Data scientist job in Raleigh, NC
Qlik Data Engineer/Developer Qlik Replicate Automation for Data Ingestion role for a professional with 3-7 years of experience generally focuses on the design, development, and maintenance of robust, automated data pipelines using Qlik's suite of tools (Qlik Replicate, Qlik Compose, Qlik Sense, and Qlik Automate). This role requires a strong mix of data engineering skills, Qlik platform expertise, and collaboration with business stakeholders.
Roles and Responsibilities
* Design and Implementation:
o Design, develop, and maintain efficient and scalable data integration and ETL/ELT (Extract, Transform,
o Load/Extract, Load, Transform) pipelines for both batch and real-time data processing.
o Utilize Qlik data integration tools (e.g., Qlik Replicate, Qlik Compose) to ingest data from various sources (SQL databases, APIs, cloud platforms, flat files) into target data warehouses or data lakes.
o Implement data models (in snowflake schema) using the Qlik associative data model to ensure efficient data storage and retrieval.
* Automation and Optimization:
o Automate data ingestion processes, application reloads, and administrative tasks using Qlik Automate or scripting
o (e.g., Python) to improve efficiency and reduce manual intervention.
o Monitor and optimize data integration processes and Qlik applications for performance, identifying and resolving bottlenecks in data loading and query execution.
o Implement data quality frameworks, including validation, monitoring, and alerting systems to ensure data accuracy and consistency.
* Collaboration and Support:
o Collaborate closely with data engineers, analysts, datascientists, and business stakeholders to gather and understand data requirements and deliver solutions that enable actionable insights.
o Troubleshoot and resolve data integration issues, providing ongoing support for Qlik solutions.
o Document data pipelines, schemas, and business logic for knowledge sharing and efficient onboarding.
* Governance and Compliance:
o Implement data security measures, including encryption, access controls, and audit logging, to ensure
o compliance with relevant data governance policies.
o Manage the data lifecycle, including archival, retention, and deletion policies.
Required Qualifications and Skills
* Experience: 3-7 years of proven experience working directly with Qlik platforms (Qlik Sense, QlikView, Qlik Data Integration suite) as
a developer or data integrator.
o Strong proficiency in SQL and experience with relational and non-relational databases.
o Solid understanding of data warehousing concepts, ETL processes, and data modeling techniques.
o Familiarity with cloud platform (AWS) and Data Platform(Snowflake) and related data services.
o Experience with other programming/scripting languages (e.g., Python) for advanced automation is a plus.
Salary Range- $100,000-$110,000 a year
#LI-SP3
#LI-VX1
How much does a data scientist earn in Rocky Mount, NC?
The average data scientist in Rocky Mount, NC earns between $61,000 and $114,000 annually. This compares to the national average data scientist range of $75,000 to $148,000.