We are seeking experienced Data Scientists to join our team and drive impact across various product areas. As a Data Scientist, you will collaborate with cross-functional partners to identify and solve complex problems using data and analysis. Your role will involve shaping product development, quantifying new opportunities, and ensuring products bring value to users and the company. You will guide teams using data-driven insights, develop hypotheses, and employ rigorous analytical approaches to test them. You will tell data-driven stories, present clear insights, and build credibility with stakeholders. By joining our team, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.
**Required Skills:**
Data Scientist, Analytics (Technical Leadership) Responsibilities:
1. Work with complex data sets to solve challenging problems using analytical and statistical approaches
2. Apply technical expertise in quantitative analysis, experimentation, and data mining to develop product strategies
3. Identify and measure success through goal setting, forecasting, and monitoring key metrics
4. Partner with cross-functional teams to inform and execute product strategy and investment decisions
5. Build long-term vision and strategy for programs and products
6. Collaborate with executives to define and develop data platforms and instrumentation
7. Effectively communicate insights and recommendations to stakeholders
8. Define success metrics, forecast changes, and set team goals
9. Support developing roadmaps and coordinate analytics efforts across teams
**Minimum Qualifications:**
Minimum Qualifications:
10. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
11. Mathematics, Statistics, Operations Research) 5+ years of experience with data querying languages (e.g. SQL), scripting languages (e.g. Python), or statistical/mathematical software (e.g. R, SAS, Matlab)
12. 8+ years of work experience leading analytics work in IC capacity, working collaboratively with Engineering and cross-functional partners, and guiding data-influenced product planning, prioritization and strategy development
13. Experience working effectively with and leading complex analytics projects across multiple stakeholders and cross-functional teams, including Engineering, PM/TPM, Analytics and Finance
14. Experience with predictive modeling, machine learning, and experimentation/causal inference methods
15. Experience communicating complex technical topics in a clear, precise, and actionable manner
**Preferred Qualifications:**
Preferred Qualifications:
16. 10+ years of experience communicating the results of analyses to leadership teams to influence the strategy
17. Masters or Ph.D. Degree in a quantitative field
18. Bachelor's Degree in an analytical or scientific field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research)
19. 10+ years of experience doing complex quantitative analysis in product analytics
**Public Compensation:**
$210,000/year to $281,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
$210k-281k yearly 60d+ ago
Looking for a job?
Let Zippia find it for you.
Data Scientist, Privacy
Datavant
Data engineer job in Columbia, SC
Datavant is a data platform company and the world's leader in health data exchange. Our vision is that every healthcare decision is powered by the right data, at the right time, in the right format. Our platform is powered by the largest, most diverse health data network in the U.S., enabling data to be secure, accessible and usable to inform better health decisions. Datavant is trusted by the world's leading life sciences companies, government agencies, and those who deliver and pay for care.
By joining Datavant today, you're stepping onto a high-performing, values-driven team. Together, we're rising to the challenge of tackling some of healthcare's most complex problems with technology-forward solutions. Datavanters bring a diversity of professional, educational and life experiences to realize our bold vision for healthcare.
As part of the Privacy Science team within Privacy Hub you will play a crucial role in ensuring that privacy of patients is safeguarded in the modern world of data sharing. As well as working on real data, you will be involved in exciting research to keep us as industry leaders in this area, and stimulating discussions on re-identification risk. You will be supported in developing/consolidating data analysis and coding skills to become proficient in the analysis of large health-related datasets.
**You Will:**
+ Critically analyze large health datasets using standard and bespoke software libraries
+ Discuss your findings and progress with internal and external stakeholders
+ Produce high quality reports which summarise your findings
+ Contribute to research activities as we explore novel and established sources of re-identification risk
**What You Will Bring to the Table:**
+ Excellent communication skills. Meticulous attention to detail in the production of comprehensive, well-presented reports
+ A good understanding of statistical probability distributions, bias, error and power as well as sampling and resampling methods
+ Seeks to understand real-world data in context rather than consider it in abstraction.
+ Familiarity or proficiency with programmable data analysis software R or Python, and the desire to develop expertise in its language
+ Application of scientific methods to practical problems through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions
+ Strong time management skills and demonstrable experience of prioritising work to meet tight deadlines
+ Initiative and ability to independently explore and research novel topics and concepts as they arise, to expand Privacy Hub's knowledge base
+ An appreciation of the need for effective methods in data privacy and security, and an awareness of the relevant legislation
+ Familiarity with Amazon Web Services cloud-based storage and computing facilities
**Bonus Points If You Have:**
+ Experience creating documents using LATEX
+ Detailed knowledge of one or more types of health information, e.g., genomics, disease, health images
+ Experience working with or supporting public sector organizations, such as federal agencies (e.g., CMS, NIH, VA, CDC), state health departments, or public health research partners. Familiarity with government data environments, procurement processes, or privacy frameworks in regulated settings is highly valued.
\#LI-BC1
We are committed to building a diverse team of Datavanters who are all responsible for stewarding a high-performance culture in which all Datavanters belong and thrive. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status.
At Datavant our total rewards strategy powers a high-growth, high-performance, health technology company that rewards our employees for transforming health care through creating industry-defining data logistics products and services.
The range posted is for a given job title, which can include multiple levels. Individual rates for the same job title may differ based on their level, responsibilities, skills, and experience for a specific job.
The estimated total cash compensation range for this role is:
$104,000-$130,000 USD
To ensure the safety of patients and staff, many of our clients require post-offer health screenings and proof and/or completion of various vaccinations such as the flu shot, Tdap, COVID-19, etc. Any requests to be exempted from these requirements will be reviewed by Datavant Human Resources and determined on a case-by-case basis. Depending on the state in which you will be working, exemptions may be available on the basis of disability, medical contraindications to the vaccine or any of its components, pregnancy or pregnancy-related medical conditions, and/or religion.
This job is not eligible for employment sponsorship.
Datavant is committed to a work environment free from job discrimination. We are proud to be an Equal Employment Opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, or other legally protected status. To learn more about our commitment, please review our EEO Commitment Statement here (************************************************** . Know Your Rights (*********************************************************************** , explore the resources available through the EEOC for more information regarding your legal rights and protections. In addition, Datavant does not and will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay.
At the end of this application, you will find a set of voluntary demographic questions. If you choose to respond, your answers will be anonymous and will help us identify areas for improvement in our recruitment process. (We can only see aggregate responses, not individual ones. In fact, we aren't even able to see whether you've responded.) Responding is entirely optional and will not affect your application or hiring process in any way.
Datavant is committed to working with and providing reasonable accommodations to individuals with physical and mental disabilities. If you need an accommodation while seeking employment, please request it here, (************************************************************** Id=**********48790029&layout Id=**********48795462) by selecting the 'Interview Accommodation Request' category. You will need your requisition ID when submitting your request, you can find instructions for locating it here (******************************************************************************************************* . Requests for reasonable accommodations will be reviewed on a case-by-case basis.
For more information about how we collect and use your data, please review our Privacy Policy (**************************************** .
$104k-130k yearly 15d ago
Databricks Data Engineer - Manager - Consulting - Location Open
EY 4.7
Data engineer job in Columbia, SC
At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
**Technology - Data and Decision Science - DataEngineering - Manager**
We are looking for a dynamic and experienced Manager of DataEngineering to lead our team in designing and implementing complex cloud analytics solutions with a strong focus on Databricks. The ideal candidate will possess deep technical expertise in data architecture, cloud technologies, and analytics, along with exceptional leadership and client management skills.
**The opportunity:**
In this role, you will design and build analytics solutions that deliver significant business value. You will collaborate with other data and analytics professionals, management, and stakeholders to ensure that business requirements are translated into effective technical solutions. Key responsibilities include:
+ Understanding and analyzing business requirements to translate them into technical requirements.
+ Designing, building, and operating scalable data architecture and modeling solutions.
+ Staying up to date with the latest trends and emerging technologies to maintain a competitive edge.
**Key Responsibilities:**
As a DataEngineering Manager, you will play a crucial role in managing and delivering complex technical initiatives. Your time will be spent across various responsibilities, including:
+ Leading workstream delivery and ensuring quality in all processes.
+ Engaging with clients on a daily basis, actively participating in working sessions, and identifying opportunities for additional services.
+ Implementing resource plans and budgets while managing engagement economics.
This role offers the opportunity to work in a dynamic environment where you will face challenges that require innovative solutions. You will learn and grow as you guide others and interpret internal and external issues to recommend quality solutions. Travel may be required regularly based on client needs.
**Skills and attributes for success:**
To thrive in this role, you should possess a blend of technical and interpersonal skills. The following attributes will make a significant impact:
+ Lead the design and development of scalable dataengineering solutions using Databricks on cloud platforms (e.g., AWS, Azure, GCP).
+ Oversee the architecture of complex cloud analytics solutions, ensuring alignment with business objectives and best practices.
+ Manage and mentor a team of dataengineers, fostering a culture of innovation, collaboration, and continuous improvement.
+ Collaborate with clients to understand their analytics needs and deliver tailored solutions that drive business value.
+ Ensure the quality, integrity, and security of data throughout the data lifecycle, implementing best practices in data governance.
+ Drive end-to-end data pipeline development, including data ingestion, transformation, and storage, leveraging Databricks and other cloud services.
+ Communicate effectively with stakeholders, including technical and non-technical audiences, to convey complex data concepts and project progress.
+ Manage client relationships and expectations, ensuring high levels of satisfaction and engagement.
+ Stay abreast of the latest trends and technologies in dataengineering, cloud computing, and analytics.
+ Strong analytical and problem-solving abilities.
+ Excellent communication skills, with the ability to convey complex information clearly.
+ Proven experience in managing and delivering projects effectively.
+ Ability to build and manage relationships with clients and stakeholders.
**To qualify for the role, you must have:**
+ Bachelor's degree in computer science, Engineering, or a related field required; Master's degree preferred.
+ Typically, no less than 4 - 6 years relevant experience in dataengineering, with a focus on cloud data solutions and analytics.
+ Proven expertise in Databricks and experience with Spark for big data processing.
+ Strong background in data architecture and design, with experience in building complex cloud analytics solutions.
+ Experience in leading and managing teams, with a focus on mentoring and developing talent.
+ Strong programming skills in languages such as Python, Scala, or SQL.
+ Excellent problem-solving skills and the ability to work independently and as part of a team.
+ Strong communication and interpersonal skills, with a focus on client management.
**Required Expertise for Managerial Role:**
+ **Strategic Leadership:** Ability to align dataengineering initiatives with organizational goals and drive strategic vision.
+ **Project Management:** Experience in managing multiple projects and teams, ensuring timely delivery and adherence to project scope.
+ **Stakeholder Engagement:** Proficiency in engaging with various stakeholders, including executives, to understand their needs and present solutions effectively.
+ **Change Management:** Skills in guiding clients through change processes related to data transformation and technology adoption.
+ **Risk Management:** Ability to identify potential risks in data projects and develop mitigation strategies.
+ **Technical Leadership:** Experience in leading technical discussions and making architectural decisions that impact project outcomes.
+ **Documentation and Reporting:** Proficiency in creating comprehensive documentation and reports to communicate project progress and outcomes to clients.
**Large-Scale Implementation Programs:**
1. **Enterprise Data Lake Implementation:** Led the design and deployment of a cloud-based data lake solution for a Fortune 500 retail client, integrating data from multiple sources (e.g., ERPs, POS systems, e-commerce platforms) to enable advanced analytics and reporting capabilities.
2. **Real-Time Analytics Platform:** Managed the development of a real-time analytics platform using Databricks for a financial services organization, enabling real-time fraud detection and risk assessment through streaming data ingestion and processing.
3. **Data Warehouse Modernization:** Oversaw the modernization of a legacy data warehouse to a cloud-native architecture for a healthcare provider, implementing ETL processes with Databricks and improving data accessibility for analytics and reporting.
**Ideally, you'll also have:**
+ Experience with advanced data analytics tools and techniques.
+ Familiarity with machine learning concepts and applications.
+ Knowledge of industry trends and best practices in dataengineering.
+ Familiarity with cloud platforms (AWS, Azure, GCP) and their data services.
+ Knowledge of data governance and compliance standards.
+ Experience with machine learning frameworks and tools.
**What we look for:**
We seek individuals who are not only technically proficient but also possess the qualities of top performers, including a strong sense of collaboration, adaptability, and a passion for continuous learning. If you are driven by results and have a desire to make a meaningful impact, we want to hear from you.
FY26NATAID
**What we offer you**
At EY, we'll develop you with future-focused skills and equip you with world-class experiences. We'll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more .
+ We offer a comprehensive compensation and benefits package where you'll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $125,500 to $230,200. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $150,700 to $261,600. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
+ Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
+ Under our flexible vacation policy, you'll decide how much vacation time you need based on your own personal circumstances. You'll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
**Are you ready to shape your future with confidence? Apply today.**
EY accepts applications for this position on an on-going basis.
For those living in California, please click here for additional information.
EY focuses on high-ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
**EY | Building a better working world**
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY's Talent Shared Services Team (TSS) or email the TSS at ************************** .
$79k-113k yearly est. 60d+ ago
Data Scientist I
UNUM Group 4.4
Data engineer job in Columbia, SC
When you join the team at Unum, you become part of an organization committed to helping you thrive. Here, we work to provide the employee benefits and service solutions that enable employees at our client companies to thrive throughout life's moments. And this starts with ensuring that every one of our team members enjoys opportunities to succeed both professionally and personally. To enable this, we provide:
* Award-winning culture
* Inclusion and diversity as a priority
* Performance Based Incentive Plans
* Competitive benefits package that includes: Health, Vision, Dental, Short & Long-Term Disability
* Generous PTO (including paid time to volunteer!)
* Up to 9.5% 401(k) employer contribution
* Mental health support
* Career advancement opportunities
* Student loan repayment options
* Tuition reimbursement
* Flexible work environments
* All the benefits listed above are subject to the terms of their individual Plans.
And that's just the beginning…
With 10,000 employees helping more than 39 million people worldwide, every role at Unum is meaningful and impacts the lives of our customers. Whether you're directly supporting a growing family, or developing online tools to help navigate a difficult loss, customers are counting on the combined talents of our entire team. Help us help others, and join Team Unum today!
General Summary:
This position for a developing data scientist who is excited to transform data into actionable insights and impact the business through his/her work. The role requires increasing technical expertise in the fields of computer programming, applied statistics and data manipulation; and relies on developing business knowledge. The individual will participate in project work primarily within the functional area, with direction and review by manager. The individual is expected to continuously increase business knowledge and take initiative in identifying and executing analytical approaches to support assigned projects.
Job Specifications
* Bachelor's degree in quantitative field is preferred
* 2 years preferred of professional experience or equivalent relevant work experience
* Core Data Science Capabilities: Deep expertise in at least one of the following skillsets preferred, with basic capability in the others:
* Programming & Process automation: Experience using APIs, file I/O, database, and analysis libraries. Understanding of programming in jupyter notebooks and/ or statistical packages. Understanding of process mapping and demonstrated application of scripting languages to automate processes. Exposure to data mining and web scraping.
* Data Visualization: Working knowledge of two or more data visualization tools and proficiency in static data visualization. Basic understanding of dynamic data visualization
* Statistics & Statistical modeling: Solid understanding of statistical inference and regression. Basic understanding of machine learning techniques including random forests and neural nets. Basic understanding of feature selection and extraction which may include analysis of categorical data, multiple comparisons, Central Limit Theorem, bootstrapping, and permutation tests
* Data Extraction, Transformation, and Loading (ETL) skillsets preferred: Expertise in writing complex SQL queries that join multiple tables/databases. Given a business problem, independently explore databases/tables to identify best data sources. Demonstrates ability to troubleshoot complex SQL queries written by others with little guidance.
* Core business capabilities: basic communication skills, exposure to financial services industry, and attention to detail and ability to prioritize while working on multiple projects keeping track of due dates
* Leadership Capabilities: Ability to coach or mentor team members, respond quickly and positively to change, adheres to technical best practices by collaborating with team members to deliver high quality output
* Preferred characteristics: Entrepreneurial self-starter. A thorough, results-oriented problem-solver, and a lifelong learner with voracious curiosity. Basic understanding of their organization.
Principal Duties and Responsibilities
* Design and execute analytical solutions using statistical, optimization, simulation and data mining methods with a focus on delivering actionable insights and partnership to deliver business value.
* Integrate large volume of data from different sources (including DB2, SQL Server, Web API and Teradata) to create data assets and perform analyses
* Apply validation, aggregation and reconciliation techniques to create rich modeling-ready data framework.
* Construct predictive models using machine learning to explain and understand observed events, forecast expected behavior, or identify risk through scoring or clustering.
* Familiarity with AI agents like github co-pilot, MS co-pilot to speed through code development and self - learning.
* Knowledge of working with LLMs and agent libraries like Langgraph. Langchain & OpenAI SDK.
* Efficiently interpret results and communicate findings and potential value to manager.
* Support integration of solutions within existing business processes using automation techniques.
* Understand theory and application of current and emerging statistical methods and tools.
* Knowledge of tools - Python, VS code, Databricks, Azure, AWS, Streamlit, tableau, SQL
* Perform other related duties as assigned
#LI-TO1
#LI-MULTI
IN4
Unum and Colonial Life are part of Unum Group, a Fortune 500 company and leading provider of employee benefits to companies worldwide. Headquartered in Chattanooga, TN, with international offices in Ireland, Poland and the UK, Unum also has significant operations in Portland, ME, and Baton Rouge, LA - plus over 35 US field offices. Colonial Life is headquartered in Columbia, SC, with over 40 field offices nationwide.
Unum is an equal opportunity employer, considering all qualified applicants and employees for hiring, placement, and advancement, without regard to a person's race, color, religion, national origin, age, genetic information, military status, gender, sexual orientation, gender identity or expression, disability, or protected veteran status.
The base salary range for applicants for this position is listed below. Unless actual salary is indicated above in the job description, actual pay will be based on skill, geographical location and experience.
$60,500.00-$123,400.00
Additionally, Unum offers a portfolio of benefits and rewards that are competitive and comprehensive including healthcare benefits (health, vision, dental), insurance benefits (short & long-term disability), performance-based incentive plans, paid time off, and a 401(k) retirement plan with an employer match up to 5% and an additional 4.5% contribution whether you contribute to the plan or not. All benefits are subject to the terms and conditions of individual Plans.
Company:
Unum
$60.5k-123.4k yearly Auto-Apply 6d ago
Data Engineer - Senior Manager
PwC 4.8
Data engineer job in Columbia, SC
**Specialty/Competency:** Data, Analytics & AI **Industry/Sector:** Not Applicable **Time Type:** Full time **Travel Requirements:** Up to 60% At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth.
In dataengineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions.
Growing as a strategic advisor, you leverage your influence, expertise, and network to deliver quality results. You motivate and coach others, coming together to solve complex problems. As you increase in autonomy, you apply sound judgment, recognising when to take action and when to escalate. You are expected to solve through complexity, ask thoughtful questions, and clearly communicate how things fit together. Your ability to develop and sustain high performing, diverse, and inclusive teams, and your commitment to excellence, contributes to the success of our Firm.
Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to:
+ Craft and convey clear, impactful and engaging messages that tell a holistic story.
+ Apply systems thinking to identify underlying problems and/or opportunities.
+ Validate outcomes with clients, share alternative perspectives, and act on client feedback.
+ Direct the team through complexity, demonstrating composure through ambiguous, challenging and uncertain situations.
+ Deepen and evolve your expertise with a focus on staying relevant.
+ Initiate open and honest coaching conversations at all levels.
+ Make difficult decisions and take action to resolve issues hindering team effectiveness.
+ Model and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements.
As part of the Data and Analytics Engineering team you design and implement thorough data architecture strategies that meet current and future business needs. As a Senior Manager you lead large projects, innovate processes, and maintain operational excellence while interacting with clients at a strategic level to drive project success. You also be responsible for developing and documenting data models, data flow diagrams, and data architecture guidelines, maintaining compliance with data governance and data security policies, and collaborating with business stakeholders to translate their data requirements into technical solutions.
**Responsibilities**
- Design and implement thorough data architecture strategies
- Lead large-scale projects and innovate processes
- Maintain operational excellence and client interactions
- Develop and document data models and data flow diagrams
- Adhere to data governance and security policies
- Collaborate with business stakeholders to translate data requirements into technical solutions
- Drive project success through strategic advising and problem-solving
- Foster a diverse and inclusive team environment
**What You Must Have**
- Bachelor's Degree
- 8 years of experience
**What Sets You Apart**
- Certification in Cloud Platforms [e.g., AWS Solutions Architect, AWS DataEngineer, Google Professional Cloud Architect, GCP DataEngineer Microsoft Azure Solutions Architect, Azure DataEngineer Associate, Snowflake Core, Snowflake Architect, DatabricksDataEngineer Associate] is a plus
- Designing and implementing thorough data architecture strategies
- Developing and documenting data models and data flow diagrams
- Maintaining data architecture compliance with governance and security policies
- Collaborating with stakeholders to translate data requirements into solutions
- Evaluating and recommending new data technologies and tools
- Leading data strategy engagements providing thought leadership
- Developing leading practices for DataEngineering, Data Science, and Data Governance
- Architecting and implementing cloud-based solutions meeting industry standards
Learn more about how we work: **************************
PwC does not intend to hire experienced or entry level job seekers who will need, now or in the future, PwC sponsorship through the H-1B lottery, except as set forth within the following policy: ***********************************
As PwC is an equal opportunity employer, all qualified applicants will receive consideration for employment at PwC without regard to race; color; religion; national origin; sex (including pregnancy, sexual orientation, and gender identity); age; disability; genetic information (including family medical history); veteran, marital, or citizenship status; or, any other status protected by law.
For only those qualified applicants that are impacted by the Los Angeles County Fair Chance Ordinance for Employers, the Los Angeles' Fair Chance Initiative for Hiring Ordinance, the San Francisco Fair Chance Ordinance, San Diego County Fair Chance Ordinance, and the California Fair Chance Act, where applicable, arrest or conviction records will be considered for Employment in accordance with these laws. At PwC, we recognize that conviction records may have a direct, adverse, and negative relationship to responsibilities such as accessing sensitive company or customer information, handling proprietary assets, or collaborating closely with team members. We evaluate these factors thoughtfully to establish a secure and trusted workplace for all.
Applications will be accepted until the position is filled or the posting is removed, unless otherwise set forth on the following webpage. Please visit this link for information about anticipated application deadlines: ***************************************
The salary range for this position is: $124,000 - $280,000. Actual compensation within the range will be dependent upon the individual's skills, experience, qualifications and location, and applicable employment laws. All hired individuals are eligible for an annual discretionary bonus. PwC offers a wide range of benefits, including medical, dental, vision, 401k, holiday pay, vacation, personal and family sick leave, and more. To view our benefits at a glance, please visit the following link: ***********************************
\#LI-Hybrid
$65k-91k yearly est. 7d ago
Data Scientist 1 - Healthcare
Baylor Scott & White Health 4.5
Data engineer job in Columbia, SC
Value-Based Care (VBC) Analytics is an independent organization covering the Baylor Scott & White Health Plan (Payer) and Baylor Scott & White Quality Alliance (Accountable Care Organization) analytical and data science needs. We are seeking a customer-facing Healthcare Data Scientist who works closely with key business stakeholders within the value-based care team, to develop use cases related to difficult to solve and complex business challenges. The ideal candidate will work on creating machine learning models using appropriate techniques to derive predictive insights that enable stakeholders to glean insights and enable actions to improve business outcomes.
**ESSENTIAL FUNCTIONS OF THE ROLE**
+ Communication and Consulting: Summarize and effectively communicate complex data science concepts to inform stakeholders, gain approval, or prompt action from non-technical audience from data-driven recommendations.
+ Applied Machine Learning: Implement machine learning solutions within production environments at scale. Apply appropriate machine learning techniques that directly impact HEDIS/Stars initiatives
+ Data Collection and Optimization: Collect and analyze data from a variety of SQL environments (Snowflake, SQL Server) and other data sources, including vendor derived data, electronic health records, and claims data.
+ Analyze Healthcare Data: Conduct detailed analyses on complex healthcare datasets to identify trends within HEDIS/Stars and utilization, patterns, and insights that support value-based care initiatives, particularly in quality, adherence to standards of care.
+ Stay Informed: Stay up to date on the latest advancements in data science and healthcare analytics to continuously improve our methodologies and tools.
**KEY SUCCESS FACTORS**
The ideal candidate will have some of the following skills and an eagerness to learn the rest.
+ Healthcare Knowledge: Understanding and prior experience in handling data pertaining to HEDIS, Stars measures and Regulatory specifications. Experience in admin claims data sources such as medical/pharmacy claims, social determinants of health (SDOH) and electronic health records is also required.
+ Education: Bachelor's or advanced degree in mathematics, statistics, data science, Public Health or another quantitative field.
+ Effective Communication: Experienced in communicating findings and recommendations directly to Executive-level customers and healthcare professionals.
+ Analytics Skills: Academic or professional experience conducting analytics and experimentation using algorithms associated with advanced analytics topics, including binary classification algorithms, regression algorithms, Neural Network frameworks, Natural Language Processing, etc.
+ Technical Skills: Proficiency in common language / tools for AI/ML such as Python, PySpark. Understanding of software engineering topics, including version control, CI/CD, and unit tests.
+ Problem Solving: A passion for solving puzzles and digging into data.
+ Technology Stack: Familiarity with deploying data science products at scale in a cloud environment such as Snowflake, Databricks or Azure AI/ML Studio.
**BENEFITS**
Our competitive benefits package includes the following
+ · Immediate eligibility for health and welfare benefits
+ · 401(k) savings plan with dollar-for-dollar match up to 5%
+ · Tuition Reimbursement
+ · PTO accrual beginning Day 1
Note: Benefits may vary based upon position type and/or level
**QUALIFICATIONS**
- EDUCATION - Masters' or Bachelors plus 2 years of work experience above the minimum qualification
- EXPERIENCE - 3 Years of Experience
As a health care system committed to improving the health of those we serve, we are asking our employees to model the same behaviours that we promote to our patients. As of January 1, 2012, Baylor Scott & White Health no longer hires individuals who use nicotine products. We are an equal opportunity employer committed to ensuring a diverse workforce. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.
$67k-90k yearly est. 4d ago
Sr Data Warehouse Lakehouse Developer
Lumen 3.4
Data engineer job in Columbia, SC
Lumen connects the world. We are igniting business growth by connecting people, data and applications - quickly, securely, and effortlessly. Together, we are building a culture and company from the people up - committed to teamwork, trust and transparency. People power progress.
We're looking for top-tier talent and offer the flexibility you need to thrive and deliver lasting impact. Join us as we digitally connect the world and shape the future.
**The Role**
We are seeking a Senior Data Warehouse/Lakehouse Developer to design, build, and optimize enterprise data solutions. This role combines advanced development expertise with strong analytical skills to translate business requirements into scalable, high-performance data systems. You will work closely with architects, product owners, and scrum teams, provide technical leadership, and ensure best practices in dataengineering and testing.
**Location**
The position is a Work-From-Home available from any US-based location. You must be a US Citizen or Permanent Resident/Green Card for consideration.
**The Main Responsibilities**
**Design & Development**
+ Develop and maintain ETL/ELT processes for Data Warehouse and Lakehouse environments.
+ Create and optimize complex SQL queries, stored procedures, and data transformations.
+ Build and enhance source-to-target mapping documents.
+ Assist with UAT build and data loading for User Acceptance Testing.
+ Estimate levels of effort (LOEs) for analysis, design, development, and testing tasks.
**Technical Leadership**
+ Provide technical leadership and mentorship to team members.
+ Collaborate with architects, system engineers, and product owners to understand and detail business/system requirements and logical/physical data models.
+ Participate in and consult on integrated application and regression testing.
+ Conduct training sessions for system operators, programmers, and end users.
**Analytical Expertise**
+ Analyze programming requests to ensure seamless integration with current applications.
+ Perform data analysis and mapping to ensure accuracy and consistency.
+ Generate test plans and test cases for quality assurance.
+ Research and evaluate problems, recommend solutions, and implement decisions.
**Continuous Improvement**
+ Monitor and optimize data pipelines for performance and reliability.
+ Stay current with emerging technologies and recommend improvements to architecture and processes.
+ Adapt to changing priorities and aggressive project timelines while managing multiple complex projects.
**What We Look For in a Candidate**
**Technical Skills**
+ Proficiency in SQL and at least one programming language (Python, Java, Scala).
+ Experience with ETL tools (Informatica, Kafka) and Lakehouse technologies (Azure Data Factory, PySpark).
+ Familiarity with databases (Databricks, Oracle, SQL Server).
+ Knowledge of modeling tools (Visio, ERwin, UML) and data analysis tools (TOAD, Oracle SQL Developer, DBeaver).
+ Strong understanding of data warehousing concepts and Lakehouse architecture.
**Analytical & Problem-Solving**
+ Ability to translate business requirements into technical solutions.
+ Strong troubleshooting and performance tuning skills.
+ Demonstrated organizational, oral, and written communication skills.
**Experience**
+ 6+ years of experience with a Bachelor's degree OR 4+ years with a Master's degree.
+ Proven ability to lead technical teams and manage projects.
+ Experience in applications development and systems analysis.
**Preferred Qualifications**
+ Project management experience.
+ Familiarity with CI/CD pipelines and version control (Git).
+ Exposure to big data frameworks (Spark, Hadoop) and cloud ecosystems (Azure, AWS, GCP).
**Compensation**
This information reflects the anticipated base salary range for this position based on current national data. Minimums and maximums may vary based on location. Individual pay is based on skills, experience and other relevant factors.
Location Based Pay Ranges
$82,969 - $110,625 in these states: AL AR AZ FL GA IA ID IN KS KY LA ME MO MS MT ND NE NM OH OK PA SC SD TN UT VT WI WV WY
$87,117 - $116,156 in these states: CO HI MI MN NC NH NV OR RI
$91,266 - $121,688 in these states: AK CA CT DC DE IL MA MD NJ NY TX VA WA
Lumen offers a comprehensive package featuring a broad range of Health, Life, Voluntary Lifestyle benefits and other perks that enhance your physical, mental, emotional and financial wellbeing. We're able to answer any additional questions you may have about our bonus structure (short-term incentives, long-term incentives and/or sales compensation) as you move through the selection process.
Learn more about Lumen's:
Benefits (****************************************************
Bonus Structure
\#LI-Remote
\#LI-PS
Requisition #: 340407
**Background Screening**
If you are selected for a position, there will be a background screen, which may include checks for criminal records and/or motor vehicle reports and/or drug screening, depending on the position requirements. For more information on these checks, please refer to the Post Offer section of our FAQ page (************************************* . Job-related concerns identified during the background screening may disqualify you from the new position or your current role. Background results will be evaluated on a case-by-case basis.
Pursuant to the San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records.
**Equal Employment Opportunities**
We are committed to providing equal employment opportunities to all persons regardless of race, color, ancestry, citizenship, national origin, religion, veteran status, disability, genetic characteristic or information, age, gender, sexual orientation, gender identity, gender expression, marital status, family status, pregnancy, or other legally protected status (collectively, "protected statuses"). We do not tolerate unlawful discrimination in any employment decisions, including recruiting, hiring, compensation, promotion, benefits, discipline, termination, job assignments or training.
**Disclaimer**
The job responsibilities described above indicate the general nature and level of work performed by employees within this classification. It is not intended to include a comprehensive inventory of all duties and responsibilities for this job. Job duties and responsibilities are subject to change based on evolving business needs and conditions.
In any materials you submit, you may redact or remove age-identifying information such as age, date of birth, or dates of school attendance or graduation. You will not be penalized for redacting or removing this information.
Please be advised that Lumen does not require any form of payment from job applicants during the recruitment process. All legitimate job openings will be posted on our official website or communicated through official company email addresses. If you encounter any job offers that request payment in exchange for employment at Lumen, they are not for employment with us, but may relate to another company with a similar name.
$91.3k-121.7k yearly 46d ago
Sr Data Engineer (MFT - IBM Sterling)
The Hertz Corporation 4.3
Data engineer job in Columbia, SC
**A Day in the Life:** The IBM Sterling Senior MFT Engineer will have expertise in Sterling B2B Integrator, Sterling File Gateway, Sterling Secure Proxy, Control Center. This position requires an expert level knowledge of these technologies. You'll provide third-level support for core hardware, software, data and infrastructure components within the B2B integration team. The software focus areas will include software infrastructure components, continuous improvement and automation related to middleware and enterprise data movement within a mission-critical, multi-site enterprise environment.
The ideal candidate will have a passion for technology; possess the ability to create change and be the facilitator for this transformation. They will have experience tailoring software design, developing, and implementing of solutions in support of data quality and automation. They will need to understand the requirements of a project, test the performance of the infrastructure, and verify that the requirements have been successfully met.
We expect the starting salary to be around $135k but will be commensurate with experience.
**What You'll Do:**
TECHNICAL SENIORSHIP
+ Communication with internal and external business users on Sterling Integrator mappings
+ Making changes to existing partner integrations to meet internal and external requirements
+ Design, develop and implement solutions based on standards and processes that establish consistency across the enterprise data, reduce risks, and promote efficiencies in support of the organizational goals and objectives.
+ Diagnose and troubleshoot complex issues, restore services and perform root cause analysis.
+ Facilitate the review, vetting of these designs with the architecture governance bodies, as required.
+ Be aware of all aspects of security related to the Sterling environment and integrations
INNOVATIVE INFRASTRUCTURE & PROBLEM SOLVING
+ Strive to engineering excellence by simplifying, optimizing, and automating processes and workflows.
TEAMWORK & COMMUNICATION
+ Superior & demonstrated team building & development skills to harness powerful teams
+ Ability to communicate effectively with different levels of Seniorship within the organization
+ Provide timely updates so that progress against each individual incident can be updated as required
+ Write and review high quality technical documentation
CONTROL & AUDIT
+ Ensures their workstation and all processes and procedures, follow organization standards
CONTINUOUS IMPROVEMENT
+ Encourages and maintains a 'Best practice sharing culture', always striving to find ways to improve service and change mind set.
**What We're Looking For:**
+ Bachelor's degree in Engineering, Statistics, Computer Science or other quantitative fields, required
+ 5+ years of IT experience
+ 4+ years' experience working with Sterling B2B Integrator, Sterling File Gateway, PCM (preferred)
+ 3+ years' experience with scripting to enable automation of standard activities (example: Ansible, Python, Bash, Java)
+ Strong interpersonal and communication skills with Agile/Scrum experience.
+ Strong problem solving and critical thinking skills with a proven record for identifying and diagnosing problems, and solving complex problems with simple, logical solutions with the ability to develop custom setups.
+ Outstanding verbal, written, presentation, facilitation, and interaction skills, including ability to effectively communicate technical and non-technical issues and concepts to multiple organization levels.
+ Prefer Travel, transportation, or hospitality experience
+ Prefer experience with designing application data models for mobile or web applications
+ Excellent written and verbal communication skills.
+ Flexibility in scheduling which may include nights, weekends, and holidays
**What You'll Get:**
+ Up to 40% off the base rate of any standard Hertz Rental
+ Paid Time Off
+ Medical, Dental & Vision plan options
+ Retirement programs, including 401(k) employer matching
+ Paid Parental Leave & Adoption Assistance
+ Employee Assistance Program for employees & family
+ Educational Reimbursement & Discounts
+ Voluntary Insurance Programs - Pet, Legal/Identity Theft, Critical Illness
+ Perks & Discounts -Theme Park Tickets, Gym Discounts & more
The Hertz Corporation operates the Hertz, Dollar Car Rental, Thrifty Car Rental brands in approximately 9,700 corporate and franchisee locations throughout North America, Europe, The Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand. The Hertz Corporation is one of the largest worldwide airport general use vehicle rental companies, and the Hertz brand is one of the most recognized in the world.
**US EEO STATEMENT**
At Hertz, we champion and celebrate a culture of diversity and inclusion. We take affirmative steps to promote employment and advancement opportunities. The endless variety of perspectives, experiences, skills and talents that our employees invest in their work every day represent a significant part of our culture - and our success and reputation as a company.
Individuals are encouraged to apply for positions because of the characteristics that make them unique.
EOE, including disability/veteran
$135k yearly 60d+ ago
Data Modeler banking industry Columbia, SC aremote
Esrhealthcare
Data engineer job in Columbia, SC
Data Modeler for an opportunity with our client in the banking industry in Columbia, SC.
Work arrangement:
Hybrid: Onsite Tuesday, Wednesday, Thursday.
; 6 months Data Modeler responsibilities:
Define and develop data modeling practices, standards, and guidelines
Create and manage conceptual, logical, and physical data models and metadata by working with stakeholders and others
Facilitate requirements gathering sessions with business units to capture data needs and translate them into model specifications
Collaborate with the development teams to ensure the database design meets application requirements
Conduct data analysis, and quality assurance to ensure data integrity and consistency
Implement database schema changes and modifications
Document data models, metadata, data flows, and data dictionaries
Provide technical support and guidance to users
Participate in model reviews
Train developers and others in modeling Required knowledge, skills, and abilities:
Bachelor's or Master's degree in Computer Science, Information Systems or a related field
4+ years of experience in data modeling
Experience with data modeling tools like Erwin or ER/Studio
Experience with SQL
Experience with databases such as Oracle, SQL Server, or MySQL
Knowledge of data warehousing and business intelligence concepts
Knowledge of cloud-based data storage and processing platforms such as AWS (S3, Redshift), Snowflake, Databricks
Excellent problem-solving skills and attention to detail
Strong communication and interpersonal skills
Ability to work independently and in a team environment About the client:
This is a consulting opportunity with one of the largest financial institutions in South Carolina. Our client is a growing lending institution with over $39 billion in assets. They place great value on their employees and consultants and offer a smoke-free work environment and business casual dress. Enjoy working in Columbia's revitalized downtown community, surrounded by institutions rich in culture, eclectic restaurants and retail stores. Interested? Learn more:
Click the apply button or contact our recruiter Carolyn at to learn more about this position (#25-00732).
Authorized US Worker - US Citizens and those authorized to work in the US are encouraged to apply. We are unable to sponsor at this time. EOE/AA/V/D
DPP offers a range of compensation and benefits packages to our employees and their eligible dependents. Call today to learn more about working with DPP.
Remote
$79k-107k yearly est. 3d ago
Data Engineer (Python, Java)
Nexonit
Data engineer job in Columbia, SC
DataEngineer (Python, Java) Onsite ColumbiaSC
Authorized to work in the US
No H1B
Government Experience: Preferred
Overall Exp 11+ yrs
The South Carolina Department of Health & Human Services (SCDHHS) is the State Medicaid Agency for South Carolina. This agency is focused on the modernization of the States Medicaid Management Information System (MMIS). The modernization is a major undertaking for SCDHHS and requires major transformation of culture, processes and technology. The modernization program supports the Departments transition from primarily a fee-for-service payor of claims towards a program and policy driver for health outcomes primarily through managed care programs. The programs strategy supports significant innovation in MMIS thinking and mindset and is aligned with MMIS innovation at the national level as well.
Scope of the project:
The Data Operations Hub (DOH) serves as the centralized platform and team responsible for ingesting, validating, transforming, securing, and provisioning all enterprise data to support analytics, reporting, operational efficiency, and regulatory compliance.
Objectives to Be Fulfilled by Candidate:
As a Software Developer (Java) Consultant at South Carolina Department of Health and Human Services (SCDHHS), you will be a key member of the Data Operations Hub Team. Your role involves designing, developing, and implementing analytical data solutions that align with the Agency's strategic goals. You will leverage your expertise in data architecture, data modeling, data migrations and data integration, collaborating with cross-functional teams to achieve target state architecture goals.
Job Responsibilities
Represent the Data Operations Hub team in various forums, advising on Data Solutions.
Lead the design and maintenance of scalable data solutions, including data lakes and warehouses.
Collaborate with cross-functional teams to ensure data product solutions supports business needs and enables data-driven decision-making.
Evaluate and select data technologies, driving the adoption of emerging technologies.
Develop architectural models using sparkx, or other data modeling tool available and other artifacts to support data initiatives.
Serve as a subject matter expert in specific areas.
Experience with modern data processing technologies such as Python, Java, Airflow, etc. using data mesh & data lake.
Contribute to the dataengineering community and advocate for agency data practices.
Engage in hands-on coding and design to implement production solutions.
Optimize system performance by resolving inefficiencies.
11.Design reusable data frameworks using new technologies.
Required Skills (rank in order of Importance):
Experience as DataEngineer or similar role leading technologists to manage, anticipate and solve complex technical items within your domain of expertise.
Hands-on experience in system design, application development, and operational stability.
Expertise in architecture disciplines and programming languages.
Deep knowledge of data architecture, modeling, integration, cloud data services, data domain driven design, best practices, and industry trends in dataengineering.
Practical experience with AWS/Azure services, and dataengineering disciplines.
Advanced experience in one or more dataengineering disciplines, e.g, ETL/ELT, event processing.
Proficiency in SQL, data warehousing solutions and cloud native relational databases, e.g. Snowflake, Athena, Postgres
Strong problem-solving, communication, and interpersonal skills.
Ability to evaluate and recommend technologies for future state architecture.
Preferred Skills (rank in order of Importance):
Business architecture knowledge and experience with architecture assessment frameworks.
Required Education:
Bachelor's or Master's degree in Computer Science or related field.
Required Certifications:
NA
$75k-101k yearly est. 60d+ ago
Data Engineer
Cognizant 4.6
Data engineer job in Columbia, SC
We are seeking a motivated DataEngineer to join our DataEngineering team. The ideal candidate will have exposure in Python, and SQL, and will be responsible for designing, developing, and maintaining robust data pipelines for structured, semi-structured, and unstructured data. This role is ideal for someone passionate about building scalable data solutions and enabling advanced analytics across the organization.
**Key Responsibilities**
+ Design, develop, and optimize data pipelines using Python, Spark, and SQL.
+ Ingest, process, and analyze structured (e.g., relational databases), semi-structured (e.g., JSON, XML), and unstructured data (e.g., text, logs, images) from diverse sources.
+ Implement data quality checks, validation, and transformation logic to ensure data integrity and reliability.
+ Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions.
+ Develop and maintain data models, data dictionaries, and technical documentation.
+ Monitor, troubleshoot, and optimize data workflows for performance and scalability.
+ Ensure compliance with data governance, security, and privacy policies.
+ Support data migration, integration, and modernization initiatives, including cloud-based solutions (AWS, Azure, GCP).
+ Automate repetitive dataengineering tasks and contribute to continuous improvement of data infrastructure.
+ Knowledge of any of the cloud platforms (AWS, Azure, GCP).
+ Understanding of data security, encryption, and compliance best practices.
**Required Skills & Qualifications**
+ Bachelor's or Master's degree in Computer Science, Information Systems, DataEngineering, or a related field
+ Good programming skills in Python and SQL
+ Good problem-solving, analytical, and communication skills
+ Ability to work collaboratively in a fast-paced environment
+ Familiarity with data orchestration tools (e.g., Airflow, Prefect)
+ Exposure to data lake and data warehouse architectures (e.g., Snowflake, Databricks, Big query etc.)
+ Knowledge of containerization and CI/CD pipelines (e.g., Docker, Kubernetes, GitHub)
+ Familiarity with data visualization tools
+ Basic understanding of ETL/ELT concepts, data modeling, and data architecture
**Location**
New hires will be hired at the Cognizant office in **Plano, TX or Teaneck, NJ,** where you will work alongside other experienced Cognizant associates delivering technology solutions. Applicants must be willing to relocate to this major geographic area. While we attempt to honor candidate location preferences, business needs and position availability will determine final location assignment.
**Start Date**
New hires will start in **January or February 2026** . While we will attempt to honor candidate start date preferences, business need and position availability will determine final start date assignment. Exact start date will be communicated with enough time for you to plan effectively.
**Salary and Other Compensation:**
Applications are accepted on an ongoing basis.
The annual salary for this position is $65,000.00 depending on experience and other qualifications of the successful candidate. This position is also eligible for Cognizant's discretionary annual incentive program, based on performance and subject to the terms of Cognizant's applicable plans.
**Why Choose Us?**
Cognizant delivers solutions that draw upon the full power and scale of our associates. You will be supported by high-caliber experts and employ some of the most advanced and patented capabilities. Our associate's diverse backgrounds offer multifaceted perspectives and fuel new ways of thinking. We encourage lively discussions which inspire better results for our clients.
**Benefits**
Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
+ Medical/Dental/Vision/Life Insurance
+ Paid holidays plus Paid Time Off
+ 401(k) plan and contributions
+ Long-term/Short-term Disability
+ Paid Parental Leave
+ Employee Stock Purchase Plan
**Disclaimer**
The hourly rate, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
**Work Authorization**
Due to the nature of this position, Cognizant cannot provide sponsorship for U.S. work authorization (including participation in a CPT/OPT program) for this role.
_Cognizant is always looking for top talent. We are searching for candidates to fill future needs within the business. This job posting represents potential future employment opportunities with Cognizant. Although the position is not currently available, we want to provide you with the opportunity to express your interest in future employment opportunities with Cognizant. If a job opportunity that you may be qualified for becomes available in the future, we will notify you. At that time you can determine whether you would like to apply for the specific open position. Thank you for your interest in Cognizant career opportunities._
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
$65k yearly 12d ago
Sr Hadoop Developer
Jobsbridge
Data engineer job in Columbia, SC
Jobs Bridge Inc is among the fastest growing IT staffing / professional services organization with its own job portal. Jobs Bridge works extremely closely with a large number of IT organizations in the most in-demand technology skill sets.
Job Description
Skill Hadoop, Pig, Hive , Map Reduce, ETL with data warehouse, BI
Location Columbia, SC
Total Experience 5 yrs.
Max Salary Not Mentioned
Employment Type Direct Jobs (Full Time)
Domain Any
Description
GC can apply
• Data Modeling: deep expertise with modeling databases for enterprise-grade solutions, ideally, analytics solutions.
• Coding: Build highly performant and scalable enterprise-grade ETL processes for populating analytics Hadoop and Oracle based data warehouses.Working in Agile teams on Linux development environments.
• Data setup: Bring together data and create views of data sets stored in the Hadoop-based big data platform using Hive, Pig, SQOOP and Oozie.
• Testing: Assist quality assurance testing teams. Where required, develop and conduct unit tests, develop system test data and perform system tests.
• Documentation: Develop program specifications and flowcharts, (dataflows, jobflows, etc.), for stand-alone products or systems. Prepare concise internal program documentation on product development and revisions. Prepare user guides and operational instruction manuals.
• Communication: Convey problems, solutions, updates and project status to peers, customers and management. Develop and maintain program, systems and user documentation.
• Planning: Prepare time estimates for assigned tasks. Attends post-implementation reviews.
Qualifications:
• Bachelors/Masters degree in computer science/ math/ statistics or a related discipline preferred with 3+ years of database development/ data mining experience, OR demonstrated ability to meet job requirements through a comparable number of years of technical work experience.
• Excellent understanding of Big Data Analytics platforms; Hands-on experience with the following technologies: Hadoop, HBase, Pig, SQL
• Solid experience as an RDBMS developer, ideally on Oracle, with stored procedures, query performance tuning and ETL.
• Solid experience with scripting language such as Python, Shell
• Extremely comfortable developing on Linux servers
• Some experience of web development stacks, analytics, NoSQL data stores, data modeling, analytical tools and libraries
• Solid understanding of Data Warehousing concepts and technologies.
• Strong foundational knowledge and experience with distributed systems and computing systems in general
• Experience working in Agile teams.
• Experience in Healthcare will be big plus.
Additional Information
Multiple Openings for GC/Citizen
$69k-89k yearly est. 1d ago
Lead Data Engineer
Omega Solutions 4.1
Data engineer job in Columbia, SC
SFO, CA (Remote until Covid)
Mandatory Skills: (Oracle or PostgreSQL) and ETL Pipelines and Big Data and AWS
Responsibilities
· Uses structured tools for analysis and presentation of concepts and models to enhance the BRD
· Develop, maintain and deliver training materials to the supply chain end-users
· Work collaboratively with external consultants, internal & external resources throughout the project lifecycle to ensure system modifications meet business needs
· Support day to day reporting needs where required
· Support production issues as relate to application functionality and integrations
Qualifications:
· Excellent spoken and written communication skills (verbal and non-verbal)
· Proven experience in managing data warehouses and ETL pipelines (Min. 2 years)
· Solid scripting capability for analysis and reporting (Strong PL/SQL and expert SQL) coming with a performance mindset as well as functionally precise
· Expert SQL development skills with ability to write complex efficient queries for data integration
· Strong analytical skills to support BAs and ability to translate user stories / work closely with the tech team
· Strong problem-solving skills (Math skills required for data modeling)
· Ability to work as a back-end developer (PL/SQL)
· Ability to manage and complete multiple tasks within tight deadlines
· Possess expert level understanding of software development practices and project life cycles.
· Working experience with Java, cloud-native technologies
· Must have: Working experience in dealing with big data and data manipulation.
· Highly Desired: Familiarity with Oracle retail data structures (Retail Management System / Retail Planning Application System)
· Desired: Familiarity with DevOps practices like CICD pipeline
· Desired: Retail experience is a plus. (fashion retail experience would be ideal)
· Desired: Working experience with cloud platforms namely AWS
$85k-122k yearly est. 60d+ ago
Data Architect
Ask It Consulting
Data engineer job in Columbia, SC
Responsible for the overall design of the enterprisewide data/information architecture, which maps to the enterprise architecture and balances the need for access against security and performance requirements. Knowledgeable in most aspects of designing and constructing data architectures, operational data stores, and data marts. Focuses on enterprisewide data modeling and database design. Defines data/information architecture standards, policies and procedures for the organization, structure, attributes and nomenclature of data elements, and applies accepted data content standards to technology projects. Facilitates consistent business analysis, data acquisition and access analysis and design, Database Management Systems optimization, archiving and recovery strategy, load strategy design and implementation, security and change management at the enterprise level. May require a bachelor's or master's degree in a related area and strong experience in the field or in a related area.
Qualifications
Data Stores, Data Marts, Data Modeling, Database Design, Sharepoint Software Development, Sql Databases, .Net Programming, Nintex Workflow Designer
Additional Information
All your information will be kept confidential according to EEO guidelines.
$82k-112k yearly est. 60d+ ago
Mill Application Programmer Analyst - IT
Sylvamo
Data engineer job in Eastover, SC
Salary Range: $75,800-$90,000 (Depending on Experience) Purpose:As a member of a Mill based Global Business IT team you will be a part of a team of professionals supporting a 24x7 manufacturing operation. This is an onsite role onsite at a manufacturing facility.
As the Mill Application Programmer Analyst, you will support all aspects of a portfolio of applications and related infrastructure critical to the day-to-day operations of the mill impacting Safety, Environmental Compliance, Product Quality and Production. You will work with business partners and other IT team members to positively impact systems reliability and innovation.
Key Accountabilities:
Support and maintain an existing portfolio of Mill based applications/solutions supporting Manufacturing, Quality and Environmental operations.
Drive innovation by leveraging emerging technologies and creative solutions to enhance business value.
Lead initiatives that advance operational capabilities through research, change management, user education, and seamless implementation.
Contribute to the success of new technologies including Manufacturing Execution System solutions and other mill applications.
Work individually or as a member of a cross functional team implementing system upgrades up to and including full solution replacement.
Provide direct end user support and training in the use of existing solutions supporting Mill operations.
Maintain existing and develop new integrations between a variety of IT and OT systems including data historians, databases, Manufacturing Execution, Quality and Enterprise Resource Systems.
Administer and maintain Aveva PI Data Historian system as well as related Analytics (PI Asset Framework) and visualization (PI Vision, ParcView) solutions.
Troubleshoot and support existing reporting systems including the maintenance of existing reporting solutions as well as the development of new reports on a variety of platforms including Crystal Report and Power BI.
Develop and maintain T-SQL queries in support of reporting and integration.
Maintain and develop in house solutions developed using a variety of platforms/languages including Power Apps, PowerShell, Visual Basic, and other scripting languages.
Work independently, with business partners and/or vendors/contractors as needed to scope and develop innovative business solutions.
SharePoint Online Administration and Development.
Create and maintain system documentation.
Communicate and collaborate closely across business and technical communities, bridge gaps and drive value.
Knowledge and Experience:
Bachelor's degree in computer science, information systems, information technology, software engineering or equivalent field of study.
5+ years' experience developing, integrating and supporting applications, database and server infrastructure preferably in a manufacturing/industrial environment. Experience in the pulp and paper industry is preferred, but not essential.
Experience working with manufacturing historian, analytic and visualization systems is preferred.
Capable of working independently with minimal supervision or direction
Clear ability to manage change and coordinate multiple projects and deadlines.
Good communication skills (verbal and written). Must be an effective listener.
Strong interpersonal skills.
Demonstrated ability to work with cross-functional teams.
Experience with Microsoft Office business productivity applications.
Competencies
Safe and Well
Inclusive and Collaborative
Customer Focused
Trustworthy
Team-Oriented
Entrepreneurial Spirit
Agile
Operationally Excellent
$75.8k-90k yearly 60d+ ago
Mill Application Programmer Analyst - IT
Sylvamo Corporation
Data engineer job in Eastover, SC
Salary Range: $75,800-$90,000 (Depending on Experience) Purpose:As a member of a Mill based Global Business IT team you will be a part of a team of professionals supporting a 24x7 manufacturing operation. This is an onsite role onsite at a manufacturing facility.
As the Mill Application Programmer Analyst, you will support all aspects of a portfolio of applications and related infrastructure critical to the day-to-day operations of the mill impacting Safety, Environmental Compliance, Product Quality and Production. You will work with business partners and other IT team members to positively impact systems reliability and innovation.
Key Accountabilities:
Support and maintain an existing portfolio of Mill based applications/solutions supporting Manufacturing, Quality and Environmental operations.
Drive innovation by leveraging emerging technologies and creative solutions to enhance business value.
Lead initiatives that advance operational capabilities through research, change management, user education, and seamless implementation.
Contribute to the success of new technologies including Manufacturing Execution System solutions and other mill applications.
Work individually or as a member of a cross functional team implementing system upgrades up to and including full solution replacement.
Provide direct end user support and training in the use of existing solutions supporting Mill operations.
Maintain existing and develop new integrations between a variety of IT and OT systems including data historians, databases, Manufacturing Execution, Quality and Enterprise Resource Systems.
Administer and maintain Aveva PI Data Historian system as well as related Analytics (PI Asset Framework) and visualization (PI Vision, ParcView) solutions.
Troubleshoot and support existing reporting systems including the maintenance of existing reporting solutions as well as the development of new reports on a variety of platforms including Crystal Report and Power BI.
Develop and maintain T-SQL queries in support of reporting and integration.
Maintain and develop in house solutions developed using a variety of platforms/languages including Power Apps, PowerShell, Visual Basic, and other scripting languages.
Work independently, with business partners and/or vendors/contractors as needed to scope and develop innovative business solutions.
SharePoint Online Administration and Development.
Create and maintain system documentation.
Communicate and collaborate closely across business and technical communities, bridge gaps and drive value.
Knowledge and Experience:
Bachelor's degree in computer science, information systems, information technology, software engineering or equivalent field of study.
5+ years' experience developing, integrating and supporting applications, database and server infrastructure preferably in a manufacturing/industrial environment. Experience in the pulp and paper industry is preferred, but not essential.
Experience working with manufacturing historian, analytic and visualization systems is preferred.
Capable of working independently with minimal supervision or direction
Clear ability to manage change and coordinate multiple projects and deadlines.
Good communication skills (verbal and written). Must be an effective listener.
Strong interpersonal skills.
Demonstrated ability to work with cross-functional teams.
Experience with Microsoft Office business productivity applications.
Competencies
Safe and Well
Inclusive and Collaborative
Customer Focused
Trustworthy
Team-Oriented
Entrepreneurial Spirit
Agile
Operationally Excellent
$75.8k-90k yearly 22d ago
Data Scientist
Credence 3.7
Data engineer job in Sumter, SC
Job DescriptionOverview
At Credence, we support our clients' mission-critical needs, powered by technology. We provide cutting-edge solutions, including AI/ML, enterprise modernization, and advanced intelligence capabilities, to the largest defense and health federal organizations. Through partnership and trust, we increase mission success for war-fighters and secure our nation for a better future.
We are privately held, are repeatedly recognized as a top place to work, and have been on the Inc. 5000 Fastest Growing Private Companies list for the last 12 years. We practice servant leadership and believe that by focusing on the success of our clients, team members, and partners, we all achieve greater success.
Credence has an up coming need for a Dats Scientist supporting AFCENT A2 at Shaw AFB, SC 29152
Responsibilities include, but are not limited to the duties listed below:
Provide expert analytic support through the development and publishing of threat operating areas and procedures to directly support USCENTCOM directed planning efforts. These products may include, but are not limited to oral briefings and discussions, PowerPoint presentations, formal messages, background papers, collection requirements, production requirements, RFI answers, and staff summary packages for general officers, and intelligence community SMEs. .
Provide critical oversight of training on predictive and forensic analysis of S2S/WMD programs throughout the Middle East. Expertise shall center on Theater Ballistic Missile (TBM), Cruise Missiles, Long Range Rockets (LRRs), operations; sub-categories include operational command and control; battle management; S2S system and subsystem capabilities, inventory and disposition; tactics, techniques, and procedures; denial and deception.
Provide expert analytic support to 9AF (AFCENT) planning by authoring and reviewing prioritized intelligence, collection, and production requirements, and integration of new TTPs to advance planning efforts. Member will serve as the adversary Airpower authority on 9AF (AFCENT)'s Red Team, during war-gaming and exercises, and in support of 9AF (AFCENT)'s adaptive planning processes.
As directed, leverage all formal and informal analytic exchanges, as well as training opportunities, to foster peer-to-peer collaboration, ensure thorough vetting of products, and deliberately pursue/maintain acuity on essential elements of tradecraft.
Maintain coordination/collaboration, critical to mission success, with: the National Command Authority; National, International, and Service Intelligence Centers; Combatant Commands, Air Components, and Wing Intelligence.
As a member of a focused analytic team, work with all available sources to develop new methods to support USAFCENT F2T development efforts. Member will be responsible for leveraging advanced analytic techniques to advance understanding of adversary TTPs to directly support USAFCENT operational priorities while balancing Intelligence Community requirements.
Effectively operating as a member of an analytical team in support of CJOA requirements
Travel to CJOA and/or within CONUS in order to provide/receive training
Briefing skills to include the ability to clearly articulate information
The following is required: Alternate Compensatory Control Measures (ACCM) and access to Special Access Programs (SAP) information.
Requirements
TS/SCI
Bachelor's degree or eight (8) years' USAF experience.
Six (6) years' comparable experience.
Formal Intelligence Analysis training.
Experience analyzing and briefing adversary surface-to-surface missile capabilities.
Experience with structured analytic techniques to develop new solutions for analytic gaps
Working knowledge of US Air Force Collection, Targeting, and Planning Processes
Working knowledge of USCENTCOM AOR, political/military threats, understanding of violent extremist organizations, terrorist organizations; working knowledge of intelligence analysis and TTPs.
Proficient in utilizing standard computer applications and intelligence related automation to support analytical efforts and product development
Advanced Analytic training: preferred.
Experience working with structured data and data scientists to develop analytic solutions to complex intelligence problems: preferred
$65k-90k yearly est. 30d ago
Data Architect Consultant
Intermountain Health 3.9
Data engineer job in Columbia, SC
We're looking for a technical, highly collaborative Data Architect - Consultant who can bridge strategy, engineering, and delivery. This role is responsible for defining the problem to solve, shaping solution approaches, and leading projects end‑to‑end with strong facilitation and execution skills. You'll play a key role in maturing our code review process, establishing data architecture best practices, and elevating the quality and consistency of our delivery. The ideal candidate combines hands-on technical fluency with exceptional communication, enabling them to partner closely with engineers, guide architectural decisions, and drive continuous improvement across teams.
**Essential Functions**
Lead the design, implementation, and management of complex data infrastructures across multiple clinical and enterprise projects.
Apply deep expertise in relational databases, cloud technologies, and big data tools to deliver scalable, efficient, and secure data solutions.
Manage multiple complex projects simultaneously, ensuring alignment with clinical program objectives and organizational priorities.
Provide leadership and direction on enterprise-level data integration strategies.
Mentor junior and senior team members, fostering technical growth through structured guidance and code reviews.
Collaborate with cross-functional teams and stakeholders to ensure high-quality data architectures and pipelines across cloud and on-premise systems.
**Skills**
+ Leadership & Project Management - Ability to lead teams and manage multiple complex initiatives concurrently.
+ Mentorship & Code Review - Skilled in guiding junior team members and enforcing best practices through structured reviews.
+ Collaboration & Stakeholder Management - Strong interpersonal skills to work effectively with clinical program leaders and technical teams.
+ Data Architecture & Design - Expertise in designing scalable and secure data solutions.
+ Cloud Infrastructure & Data Solutions - Proficiency in AWS, Azure, or similar platforms.
+ ETL/ELT Development & Data Integration - Building and optimizing data pipelines.
+ Database & Performance Optimization - Ensuring high availability and efficiency.
+ Data Modeling & Documentation - Creating clear, maintainable models and technical documentation.
+ Data Governance & Security - Implementing compliance and security best practices.
+ Coding/Programming - Strong programming skills for dataengineering and architecture.
Minimum Qualifications:
+ Expert proficiency with SQL and extensive experience with traditional RDBMS (e.g., Oracle, SQL Server, PostgreSQL).
+ Extensive experience with cloud platforms such as AWS, Azure, or Google Cloud Platform for data architecture and storage solutions.
+ Mastery of programming languages such as Python and PySpark for dataengineering tasks.
+ In-depth knowledge of ETL/ELT processes and tools, including both traditional (e.g., SSIS, Informatica) and cloud-native solutions (e.g., Azure Data Factory, Databricks).
+ Outstanding communication skills for collaborating with stakeholders and teams.
+ Expert understanding of Product Management, Project Management, or Program Management philosophies and methodologies, and capable of applying them to data architecture projects to ensure alignment with business goals and efficient execution.
+ Demonstrated ability to stay updated on industry trends and advancements.
+ Proven experience in providing mentorship and guidance to junior and senior architects.
Preferred Qualifications:
+ A Master's Degree in an analytics-related field such as information systems, data science / analytics, statistics, computer science, mathematics and 4 years of experience.
+ or
+ 8 years of professional experience in analytics role in an analytics-related field such as statistics, mathematics, information systems, computer science, data science / analytics.
+ or
+ Bachelors degree in an analytics-related field such as information systems, data science / analytics, statistics, computer science, mathematics. With 6 years of experience
+ Experience with Databricks, Apache Spark, and Delta Lake for real-time and batch data processing.
+ Proficiency in data streaming technologies such as Kafka, AWS Kinesis, or Azure Event Hubs.
+ Experience working with APIs to retrieve and integrate data from external systems.
+ Experience developing APIs to provide data as a product.
+ Familiarity with CI/CD pipelines for dataengineering workflows.
+ Knowledge of data governance frameworks and compliance standards (e.g., GDPR, HIPAA).
+ Experience in a healthcare environment
+ Familiarity with business intelligence tools such as Tableau, Power BI, or Looker for delivering insights from data architectures
Remain sitting or standing for long periods of time to perform work on a computer, telephone, or other equipment.
**Location:**
Lake Park Building
**Work City:**
West Valley City
**Work State:**
Utah
**Scheduled Weekly Hours:**
40
The hourly range for this position is listed below. Actual hourly rate dependent upon experience.
$60.06 - $94.57
We care about your well-being - mind, body, and spirit - which is why we provide our caregivers a generous benefits package that covers a wide range of programs to foster a sustainable culture of wellness that encompasses living healthy, happy, secure, connected, and engaged.
Learn more about our comprehensive benefits package here (***************************************************** .
Intermountain Health is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability or protected veteran status.
At Intermountain Health, we use the artificial intelligence ("AI") platform, HiredScore to improve your job application experience. HiredScore helps match your skills and experiences to the best jobs for you. While HiredScore assists in reviewing applications, all final decisions are made by Intermountain personnel to ensure fairness. We protect your privacy and follow strict data protection rules. Your information is safe and used only for recruitment. Thank you for considering a career with us and experiencing our AI-enhanced recruitment process.
All positions subject to close without notice.
$76k-101k yearly est. 5d ago
Data Scientist, Product Analytics
Meta 4.8
Data engineer job in Columbia, SC
As a Data Scientist at Meta, you will shape the future of people-facing and business-facing products we build across our entire family of applications (Facebook, Instagram, Messenger, WhatsApp, Oculus). By applying your technical skills, analytical mindset, and product intuition to one of the richest data sets in the world, you will help define the experiences we build for billions of people and hundreds of millions of businesses around the world. You will collaborate on a wide array of product and business problems with a wide-range of cross-functional partners across Product, Engineering, Research, DataEngineering, Marketing, Sales, Finance and others. You will use data and analysis to identify and solve product development's biggest challenges. You will influence product strategy and investment decisions with data, be focused on impact, and collaborate with other teams. By joining Meta, you will become part of a world-class analytics community dedicated to skill development and career growth in analytics and beyond.Product leadership: You will use data to shape product development, quantify new opportunities, identify upcoming challenges, and ensure the products we build bring value to people, businesses, and Meta. You will help your partner teams prioritize what to build, set goals, and understand their product's ecosystem.Analytics: You will guide teams using data and insights. You will focus on developing hypotheses and employ a varied toolkit of rigorous analytical approaches, different methodologies, frameworks, and technical approaches to test them.Communication and influence: You won't simply present data, but tell data-driven stories. You will convince and influence your partners using clear insights and recommendations. You will build credibility through structure and clarity, and be a trusted strategic partner.
**Required Skills:**
Data Scientist, Product Analytics Responsibilities:
1. Work with large and complex data sets to solve a wide array of challenging problems using different analytical and statistical approaches
2. Apply technical expertise with quantitative analysis, experimentation, data mining, and the presentation of data to develop strategies for our products that serve billions of people and hundreds of millions of businesses
3. Identify and measure success of product efforts through goal setting, forecasting, and monitoring of key product metrics to understand trends
4. Define, understand, and test opportunities and levers to improve the product, and drive roadmaps through your insights and recommendations
5. Partner with Product, Engineering, and cross-functional teams to inform, influence, support, and execute product strategy and investment decisions
**Minimum Qualifications:**
Minimum Qualifications:
6. Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
7. Bachelor's degree in Mathematics, Statistics, a relevant technical field, or equivalent
8. 4+ years of work experience in analytics, data querying languages such as SQL, scripting languages such as Python, and/or statistical mathematical software such as R (minimum of 2 years with a Ph.D.)
9. 4+ years of experience solving analytical problems using quantitative approaches, understanding ecosystems, user behaviors & long-term product trends, and leading data-driven projects from definition to execution [including defining metrics, experiment, design, communicating actionable insights]
**Preferred Qualifications:**
Preferred Qualifications:
10. Master's or Ph.D. Degree in a quantitative field
**Public Compensation:**
$147,000/year to $208,000/year + bonus + equity + benefits
**Industry:** Internet
**Equal Opportunity:**
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
$147k-208k yearly 60d+ ago
Data Scientist I
Unum Group 4.4
Data engineer job in Columbia, SC
When you join the team at Unum, you become part of an organization committed to helping you thrive. Here, we work to provide the employee benefits and service solutions that enable employees at our client companies to thrive throughout life's moments. And this starts with ensuring that every one of our team members enjoys opportunities to succeed both professionally and personally. To enable this, we provide:
+ Award-winning culture
+ Inclusion and diversity as a priority
+ Performance Based Incentive Plans
+ Competitive benefits package that includes: Health, Vision, Dental, Short & Long-Term Disability
+ Generous PTO (including paid time to volunteer!)
+ Up to 9.5% 401(k) employer contribution
+ Mental health support
+ Career advancement opportunities
+ Student loan repayment options
+ Tuition reimbursement
+ Flexible work environments
**_*All the benefits listed above are subject to the terms of their individual Plans_** **.**
And that's just the beginning...
With 10,000 employees helping more than 39 million people worldwide, every role at Unum is meaningful and impacts the lives of our customers. Whether you're directly supporting a growing family, or developing online tools to help navigate a difficult loss, customers are counting on the combined talents of our entire team. Help us help others, and join Team Unum today!
**General Summary:**
This position for a developing data scientist who is excited to transform data into actionable insights and impact the business through his/her work. The role requires increasing technical expertise in the fields of computer programming, applied statistics and data manipulation; and relies on developing business knowledge. The individual will participate in project work primarily within the functional area, with direction and review by manager. The individual is expected to continuously increase business knowledge and take initiative in identifying and executing analytical approaches to support assigned projects.
**Job Specifications**
+ Bachelor's degree in quantitative field is preferred
+ 2 years preferred of professional experience or equivalent relevant work experience
+ **Core Data Science Capabilities:** Deep expertise in at least one of the following skillsets preferred, with basic capability in the others:
+ Programming & Process automation: Experience using APIs, file I/O, database, and analysis libraries. Understanding of programming in jupyter notebooks and/ or statistical packages. Understanding of process mapping and demonstrated application of scripting languages to automate processes. Exposure to data mining and web scraping.
+ Data Visualization: Working knowledge of two or more data visualization tools and proficiency in static data visualization. Basic understanding of dynamic data visualization
+ Statistics & Statistical modeling: Solid understanding of statistical inference and regression. Basic understanding of machine learning techniques including random forests and neural nets. Basic understanding of feature selection and extraction which may include analysis of categorical data, multiple comparisons, Central Limit Theorem, bootstrapping, and permutation tests
+ **Data Extraction, Transformation, and Loading** (ETL) skillsets preferred: Expertise in writing complex SQL queries that join multiple tables/databases. Given a business problem, independently explore databases/tables to identify best data sources. Demonstrates ability to troubleshoot complex SQL queries written by others with little guidance.
+ **Core business capabilities** : basic communication skills, exposure to financial services industry, and attention to detail and ability to prioritize while working on multiple projects keeping track of due dates
+ **Leadership Capabilities:** Ability to coach or mentor team members, respond quickly and positively to change, adheres to technical best practices by collaborating with team members to deliver high quality output
+ **Preferred characteristics** : Entrepreneurial self-starter. A thorough, results-oriented problem-solver, and a lifelong learner with voracious curiosity. Basic understanding of their organization.
**Principal Duties and Responsibilities**
- Design and execute analytical solutions using statistical, optimization, simulation and data mining methods with a focus on delivering actionable insights and partnership to deliver business value.
- Integrate large volume of data from different sources (including DB2, SQL Server, Web API and Teradata) to create data assets and perform analyses
- Apply validation, aggregation and reconciliation techniques to create rich modeling-ready data framework.
- Construct predictive models using machine learning to explain and understand observed events, forecast expected behavior, or identify risk through scoring or clustering.
- Familiarity with AI agents like github co-pilot, MS co-pilot to speed through code development and self - learning.
- Knowledge of working with LLMs and agent libraries like Langgraph. Langchain & OpenAI SDK.
- Efficiently interpret results and communicate findings and potential value to manager.
- Support integration of solutions within existing business processes using automation techniques.
- Understand theory and application of current and emerging statistical methods and tools.
- Knowledge of tools - Python, VS code, Databricks, Azure, AWS, Streamlit, tableau, SQL
- Perform other related duties as assigned
\#LI-TO1
\#LI-MULTI
IN4
Unum and Colonial Life are part of Unum Group, a Fortune 500 company and leading provider of employee benefits to companies worldwide. Headquartered in Chattanooga, TN, with international offices in Ireland, Poland and the UK, Unum also has significant operations in Portland, ME, and Baton Rouge, LA - plus over 35 US field offices. Colonial Life is headquartered in Columbia, SC, with over 40 field offices nationwide.
Unum is an equal opportunity employer, considering all qualified applicants and employees for hiring, placement, and advancement, without regard to a person's race, color, religion, national origin, age, genetic information, military status, gender, sexual orientation, gender identity or expression, disability, or protected veteran status.
The base salary range for applicants for this position is listed below. Unless actual salary is indicated above in the job description, actual pay will be based on skill, geographical location and experience.
$60,500.00-$123,400.00
Additionally, Unum offers a portfolio of benefits and rewards that are competitive and comprehensive including healthcare benefits (health, vision, dental), insurance benefits (short & long-term disability), performance-based incentive plans, paid time off, and a 401(k) retirement plan with an employer match up to 5% and an additional 4.5% contribution whether you contribute to the plan or not. All benefits are subject to the terms and conditions of individual Plans.
Company:
Unum
How much does a data engineer earn in Columbia, SC?
The average data engineer in Columbia, SC earns between $65,000 and $115,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.
Average data engineer salary in Columbia, SC
$87,000
What are the biggest employers of Data Engineers in Columbia, SC?
The biggest employers of Data Engineers in Columbia, SC are: