Post job

Data engineer jobs in Rock Hill, SC

- 685 jobs
All
Data Engineer
Data Scientist
Data Consultant
Software Engineer
  • Mid Level Software Engineer - Oracle Cloud Apps

    USAA 4.7company rating

    Data engineer job in Charlotte, NC

    Why USAA? At USAA, our mission is to empower our members to achieve financial security through highly competitive products, exceptional service and trusted advice. We seek to be the #1 choice for the military community and their families. Embrace a fulfilling career at USAA, where our core values - honesty, integrity, loyalty and service - define how we treat each other and our members. Be part of what truly makes us special and impactful. The Opportunity As a dedicated Mid Level Software Engineer - Oracle Cloud Apps, you will collaborate closely with the Finance-IT and Accounting teams within USAA's Chief Financial Office (CFO). They will play a key role in Financial Close and Consolidation projects, using Oracle Cloud technologies to enhance financial processes and system efficiencies. Provides support to the Enterprise through delivering best in class technology solutions. Engaged in all phases of the software systems and application development lifecycle which include gathering and analyzing requirements, designing, testing, documenting, and implementing software, responding to outages. We offer a flexible work environment that requires an individual to be in the office 4 days per week. This position can be based in one of the following locations: San Antonio, TX, Plano, TX, Phoenix, AZ, or Charlotte, NC. Relocation assistance is not available for this position. What you'll do: Design, develop, code, and test complex technical solutions Investigates and resolves complex application and system technical problems and production issues through solving techniques. Continually improves operations by conducting complex systems analysis and recommending changes in policies and procedures. Prepares and installs complex solutions by resolving and designing system specifications, standards, and programming. Follows the software development lifecycle. Participates in design reviews and learns key system design principles. Mentors junior engineers and may begin mentoring peer engineers; Review teammates' code. Ensures risks associated with business activities are effectively identified, measured, monitored, and controlled in accordance with risk and compliance policies and procedures. What you have: Bachelor's Degree or 4 additional years of experience beyond the minimum requirement can be used in lieu of a degree OR Approved certification from CodeUp, Galvanize, VetFIT (Veterans for IT) or eFIT (Employees for IT). 4 years of software development experience demonstrating depth of technical understanding within a specific discipline(s)/technology(s). 2 years of experience delivering technology solutions in all phases of the software systems and application development lifecycle to include leading code/design reviews. Basic Understanding of one or more of the following: Java, Swift, Objective-C, Cobol, JavaScript, Kotlin, C++, HTML, CSS, SQL, Go, and Python Developing level of business insight in the areas of business operations, risk management, industry practices and emerging trends. Experience supporting efforts to address production issues through fixing applications and systems. Experience articulating technical challenges and solutions. Basic understanding of cloud technologies and tools. What sets you apart: Strong understanding of the Financial & Insurance Industry technical and functional landscape. Deep knowledge of CFO processes and related business operations. Validated experience in driving the development and configuration of Oracle Cloud ERP modules (GL, AR, RM, etc.) and Oracle EPM applications (FCCS, EPCM, etc.) Expertise in Oracle Fusion Cloud Reporting Applications, including: FDI (Fusion Data Intelligence), BI Publisher (BIP), Oracle Analytics Cloud (OAC) Demonstrated experience in implementing at least two modules across ERP or EPM platforms (Examples: Implemented GL & ARE; Implemented FCCS and PCMCS; or two FCCS implementations.) Compensation range: The salary range for this position is: $93,770.00 - $179,240.00. USAA does not provide visa sponsorship for this role. Please do not apply for this role if at any time (now or in the future) you will need immigration support (i.e., H-1B, TN, STEM OPT Training Plans, etc.). Compensation: USAA has an effective process for assessing market data and establishing ranges to ensure we remain competitive. You are paid within the salary range based on your experience and market data of the position. The actual salary for this role may vary by location. Employees may be eligible for pay incentives based on overall corporate and individual performance and at the discretion of the USAA Board of Directors. The above description reflects the details considered necessary to describe the principal functions of the job and should not be construed as a detailed description of all the work requirements that may be performed in the job. Benefits: At USAA our employees enjoy best-in-class benefits to support their physical, financial, and emotional wellness. These benefits include comprehensive medical, dental and vision plans, 401(k), pension, life insurance, parental benefits, adoption assistance, paid time off program with paid holidays plus 16 paid volunteer hours, and various wellness programs. Additionally, our career path planning and continuing education assists employees with their professional goals. For more details on our outstanding benefits, visit our benefits page on USAAjobs.com. Applications for this position are accepted on an ongoing basis, this posting will remain open until the position is filled. Thus, interested candidates are encouraged to apply the same day they view this posting. USAA is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
    $93.8k-179.2k yearly Auto-Apply 2d ago
  • Lead Data Science Consultant - Global Payments & Liquidity

    W.F. Young 3.5company rating

    Data engineer job in Charlotte, NC

    We are seeking a visionary and hands-on Vice President-level Lead Data Scientist to join our Global Payments & Liquidity (GPL) team. In this role, you will work with senior leaders to uncover insights, solve complex problems, and build data-driven solutions that fuel growth, improve products, and deepen customer engagement. You will analyze product portfolios, identify strategic opportunities, and design solutions that deepen customer relationships and accelerate GPL's growth. In this role you will: Partner and consult with business leaders to understand strategic goals and translate them into data science initiatives. Lead the design and implementation of complex solutions and deliverables by utilizing advanced statistical techniques and Machine learning models. Collaborate with data engineering and business analysts to ensure scalable and secure deployment of analytics solutions. Enforce and establish best practices related to data quality and governance. Ensure data quality, model accuracy, and reproducibility through rigorous validation and documentation. Communicate insights and recommendations clearly to both technical and non-technical stakeholders. Required Qualifications: 5+ years of data science experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education 5+ years of hands-on programming using SQL 5+ years of Python experience 1+ years of experience in a leadership or mentorship role Must have completed a Master's degree level or higher in a quantitative discipline such as mathematics, statistics, engineering, physics, economics, Business Analytics, or computer science Desired Qualifications: Prior experience supporting Agile product teams in a corporate environment Knowledge and understanding of banking products and services, including Transaction Services in areas such as payables and receivables and working capital solutions Knowledge of cloud platforms (e.g., AWS, Azure) and data visualization tools (e.g., Power BI, Tableau). Job Expectations: This role will be required to work on-site in Charlotte, NC. Candidates outside of commuting distance must be willing to relocate. Base Salary Range for Charlotte, NC: $139,000.00 - $217,000.00 annually. Role is eligible for annual discretionary performance bonus. Posting End Date: 28 Dec 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo. Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.
    $139k-217k yearly Auto-Apply 45d ago
  • Data Scientist

    Tata Consulting Services 4.3company rating

    Data engineer job in Charlotte, NC

    Must Have Technical/Functional Skills Strong Python and Machine Learning skillset An experienced Data Scientist to lead end-to-end AI/ML solution design and implementation across a range of business domains in financial services. You will be responsible for architecting robust, scalable, and secure data science solutions that drive innovation and competitive advantage in the BFSI sector. This includes selecting appropriate technologies, defining solution blueprints, ensuring production readiness, and mentoring cross-functional teams. You will work closely with stakeholders to identify high-value use cases and ensure seamless integration of models into business applications. Your deep expertise in machine learning, cloud-native architectures, MLOps practices, and financial domain knowledge will be essential to influence strategy and deliver transformative business impact. * Proficient in Python, scikit-learn, TensorFlow, PyTorch, HuggingFace. * Strong BFSI domain knowledge. * Experience with NLP, LLMs (GPT), and deep learning. * Hands-on with MLOps pipelines and tools. * Experience with graph analytics tools (Neo4j, TigerGraph, NetworkX). Roles & Responsibilities * Architect and drive the design, development, and deployment of scalable ML/AI solutions. * Lead data science teams through complete project lifecycles - from ideation to production. * Define standards, best practices, and governance for AI/ML solutioning and model management. * Collaborate with data engineering, MLOps, product, and business teams. * Oversee integration of data science models into production systems. * Evaluate and recommend ML tools, frameworks, and cloud-native solutions. * Guide feature engineering, data strategy, and feature store design. * Promote innovation with generative AI, reinforcement learning, and graph-based learning. * Knowledge of Spark, PySpark, Scala. * Experience leading CoEs or data science accelerators. * Open-source contributions or published research. TCS Employee Benefits Summary: * Discretionary Annual Incentive. * Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans. * Family Support: Maternal & Parental Leaves. * Insurance Options: Auto & Home Insurance, Identity Theft Protection. * Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement. * Time Off: Vacation, Time Off, Sick Leave & Holidays. * Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing. Salary Range : $100,000-$130,000 a year
    $100k-130k yearly 27d ago
  • Data Scientist

    Isolved HCM

    Data engineer job in Charlotte, NC

    Summary/objective We are seeking a highly skilled Data Scientist to focus on building and deploying predictive models that identify customer churn risk and upsell opportunities. This role will play a key part in driving revenue growth and retention strategies by leveraging advanced machine learning, statistical modeling, and large-scale data capabilities within Databricks. Why Join Us? Be at the forefront of using Databricks AI/ML capabilities to solve real-world business challenges. Directly influence customer retention and revenue growth through applied data science. Work in a collaborative environment where experimentation and innovation are encouraged. Core Job Duties: Model Development * Design, develop, and deploy predictive models for customer churn and upsell propensity using Databricks ML capabilities. * Evaluate and compare algorithms (e.g., logistic regression, gradient boosting, random forest, deep learning) to optimize predictive performance. * Incorporate feature engineering pipelines that leverage customer behavior, transaction history, and product usage data. Data Engineering & Pipeline Ownership * Build and maintain scalable data pipelines in Databricks (using PySpark, Delta Lake, and MLflow) to enable reliable model training and scoring. * Collaborate with data engineers to ensure proper data ingestion, transformation, and governance. Experimentation & Validation * Conduct A/B tests and back testing to validate model effectiveness. * Apply techniques for model monitoring, drift detection, and retraining in production. Business Impact & Storytelling * Translate complex analytical outputs into clear recommendations for business stakeholders. * Partner with Product and Customer Success teams to design strategies that reduce churn, increase upsell and improve customer retention KPIs. Minimum Qualifications: * Master's or PhD in Data Science, Statistics, Computer Science, or related field (or equivalent industry experience). * 3+ years of experience building predictive models in a production environment. * Strong proficiency in Python (pandas, scikit-learn, PySpark) and SQL. * Demonstrated expertise using Databricks for: * Data manipulation and distributed processing with PySpark. * Building and managing models with MLflow. * Leveraging Delta Lake for efficient data storage and retrieval. * Implementing scalable ML pipelines within Databricks' ML Runtime. * Experience with feature engineering for behavioral and transactional datasets. * Strong understanding of customer lifecycle analytics, including churn modeling and upsell/recommendation systems. * Ability to communicate results and influence decision-making across technical and non-technical teams. Preferred Qualifications: * Experience with cloud platforms (Azure Databricks, AWS, or GCP). * Familiarity with Unity Catalog for data governance and security. * Knowledge of deep learning frameworks (TensorFlow, PyTorch) within Databricks. * Exposure to MLOps best practices (CI/CD for ML, model versioning, monitoring). * Background in SaaS, subscription-based businesses, or customer analytics. Physical Demands Prolonged periods of sitting at a desk and working on a computer. Must be able to lift up to 15 pounds. Travel Required: Limited Work Authorization: Employees must be legally authorized to work in the United States. FLSA Classification: Exempt Location: Any Effective Date: 9/16/2025 About isolved isolved is a provider of human capital management (HCM) solutions that help organizations recruit, retain and elevate their workforce. More than 195,000 employers and 8 million employees rely on isolved's software and services to streamline human resource (HR) operations and deliver employee experiences that matter. isolved People Cloud is a unified yet modular HCM platform with built-in artificial intelligence (AI) and analytics that connects HR, payroll, benefits, and workforce and talent management into a single solution that drives better business outcomes. Through the Sidekick Advantage, isolved also provides expert guidance, embedded services and an engaged community that empowers People Heroes to grow their companies and careers. Learn more at ******************* isolved is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. isolved is a progressive and open-minded meritocracy. If you are smart and good at what you do, come as you are. Visit ************************** for more information regarding our incredible culture and focus on our employee experience. Visit ************************* for a comprehensive list of our employee total rewards offerings.
    $68k-95k yearly est. 5d ago
  • Data Scientist

    Zone It Solutions

    Data engineer job in Charlotte, NC

    Job Description We are looking for a talented Data Scientist skilled in Python and SQL. In this role, you will analyze large datasets, develop predictive models, and derive actionable insights that will guide business decisions. Requirements Proven experience as a Data Scientist or a similar role, with a strong focus on data analysis and modeling. Proficiency in programming languages, especially Python, and strong SQL skills for database management and querying. Experience with statistical analysis techniques and data visualization tools (e.g., Tableau, Matplotlib, Seaborn). Familiarity with machine learning frameworks and libraries (e.g., Scikit-learn, TensorFlow). Strong analytical skills and the ability to work with large datasets to extract meaningful information. Experience in data preprocessing, feature engineering, and model evaluation. Excellent problem-solving abilities and strong communication skills to present findings effectively. A degree in Computer Science, Mathematics, Statistics, or a related field is preferred. Benefits About Us We specialize in Digital, ERP, and larger IT Services. We offer flexible, efficient and collaborative solutions to any organization that requires IT, experts. Our agile, agnostic, and flexible solutions will help you source the IT Expertise you need. If you are looking for new opportunities, send your profile at ***************************. Also follow our LinkedIn page for new job opportunities and more. Zone IT Solutions is an equal opportunity employer and our recruitment process focuses on essential skills and abilities. We encourage applications from a diverse array of backgrounds, including individuals of various ethnicities, cultures, and linguistic backgrounds, as well as those with disabilities.
    $68k-95k yearly est. Easy Apply 23d ago
  • AI Data Scientist

    Nationmind LLC

    Data engineer job in Charlotte, NC

    NationMind LLC is a technology consulting firm focused on software development and QA testing services. We help clients build reliable, scalable applications with a strong emphasis on automation, performance, and quality. Our team works across industries, delivering solutions that drive innovation and operational efficiency. We are currently hiring skilled professionals for AI Data Scientist to join our growing team. Job title: AI Data Scientist Location: Charlotte, NC (Hybrid) Job description/ Role Overview We are looking for an experienced Data Scientist who can design develop and deploy advanced AIML models and data driven solutions. The ideal candidate will have strong expertise in machine learning deep learning LLMs and cloud based data platforms along with handson experience in data engineering vector databases and endtoend deployment. Key Responsibilities Model Development Optimization Build and finetune MLDL models including LLMs for NLP tasks Implement RAG Retrieval Augmented Generation and Agentic AI workflows for enterprise use cases Optimize models for performance scalability and cost efficiency Data Engineering Management Design and maintain data pipelines for structured and unstructured data Work with vector databases eg Pinecone Milvus Weaviate for semantic search and embeddings Ensure data quality governance and compliance Deployment MLOps Deploy models using Docker Kubernetes and cloudnative services AWS Azure GCP Implement CICD pipelines for ML workflows and automated retraining Monitor model performance and drift using MLOps tools Collaboration Communication Work closely with architects engineers and business stakeholders to translate requirements into solutions Present insights and recommendations using data visualization tools Required Technical Skills Programming Python Pandas NumPy Scikitlearn PyTorch TensorFlow SQL AIML Frameworks LangChain Hugging Face LlamaIndex Cloud Platforms AWS SageMaker Azure ML GCP Vertex AI Databases SQLNoSQL Vector DBs Pinecone Milvus Weaviate Deployment Docker Kubernetes Helm MLOps Tools MLflow Kubeflow Airflow Visualization Power BI Tableau Matplotlib Plotly
    $68k-95k yearly est. 5d ago
  • Data Scientist

    Insight Global

    Data engineer job in Charlotte, NC

    Insight Global is seeking a Data Engineer for one of our large retail food service clients. This position will be within the BI & Analytics group supporting unattended retail solutions for clients and consumers across the US. This individual will be responsible for the following: - Develop the comprehensive knowledge required to support existing Production models and forecast solutions - Implement automated processes for efficiently generating and monitoring production forecasting models - Design and creates algorithms, machine-learning models, re-trains and tunes the model for efficiency and scalability in a cloud environment - Develop ad hoc and statistical analysis to determine trends and significant relationships - Develop and validates predictive and prescriptive models to identify opportunity for business improvement - Identifies unique opportunities and strategizes new uses for data; active engagement with the business customer to identify measurable criteria - Present and communicates complex analyses and content appropriate for the audience - Collaborate and develops a working relationship with IT, Data Engineers and Analytics teams to understand and share best practices and inter-dependencies between teams, data sources, technologies, and platforms - Accountable for ensuring high quality solutions are delivered within project timelines We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to Human Resources Request Form (****************************************** Og4IQS1J6dRiMo) . The EEOC "Know Your Rights" Poster is available here (*********************************************************************************************** . To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: *************************************************** . Skills and Requirements - Graduate studies or BS Degree in Data Science, Computer Science, Mathematics, Statistics, or related field. - 2+ years of experience in large complex data analysis and predictive modeling in a corporate setting - Experience with designing and building processes with large, complex data sets from multiple data sources - 2+ years of advanced proficiency in writing complex MS SQL, including aggregate analysis over large complex datasets, multiple joins and query optimization - Proficient coding abilities in the major programming languages including R and Python - Experience with "end to end" Machine Learning model development - Skilled in best practices for data handling and imputation - Familiar with a variety of ML modeling techniques including limitations and appropriate application of models such as forecasting, logistic regression, random forest, and gradient boost. - Proficiency with MS Excel and pivot tables/charts. - Experience visualizing business insights with MS Power BI or other data visualization tools preferred. - Demonstrates excellent verbal and written communication skills as well as the ability to present complex topics effectively and in a simplified and appropriate manner - Inquisitive, energetic initiative-taker, able to work independently or collaboratively in a fast-paced team environment. - Experience with Snowflake and Snowpark preferred. - Experience working with Azure DevOps; Sprint Agile projects preferred. - Experience with analysis within the retail industry is a plus.
    $68k-95k yearly est. 60d+ ago
  • Data Scientist

    Techstarsgroup

    Data engineer job in Charlotte, NC

    Use your Data Science skills in the fight against chronic diseases. Our client operates a pioneering disease management platform, focusing on delivering value-based care tailored to individuals with chronic conditions. By seamlessly integrating human expertise with cutting-edge software and analytics, the platform actively engages patients soon after diagnosis, ensuring they receive the most appropriate care swiftly. It offers continuous support by employing targeted, evidence-based interventions, significantly enhancing patient outcomes and making a meaningful difference in their health journey. As a Data Scientist, you will become an integral part of our Data team, tasked with enhancing our analytical capabilities to support our overarching mission. Your role will involve modeling data and crafting visualizations to communicate with stakeholders, delivering insights that prompt actionable measures. A background in working with healthcare data, including claims, prior authorizations, and electronic health records, is crucial for success in this role. You will work closely with teams across Product, Engineering, Clinical, and Operations to pinpoint opportunities for improving outcomes and monitoring the effectiveness of our interventions for our members and clients. Your contributions will be pivotal in establishing strong relationships with providers by developing data products that support practice transformation efforts. Within three months, you will: - Acquire a comprehensive understanding of our data platform, contributing to the enhancement of our data models and pipelines. - Forge relationships with oncology practices and providers, showcasing our expertise in developing data and analytics products. - Collaborate with stakeholders to grasp their business needs and translate these into technical specifications, involving the creation of data models, pipelines, and analytics dashboards using tools such as Looker or RStudio. This may include utilizing dbt to construct models for analyzing medical claims data to identify value care utilization across oncology practices or examining the dispersion of medical care among our members and its impact on out-of-pocket expenses. After six months, you will: - Assume responsibility for and lead the development of data models to assess the impact of our interventions with oncology practices, reporting outcomes both internally and externally. - Play a crucial role in enhancing our data infrastructure and devising a roadmap for a scalable and modular data architecture to accommodate our team's expansion. - Lead in the development of utilization and quality metrics, becoming the primary contact for stakeholder inquiries. You will also utilize our data assets for identifying business opportunities and strategic initiatives. Keys to success include: - Prioritizing our members. The mission of our organization, especially the experience of our members, is of utmost importance to you. - Being action-oriented. You have a knack for identifying and prioritizing the needs of your initiatives, ensuring that urgent and important tasks are addressed promptly. - Valuing diverse perspectives. You are humble, constantly seeking feedback, and are keen on learning and sharing knowledge. - Relevant experience. You have experience handling large healthcare datasets, preferably within a health plan or a healthcare-focused technology startup with advanced data structures and pipelines. Expertise in medical claims, pharmacy claims, eligibility files, and other pertinent healthcare data is vital for creating data marts for reporting and analysis. - Technical proficiency. Your skills in analytics, data modeling, and data transformation are essential. While familiarity with DBT is preferred, we welcome candidates who are eager to learn it swiftly. Experience with Python or R and tools like Looker for data analysis and visualization is advantageous. - Effective communication. You are adept at expressing your ideas clearly to both technical and non-technical team members and stakeholders. - Comfort with ambiguity. You have a proven track record of navigating through challenges and finding solutions in uncertain situations, particularly in fast-paced environments and ambitious startups.
    $68k-95k yearly est. 60d+ ago
  • Applied AI Data Scientist

    Thestaffed

    Data engineer job in Dallas, NC

    Our client, a top-tier Management Consulting firm, is seeking a highly skilled Applied AI Data Scientist for a top tier US Bank. Responsibilities and Requirements: Perform statistical analysis, clustering, and probability modeling to drive insights and inform AI-driven solutions Analyze graph-structured data to detect anomalies, extract probabilistic patterns, and support graph-based intelligence Build NLP pipelines with a focus on NER, entity resolution, ontology extraction, and scoring Contribute to AI/ML engineering efforts by developing, testing, and deploying data-driven models and services Apply ML Ops fundamentals, including experiment tracking, metric monitoring, and reproducibility practices Collaborate with cross-functional teams to translate analytical findings into production-grade capabilities Prototype quickly, iterate efficiently, and help evolve data science best practices across the team Solid experience in statistical modeling, clustering techniques, and probability-based analysis Hands-on expertise in graph data analysis, including anomaly detection and distribution pattern extraction Strong NLP skills with practical experience in NER, entity/ontology extraction, and related evaluation methods An engineering-forward mindset with the ability to build, deploy, and optimize real-world solutions (not purely theoretical) Working knowledge of ML Ops basics, including experiment tracking and key model metrics Proficiency in Python and common data science/AI libraries Strong communication skills and the ability to work collaboratively in fast-paced, applied AI environments
    $68k-94k yearly est. 18d ago
  • Lead Data Science Consultant - Global Payments & Liquidity

    Wells Fargo 4.6company rating

    Data engineer job in Charlotte, NC

    We are seeking a visionary and hands-on Vice President-level Lead Data Scientist to join our Global Payments & Liquidity (GPL) team. In this role, you will work with senior leaders to uncover insights, solve complex problems, and build data-driven solutions that fuel growth, improve products, and deepen customer engagement. You will analyze product portfolios, identify strategic opportunities, and design solutions that deepen customer relationships and accelerate GPL's growth. In this role you will: * Partner and consult with business leaders to understand strategic goals and translate them into data science initiatives. * Lead the design and implementation of complex solutions and deliverables by utilizing advanced statistical techniques and Machine learning models. * Collaborate with data engineering and business analysts to ensure scalable and secure deployment of analytics solutions. * Enforce and establish best practices related to data quality and governance. * Ensure data quality, model accuracy, and reproducibility through rigorous validation and documentation. * Communicate insights and recommendations clearly to both technical and non-technical stakeholders. Required Qualifications: * 5+ years of data science experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education * 5+ years of hands-on programming using SQL * 5+ years of Python experience * 1+ years of experience in a leadership or mentorship role * Must have completed a Master's degree level or higher in a quantitative discipline such as mathematics, statistics, engineering, physics, economics, Business Analytics, or computer science Desired Qualifications: * Prior experience supporting Agile product teams in a corporate environment * Knowledge and understanding of banking products and services, including Transaction Services in areas such as payables and receivables and working capital solutions * Knowledge of cloud platforms (e.g., AWS, Azure) and data visualization tools (e.g., Power BI, Tableau). Job Expectations: * This role will be required to work on-site in Charlotte, NC. Candidates outside of commuting distance must be willing to relocate. Base Salary Range for Charlotte, NC: $139,000.00 - $217,000.00 annually. Role is eligible for annual discretionary performance bonus. Posting End Date: 28 Dec 2025 * Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo. Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.
    $139k-217k yearly 44d ago
  • Data Engineer

    Contact Government Services, LLC

    Data engineer job in Charlotte, NC

    Data Engineer Employment Type: Full-Time, Mid-level Department: Business Intelligence CGS is seeking a passionate and driven Data Engineer to support a rapidly growing Data Analytics and Business Intelligence platform focused on providing solutions that empower our federal customers with the tools and capabilities needed to turn data into actionable insights. The ideal candidate is a critical thinker and perpetual learner; excited to gain exposure and build skillsets across a range of technologies while solving some of our clients' toughest challenges. CGS brings motivated, highly skilled, and creative people together to solve the government's most dynamic problems with cutting-edge technology. To carry out our mission, we are seeking candidates who are excited to contribute to government innovation, appreciate collaboration, and can anticipate the needs of others. Here at CGS, we offer an environment in which our employees feel supported, and we encourage professional growth through various learning opportunities. Skills and attributes for success:-Complete development efforts across data pipeline to store, manage, store, and provision to data consumers.-Being an active and collaborating member of an Agile/Scrum team and following all Agile/Scrum best practices.-Write code to ensure the performance and reliability of data extraction and processing.-Support continuous process automation for data ingest.-Achieve technical excellence by advocating for and adhering to lean-agile engineering principles and practices such as API-first design, simple design, continuous integration, version control, and automated testing.-Work with program management and engineers to implement and document complex and evolving requirements.-Help cultivate an environment that promotes customer service excellence, innovation, collaboration, and teamwork.-Collaborate with others as part of a cross-functional team that includes user experience researchers and designers, product managers, engineers, and other functional specialists. Qualifications:-Must be a US Citizen.-Must be able to obtain a Public Trust Clearance.-7+ years of IT experience including experience in design, management, and solutioning of large, complex data sets and models.-Experience with developing data pipelines from many sources from structured and unstructured data sets in a variety of formats.-Proficiency in developing ETL processes, and performing test and validation steps.-Proficiency to manipulate data (Python, R, SQL, SAS).-Strong knowledge of big data analysis and storage tools and technologies.-Strong understanding of the agile principles and ability to apply them.-Strong understanding of the CI/CD pipelines and ability to apply them.-Experience with relational database, such as, PostgreSQL.-Work comfortably in version control systems, such as, Git Repositories. Ideally, you will also have:-Experience creating and consuming APIs.-Experience with DHS and knowledge of DHS standards a plus.-Candidates will be given special consideration for extensive experience with Python.-Ability to develop visualizations utilizing Tableau or PowerBI.-Experience in developing Shell scripts on Linux.-Demonstrated experience translating business and technical requirements into comprehensive data strategies and analytic solutions.-Demonstrated ability to communicate across all levels of the organization and communicate technical terms to non-technical audiences. Our Commitment:Contact Government Services (CGS) strives to simplify and enhance government bureaucracy through the optimization of human, technical, and financial resources. We combine cutting-edge technology with world-class personnel to deliver customized solutions that fit our client's specific needs. We are committed to solving the most challenging and dynamic problems. For the past seven years, we've been growing our government-contracting portfolio, and along the way, we've created valuable partnerships by demonstrating a commitment to honesty, professionalism, and quality work. Here at CGS we value honesty through hard work and self-awareness, professionalism in all we do, and to deliver the best quality to our consumers mending those relations for years to come. We care about our employees. Therefore, we offer a comprehensive benefits package:-Health, Dental, and Vision-Life Insurance-401k-Flexible Spending Account (Health, Dependent Care, and Commuter)-Paid Time Off and Observance of State/Federal Holidays Contact Government Services, LLC is an Equal Opportunity Employer. Applicants will be considered without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Join our team and become part of government innovation! Explore additional job opportunities with CGS on our Job Board:************************************* For more information about CGS please visit: ************************** or contact:Email: ******************* #CJ
    $77k-103k yearly est. Auto-Apply 60d+ ago
  • Senior Data Engineer

    CPI Security 4.7company rating

    Data engineer job in Charlotte, NC

    Job Description CPI Security, a national leader in residential and commercial security solutions, is seeking a Senior Data Engineer to join us on our data transformational journey. This is an exciting, hands-on opportunity to implement a modern enterprise data platform at a company that has fully embraced the Snowflake platform. This role will work directly with line of business leaders and technical users to design and implement our cloud data warehouse using data vault modeling and dbt. CPI will leverage the data cloud for our data warehouse, machine learning, and AI journeys. The ideal person will have extensive experience building and implementing data warehouses in the cloud with deep expertise in data vault modeling and dbt. This is an on-site position at our HQ in Charlotte, NC. What You'll Do: Data Vault Implementation: Design and implement data vault 2.0 modeling patterns to build a scalable, audit-friendly enterprise data platform that supports business agility and data governance. Modern Data Engineering: Build and maintain automated data pipelines using dbt (Cloud/Core), Python, and Snowflake to transform raw data into business-ready datasets with comprehensive data quality testing. Cloud Data Platform Development: Architect and implement an enterprise data platform on Snowflake, including automated deployment pipelines, data quality frameworks, and monitoring solutions. While we are modernizing to a cloud data platform on premises work is still needed using SSIS and MSSQL Server. Data Mart & Dimensional Modeling: Design and build data marts using dimensional modeling techniques (Kimball methodology) to support business intelligence and analytics requirements. ETL/ELT Pipeline Development: Design and implement robust data transformation models using dbt, SQL, and Python to build scalable ingestion and processing pipelines. Data Quality & Testing: Implement comprehensive data quality testing frameworks using dbt tests, custom Python validations, and automated monitoring to ensure data accuracy and reliability. External Data Integration: Integrate and operationalize data from external systems such as CRM, ERP, and third-party platforms via secure cloud data sharing, CDC, and APIs. DataOps Implementation: Enable reliable, scalable, and automated data workflows by implementing DataOps best practices for continuous integration, testing, deployment, and monitoring across the data pipeline lifecycle. Cloud Migration Support: Play an integral role in planning, designing, and implementing data migration strategies from legacy systems to our modern cloud platform. What We're Looking For: Required Experience: 6+ years of data engineering experience with cloud data platforms 4+ years of experience with Snowflake (required) 4+ years of experience with dbt (Cloud and/or Core) 4+ years of Python development experience 4+ years of AWS experience (AWS Certified Developer preferred) 6+ years of experience building data warehouses and data marts Deep understanding of data vault 2.0 modeling methodology Strong experience with dimensional modeling (Kimball methodology) Proven experience with automated deployment and CI/CD pipelines Experience implementing data quality testing frameworks Technical Skills: MSSQL Server SQL and SSIS. Advanced SQL and data modeling expertise with dimensional modeling and data vault modeling Strong dbt skills for data staging, cleaning, transformation, testing, and modeling Proficiency in Python programming for data engineering tasks Experience with agile / scrum teams for data engineering and analytics engineering. AWS cloud services (S3, Lambda, IAM, CloudFormation, etc.) Experience with data orchestration tools (Airflow, Prefect, or similar) Understanding of modern data engineering practices and agile methodologies Knowledge of data governance, security, and compliance requirements Preferred Qualifications: AWS Certified Developer certification Snowflake certifications in data engineering and/or architecture Experience with data vault automation tools (automate-dv package) Knowledge of modern BI and analytics platforms Soft Skills: Excellent oral and written communication skills to effectively deliver messages to a wide range of audiences - from business to technical Innovative and positive team member mindset Strong teamwork and interpersonal skills, with the ability to deliver results working independently or in a collaborative environment Agile development experience preferred Solution-oriented approach with strong problem-solving abilities Education: Bachelor's degree in Information Systems, Computer Science, Data Science, or related field of study preferred Work experience equivalent will be considered What Sets You Apart: Deep understanding of the complete data engineering lifecycle Experience with cloud data platform implementations Proven ability to work with cross-functional teams and stakeholders Passion for building modern, cloud-first data solutions Strong analytical and critical thinking skills Commitment to data quality and best practices Why Join CPI Security: Opportunity to build a modern enterprise data platform from the ground up Work with cutting-edge cloud technologies and data vault modeling Collaborative environment with experienced data professionals Competitive compensation and benefits package Professional development opportunities and certification support On-site position in Charlotte, NC with a dynamic, growing company CPI Security is an equal opportunity employer committed to diversity and inclusion in the workplace.
    $86k-112k yearly est. 19d ago
  • Google Cloud Data & AI Engineer

    Slalom 4.6company rating

    Data engineer job in Charlotte, NC

    Who You'll Work With As a modern technology company, our Slalom Technologists are disrupting the market and bringing to life the art of the possible for our clients. We have passion for building strategies, solutions, and creative products to help our clients solve their most complex and interesting business problems. We surround our technologists with interesting challenges, innovative minds, and emerging technologies You will collaborate with cross-functional teams, including Google Cloud architects, data scientists, and business units, to design and implement Google Cloud data and AI solutions. As a Consultant, Senior Consultant or Principal at Slalom, you will be a part of a team of curious learners who lean into the latest technologies to innovate and build impactful solutions for our clients. What You'll Do * Design, build, and operationalize large-scale enterprise data and AI solutions using Google Cloud services such as BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub and more. * Implement cloud-based data solutions for data ingestion, transformation, and storage; and AI solutions for model development, deployment, and monitoring, ensuring both areas meet performance, scalability, and compliance needs. * Develop and maintain comprehensive architecture plans for data and AI solutions, ensuring they are optimized for both data processing and AI model training within the Google Cloud ecosystem. * Provide technical leadership and guidance on Google Cloud best practices for data engineering (e.g., ETL pipelines, data pipelines) and AI engineering (e.g., model deployment, MLOps). * Conduct assessments of current data architectures and AI workflows, and develop strategies for modernizing, migrating, or enhancing data systems and AI models within Google Cloud. * Stay current with emerging Google Cloud data and AI technologies, such as BigQuery ML, AutoML, and Vertex AI, and lead efforts to integrate new innovations into solutions for clients. * Mentor and develop team members to enhance their skills in Google Cloud data and AI technologies, while providing leadership and training on both data pipeline optimization and AI/ML best practices. What You'll Bring * Proven experience as a Cloud Data and AI Engineer or similar role, with hands-on experience in Google Cloud tools and services (e.g., BigQuery, Vertex AI, Dataflow, Cloud Storage, Pub/Sub, etc.). * Strong knowledge of data engineering concepts, such as ETL processes, data warehousing, data modeling, and data governance. * Proficiency in AI engineering, including experience with machine learning models, model training, and MLOps pipelines using tools like Vertex AI, BigQuery ML, and AutoML. * Strong problem-solving and decision-making skills, particularly with large-scale data systems and AI model deployment. * Strong communication and collaboration skills to work with cross-functional teams, including data scientists, business stakeholders, and IT teams, bridging data engineering and AI efforts. * Experience with agile methodologies and project management tools in the context of Google Cloud data and AI projects. * Ability to work in a fast-paced environment, managing multiple Google Cloud data and AI engineering projects simultaneously. * Knowledge of security and compliance best practices as they relate to data and AI solutions on Google Cloud. * Google Cloud certifications (e.g., Professional Data Engineer, Professional Database Engineer, Professional Machine Learning Engineer) or willingness to obtain certification within a defined timeframe. About Us Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all. Compensation and Benefits Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement account for any well-being-related expenses, as well as discounted home, auto, and pet insurance. Slalom is committed to fair and equitable compensation practices. For this position the target base salaries are listed below. In addition, individuals may be eligible for an annual discretionary bonus. Actual compensation will depend upon an individual's skills, experience, qualifications, location, and other relevant factors. The target salary pay range is subject to change and may be modified at any time. East Bay, San Francisco, Silicon Valley: * Consultant $114,000-$171,000 * Senior Consultant: $131,000-$196,500 * Principal: $145,000-$217,500 San Diego, Los Angeles, Orange County, Seattle, Houston, New Jersey, New York City, Westchester, Boston, Washington DC: * Consultant $105,000-$157,500 * Senior Consultant: $120,000-$180,000 * Principal: $133,000-$199,500 All other locations: * Consultant: $96,000-$144,000 * Senior Consultant: $110,000-$165,000 * Principal: $122,000-$183,000 EEO and Accommodations Slalom is an equal opportunity employer and is committed to inclusion, diversity, and equity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veterans' status, or any other characteristic protected by federal, state, or local laws. Slalom will also consider qualified applications with criminal histories, consistent with legal requirements. Slalom welcomes and encourages applications from individuals with disabilities. Reasonable accommodations are available for candidates during all aspects of the selection process. Please advise the talent acquisition team if you require accommodations during the interview process. We are accepting applications until 12/31. #LI-FB1
    $145k-217.5k yearly 60d+ ago
  • Data Engineer

    Sharp Decisions 4.6company rating

    Data engineer job in Charlotte, NC

    Experience Level: Mid (5-7 Years) W2 ONLY - NO 3RD PARTIES PLEASE CONTRACT / C2H Role Objectives * These roles will be part of the Data Strategy team spanning across the Client Capital Markets teams. * These roles will be involved in the active development of the data platform in close coordination with the Client team, beginning with the establishment of a reference data system for securities and pricing data, and later moving to other data domains. * The consulting team will need to follow internal developments standards to contribute to the overall agenda of the Data Strategy team. Qualifications and Skills * Proven experience as a Data Engineer with experience in Azure cloud. * Experience implementing solutions using - * Azure cloud services * Azure Data Factory * Azure Lake Gen 2 * Azure Databases * Azure Data Fabric * API Gateway management * Azure Functions * Well versed with Azure Databricks * Strong SQL skills with RDMS or no SQL databases * Experience with developing APIs using FastAPI or similar frameworks in Python * Familiarity with the DevOps lifecycle (git, Jenkins, etc.), CI/CD processes * Good understanding of ETL/ELT processes * Experience in financial services industry, financial instruments, asset classes and market data are a plus. #LI-EW1
    $78k-101k yearly est. 3d ago
  • Data Engineer-Lead - Project Planning and Execution

    DPR Construction 4.8company rating

    Data engineer job in Charlotte, NC

    We are a leading construction company committed to delivering high-quality, innovative projects. Our team integrates cutting-edge technologies into the construction process to streamline operations, enhance decision-making, and drive efficiency across all levels. We are looking for a talented Data Engineer to join our team and contribute to developing robust data solutions that support our business goals. This role is ideal for someone who enjoys combining technical problem-solving with stakeholder collaboration. You will collaborate with business leaders to understand data needs and work closely with a global engineering team to deliver scalable, timely, and high-quality data solutions that power insights and operations. Responsibilities * Own data delivery for specific business verticals by translating stakeholder needs into scalable, reliable, and well-documented data solutions. * Participate in requirements gathering, technical design reviews, and planning discussions with business and technical teams. * Partner with the extended data team to define, develop, and maintain shared data models and definitions. * Design, develop, and maintain robust data pipelines and ETL processes using tools like Azure Data Factory and Python across internal and external systems. * Proactively manage data quality, error handling, monitoring, and alerting to ensure timely and trustworthy data delivery. * Perform debugging, application issue resolution, root cause analysis, and assist in proactive/preventive maintenance. * Support incident resolution and perform root cause analysis for data-related issues. * Create and maintain both business requirement and technical requirement documentation * Collaborate with data analysts, business users, and developers to ensure the accuracy and efficiency of data solutions. * Collaborate with platform and architecture teams to align with best practices and extend shared data engineering patterns. Qualifications * Minimum of 4 years of experience as a Data Engineer, working with cloud platforms (Azure, AWS). * Proven track record of managing stakeholder expectations and delivering data solutions aligned with business priorities. * Strong hands-on expertise in Azure Data Factory, Azure Data Lake, Python, and SQL * Familiarity with cloud storage (Azure, AWS S3) and integration techniques (APIs, webhooks, REST). * Experience with modern data platforms like Snowflake and Microsoft Fabric. * Solid understanding of Data Modeling, pipeline orchestration and performance optimization * Strong problem-solving skills and ability to troubleshoot complex data issues. * Excellent communication skills, with the ability to work collaboratively in a team environment. * Familiarity with tools like Power BI for data visualization is a plus. * Experience working with or coordinating with overseas teams is a strong plus Preferred Skills * Knowledge of Airflow or other orchestration tools. * Experience working with Git-based workflows and CI/CD pipelines * Experience in the construction industry or a similar field is a plus but not required. DPR Construction is a forward-thinking, self-performing general contractor specializing in technically complex and sustainable projects for the advanced technology, life sciences, healthcare, higher education and commercial markets. Founded in 1990, DPR is a great story of entrepreneurial success as a private, employee-owned company that has grown into a multi-billion-dollar family of companies with offices around the world. Working at DPR, you'll have the chance to try new things, explore unique paths and shape your future. Here, we build opportunity together-by harnessing our talents, enabling curiosity and pursuing our collective ambition to make the best ideas happen. We are proud to be recognized as a great place to work by our talented teammates and leading news organizations like U.S. News and World Report, Forbes, Fast Company and Newsweek. Explore our open opportunities at ********************
    $83k-109k yearly est. Auto-Apply 28d ago
  • Data Platforms Engineer

    Adi Construction 4.2company rating

    Data engineer job in Charlotte, NC

    ADI is seeking a passionate Data Platforms Engineer to design, develop, and maintain our enterprise data platforms. As the Engineer of Data Platforms, you will be part of a talented engineering and operations team focused on building and supporting secure, performant, and scalable next gen data platforms that include Snowflake, Databricks, MS Fabric, PowerBI, DBT, Airflow, Azure data services, SQL Server database and modern AI/ML technologies. This role will help ensuring our data infrastructure meets both present and future needs while adhering to governance, compliance and operational excellence. JOB DUTIES: Platform Development - end-to-end engineering data platforms, ensuring platform reliability, scalability, and extensibility that evolves with the business goals and customer needs Build reusable frameworks and automation using snowflake cloud, GitHub, Azure Data Factory, DBT, or similar tools. Maintain critical infrastructure, including OLTP databases, near real-time data pipelines, batch processing systems and frameworks that make data infrastructure accessible for engineering teams Administer, configure, maintain and support data, analytics and cloud data platforms Manage and administer data security and protection to protect the databases against threats or unauthorized access. Contribute to data modeling, partitioning, indexing, clustering, and performance optimization efforts in Snowflake and other data platforms, following established best practices Resource Monitoring, cost optimization, data validation and quality checks inbuilt into the data pipeline Maintain run books and knowledge bases for platform operations team Ensure seamless integration of data systems across the company, collaborating with data engineering and analytics teams to align solutions with business needs Performs after-hours support for critical production systems with occasional, scheduled maintenance or release deployments and provide on call support as needed YOU MUST HAVE: 2+ years' overall experience designing, deploying, managing and supporting RDBMS, Data Warehouse systems, Data Lake, and BI solutions, including working with any platforms such as, SQL Server, Snowflake Cloud, Databricks, DBT, ADF (Azure Data Factory) and other Azure services Advanced SQL proficiency Proficiency in Snowflake cloud advanced capabilities Hands-on public cloud experience including experience with technologies and tools such as Continuous Integration, Continuous Deployment, Configuration Management, and Provisioning Automation with tools such as Github Understanding of security practices and certificate management Experience with orchestration tools WE VALUE: Certifications in cloud technologies, data management, or data engineering (e.g., Azure Data Engineer, Snowflake SnowPro core/Advanced) Understanding of enterprise data management principles Experience with AI/ML technologies and MLOps capabilities and tools Experience with Data Platforms administration #LI-MH2 #LI-HYBRID
    $76k-104k yearly est. Auto-Apply 54d ago
  • Data Scientist I

    Bank of America Corporation 4.7company rating

    Data engineer job in Charlotte, NC

    At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. We do this by driving Responsible Growth and delivering for our clients, teammates, communities and shareholders every day. Being a Great Place to Work is core to how we drive Responsible Growth. This includes our commitment to being an inclusive workplace, attracting and developing exceptional talent, supporting our teammates' physical, emotional, and financial wellness, recognizing and rewarding performance, and how we make an impact in the communities we serve. Bank of America is committed to an in-office culture with specific requirements for office-based attendance and which allows for an appropriate level of flexibility for our teammates and businesses based on role-specific considerations. At Bank of America, you can build a successful career with opportunities to learn, grow, and make an impact. Join us! The Data and AI team within Global Payment Solutions (GPS) at Bank of America is looking for a highly technical Research Engineer that is a self-starter who can translate complex concepts and generate actionable insights. Who we are: The GPS Data and AI team drives and powers innovation within the one of the largest payment businesses in the world. We are the core data scientists building end-to-end production level solutions at scale. The team is a collection of individuals that welcomes challenging problems because that is where the learning begins. Many of our projects begin with "think tank" sessions with business teams that is later translated to tangible solutions and products. Representative Projects: * Develop advanced statistical methods to determine the optimal interest-rate or product pricing strategies for clients * Build a Generative AI-powered search platform that enables sales and product teams to access high quality answers to product, servicing, and client related questions * Develop AI models that shift foreign currency conversion "up the payment stream," reducing reliability on beneficiary banks * Apply advanced NLP techniques to generate near real-time insights into what clients are reaching out to servicing teams about, helping servicing teams anticipate client needs and improve service quality Who we're looking for: As a Research Engineer on the GPS team, you will tackle one of our most import challenges: building models that transform data into revenue-driving insights and products. This role blends research, engineering, and product responsibilities, requiring you to design statistical models, build scalable ML pipelines, and deploy algorithms directly into our production systems. You will work across teams to solve problems in recommendation, stochastic optimization, and time-series forecasting; while running large A/B tests on new methods we design to push the boundaries of what data can achieve. If you are passionate about working in cross-functional teams and utilizing your skills for technical product management, statistical analytics, predictive modeling, and generating revenue, consider applying to this role. Responsibilities: * Frame abstract business problems using advanced data science and machine learning algorithms * Work with stakeholders throughout the organization to identify opportunities for leveraging internal and external data to drive business solutions * Own critical research and application development, strengthening the firm's competitive advantage * Shape cross-team collaboration to advance the firm's data science capabilities Required Skills: * Minimum Bachelor's degree in a quantitative field such as computer science, math, statistics, and physics * Minimum 3 years of experience with translating mathematical models and algorithms into code (Python and/or C++) * Strong foundation in probability, statistics, and applied machine learning (NLP, time-series analysis, Generative AI) * Prior experience working in a highly technical data driven research environment * Ability to communicate complex topics in a concise and coherent manner Desired Skills: * Graduate level degree in a quantitative field such as, computer science, math, statistics, and physics * Demonstrated prior experience through work, research or passion projects in RAG, Agentic Frameworks, LLMs, Reinforcement Learning * Are energized by the high stakes and intensity of dynamic environments and ready to dive in whenever * Are excited to dive into new technical areas on a regular basis Shift: 1st shift (United States of America) Hours Per Week: 40
    $71k-94k yearly est. 7d ago
  • Data Scientist

    Zone It Solutions

    Data engineer job in Charlotte, NC

    We are looking for a talented Data Scientist skilled in Python and SQL. In this role, you will analyze large datasets, develop predictive models, and derive actionable insights that will guide business decisions. Requirements Proven experience as a Data Scientist or a similar role, with a strong focus on data analysis and modeling. Proficiency in programming languages, especially Python, and strong SQL skills for database management and querying. Experience with statistical analysis techniques and data visualization tools (e.g., Tableau, Matplotlib, Seaborn). Familiarity with machine learning frameworks and libraries (e.g., Scikit-learn, TensorFlow). Strong analytical skills and the ability to work with large datasets to extract meaningful information. Experience in data preprocessing, feature engineering, and model evaluation. Excellent problem-solving abilities and strong communication skills to present findings effectively. A degree in Computer Science, Mathematics, Statistics, or a related field is preferred. Benefits About Us We specialize in Digital, ERP, and larger IT Services. We offer flexible, efficient and collaborative solutions to any organization that requires IT, experts. Our agile, agnostic, and flexible solutions will help you source the IT Expertise you need. If you are looking for new opportunities, send your profile at ***************************. Also follow our LinkedIn page for new job opportunities and more. Zone IT Solutions is an equal opportunity employer and our recruitment process focuses on essential skills and abilities. We encourage applications from a diverse array of backgrounds, including individuals of various ethnicities, cultures, and linguistic backgrounds, as well as those with disabilities.
    $68k-95k yearly est. Auto-Apply 52d ago
  • Data Scientist

    Tata Consulting Services 4.3company rating

    Data engineer job in Charlotte, NC

    Artificial Intelligence(GenAI)/ML + Data science Must Have Technical/Functional Skills Primary: Artificial intelligence/ML Secondary: Data science, Python, NLP, Agile tools Experience: 7 to 10+ Roles & Responsibilities ⦁ Experience with NLP, deep learning, or time series analysis. ⦁ Experience deploying models to production environments. ⦁ Knowledge of regulatory requirements and compliance in banking and finance. ⦁ Familiarity with MLOps practices and tools. ⦁ Experience with Agile methodology and tools (JIRA or Rally) ⦁ Proven experience as a Data Scientist in Banking or a similar domain ⦁ Proficiency in Python or R, and experience with data science libraries (e.g., pandas, scikit-learn, TensorFlow, PyTorch). ⦁ Hands-on experience with large language models (e.g., OpenAI GPT, Llama, or similar), including fine-tuning and prompt engineering. ⦁ Strong knowledge of statistics, machine learning, and data mining techniques. ⦁ Experience with data visualization tools (e.g., Tableau, Power BI). ⦁ Experience with Big Data Platforms (Hadoop). ⦁ Familiarity with SQL and working with relational databases. ⦁ Excellent problem-solving, communication, and collaboration skills. ⦁ Experience with cloud platforms (AWS, Azure, or GCP) is a plus. ⦁ Analyze large financial datasets to extract insights and support business decisions. ⦁ Develop, implement, and evaluate machine learning models and algorithms tailored to banking and finance use cases (e.g., risk modeling, fraud detection, customer segmentation). ⦁ Apply and fine-tune large language models (LLMs) for tasks such as document analysis, customer communication, and regulatory compliance. ⦁ Collaborate with cross-functional teams to understand business requirements and deliver data-driven solutions. ⦁ Communicate findings and recommendations through reports, dashboards, and presentations. ⦁ Work with data engineers to ensure data quality and pipeline reliability. Salary Range- $95,000-$120,000 a year
    $95k-120k yearly 48d ago
  • Lead Data Science Consultant - Global Payments & Liquidity

    Wells Fargo 4.6company rating

    Data engineer job in Charlotte, NC

    We are seeking a visionary and hands-on **Vice President-level Lead Data Scientist** to join our **Global Payments & Liquidity (GPL)** team. In this role, you will work with senior leaders to uncover insights, solve complex problems, and build data-driven solutions that fuel growth, improve products, and deepen customer engagement. You will analyze product portfolios, identify strategic opportunities, and design solutions that deepen customer relationships and accelerate GPL's growth. **In this role you will:** + Partner and consult with business leaders to understand strategic goals and translate them into data science initiatives. + Lead the design and implementation of complex solutions and deliverables by utilizingadvanced statistical techniques and Machine learning models. + Collaborate with data engineering and business analysts to ensure scalable and secure deployment of analytics solutions. + Enforce and establish best practices related to data quality and governance. + Ensure data quality, model accuracy, and reproducibility through rigorous validation and documentation. + Communicate insights and recommendations clearly to both technical and non-technical stakeholders. **Required Qualifications:** + 5+ years of data science experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education + 5+ years of hands-on programming using SQL + 5+ years of Python experience + 1+ years of experience in a leadership or mentorship role + **Must have completed a Master's degree level or higher in a quantitative discipline such as mathematics, statistics, engineering, physics, economics, Business Analytics, or computer science** **Desired Qualifications:** + Prior experience supporting Agile product teams in a corporate environment + Knowledge and understanding of banking products and services, including Transaction Services in areas such as payables and receivables and working capital solutions + Knowledge of cloud platforms (e.g., AWS, Azure) and data visualization tools (e.g., Power BI, Tableau). **Job Expectations:** + **This role will be required to work on-site in Charlotte, NC. Candidates outside of commuting distance must be willing to relocate.** **Base Salary Range for Charlotte, NC:** **$139,000.00 - $217,000.00 annually. Role is eligible for annual discretionary performance bonus.** **Posting End Date:** 28 Dec 2025 **_*Job posting may come down early due to volume of applicants._** **We Value Equal Opportunity** Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. **Applicants with Disabilities** To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo (****************************************************************** . **Drug and Alcohol Policy** Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy (********************************************************************** to learn more. **Wells Fargo Recruitment and Hiring Requirements:** a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process. **Req Number:** R-503546
    $139k-217k yearly 42d ago

Learn more about data engineer jobs

How much does a data engineer earn in Rock Hill, SC?

The average data engineer in Rock Hill, SC earns between $66,000 and $116,000 annually. This compares to the national average data engineer range of $80,000 to $149,000.

Average data engineer salary in Rock Hill, SC

$87,000

What are the biggest employers of Data Engineers in Rock Hill, SC?

The biggest employers of Data Engineers in Rock Hill, SC are:
  1. Atria Wealth Solutions
  2. Shutterfly
Job type you want
Full Time
Part Time
Internship
Temporary