Data Scientist with GenAI and Python
Data scientist job in Charlotte, NC
Dexian is seeking a Data Scientist with GenAI and Python for an opportunity with a client located in Charlotte, NC.
Responsibilities:
Design, develop, and deploy GenAI models, including LLMs, GANs, and transformers, for tasks such as content generation, data augmentation, and creative applications
Analyze complex data sets to identify patterns, extract meaningful features, and prepare data for model training, with a focus on data quality for GenAI
Develop and refine prompts for LLMs, and optimize GenAI models for performance, efficiency, and specific use cases
Deploy GenAI models into production environments, monitor their performance, and implement strategies for continuous improvement and model governance
Work closely with cross-functional teams (e.g., engineering, product) to understand business needs, translate them into GenAI solutions, and effectively communicate technical concepts to diverse stakeholders
Stay updated on the latest advancements in GenAI and data science, and explore new techniques and applications to drive innovation within the organization
Utilize Python and its extensive libraries (e.g., scikit-learn, TensorFlow, PyTorch, Pandas, LangChain) for data manipulation, model development, and solution implementation
Requirements:
Proven hands-on experience implementing Gen AI project using open source LLMs (Llama, GPT OSS, Gemma, Mistral) and proprietary API's (OpenAI, Anthropic)
Strong background in Retrieval Augmented Generation implementations
In depth understanding of embedding models and their applications
Hands on experience in Natural Language Processing (NLP) solutions on text data
Strong Python development skills. Should be comfortable with Pandas and NumPy for data analysis and feature engineering
Experience building and integrating APIs (REST, FastAPI, Flask) for serving models
Fine tuning and optimizing open source LLM/SLM is a big plus
Knowledge of Agentic AI frameworks and Orchestration
Experience in ML and Deep Learning is an advantage
Familiarity with cloud platforms (AWS/Azure/GCP)
Experience working with Agile Methodology
Strong problem solving, analytical and interpersonal skills
Ability to work effectively in a team environment
Strong written and oral communication skills
Should have the ability to clearly express ideas
Dexian is a leading provider of staffing, IT, and workforce solutions with over 12,000 employees and 70 locations worldwide. As one of the largest IT staffing companies and the 2nd largest minority-owned staffing company in the U.S., Dexian was formed in 2023 through the merger of DISYS and Signature Consultants. Combining the best elements of its core companies, Dexian's platform connects talent, technology, and organizations to produce game-changing results that help everyone achieve their ambitions and goals.
Dexian's brands include Dexian DISYS, Dexian Signature Consultants, Dexian Government Solutions, Dexian Talent Development and Dexian IT Solutions. Visit ******************* to learn more.
Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.
Senior Data Scientist
Data scientist job in Charlotte, NC
About Us:
LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit ********************
Role: Senior Data Scientist - Generative AI & Solution Architecture
Location: Charlotte, NC
Work Mode - Hybrid (2-3 days from client office)
Experience: 10+ Years
Job Description
We are looking for a Senior Data Scientist to lead the design and implementation of a Generative AI-driven Contract Risk Reporting System. This role involves building advanced RAG pipelines, a reporting engine, and an interactive chatbot. The ideal candidate combines deep technical expertise, AI/ML fundamentals, and the ability to interpret legal contract language into actionable solution requirements, while engaging directly with clients.
Key Responsibilities:
Architect and deliver AI solutions for contract risk analysis and reporting.
Design and implement LLM-based RAG systems, reporting dashboards, and conversational interfaces.
Translate legal domain requirements into technical specifications.
Collaborate with clients for requirement gathering, solution validation, and presentations.
Ensure MLOps best practices, model monitoring, and Model Risk Management (MRM) compliance.
Required Skills:
Expertise in Generative AI, LLMs, RAG architectures, and NLP techniques.
Strong foundation in machine learning algorithms, model evaluation, and feature engineering.
Hands-on experience with MLOps pipelines, model governance, and risk controls.
Proficiency in Python, LangChain, Vector Databases, and Cloud platforms (AWS/Azure/GCP).
Ability to interpret legal contract language and map it to technical solutions.
Excellent communication, client engagement, and solution design skills.
Preferred:
Experience in legal tech, contract analytics, or risk management solutions.
Familiarity with prompt engineering, domain-specific fine-tuning, and LLM optimization.
LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.
Principal Data Architect
Data scientist job in Raleigh, NC
Our client, a leader in their industry, has an excellent opportunity for a Principal Data Architect to work on a 12-month+ contract with an option to hire if the fit is right in Raleigh, NC. This position is hybrid work, with on-site 3 days and 2 days remotely.
Due to client requirement, applicants must be willing and able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance.
Rate: $95/hour - $115/hour w-2
Responsibilities:
Define and drive the long-term data architecture vision and provide strategic data leadership for the organization
Lead the architecture and implementation of scalable ingestion and transformation patterns, establishing Snowflake FinOps practices and enterprise standards for Data Vault, dbt, and ingestion tooling.
Spearhead research initiatives in emerging data technologies and architectural paradigms
Shape enterprise-wide data governance policies and standards
Provide thought leadership and industry influence
Represent the organization at high-profile industry events and conferences
Mentor senior data professionals and foster a culture of innovation
Requirements:
12+ years of experience in enterprise data architecture or engineering, with a strong focus on data platforms and AI-enablement
Proven experience designing and delivering large-scale data systems in cloud-native environments, specifically Snowflake and AWS
Proven experience designing enterprise-scale data ingestion frameworks using Snowflake (Snowpipe, Tasks, Streams), Data Vault modeling, dbt, and modern ingestion tools such as Qlik, Fivetran, or Matillion.
Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related field
Strong communication and executive presentation skills
Player-coach mindset and a highly collaborative approach
Passion for experimentation and continuous learning
Strategic thinking with an eye for emerging trends
Preferred:
• Experience in regulated industries such as banking, healthcare, or insurance
Skills, experience, and other compensable factors will be considered when determining pay rate. The pay range provided in this posting reflects a W2 hourly rate; other employment options may be available and that may result in pay outside of the range provided.
W2 employees of Eliassen Group who are regularly scheduled to work 30 or more hours per week are eligible for the following benefits: medical (choice of 3 plans), dental, vision, pre-tax accounts, other voluntary benefits including life and disability insurance, 401(k) with match, and sick time if required by law in the worked-in state/locality.
Please be advised- If anyone reaches out to you about an open position connected with Eliassen Group, please confirm that they have an Eliassen.com email address and never provide personal or financial information to anyone who is not clearly associated with Eliassen Group. If you have any indication of fraudulent activity, please contact ********************.
Job ID: JN-092025-103677
Data Management Analyst
Data scientist job in Charlotte, NC
Need strong Data Management resources that have hands-on data provisioning and ability to distribute the data.
Moderate to Advanced SQL skills (writing complex queries is a plus)
Commercial Lending (iHub, WICS, WICDR systems)/Commercial Banking Background
Metadata/Data Governance
Regulatory Reporting
Data Management Framework
SQL
Data Quality
Data Conversion Engineer
Data scientist job in Charlotte, NC
Summary/Objective
Are you looking to work at a high growth, innovative, and purpose driven FinTech company? If so, you'll love Paymentus. Recognized by Deloitte as one of the fastest growing companies in North America, Paymentus is the premier provider of innovative, reliable, and secure electronic bill payment and presentment for more than 1700 clients. We are a SaaS provider that enables companies to help their customers simplify their financial lives. We do that by making it easier for consumers and businesses to pay bills, plus move and manage money to achieve strong financial health. We continually build upon a massively scalable platform, supporting thousands of businesses and millions of transactions on a daily basis. We're looking for high performers to join our team who excel in their expertise and who can transform plans into action. You'll have the opportunity to grow in an environment where intelligence, innovation, and leadership are valued and rewarded.
About the Role
The Data Conversion Engineer serves as a key component of the Platform Integrations team, providing technical support and guidance on data conversion projects. Conversions are an integral part in ensuring adherence to Paymentus' standards for a successful launch. This role is essential to ensure all bill payment data converts properly and efficiently onto the Paymentus platform.
Responsibilities
Develop data conversion procedures using SQL, Java and Linux scripting
Augment and automate existing manual procedures to optimize accuracy and reduce time for each conversion
Develop and update conversion mappers to interpret incoming data and manipulate it to match Paymentus' specifications
Develop new specifications to satisfy new customers and products
Serve as the primary point of contact/driver for all technical related conversion activities
Review conversion calendar and offer technical support and solutions to meet deadlines and contract dates
Maintain and update technical conversion documentation to share with internal and external clients and partners
Work in close collaboration with implementation, integration, product and development teams using exceptional communication skills
Adapt and creatively solve encountered problems under high stress and tight deadlines
Learn database structure, business logic and combine all knowledge to improve processes
Be flexible
Monitor new client conversions and existing client support if needed; provide daily problem solving, coordination, and communication
Management of multiple projects and conversion implementations
Ability to proactively troubleshoot and solve problems with limited supervision
Qualifications
B.S. Degree in Computer Science or comparable experience
Strong knowledge of Linux and the command line interface
Exceptional SQL skills
Experience with logging/monitoring tools (AWS Cloudwatch, Splunk, ELK, etc.)
Familiarity with various online banking applications and understanding of third-party integrations is a plus
Effective written and verbal communication skills
Problem Solver - recognizes the need to resolve issues quickly and effectively, uses logic to solve problems; identifies problems and brings forward multiple solution options; knows who/when to involve appropriate people when troubleshooting issues
Communication; ability to use formal and informal written and/or verbal communication channels to inform others; articulates ideas and thoughts clearly both verbally and in writing
Dynamic and self-motivated; able to work on their own initiative and deliver the objectives required to maintain service levels
Strong attention to detail
Proficiency with raw data, analytics, or data reporting tools
Preferred Skills
Background in the Payments, Banking, E-Commerce, Finance and/or Utility industries
Experience with front end web interfaces (HTML5, Javascript, CSS3)
Cloud technologies (AWS, GCP, Azure)
Work Environment
This job operates in a professional office environment. This role routinely uses standard office equipment such as laptop computers, photocopiers and smartphones.
Physical Demands
This role requires sitting or standing at a computer workstation for extended periods of time.
Position Type/Expected Hours of Work
This is a full-time position. Days and hours of work are Monday through Friday, 40 hours a week. Occasional evening and weekend work may be required as job duties demand.
Travel
No travel is required for this position.
Other Duties
Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee for this job. Duties, responsibilities and activities may change at any time with or without notice.
Equal Opportunity Statement
Paymentus is an equal opportunity employer. We enthusiastically accept our responsibility to make employment decisions without regard to race, religious creed, color, age, sex, sexual orientation, national origin, ancestry, citizenship status, religion, marital status, disability, military service or veteran status, genetic information, medical condition including medical characteristics, or any other classification protected by applicable federal, state, and local laws and ordinances. Our management is dedicated to ensuring the fulfillment of this policy with respect to hiring, placement, promotion, transfer, demotion, layoff, termination, recruitment advertising, pay, and other forms of compensation, training, and general treatment during employment.
Reasonable Accommodation
Paymentus recognizes and supports its obligation to endeavor to accommodate job applicants and employees with known physical or mental disabilities who are able to perform the essential functions of the position, with or without reasonable accommodation. Paymentus will endeavor to provide reasonable accommodations to otherwise qualified job applicants and employees with known physical or mental disabilities, unless doing so would impose an undue hardship on the Company or pose a direct threat of substantial harm to the employee or others. An applicant or employee who believes he or she needs a reasonable accommodation of a disability should discuss the need for possible accommodation with the Human Resources Department, or his or her direct supervisor.
Data Engineer
Data scientist job in Charlotte, NC
C# Senior Developer RESOURCE TYPE: W2 Only Charlotte, NC - Hybrid Mid (5-7 Years) Role Description A leading Japanese bank is in the process of driving a Digital Transformation across its Americas Division as it continues to modernize technology, strengthen its data-driven approach, and support future growth. As part of this initiative, the firm is seeking an experienced Data Engineer to support the design and development of a strategic enterprise data platform supporting Capital Markets and affiliated securities businesses.
This role will contribute to the development of a scalable, cloud-based data platform leveraging Azure technologies, supporting multiple business units across North America and global teams.Role Objectives
Serve as a member of the Data Strategy team, supporting broker-dealer and swap-dealer entities across the Americas Division.
Participate in the active development of the enterprise data platform, beginning with the establishment of reference data systems for securities and pricing data, and expanding into additional data domains.
Collaborate closely with internal technology teams while adhering to established development standards and best practices.
Support the implementation and expansion of the strategic data platform on the bank's Azure Cloud environment.
Contribute technical expertise and solution design aligned with the overall Data Strategy roadmap.
Qualifications and Skills
Proven experience as a Data Engineer, with strong hands-on experience in Azure cloud environments.
Experience implementing solutions using:
Azure Cloud Services
Azure Data Factory
Azure Data Lake Gen2
Azure Databases
Azure Data Fabric
API Gateway management
Azure Functions
Strong experience with Azure Databricks.
Advanced SQL skills across relational and NoSQL databases.
Experience developing APIs using Python (FastAPI or similar frameworks).
Familiarity with DevOps and CI/CD pipelines (Git, Jenkins, etc.).
Strong understanding of ETL / ELT processes.
Experience within financial services, including exposure to financial instruments, asset classes, and market data, is a strong plus.
Senior Data Engineer
Data scientist job in Charlotte, NC
We are
At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver industry-leading digital solutions. Synechron s progressive technologies and optimization strategies span end-to-end Artificial Intelligence, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering, servicing an array of noteworthy financial services and technology firms. Through research and development initiatives in our FinLabs we develop solutions for modernization, from Artificial Intelligence and Blockchain to Data Science models, Digital Underwriting, mobile-first applications and more. Over the last 20+ years, our company has been honored with multiple employer awards, recognizing our commitment to our talented teams. With top clients to boast about, Synechron has a global workforce of 14,500+, and has 58 offices in 21 countries within key global markets.
Our Challenge:
Looking for skilled senior data engineer with comprehensive experience in designing, developing, and maintaining scalable data solutions within the financial and regulatory domains. Proven expertise in leading end-to-end data architectures, integrating diverse data sources, and ensuring data quality and accuracy.
Additional Information*
The base salary for this position will vary based on geography and other factors. In accordance with law, the base salary for this role if filled within New York, NY is $135k - $155k/year & benefits (see below).
Work location: New York City, NY (Hybrid, 3 days in a week)
The Role
Responsibilities:
Advanced proficiency in Python, SQL Server, Snowflake, Azure Databricks, and PySpark.
Strong understanding of relational databases, ETL processes, and data modeling.
Expertise in system design, architecture, and implementing robust data pipelines.
Hands-on experience with data validation, quality checks, and automation tools (Autosys, Control-M).
Skilled in Agile methodologies, SDLC processes, and CI/CD pipelines.
Effective communicator with the ability to collaborate with business analysts, users, and global teams.
Requirements:
Overall 10+ years of IT experience is required
Collaborate with business stakeholders to gather technical specifications and translate business requirements into technical solutions.
Develop and optimize data models and schemas for efficient data integration and analysis.
Lead application development involving Python, Pyspark, SQL, Snowflake and Databricks platforms.
Implement data validation procedures to maintain high data quality standards.
Strong experience in SQL (Writing complex queries, Join, Tables etc.)
Conduct comprehensive testing (UT, SIT, UAT) alongside business and testing teams.
Provide ongoing support, troubleshooting, and maintenance in production environments.
Contribute to architecture and design discussions to ensure scalable, maintainable data solutions.
Experience with financial systems (capital markets, credit risk, and regulatory compliance applications).
We offer:
A highly competitive compensation and benefits package.
A multinational organization with 58 offices in 21 countries and the possibility to work abroad.
10 days of paid annual leave (plus sick leave and national holidays).
Maternity & paternity leave plans.
A comprehensive insurance plan including medical, dental, vision, life insurance, and long-/short-term disability (plans vary by region).
Retirement savings plans.
A higher education certification policy.
Commuter benefits (varies by region).
Extensive training opportunities, focused on skills, substantive knowledge, and personal development.
On-demand Udemy for Business for all Synechron employees with free access to more than 5000 curated courses.
Coaching opportunities with experienced colleagues from our Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups.
Cutting edge projects at the world's leading tier-one banks, financial institutions and insurance firms.
A flat and approachable organization.
A truly diverse, fun-loving, and global work culture.
SYNECHRON'S DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference' is committed to fostering an inclusive culture - promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant's gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
Senior Data Engineer
Data scientist job in Durham, NC
We are seeking an experienced Senior Big Data & Cloud Engineer to design, build, and deliver advanced API and data solutions that support financial goal planning, investment insights, and projection tools. This role is ideal for a seasoned engineer with 10+ years of hands-on experience in big data processing, distributed systems, cloud-native development, and end-to-end data pipeline engineering.
You will work across retail, clearing, and custody platforms, leveraging modern cloud and big data technologies to solve complex engineering challenges. The role involves driving technology strategy, optimizing large-scale data systems, and collaborating across multiple engineering teams.
Key Responsibilities
Design and develop large-scale data movement services using Apache Spark (EMR) or Spring Batch.
Build and maintain ETL workflows, distributed pipelines, and automated batch processes.
Develop high-quality applications using Java, Scala, REST, and SOAP integrations.
Implement cloud-native solutions leveraging AWS S3, EMR, EC2, Lambda, Step Functions, and related services.
Work with modern storage formats and NoSQL databases to support high-volume workloads.
Contribute to architectural discussions and code reviews across engineering teams.
Drive innovation by identifying and implementing modern data engineering techniques.
Maintain strong development practices across the full SDLC.
Design and support multi-region disaster recovery (DR) strategies.
Monitor, troubleshoot, and optimize distributed systems using advanced observability tools.
Required Skills :
10+ years of experience in software/data engineering with strong big data expertise.
Proven ability to design and optimize distributed systems handling large datasets.
Strong communicator who collaborates effectively across teams.
Ability to drive architectural improvements and influence engineering practices.
Customer-focused mindset with commitment to delivering high-quality solutions.
Adaptable, innovative, and passionate about modern data engineering trends.
Principal Big Data Engineer
Data scientist job in Durham, NC
Immediate need for a talented Principal Big Data Engineer. This is a 12+ months contract opportunity with long-term potential and is located in Durham, NC(Onsite). Please review the job description below and contact me ASAP if you are interested.
Job ID: 25-94747
Pay Range: $63 - $73 /hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Responsibilities:
We are seeking a highly motivated Data Engineer to join the Data Aggregation team.
Data Aggregation is a growing area and we are looking for a skilled engineer to drive design and development of industry leading external facing API solutions.
The comprehensive API / data solutions will seek to bring together retail, clearing and custody capabilities to help external fintech partners with the financial goal planning, investment advice and financial projections capabilities to better serve our clients and more efficiently partner with them to accomplish their financial objectives.
Key Requirements and Technology Experience:
Bachelor's or Master's Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with 10 years of working experience
Big Data Processing: Apache Spark (EMR), Scala, distributed computing, performance optimization
Cloud & Infrastructure: AWS (S3, EMR, EC2, Lambda, Step Functions), multi-region DR strategy
Databases: Cassandra/YugaByte (NoSQL), Oracle, PostgreSQL,
Snowflake Data Pipeline: ETL design, API integration, batch processing
DevOps & CI/CD: Jenkins, Docker, Kubernetes, Terraform, Git
Monitoring & Observability: Splunk, Datadog APM, Grafana, CloudWatch
Orchestration: Control-M job scheduling, workflow automation
Financial domain experience
Our client is a leading Financial Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
AWS Data Engineer
Data scientist job in Charlotte, NC
We are looking for a skilled and experienced AWS Data Engineer with 10+ Years of experience to join our team. This role requires hands-on expertise in AWS serverless technologies, Big Data platforms, and automation tools. The ideal candidate will be responsible for designing scalable data pipelines, managing cloud infrastructure, and enabling secure, reliable data operations across marketing and analytics platforms.
Key Responsibilities:
Design, build, and deploy automated CI/CD pipelines for data and application workflows.
Analyze and enhance existing data pipelines for performance and scalability.
Develop semantic data models to support activation and analytical use cases.
Document data structures and metadata using Collibra or similar tools.
Ensure high data quality, availability, and integrity across platforms.
Apply SRE and DevSecOps principles to improve system reliability and security.
Manage security operations within AWS cloud environments.
Configure and automate applications on AWS instances.
Oversee all aspects of infrastructure management, including provisioning and monitoring.
Schedule and automate jobs using tools like Step Functions, Lambda, Glue, etc.
Required Skills & Experience:
Hands-on experience with AWS serverless technologies: Lambda, Glue, Step Functions, S3, RDS, DynamoDB, Athena, CloudFormation, CloudWatch Logs.
Proficiency in Confluent Kafka, Splunk, and Ansible.
Strong command of SQL and scripting languages: Python, R, Spark.
Familiarity with data formats: JSON, XML, Parquet, Avro.
Experience in Big Data engineering and cloud-native data platforms.
Functional knowledge of marketing platforms such as Adobe, Salesforce Marketing Cloud, and Unica/Interact (nice to have).
Preferred Qualifications:
Bachelor's or Master's degree in Computer Science, Data Engineering, or related field.
AWS, Big Data, or DevOps certifications are a plus.
Experience working in hybrid cloud environments and agile teams.
Life at Capgemini
Capgemini supports all aspects of your well-being throughout the changing stages of your life and career. For eligible employees, we offer:
Flexible work
Healthcare including dental, vision, mental health, and well-being programs
Financial well-being programs such as 401(k) and Employee Share Ownership Plan
Paid time off and paid holidays
Paid parental leave
Family building benefits like adoption assistance, surrogacy, and cryopreservation
Social well-being benefits like subsidized back-up child/elder care and tutoring
Mentoring, coaching and learning programs
Employee Resource Groups
Disaster Relief
Disclaimer
Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law.
This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship.
Capgemini is committed to providing reasonable accommodations during our recruitment process. If you need assistance or accommodation, please get in touch with your recruiting contact.
Click the following link for more information on your rights as an Applicant **************************************************************************
Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Palantir Data Engineer
Data scientist job in Charlotte, NC
Build and maintain data pipelines and workflows in Palantir Foundry.
Design, train, and deploy ML models for classification, optimization, and forecasting use cases.
Apply feature engineering, data cleaning, and modeling techniques using Python, Spark, and ML libraries.
Create dashboards and data applications using Slate or Streamlit to enable operational decision-making.
Implement generative AI use cases using large language models (GPT-4, Claude, etc)
AWS Data Engineer (Only W2)
Data scientist job in Charlotte, NC
Title: AWS Data Engineer
Exprience: 10 years
Must Have Skills:
• Strong experience in AWS services, primarily serverless, databases, storage services, container services, schedulers, and batch services.
• Experience in Snowflake and Data Build Tool.
• Expertise in DBT, NodeJS and Python.
• Expertise in Informatica, PowerBI , Database, Cognos.
Nice to Have Skills:
Detailed Job Description:
• Strong experience in AWS services, primarily serverless, databases, storage services, container services, schedulers, and batch services.
• Experience in Snowflake and Data Build Tool.Expertise in DBT, NodeJS and Python.
• Expertise in Informatica, PowerBI , Database, Cognos.
• Proven experience in leading teams across locations.
• Knowledge of DevOps processes, Infrastructure as Code and their purpose.
• Good understanding of data warehouses, their purpose, and implementation
• Good communication skills.
Kindly share the resume in ******************
Teradata, SAS Developer
Data scientist job in Charlotte, NC
Job Title: Teradata, SAS Developer
Duration: Fulltime Employment (Permanent)
Must Have Technical/Functional Skills:
Prior experience with bank data sources (for internal candidates)
Experience with DTS (for internal candidates)
Familiarity/prior experience with AGILE tools and methodologies, use of Jira, knowledge of Story development and organization from business requirements.
Excellent oral and written communication skills
Interpersonal skills, as this role works closely with both LOB partners and Vendor representatives
Highly proficient with Teradata SQL in performing ETL, building tables and views, organizing data into usable and ready to access data for reporting and analytics
Proficient with Korn Shell (KSH) in a Linux environment
Knowledge of BTEQ
Strong application development skills related to data mart management
Foundational development knowledge to support reporting and analytics
Provide consultative capability in addressing complex business issues, data investigation, user research requests
Assist with application management activities where needed: process flows, Business Process Management (BPM), and application governance (Rise deliverables)
Roles & Responsibilities:
Coordinates and facilitates routines to support delivery of technology solutions - e.g. kick-offs, status reviews, stakeholders meetings, change controls, and tollgates.
Plans and coordinates delivery and dependencies across multiple technology teams.
Facilitates dependency management, risk managements, and impediment removal for the defined deliverables.
Promotes and facilitates communication and collaboration across organizations to support the deliverable completion and timeline. Articulate clear updates and critical path.
Gathers and facilitates project updates for the deliverables to stakeholders and leadership pertaining to deliver, risks/issues and schedule.
Ensures that execution is aligned with deliverable requirements by working with the sponsor and stakeholders.
Identify any emerging risks/issues, escalated as needed, and identify critical path to resolve.
Execute appropriate due diligence and financial management routines to deliver against financial commitments.
Ensures deliverables comply with Enterprise Change Management standards and maintain evidence and systems for record for change.
Supports resource planning for delivery/execution.
Strong MS Excel skills to ensure cost reconciled with other systems of records.
Data Modeler
Data scientist job in Columbia, SC
Job Title: Data Modeler (3NF / Canonical Models)
We are seeking an experienced Data Modeler to design and maintain high-quality conceptual, logical, and physical data models. The role focuses on 3NF and canonical data modeling to support enterprise applications and analytics.
Responsibilities:
Develop and maintain conceptual, logical, and physical data models
Define data modeling standards, guidelines, and best practices
Work with business and technical stakeholders to translate requirements into data models
Ensure data integrity, consistency, and quality across systems
Document data models, metadata, data flows, and data dictionaries
Collaborate with development teams and participate in model reviews
Requirements:
Bachelor's or Master's degree in Computer Science, Information Systems, or related field
4+ years of hands-on data modeling experience
Strong experience with 3NF and/or canonical data models
Proficiency with data modeling tools (Erwin, ER/Studio)
Strong SQL and relational database knowledge (Oracle, SQL Server, MySQL)
Familiarity with data warehousing and cloud platforms (AWS, Snowflake, Databricks preferred)
Excellent communication and problem-solving skills
Snowflake Data Engineer
Data scientist job in Durham, NC
Experience in developing and proficient in SQL and knowledge on Snowflake cloud computing environments
Knowledge on Data warehousing concepts and metadata management Experience with data modeling, Data lakes
multi-dimensional models and data dictionaries
Hands-on experience with Snowflake features like Time Travel and Zero-Copy Cloning. Experience in query performance
tuning and cost optimization in a cloud data platform
Knowledge in Snowflake warehousing, architecture, processing and administration , DBT , Pipeline
Hands-on experience on PLSQL Snowflake
•Excellent personal communication, leadership, and organizational skills.
•Should be well versed with various Design patterns
Knowledge of SQL database is a plus
Hands-on Snowflake development experience is must
Work with various cross-functional groups, tech leads from other tracks
Need to work with team closely and guide them technically/functionally Must be a team player with good attitude
Data Scientist
Data scientist job in Durham, NC
Temp
Data Scientist - Boston, MA or NC, NH, RI, TX, MA, or CO (18 month contract- probable extension or permanent conversion)
Notes:
Master's Degree Required Python development to build time series models Run SQL queries Linux OS administration
Any web development experience preferred.
Experience working with Artificial Intelligence, Machine learning algorithms, neural networks, decision trees, modeling, Cloud Machine Learning, time series analysis and robotics process automation.
Description:
We are seeking a hands-on experienced data scientist with financial services industry experience. As part of a small, nimble team, the associate's key differentiating abilities will be exceptional analytical skills, and an ability to conceive of and develop differentiated products for the benefit of customers. Absolutely critical is the associate's ability to carry an initiative from idea through to execution.
5+ years' experience in Information security/technology risk management for large-scale, complex IT infrastructures and distributed environments or an equivalent combination of related training and experience
Analytic Skills: In addition to core regression, classification and time series skills that accompany the data science role, experience with next best action (NBA) prediction, multi-armed bandits, online learning, A/B testing, and experimentation methods are preferred
Natural programmer, and confirmed industry experience with statistics and data modeling
Experience with one or more of the following tools/frameworks - python, scikit-learn, nltk, pandas, numpy, R, pyspark, scala, SQL/big data tools, TensorFlow, PyTorch, etc
Education- At least one advanced degree (Master or PhD level) in a technical or mathematically-oriented discipline, e.g., coursework or experience in fields such as statistics, machine learning, computer science, applied mathematics, econometrics, engineering, etc.
Extensive experience in written and oral communications/presentations, and ability to produce a variety of business documents (business requirements, technical specs, slide presentations, etc.) that demonstrate command of language, clarity of thought, and orderliness of presentation
We are looking for an authority quantitative developer to advance the research and development of AI/ML methods as components in the delivery of creative investment management technology solutions. You will have experience combining multi-variate statistical modeling, predictive machine learning methods and open-source approaches to Cloud computing and Big Data.
Data Scientist
Data scientist job in Charlotte, NC
Pay Range - $75.00 to $85.00/Hour Data Scientist IV will support strategic marketing initiatives, drive data-driven decision-making, and optimize campaign performance in a competitive telecom environment. The ideal candidate will have deep expertise in predictive modeling, statistical analysis, and data engineering collaboration, with a proven ability to translate complex data into business impact.
Have 7+ years of experience in advanced analytics, with a proven ability to deliver insights that drive business growth.
Hold a Bachelor's degree in Analytics, Statistics, Data Science, Marketing, or a related field (Master's degree preferred).
Have strong technical skills and are proficient in analytical tools and platforms, including Python, R, SQL, Tableau, and speech analytics tools (e.G., CallMiner, Verint, NICE).
Have experience with propensity modeling, machine learning, and NLP techniques (preferably within telecom).
Excel at analyzing data to uncover trends, predict outcomes, and create actionable recommendations.
Possess strong communication and collaboration skills to work effectively across teams and stakeholders.
Demonstrate critical thinking and a strong understanding of the telecom industry, including customer acquisition, churn reduction, and pricing strategies.
Data Scientist
Data scientist job in Charlotte, NC
Must Have Technical/Functional Skills Strong Python and Machine Learning skillset An experienced Data Scientist to lead end-to-end AI/ML solution design and implementation across a range of business domains in financial services. You will be responsible for architecting robust, scalable, and secure data science solutions that drive innovation and competitive advantage in the BFSI sector. This includes selecting appropriate technologies, defining solution blueprints, ensuring production readiness, and mentoring cross-functional teams. You will work closely with stakeholders to identify high-value use cases and ensure seamless integration of models into business applications. Your deep expertise in machine learning, cloud-native architectures, MLOps practices, and financial domain knowledge will be essential to influence strategy and deliver transformative business impact.
* Proficient in Python, scikit-learn, TensorFlow, PyTorch, HuggingFace.
* Strong BFSI domain knowledge.
* Experience with NLP, LLMs (GPT), and deep learning.
* Hands-on with MLOps pipelines and tools.
* Experience with graph analytics tools (Neo4j, TigerGraph, NetworkX).
Roles & Responsibilities
* Architect and drive the design, development, and deployment of scalable ML/AI solutions.
* Lead data science teams through complete project lifecycles - from ideation to production.
* Define standards, best practices, and governance for AI/ML solutioning and model management.
* Collaborate with data engineering, MLOps, product, and business teams.
* Oversee integration of data science models into production systems.
* Evaluate and recommend ML tools, frameworks, and cloud-native solutions.
* Guide feature engineering, data strategy, and feature store design.
* Promote innovation with generative AI, reinforcement learning, and graph-based learning.
* Knowledge of Spark, PySpark, Scala.
* Experience leading CoEs or data science accelerators.
* Open-source contributions or published research.
TCS Employee Benefits Summary:
* Discretionary Annual Incentive.
* Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans.
* Family Support: Maternal & Parental Leaves.
* Insurance Options: Auto & Home Insurance, Identity Theft Protection.
* Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement.
* Time Off: Vacation, Time Off, Sick Leave & Holidays.
* Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
Salary Range : $100,000-$130,000 a year
Data Scientist, Product Observability
Data scientist job in Cary, NC
WHAT MAKES US EPIC?
At the core of Epic's success are talented, passionate people. Epic prides itself on creating a collaborative, welcoming, and creative environment. Whether it's building award-winning games or crafting engine technology that enables others to make visually stunning interactive experiences, we're always innovating.
Being Epic means being a part of a team that continually strives to do right by our community and users. We're constantly innovating to raise the bar of engine and game development.
ANALYTICS What We Do
Our Data & Analytics teams build powerful stories and visuals that inform the games we make, the technology we develop, and business decisions that drive Epic.
What You'll Do
Epic Games is seeking to expand its Insights team with a focus on Observability. We are looking for a Data Scientist to assist us in delivering and maintaining an Epic-quality live service game. This role wears many hats, juggles dynamic priorities, and works toward constantly moving targets. You'll create Data Tools, Workbooks, and Dashboards to support global Epic Development and Publishing efforts.
In this role, you will
Expand on Epic's real-time Observability Framework by building, maintaining, and working with stakeholders to produce insightful product health monitoring and alerting.
Educate key stakeholders on real-time data and provide clear sources of truth to make high-velocity data-driven decisions.
Advocate for the player - coordinate across multiple disciplines, teams, and time zones to maintain a high-quality live service product.
Dig into data, utilizing existing tools and rapid ad-hoc querying and visualization to help resolve active live issues.
Drive Observability evolution - continuously re-evaluate existing dashboards for opportunities to improve our data story. Work with Data Teams to leverage the latest data features and tools.
What we're looking for
Direct experience building or maintaining observability for live service games, including familiarity with instrumentation, event schemas, real-time alerting, and health dashboards
Proficiency with Grafana, DataDog, Kibana, or Amazon CloudWatch
Proficiency with JPath, PromQL, SQL, or strong knowledge of scripting languages.
Prior experience in analytics, project management, release management, production, or analytics qa.
Strong in-the-moment problem-solving skills and ability to break large problems into manageable and actionable pieces.
Ability to communicate equally effectively with both technical and non-technical stakeholders at all levels.
Flexible availability - including on-call - and strong time management skills.
EPIC JOB + EPIC BENEFITS = EPIC LIFE
We pay 100% for benefits for both employees and dependents and offer coverage for supplemental medical, dental, vision, critical illness, telemedicine, Life and AD&D, long term disability insurance as well as weekly indemnity (short term disability) and a retirement savings plan with a competitive employer match. In addition to the EAP (employee assistance program), we also offer a robust mental well-being program through Modern Health, which provides free therapy and coaching for employees & dependents.
British Columbia Base Pay Range$138,840-$203,632 CAD ABOUT US
Epic Games spans across 25 countries with 46 studios and 4,500+ employees globally. For over 25 years, we've been making award-winning games and engine technology that empowers others to make visually stunning games and 3D content that bring environments to life like never before. Epic's award-winning Unreal Engine technology not only provides game developers the ability to build high-fidelity, interactive experiences for PC, console, mobile, and VR, it is also a tool being embraced by content creators across a variety of industries such as media and entertainment, automotive, and architectural design. As we continue to build our Engine technology and develop remarkable games, we strive to build teams of world-class talent.
Like what you hear? Come be a part of something Epic!
Epic Games deeply values diverse teams and an inclusive work culture, and we are proud to be an Equal Opportunity employer. Learn more about our Equal Employment Opportunity (EEO) Policy here.
Note to Recruitment Agencies: Epic does not accept any unsolicited resumes or approaches from any unauthorized third party (including recruitment or placement agencies) (i.e., a third party with whom we do not have a negotiated and validly executed agreement). We will not pay any fees to any unauthorized third party. Further details on these matters can be found here.
Auto-ApplySr. Data Scientist
Data scientist job in Charlotte, NC
DPR Construction is seeking a skilled Senior Data Scientist to help advance our data-driven approach to building. In this role, you'll use statistical analysis, machine learning, and data visualization to turn complex construction and business data into actionable insights that improve project planning, cost forecasting, resource management, and safety. Working with project and operations teams, you'll build and deploy scalable, secure data solutions on cloud platforms like Azure and AWS, driving innovation and operational excellence across DPR's projects.
Responsibilities
* Data analysis and modeling: Analyze large datasets to identify trends, bottlenecks, and areas for improvement in operational performance. Build predictive and statistical models to forecast demand, capacity, and potential issues.
* Develop and deploy models: Build, test, and deploy machine learning and AI models to improve operational processes.
* Analyze operational data: Examine data related to projects, production, supply chains, inventory, and quality control to identify patterns, trends, and inefficiencies.
* Optimize processes: Use data-driven insights to streamline workflows, allocate resources more effectively, and improve overall performance.
* Forecast and predict: Create predictive models to forecast outcomes, such as demand, and inform strategic decisions.
* Communicate findings: Present findings and recommendations to stakeholders through reports, visualizations, and presentations.
* Ensure reliability: Build and maintain reliable, scalable, and efficient data science systems and processes.
* Collaboration: Partner with project managers, engineers, and business leaders to ensure data solutions are aligned with organizational goals and deliver tangible improvements.
* Continuous Learning: Stay current with advancements in data science and machine learning to continually enhance the company's data capabilities.
* Reporting and communication: Create dashboards and reports that clearly communicate performance trends and key insights to leadership and other stakeholders. Translate complex data into actionable recommendations.
* Performance monitoring: Implement data quality checks and monitor the performance of models and automated systems, creating feedback loops for continuous improvement.
* Experimentation: Design and evaluate experiments to quantify the impact of new systems and changes on operational outcomes.
Qualifications
* Bachelor's or Master's degree in Data Science, Computer Science, Statistics, Engineering, or a related field.
* 7+ years of experience in data science roles within AEC, product or technology organizations.
* At least 4 years of experience working with cloud platforms, specifically Azure and AWS, for model deployment and data management.
* Strong proficiency in Python or R for data analysis, modeling, and machine learning, with experience in relevant libraries (e.g., Scikit-learn, TensorFlow, PyTorch) and NLP frameworks (e.g., GPT, Hugging Face Transformers).
* Expertise in SQL for data querying and manipulation, and experience with data visualization tools (e.g., Power BI, Tableau).
* Solid understanding of statistical methods, predictive modeling, and optimization techniques.
* Expertise in statistics and causal inference, applied in both experimentation and observational causal inference studies.
* Proven experience designing and interpreting experiments and making statistically sound recommendations.
* Strategic and impact-driven mindset, capable of translating complex business problems into actionable frameworks.
* Ability to build relationships with diverse stakeholders and cultivate strong partnerships.
* Strong communication skills, including the ability to bridge technical and non-technical stakeholders and collaborate across various functions to ensure business impact.
* Ability to operate effectively in a fast-moving, ambiguous environment with limited structure.
* Experience working with construction-related data or similar industries (e.g., engineering, manufacturing) is a plus.
Preferred Skills
* Familiarity with construction management software (e.g., ACC, Procore, BIM tools) and knowledge of project management methodologies.
* Hands-on experience with Generative AI tools and libraries.
* Background in experimentation infrastructure or human-AI interaction systems.
* Knowledge of time-series analysis, anomaly detection, and risk modeling specific to construction environments.
DPR Construction is a forward-thinking, self-performing general contractor specializing in technically complex and sustainable projects for the advanced technology, life sciences, healthcare, higher education and commercial markets. Founded in 1990, DPR is a great story of entrepreneurial success as a private, employee-owned company that has grown into a multi-billion-dollar family of companies with offices around the world.
Working at DPR, you'll have the chance to try new things, explore unique paths and shape your future. Here, we build opportunity together-by harnessing our talents, enabling curiosity and pursuing our collective ambition to make the best ideas happen. We are proud to be recognized as a great place to work by our talented teammates and leading news organizations like U.S. News and World Report, Forbes, Fast Company and Newsweek.
Explore our open opportunities at ********************
Auto-Apply