International Logistics Data Quality Analyst
Data analyst job in Atlanta, GA
Log-Net, Inc provides the finest trade, transportation, and logistics information technology systems via the internet. We currently have a unique opportunity in the areas of international logistics process improvement and trading partner management. Your primary roles would be to support the ongoing deployment of a global logistics solutions and to facilitate the management of all supplier interfaces between our clients, including coordinating and monitoring activity with the trading partners, managing the EDI processes, ensuring quality control standards are maintained and setting and maintaining agreed upon service level of standards. If you enjoy a challenging and fast paced demanding environment with customer interaction, we invite you to take a closer look at this opportunity.
Responsibilities
Understand the day-to-day issues that our business faces, which can be better understood with data
Being the point of contact for EDI and reporting around EDI
Successfully onboard and be the point of contact for new trade partners into LOG-NET
Compile and analyze data related to business EDI issues
Be able to fully understand LOG-NET's platform and the capabilities of it
Conducting analysis, presentations, and reports on the supply chain outlook.
Developing carrier scorecard reports and analysis.
Integrated logistics communication with trade partners and carriers.
Coordinating client problem solving and troubleshooting.
Self-starter, technical aptitude with an attitude towards learning
Qualifications
Bachelor's or Master's degree in Logistics, Supply Chain, Statistics, Applied Mathematics, Physics or equivalent experience preferred
EDI or API integration experience preferred
0-1 years of Analytical experience
Experience with data analytics, SQL, Big Query preferred
Experience in international supply chain preferred
Authorization to work in the United States is required for this position
SALARY RANGE: USD 50,000-70,000
GIS Data Analyst
Data analyst job in Atlanta, GA
We are seeking a highly skilled GIS Data Analyst to support the creation, maintenance, and quality assurance of critical geospatial infrastructure data for enterprise-level Engineering initiatives. This role plays a key part in ensuring the accuracy and reliability of GIS data used across the organization - including compliance-related programs and operational analytics.
The ideal candidate is a hands-on GIS professional with strong analytical skills, advanced geospatial editing experience, and the ability to interpret field data into accurate digital representations. PTC (Positive Train Control) and rail experience are preferred but not required.
Key Responsibilities
Create, modify, and quality-check geospatial infrastructure data for engineering and business operations
Utilize GIS tools to ensure accurate topology and track geometry representation
Convert field-collected spatial data into a validated digital rail network aligned with organizational standards
Review, approve, and promote data change sets submitted by GIS/CAD technicians
Conduct regular inventory analysis including track mileage, asset counts, and spatial measurements
Collaborate with engineering sub-groups and business partners to support enterprise GIS initiatives
Contribute to the preparation, assembly, and deployment of geospatial data to support compliance programs and corporate systems
Support continuous improvement by recommending cost-saving initiatives leveraging GIS technologies
Assist senior GIS staff in additional GIS responsibilities as needed
Required Skills & Qualifications
Advanced proficiency with GIS software, ideally ESRI tools (ArcGIS Pro, Desktop, geodatabase editing, topology management)
Strong analytical, problem-solving, and data quality assurance capabilities
Ability to interpret engineering drawings, field data, and spatial reference materials
Familiarity working with infrastructure or utility network datasets
Excellent communication and collaboration skills
Bachelor's degree required - GIS, Computer Science, Software Engineering, IT, Geography, or related
Preferred Qualifications (Nice to Have)
Exposure to railroad infrastructure or linear transportation networks
Experience supporting Positive Train Control (PTC) data models or compliance initiatives
Working knowledge of CAD-to-GIS workflows
Experience with Enterprise GIS deployments in large-scale organizations
Soft Skills
Detail-oriented data stewardship mindset
Ability to make informed decisions and manage competing priorities
Strong teamwork and communication in a technical environment
Data Strategy Analyst
Data analyst job in Atlanta, GA
Required Skills & Experience
5+ years in data analytics Advanced SQL skills Experience building business strategy from data findings Collaborate with internal and external partners to implement recommendations Utilize statistical techniques to evaluate strategy performance
Nice to Have Skills & Experience
Banking experience Collections & recovery experience
Job Description
We are hiring for a Strategy Data Analyst to join one of our financial clients working in Consumer & Small Business Collections & Recovery Strategy department. The ideal individual should have a deep understanding of data analysis and have the ability to develop SQL code to extract statistical & financial data across databases. The goal of this role is to analyze data on delinquent customers to identify the most likely channels to reach the customer & payment recovery offer the bank should propose.
Data Management Analyst
Data analyst job in Charlotte, NC
Need strong Data Management resources that have hands-on data provisioning and ability to distribute the data.
Moderate to Advanced SQL skills (writing complex queries is a plus)
Commercial Lending (iHub, WICS, WICDR systems)/Commercial Banking Background
Metadata/Data Governance
Regulatory Reporting
Data Management Framework
SQL
Data Quality
Data Analyst III
Data analyst job in Columbia, SC
Hours/Schedule: Schedule: 8:30am to 5pm with a 30-minute lunch or 8 to 5 with an hour lunch - Potential OT.
Hybrid - Onsite 3x per week.
Day to day: There is a development and recurring or operational focus for the analyst. The development work is consultative, customer-facing, and requires understanding I/S business processes.
Work involves facilitating meetings, gathering and documenting requirements, interacting with management and multiple teams to complete work, designing and developing a solution, and presenting outcomes to customers.
This work requires a technical focus as well as a business perspective and consultative mindset. The recurring or operational work is recurring, repeatable tasks and includes one-time ad-hoc requests.
Recurring reports and data entry are well defined and are as automated as possible. Recurring work is reviewed annually at a minimum to ensure it continues to meet business needs.
This work requires knowledge of reporting and data tools and communication with customers.
The operations focuses on timeliness, accuracy, consistency, and providing the key customer insights and analysis to help with business needs and decisions.
Responsibilities:
Creates and analyzes reports to support operations.
Ensures correctness of analysis, and reports findings in a concise manner to senior management.
Directly responsible for accuracy of data as financial and operational decisions are made based on the data provided.
Generates internal and external reports to support management in determining productivity and efficiencies of programs or operational processes. Revises existing reports and develops new reports based on changing methodologies.
Analyzes reports to ensure accuracy and quality. Tracks and verifies all reporting statistics.
Communicates and trains employees and managers on the complex database programs used to generate analytical data.
Designs, codes, and maintains complex database programs for the extraction and analysis of data to support financial and operational decisions.
Experience:
4 Years Research and analysis experience.
Skills:
Strong organizational, customer service, communications, and analytical skills.
Advanced experience using complex mathematical calculations and understand mathematical and statistical concepts.
Knowledge of relevant computer support systems.
Ability to train subordinate staff including provide assistance/guidance to staff in design/execution of reporting needs.
Proven experience with report writing and technical requirements analysis, data and business process modeling/mapping, and methodology development.
Strong understanding of relational database structures, theories, principles, and practices.
Required Software and Other Tools: Advanced knowledge of Microsoft Office. Knowledge of programming languages across various software platforms, using DB2, SQL, and/or relational databases. Knowledge of tools such as Visual Basic and Macros useful in automating reporting processes.
Education:
Bachelor's degree in Statistics, Computer Science, Mathematics, Business, Healthcare, or other related field.
About US Tech Solutions:
US Tech Solutions is a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions. To know more about US Tech Solutions, please visit ************************
US Tech Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Recruiter Details:
Name: Shailesh
Email: *********************************
Internal Id: 25-53080
Data Governance Analyst
Data analyst job in Charlotte, NC
Key Responsibilities:
Identifying and documenting Key Data Elements utilized in high profile dashboards, key business process and other ultimate use of data.
Work closely with stakeholders to understand business processes, IT architecture, data flows (particularly downstream effects) and document system of records (or authoritative data sources), Data Owners, Key Data Elements attributes, Data Lineage.
Together with Data Owners participate in the design and testing of data quality rules to be applied to each Key Data Element.
Maintain the Business Glossary and report inventory (regulatory reports and non-regulatory reports).
Capture data quality issues reported by stakeholders and input detailed information in the Data Quality Incident Management system for tracking purposes.
Produce and monitor Data Quality KPIs.
Support Root Cause analysis when a data quality issue is identified and / or process didn't work as expected.
Document business requirement for future system and/or workflow enhancements and relate such requirements to the Data Governance framework.
Work with data consumers to understand the source, creation process and purpose of data.
Qualifications And Skills:
Demonstrated experience in requirement gathering, documenting functional specification, designing testing scripts, conducting data analysis and gap analysis in tandem with Data Owners and other stakeholders
Ability to present facts, project plans, milestones, achievements and recommended solutions in a concise and intuitive manner.
Highly organized individual with exceptional attention to details, strong sense of accountability and collaboration skills.
Work Experience:
Relevant experience within the Data Governance field for a Financial Institution with focus on: documenting data requirements and data quality rules criteria; data quality issue logging and tracking.
EEO:
“Mindlance is an Equal Opportunity Employer and does not discriminate in employment on the basis of - Minority/Gender/Disability/Religion/LGBTQI/Age/Veterans.”
Financial Data Analyst
Data analyst job in Alpharetta, GA
Ready to build the future with AI?
At Genpact, we don't just keep up with technology-we set the pace. AI and digital innovation are redefining industries, and we're leading the charge. Genpact's AI Gigafactory, our industry-first accelerator, is an example of how we're scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies' most complex challenges.
If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what's possible, this is your moment.
Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook.
Inviting applications for the role of Financial Data Analyst at Alpharetta , GA .
Role : Financial Data Analyst
Location : Alpharetta , GA 30005 / 3 days from Office
Hiring Type: Fulltime with Genpact + Benefits
Responsibilities
Define and execute the product roadmap for AI tooling and data integration initiatives, driving products from concept to launch in a fast-paced, Agile environment.
Translate business needs and product strategy into detailed requirements and user stories.
Collaborate with engineering, data, and AI/ML teams to design and implement data connectors that enable seamless access to internal and external financial datasets.
Partner with data engineering teams to ensure reliable data ingestion, transformation, and availability for analytics and AI models.
Evaluate and work to onboard new data sources, ensuring accuracy, consistency, and completeness of fundamental and financial data.
Continuously assess opportunities to enhance data coverage, connectivity, and usability within AI and analytics platforms.
Monitor and analyze product performance post-launch to drive ongoing optimization and inform future investments.
Facilitate alignment across stakeholders, including engineering, research, analytics, and business partners, ensuring clear communication and prioritization.
Minimum qualifications
Bachelor's degree in Computer Science, Finance, or related discipline. MBA/Master's Degree desired.
5+ years of experience in a similar role
Strong understanding of fundamental and financial datasets, including company financials, market data, and research data.
Proven experience in data integration, particularly using APIs, data connectors, or ETL frameworks to enable AI or analytics use cases.
Familiarity with AI/ML data pipelines, model lifecycle, and related tooling.
Experience working with cross-functional teams in an Agile environment.
Strong analytical, problem-solving, and communication skills with the ability to translate complex concepts into actionable insights.
Prior experience in financial services, investment banking, or research domains.
Excellent organizational and stakeholder management abilities with a track record of delivering data-driven products.
Preferred qualifications
Deep understanding of Python, SQL, or similar scripting languages
Knowledge of cloud data platforms (AWS, GCP, or Azure) and modern data architectures (data lakes, warehouses, streaming)
Familiarity with AI/ML platforms
Understanding of data governance, metadata management, and data security best practices in financial environments.
Experience with API standards (REST, GraphQL) and data integration frameworks.
Demonstrated ability to partner with engineering and data science teams to operationalize AI initiatives.
Why join Genpact?
• Lead AI-first transformation - Build and scale AI solutions that redefine industries
• Make an impact - Drive change for global enterprises and solve business challenges that matter
• Accelerate your career-Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills
• Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace
• Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything webuild
• Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress
Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up.
Let's build tomorrow together.
Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation.
Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Business Data Analyst (Mortgage)
Data analyst job in Reston, VA
Business Analyst is responsible for leading the functional requirements gathering team. The candidate works directly with internal customers to understand the business environment and needs. Identifies relevant design, process and specification issues and then mentors/assists lower level Business Analysts to document and translate these business requirements. The candidate may be required to manage business and/or system issues during project life cycle as well as post implementation. Skills: 1) Expertise with Software Development Lifecycle (SDLC) 2) Strong oral and written communication skills 3) In-depth knowledge of client-server, object-oriented, and web-based systems, applications, environments and relevant tools/technology 4) Prior management experience 5) Strong analytical skills. Ability to identify and evaluate several alternative solutions and help the team arrive at the best functional requirement set to meet the business need 6) Knowledge of requirements tools such as Rational Requisite Pro desired Education/Work Experience: Bachelor Degree or Equivalent 10+ years software development experience with experience with projects of similar scope and complexity.
Business Reporting Analyst
Data analyst job in Charlotte, NC
Job Title: Business Reporting Analyst
Pay - Depending on experience
6 months contract to with potential extensions
WHAT WILL YOU DO?
· Develop and test dashboards, visualizations, semantic data models, and reports. Focus will be on reporting development using SAP Business Objects.
· Develop high level and detailed designs for existing reporting platforms, development activities along with process improvements.
· Work with business users to gather requirements, troubleshoot report issues, and drive resolutions towards completion. Participate in user acceptance testing.
· Effectively communicating and collaborating with business users, T&I support teams, providing an understanding of the problem and resolution.
· Identify development and data quality issues, performing thorough testing and validation of reports, visualizations and dashboards in conjunction with business users.
· Research and trouble-shoot user-reported issues and incidents related to the BI reporting environment including performance, data discrepancies, access issues, etc.
· Be a self-starter, someone that will take the initiative to learn new things and research solutions without prompting.
· Ability to multi-task various projects and tasks with tight deliverables.
· Fosters and maintains good relationships with colleagues to meet expected customer service levels.
· Develop and maintain positive, productive, and professional relationships with key business users to meet expected customer service levels.
WHAT DO YOU NEED TO SUCCEED
Must-Have*
· Minimum 3 years of BI report development experience using Business Objects 4.2 or above. Includes experience with Universe Design (IDT/UDT).
· Minimum of 1 year of SQL logic development and support experience.
Skills and Knowledge
· Strong SQL and data modeling experience; the capability to troubleshoot joins, conduct performance tuning across heterogeneous sources, and validate the end-to-end data flow process
· Direct experience cleaning data and compiling disparate data sources across multiple databases, ensuring consistent semantic layers and governed data definitions.
Direct experience with Oracle, MS SQL Server and Snowflake databases is strongly preferred·
· BS in Computer Science or a related technical field preferred.
· Direct experience with Tableau or other business intelligence platforms is strongly preferred.
· Excellent analytical and problem-solving skills.
· Excellent oral and written communication and interpersonal skills.
· Strong organizational, multi-tasking, and time-management skills preferred.
· Ability to work independently or within a team for problem resolution
· Must demonstrate ability to multi-task and be flexible.
Data Scientist - ML, Python
Data analyst job in McLean, VA
10+years of experience required in Information Technology.
Python Programming: At least 5 years of hands-on experience with Python, particularly in
frameworks like FastAPI, Django, Flask, and experience using AI frameworks.
• Access Control Expertise: Strong understanding of access control models such as Role-Based
Access Control (RBAC) and Attribute-Based Access Control (ABAC).
• API and Connector Development: Experience in developing API connectors using Python for
extracting and managing access control data from platforms like Azure, SharePoint, Java, .NET,
WordPress, etc.
• AI and Machine Learning: Hands-on experience integrating AI into applications for automating
tasks such as access control reviews and identifying anomalies
• Cloud and Microsoft Technologies: Proficiency with Azure services, Microsoft Graph API, and
experience integrating Python applications with Azure for access control reviews and reporting.
• Reporting and Visualization: Experience using reporting libraries in Python (Pandas, Matplotlib,
Plotly, Dash) to build dashboards and reports related to security and access control metrics.
• Communication Skills: Ability to collaborate with various stakeholders, explain complex
technical solutions, and deliver high-quality solutions on time.
• PlainID: Experience or familiarity with PlainID platforms for identity and access management.
• Azure OpenAI: Familiarity with Azure OpenAI technologies and their application in access
control and security workflows.
• Power BI: Experience with Microsoft Power BI for data visualization and reporting.
• Agile Methodologies: Experience working in Agile environments and familiarity with Scrum
methodologies for delivering security solutions.
Data Integration Specialist
Data analyst job in Raleigh, NC
This resource will support the NC HIEA Medicaid Services (HMS) project by leading technical assistance work for NC HIEA as we work with participants and their electronic health record (EHR) vendors on implementing, improving, and maintaining data integrations. Supports and monitors technical infrastructure for NC HIEA and SAS environments. Manages change orders, bug fixes and updates for the NC HealthConnex Clinical Portal. Develops and supports data queries to assist with data analysis, verifying data extracts, data transformation, load job streams, and developing business analysis reports. Supports various analytic and outbound data services with a specific focus on developing Queries utilizing SQL, SAS, and InterSystems HealthShare tools to perform exploratory data analysis (EDA) and reports as needed.
Required/Desired Skills
SkillRequired /DesiredAmountof Experience
Effective Communication & Community Engagement Clearly conveys information to diverse populations and healthcare stakeholders
Required
5
Years
Technical Proficiency Skilled in healthcare data systems, digital platforms, and outreach technologies
Required
5
Years
Relationship Building Build trust with providers, patients, and partners to support outreach goals
Required
5
Years
Data-Driven Outreach Gathers, analyzes, and reports data to guide strategic engagement
Required
5
Years
Organized & Adaptable Manages multiple priorities and adjusts to evolving community and technical need
Required
5
Years
SAP Master Data Steward
Data analyst job in Mooresville, NC
DEHN protects. Two words, a big promise. The motto of our company has been both an obligation and an incentive since 1910 of our family-owned company. Headquartered in Bavaria, Germany, DEHN's mission is to provide world-class. Lightning and Surge protection solutions for people, building installations and electrical/electronic devices and systems against the effects of lightning and surges. For the past 115 years, we have been leading the development in surge protection, lightning protection and safety equipment, making DEHN the most experienced and trusted expert for a total protection concept.
Business Overview:
DEHN Inc. is the USA subsidiary for DEHN SE (ISO 9001/14001 certified). We focus on solutions for lightning and surge-related problems as they apply to the North American market. These solutions include education, technical assistance, system design, risk assessments and site surveys in addition to the lightning and electrical surge protection products. Our philosophy is to use best practices from the IEC and our experience globally and apply them to the USA IEEE and NEC standards.
By combining the best technologies and processes from international and domestic markets, DEHN assures the customer will receive the most comprehensive solution tailored to their specific application. Our customers include commercial, communications, energy, electronics, industrial, hospitality, infrastructure, medical, security & defense markets. Companies depend on DEHN solutions to ensure their facilities and assets run efficiently and without fail, protecting the plant, people, equipment and the critical services they provide to the public and industry. With over 115 years in business, we have two words… DEHN protects.
Position Overview
The SAP Master Data Steward supports and executes data management, data quality, and data cleansing activities to ensure accurate and reliable material master data across the organization. This role works across current and future SAP environments and partners closely with Procurement, Production, Data Governance, and IT to maintain high-quality master data for both purchased and manufactured materials.
Key Responsibilities
Review, validate, and approve new material master data creation and change requests.
Create material master records for purchased and manufactured materials using business-provided information and templates.
Execute data cleansing and retrofit activities before and after SAP go-live.
Perform quality checks to ensure accuracy and consistency of material master data.
Act as a liaison between business units and the Data Governance team.
Provide guidance to key users on the structure and interdependencies of master data fields.
Support rollout of new master data guidelines, standards, and policies.
Monitor and drive progress for new material creation and extension requests.
Execute master data cleanup activities, including authorization cleanup in coordination with IT.
Communicate best practices in master data control, governance, and data quality standards across business areas.
Identify and recommend process improvements to enhance data quality and prevent recurrence of data issues.
Assist with data migration activities for legacy data moving into SAP.
Partner with SAP Migration teams and business units on data governance transitions.
Coordinate dual-maintenance activities during cutover periods.
Participate in assigned projects related to master data, data quality, or deployment.
Work on-site as required.
Qualifications
Required
Experience in material master data creation, governance, or stewardship.
Working knowledge of ERP master data processes; SAP experience preferred (Materials, BOMs, Routings, PIRs, Source Lists).
Strong attention to detail and commitment to data accuracy.
Ability to manage a high-volume workload in a fast-paced environment.
Strong analytical skills and advanced Excel capabilities.
Effective communication and interpersonal skills with a customer-service mindset.
Ability to work collaboratively across business and technical teams.
Associate's degree in Business, IT, Supply Chain, or related field or equivalent professional experience.
Preferred
Experience in a purchasing or manufacturing environment.
Prior support experience with ERP systems in manufacturing settings (SAP preferred).
Bachelor's degree in Business, IT, Supply Chain, or related field or equivalent professional experience.
Minimum 3 years' experience in master data, data quality, purchasing, or related functions.
Clinical Data Analyst
Data analyst job in Raleigh, NC
Role: Technical Specialist - Expert
Duration: 12 months
The Technical Specialist will support the Medicaid Services project by working with healthcare participants and EHR vendors to implement and enhance data integrations. The role includes managing the technical infrastructure for SAS environments, handling updates and fixes for the Health Connex Clinical Portal, and developing data queries for analysis and reporting. The specialist will use SQL, SAS, and InterSystems HealthShare for exploratory data analysis and various outbound data services.
Key Skills:
• Strong communication with healthcare stakeholders
• Technical expertise in healthcare data systems
• Ability to build relationships with providers and partners
• Experience in data analysis and reporting
• Highly organized and adaptable
Thanks & Regards
Principal Data Scientist with Gen AI
Data analyst job in McLean, VA
Title: Principal Data Scientist with Gen AI
Contract: W2
Exp: 10+
Duration: Long Term
Interview Mode: In-Person interview
Call Notes:
Looking for a Principal Data Scientist with strong focus on Generative AI (GenAI) with expertise in Machine Learning transitioned into GenAI. Need someone with good experience in RAG, Python- Jupyter, other Software knowledge, using agents in workflows, strong understanding of data.
Someone with advanced proficiency in Prompt Engineering, Large Language Models (LLMs), RAG, Graph RAG, MCP, A2A, multi-modal AI, Gen AI Patterns, Evaluation Frameworks, Guardrails, data curation, and AWS cloud deployments.
Highly preferred for someone who can built AI agent, MCP, A2A, Graph Rag, deployed Gen AI applications to production.
Top Skills:
Machine Learning & Deep Learning - Required
GenAI - Required
Python - Required
Rag and/or Graph Rag - Required
MCP (Model Context Protocol) and A2A (Agent-to-Agent) is highly preferred
Job Description:
We are seeking a highly experienced **Principal Gen AI Scientist** with a strong focus on **Generative AI (GenAI)** to lead the design and development of cutting-edge AI Agents, Agentic Workflows and Gen AI Applications that solve complex business problems. This role requires advanced proficiency in Prompt Engineering, Large Language Models (LLMs), RAG, Graph RAG, MCP, A2A, multi-modal AI, Gen AI Patterns, Evaluation Frameworks, Guardrails, data curation, and AWS cloud deployments. You will serve as a hands-on Gen AI (data) scientist and critical thought leader, working alongside full stack developers, UX designers, product managers and data engineers to shape and implement enterprise-grade Gen AI solutions.
Key Responsibilities:
* Architect and implement scalable AI Agents, Agentic Workflows and GenAI applications to address diverse and complex business use cases.
* Develop, fine-tune, and optimize lightweight LLMs; lead the evaluation and adaptation of models such as Claude (Anthropic), Azure OpenAI, and open-source alternatives.
* Design and deploy Retrieval-Augmented Generation (RAG) and Graph RAG systems using vector databases and knowledge bases.
* Curate enterprise data using connectors integrated with AWS Bedrock's Knowledge Base/Elastic
* Implement solutions leveraging MCP (Model Context Protocol) and A2A (Agent-to-Agent) communication.
* Build and maintain Jupyter-based notebooks using platforms like SageMaker and MLFlow/Kubeflow on Kubernetes (EKS).
* Collaborate with cross-functional teams of UI and microservice engineers, designers, and data engineers to build full-stack Gen AI experiences.
* Integrate GenAI solutions with enterprise platforms via API-based methods and GenAI standardized patterns.
* Establish and enforce validation procedures with Evaluation Frameworks, bias mitigation, safety protocols, and guardrails for production-ready deployment.
* Design & build robust ingestion pipelines that extract, chunk, enrich, and anonymize data from PDFs, video, and audio sources for use in LLM-powered workflows-leveraging best practices like semantic chunking and privacy controls
* Orchestrate multimodal pipelines** using scalable frameworks (e.g., Apache Spark, PySpark) for automated ETL/ELT workflows appropriate for unstructured media
* Implement embeddings drives-map media content to vector representations using embedding models, and integrate with vector stores (AWS KnowledgeBase/Elastic/Mongo Atlas) to support RAG architectures
Required Qualifications:**
* 10+ years of experience in AI/ML, with 3+ years in applied GenAI or LLM-based solutions.
* Deep expertise in prompt engineering, fine-tuning, RAG, GraphRAG, vector databases (e.g., AWS KnowledgeBase / Elastic), and multi-modal models.
* Proven experience with cloud-native AI development (AWS SageMaker, Bedrock, MLFlow on EKS).
* Strong programming skills in Python and ML libraries (Transformers, LangChain, etc.).
* Deep understanding of Gen AI system patterns and architectural best practices, Evaluation Frameworks
* Demonstrated ability to work in cross-functional agile teams.
* Need Github Code Repository Link for each candidate. Please thoroughly vet the candidates.
**Preferred Qualifications:**
* Published contributions or patents in AI/ML/LLM domains.
* Hands-on experience with enterprise AI governance and ethical deployment frameworks.
* Familiarity with CI/CD practices for ML Ops and scalable inference APIs.
Data scientist
Data analyst job in Reston, VA
Job title: Data scientist
Fulltime
About Smart IT Frame:
At Smart IT Frame, we connect top talent with leading organizations across the USA. With over a decade of staffing excellence, we specialize in IT, healthcare, and professional roles, empowering both clients and candidates to grow together.
Note:
• In- person interview
Must Have;
• Data science
• Python
• SQL
• ML/Ops
• Risk Modelling
📩 Apply today or share profiles at ****************************
Data Scientist with GenAI and Python
Data analyst job in Charlotte, NC
Dexian is seeking a Data Scientist with GenAI and Python for an opportunity with a client located in Charlotte, NC.
Responsibilities:
Design, develop, and deploy GenAI models, including LLMs, GANs, and transformers, for tasks such as content generation, data augmentation, and creative applications
Analyze complex data sets to identify patterns, extract meaningful features, and prepare data for model training, with a focus on data quality for GenAI
Develop and refine prompts for LLMs, and optimize GenAI models for performance, efficiency, and specific use cases
Deploy GenAI models into production environments, monitor their performance, and implement strategies for continuous improvement and model governance
Work closely with cross-functional teams (e.g., engineering, product) to understand business needs, translate them into GenAI solutions, and effectively communicate technical concepts to diverse stakeholders
Stay updated on the latest advancements in GenAI and data science, and explore new techniques and applications to drive innovation within the organization
Utilize Python and its extensive libraries (e.g., scikit-learn, TensorFlow, PyTorch, Pandas, LangChain) for data manipulation, model development, and solution implementation
Requirements:
Proven hands-on experience implementing Gen AI project using open source LLMs (Llama, GPT OSS, Gemma, Mistral) and proprietary API's (OpenAI, Anthropic)
Strong background in Retrieval Augmented Generation implementations
In depth understanding of embedding models and their applications
Hands on experience in Natural Language Processing (NLP) solutions on text data
Strong Python development skills. Should be comfortable with Pandas and NumPy for data analysis and feature engineering
Experience building and integrating APIs (REST, FastAPI, Flask) for serving models
Fine tuning and optimizing open source LLM/SLM is a big plus
Knowledge of Agentic AI frameworks and Orchestration
Experience in ML and Deep Learning is an advantage
Familiarity with cloud platforms (AWS/Azure/GCP)
Experience working with Agile Methodology
Strong problem solving, analytical and interpersonal skills
Ability to work effectively in a team environment
Strong written and oral communication skills
Should have the ability to clearly express ideas
Dexian is a leading provider of staffing, IT, and workforce solutions with over 12,000 employees and 70 locations worldwide. As one of the largest IT staffing companies and the 2nd largest minority-owned staffing company in the U.S., Dexian was formed in 2023 through the merger of DISYS and Signature Consultants. Combining the best elements of its core companies, Dexian's platform connects talent, technology, and organizations to produce game-changing results that help everyone achieve their ambitions and goals.
Dexian's brands include Dexian DISYS, Dexian Signature Consultants, Dexian Government Solutions, Dexian Talent Development and Dexian IT Solutions. Visit ******************* to learn more.
Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.
Snowflake Data Scientist (Need local to Charlotte, NC)
Data analyst job in Charlotte, NC
Job Title: Senior Snowflake Data Scientist
Long term Contract
For data scientists, additional skill set required to be in AIML, RAG & LLM Models, Agentic AI experience.
The Senior Snowflake Data Scientist will lead the development, deployment, and operationalization of machine learning and statistical models that solve complex business problems and drive strategic decision-making. This role requires an expert blend of statistical rigor, advanced programming, and deep knowledge of leveraging Snowflake's ecosystem (e.g., Snowpark, Streamlit, external functions) for high-performance, in-warehouse data science.
Key Responsibilities
1. Advanced Modeling & Analysis
Model Development: Design, build, train, and validate sophisticated machine learning (ML) and statistical models (e.g., predictive, prescriptive, clustering, forecasting) to address key business challenges (e.g., customer churn, sales forecasting, risk modeling).
Feature Engineering: Utilize advanced SQL and Python/Snowpark to perform large-scale feature engineering, data transformation, and preparation directly within Snowflake, ensuring high data quality and low latency for modeling.
A/B Testing & Causal Inference: Design and analyze experiments (A/B tests) and employ causal inference techniques to measure the business impact of product features, strategies, and model outputs.
2. MLOps & Production Deployment
Operationalization: Lead the process of deploying trained models into production environments, utilizing Snowpark, Snowflake UDFs/UDTFs, and external functions for scalable inference and real-time scoring.
Pipeline Automation: Collaborate with Data Engineering to integrate ML pipelines into CI/CD workflows, ensuring models are automatically retrained and redeployed using tools like Airflow or orchestration platforms.
Monitoring: Establish and maintain robust monitoring for model performance (drift, bias, accuracy) and operational health within the Snowflake environment.
3. Data Visualization & Storytelling
Insight Generation: Conduct deep-dive exploratory data analysis (EDA) using complex Snowflake SQL to uncover hidden patterns, opportunities, and risks.
Visualization & Communication: Effectively communicate complex analytical findings, model outputs, and recommendations to technical and non-technical stakeholders and senior leadership using compelling data storytelling and visualization tools (e.g., Tableau, Power BI, or Snowflake Streamlit).
4. Platform & Technical Leadership
Best Practices: Define and promote best practices for statistical rigor, ML coding standards, and efficient data processing within the Snowflake ecosystem.
Mentorship: Provide technical guidance and mentorship to junior data scientists and analysts on modeling techniques and leveraging Snowflake's data science features.
Innovation: Stay current with the latest features of the Snowflake Data Cloud (e.g., Generative AI/LLMs, Unistore, Data Sharing) and propose innovative ways to leverage them for business value.
Minimum Qualifications
MS or Ph.D. in a quantitative discipline (e.g., Statistics, Computer Science, Engineering, Economics, or Mathematics).
7+ years of progressive experience in Data Science, with at least 3+ years of hands-on experience building and deploying ML solutions in a cloud data warehouse environment, preferably Snowflake.
Expert proficiency in Python (including packages like scikit-learn, NumPy, Pandas) and writing scalable code for data processing.
Expert-level command of Advanced SQL for complex data manipulation and feature engineering.
Proven experience with Machine Learning algorithms and statistical modeling techniques.
Strong understanding of MLOps principles for model lifecycle management.
Preferred Skills & Certifications
Snowflake SnowPro Advanced: Data Scientist Certification.
Hands-on experience developing solutions using Snowpark (Python/Scala).
Experience building data apps/dashboards using Snowflake Streamlit.
Familiarity with cloud platforms and services (AWS Sagemaker, Azure ML, or GCP Vertex AI) integrated with Snowflake.
Experience with workflow orchestration tools (e.g., Apache Airflow, dbt).
Data Scientist
Data analyst job in Chattanooga, TN
BUILT TO CONNECT
At Astec, we believe in the power of connection and the importance of building long-lasting relationships with our employees, customers and the communities we call home. With a team more than 4,000 strong, our employees are our #1 advantage. We invest in skills training and provide opportunities for career development to help you grow along with the business. We offer programs that support physical safety, as well as benefits and resources to enhance total health and wellbeing, so you can be your best at work and at home.
Our equipment is used to build the roads and infrastructure that connects us to each other and to the goods and services we use. We are an industry leader known for delivering innovative solutions that create value for our customers. As our industry evolves, we are using new technology and data like never before.
We're looking for creative problem solvers to build the future with us. Connect with us today and build your career at Astec.
LOCATION: Chattanooga, TN On-site / Hybrid (Role must report on-site regularly)
ABOUT THE POSITION
The Data Scientist will play a key role in establishing the analytical foundation of Astec Smart Services. This individual will lead efforts to build pipelines from source to cloud, define data workflows, build predictive models, and help guide the team's approach to turning data into customer value. He or she will work closely within Smart Services and cross-functionally to ensure insights are actionable and impactful. The role blends Data architecture, data engineering, and data science to help build Smart Services analytical foundation. This person will be instrumental in helping to build Astec's digital transformation and aftermarket strategy.
Deliverables & Responsibilities
Data Engineering:
Build and maintain robust data pipelines for ingestion, transformation, and storage.
Optimize ETL processes for scalability and performance.
Data Architecture:
Design and implement data models that support analytics and operational needs.
Define standards for data governance, security, and integration.
Data Science:
Develop predictive models and advanced analytics to support business decisions.
Apply statistical and machine learning techniques to large datasets.
Strong business acumen to understand decision drivers with internal and external customers
Collaborate with individuals and departments across the company to ensure insights are aligned with customer needs and drive value.
To be successful in this role, your experience and competencies are:
Bachelor's degree in data science, engineering, or related field. (Adv. degrees a plus.)
5+ years of experience in data science, including at least 3 years in industrial or operational environments.
Strong communication and project management skills are critical.
Proficiency in data pipeline tools (e.g., Spark, Airflow) and cloud platforms (Azure, AWS, GCP).
Strong understanding of data modeling principles and database technologies (SQL/NoSQL).
Hands-on experience with machine learning frameworks (e.g., TensorFlow, PyTorch) and statistical analysis.
Ability to work across data architecture design and data science experimentation.
Programming: Python, SQL, and optionally Scala or Java.
Familiarity with distributed systems and big data technologies.
Strong communication skills for translating technical insights into business value.
Ability to work across technical, commercial, and customer-facing teams.
Supervisor and Leadership Expectations
This role will not have supervisory or managerial responsibilities.
This role will have program management responsibilities.
Our Culture and Values
Employees that become part of Astec embody the values below throughout their work.
Continuous devotion to meeting the needs of our customers
Honesty and integrity in all aspects of business
Respect for all individuals
Preserving entrepreneurial spirit and innovation
Safety, quality and productivity as means to ensure success
EQUAL OPPORTUNITY EMPLOYER
As an Equal Opportunity Employer, Astec does not discriminate on the basis of race, creed, color, religion, gender (sex), sexual orientation, gender identity, marital status, national origin, ancestry, age, disability, citizenship status, a person's veteran status or any other characteristic protected by law or executive order.
Data Scientist
Data analyst job in Garner, NC
Accentuate Staffing is working with a client that is hiring an experienced Data Scientist to work on predictive analytics and join their data and AI team. This role combines advanced machine learning research with strategic business analytics to create scalable predictive solutions that drive efficiency and smarter decision-making across the enterprise.
The ideal candidate will bring a blend of technical expertise in machine learning, cloud platforms, and data engineering, alongside strong business acumen. You'll build and refine predictive models that help the company forecast sales, understand demand patterns, and make smarter operational decisions - from production planning to staffing and supply chain management.
Responsibilities:
Design and implement advanced predictive and machine learning models to support sales forecasting, demand planning, and strategic decision-making.
Build and maintain scalable ETL/ELT data pipelines that integrate structured and unstructured data from multiple business sources.
Experiment with AI techniques such as NLP, computer vision, and generative models to explore innovative applications across the organization.
Partner with business and IT teams to define analytics requirements, operationalize models, and integrate outputs into dashboards and reporting platforms.
Develop and manage self-service analytics dashboards using Power BI, SAP Analytics Cloud, or similar tools to deliver actionable insights.
Ensure data integrity, quality, and governance across predictive systems.
Qualifications:
Degree in Data Science, Computer Science, Statistics, or a related field.
Experience in predictive analytics, data science, or AI engineering within a business setting.
Proficiency in Python, R, SQL, and experience with cloud-based ML platforms such as Azure ML, AWS, or GCP.
Hands-on experience with data pipeline technologies (Azure Data Factory, Spark, Hadoop) and business intelligence tools (Power BI, Tableau, or SAP Analytics Cloud).
Strong understanding of machine learning model lifecycle management, from design through deployment and monitoring.
Exceptional communication and stakeholder engagement skills, with the ability to translate technical work into business value.
Project Management Analyst
Data analyst job in Newport News, VA
Oversees and manages the operational aspects of ongoing projects and serves as liaison between project management and planning, project team, and line management. Reviews status of projects and budgets; manages schedules and prepares status reports. Assesses project issues and develops resolutions to meet productivity, quality, and client-satisfaction goals and objectives. Develops mechanisms for monitoring project progress and for intervention and problem solving with project managers, line managers, and clients.
Experience in federal government contracting, compliance, SAP, MS Office Suite, FAR / DFARS, leading and influencing without direct authority.
Basic Qualifications
Bachelor's Degree and 3 years of experience. Master's Degree and 1 year of experience. 4 years of related exempt experience can be substituted for Bachelor's degree. 8 years of non-related exempt experience can be substituted for Bachelor's degree.