Data Engineer - Charlotte, NC (2 Days Onsite) #25472
Data engineer job in Charlotte, NC
Our client is looking to hire a Data Engineer for a one year contract. This role will be onsite (Hybrid) Monday and Thursday in Charlotte, NC.
Must Haves:
2+ years working in Agile/SDLC delivery
Hands-on Python in production or pipeline work
Hands-on TensorFlow
or
PyTorch
Practical NLP experience
LLM / GenAI applied experience (At least one real build using LLMs: RAG, embeddings + vector DB, prompt workflows, evaluation, fine-tuning/LoRA, or deployment)
Data engineering fundamentals (Clear ETL/ELT or data pipeline experience (lake/warehouse/API/streaming).
SQL + BI/reporting exposure (Can write real SQL and support dashboards/reports)
1 year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)
Bachelor's Degree
Data Scientist with GenAI and Python
Data engineer job in Charlotte, NC
Dexian is seeking a Data Scientist with GenAI and Python for an opportunity with a client located in Charlotte, NC.
Responsibilities:
Design, develop, and deploy GenAI models, including LLMs, GANs, and transformers, for tasks such as content generation, data augmentation, and creative applications
Analyze complex data sets to identify patterns, extract meaningful features, and prepare data for model training, with a focus on data quality for GenAI
Develop and refine prompts for LLMs, and optimize GenAI models for performance, efficiency, and specific use cases
Deploy GenAI models into production environments, monitor their performance, and implement strategies for continuous improvement and model governance
Work closely with cross-functional teams (e.g., engineering, product) to understand business needs, translate them into GenAI solutions, and effectively communicate technical concepts to diverse stakeholders
Stay updated on the latest advancements in GenAI and data science, and explore new techniques and applications to drive innovation within the organization
Utilize Python and its extensive libraries (e.g., scikit-learn, TensorFlow, PyTorch, Pandas, LangChain) for data manipulation, model development, and solution implementation
Requirements:
Proven hands-on experience implementing Gen AI project using open source LLMs (Llama, GPT OSS, Gemma, Mistral) and proprietary API's (OpenAI, Anthropic)
Strong background in Retrieval Augmented Generation implementations
In depth understanding of embedding models and their applications
Hands on experience in Natural Language Processing (NLP) solutions on text data
Strong Python development skills. Should be comfortable with Pandas and NumPy for data analysis and feature engineering
Experience building and integrating APIs (REST, FastAPI, Flask) for serving models
Fine tuning and optimizing open source LLM/SLM is a big plus
Knowledge of Agentic AI frameworks and Orchestration
Experience in ML and Deep Learning is an advantage
Familiarity with cloud platforms (AWS/Azure/GCP)
Experience working with Agile Methodology
Strong problem solving, analytical and interpersonal skills
Ability to work effectively in a team environment
Strong written and oral communication skills
Should have the ability to clearly express ideas
Dexian is a leading provider of staffing, IT, and workforce solutions with over 12,000 employees and 70 locations worldwide. As one of the largest IT staffing companies and the 2nd largest minority-owned staffing company in the U.S., Dexian was formed in 2023 through the merger of DISYS and Signature Consultants. Combining the best elements of its core companies, Dexian's platform connects talent, technology, and organizations to produce game-changing results that help everyone achieve their ambitions and goals.
Dexian's brands include Dexian DISYS, Dexian Signature Consultants, Dexian Government Solutions, Dexian Talent Development and Dexian IT Solutions. Visit ******************* to learn more.
Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.
Snowflake Data Scientist (Need local to Charlotte, NC)
Data engineer job in Charlotte, NC
Job Title: Senior Snowflake Data Scientist
Long term Contract
For data scientists, additional skill set required to be in AIML, RAG & LLM Models, Agentic AI experience.
The Senior Snowflake Data Scientist will lead the development, deployment, and operationalization of machine learning and statistical models that solve complex business problems and drive strategic decision-making. This role requires an expert blend of statistical rigor, advanced programming, and deep knowledge of leveraging Snowflake's ecosystem (e.g., Snowpark, Streamlit, external functions) for high-performance, in-warehouse data science.
Key Responsibilities
1. Advanced Modeling & Analysis
Model Development: Design, build, train, and validate sophisticated machine learning (ML) and statistical models (e.g., predictive, prescriptive, clustering, forecasting) to address key business challenges (e.g., customer churn, sales forecasting, risk modeling).
Feature Engineering: Utilize advanced SQL and Python/Snowpark to perform large-scale feature engineering, data transformation, and preparation directly within Snowflake, ensuring high data quality and low latency for modeling.
A/B Testing & Causal Inference: Design and analyze experiments (A/B tests) and employ causal inference techniques to measure the business impact of product features, strategies, and model outputs.
2. MLOps & Production Deployment
Operationalization: Lead the process of deploying trained models into production environments, utilizing Snowpark, Snowflake UDFs/UDTFs, and external functions for scalable inference and real-time scoring.
Pipeline Automation: Collaborate with Data Engineering to integrate ML pipelines into CI/CD workflows, ensuring models are automatically retrained and redeployed using tools like Airflow or orchestration platforms.
Monitoring: Establish and maintain robust monitoring for model performance (drift, bias, accuracy) and operational health within the Snowflake environment.
3. Data Visualization & Storytelling
Insight Generation: Conduct deep-dive exploratory data analysis (EDA) using complex Snowflake SQL to uncover hidden patterns, opportunities, and risks.
Visualization & Communication: Effectively communicate complex analytical findings, model outputs, and recommendations to technical and non-technical stakeholders and senior leadership using compelling data storytelling and visualization tools (e.g., Tableau, Power BI, or Snowflake Streamlit).
4. Platform & Technical Leadership
Best Practices: Define and promote best practices for statistical rigor, ML coding standards, and efficient data processing within the Snowflake ecosystem.
Mentorship: Provide technical guidance and mentorship to junior data scientists and analysts on modeling techniques and leveraging Snowflake's data science features.
Innovation: Stay current with the latest features of the Snowflake Data Cloud (e.g., Generative AI/LLMs, Unistore, Data Sharing) and propose innovative ways to leverage them for business value.
Minimum Qualifications
MS or Ph.D. in a quantitative discipline (e.g., Statistics, Computer Science, Engineering, Economics, or Mathematics).
7+ years of progressive experience in Data Science, with at least 3+ years of hands-on experience building and deploying ML solutions in a cloud data warehouse environment, preferably Snowflake.
Expert proficiency in Python (including packages like scikit-learn, NumPy, Pandas) and writing scalable code for data processing.
Expert-level command of Advanced SQL for complex data manipulation and feature engineering.
Proven experience with Machine Learning algorithms and statistical modeling techniques.
Strong understanding of MLOps principles for model lifecycle management.
Preferred Skills & Certifications
Snowflake SnowPro Advanced: Data Scientist Certification.
Hands-on experience developing solutions using Snowpark (Python/Scala).
Experience building data apps/dashboards using Snowflake Streamlit.
Familiarity with cloud platforms and services (AWS Sagemaker, Azure ML, or GCP Vertex AI) integrated with Snowflake.
Experience with workflow orchestration tools (e.g., Apache Airflow, dbt).
AWS Data Engineer
Data engineer job in Charlotte, NC
We are looking for a skilled and experienced AWS Data Engineer with 10+ Years of experience to join our team. This role requires hands-on expertise in AWS serverless technologies, Big Data platforms, and automation tools. The ideal candidate will be responsible for designing scalable data pipelines, managing cloud infrastructure, and enabling secure, reliable data operations across marketing and analytics platforms.
Key Responsibilities:
Design, build, and deploy automated CI/CD pipelines for data and application workflows.
Analyze and enhance existing data pipelines for performance and scalability.
Develop semantic data models to support activation and analytical use cases.
Document data structures and metadata using Collibra or similar tools.
Ensure high data quality, availability, and integrity across platforms.
Apply SRE and DevSecOps principles to improve system reliability and security.
Manage security operations within AWS cloud environments.
Configure and automate applications on AWS instances.
Oversee all aspects of infrastructure management, including provisioning and monitoring.
Schedule and automate jobs using tools like Step Functions, Lambda, Glue, etc.
Required Skills & Experience:
Hands-on experience with AWS serverless technologies: Lambda, Glue, Step Functions, S3, RDS, DynamoDB, Athena, CloudFormation, CloudWatch Logs.
Proficiency in Confluent Kafka, Splunk, and Ansible.
Strong command of SQL and scripting languages: Python, R, Spark.
Familiarity with data formats: JSON, XML, Parquet, Avro.
Experience in Big Data engineering and cloud-native data platforms.
Functional knowledge of marketing platforms such as Adobe, Salesforce Marketing Cloud, and Unica/Interact (nice to have).
Preferred Qualifications:
Bachelor's or Master's degree in Computer Science, Data Engineering, or related field.
AWS, Big Data, or DevOps certifications are a plus.
Experience working in hybrid cloud environments and agile teams.
Life at Capgemini
Capgemini supports all aspects of your well-being throughout the changing stages of your life and career. For eligible employees, we offer:
Flexible work
Healthcare including dental, vision, mental health, and well-being programs
Financial well-being programs such as 401(k) and Employee Share Ownership Plan
Paid time off and paid holidays
Paid parental leave
Family building benefits like adoption assistance, surrogacy, and cryopreservation
Social well-being benefits like subsidized back-up child/elder care and tutoring
Mentoring, coaching and learning programs
Employee Resource Groups
Disaster Relief
Disclaimer
Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law.
This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship.
Capgemini is committed to providing reasonable accommodations during our recruitment process. If you need assistance or accommodation, please get in touch with your recruiting contact.
Click the following link for more information on your rights as an Applicant **************************************************************************
Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Data Engineer
Data engineer job in Charlotte, NC
Job Title: Data Engineer / SQL Server Developer (7+ Years)
Client: Wells Fargo
Rate: $60/hr
Interview Process: Code Test + In-Person Interview
Job Description
Wells Fargo is seeking a Senior Data Engineer / SQL Server Developer (7+ years) who can work across both database development and QA automation functions. The ideal candidate will have strong SQL Server expertise along with hands-on experience in test automation tools.
Key Responsibilities
Design, develop, and optimize SQL Server database structures, queries, stored procedures, triggers, and ETL workflows.
Perform advanced performance tuning, query optimization, and troubleshooting of complex SQL issues.
Develop and maintain data pipelines ensuring data reliability, integrity, and high performance.
Build and execute automated test scripts using Selenium, BlazeMeter, or similar frameworks.
Perform both functional and performance testing across applications and data processes.
Support deployments in containerized ecosystems, ideally within OpenShift.
Collaborate with architecture, QA, DevOps, and application teams to ensure seamless delivery.
Required Skills
Primary:
7+ years of hands-on SQL Server development (T-SQL, stored procedures, performance tuning, ETL).
Secondary:
Experience working with OpenShift or other container platforms.
Testing / QA Automation:
Strong experience with test automation tools like Selenium, BlazeMeter, JMeter, or equivalent.
Ability to design automated functional and performance test suites.
Ideal Candidate Profile
Senior-level developer capable of taking ownership of both development and test automation deliverables.
Strong analytical and debugging skills across data engineering and testing disciplines.
Experience working in large-scale enterprise environments.
Senior Data Engineer
Data engineer job in Charlotte, NC
We are
At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver industry-leading digital solutions. Synechron s progressive technologies and optimization strategies span end-to-end Artificial Intelligence, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering, servicing an array of noteworthy financial services and technology firms. Through research and development initiatives in our FinLabs we develop solutions for modernization, from Artificial Intelligence and Blockchain to Data Science models, Digital Underwriting, mobile-first applications and more. Over the last 20+ years, our company has been honored with multiple employer awards, recognizing our commitment to our talented teams. With top clients to boast about, Synechron has a global workforce of 14,500+, and has 58 offices in 21 countries within key global markets.
Our Challenge:
Looking for skilled senior data engineer with comprehensive experience in designing, developing, and maintaining scalable data solutions within the financial and regulatory domains. Proven expertise in leading end-to-end data architectures, integrating diverse data sources, and ensuring data quality and accuracy.
Additional Information*
The base salary for this position will vary based on geography and other factors. In accordance with law, the base salary for this role if filled within New York, NY is $135k - $155k/year & benefits (see below).
Work location: New York City, NY (Hybrid, 3 days in a week)
The Role
Responsibilities:
Advanced proficiency in Python, SQL Server, Snowflake, Azure Databricks, and PySpark.
Strong understanding of relational databases, ETL processes, and data modeling.
Expertise in system design, architecture, and implementing robust data pipelines.
Hands-on experience with data validation, quality checks, and automation tools (Autosys, Control-M).
Skilled in Agile methodologies, SDLC processes, and CI/CD pipelines.
Effective communicator with the ability to collaborate with business analysts, users, and global teams.
Requirements:
Overall 10+ years of IT experience is required
Collaborate with business stakeholders to gather technical specifications and translate business requirements into technical solutions.
Develop and optimize data models and schemas for efficient data integration and analysis.
Lead application development involving Python, Pyspark, SQL, Snowflake and Databricks platforms.
Implement data validation procedures to maintain high data quality standards.
Strong experience in SQL (Writing complex queries, Join, Tables etc.)
Conduct comprehensive testing (UT, SIT, UAT) alongside business and testing teams.
Provide ongoing support, troubleshooting, and maintenance in production environments.
Contribute to architecture and design discussions to ensure scalable, maintainable data solutions.
Experience with financial systems (capital markets, credit risk, and regulatory compliance applications).
We offer:
A highly competitive compensation and benefits package.
A multinational organization with 58 offices in 21 countries and the possibility to work abroad.
10 days of paid annual leave (plus sick leave and national holidays).
Maternity & paternity leave plans.
A comprehensive insurance plan including medical, dental, vision, life insurance, and long-/short-term disability (plans vary by region).
Retirement savings plans.
A higher education certification policy.
Commuter benefits (varies by region).
Extensive training opportunities, focused on skills, substantive knowledge, and personal development.
On-demand Udemy for Business for all Synechron employees with free access to more than 5000 curated courses.
Coaching opportunities with experienced colleagues from our Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups.
Cutting edge projects at the world's leading tier-one banks, financial institutions and insurance firms.
A flat and approachable organization.
A truly diverse, fun-loving, and global work culture.
SYNECHRON'S DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference' is committed to fostering an inclusive culture - promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant's gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
Data Engineer
Data engineer job in Charlotte, NC
Job Title: Azure Databricks Engineer (Onsite)
Years of Experience: 7- 12 Years
Full Time
We are seeking a highly skilled and motivated Technical Team Lead with extensive experience in Azure Databricks to join our dynamic team. The ideal candidate will possess a strong technical background, exceptional leadership abilities, and a passion for driving innovative solutions. As a Technical Team Lead, you will be responsible for guiding a team of developers and engineers in the design, development, and implementation of data driven solutions that leverage Azure Databricks.
Responsibilities:
Lead and mentor a team of technical professionals, fostering a collaborative and high performance culture.
Design and implement data processing solutions using Azure Databricks, ensuring scalability and efficiency.
Collaborate with cross functional teams to gather requirements and translate them into technical specifications.
Oversee the development lifecycle, from planning and design to deployment and maintenance.
Conduct code reviews and provide constructive feedback to team members to ensure code quality and adherence to best practices.
Stay up to date with industry trends and emerging technologies related to Azure Databricks and data engineering.
Facilitate communication between technical and non technical stakeholders to ensure alignment on project goals.
Identify and mitigate risks associated with project delivery and team performance.
Mandatory Skills:
Proven expertise in Azure Databricks, including experience with Spark, data pipelines, and data lakes.
Strong programming skills in languages such as Python, Scala, or SQL.
Experience with cloud based data storage solutions, particularly Azure Data Lake Storage and Azure SQL Database.
Solid understanding of data modeling, ETL processes, and data warehousing concepts.
Demonstrated ability to lead technical teams and manage multiple projects simultaneously.
Preferred Skills:
Familiarity with Azure DevOps for CI/CD processes.
Experience with machine learning frameworks and libraries.
Knowledge of data governance and compliance standards.
Strong analytical and problem solving skills.
Excellent communication and interpersonal skills.
Qualifications:
Bachelor's degree in Computer Science, Information Technology, or a related field.
7 10 years of experience in data engineering, software development, or a related technical role.
Proven track record of leading technical teams and delivering successful projects.
Relevant certifications in Azure or data engineering are a plus.
If you are a passionate Technical Team Lead with a strong background in Azure Databricks and a desire to drive innovation, we encourage you to apply and join our team
Data Conversion Engineer
Data engineer job in Charlotte, NC
Summary/Objective
Are you looking to work at a high growth, innovative, and purpose driven FinTech company? If so, you'll love Paymentus. Recognized by Deloitte as one of the fastest growing companies in North America, Paymentus is the premier provider of innovative, reliable, and secure electronic bill payment and presentment for more than 1700 clients. We are a SaaS provider that enables companies to help their customers simplify their financial lives. We do that by making it easier for consumers and businesses to pay bills, plus move and manage money to achieve strong financial health. We continually build upon a massively scalable platform, supporting thousands of businesses and millions of transactions on a daily basis. We're looking for high performers to join our team who excel in their expertise and who can transform plans into action. You'll have the opportunity to grow in an environment where intelligence, innovation, and leadership are valued and rewarded.
About the Role
The Data Conversion Engineer serves as a key component of the Platform Integrations team, providing technical support and guidance on data conversion projects. Conversions are an integral part in ensuring adherence to Paymentus' standards for a successful launch. This role is essential to ensure all bill payment data converts properly and efficiently onto the Paymentus platform.
Responsibilities
Develop data conversion procedures using SQL, Java and Linux scripting
Augment and automate existing manual procedures to optimize accuracy and reduce time for each conversion
Develop and update conversion mappers to interpret incoming data and manipulate it to match Paymentus' specifications
Develop new specifications to satisfy new customers and products
Serve as the primary point of contact/driver for all technical related conversion activities
Review conversion calendar and offer technical support and solutions to meet deadlines and contract dates
Maintain and update technical conversion documentation to share with internal and external clients and partners
Work in close collaboration with implementation, integration, product and development teams using exceptional communication skills
Adapt and creatively solve encountered problems under high stress and tight deadlines
Learn database structure, business logic and combine all knowledge to improve processes
Be flexible
Monitor new client conversions and existing client support if needed; provide daily problem solving, coordination, and communication
Management of multiple projects and conversion implementations
Ability to proactively troubleshoot and solve problems with limited supervision
Qualifications
B.S. Degree in Computer Science or comparable experience
Strong knowledge of Linux and the command line interface
Exceptional SQL skills
Experience with logging/monitoring tools (AWS Cloudwatch, Splunk, ELK, etc.)
Familiarity with various online banking applications and understanding of third-party integrations is a plus
Effective written and verbal communication skills
Problem Solver - recognizes the need to resolve issues quickly and effectively, uses logic to solve problems; identifies problems and brings forward multiple solution options; knows who/when to involve appropriate people when troubleshooting issues
Communication; ability to use formal and informal written and/or verbal communication channels to inform others; articulates ideas and thoughts clearly both verbally and in writing
Dynamic and self-motivated; able to work on their own initiative and deliver the objectives required to maintain service levels
Strong attention to detail
Proficiency with raw data, analytics, or data reporting tools
Preferred Skills
Background in the Payments, Banking, E-Commerce, Finance and/or Utility industries
Experience with front end web interfaces (HTML5, Javascript, CSS3)
Cloud technologies (AWS, GCP, Azure)
Work Environment
This job operates in a professional office environment. This role routinely uses standard office equipment such as laptop computers, photocopiers and smartphones.
Physical Demands
This role requires sitting or standing at a computer workstation for extended periods of time.
Position Type/Expected Hours of Work
This is a full-time position. Days and hours of work are Monday through Friday, 40 hours a week. Occasional evening and weekend work may be required as job duties demand.
Travel
No travel is required for this position.
Other Duties
Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee for this job. Duties, responsibilities and activities may change at any time with or without notice.
Equal Opportunity Statement
Paymentus is an equal opportunity employer. We enthusiastically accept our responsibility to make employment decisions without regard to race, religious creed, color, age, sex, sexual orientation, national origin, ancestry, citizenship status, religion, marital status, disability, military service or veteran status, genetic information, medical condition including medical characteristics, or any other classification protected by applicable federal, state, and local laws and ordinances. Our management is dedicated to ensuring the fulfillment of this policy with respect to hiring, placement, promotion, transfer, demotion, layoff, termination, recruitment advertising, pay, and other forms of compensation, training, and general treatment during employment.
Reasonable Accommodation
Paymentus recognizes and supports its obligation to endeavor to accommodate job applicants and employees with known physical or mental disabilities who are able to perform the essential functions of the position, with or without reasonable accommodation. Paymentus will endeavor to provide reasonable accommodations to otherwise qualified job applicants and employees with known physical or mental disabilities, unless doing so would impose an undue hardship on the Company or pose a direct threat of substantial harm to the employee or others. An applicant or employee who believes he or she needs a reasonable accommodation of a disability should discuss the need for possible accommodation with the Human Resources Department, or his or her direct supervisor.
Big Data Engineer
Data engineer job in Charlotte, NC
Hello,
This is Shivam from Centraprise Global working as Talent Acquisition Lead.
I came across your profile on our resume database and wanted to reach out regarding a Job opportunity. If interested please reply with your updated resume, contact details, and best time to discuss regarding the opportunity.
Job Title: Hadoop // Big Data Engineer
Location: Charlotte, NC // New York City, NY (onsite)
Duration: Full Time
Job Description
Must Have Technical/Functional Skills
Primary Skil: Hadoop ecosystem (HDFS, Hive, Spark),PySpark,Python,Apache Kafka
Experience: Minimum 9 years
Roles & Responsibilities
Architectural Leadership:
Define end-to-end architecture for data platforms, streaming systems, and web applications.
Ensure alignment with enterprise standards, security, and compliance requirements.
Evaluate emerging technologies and recommend adoption strategies.
Data Engineering :
Design and implement data ingestion, transformation, and processing pipelines using Hadoop, PySpark, and related tools.
Optimize ETL workflows for large-scale datasets and real-time streaming.
Integrate Apache Kafka for event-driven architectures and messaging.
Application Development :
Build and maintain backend services using Python and microservices architecture.
Develop responsive, dynamic front-end applications using Angular.
Implement RESTful APIs and ensure seamless integration between components.
Collaboration & Leadership:
Work closely with product owners, business analysts, and DevOps teams.
Mentor junior developers and data engineers.
Participate in agile ceremonies, code reviews, and design discussions.
Required Skills & Qualifications:
Technical Expertise:
Strong experience with Hadoop ecosystem (HDFS, Hive, Spark).
Proficiency in PySpark for distributed data processing.
Advanced programming skills in Python.
Hands-on experience with Apache Kafka for real-time streaming.
Frontend development using Angular (TypeScript, HTML, CSS).
Architectural Skills:
Expertise in designing scalable, secure, and high-performance systems.
Familiarity with microservices, API design, and cloud-native architectures.
Additional Skills:
Knowledge of CI/CD pipelines, containerization (Docker/Kubernetes).
Exposure to cloud platforms (AWS, Azure, GCP).
Thanks & Regards,
Shivam Gupta | Talent Acquisition Lead
Desk: ************ Ext- 732
Data Engineer
Data engineer job in Charlotte, NC
W2 ONLY - NO CORP TO CORP - CONTRACT TO HIRE - NO VISA SPONSOR/TRANSFER - NO 3RD PARTY AGENCY CANDIDATES
Data Engineer
Serve as subject matter expert and/or technical lead for large-scale data products.
Drive end-to-end solution delivery across multiple platforms and technologies, leveraging ELT solutions to acquire, integrate, and operationalize data.
Partner with architects and stakeholders to define and implement pipeline and data product architecture, ensuring integrity and scalability.
Communicate risks and trade-offs of technology solutions to senior leaders, translating technical concepts for business audiences.
Build and enhance data pipelines using cloud-based architectures.
Design simplified data models for complex business problems.
Champion Data Engineering best practices across teams, implementing leading big data methodologies (AWS, Hadoop/EMR, Spark, Snowflake, Talend, Informatica) in hybrid cloud/on-prem environments.
Operate independently while fostering a collaborative, transformation-focused mindset.
Work effectively in a lean, fast-paced organization, leveraging Scaled Agile principles.
Promote code quality management, FinOps principles, automated testing, and environment management practices to deliver incremental customer value.
Qualifications
5+ years of data engineering experience.
2+ years developing and operating production workloads in cloud infrastructure.
Bachelor's degree in Computer Science, Data Science, Information Technology, or related field.
Hands-on experience with Snowflake (including SnowSQL, Snowpipe).
Expert-level skills in AWS services, Snowflake, Python, Spark (certifications are a plus).
Proficiency in ETL tools such as Talend and Informatica.
Strong knowledge of Data Warehousing (modeling, mapping, batch and real-time pipelines).
Experience with DataOps tools (GitHub, Jenkins, UDeploy).
Familiarity with P&C Commercial Lines business.
Knowledge of legacy tech stack: Oracle Database, PL/SQL, Autosys, Hadoop, stored procedures, Shell scripting.
Experience using Agile tools like Rally.
Excellent written and verbal communication skills to interact effectively with technical and non-technical stakeholders.
Lead Data Engineer
Data engineer job in Charlotte, NC
We are looking for a Lead Data Engineer with strong communication skills and hands-on experience across Snowflake, AWS, Python, PySpark, MongoDB, and IICS. This role requires a technical leader who can guide a small engineering team while also building and optimizing scalable data pipelines in a cloud environment.
Long-term Contract
Location: Charlotte, NC
4 Days Onsite
***Interviews are actively happening***
***If you are interested in this role, please share your Updated resume to proceed further***
Responsibilities
Lead and mentor a team of data engineers in day-to-day project delivery
Design, build, and optimize ETL/ELT pipelines using AWS Glue, Python, PySpark, and Snowflake
Work with business and technical stakeholders, deliver updates, and ensure smooth communication
Develop and maintain data workflows using IICS (Informatica Intelligent Cloud Services)
Manage data ingestion from multiple sources, including MongoDB and AWS services
Perform data modeling, SQL scripting, and performance tuning in Snowflake
Support deployment, monitoring, and troubleshooting of data pipelines
Ensure best practices for code quality, documentation, and cloud data architecture
AWS Data Engineer (Only W2)
Data engineer job in Charlotte, NC
Title: AWS Data Engineer
Exprience: 10 years
Must Have Skills:
• Strong experience in AWS services, primarily serverless, databases, storage services, container services, schedulers, and batch services.
• Experience in Snowflake and Data Build Tool.
• Expertise in DBT, NodeJS and Python.
• Expertise in Informatica, PowerBI , Database, Cognos.
Nice to Have Skills:
Detailed Job Description:
• Strong experience in AWS services, primarily serverless, databases, storage services, container services, schedulers, and batch services.
• Experience in Snowflake and Data Build Tool.Expertise in DBT, NodeJS and Python.
• Expertise in Informatica, PowerBI , Database, Cognos.
• Proven experience in leading teams across locations.
• Knowledge of DevOps processes, Infrastructure as Code and their purpose.
• Good understanding of data warehouses, their purpose, and implementation
• Good communication skills.
Kindly share the resume in ******************
Deployment Engineer
Data engineer job in Charlotte, NC
We are seeking a highly skilled and customer-focused IT Support Specialist to join our team. In this role, you will provide hands-on technical support, contribute to key technology initiatives, and ensure an exceptional support experience for users across the organization. The ideal candidate thrives in fast-paced, high-profile environments and brings both strong technical acumen and outstanding communication skills.
Key Responsibilities
Team Collaboration
Work closely with IT peers and business stakeholders to provide a cohesive and seamless support experience.
Share knowledge and contribute to internal documentation, playbooks, and training materials.
Participate in team meetings and actively support knowledge-sharing culture.
Technical Expertise
Hands-on experience supporting Windows 11.
Skilled in Microsoft 365 (M365) services and applications support.
Proficient in Active Directory (AD) administration.
Experience with System Center Configuration Manager (SCCM).
Familiarity with Azure Active Directory (Azure AD).
Experience managing mobile devices through Microsoft Intune.
Working knowledge of iOS device management and support.
Understanding of ITIL practices and frameworks.
Non-Technical Expertise
Proven track record supporting users in fast-paced, high-profile environments.
Excellent communication skills, with the ability to interact effectively and professionally at all levels, including with high-visibility individuals.
Strong analytical and problem-solving skills with a focus on root cause analysis and continuous improvement.
CNC Applications Engineer
Data engineer job in Charlotte, NC
A leading manufacturing solutions provider is seeking a highly experienced CNC Applications Engineer with a strong background in part processing, Fanuc CNC controls, and CAM programming using Mastercam or Esprit. This role is crucial in supporting process development, machine optimization, and technical customer support for high-precision machining operations.
Key Responsibilities:
Part Processing Development:
Interpret part drawings and develop comprehensive machining strategies including tooling, workholding, and efficient operation sequencing.
CNC Programming & Optimization:
Generate, modify, and refine CNC programs using G-code, M-code, and CAM-generated output, ensuring high accuracy and process efficiency.
Fanuc Controls Expertise:
Troubleshoot, configure, and fine-tune programs and operations on Fanuc-controlled CNC equipment.
Technical Customer Support:
Provide remote and on-site support to customers for troubleshooting, training, and process validation during installations and upgrades.
Machine Setup & Validation:
Assist with machine setup, prove-outs, and first-article inspections to ensure part accuracy and adherence to specifications.
CAM Software Utilization:
Create toolpaths and NC code using CAM software (preferably Mastercam or Esprit) for a variety of CNC machines including multi-axis.
Process Improvement:
Identify and implement opportunities to reduce cycle times, enhance surface finishes, and extend tool life through process refinement.
Documentation:
Maintain detailed and accurate documentation of machining processes, programs, and tooling setups.
Required Qualifications:
5-10 years of experience in CNC machining, part processing, and programming.
Strong working knowledge of Fanuc CNC controls and G/M code.
Proficiency in CAM software (preferably Mastercam or Esprit).
Deep understanding of machining practices, tooling, speeds/feeds, and materials.
Ability to collaborate across teams and support customer-facing technical work.
Strong communication and documentation skills.
Able to train operators, programmers, or customers on processes and best practices.
Preferred Qualifications:
Experience with multi-axis CNC machines and turning centers.
Exposure to automation and robotics in CNC manufacturing environments.
Degree or technical certification in Manufacturing Technology, CNC Programming, or Mechanical Engineering.
Work Environment:
Combination of office-based engineering and hands-on work on the shop floor.
Occasional travel to customer sites for training, support, and machine setup.
Benefits:
Competitive compensation and performance-based bonuses
Medical, Dental, and Vision insurance
Paid Time Off (PTO)
401(k) with company match
Support for continuing education and training
Clear opportunities for career advancement in a fast-paced technical environment
JAVA SDET Automation Lead
Data engineer job in Charlotte, NC
Required Qualifications
Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
4 years of Information Technology experience.
Candidate must be located within commuting distance of Charlotte, NC or be willing to relocate to one of these areas.
This position may require travel in the US and Canada.
At least 4 years of software testing experience.
Preferred Qualifications:
Minimum 4+ years of experience as a Test Automation Lead Engineer
Strong knowledge and hands on experience in Playwright, Selenium, BDD/Cucumber & Java automation.
Experience in LLM models, Prompt Engineering and Copilot for Test Automation
Experienced in API automation using Karate
Experience with GitHub version control and collaboration tool
Strong Understanding in Agile Process
Should drive the project and communicate with all stakeholders.
Experience with DevOps Pipeline integration
Strong communication and presentation skills
Ability to work on multiple projects and manage a dynamic working environment.
Senior AI Software Engineer
Data engineer job in Charlotte, NC
Immediate need for a talented Senior AI Software Engineer. This is a 24 months contract opportunity with long-term potential and is located in Charlotte, NC (Hybrid). Please review the job description below and contact me ASAP if you are interested.
Job ID:25-93571
Pay Range: $75 - $80/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Responsibilities:
Design and build highly complex AI-enabled software solutions using Agile, XP, and TDD practices.
Develop scalable applications and APIs leveraging AWS services (Lambda, Glue, etc.) and Terraform for infrastructure automation.
Write efficient, well-structured Python code for AI workflows and cloud integrations.
Collaborate with data engineers, product owners, and other developers to deliver impactful solutions.
Participate in code reviews, paired programming, and mentor junior team members.
Translate complex AI concepts into manageable user stories and technical deliverables.
Optimize data pipelines and ensure robust integration with cloud platforms.
Provide technical guidance on AI architecture and best practices for sustainability and security.
Stay current with emerging AI technologies, including agentic AI development.
Comfortable working in a data-centric environment and managing multiple responsibilities.
Key Requirements and Technology Experience:
Key Skills; Python, Agentic AI, AWS .
Associate's degree in Information Systems or related discipline AND 5 years of relevant experience
OR High School/GED with 6 years of related work experience.
Proven experience as a Senior AWS Software Engineer with hands-on expertise in:
Python, Terraform, AWS Glue, Lambda, and API development.
Experience with agentic AI development and AI-driven solutions.
Familiarity with front-end development (Angular or similar) is a plus.--- main work is not Front-end.
Strong understanding of cloud architecture and security best practices.
Experience with CI/CD pipelines and automation.
Ability to mentor and coach team members on AI and cloud technologies.
Portfolio of work showcasing AI or cloud-based projects. - NOT A MUST .
Our client is a leading Utility Industry, and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, colour, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
Senior Python Developer
Data engineer job in Charlotte, NC
Senior Python Developer (Banking / Risk)
Sponsorship - client not providing sponsorship now or in the future.
About the Role
Seeking a senior Python developer to design and build scalable risk technology solutions within a modern cloud environment. The role involves developing backend systems in Python, deploying applications on Azure, and collaborating on front-end interfaces (React, HTML, CSS).
Responsibilities
Develop and optimize scalable Python-based applications.
Deploy and integrate solutions using Azure cloud components.
Contribute to UI enhancements and application performance.
Implement secure authentication and robust unit testing.
Collaborate across teams and document technical deliverables.
Qualifications
5+ years of Python development experience; Databricks/Azure a plus.
Proficiency with SQL/NoSQL databases and distributed systems.
Experience in risk systems or VaR frameworks preferred.
Strong communication, organization, and Agile experience.
Exposure to Power BI and data visualization tools beneficial.
Excellent communication skills are paramount to this role.
Senior Back End Developer
Data engineer job in Charlotte, NC
Sr. Backend Engineer
AVIVA Talent Advisors is partnered with an innovative cybersecurity company redefining how organizations defend themselves. Backed by top-tier investors, they've built an Autonomous Defense & Remediation platform powered by agentic AI, enabling companies to identify, contain, and neutralize threats in seconds. Their technology integrates seamlessly with existing security stacks and helps teams scale without additional headcount or reliance on MSPs.
We are looking for two full time Sr. Backend Developers to join their team. You will architect and develop high-quality backend services in Python, build scalable systems, orchestrate AI-agent workflows, and collaborate closely with security experts to deliver enterprise-grade capabilities.
ABOUT THE ROLE
Architect and develop backend services in Python using FastAPI and Celery
Design asynchronous workflows with Celery + Redis
Build and maintain data models and persistence layers in MongoDB
Develop advanced AI-agent workflow orchestration
Build reliable, scalable, well-versioned APIs (authentication, tenancy, rate limiting, observability)
Improve system performance for high-throughput event ingestion and enrichment
Containerize and deploy services using Docker & Kubernetes
Collaborate closely with product and security SMEs to translate SOC needs into backend capabilities
ABOUT YOU
5+ years of backend engineering experience
Strong Python development experience (FastAPI preferred)
Experience with asynchronous task processing using Celery + Redis
Strong understanding of MongoDB data modeling and persistence
Experience owning features end-to-end and explaining technical decisions to stakeholders
Familiarity with LangGraph / LangChain (tool calling, state machines, memory, guardrails)
Containerization with Docker and deployment/operations in Kubernetes (cloud or hybrid/on-prem)
Experience integrating with security platforms like CrowdStrike, Microsoft Defender, Splunk, Cribl, ServiceNow
Solid understanding of observability practices (metrics, tracing, structured logging, distributed debugging)
Thorough understanding of GitFlow and Agile methodologies
You thrive in early stage organizations with high autonomy, fast iteration, and no bureaucracy
Shipping meaningful products at enterprise scale excites you
Join a world-class engineering team with strong peers and high standards
ABOUT US
AVIVA Talent Advisors is a Certified Women Business Enterprise (WBE) executive search firm providing expertise in talent acquisition and talent intelligence. We take a human-centric approach to hiring niche talent and executives driving digital transformation across the enterprise. We're passionate about creating connections that bring talented people together and creating a platform for elevating women in leadership.
We are an Equal Employment and Affirmative Action employer F/M/Disability/Vet/Sexual Orientation/Gender Identity. All qualified applicants will receive consideration for employment without regard to race, creed, color, religion, national origin, sexual orientation, gender identity, disability, veteran status, sex age, genetic information, or any other legally protected basis.
Software Engineer
Data engineer job in Charlotte, NC
In this role, you will:
Lead moderately complex initiatives and deliverables within technical domain environments
Resolve complex issues and lead a team to meet existing client needs or potential new clients' needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements
Design, code, test, debug, and document for projects and programs associated with technology domain, including transformation programs, application upgrades and deployments
Collaborate and consult with peers, colleagues, and mid-level managers to resolve technical challenges and achieve goals
Lead projects and act as an escalation point, provide guidance and direction to less experienced staff
Contribute to large scale planning of strategies
Required Qualifications:
7+ years of full stack Software Engineering experience
7+ years of J2EE experience (including RESTful or SOAP web services)
5+ years of JMS (Java Message Service) experience
3+ years of Kafka Platform experience, Confluent Platform experience, or a combination of both
3+ years of MongoDB experience
3+ years of experience with secure DevOps and deployment automation to cloud environments
3+ years of experience with Test Automation
Proficient in Micro-Services architecture
Proficient in monitoring and observability tools like Splunk and Grafana
Strong understanding of middleware and application server concepts like Integration, Transactions and XA transaction, Security, Connection pooling, Load balancing and Messaging; clustered server environment and familiar with system performance related tasks
Must have excellent communications and teamwork skills, be self-directed, self-motivated, committed, and a quick learner.
.NET/AWS Software Engineer Lead(Only w2)
Data engineer job in Fort Mill, SC
Role: .NET/AWS Software Engineer Lead(Only w2)
Duration: Long Term Contract
Key Qualifications:
Required:
• 10+ years of hands-on software development experience
• Strong expertise in C#, .NET Core, OOP, SOLID principles
• Deep experience with:
o Multithreading & concurrency
o Asynchronous programming (async/await)
o Parallelism (TPL, Parallel LINQ)
o Performance tuning and high-throughput systems
• Hands-on experience with AWS cloud services
• Expertise in caching (Redis, ElastiCache, Memcached or Similar tools)
• Experience with SQL and NoSQL (SQL Server, DynamoDB, PostgreSQL)
• Experience with Docker; Kubernetes is a plus
• Excellent communication and problem-solving skills
Preferred:
• Exposure to AWS Landing Zone concepts
• Experience with Terraform (IaC modules, templates)
• Financial services or brokerage industry experience
• Event-driven architecture experience with Kafka, Kinesis, or RabbitMQ
• Knowledge of CloudFormation or other IaC tools
If I missed your call ! Please drop me a mail.
Thank you,
Harish
Talent Acquisition
Astir IT Solutions, Inc - An E-Verified Company
Email:*******************
Direct : ***********788
50 Cragwood Rd. Suite # 219, South Plainfield, NJ 07080
***************