Data Engineer jobs at Hexaware Technologies - 49 jobs
Analytics Data science and IOT Engineer
Hexaware Technologies 4.2
Data engineer job at Hexaware Technologies
Role
Analytics
Data
science
and
IOT
Engineer
Responsibilities
Understanding
the
requirement
and
ability
to
relate
to
statistical
algorithms
Knowing
the
Acceptance
Criteria
and
ways
to
achieve
the
same
Complete
understanding
of
Business
Processes
and
data
Performing
EDA
(Exploratory
Data
Analysis)
cleansing, data preprocessing data munging and create training data sets Using the right Statistical models and other statistical methods Deploying the statistical model using the technology of customers' preference Building Data Pipeline , Machine Learning Pipeline and Monitoring activities are set for Continuous Integration , Continuous Development and Continuous Testing Investigating Statistical Model & provide resolution when there is any data drift and performance issues The Role offers Opportunity to join a global team to do meaningful work that contributes to global strategy and individual development To re-imagine, redesign, and apply technology to add value to the business and operations
$78k-103k yearly est. Auto-Apply 60d+ ago
Looking for a job?
Let Zippia find it for you.
Big Data Engineer
Hexaware Technologies, Inc. 4.2
Data engineer job at Hexaware Technologies
Job Description: 1. 3-5 years in Data platform engineering 2. Experience with CI/CD, laC(Terraform) and containerization with Docker/Kubernetes 3. Hands on experience building backend applications like APIs, Services etc 4. Proven track record in building scalable DataEngineering pipelines using Python, SQL, DBT Core/Cloud.
5.
Experience working with MWAA (Airflow) or similar cloud based DataEngineering Orchestration Tool 6.
Experience working with cloud ecosystems like AWS, Azure or GCP and modern data tools like Snowflake, Databricks etc.
7.
Strong problem solving skills as well as ability to move in a fast pace environment is a plus.
$78k-103k yearly est. Auto-Apply 60d+ ago
Process Intelligence Consultant / Senior Data Engineer
Capgemini Holding Inc. 4.5
Cleveland, OH jobs
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
YOUR ROLE
We are looking for a hands-on Process Intelligence Consultant / Senior DataEngineer professional experience with Process/Task Mining (KYP.ai, Celonis,) and Business Intelligence tools (Power BI, Tableau, Qlik).
YOUR TASKS
* Drive end-to-end implementation of the process intelligence (process/task mining) technology across various customers.
* Collaborate with business teams in order to translate demands into technical requirements and solution design, followed with documentation.
* Design data models with the appropriate event and case calculation to support the integration of data of business processes from different systems.
* Build analytical dashboards to equip business users with insight about process execution and variants.
* Interpret dashboards and analytical outputs from KYP.ai and other analytical tools, converting them into clear, practical actions for process optimization.
* Develop executive-ready reports and presentations that communicate insights in a business-friendly manner.
YOUR PROFILE
* 3-4 years of professional experience with Process/Task Mining (KYP.ai, Celonis,) and Business Intelligence tools (Power BI, Tableau, Qlik).
* Demonstrates general understanding of business processes, preferably with experience in finance area (P2P, C2C, R2R).
* General understanding with of one of the leading ERP systems on the market (SAP, Oracle).
* Know-how of data structures & models, experience with SQL & data visualization .
* Demonstrated ability to interpret complex data and communicate actionable insights to non-technical stakeholders.
* Strong communication and organization skills, with a logical approach to problem-solving, ability to explain solutions, good time management, and task prioritization skills.
* Very good command of English, both written and spoken.
It's not essential, but we appreciate if you also have:
* German at coomunicative level.
* Higher education, preferred quantitative methods, econometrics, mathematics, statistics or related, IT.
* Story-telling skills for executive-level presentations.
WHAT YOU'LL LOVE ABOUT WORKING HERE
* Well-being culture: medical care with Medicover, private life insurance, and Sports card. But we went one step further by creating our own Capgemini Helpline offering therapeutical support if needed and the educational podcast 'Let's talk about wellbeing' which you can listen to on Spotify.
* Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform. Dive into a world of knowledge with free access to Education First languages platform, Pluralsight, TED Talks, Coursera and Udemy Business materials and trainings.
* Continuous feedback and ongoing performance discussions thanks to our performance management tool GetSuccess supported by a transparent performance management policy.
* Enjoy hybrid working model that fits your life - after completing onboarding, connect work from a modern office with ergonomic work from home, thanks to home office package (including laptop, monitor, and chair). Ask your recruiter about the details.
GET TO KNOW US
Capgemini is committed to diversity and inclusion, ensuring fairness in all employment practices. We evaluate individuals based on qualifications and performance, not personal characteristics, striving to create a workplace where everyone can succeed and feel valued.
Do you want to get to know us better? Check our Instagram - @capgeminipl or visit our Facebook profile - Capgemini Polska. You can also find us on YouTube.
ABOUT CAPGEMINI
Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 360,000 team members globally in more than 50 countries. With its strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms.
Apply now!
$77k-101k yearly est. 5d ago
Cloud Data Engineer (Azure&Microsoft Fabric)
Capgemini Holding Inc. 4.5
Cleveland, OH jobs
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
YOUR PROJECT
Join our Insights & Data team, delivering cutting-edge, cloud-native data solutions for clients across industries such as finance, logistics, automotive, and telecom.
Our projects focus on Azure-based dataengineering, leveraging services like Microsoft Fabric, Synapse, Data Factory, and Azure AI.
We work end-to-end-from designing data architectures and building ETL/ELT pipelines to integrating with downstream systems and visualizing data.
You'll collaborate with AI and Data Science teams to bring GenAI, NLP, and RAG solutions into production environments.
YOUR TASKS
* Design and develop scalable data solutions in Azure using Microsoft Fabric
* Build and optimize data pipelines (ETL/ELT) with Azure Data Factory, Synapse, Dataflows, and Pipelines
* Integrate data from various structured and unstructured sources
* Collaborate with AI/ML teams to implement GenAI and NLP models
* Create and manage data models and data warehouses
* Ensure data quality, security, and compliance
* Document data architectures and processing workflows
* Contribute to CI/CD and DevOps practices for data environments
YOUR PROFILE
* Hands-on experience with Azure data services, including Microsoft Fabric (Data Factory, Synapse, OneLake, Notebooks, Pipelines)
* Strong skills in SQL and Python for data processing and transformation
* Solid understanding of data modeling, warehousing, and big data processing
* Familiarity with Azure AI and Cognitive Services
* Ability to work closely with development and analytics teams
* Proficient in English (B2 or higher)
Nice to have
* Exposure to GenAI, LLMs, and Retrieval-Augmented Generation (RAG)
* Knowledge of TypeScript/JavaScript
* Experience with DevOps and CI/CD tools (e.g., Azure DevOps, GitHub Actions)
* Understanding of data governance and data security best practices
* Azure certifications (e.g., DP-203, AI-102)
WHAT YOU'LL LOVE ABOUT WORKING HERE
* Enjoy hybrid working model that fits your life - after completing onboarding, connect work from a modern office with ergonomic work from home, thanks to home office package (including laptop, monitor, and chair). Ask your recruiter about the details.
* Access to over 70 training tracks with certification opportunities (e.g., GenAI, Architects, Google) on our NEXT platform. Dive into a world of knowledge with free access to Education First languages platform, Pluralsight, TED Talks, Coursera and Udemy Business materials and trainings.
* Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance and 40+ options on our NAIS benefit platform, including Netflix, Spotify or Sports card.
GET TO KNOW US
Capgemini is committed to diversity and inclusion, ensuring fairness in all employment practices. We evaluate individuals based on qualifications and performance, not personal characteristics, striving to create a workplace where everyone can succeed and feel valued.
Do you want to get to know us better? Check our Instagram - @capgeminipl or visit our Facebook profile - Capgemini Polska. You can also find us on TikTok! - @capgeminipl.
ABOUT CAPGEMINI
Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 360,000 team members globally in more than 50 countries. With its strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms.
Apply now!
$77k-101k yearly est. 5d ago
Data Engineer with Databricks & PySpark
Capgemini Holding Inc. 4.5
Cleveland, OH jobs
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
YOUR ROLE
Join our dynamic Insights & Data team-over 400 professionals delivering cutting-edge, data-driven solutions. We specialize in Cloud & Big Dataengineering, building scalable architectures across AWS, Azure, and GCP. You'll be part of a team that manages the full Software Development Life Cycle (SDLC) using modern frameworks, agile methodologies, and DevOps best practices.
YOUR TASKS
* Develop and maintain data processing pipelines using Databricks and PySpark.
* Collaborate with senior engineers and architects to implement scalable data solutions.
* Work with cloud-native tools to ingest, transform, and store large datasets.
* Ensure data quality, consistency, and security in cloud environments.
* Participate in code reviews and contribute to continuous improvement initiatives.
YOUR PROFILE
* You have hands-on experience in dataengineering and are comfortable working independently on moderately complex tasks.
* You've worked with Databricks and PySpark in real-world projects.
* You're proficient in Python for data transformation and automation.
* You've used at least one cloud platform (AWS, Azure, or GCP) in a production environment.
* You communicate clearly and confidently in English.
Nice to Have
* Solid SQL skills and understanding of data modeling.
* Exposure to CI/CD pipelines, Terraform, or other DevOps tools.
* Familiarity with streaming technologies (e.g., Kafka, Spark Streaming).
* Knowledge of cloud data storage solutions (e.g., Data Lake, Snowflake, Synapse).
* Relevant certifications (e.g., Databricks Certified DataEngineer Associate).
WHAT YOU'LL LOVE ABOUT WORKING HERE
* Enjoy hybrid working model that fits your life - after completing onboarding, connect work from a modern office with ergonomic work from home, thanks to home office package (including laptop, monitor, and chair). Ask your recruiter about the details.
* Access to over 70 training tracks with certification opportunities (e.g., GenAI, Architects, Google) on our NEXT platform. Dive into a world of knowledge with free access to Education First languages platform, Pluralsight, TED Talks, Coursera and Udemy Business materials and trainings.
* Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance and 40+ options on our NAIS benefit platform, including Netflix, Spotify or Sports card. .
GET TO KNOW US
Capgemini is committed to diversity and inclusion, ensuring fairness in all employment practices. We evaluate individuals based on qualifications and performance, not personal characteristics, striving to create a workplace where everyone can succeed and feel valued.
Do you want to get to know us better? Check our Instagram - @capgeminipl or visit our Facebook profile - Capgemini Polska. You can also find us on TikTok! - @capgeminipl.
ABOUT CAPGEMINI
Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 360,000 team members globally in more than 50 countries. With its strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms.
Apply now!
$77k-101k yearly est. 58d ago
Senior Data Visualisation Engineer (Power BI)
Capgemini Holding Inc. 4.5
Cleveland, OH jobs
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
YOUR ROLE
Insights & Data is a thriving team of over 400 professionals focused on delivering advanced data-driven solutions. Our expertise lies in Cloud & Big Dataengineering, where we build scalable architectures for processing large and complex datasets across AWS, Azure, and GCP. We manage the entire Software Development Life Cycle (SDLC), leveraging modern data frameworks, programming methodologies, and DevOps best practices to create impactful and efficient solutions.
We are looking for an ambitious Senior Data Visualization Engineer with expertise in Power BI both as a developer and administrator. Join our Data Management Services team to tackle exciting challenges in projects for international clients across diverse industries.
This role goes beyond technical skills-it involves direct collaboration with external clients, building strong partnerships, and ensuring the successful delivery of projects. We value honesty, open communication, and effective teamwork. If you thrive on leading diverse teams and contributing to an open, collaborative environment, we want you on board!
YOUR TASKS
* Collaborate with stakeholders to gather requirements and translate them into effective dashboard solutions using UI/UX.
* Design, develop, and implement interactive and visually appealing Power BI dashboards.
* Ensure a full testing for reports and dashboards.
* Optimize reports for performance and ensure they meet user requirements.
* Manage and administer using services, including workspace management, data gateway configuration, and user access control.
* Monitor system performance, troubleshoot issues, and ensure high availability and reliability of the Power BI environment.
* Implement best practices for security, data governance, and compliance within Power BI.
* Integrate data from multiple sources, including both On prem and cloud-based data sources like SQL, Oracle, Mainframe, Hadoop databases, files, O365 sources.
* Develop and manage ETL processes to ensure data accuracy, consistency, and reliability.
* Create and manage semantic models to centralize and standardize data for use across multiple reports and dashboards.
* Participate in code reviews and provide feedback to ensure high-quality deliverables
* Support the users in their development and usage of the Reporting tools.
Frequently used technologies:
* Power BI
* SQL
* Python
* Cloud Data Services: AWS / Azure / GCP
YOUR PROFILE
* Strong experience in Microsoft Power BI
* You have technical skills and a good business understanding/interest.
* You master Power BI and in particular the Cloud edition (Strong proficiency in DAX and Power Query)
* Good knowledge of Aure Synapse, Power Platform stacks
* Experience with Power BI administration, including workspace management and security settings.
* Proficiency in SQL and experience with integrating data from various sources.
* You can model data using Kimball, Dimensional modelling and Data Vault
* Experience with Cognos BI reporting is a plus
* Knowledge of Business Intelligence methodology and agile development methods such as SCRUM is clearly an asset.
* Strong analytical and problem-solving skills.
* Excellent communication and collaboration skills.
* Ability to manage multiple tasks and prioritize effectively.
WHAT YOU'LL LOVE ABOUT WORKING HERE
* Practical benefits: yearly financial bonus, private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance and access to NAIS benefit platform.
* Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform. Dive into a world of knowledge with free access to Education First languages platform, Pluralsight, TED Talks, Coursera and Udemy Business materials and trainings.
* Cutting-Edge Technology: Position yourself at the forefront of IT innovation, working with the latest technologies and platforms. Capgemini partners with top global enterprises, including 145 Fortune 500 companies.
* Enjoy hybrid working model that fits your life - after completing onboarding, connect work from a modern office with ergonomic work from home, thanks to home office package (including laptop, monitor, and chair). Ask your recruiter about the details.
GET TO KNOW US
Capgemini is committed to diversity and inclusion, ensuring fairness in all employment practices. We evaluate individuals based on qualifications and performance, not personal characteristics, striving to create a workplace where everyone can succeed and feel valued.
Do you want to get to know us better? Check our Instagram - @capgeminipl or visit our Facebook profile - Capgemini Polska. You can also find us on YouTube.
ABOUT CAPGEMINI
Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 360,000 team members globally in more than 50 countries. With its strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms.
Apply now!
$77k-101k yearly est. 58d ago
AWS Data Engineer
Capgemini Holding Inc. 4.5
Cleveland, OH jobs
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
YOUR ROLE
Join our Insights & Data team, a community of over 400 professionals delivering impactful, data-driven solutions. We specialize in building scalable cloud-native data architectures and pipelines that power analytics and machine learning across various industries. Our work spans the full Software Development Life Cycle (SDLC), using modern data frameworks, agile practices, and DevOps principles.
YOUR TASKS
* Design and implement solutions for processing large-scale and unstructured datasets (Data Mesh, Data Lake, or Streaming Architectures).
* Develop, optimize, and test modern DWH/Big Data solutions based on the AWS cloud platform within CI/CD environments.
* Improve data processing efficiency and support migrations from on-premises systems to public cloud platforms.
* Collaborate with data architects and analysts to deliver high-quality cloud-based data solutions.
* Ensure data quality, consistency, and performance across AWS services and environments.
* Participate in code reviews and contribute to technical improvements.
YOUR PROFILE
* Proven experience in Big Data or Cloud projects involving processing and visualization of large and unstructured datasets, across various phases of the SDLC.
* Hands-on experience with the AWS cloud, including Storage, Compute (and Serverless), Networking, and DevOps services.
* Solid understanding of AWS services, ideally supported by relevant certifications.
* Familiarity with several of the following technologies: Glue, Redshift, Lambda, Athena, S3, Snowflake, Docker, Terraform, CloudFormation, Kafka, Airflow, Spark.
* Basic proficiency in at least one of the following programming languages: Python, Scala, Java, or Bash.
* Very good command of English (German language skills would be an advantage).
Nice to Have
* Experience with orchestration tools (e.g., Airflow, Prefect).
* Exposure to CI/CD pipelines and DevOps practices.
* Knowledge of streaming technologies (e.g., Kafka, Spark Streaming).
* Experience working with Snowflake or Databricks in a production or development environment.
* Relevant certifications in AWS, dataengineering, or big data technologies.
WHAT YOU'LL LOVE ABOUT WORKING HERE
* Practical benefits: yearly financial bonus, private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance and access to NAIS benefit platform.
* Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform. Dive into a world of knowledge with free access to Education First languages platform, Pluralsight, TED Talks, Coursera and Udemy Business materials and trainings.
* Cutting-Edge Technology: Position yourself at the forefront of IT innovation, working with the latest technologies and platforms. Capgemini partners with top global enterprises, including 145 Fortune 500 companies.
* Enjoy hybrid working model that fits your life - after completing onboarding, connect work from a modern office with ergonomic work from home, thanks to home office package (including laptop, monitor, and chair). Ask your recruiter about the details.
GET TO KNOW US
Capgemini is committed to diversity and inclusion, ensuring fairness in all employment practices. We evaluate individuals based on qualifications and performance, not personal characteristics, striving to create a workplace where everyone can succeed and feel valued.
Do you want to get to know us better? Check our Instagram - @capgeminipl or visit our Facebook profile - Capgemini Polska. You can also find us on YouTube.
ABOUT CAPGEMINI
Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 360,000 team members globally in more than 50 countries. With its strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms.
Apply now!
$77k-101k yearly est. 58d ago
Senior Data Engineer (Matillion&Snowflake)
Capgemini Holding Inc. 4.5
Cleveland, OH jobs
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. YOUR PROJECT Join our dynamic team working on innovative data management solutions. Our project focuses on creating advanced analytical and integration systems that support key business processes. We leverage the latest technologies, such as Matillion and Snowflake, to ensure efficient and secure data processing. YOUR TASKS * Design and implement advanced data pipelines. * Collaborate with business teams to understand requirements and use cases. * Document technical processes and procedures. * Support Run & Support teams in resolving critical issues. * Lead technical teams and promote best practices. YOUR PROFILE As a Senior DataEngineer, you will be responsible for designing, building, and maintaining advanced data solutions. We are looking for someone with deep technical knowledge and analytical skills who can work with complex data and understand its role within the business context. The ideal candidate has experience in the following areas: * General Expertise: * Critical thinking and analytical skills. * Deep and broad understanding of data domains. * Effective communication skills, including documentation of design, build, test procedures, and maintenance. * Presentation skills, including explaining technical concepts to a broader audience. * Ability to lead technical teams, support best practices, resolve technical bottlenecks, and assist Run & Support teams with critical issues. * Technical Skills: * ETL/ELT: Matillion, BODS, DBT. * Snowflake and Redshift. * Data warehouse. * Coding languages (Python, NodeJS, etc.). * Experience with cloud technologies and integration. * Authentication (Basic, KeyPair, Token, etc.). * Secure data transmission (encryption, etc.). * General knowledge of Artificial Intelligence & ML. * Analytics / Business Intelligence. * SoX Controls/Audit/Compliance. * GDPR / SPI / PI Handling. * Data retention / archiving. * User interface technologies (HTML, JS, etc.). * Communication & Collaboration: * Estimation, design, build, and implementation of data pipelines based on business requirements and use cases. * Documentation (SDD/TDD Knowledge). * Core standards & per tool. * Team management. * Project finances/budgeting. * Process Knowledge: *
OTC - Order-To-Cash. * Invoicing. * Omni-channel. * Warranty / Returns. * System Knowledge: * Salesforce. * DCE (Digital Consumer Engagement). * ERP and finance systems. Our Expectation: * Significant experience in a similar role. * Deep technical knowledge and analytical skills. * Excellent communication and presentation skills. * Experience with cloud technologies and integration. * Proficiency in Matillion and Snowflake. * Ability to lead teams and manage projects. WHAT YOU'LL LOVE ABOUT WORKING HERE * Enjoy hybrid working model that fits your life - after completing onboarding, connect work from a modern office with ergonomic work from home, thanks to home office package (including laptop, monitor, and chair). Ask your recruiter about the details. * Access to over 70 training tracks with certification opportunities (e.g., GenAI, Architects, Google) on our NEXT platform. Dive into a world of knowledge with free access to Education First languages platform, Pluralsight, TED Talks, Coursera and Udemy Business materials and trainings. * Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance and 40+ options on our NAIS benefit platform, including Netflix, Spotify or Sports card. GET TO KNOW US Capgemini is committed to diversity and inclusion, ensuring fairness in all employment practices. We evaluate individuals based on qualifications and performance, not personal characteristics, striving to create a workplace where everyone can succeed and feel valued. Do you want to get to know us better? Check our Instagram - @capgeminipl or visit our Facebook profile - Capgemini Polska. You can also find us on TikTok! - @capgeminipl. ABOUT CAPGEMINI Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 360,000 team members globally in more than 50 countries. With its strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms. Apply now!
$77k-101k yearly est. 58d ago
Senior Azure Data Platform Engineer (DevOps)
Capgemini Holding Inc. 4.5
Cleveland, OH jobs
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
YOUR TASKS
* Collaborating with data scientists and software engineers to integrate (Gen) AI models into business applications, such as conversational chatbots and analytics.
* Managing, optimizing, and streamlining Data/AI workflows using AI orchestrators like LangChain
* Implementing Azure Cognitive Services for natural language processing and other AI functionalities
* Designing and managing efficient database schemas to meet application data requirements
* Performing database tuning and query optimization to ensure high performance
* Working closely with cross-functional teams to understand business requirements and translate them into technical solutions
* Documenting data workflows, processes, and architectures
* Ensuring that all data solutions adhere to security and compliance standards, including data encryption and access controls
YOUR PROFILE
* Proven experience with Microsoft Azure DevOps environment and services, including Azure Synapse Analytics, SQL Dedicated Pool, and container distribution using Argo CD.
* At least 3 + years experience in dataengineering, with a focus on building and managing ETL pipelines in environments that utilize Oracle, PostgreSQL, and Azure Synapse.
* Hands-on experience with Kubernetes for container orchestration and management.
* Strong experience in using Jenkins and Nexus for continuous integration and continuous deployment (CI/CD) pipelines.
* Strong problem-solving and analytical skills, particularly in optimizing database performance and managing distributed systems.
* Good communication and collaboration skills, especially when working across cross-functional teams.
* Familiarity with Azure Cognitive Services and other AI/ML frameworks in cloud-based architectures.
* Solid understanding of DevOps practices, including container management, automated testing, and deployment processes.
WHAT YOU'LL LOVE ABOUT WORKING HERE
* Well-being culture: medical care with Medicover, private life insurance, and Sports card. But we went one step further by creating our own Capgemini Helpline offering therapeutical support if needed and the educational podcast 'Let's talk about wellbeing' which you can listen to on Spotify.
* Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform. Dive into a world of knowledge with free access to Education First languages platform, Pluralsight, TED Talks, Coursera and Udemy Business materials and trainings.
* Continuous feedback and ongoing performance discussions thanks to our performance management tool GetSuccess supported by a transparent performance management policy.
* Enjoy hybrid working model that fits your life - after completing onboarding, connect work from a modern office with ergonomic work from home, thanks to home office package (including laptop, monitor, and chair). Ask your recruiter about the details.
GET TO KNOW US
Capgemini is committed to diversity and inclusion, ensuring fairness in all employment practices. We evaluate individuals based on qualifications and performance, not personal characteristics, striving to create a workplace where everyone can succeed and feel valued.
Do you want to get to know us better? Check our Instagram - @capgeminipl or visit our Facebook profile - Capgemini Polska. You can also find us on TikTok! - @capgeminipl.
ABOUT CAPGEMINI
Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 360,000 team members globally in more than 50 countries. With its strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms.
Apply now!
$77k-101k yearly est. 58d ago
Data Loss Prevention (DLP) Engineer - Work Remotely
Capgemini 4.5
Atlanta, GA jobs
A global leader in consulting, technology services and digital transformation, Capgemini is at the forefront of innovation to address the entire breadth of clients' opportunities in the evolving world of cloud, digital and platforms. Building on its strong 50-year heritage and deep industry-specific expertise, Capgemini enables organizations to realize their business ambitions through an array of services from strategy to operations. Capgemini is driven by the conviction that the business value of technology comes from and through people. It is a multicultural company of over 200,000 team members in more than 40 countries. The Group reported 2018 global revenues of EUR 13.2 billion.
About Infrastructure Services :
The Cloud Infrastructure Services Global Business Line is Capgemini's consulting and infrastructure build-and-run provisioning offering, and supports the group's cloud-based services. As part of the integrated cloud offering from Capgemini, Cloud Infrastructure Services delivers a broad range of cloud services to build and support the hybrid cloud estate by encompassing the leading public cloud players and leading private cloud technologies. With EUR 1.5 billion annual revenue, Cloud Infra Services helps clients virtualize and optimize their IT estates through infrastructure outsourcing services such as data center, helpdesk, network support, and service integration and service maintenance support. Our other services also include infrastructure transformation services-helping clients consolidate and migrate entire workloads and data centers.
Data Loss Prevention (DLP) Engineer Job Description: • Design, architect, and implement Data Loss Prevention (DLP) technologies • Work with the client to establish and define DLP parameters to be configured across the enterprise • Establish monitoring and incident response processes based on the results of DLP events utilizing a global team to enforce company policies • Maintain Group Policy Objects (GPOs) and other security controls. • Lead a global team to perform required tasks and client deliverables • Partner with other infrastructure Team Members to learn, develop and enhance the skills needed to support DLP within this complex environment. • Have strong verbal and written skills enabling for direct contact with client stakeholders and to provide status of various operational activities. • Ability to gain confidence of client, team, and other provider teams by meeting operational objectives and improving processes where feasible. • Collaborate with internal infrastructure Teams to resolve escalated support issues. • Have strong knowledge of tools being used in order to lead and execute tool upgrade planning, utilization of available capabilities, and improvements to reduce manual workloads. Qualifications: • 7+ years of overall IT security experience • 5+ years' experience with PKI Services in large, multi-domain environments. • 2+ years AppViewX experience including configuration and management as well as Certificate management. • 2+ years' experience with PingIdentity focusing on Biometric ID Management • Experience working with and implementing solutions leveraging Kerberos, LDAP, GPOs, group and role mapping DNS, and DHCP. • B.S. Degree in Computer Science or related IT work experience in a global information technology environment. • Certifications in PKI-related or PKI-tools listed above is desired Keywords\: Data Loss Prevention Engineer DLP Engineer Location: Remote
Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law
This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship.
Click the following link for more information on your rights as an Applicant - http\://*******************************************************************
Applicants for employment in the US must have valid work authorization that does not now and/or will not in the future require sponsorship of a visa for employment authorization in the US by Capgemini.
$77k-99k yearly est. Auto-Apply 60d+ ago
Data Loss Prevention (DLP) Engineer - Work Remotely
Capgemini 4.5
Texas jobs
A global leader in consulting, technology services and digital transformation, Capgemini is at the forefront of innovation to address the entire breadth of clients' opportunities in the evolving world of cloud, digital and platforms. Building on its strong 50-year heritage and deep industry-specific expertise, Capgemini enables organizations to realize their business ambitions through an array of services from strategy to operations. Capgemini is driven by the conviction that the business value of technology comes from and through people. It is a multicultural company of over 200,000 team members in more than 40 countries. The Group reported 2018 global revenues of EUR 13.2 billion.
About Infrastructure Services :
The Cloud Infrastructure Services Global Business Line is Capgemini's consulting and infrastructure build-and-run provisioning offering, and supports the group's cloud-based services. As part of the integrated cloud offering from Capgemini, Cloud Infrastructure Services delivers a broad range of cloud services to build and support the hybrid cloud estate by encompassing the leading public cloud players and leading private cloud technologies. With EUR 1.5 billion annual revenue, Cloud Infra Services helps clients virtualize and optimize their IT estates through infrastructure outsourcing services such as data center, helpdesk, network support, and service integration and service maintenance support. Our other services also include infrastructure transformation services-helping clients consolidate and migrate entire workloads and data centers.
Data Loss Prevention (DLP) Engineer Job Description: • Design, architect, and implement Data Loss Prevention (DLP) technologies • Work with the client to establish and define DLP parameters to be configured across the enterprise • Establish monitoring and incident response processes based on the results of DLP events utilizing a global team to enforce company policies • Maintain Group Policy Objects (GPOs) and other security controls. • Lead a global team to perform required tasks and client deliverables • Partner with other infrastructure Team Members to learn, develop and enhance the skills needed to support DLP within this complex environment. • Have strong verbal and written skills enabling for direct contact with client stakeholders and to provide status of various operational activities. • Ability to gain confidence of client, team, and other provider teams by meeting operational objectives and improving processes where feasible. • Collaborate with internal infrastructure Teams to resolve escalated support issues. • Have strong knowledge of tools being used in order to lead and execute tool upgrade planning, utilization of available capabilities, and improvements to reduce manual workloads. Qualifications: • 7+ years of overall IT security experience • 5+ years' experience with PKI Services in large, multi-domain environments. • 2+ years AppViewX experience including configuration and management as well as Certificate management. • 2+ years' experience with PingIdentity focusing on Biometric ID Management • Experience working with and implementing solutions leveraging Kerberos, LDAP, GPOs, group and role mapping DNS, and DHCP. • B.S. Degree in Computer Science or related IT work experience in a global information technology environment. • Certifications in PKI-related or PKI-tools listed above is desired Keywords\: Data Loss Prevention Engineer DLP Engineer Location: Remote
Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law
This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship.
Click the following link for more information on your rights as an Applicant - http\://*******************************************************************
Applicants for employment in the US must have valid work authorization that does not now and/or will not in the future require sponsorship of a visa for employment authorization in the US by Capgemini.
$81k-105k yearly est. Auto-Apply 60d+ ago
Data Engineer
Capgemini Holding Inc. 4.5
Bogota, NJ jobs
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
Requirements:
Advanced English skills (B2+).
Experience in ETL and Azure tools: A minimum of 2-5 years of experience in developing ETL processes using Azure Data Factory, Azure Databricks, and other Azure cloud tools.
Knowledge of Apache Spark and Databricks: Experience using Apache Spark for distributed data processing and knowledge of Databricks for data integration and transformation.
Proficiency in Python and SQL: Advanced skills in Python for automating processes and manipulating data, along with a strong understanding of SQL for querying and transforming data.
Handling large volumes of data: Ability to manage, transform, and optimize large volumes of data using cloud platforms and processing technologies like Spark.
Teamwork and communication: Collaborative skills with multidisciplinary teams and effective communication ability to interact with both technical and non-technical stakeholders.
Responsabilities:
Develop and maintain efficient ETL pipelines using Azure Data Factory and Databricks to process and transform large datasets for analysis.
Analyze large datasets and build machine learning models using Python and Spark to extract actionable insights and solve business problems.
Utilize Apache Spark to process and analyze large volumes of data in a distributed environment, ensuring scalability and performance.
Deploy machine learning models to production using Azure and Databricks, and monitor their performance for continuous improvement.
Work closely with cross-functional teams (dataengineers, business analysts, etc.) to understand requirements and communicate insights and results effectively to stakeholders.
What You´ll Love About Working Here:
We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance.
At the heart of our mission is your career growth. Out array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities.
Equip yourself with valuable certification in the latest technologies.
#LI-Remote
#LI-LG8
Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders.
$80k-107k yearly est. 45d ago
Analytics Data science and IOT Engineer
Hexaware Technologies, Inc. 4.2
Data engineer job at Hexaware Technologies
Role
Analytics
Data
science
and
IOT
Engineer
Responsibilities
Understanding
the
requirement
and
ability
to
relate
to
statistical
algorithms
Knowing
the
Acceptance
Criteria
and
ways
to
achieve
the
same
Complete
understanding
of
Business
Processes
and
data
Performing
EDA
(Exploratory
Data
Analysis)
cleansing, data preprocessing data munging and create training data sets Using the right Statistical models and other statistical methods Deploying the statistical model using the technology of customers' preference Building Data Pipeline , Machine Learning Pipeline and Monitoring activities are set for Continuous Integration , Continuous Development and Continuous Testing Investigating Statistical Model & provide resolution when there is any data drift and performance issues The Role offers Opportunity to join a global team to do meaningful work that contributes to global strategy and individual development To re-imagine, redesign, and apply technology to add value to the business and operations
$78k-103k yearly est. Auto-Apply 60d+ ago
Big Data Engineer
Hexaware Technologies, Inc. 4.2
Data engineer job at Hexaware Technologies
DataEngineer: ----------------- JD Good Communication and strong in Technical design .
Handle end to end ETL Pipeline development and testing, Strong Experience in ETL / Dataware House Strong Experience in Databricks, Unity Catalogue, Medallion Architecture, Data Lineage & Pyspark, CTE Good Experience in Data Analysis Strong Experience in SQL Queries, Stored Procedures, Views, Functions, UDF (User Defined Functions) Experience in Azure Cloud, ADF, Storage & Containers, Azure DB, Azure Datalake Experience in SQL Server Experience in Data Migration & Production Support Skills Must Have Skills :- (a) Databricks, Pyspark, Medallion Arch, CTE (b) Python, ADL, ADF, Azure Cloud Storage & Container Exp (c) SQL, Stored Procedures, Views, User Defined Functions (d) SQL Server (e) Data Analysis Exp Nice To Have :- (a) Collibra
$78k-103k yearly est. Auto-Apply 15d ago
Big Data Engineer
Hexaware Technologies, Inc. 4.2
Data engineer job at Hexaware Technologies
JD: Experience with big data processing and distributed computing systems like Spark. • Implement ETL pipelines and data transformation processes. • Ensure data quality and integrity in all data processing workflows. • Troubleshoot and resolve issues related to PySpark applications and workflows.
• Understand source, dependencies and data flow from converted PySpark code.
• Strong programming skills in Python and SQL.
• Experience with big data technologies like Hadoop, Hive, and Kafka.
• Understanding of data warehousing concepts and relational databases like SQL.
• Demonstrate and document code lineage.
• Integrate PySpark code with frameworks such as Ingestion Framework, DataLens, etc.
, • Ensure compliance with data security, privacy regulations, and organizational standards.
• Knowledge of CI/CD pipelines and DevOps practices.
• Strong problem-solving and analytical skills.
• Excellent communication and leadership abilities.
$78k-103k yearly est. Auto-Apply 60d+ ago
Big Data Engineer
Hexaware Technologies, Inc. 4.2
Data engineer job at Hexaware Technologies
Role Big DataEngineer Responsibilities Implement Data integration and Data Warehouse based solutions using Big Data Technologies. Should be highly proficient in the use of Big Data / Open Source Technologies and standard techniques of Data Integration, Data Manipulation with hands-on contribution.
Should be able to develop cost efficient and performant data pipelines in the cloud platform In-depth understanding of modern big data technology, including Data modeling and machine learning skills Knowledge of real time data streaming and aggregation architectural patterns and practice Essential Skills: 3+ Years hands on knowledge on SQL as well as SQL/NoSQL databases Proficient in programming languages such as Python and PySpark/Scala/Java Experience with Apache Spark in On-prem / Cloud (Databricks / EMR / DataProc HDInsight) Working knowledge of API development Experience in building data pipelines in ETL/ELT methods Experience with integration of data from multiple data sources Experience in one of the NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of real time/streaming/event based dataengineering frameworks, such as Flume, Pub-Sub, Kinesis, Eventhub
$78k-103k yearly est. Auto-Apply 1d ago
Big Data Lead
Hexaware Technologies 4.2
Data engineer job at Hexaware Technologies
Big data Lead: ----------------- JD Lead (Hands On) the team of DataEngineers Good Communication and strong in Technical design decision making. .
Strong Experience in ETL / Dataware House Strong Experience in Databricks, Unity Catalogue, Medallion Architecture, Data Lineage & Pyspark, CTE Good Experience in Data Analysis Strong Experience in SQL Queries, Stored Procedures, Views, Functions, UDF (User Defined Functions) Experience in Azure Cloud, ADF, Storage & Containers, Azure DB, Azure Datalake Experience in SQL Server Experience in Data Migration & Production Support Skills Must Have Skills :- (a) Databricks, Pyspark, Medallion Arch, CTE (b) Python, ADL, ADF, Azure Cloud Storage & Container Exp (c) SQL, Stored Procedures, Views, User Defined Functions (d) SQL Server (e) Data Analysis Exp Nice To Have :- (a) Collibra
$75k-97k yearly est. Auto-Apply 15d ago
Big Data Lead
Hexaware Technologies, Inc. 4.2
Data engineer job at Hexaware Technologies
Big data Lead: ----------------- JD Lead (Hands On) the team of DataEngineers Good Communication and strong in Technical design decision making. .
Strong Experience in ETL / Dataware House Strong Experience in Databricks, Unity Catalogue, Medallion Architecture, Data Lineage & Pyspark, CTE Good Experience in Data Analysis Strong Experience in SQL Queries, Stored Procedures, Views, Functions, UDF (User Defined Functions) Experience in Azure Cloud, ADF, Storage & Containers, Azure DB, Azure Datalake Experience in SQL Server Experience in Data Migration & Production Support Skills Must Have Skills :- (a) Databricks, Pyspark, Medallion Arch, CTE (b) Python, ADL, ADF, Azure Cloud Storage & Container Exp (c) SQL, Stored Procedures, Views, User Defined Functions (d) SQL Server (e) Data Analysis Exp Nice To Have :- (a) Collibra
$76k-101k yearly est. Auto-Apply 13d ago
Big Data Lead
Hexaware Technologies, Inc. 4.2
Data engineer job at Hexaware Technologies
Database Development & Management • Strong experience in SQL development and query optimization. • Hands-on experience with Snowflake (preferred), Oracle, or any relational database. • Understanding of database indexing, partitioning, and performance tuning. . ETL/ELT Development • Expertise in ETL/ELT development, preferably using Talend (or any other ETL tool). • Strong proficiency in Advanced SQL scripting for data transformation and processing. • Ability to design, develop, and optimize data pipelines for structured/unstructured data. • Experience in error handling, logging, and recovery mechanisms for ETL processes. Data Warehousing & Modeling • Understanding of Data Warehousing concepts, including Star Schema and Dimension Modeling. • Hands-on experience with Snowflake (preferred) or other cloud/on-prem data warehouses. • Ability to design and maintain Fact and Dimension tables for analytics and reporting. • Knowledge of data partitioning and performance tuning techniques in DWH environments. CI/CD & Version Control (Good to Have, Not Core Focus) • Experience using GIT for version control of ETL scripts and database objects. • Exposure to CI/CD tools like TeamCity (preferred), Jenkins, or Azure DevOps for ETL deployment automation. • Understanding of branching strategies, merging, and automated deployments for ETL processes. • Familiarity with scheduled job execution and monitoring via CI/CD tools. Cloud Exposure (Good to Have, Not Core Focus) • Basic familiarity with Azure or AWS cloud environments. • Understanding of Snowflake on Azure/AWS or Redshift on AWS (data storage, querying, and schema management). • Exposure to cloud storage solutions (Azure Blob, AWS S3) for data ingestion and staging. • Awareness of cloud-based ETL services (Azure Data Factory, AWS Glue) - preferred but not required.
$76k-101k yearly est. Auto-Apply 60d+ ago
Senior Principal Data Architect
Hexaware Technologies, Inc. 4.2
Data engineer job at Hexaware Technologies
We are seeking a seasoned Data & AI professional to join our high-impact team. In this customer-facing role, you will act as the trusted technical advisor to Fortune 1000 enterprises, helping them transform and modernize their data and AI landscapes.
You will combine deep technical expertise with strong consultative selling skills to drive strategic deals from initial discovery through to closed-won.
Key Responsibilities • Lead consultative sales engagements for enterprise data, analytics, and AI services, including opportunity discovery workshops, solution visioning, and executive-level presentations.
• Translate complex customer business challenges into compelling technical solutions leveraging modern data platforms (Databricks, Snowflake, Microsoft Fabric, AWS, Google Cloud BigQuery, etc.
).
• Design and present high-level and detailed architectures for large-scale data platform migrations, lakehouse implementations, real-time analytics, and AI/ML workloads.
• Serve as the technical authority during the sales cycle: scoping requirements, creating proof-of-concept roadmaps, responding to RFIs/RFPs, and delivering persuasive demos and whiteboard sessions.
• Collaborate closely with account executives to build and execute account strategies, expand existing relationships, and accelerate deal velocity.
• Act as a bridge between customers and internal delivery teams to ensure proposed solutions are realistic, innovative, and commercially viable.
• Stay ahead of industry trends and competitively position our offerings against legacy and emerging platforms