Hadoop administrator job description
Updated March 14, 2024
7 min read
Find better candidates in less time
Post a job on Zippia and take the best from over 7 million monthly job seekers.
Example hadoop administrator requirements on a job description
Hadoop administrator requirements can be divided into technical requirements and required soft skills. The lists below show the most common requirements included in hadoop administrator job postings.
Sample hadoop administrator requirements
- Hands-on experience in Hadoop administration
- Expertise in setting up and managing Hadoop clusters
- Proficiency in Linux system administration
- Knowledge of Hadoop ecosystem components such as HDFS, YARN, Hive, Pig
- Understanding of network configuration and security
Sample required hadoop administrator soft skills
- Strong problem-solving and analytical skills
- Excellent communication and collaboration skills
- Ability to work independently and as part of a team
- Flexibility to work in a fast-paced and dynamic environment
- Willingness to learn and adapt to new technologies
Hadoop administrator job description example 1
InfosysPublicService hadoop administrator job description
Infosys is seeking a
Hadoop Administrator
who will be responsible for implementation and ongoing administration of Hadoop infrastructure. The candidate will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued.
Required Qualifications:
Preferred Qualifications:
The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email, or face to face.
About US
Infosys is a global leader in next-generation digital services and consulting. We enable clients in more than 50 countries to navigate their digital transformation. With over four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.
Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.
EOE/Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity/National Origin
Required Qualifications:
- Candidate can be located Anywhere in US . This position may require travel to project location.
- Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
- At least 7 years of Information Technology Experience .
- At least 5 years of Hadoop Administration (CDP).
- Experience in Hadoop installation, upgrades and migration (CDP)
- Experience in managing the cluster using Ambari and Solr administration.
- Experience in HDFS, Hive, NiFi, Ranger, YARN, Spark, Zookeeper and Livy Services
- Experience in Hadoop Security (AD, Kerberos, Knox, Ranger, encryption techniques).
- Experience in HDFS snapshots and DR solutions.
- Experience in Shell Scripting.
- Excellent troubleshooting and problem solving skills.
- U.S. citizens and those authorized to work in the U.S. are encouraged to apply. We are unable to sponsor at this time.
Preferred Qualifications:
- RHEL knowledge, administration level preferred.
- Hadoop Certification - Good to have .
- MySQL, Vertica and MongoDB database administration is good to have.
- Knowledgeable about integration of third party client tools to connect to Hadoop Platform.
- Planning and Co-ordination skills.
- Good Communication and Analytical skills
The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email, or face to face.
About US
Infosys is a global leader in next-generation digital services and consulting. We enable clients in more than 50 countries to navigate their digital transformation. With over four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.
Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.
EOE/Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity/National Origin
Post a job for free, promote it for a fee
Hadoop administrator job description example 2
BRMi hadoop administrator job description
**Overview**
BRMi Technology is seeking aCloudera/Hadoop Admin
****Can be 100% remote in VA, MD, or FL****
here to see BRMi's Glassdoor reviews
**Responsibilities**
+ Will be responsible for the administration of Cloudera CDP on-prem Cluster
+ Implement Knox and Kerberos for Cluster security and integrate with enterprise AD
+ Develop scripts to automate and streamline operations and configuration
+ Maintain and update the cluster to support day-to-day operations in the enterprise
+ Perform minor and major upgrades, support monthly OS patches
+ Vulnerability management and remediation.
+ Support project teams in trouble shooting job issues and deployment where needed.
+ Activities include Efficiently copy data within a cluster/between clusters; Create/restore a snapshot of an HDFS directory
+ Research performance issues, configuring the cluster with Cloudera best practices, optimizing specifications and parameters to fine-tune and proactively avoid performance issues
**Qualifications**
+ Hands-on experience on Cloudera installation, configuration, debugging, tuning and administration
+ Strong hands-on experience in implementation of Security like Kerberos, Ranger, OS Upgrade and TLS/SSL implementation etc
+ Experience administrating distributed applications: Hadoop, Spark, HBase, Kudu, Map Reduce, Hive, Impala
+ Understanding of vulnerabilities and remediations
+ Experience with scripting or other automation
+ Experience with Performance monitoring and tuning
+ Working knowledge of Networks, Linux OS and Unix Shell Scripting
+ Experience with Cloudera stack upgrades
+ Understanding working authorization mechanism for Ranger
+ Demonstrate ability to find the root cause of a problem in CDP clusters, optimize inefficient execution, and resolve resource contention scenarios
**** BRMi will not sponsor applicants for work visas for this position.****
****This is a W2 opportunity only****
**EOE/Minorities/Females/Vet/Disabled**
We are an equal opportunity employer that values diversity and commitment at all levels. All individuals, regardless of personal characteristics, are encouraged to apply. Employment policies and decisions on employment and promotion are based on merit, qualifications, performance, and business needs. The decisions and criteria governing the employment relationship with all employees are made in a nondiscriminatory manner, without regard to race, religion, color, national origin, sex, age, marital status, physical or mental disability, medical condition, veteran status, or any other factor determined to be unlawful by federal, state, or local statutes.
**Job Locations** _VA | MD | FL_
**Posted Date** _1 month ago_ _(9/7/2022 9:39 AM)_
**_ID_** _2022-3285_
**_\# of Openings_** _1_
**_Category_** _IT Operations_
BRMi Technology is seeking aCloudera/Hadoop Admin
****Can be 100% remote in VA, MD, or FL****
here to see BRMi's Glassdoor reviews
**Responsibilities**
+ Will be responsible for the administration of Cloudera CDP on-prem Cluster
+ Implement Knox and Kerberos for Cluster security and integrate with enterprise AD
+ Develop scripts to automate and streamline operations and configuration
+ Maintain and update the cluster to support day-to-day operations in the enterprise
+ Perform minor and major upgrades, support monthly OS patches
+ Vulnerability management and remediation.
+ Support project teams in trouble shooting job issues and deployment where needed.
+ Activities include Efficiently copy data within a cluster/between clusters; Create/restore a snapshot of an HDFS directory
+ Research performance issues, configuring the cluster with Cloudera best practices, optimizing specifications and parameters to fine-tune and proactively avoid performance issues
**Qualifications**
+ Hands-on experience on Cloudera installation, configuration, debugging, tuning and administration
+ Strong hands-on experience in implementation of Security like Kerberos, Ranger, OS Upgrade and TLS/SSL implementation etc
+ Experience administrating distributed applications: Hadoop, Spark, HBase, Kudu, Map Reduce, Hive, Impala
+ Understanding of vulnerabilities and remediations
+ Experience with scripting or other automation
+ Experience with Performance monitoring and tuning
+ Working knowledge of Networks, Linux OS and Unix Shell Scripting
+ Experience with Cloudera stack upgrades
+ Understanding working authorization mechanism for Ranger
+ Demonstrate ability to find the root cause of a problem in CDP clusters, optimize inefficient execution, and resolve resource contention scenarios
**** BRMi will not sponsor applicants for work visas for this position.****
****This is a W2 opportunity only****
**EOE/Minorities/Females/Vet/Disabled**
We are an equal opportunity employer that values diversity and commitment at all levels. All individuals, regardless of personal characteristics, are encouraged to apply. Employment policies and decisions on employment and promotion are based on merit, qualifications, performance, and business needs. The decisions and criteria governing the employment relationship with all employees are made in a nondiscriminatory manner, without regard to race, religion, color, national origin, sex, age, marital status, physical or mental disability, medical condition, veteran status, or any other factor determined to be unlawful by federal, state, or local statutes.
**Job Locations** _VA | MD | FL_
**Posted Date** _1 month ago_ _(9/7/2022 9:39 AM)_
**_ID_** _2022-3285_
**_\# of Openings_** _1_
**_Category_** _IT Operations_
Dealing with hard-to-fill positions? Let us help.
Hadoop administrator job description example 3
Epsilon hadoop administrator job description
Love cutting-edge tech? We do too.
At Epsilon, we do more than collect and store data. We help some of the world's biggest brands discover real opportunities inside the data types, delimiters and decimals. Epsilon is a leading provider of multi-channel marketing services, technologies and database solutions.
Epsilon is seeking a Senior Hadoop Admin with a balance of skills in design, physical implementation and performance tuning to work out of our Dallas, Texas location supporting Data Warehouse and Big Data environments running on premise or on external cloud services providers for Database Marketing clients.
Duties and Responsibilities:
Design, configure, and manage Big Data implementations on premises and in the cloud (e.g., Cloudera, OpenSource Hadoop)
Support cloud and On-Prem servers including security configurations, patching, and troubleshooting.
Setup authentication for Hadoop services, includes setting up Kerberos principals and integrated with Active Directory
Setup Role based authentication using sentry and Ranger services
Implement data protection using HDFS encryption at rest with KMS services
Setting up High Availability configuration for Hadoop services
Performance tuning of Hadoop clusters and Spark/Impala services
Troubleshoot Python/Scala programs and provide feedback to data science engineers
Plan and perform Hadoop cluster upgrades
Monitor and report on performance metrics, system health, and logging compliance on Cloud
Perform database administration tasks including backup and recovery, troubleshooting, performance tuning and SQL optimization
Work closely with IT, developers and competency teams to ensure that applications availability and performance are within agreed on service levels
Interact with vendor support to ensure open issues are resolved within defined SLA
Manage data replication, backup snapshot, backup storage and archiving procedures
Evaluate new database technologies and products and recommend to technical management
Provide 24 x 7 x 365 support through on-call rotation
Preferred Skills
Experience with Cloudera or Hortonworks distribution
5 years hand-on experience on Cloud platforms - AWS or Azure
Experience in large project implementation using: AWS EC2, S3.
Experience in big data client tool - Squirrel, dB visualizer
Experience in DataBricks cloud solutions
Experience of Cloudera CDP public cloud implementation
Qualifications
4-6 years of experience in general database administration
At least 3 years of experience managing and maintaining Hadoop is a requirement
Knowledge of cloud and database technologies is a plus (e.g. AWS/Azure services, Oracle, SQL Server, etc)
Understanding of Data Warehousing MPP architecture, technologies and concepts
Ability to diagnose problems and resolve issues across various tiers (application, database, network and appliance)
Strong Unix/Linux and shell scripting
Proficiency in backup and recovery methodologies and techniques
Adherence to SDLC, Change Management, troubleshooting and development methodologies
Ability to perform work with minimal supervisory direction
Excellent written and verbal communication skills
Education/Certification
B.S. in Computer Science or related discipline or equivalent work experience required.
AWS certification is a plus.
Keywords:
- Spark
- Hive
- Cluster Administration
- KMS/KTS
- Replication
- Cloudera
- OpenSource Hadoop
- Python/py Spark
- hbase
At Epsilon, we do more than collect and store data. We help some of the world's biggest brands discover real opportunities inside the data types, delimiters and decimals. Epsilon is a leading provider of multi-channel marketing services, technologies and database solutions.
Epsilon is seeking a Senior Hadoop Admin with a balance of skills in design, physical implementation and performance tuning to work out of our Dallas, Texas location supporting Data Warehouse and Big Data environments running on premise or on external cloud services providers for Database Marketing clients.
Duties and Responsibilities:
Design, configure, and manage Big Data implementations on premises and in the cloud (e.g., Cloudera, OpenSource Hadoop)
Support cloud and On-Prem servers including security configurations, patching, and troubleshooting.
Setup authentication for Hadoop services, includes setting up Kerberos principals and integrated with Active Directory
Setup Role based authentication using sentry and Ranger services
Implement data protection using HDFS encryption at rest with KMS services
Setting up High Availability configuration for Hadoop services
Performance tuning of Hadoop clusters and Spark/Impala services
Troubleshoot Python/Scala programs and provide feedback to data science engineers
Plan and perform Hadoop cluster upgrades
Monitor and report on performance metrics, system health, and logging compliance on Cloud
Perform database administration tasks including backup and recovery, troubleshooting, performance tuning and SQL optimization
Work closely with IT, developers and competency teams to ensure that applications availability and performance are within agreed on service levels
Interact with vendor support to ensure open issues are resolved within defined SLA
Manage data replication, backup snapshot, backup storage and archiving procedures
Evaluate new database technologies and products and recommend to technical management
Provide 24 x 7 x 365 support through on-call rotation
Preferred Skills
Experience with Cloudera or Hortonworks distribution
5 years hand-on experience on Cloud platforms - AWS or Azure
Experience in large project implementation using: AWS EC2, S3.
Experience in big data client tool - Squirrel, dB visualizer
Experience in DataBricks cloud solutions
Experience of Cloudera CDP public cloud implementation
Qualifications
4-6 years of experience in general database administration
At least 3 years of experience managing and maintaining Hadoop is a requirement
Knowledge of cloud and database technologies is a plus (e.g. AWS/Azure services, Oracle, SQL Server, etc)
Understanding of Data Warehousing MPP architecture, technologies and concepts
Ability to diagnose problems and resolve issues across various tiers (application, database, network and appliance)
Strong Unix/Linux and shell scripting
Proficiency in backup and recovery methodologies and techniques
Adherence to SDLC, Change Management, troubleshooting and development methodologies
Ability to perform work with minimal supervisory direction
Excellent written and verbal communication skills
Education/Certification
B.S. in Computer Science or related discipline or equivalent work experience required.
AWS certification is a plus.
Keywords:
- Spark
- Hive
- Cluster Administration
- KMS/KTS
- Replication
- Cloudera
- OpenSource Hadoop
- Python/py Spark
- hbase
Start connecting with qualified job seekers
Resources for employers posting hadoop administrator jobs
Hadoop administrator job description FAQs
Ready to start hiring?
Updated March 14, 2024