Job Description: –
- Responsible to build Hadoop Platform & Infrastructure from the scratch
- Aligning with development and architecture teams to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
- Responsible create security layer for Hadoop environment.
- Working with data delivery teams to setup new Hadoop users, including access to HDFS, Hive, Pig and MapReduce, etc.
- Cluster maintenance as well as creation and removal of nodes using tools like Cloudera Manager Express or other open source tools.
- Screen Hadoop cluster job performances and capacity planning
- Monitor Hadoop cluster connectivity and security.
- Manage and review Hadoop log files.
- File system management and monitoring.
- HDFS support and maintenance..
- Responsible to build a Hadoop Infrastructure that guarantees high data quality and availability.
- Collaborating with Sys Admins to install operating system and Hadoop updates, patches, version upgrades when required.
- Interact with the business users, Enterprise Architects and Technical Leads to gather the requirements.
- Experienced in building or administrating Hadoop cluster with HDFS, Spark, Kafka, Zookeeper, Impala, Hive, Yarn, Hue, Oozie, etc…
- Experienced in setting up high availability and disaster recovery Infrastructure across different data centers.
- Experienced in building or administrating a Real-time streaming environment.
Preferred : –
- Must have at least 7 years of experience as Hadoop administrator
- Must have experience building and administrating open source Hadoop environment or product without licensed distribution, warranty and support.
- Must have experience implementing and administrating Hadoop environment using distributions like Cloudera Express (preferred), Apache or Hortonworks.
- Must have good UNIX or Linux experience preferably Red Hat
- Must have experience supporting High availability and Real-time streaming environment
- General operational expertise such as good troubleshooting skills, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks.
- Experience in Hadoop skills like MR, Hive, Pig, Spark, Kafka, Oozie, etc… is an advantage.
- Knowledge of Troubleshooting Core Java Applications is a plus.
- Good to have knowledge on Puppet/Chef/Ansible
- Good to have knowledge or experience in supporting MySQL & NoSQL Databases like Cassandra & HBase..
- Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
- At least 7 years of experience with Information Technology..
** U.S. citizens and those authorized to work in the U.S. are encouraged to apply . We are unable to sponsor at this time.
To apply for this job please visit tinyurl.com.