Job Code: SG209
Hadoop SME

Experience : 7-13

Location : Singapore

Demonstrated experience of designing/building/migrating Hadoop solutions with separation of storage and compute design pattern for object storage (e.g. Hadoop on Cloud based object store).
- Must have worked on Distributed Copy, Flume, Sqoop and Hive.
- Possess strong unix, sql and python scripting skills.
- Should have Hadoop configuration knowledge (hdfs-site.xml, hdfs-conf.xml) and Hadoop security models (Kerberos, Knox, Ranger, Sentry, ACL's).
- Have proficiency in hadoop performance management (yarn, zookeeper, map reduce) and scheduling tools (tws or similar).
- Knowledge of any among S3 (Cleversafe), Alluxio, Teradata and Java will be good to have.
- Minimum 8-12 years of data platform / foundation experience.

Apply Now, kindly share your CV at with subject as SG209