Scala and spark is must
Exp in developing Hadoop solutions
Extensive knowledge about Hadoop Architecture and HDFS, and other file formats
Exp writing MapReduce Job
The person should have Hadoop/ETL project work experience .
The person must have worked on Hortonworks Hadoop Platform.
He/She should have working knowledge of HDFS, Oozie, Hive and/or PIG