10+ years of software development experience using multiple computer languages.
Experience building large scale distributed data processing systems/applications or large-scale internet systems (cloud computing).
Experience with handling very large data repositories and technologies
Experience on Hadoop (Apache/Cloudera) and/or other MapReduce Platforms
Hands-on experience with the Hadoop stack (e.g. MapReduce, Sqoop, Pig, Hive, Hbase, Flume)
Hands-on experience with related/complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef)
Hands-on experience with ETL (Extract-Transform-Load) tools (e.g Informatica, Talend, Pentaho)
Hands-on experience with BI tools and reporting software (e.g. Microstrategy, Cognos, Pentaho)
Hands-on experience with analytical tools, languages, or libraries (e.g. SAS, SPSS, R, Mahout)
Hands-on experience with "productionalizing" Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning)
Experience with high-scale or distributed RDBMS (Teradata, Netezza, Greenplum, Aster Data, Vertical)
Knowledge of NoSQL platforms (e.g. key-value stores, graph databases, RDF triple stores)
Exposure to the most of the cutting edge technologies like Hadoop, MapReduce, MySQL, NoSQL, Cassandra, HBase and other cutting-edge technology