5+ years of research & distributed computing experience.
Experience working with large unfiltered data sets using distributed computing tools such as Map Reduce, Spark, Pig, Hive or other big data frameworks
Experience developing interactive visualization, drill-down dashboards, spatial and/or spatio-temporal visualizations.
Ability to translate traditional algorithms or Matlab code to Map Reduce/Spark Framework
Fluency in a programming or scripting language (e.g. Python, Ruby, Java) for fast prototyping and significant knowledge of core library and common design patterns.
Proficiency with UNIX/LINUX environments