* Good understanding of ETL Architecture and DW design
* Good understanding of Dimension modeling
* Extremely proficient in SQL& DB design
* Good knowledge of scripting language like Shell & Python MUST
* Proficient in any ETL tool. Pentaho preferred.
* Exposure to BI dashboard/OLAP is preferred e.g. Pentaho Dashboard CDE
* Hand's on experience in Hadoop/Map-reduce required
* Good understanding of Pig, Hive, Scoop
* Exposure to Columnar DB or MPP is a plus
The responsibilities of Data Engineer/ETL Specialist include:
* Data Engineer will have complete ownership & accountability of Data Platform that includes DWH & Big Data Platform
* The Data Engineer will be responsible to extract data destined for the Data Warehouse from the operational systems that store the data.
* Work with the source system analysts/developers to understand the windows available for data extraction
* Program, test, implement and maintain any data extraction programs necessary to extract the data from the operational systems needed to be moved to the Data Warehouse
* The ETL specialist is responsible for applying transformation rules as necessary to keep the data clean and consistent and therefore usable by the user community.
* Obtaining complete knowledge of the physical database schema
* Evaluate new technologies available in the Big Data ecosystem and implement the most relevant within Data Warehouse
* Ability to do R&D and recommend the most suitable solution for Platform
* To build the Hadoop infrastructure from scratch and come up with a sophisticated Big Data solution to process TBs of Data and generate insights for the company and associated platform