Service Engineer - Big Data

Rinalytics Advisors Pvt. Ltd.
  • Bangalore, Pune
  • 10-15 lakh
  • 8-12 years
  • 187 Views
  • 31 Dec 2015
dfdf

  • Software Design & Development

  • IT/ Technology - Software/ Services
Job Description

The Role

- Design and implement applications to support distributed processing using Hadoop Eco system.

- Build libraries, user defined functions, and frameworks around Hadoop

- Research, evaluate and utilize new technologies/tools/frameworks around hadoop eco system such as Apache Kafka, Apache Spark, HDFS, Hive, HBase, etc.

- Develop user defined functions to provide custom hbase/hive capabilities

- Participating in the installation, configuration and administration of a single-node and multi-node cluster with
technologies like HDFS, Apache Kafka, Apache Spark etc

Requirement

- 8-12 years of experience building and managing complex products/solutions.

- Total 5+ experience in Java/J2EE technology

- 2+ years of hands on in development experience using Big Data Technology like: Hadoop (Apache Hadoop,
MapR Hadoop, HBase)

- Proficient in using Streaming based analytics using Apache Storm & Apache Spark

- Expert level programming in Java.

- Hands on experience building web services in Java/PHP/Python stack.

- Experience developing Restful web services in Spring framework

- Knowledge of web technologies and protocols (NoSQL/JSON/REST/JMS)


Competencies/Skill sets for this job

Big Data Hadoop Hbase Apache Hdfs Java Frameworks

Job Posted By

Manav Das
Associate Partner

About Organisation

Rinalytics Advisors Pvt. Ltd.