Software Engineer - Analytics

Curve HR
  • Delhi, Gurgaon
  • Confidential
  • 1-5 years
  • 241 Views
  • 17 Sep 2015
dfdf

  • Software Design & Development

  • IT/ Technology - Security
Job Description

Job responsibilities :

- Be involved in enhancing / building up near real-time (low latency) infrastructure for reporting Apps statistics and making decisions based on them

- Candidates should have significant Hadoop experience (knowledge).

- In addition, he/she must be proficient in operating systems and high-availability environments including Unix, Linux.

- Strong Shell programming experience is also required for creating automation programs.

Skills :

- B.Tech/BS/BE/M.Tech/MS/ME in Computer Science or equivalent

- Experience in software development with Java/Python/Scala/Perl

- Experience/Understanding of Apache Hadoop and Map-Reduce architecture

- Expert Unix, Linux experience

- Strong Shell Scripting (Unix Shell, Perl)

- Experience of working over distributed system

- Should have at least 2-5 years hands on experience in designing and implementing big data solutions using stack mentioned below:

- Hands on working with Hortonworks Hadoop would be preferred.

- Should enjoy being challenged and to solve complex problems on a daily basis.

- Should be very strong in trouble shooting the issues and improving performance of an applicant.

- Strong communication skills


Job Posted By

About Organisation

Curve HR