- Develop improvements to big-data components to optimize performance and reliability in Cloud environments.
- Add new big data components to our Data Platform.
- Add new Cloud Platforms.
- Enhance orchestration of Cloud resources, optimize on cost/latency/performance.
- Add new functionality in back-end components - for example supporting new data formats or adding new analytic functions.
- Analyze customer workloads and identify and develop mechanisms to optimize efficiency and latency of the same.
- Strong programming skills and expert in at least one of Java, Python, C/C++.
- Ability to design modular software using OO principles.
- Experience in developing on Unix/Linux systems.
- Experience and interest in developing and debugging complex distributed systems.
- Experience in developing complex standalone systems like File Systems, Databases or web services like Web Search and Ad Serving.
- Familiarity with Clouds like AWS, GCE, RackSpace and experience with Cloud libraries like Boto, JCloud etc. is a positive.
- Familiarity with and track record of contributing to big data projects like Hadoop, HBase, Cassandra etc. is a positive.
- Excellent communication and strong troubleshooting and problem solving skills.