Design and build big data platform and Big Data solutions primarily based on Hadoop echo system that is fault-tolerant & scalable.
Build high throughput messaging framework to transport high volume data.
Build framework(s) to support data pipelines on Spark.
Develop framework to deploy Restful web services.
Build ETL, distributed caching, transactional, indexing and messaging services.
Build High-Availability (HA) architectures and deployments primarily using big data technologies.
Hands on contribution to biz logic using Hadoop echo system (Java MR, PIG, Scala, Hbase, Hive)
Work on technologies related to NoSQL, SQL and InMemory platform(s) along with ML and AI will be an added advantage
3+ years hands on experience building software products and solutions using Java as primary programming language.
Experience with Big Data Technologies (Hive, HBase, Spark, Kafka, Storm, HDFS, Splunk, Vertica, MemSQL, Cassandra), understands the concepts and technology ecosystem around both real-time and batch processing in Hadoop.
Expert level experience with Java programming.
Experience with agile and scrum methodologies.
Expert level experience designing high throughput data services.
Expert level experience with build and continuous integration.