Big Data Architect And Lead

Pitney Bowes Software India Pvt. Ltd.
  • Pune
  • Confidential
  • 10-15 years
  • 262 Views
  • 09 Sep 2015
dfdf

  • Project/ Program Management IT

  • IT/ Technology - Software/ Services
Job Description

Impact

As a Big Data Architect and Lead, you will be part of the Big Data and Analytics Platform Team .The Team has built a multi-tenant Amazon Platform using several technologies including S3, IAM, EMR, Spark, Kinesis, Redshift, Dynamo, R and other technologies. This is enabling Business Units, Clients and Partners to publish data catalog, store the data in common Data Lake and data scientists to develop new value offerings and productize them for monetizable value.

The Job

* Architecting and Delivering large scale Big Data product solutions for customers leveraging data sources available in Pitney Bowes and integrated with external/client data sets to derive new monetizable value at scale.
* Working on the cutting edge of a wide range of innovative AWS uses cases and AWS Big Data Solutions including S3, EMR, Spark, Redshift and other integration tools.
* Working with Business Unit teams and Data Scientists in gathering solution requirements and translating them to architecture/design using Data & Analytics Platform.
* Designing processes for structured raw data (structured and unstructured) from data lake to processed and build data warehouse for data discovery needs for summarize to data marts/data warehouse.
* Lead the build of the Hadoop platform and of Java applications
* Provide direction to team members interfacing with geographically distributed teams
* Handle technical documentation, architecture diagram, data flow diagram, server configuration diagram creation
* Drive project team to meet planned dates and deliverables
* Work with other big data developers, designing scalable, supportable infrastructure
* Support applications until they are handed over to Production Operations
* Perform ongoing capacity management forecasts including timing and budget considerations.

Qualifications & Skills required

* BS Degree in Computer Science/Engineering
* 10+ years of IT experience
* 4+ years of Java development experience preferred

* Must Have Skills:

o 2+ production experience with major big data technologies and frameworks including but not limited to Hadoop, Map Reduce, Pig, Hive, HBase, Oozie, Mahout, Flume, Zookeeper, Mongo DB, and/or NOSQL Databases
o Strong experience with real-time analytics like Storm, Esper and Storm are preferred
o Experience with one or more of the following tools for data integration / ETL: Talend, Informatica, and Pentaho.
o Experience w/ Amazon Redshift or similar large scale data warehousing systems such as Vertica, Aster, Teradata, and Netezza.
o Experience w/ one or more of the following relational databases: MySQL, PostgreSQL, Oracle, SQL Server.
o Experience w/ Amazon Web Services technologies like S3, SQS, EMR, Dynamo, etc.
o Experience to SQL/Shell and procedural languages like PL/SQL, C, C++
o Exposure of Data Analytics and BI tools like R, SAS, Tableau, Mat lab are preferred
o Understanding of Storage, Filesytem , Disks, Mounts , NFS
o Excellent customer service attitude, communication skills (written and verbal), and interpersonal skills
o Excellent analytical and problem-solving skills
o Experience leading technology initiatives

* Good to have Skills:

o Experience working in cross-functional, multi-location teams
o Deep understanding and practice of Agile (Scrum)


Job Posted By