The candidate will:
- Be responsible for building the foundation for the Analytic Center, a scalable platform designed to aggregate, compute and analyze large volumes to data and provide insights to the business and technology stakeholders.
- Be responsible for performing complex data manipulation and analytics, while working with business functions to identify and to respond to complex technical and business problems.
- Be challenged with developing innovative solutions for parallelizing and optimizing analytical algorithms.
- Be part of a team of talented data scientists, analysts, developers and engineers and administrators adhering to the organization's software development standards.
- Use big data fabric technologies to acquire, transform, and provision complex findings using visualization tools.
- Maintain high standard of quality and adhere to best coding practices.
Skills required (essential):
- Strong foundation for compute architecture, parallel processing and data engineering.
- Good understanding of Retail Banking or Wealth Management business.
- Proficiency with Hadoop, Hive, Pig, Python, and other supporting technologies and frameworks.
- 5+ yrs. of Object Oriented design & development, web application architecture and development experience.
- Good practical knowledge in Linux/Unix tools and scripting (Perl, ksh/bash and Autosys).
- Experience in building solutions with SQL and NoSQL systems.
- Successful track record building successful data products that leverage large datasets.
- Deep proficiency with mathematical and statistical model generation using tools such as R, SAS etc.
- Familiarity with machine learning and NLP tools.
- Understanding of the SDLC, good practices and experience with different development and change management tools.
- Experience working with any RDBMS and integration with application.
- Bachelors or advanced degree in an analytical or scientific field such as mathematics, statistics, computer science etc.
- Prior experience working in finance industry is preferred
Deep expertise with mathematical and statistical model generation using tools such as R, SAS, SPSS etc. Strong foundation in parallel processing, compute architecture, NoSQL systems, UNIX shell scripting, including Perl.
Good understanding of Hadoop ecosystem and related frameworks.