Responsibilities:
- Experience working in big data platform for implementing projects with high SLA of data availability and data quality, exposure to public cloud technologies is a major plus.
- 3+ years of experience in Big Data technology
- Experience with big data tools like Spark, HDFS, Hive, Impala, Scala, Kudu, Kafka, Python programming
- Experience with relational SQL and NoSQL databases
- Good working knowledge in bash script
- Substantial exposure to enterprise infrastructure, data processing, enterprise software applications
- Experience of being involved in delivering large scale distributed systems
- Flexible, people and organization savvy - ability to work in a matrix environment across regions with fast pace, know how to build effective relationships and communicate effectively with various teams across time zones
- Working experience with CI/CD tools like Subversion/Git, Maven, Jenkins
- Good knowledge of agile deliveries
- Excellent communication and presentation skills
- Familiarity with Finance Business domain, P&L Attribution and/or Risk P&L
- Familiar with microservcie architecture, JSON, API based interfaces
- Experience in redesigning legacy technology based solutions using Big Data technology stack
- Previous participation in change management or process re-engineering team
- Experience at a major investment or global commercial bank
- Experience in working in a global team spanning across multiple regions and locations
- Self-motivated, enthusiastic and proven fast learner who possess good problem solving skills
- Takes ownership of tasks assigned to ultimate resolution.
- Accuracy and timeliness of delivering solutions using the best IT standards and practices
- BI/Analytical skills