Description
About our client
Our Client operates in the Financial Services Industry, with its headquarters rooted strongly in Singapore. It has its branches spread to more than 15 countries, providing employment to more than 25,000 people all over the world. They fall in the Forbes Global 2000 (2022). Their core business is to offer financial services to its clients, ranging from Investment Banking to Corporate as well as Personal Banking Services.. They are also well known for their Residential Home Loan Business.
Job description
Responsibilities:
- Play a key role in Data Lake to onboard all critical upstream system data.
- Another key responsibility is proper gate keeping, to ensure upstream or business users following best practices and principles of Data Lake.
- In addition, taking care of Data Lake registration and production mask data restoration into UAT.
- Assembling large, complex sets of data that meet non-functional and functional business requirements.
- Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes
Requirements:
- Strong Data warehouse working experience in banking domain and upstream system knowledge
- Demonstrated ability to solve complex problems
- Test methodologies and testing tools. E.g. TestNG, JUnit. Full-SDLC cycle, participated in large-scale live roll-out.
- Hands-on experience in deployment and support, and monitoring the performance of the Control-M
- Min. 5 years of strong experience in ETL
- Good knowledge and working experience in Hadoop, ETL (Informatica BDM) & SQL
- Database design, programming, tuning and query optimization
- Experience in loading the data files into EDW Staging using Teradata(optional)
- Unix scripting experience
- Good communication and interpersonal skills is a must
- Having a great attitude, flexibility to stretch and take on challenges will be key to success in this role