In order to provide true value to the customer, banks need to understand them first. To understand customer behaviour with the goal of offering products/services relevant to them, the bank collects all details of the customer transactions. Then they study the data to see if any deeper understanding can be gleaned.
To facilitate the process of getting insights banks use a collection of data bases called Data Repository (aka data lake, data mart, data library, data archive). This repository stores the data sets on which analytics are run to gain insights and predictions around customer behaviour. The business then takes a call on what insights to offer the customer .
To get to the data sets however is a major task. The different sources from which data will be gathered need to identified. The data from each of these databases will be placed in a data warehouse . To get data into the warehouse requires ETL tools that will extract data from each of the databases, transform the data format of this data into a format the data warehouse will understand and load the data to create the data set for analytics.
In fast moving real time businesses like banks, the data needs to be current, accurate and changes need to be captured . The processes and technical architecture of the data repository needs to accommodate these requests. Many ways to create these technical infrastructures exist and new ones such as cloud based ,open source, hybrid and connections with social and data feeds add more complexity.
Start small or repurpose an existing data set and scale up as you build consensus and maturity to deliver these data sets.
0 comments:
Post a Comment