Data: Surely it is an IT issue, right? (Part 1)
Data is not generally seen as a banking issue, when in fact, it is one of the few, crucial advantages it has.
Ben Robinson argues that the banking sector needs to build on the competitive advantages it has, particularly around data, to deliver a better, fuller, richer banking experience for the consumer
When discussing data, it’s very easy to couch the issues in technical language – models, warehouses, cubes and so on.
This makes it tempting to see data as an IT issue. However, data lies at the very core of what banks do; it represents the industry’s only source of enduring competitive advantage and the effective use of data will determine which banks are successful in the digital age.
An explosion of data – and processing capabilities
The amount of data being produced every year is increasing exponentially, by a compound rate of 40%, according to McKinsey estimates.
This is being driven by an explosion in take-up of smart devices (embedded with computer chips capable of recording and transmitting data) and in user-generated content such as photos, Tweets and instant messages.
As Eric Schmidt famously said in 2010: “Every two days now we create as much information as we did from the dawn of civilization up until 2003.”
The same is true in the world of banking. As banking digitises (moves online) and as payments dematerialise (move away from cash), the amount and variety of data is mushrooming.
The digital versus the physical
With the advent of mobile banking, the look-to-book ratio (the ratio of banking interactions, such as balance checks, to transactions) is increasingly sharply.
A recent article on Barclays’ Pingit, for example, found that on average customers are using their mobile app 26 times a month compared with two visits to a branch per month.
This is likely to reach at least 500 to 1 if lessons from other industries, such as travel, are representative.
And it could possibly reach 5,000 to 1 when, with the machine-to-machine interaction of the internet of things, our fridges, wallets and cards all query our bank balance. The variety of data is also growing – banks are able to gain contextual and social data about their customers, for instance.
The good news for banks (and other companies) is that with improvements in processing power, this data can be turned into insights that will help drive a more intimate customer relationship.
Moore’s Law predicted the density of chip transistors would double roughly every two years, which has played out over time and allowed computers to become much more powerful.
But, it is also interesting to look at storage costs (“Kryder’s Law”). These have fallen even faster thanks to increasing disk density: 1GB of data cost more than $200,000 in 1980 compared with less than 3 cents today.
Still, banks are doing very little with their data. A survey by Capgemini found, for example, that only 37% of customers believe that banks understand their needs and preferences adequately.
In order to capitalise on their data assets, banks will need to overcome challenges.
The first involves tackling the prevailing mindset at most banks. Banks reputations rest on safeguarding customer assets.
For many bankers, therefore, the priority is to lock down customer data to ensure that it isn’t compromised in any way, for example being burnt onto a USB stick that gets left on a train, or stolen by someone who intends to commit fraud.
While the privacy of customers must be maintained, this mindset will limit banks’ ability to take advantage of data to improve the customer experience.
Changing the mindset of data
The second challenge is one of data silos. Banks’ IT systems have been typically built to offer specific products and services (loans, credit cards etc) and the data is product- rather than customer-focused.
The data is moreover bound up in multiple different systems with no conformity on semantic standards (for example, a standard definition of what constitutes a customer).
Banks will need to solve this problem if they are to get data flowing easily and usefully through the entire organisation. The renewal of core banking software is the best means to consolidate data sets and – as echoed by a recent McKinsey study – represents the most fundamental step to digitisation.
A third challenge relates to using unstructured data.
Since few banks today have a single, consolidated view of their structured data, trying to enrich structured data with references to unstructured might seem a challenge too far.