Data Governance and Data Quality

The Solvency II Directive (2009/138/EC) is a European Union (EU) Directive that codifies and harmonizes the EU insurance regulation.

EU insurance legislation aims to unify a single EU insurance market and enhance consumer protection. The primary concern is relative to the amount of capital that EU insurance companies must hold to reduce the risk of insolvency. Following an EU Parliament vote on the Omnibus II Directive on 11 March 2014, Solvency II should become compulsory for insurance companies on 1 January 2016. In the past, this date has been advance many times

Solvency II is somewhat similar to the banking regulations of Basel II. Its framework has three main areas (pillars):

  1. Pillar 1 consists of the quantitative requirements (for example, the amount of capital an insurer should hold);
  2. Pillar 2 sets out requirements for the governance and risk management of insurers, as well as for the effective supervision of insurers;
  3. Pillar 3 focuses on disclosure and transparency requirements.

Data Management and Solvency II

Effective data management is at the root of most Solvency II requirements. Insurance companies have usually generated risk reporting internally, even if depending to some extent on externally provided data. Without modification, this internal-only strategy is looking less and less sustainable in the face of the Solvency II requirements. The UK Financial Services Authority’s (FSA) internal model approval process thematic review findings (IMAP) explicitly identify data management as an area where insurance companies need to undertake significant work to achieve compliance.

Reporting and analysis requirements outlined in Solvency II are fundamentally more demanding than any existing reporting framework. The challenges are:

  • Establishing and monitoring the provenance of key data;
  • Assigning ownership; and
  • Implementing quality metrics.

The 2011 Report on the fifth Quantitative Impact Study (QIS5) for Solvency II by the European Insurance and Occupational Pensions Authority (EIOPA) underlined that 20% of firms reporting that their data were not absent or substantially incomplete (relating to core Pillar II data such as valuations, asset reference and pricing and policyholder static). 80% reported gaps where required data was unavailable or where the quality and completeness were under question. Some data were normally managed manually using spreadsheets, requiring a full audit and application of governance policy.

Overall, firms responding to the EIOPA survey reported six different initiatives necessary to deliver Pillar II requirements:

  • Data quality (validation);
  • Gap analysis (identification and mitigation);
  • Process (data governance);
  • Control (focusing on external source control);
  • Look-through (requirements on pooled fund valued position and security reference data); and
  • Infrastructure (data warehouse and workflow).

57% of EIOPA survey respondents said they had high or medium exposure to third-party providers such as risk modelers, third-party fund managers, custodians and ex-EEA parent companies. It is necessary that data audits extend across all these external providers,

Good quality data is the starting point for a successful Solvency II process, and many insurers have responded by establishing dedicated teams whose remit is to focus on data governance and data quality.

Data Quality and Governance

In the past, the attention of the management and of the academicians was very much concentrated on the quality of the products and its development and management. Later more and more attention has been devoted to the processes. The idea is that if the process is sound, also the quality of the product will be good.

In a World 2.0, more and more the attention should be devoted to the data. Thanks to the spreading of the Information and Telecommunication Technology and of the sensors, more and more data are available. Actually, more and more people are talking on the Big Data: This expression refer to data that are processable and have the three characteristics of the 3V’s: large Volume, need to access them in Velocity, and with a great Variety (structured and un-structured, internal and external, and so on).

Data are becoming more and more important. Their analysis is labeled Business Intelligence: the use of data to support the building of information. Information are then used to make decisions. If this is the chain, it is extremely important the quality of the data must be excellent. If this is not the case, at the end of the chain (data=>information=>decision) the decisions are wrong.

Actually, until now the attention to the quality of the data has been relatively limited. In the world of Big Data this is not anymore possible. How is it possible to assure the quality of the data? It is now time to move from the certification of products to the certification of data.

We believe that this can be achieved acting on applying Lean and Digitize to the data. This means to act on what I call the 3 P’s: People, Processes, and Platforms:
To act on People means essentially to take care of the Data Governance. Data Governance is an integral part of corporate and ICT governance. It combines leadership, organizational structures, and processes to ensure that data provide value to the Business. For an effective, efficient, and economical Data Governance it is essential to define a proper organization.
Organization is important, but it is essential to define Processes. In the case of data, processes they are relative to the creation, transformation and loading of data and the auditing and vetting of data already present in the databases. To be more precise, the sub-processes are:

  • Data creation/capture/extraction and recording at the time of gathering;
  • Data manipulation/transformation (label preparation, copying of data to a ledger, and so on);
  • Classification and tagging of data (class, observation, and so on) and its recording;
  • Digitization/Transfer of the data;
  • Documentation of the data (capturing and recording the metadata);
  • Data storage and archiving;
  • Data presentation and dissemination (paper and electronic publications, web-enabled databases, and so on);
  • Using the data (analysis and manipulation).
  • Once identified the proper organization and improved processes, it is necessary to evaluate the tools that can help in the certification of data (what we call here Platforms). The tools are essential due to the sheer volume of the data, their diversity, and the speed at which it is necessary to perform the processes.

Two aspects are important in the data certification:

  1. Accuracy refers to the closeness of measured values, observations or estimates to the real or true value (or to a value that is accepted as being true);
  2. Precision (or Resolution) can be divided into two main types:
o Statistical precision is the closeness with which repeated observations conform to themselves. They have nothing to do with their relationship to the true value, and may have high precision, but low accuracy,
o Numerical precision is the number of significant digits that an observation is recorded in and has become far more obvious with the advent of computers.

Plenty of work still ahead for many insurance companies. To work in these areas is important, but there are other substantial benefits. Even where companies already have good data management systems, more formalized control and ownership can bring to better understanding of business activities and outcomes.