Challenges in the Banking and Insurance Industry and the Role of Technology

Michel Philippens, Business Development Manager Banking, SAS Belgium & Luxembourg
Sebastien Schouteten, Business Development Manager Insurance, SAS Belgium & Luxembourg

Leveraging high-performance risk technology to capitalize on Basel III reforms

Over the next years, regulators will implement a wave of risk management reforms designed to create greater stability to the banking system. Basel III, probably the most impacting regulation, defines new requirements regarding the amount and quality of capital and introduces new liquidity ratios such as the LCR (Liquidity Coverage Ratio) and NSFR (Net Stable Funding Ratio) ratios. These rules will put additional constraints on capital and liquidity and will change the competitive dynamics of the industry, ultimately leading many banks to revisit their business model.

Despite the strong business impact of these regulations, banks should not ignore the technology side of these reforms. Merely addressing the compliance tasks related to Basel III is already a considerable challenge. Contrary to Basel II, which focused pre-dominantly on credit risk, Basel III now introduces a more complete coverage of risks and forces banks to have a more 360 degree view on risk. The introduction of liquidity risk into the Basel risk taxonomy e.g. has proven to be a daunting data integration challenge for many banks – requiring a cross-departmental view on balance sheet, collateral and cashflow-related information.

However, if banks where only to focus on compliance, this would be a missed opportunity. The new challenging business environment is an incentive to adopt a more sophisticated way in allocating and using scarce financial resources such as liquidity and capital. In order to generate the insights that enable more active steering of financial resources, most banks will have to rethink and revamp their existing Risk-IT architectures, the collection of systems used to measure, manage and control the data associated with the bank’s risk position. A joint paper by The Boston Consulting Group, Platinion & SAS (1) discusses how the enhancement of Risk-IT architectures is likely to evolve in a phased manner - going from the most basic “Compliance” level, through a subsequent “Transparency” level to eventually arrive at the most sophisticated level: the “Forecasting-enabling” level.

Towards a transparent Risk-IT architecture

The “Transparency” maturity level is characterized by the business objective to actively manage & steer the bank against risk targets at different levels of the organization such as the entity, business line, product or counterparty level. This implies that business users should have access to high-quality, timely and granular risk data, being able to display aggregated risk data and drill-down from the aggregated risk dashboard to the single transaction level in just a few mouse clicks. Banks moving from a “Compliance” level to a “Transparency” level will face a number of enhancements to their Risk-IT infrastructures including:

  • The construction of conformed core data marts per risk domain (credit, market, liquidity, operational risk, integrated risk) combining aggregated risk result data & full position or risk factor details.
  • The automation of recurrent data quality, reconciliation and balance coverage checks to ensure the validity and completeness of the risk data.
  • The automation of previously manually executed processes such as for example the (dis)aggregation of risk capital numbers.
  • Harmonization of risk reporting technologies and approaches across risk domains.
  • Adoption of next-generation reporting and visualization tools to satisfy business demands on flexibility, cost-effectiveness and performance. The introduction of “in-memory” reporting tools deployed on commodity grid or multi-processor hardware will allow “self-service” access to current and past risk data at any level of granularity – whilst avoiding the cost of setting up and maintaining multiple intermediate data- or semantic layers.
  • Introduction of mobile BI reporting applications to strengthen board and senior level use of risk reporting information.
From “Transparency” to “Forecast” enablement

Going beyond the “Transparency” level, the maturity spectrum peaks at the “Forecast-enabling” level. Banks at this level are characterized by a forward-looking culture and manage the bank by taking preventive measures based on an array of positive and adverse scenarios for the bank’s key economic and regulatory indicators. Today, many institutions are still limited in terms of forecasting or stress testing capabilities and are far from a) offering its risk staff the ability to run simulations and stress tests “on-demand” and b) having results of these simulations in near real-time.

The associated benefits of having near-real time simulation capabilities could however be substantial as Ming Soong, CRO of UOB, explains (2): “the ability to understand market changes before your competitors give you the additional advantage. It means that we can take products to market a lot sooner or exit markets a lot sooner…. It will also enable risk management to demonstrate the effects of large credit decisions to senior managers. They will be able to see how particular credit decisions impact the entire portfolio in almost real time. Also, if we have the capacity and the capability to make complex credit decisions with any degree of accuracy or comfort in near-real time, it could endear us to our customers and create a safer environment for us to operate.”

Moving IT-Risk architectures to support this maturity level will involve the implementation of a dedicated IT infrastructure for “detailed” stress testing or simulation. The combination of new-generation commodity-priced grid or multiprocessor hardware infrastructure with in-memory analytical engines will then allow processes that normally run over days to be run in hours or minutes.

Costs and benefits

Banks that will go beyond compliance to create greater transparency and enhanced forecasting or simulation capabilities will have an increased capability to actively steer financial resources and will earn a significantly higher return from the investments required by these new regulations1. Despite the IT costs associated with moving along the “maturity” spectrum of Risk-IT architectures, many industry participants are convinced that the long-term benefits outweigh the costs. More and more, the necessary enhancements to Risk-IT architectures described in this paper are even perceived as a prerequisite rather than a source of potential differentiation.

In this context, it is useful to refer to the comments of the Basel Committee’s paper on Effective principles for risk data aggregation and risk management (3) which states: “Many in the banking industry recognize the benefits of improving their risk data aggregation capabilities and are working towards this goal. They see the improvements in terms of strengthening the capability and the status of the risk function to make judgments. This leads to gains in efficiency, reduced probability of losses and enhanced strategic decision-making, and ultimately increased profitability… Strong risk management capabilities are an integral part of the franchise value of a bank. Effective implementation of the Principles [for effective risk reporting] should increase the value of the bank. The Committee believes that the long-term benefits of improved risk data aggregation capabilities and risk reporting practices will outweigh the investment costs incurred by banks”.

Innovation in the Insurance industry

Much has been written about how the insurance industry is conservative compared to many other industries, especially when it comes to technology. For example, a 2010 report surveyed the usage of social media in 30 different industries. Insurance came in 28th, ahead of only Zoos and Funeral Homes. So it beat animals and dead people!

But things are changing. Recently, many conferences highlighted technology innovation within insurance. For example, an insurance company is using GPS information from a smartphone to trigger location based insurance offers such as travel insurance if the insured is at an airport. Another instance of how advances in technology are helping insurers is the usage of radio-frequency identification devices (RFID). These are being implanted into animals to track and identify livestock, thereby helping insurers rate and price farm insurance more accurately.

First, let’s consider the amount of data automotive telematics devices are expected to generate. Every second a telematics device will produce a data record. This data record will include information such as date, time, speed, longitude, latitude, acceleration or deceleration, cumulative mileage and fuel consumption. Depending on the frequency and length of the trips, these data records or data sets can represent approximately 5 to 15 MB of data annually, per customer. With a customer base of just 100,000 vehicles, this represents over 1 terabyte of data per year!

In the area of Solvency II or fraud detection, the challenge of big data (large amounts of data) is the same. (Stochastic and deterministic) Risk models developed by actuaries, need lots of detailed information (customer information such as name, sex, age, insured capital, …) but also disparate information across the company (liability data such as information about insurance products but also asset data) and outside the company (economic data such as yield curve, volatility but also data coming from different countries or entities). Moreover, the Solvency II regulation imposes additional requirements regarding data quality. For example, Article 48 of the Solvency II directive requires insurance companies to proactively assess the sufficiency and quality of data used to calculate technical provisions.

Next, once all this data is collected, insurance companies will need to prepare and analyze this vast amount of data. By using data exploration and analytics, insurers will be able to rank and weigh hundreds of new variables generated by telematics to develop highly accurate telematics pricing models based on a driver’s past and forecasted driving behavior. For example, insurers could use a correlation matrix to quickly identify which variables are related, and to determine the strength of the relationship.Insurance companies cannot rely on traditional data mining technology to analyze all of this new data. Due to the sheer size of data, insurers must consider a distributed, in-memory environment to display the results of data exploration and analysis in a way that is meaningful but not overwhelming.

In the area of risk, before running the models, data will be prepared and analyzed to determine assumptions and create model points based on historical and forecasted factors. There are lots of factors that can influence assumptions. For example, by exploring their large volume of data, life insurers can identify factors that are not normally used to predict the mortality rates (such as GDP, hospital bed, CO2 emission, water quality, road safety, education, …). When data and assumptions are determined, stochastic simulations are run in order to project and calculate the required capital, appropriate to the business of the company. Using this large volume of data to generate personalized pricing and provide risk information is very complex and it requires the use of high-performance analytics. The velocity of big data coming into an organization can be very difficult to manage. The ability to quickly access and process varying velocities of data is critical. Insurance companies should consider a “stream it, score it, store it” approach.

The amount and extreme velocity of data is transforming the insurance industry into a big data industry. It offers recommendations for using analytics to help the insurance industry transform this big data into big opportunity. With Solvency II, insurers should not just see on the short term (minimum to be compliant with regulation). They will see only the cost and will lose the benefits in the long term horizon. They have to capitalize on the investment they made for purely regulatory.

One of my colleagues told me a story about delay of the implementation of the Solvency Ii directive. His story was so relevant that I’m re-using it. Over the past few years many articles have been written on Solvency II with the anticipation that the directive would come into effect at the end of this year. However when it comes to Solvency II it seems Insurance companies are feeling like Bill Murray in the movie “Groundhog Day”. No matter what they do each day at work, when they wake up the next day nothing has changed, the deadline is still two years away.

With the latest delay, Insurance companies should not despair and put their Solvency II projects on hold, but take advantage of what they have already implemented; especially in the areas of data management and reporting. A good data management strategy is a prerequisite for meeting Solvency II regulations. Combining a regulatory challenge with a proactive approach can yield benefits well beyond just meeting the regulatory requirements. Some insurance companies have used innovative products in the area of data quality and data management to not only meet the rigorous data requirements of Solvency II, but they are also experiencing benefits such as a reduction in the cost of detecting and investigating suspicious claims for fraud, thanks to more accurate and complete data. For many insurers, reporting is often the final hurdle to compliance. But this should not be the case. If insurers focus only on compliance with Solvency II QRTs, they will miss out on the extra benefits of a business intelligence solution that can support additional internal reporting requirements.

At the end of the movie, Bill Murray’s character uses his knowledge of that day’s experience not just for his own gain, but more importantly to help the local community. Insurance companies should consider Solvency II in the same light. Lessons learnt from this exercise will help carriers in many other aspects of their business than just risk management, and as in the movie, eventually everyone “lives happily ever after”.

Innovation is happening in the insurance industry, and faster than we may think. Insurers have to take this opportunity to create business value for their company.

References

  1. Moving Beyond Compliance, How Banks should leverage risk technology to capitalize on regulatory reform, a white paper from Boston Consulting Group, SAS and Platinion, October 2011
  2. Positive creativity solves complex risk puzzle from Insights, Big Data, Bigger Opportunities, High Performance Analytics at the speed of light, a SAS white paper, January 2012
  3. Principles for effective risk data aggregation and risk reporting. Final paper issued by the Basel Committee on Banking Supervision, January 2013

Challenges in the Banking and Insurance Industry and the Role of Technology

Accreditations
& Rankings

Equis Association of MBAs AACSB Financial Times