Sponsored by ?

This article was paid for by a contributing third party.More Information.

Mounting pressure to capture real-time exposures spurs data aggregation efforts

Mounting pressure to capture real-time exposures spurs data aggregation efforts

A range of market factors and technological advances is driving the need for real-time data and analytics, and persuading firms to move away from their risk management infrastructures. In a recent Risk Live panel session sponsored by KWA Analytics, experts discussed this trend and the challenges risk managers face in capturing real-time data for risk management. This article explores the four themes that emerged from the discussion

The panel

  • Ram Meenakshisundaram, Senior vice-president of quantitative services, KWA Analytics
  • Maithreyi Bharadwaj, Principal consultant, KWA Analytics 
  • Martin Campbell, Head of risk, Mizuho Securities Europe
  • Nicola Crawford, Former chief risk officer, National Bank of Kuwait
  • Julien Cuisinier, Head of financial risk, Artemis Fund Management

 
Drivers of change

Recent volatility caused by geopolitical events and rising interest rates as a result of high inflation has heightened the need to accurately calculate risk exposures in real time.

There is also increasing emphasis on climate change data. Julien Cuisinier, head of financial risk at Artemis Fund Management, noted that greater environmental, social and governance (ESG) and climate risk engagement is driving changes in technology across market risk and liquidity risk. “Firms are looking for timely data and analytics to prove their ESG and climate risk engagement and how to link that to their risk management systems,” he explained.

At the same time, regulators continue to focus on operational resilience, which has been driving advances in stress-testing. They are trying to be more comprehensive by introducing stress tests for emerging risks such as climate risk and cyber risk. Platforms therefore continue to improve in stress-testing capabilities to understand real-time positions.

There will be an increasing number of regulatory reporting requirements in the future as regulators become much tougher. “As well as the risk results and calculations, regulators want the source data so they can trace through the steps and check they are correct – and they want a very quick turnaround,” said Martin Campbell, head of risk at Mizuho Securities Europe.

Regulators are also driving the need for good-quality risk data and data aggregation for accurate and timely risk reporting. For example, there is renewed attention to BCBS 239, where regulators are becoming less tolerant of banks’ inaction.
 

Integration challenges

Against this backdrop, risk managers must try to combine data from the front and back offices as well as finance departments while their systems are poorly integrated. Obtaining data in a timely manner and collating it with a common identifier continues to prove challenging.

In addition, risk managers are often dealing with multiple product-specific systems in the front office. These are difficult to integrate with middle- and back-office infrastructures, since each of the front-line systems have different implementations of securities, risk analytics and data. Firms are typically using extract, transform and load (ETL) processes to try to put all the data into a common framework, which is costly.

Ram Meenakshisundaram, senior vice-president of quantitative services at KWA Analytics, said that firms are using security masters and referential data masters to integrate all of their systems and collate them into data lakes, ensuring all the data is coherent. “Firms need to start taking ownership of things like securities – which are much more common across the market – so that they do not need to be implemented differently for every single system,” he said.

Nicola Crawford, former chief risk officer at the National Bank of Kuwait, explained that data ownership was a big issue because most data used by risk managers for analytics is owned by the front line, which is responsible for data quality and model risk controls. “Persuading the first line to be accountable and acknowledge that they own the data is one of the biggest challenges,” she said.

The European Central Bank has recently expressed dissatisfaction with what banks have achieved since the Principles for effective risk data aggregation and risk reporting were published 10 years ago. Maithreyi Bharadwaj, principal consultant at KWA Analytics, attributed failures in implementing these principles to weak oversight of management bodies and fragmented IT infrastructures within banks. “Data lakes can be helpful, but the systems that are being used to aggregate are not fit for purpose, and often the scope and ambition of remediation plans do not go far enough,” she said.
 

Capturing real-time data

Capturing real-time data depends on risk type and investment in technology but is relatively easy. However, there are cultural challenges for firms in accurately calculating risk in real time and using reporting tools to capture real-time exposures. Crawford observed there is a compromise between data quality and the speed at which firms need decisions to be made. “The balance between that risk-reward relationship has not been fully recognised and addressed by all banks,” she said. There is also a cultural issue around access to the data for the right people. The panellists stressed that there needs to be a culture and practice around the technology for real-time analytics to ensure the data is deployed to the right people.

There is often too much noise in data for volatility surfaces – especially on the stochastic side – to calibrate them to the marketplace. The panellists explained that stochastic volatility surfaces cannot be captured in real time because of the computational time it takes. However, firms are using non-traditional methods, such as variational autoencoders in machine learning, to approximate the computation instead of running through the models. Although it is not as accurate as traditional methods, approximation offers near real-time calibration of volatility surfaces. Real-time data is being used with deep calibration and deep hedging models for other successes in regulatory reporting.
 

Moving to the cloud

When firms move to the cloud, they end up with a huge amount of data that is challenging to manage. From a private cloud, risk managers will have access to almost all permissible front-office data. According to the panellists, firms are looking at various solutions to help them identify the meaningful information they can extract from this data explosion.

There is a cultural issue around regulatory reporting – in getting people to address the levels of governance, supervision, monitoring, performance, availability, reliability and resilience of data in the cloud. “The main challenge is getting people to understand the definitions of cloud and outsourcing, which are much broader and more intrusive than they realise,” said Campbell.

However, the cloud has made it simple for banks to update requirements and undertake regulatory reporting almost instantly. Historically, they have invested heavily in changing regulatory requirements on legacy systems. In contrast, using a private cloud, banks and risk managers do not have to build out the regulatory reporting solution or generate the risk reports in-house. Part of the cost has been transferred to the ETL process they must use to ensure the data is compliant with the service they have bought – which is a lot less costly.

The data quality in a lot of swaps data repositories has been relatively poor until now because each system interprets and implements it in its own way. According to KWA Analytics’ Meenakshisundaram, as data has moved into cloud environments, it has become much cleaner. “This is another reason risk data and regulatory reporting are more suited to the cloud than on the legacy side,” he said.
 

Conclusion

A lot of firms are still hampered in their data aggregation efforts by fragmented IT infrastructures, data ownership and data quality issues. Recent volatility and continued emphasis on climate risk, as well as regulatory focus on operational resilience and data quality are pushing risk managers to assess their current infrastructures and move to new ones that allow them to capture real-time data. Where it is not possible to capture real-time exposures, firms are using mechanisms such as deep learning, deep hedging and big data analytics to compute risk analytics from approximations using non-traditional risk metrics. Cloud computing has helped with the consolidation of regulatory reporting and reduced operational risk. It has also helped improve data quality on the planning side and overall is much more cost-effective than investing in legacy systems.

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here