QuartetFS: Optimising Collateral With Real-Time Analytics
Posted: 26 March 2013 | Author: Georges Bory | Source: QuartetFS
Collateral management has never received this much attention. Pre-global financial crisis, collateral management was a spreadsheet-based function, often a box ticking exercise. However, it is now undergoing a huge shake up, as incoming regulations are redesigning the clearing landscape. The US Dodd-Frank Act and its European equivalent, European Market Infrastructure Regulation (EMIR) are transitioning the derivatives market from an over-the-counter (OTC) model to an exchange-traded one. And with that, trades are becoming centrally cleared through Central Counterparties (CCPs).
This is a game changer for all market participants – dealers, prime brokers, custodians, asset managers and hedge funds alike. The need for financial institutions to have real-time access to their exposures, pledged collateral and collateral requirements across all asset classes and counterparties is no trivial matter.
Financial institutions – and the sell-side have been quicker out of the blocks – have been rapidly building new collateral management systems and operating systems, which comply with the incoming regulations and also consider inventory/position management and collateral optimisation. Sitting on top of traditional collateral management, the most advanced institutions are now looking to put in place a collateral optimisation layer that empowers firms with access to real-time, up-to-date collateral analytics at a group-wide level, and enables the efficient management of collateral to minimise costs and maximise return on assets.
There are a number of key factors necessary for success:
- The need for flexible front-office tools and analytics that help collateral trading desks pledge, substitute and recall their assets to increase revenue generation
- Ownership and customisation – firms need to be able to plug in their own bespoke algorithms that reflect their own collateral models.
- A cross-silo asset pool – an effective optimisation layer needs to be able to aggregate across a firm-wide asset pool for efficient and cost-effective optimisation
- The ability to have access to an incrementally updated asset pool and a real-time optimisation framework ensures that firms are able to react effectively to changes in credit or market events
To conform with regulatory reform seamlessly, financial institutions need continuous, up-to-date information on exposures, collateral positions and requirements at a moment’s notice, safe in the knowledge that they are truly optimising on the latest set of information available at that time. Real-time aggregation and analytics technologies offer the myriad capabilities necessary to meet these needs. The most advanced tools allow users to manipulate data and perform instant ‘What-If’ analyses on large data volumes to evaluate alternative scenarios. For instance, technology can be used to monitor the effect of potential ratings changes, market shifts, and asset withdrawals to make more efficient and informed business decisions.
With Dodd-Frank set to kick in at the end of February this year, and EMIR in Summer 2014 and Basel III looking like 2015, the regulatory burden is only set to increase. The time is now for banks to invest in the correct technology to maximise their business operations and easily comply with the challenges which lie ahead.