An industry working group has issued a paper entitled “Cyber Threats and Data Recovery Challenges for Financial Market Infrastructures”, sponsored by the Committee on Payments and Market Infrastructures-International Organisation of Securities Commissions (CPMI-IOSCO) Working Group on Cyber Resilience. The industry working group includes representatives from DTCC, Euroclear, LCH, the Federal Reserve Bank of New York, TMX Group and the Reserve Bank of Australia.
The paper identifies measures that could bolster the financial services industry’s capability to protect data and maximise data recovery, as the cyber threat landscape continues to evolve. It outlines the increasing threat to data resilience and the need for industry to collaborate to develop common standards and devise guidelines to minimise contagion. The paper sets out a framework which financial services firms can use to assess how they can best improve data protection and validation capabilities to fend off cyber threats.
There is no standard approach to identifying the types of data that need protection from a cyber attack. Traditional data replication strategies designed for physical or non-cyber disruptions have the potential to spread corrupted data to back-up systems. To tackle this challenge, the working group set out to identify tools to help firms address data recovery and validation concerns, and to identify which areas would most benefit from further industry collaboration.
The working group found that effective solutions and tools vary from firm to firm. There is no single tool that provides solutions for data recovery in all integrity scenarios, and firms typically use a range of tools as part of their data resilience design practice. The effectiveness of a firm’s response varies according to different scenarios, and requires them to adopt a “tool-kit” approach. This will not be enough in future.
Five main themes
Firms need to re-evaluate how best to protect and leverage data. The working paper highlights five main themes:
- Data integrity issues require trade-offs between the speed of recovery and the accuracy of recovery, although it remains important to maintain the two-hourly recovery objective set out in regulatory guidance as a target.
- Recovery capabilities of existing systems were designed with physical and non-cyber outages in mind, and may not be as effective at maintaining data integrity during a cyber attack.
- Interconnections between firms increase the potential impact of a data integrity compromise across the industry.
- Recovery from a data integrity breach requires a high degree of trust in the available back-up data copies as well as coordination within the settlement ecosystem.
- When considering the recovery objective, the definition of critical services can vary across firms and across scenarios.
Firms’ business systems will have evolved over time. They will often be based on legacy business systems, and many will also provide related services with internal crossover points that may be dependent upon one another, the working group found. It is possible that some of these data interdependencies could enhance protections from hardware or software failures, but it is more likely that this mix may lead to data corruption and complicate data recovery processes, the group said. Firms may therefore need to review their data protection infrastructure.
The working group identified a number of tools for use in data corruption scenarios which allow for the storage and recovery of critical data outside of firms’ on-site premises. The paper called for clear guidelines on minimising contagion and for the provision of support for resilience across the financial services industry. Firms and infrastructure providers need to focus on identifying tools that harmonise with their objectives, identify the “restore points” that make sense for their business, and ensure they understand the impact of any shortfalls in their legacy technology.
The working paper provides a framework to help firms assess the internal integration points that have created interdependencies and to establish what data needs to be protected to provide services to clients:
- Identify the tools that most harmonise with the firm’s objectives. Firms should identify tools that are attainable from a design perspective. They should focus on the implementation of those tools that provide the most coverage and which bolster capabilities of data recovery, reconciliation and replay.
- Define logical restore points. Firms should work with their participants and the wider community to identify restore points that make sense for their business. Firms need to establish which data is critical to restore if they are to continue to provide expected services to clients, and establish how this can then be “ring-fenced” to bolster their recovery capabilities to protect critical data.
- Understand legacy technology. Firms can improve data resilience by regularly conducting a comprehensive evaluation of their applications to understand any critical interdependencies and identify opportunities for enhanced resilience as the technology evolves.
The paper highlights the need for greater industry collaboration on data resilience. Firms must consider how they can improve data protection and resilience capability to defend and recover from cyber threats, and assess how critical data can be best protected. The working paper concludes that, as cyber threat evolves, firms must constantly review risk threats, and must improve protection, detection and response procedures continuously to counter the evolution of cyber threats.