Technology isn’t static, so how come so many businesses are when it comes to regulatory compliance? Our Principal Product Owner Danny Ivatt reveals why it’s time for a fresh approach…
Technology powers the world - yet in terms of being data and technology driven, regulatory and operational resilience approaches have often lagged behind the technology ecosystems they are there to support.
Technology value chains are complex and global. The risks presented from concentration on a few core providers, legacy technologies, emergent cyber security threats and disasters represent multi billion pound disruptions to financial markets, supply chains and economies.
Whilst data and technology are the key enablers for investment in the components of business services, the approach to understanding, reporting and mitigating risks across these systems is often manual, time intensive, remedial rather than proactive and dates from a time before cloud, cyber attacks and the kind of complex, distributed technology estates that power enterprise.
Current approaches are neither continuous, nor data driven, and that needs to change.
Last year alone the FCA handed out £568m of fines in the UK – three times as much as in 2020 – and Bank Of Ireland, Raphaels Bank and Metro Bank were among the high profile companies penalised for operational failings.
Regulators globally are harmonising on an approach that mandates organisations to collect and report granular data that will help them understand and mitigate against these risks. This increased scrutiny means that time is running out for financial organisations to rethink their approach to evidencing compliance and operational resilience using data.
Adopting a new way of working will help businesses avoid millions of pounds of fines from regulators but critically, also help to prevent catastrophic operational meltdowns that could cost them - and customers - even more.
Firms need a fresh approach that satisfies data compliance requirements while also boosting long-term operational resilience. First though, they need to look at why the existing model isn’t working.
Compliance initiatives typically kick off by paying a team of consultants to run a manual data collection exercise to evidence compliance. The process can be lengthy as information is often siloed and in a variety of formats.
The work can involve months of poring over documents, determining which standards apply, then comparing these to large spreadsheets of data to determine if you’re compliant or not. That’s before the work starts to find the people responsible and get things fixed.
A request for a specific piece of data from a regulator often leads to an individual or team being dispatched off into the depths of an organisation in search of the information. Not everyone who needs to be involved in the process is invested in the outcome and it becomes painfully slow and inaccurate.
The requirements from different regulators in different regions, whilst subtly different to each other, are often directionally the same. However, as each new compliance initiative is treated as a greenfield exercise, the knowledge isn’t retained afterwards as it’s been outsourced along with the creation of the initial report that kicked off the programme.
Many organisations still think of these compliance initiatives as discrete programmes of work that are focused on a specific regulatory requirement. They are snapshot, point in time assessments. Risks aren’t static, so why is the work?
The compliance and operational resilience landscape is continuous and dynamic, so your approach needs to mirror this.
Rather than snapshots that expire immediately and manual data capture, compliance can be as continuous and dynamic as the external landscape. Cyber security threats, the technology landscape, software, value chains, and concentration risks all continuously evolve, so the approach to managing these risks must do too.
Moving towards a model of continuous data driven compliance can offer organisations a radical transformation in terms of the time and cost it takes to be compliant and create more resilient organisations able to move at the speed of the threats they must confront.
Organisations facing multiple requests from regulators traditionally treat these as discrete projects that have an artificial ‘done’ state. By harmonising these requests into a single road map, you can treat them as related, but different concerns. This means that by improving your response to one, you can build upon that and improve your response to another. In this way, compliance requirements and associated projects become a persistent backlog and roadmap for a product-like approach.
Characteristic of a project-like approach, most organisations, when investing in software and tools that can enhance their compliance approach, choose platforms that only start to deliver outcomes at the end of implementation. By iteratively building a compliance product (using the technologies and approaches mentioned below), organisations can drastically cut down on this and build their data driven compliance capability by evidencing compliance with specific regulatory policy requirements in a continuous roadmap.
By iteratively building a data foundation, continuously collecting the metrics and controls needed to evidence compliance, organisations can start the shift to data driven compliance sooner rather than later. Much of this data is held across an organisation's applications, services and internal tools. Consideration must be given to data lineage, data ownership, quality, standards and the future state, but mapping this and then creating an entire data foundation can take years. A product-like approach invests in enough up front design, and then iteratively builds the foundation against real world compliance needs. In this way, value is created continuously.
Conventional ways of storing and ordering data, (spreadsheets, conventional databases) rely on human intervention to align different but thematically similar regulatory requirements. Regulation A can be directionally similar to regulation B, but the precise nature of the standards might differ. These tools necessitate a lot of human intervention. This is one reason why organisations still leverage so much manual compliance work and why compliance initiatives are often clean sheet, as adapting old work takes as long as starting afresh.
Knowledge graphs are an alternative way of showing the connections between different but related elements. In a knowledge graph, the connections themselves can also have properties. In this way, a line of one regulation can be linked to a section of another, with further links made to relevant internal policies and the underlying data and controls needed to evidence compliance.
New elements can be added to the graph, a new regulatory requirement can first be linked to thematically similar preexisting requirements or policy, and thus gather the required data via these connections. This can drastically shorten the time it takes to understand new compliance requirements and can be used to continuously create a dynamic topology of requirements, policy, services, applications, people and data. A knowledge graph doesn’t have a predefined structure, so it is always adaptable to new requirements.
Knowledge graphs can also be queried, again aligning with how people think and form connections themselves. Rather than simple static queries such as “give me the penetration testing data for this service” you can ask “Which parts of this regulation are already evidenced by data?”
It’s the organisation, its topology of places, infrastructure, systems, applications, people and tools, all characterised by the data that represents them.
Bringing together a compliance data foundation, and a knowledge graph representation of compliance, a continuous compliance digital twin uses real world data from your organisation to create a live, up-to-date, dynamic ongoing representation that can show risks, model impact, drive investment and highlight non-compliance.
Whereas querying a division to understand how compliant they are with new policy or regulation could take a long time, querying the digital twin takes moments. Modelling operational resilience scenarios in the real world tends to be either highly risky (impacting production services) or manual (not modelled with live data). A digital twin offers very little risk, but a high level of granularity and accuracy, as close as possible to the real world.
Digital twin technology has already been put to use in projects such as LA’s Sofi Stadium which collects data from every area of the site and then uses this to answer specific questions from event organisers, maintenance and so on. The technology is gathering momentum and 61% of utility executives expect their organisation’s spending in the area to increase over the next three years.
For many organisations the pace of change can be daunting, however, technology has no intention of stopping and it’s critical to keep up with it.
By embracing a continuous data driven approach to compliance, businesses can look to head off situations that can both mitigate risk running into millions, while also creating more robust and efficient organisations.
Interested in seeing our latest blogs as soon as they get released? Sign up for our newsletter using the form below, and also follow us on LinkedIn.