Could quantum computing security solve the data sovereignty challenge?

Abstract geometric shapes background. 3D generated image.
image: ©gremlin | iStock

Following the recent news that Microsoft is unable to guarantee the data sovereignty of UK policing data stored on Azure, some commentators have suggested that a quick move away from Microsoft cloud is probably the only option for government organisations keen to ensure they remain on the right side of UK law

The issue came to light following a Freedom of Information request that highlighted Microsoft could not guarantee that sensitive law-enforcement data, hosted on its public cloud infrastructure, would remain in the UK – a key legal requirement for many Government bodies, as well as regulated services, such as water, electricity and gas providers.

The disclosure also revealed how data hosted in Microsoft’s public cloud infrastructure is regularly transferred and processed overseas; a process inherent to its public cloud architecture, which causes problems for any UK government users with regulatory limitations around offshoring UK data. For example, the new G-Cloud 14 framework recently introduced a UK-only data hosting requirement, while under Part 3 of the Data Protection Act (DPA) 2018, there is no general approval for data transfers outside of the UK, meaning data processors are not allowed to send data offshore unless they are specifically told to do so, on a case-by-case basis, and with each instance requiring direct report to the ICO.

So, what can Government departments and other public bodies do to ensure their data processing remains on the right side of the law?

Data sovereignty: The data infrastructure challenge

One option is for organisations to move away from Microsoft cloud-based products. This would undoubtedly open a Pandora’s box of risks though, which may pose more of a threat to data integrity than the good intentions behind the move.

The first challenge lies simply in the scale of the transformation process. For almost every public body, the scale and tailored nature of existing data infrastructures make any transfer to new systems inherently risky. Not only would it require a significant investment of time, but reconfiguring aspects of the architecture – e.g. taking items out, adding them back in and designing the system to allow for regular updates – all create opportunities to lose files, corrupt key data, and open up vulnerabilities to bad actors. There are also direct costs when cloud providers charge to pull back data stored on their infrastructure or transfer this to other providers.

Transferring existing systems away from Azure creates timing and prioritisation issues too.

To illustrate, if there was an edict to change the side of the road we drive on, everyone would have to change at the same time; you could not have a transition period where some vehicles drove on the right while others drove on the left. Making changes to critical Government infrastructure is much the same: If you want to move away from a particular vendor starting with one sub-organisation, it will likely generate a catastrophic set of interoperability problems that represent a significant risk to essential service delivery.

While it is relatively easy to build data storage systems organically, adding elements around a single, cloud-based provider as required; doing the reverse has to happen at one time.

Take NHS IT systems, for example. The siloed nature of operations and IT infrastructure mean approaches can vary significantly between Integrated Care Boards, Trusts, hospitals and even individual wards. Consequently, many critical business functions happen end-to-end, with little consistency across siloes. This variation stores up a world of problems should a wholesale move away from a single provider be required. And while many Government departments are more joined up, you still have what we call ‘the pizza problem’: That is, when everyone is using the same system (the pizza), making changes to data (toppings) that are stored and accessed across multiple departments risks creating unintentional problems ie. accidentally taking someone else’s topping (data) when trying to separate your own slice.

So, if moving away from existing cloud-based providers is possible but inherently risky, what alternatives are there?

The quantum-resistant solution

One surprising answer may have its roots in the advent of quantum computing.

To protect against these super computers, new, secure data-storage solutions are being developed.  Some of these disaggregate data and disseminates it across multiple storage end points: The disaggregation is at the bit level (digital ones and zeros), the dissemination is random and none of the many end points has all the binary digits for any data asset.

Reassembly of the data assets includes full integrity checks but the approach means the data cannot be decrypted as even a quantum computer working at high speed would not be able to recreate the original information with only part of the story to work from.

The implications of such a solution for data sovereignty laws are exciting.

What is data? And what does that mean for data sovereignty?

Loosely speaking, data sovereignty means that governments have control over data located within their jurisdictions. Information stored in the cloud can be subject to a variety of national laws, depending on where data is stored, processed or transmitted. With huge amounts of data stored outside national boundaries, it is becoming a critical data and national-security issue.

But what if that data was disaggregated and disbursed across multiple geographic jurisdictions? If data is broken up at bit level and randomly distributed to multiple locations across multiple cloud endpoints, not only is that ‘data’ securely stored but it should also meet data sovereignty rules, wherever in the world those end points are.
Why? Because data can only exist in complete form.

When you anonymise it and disaggregate it at bit level, it is impossible to retrieve and reconstruct without an ‘algorithm key’. From a security perspective, if elements secured in the cloud were accessed by storage providers, hackers or governments, all that could be retrieved is random fractions of binary digits that would be unintelligible on their own. From a legal perspective, it would not constitute ‘data’ and therefore, once disaggregated, the concept of jurisdiction is removed.

 Cloud protection ‘gateways’ such as these can also be installed on-premise, protecting data at source. And by choosing a platform that disaggregates data at bit level across multiple cloud endpoints, public organisations can quickly comply with national regulations on data protection without risking data loss or architecture breakdown.

In short, the approach allows public organisations to keep existing data storage architecture in place and hone it in their own time.

Once a multi-cloud environment has been created, organisations can rebalance what data is held where, across multiple providers, maybe moving the most-sensitive data first, or creating schedules for different classifications of information.

Crucially, not only is the risk diluted by spreading storage across multiple endpoints but, the more data is managed in this way, the more obfuscated it becomes and the more secure it is. Platform design means that, in the event of connectivity loss to the end points or data corruption, the algorithm (accessed only by key holders) can also recalculate missing digits stored in the corrupted endpoint, restoring information to data owners quickly and efficiently.

It is a secure and resilient approach that not only reduces security risks but also simplifies the storage regime; reducing the need for additional support and ensuring continued availability of data while meeting data sovereignty requirements.

The FOI story throws up a variety of questions for the sector but moving wholesale away from existing architecture is an unnecessarily risky approach. There are simpler, less risky and more cost effective options out there. Getting it right will be key to building a more secure, reliable data infrastructure for all.

Contributor Details

Adrian
Fern
Chief Technology Officer
Prizsm Technologies

LEAVE A REPLY

Please enter your comment!
Please enter your name here