Charles Southwood, Regional VP Northern Europe and MEA at Denodo, discusses how the pandemic has become a digital catalyst for the public sector and what organisations need to do to prosper in this new data-driven age
In today’s digital climate, the value of data is something that cannot be disputed. It is every organisation’s single greatest asset, regardless of size or sector. For those operating in the private sphere, data insights help to drive profit and boost productivity. However, in the public sector, data isn’t just about financial growth and operational efficiencies; it also holds the secrets to improving the lives of the general population.
The UK government recently made its commitment to a data-driven future clear, announcing the launch of a new National Data Strategy. This puts data at the heart of the UK’s COVID-19 recovery plan – proposing an overhaul of data usage across the public sector and encouraging the digital transformation projects needed to support it.
However, in order for public sector organisations to be able to fully embrace and benefit from this data-driven future, they need to ensure that they have the infrastructure and tools in place to effectively manage and understand their data.
Data Challenges
Across the public sector, data – and the insights that it provides – is often buried in a hugely complex network of applications and siloed databases. For example, in the National Health Service (NHS), patient data is often spread across multiple systems and departments – whether that’s GP notes, clinical systems, A&E systems, pharmacy records, just to name a few. To make matters worse, legacy IT systems – which are still heavily relied upon in many areas – were not originally designed with sharing capabilities in mind. This makes data extremely difficult to find and retrieve, as well as time-consuming to translate into meaningful, actionable information.
In the past, the most common method for sharing data across multiple public sector agencies has been Extract, Transform and Load (ETL). Through this, data files are extracted from an existing source, transformed into a common format and loaded onto a new location – such as a database server, data store or data warehouse. Once complete, information can be made available to prescribed users, under pre-set access and security protocols. However, ETL has been a standard method of mass data integration since the 1970s. It’s no surprise, therefore, that certain limitations – most prominently and concerningly around security and governance – are becoming increasingly apparent.
One of the most problematic aspects of ETL is its method of duplicating data. This results in the creation of new data repositories which can quickly multiply into complex, siloed datasets with their own governance and security mechanisms. With the General Data Protection Regulation (GDPR) requiring robust personal data policies, strict record-keeping and time limits on how long data can be stored, this could present a very real governance problem with potentially devastating consequences. This is where data virtualisation is now helping.
Supporting the next generation of public services
Data virtualisation has already taken the business landscape by storm. In fact, according to Gartner, by 2022, 60% of all organisations will implement it as one of the key delivery styles in their data integration architecture. For public sector organisations looking to make the most of their data, it is also proving to be an invaluable asset.
Often thought of as a ‘data fabric’ – due to its ability to ‘stitch together’ data from disparate sources regardless of location, format or latency – data virtualisation delivers information in real-time and in the appropriate format required by each individual user. This means that all public sector data, no matter where it is stored – whether on-premise, in a cloud environment, data lake – can be brought together to create a complete, real-time view.
In terms of security and compliance, data virtualisation offers a way for organisations to manage data centrally, whilst maintaining consistent standards for every dataset and laying down clear access rules for all data consumers. It also enables data to be conveniently accessed through front-end solutions, such as applications and dashboards, without the user having to use specialised software or know its exact storage location. This enables public sector organisations to make much more efficient use of their time, staff and resources.
Whether within health care, social services, policing or the judiciary, data holds the key to improving services at a lower cost and making multi-agency cooperation more efficient. In our data-intensive age, traditional integration methods are no longer fit for purpose and, as a result, the full potential for insights from data is not being realised. Data virtualisation is the key for a more efficient and responsive public sector – one that delivers services with the speed and quality that citizens are entitled to expect, and has the information needed to benefit society as a whole.