Some thoughts on moving from an 'open' data to a 'closed' data world
Image credits: Unsplash
Sometimes when you start to write an article what ends up coming out of the finger tips is different than what you set out to convey. As I started this (what was supposed to be short) article, I kept thinking about how we in the information security space are constantly on the back foot because the fundamental design of securing our world starts with this soft gooey middle of interesting data to the bad guys. It's a fundamentally weak starting structure. We work to secure it, but even that process requires then more work...extensive work around logging and alerting to even have a chance of keeping bad guys at bay. So the thoughts that went through my head as I started were, "Wouldn’t it be nice to know critical data is safe, even in the case of a breach? Wouldn’t it be nice to know who was accessing your critical data, when they were accessing it, and what they were doing with it? Wouldn’t it be nice if all the logging and alerting to try and catch bad guys was easier and more accurate?" Which led to this article. I believe we are on the verge of a sea change where data is going to move from always being 'open' and visible to 'closed' and locked except when explicitly needed.
Information security has been using the same model for decades – a hardened perimeter with open, unsecured data on the inside. If, or I should say when, the perimeter is breached, the bad guys have unfettered access to use, view or export your data. In this traditional model, the default state of data is open, or unsecured. Because of this open state most organizations have a need for an extensive logging and alerting infrastructure to try and ferret out if the data has been seen, tampered with or exfiltrated without permission.
Now imagine instead that the default state of your data was closed, or secured. Imagine your data was in this secured state for its entire life cycle, from creation to transit to storage. The only time your data is ever unsecured is precisely when it needs to be viewed or processed. After that, it resorts right back to its default secured state. I believe this new data-centric security model is the way forward for companies who want to thrive.
This data first model has many security implications, but I want to specifically discuss how this model helps with today’s ever more stringent regulatory environments. With data sitting in an open state, knowing if the wrong system or person accessed the data is difficult. Consequently, it’s a daunting and expensive prospect to setup the logging and reporting infrastructure correctly in order to try and protect the data. And then when it is time to run reports proving you are handling customer data properly or meeting regulatory requirements by only allowing access to the data to the appropriate entities, it is another exhausting effort. However, with data in a closed state, the entire reporting effort changes. Since there needs to be specific access granted to see the data in its unsecured state, it is much simpler and easier to report on who or what accessed the data. Imagine that you had diamonds in a store but as soon as they were taken out of the store they turned to dust. Would that change how you setup your security guards, cameras and safes?
Let’s use an example to show how this can transform a business. Imagine you are a company that has some critical files. Things like incident responses, or HR reports, or M&A information. In the old model, for any server where that information is stored, the IT team would be able to view it, as they are responsible for managing the network resources. To solve this problem, many companies either build secondary infrastructure or build alerts into the folders and files to see if anyone has tampered with them. Most file access is benign, however, the team still needs to at least be prepared to explain each instance of file access when reports are pulled.
Now imagine instead that the data was in a secured state by default, and only the appropriate people could pull the data into a proper document. You wouldn’t care if the domain admin or anything else touched the secured files. You wouldn’t even have to setup logging for file access, since all files would be secured and unintelligible for anyone who wasn’t specifically authorized. Also, if bad guys came and store the data while it was in its default, secured state, you wouldn’t even need to report it (though there are reasons it could be useful to do so…see our other article).
In conclusion there are generally two ways to structure systems in terms of how they treat data. Leave the data open and harden the perimeter to prevent breaches, or keep the data in a secured state by default, only allowing it to be opened by specified users or systems. In the Systems World, we’ve been operating in model one, but we need to transition to model two. Trying to show what wasn’t done to the data is hard, time consuming and virtually impossible to be 100% accurate. On the other hand, knowing data is closed, reporting on who or what actually accessed the 'real' data is comparatively much easier. It simplifies and completely transforms your logging and reporting methodology. This frees up your organization to spend time on innovation. Said another way, if your data is fundamentally closed, in an ironic twist you actual free your people to innovate on it. I believe this shift will happen and that shifting our security to a data centric model will provide powerful benefits to companies. Not only as a way they protect their most critical asset, or easier means to grapple with navigating a regulatory environment that continues to grow more complex and stringent, but that actually allows them faster paths to innovation and heightened customer trust. And that would be good for everyone.