Story image

Data classification: What’s the point of brushing off 'digital dust'?

08 Aug 2018

To a lot of us data is boring but a necessary evil.  Our organisations create documents that end up being stored somewhere (often goodness knows where) gathering the equivalent of digital dust.  We’re accumulating vast stores of data we have no idea what to do with, and no hope of learning anything useful from.  And then if you layer on top the increasingly stringent data protection and compliance requirements, then it’s easy to see that we’ve got our data management work cut out for us. 

For IT security professionals, this data deluge only adds to the challenge of identifying, mitigating and remediating the ever-increasing number of cyber threats out there.  Much of the advice around security strategy has shifted from attempting to block threats at the perimeter, to protecting what matters most – sensitive data.  But therein lies another challenge.  With such high volumes of data, how is it possible to quickly identify which data is the highest priority for protection?

Although not a new phenomenon, there has been a recent resurgence in data discovery and classification techniques.  When implemented properly a solid data classification process will allow the IT security team to identify where sensitive data resides, set policies for handling it, implement appropriate technical controls, and educate users about current threats to the data they work with and best practices for keeping it safe.

Best practice for implementing a data classification strategy

Data classification is not a one size fits all approach. Every company has different needs to address, so a strategy must be tailored accordingly. However, the following five-point action plan can be used to create the foundation of an effective strategy:

Define what’s needed

Establish the goals, objectives and strategic intent of the policy are.  Make sure all relevant employees are aware and understand why it is being put in place. An effective data policy must also balance the confidentiality and privacy of employees/users against the integrity and availability of the data being protected. A policy that's too stringent can alienate staff and impede their ability to carry out their jobs, but if it's too lax, the very data the firm is trying to protect could be at risk.

Establish the scope

It's important to establish where the boundaries will be early on; otherwise it can quickly grow out of control. This is particularly important when considering partners and third parties. How far into their network, will/can you reach? Equally important is legacy and archived data. Where is it and how will it be protected? Finally, make sure to note anything that's out of scope and ensure this is evaluated and adjusted regularly.

Discover all sensitive data that’s in scope

Once data policy and scope have been established, the next task is to identify all the sensitive data that requires classification and protection. First, understand what data is being looking for. This could take many forms, ranging from confidential case files and personally identifiable information, through to client IP, source code, proprietary formulas etc.

Next, focus on where this data is likely to be found, from endpoints and servers, to on-site databases and in the cloud. Remember that discovery is not a one time event, it should be continuously re-evaluated, taking into account data at rest, data in motion and data in use across all platforms.

Evaluate all appropriate solutions

When the time comes to identify an appropriate data classification solution, there are plenty to choose from. Many of the best solutions today are automated and classification can be context (file type, location etc) and/or content based (fingerprint, RegEx etc). This option can be expensive and require a high degree of fine tuning, but once up and running it is extremely fast and classification can be repeated as often as desired.

An alternative to automated solutions is a manual approach, which allows users themselves to choose the classification of a file. This approach relies on a data expert to lead the classification process and can be time intensive, but in law firms where the correct classification is intricate and/or subjective, a manual approach can often be preferable.

A final option is to outsource the classification process to a service provider or consulting firm. This approach is rarely the most efficient or cost effective, but can provide a one-time classification of data and give any firm a good idea of where it stands in terms of compliance and risk.

Ensure feedback mechanisms are in place

The final stage is to ensure there are effective feedback mechanisms in place that allow swift reporting both up and down the firm’s hierarchy. As part of this, data flow should be analysed regularly to ensure classified data isn't moving in unauthorised ways or resting in places it shouldn't be. Any issues or discrepancies should be immediately flagged for follow up.

With data now playing a pivotal role in nearly every firm around the world, the ability to track, classify and protect it is critical. An effective data classification strategy should form the cornerstone of any modern security initiative, allowing firms to quickly identify the data most valuable to them and their clients, and ensure it is safe at all times.

Article by Digital Guardian VP and GM EMEA, Jan van Vliet.

Aerohive launches guide to cloud-managed network access control
NAC for Dummies teaches the key aspects of network access control within enterprise IT networks and how you can secure all devices on the network.
Sungard AS named DRaaS leader by Forrester
It was noted for its disaster-recovery-as-a-service solution’s ability to “serve client needs at all stages of their need for business continuity.”
Gartner: The five priorities of privacy executives
The priorities highlight the need for strategic approaches to engage with shifting regulatory, technology, customer and third-party risk trends.
emt Distribution adds risk intelligence vendor
Flashpoint has signed emt Distribution to provide channel partners in Oceania and South East Asia a solution for illicit threat actor communities.
CrowdStrike: Improving network security with cloud computing solutions
Australian spending on public cloud services is expected to reach $6.5 billion this year according to Gartner
Thycotic debunks top Privileged Access Management myths
Privileged Access encompasses access to computers, networks and network devices, software applications, digital documents and other digital assets.
Veeam reports double-digit Q1 growth
We are now focussed on an aggressive strategy to help businesses transition to cloud with Backup and Cloud Data Management solutions.
Paving the road to self-sovereign identity using blockchain
Internet users are often required to input personal information and highly-valuable data from contact numbers to email addresses to make use of the various platforms and services available online.