Five Simple Facts About Hardening Corporate Data

Author

Frontline, Contributor

March 5, 2015

Last month, the Ponemon Institute published a report on 2014’s mega-breaches analyzing the impact they had on data protection practices. Unsurprisingly, the report confirmed corporate leaders are allocating funds to create a “stronger cyber defense posture.” Earmarked mainly for better detection and response, the top post-Target security investments in 2014 were security information and event management (SIEM), endpoint security and intrusion protection and detection software. Better detection and response is sure to help, but until we shift our focus from hardening the network to hardening the data, high-profile data losses will remain in the headlines.

Knowing that it’s all about the data, why are we still so focused on the network?

At a macro level, the pace of IT innovation is moving faster than even the most high-performing IT teams can keep up with. There’s no turning back from the cloud and mobility. Businesses are becoming increasingly globalized and distributed, generating more data in more places than ever before. In today’s borderless business environment, controlling the flow of data has become impractical and virtually impossible, but the “moat and castle” approach is still ingrained into IT security organizations.

Plus, first-generation information protection solutions required a significant investment of time and resources, only to have their value degraded by performance, management and complexity issues. While offerings have matured greatly since the days when DLP was the shiny new security thing, data-centric security solutions still suffer from residual bad PR.

Aside from their efficacy protecting stolen data, in the era of mega breaches, a data centric security strategy offers security teams one of their most precious commodities – control.  They may not have control over corporate data, but they have total control over protection policies, providing a much needed home court advantage.  Given the size and scope of some of the recent mega breaches it’s time to get serious about data-centric security.

The first step for any information protection initiative is to classify your data. Accurate classification is critical to the success of any implementation, and is inherently linked to your protection strategy and architecture.  Getting this up-front part done right is more than half the battle, and here are some best practices that can streamline the process:

Classify and protect selectively

Most companies classify data along these general lines:

  • Public – such as website copy, product brochures, etc.
  • Internal  – internal memos, phone lists
  • Confidential  – product roadmaps, HR info
  • Secret – Intellectual property, financial data, etc. PII, etc.

For most organizations, the vast majority of data is likely to be “Public” or “Internal,” meaning it does not require protection.  For the other levels, you should create clear policies so that if it is required to classify data, it’s easy for the organization to determine what category the data it falls into and to create clear guidelines as to which data warrants protection and why.

Limit the role of end-users

Perhaps the main reason why first generation solutions didn’t work is that they put way too much accountability on end-users to classify their data before they sent it out. This resulted in “pop-up” overload where workers could barely do anything without getting prompted by the protection system and as a result, would misclassify data. It is possible to classify data by content, location, users, and metadata without user intervention. Let your workers work, and hold your vendors accountable – have them help with policy creation and with automating classification and protection.

Classify/Protect close the data source

Whether it is application generated reports or a data owner drafting a new memo, classifying and/or protecting the data before it travels anywhere ensures that the data does not get “tainted” or in any way altered, minimizing the chance of misclassification.

Do not over protect!

This is another common mistake many organizations make. A general guideline is that organizations should encrypt about 20% of their data. If you are protecting 80% of your data, then your policies are off.  With unstructured data (files and emails), start protecting documents first. When it comes to protecting email, craft rules first about attachments, as emails tend to consist of 60-70% of known non-secret data. If you want to avoid a Sony-like situation, you can protect based on users (“Protect all executive emails”) without overprotecting.

Don’t skimp on up front planning:

Protection is preventive and, as such, requires careful planning. Even though protection polices can be applied automatically, creating them can require careful thinking and a keen understanding of the business. If there is any upside from all the breaches it is that security is now front and center. It’s a great opportunity to align security with other business functions, especially legal and compliance, which often fund Information Protection projects and whose input can streamline classification and policy creation.

They say an ounce of prevention is worth a pound of cure. Well…it’s true. There’s been some debate as to whether Anthem was right or wrong to encrypt, but the bottom line is if it had found a way to protect that data, it would have been in a much different position.

This article was written by Frontline from Forbes and was legally licensed through the NewsCred publisher network.

Great ! Thanks for your subscription !

You will soon receive the first Content Loop Newsletter