On-premises data loss prevention strategies alone are no longer sufficient to protect enterprise data against inadvertent or malicious exposure.
As more workers upload, store and share corporate data in private and public cloud environments, organizations have to confront the realities of protecting data that users access from anywhere at any time through a mix of sanctioned and unapproved devices on services with varying degrees of security. Monitoring and controlling data that is stored in cloud services and downloaded to devices outside the enterprise network has become critical for CIOs and CISOs in today's environments.
To implement effective data loss prevention (DLP) in the cloud, security administrators need to understand which cloud services employees are using and what type of data is being shared, as well as how, when and why this is happening. Unfortunately, for many organizations that's a lot easier said than done.
"Cloud data is hard to locate," says Richard Stiennon, chief research analyst and founder of research firm IT-Harvest. "It may be in so-called shadow IT servers set up by staff. It can be dispersed in thousands of fragments in a cloud storage solution like Google or Amazon Simple Storage Service or even a Hadoop database." Sometimes, data can reside in snapshot images of workloads, or be encrypted with a user's private keys and those belonging to the cloud service provider.
The tendency by remote workers to connect to and use unapproved cloud collaboration services has created a huge shadow IT problem, says Krishna Narayanaswamy, chief scientist at Netskope Inc., a cloud security startup in Los Altos, Calif. The average enterprise uses about 755 cloud apps, only a small fraction of which are actually approved for use by IT.
Several tools and services are commercially available that let CIOs and CISOs gain this sort of visibility and allow them to categorize and prioritize cloud applications based on the risk they pose to enterprise data. Such discovery is critical to enabling effective cloud DLP.
Organizations basically have two options for implementing DLP in the cloud. One is through the use of a cloud access security broker (CASB) service or software tool, and the other is to inspect data uploaded and stored in the cloud via an API within the application itself.
A CASB is an in-line proxy or gateway that sits between the enterprise and the cloud service provider and inspects data streaming into and out of cloud applications. Not all CASB architectures are the same, and Twitter, Facebook and other services that are widely adopted for personal use may complicate sweeping traffic analysis due to employee privacy issues.
"Organizations can direct connections to cloud applications through a gateway that does content inspection based on policies looking for keywords or access to specific cloud applications," says Fred Kost, senior vice president at cloud security vendor HyTrust Inc., in Mountain View, Calif. "This may benefit some organizations, depending on the connection to the cloud application and their ability to effectively inspect the application connection."
CASBs offer a central control point for enforcing policies in an environment in which users might be remote or mobile, devices are both managed and unmanaged, and cloud services are hosted all over the world with different native security and compliance capabilities. The access security gateway market, which attempts to address on-premises cloud and software as a service deployments, is largely populated by startups: Adallom (acquired by Microsoft last July), Elastica (acquired by Blue Coat Systems in November), CipherCloud, Netskope and Skyhigh Networks.
In addition to providing CISOs greater visibility through audit logs and compliance reports, CASB technology can help security administrators enforce enterprise DLP policies pertaining to encryption, access, authentication and authorization. That's assuming the organization has classified its data and created the complex set of rules necessary for effective leak prevention.
"Rather than recreate DLP policies already in place, an enterprise can use a CASB to connect their on-premises DLP system to their cloud security provider and effectively extend their policies to data uploaded to the cloud," says Kamal Shah, senior vice president of products at Skyhigh Networks in Campbell, Calif.
Even companies that don't have an existing DLP mechanism in place for their internal network can specify and enforce leak prevention controls for their cloud data. Many CASBs offer templates to identify sensitive data and enable users to define policies similar to on-premises DLP.
While CASBs started to gain traction in 2015, APIs are familiar tools that enable developers to program interactions -- and security controls -- at the application level. Cloud service providers offer APIs as a way for enterprises and security vendors to analyze interactions related to data that has already been uploaded to the cloud. APIs can make it easier for enterprises to inspect these transactions for sensitive information -- Social Security numbers (SSNs), health information, intellectual property, credit card and financial data -- to impose the appropriate access and security policies. Cloud security is multifaceted, however. Most attacks happen over APIs, so programming complex interactions between data center and Web components, for example, carries its own risks.
APIs are less intrusive than proxies, but there is a slightly greater risk of exposure because data is inspected only after it arrives in the cloud, says Willy Leichter, global director of cloud security at CipherCloud in San Jose, Calif. "Where you need to apply conditional DLP policies that require more processing, the application API mode tends to work better." Cloud security gateways work well for enforcing DLP policies on structured data and somewhat less so with unstructured data, he says. "When you talk about unstructured data, it is always a larger problem, because you don't know what you don't know."
Contextual analysis of transactions
According to Gartner, by 2020 roughly 85% of large enterprise will use CASBs for enabling greater visibility into their cloud services for security and compliance purposes. The analyst firm expects that the technology will give companies a way to gain a much more granular understanding of how workers are consuming cloud services and the risks resulting from such use.
One of the most important qualities of CASBs is that they give enterprises the ability to add critical real-time context to security decisions in the cloud, according to Gartner. For instance, security administrators can use CASBs to develop and enforce policies like restricting or enabling data access based on the location or the time of day. Similarly, they can be used to encrypt certain types of data like SSNs and credit card numbers that are uploaded to the cloud or to deny access to devices that do not meet enterprise security policies.
"Many enterprises doing DLP in the cloud find contextual-awareness to be a 'must-have' for cloud DLP due to the sheer volume of cloud transactions that need to be inspected," Narayanaswamy says.
Data leak prevention tools allow organizations to impose security controls on data based on data type and the context in which it is being used. But the policies need to be flexible enough to ensure that people who are authorized to access sensitive and secured data in the cloud have access to it as needed. That means enabling a capability for enforcing DLP that is not just content aware but also context aware in terms of who is accessing the data, from where they are accessing it, why they might be accessing it and whether that access is compliant with associated policies.
"If you think about cloud app traffic, there are tons of transactions and you need a way to narrow down that sea of data to the transactions you really care about," says Narayanaswamy. "Context means who's doing the action, on what device, from what location, in what application or category, what the app risk level is, where the data is being hosted and what the activity is," he says.
For instance, analyzing data that's uploaded from a desktop computer on the campus network to a cloud storage folder designated for sensitive data may not be as important as inspecting the data that's downloaded from that same sensitive folder to a remote user on a personal device. By classifying data using context-aware techniques, enterprises can reduce the false positives in detection that plague many legacy enterprise DLP systems, he says.
Context-aware DLP takes into account factors like whether an employee is accessing data on a managed or unmanaged device or where they are located, adds Shah. For instance, a company may need to specifically prevent employees from accessing sensitive information on a personal device or when they are abroad. Or some companies might only want employees to share sensitive data in the cloud with sanctioned business partners while prohibiting other kinds of use.
Consistent policies across environments
Regardless of the architecture that an organization chooses for cloud DLP, it is vital to have a consistent set of policies governing the manner in which protected data is secured and accessed in the cloud and within the enterprise.
The same content-scanning policies that are used for data stored on premises need to be applied to information stored and shared in the cloud, says Leichter. An organization's obligation to comply with relevant regulatory requirements like PCI DSS, HIPAA, GLBA and FINRA standards do not change because data has migrated to the cloud. The only thing that really changes is the actions that need to be taken to prevent and remediate sensitive data leaks.
To be truly effective, a cloud DLP policy must cover Internet traffic outbound from corporate devices to cloud services and data that is being downloaded from these sites to devices outside the enterprise network.
By 2017, every enterprise DLP provider will have developed at least one partnership with a CASB or acquire one, according to Gartner. With a central control point for cloud security, CIOs and CISOs can enforce security policies across multiple apps, instead of relying on each service's built-in DLP controls.
This can be achieved by logically separating the data classification rules from the enforcement policies, says Narayanaswamy. "Any other approach is knowingly creating a nightmare for your customers."
About the author
Jaikumar Vijayan is a freelance writer with over 20 years of experience covering the information technology industry. He is a frequent contributor to Christian Science Monitor Passcode, eWEEK, Dark Reading and several other publications.
Which DLP products made readers' short lists?
Key challenges enterprises face with cloud data loss prevention
What is Symantec's strategy for DLP in the cloud?
How DLP works in virtual environments