SAN DIEGO -- Security professionals looking to protect sensitive corporate data in a cloud environment have new attack vectors to worry about and can run into complications with tools like DLP and encryption, according to Gartner.
The public cloud’s elasticity, scalability, complexity and multi-tenancy all present technical risks for protecting data in the cloud, Ramon Krikken, Gartner research director, said in a presentation at Gartner Catalyst Conference 2011. “Anyone with a credit card can sign up for these things, so you never know who your neighbor is,” he said.
Data integrity is an issue in public cloud environments, he said. “You don’t own the hardware. Hardware is what anchors trust.”
With multi-tenancy, there’s the risk of a cloud provider losing multiple clients’ data, said Trent Henry, Gartner research vice president. In the hybrid cloud model, where workloads move dynamically to a cloud provider, security professionals need to be aware of the sensitivity of data that may be moving and how security controls will operate in a public cloud environment, he said.
Content-aware DLP has proven helpful in the data center, but is problematic in the cloud due to a lack of integration, Henry said. “Providers need to enable a platform that we can plug content-aware DLP into, and DLP vendors need to make use of those platform capabilities,” he said.
Encryption options also can be limited, Krikken said. Building encryption into the application is one option, but then there’s the question of where the keys go, he said. Another alternative, encrypting data before it’s sent to the cloud, restricts what a company can do with a cloud service because the cloud service needs to be able to perform operations on the data, he said.
Data masking is an alternative to encryption, but is just as complicated, Krikken said. “Don’t think it’s a silver bullet.”
“Preventative controls will only be effective up to a point,” he said. “Monitoring is going to be extremely important. …Without monitoring, you’re flying blind.” Monitoring in a cloud environment will require figuring out where logs originate and reside, he said.
Henry recommended attendees consider service quotas and throttles to thwart an economic threat stemming from cloud elasticity. For example, if a process runs amok, public cloud infrastructure costs could easily skyrocket, leading to what he described as an “economic denial-of-service,” in which service costs run far beyond a company’s budget.
He urged the audience to get involved in capacity planning. “There are real dollars at stake,” he said.