Cloud tokenization: Why it might replace cloud encryption

Expert Dave Shackleford says cloud tokenization technology is becoming an attractive alternative to cloud encryption, but problems persist.

As more organizations do business with cloud service providers, security and business leaders are increasingly

concerned with the security of data transmitted and stored in cloud environments. Particularly in light of a number of recent data breaches and leaks by former NSA contractor Edward Snowden, many security teams are looking for new and different ways to more carefully control data in the enterprise and prevent exposure or compromise.

The most popular method for protecting data used with cloud services is encryption, but tokenization is another option that is gaining traction.

Currently, the most popular method for protecting data used with cloud services is encryption, but tokenization is another option that is gaining traction.

In this tip, we'll discuss how tokenization in the cloud differs from encryption, including potential benefits and pitfalls.

Cloud tokenization

Tokenization tools replace sensitive data fields with a separate value called a token. The sensitive data resides in a token cache or database locally, and only the token values are transmitted when using cloud applications or services. The process is then reversed when token data returns to the local network, where de-tokenization will then replace a token with its associated cleartext value.

Tokenization is applied in several ways. It can be performed by in-house applications that are applied to databases and other sensitive data stores, and then the tokens themselves can be used for cloud application and service transactions. Alternately, some cloud security vendors offer "cloud-enabled" tokenization platforms and gateways that transform data into tokens as it traverses the gateway, storing the token mapping in its local cache. Many products also integrate with internal databases for more secure storage of the token data.

Tokenization offers organizations a number of possible benefits, including an opportunity to reduce the complexity of managing encryption keys and infrastructure, as well as avoiding sharing them with cloud service providers in many cases. In addition, if data is breached within a cloud provider's environment, it isn't sensitive data at all once it's tokenized, as the tokenized data should bear no similarity to the original data that was converted. Be forewarned, though, as most tokenization schemes will use the same structure and data types by default; for example, credit card numbers may be replaced with the same number of different digits.

Tokenization versus encryption

There are several fundamental differences between tokenization and encryption that influence security. With tokenization, the original data is completely separate from the tokens created, while encryption may bear a relationship to the original unencrypted data. Encrypted data is also tied to the encryption algorithm and/or keys employed to encrypt it, which typically means that the output length and structure of data is fixed. Conversely, tokens can be generated in a number of ways, so if needed, the output type and length can be manipulated, meaning they do not have any relationship to the length of the original value.

Though offering a number of security benefits, organizations should not view cloud tokenization as an all-upside proposition. If a key is compromised, all encrypted data (regardless of location) would be vulnerable. However, a compromised tokenization cache or database would be equally as devastating and could possibly be viewed as a single point of failure.

From the editors: More on tokenization and PCI

Resident SearchSecurity compliance expert explains how to reduce the scope of PCI DSS regulations by using tokenization for credit card data.

Forrester Research's John Kindervag walks through the process of implementing tokenization technology for PCI purposes.

Beyond this concern, there are some additional pitfalls that organizations may face when applying tokenization. Tokenization may simply not be supported by many cloud services. Also, given the nature of the data transformation, generated tokens may not suffice without storing a replica of the token database within a cloud service provider. For tokenization platforms that store the database locally and perform lookups, additional latency or performance issues may be present.

As for whether tokenization technology is the right choice, a new cloud offering called tokenization-as-a-service (TaaS), from leading providers like Akamai Technologies, CardVault (built by 3Delta Systems) and Liaison Technologies, may make sense for some organizations. With TaaS technology, payment card data at a point-of-sale (PoS) system is encrypted with a TaaS key for the tokenization server in the cloud, and the payment data is transmitted to the TaaS environment. At the TaaS cloud, the vendor decrypts the data and replaces it with a token handed back to the merchant after authorization data is processed, thus eliminating the need for the merchant to ever store payment card data. Tokenization can help reduce compliance scope for regulations like the Payment Card Industry Data Security Standard; having a third-party manage this could offer cost savings and ease implementation concerns.

For other organizations, implementing a hybrid strategy that employs both tokenization and encryption may be the best approach, with tokenization used for critical, real-time services that support it. Encryption is used for storing data and cloud services that support robust key management. Tokenization tends to be faster than encryption and poses less management overhead too. By using newer tokenization techniques like in-memory tokenization, which involves leveraging pre-generated token tables that can be rapidly distributed and used for lookups, some organizations enable more practical, real-time cloud-based data processing.

Old technology for new problems

Tokenization is by no means a new technology, but its use in cloud computing scenarios is growing. For organizations looking to protect data used in cloud services, tokenization may offer a reasonable alternative to encryption, or, at the minimum, might serve as another tool in the security toolbox.

About the author:
Dave Shackleford is the owner and principal consultant of Voodoo Security LLC; lead faculty at IANS; and a SANS analyst, senior instructor and course author. He has consulted with hundreds of organizations in the areas of security, regulatory compliance, network architecture and engineering, and is a VMware vExpert with extensive experience designing and configuring secure virtualized infrastructures. He has previously worked as CSO at Configuresoft; as chief technology officer at the Center for Internet Security; and as a security architect, analyst and manager for several Fortune 500 companies. Dave is the author of the Sybex book Virtualization Security: Protecting Virtualized Environments, as well as the co-author of Hands-On Information Security from Course Technology. Recently, he co-authored the first published course on virtualization security for the SANS Institute. He currently serves on the board of directors at the SANS Technology Institute and helps lead the Atlanta chapter of the Cloud Security Alliance.

This was first published in April 2014

Dig deeper on Cloud Data Storage, Encryption and Data Protection Best Practices

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

1 comment

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

SearchSecurity

SearchCloudComputing

SearchAWS

SearchCloudApplications

SearchServerVirtualization

SearchVMware

ComputerWeekly

Close