Tokenization and Segmentation
powers-old-account 270000NC1K Visits (2108)
The PCI Security Standards Council recently released their PCI DSS Tokenization Guidelines, which has consequently sparked much discussion about how to decide when/if company should consider implementing a tokenization structure to improve the security of cardholder data in their environment and to reduce the costs PCI compliance.
Within the world of PCI DSS, “tokenization” refers to the process of converting the Primary Account Number (PAN), aka “your credit card number” with another number or character string. This alternate representation of the PAN is called the token. It's called a token because it cannot be used as a credit card number. In fact in most cases, the token doesn't even have the same format. It might be longer or shorter than the PAN or it might have letters and special characters in it. But it's a token because it's something that can be traded in for the PAN when needed.
The PCI DSS standards have very clear definitions about what subset of the IT infrastructure is subject to the requirements of PCI DSS. The guiding principle is that PCI-DSS applies to systems that collect, transmit, or process cardholder data, and the systems that are connected to them. In practice, this means all the machines that are in the same network segment as the machines processing cardholder data.
The idea behind tokenization is that there are many elements of a transaction processing environment that may be using the PAN as part of their processing but which don't actually use the PAN for payment processing. They use the PAN for other bookkeeping, order fulfillment, or business analytics purposes. By converting these systems to using a token instead of the PAN, these systems can be eliminated from the scope of the PCI DSS requirements, thereby improving security and reducing assessment and compliance costs.
But it's not quite that simple.
There are significant additional considerations that have to be factored in to the business case before going with a tokenization strategy:
That last point is the potential show-stopper. A system that is converted so that it requests tokens and uses those tokens for its internal processing is still part of the cardholder data environment. So you've incurred the expense the conversion to tokens but have not reduced your compliance costs one bit. It may arguably be somewhat more secure because that system may no longer be writing PANs to disk. But there is no cost-savings benefit to converting that machine to a token infrastructure.
It's only when you propagate the tokens outside of the Cardholder Data Environment, as defined by PCI DSS, that you can begin to realize compliance cost savings.
Network segmentation remains the primary consideration when improving security of cardholder data and the fewer machines that have to be in the cardholder data environment the better. A tokenization infrastructure is not going to reduce the compliance costs for PCI-DSS compliance unless it enables machines to be moved out of the network segments that make up the carholder data environment.