Still Using Stateful Tokens? There is a Better Way!

Lately, with regards to tokenization, we’ve been seeing . . .

As you see many of the discussions going on related to tokenization solutions – stateless, vaultless, vaulted, etc. – let’s step back and segment these conversations around the market the solution is wishing to service.  Generally speaking, selling tokenization into a market is divided into two main categories – organizations that want to control their destiny and those that feel better equipped to let others provide and manage this service for them.  If you keep this in mind some key criteria comes up pretty quickly that weighs in on the technology decision – cost, manageability, transaction throughput and type, scalability and last but not least security standards.

This argument has been around a while for many years . . . to tokenize or not to tokenize

Tokens for paymentsAccording to Markets and Markets (Tokenization Market, Global Forecast to 2022), “the rise in adoption of payment security trends and increase in the number of security breaches targeting payment transaction processes are among the major factors for the growth of the tokenization market around the globe.”  As organizations delve into the age of digital commerce, the collection of cardholder data and the protection of this data is imminent.

Payment Card Industry Data Security Standard (PCI DSS) provides guidelines mandating protection of cardholder and ACH numbers in systems.  They recommend tokenization as a way to actively remove live data from the infrastructure by replacing it with a “token” that is valueless to hackers.  In the event of a breach, the token would be useless and valueless on the market, unless the hacker was able to get the token table.  Gaining PCI compliance is a key driver for organizations to implement tokenization with the added benefit of audit scope reduction.  There are times when organizations elect to encrypt over tokenizing but for the purpose of this discussion let’s look at some of the reasons customer use tokenization.

  • Compliance management requirements
  • Reduce audit scope costs
  • Strengthen applications and processes (payment security and user authentication) by creating a surrogate value

If the surrogate value can be used more than once then all bets are off for reducing risk.  The value of “deriving” a token for a single use, greatly reduces the target for cyber-criminals.

So what works best?

Many talk about stateful / vaulted tokenization vs stateless / vaultless. Let’s look at the different tokenization types.

Stateful tokens are assigned to PAN through the use of an index or token table which is usually a two-column table with the PAN and corresponding tokens are listed across from each other in the table.  The table needs a “location” for it to be stored and managed preferably a separate hardware device – a database, hardware security module (HSM) or a secure location.  To maintain integrity and avoid data loss, the state of the system (specifically, the token vault) must be constantly backed-up and/or replicated. Since replication is not instantaneous, multiple copies of the vault will be out of sync, resulting in cumbersome situations where multiple tokens are issued for the same PAN (often referred to as “collision”), or multiple PANs are associated with the same token.

Stateful tokenization solutions

  • Offer a centralized location for security attacks to random number generators
  • Require maintaining and continuous synchronization of token tables
  • Scalability challenges for large deployments – possible token collisions
  • Cost implications and sustainability (i.e., hardware centric organization vs a non-hardware centric organization; organization size and maturity, etc.)

With stateless tokenization, multiple token tables are randomly generated one time for all possible PANs. This generation uses random numbers and a provably secure method. Each and every PAN in the numeric range has a token assigned to it for the life of the table(s). Since every token PAN is pre-associated with a token, the tables are stateless; they do not change. This eliminates the need to synchronize a database across data centers, or constantly back it up. (See more in the Coalfire Systems Hyper Secure Stateless Tokenization (SST) PCI DSS Technical Assessment White paper). 

The better way:  Secure Stateless Tokenization

What if there was a better way to tackle payment security breaches while enabling a streamlined experience for the customer and reducing many of the scalability challenges faced in large deployments – especially if you have Hadoop or Big Data implementation?  We have taken this into account as we built our stateless tokenization solution (SST) into SecureData.

Secure Stateless Tokenization uses a set of static, pre-generated tables containing random numbers created using a FIPS random number generator and based on published and proven academic research. These static tables reside on the SecureData appliances, and are used to consistently produce a unique, random token for each clear text PAN input, resulting in a token that has no relationship to the original PAN. No “token vault” database is required, thus improving the speed, scalability, security, and manageability of the tokenization process.

Secure Stateless Tokenization (SST):

  • Eliminates tokenization and de-tokenization (read/write) to disk (RAM)
  • Enables horizontal and vertical scalability with large deployments and data collections (i.e., Cloud, Hadoop, Big Data, IoT)
  • Created random one-time tokens with no databases, no data synchronization, no collisions, and high performance
  • Provides an enhanced approach that maximizes speed, scalability, security and management of the tokenization process
  • Unified back-end platform enabling growth not just in tokenization but also enabling expansion to protect PII data
  • Forgo the challenges of incorrect token mapping
  • Cost (hardware, license, etc) and complexities of managing the token tables can become overwhelming if you aren’t a hardware organizations.

Is encryption a potential better fit?

The Tokenization vs encryption argument has been raging for many years which may be brought up by vendors as a solution often if they aren’t succeeding with their tokenization solution.  Encryption has its place in data protection but if you are required to meet PCI DSS mandates, tokenization is the best solution. A thought for future discussion.

Leave a Reply

Your email address will not be published. Required fields are marked *