Another standard for tokenization

It looks like a tokenization vendor has taken it upon themselves to organize a group of vendors to create a standard for tokenization and its secure implementation. Having a standard that describes tokenization and how to do it securely is definitely a good idea, but I'm not sure that a vendor-led group that's not part of any recognized standards organization is the best way to do this.

A group organized by a single vendor probably won't have decision making processes that are as transparent as those that established standards organizations have. This probably isn't good. Although all standards organizations have their problems, you at least have a known process by which these organizations create and review their standards. These processes may not be perfect, but they're at least well known, and they almost always try to keep any single vendor from unduly influencing the content of any particular standard. With vendor-led groups, however, there's often little attempt at transparency, and there may even be little incentive for the group to act on feedback that they get on their standard as it's being developed. This often results in standards that aren't as useful as they could be.

I also have to wonder why a vendor-led group is necessary in this particular case. The Accredited Standards Committee X9, the organization that creates the American National Standards for the financial services industry, is currently working on a standard (X9.119) that will define requirements for the secure implementation of tokenization. This X9 standard will only reflect the requirements of the financial services industry, but because that's the main user of tokenization, that really shouldn't be a problem. I'm fairly confident that an organization that specializes in standards for the financial services industry will have a better chance of creating a standard that reflects the needs of the financial services industry than an organization comprising tokenization vendors will.

I'm also a bit puzzled by the fact that many tokenization vendors don't seem to participate in X9 at all. Voltage has been an X9 member for quite a while, and both the X9 community as well as Voltage have benefitted from this. The other members of X9 have essentially received lots of free cryptography and security consulting from Voltage, and Voltage has received lots of useful information about exactly what the needs of its customers are in return.

I'm not sure why tokenization vendors don't actively participate in X9, and the fact that they don't makes me wonder how much they really understand the requirements of their customers and how well they'll be able to create a useful standard for the secure implementation of tokenization.

I'm fairly sure that Voltage will end up participating in this vendor-led effort to define a standard for secure implementations of tokenization, but from what I know now, I'm not sure that the standard that this group creates will be very useful. In this particular case, I hope that I'm wrong. That doesn't happen very often, does it?

Do we really need another tokenization standard in addition to X9.119?

  • Marc

    Defining standards around token generation, use, and storage are all good things. I agree that a vendor organization will have little use unless that vendor group drives standards in the X9.119 forum.
    What I’m not totally clear on is the goal of the standards in the first place. If the goals are around generation and use, that’s good. Establish a security baseline and best practices around tokenization…
    Recently I’ve heard token vendors talking about an interoperability standard. From my perspective, tokens are a data devaluation tool. Interoperable tokens are more valuable than tokens that are not interoperable. Increasing the value of a token is not what we want…we want chunks of data that aren’t worth anything outside of some narrow context.
    What we don’t want is a way for attackers to start collecting tokens and using those tokens for bad things. Sure, the token vendors say that tokens can only be used to do x,y, and z…they can’t be used to originate transactions. But that still leaves many cases where they can be used…and interoperability expands those use scenarios.
    Standards are good in this space – there’s a lack of standards now. Not sure that an interoperable standard is such a good idea though.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *