Tokenization and encryption are the two technologies that are most commonly used to protect sensitive cardholder data like the PCI DSS requires. Encryption is very well defined and understood, but tokenization isn’t. Exactly what is tokenization? Here’s my definition. You can read more about this, including why I believe that this definition makes sense and how this definition compares to the definition of encryption, in "Defining Tokenization and the Security Provides" this month’s ISSA Journal.
A tokenization scheme comprises two stateful, deterministic algorithms: tokenize and detokenize. These operate on two strings called a plaintext and a token. The tokenize algorithm produces a token from a plaintext. The detokenize algorithm produces a plaintext from a token that has already been created by the tokenize algorithm.
A secure tokenization scheme is one in which the mutual information between a plaintext and the token that the tokenize algorithm creates from it is zero.