IT Glossary



Tokenization

Tokenization refers to a process by which a piece of sensitive data, such as a credit card number, is replaced by a surrogate value known as a token. The sensitive data still generally needs to be stored securely at one centralized location for subsequent reference and requires strong protections around it. The security of a tokenization approach depends on the security of the sensitive values and the algorithm and process used to create the surrogate value and map it back to the original value.


Become a Client

Talk to Gartner now and learn the benefits of becoming a Gartner client.

Contact us Today
Free Research
Discover what 12,000 CIOs and Senior IT leaders already know.
Free Access