Information Technology

Gartner Glossary

Tokenization

Tokenization refers to a process by which a piece of sensitive data, such as a credit card number, is replaced by a surrogate value known as a token. The sensitive data still generally needs to be stored securely at one centralized location for subsequent reference and requires strong protections around it. The security of a tokenization approach depends on the security of the sensitive values and the algorithm and process used to create the surrogate value and map it back to the original value.

Discover Gartner Conferences

There are multiple Gartner conferences available in your area. Transform your business and experience the value of Gartner, live and in person.