Information Technology

Gartner Glossary

Tokenization

Tokenization refers to a process by which a piece of sensitive data, such as a credit card number, is replaced by a surrogate value known as a token. The sensitive data still generally needs to be stored securely at one centralized location for subsequent reference and requires strong protections around it. The security of a tokenization approach depends on the security of the sensitive values and the algorithm and process used to create the surrogate value and map it back to the original value.

Experience Gartner virtual conferences

Master your role, transform your business and tap into an unsurpassed peer network through our world-leading virtual conferences.

Gartner Webinars

Expert insights and strategies to address your priorities and solve your most pressing challenges.