IT Glossary


Tokenization refers to a process by which a piece of sensitive data, such as a credit card number, is replaced by a surrogate value known as a token. The sensitive data still generally needs to be stored securely at one centralized location for subsequent reference and requires strong protections around it. The security of a tokenization approach depends on the security of the sensitive values and the algorithm and process used to create the surrogate value and map it back to the original value.

Become a Client

Call us now at:

+1 800-213-4848


Contact us online 

Free Research
Discover what 12,000 CIOs and Senior IT leaders already know.
Free Access

Stay Informed About New Special Reports