Tokenization is the process of replacing sensitive data with a non-meaningful value, called a token. Whether the tokenized data represents a credit card number, social security number, or other confidential data, the practice is intended to mitigate the risk of accidental or unauthorized access and use. This Fotrex paper evaluates the Vormetric Token Server’s use case and its deployment in a typical corporate environment.
Jon Collins’ in-depth look at tech and society
Phil Muncaster reports on China and beyond