Quantcast
Channel: Peter Arthur Martin
Viewing all articles
Browse latest Browse all 47

Tokenization (data security)

$
0
0

Tokenization is the process of replacing sensitive data with non-sensitive equivalents called tokens, while preserving the essential characteristics of the original data. Here are the key aspects of tokenization: In the context of data security, tokenization differs from encryption in that it doesn’t use a mathematical process to transform the data, making it more secure and…

Source


Viewing all articles
Browse latest Browse all 47

Trending Articles