Tokenization is the process of replacing sensitive data with unique symbols called tokens. This is done to protect data privacy and simplify the processing, storage and transmission of information, especially in the context of sensitive data such as credit card information or medical records.