About 429,000 results
Open links in new tab
  1. Tokenization (data security) - Wikipedia

    To protect data over its full lifecycle, tokenization is often combined with end-to-end encryption to secure data in transit to the tokenization system or service, with a token replacing the original data on return.

  2. What is tokenization? | McKinsey

    Jul 25, 2024 · Tokenization is the process of creating a digital representation of a real thing. Tokenization can also be used to protect sensitive data or to efficiently process large amounts of data.

  3. What is tokenization? - IBM

    In data security, tokenization is the process of converting sensitive data into a nonsensitive digital replacement, called a token, that maps back to the original. Tokenization can help protect sensitive …

  4. Explainer: What is tokenization and is it crypto's next big thing?

    Jul 23, 2025 · But it generally refers to the process of turning financial assets - such as bank deposits, stocks, bonds, funds and even real estate - into crypto assets. This means creating a record on digital...

  5. How Does Tokenization Work? Explained with Examples - Spiceworks

    Mar 28, 2023 · Tokenization is the process of hiding the contents of a dataset by replacing sensitive or private elements with a series of non-sensitive, randomly generated elements (called a token). …

  6. What is data tokenization? The different types, and key use cases

    Apr 16, 2025 · Data tokenization as a broad term is the process of replacing raw data with a digital representation. In data security, tokenization replaces sensitive data with randomized, nonsensitive …

  7. What is Data Tokenization? [Examples, Benefits & Real-Time …

    Jul 9, 2025 · Data tokenization is a method of protecting sensitive information by replacing it with a non-sensitive equivalent — called a token — that has no exploitable meaning or value outside of its …

  8. Data Tokenization - A Complete Guide - ALTR

    Aug 11, 2025 · Tokenization is a data security technique that replaces sensitive information—such as personally identifiable information (PII), payment card numbers, or health records—with a non …

  9. Tokenization takes the lead in the fight for data security

    Dec 15, 2025 · Tokenization is emerging as a cornerstone of modern data security, helping businesses separate the value of their data from its risk. During this VB in Conversation, Ravi Raghu, president, …

  10. Tokenization Explained: What Is Tokenization & Why Use It? - Okta

    Sep 2, 2024 · Tokenization involves protecting sensitive, private information with something scrambled, which users call a token. Tokens can't be unscrambled and returned to their original state.

  11. What Is Tokenization? The Secret to Safer Transactions

    Feb 21, 2025 · Tokenization involves replacing sensitive data, such as credit card numbers or personal information, with randomly generated tokens. These tokens hold no intrinsic value, making them …

  12. What is Tokenization? - OpenText

    Tokenization is a process by which PANs, PHI, PII, and other sensitive data elements are replaced by surrogate values, or tokens. Tokenization is really a form of encryption, but the two terms are …

  13. What is Data Tokenization? [Examples & Benefits] | Airbyte

    Sep 10, 2025 · The fundamental principle behind tokenization lies in data substitution rather than data transformation. Unlike encryption, which mathematically converts data into ciphertext, tokenization …

  14. What is Tokenization? - GeeksforGeeks

    6 days ago · Tokenization can be likened to teaching someone a new language by starting with the alphabet, then moving on to syllables, and finally to complete words and sentences.

  15. What Is Data Tokenization and How Does It Work?

    Apr 14, 2025 · Data tokenization is a data protection method that replaces sensitive information with a unique, non-sensitive substitute known as a token. The token has no meaningful value or relation to …

  16. Asset tokenization - Wikipedia

    Asset tokenization is the transcription of an asset into a digital token on a blockchain or a digital platform with similar properties. [1] Most tokenized assets to date are stablecoins representing a claim on a …

  17. Tokenization: Definition, Benefits, and Use Cases Explained

    Jul 17, 2024 · Tokenization is the process of replacing sensitive data with unique identifiers to enhance security. This process ensures that sensitive information, such as credit card numbers or personal …

  18. Data Tokenization: Secure Sensitive Information Effectively

    Tokenization is the process of removing sensitive information from your internal system – where it’s vulnerable to hackers – and replacing it with a one-of-a-kind token that is unreadable, even if hackers …

  19. What Is Data Tokenization? Key Concepts and Benefits

    Jan 6, 2025 · Data tokenization is a security method for replacing sensitive data with non-sensitive, unique tokens. The original data is stored securely in a separate database, and the tokens are used …

  20. Tokenized financial assets: From pilot to scale | McKinsey

    Jun 20, 2024 · Tokenization, the process of creating a unique digital representation of an asset on a blockchain network, has reached a tipping point after many years of promise and experimentation. …

  21. Tokenization could revolutionize access to alternatives, unlocking revenues for managers and distributors while enabling higher-quality portfolios for wealthy individuals.

  22. Tokenization: Key Factors to Consider and Countries Leading Global ...

    Dec 14, 2025 · Explore key factors for tokenization of assets and see which countries are leading global adoption of tokenized assets.

  23. What is Tokenization? | IXOPAY

    Oct 24, 2025 · Tokenization is the process of exchanging sensitive data for nonsensitive data called “tokens” that can be used in a database or internal system without bringing it into PCI scope.

  24. What is Tokenization? - TechTarget

    Feb 7, 2023 · Tokenization is the process of replacing sensitive data with unique identification symbols that retain all the essential information about the data without compromising its security.

  25. RWA Tokenization Will Reinvigorate DeFi In 2026 | IBTimes

    5 days ago · RWA tokenization is injecting fresh liquidity into DeFi, transforming idle capital and bridging traditional assets with onchain finance in 2026.

  26. Indian MP Pushes Tokenization Bill to Democratize Investment

    6 days ago · A new parliamentary push seeks to use tokenization to open high-value assets to India’s middle class through blockchain-based ownership.

  27. What Is Tokenization? - Akamai

    In the world of data security and payment processing, tokenization is the practice of protecting sensitive data by replacing it with a token — a unique and nonsensitive string of symbols randomly generated …