Data tokenization

Strengthen data security and reduce regulatory compliance burdens

Data tokenization solutions address a critical challenge faced by most organizations today—protecting sensitive and regulated data. Evolving regulations and extensive reputational and monetary risks from data breaches set a high bar for organizations to keep their data secure. Tokenization can help balance data accessibility with protection from unauthorized access, especially critical for payment card data or other types of regulated data.

OpenText™ Protect™ is a data tokenization solution that helps organizations protect their sensitive data by combining the security of the OpenText Cloud platform with OpenText tokenization experts.

What is data tokenization?

Data tokenization solutions substitute sensitive data elements with non-sensitive equivalents, called tokens, to protect data. Unlike encryption, tokenized data does not have any mathematical relation to the original data and is typically used to protect sensitive data, such as credit card information (PCI), personally identifiable information (PII) and personal health information (PHI).

Tokenization replaces sensitive data throughout the enterprise with data surrogates that have no value on their own, reducing the footprint of sensitive data in enterprise systems and greatly decreasing the risk of losing sensitive data in a breach event.

OpenText Protect data tokenization overview

Protect is a cloud-based data tokenization solution. It enables tokenized data to be used for analytics and business insights while drastically improving data protection. The token-value pairs and sensitive data are encrypted and stored in a central vault that persists on the OpenText platform.

OpenText Protect data tokenization features

Tokenizes any data values

Stores any sensitive data values, such as credit card or Social Security numbers, in the OpenText encrypted cloud to narrow the scope of systems, applications and processes that need to be audited for compliance with standards, such as PCI DSS or HIPAA, greatly streamlining the process.

Preserves data format and length

Enables data analysis and other mission-critical business functions as usual with format preserving tokens that maintain the length and format of the original data. Ensures business processes are not disrupted with tokenization.

Supports referential integrity of data

Produces tokens that are stable over time and can be used as unique identifiers in place of sensitive data values.

Expertise and best practices

Leverages OpenText expert data security resources and proven processes to handle the day-to-day complexities of tokenization and encryption operations and frees IT resources to focus on other strategic initiatives.

OpenText Protect data tokenization benefits

Reduce risk

Replace PCI, PHI, PII and other types of sensitive data with surrogate data, vastly reducing the points of potential data compromise and breach.

Lower costs

Leverage the OpenText Cloud to eliminate the up-front costs associated with implementing an on-premises tokenization process.

Improve flexibility and scalability to meet business goals

Accelerate transaction times and dramatically improve scalability compared to traditional approaches with on-premises relational databases.

Partner for success

Leverage the OpenText experts for an end-to-end solution, from designing a tailored architecture to managing the solution.

Contact us

Resources

Related solutions