NuBridges Inc. has released an updated version of its enterprise tokenisation product, seeking to eliminate a key pain point for large companies implementing tokenisation: coordinating the issuing of tokens among multiple data centres.
Tokenisation has drawn interest from merchants in particular because the process substitutes payment card data with a unique value or token after a transaction authorization takes place. It supplements data encryption and in some cases limits the networks and systems that fall within the scope of the often onerous Payment Card Industry Data Security Standard (PCI DSS).
Gary Palgon, vice president of product management for NuBridges, said large enterprises that process payment transactions from multiple data centres using tokenisation have struggled with collision, or two different values mapping back to the same token. This may occur if two different data centers produce identical tokens representing different values in separate transactions.
Token Manager 2.0 enables up to 10 data centres to generate unique tokens simultaneously. It ensures each data center uses its own set of tokens, and tracks the issuing of all tokens via a centralised data vault.
In addition to ensuring a scalable tokenisation process for a large infrastructure, Palgon said the feature supports disaster recovery and business continuity and preserves centralised key management.
The new PII/PHI feature adds additional configuration options to break the 1:1 relationship between a token and its actual value. By default, all instances of a value are replaced by the same token, which Palgon said in some instances might make it possible to figure out a token's value.
"For instance, if I'm a DBA, and I know my own salary, I can query the database, find the token for my salary, and then I can look for that token elsewhere to know who else has the same salary," Palgon said. "Now we offer the opportunity to further obfuscate, so that there might be multiple tokens to represent the same figure."
He added that the new product allows for field-by-field configuration, so that highly sensitive information like dates of birth or health data can be given unique tokens, but less sensitive types of data can reuse tokens to increase efficiency of the system.
Finally Token Manager 2.0 also allows for data to be encrypted at the point of capture or point of sale, eliminating the risk of data exposure during transit from point of capture to the Token Manager. While Palgon said the data is decrypted in the data centre in order to be tokenised, Palgon said only happens within the memory of the tokenisation server, so the data value isn't written out anywhere other than the server's memory.
NuBridges released Token Manager last spring. The vendor said it has more than 100 enterprise tokenisation customers. Palgon said customers commonly seek to apply tokenisation to payment data, Social Security numbers, drivers' licenses and passport data, but in recent months there's been a flurry of interest among customers and prospects regarding PHI and how the technology can help with HIPAA and HITECH compliance.
John Pescatore, vice president and research fellow at Stamford, Conn.-based research firm Gartner Inc., said that NuBridges was a first-mover in supporting enterprise tokenisation, but most of the major key management vendors have since added tokenisation capabilities.
"That early lead has given them a head start on other features that large enterprises look for," Pescatore said via email, "and the idea of token coordination across different physical locations is one of those features."
Pescatore noted that Gartner continues to see a high level of interest in tokenisation among organizations, primarily as way of reducing the scope and costs of the PCI DSS audit process.
"However, we are also seeing companies that do implement tokenisation for PCI reasons start to look at being able to offer encryption and tokenisation as internal services, to deal with sensitive data (mostly forms of PII) where encryption will be needed and tokenisation augments encryption," Pescatore said. "In general, we see steadily increasing demand for data encryption and see tokenisation as a needed capability for the majority of uses of server-side encryption."