How it works
Tokenize columns in production databases and in copies of databases before they are shared with third-party developers and big data environments.
Dynamic Data Masking
Administrators can establish policies that determine the dynamic masking approach: Return an entire field tokenized or dynamically mask only parts of a field. For example, security teams can establish policies so those with customer service representative credentials will only receive credit card numbers with the last four digits visible, while customer service supervisors can access the full credit card number.
IBM Guardium for Tokenization's format-preserving tokenization capabilities allow organizations to restrict access to sensitive assets without changing the existing database schema. A REST API implementation makes it fast, simple, and efficient for application developers to institute sophisticated tokenization capabilities.
IBM Guardium for Tokenization helps organizations comply with security policies and regulatory mandates like PCI DSS, Sarbanes Oxley, HIPAA, the GDPR, and more.
Guardium for Tokenization requires a virtual data security module (DSM) virtual appliance depolyed on a VMWare hypervisor (ESXi Server 5.5 or higher) and a Tokenization virtual appliance deployed on a VMWare hypervisor (ESXi Server 5.5 or higher) .
The DSM virtual appliance may require additional resources based on the number of agents that are being managed.
Applications or Databases that need tokenized data are required to make REST API calls to the Tokenization virtual appliance (server) to receive tokenized objects. The following minimum requirements should be followed:
- DSM Number of CPU Cores: 2 (min) and 6 (recommended)
- DSM RAM: 4-16 GB minimum
- DSM Hard Disk space: 100-200 GB
- Tokenization server CPU cores: 4
- Tokenization server RAM: 16 GB (min) and 24 GB (recommended)
- Tokenization server Hard Disk Space: 100 GB
See software requirements.