Thieves can't steal --- what isn't there!
Tokenization is a method to provide anonymization – the process in which sensitive/protected information are eliminated or manipulated with the purpose of hindering the possibility of reverting back to the original data. This involves removing all identifying data in order to create unlink-able data. Tokenized data is not mathematically reversible. Tokenization is Data Security.
Storing tokens reduces the amount of sensitive data in your environment and helps your business meet many types of privacy and data compliance requirements.
Tokenization replaces sensitive data on your systems with a unique set of numbers and letters that have no bearing on the original data. "No bearing" means the data being stored is not used to calculate the token value.
Tokenization is the most secure method of storing sensitive data. It provides both physical and logical separation of data elements. You store tokens in your systems, the original data is safely encrypted and stored in the AuricVault® servers. You cannot programmatically or mathematically determine the original data from the token itself. You must use the token, and your credentials, to retrieve the original data from the AuricVault® service. This physical and logical separation of payment and privacy information significantly reduces a company’s exposure to information theft and reduces the impact of any security breach.
The AuricVault® service provides the ability to securely protect and exchange your sensitive data; and help you meet your company's data compliance requirements. Auric Systems International has tokenization clients in over 31 countries in a broad range of industries including aerospace, e-commerce, finance, healthcare, hospitality, mobile applications, and transportation.
- Convert Sensitive Data to unique token ID’s.
- Store sensitive data remotely and redundantly.
Improve the Security of Your Sensitive Data
A token storage service is an easy way to protect sensitive financial, identification, and access data such as credit card numbers, social security numbers and security codes.
- Safely exchange data
- Secure sensitive information
- PCI, HIPAA, GDPR, Privacy Shield, etc. compliance
- Business to Business (B2B) data sharing benefits
- Reduce risk of data theft
- Reduce risk of data loss (redundant/archival storage)
- Legacy environment implementations
- Business continuity
Safely Exchange Data
A tokenization and storage service ought to provide ways to safely exchange sensitive data with clients and business partners. The AuricVault® service does that.
Simplify Your Compliance
A token storage service ought to simplify complying with data security requirements. The AuricVault® service complies with the following industry and governmental requirements:
- Payment Card Industry Data Security Standard: PCI DSS (credit cards)
- Personally Identifiable Information: PII
- Medical Data: U.S. Health Insurance Portability and Accountability Act (HIPAA)
- Personal Health Information: PHI
- EU-U.S. Privacy Shield
- Swiss-U.S. Privacy Shield
- General Data Protection Regulation: GDPR
- Canadian Personal Information Protection and Electronic Documents Act: PIPEDA
What Can I Store?
Tokenization is for all types of data. The AuricVault® service allows you to store up to 11,000 bytes of data, either plain ASCII or UTF-8 encoded characters.
The Privacy Rights Clearinghouse, which tracks private and privileged information losses, clearly indicates that Payment Cards, Personally Identifiable Information (PII), and Protected Health Information (PHI) are not the only types of sensitive data that could benefit from the safety and security that tokenization brings. Companies frequently have multiple silos of sensitive information that could be tokenized.
- Credit/debit card account number
- Banking account number
- Financial account number
- CNET: Equifax data breach may affect nearly half the US population
- New York Times: JPMorgan Chase Hacking Affects 76 Million Households
- ssa.gov: Identity Theft and Your Social Security Number
- Protected Health Information (PHI)
- Reuters: Your medical record is worth more to hackers than your credit card
- Answers to security challenge questions.
- Biometric data
- Birth date
- Driver’s license Information
- Email address
- Foreign Visa Information
- Mother's maiden name
- Name and address
- National Insurance Number
- Personally Identifiable Information (PII)
- Passport Number
- Social Security Numbers
- ABC 15: Child’s Social Security number stolen in 2011, still being used
- ID Theft Center: Can Someone Steal Your Identity From Your Driver’s License?
- The Washington Times: 63K Social Security numbers compromised in UCF data breach
- Access Codes
- Password Hashes
- Security Codes
- ID Agent: 63% of Data Breaches Result From Weak or Stolen Passwords
Data detokenization is the reverse of the tokenization process. The original data is retrieved using the token and your service credentials. Bulk retrieval of the original data from a tokenized state ought to be easily accomplished. It is simple with the AuricVault® service.
Every technology comes with tradeoffs.
Two potential tokenization disadvantages are:
Using payment processor specific tokenization, for example, can lock you into that particular payment processor. To avoid lock-in, select a tokenization service that is payment processor neutral supports multiple payment processors, and which will return your tokenized data if you need it for future migration.
When selecting a tokenization service, consider how easily the data could be transferred to a different service in the future. With the AuricVault® service you'll always be able to retrieve your original data. Auric will even help you migrate to another service.
Using the tokenized data requires it be detokenized and retrieved from a remote service. This introduces a small increase in transaction time to the process which is negligible in most situations. However, in high-speed automation situations, a company might find that sub-millisecond response times from local storage is more important than the data separation security provided by tokenization.