CipherCloud unveils 'next generation' tokenization for Cloud Data Protection

Latest enhancement to the CipherCloud platform enables compliance of cloud services with EU data residency requirements.

CipherCloud announces Next Generation Tokenization as a complement to the company’s suite of cloud data protection technologies. Tokenization protects sensitive data by replacing clear text data with a random “token” that bears no mathematical relationship to the original data. 

Regarded as one of the strongest forms of data protection, enterprises worldwide use tokenization in critical security environments and for data residency purposes. The offering, an integral part of the CipherCloud platform, combines strong data protection with natural language search. In addition, CipherCloud becomes the first Cloud Access Security Broker (CASB) to offer multi-cloud tokenization for Salesforce, ServiceNow, Docusign, Marketo, Good Technology and integrations with other applications.

In the context of cloud computing, tokenization allows enterprises retain the original data on premises while sending the tokenized forms into the cloud. Because the clear text data never leaves the company, tokenization complies with some of the most stringent EU privacy laws, such as in Germany and Luxembourg, that prohibit certain types of sensitive personal information from leaving national boundaries. Tokenization is also often used for PCI compliance and meets Requirement 3 of the PCI Data Security Standard (PCI DSS).

“We’re unveiling Next Generation Tokenization for Cloud Data Protection at Europe’s biggest security conference because the product is tailor made for the EU’s increasingly tougher regulatory environment for the cloud,” said Pravin Kothari, founder and CEO of CipherCloud. “Given the harsher financial fines of the Privacy Regulation, the consumption of multiple cloud applications by the enterprise requires security and compliance to be in lock step with business. We are giving enterprises a coherent strategy and the innovative technologies they need to navigate the privacy and residency regulations required for conducting business across the globe.”

Next Generation Tokenization enables organisations to protect their information assets and reduce risks through:
• Flexible and granular field level protection with advanced policy-based controls that is also unobtrusive to the business process.
• Natural language search that enables complex searches, deep application integration, advanced reporting and charting.
• A solution that scales massively across data centres and geographical locations.
• The technology is configurable on a per-field, per-word or partial field basis.
• Multiple tokens are randomly generated for repetitive words or strings to prevent frequency analysis attacks. 

How Site24x7's new AI features aim to enhance IT operations, reduce recovery time, and ensure...
The unveiling of CrowdStrike's 2026 Global Threat Report highlights a surge in AI-enabled threats,...
Capgemini and OpenAI collaborate to support enterprise AI adoption via the Frontier platform.
Tech Mahindra and University College London are collaborating on research and solution development...
BMC is working with financial institutions to support mainframe modernisation, workflow...
Creative ITC has established its U.S. headquarters in Houston to support growth across North...
Large enterprises express concern that AI may not deliver the resilience and business continuity...
WaveMaker has introduced a new system for AI-driven enterprise application development designed to...