By Steve Wilson

This blog is based on a new Constellation Research report Preparing for the New Age of Cloud Security, Dec 7, 2021, by Liz Miller and me,

1. Zero Trust

One of the most popular slogans today is also one of the most confusing: Zero Trust. Of course we want to trust business partners and service providers to provide predictable outcomes.  But “zero trust” is a technicality, a policy perspective that dispels the traditional class system which grants selected individuals or systems more privileged access than others. That’s the road to security hell, for it is increasingly possible to fake or falsely assume a trusted position, and thence wreak limitless damage.

A zero-trust access control policy means that all agents within an system are dealt with equally, regardless of identity or history. Essentially it equates to the “need to know” principle: all actors must have a demonstrable need for a given level of access.

Sadly “zero trust” has become a catch phrase, much abused in security marketing. All it really means is do not trust people on their word. Equally, we should not accept “zero trust” as a technical description on its word alone. It must go with tangible security controls.

2. Exponentially Increasing Client-Device Capabilities

Moore’s Law leads to periodic paradigm shifts in computing focus, from the server to the client and back again.  In the 1980s and 90s, mainframe computing gave way to minicomputers, workstations, and microcomputers. Then the sheer volume of data processing required computing to swing back to huge back-end systems and shared resources, which became known as the cloud aka “someone else’s computers”. 

The pendulum has swung yet again, driven by cryptography, with mounting preference (nay, mandates) for secure microprocessing units (MCUs) on the client side.  Local encryption key generation and storage, and integrated transaction signing are now standard in mobile devices, and IoT device capability is heading the same way. Managing fleets of connected automobiles, smart electricity meters, and medical devices -- to cite some popular examples -- takes a dynamic mix of processing and storage at the edge and in the cloud.

High quality client-side cryptography is the key (pardon the pun) to genuine strong authentication; that is, using personally controlled tangible devices, individually accountable and resistant to phishing.

The FIDO Alliance has brought about the biggest improvements ever seen in end user authentication, by consumerising cryptography. Most users are blithely unaware that when they unlock a virtual credit card on their mobile phone via facial recognition, they are invoking hardware security and public key cryptography which until recently was confined to HSMs costing $25,000 each.

3. Quantum-Safe Cryptography

Quantum computing promises radical new architectures surpassing the classical limits of arithmetic and sequential data processing. In cybersecurity, there is a dark side top this power: many encryption algorithms, especially those used for digital signatures and authentication, are based on “one-way functions” that are easy to compute in one direction but practically impossible to reverse. There has always been an arms race between cryptologists and attackers who uncover secret keys through brute force attacks using powerful conventional computers (like graphics cards and custom gate arrays). The main weapon against these brute force attack is for vendors and sysadmins to regularly increase the lengths of the keys used to protect data.

Quantum computers will one day punch through the computational barriers that underpin most of today’s digital signature algorithms. For now, the threat tends to be exaggerated; some of the world’s best cryptologists advise the arms race will continue, with longer keys still offering protection for many years to come.  Meanwhile the National Institute of Technology and Standards (NIST) is conducting a methodical search -- and sponsoring competitions -- the next generation of quantum-safe cryptographic techniques, and practical quantum engineering progresses in leaps and bounds. 

This is an especially difficult area to predict with confidence. The maintenance of enterprise cryptographic services must be left to experts. The better cloud providers, offering managed services, original research and insights, will be watching and preparing for the next generation of encryption.

4. Cloud Hardware Security Modules

For decades, hardware security modules (HSMs) have been specialised core components of critical security operations such as payments gateways, ATMs, and high-volume website encryption. Cloud HSMs are critical new building blocks for the future of the virtualized cryptographic processing, offering:

  • compact, certifiable, tamper-resistant cryptographic primitives running in firmware
  • secure execution environments for running partitioned custom code
  • secure elements (dedicated microchips) holding private keys which rarely if ever are released outside the safe hardware
  • physically robust enclosures with tamper detection to scrub code and keys in the event of a mechanical or electronic intrusion
  • certification to high levels of assurance such as Common Criteria Evaluation Assurance Level 6 (EAL6) or FIPS 140-2 Level 4, and 
  • options to rent either shared or dedicated modules.

5. Data Protection as a Service

Many security challenges have more to do with regulatory trends and competitive pressure than fraud or overt criminality. Increasingly stringent data protection rules are driving demand for encryption and confidentiality.

Managed cryptography is needed in response to the long-standing challenge of keeping encryption systems up to date. I see encryption, tokenization and anonymization (with qualifications) being delivered to customers as services via APIs, with the algorithmic complexity abstracted away. Ideally, higher order data protection services will come to be expressed in an algorithm-neutral manner. 

True anonymisation is a promise that is difficult to keep, but a range of technologies are available for providing qualified degrees of data-hiding -- including homomorphic encryption, fully homomorphic encryption (FHE), and Differential Privacy.  These are highly technical methods, needing careful fine tuning and awareness of the compromises they entail.

Traditionally encrypted data loses all structure and cannot be processed for routine tasks such as sorting, reporting and statistical analysis. Homomorphic encryption is a class of algorithms that preserve some structure and enable some processing. Fully homomorphic encryption promises to enable all regular processing to be performed on encrypted data. This is new technology, not yet fully proven or accepted by the academic and regulatory communities. With FHE and similar techniques still in flux, I see these being offered as a cloud service in innovative new ways.

“Infostructure”

In conclusion, with data itself becoming a critical new asset class, we need a broader, more integrated definition of “data protection” to transcend today’s siloed approaches to privacy and cybersecurity. The technical complexity, regulatory risk, and intensity of ongoing R&D will all drive the virtualization of cloud-based data protection, and a new breed of infostructure services.