From the Spring 2024 Issue

Shannon’s Dilemma: Securing a Future We Control but Cannot Know

Adam Firestone
Editor-in-Chief | United States Cybersecurity Magazine

The general sat back in her high-backed executive office chair and surveyed the network operations center.  A half-dozen engineers focused intently on the large curved high-definition displays in front of them and two men stood to her right.

She nodded to the government program manager.  “Have we gotten all the necessary approvals?”  She knew the answer, but she hadn’t gotten to this point in her career by leaving anything to chance.  

He nodded.  “Yes, madam general.”

The general glanced at the company executive who had spent untold billions of the people’s money to achieve what she knew would be a revolutionary disruption in the global economy and balance of power.  “Are we ready to start?”

The executive bowed slightly.  “Yes, madam general.  All systems are on standby, waiting for your word.”

The general murmured approval.  “Begin.”

The executive bowed again and turned to the engineers.  “Execute.”

The culmination of the project which had taken so much time and demanded so many resources was anticlimactic:  Some typing and mouse clicking resulted in lines of text scrolling across a few of the engineers’ displays.

The general looked over to the executive again.  “And now?”

The executive beamed.  “And now, madam general, we are decrypting, reading, and indexing everything, every secret, that everyone in the world has transmitted electronically for the last forty years.  We are also doing those things in real time to new transmissions as they happen.  Within a day, we will have searchable, usable records of every document, every purchase, every access, every transaction, every conversation, every thought that anyone in the world conceives.  We anticipate that we will work through all the harvested information in approximately six months.  At that time, there will exist nothing that is secret from us, not on a personal level, not on a local level, not on a national level.”

The general smiled, and with the barest hint of a nod, said to herself: “And then nothing can stop us.”

If that sounds like the introduction to a thriller, it could be.  Unfortunately, it’s also a very real potential future scenario as the age of quantum computing dawns.  Some context behind what some call Q-Day, and othersThe Coming Quantumpocalypse is useful.

Asymmetric Cryptography and Why You Care

The advent of broadband in the early 2000s and the explosion of complementary internet technologies in the 2010s created a functionally connected global environment.  The result was a dramatic increase in the velocity and volume of information creation, movement and sharing.  The interconnected systems built on these near-real time information conduits accelerated commercial opportunities, the efficiency of civil government, the effectiveness of military operations, the impact of academic organizations, and generated countless opportunities for personal connections and growth.  Digital connectivity and speed became the indispensable enablers of the 21st century. 

With respect to a digitally connected world, trust means that the confidentiality, integrity, availability, and privacy of the systems we use and the data they process can be relied upon.

Underlying all of this, however, is the simple concept of trust.  Trust, as defined by the Merriam-Webster dictionary, is a confident reliance on the character, ability, strength, or truth of someone or something.  With respect to a digitally connected world, trust means that the confidentiality, integrity, availability, and privacy of the systems we use and the data they process can be relied upon.  Without this trust, rational actors simply wouldn’t engage in any meaningful or significant transactions remotely across the internet.

Since the 1970s, the necessary trust has been guaranteed by a branch of cryptology called asymmetric encryption or, alternately, public key encryption.  Asymmetric encryption enables two parties to securely encrypt, transmit, decrypt, and use information without ever exposing a key that will decrypt the information.  

Think of it this way:  Imagine that you regularly send packages of your delicious homemade chocolate chip cookies to a friend who lives far away.  Unfortunately, the package handlers enjoy your cookies too, and your friend has received a number of packages with nothing but crumbs in them.  You decide to put the cookies in a case secured with a lock.  This creates a problem:  You need to get the key to the lock to your friend, who is far away.  Since the package handlers can’t be trusted with cookies, you can’t trust them to not make a copy of the key.   

To overcome this, when you next meet your friend, she gives you a case of unlocked padlocks.  She keeps all the keys.  Each time you send her cookies, you use one of the padlocks to secure the case.  Once the padlock is locked, nobody can open it, not you, not the package handlers – only your friend, who has the key.

In an asymmetric encryption scheme, each user has a pair of mathematically related cryptographic keys.  One is referred to as the public key and the other as the private key.  The public key, analogous to the unlocked padlock, can be openly published across insecure channels; in an encryption use case, its function is to encrypt information.  It cannot be used to decrypt information it has encrypted that has been encrypted with it.  When a message encrypted with a recipient’s public key is received, it is decrypted with the corresponding private key, analogous to the padlock’s key, which has presumably been safeguarded by the recipient.

The downside to asymmetric cryptography is that it’s computationally expensive; that is, it’s slower and less efficient than symmetric cryptography, which uses the same key to encrypt and decrypt. As a result, these two branches of cryptography are generally used in a tandem manner where a fast efficient symmetric key cryptography algorithm, such as AES-256 in the Galois Counter Mode (GCM) of operation is used for bulk encryption (e.g., the contents of a PowerPoint file), and an asymmetric algorithm is used to encrypt the relatively small (e.g., 256 bits) symmetric key such that it can be decrypted with the associated private key.

Some variation of this methodology is used to create the trust frameworks necessary for almost every type of remote transaction.  From web surfing to ecommerce to telecommunications to bulk data storage and retrieval to banking to credit cards to interconnected system authentication to keyless remote entry, almost all automated information management technologies that enable modernity rely on asymmetric cryptography.

It’s the End of the World as We Know It: The Quantum Eclipse of Classical Asymmetric Cryptography

The mathematics that power classical asymmetric cryptography is often categorized as one-way or trapdoor functions.  At a high level, these are functions that are easy to compute in one direction, yet computationally infeasible to compute in the opposite direction (finding its inverse) without special information.  (Strictly speaking, trapdoor functions are a sub-category of one-way functions.) 

Mathematical mechanisms used as the basis for one-way functions in asymmetric cryptography include the factorization of large prime numbers (RSA), modular squaring (Rabin cryptosystem), and discrete logarithms (El Gamal, Diffie-Hellman Key Exchange).  All of these have provided more or less satisfactory asymmetric cryptosystems.  Some of them have been codified as part of prolifically adopted standards such as Transport Layer Security (TLS) 1.3 (RFC 8446), or Federal Information Processing Standard (FIPS) 140-3.

Time, unfortunately, is the implacable enemy of all cryptosystems, and it has come calling for classical asymmetric cryptography.  More precisely, the advent of quantum computing threatens to make classical asymmetric cryptography non-viable in, depending on the analysis, anywhere from one to ten years.  This is due to the vast increases in computational speed promised by quantum computers.  

In a nutshell, classical computers (everything from a smartphone to a supercomputer) perform calculations by storing information in bits that represent one of two values, 0 or 1.   Quantum computers use qubits (cue-bits – quantum bits) which, because of a phenomenon of quantum mechanics known as superposition, can simultaneously exist in multiple states at once. Qubits can exist in all states simultaneously while calculations are ongoing.   As the number of qubits supported by a quantum computer increases, the number of simultaneous computational states scales as well, significantly reducing the time it takes to perform calculations.   This increase in computational power is likely to have dramatic – and fatal – impacts on the viability of the mathematics that power classical asymmetric cryptography. (Less so for symmetric cryptography.)

This isn’t new news, and both nation states and private actors are responding to the threat (or promise, depending on your point of view) of quantum computing with something called the Harvest Now, Decrypt Later (HNDL) attack.

HNDL is a surveillance strategy involving the interception, copying, and long-term archival storage of encrypted data that is currently unreadable, based on an expectation that future advances in decryption technology enabled by quantum computing will make it readable.

To be clear, there aren’t any quantum computers extant at the time of publication that can defeat existing asymmetric encryption systems within a useful timeframe (i.e., in what cryptographers refer to as polynomial time).  It’s envisioned that such a quantum computer would require chips containing millions or billions of qubits.  The largest publicized number of qubits on a contemporary quantum computer to date is 1,225.  To be fair, the number of qubits on a chip doesn’t tell the entire story; attributes such as coherency time and noise play a significant role, but the trend is clear:  Staggering amounts of resources are being poured into the quantum arms race, and faster and more capable quantum computers are being developed at an almost geometric rate. Business models built around exploiting the power of quantum computing already exist.  (Quantum-Computing-as-a-Service (QCaaS) anyone?)

On what is colloquially referred to as either Q-Day or the Quantumpocalypse, that is, the day when quantum computers can defeat classical asymmetric cryptography at scale, the information age as we’ve come to know it would seem to be over.  

Or would it?

All is Not Lost: Post-Quantum Cryptography

Quantum computing’s looming threat to asymmetric cryptography has not gone unnoticed in either the public or private sectors or academia.  In 2016, the U.S. National Institute of Standards and Technology (NIST) called for submissions of Post-Quantum Cryptographic (PQC) algorithms.  In July 2022, NIST announced the competition’s first four winners:

  • For general encryption and especially key encapsulation NIST selected the CRYSTALS-Kyber algorithm. 

  • For digital signatures, NIST selected three algorithms: CRYSTALS-Di lithium, FALCON and SPHINCS+

What makes these algorithms quantum resistant is the nature of their underlying mathematics.  CRYSTALS-Kyber (CRYSTALS = Cryptographic Suite for Algebraic Lattices) revolves around the inherent difficulty of solving the Learning With Errors (LWE) problem over module lattices.  This mathematical challenge is considered computationally infeasible, even for quantum computers.  However, it’s not just about difficult math.  Kyber generates cryptographic keys with smaller key sizes and faster encryption and decryption speeds than other algorithms, making it suitable for real-world applications.

However, as advanced and as promising as Kyber (which has been standardized at the Module-Lattice-based Key Encapsulation Mechanism, or ML-KEM) is, it’s only part of the solution to the Quantumpocalypse.

It's Not Just the Algorithm: Implementation Proliferation Matters

In 2014, a vulnerability was found in OpenSSL, a software library that secures communications over computer networks, protecting against eavesdropping, and providing tools and resources for the implementation of the Secure Sockets Layer (SSL) and Transport Layer Security (TLS) protocols.  It’s used by a majority of HTTPS websites.  The vulnerability was due to an implementation flaw, now known as Heartbleed.  As a result of the flaw, attackers could gain the ability to read webserver memory and find encryption keys.  Using these keys, attackers could gain access to authentication credentials. At that point, the attackers are free to initiate further attacks, eavesdrop on communications, impersonate users, and exfiltrate data.

Sounds pretty serious, no?

Despite the significance of the Heartbleed flaw, five years later, in 2019, and notwithstanding the publication of solution and updates to the OpenSSL product, a significant number of webservers remained unpatched, still susceptible to the vulnerability.  

Unfortunately, such a state of affairs is often the norm across the cyberscape.  While it’s tempting to attribute this to negligence or professional malpractice, it’s more often driven by other – and valid – business concerns and circumstances.  Just a few examples:  Some servers can’t be patched because of cost concerns or the criticality of the functions they perform.  Others are provided by third parties, and an impacted organization doesn’t have the ability to make the necessary changes.  Some organizations may not have the IT sophistication to understand the issues and take the necessary actions.

In sum, a technology that solves a current or future problem, like a shiny new PQC algorithm, doesn’t matter unless that technology is coupled with an implementation mechanism that makes it simple, fast, and cost effective to deploy and doesn’t adversely impact other business operations.  An example of an adverse impact might be a significant decrease in network connectivity or computational speeds.

An example of a company bringing this holistic solution perspective to the problems posed by the advent of quantum computing is American Binary (https://www.ambit.inc/).   American Binary offers a suite of products that, in combination, deliver post-quantum security that protects an organization’s information from the endpoints on which it is created to the network on which it travels and the infrastructure on which it is managed.

It's Not a Sprint, It’s a Marathon

The challenge posed by quantum computing has been apparent for years, and threat actors have been using that time proactively.  The bad news?  Much of what’s been confidential in the past is likely to be compromised in the future.  But that’s outweighed by the good news.  For the first time in decades, there are viable paths forward that make the challenge manageable from both technical and organizational perspectives.  In 1948, Claude Shannon, the inventor of information theory noted that We know the past but cannot control it. We control the future but cannot know it.  With the advent of rapidly implementable PQC solutions, we may not know the future, but we may be able to secure it. lock

Adam Firestone

Leave a Comment