{"id":5832,"date":"2023-12-22T02:47:51","date_gmt":"2023-12-22T02:47:51","guid":{"rendered":"https:\/\/www.pufsecurity.com\/?post_type=dlp_document&p=5832"},"modified":"2024-01-10T05:58:22","modified_gmt":"2024-01-10T05:58:22","slug":"post-quantum-cryptography-pqc-on-the-road-to-preparedness","status":"publish","type":"dlp_document","link":"https:\/\/www.pufsecurity.com\/zh-hant\/document\/post-quantum-cryptography-pqc-on-the-road-to-preparedness\/","title":{"rendered":"Post-Quantum Cryptography (PQC) \u2013 On the Road to Preparedness"},"content":{"rendered":"\n

As more and more governments and private sectors embark on standardizing quantum cryptography, the era of quantum computing seems imminent. In the face of this new wave, it is imperative to equip ourselves for the forthcoming challenges and opportunities thoroughly. This article will cover some basic concepts of quantum computing, how quantum computing is related to cryptography, and why preparing for post-quantum cryptography (PQC) is important. The conclusion of the article then provides some perspectives on the transition to PQC and how these fit in with overall system security.<\/p>\n\n\n\n

Let\u2019s start with a brief look at the historical evolution of quantum computing. The concept of quantum computing (QC) was first introduced as Paul Benioff described the quantum model of computing in 1980. In 1994, Peter Shor\u2019s quantum algorithm for factoring large integers was published, sparking interest among scientists and mathematicians. Later, Messrs. Nakamura and Tsai demonstrated a working qubit using a superconducting circuit in 1999, followed by D-Wave announcing the first commercially available quantum computer just after twelve years. In 2016, inveterate cryptographers might feel the call to action when the US National Institute of Standards and Technology (NIST) issued its request for nominations for its post-quantum cryptography (PQC) standardization program<\/a>. Subsequently, the National Security Memorandum (NSM-10)<\/a> was released in 2022, also urging the US to mitigate the threats to our current cryptographic protocols that a cryptographically relevant quantum computer (CRQC) represents. As of 2023, IBM launched the first quantum computer<\/a> with more than<\/p>\n\n\n\n

1,000 qubits.<\/p>\n\n\n\n

Before exploring further, we need to first understand the main difference between classical and quantum computing. Classical computing processes data in the form of binary bits, which can take the value of 0 or 1, whereas quantum computing uses particles that are in a quantum state called \u201cqubits\u201d. Because the superposition property of quantum mechanics applies to qubits, they can represent 0, 1, or any value in between (until measurement, whereupon the quantum superposition states will collapse) as shown in Figure 1. Then by taking advantage of other quantum mechanics properties such as entanglement, quantum algorithms can perform certain calculations much faster than it would take to complete on a classical computer. For example, Google claimed quantum supremacy in 2019 by performing a series of operations in the span of 200 seconds on their quantum computer, and that they claimed would take a classical supercomputer 10,000 years.<\/p>\n\n\n