Читать книгу Quantum Computing - Melanie Swan - Страница 13

Оглавление

Chapter 3

Quantum Computing: Basic Concepts

it seems that the laws of physics present no barrier to reducing the size of computers until bits are the size of atoms, and quantum behavior holds sway

— Richard P. Feynman (1985)

Abstract

Quantum computing is a research frontier in physical science with a focus on developing information processing at the quantum scale. Quantum computing involves the use of algorithms to exploit the special properties of quantum mechanical objects (such as superposition, entanglement, and interference) to perform computation. In physics, quantum mechanics is the body of laws that describe the behavior and interaction of electrons, photons, and other subatomic particles that make up the universe. Quantum computing engages the rules of quantum mechanics to solve problems using quantum information. Quantum information is information concerning the state of a quantum system which can be manipulated using quantum information algorithms and other processing techniques. Although quantum computing is farther along than may be widely known, it is an early-stage technology fraught with uncertainty. The overall aim in the longer term is to construct universal fault-tolerant quantum computers.

3.1Introduction

Quantum computers are in the early stages of development and would likely be complementary to existing computational infrastructure, interacting with classical devices, and being accessed either locally or as a cloud service. Currently, the top methods demonstrate 30–70 qubits of processing power and achieve fidelity rates above 99% (i.e. below a fault tolerance threshold of 1%). However, there is uncertainty about the realizability of scalable universal quantum computers. Quantum computers may excel at solving certain types of problems such as optimization. This could offer a step-up in computing such that it is possible to solve new classes of problems, but not all problems. For example, considering well-known optimization problems, it may be possible to search twice as many possibilities in half the time (exploring a fixed-size space in the square root of the amount of time required for a classical computer).

Quantum computing is an early-stage technology with numerous risks and limitations (Dyakonov, 2018). The long-term goal of universal quantum computing is not immediate as many challenges including error correction need to be resolved. In the short term, the focus is on solving simple problems in which quantum computers offer an advantage over classical methods through NISQ devices (noisy intermediate-scale quantum devices) (Preskill, 2018).

3.1.1 Breaking RSA encryption

One of the biggest questions is when it might be possible to break existing cryptography standards with quantum computing. The current standard is 2048-bit RSA (Rivest–Shamir–Adleman) encryption, which is widely used for activities such as securely sending credit card details over the internet. Predictions vary as to when it may be possible to break the current standard, meaning factor the 2048-bit integers used by the RSA method. Although not immanent, methods are constantly improving, and readying cryptographic systems for the quantum era is prescribed.

A 2019 report published by the US National Academies of Sciences predicts that breaking RSA encryption is unlikely within the next decade. The report indicates that any serious applications of quantum computing are at least 10 years away (Grumbling & Horowitz, 2019). Given the current state of quantum computing and the recent rates of progress, “it is highly unexpected that a quantum computer that can compromise RSA 2048 or comparable discrete logarithm-based public key cryptosystems will be built within the next decade” (Grumbling & Horowitz, 2019, 157). The report’s stance on error correction is that “The average error rate of qubits in today’s larger devices would need to be reduced by a factor of 10 to 100 before a computation could be robust enough to support [the required] error correction at scale” (Grumbling & Horowitz, 2019). The report further highlights that “at this error rate, the number of physical qubits held by these devices would need to increase at least by a factor of 105 in order to create a useful number of effective logical qubits” (Grumbling & Horowitz, 2019). A 2016 National Institute of Standards and Technology (NIST) report has a similar result, noting that some of the most aggressive time estimates predicting when quantum computers might be powerful enough to break 2048-bit RSA might be by 2030, at a potential cost of a billion dollars (Chen et al., 2016, 6).

One complication in making predictions is that the number of required qubits (processing power) needed to factor 2048-bit RSA integers varies by method. Different algorithms need different numbers of qubits (Mavroeidis et al., 2018). Although difficult to guess, “current estimates range from tens of millions to a billion physical qubits” (Mosca, 2018, 39). Newer estimates propose more granularity, indicating in more detail how a quantum computer might perform the calculation with 20 million noisy qubits (without error correction) in just 8 hours (Gidney & Ekera, 2019). (The method relies on modular exponentiation which is the most computationally expensive operation in Shor’s algorithm for factoring.) In 2012, a 4-qubit quantum computer factored the number 143, and in 2014, a similar device factored the number 56,153. However, scaling up quickly is not straightforward because the extent of error correction required is unknown. The recent result from Gidney & Ekera (2019) suggesting that 20 million qubits might be possible without error correction is potentially a landmark step.

One of the nearest term remedies for post-quantum security is quantum cryptography, in the form of quantum key distribution (QKD), which has been foreseen in quantum information science roadmaps for some time (Los Alamos National Laboratory (LANL), 2004). QKD is the idea of issuing cryptographic keys generated with quantum computing methods and distributing them via global communications networks, satellite-based and terrestrial. QKD has been experimentally demonstrated and remains to be commercialized. The market for QKD is estimated to reach $980 million by 2024 from $85 million in 2019 (Gasman, 2019).

3.2Basic Concepts: Bit and Qubit

Quantum computing springs from Nobel physicist Richard Feynman’s intuition that it should be possible to perform very powerful computations by using quantum building blocks. He suggests the idea of simulating physics with computers using a universal quantum simulator.

Feynman worked on many problems at the small scales of the quantum mechanical domain. He famously announced that “There’s plenty of room at the bottom” (Feynman, 1960), referring to the idea of miniaturization at the atomic scale such that the entire 24-volume set of the Encyclopedia Britannica might be printed on the head of a pin. These ideas helped to sponsor the nanotechnology industry in the 1990s and are likewise motivating the current development of quantum computing. To get an idea of scale, the nanometer scale is 10−9 m, the atomic scale (in terms of the Bohr radius being the probable distance from the nucleus to the electron) is 10−11 m, and the electron scale (the size of the electron) is 10−15 m.

Feynman suggested that a quantum computer could be an efficient universal simulator of quantum mechanics. Such a “universal quantum simulator” (Feynman, 1982, 474) would be a different kind of computer that is not a traditional Turing machine. He posits two ways to simulate quantum mechanics with computers. One is reconceiving the notion of computers and building computers out of quantum mechanical elements that obey quantum mechanical laws. The other idea is trying to imitate quantum mechanical systems with classical systems.

Feynman’s key thought is that the more closely computing systems can be built in the structure of nature, the better they can simulate nature. He says that the “the various field theories have the same kind of behavior, and can be simulated in every way, apparently, with little latticeworks of spins and other things” (Feynman, 1982, 474–5). Assuming that the world is made in a discrete lattice, “the phenomena of field theories can be well imitated by many phenomena in solid state theory (which is simply the analysis of a latticework of crystal atoms, and in the case of the kind of solid state I mean each atom is just a point which has numbers associated with it, with quantum mechanical rules)” (Feynman, 1982, 475).

Feynman proposed the idea of the universal quantum simulator in 1982, following which there have been other theoretical developments. In 1985, David Deutsch proposed a universal quantum computer based on the idea that quantum gates could function in a similar fashion to traditional digital computing binary logic gates [Deutsch, 1985]. In 2000, Charlie Bennett showed that it is possible to efficiently simulate any classical computation using a quantum computer (Bennett & DiVincenzo, 2000).

Other advances in recent decades have led to the practical realizability of quantum computers. First, in the 1990s was the discovery of quantum error correction. Unlike classical bits that persistently stay in a 1 or 0 state, quantum bits are extremely sensitive to environmental noise and may decohere before they can be used to perform a computation. Quantum error correction overcomes some of the challenges of working in quantum mechanical domains.

Second, since 2012, there have been advances in room-temperature superconducting materials and a proliferation of ways of making qubits such that quantum systems have increased from 1–2 qubits to 50–100 qubits. A research goal is demonstrating quantum advantage, which is specific cases in which quantum computing confers an advantage over classical computing.

3.2.1 Quantum computing and classical computing

Quantum information processing is not only a potentially faster means of computing but also a new paradigm in that information is conceived and managed in a completely different way due to the different properties of quantum objects. According to W.D. Phillips, 1997 Nobel Prize winner in physics and NIST scientist, “Quantum information is a radical departure in information technology, more fundamentally different from current technology than the digital computer is from the abacus” (Williams, 2007).

Some of the special properties of quantum objects (be it atoms, ions, or photons) are superposition, entanglement, and interference (SEI properties). Superposition means that particles can exist across all possible states simultaneously. This is known as a superposition of states. For example, an electron may exist in two possible spin states simultaneously, referred to as 1 and 0, or spin-up and spin-down. Entanglement is the situation that groups of particles are related and can interact in ways such that the quantum state of each particle cannot be described independently of the state of the others even when the particles are separated by a large distance. Across large distances, this is called Bell pair entanglement or nonlocality. Interference relates to the wave-like behavior of particles. Interference can be positive or negative, in that when two waves come together, they are either reinforced or diminished.

Classical computing is based on electrical conductivity, using Boolean algebra (namely expressions evaluating as true/false, and/or, etc.) to manipulate bits. Quantum computing is based on quantum mechanics, using vectors and linear algebra to manipulate matrices of complex numbers. Aiming toward a universal model of quantum computation, the idea is to package the quantum mechanical matrix manipulations such that they run quantum states that are executed with a set of gates that offer the same kind of Boolean logic as in classical computing.

3.2.2 Bit and qubit

In classical computing, the bit is the fundamental computational unit. The bit is an abstract mathematical entity that is either a 0 or a 1. Computations are constructed as a series of manipulations of 0s and 1s. In the physical world, a bit might be represented in terms of a voltage inside a computer, a magnetic domain on a hard disk, or light in an optical fiber. The qubit (quantum bit) is the equivalent system in quantum mechanics. The qubit is likewise an abstract mathematical entity (a logical qubit), existing in a superposition state of being both a 0 and a 1, until collapsed in the measurement at the end of the computation into being a classical 0 or 1. The qubit can be instantiated in different ways in the physical world. There are realizations of qubits in atoms, photons, electrons, and other kinds of physical systems. The quantum state of a qubit is a vector in a 2D space. This is a linear combination of the 1 and the 0 (the trajectory or probability that it is in the 1 or the 0 state). A model of computation can be built up by assigning states closer to the 0 as being 0 and states closer to the 1 as being 1 (when measured).

A bit is always in a state of either 1 or 0. A qubit exists in a state of being both 1 and 0 until it is collapsed into a 1 or a 0 at the end of the computation. A bit is a classical object that exists in an electronic circuit register. A qubit is a quantum object (an atom, photon, or electron) that bounces around in a 3D space with a different probability of being at any particular place in the 3D sphere called a Hilbert space (and has vector coordinates in the X, Y, and Z directions). Figure 3.1 shows the physical space of the states of the bit and the qubit.

The interpretation is that whereas a classical bit is either on or off (in the state of 1 or 0), a qubit can be on and off (1 and 0) at the same time, a property called superposition. One example of this is the spin of the electron in which the two levels can be understood as spin-up and spin-down. Another example is the polarization of a single photon in which the two states can be taken to be the vertical polarization and the horizontal polarization (single photons are often transmitted in communications networks on the basis of polarization). In a classical system, a bit needs to be in one state or the other. However, in a quantum mechanical system, the qubit can be in a coherent superposition of both states or levels of the system simultaneously, a property which is fundamental to quantum mechanics and indicates the greater potential range of computation in quantum systems.

Figure 3.1. Potential states of bit and qubit.

Compared to classical states, quantum states are much richer and have more depth. Superposition means that quantum states can have weight in all possible classical states. Each step in the execution of a quantum algorithm mixes the states into more complex superpositions. For example, starting with the qubit in a position of 0–0–0 leads to a superposition of 1–0–0, 1–0–1, and 1–1–1. Then each of the three parts of the superposition state branches out into even more states. This indicates the extensibility of quantum computers that could allow faster problem solving than is available in classical computers.

3.2.3 Creating qubits

A qubit can be created in any quantum system which has two levels of energy that can be manipulated (Steane, 1997). Qubits can be conceived as being similar to harmonic oscillators at the macroscale. Physical systems that vibrate in a wave-like form between two levels of energy are called harmonic oscillators. Some examples include electrical circuits with oscillating current, sound waves in gas, and pendulums. Harmonic oscillators can be modeled as a wave function that cycles between the peak and trough energy levels. The same wave function concept is true at the quantum scale. In this sense, whenever there is a quantum system with two levels of energy, it can be said to be a qubit and possibly engaged as a two-state quantum device. This implies that there can be many different ways of building qubits. Hence, the method for creating qubits might be an engineering choice similar to the way that different methods have been used in classical computing for the physical implementation of logic gates (methods have ranged over time and included vacuum tubes, relays, and most recently integrated circuits).

3.3Quantum Hardware Approaches

3.3.1 The DiVincenzo criteria

The DiVincenzo criteria have been proposed as standards that constitute the five elements of producing a well-formed quantum computer (DiVincenzo, 2000). The criteria are having (1) a scalable system of well-characterized qubits, (2) qubits that can be initialized with fidelity (typically to the zero state), (3) qubits that have a long-enough coherence time for the calculation (with low error rates), (4) a universal set of quantum gates (that can be implemented in any system), and (5) the capability of measuring any specific qubit in the ending result.

There are several approaches to quantum computing (Table 3.1) (McMahon, 2018). Those with the most near-term focus are superconducting circuits, ion trapping, topological matter, and quantum photonics. Irrespective of the method, the objective is to produce quantum computing chips that perform computations with qubits, using a series of quantum logic gates that are built into quantum circuits, whose operation is programmed with quantum algorithms. Quantum systems may be accessed locally or as a cloud service. As of June 2019, one method is commercially available, which is superconducting circuits. Verification of computational claims is a considerable concern. External parties such as academic scientists are engaged to confirm, verify, and benchmark the results of different quantum systems, for example, for Google (Villalonga et al., 2019) and for IonQ (Murali et al., 2019).

Table 3.1. Quantum computing hardware platforms.


3.3.2 Superconducting circuits: Standard gate model

The most prominent approach to quantum computing is superconducting circuits. Qubits are formed by an electrical circuit with oscillating current and controlled by electromagnetic fields. Superconductors are materials which have zero electrical resistance when cooled below a certain temperature. (In fact, it is estimated that more than half of the basic elements in the periodic table become superconducting if they are cooled to sufficiently low temperatures.) Mastering superconducting materials could be quite useful since as a general rule, about 20% of electricity is lost due to resistance. The benefit of zero electrical resistance for quantum computing is that electrons can travel completely unimpeded without any energy dissipation. When the temperature drops below the critical level, two electrons (which usually repel each another) form a weak bond and become a so-called Cooper pair that experiences no resistance when going through metal (tunneling) and which can be manipulated in quantum computing.

Superconducting materials are used in quantum computing to produce superconducting circuits that look architecturally similar to classical computing circuits, but are made from qubits. There is an electrical circuit with oscillating current in the shape of a superconducting loop that has the circulating current and a corresponding magnetic field that can hold the qubits in place. Current is passed through the superconducting loop in both directions to create the two states of the qubit. More technically, the superconducting loop is a superconducting quantum interference device (SQUID) magnetometer (a device for measuring magnetic fields), which has two superconductors separated by thin insulating layers to form two parallel Josephson junctions. Josephson junctions are key to quantum computing because they are nonlinear superconducting inductors that create the energy levels needed to make a distinct qubit.

Specifically, the nonlinearity of the Josephson inductance breaks the degeneracy of the energy-level spacings, allowing the dynamics of the system to be restricted to only the 2-qubit states. The Josephson junctions are necessary to produce the qubits; otherwise, the superconducting loop would just be a circuit. The point is that the linear inductors in a traditional circuit are replaced with the Josephson junction, which is a nonlinear element that produces energy levels with different spacings from each other that can be used as a qubit. Josephson (after whom the Josephson junction is named) was awarded the Nobel Prize in Physics in 1973 for work predicting the tunneling behavior of superconducting Cooper pairs.

As an example of a superconducting system, Google’s qubits are electrical oscillators constructed from aluminum (niobium is also used), which becomes superconducting when cooled to below 1 K (−272°C). The oscillators store small amounts of electrical energy. When the oscillator is in the 0 state, it has zero energy, and when the oscillator is in the 1 state, it has a single quantum of energy. The two states of the oscillator with 0 or 1 quantum of energy are the logical states of the qubit. The resonance frequency of the oscillators is 6 gigahertz (which corresponds to 300 millikelvin) and sets the energy differential between the 0 and 1 states. The frequency is low enough so that control electronics can be built from readily available commercial components and also high enough so that the ambient thermal energy does not scramble the oscillation and introduce errors. In another example, Rigetti has a different architecture. This system consists of a single Josephson junction qubit on a sapphire substrate. The substrate is embedded in a copper waveguide cavity. The waveguide is coupled to qubit transitions to perform quantum computations (Rigetti et al., 2012).

3.3.2.1Superconducting materials

Superconducting materials is an active area of ongoing research (Table 3.2). The discovery of “high-temperature superconductors” in 1986 led to the feasibility of using superconducting circuits in quantum computing (and the 1987 Nobel Prize in Physics) (Bednorz & Muller, 1986). Before high-temperature superconductors, ordinary superconductors were known materials that become superconducting at critical temperatures below 30 K (−303°C), when cooled with liquid helium. High-temperature superconductors constitute advanced materials because transition temperatures can be as high as 138 K (−135°C), and materials can be cooled to superconductivity with liquid nitrogen instead of helium. Initially, only certain compounds of copper and oxygen were found to have high-temperature superconducting properties (for example, varieties of copper oxide compounds such as bismuth strontium calcium copper oxide and yttrium barium copper oxide). However, since 2008, several metal-based compounds (such as iron, aluminum, copper, and niobium) have been found to be superconducting at high temperatures too.

Table 3.2. Superconducting materials.


Experimental, of interest is a new class of hydrogen-based “room-temperature superconductors” (i.e. warmer than ever before) that have been discovered with high-pressure techniques. In 2015, hydrogen sulfide subjected to extremely high pressure (about 150 gigapascals) was found to have a superconducting transition near 203 K (−70°C) (Drozdov et al., 2015). In 2019, another project produced evidence for superconductivity above 260 K (−13°C) in lanthanum superhydride at megabar pressures [Somayazulu et al., 2019]. Although experimentally demonstrated, such methods are far from development into practical use due to the specialized conditions required to generate them (a small amount of material is pressed between two high-pressure diamond points (Zurek, 2019)).

3.3.3 Superconducting circuits: Quantum annealing machines

Within the superconducting circuits approach to quantum computing, there are two architectures, the standard gate model (described above) and quantum annealing (invented first, but more limited). The two models are used for solving different kinds of problems. The universal gate model connotes a general-purpose computer, whereas the annealing machine is specialized. Quantum annealing machines have superconducting qubits with programmable couplings that are designed to solve QUBO problems (quadratic unconstrained binary optimization), a known class of NP-hard optimization problems that minimize a quadratic polynomial over binary variables.

In quantum annealing, the aim is to harness the natural evolution of quantum states over time. A problem is set up at the beginning and then the system runs such that quantum physics takes its natural evolutionary course. There is no control during the system’s evolution, and ideally, the ending configuration corresponds to a useful answer to the problem. As compared with quantum annealing, the gate model aims to more fully control and manipulate the evolution of quantum states during the operation. This is more difficult given the sensitivity of quantum mechanical systems, but having more control implies that a bigger and more general range of problems can be solved. The difference in approach explains why quantum annealing machines appeared first and have been able to demonstrate 2048 qubits, whereas only 30–70 qubits are currently achieved in the standard gate model.

Quantum annealing is an energy-based model related to the idea of using the quantum fluctuations of spinning atoms to find the lowest energy state of a system (Kadowaki & Nishimori, 1998). Annealing refers to the centuries-old technique used by blacksmiths to forge iron. In the thermal annealing process, the iron becomes uniformly hot enough so that the atoms settle in the lowest energy landscape, which makes the strongest material. Similarly, quantum annealing is based on the idea of finding the lowest energy configuration of a system.

Quantum annealing is deployed as a method for solving optimization problems by using quantum adiabatic evolution to find the ground state of a system (adiabatic means heat does not enter or leave the system). Run on a quantum computer, the quantum annealing process starts from a ground state which is the quantum mechanical superposition of all possible system states with equal weights. The system then evolves per the time-dependent Schrödinger equation in a natural physical evolution to settle in a low-energy state. The computational problem to be solved is framed in terms of an energy optimization problem in which the low-energy state signals the answer. (The quantum annealing process is described in more detail in Chapter 10.)

Overall, the quantum annealing process allows the system of spins (spinning atoms of qubits) to find a low-energy state. Superconducting circuits in the quantum annealing model can be thought of as programmable annealing engines (Kaminsky & Lloyd, 2004). Optimization problems are framed such that they can be instantiated in the form of an energy landscape minimization. Although annealing machines are not general-purpose quantum computers, one advantage is that since annealing systems constantly attempt to reach the lowest energy state, they are more tolerant and resistant to noise than gate model systems and may require much less error correction at large scales.

3.3.4 Ion trapping

Another prominent approach to quantum computing is trapped ions. In these quantum chips, ions are stored in electromagnetic traps and manipulated by lasers and electromagnetic fields. Ions are atoms which have been stripped of or received electrons, which leaves them positively or negatively charged and therefore more easily manipulatable. The advantage of ion trap qubits is that they have a long coherence time (making calculations easier) and (like annealing machines) may require less error correction at large scales. A single qubit trap may accommodate 30–100 qubits, and 23 qubits have been demonstrated in a research context (Murali et al., 2019).

The IonQ quantum chip uses ytterbium ions, which unlike superconducting qubits, do not need to be supercooled to operate. Bulky cryogenic equipment is not required, and the entire system occupies about one cubic meter, as opposed to a much larger footprint for superconducting circuit machines. The chip is a few millimeters across. It is fabricated with silicon and contains 100 electrodes that confine and control the ions in an ultrahigh-vacuum environment.

To operate, the ion trap quantum computer holds the ions in a geometrical array (a linear array for IonQ). Laser beams encode and read information to and from individual ions by causing transitions between the electronic states of the ion. The ions influence each other through electrostatic interactions and their coupling can be controlled. More specifically, the IonQ ions form a crystal structure because they repel each other (since they are all of the same isotope of the same element (ytterbium-171)). The electrodes underneath the ions hold the charged particles together in a linear array by applying electrical potentials. The lasers initialize the qubits, entangle them through coupling, and produce quantum logic gates to execute the computation. At the end of the computation, another laser causes ions to fluoresce if they are in a certain qubit state. The fluorescence is collected to measure each qubit and compute the result of the computation. One design principle is already becoming clear in such ion trap systems, that the number of qubits scales as the square root of the gates.

3.3.5 Majorana fermions and topological quantum computing

An interesting and somewhat exotic approach for building a universal quantum computer is Majorana fermions. Qubits are made from particles in topological superconductors and electrically controlled in a computational model based on their movement trajectories (called “braiding”). One of the main benefits of topological quantum computing is physical error correction (error correction performed in the hardware, not later by software). The method indicates very low initial error rates as compared with other approaches (Freedman et al., 2002).

Topological superconductors are novel classes of quantum phases that arise in condensed matter, characterized by structures of Cooper pairing states (i.e. quantum computable states) that appear on the topology (the edge and core) of the superconductor (hence the name topological superconductors). The Cooper pairing states are a special class of matter called Majorana fermions (particles identified with their own antiparticles). Topological invariants constrained by the symmetries of the systems produce the Majorana fermions and ensure their stability.

As the Majorana fermions bounce around, their movement trajectories resemble a braid made out of different strands. The braids are wave functions that are used to develop the logic gates in the computation model (Wang, 2010). Majorana fermions appear in particle–antiparticle pairs and are assigned to quantum states or modes. The computation model is built up around the exchange of the so-called Majorana zero modes in a sequential process. The sequentiality of the process is relevant as changing the order of the exchange operations of the particles changes the final result of the computation. This feature is called non-Abelian, denoting that the steps in the process are non-commuting (non-exchangeable with one another). Majorana zero modes obey a new class of quantum statistics, called non-Abelian statistics, in which the exchange operations of particles are non-commutative.

The Majorana zero modes (modes indicate a specific state of a quantum object related to spin, charge, polarization, or other parameter) are an important and unique state of the Majorana fermionic system (unlike other known bosonic and fermionic matter phases). The benefit of the non-Abelian quantum statistics of the Majorana zero modes is that they can be employed for wave function calculations, namely to average over the particle wave functions in sequential order. The sequential processing of particle wave function behavior is important for constructing efficient logic gates for quantum computation. Researchers indicate that well-separated Majorana zero modes should be able to manifest non-Abelian braiding statistics suitable for unitary gate operations for topological quantum computation (Sarma et al., 2015).

Majorana fermions have only been realized in the specialized conditions of temperatures close to 1 K (−272°C) under high magnetic fields. However, there are recent proposals for more reliable platforms for producing Majorana zero modes (Robinson et al., 2019) and generating more robust Majorana fermions in general (Jack et al., 2019).

3.3.6 Quantum photonics

Qubits are formed from either matter (atoms or ions) or light (photons). Quantum photonics is an important approach to quantum computing given its potential linkage to optical networks, in the fact that global communications networks are based on photonic transfer. In quantum photonics, single photons or squeezed states of light in silicon waveguides are used to represent qubits, and they are controlled in a computational model in cluster states (entangled states of multiple photons). Quantum photonics can be realized in computing chips or in free space. Single photons or squeezed states of light are sent through the chip or the free space for the computation and then measured with photon detectors at the other end.

For photonic quantum computing, a cluster state of entangled photons must be produced. The cluster state is a resource state of multidimensional highly entangled qubits. There are various ways of generating and using the cluster state (Rudolph, 2016). The general process is to produce photons, entangle them, compute with them, and measure the result. One way of generating cluster states is in lattices of qubits with Ising-type interactions (phase transitions). Lattices translate well into computation. Cluster states are represented as graph states, in which the underlying graph is a connected subset of a d-dimensional lattice. The graph states are then instantiated as a computation graph with directed operations to perform the computation.

3.3.6.1Photonic time speed-ups

The normal speed-up in quantum computing compared to classical computing is due to the superposition of 0s and 1s, in that the quantum circuit can process 0s and 1s at the same time. This provides massive parallelism by being able to process all of the problem inputs at the same time. Photonics allows an additional speed-up to the regular speed-up of quantum computing. In photonic quantum computing, superposition can be used not only for problem inputs but also for processing gates (Procopio et al., 2015). Time can be accelerated by superpositioning the processing gates. Standard quantum architectures have fixed gate arrangements, whereas photonic quantum architectures allow the gate order to be superimposed as well. This means that when computations are executed, they run through circuits that are themselves superpositioned. The potential computational benefit of the superposition of optical quantum circuits is an exponential advantage over classical algorithms and a linear advantage over regular quantum algorithms.

3.3.7 Neutral atoms, diamond defects, quantum dots, and nuclear magnetic resonance

Overall, there are many methods for generating qubits and computing with them (Table 3.3). In addition to the four main approaches (superconducting circuits, ion traps, Majorana fermions, and photonics), four additional approaches are discussed briefly. These include neutral atoms, diamond defects (nitrogen-vacancy defect centers), quantum dots, and nuclear magnetic resonance (NMR).

3.3.7.1Neutral atoms

An early-stage approach to quantum computing is neutral atoms. Neutral atoms are regular uncharged atoms with balanced numbers of protons and electrons, as opposed to ions that are charged because they have had an electron stripped away from them or added to them. Qubits are produced by exciting neutral atoms trapped in optical lattices or optical arrays, and qubits are controlled in computation by another set of lasers. The neutral atoms are trapped in space with lasers. An optical lattice is made with interfering laser beams from multiple directions to hold the atoms in wells (an egg carton-shaped structure). Another method is holding the atoms in an array with optical tweezers. Unlike ions (which have strong interactions and repel each other), neutral atoms can be held in close confinement with each other and manipulated in computation. Atoms such as cesium and rubidium are excited into Rydberg states from which they can be manipulated to perform computation (Saffman, 2016). Researchers have been able to accurately program a two-rubidium atom logic gate 97% of the time with the neutral atoms approach (Levine et al., 2018), as compared to 99% fidelity with superconducting qubits. A 3D array of 72 neutral atoms has also been demonstrated (Barredo et al., 2018).

Table 3.3. Qubit types by formation and control parameters.

Qubit typeQubit formation (DiVincenzo criterion #1)Qubit control for computation (DiVincenzo criteria #2–5)
1.Superconducting circuitsElectrical circuit with oscillating currentElectromagnetic fields and microwave pulses
2.Trapped ionsIon (atom stripped of one electron)Ions stored in electromagnetic traps and manipulated with lasers
3.Majorana fermionsTopological superconductorsElectrically controlled along non-Abelian “braiding” path
4.Photonic circuitsSingle photons (or squeezed states) in silicon waveguidesMarshalled cluster state of multidimensional entangled qubits
5.Neutral atomsElectronic states of atoms trapped by laser-formed optical latticeControlled by lasers
6.Quantum dotsElectron spins in a semiconductor nanostructureMicrowave pulses
7.Diamond center defectsDefect has an effective spin; the two levels of the spin define a qubitMicrowave fields and lasers

Source: Adapted from McMahon (2018).

3.3.7.2Diamond defects (nitrogen-vacancy defect centers)

An interesting approach, although one that may have scalability challenges for commercial deployment, is diamond center defects. Imperfections in the crystal lattice within diamonds are commonplace and have been exploited for a variety of uses from crystallography to the development of novel quantum devices. Defects may be the result of natural lattice irregularities or artificially introduced impurities. For quantum computing, impurities are introduced by implanting ions to make nitrogen-vacancy photonic centers. A nitrogen vacancy can be created in a diamond crystal by knocking out a carbon atom and replacing it with a nitrogen atom and also by knocking out a neighboring carbon atom so that there is a vacant spot. The nitrogen vacancy produces the so-called Farbe center (color center), which is a defect in a crystal lattice that is occupied by an unpaired electron. The unpaired electron creates an effective spin which can be manipulated as a qubit. The nitrogen-vacancy defect center is attractive for quantum computing because it produces a robust quantum state that can be initialized, manipulated, and measured with high fidelity at room temperature (Haque & Sumaiya, 2017).

3.3.7.3Quantum dots

Another early-stage approach, in the form of a semiconductor concept, is quantum dots (quantum dots are nanoparticles of semiconducting material) (Loss & DiVincenzo, 1998). In this method, electrically controlled quantum dots that can be used as qubits are created from electron spins trapped in a semiconductor nanostructure, and then electrical pulses are used to control them for computation. A semiconductor-based structure is fabricated that is similar to that of classical processors. Metal electrodes are patterned on the semiconductor layer so that electrostatic fields can be made from the wires to trap single electrons. The spin degrees of freedom of the electrons are used as qubits. Within the semiconductor nanostructure, there are small silicon chambers that keep the electron in place long enough to hybridize its charge and spin and manipulate the electron spin–orbit interactions for computation (Petta et al., 2005). The coherence interactions typically last longer in silicon than in other materials, but can be difficult to control. There has been some improvement in controlling qubit decoherence in quantum dot computing models (Kloeffel & Loss, 2013).

3.3.7.4Nuclear magnetic resonance

Nuclear magnetic resonance (NMR) is one of the first approaches to quantum computing, but is seen as being difficult to scale for commercial purposes. NMR uses the same technology that is used in medical imaging. The physics principle is that since atoms have spin and electrical charge, they may be controlled through the application of an external magnetic field. In 2001, IBM demonstrated the first experimental realization of quantum computing, using NMR (Vandersypen et al., 2001). A 7-qubit circuit performed the simplest instance of Shor’s factoring algorithm by factoring the number 15 (into its prime factors of 3 and 5).

References

Barredo, D., Lienhard, V., Leseleuc, S. et al. (2018). Synthetic three-dimensional atomic structures assembled atom by atom. Nature 561:79–82.

Bednorz, J.G. & Muller, K.A. (1986). Possible high TC superconductivity in the Ba-La-Cu-O system. Zeit. Phys. B. 64(2):189–93.

Bennett, C.H. & DiVincenzo, D.P. (2000). Quantum information and computation. Nature 404:247–55.

Chen, L., Jordan, S., Liu, Y.-K. et al. (2016). Report on post-quantum cryptography. NIST Interagency Report 8105.

Deutsch, D. (1985). Quantum theory, the Church-Turing principle and the universal quantum computer. Proc. Roy. Soc. Lond. A. 400(1818):97–117.

DiVincenzo, D.P. (2000). The physical implementation of quantum computation. Fortschrit. Phys. 48(9–11):771–83.

Drozdov, A.P., Eremets, M.I., Troyan, I.A. et al. (2015). Conventional superconductivity at 203 kelvin at high pressures in the sulfur hydride system. Nature 525(7567):73–6.

Dyakonov, M. (2018). The case against quantum computing: The proposed strategy relies on manipulating with high precision an unimaginably huge number of variables. IEEE Spectr.

Feynman, R.P. (1960). There’s plenty of room at the bottom. Eng. Sci. 23(5):22–36.

Feynman, R.P. (1982). Simulating physics with computers. Int. J. Theor. Phys. 21(6):467–88.

Freedman, M.H., Kitaev, A., Larsen, M.J. & Wang, Z. (2002). Topological quantum computation. arXiv:quant-ph/0101025.

Gasman, L. (2019). Quantum key distribution (QKD) markets: 2019 to 2028. Inside Quantum Technology Report.

Gidney, C. & Ekera, M. (2019). How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits. arXiv:1905.09749 [quant-ph].

Grumbling, E. & Horowitz, M. (2019). Quantum Computing: Progress and Prospects. Washington, DC: US National Academies of Sciences.

Haque, A. & Sumaiya, S. (2017). An overview on the formation and processing of nitrogen-vacancy photonic centers in diamond by ion implantation. J. Manuf. Mater. Process. 1(1):6.

Jack, B., Xie, Y., Li, J. et al. (2019). Observation of a Majorana zero mode in a topologically protected edge channel. Science 364(6447):1255–59.

Kadowaki, T. & Nishimori, H. (1998). Quantum annealing in the transverse Ising model. Phys. Rev. E. 58(5355).

Kaminsky, W.M. & Lloyd, S. (2004). Scalable architecture for adiabatic quantum computing of Np-hard problems. In: Leggett A.J., Ruggiero B. & Silvestrini P. (Eds). Quantum Computing and Quantum Bits in Mesoscopic Systems. Boston, MA: Springer.

Kloeffel, C. & Loss, D. (2013). Prospects for spin-based quantum computing in quantum dots. Annu. Rev. Conden. Matt. Phys. 4:51–81.

Levine, H., Keesling, A., Omran, A. et al. (2018). High-fidelity control and entanglement of Rydberg-atom qubits. Phys. Rev. Lett. 121(123603).

Los Alamos National Laboratory (LANL). (2004). A Quantum Information Science and Technology Roadmap. LA-UR-04-1778.

Loss, D. & DiVincenzo, D.P. (1998). Quantum computation with quantum dots. Phys. Rev. A. 57(1):120–26.

Mavroeidis, V., Vishi, K., Zych, M.D. & Josang, A. (2018). The impact of quantum computing on present cryptography. Int. J. Adv. Comp. Sci. App. 9(3):1–10.

McMahon, P. (2018). Quantum Computing Hardware Landscape. San Jose, CA: QC Ware.

Mosca, M. (2018). Cybersecurity in an era with quantum computers: will we be ready? IEEE Secur. Priv. 16(5):38–41.

Murali, P., Linke, M. & Martonosi, M. (2019). Full-Stack, Real-System Quantum Computer Studies: Architectural Comparisons and Design Insights. International Symposium on Computer Architecture (ISCA), 2019, pp. 1–14.

Petta, J.R., Johnson, A.C., Taylor, J.M. et al. (2005). Coherent manipulation of coupled electron spins in semiconductor quantum dots. Science 309(5744): 2180–84.

Preskill, J. (2018). Quantum computing in the NISQ era and beyond. Quantum 2(79):1–20.

Procopio, L.M., Moqanaki, A., Araujo, M. et al. (2015). Experimental superposition of orders of quantum gates. Nat. Commun. 6(7913):1–6.

Rigetti, C., Poletto, S., Gambetta, J.M. et al. (2012). Superconducting qubit in waveguide cavity with coherence time approaching 0.1 ms. Phys. Rev. B. 86:100506(R).

Robinson, N.J., Altland, A., Egger, R. et al. (2019). Nontopological Majorana zero modes in inhomogeneous spin ladders. Phys. Rev. Lett. 122(2):027201.

Rudolph, T. (2016). Why I am optimistic about the silicon-photonic route to quantum computing. arXiv:1607.08535 [quant-ph].

Saffman, M. (2016). Quantum computing with atomic qubits and Rydberg interactions: progress and challenges. J. Phys. B: Atom. Mol. Opt. Phys. 49(202001):1–27.

Sarma, S.D., Freedman, M. & Nayak, C. (2015). Majorana zero modes and topological quantum computation. NPJ Quantum Inf. 1(15001).

Somayazulu, M., Ahart, M., Mishra, A.K. et al. (2019). Evidence for superconductivity above 260 K in lanthanum superhydride at megabar pressures. Phys. Rev. Lett. 122(027001).

Steane, A. (1997). Quantum computing. arXiv:quant-ph/9708022.

Vandersypen, L.M.K., Steffen, M., Breyta, G. et al. (2001). Experimental realization of Shor’s quantum factoring algorithm using nuclear magnetic resonance. Nature 414:883–7.

Villalonga, B., Boixo, S. & Nelson, B. (2019). A flexible high-performance simulator for the verification and benchmarking of quantum circuits implemented on real hardware. arXiv:1811.09599 [quant-ph].

Wang, Z. (2010). Topological Quantum Computation. Providence, RI: American Mathematical Society.

Williams, C.J. (2007). Quantum Information Science, NIST, and Future Technological Implications. Gaithersburg, MD: National Institute of Standards and Technology.

Zurek, E. (2019). Viewpoint: pushing towards room-temperature superconductivity. APS Phys. 12(1).

Quantum Computing

Подняться наверх