• chevron_right

      Quantum computing progress: Higher temps, better error correction

      news.movim.eu / ArsTechnica · Wednesday, 27 March - 22:24 · 1 minute

    conceptual graphic of symbols representing quantum states floating above a stylized computer chip.

    Enlarge (credit: vital )

    There's a strong consensus that tackling most useful problems with a quantum computer will require that the computer be capable of error correction. There is absolutely no consensus, however, about what technology will allow us to get there. A large number of companies, including major players like Microsoft, Intel, Amazon, and IBM, have all committed to different technologies to get there, while a collection of startups are exploring an even wider range of potential solutions.

    We probably won't have a clearer picture of what's likely to work for a few years. But there's going to be lots of interesting research and development work between now and then, some of which may ultimately represent key milestones in the development of quantum computing. To give you a sense of that work, we're going to look at three papers that were published within the last couple of weeks, each of which tackles a different aspect of quantum computing technology.

    Hot stuff

    Error correction will require connecting multiple hardware qubits to act as a single unit termed a logical qubit. This spreads a single bit of quantum information across multiple hardware qubits, making it more robust. Additional qubits are used to monitor the behavior of the ones holding the data and perform corrections as needed. Some error correction schemes require over a hundred hardware qubits for each logical qubit, meaning we'd need tens of thousands of hardware qubits before we could do anything practical.

    Read 21 remaining paragraphs | Comments

    • chevron_right

      IBM adds error correction to updated quantum computing roadmap

      news.movim.eu / ArsTechnica · Monday, 4 December - 15:40 · 1 minute

    Image of a series of silver-covered rectangles, each representing a processing chip.

    Enlarge / The family portrait of IBM's quantum processors, with the two new arrivals (Heron and Condor) at right. (credit: IBM)

    On Monday, IBM announced that it has produced the two quantum systems that its roadmap had slated for release in 2023. One of these is based on a chip named Condor, which is the largest transmon-based quantum processor yet released, with 1,121 functioning qubits. The second is based on a combination of three Heron chips, each of which has 133 qubits. Smaller chips like Heron and its successor, Flamingo, will play a critical role in IBM's quantum roadmap—which also got a major update today.

    Based on the update, IBM will have error-corrected qubits working by the end of the decade, enabled by improvements to individual qubits made over several iterations of the Flamingo chip. While these systems probably won't place things like existing encryption schemes at risk, they should be able to reliably execute quantum algorithms that are far more complex than anything we can do today.

    We talked with IBM's Jay Gambetta about everything the company is announcing today, including existing processors, future roadmaps, what the machines might be used for over the next few years, and the software that makes it all possible. But to understand what the company is doing, we have to back up a bit to look at where the field as a whole is moving.

    Read 20 remaining paragraphs | Comments

    • chevron_right

      Qubits 30 meters apart used to confirm Einstein was wrong about quantum

      news.movim.eu / ArsTechnica · Wednesday, 10 May, 2023 - 18:12 · 1 minute

    Image of a long metallic pipe extending down a hallway lit in blue.

    Enlarge / The quantum network is a bit bulkier than Ethernet. (credit: ETH Zurich / Daniel Winkler )

    A new experiment uses superconducting qubits to demonstrate that quantum mechanics violates what's called local realism by allowing two objects to behave as a single quantum system no matter how large the separation between them. The experiment wasn't the first to show that local realism isn't how the Universe works—it's not even the first to do so with qubits.

    But it's the first to separate the qubits by enough distance to ensure that light isn't fast enough to travel between them while measurements are made. And it did so by cooling a 30-meter-long aluminum wire to just a few microKelvin. Because the qubits are so easy to control, the experiment provides a new precision to these sorts of measurements. And the hardware setup may be essential for future quantum computing efforts.

    Getting real about realism

    Albert Einstein was famously uneasy with some of the consequences of quantum entanglement. If quantum mechanics were right, then a pair of entangled objects would behave as a single quantum system no matter how far apart the objects were. Altering the state of one of them should instantly alter the state of the second, with the change seemingly occurring faster than light could possibly travel between the two objects. This, Einstein argued, almost certainly had to be wrong.

    Read 13 remaining paragraphs | Comments

    • chevron_right

      Breaking RSA with a Quantum Computer

      news.movim.eu / Schneier · Tuesday, 3 January, 2023 - 17:38 · 1 minute

    A group of Chinese researchers have just published a paper claiming that they can—although they have not yet done so—break 2048-bit RSA. This is something to take seriously. It might not be correct, but it’s not obviously wrong.

    We have long known from Shor’s algorithm that factoring with a quantum computer is easy. But it takes a big quantum computer, on the orders of millions of qbits, to factor anything resembling the key sizes we use today. What the researchers have done is combine classical lattice reduction factoring techniques with a quantum approximate optimization algorithm. This means that they only need a quantum computer with 372 qbits, which is well within what’s possible today. (IBM will announce a 1000-qbit quantum computer in a few months. Others are on their way as well.)

    The Chinese group didn’t have that large a quantum computer to work with. They were able to factor 48-bit numbers using a 10-qbit quantum computer. And while there are always potential problems when scaling something like this up by a factor of 50, there are no obvious barriers.

    Honestly, most of the paper is over my head—both the lattice-reduction math and the quantum physics. And there’s the nagging question of why the Chinese government didn’t classify this research.

    But…wow…maybe…and yikes! Or not.

    “Factoring integers with sublinear resources on a superconducting quantum processor”

    Abstract: Shor’s algorithm has seriously challenged information security based on public key cryptosystems. However, to break the widely used RSA-2048 scheme, one needs millions of physical qubits, which is far beyond current technical capabilities. Here, we report a universal quantum algorithm for integer factorization by combining the classical lattice reduction with a quantum approximate optimization algorithm (QAOA). The number of qubits required is O(logN/loglogN ), which is sublinear in the bit length of the integer N , making it the most qubit-saving factorization algorithm to date. We demonstrate the algorithm experimentally by factoring integers up to 48 bits with 10 superconducting qubits, the largest integer factored on a quantum device. We estimate that a quantum circuit with 372 physical qubits and a depth of thousands is necessary to challenge RSA-2048 using our algorithm. Our study shows great promise in expediting the application of current noisy quantum computers, and paves the way to factor large integers of realistic cryptographic significance.

    • chevron_right

      SIKE Broken

      news.movim.eu / Schneier · Wednesday, 3 August, 2022 - 09:03

    SIKE is one of the new algorithms that NIST recently added to the post-quantum cryptography competition.

    It was just broken , really badly.

    We present an efficient key recovery attack on the Supersingular Isogeny Diffie­-Hellman protocol (SIDH), based on a “glue-and-split” theorem due to Kani. Our attack exploits the existence of a small non-scalar endomorphism on the starting curve, and it also relies on the auxiliary torsion point information that Alice and Bob share during the protocol. Our Magma implementation breaks the instantiation SIKEp434, which aims at security level 1 of the Post-Quantum Cryptography standardization process currently ran by NIST, in about one hour on a single core.

    News article .

    • chevron_right

      NIST Announces First Four Quantum-Resistant Cryptographic Algorithms

      news.movim.eu / Schneier · Wednesday, 6 July, 2022 - 16:49 · 1 minute

    NIST’s post-quantum computing cryptography standard process is entering its final phases. It announced the first four algorithms:

    For general encryption, used when we access secure websites, NIST has selected the CRYSTALS-Kyber algorithm. Among its advantages are comparatively small encryption keys that two parties can exchange easily, as well as its speed of operation.

    For digital signatures, often used when we need to verify identities during a digital transaction or to sign a document remotely, NIST has selected the three algorithms CRYSTALS-Dilithium , FALCON and SPHINCS+ (read as “Sphincs plus”). Reviewers noted the high efficiency of the first two, and NIST recommends CRYSTALS-Dilithium as the primary algorithm, with FALCON for applications that need smaller signatures than Dilithium can provide. The third, SPHINCS+, is somewhat larger and slower than the other two, but it is valuable as a backup for one chief reason: It is based on a different math approach than all three of NIST’s other selections.

    NIST has not chosen a public-key encryption standard. The remaining candidates are BIKE , Classic McEliece , HQC , and SIKE .

    I have a lot to say on this process, and have written an essay for IEEE Security & Privacy about it. It will be published in a month or so.

    • chevron_right

      Quantum computing’s also-rans and their fatal flaws

      Chris Lee · news.movim.eu / ArsTechnica · Saturday, 30 November, 2019 - 15:00

    Extreme closeup of computer chip.

    Enlarge / IBM's 16-qubit quantum computer from 2017. (credit: IBM quantum experience )

    Last month, Google claimed to have achieved quantum supremacy —the overblown name given to the step of proving quantum computers can deliver something that a classical computer can't. That claim is still a bit controversial , so it may yet turn out that we need a better demonstration.

    Independently of the claim, it's notable that both Google and its critics at IBM have chosen the same type of hardware as the basis of their quantum computing efforts. So has a smaller competitor called Rigetti. All of which indicates that the quantum-computing landscape has sort of stabilized over the last decade. We are now in the position where we can pick some likely winners and some definite losers.

    Why are you a loser?

    But why did the winners win and the losers lose?

    Read 22 remaining paragraphs | Comments

    index?i=fd8gHgWjzXI:HgzTyADUgOk:V_sGLiPBpWUindex?i=fd8gHgWjzXI:HgzTyADUgOk:F7zBnMyn0Loindex?d=qj6IDK7rITsindex?d=yIl2AUoC8zA