• To people who do quantum computing: are qubits (after error correction) functionally equivalent and hence directly comparable across quantum computers, and is it a useful objective measure to compare progress? Or is it more a easy-to-mediatise stat?
    • Pretty much everything you read (especially when aimed at a non-expert audience) about quantum computing in the media is "easy-to-mediatize" information only.

      People building these things are trying to oversell their achievements while carefully avoiding making them easy to check, reproduce, or objectively compare to others. It's hard to objectively evaluate even for people who work in the field but haven't worked on the exact technology platform reported on. Metrics are taylored to marketing goals, for example, IBM made up a performance metric called "quantum volume", only to basically stop using it when it seemed to no longer favour them.

      That being said, it's also undeniable that quantum computing is making significant progress, error correction being a major milestone. What this ends up being actually used for, if anything, remains to be seen (I'm rather sure we'll find something).

    • I worked on a quantum computer for several years and can speak to this a bit: sorta. They're functionally equivalent in the sense that you can do the same computations, usually, but there are a ton of details that make how each particular modality behave. Things like gate fidelities (how good the gates are), how fast the gates can "execute", how long it takes to initialize the quantum state so you can execute gates, how long decoherence times (how long before the quantum state is lost) are, and many (many) other differences. Some modalities even have restrictions on what qubits can interact with other qubits which will, among other things, impact algorithm design.
      • Also the error correction strategies that are available to different architectures/platforms can make a huge difference in the future.
        • Yup! Though error correction was not something I spent a lot of time on. I worked primarily on "quantum OS" (really, just AMO control systems) so wasn't thinking much on the theoretical side of things.
    • No, they are not comparable. There are gate-model quantum computers like the one described in the article, and there are quantum annealers from companies like D-Wave that are geared toward solving a type of problem called QUBO: https://en.wikipedia.org/wiki/Quadratic_unconstrained_binary...

      The latter has released quantum computers with thousands of qubits, but these qubits are not comparable with the physical qubits in a gate-model computer (and especially not with logical qubits from one).

    • As far as I understand quantum error correction, the number of physical qubits might vary between systems with the same number of logical qubits. So, if you care about the overhead, they might not be equivalent in practice.
    • The idea is to implement something like the quantum equivalent of a Turing machine. What one universal quantum computer can do, another can. So yeah. However, connectivity and gate/measurement time will set some aspects of the performance but not the asymptotics.
    • It matters, but it's not functionally equivalent between different architectures.

      Since noone has many qubits, typically physical qubits are compared as opposed to virtual qubits (the error corrected ones).

      The other key figures of merit are the 1-qubit and 2-qubit gate fidelities (basically the success rates). The 2-qubit gate is typically more difficult and has a lower fidelity, so people often compare qubits by looking only at the 2-qubit gate fidelity. Every 9 added to the 2-qubit gate fidelity is expected to roughly decrease the ratio of physical to virtual qubits by an order of magnitude.

      In architectures where qubits are fixed in place and can only talk to their nearest neighbours, moving information around requires swap gates which are made up of the elementary 1 and 2-qubit gates. Some architectures have mobile qubits and all-to-all connectivity, so their proponents hope to avoid swap gates, considerably reducing the number of required 2-qubit gates required to run an algorithm, thus resulting in less errors to deal with.

      Some companies, particularly ones on younger architectures, but perhaps with much better gate fidelities, argue that their scheme is better by virtue of being more "scalable" (having more potential in future).

      It is expected that in the future, the overall clock speed of the quantum computer will matter, as the circuits we ultimately want to run are expected to be massively long. Since we're far away from the point where this matters, clock speed is uncommonly brought up.

      In general, different architectures have different advantages. With different proponents having different beliefs of what matters, it was once described to me as each architecture having their own religion.

      TL;DR: the two key stats are number of qubits and 2-qubit gate fidelity.

    • They should declare a standard unit called Q*bert.
      • Q sub BASIC?

        Does HN do subscript? I don't think I've seen it. I'm unsure if markdown supports it. I'd probably use subscripts a lot more if markdown had them.

  • So .. what can you actually do with that thing?

    Are there any real world applications yet? Or is the real world application, quantum state experiments?

    I think we are pretty far from using it as a general purpose computer or even special (disrupting) usecases like factorization. So who could use it with benefit?

    • There won't be "real world applications" for many years to come.

      If I had to bet on what (impactful) application might come first, I'd guess simulation of chemical/physical properties used for drug development and materials science.

      • "Both organizations will integrate the 256-qubit superconducting quantum computer into its platform for hybrid quantum computing lineup and offer it to companies and research institutions"

        But they offer it for rent. Who would be a buyer for the quantum part of the hybrid?

        "Research institutions" but for what kind of research?

        Or is this rather wishful thinking/PR "we bring quantum computing to the market (just nobody uses it)"?

        • > "Research institutions" but for what kind of research?

          Quantum computing research. I'd guess a big chunk of revenue will come from universities and research institutes. Some companies might also pay for it, e.g. quantum computing startups in need of anything they can show before they have hardware, or startups that aren't even planning to build their own hardware.

          There are people working on finding useful problems that these devices might help with and how to best make use of them, how to build "infrastructure" for it. It's useful for them to have something to play with. Also, many organizations want to be (seen as) at the forefront of quantum computing, know the current capabilities, strengths and weaknesses of the various platforms, train and educate people about quantum computing and quantum technology in general, etc.

      • mapt
        Is there some minimum number of qubits at which some minimum viable quantum-supreme task can theoretically be achieved?

        What would be required to factor a 1024 bit integer key?

        • You might as well ask what would be required to factor an 8 bit integer key. Because decades after the factorization of 21, we're still waiting for a quantum computer able to factor any product of two 4-bit primes with the general Shor algorithm.
          • Ok, so ... what would be required for the 8 bit key? Do we have reputable numbers? And are the qubits in the article equivalent to other qubits or are they lacking in some way?
            • You need a few dozen logical qubits. Qubits that have negligible errors and do not lose coherence. The problem is that a single logical qubit takes hundreds of physical qubits and advanced error correction techniques to construct.
              • The more I hear about quantum computing the more it sounds like make believe grift. I first heard about it in a 2600 magazine in 1997 or so. And the claims and "way forward" then and now are roughly equivalent.

                Read as: I've heard for nearly 30 years that quantum is just around the corner, and we need post quantum cryptography.

                Or, as reverend Sharpton said: "All hell's gunna break loose; and you're gunna need a Bitcoin!"

                • To be fair the story has been consistent. The hardware is lacking and the predictions are testable given advances in it.

                  When you compare it to the historical development of classical computers it's proceeding at a decent rate. Imagine if we'd needed hundreds of thousands of transistors before being able to demonstrate actually useful work by a classical computer. They likely never would have been developed in the first place.

                  Cryptography wise I'd expect dire warnings about any theoretical attack that's reasonably plausible. Better to react immediately than sit around waiting for it to materialize. It took over 15 years after the warnings for SHA to be broken in practice and I don't necessarily expect that SHA2 ever will be but we've moved on to SHA3 nonetheless.

                  • to compare, ENIAC had 18,000 tubes and 1200 relays, and could perform 5000 additions, or 3 square roots per second. in 1956, when it was decommissioned.

                    that was 80 years ago, for the military. So plotting that out, first actual PC was 25 years after ENIAC was decommissioned, the IBM PC 5150, with 29,000 transistors in the 8088. 12 years later, the 586 had 3.1mm transistors, P4 had 42mm, 10 years later (2003) p4xe had 169mm (but a year earlier there were only 65mm in the p4). haswell, ten years later, 1.4 billion transistors. in 2023, AMD ryzen 7800x3d had 6.5 billion transistors.

                    here's a graph i threw together to see what the trendline was https://i.imgur.com/4ofV7Xr.png

                    • Serious question: has such a calculation ever successfully predicted a technology trend? How much should we believe it and to what accuracy?
                      • Well, i did this a few times between 2011-2016 and predicted SSD/NVME and spindle sizes through this year pretty accurately. Moore's "law" said the graph has a maximum angle, but i think most people think it implies a minimum angle. the graph i plotted is a lot less steep than Moore's law ought to imply, but these are all desktop CPUs - workstation/server CPUs are over 100 billion transistors already and are well within the constraints of Moore's law.
        • > Is there some minimum number of qubits at which some minimum viable quantum-supreme task can theoretically be achieved?

          That is a very broad range of possibilities, so allow me to narrow it to cryptography. I am by no means an expert on this, but I spent the weekend reading about quantum motivations to change the cryptographic algorithms society uses and as at as I can tell, nobody knows what the hard lower bound is for breaking hardness assumptions in classical cryptography. The best guess is that it is many orders of magnitude higher than what current machines can do.

          We are so far from machines capable of this that it is unclear that a machine capable of it will be made in our lifetimes, despite optimism/fear/hope, depending on who you are, to the contrary.

          > What would be required to factor a 1024 bit integer key?

          I assume you mean a 1024-bit RSA key (if you mean a 1024-bit ECC key, I want to know what curve). You can crack 1024-bit RSA with 2051 logical qubits according to:

          https://arxiv.org/abs/quant-ph/0205095

          In order for it to actually work, it is believed that you will need between 1000 to 10000 physical qubits for every logical qubit, so it could take up to 20 million qubits.

          Coincidentally, the following paper claims that cracking a 2048-bit RSA key can be done in 8 hours with 20 million physical qubits:

          https://arxiv.org/abs/1905.09749

          That sounds like it should not make sense given the previous upper estimate of 20 million physical qubits for a 1024-bit RSA key. As far as I can tell, there are different ways of implementing Shor’s algorithm and some ways use more qubits and some ways use less. The biggest factor in the number of physical qubits used is the error correction. If you can do better error correction, you can use fewer physical qubits.

        • Factoring requires much better noise performance. So you don't have to consider the number of qubits yet for that particular application. A fundamental breakthrough is required.

          There might be applications other than factoring that can be addressed with the noisy qubits we can actually create.

          • What’s the purpose of quantum computing other than solving discrete logarithm or factorisation? Doesn’t seem very promising at the moment
            • It is said to be useful for optimization problems like protein folding, although they have so few qubits that I am not sure how useful it really could be. A microcontroller with only 1024 bits of RAM for example has very limited usefulness so I would not think a quantum computer with a similar number of qubits would be very useful either.
    • That was my thought. Somehow Ford managed to use one to improve its manufacturing process by having it solve a scheduling problem in 5 minutes that previously took an hour:

      https://www.dwavequantum.com/company/newsroom/press-release/...

      The news is light on technical details. Beyond that, I have no clue about useful applications.

  • If anyone's looking for a clear and concise intro book, the best I've found is N. David Mermin's Quantum Computer Science. It's geared towards CS students, so very approachable, requires only familiarity with the basics of finite-dimensional complex vector spaces
  • Those pictures look like something just came down from the roof to be repaired.
  • Fujitsu has been an astounding company ime, just premium hardware
    • > Fujitsu has been an astounding company ime, just premium hardware

      Has been. Now they are making "quantum". Whatever that means. (But hey, if Microsoft can do it, everybody can do it)