Rose’s Law for Quantum Computers

When I first met Geordie Rose in 2002, I was struck by his ability to explain complex quantum physics and the “spooky” underpinnings of quantum computers. I had just read David Deutsch’s Fabric of Reality where he predicts the possibility of such computers, and so I invited Rose to one of our tech conferences.


We first invested in 2003, and Geordie predicted that he would be able to demonstrate a two-bit quantum computer within 6 months. There was a certain precision to his predictions. With one bit under his belt, and a second coming, he went on to suggest that the number of qubits in a scalable quantum computing architecture should double every year. It sounded a lot like Gordon Moore’s prediction back in 1965, when he extrapolated from just five data points on a log-scale (his original plot is below).


So I called it “Rose’s Law” and that seemed to amuse him. Well, the decade that followed has been quite amazing. I commented on Rose’s Law four years ago on flickr, but I share this graph and some potential futures for the first time today.


So, how do we read the graph above? Like Moore’s Law, a straight line describes an exponential. But unlike Moore’s Law, the computational power of the quantum computer should grow exponentially with the number of entangled qubits as well. It’s like Moore’s Law compounded. (D-Wave just put together an animated visual of each processor generation in this video, bringing us to the present day.)


And now, it gets mind bending. If we suspend disbelief for a moment, and use D-Wave’s early data on processing power scaling (more on that below), then the very near future should be the watershed moment, where quantum computers surpass conventional computers and never look back. Moore’s Law cannot catch up. A year later, it outperforms all computers on Earth combined. Double qubits again the following year, and it outperforms the universe. What the???? you may ask... Meaning, it could solve certain problems that could not be solved by any non-quantum computer, even if the entire mass and energy of the universe was at its disposal and molded into the best possible computer.


It is a completely different way to compute — as David Deutsch posits — harnessing the refractive echoes of many trillions of parallel universes to perform a computation.


First the caveat (the text in white letters on the graph). D-Wave has not built a general-purpose quantum computer. Think of it as an application-specific processor, tuned to perform one task — solving discrete optimization problems. This happens to map to many real world applications, from finance to molecular modeling to machine learning, but it is not going to change our current personal computing tasks. In the near term, assume it will apply to scientific supercomputing tasks and commercial optimization tasks where a heuristic may suffice today, and perhaps it will be lurking in the shadows of an Internet giant’s data center improving image recognition and other forms of near-AI magic. In most cases, the quantum computer would be an accelerating coprocessor to a classical compute cluster.


Second, the assumptions. There is a lot of room for surprises in the next three years. Do they hit a scaling wall or discover a heretofore unknown fracturing of the physics… perhaps finding local entanglement, noise, or some other technical hitch that might not loom large at small scales, but grows exponentially as a problem just as the theoretical performance grows exponentially with scale. I think the risk is less likely to lie in the steady qubit march, which has held true for a decade now, but in the relationship of qubit count to performance.


There is also the question of the programming model. Until recently, programming a quantum computer was more difficult than machine coding an Intel processor. Imagine having to worry about everything from analog gate voltages to algorithmic transforms of programming logic to something native to quantum computing (Shor and Grover and some bright minds have made the occasional mathematical breakthrough on that front). With the application-specific quantum processor, D-Wave has made it all much easier, and with their forthcoming Black Box overlay, programming moves to a higher level of abstraction, like training a neural network with little understanding of the inner workings required.


In any case, the possibility of a curve like this begs many philosophical and cosmological questions about our compounding capacity to compute... the beginning of infinity if you will.


While it will be fascinating to see if the next three years play out like Rose’s prediction, for today, perhaps all we should say is that it’s not impossible. And what an interesting world this may be.

solerena, js brain and 44 more people faved this
  • Steve Jurvetson PRO 3y

    Day trading has become a computational arms race.

    D. E. Shaw has been tinkering with quantum computation for years, for the hedge fund, the labs, and Schrödinger. They built an application-specific processor to study protein folding (called Anton, with an alleged 50x energy advantage).
  • rjp1965 3y

    Amen Steve. "I do find it fascinating to watch the pure theoreticians and their reaction to someone building machines that they want to philosophize about, not being able to build one themselves, and rendered just a bit obsolete when someone does."
  • vennettaj PRO 3y

    sorta out of topic...i started thinking about women...i think they/some just don't like to hang out with bragging piggies
  • GeordieRose 3y

    breic That's not correct. The noise rates for both the Rainier and Vesuvius processors are well below our targets, and will allow us to maintain the steady growth in computational power we've seen over each of the past 8 years.
  • David Gobel 3y

    Hi Steve and Geordie, Congratulations on all your progress and finding additional .gov investors! I applaud your go to market courage. These academic vs entrepreneurial combats just keep when two bicycle mechanics from Ohio cleaned Samuel Pierpont Langley's clock :-) Served him Wright.
  • Chucil 3y

    Schrödinger -> Schrödinger's Cat
    Quantum computing: Measurements leave Schrödinger's cat alive
  • David Seaton 3y

    This last crisis has made me a bit of Luddite I'm afraid. That's what has gotten me interested in evolutionary biology... Being so intelligent, but not nearly intelligent enough may be our fatal flaw as a species
  • SQLwriter 3y

    Not all of academia has been dissing D-Wave. Professor Marek Perkowski of Portland State University has been in the D-Wave camp for years.
  • felix turner PRO 3y

  • Rodolfo 3y

    Even if they do not scale to universe-size computation, quantum computers will have the capability to disrupt completely the way we do business by making encryption trivial. Financial transactions, defense, telecommunications, you name it.
  • helen sotiriadis PRO 3y

    can i pre-order an upgrade? i'm hoping to at last own a computer that can keep up with me.
  • PhOtOnQuAnTiQuE PRO 3y

    And this year Physics Nobel goes to Serge Haroche and David Wineland, for their work on quantum optics communication and computation ; ) BBCnews-2012 Physics Nobel
  • OscarFalcon 3y

  • Steve Jurvetson PRO 3y

    A new article: "The black box that could change the world"
  • photon~wave 3y

    Good summing-up about the state-of-the-art in quantum computing: Simulation: Quantum leaps by Geoff Brumfiel.
  • Steve Jurvetson PRO 2y

    And finally a move from Phys. Rev. B to the business lead in the NYT, and the second-most shared story for the day.
    "if it performs as Lockheed and D-Wave expect, the design could be used to supercharge even the most powerful systems, solving some science and business problems millions of times faster than can be done today.

    Ray Johnson, Lockheed’s chief technical officer, said his company would use the quantum computer to create and test complex radar, space and aircraft systems. It could be possible, for example, to tell instantly how the millions of lines of software running a network of satellites would react to a solar burst or a pulse from a nuclear explosion — something that can now take weeks, if ever, to determine.

    “This is a revolution not unlike the early days of computing,” he said. “It is a transformation in the way computers are thought about.” Many others could find applications for D-Wave’s computers. Cancer researchers see a potential to move rapidly through vast amounts of genetic data. The technology could also be used to determine the behavior of proteins in the human genome, a bigger and tougher problem than sequencing the genome. Researchers at Google have worked with D-Wave on using quantum computers to recognize cars and landmarks, a critical step in managing self-driving vehicles.

    Quantum computing has been a goal of researchers for more than three decades, but it has proved remarkably difficult to achieve.

    The D-Wave computer that Lockheed has bought uses a different mathematical approach than competing efforts. In the D-Wave system, a quantum computing processor, made from a lattice of tiny superconducting wires, is chilled close to absolute zero. It is then programmed by loading a set of mathematical equations into the lattice.

    The processor then moves through a near-infinity of possibilities to determine the lowest energy required to form those relationships. That state, seen as the optimal outcome, is the answer.

    The approach, which is known as adiabatic quantum computing, has been shown to have promise in applications like calculating protein folding, and D-Wave’s designers said it could potentially be used to evaluate complicated financial strategies or vast logistics problems."
  • Steve Jurvetson PRO 2y

    News from today: it looks like the first red star (for 2013 - Competitive Performance with Classical Computers) has been met, according to this summary by MIT Tech Review:
    "Catherine McGeoch, a computer science professor at Amherst College, carried out the tests and will soon present her results in a peer reviewed paper at the International Conference on Computing Frontiers. Her verdict on D-Wave’s computer? “In some cases, really, really fast.”

    McGeoch is an expert in “experimental algorithmics” – algorithm racing, essentially – and conducted her tests using three examples of what are known as “optimization” problems. These are the mathematical core of conundrums such as figuring out the most efficient delivery route around a city, or how the atoms in a protein will move around when it meets a drug compound. In each trial, she pitted a D-Wave computer, a giant black cabinet hiding a chip cooled to 0.02 Kelvin, against software running on a Lenovo workstation with a 2.4GHz quad core Intel processor and 16GB RAM.

    Some of the results saw D-Wave’s machine win in spectacular fashion. On one problem well-matched to the hard-wired design of the machine’s super-cooled chip, it found the best result about 3,600 times more quickly than the best conventional software solver. It crossed the finish line in just under half a second, while the second finisher took 30 minutes.

    "At the problem sizes I tested, it’s thousands of times faster than anything I’m aware of."
  • Steve Jurvetson PRO 2y

    It's an interesting statement on the state of AI that real humans are hired en masse to read online material and craft a relevant conversational sentence (like the one from lucygray here) to hide a spam link. Pure robots don't quite do it right yet, and this crowdsourced labor is on the cusp of computability... it's barely human... and a sad way to make a living.
  • vennettaj PRO 2y

    oh come's that the AI sounds like a bad's up to no good..face it :p
  • Saffron Blaze PRO 4mo

    Any updates?
46 faves
Uploaded on October 4, 2012
  • Show EXIF
This photo is in 3 groups
This photo is in 2 albums

Additional info

  • Viewing this photo Public
  • Safety level of this photo Safe
  • S Search
    Photo navigation
    < > Thumbnail navigation
    Z Zoom
    B Back to context