Hacker News new | past | comments | ask | show | jobs | submit login
The quantum computing bubble (ft.com)
143 points by ipeev on Sept 5, 2022 | hide | past | favorite | 141 comments



Link to the actual article by the Oxford Physicist.

https://archive.ph/0VB0K

Valuation over value.


This is how I think about it.

It's insurance money. If you're a manager of a big company like IBM, Microsoft or Google, you have to align your current product portfolio and future portfolio in such a way that shows your investor that your company will keep growing, even if your current products are stagnant.

You can surely say Quantum computing won't do much in next 5 years. But what about 10 years? 20 years? 30 years? The farther you look into the future, the bigger the probability of having a huge tech breakthrough that could give the company who has it a massive edge on the market.

Even if you have a chance of 1% of having a sort of transistor revolution from QC, it becomes a race to the bottom. If Google starts researching it, IBM will follow suit, and so will Microsoft. If in 30 years this turns out to be a big deal, no one will be 30 years behind.


Ah, the quantum ROI, where we affect the return by attempting to measure it.


This made me laugh, thank you!


I think you are describing the company dynamics accurately, but I can't help think this is just a terrible way to invest. No party has a concrete plan or vision for how to use it, they just throw money because there is a consensus of good feeling around it. Those good feelings were probably created through academic or corporate marketing efforts in the first place.


> but I can't help think this is just a terrible way to invest

Picking a company to invest in only half the job, choosing how much to invest is the other half.

It's a terrible way to invest if you put all your money into it. QC "changing the world" is a tail-end event. You allocate according to the risk.


I am not talking about investing in companies. I am talking about how these companies invest their R&D capital in projects.

> QC "changing the world" is a tail-end event. You allocate according to the risk

But, nobody has a clear idea how it's a risk. They just invest because others invest. You might as well allocate some capital to protect against giant aggressive pigs ruining the southern US.

To put it another way. What value is being provided by the managers of this capital? If they just put money in everything that seems popular in tech media (because it's a tail risk), then couldn't anyone do their job?


> I am not talking about investing in companies. I am talking about how these companies invest their R&D capital in projects.

Some food for thought:

These companies do have economies of scale on their side for internal tech investments. Things like this are a cheaper bet for them than for anyone else. Investments that seem irrational in a vacuum (aka for anybody else) can be rational for these specialized parties.

They already have hardware specialists, contacts / relationships, software expertise, idle bodies, etc.


> a terrible way to invest. No party has a concrete plan or vision for how to use it, they just throw money because there is a consensus of good feeling around it.

Almost if I were reading about cryptocoins.


The economics change when you control the money (military aperatus funds most of the research, does it not?).


I mean these companies have Research and Development divisions. I was at IBM at the turn of the century when they spent 6 billion dollars on research. One of the pushes was to get research to focus on things they could market and make money with.

But the big research plays (bell labs, xerox parc) seem to get less and less funding if they exist at all. A lot of the inventions of those places were monitized outside those companies. IBM had a chip fab in the research building… long spun of was that business.

At the turn of the century IBM was researching quantum computing, but as I was leaving selling services was IBMs big push.


It is not the amount of funding, it is the allocation that doomed R&D at big companies.

Forty years ago they were the best game in town for applied research (defining applied as, on success, having a fast track to commercialization). Later it became similar to the university research: on success, you write some articles, get company wows, but business people have no idea where to stick it and half heartedly throw a few applications at it to see if any stick. Most don't (e.g., deep blue, watson).

At this point large company R&D centers got passed over (by a lot) by VC- funded applied research and saw a (IMO well deserved) drop of funding.


+1 Insightful.

Also, in the spirit of helpfulness, "passed over" is the wrong turn of phrase as used here; it implies "declined or rejected" (as in, being passed over for a promotion at work). I think "surpassed" or "leap-frogged" are closer to your intended meaning. HTH! :)


Agreed, thank you! I appreciate corrections for my non-native english speakerness :)


Supposedly one of the really big, important things could do with a quantum computer (QC) is quickly solve to optimality instances of NP-complete optimization problems, e.g., problems in scheduling, resource allocation, logistics, etc. which can be formulated as linear programming problems (that need just knowledge of linear equations) where want all the variables to have whole number values, that is integer linear programming (ILP).

Okay, integer linear programming problems .... To get all excited about quantum computing (QC), need to get excited by the big money to be saved by solving all those important, practical ILP problems.

Okay, I had a good background in pure/applied math and in computing and got into ILP for scheduling the fleet at FedEx. Since the promised stock was 1+ years late, I ran off and got a Ph.D., in one of the best programs, in more in hopefully useful pure/applied math, and much of that work was in ILP.

Here is some blunt truth about the NP-complete problems and the cartoon at the beginning of the famous book by Garey and Johnson: The math guys were talking to their manager explaining that they couldn't solve the manager's problem but neither could some long line of other math guys.

Here the blunt part is the meaning of "solve" -- with a computer program running in time only a polynomial in the size of the problem get an optimal solution to any instance of the problem including the worst cases. And here optimal means down to the last penny to be saved. So, for some network deployment by AT&T that was to cost $1 billion, save down to the last penny, in polynomial time, including for the worst case instance of the problem.

Yup, maybe the savings would be $51,937,228.21. And do want to save that last penny. But if the manager would settle for saving just the first $51,900,000.00 in reasonable computer time for all or nearly all the actual instances of the manager's real problem, then there would be little or no difficulty. And should be able to tell the manager that savings of more than $55 million, or some such, were impossible -- that is, have an upper bound.

So, much of the difficulty was saving the last $37,228.21, guaranteeing to do so, for all instances of the problem, including the worst cases.

Well, I can assure readers that should I have insisted on a career saving, e.g., $51,900,000.00 where savings of $55 million were impossible, then I would have spent the last several decades homeless on the streets or dead from homeless on the streets -- no joke.

Bluntly, there just is no significant demand for solving ILP problems in practice. The "managers" don't want to get involved.

Selling pizzas from the back of a truck? Sure -- might sell 100 pizzas a day. Selling solutions to ILP and other NP-complete problems -- f'get about it.

Uh, since there is no significant demand for saving $51,900,000.00 with a bound of $55 million, there stands to be not significantly more demand for saving $51,937,228.21.

Thus, there stands to be no significant value for QC for solving NP-complete ILP problems. Sorry 'bout that. If some people want to get the $51,900,000.00 savings, they've been able to do that for decades and have voted loud and clear "We don't care.".

E.g., in one of my attempts, a guy sent me an ILP problem, we talked, and two weeks later I had running code that in 900 seconds on a slow computer got a feasible solution guaranteed to be within 0.025% of optimality. The problem had 600,000 variables and 40,000 constraints. I had done the work for free. Still, then, suddenly he was not interested.

So be it.

There was another one: I was writing the code using the idea of a strongly feasible basis, and suddenly the customer was not interested and returned to some not very good heuristic code he had.

Better, a lot better, to sell something a lot of people actually want, e.g., a lot better to sell pizza.

And I am doing a startup that to me continues to look good, software running, but it has nothing to do with NP-complete or ILP and wouldn't be helped by QC.

So, to me, e.g., even if Google gets a good QC that can solve ILP problems, then I don't believe that they will have many customers or much of a business and there will be no big reason for IBM or Microsoft to worry.

Since there is no significant demand for using ILP to save money now, I don't see a significant demand for using QC on ILP to save money in the future.

Their employees might be better off selling pizzas. Let's see: From some of my arithmetic about costs of pizza, can do well for $2-3 a pizza. From a pizza truck in a good location might be able to sell the pizzas for an average of $10 each, e.g., an extra $1 for anchovies! Might sell 100 pizzas a day for $1000 a day, maybe 20 days a month. Looks like a better career than QC research!

If there is no demand for pizzas, then there won't be much demand for pizzas with anchovies.

Uh, the Google QC researchers are well paid? Terrific -- park the pizza truck near the Google QC research building!!!!

For some parts of US national security, the situation for a good QC might be significantly different -- I doubt it, but maybe.


That's the positive, positive outlook, yeah.

Negative, positive outlook is that it is a disinformation campaign so one may maintain the lead in a particular trajectory of technical dominance. Whilst doing so, as an extra game theoretic safety precaution which also amplifies the disinformation campaign is to fund any research in the direction of the disinformation campaign as both a distraction and 'impossibility canary.'

Quite... deliciously deceptive.


I don't understand this argument at all. Of course it isn't making money yet-- that's because it's an early technology that is still being researched. Sure it might never mature, but it seems crazy to call it a "bubble" or to analyze it based on current sales figures.


That wasn't the argument made in the article. There are reasons to believe that the technology is fundamentally unsound, and will never be able to scale or make money.


It wasn't? The article leads with that argument.

"The reality is that none of these companies — or any other quantum computing firm, for that matter — are actually earning any real money."

I don't see any argument that the technology is fundamentally unsound or doesn't scale, even though that's an argument I'm pretty amenable to.


"The simple reason for this is that despite years of effort nobody has yet come close to building a quantum machine that is actually capable of solving practical problems. The current devices are so error-prone that any information one tries to process with them will almost instantly degenerate into noise. The problem only grows worse if the computer is scaled up (ie, the number of “qubits” increased)."

+ 5 subsequent paragraphs


The current mainstream view is that QC is a very hard but tractable engineering problem. There is no fundamental reason for quantum error-correction to not work, and a demonstration that it cannot would be a major and surprising breakthrough in fundamental physics. This has been our state of understanding for a couple of decades now, and the progress we have been seeing is consistent with this view.

To be clear, there are people making sophisticated arguments for fundamental barriers to quantum computation. For instance, the quantum-skeptical mathematician Gil Kalai writes about his thoughts on recent progress on QC at (https://gilkalai.wordpress.com/2022/05/26/waging-war-on-quan...). His view is considerably more nuanced than this FT article, and much more conducive to learning and discussion. I hope you take the time to read it, and I think I'll submit this to HN main as well.


As someone working in QC control systems, this holds up with my experience. We're fundamentally dealing with tough engineering problems, primarily on the hardware side of things. To be clear, these systems are very complex and rely on a multitude of software and hardware working, and working well. I'm of course biased, but I lean towards a positive outlook on QC.


It's not clear and obvious that this and the following 5 paragraphs mean that the technology is "fundamentally unsound" either. It's just that there are big problems we haven't yet figured out how to work with.

Though it is true that we don't know the degree to which we will be successful at developing this technology, and we know there are fundamental properties we will need to contend with (for better and for worst), this is entirely consistent with how "early technology" develops.

AFAIK, even "commercially oriented" quantum computing projects are better understood as being in a research stage at this time. When you do research, in general, it feels daunting and it's not at all obvious that things are going to work. (my field is biochem)


Some quantum error correcting codes were recently demonstrated experimentally: https://physics.aps.org/articles/v15/103


The first experimental demonstration was 18 years ago: https://www.nature.com/articles/nature03074

The problem is that none of this scales beyond toy systems with a hand full of qubits. As soon as you try making it bigger, everything starts falling apart. I feel like this is a fundamental difference to digital logic which is extremely easy to scale.


With 17 q-bits. Not something I’d call breathtaking.


It argues that QC is not currently viable. I don’t see an argument that it will never be viable, which is what “fundamentally unsound” means.


It's not actually early technology, it's been developed since the 80's. And if the underlying theories are unsound - if it doesn't even work in theory - then putting more money in won't make it magically viable.


1. Seems a bit unfair to say it has been developed since the 80's. In 80's a couple of people (e.g. Feynman) noticed that if you have a quantum simulator, you can simulate chemistry in a way that a classical computer is seemingly incapable of. But the transmon (one of the first possibly viable implementations of a qubit) was not developed until the 00's and complete control of some of these systems (e.g. transmon coupled to an oscillator, in order to make a memory) was not demonstrated until the 2010's. Life-times of quantum memories have also been growing exponentially for more than a decade (a trend that started in the 00's).

2. It is worth mentioning that by the standards of your comment, the time between conceiving of a classical computer (Babbage) and a scalable electronic computer (ENIAC and family) was about a century.

3. While ultimately there might be a "quantum winter" in the next few years because we (I work in the field) overpromised, this would not be the first time a tech that ultimately works gets disregarded for a decade or two because of mismanaging expectations (e.g. Liquid Crystal displays or neural networks, which were both developed for many decades before being commercially viable).

EDIT: And yes, there are some startups with misleadingly general pitches.


> e.g. Liquid Crystal displays

Liquid crystal displays were commercially viable 10 years before they became commercially available.

Source: my dad worked for a company that made display tubes for CRT manufacturers all over the world. He told me back in the seventies that they knew how to make TVs you could hang on the wall like a picture, but weren't making them because they were camping on a lucrative market.


Early (00s) LCDs sucked, I can't even imagine how much a 70's LCD would suck compared to a CRT.


To elaborate on the NN example:

The perceptron was developed in 1940s and only became useful in the 2000-2010 range. So 60-70 years of development time.

If we take 50 years as a baseline (faster than historic examples; but underestimating), we’d expect a quantum computer some time in the 2030s.

Is anyone here hard objecting to QC in the 2030-2040 range?


Doesn't the same apply to nuclear fusion power generation?


If the output side is saturated with work load but the input side keeps growing to blow up without significantly changing the output, or at worst affecting it negatively (quantum blockchain buzzword bingo), it may be fair to speak of a bubble.


I think his model of the situation is short-sighted, to say nothing of the callbacks to that management principle involving transistors.

If you're thinking that the whole purpose of QC will be quickly subsumed by wide algorithms with superpolynomial speedup, you might be missing the point. It's about how computers are built, not about stuffing one specific abstraction into another. If suddenly we discover we can build a machine that can generate random numbers a quadrillion times faster than any current hardware design, that's a new space in computation.

I mean consider how widely deployed the parallelism construct is now, and that Amdahl's law was elucidated in the 60's.

Parallelism was just one degree of freedom for us to climb the S-curve on, quantum computing seems to provide essentially a continuum of them.


I think most people, including the author, would agree QC should be funded for fundamental research reasons. But that is clearly not the way it is being pitched to VC. Right now there is no clear use-case, that's what I felt he was warning against. If nothing materialises soon, he's probably correct to say this is a bubble.


> Right now there is no clear use-case

Why people keep repeating that? You mean that if somebody creates a machine that can simulate chemistry and materials science in polynomial time, nobody would use it? That's crazy.


The key word being 'if' - cited research from the FT article that disclaims this: https://arxiv.org/pdf/2208.02199.pdf

I have no expertise in QC/chemistry so I can't make a judgement other than to say it doesn't seem universally agreed that QC will be a silver bullet here.


That's quite similar to claiming that quantum computers won't change anything for RSA cryptography, because we never had any proof that integer factorization can't be done in polynomial time on a classical computer.

It's true, in some sense.


The paper also shows that the proposed QC strategy for calculating energy levels - where you start with a guess and then optimise - may not lead to exponential speedup on a large class of important chemicals, because just coming up with an appropriate initial guess has no polynomial time algorithm at present. So it's not clear for what class of chemicals this would even theoretically become a tractable calculation.


Yes, that applies to the chemicals that we currently have very good heuristics to guide the optimization, in a process that is usually validated empirically afterwards because of it's not perfect reliability.

At the same time, it doesn't go to lengths to discuss how the heuristics needed for the quantum computer are much more general and self-validating (if it converges, it converges).

But then, nobody knows if the classical algorithms will improve enough so that when we actually get quantum computers they will still be needed. That's the article's point... What is not different from the RSA one from my post.


I figure it's more like a standoff between these shops. Just consider that historically, no one would want to build ENIAC, everyone would want to design the solid-state transistor.


I have a fair amount of experience in this space. It’s like, at a vacuum tube era, at best. There is a definite opportunity for advancement, but it is still extremely early.

We are building user interfaces that make it easier to “play around” with quantum computing phenomena—especially with music and art—with the idea that our aesthetic sensibilities may help drive discovery.


> It’s like, at a vacuum tube era, at best.

How is it even remotely close?

Vacuum tubes were a thriving industry, producing many groundbreaking products and services.


I think the comparison is still reasonable: while we do not have scalable quantum computers, the technologies developed for them have actually seen a lot of use: squeezed light and non-classical light, color centers, Josephson junctions, nonlinear-optics at the single-photon level, to name a few "terms of art" that should be google-able, are crucial for precision sensing and telecom.


I am not aware of a single telecom application that uses any of these technologies. Could you give some examples?


The nonlinear optics is what enables optical signal modulators. For modulation of classical optical signals it is not necessary to have a "strong nonlinearity at the single photon level". However, that is being developed for quantum computing applications and as a side effect it makes today's optical modulators much better.


Telecom signal modulators are almost exclusively based on the Kerr effect or acousto-optics. Both effects have been known for at least 100 years and have nothing to do with nonlinear optics. The development of modern modulators was always driven by telecom and then applied to scientific experiments, not the other way round.


There might be some semantic misunderstanding here. "Kerr effect" is "nonlinear optics" in my book - it works both at the low GHz frequency of electro-optics and at the high THz frequencies of pure optics. I agree with the rest of your statement (and it does not seem to be in conflict with the initial comparison: vacuum tubes were being developed for 50 years due to the needs of the radio industry before they became viable for computation).

And the direction symbiotic improvements is not necessary as clear: there are academic groups solely focused on quantum optical effects, whose research then gets reused by the telecom industry and vice versa. These quantum research groups would be dead in the water if it was not for the fab capabilities initially developed by telecom folks, but they have certainly surpassed them by far now (in one-off "hero" devices).

P.S. Same with acousto-optic devices. What you would find in a quantum computing lab is far more impressive than what is being deployed today. Even if the quantum computing field is a bust, the tech they developed would improve telecom state of the art by orders of magnitude.


Right now it is a pre-transistor in the 1920s and 1930s.

Vacuum tubes are analogous to the computers that we have now. In 1925 a patent for the concept of a FET was filed. It wasn't until 1948 that we had a working transistor.

That took 23 years to go from concept to useful invention. It isn't too surprising that a quantum computer is harder.


They are saying that quantum computing is at the stage that classical computing was at when it still relied on vacuum tubes.

Not that quantum computing is at the same stage as the vacuum tube industry was at at some unspecified time.


It still doesn't wash. Vacuum tube computers were used in WWII to compute artillery tables. That presumably changed (or ended) lives.

I don't think it's a reasonable comparison at all.


Yet before that stage, they were a new tech, no one even had ideas of how to use.

So far, quantum computing is used in labs, not for any real useful purpose.


> Yet before that stage, they were a new tech

As far as I know - and we still had some tube radios at home when I grew up - they were used pretty much right away. For radios, and replacing mechanical relay switches, over which there obvious advantage was that since the tubes were not mechanical they would last longer, especially important in systems with lots of them.

So I'm not completely sure and somebody with definite knowledge should chime in, but I think they were not some "we have no idea what to do with this" but where useful from early on, as soon as the manufacturing method was solid. They were used for "normal" electronics well before they were used for the early computers. Even though Google search - which I just tried - also seems to have a bias and mostly associate those vacuum tubes with computers and forget all about the many other uses. It was the predecessor of diodes and transistors in addition to being better than mechanical relays.

https://www.engineering.com/story/vacuum-tubes-the-world-bef...

https://www.nps.gov/features/safr/feat0001/virtualships/vrmo...

(just one of the earliest commercial use cases as example for how short the period was between invention and wide-spread use)

> Dr. Lee deForest invented the vacuum tube in 1906. His tube, which he called the "audion," was first developed as a detector of radio waves and was quickly adopted by shipboard operators. Later experimentation, by deForest and others, showed the ability of the vacuum tube to generate radio signals with far greater precision than earlier systems. By 1914 the essentials of tube-based transmitters had been worked out.


Sure, tube based transmitters had been, in a lab, "worked out". But we're talking about 20 years before commercial use really happened:

https://en.wikipedia.org/wiki/History_of_radio#The_first_vac...

Radio existed (as you say) before the tube, but it took time to get the tube into real-world use. I guess this is somewhat like computing ; computing has existed in many forms, mechanical through to IC designs we have now.

But quantum? It's in a lab, and real-world applications are being worked out.

Paralleling back to radio again, people knew tubes would be useful, but it took 20+ years to make them mass producible, and make tubes usable, and redesign radios to use them. This, to me, sounds like quantum computing.

We can think of things I suppose to use them for (perhaps I was hasty on this point in my prior post), but we aren't there yet. And really, looking at the tube -- did anyone think they'd be used to make massive computers even?

I bet quantum computing will be the same way -- uses we're not aware of now.


You need to get your information from places other than Wikipedia. The vacuum tube was invented in 1904 and entered mass production by RCA in 1920 after several rounds of improvements. D-Wave started demonstrating its QC product in 2009 and here we are 13 years later with basically nothing. Are you going to tell us that QC will be commercially viable in three years?


It's honestly just a really bad analogy. Evacuated tubes were not pursued en-masse as an end unto themselves the way quantum computing is being pursued. Their development was as a side fascination for a handful of researchers or they were advanced because they were directly applicable to some goal of a researcher with maybe some mild tweaking. They were not a moonshot and they developed along similar lines as most useful "inventions" of today.

Quantum computing isn't even really comparable to ENIAC. Because the fundamental parts of ENIAC were known to be capable of doing what was required before the machine was built. There were analogus precursors to basically all the parts of the machine. The engineering challenges came with integration and at scale but fundamentally there wasn't a question of whether it could be done or not. We can't say the same yet of quantum computing.

IMO, quantum computing is more like space travel ca. the 1950s. We've got a lot of information: rockets look promising, we've learned a lot about the environment pilots will need to operate in up there, etc but no one really knows how far this can go and certainly we can't say if it will ever be profitable.


> It's honestly just a really bad analogy. Evacuated tubes were not pursued en-masse as an end unto themselves the way quantum computing is being pursued

No, it's not. The Vacuum tube was sold for commercial and industrial use starting from 1915 for rectification. And most of the research spending went into it after it shown some promise from commercial application.

Not arguing though if pursuing something just for research is bad, just saying vacuum tube research was nothing like quantum computer research.


> just saying vacuum tube research was nothing like quantum computer research.

That's what I'm saying. Vacuum tubes had various evacuated tubes as their precursors which had uses in experimentation and industrial applications. This led to a step by step development process with continuous subsequent innovations building on each other. Quantum computing on the other hand is kind of an all or nothing proposition with many problems that must be solved which only produce value when functioning as part of the whole.

I don't think the development of classical computing is a good analogy either as the shared memory computer had various electromechanical precursors that had utility all of their own.


To clarify the analogy:

Quantum computing with todays technology :: Classical computing in the early era of vacuum tubes

The point of the analogy is to communicate that we are at an early stage of development. Also, I just like vacuum tubes.


> Quantum computing with todays technology :: Classical computing in the early era of vacuum tubes

There's still a big difference.

Even in the earliest era of computing with vacuum tubes (and even before), they were building machines that produced useful results: artillery tables, H-bomb simulations, cryptanalysis, etc.

Most things these machines were used for were simply impractical without them.

There's nothing even remotely analogous with quantum computing.


Well, to be fair: quantum computers can do practical computations—but because “normal” computers are so unbelievably well developed, it is not practical to do them with quantum computers.

The earliest era of computing was mechanical. The mechanism of thermionic transmission (vacuum tubes) was invented by Edison in 1883. The Colossus computer, with thousands of vacuum tubes, was the first practical use of vacuum tubes for computation, in 1943. We cannot yet assemble thousands of qubits together. Once we can do that, we will be able to perform many useful functions. However, even with thousands of qubits, many of the functions will be more practical to run on a classical computer (because classical computers are amazing and continue to develop).


> The Colossus computer, with thousands of vacuum tubes, was the first practical use for computation, in 1943.

I don't think this is true. Perhaps the Colossus is what we would call the "first computer" by some definition but electro mechanical computing devices built for specific purposes preceded it. There wasn't a 60 year gap of investment with no practical application of classical computing.

Classical computing was built up over time with practical utilith along the wya. The vacuum tube eventually became a useful component of it.


No, current quantum computers cannot do any practical computations. The gate errors are astronomical compared to even the earliest digital computers. Any algorithm with more than a few gates will produce just noise.


So it's a meaningless description then of a tech's development. Somewhere between new/useless and mature/ubiquitous.


Could you share some links to your work / research re: interfaces with focus on art? I'd be very interested to check it out.

https://quantumdelta.nl/ is some kind of hub but landing pages offer too much hype and too little content :)

thank you.


I will have material to share in about a month!


"The Quantum Computing Bubble" - https://news.ycombinator.com/item?id=32630815

"Separating Quantum Hype From Quantum Reality" - https://news.ycombinator.com/item?id=32691220



This is something of a low-effort article, with a short-sighted focus on immediate profitability. There are many scientific programs that didn't really become private-free-market revenue generators for decades at least (the US space program, for example).

An article with a little more depth might examine the future of trapped-ion quantum computing, for example:

https://en.wikipedia.org/wiki/Trapped_ion_quantum_computer

As far as the 'make money off new drugs' mentality, that's not really where QM chemical simulations in molecular dynamics really seems all that promising - it's more about things like the design of new catalysts to improve the efficiency of various industrial processes.

If QM computation is eventually developed, the devices will almost certainly be large and extremely expensive (kind of like the cutting-edge chip fab machines of today in scale). For most businesses, it's unlikely the benefit of owning one will justify the cost, so it'll probably be a national lab / research center type thing.


The key consideration with investments is ROI. When the investor is a government; it can afford to take the long perspective. For most companies; and institutional investors, this works less well.

The key mechanism to protect inventions is patents. Patents have a limited shelf life. If you file a lot of patents today and it takes 30 years before you can apply them, they will have expired by then and others are free to take your inventions and build on that. So, if quantum computing requires another three decades to start making money, most of the companies that are currently being invested in will have failed and their patent portfolios and investment will be worthless. Their patents will have expired, their founding scientists will have moved on or retired, etc. At best those companies may be in a position to file more patents. So, any investors investing right now are making bets on how long it will take before there's a meaningful market to get an ROI and which companies are positioned best to take a chunk out of that market. The further that is out, the higher the risk of losing their investment.

There are billions flowing into quantum computing and the article is simply making the point that in terms of revenue potential there seems to be a lot of uncertainty about the practicality of current approaches, the lack of any real revenue (beyond consulting people on how awesome it would be if we had working quantum computing, etc.). And the lack of perspective on when all this will change. Very valid points. There are a few big companies investing in this stuff but none of them is betting their company on it. It's a side show at MS, Google, IBM, etc.

A long shot that might create some viable business decades further from now but if it all fails, their stocks will be fine. There's enough substance there for them to want to have a finger in the pie if it does take off but none of these companies seems to be counting on that happening any time soon.


There’s also billions going into commercial fusion reactors, which haven’t turned net positive yet. The goal of the investment is to build that capability tho, same (I think?) as with quantum computing. Weird critique imo.


It's just saying that quantum startups are overvalued. Which may be true. It's certainly phrased in the article as an opinion.


Thing is fusion is known to be possible - the sun and hydrogen bombs. Quantum Computing lacks equivalent existence proofs. There's a lot of abstract theory but no real indication it is possible to scale up in the real physical world to useful problem sizes and reasons to doubt that it is possible.


Neither of those are contained fusion. I think gp is implying that it might be impossible to have net energy gain for contained fusion (which currently seems to be the case even with most advanced designs)


Contained fusion might be impossible but I don't think we have anything as suggestive that quantum computing is possible as we do for fusion.


As the tech is still unproven, it's research, rather than building capability, that they're spending money on. I hope all these investors understand this...


The physics underpinning fusion was proven in the 1950's. Since then it's been an engineering problem.

The physics underpinning qc was arguably proven in the 2020's. It's not quite done (in the way that fusion was not quite done in the 50's) but there is a fairly clear set of demonstrations that QC's with error correction are possible. However the engineering barriers are fierce and there is still a possibility that they are insurmountable. In addition there are concerns that while QC will work the class of problems that is NP and also BQP may be very small. Even if a problem is in that group then it may be that the algorithms we have are not superquadratic or quadratic - meaning that the improvement that they offer over classical algorithms may be marginal.

Worse, there are often very good heuristic approaches to some of these problems which means that although a superquadratic QC approach would be an amazing breakthrough of computer science (genuinely amazing and worthy of accolades and prizes and fundamentally important for our understanding of the universe etc) it would offer only marginal economic value (possibly). Now, this is not true of some problems where there are exponential explosions and no good heuristics... but there is an even worse catch.. Which is that the quantum algorithms offer computer scientists fresh insight into what's tripping up the classical approaches. In this scenario it can be that an amazing breakthrough happens in QC, and someone uses that to get an insight that pushes the classical approach close enough to the QC approach as to render the QC approach marginal.

The theoretical picture is moving very fast though - so we will have to see.

On the other hand the practical side is moving more slowly. We see announcements that make one think that a Moore's law type of scaling is happening but hidden in the small print there are often (always as far I can decode) catches that mean that while the results look great they are still very much mired in problems. For example, are all the bits on a QC useable at once? Can they be used to form an actual algorithm? How long does the machine run for? How long does it take to start? Some of the answers are jarring - often only a small subset of a machine can be used in an actual problem solving episode; sometimes the machines run for a few steps only; sometimes the machines take 24hrs or longer to start.

It has taken 70 years to nearly build fusion reactors, it took 70 years to create mRNA vaccines. It may well take 70 years (from now) to build practical, valuable quantum computers. And something could go wrong on that path that just renders them moot.


I forgot to mention something else - the exact results of QC algorithms are read probabilistic from the instruments that read out the state of the machine. The confidence intervals for these results is important. I do not think that the results for every demonstration are read to 7 sigma... Be careful about this because if you are seeing a result of an exact algorithm that's read to 3 sigma then it's probably best to rate it as equivalent to a heuristic that gives a result within 0.01% 99.99% of the time (I am being generous). It's all in the small print in the annexes...


One point of interest was the paper on quantum computing applied to quantum chemistry[1]. In that paper, they did not find generic exponential speedup for a list of chemistry problems with current quantum algorithms. There are 3 problems with this: a speedup does not need to be exponential in order to be incredibly valuable; a speedup does not need to be extremely generic, just enough to cover real-world use cases; and quantum algorithms are still in their infancy, and it's unclear how much more we might discover in the next 10-20 years.

Furthermore, the paper itself links to a github repository[2] with a list of papers that either imply or use an exponential advantage in quantum chemistry. Now would be a good time to mention that I am not an expert in chemistry, nor have I read the entirety of this list of papers so I am not in a position to go through each and every one to decide how generic their results are or what the limitations are. Perhaps all these papers have fundamental limitations that prevent it from being useful in normal chemistry, only in weird souped-up problems specifically devised for a quantum advantage.

Either way, this paper is by no means conclusive on the subject. There's a ton of more research to be done in multiple fields to know for sure.

[1] https://arxiv.org/pdf/2208.02199.pdf [2] https://github.com/seunghoonlee89/Refs_EQA_GSQC


The reason exponential speedups are required is due to the extreme cost of quantum computing R&D and extremely limited quantum computers that come out of it.

I can provision 1k CPU based servers or ~20 4x GPU based servers in a cloud computing environment for an hour for <$400. These are mature technologies with massive economies of scale behind them. A quantum computer needs to not only outperform scale out GPU/CPU performance on a particular problem set, it needs to crush it.


> The reason exponential speedups are required is due to the extreme cost of quantum computing R&D and extremely limited quantum computers that come out of it.

Hmmm. I'm no mathematician; but I thought the value of an "exponential speedup" is if you are trying to solve a problem with "exponential complexity".

I don't know if "exponential compexity" is a thing; I'm pretty sure "exponential speedup" isn't. Is it correct to say that a quantum factoring machine has an "exponential speedup"? Isn't it more accurate to say that the exponential difficulty is a property of the classical algorithms, not of the problem itself?


In an informal setting, specifying an exponential speed up is equivalent to specifying a "linear time solution to a problem for which the best known classical algorithm has exponential complexity".

My point was that we have immense amounts of classical compute available. QC systems will not be economically viable unless they deliver gains which are >10x classical computers.


OK, thanks.

I'm one of those pedants that cringes when anyone uses "exponential" to mean "very fast". In a technical forum like this, I expect people to use a word like that fairly precisely; I'm cool with the version "a linear-time solution to a problem with exponential-time complexity for classical algorithms".

I'm also OK with your claim about the economic viability of QC; I'm not OK with the implication that there is anything exponential about a 10x performance gain.


Note that he only recently finished his PhD in a tangential area related to Quantum computing, so he is not a giant in a field like say Aronson who I would be interested in knowing what they think on this industry. I do in fact agree with most of the article. However I also think if you measure ratio of scientific/technological impact to funding, I would place Quantum Computing for higher than many other hype bubbles such as crypto, blockchain, web3.

In other words the size of funding QC is getting is nowhere close to the other hype bubbles and there are some significant peer-reviewed results that have been generated from it, so for the time being you can still give it the benefit of the doubt.

For example it has definitely enhanced our understanding of quantum chemistry and computational complexity, and anyone who invests time learning QC will end up having solid new insight about how the world works and deep engineering knowledge of electronics, which you can't say about many other bubbles.

For example, compare how many QC startups YC has funded (I think 0?) compared to blockchain, crypto, AI-assisted medicine and web3. There is no comparison. Picking on QC is far below my list if you want to have a go at hype bubbles.


>That means these firms are collecting orders of magnitude more in financing than they're able to earn in actual revenue — a growing bubble that could eventually burst.

>"The little revenue they generate mostly comes from consulting missions aimed at teaching other companies about 'how quantum computers will help their business,'"

well, that makes QC bona fide a tech industry.


Surely Accenture is already training thousands of people to help businesses transform their strategy towards web4 qApps and the quantumverse.


All bubbles are not equal in risk, folly, or long term sustainability.

In the case of the Internet Bubble stocks were down 78%, but it was not hard to do well in the end given diversification and a long enough horizon.

In the case of the Dutch Tulip bubble there was no good ending for anyone except those who got out early.

Some bubbles like NFTs generate strong opinions but have yet to have final judgment from history.

I think the quantum computing bubble is different than all three, but closer to the Internet than to Tulips. In which case the conventional strategy would be to diversify and expect a long time horizon.


> In the case of the Internet Bubble stocks were down 78%, but it was not hard to do well in the end given diversification and a long enough horizon.

This is untrue. If you were invested in what became the stars of that era: Amazon, Red Hat, Cisco, a few others, you eventually made decent money, although far worse than if you had stayed out and bought the dip.

If you had a diversified portfolio of 'new economy' stocks which didn't include a few winners like this, you might have lost over 95% of your money and never got it back. Lots and lots of stocks simply disappeared or were bought for peanuts. Many others, including lots of very very highly rated ones like Yahoo never exceeded their bubble-era peaks.


Let me give an example of why I think it’s true, the difference between our opinions may be based on tightening up the premises for “diversified” and “time horizon”

By diversification I’m assuming a NASDQ index fund, which many of the hot new Internet stocks, as well as larger establishes tech companies benefiting from bubble were part of.

If you invested in NASDAQ everything at the absolute worst peak of the bubble: - The initial crash put you at -78% return - It took 21 years to recover all loses and earn a 300% return.

Why do you think that’s not “doing well” for a index fund closely tracking the bubble?

You could say alternative scenarios would’ve done better but that’s always the case.

The main point is, for someone with a long time horizon who was diversified, this turned out way way better than a lot of other bubbles turned out.


I'll get downvoted into oblivion for this, but literally most of my old academic friends are in on this grift, old advisors own some of the largest enterprises in this 'sector', and all all these people (in private) regard quantum computing as a money grab. At best as a way to fund research.

And that's it! The author of this article is 100% right. Markets are fully aware though, go ahead and try to short any publicly traded QC stock lol. You can't. There's no shares to borrow and no liquidity on puts....


This article is kind of crap. I kind of expected better from ft.

----- The reality is that none of these companies — or any other quantum computing firm, for that matter — are actually earning any real money. ----- ORLY? I guess I should go masssively short IBM shares then. https://newsroom.ibm.com/image/2022%20IBM%20Quantum%20Roadma...

---- Shor’s algorithm has been a godsend to the quantum industry, leading to untold amounts of funding from government security agencies all over the world. However, the commonly forgotten caveat here is that there are many alternative cryptographic schemes that are not vulnerable to quantum computers. It would be far from impossible to simply replace these vulnerable schemes with so-called “quantum-secure” ones. ---- ORLY? New cryptography can take 20 years or more to be fully deployed to all National Security Systems. NSS equipment is often used for decades after deployment. National security information intelligence value varies depending on classification, sensitivity, and subject, but it can require protection for many decades. -NSA

The solutions we do have do not work very well. Only the weakest FALCON-512 (bad name as it was only 64 bits of quantum security, now the dual lattice attack seems to reduce this to 20?), actually fits the TLS use case without breaking the internet. The signatures are just too big. Cloudflare has testing that proves this.

If that wasn't enough, this person is completely unaware of the annual survey of quantum researches that actually puts the arrivial of a cryptanalyically relevant quantum computer at 2030 or so. Peter Shor is actually one of the people polled in the survey, this person is not. And if you parts are still clean, you can look at the surveys estimates since 2018. These estimates are clearly trending towards sooner and sooner, instead of further and futher away.

If you still have doubts, read this: https://www.whitehouse.gov/briefing-room/statements-releases...


According the OP, the quantum computers that are feasible today have no real-world use, and likely won't have any real-world uses in the near term, despite claims to the contrary by executives and salespeople. Quantum computing research, the OP asserts, is an academic pursuit, not a commercial one, but it is funded by investors who think it's the latter. The entire piece can be summarized as "Look! The emperor has no clothes!"

Quoting:

> Billions of dollars have poured into the field in recent years, culminating with the public market debuts of prominent quantum computing companies like IonQ, Rigetti and D-Wave ... These three jointly still have a market capitalisation of $3bn, but combined expected sales of about $32mn this year (and about $150mn of net losses), according to Refinitiv.

> The reality is that none of these companies — or any other quantum computing firm, for that matter — are actually earning any real money. The little revenue they generate mostly comes from consulting missions aimed at teaching other companies about "how quantum computers will help their business", as opposed to genuinely harnessing any advantages that quantum computers have over classical computers.


How did this come about? I think it's a very simple case of misunderstanding and overpromising. There are many interdisciplinary fields that mesh with computation. We have DNA computers and neural turing machines. QC is a subfield of quantum mechanics, one of many with some interesting applications but nothing shows that it has open-ended potential to revolutionize computation. But, it has the word 'computation' in it , and in the past decades the VCs with most money come from computer science. So you have a combination of Quantum (spoooky, mysterious) with computation (that one i know). I never got why QC was seen as so promising, it's an interesting exercise on paper but is not the 2nd coming of anything. Wish that money had gone to fusion instead, that one we understand now more than ever, that it has very real positive consequences


> nothing shows that it has open-ended potential to revolutionize computation

Well, we do know that P <= BQP <= PSCPACE, and we have one important example that lies in BQP but not in P (for all we know). It's just not clear how important that particular example is for the kind of computing we do today, if it ever becomes practical. It looks like it'd rather result in a one-time nuisance for sysadmins, like Y2K was.

The hope was for applications in new areas like materials and drug design. The author has posted a link to one paper suggesting that we might not see exponential speedups in chemical simulations, but that's not an outright refutation either.


It was definitely sold as the next step in computation.

A real performant quantum computer could potentially revolutionize a lot of industries. But selling it as an accelerator of molecular dynamic simulations is not quite as sexy.


"Quantum computing as a field is obvious bullshit"

https://scottlocklin.wordpress.com/2019/01/15/quantum-comput...


Further confirmation of my bias and/or evidence against QC.

Operator Imprecision and Scaling of Shor’s Algorithm

https://arxiv.org/ftp/arxiv/papers/0804/0804.3076.pdf


Well hopefully the quantum industry has fundraised enough to last them for a while . It isn't the exact picture, but minus potential shareholder (edit:lawsuits) for tanking stocks these are exactly the sorts of startups that should be comfortable with taking 70%+ haircuts to their valuation.


The observation that important technologies required a long time from inception to practical use is common here. That is true but ignores the fact that there were a tremendous number of possible technologies available that could of eventually worked out. Only a very small number ever did.


I don't believe in quantum supremacy.[1] I think that someone will eventually come up with a binary computer equivalent for Shor's Algorithm[2].

I further posit that there are no quantum algorithms without binary equivalents.

[1] https://en.wikipedia.org/wiki/Quantum_supremacy

[2] https://en.wikipedia.org/wiki/Shor%27s_algorithm


Are you extending that to pure polynomial speed ups like grover?

Because i find it really hard to believe there will ever be an O(sqrt(n)) classical algorithm for unstructured search. How could there possibly be?


Based on 5 minutes of reading about Grover's algorithm, I don't see how a practical quantum computer (an analog system) can repeat an operation 2^64th times to break an 128 bit key without some bias towards interaction in the qubits involved. You'd have to have 1/(2^128) attenuation between each and every channel for that to even work reliably. That's 380 dB of signal to noise ratio. Typically 40 or 50 dB of isolation is all you get between channels in communications systems.

[Edit -- Additional considerations] In order for Grover's algorithm to work, you have to implement your conventional algorithm in Quantum hardware, with perfect fidelity. Digital computers can do this just fine because every gate is also a comparator, so signal to noise ratio isn't an issue. I fail to see how a quantum gate can possibly operate with enough fidelity to even just copy the input state after 2^64 stages, let alone complex logic AND the Grover Diffusion Operator.

[1] https://en.wikipedia.org/wiki/Grover%27s_algorithm


Isn't the whole point of QC that a classical computer needs to do 2^64th calculations to try 64 bits of data in a function BUT a QC just needs 64 QBits, because each Qbit can be both 0 and 1 simultaneously? This effectively turns a time problem (2^64th operations) into a memory one (64 qbits and 1 operation over all of them, maybe repeat 10 times for error detection).


Qubits are NEVER both 0 and 1 simultaneously... they might be either at any given moment. Qubits are a vector in 3 dimensions of unit length, best illustrated with the Bloch Sphere[1] In this sphere Up, where Z=1 is written as |0> because of history.

One of the most common Quantum Logic Gates[2], the Hadamard gate performs a rotation of a diagonal axis half way between X and Z. This gate is used many, many times in Grover's algorithm

There are many such rotations in quantum computing. Error correction can only ensure that a state is at either end of an axis, not at the correct point anywhere on the sphere.

[1] https://en.wikipedia.org/wiki/Bloch_sphere

[2] https://en.wikipedia.org/wiki/Quantum_logic_gate#Hadamard_ga...


Isn't that the point of quantum error correction codes?


You would think so, but no. The correction is only to ensure the resultant measured qubit is correct.

Grover's algorithm apparently relies on very small phase angles iterated very many times, which is beyond the capabilities of current forms of quantum error correction.


Engineering can never supersede the limits of Physics. Engineering follows Physics.


its not exactly a secret we are very far away from useful quantum computers. Every comment i've ever seen from people in that industry, except those in a financial position to benefit, have said so.


Frankly, the article reads as if the author has an axe to grind.

On utility, there's more than just Shor's: unstructured search [1], finance ([2], [3]). Even if quantum computers ultimately prove unfruitful commercially, that doesn't render it a useless endeavor. Like String Theory, it can beget findings in other areas, regardless of whether you can profit from them: novel classical recommendation algorithms ([4]), quantum algorithms for SAT that could possibly help automated theorem proving ([5]).

Part of the difficulty of quantum computing is that to show speedup, you need to find complexity bounds on classical problems whose runtime is actively being researched, e.g. neural networks ([6]).

As for their financial worthwhileness, while there is valid concern ([7], [8]), it's far too early to tell: it's hardware, not software. Also, it's my understanding that private investment is much larger than public funding in the US for quantum computing, both of which pale in comparison to China's investment. Thus, I wouldn't want to see investors shy away if the government is unwilling to make up the difference!

[1] - https://en.wikipedia.org/wiki/Grover%27s_algorithm

[2] - https://arxiv.org/abs/1905.02666

[3] - https://arxiv.org/abs/1908.08040

[4] - https://scottaaronson.blog/?p=3880

[5] - https://cstheory.stackexchange.com/questions/36428/do-any-qu...

[6] - https://arxiv.org/abs/1912.01198

[7] - https://www.microsoft.com/en-us/research/project/topological...

[8] - https://arxiv.org/abs/2110.03137


Anybody have any ideas on how to short the quantum computing space?


There are publicly traded quantum companies (IonQ, etc.)


Great. A link to one of the most formidable paywalls on the Internet...


Try the 'Bypass Paywalls Clean' plugin. It's working for me.


Commenting on this, now that it's making its third or whatever pass on HN. Also Scott Aaronson's comments are interesting: https://scottaaronson.blog/?p=6670

> The most prominent application by far is the Shor algorithm (opens a new window)for factorising large numbers into their constituent primes, which is exponentially faster than any known corresponding scheme running on a classical computer. Since most cryptography currently used to protect our internet traffic are based on the assumed hardness of the prime factorisation problem, the sudden appearance of an actually functional quantum computer capable of running Shor’s algorithm would indeed pose a major security risk.

> Shor’s algorithm has been a godsend to the quantum industry, leading to untold amounts of funding from government security agencies all over the world. However, the commonly forgotten caveat here is that there are many alternative cryptographic schemes that are not vulnerable to quantum computers. It would be far from impossible to simply replace these vulnerable schemes with so-called “quantum-secure” ones.

Note that Shor's algorithm breaks not just factoring, but also discrete log, including elliptic curve discrete log. That includes classic DH and DSA of course, as well as ECDSA and ECDH, whether they're over Bitcoin's curve, the other NIST curves, Brainpool, {curve,ed}{25519,448}, pairing-friendly curves, everything. Almost all broadly deployed public-key crypto uses RSA or elliptic curves. Those alternative public-key algorithms are still being worked out, and will take years to broadly deploy, so if a QC gets built, it will probably be able to break into straggling systems for some years. There is also a risk that the replacements will eventually fall to quantum or even classical attack, especially considering that a significant fraction of the proposed replacements already have fallen (most recently SIKE) or been weakened (eg, every multivariate quadratic sig). They may also have other security problems, eg implementation bugs or side-channel attacks.

The surviving quantum-secure algorithms are all either pretty inefficient (McEliece and SPHINCS+, and CSIDH and SQISign but those are also bleeding-edge), or use structured lattices (Kyber, Falcon, Dilithium, NTRU and NTRU prime, etc) or structured codes that look kind of like structured lattices (BIKE, HQC). So we'll have most of our eggs in just a couple of baskets again, and outside of applications that can use McEliece and SPHINCS+, they'll be newer, less-tested baskets. Also, while fast, the structured lattice and structured code systems still use significantly more bandwidth that elliptic curves.

Using long-term symmetric keys instead of or in addition to public-key crypto is possible in some applications, but it's obnoxious and limiting: you'd end up with some combination of Kerberos derivatives (with trusted third parties acting as single points of security failure), mailed smartcards or other secrets, and physical in-person meetings to set up shared keys.

So the bigger issue in my view is that outside of Bitcoin, breaking crypto is mostly a net negative for society. Transitioning to quantum-secure crypto is also a negative, in that it will take a ton of work and the replacements are less efficient than elliptic curves, and may have security problems. (It's also probably unavoidable because governments will try to build QCs to break crypto even if private industry doesn't.) So all this money is being spent on something whose first major application will be negative, if it even works at all. Hopefully the positive stuff will outweigh this.


Sometimes I wonder if we should try building a news site optimised for seeing the effect of appeals to authority.

To write an article for the site, we would need to:

1. Write a headline with no mentions of any experts.

2. Write another headline mentioning at least one expert in it.

3. Write the content without mentioning any experts.

4. Write the content and sprinkle names of experts as needed.

5. Publish.

Now, the reader would then:

1. Be exposed to the no-experts version of the article - both headline and content.

2. Once finished, the reader will be prompted to write their thoughts on the article.

3. Click “Reveal”.

4. The reader would then skim or read the whole article again, but this time it would mention the experts.

5. Prompt the reader to evaluate how their thoughts had changed after reading the expert version of the article.

I’m so gullible, seeing experts in anything especially when names of prestigious institutions or titles are tacked onto them, tend to shut down the reasoning part of my brain altogether.

Bear in mind, the site I proposed is not a place to police how articles should be written; rather, it’s all about increasing its readers’ awareness on how much mentions of an authority can impact their initial reasoning and judgement and sometimes make them stop reasoning at all. My view is that mentions of an authority are useful for calibrating our judgements after we tried to reason on our own but not before that.

And yeah, I have no opinion on the original post. Just like to go off on a tangent once in a while.


There is value in knowing what experts say. I and 99% of people in IT will never be in position to evaluate if quantum computing is feasible or not. The only thing we can do is try to evaluate expert's credibility and pick a side so to speak.

The same goes for health advice but this time it's 99.9%+ - if you're not in the field you can just listen and hope you are good at estimating who is more credible or likely to do better research or more truthful claims. Trying to evaluate them yourself is a recipe for being wrong and creating your own bubble.

If the article says: random guy X says quantum computing is a scam because Y there is nothing I can take away from it because it's very easy to make Y both incorrect and plausible sounding to me. If I know it's Oxford physics professor who makes the claim I can learn that Y is at least serious enough reason to not be easily dismissed.

Appeal to authority is bad as an argument when people knowledgeable in the field try to debate a certain point. In other cases it's very useful to know who makes the claims and very often it's the only thing that gives the claims credibility.


I love the idea.

I've always wondered if the format of long form expert opinions could be replaced by a knowledge graph that is independent of the expert.

E.g. instead of article "Economist John rejects minimum wage"

Root node "Minimum wage is not the best solution to problem x" -> because -> <node to define problem>, <node to define alternative solutions> -> because -> <leaf nodes of studies or models>

In this way other experts could add to the graph and the differences between different branches of argument could be more easily compared or automatically updated. Articles could still be written, but could reference specific nodes or edges of the graph which adds clarity to the discussion.


I've thought about this before, but notating as a structure that needs logical supports instead of a graph.


When you say logical supports, do you mean something like "A because B and C"? Why is this better than a graph?

If you can't tell I don't know much about formal logic. Apologies for the naive questions.


I'm not saying it's better. I'm saying I've thought of a similar idea but with a different notation.


I didn't mean to come across as argumentative, I was just looking for more details so I could understand better.


Oh, you didn't. I did. Sorry - mind on other things!


Well, I don’t know, I do think that expert opinions can carry weight. Like: “according to Dr Malcolm Alan of the department of geology at Harvard university, this kind of rock usually indicates…”. I mean knowing that a professor of geology said that is evidence that a fact is true, isn’t it?


Not necessarily true if the "expert" goes against most other experts.


Having done that, you could then write a paper on your findings. You would then be an expert! You could then rerun the experiment using your paper as the story in order to validate your initial findings…


Appeals to findings/evidence would be different, nah?


> 5. Prompt the reader to evaluate how their thoughts had changed after reading the expert version of the article.

Basing the perceived quality of an article on an appeal to authority doesn't make much sense either.

The Royal Society's motto is literally "take nobody's word for it."


To be pedantic, their motto is literally Nullius in verba which translates as you describe.


> at some point the claims will be found out and the funding will dry up.

Someone hasn't been watching the cryptocurrency markets.

That's partly tongue in cheek. But there are countless examples of the market remaining irrational longer than one can stay solvent.

Witness the continued success of BTC and Ether, amid newer options that outperform them on every tech-related metric, often by many orders of magnitude. I conclude that marketing hype and the first mover advantage form the vast bulk of valuation in a novel tech that people don't understand.

This is not to take away from the author's point at all - I would hope that anyone who invests in quantum computing reads the criticism from an insider who can actually read the papers.

However, as irrational and harmful as it is, I don't expect BTC to drop to zero before the day quantum computing actually does follow through. Rationality really isn't our thing.


>> I conclude that marketing hype and the first mover advantage form the vast bulk of valuation in a novel tech that people don't understand.

Well said. Having worked in the blockchain space as a developer and founder since 2017, I've also come to the same conclusion. The formula for success is hype + first mover advantage - Aside from that; it's all about social climbing and politics around those projects.

It's surprising how long the first-mover advantage advantage lasts and it's weird to see that even developers who should know better are getting pulled into learning poorly designed (or outdated) technologies. They're conflating the financial achievements of projects with their technological achievements.

I guess that's what happens when big investors are laser-focused on making as much money as possible instead of also trying to drive innovation forward.

IMO, the inability to separate the two is a major reason why we have such significant financial bubbles in the tech sector.


Crypto is a bad example because it's pretty pointless in general. You can have better, faster, more efficient, safer, easier to use tech but at the end of day you're only achieving the same thing: skipping over regulations that apply to traditional currencies but somehow don't to crypto. That's assuming good will as one can argue that creating a distributed Ponzi scheme is another goal. It turns out that to achieve those you can get away with a simple idea and implementation like BTC.

It's really not a surprise it's not about tech when the goals and challenges were never technical.

On the other hand in areas where it is about tech we see superior one winning over established players all the time. Google, WhatsApp, AMD just to name three out of many examples in various times of this millennium.


> it's pretty pointless in general

Tell that to Wikileaks; after Mastercard and VISA stopped letting people donate due to political pressure.

Tell that to people charged double digit percentages just for the privilege of spending hours transferring money across borders.

What about the Canadians freezing hundreds of accounts "linked" to civil protests (however much one may disagree with their cause).

People complain that micro-transactions aren't cost effective; but there are, right now, secure and decentralized, fee-less crypto techs; even quantum resistant ones, that allow for the transfer of millionths of a cent.

With no disrespect, I feel that people who say crypto has no use-case are being let down by their imagination and research skills.

> skipping over regulations that apply to traditional currencies

There is actually at least one cryptocurrency working on becoming recognized as a legitimate currency; Nano.

> in areas where it is about tech we see superior one winning over established players all the time.

People could immediately see that Google gave better search results; that WhatsApp was instant and free; and I don't think AMD is as clearcut an example as you state.

Notice that in my comment I stated the condition that people can't understand the tech by themselves - such is the case with crypto. People can't understand the proofs; the difference between POW, POS, and Block Lattice; whether a coin is decentralized or not, etc, just by using it.

Arguing against myself a bit, I did think people would notice how slow and expensive their BTC transactions were; or how awkward setting an account up is, or how often claimed roadmap milestones were put back - but they haven't. I don't think I accounted for the sheer volume and reach of bag holders and maxis.

BTC uses the same amount of power as Argentina - it's perverse. The people claiming it's better in any way, on any metric outside adoption are just wrong. The fact that BTC holders and miners can claim otherwise, and people apparently believe them, is kinda my whole point.

It feels like there's a strong parallel with QC, or nuclear energy, or even cannabis. People get deluded by hype from vested interests if they have no way to accurately judge for themselves.


> "amid newer options that outperform them on every tech-related metric"

What outperforms Bitcoin in terms of decentralization, scalability, and resilience?



We changed the url from https://futurism.com/the-byte/oxford-physicist-unloads-quant..., which points to this. We wouldn't do that if the site was hardwalled, but given that there are workarounds posted in this thread, it's more important to have the original source.


For FT articles at least, I usually see links to pirated content here, rather than workarounds?


Archive.ph is not really pirated content if you consider archival of publicly available information not as pirating.

The FT publishes its articles usually without a paywall for SEO reasons and to get traction in social media. Thats why there is almost always a archived version of the article. That way the paywall is not getting circumvented in any way.


This is definitely pirating.


I agree: Quantum computing = Quantum quatsch!


The article must be written someone who is not an expert in this field, or an expert who is suppressing information to disinform with his conclusion.

I say this because:

1. They say nothing about the breakthroughs in quantum error correction that is allowing IBM to promise a leap from 89 qubits today to 4,000 qubits in 2025 (still not enough on its own for a cryptographically relevant quantum computer - CRQC0, running Shor's algorithm for exponential speedup in breaking e.g. RSA 2048, which some research suggests would take 20M qubits including those for quantum error correction)

2. He did not mention Grover's algorithm which provides quadratic speedup (for time complexity of searching for a particular string in an unsorted list of N items) over their classical counterparts. However, even quadratic speedup is considerable when N is large.

3. He did not mention the breakthrough by University of Chicago researchers that showed multiple quantum computers can be entangled over tuned optical fibers to act as a single quantum computer. This still doesn't mean that we can go from 4,000 qubits to 20M by networking 5,000 of the quantum computers IBM promised for 2025, in 2025, but it provides a trajectory for networked quantum computing as a horizontal scaling strategy.

4. He did not mention the $100B allocated this year by the Whitehouse/Congress for CRQC research.

What is his motive in giving us such an incomplete story with such a skewed conclusion? Is he working for a hedge fund that is shorting some stock? Or is he just a lay person trying to sound intelligent by writing about a field where they're not sufficiently informed?




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: