Yeah, most quantum science at the moment is largely fraudulent. It’s not just Microsoft. It’s being developed because it’s being taught in business schools as the next big thing, not because anybody has any way to use it.
Any of the “quantum computers” you see in the news are nothing more than press releases about corporate emulators functioning how they think it might work if it did work, but it’s far too slow to be used for anything.
Quantum science is not fraudulent, incredible leaps are being made with the immense influx of funding.
Quantum industry is a different beast entirely, with scientific rigour being corrupted by stock price management.
It’s an objective fact that quantum computers indeed exist now, but only at a very basic prototype level. Don’t trust anything a journalist says about them, but they are real, and they are based on technology we had no idea if would ever be possible.
Well, I love being wrong! Are you able to show a documented quantum experiment that was carried out on a quantum computer (and not an emulator using a traditional architecture)?
How about a use case that isn’t simply for breaking encryption, benchmarking, or something deeply theoretical that they have no way to know how to actually program for or use in the real world?
I’m not requesting these proofs to be snarky, but simply because I’ve never seen anything else beyond what I listed.
When I see all the large corporations mentioning the processing power of these things, they’re simply mentioning how many times they can get an emulated tied bit to flip, and then claiming grandiose things for investors. That’s pretty much it. To me, that’s fraudulent (or borderline) corporate BS.
One good example of a quantum computer is the Lukin group neutral atoms work. As the paper discusses, they managed to perform error correction procedures making 48 actual logical qubits and performing operations on them. Still not all that practically useful, but it exists, and is extremely impressive from a physics experiment viewpoint.
There are also plenty of meaningful reports on non-emulated machines from the corporate world. From the big players examples include the Willow chip from Google and Heron from IBM being actual real quantum devices doing actual (albeit basic) operations. Furthermore there are a plethora of smaller companies like OQC and Pasqal with real machines.
On applications, this review is both extensive and sober, outlining the known applications with speedups, costs and drawbacks. Among the most exciting are Fermi-Hubbard model dynamics (condensed matter stuff), which is predicted to have exponential speedup with relatively few resources. These all depend on a relatively narrow selection of tricks, though. Among interesting efforts to fundamentally expand what tricks are available is this work from the Babbush group.
Let me know if that’s not what you were looking for.
I made the attempt, but couldn’t parse that first link. I gathered that it was about error correction due to the absolutely massive number of them that crop up in QC, but I admit that I can’t get much further with it as the industry language is thick on that paper. Error reduction is good, but it still isn’t on any viable data, and it’s still a massive amount of errors even post-correction. It’s more of a small refinement to an existing questionable system, which is okay, but doesn’t really do much unless I’m misunderstanding.
The Willow (and others) examples I’m skeptical on. We already have different types of chips for different kinds of operations, such as CPUs, GPUs, NPUs, etc. This is just one more kind of chip that will be found in computers of the future. Of course, these can sometimes be combined into a single chip too, but you get the idea.
The factorization of integers is one operation that is simple on a quantum computer. Since that is an essential part of public / private key cryptography, those encryption schemes have been recently upgraded with algorithms that a quantum computer cannot so easily unravel.
With quantum computing, a system of qubits can be set up in such a way that it’s like a machine that physically simulates the problem. It runs this experiment over and over again and measures the outcome, until one answer is the clear winner. For the right type of problem, and with enough qubits, this is unbelievably fast.
Problem is, this only works for systems that have a known answer (like cryptography) with a verifiable result, otherwise the system never knows when the equation is “complete”. It’s also of note that none of these organizations are publishing their benchmarking algorithms so when they talk about speed, they aren’t exactly being forthright. I can write code that runs faster on an Apple 2e than a modern x64 processor, doesn’t mean the Apple 2e is faster.
Then factor in how fast quantum systems degrade and it’s… not really useful in power expenditure or financially to do much beyond a large corporation or government breaking encryption.
Use cases are generally problems with very large amount of factors that are not feasible to calculate with normal comouters, think about chemical/medicine simulation and logistics optimization or public transport timetables.
So that’s the part that gets me stuck. There is no clear answer and it has no way to check the result as QC aren’t capable of doing so (otherwise they wouldn’t be using QC since they can only be based on binary inputs and binary guesses of true / false outcomes on a massive scale). How can it decide that it is “correct” and that the task is completed?
Computations based on guesses of true / false can only be so accurate with no way to check the result in the moment.
I used a hybrid of near-shore telepresence and on-site scrum sessions to move fast and put the quantum metaverse on a content-addressable de-fi AI blockchain
It’s…not shocking exactly, but a little surprising and a lot disappointing that so much of finance is now targeted at “let’s make a thing that we read about in sci fi novels we read as kids.”
Focusing on STEM and not the humanities means we have a bunch of engineers who think “book thing cool” and have zero understanding of how allegory works.
Elno has just reinforced that if you lie enough to become a billionaire, that the market will reward you for YEARS. Possibly forever of you don’t let them find out your a power hungry amazing who want to ruin the whole country.
Most competent engineers don’t think that. They know and understand the limitations of what they’re working on. They just do it because the finance bros pay.
Yeah, most quantum science at the moment is largely fraudulent. It’s not just Microsoft. It’s being developed because it’s being taught in business schools as the next big thing, not because anybody has any way to use it.
Any of the “quantum computers” you see in the news are nothing more than press releases about corporate emulators functioning how they think it might work if it did work, but it’s far too slow to be used for anything.
Quantum science is not fraudulent, incredible leaps are being made with the immense influx of funding.
Quantum industry is a different beast entirely, with scientific rigour being corrupted by stock price management.
It’s an objective fact that quantum computers indeed exist now, but only at a very basic prototype level. Don’t trust anything a journalist says about them, but they are real, and they are based on technology we had no idea if would ever be possible.
Well, I love being wrong! Are you able to show a documented quantum experiment that was carried out on a quantum computer (and not an emulator using a traditional architecture)?
How about a use case that isn’t simply for breaking encryption, benchmarking, or something deeply theoretical that they have no way to know how to actually program for or use in the real world?
I’m not requesting these proofs to be snarky, but simply because I’ve never seen anything else beyond what I listed.
When I see all the large corporations mentioning the processing power of these things, they’re simply mentioning how many times they can get an emulated tied bit to flip, and then claiming grandiose things for investors. That’s pretty much it. To me, that’s fraudulent (or borderline) corporate BS.
Hell yes! I’d love to share some stuff.
One good example of a quantum computer is the Lukin group neutral atoms work. As the paper discusses, they managed to perform error correction procedures making 48 actual logical qubits and performing operations on them. Still not all that practically useful, but it exists, and is extremely impressive from a physics experiment viewpoint.
There are also plenty of meaningful reports on non-emulated machines from the corporate world. From the big players examples include the Willow chip from Google and Heron from IBM being actual real quantum devices doing actual (albeit basic) operations. Furthermore there are a plethora of smaller companies like OQC and Pasqal with real machines.
On applications, this review is both extensive and sober, outlining the known applications with speedups, costs and drawbacks. Among the most exciting are Fermi-Hubbard model dynamics (condensed matter stuff), which is predicted to have exponential speedup with relatively few resources. These all depend on a relatively narrow selection of tricks, though. Among interesting efforts to fundamentally expand what tricks are available is this work from the Babbush group.
Let me know if that’s not what you were looking for.
I appreciate the reply!
I made the attempt, but couldn’t parse that first link. I gathered that it was about error correction due to the absolutely massive number of them that crop up in QC, but I admit that I can’t get much further with it as the industry language is thick on that paper. Error reduction is good, but it still isn’t on any viable data, and it’s still a massive amount of errors even post-correction. It’s more of a small refinement to an existing questionable system, which is okay, but doesn’t really do much unless I’m misunderstanding.
The Willow (and others) examples I’m skeptical on. We already have different types of chips for different kinds of operations, such as CPUs, GPUs, NPUs, etc. This is just one more kind of chip that will be found in computers of the future. Of course, these can sometimes be combined into a single chip too, but you get the idea.
The factorization of integers is one operation that is simple on a quantum computer. Since that is an essential part of public / private key cryptography, those encryption schemes have been recently upgraded with algorithms that a quantum computer cannot so easily unravel.
With quantum computing, a system of qubits can be set up in such a way that it’s like a machine that physically simulates the problem. It runs this experiment over and over again and measures the outcome, until one answer is the clear winner. For the right type of problem, and with enough qubits, this is unbelievably fast.
Problem is, this only works for systems that have a known answer (like cryptography) with a verifiable result, otherwise the system never knows when the equation is “complete”. It’s also of note that none of these organizations are publishing their benchmarking algorithms so when they talk about speed, they aren’t exactly being forthright. I can write code that runs faster on an Apple 2e than a modern x64 processor, doesn’t mean the Apple 2e is faster. Then factor in how fast quantum systems degrade and it’s… not really useful in power expenditure or financially to do much beyond a large corporation or government breaking encryption.
Use cases are generally problems with very large amount of factors that are not feasible to calculate with normal comouters, think about chemical/medicine simulation and logistics optimization or public transport timetables.
So that’s the part that gets me stuck. There is no clear answer and it has no way to check the result as QC aren’t capable of doing so (otherwise they wouldn’t be using QC since they can only be based on binary inputs and binary guesses of true / false outcomes on a massive scale). How can it decide that it is “correct” and that the task is completed?
Computations based on guesses of true / false can only be so accurate with no way to check the result in the moment.
So glad we dereguled the market so everything is a crypto scam now.
🌎🧑🚀🔫
I just saw on Linked In that in 12 months “quantum AI” is going to be where it’s at. Uh… really? Do I hear “crypto-quantum AI?”
QUANTUM AI? IN my blockchain? It’s more likely than you think!
deleted by creator
‘distributed compute’ using blockchain to farm out ai instances, is a web3 thang.
I used a hybrid of near-shore telepresence and on-site scrum sessions to move fast and put the quantum metaverse on a content-addressable de-fi AI blockchain
Fascinating. Where do I sign up?
That sounds like something they say your washing detergent has to clean stains better.
Crypto-quantum AI+ MaXX?
It’s…not shocking exactly, but a little surprising and a lot disappointing that so much of finance is now targeted at “let’s make a thing that we read about in sci fi novels we read as kids.”
Focusing on STEM and not the humanities means we have a bunch of engineers who think “book thing cool” and have zero understanding of how allegory works.
Elno has just reinforced that if you lie enough to become a billionaire, that the market will reward you for YEARS. Possibly forever of you don’t let them find out your a power hungry amazing who want to ruin the whole country.
Most competent engineers don’t think that. They know and understand the limitations of what they’re working on. They just do it because the finance bros pay.