DRAFT: Quantum Illusion

My Recent Posts

In this text, I am going to debunk quantum computing and then make a statement why "quantum computing" is an indicator of the dilapidated state of science today.

 

I must first state in the affirmative that logic must be used to establish the merit of QCs (quantum computers).  We must base this argument on logic because logic is the only established mechanism used by classical computers currently.  Our only comparison to value.  As well, logic is the ONLY mechanism we have in understanding anything.  Without an agreement on this assumption, even science, and all we understand today in science, would be rendered an "opinion". 

Intel Corporation does not manufacture "opinion" machines or "guessing" machines.  It makes logical machines.  In fact, the accuracy of these logical devices is so precise, that if their machines were wrong 1 time in a billion calculations, people would not buy these devices.

 

I actually can't believe I'm having to establish first that logic must be used in this argument.  In science, this was always assumed.  Perhaps this is an early indication of the state of science where metascience is now "alternatively" more appealing.  Very strange.

 

Enter:  the new "Guessing Machines"

 

I'm going to break down the claims made by QC scientists.   If you take the effort to listen to a QC lecture or read an article about QCing, you will observe these claims being made.

 

1) 

 

The scaling problem itself is a result of quantum decoherence—or, rather, wanting to eliminate it. The problem is that, as good ol’ Schroedinger was only too keen to point out, quantum systems need to be isolated from the rest of world in order to work. Interactions with the external world cause the system to decohere, collapsing down and taking a binary state, just like a normal computer.

 

First, there’s the question of knowing if it’s even working in the first place. A widely known tenet of quantum mechanics is that merely observing the phenomenon changes the outcome of an event. So, watch a quantum particle, or a qubit, or anything quantum for that matter, and you change its behavior. That means that it’s actually very difficult to tell if a quantum computer is behaving in the way we’d expect or need it to.

 

In fact, the currently available so-called quantum computers aren’t actually verified to be working the way they're supposed to. They’re simply based on the right theory, some fingers crossing, and judged by their output.

 

A quantum computer could, in theory, be used to calculate solutions in days, maybe even hours, that would take a normal computer thousands of years to produce. While some of the answers its spits out are verifiable—a complex cryptographic key generated by a quantum computer is testable by using it and checking it by encrypting and decrypting a message, for instance—there are others that can’t be tested. Simply put, quantum computers are often used to solve the types of problems for which we have no other confirmation mechanism. We often can't double check their work.

There may, however, be a way around this. A team of scientists from the University of Vienna have developed a technique call “blind quantum computing” that it believes could help. The idea’s quite straightforward; it involves mathematical traps, essentially intermediate steps in a calculation, which humans—or at least other computers—definitely can work out in advance. If those answers come out incorrect, then the overall answer must contain an error, too. Instead of checking the final solution, it keeps an eye out for problem spots along the way.

 

 

There is a very good reason you do not understand this . . . QCs, the way she described, is not functional. Adding more ambiguity into logic does not equate to solving harder math problems. Example of typical logical computers: 1 + 2 = 3 Example of QC computing: A + B = C Which one of these computers provides an answer that is useful? The other huge drawback with QCs (as described) is that QC theory says that A + B might equal C. Sometimes it equals C. Sometimes it doesn't equal C. So, not only do we add ambiguity with QCs, we also add more error. It is so ridiculous that these claims made by QC "scientists" are going unchallenged.

Recent Articles by Writers William Stockton follows.