Characteristics of quantum computers

Characteristics of quantum computers


The power of a quantum computer is measured in qubits, the basic unit of measurement in a quantum computer. Source .

I make a facepalm after each reading of a similar phrase. It didn’t bring any good, vision began to set; will soon have to contact Meklon.

I think it's time to systematize the basic parameters of a quantum computer somewhat. There are several:

  1. Number of qubits
  2. Coherence retention time (decoherence time)
  3. Error Level
  4. Processor Architecture
  5. Price, availability, conditions of detention, depreciation time, programming tools, etc.

Number of qubits


It's all obvious, the more the better. In fact, you need to pay for qubits, and ideally you need to buy exactly as many qubits as you need to complete the task. For the developer of exclusive gaming machines, one qubit per machine is enough (to generate a random house). For bruteforce RSA-2048 - at least 2048 qubits.

The most advanced quantum algorithms are named after Grover and Shor. Grover allows you to hack hashes. For the collapse of Bitcoin, computers with at least 256 qubits on board are needed (you can poshamanit with the complexity of Bitcoin, but let's dwell on this round number). Shore allows you to factor numbers. To factorize a number of n bits in length, at least n qubits are needed.

Current maximum: 50 qubits ( already 72? ). Indeed, 50 qubits is the limit. The limit of a quantum computer simulation. In theory, we can simulate any number of qubits on classical calculators. In practice, adding one qubit to the simulation requires doubling the classical solvers. Add to this the rumors about doubling qubits every year, and ask yourself the question: how to debug algorithms for 256 \ 512 \ 1024 \ 2048 qubits? There is no simulator, you cannot put a break point on a quantum processor.

Coherence retention time (decoherence time)


Coherence and coherence are not the same thing. I prefer to compare the coherence with the regeneration of RAM. There are billions of cells on the RAM bar, each has a charge, zero or one. This charge has a very interesting property - it flows down. Initially, the “single” cell becomes a cell at 0.99, then 0.98, and so on. Accordingly, on zero accumulates 0.01, 0.02, 0.03 ... It is necessary to update this charge, “regenerate”. Anything less than half is reset to zero, everything else reaches one.

Quantum processors can not be regenerated. Accordingly, there is one cycle for all calculations, up to the first “leaked” qubit. The time until the first “drip” is called decoherence time. Coherence is a state when qubits have not yet “leaked”. Here you can see a little more adult explanations.

Decoherence is related to the number of qubits: the more qubits, the harder it is to keep coherence. On the other hand, if there are a large number of qubits, some of them can be used to correct errors associated with decoherence. From this, it follows that the number of qubits in itself does not solve anything. You can double the number of qubits, and spend 90% of them on fixing decoherence.

Approximately here there is a concept of a logical qubit. Roughly speaking, if you have a processor for 100 qubits, but 40 of them are aimed at fixing decoherence - you have 60 logical qubits. Those on which you execute your algorithm. The concept of logical qubits is now rather theoretical, I personally have not heard about practical implementations.

Errors and Corrections


Another scourge of quantum processors. If you invert a qubit, with a 2% probability, the operation will fail. If you confuse 2 qubits, the probability of error reaches 8%. Take the number of 256 bits, hash it on SHA-256, count the number of operations, count the probability of performing ALL of these operations accurately.

Mathematicians provide a solution: error correction. Algorithms are. Implementing one entanglement of 2 logical qubits requires 100,000 physical qubits. Bitcoin does not come soon.

Processor Architecture


Strictly speaking, there are no quantum computers. There are only quantum processors. Why do we need RAM when time for work is limited to milliseconds? I program in Q #, but it is a high level language. Allocated to itself 15 qubits, and do with them what you want. Wanted, confused the first qubit with the tenth. Desired - confused the first six.

There is no such freedom on a real processor. Asked to confuse the first qubit with 15 - the compiler will generate 26 additional operations. If you're lucky. If you are unlucky, it will generate a hundred. The fact is that a qubit can only get confused with its neighbors. More than 6 neighbors per qubit, I have not seen. In principle, there are compilers optimizing quantum programs, but they are rather theoretical so far.

Each processor has its own set of instructions, and the links between the qubits are different. In a perfect world, we have arbitrary Rx, Ry, Rz, and their combinations, plus free entanglement for a dozen of signs, plus Swap: look at the operators in Quirk . In real life, we have several pairs of qubits, and the entanglement of CNOT (q [0], q [1]) costs one operation, and CNOT (q [1], q [0]) - already in 7. And the coherence melts ...

Price, availability, conditions of detention, depreciation time, programming tools ...


Prices are not advertised, availability to the ordinary citizen is near zero, the depreciation time in practice is not counted, programming tools are only being born. Documentation on arxiv.org.

So, what kind of information is required from experts when a new quantum computer is released?


Besides the list above, I like the options from PerlPower and Alter2 :
Here, each article about a new quantum computer starts with two characteristics - the number of simultaneous tangled qubits, and the qubit hold time.
Or even better, from the runtime of the simplest benchmark, for example, finding prime factors of the number 91.

Source text: Characteristics of quantum computers