When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Researchers have now used neural networks to pinpoint qubits in large-scale quantum computers

So far, quantum computers have remained in an esoteric realm with limited applications. But researchers are trying to up-scale them. One way of implementing a large-scale, fault-tolerant quantum computer architecture with silicon qubits is by positioning individual phosphorus atoms on a 2D grid. Within this grid, nanoelectronic wires can control the one-qubit and two-qubit logical quantum gates to perform computations.

However, this design choice relies heavily on precise placement of phosphorus atoms on the silicon grid. Since quantum computing leverages the properties of quantum entanglement to achieve exponential computing power, uncertainty in the location of a qubit atom of the order of one atomic lattice site can disrupt these interactions by orders of magnitude. Consequently, this leads to errors in two-qubit gate operations, producing inaccurate results for a given computation. The effect is even more pronounced in large-scale quantum computing architectures.

To help deal with this problem, in 2016, researchers at the University of Melbourne used computed scanning tunneling microscope (STM) images of phosphorus atom wave functions to pinpoint their spatial locations in silicon. This allowed highly accurate pinpointing of qubit atom locations a with single lattice. But the next challenge was to scale this method of spatial pinpointing to large scale, fault tolerant quantum computers.

To develop this framework, the researchers have now leveraged the power of deep learning to train a convolutional neural network (CNN) on computed STM images. The training was performed on a dataset of 100,000 STM images. Then the trained model was tested on 17,600 test images complete with blurring and asymmetry noise that is typically present in the realistic environments. Impressively, the model achieved an accuracy of over 98%.

The CNN classified the test images with an accuracy of above 98 per cent, confirming that this machine learning-based technique could process qubit measurement data with high-throughput, high precision, and minimal human interaction.

Furthermore, the proposed technique also has the potential to scale up for qubits consisting of more than one phosphorus atoms. In such a setting, the number of possible image configurations increases exponentially. The team stated that such a machine learning-based framework could be potent in this situation as it can include any number of possible configurations.

This work shows how machine learning techniques such as developed in this work could play a crucial role in this aspect of the realisation of a full-scale fault-tolerant universal quantum computer – the ultimate goal of the global research effort.

If you are interested in finding out more, you may study the paper published in Nature.

Report a problem with article
The Twitter logo on a light background
Next Article

Twitter broadens its definition of harm for stricter moderation of content about COVID-19

Previous Article

New Samsung Galaxy S20 firmware update improves camera autofocus performance

Join the conversation!

Login or Sign Up to read and post a comment.

0 Comments - Add comment