Determination of the semion code threshold using neural decoders
Loading...
Official URL
Full text at PDC
Publication date
2020
Advisors (or tutors)
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Amer Physical Soc
Citation
Abstract
We compute the error threshold for the semion code, the companion of the Kitaev toric code with the same gauge symmetry group Z(2). The application of statistical mechanical mapping methods is highly discouraged for the semion code, since the code is non-Pauli and non-Calderbank-Shor-Steane (CSS). Thus, we use machine learning methods, taking advantage of the near-optimal performance of some neural network decoders: multi-layer perceptrons and convolutional neural networks (CNNs). We find the values p(eff) = 9.5% for uncorrelated bit-flip and phase-flip noise, and p(eff) = 10.5% for depolarizing noise. We contrast these values with a similar analysis of the Kitaev toric code on a hexagonal lattice with the same methods. For convolutional neural networks, we use the ResNet architecture, which allows us to implement very deep networks and results in better performance and scalability than the multilayer perceptron approach. We analyze and compare in detail both approaches and provide a clear argument favoring the CNN as the best suited numerical method for the semion code.
Description
©2020 American Physical Society.
We thank G. Dauphinais for useful discussions at the early stage of this research. The authors thankfully acknowledge the resources from the supercomputer "Cierzo," HPC infrastructure of the Centro de Supercomputacion de Aragon (CESAR), and the technical expertise and assistance provided by BIFI (Universidad de Zaragoza). S.V. especially thanks Hector Villarrubia Rojo for computational resources and technical assistance. We acknowledge financial support from the Spanish MINECO grants MINECO/FEDER Projects No. FIS2017-91460-EXP and No. PGC2018-099169-B-I00FIS2018 and from CAM/FEDER Project No. S2018/TCS-4342 (QUITEMAD-CM). The research of M.A.M.-D. has been partially supported by the U.S. Army Research Office through Grant No. W911NF-14-1-0103. S.V. thanks the support of a FPU MECD Grant.