Differentiable programming of isometric tensor networks

Geng, Chenhua and Hu, Hong-Ye and Zou, Yijian (2022) Differentiable programming of isometric tensor networks. Machine Learning: Science and Technology, 3 (1). 015020. ISSN 2632-2153

[thumbnail of Geng_2022_Mach._Learn.__Sci._Technol._3_015020.pdf] Text
Geng_2022_Mach._Learn.__Sci._Technol._3_015020.pdf - Published Version

Download (1MB)

Abstract

Differentiable programming is a new programming paradigm which enables large scale optimization through automatic calculation of gradients also known as auto-differentiation. This concept emerges from deep learning, and has also been generalized to tensor network optimizations. Here, we extend the differentiable programming to tensor networks with isometric constraints with applications to multiscale entanglement renormalization ansatz (MERA) and tensor network renormalization (TNR). By introducing several gradient-based optimization methods for the isometric tensor network and comparing with Evenbly–Vidal method, we show that auto-differentiation has a better performance for both stability and accuracy. We numerically tested our methods on 1D critical quantum Ising spin chain and 2D classical Ising model. We calculate the ground state energy for the 1D quantum model and internal energy for the classical model, and scaling dimensions of scaling operators and find they all agree with the theory well.

Item Type: Article
Subjects: STM Open Library > Multidisciplinary
Depositing User: Unnamed user with email support@stmopenlibrary.com
Date Deposited: 14 Jul 2023 11:05
Last Modified: 11 May 2024 09:41
URI: http://ebooks.netkumar1.in/id/eprint/1884

Actions (login required)

View Item
View Item