Julia package for contraction of tensor networks, based on the sweep line algorithm outlined in the paper General tensor network decoding of 2D Pauli codes

Overview

SweepContractor.jl

A Julia package for the contraction of tensor networks using the sweep-line-based contraction algorithm laid out in the paper General tensor network decoding of 2D Pauli codes. This algorithm is primarily designed for two-dimensional tensor networks but contains graph manipulation tools that allow it to function for generic tensor networks.

Sweep-line anim

Below I have provided some examples of SweepContractor.jl at work. Scripts with working versions of each of these examples are also included in the package. For more detailed documentation consult help pages by using ? in the Julia REPL.

Feel free to contact me with any comments, questions, or suggestions at [email protected]. If you use SweepContractor.jl for research, please cite either arXiv:2101.04125 and/or doi:10.5281/zenodo.5566841.

Example 1: ABCD

Consider the following four tensor networks, taken from the tensor network review Hand-waving and Interpretive Dance:

ABCD1,

where each tensor is defined

ABCD2

First we need to install SweepContract.jl, which we do by running

import Pkg
Pkg.add("SweepContractor")

Now that it's installed we can use the package by running

using SweepContractor

Next we need to define our network. We do this by initialising a LabelledTensorNetwork, which allows us to have a tensor network with elements labelled by an arbitrary type, in our case Char.

LTN = LabelledTensorNetwork{Char}()

Next, we populate this with our four tensors, which are each specified by giving a list of neighbouring tensors, an array consisting of the entries, and a two-dimensional location.

LTN['A'] = Tensor(['D','B'], [i^2-2j for i=0:2, j=0:2], 0, 1)
LTN['B'] = Tensor(['A','D','C'], [-3^i*j+k for i=0:2, j=0:2, k=0:2], 0, 0)
LTN['C'] = Tensor(['B','D'], [j for i=0:2, j=0:2], 1, 0)
LTN['D'] = Tensor(['A','B','C'], [i*j*k for i=0:2, j=0:2, k=0:2], 1, 1)

Finally, we want to contract this network. To do this we need to specify a target bond dimension and a maximum bond-dimension. In our case, we will use 2 and 4.

value = sweep_contract(LTN,2,4)

To avoid underflows or overflows in the case of large networks sweep_contract does not simply return a float, but returns (f::Float64,i::Int64), which represents a valuef*2^i. In this case, it returns (1.0546875, 10). By running ldexp(sweep...) we can see that this corresponds to the exact value of the network of 1080.

Note there are two speedups that can be made to this code. Firstly, sweep_contract copies the input tensor network, so we can use the form sweep_contract! which allows the function to modify the input tensor network, skipping this copy step. Secondly, sweep_contract is designed to function on arbitrary tensor networks, and starts by flattening the network down into two dimensions. If our network is already well-structured, we can run the contraction in fast mode skipping these steps.

value = sweep_contract!(LTN,2,4; fast=true)

Examples 2: 2d grid (open)

Next, we move on to the sort of network this code was primarily designed for, a two-dimensional network. Here consider an square grid network of linear size L, with each index of dimension d. For convenience, we can once again use a LabelledTensorNetwork, with labels in this case corresponding to coordinates in the grid. To construct such a network with Gaussian random entries we can use code such as:

LTN = LabelledTensorNetwork{Tuple{Int,Int}}();
for i1:L, j1:L
    adj=Tuple{Int,Int}[];
    i>1 && push!(adj,(i-1,j))
    j>1 && push!(adj,(i,j-1))
    i<L && push!(adj,(i+1,j))
    j<L && push!(adj,(i,j+1))
    LTN[i,j] = Tensor(adj, randn(d*ones(Int,length(adj))...), i, j)
end

We note that the if statements used have the function of imposing open boundary conditions. Once again we can now contract this by running the sweep contractor (in fast mode), for some choice of bond-dimensions χ and τ:

value = sweep_contract!(LTN,χ,τ; fast=true)

Example 3: 2d grid (periodic)

But what about contracting a 2d grid with periodic boundary conditions? Well, this contains a small number of long-range bonds. Thankfully, however SweepContractor.jl can run on such graphs by first planarising them.

We might start by taking the above code and directly changing the boundary conditions, but this will result in the boundary edges overlapping other edges in the network (e.g. the edge from (1,1) to (2,1) will overlap the edge from (1,1) to (L,1)), which the contractor cannot deal with. As a crude workaround we just randomly shift the position of each tensor by a small amount:

LTN = LabelledTensorNetwork{Tuple{Int,Int}}();
for i1:L, j1:L
    adj=[
        (mod1(i-1,L),mod1(j,L)),
        (mod1(i+1,L),mod1(j,L)),
        (mod1(i,L),mod1(j-1,L)),
        (mod1(i,L),mod1(j+1,L))
    ]
    LTN[i,j] = Tensor(adj, randn(d,d,d,d), i+0.1*rand(), j+0.1*rand())
end

Here the mod1 function is imposing our periodic boundary condition, and rand() is being used to slightly move each tensor. Once again we can now run sweep_contract on this, but cannot use fast-mode as the network is no longer planar:

value = sweep_contract!(LTN,χ,τ)

Example 4: 3d lattice

If we can impose periodic boundary conditions, can we go further away from 2D? How about 3D? We sure can! For this we can just add another dimension to the above construction for a 2d grid:

LTN = LabelledTensorNetwork{Tuple{Int,Int,Int}}();
for i1:L, j1:L, k1:L
    adj=Tuple{Int,Int,Int}[];
    i>1 && push!(adj,(i-1,j,k))
    i<L && push!(adj,(i+1,j,k))
    j>1 && push!(adj,(i,j-1,k))
    j<L && push!(adj,(i,j+1,k))
    k>1 && push!(adj,(i,j,k-1))
    k<L && push!(adj,(i,j,k+1))
    LTN[i,j,k] = Tensor(
        adj,
        randn(d*ones(Int,length(adj))...),
        i+0.01*randn(),
        j+0.01*randn()
    )
end

value = sweep_contract!(LTN,χ,τ)

Example 5: Complete network

So how far can we go away from two-dimensional? The further we stray away from two-dimensional the more inefficient the contraction will be, but for small examples arbitrary connectivity is permissible. The extreme example is a completely connected network of n tensors:

TN=TensorNetwork(undef,n);
for i=1:n
    TN[i]=Tensor(
        setdiff(1:n,i),
        randn(d*ones(Int,n-1)...),
        randn(),
        randn()
    )
end

value = sweep_contract!(LTN,χ,τ)

Here we have used a TensorNetwork instead of a LabelledTensorNetwork. In a LabelledTensorNetwork each tensor can be labelled by an arbitrary type, which is accomplished by storing the network as a dictionary, which can incur significant overheads. TensorNetwork is built using vectors, which each label now needs to be labelled by an integer 1 to n, but can be significantly faster. While less flexible, TensorNetwork should be preferred in performance-sensitive settings.

You might also like...
 Pretty Tensor - Fluent Neural Networks in TensorFlow
Pretty Tensor - Fluent Neural Networks in TensorFlow

Pretty Tensor provides a high level builder API for TensorFlow. It provides thin wrappers on Tensors so that you can easily build multi-layer neural networks.

Self-Correcting Quantum Many-Body Control using Reinforcement Learning with Tensor Networks

Self-Correcting Quantum Many-Body Control using Reinforcement Learning with Tensor Networks This repository contains the code and data for the corresp

DI-HPC is an acceleration operator component for general algorithm modules in reinforcement learning algorithms

DI-HPC: Decision Intelligence - High Performance Computation DI-HPC is an acceleration operator component for general algorithm modules in reinforceme

PICARD - Parsing Incrementally for Constrained Auto-Regressive Decoding from Language Models
PICARD - Parsing Incrementally for Constrained Auto-Regressive Decoding from Language Models

This is the official implementation of the following paper: Torsten Scholak, Nathan Schucher, Dzmitry Bahdanau. PICARD - Parsing Incrementally for Con

PyTorch implementation of D2C: Diffuison-Decoding Models for Few-shot Conditional Generation.
PyTorch implementation of D2C: Diffuison-Decoding Models for Few-shot Conditional Generation.

D2C: Diffuison-Decoding Models for Few-shot Conditional Generation Project | Paper PyTorch implementation of D2C: Diffuison-Decoding Models for Few-sh

Code For TDEER: An Efficient Translating Decoding Schema for Joint Extraction of Entities and Relations (EMNLP2021)
Code For TDEER: An Efficient Translating Decoding Schema for Joint Extraction of Entities and Relations (EMNLP2021)

TDEER (WIP) Code For TDEER: An Efficient Translating Decoding Schema for Joint Extraction of Entities and Relations (EMNLP2021) Overview TDEER is an e

General Virtual Sketching Framework for Vector Line Art (SIGGRAPH 2021)
General Virtual Sketching Framework for Vector Line Art (SIGGRAPH 2021)

General Virtual Sketching Framework for Vector Line Art - SIGGRAPH 2021 Paper | Project Page Outline Dependencies Testing with Trained Weights Trainin

Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more

Apache MXNet (incubating) for Deep Learning Apache MXNet is a deep learning framework designed for both efficiency and flexibility. It allows you to m

Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more

Apache MXNet (incubating) for Deep Learning Apache MXNet is a deep learning framework designed for both efficiency and flexibility. It allows you to m

Comments
  • Restructure code base and depend on DataStructures rather than copying code.

    Restructure code base and depend on DataStructures rather than copying code.

    • Organize some files in subdirectories
    • SweepContractor.jl uses a data structure copied and modified from DataStructures.jl. This PR minimizes the number of files copied and instead depends as much as possible on DataStructures.jl
    • Creates a test suite with a few tests taken from the examples.
    opened by jlapeyre 0
Releases(v0.1.7)
Owner
Christopher T. Chubb
Christopher T. Chubb
Python3 Implementation of (Subspace Constrained) Mean Shift Algorithm in Euclidean and Directional Product Spaces

(Subspace Constrained) Mean Shift Algorithms in Euclidean and/or Directional Product Spaces This repository contains Python3 code for the mean shift a

Yikun Zhang 0 Oct 19, 2021
Vehicle direction identification consists of three module detection , tracking and direction recognization.

Vehicle-direction-identification Vehicle direction identification consists of three module detection , tracking and direction recognization. Algorithm

5 Nov 15, 2022
UFT - Universal File Transfer With Python

UFT 2.0.0 UFT (Universal File Transfer) is a CLI tool , which can be used to upl

Merwin 1 Feb 18, 2022
NAACL2021 - COIL Contextualized Lexical Retriever

COIL Repo for our NAACL paper, COIL: Revisit Exact Lexical Match in Information Retrieval with Contextualized Inverted List. The code covers learning

Luyu Gao 108 Dec 31, 2022
Implementation of ConvMixer for "Patches Are All You Need? 🤷"

Patches Are All You Need? 🤷 This repository contains an implementation of ConvMixer for the ICLR 2022 submission "Patches Are All You Need?" by Asher

CMU Locus Lab 934 Jan 08, 2023
BTC-Generator - BTC Generator With Python

Что такое BTC-Generator? Это генератор чеков всеми любимого @BTC_BANKER_BOT Для

DoomGod 3 Aug 24, 2022
Official PyTorch Implementation for "Recurrent Video Deblurring with Blur-Invariant Motion Estimation and Pixel Volumes"

PVDNet: Recurrent Video Deblurring with Blur-Invariant Motion Estimation and Pixel Volumes This repository contains the official PyTorch implementatio

Junyong Lee 98 Nov 06, 2022
This repository is for DSA and CP scripts for reference.

dsa-script-collections This Repo is the collection of DSA and CP scripts for reference. Contents Python Bubble Sort Insertion Sort Merge Sort Quick So

Aditya Kumar Pandey 9 Nov 22, 2022
[NeurIPS 2020] Official repository for the project "Listening to Sound of Silence for Speech Denoising"

Listening to Sounds of Silence for Speech Denoising Introduction This is the repository of the "Listening to Sounds of Silence for Speech Denoising" p

Henry Xu 40 Dec 20, 2022
"Domain Adaptive Semantic Segmentation without Source Data" (ACM MM 2021)

LDBE Pytorch implementation for two papers (the paper will be released soon): "Domain Adaptive Semantic Segmentation without Source Data", ACM MM2021.

benfour 16 Sep 28, 2022
🛠️ Tools for Transformers compression using Lightning ⚡

Bert-squeeze is a repository aiming to provide code to reduce the size of Transformer-based models or decrease their latency at inference time.

Jules Belveze 66 Dec 11, 2022
Code for the ECIR'22 paper "Evaluating the Robustness of Retrieval Pipelines with Query Variation Generators"

Query Variation Generators This repository contains the code and annotation data for the ECIR'22 paper "Evaluating the Robustness of Retrieval Pipelin

Gustavo Penha 12 Nov 20, 2022
AAI supports interdisciplinary research to help better understand human, animal, and artificial cognition.

AnimalAI 3 AAI supports interdisciplinary research to help better understand human, animal, and artificial cognition. It aims to support AI research t

Matthew Crosby 58 Dec 12, 2022
gtfs2vec - Learning GTFS Embeddings for comparing PublicTransport Offer in Microregions

gtfs2vec This is a companion repository for a gtfs2vec - Learning GTFS Embeddings for comparing PublicTransport Offer in Microregions publication. Vis

Politechnika Wrocławska - repozytorium dla informatyków 5 Oct 10, 2022
Segment axon and myelin from microscopy data using deep learning

Segment axon and myelin from microscopy data using deep learning. Written in Python. Using the TensorFlow framework. Based on a convolutional neural network architecture. Pixels are classified as eit

NeuroPoly 103 Nov 29, 2022
Code for Universal Semi-Supervised Semantic Segmentation models paper accepted in ICCV 2019

USSS_ICCV19 Code for Universal Semi Supervised Semantic Segmentation accepted to ICCV 2019. Full Paper available at https://arxiv.org/abs/1811.10323.

Tarun K 68 Nov 24, 2022
Virtual hand gesture mouse using a webcam

NonMouse 日本語のREADMEはこちら This is an application that allows you to use your hand itself as a mouse. The program uses a web camera to recognize your han

Yuki Takeyama 55 Jan 01, 2023
Calculates JMA (Japan Meteorological Agency) seismic intensity (shindo) scale from acceleration data recorded in NumPy array

shindo.py Calculates JMA (Japan Meteorological Agency) seismic intensity (shindo) scale from acceleration data stored in NumPy array Introduction Japa

RR_Inyo 3 Sep 23, 2022
DeepCAD: A Deep Generative Network for Computer-Aided Design Models

DeepCAD This repository provides source code for our paper: DeepCAD: A Deep Generative Network for Computer-Aided Design Models Rundi Wu, Chang Xiao,

Rundi Wu 85 Dec 31, 2022
The GitHub repository for the paper: “Time Series is a Special Sequence: Forecasting with Sample Convolution and Interaction“.

SCINet This is the original PyTorch implementation of the following work: Time Series is a Special Sequence: Forecasting with Sample Convolution and I

386 Jan 01, 2023