diff --git a/docs/refs.bib b/docs/refs.bib index 0beca3c2..1fdf1de1 100644 --- a/docs/refs.bib +++ b/docs/refs.bib @@ -95,4 +95,14 @@ @misc{cotengra howpublished={https://github.com/jcmgray/cotengra}, url={https://github.com/jcmgray/cotengra}, } +@article{arute2019quantum, + title={Quantum supremacy using a programmable superconducting processor}, + author={Arute, Frank and Arya, Kunal and Babbush, Ryan and Bacon, Dave and Bardin, Joseph C and Barends, Rami and Biswas, Rupak and Boixo, Sergio and Brandao, Fernando GSL and Buell, David A and others}, + journal={Nature}, + volume={574}, + number={7779}, + pages={505--510}, + year={2019}, + publisher={Nature Publishing Group} +} } \ No newline at end of file diff --git a/docs/src/transformations.md b/docs/src/transformations.md index 55ed2e03..70ad67ac 100644 --- a/docs/src/transformations.md +++ b/docs/src/transformations.md @@ -4,7 +4,7 @@ In tensor network computations, it is good practice to apply various transformat A crucial reason why these methods are indispensable lies in their ability to drastically reduce the problem size of the contraction path search and also the contraction. This doesn't necessarily involve reducing the maximum rank of the Tensor Network itself, but more importantly, it reduces the size (or rank) of the involved tensors. -Our approach has been significantly inspired by the ideas presented in the [Quimb](https://quimb.readthedocs.io/) library, explained in [this paper](https://arxiv.org/pdf/2002.01935.pdf). +Our approach is based in [gray2021hyper](@cite), which can also be found in [quimb](https://quimb.readthedocs.io/). In Tenet, we provide a set of predefined transformations which you can apply to your `TensorNetwork` using both the `transform`/`transform!` functions. @@ -249,7 +249,7 @@ fig #hide ## Example: RQC simplification -Here we show how can we reduce the complexity of the tensor network by applying a tranformation to it. We take as an example the Sycamore circuit from the [Google's quantum supremacy paper](https://www.nature.com/articles/s41586-019-1666-5) +Local transformations can dramatically reduce the complexity of tensor networks. Take as an example the Random Quantum Circuit circuit on the Sycamore chip from Google's quantum advantage experiment [arute2019quantum](@cite). ```@setup plot using Makie diff --git a/ext/TenetMakieExt.jl b/ext/TenetMakieExt.jl index 2fd2cc62..18338554 100644 --- a/ext/TenetMakieExt.jl +++ b/ext/TenetMakieExt.jl @@ -16,9 +16,8 @@ Plot a [`TensorNetwork`](@ref) as a graph. # Keyword Arguments - - `inds` Whether to show the index labels. Defaults to `false`. - - `layout` Algorithm used to map graph vertices to a (2D or 3D) coordinate system. - The algorithms implemented in the `NetworkLayout` package are recommended. + - `labels` If `true`, show the labels of the tensor indices. Defaults to `false`. + - The rest of `kwargs` are passed to `GraphMakie.graphplot`. """ function Makie.plot(tn::TensorNetwork; kwargs...) f = Figure()