Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Introduction to the ZX-Calculus (pennylane.ai)
62 points by EvgeniyZh on June 6, 2023 | hide | past | favorite | 13 comments


Wow, I didn’t expect to see ZX here. Its related to my PhD. You can find a lot of ZX related literature on [1]. [2] is a really nice introduction. If you want to play with it in python take a look at pyzx [3].

[1] https://zxcalculus.com/

[2] https://arxiv.org/abs/2012.13966

[3] https://github.com/Quantomatic/pyzx


Why is it useful in a world where no quantum computer physically exists?


> Why is it useful in a world where no quantum computer physically exists?

Turing's and Church's investigations into the various theoretical computing calculi are enormously useful, even though both were done well before the existence of anything we'd recognize as a modern general-purpose computer.


Yeah but you at least had analog forms of things that represented mechanical work through computation (looms, relays, etc...). Nothing exists at all in terms of physical worth in relation to quantum computing and nothing is even close to being _real_ besides a model of a theoretical physical computation that has to be checked by classical means. So I don't really get why there exists a language for it.


But we do have a ton of "analog quantum computers": basically any experiment in quantum materials or quantum chemistry is something we can not simulate and something that informs us about analogs of such systems. No different from using single-purpose analog electronics experiments to predict as an analog to trajectories of spacecrafts or ocean currents or bombs or planes.

In both the classical and the quantum case, building a digital computer (a computer that has the potential to scale to large system sizes, unlike analog computers) is what is difficult.

Babbage and Lovelace imagined a digital computer in 1830s. It took 120 years until there was an actual digital computer in existence. It took the creation of an actual digital computer around 1950 for people to stop saying "digital computers are impossible because of noise". See "Probabilistic Logics and the Synthesis of Reliable Organisms From Unreliable Component" by von Neumann, to see how incredibly surprised theorists were at the time.

Quantum digital computers were conceptualized in the late 1980s, the existence of quantum error correction (the equivalent to von Neumann's paper) was shown in late 1990s. Compared to how long it took for conception-to-realization in the case of classical computers (120 years), the quantum side of things is not that terrible (30 years and counting).


Let me give you one example to illustrate the general principle. Any quantum computer we build will be too noisy to do any useful computation as is. To counter this, it is necessary to employ (quantum) error-correction at the software level to reduce the noise sufficiently to do useful computation.

While a lot of quantum error-correction algorithms have been developed using traditional techniques, they are still not good enough and there appears to be still lots of room for improvement. The usage of ZX-calculus has already started to yield novel insights into how to design these better algorithms. The change of language gives a new perspective that helps quite a lot.

Hence, ZX calculus is helping to bring quantum computers much closer to reality.


That's very true indeed, and for photonic QCs ZX has become vital for all software.


That's not a bad approximation but consider that we did have computers being used in anger (to kill people, even) very shortly after, where are the quantum computers today?


I have a ZX-81 in the basement that won't take to much work to get running.


Not specifically ZX, but rather any non-trivial technique for modeling quantum dynamics (or some subset of such dynamics) is extremely useful: it lets you study what is the boundary between classical and quantum algorithms.

Consider that I am excited even about the worst case scenario: I am a researcher working on building quantum hardware -- if one day techniques like these show that you can actually simulate a complete quantum computer efficiently in a classical computer, then my whole field will die. But that would be a monumental win because we would have learnt incredibly deep truths about the laws of math and nature.

Or more realistically and practically, currently such techniques let us model small parts of the quantum hardware and let us make informed decisions about how to proceed with building it.


There are many quantum computers around, IBM and Google's supper conducting ones, our ion traps, optical ones by e.g. Quandela, and they are all heavily used. There exist compilers for them and they heavily employ ZX-calculus for optimisation. ZX-calculus is also the main language for optical quantum computing.

Within the year we expect them to do a task that is impossible on classical hardware within a realistic timeframe.

Btw ZX-calculus as part of categorical quantum mechanics is an alternative formalism for QM, independent of quantum computing. It has been proven to be complete for Hilbert space and linear maps.


These are basically interaction nets with different rules than those used for lambda calculus.

> Interaction nets are a graphical model of computation devised by Yves Lafont in 1990

https://en.wikipedia.org/wiki/Interaction_nets


Not really, interaction nets and linear logic don't have the compact closed categorical semantics we use here. Ros Duncan's PhD is about the differences. (and that was long before the extra structures of ZX had started)




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: