Chapter 1 Preface

This is an open source project accessible on GitHub - only possible thanks to its many contributors. The website is licensed under CC BY-NC-SA 4.0. We are searching for talented people and researchers to contribute: you can find a list of issues and enhancements that would improve this book further on the issues list for the GitHub repo, making recent material that keeps popping up in quantum information accessible to a larger audience.

The aim of this book is twofold:

  • First, we want to bridge the gap between introductory material in quantum computation and research material.
  • Secondarily, you should be able to use this book as a resource for state-of-the-art algorithms. Readers and scholars should find statements of theorems (along with their citations) and runtimes of the best quantum subroutines in literature, ready to be used in new quantum algorithms or applications.

These lecture notes were used to teach at:

  • Politecnico di Milano (2019) - Quantum machine learning in practice.
  • Politecnico di Milano (2021) - Applied quantum computing.

Are you using these lecture notes as support for your course? Write us an email!

This book is dedicated to all cypherpunks: civil liberties through complex mathematics.

1.1 Abstract

In these lecture notes, we explore how we can leverage quantum computers and quantum algorithms for information processing. It has long been known that quantum computation can offer computational advantages with respect to classical computation, and in this place we explore more the consequences of this intuition in current domains of computer sciences.

Why are we studying quantum algorithms? Studying how to use quantum mechanical systems is already fascinating in itself, but we argue that having faster algorithms is not the only reason for studying quantum computing. Studying quantum computation might also reveal profound insights into new ways to process information. For instance, it can give us ideas on processing data in a secure way (though, quantum cryptography is not discussed in these notes). Understanding the computational capabilities of quantum machines is certainly an interesting thing to do. This might lead to understanding the computational limits of nature: what can be computed in this world? Last but not least, because of the interplay between classical and quantum computation, many new classical algorithms have been invented (i.e. the dequantizations of quantum machine learning algorithms, the classical algorithms for Gibbs sampling, simulations of QAOA, etc..). This, in turn, improved our understanding of physics, and ultimately of the world itself. Another reason for studying quantum algorithms is that quantum computers are posing a significant challenge to the strong Church-Turing thesis, which says that any “reasonable” model of computation can be efficiently simulated on a probabilistic Turing machine (i.e. a Turing machine which has access to randomness). However, there are some physical processes that we do not know how to simulate efficiently on classical computers, but for which we have efficient quantum algorithms! This is strong evidence that the strong Church-Turing thesis might be false!

You might often hear that there are only two real quantum algorithms: phase estimation and the Grover’s algorithm. This is somewhat true, but it is true in the same way that we have only 12 notes in the western temperate scale, and yet Pink Floyd were able to write The Dark Side of the Moon (and the other musicians came up with “the rest” of the music).

The common thread of these algorithms is that they are faster than their best classical counterpart. Oftentimes, (especially for ML) the runtime will depend only poly-logarithmically on the number of elements of the dataset, and it is usually only linear in the number of features (classical algorithms are often either linear in the number of elements and quadratic in the number of features, or depend on the number of nonzero components of the matrix and depend polynomially on other parameters of the matrix). The runtime of a quantum machine learning algorithm also often depends on characteristics of the matrix that represents the data under analysis, such as its rank, the Frobenius norm (or other matrix norms), the sparsity, the condition number, and the error we tolerate in the analysis. For this, along with an error-corrected quantum computer, we assume to have quantum access to a dataset. In other words, we assume that the data is stored in a quantum memory: the corresponding quantum version of the classical random-access memory.

We will see that, for a new QML algorithm, one often needs to make sure that the real performances of the quantum algorithms offer concrete advantages with respect to the effective runtime and the accuracy that is offered by the best classical algorithms. As we don’t have access to big-enough quantum computers yet, we can only assess the performance of these quantum algorithms via a classical simulation.

These lecture notes should prepare the future quantum data analyst to understand the potential and limitations of quantum computers, so as to unlock new capabilities in information processing and machine learning. The hope is that this kind of technology can foster further technological advancements that benefit society and humankind, as soon as the hardware that supports this kind of computation becomes ready.

Last but not least, we will also cover important algorithms that are not necessarily related to machine learning, but are the quantum counterpart of important classical algorithms. Don’t get swayed by the “lack” of exponential speedups. Remember: the square root of 365 days is a little less than 3 weeks. Besides this, big polynomial speedups, small polynomial speedups in important problems, or polynomial speedups proposing new algorithmic techniques are all much welcome in quantum computer science. All in all, quantum algorithms (to me) should be seen as a way for making impossible things possible.

While reading these lecture notes you should always remember the good Simon Martiel:

“(quantum) Theoretical computer science is the fun part of mathematics.”

To all of you, happy reading.

1.2 Changelog

  • August 2020: Migrated the old blog on bookdown
  • December 2020: Moved thesis in bookdown
  • January 2021: First system for typesetting algorithms, more appendix on linear algebra
  • February 2021: New subroutines for estimating \(\ell_1\) norms.
  • March 2021: quantumalgorithms.org is proudly supported by the Unitary Fund, and quantumalgorithms.org is a project of the QOSF mentorship program: 5 students started creating new content!
  • April 2021: Mobile version working, search functionality added, q-means, finding the minimum, new algo for dimensionality reduction, and factor score ratio estimation estimation.
  • June 2021: Quantum Monte Carlo algorithms, lower bounds techniques for query complexity of quantum algorithms, quantum algorithms for graph problems. The output of the mentorship program of the QOSF foundation!
  • January 2022: In the past months we improved the overall quality of the website (graphics, typos) and added some content (Deutsch-Josza, Bernstein-Vazirani, swap and Hadamard tests)
  • February 2022: We are happy to bring our swag (t-shirts, stickers, lanyards) to QIP. We added a whole chapter on perceptron algorithms and we doubled the chapter on quantum monte carlo algorithms with applications to finance and trace distance!
  • June 2022: We are partecipating to the UnitaryHack! We also presented this work at QNLP2022 where we distributed our swag! Now the website can be compiled also with the latest version of RStudio,pandoc,bookdown.

Coming soon:

  • quantum perceptrons
  • quantum algorithms for dynamic programming
  • quantum convolutional neural networks
  • quantum random feature sampling

1.3 Teaching using this book

  • Lecture 1 - Axioms of quantum mechanics: Chapter ??, section ??
  • Lecture 2 - Quantum computer science: oracle model, BPP vs BQP, from boolean to quantum circuits, Solovay-Kitavev, randomized algorithms, Markov Inequality, Union bound and applications. Section ??.
  • Lecture 3 Foundational quantum algorithms, Section ??: Deutsch-Josza, Bernstein-Vazirani, Swap-test, Hadamard-test.
  • Lecture 4 - Oracles, data representation and data loaders: Oracles for accessing the data, data representation, sparse models. Chapter 2.
  • Lecture 5 - QFT and Grovers’ algorithm: Section 4.2, Section 4.1
  • Lecture 6 - Phase estimation, amplitude amplification, counting, finding the minimum, and applications: Beginning of Chapter 4.
  • Lecture 7 (optional) - More foundational quantum algorithms: Simons’s and Shor’s algorithm (#TODO)
  • Lecture 8 (optional): Quantum perceptron models: This is a very simple quantum machine learning algorithm that shows the applications of different techniques seen so far. (#TODO, very soon!)
  • Lecture 9: Quantum numerical linear algebra pt. 1: HHL algorithm, block-encodings and quantum singular value estimation. Mostly from Chapter 6.
  • Lecture 10: Quantum numerical linear algebra pt. 2: Quantum singular value transformation, Hamiltonian simulation. Theory and examples.
  • Lecture 11: Distance, inner product, trace estimation: From Chapter 4, especially Section 4.7, (#TODO trace estimations).
  • Lecture 12: Quantum monte carlo algorithms and applications: Content from Chapter 7.
  • Lecture 13: Quantum machine learning pt. 1: QPCA and QSFA. Chapter 8 Hamiltonian simulation with density matrices (#TODO).
  • Lecture 14: Quantum machine learning pt. 2: Random feature sampling (#TODO, to convert from .tex to markdown, check github repository).
  • Lecture 15: Classical simulation of quantum algorithms: How to simulate a quantum circuit? (#TODO) How to simulate a quantum machine learning algorithm? Section 11.
  • Lecture 16: Lower bounds: Adversarial and polynomial method in Chapter 13, state discrimination (missing).
  • Lecture 17 (optional): Between the qubits and me: what do we have between the theory of our algorithms and the hardware? An overview of different hardware architectures, error correction codes, and compilation techniques. (#TODO).
  • Lecture 18: Quantum algorithms on graphs: Backtracking algorithms, NP-complete problems on graphs (#TODO).
  • Lecture 19: Quantum optimization: Introduction to optimization. Quantum simplex, quantum zero-sum games. (#TODO).
  • Lecture 20: Quantum optimization: Quantum SDP algorithms, quantum interior point methods, quantum branch-and-bound algorithms (#TODO).