# ARITH 2022 Keynotes

## Number Representations for Deep Learning

**Bill Dally, chief scientist at NVIDIA**

The current resurgence of artificial intelligence is due to advances in deep learning. Systems based on deep learning now exceed human capability in speech recognition, object classification, and playing games like Go. Deep learning has been enabled by powerful, efficient computing hardware. The algorithms used have been around since the 1980s, but it has only been in the last decade - when powerful GPUs became available to train networks - that the technology has become practical. Advances in DL are now gated by hardware performance. In the last decade, the efficiency of DL inference on GPUs had improved by 1000x. Much of this gain was due to improvements in data representation starting with FP32 in the Kepler generation of GPUs and scaling to Int8 and FP8 in the Hopper generation. This talk will review this history and discuss further improvements in number representation including logarithmic representation, optimal clipping, and per-vector quantization.

## Fully Homomorphic Encryption over the Discretized Torus

**Marc Joye, chief scientist at Zama**

First posed as a challenge in 1978 by Rivest et al., fully homomorphic encryption—the ability to evaluate any function over encrypted data—was only solved in 2009 in a breakthrough result by Gentry (Commun. ACM, 2010). After a decade of intense research, practical solutions have emerged and are being pushed for standardization.

This talk explains the inner-workings of TFHE, a torus-based fully homomorphic encryption scheme. Specifically, it describes its implementation on a discretized version of the torus. It also explains in detail the technique of programmable bootstrapping that allows evaluating homomorphically any function during a bootstrapping operation.

## Integer multiplication in time $O(n\; log\; n)$

**Joris van der Hoeven, research director at CNRS**