Since the beginning of the field of high-performance computing (HPC) after World War II, there has been a rapid increase in computing resources and simulation complexity. This has enabled significant scientific and technological advances. However, large numerical simulations have important environmental costs, both to fabricate the hardware and to power the computations.
In this talk, we discuss the power consumption of HPC and efforts to improve computation efficiency both at the software and hardware levels. By analyzing data from the TOP500 supercomputers ranking, we show that despite important and steady efficiency gains since 1970, the total power footprint of HPC keeps rising. Indeed, much like in Jevons’ paradox, the gained efficiency is harnessed to increase the model complexity and to apply simulation to new research and industry domains.
In recent years, AI models, specifically neural networks, have grown fast in complexity, reaching billions of parameters. This complexity growth enables flexibility, finer model accuracy, and impressive feats such as large language model’s text generation, which is difficult to distinguish from human-written text. However, it also comes with a spike in power consumption, both during training and inference.
To curb energy usage, we can reduce the model complexity to match the required accuracy of the problem addressed, either by computing with less precision or using a simpler model (which may also improve model explainability). This requires weighing the allocated resources against the expected requirements and outcomes for the computation, raising specific, contextual, and political questions. Faced with a shrinking energy budget, we should regulate our usage of computation resources and use numerical simulation judiciously.
Pablo de Oliveira Castro (https://sifflez.org) is a professor at the University of Versailles Saint Quentin and co-coordinates the first year of the Université Paris-Saclay master’s degree in High-Performance Computing and Simulation. He received his Ph.D. in 2010 on Parallel Data Flow Languages at the CEA. His research interests include floating-point arithmetic, compilers, and high-performance computing. He also participates in an interdisciplinary group, Écopolien (https://ecopolien.org), that brings together researchers and teachers concerned by the climate crisis.
Quantum information theory has several puzzling properties. In particular, it predicts that information carried by quantum objects is incompatible with the use of the logical ‘or’. While in our intuition of the world, a cat hidden in a box can be dead or alive, according to quantum mechanics this assertion is not valid anymore. Hence, we need a new concept: we say that a quantum cat can be dead ‘and’ alive. This new concept is however hard to accept: couldn’t we imagine a new theory replacing quantum information theory and compatible again with our ‘or’? In 1964, Bell proved that, in some sense, this puzzling feature of quantum theory is a necessity: there does not exist any ‘reasonable’ theory of information (which would replace quantum information theory) compatible with this logical ‘or’. Bell’s seminal proof is based on quantum correlations, which are obtained when multiple parties perform independent measurements on a shared quantum state, and on very weak assumptions. I’ll first review the Bell theorem. Then, I will discuss some of its foundational main consequences and some of its concrete applications to certification of quantum devices. At last, I will discuss recent generalization of this theorem in the context of causal quantum networks, in which multiple parties perform independent measurements on several, independent quantum states. In particular, I’ll show that Real Quantum Theory (the information theory obtained replacing complex by real numbers in standard Quantum Information Theory) can be ruled out by a Bell-like experiment, and discuss questions related to quantum distributed computing.
In our presentation, we will analyze the energy footprints of various types of computations, with a particular focus on the influence of computational complexity.
Diagrammatic proofs are ubiquitous in certain areas of mathematics, especially in category theory. Mechanising such proofs is a tedious task because proof assistants (such as Coq) are text based. We present a prototypical diagram editor to make this process easier, building upon the vscode extension coq-lsp for the Coq proof assistant and a web application (you can try it on my personal website).