
We have the following stage(s) available: Some success stories Bayesian update as a fixedpoint transformationSupervisorsKostas Chatzikokolakis (CNRS, Ecole Polytechnique) and Catuscia Palamidessi (INRIA, Ecole Polytechnique)DescriptionThe most distinguishing feature of the Bayesian probability theory consists in a mathematical framework for performing probabilistic inference. It is most often used to judge the relative validity of hypotheses in the face of noisy, sparse, or uncertain data, or to adjust the parameters of a specific model. To evaluate the probability of the hypothesis, or the parameter, the Bayesian method needs to assume a prior probability distribution, which is then updated to a posterior probability in the light of new, relevant data. The update of the probability is calculated using the Bayes' theorem. The foundations of Bayesian probability theory were laid down by Thomas Bayes (1702–1761), who formulated a special case of what is now called Bayes' theorem, and by PierreSimon Laplace (1749–1827) who introduced the general version. Although it had been almost abandoned in the last century in favor of a collection of techniques known as ”frequentist statistics”, in the 1980s there was a sudden comeback of interest for the Bayesian method (the socalled “Bayesian revolution”), mostly attributed to the discovery of Markov chain Monte Carlo methods, which allows to effectuate the update efficiently. Since then, there has bee a dramatic growth in research and applications of Bayesian probability, and it is now commonly employed in many scientific disciplines, from astrophysics to neuroscience. In particular, in Computer Science, it is used in Machine Learning, in Probabilistic Programming, in Quantitative Information Flow, and in Privacy, just to mention a few. The Bayes’ theorem is formulated as follows: where:
Goals of the stageThe asymptotic behavior of the Bayesian update has been investigated by Joseph L. Doob [1948], and by David A. Freedman [1963,1965], who provided general conditions for the convergence in the case of Bayes estimators. However, much remains to be done. In particular, the study of convergence conditions specific to various applications, and the study of the convergence speed: how to measure it (according to various criteria dictated by the application), and what are the parameters that influence it, etc. In particular, we aim at studying three main application domains:
MethodologyWe will formalize the Bayesian update as a probabilistic transformation on probability distributions, and we will study the convergence condition and the convergence speed of this transformation to a fixed point, which will represent the value of the true hypothesis. We will use the knowhow from the theory of (deterministic) programming languages, where such operators are used to provide semantics to recursive programs. A typical condition for the existence of a fixed point is the contraction mapping principle: Let (X, d) be a metric space. Then a map T : X → X is called a contraction mapping on X if there exists q ∈ [0, 1) such that
Bibliography[1] Subhashis Ghosal. A review of Consistency and Convergence of Posterior Distributions [pdf] [2] M.S. Alvim, K. Chatzikokolakis, C. Palamidessi, and G. Smith. Measuring Information Leakage using Generalized Gain Functions. Proc. of CSF, 265279, 2012. [pdf] [3] Cynthia Dwork. A Firm Foundation for Private Data Analysis. Communications of the ACM, Vol. 54 No. 1, Pages 8695, 2011. [pdf] SalaryStandard INRIA Internship GratificationContinuation as PhDThe internship is supposed to be used by the candidate to produce his/her master thesis. There will be the possibility of continuing working with the team as a PhD student. A preference will be given to the students who are willing to continue as PhDs.Candidate’s required levelMaster’s student (2nd year).Proposed Duration46 monthsPrerequisitesAttitude for formal and mathematical reasoningSome knowledge of Probability Theory would be welcome Contact peopleKostas Chatzikokolakis or Catuscia PalamidessiSome success stories
