Thu 24 Oct 2024 16:00 - 16:20 at San Gabriel - Probabilistic Programming and Analysis 1 Chair(s): Di Wang

The design of online learning algorithms typically aims to optimize the incurred loss or \emph{cost}, e.g., the number of classification mistakes made by the algorithm. The goal of this paper is to build a type-theoretic framework to \emph{prove} that a certain algorithm actually achieves its stated bound on the cost.

Online learning algorithms often rely on randomness, their loss functions are often defined as expectations, precise bounds are often non-polynomial (e.g., logarithmic) and proofs of optimality often rely on potential-based arguments. Accordingly, we present p$\lambda$-amor, a type-theoretic graded modal framework for analysing (expected) costs of higher-order \emph{probabilistic} programs with recursion. p$\lambda$-amor is an effect-based framework which uses graded modal types to represent potentials, cost and probability at the type level. It extends prior work ($\lambda$-amor) on cost analysis for \emph{deterministic} programs. We prove p$\lambda$-amor sound relative to a Kripke step-indexed model which relates potentials with probabilistic coupling. We use p$\lambda$-amor to prove cost bounds of several examples from online machine learning literature. Finally, we describe an extension of p$\lambda$-amor with a graded comonad and describe relationship between the different modalities.

Thu 24 Oct

Displayed time zone: Pacific Time (US & Canada) change

16:00 - 17:40
Probabilistic Programming and Analysis 1OOPSLA 2024 at San Gabriel
Chair(s): Di Wang Peking University
16:00
20m
Talk
A modal type-theory of expected cost in higher-order probabilistic programsRemote
OOPSLA 2024
Vineet Rajani University of Kent, Gilles Barthe MPI-SP; IMDEA Software Institute, Deepak Garg MPI-SWS
DOI
16:20
20m
Talk
Distributions for Compositionally Differentiating Parametric Discontinuities
OOPSLA 2024
Jesse Michel Massachusetts Institute of Technology, Kevin Mu University of Washington, Xuanda Yang University of California San Diego, Sai Praveen Bangaru MIT, Elias Rojas Collins MIT, Gilbert Bernstein University of Washington, Seattle, Jonathan Ragan-Kelley Massachusetts Institute of Technology, Michael Carbin Massachusetts Institute of Technology, Tzu-Mao Li Massachusetts Institute of Technology; University of California at San Diego
DOI
16:40
20m
Talk
Exact Bayesian Inference for Loopy Probabilistic Programs Using Generating Functions
OOPSLA 2024
Lutz Klinkenberg RWTH Aachen University, Christian Blumenthal RWTH Aachen University, Mingshuai Chen Zhejiang University, Darion Haase RWTH Aachen University, Joost-Pieter Katoen RWTH Aachen University
DOI
17:00
20m
Talk
Hopping Proofs of Expectation-Based Properties: Applications to Skiplists and Security Proofs
OOPSLA 2024
Martin Avanzini Inria, Gilles Barthe MPI-SP; IMDEA Software Institute, Benjamin Gregoire INRIA, Georg Moser University of Innsbruck, Gabriele Vanoni IRIF, Université Paris Cité
DOI
17:20
20m
Talk
Learning Abstraction Selection for Bayesian Program Analysis
OOPSLA 2024
Yifan Zhang Peking University, Yuanfeng Shi Peking University, Xin Zhang Peking University
DOI Pre-print