Welcome to the homepage of the

Mathematical Optimization for Data Science Group

Department of Mathematics and Computer Science, Saarland University, Germany

Symmetries in PAC-Bayesian Learning

A. Beck and P. Ochs

Abstract:
Symmetries are known to improve the empirical performance of machine learning models, yet theoretical guarantees explaining these gains remain limited. Prior work has focused mainly on compact group symmetries and often assumes that the data distribution itself is invariant, an assumption rarely satisfied in real-world applications. In this work, we extend generalization guarantees to the broader setting of non-compact symmetries, such as translations and to non-invariant data distributions. Building on the PAC-Bayes framework, we adapt and tighten existing bounds, demonstrating the approach on McAllester's PAC-Bayes bound while showing that it applies to a wide range of PAC-Bayes bounds. We validate our theory with experiments on a rotated MNIST dataset with a non-uniform rotation group, where the derived guarantees not only hold but also improve upon prior results. These findings provide theoretical evidence that, for symmetric data, symmetric models are preferable beyond the narrow setting of compact groups and invariant distributions, opening the way to a more general understanding of symmetries in machine learning.
pdf Bibtex arXiv
Latest update: 20.10.2025
Citation:
A. Beck, P. Ochs:
Symmetries in PAC-Bayesian Learning. [pdf]
Technical Report, ArXiv e-prints, arXiv:2510.17303, 2025.
Bibtex:
@techreport{BO25,
  title        = {Symmetries in PAC-Bayesian Learning},
  author       = {A. Beck and P. Ochs},
  year         = {2025},
  journal      = {ArXiv e-prints, arXiv:2510.17303},
}


MOP Group
©2017-2025
The author is not
responsible for
the content of
external pages.