Skip to content

Experiments for the paper `Linearization Turns Neural Operators Into Function-Valued Gaussian Processes`

License

Notifications You must be signed in to change notification settings

2bys/luno-experiments

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

luno-experiments

This repository contains the code for the experiments in the paper

Emilia Magnani, Marvin Pförtner, Tobias Weber, Philipp Hennig, "Linearization Turns Neural Operators into Function-Valued Gaussian Processes", International Conference on Machine Learning 2025.

It provides a simplified and refactored version of the original experimental code used in the paper.

Quick Start

Installation

Clone the repository, including the luno, which contains the core implementation of the LUNO method:

git clone --recurse-submodules https://github.com/2bys/luno-experiments.git

Then, install the dependencies with uv (recommended).

# Or install the package in development mode
uv pip install -e deps/luno
uv pip install -e .

Running Experiments

The repository supports two main categories of experiments:

  1. Low-Data Regime Experiments (APEBENCH datasets):

    • diff_lin_1, diff_ks_cons_1, diff_hyp_diff_1, diff_burgers_1
  2. Out-of-Distribution Experiments (Advection-Diffusion-Reaction):

    • base_2, flip_2, pos_2, pos_neg_2, pos_neg_flip_2

We use the submit package here to launch experiments on our ML cloud. The same commands can also be run locally.

Training

# Train models using the provided script
python3 submit/submit.py --mode slurm --script train \
  --data_name diff_lin_1 diff_ks_cons_1 diff_hyp_diff_1 diff_burgers_1 \
  --num_epochs 100 \
  --batch_size 5 \
  --num_train_samples 25 \
  --seed 0

Evaluation

# Evaluate trained models
python3 submit/submit.py --mode slurm --script evaluate \
  --data_name <dataset_name>

Replace <dataset_name> with one of the supported datasets.

Project Structure

  • luno_experiments/ - Main package containing experiment code
    • scripts/ - Training, evaluation, and data generation scripts
    • data/ - Data loading and processing utilities
    • nn/ - Neural network implementations
    • uncertainty/ - Uncertainty quantification methods
    • plotting/ - Visualization utilities
  • scripts/ - Shell scripts for running experiments
  • data/ - Dataset storage
  • results/ - Experiment results and outputs

The original plotting code will be released soon.

Methods

The repository implements several uncertainty quantification methods:

  • Input perturbations
  • Ensemble methods
  • Sampling-based approaches (ISO/LA)
  • LUNO-based approaches (ISO/LA)

Dependencies

Key dependencies include:

  • linox - Linear Operator framework
  • laplax - Laplace approximation
  • flax - Neural network library
  • wandb - Experiment tracking
  • apebench - Benchmark datasets

More detailed information on how to run the code will be added in the near future. For questions, guidance on usage, or access to original checkpoints, please contact the authors.

About

Experiments for the paper `Linearization Turns Neural Operators Into Function-Valued Gaussian Processes`

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •