Haofei Xu · Daniel Barath · Andreas Geiger · Marc Pollefeys
Paper | Project Page | Models
ReSplat is a feed-forward recurrent model for 3D Gaussian splatting that iteratively refines Gaussians using the rendering error as a gradient-free feedback signal for test-time adaptation.
Key features:
- Compact initialization: Predicts Gaussians in a subsampled space (16× fewer Gaussians than prior per-pixel methods)
- Recurrent refinement: Weight-sharing recurrent module that uses rendering error to predict per-Gaussian parameter updates
This codebase is developed with Python 3.12, PyTorch 2.7.0, and CUDA 12.8.
We recommend setting up a virtual environment (e.g., conda or venv) before installation:
# conda
conda create -y -n resplat python=3.12
conda activate resplat
# or venv
# python -m venv /path/to/venv/resplat
# source /path/to/venv/resplat/bin/activate
# torch 2.7.0, cuda 12.8
pip install torch==2.7.0 torchvision==0.22.0 --index-url https://download.pytorch.org/whl/cu128
pip install -r requirements.txt
# Install gsplat 1.5.3
pip install --no-build-isolation git+https://github.com/nerfstudio-project/gsplat.git@v1.5.3
# Install pointops (kNN)
cd src/model/encoder/pointops && python setup.py install && cd ../../../..Pre-trained models are available in the Model Zoo.
Download the weights and place (or symlink) them in the pretrained directory:
ln -s YOUR_MODEL_PATH pretrainedThe camera intrinsic matrices are normalized, with the first row divided by the image width and the second row divided by the image height.
The camera extrinsic matrices follow the OpenCV convention for camera-to-world transformation (+X right, +Y down, +Z pointing into the screen).
See DATASETS.md for detailed instructions on preparing RealEstate10K, DL3DV and ACID datasets.
Symlink the downloaded datasets to the datasets directory:
ln -s YOUR_DATASET_PATH datasetsCheck scripts/infer_colmap.sh for running our pre-trained models on COLMAP datasets.
A demo scene can be downloaded here to quickly try our method.
Evaluation scripts are also provided in scripts/ for reproducing the results in our paper.
ReSplat is trained in two stages: (1) initial Gaussian prediction and (2) recurrent refinement.
The training scripts in scripts/ contain the exact commands and hyperparameters used for the experiments in our paper. Please refer to them for detailed configurations.
Before training, you need to download the pre-trained depth model, and set up your wandb account (in particular, by setting wandb.entity=YOUR_ACCOUNT) for logging.
If you find this work useful, please consider citing:
@article{xu2025resplat,
title={ReSplat: Learning Recurrent Gaussian Splatting},
author={Xu, Haofei and Barath, Daniel and Geiger, Andreas and Pollefeys, Marc},
journal={arXiv preprint arXiv:2510.08575},
year={2025}
}Our codebase builds upon several excellent open-source projects: pixelSplat, MVSplat, MVSplat360, UniMatch, Depth Anything V2, DepthSplat, Pointcept, 3DGS, gsplat, and DL3DV. We thank all the authors for their great work.
