Official Implementation for Fast Training of Neural Lumigraph Representations using Meta Learning.

Overview

Fast Training of Neural Lumigraph Representations using Meta Learning

Project Page | Paper | Data

Alexander W. Bergman, Petr Kellnhofer, Gordon Wetzstein, Stanford University.
Official Implementation for Fast Training of Neural Lumigraph Representations using Meta Learning.

Usage

To get started, create a conda environment with all dependencies:

conda env create -f environment.yml
conda activate metanlrpp

Code Structure

The code is organized as follows:

  • experiment_scripts: directory containing scripts to for training and testing MetaNLR++ models.
    • pretrain_features.py: pre-train encoder and decoder networks
    • train_sdf_ibr_meta.py: train meta-learned initialization for encoder, decoder, aggregation fn, and neural SDF
    • test_sdf_ibr_meta.py: specialize meta-learned initialization to a specific scene
    • train_sdf_ibr.py: train NLR++ model from scratch without meta-learned initialization
    • test_sdf_ibr.py: evaluate performance on withheld views
  • configs: directory containing configs to reproduce experiments in the paper
    • nlrpp_nlr.txt: configuration for training NLR++ on the NLR dataset
    • nlrpp_dtu.txt: configuration for training NLR++ on the DTU dataset
    • nlrpp_nlr_meta.txt: configuration for training the MetaNLR++ initialization on the NLR dataset
    • nlrpp_dtu_meta.txt: configuration for training the MetaNLR++ initialization on the DTU dataset
    • nlrpp_nlr_metaspec.txt: configuration for training MetaNLR++ on the NLR dataset using the learned initialization
    • nlrpp_dtu_metaspec.txt: configuration for training MetaNLR++ on the DTU dataset using the learned initialization
  • data_processing: directory containing utility functions for processing data
  • torchmeta: torchmeta library for meta-learning
  • utils: directory containing various utility functions for rendering and visualization
  • loss_functions.py: file containing loss functions for evaluation
  • meta_modules.py: contains meta learning wrappers around standard modules using torchmeta
  • modules.py: contains standard modules for coodinate-based networks
  • modules_sdf.py: extends standard modules for coordinate-based network representations of signed-distance functions.
  • modules_unet.py: contains encoder and decoder modules used for image-space feature processing
  • scheduler.py: utilities for training schedule
  • training.py: training script
  • sdf_rendering.py: functions for rendering SDF
  • sdf_meshing.py: functions for meshing SDF
  • checkpoints: contains checkpoints to some pre-trained models (additional/ablation models by request)
  • assets: contains paths to checkpoints which are used as assets, and pre-computed buffers over multiple runs (if necessary)

Getting Started

Pre-training Encoder and Decoder

Pre-train the encoder and decoder using the FlyingChairsV2 training dataset as follows:

python experiment_scripts/pretrain_features.py --experiment_name XXX --batch_size X --dataset_path /path/to/FlyingChairs2/train

Alternatively, use the checkpoint in the checkpoints directory.

Training NLR++

Train a NLR++ model using the following command:

python experiment_scripts/train_sdf_ibr.py --config_filepath configs/nlrpp_dtu.txt --experiment_name XXX --dataset_path /path/to/dtu/scanXXX --checkpoint_img_encoder /path/to/pretrained/encdec

Note that we have uploaded our processed version of the DTU data here, and the NLR data can be found here.

Meta-learned Initialization (MetaNLR++)

Meta-learn the initialization for the encoder, decoder, aggregation function, and neural SDF using the following command:

python experiment_scripts/train_sdf_ibr_meta.py --config_filepath configs/nlrpp_dtu_meta.txt --experiment_name XXX --dataset_path /path/to/dtu/meta/training --reference_view 24 --checkpoint_img_encoder /path/to/pretrained/encdec

Some optimized initializations for the DTU and NLR datasets can be found in the data directory. Additional models can be provided upon request.

Training MetaNLR++ from Initialization

Use the meta-learned initialization to specialize to a specific scene using the following command:

python experiment_scripts/test_sdf_ibr_meta.py --config_filepath configs/nlrpp_dtu_metaspec.txt --experiment_name XXX --dataset_path /path/to/dtu/scanXXX --reference_view 24 --meta_initialization /path/to/learned/meta/initialization

Evaluation

Test the converged scene on withheld views using the following command:

python experiment_scripts/test_sdf_ibr.py --config_filepath configs/nlrpp_dtu.txt --experiment_name XXX --dataset_path /path/to/dtu/scanXXX --checkpoint_path_test /path/to/checkpoint/to/evaluate

Citation & Contact

If you find our work useful in your research, please cite

@inproceedings{bergman2021metanlr,
author = {Bergman, Alexander W. and Kellnhofer, Petr and Wetzstein, Gordon},
title = {Fast Training of Neural Lumigraph Representations using Meta Learning},
booktitle = {NeurIPS},
year = {2021},
}

If you have any questions or would like access to specific ablations or baselines presented in the paper or supplement (the code presented here is only a subset based off of the source code used to generate the results), please feel free to contact the authors. Alex can be contacted via e-mail at [email protected].

Owner
Alex
Alex
Pytorch implementation of our paper LIMUSE: LIGHTWEIGHT MULTI-MODAL SPEAKER EXTRACTION.

LiMuSE Overview Pytorch implementation of our paper LIMUSE: LIGHTWEIGHT MULTI-MODAL SPEAKER EXTRACTION. LiMuSE explores group communication on a multi

Auditory Model and Cognitive Computing Lab 17 Oct 26, 2022
This is a TensorFlow implementation for C2-Rec

This is a TensorFlow implementation for C2-Rec We refer to the repo SASRec. Requirements requirement.txt Datasets This repo includes Amazon Beauty dat

7 Nov 14, 2022
Evolutionary Scale Modeling (esm): Pretrained language models for proteins

Evolutionary Scale Modeling This repository contains code and pre-trained weights for Transformer protein language models from Facebook AI Research, i

Meta Research 1.6k Jan 09, 2023
A curated list of awesome deep long-tailed learning resources.

A curated list of awesome deep long-tailed learning resources.

vanint 210 Dec 25, 2022
OOD Dataset Curator and Benchmark for AI-aided Drug Discovery

🔥 DrugOOD 🔥 : OOD Dataset Curator and Benchmark for AI Aided Drug Discovery This is the official implementation of the DrugOOD project, this is the

108 Dec 17, 2022
Speech Recognition using DeepSpeech2.

deepspeech.pytorch Implementation of DeepSpeech2 for PyTorch using PyTorch Lightning. The repo supports training/testing and inference using the DeepS

Sean Naren 2k Jan 04, 2023
Official implementation of the ICCV 2021 paper "Conditional DETR for Fast Training Convergence".

The DETR approach applies the transformer encoder and decoder architecture to object detection and achieves promising performance. In this paper, we handle the critical issue, slow training convergen

281 Dec 30, 2022
A sequence of Jupyter notebooks featuring the 12 Steps to Navier-Stokes

CFD Python Please cite as: Barba, Lorena A., and Forsyth, Gilbert F. (2018). CFD Python: the 12 steps to Navier-Stokes equations. Journal of Open Sour

Barba group 2.6k Dec 30, 2022
Diverse graph algorithms implemented using JGraphT library.

# 1. Installing Maven & Pandas First, please install Java (JDK11) and Python 3 if they are not already. Next, make sure that Maven (for importing J

See Woo Lee 3 Dec 17, 2022
DeepStruc is a Conditional Variational Autoencoder which can predict the mono-metallic nanoparticle from a Pair Distribution Function.

ChemRxiv | [Paper] XXX DeepStruc Welcome to DeepStruc, a Deep Generative Model (DGM) that learns the relation between PDF and atomic structure and the

Emil Thyge Skaaning Kjær 13 Aug 01, 2022
mbrl-lib is a toolbox for facilitating development of Model-Based Reinforcement Learning algorithms.

mbrl-lib is a toolbox for facilitating development of Model-Based Reinforcement Learning algorithms. It provides easily interchangeable modeling and planning components, and a set of utility function

Facebook Research 724 Jan 04, 2023
Bayes-Newton—A Gaussian process library in JAX, with a unifying view of approximate Bayesian inference as variants of Newton's algorithm.

Bayes-Newton Bayes-Newton is a library for approximate inference in Gaussian processes (GPs) in JAX (with objax), built and actively maintained by Wil

AaltoML 165 Nov 27, 2022
A PyTorch implementation of EfficientDet.

A PyTorch impl of EfficientDet faithful to the original Google impl w/ ported weights

Ross Wightman 1.4k Jan 07, 2023
A curated list of awesome Deep Learning tutorials, projects and communities.

Awesome Deep Learning Table of Contents Books Courses Videos and Lectures Papers Tutorials Researchers Websites Datasets Conferences Frameworks Tools

Christos 20k Jan 05, 2023
This is an example implementation of the paper "Cross Domain Robot Imitation with Invariant Representation".

IR-GAIL This is an example implementation of the paper "Cross Domain Robot Imitation with Invariant Representation". Dependency The experiments are de

Zhao-Heng Yin 1 Jul 14, 2022
TensorFlow-based implementation of "ICNet for Real-Time Semantic Segmentation on High-Resolution Images".

ICNet_tensorflow This repo provides a TensorFlow-based implementation of paper "ICNet for Real-Time Semantic Segmentation on High-Resolution Images,"

HsuanKung Yang 406 Nov 27, 2022
Tools for robust generative diffeomorphic slice to volume reconstruction

RGDSVR Tools for Robust Generative Diffeomorphic Slice to Volume Reconstructions (RGDSVR) This repository provides tools to implement the methods in t

Lucilio Cordero-Grande 0 Oct 29, 2021
This is a official repository of SimViT.

SimViT This is a official repository of SimViT. We will open our models and codes about object detection and semantic segmentation soon. Our code refe

ligang 57 Dec 15, 2022
[AAAI22] Reliable Propagation-Correction Modulation for Video Object Segmentation

Reliable Propagation-Correction Modulation for Video Object Segmentation (AAAI22) Preview version paper of this work is available at: https://arxiv.or

Xiaohao Xu 70 Dec 04, 2022
Neural-PIL: Neural Pre-Integrated Lighting for Reflectance Decomposition - NeurIPS2021

Neural-PIL: Neural Pre-Integrated Lighting for Reflectance Decomposition Project Page | Video | Paper Implementation for Neural-PIL. A novel method wh

Computergraphics (University of Tübingen) 64 Dec 29, 2022