PyTorch implementations of the NeRF model described in "NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis"

Overview

PyTorch NeRF and pixelNeRF

NeRF: Open NeRF in Colab

Tiny NeRF: Open Tiny NeRF in Colab

pixelNeRF: Open pixelNeRF in Colab

This repository contains minimal PyTorch implementations of the NeRF model described in "NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis" and the pixelNeRF model described in "pixelNeRF: Neural Radiance Fields from One or Few Images". While there are other PyTorch implementations out there (e.g., this one and this one for NeRF, and the authors' official implementation for pixelNeRF), I personally found them somewhat difficult to follow, so I decided to do a complete rewrite of NeRF myself. I tried to stay as close to the authors' text as possible, and I added comments in the code referring back to the relevant sections/equations in the paper. The final result is a tight 357 lines of heavily commented code (303 sloc—"source lines of code"—on GitHub) all contained in a single file. For comparison, this PyTorch implementation has approximately 970 sloc spread across several files, while this PyTorch implementation has approximately 905 sloc.

run_tiny_nerf.py trains a simplified NeRF model inspired by the "Tiny NeRF" example provided by the NeRF authors. This NeRF model does not use fine sampling and the MLP is smaller, but the code is otherwise identical to the full model code. At only 155 sloc, it might be a good place to start for people who are completely new to NeRF. If you prefer your code more object-oriented, check out run_nerf_alt.py and run_tiny_nerf_alt.py.

A Colab notebook for the full model can be found here, while a notebook for the tiny model can be found here. The generate_nerf_dataset.py script was used to generate the training data of the ShapeNet car.

For the following test view:

run_nerf.py generated the following after 20,100 iterations (a few hours on a P100 GPU):

Loss: 0.00022201683896128088

while run_tiny_nerf.py generated the following after 19,600 iterations (~35 minutes on a P100 GPU):

Loss: 0.0004151524917688221

The advantages of streamlining NeRF's code become readily apparent when trying to extend NeRF. For example, training a pixelNeRF model only required making a few changes to run_nerf.py bringing it to 370 sloc (notebook here). For comparison, the official pixelNeRF implementation has approximately 1,300 pixelNeRF-specific (i.e., not related to the image encoder or dataset) sloc spread across several files. The generate_pixelnerf_dataset.py script was used to generate the training data of ShapeNet cars.

For the following source object and view:

and target view:

run_pixelnerf.py generated the following after 73,243 iterations (~12 hours on a P100 GPU; the full pixelNeRF model was trained for 400,000 iterations, which took six days):

Loss: 0.004468636587262154

The "smearing" is an artifact caused by the bounding box sampling method.

Similarly, training an "object-centric NeRF" (i.e., where the object is rotated instead of the camera) is identical to run_tiny_nerf.py (notebook here). Rotating an object is equivalent to holding the object stationary and rotating both the camera and the lighting in the opposite direction, which is how the object-centric dataset is generated in generate_obj_nerf_dataset.py.

For the following test view:

run_tiny_obj_nerf.py generated the following after 19,400 iterations (~35 minutes on a P100 GPU):

Loss: 0.0005469498573802412

Owner
Michael A. Alcorn
Brute-forcing my way through life.
Michael A. Alcorn
Open-source code for Generic Grouping Network (GGN, CVPR 2022)

Open-World Instance Segmentation: Exploiting Pseudo Ground Truth From Learned Pairwise Affinity Pytorch implementation for "Open-World Instance Segmen

Meta Research 99 Dec 06, 2022
Tracking code for the winner of track 1 in the MMP-Tracking Challenge at ICCV 2021 Workshop.

Tracking Code for the winner of track1 in MMP-Trakcing challenge This repository contains our tracking code for the Multi-camera Multiple People Track

DamoCV 29 Nov 13, 2022
Modeling Category-Selective Cortical Regions with Topographic Variational Autoencoders

Modeling Category-Selective Cortical Regions with Topographic Variational Autoencoders

1 Oct 11, 2021
A short code in python, Enchpyter, is able to encrypt and decrypt words as you determine, of course

Enchpyter Enchpyter is a program do encrypt and decrypt any word you want (just letters). You enter how many letters jumps and write the word, so, the

João Assalim 2 Oct 10, 2022
TLoL (Python Module) - League of Legends Deep Learning AI (Research and Development)

TLoL-py - League of Legends Deep Learning Library TLoL-py is the Python component of the TLoL League of Legends deep learning library. It provides a s

7 Nov 29, 2022
[SIGGRAPH 2020] Attribute2Font: Creating Fonts You Want From Attributes

Attr2Font Introduction This is the official PyTorch implementation of the Attribute2Font: Creating Fonts You Want From Attributes. Paper: arXiv | Rese

Yue Gao 200 Dec 15, 2022
YOLOv5 in PyTorch > ONNX > CoreML > TFLite

This repository represents Ultralytics open-source research into future object detection methods, and incorporates lessons learned and best practices evolved over thousands of hours of training and e

Ultralytics 34.1k Dec 31, 2022
Inference pipeline for our participation in the FeTA challenge 2021.

feta-inference Inference pipeline for our participation in the FeTA challenge 2021. Team name: TRABIT Installation Download the two folders in https:/

Lucas Fidon 2 Apr 13, 2022
Pca-on-genotypes - Mini bioinformatics project - PCA on genotypes

Mini bioinformatics project: PCA on genotypes This repo contains the code from t

Maria Nattestad 8 Dec 04, 2022
Breast cancer is been classified into benign tumour and malignant tumour.

Breast cancer is been classified into benign tumour and malignant tumour. Logistic regression is applied in this model.

1 Feb 04, 2022
GLM (General Language Model)

GLM GLM is a General Language Model pretrained with an autoregressive blank-filling objective and can be finetuned on various natural language underst

THUDM 421 Jan 04, 2023
Arabic Car License Recognition. A solution to the kaggle competition Machathon 3.0.

Transformers Arabic licence plate recognition 🚗 Solution to the kaggle competition Machathon 3.0. Ranked in the top 6️⃣ at the final evaluation phase

Noran Hany 17 Dec 04, 2022
Nerf pl - NeRF (Neural Radiance Fields) and NeRF in the Wild using pytorch-lightning

nerf_pl Update: an improved NSFF implementation to handle dynamic scene is open! Update: NeRF-W (NeRF in the Wild) implementation is added to nerfw br

AI葵 1.8k Dec 30, 2022
Auto-Lama combines object detection and image inpainting to automate object removals

Auto-Lama Auto-Lama combines object detection and image inpainting to automate object removals. It is build on top of DE:TR from Facebook Research and

44 Dec 09, 2022
This is an implementation for the CVPR2020 paper "Learning Invariant Representation for Unsupervised Image Restoration"

Learning Invariant Representation for Unsupervised Image Restoration (CVPR 2020) Introduction This is an implementation for the paper "Learning Invari

GarField 88 Nov 07, 2022
Code for ECCV 2020 paper "Contacts and Human Dynamics from Monocular Video".

Contact and Human Dynamics from Monocular Video This is the official implementation for the ECCV 2020 spotlight paper by Davis Rempe, Leonidas J. Guib

Davis Rempe 207 Jan 05, 2023
Implementation of the method described in the Speech Resynthesis from Discrete Disentangled Self-Supervised Representations.

Speech Resynthesis from Discrete Disentangled Self-Supervised Representations Implementation of the method described in the Speech Resynthesis from Di

4 Mar 11, 2022
A new framework, collaborative cascade prediction based on graph neural networks (CCasGNN) to jointly utilize the structural characteristics, sequence features, and user profiles.

CCasGNN A new framework, collaborative cascade prediction based on graph neural networks (CCasGNN) to jointly utilize the structural characteristics,

5 Apr 29, 2022
[Link]mareteutral - pars tradg wth M []

pairs-trading-with-ML Jonathan Larkin, August 2017 One popular strategy classification is Pairs Trading. Though this category of strategies can exhibi

Jonathan Larkin 134 Jan 06, 2023
A Nim frontend for pytorch, aiming to be mostly auto-generated and internally using ATen.

Master Release Pytorch - Py + Nim A Nim frontend for pytorch, aiming to be mostly auto-generated and internally using ATen. Because Nim compiles to C+

Giovanni Petrantoni 425 Dec 22, 2022