Generic template to bootstrap your PyTorch project with PyTorch Lightning, Hydra, W&B, and DVC.

Overview

NN Template

PyTorch Lightning Conf: hydra Logging: wandb Conf: hydra Code style: black

Generic template to bootstrap your PyTorch project. Click on Use this Template and avoid writing boilerplate code for:

  • PyTorch Lightning, lightweight PyTorch wrapper for high-performance AI research.
  • Hydra, a framework for elegantly configuring complex applications.
  • DVC, track large files, directories, or ML models. Think "Git for data".
  • Weights and Biases, organize and analyze machine learning experiments. (educational account available)

nn-template is opinionated so you don't have to be. If you use this template, please add to your README.

Usage Examples

Checkout the mwe branch to view a minimum working example on MNIST.

Structure

.
├── conf                # Hydra compositional config
│   ├── default.yaml    # current experiment configuration
│   ├── data
│   ├── hydra
│   ├── logging
│   ├── model
│   ├── optim
│   └── train
├── data                # datasets
├── experiments         # local logs
├── README.md
├── requirements.txt    # basic requirements
└── src
    ├── common          # common Python modules
    ├── pl_data         # PyTorch Lightning datamodules and datasets
    ├── pl_modules      # PyTorch Lightning modules
    └── run.py          # entry point to run current conf

Data Version Control

DVC runs alongside git and uses the current commit hash to version control the data.

Initialize the dvc repository:

$ dvc init

To start tracking a file or directory, use dvc add:

$ dvc add data/ImageNet

DVC stores information about the added file (or a directory) in a special .dvc file named data/ImageNet.dvc, a small text file with a human-readable format. This file can be easily versioned like source code with Git, as a placeholder for the original data (which gets listed in .gitignore):

git add data/ImageNet.dvc data/.gitignore
git commit -m "Add raw data"

Making changes

When you make a change to a file or directory, run dvc add again to track the latest version:

$ dvc add data/ImageNet

Switching between versions

The regular workflow is to use git checkout first to switch a branch, checkout a commit, or a revision of a .dvc file, and then run dvc checkout to sync data:

$ git checkout <...>
$ dvc checkout

Read more in the docs!

Weights and Biases

Weights & Biases helps you keep track of your machine learning projects. Use tools to log hyperparameters and output metrics from your runs, then visualize and compare results and quickly share findings with your colleagues.

This is an example of a simple dashboard.

Quickstart

Login to your wandb account, running once wandb login. Configure the logging in conf/logging/*.


Read more in the docs. Particularly useful the log method, accessible from inside a PyTorch Lightning module with self.logger.experiment.log.

W&B is our logger of choice, but that is a purely subjective decision. Since we are using Lightning, you can replace wandb with the logger you prefer (you can even build your own). More about Lightning loggers here.

Hydra

Hydra is an open-source Python framework that simplifies the development of research and other complex applications. The key feature is the ability to dynamically create a hierarchical configuration by composition and override it through config files and the command line. The name Hydra comes from its ability to run multiple similar jobs - much like a Hydra with multiple heads.

The basic functionalities are intuitive: it is enough to change the configuration files in conf/* accordingly to your preferences. Everything will be logged in wandb automatically.

Consider creating new root configurations conf/myawesomeexp.yaml instead of always using the default conf/default.yaml.

Sweeps

You can easily perform hyperparameters sweeps, which override the configuration defined in /conf/*.

The easiest one is the grid-search. It executes the code with every possible combinations of the specified hyperparameters:

PYTHONPATH=. python src/run.py -m optim.optimizer.lr=0.02,0.002,0.0002 optim.lr_scheduler.T_mult=1,2 optim.optimizer.weight_decay=0,1e-5

You can explore aggregate statistics or compare and analyze each run in the W&B dashboard.


We recommend to go through at least the Basic Tutorial, and the docs about Instantiating objects with Hydra.

PyTorch Lightning

Lightning makes coding complex networks simple. It is not a high level framework like keras, but forces a neat code organization and encapsulation.

You should be somewhat familiar with PyTorch and PyTorch Lightning before using this template.

Environment Variables

System specific variables (e.g. absolute paths to datasets) should not be under version control, otherwise there will be conflicts between different users.

The best way to handle system specific variables is through environment variables.

You can define new environment variables in a .env file in the project root. A copy of this file (e.g. .env.template) can be under version control to ease new project configurations.

To define a new variable write inside .env:

export MY_VAR=/home/user/my_system_path

You can dynamically resolve the variable name from Python code with:

get_env('MY_VAR')

and in the Hydra .yaml configuration files with:

${env:MY_VAR}
You might also like...
Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch
Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch

Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch

A clean and scalable template to kickstart your deep learning project 🚀 ⚡ 🔥
A clean and scalable template to kickstart your deep learning project 🚀 ⚡ 🔥

Lightning-Hydra-Template A clean and scalable template to kickstart your deep learning project 🚀 ⚡ 🔥 Click on Use this template to initialize new re

PyTorch implementation of EGVSR: Efficcient & Generic Video Super-Resolution (VSR)
PyTorch implementation of EGVSR: Efficcient & Generic Video Super-Resolution (VSR)

This is a PyTorch implementation of EGVSR: Efficcient & Generic Video Super-Resolution (VSR), using subpixel convolution to optimize the inference speed of TecoGAN VSR model. Please refer to the official implementation ESPCN and TecoGAN for more information.

Official implementation of our paper "Learning to Bootstrap for Combating Label Noise"

Learning to Bootstrap for Combating Label Noise This repo is the official implementation of our paper "Learning to Bootstrap for Combating Label Noise

An essential implementation of BYOL in PyTorch + PyTorch Lightning
An essential implementation of BYOL in PyTorch + PyTorch Lightning

Essential BYOL A simple and complete implementation of Bootstrap your own latent: A new approach to self-supervised Learning in PyTorch + PyTorch Ligh

A general framework for deep learning experiments under PyTorch based on pytorch-lightning

torchx Torchx is a general framework for deep learning experiments under PyTorch based on pytorch-lightning. TODO list gan-like training wrapper text

a generic C++ library for image analysis

VIGRA Computer Vision Library Copyright 1998-2013 by Ullrich Koethe This file is part of the VIGRA computer vision library. You may use,

Generic Event Boundary Detection: A Benchmark for Event Segmentation

Generic Event Boundary Detection: A Benchmark for Event Segmentation We release our data annotation & baseline codes for detecting generic event bound

[ICCV2021] IICNet: A Generic Framework for Reversible Image Conversion
[ICCV2021] IICNet: A Generic Framework for Reversible Image Conversion

IICNet - Invertible Image Conversion Net Official PyTorch Implementation for IICNet: A Generic Framework for Reversible Image Conversion (ICCV2021). D

Comments
  • Curious if you checked out DAGsHub

    Curious if you checked out DAGsHub

    Hi @lucmos, this looks like an awesome repo. I stumbled on it while doing some research on project templates for ML projects. I'm one of the creators of DAGsHub which is a platform built on Git, DVC, and MLflow. It integrates with GitHub and provides a free DVC remote and MLflow server so that you can track experiments and share your data & models in one UI.

    Here's an example project to showcase the abilities: https://dagshub.com/OperationSavta/SavtaDepth

    It seems really in line with what you're creating here, and I would love to hear your thoughts about it.

    opened by deanp70 4
  • Streamlit UI - Weights and Biases login

    Streamlit UI - Weights and Biases login

    The template is really awesome.

    I had a small issue. When I run the Streamlit UI without being logged in in Weights and Biases. The UI just hanged with the loading status without giving me any feedback about what was happening, so that I have to log in first in wandb. I had to login manually from the console. Is there any way to solve this issue? For example to have feedback from the UI if I'm not logged in.

    Thanks!

    opened by andreim14 1
  • load_model

    load_model

    Hi, This issue concerns the function from nn_core.serialization import load_model Suppose i train a pytorch model with class MyLightningModule, and that I saved the checkpoint in model_path. Suppose now that the class MyLightningModule has received some minor changes, like a new class variable has been added. Let's call this version MyLightningModuleV2. When I load a model using this function, like:

    self.model = load_model(module_class=MyLightningModuleOld, checkpoint_path=Path(model_path), map_location=self.device).to(self.device).eval()

    I get an error because the chekpoint refers to the model of class MyLightningModule and therefore the new variable is (obviously) missing. To make it work, i need to load the model with the old version of the class, that is, MyLightningModule, and then manually setting "model.new_variable" to the value i want to get, like the following:

    self.model = load_model(module_class=MyLightningModuleOld, checkpoint_path=Path(model_path), map_location=self.device).to(self.device).eval()
    self.model.new_variable = False
    

    It would be nice to have this option in the load_model function to avoid creating multiple versions of the same class.

    opened by framolfese 0
  • PyTorch Lightning EcoCI integration to check for compatibility with latest & upcoming releases

    PyTorch Lightning EcoCI integration to check for compatibility with latest & upcoming releases

    Hey Valentino & Luca,

    I am just catching up with some bookmarks and remembered your repo here :). As someone who constantly fuzzes about the ideal project structure, that's actually pretty cool. I have been using an adapted version of the data science cookiecutter for generic ML projects, but nothing sophisticated like this here with code stubs.

    Haven't thoroughly played with it yet, though, besides creating an example folder and looking at the pl_module.py and datamodule.py files, which look good to me!

    In any case, long story short, I was wondering if you'd be interested in the PyTorch Lightning's ecosystem CI to make sure that it stays fresh and relevant wrt to upcoming version releases (comes with free CPU and multi-GPU CI tests): https://devblog.pytorchlightning.ai/stay-ahead-of-breaking-changes-with-the-new-lightning-ecosystem-ci-b7e1cf78a6c7

    If you are interested in that, I am sure my colleague @Borda would be happy to assist with questions & technical details -- he built this thing, so he probably knows best :)

    opened by rasbt 4
Releases(0.2.3)
  • 0.2.3(Dec 15, 2022)

    What's Changed

    • Bump dependency versions by @lucmos in https://github.com/grok-ai/nn-template/pull/79
    • Version 0.2.3 by @lucmos in https://github.com/grok-ai/nn-template/pull/80

    Full Changelog: https://github.com/grok-ai/nn-template/compare/0.2.2...0.2.3

    Source code(tar.gz)
    Source code(zip)
  • 0.2.2(Jun 13, 2022)

    What's Changed

    • Update README.md by @Flegyas in https://github.com/grok-ai/nn-template/pull/70
    • Improve documentation by @Flegyas in https://github.com/grok-ai/nn-template/pull/71
    • Update documentation by @Flegyas in https://github.com/grok-ai/nn-template/pull/72
    • Add asciinema gif in the README and docs by @lucmos in https://github.com/grok-ai/nn-template/pull/74
    • Add papers by @lucmos in https://github.com/grok-ai/nn-template/pull/76
    • Update precommits versions by @lucmos in https://github.com/grok-ai/nn-template/pull/75
    • Version 0.2.2 by @lucmos in https://github.com/grok-ai/nn-template/pull/77

    Full Changelog: https://github.com/grok-ai/nn-template/compare/0.2.1...0.2.2

    Source code(tar.gz)
    Source code(zip)
  • 0.2.1(Mar 1, 2022)

    Changelog for nn-template 0.2.1 (2022-03-01)

    What's Changed

    • Fix status badge in the documentation by @lucmos in https://github.com/grok-ai/nn-template/pull/64
    • Minor fixes post release by @lucmos in https://github.com/grok-ai/nn-template/pull/65
    • Fix typos in the documentation by @mikcnt in https://github.com/grok-ai/nn-template/pull/67
    • Fix broken relative links due to mike root folder by @lucmos in https://github.com/grok-ai/nn-template/pull/68
    • Version 0.2.1 by @lucmos in https://github.com/grok-ai/nn-template/pull/69

    New Contributors

    • @mikcnt made their first contribution in https://github.com/grok-ai/nn-template/pull/67

    Full Changelog: https://github.com/grok-ai/nn-template/compare/0.2.0...0.2.1

    Source code(tar.gz)
    Source code(zip)
  • 0.2.0(Mar 1, 2022)

    We are very pleased to present you NN Template 0.2.0!

    Changelog for nn-template 0.2.0 (2022-03-01)

    Summary

    • Cookiecutter parametrization
    • CI/CD Integration via GitHub Actions
    • Automate testing of your projects
    • Logic decoupling thanks to nn-template-core
    • Advanced restore options for trainings
    • Documentation website
    • Support for Python logging (with colors!)

    What's Changed

    • Refactor configuration by @lucmos in https://github.com/grok-ai/nn-template/pull/8
    • Refactor project to a python package by @lucmos in https://github.com/grok-ai/nn-template/pull/10
    • Add tooling configuration by @lucmos in https://github.com/grok-ai/nn-template/pull/9
    • Refactor codebase to be compliant to the pre-commits by @lucmos in https://github.com/grok-ai/nn-template/pull/11
    • Refactor the project root management by @lucmos in https://github.com/grok-ai/nn-template/pull/12
    • Added wandb to .gitignore by @Flegyas in https://github.com/grok-ai/nn-template/pull/14
    • Refactor logging by @lucmos in https://github.com/grok-ai/nn-template/pull/15
    • Enable pin-memory if not on CPU by @lucmos in https://github.com/grok-ai/nn-template/pull/16
    • Factor our PyTorch Module from the Lightning Module by @lucmos in https://github.com/grok-ai/nn-template/pull/17
    • Force the .cache folder to be in the PROJECT_ROOT by @lucmos in https://github.com/grok-ai/nn-template/pull/19
    • Add the configuration to the Lightning checkpoints by @lucmos in https://github.com/grok-ai/nn-template/pull/20
    • Use extend-ignore instead of ignore in .flake8 by @lucmos in https://github.com/grok-ai/nn-template/pull/21
    • Fix formatting by @lucmos in https://github.com/grok-ai/nn-template/pull/22
    • Log the code used in the current experiment to wandb by @lucmos in https://github.com/grok-ai/nn-template/pull/18
    • Functionalities decoupling via external library (nn-core). by @Flegyas in https://github.com/grok-ai/nn-template/pull/23
    • Add tests by @lucmos in https://github.com/grok-ai/nn-template/pull/24
    • Implement resuming behaviour by @lucmos in https://github.com/grok-ai/nn-template/pull/25
    • Refactor NNLogger usages by @lucmos in https://github.com/grok-ai/nn-template/pull/27
    • Add CI on pre-commits and tests by @lucmos in https://github.com/grok-ai/nn-template/pull/26
    • Remove some trigger from the Test Suite workflow by @lucmos in https://github.com/grok-ai/nn-template/pull/28
    • Overwrite Lightning logging configuration by @lucmos in https://github.com/grok-ai/nn-template/pull/29
    • Ensure tags are defined asking interactively for them by @lucmos in https://github.com/grok-ai/nn-template/pull/30
    • Introduce the seed index concept by @lucmos in https://github.com/grok-ai/nn-template/pull/31
    • Force execution of init.py on direct execution by @lucmos in https://github.com/grok-ai/nn-template/pull/33
    • Move functions from template to core by @lucmos in https://github.com/grok-ai/nn-template/pull/34
    • Add functionality to upload the run files in the storage to wandb by @lucmos in https://github.com/grok-ai/nn-template/pull/35
    • Move ui_utils entirely to nn-core by @lucmos in https://github.com/grok-ai/nn-template/pull/36
    • Add dynamic parametrized badges for the Test Suite and docs by @lucmos in https://github.com/grok-ai/nn-template/pull/45
    • Fix files hashing in workflow cache keys by @lucmos in https://github.com/grok-ai/nn-template/pull/46
    • Add seed_index determinism test by @lucmos in https://github.com/grok-ai/nn-template/pull/44
    • Refactor references to organization name into grok-ai by @lucmos in https://github.com/grok-ai/nn-template/pull/48
    • Push the default version in mike on release by @lucmos in https://github.com/grok-ai/nn-template/pull/49
    • Improve docs status badge to monitor the github-pages environment by @lucmos in https://github.com/grok-ai/nn-template/pull/50
    • Fix mike rebasing and pushing logic on release by @lucmos in https://github.com/grok-ai/nn-template/pull/51
    • Add a DAG in the post hook interactive setup by @lucmos in https://github.com/grok-ai/nn-template/pull/47
    • Skip test if no dataset is provided by @Flegyas in https://github.com/grok-ai/nn-template/pull/52
    • Fix remote parametrization in the README by @lucmos in https://github.com/grok-ai/nn-template/pull/53
    • Fix type hint in dataset.py by @lucmos in https://github.com/grok-ai/nn-template/pull/55
    • Improve the "add git remote" message in the post hook by @lucmos in https://github.com/grok-ai/nn-template/pull/54
    • Update nn-template-core dependency to 0.0.7 by @lucmos in https://github.com/grok-ai/nn-template/pull/56
    • Update docs by @lucmos in https://github.com/grok-ai/nn-template/pull/57
    • Add custom collate function by @Flegyas in https://github.com/grok-ai/nn-template/pull/58
    • Set metadata as a cached property in DataModule by @Flegyas in https://github.com/grok-ai/nn-template/pull/59
    • Pass run tags to the WandbLogger by @Flegyas in https://github.com/grok-ai/nn-template/pull/60
    • Feature/bump core by @Flegyas in https://github.com/grok-ai/nn-template/pull/61
    • Version 0.2.0 by @Flegyas in https://github.com/grok-ai/nn-template/pull/62

    Full Changelog: https://github.com/grok-ai/nn-template/compare/0.1.0...0.2.0

    Source code(tar.gz)
    Source code(zip)
Owner
Luca Moschella
PhD student at University of Rome La Sapienza in Computer Science.
Luca Moschella
An open-source outlier detection package by Getcontact Data Team

pyfbad The pyfbad library supports anomaly detection projects. An end-to-end anomaly detection application can be written using the source codes of th

Teknasyon Tech 41 Dec 27, 2022
Code for EmBERT, a transformer model for embodied, language-guided visual task completion.

Code for EmBERT, a transformer model for embodied, language-guided visual task completion.

41 Jan 03, 2023
Deep Structured Instance Graph for Distilling Object Detectors (ICCV 2021)

DSIG Deep Structured Instance Graph for Distilling Object Detectors Authors: Yixin Chen, Pengguang Chen, Shu Liu, Liwei Wang, Jiaya Jia. [pdf] [slide]

DV Lab 31 Nov 17, 2022
A PyTorch-Based Framework for Deep Learning in Computer Vision

TorchCV: A PyTorch-Based Framework for Deep Learning in Computer Vision @misc{you2019torchcv, author = {Ansheng You and Xiangtai Li and Zhen Zhu a

Donny You 2.2k Jan 09, 2023
A collection of random and hastily hacked together scripts for investigating EU-DCC

A collection of random and hastily hacked together scripts for investigating EU-DCC

Ryan Barrett 8 Mar 01, 2022
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more

Apache MXNet (incubating) for Deep Learning Apache MXNet is a deep learning framework designed for both efficiency and flexibility. It allows you to m

The Apache Software Foundation 20.2k Jan 05, 2023
Optimized Gillespie algorithm for simulating Stochastic sPAtial models of Cancer Evolution (OG-SPACE)

OG-SPACE Introduction Optimized Gillespie algorithm for simulating Stochastic sPAtial models of Cancer Evolution (OG-SPACE) is a computational framewo

Data and Computational Biology Group UNIMIB (was BI*oinformatics MI*lan B*icocca) 0 Nov 17, 2021
PEPit is a package enabling computer-assisted worst-case analyses of first-order optimization methods.

PEPit: Performance Estimation in Python This open source Python library provides a generic way to use PEP framework in Python. Performance estimation

Baptiste 53 Nov 16, 2022
[CVPR 2021] Unsupervised 3D Shape Completion through GAN Inversion

ShapeInversion Paper Junzhe Zhang, Xinyi Chen, Zhongang Cai, Liang Pan, Haiyu Zhao, Shuai Yi, Chai Kiat Yeo, Bo Dai, Chen Change Loy "Unsupervised 3D

100 Dec 22, 2022
Official Pytorch implementation of Online Continual Learning on Class Incremental Blurry Task Configuration with Anytime Inference (ICLR 2022)

The Official Implementation of CLIB (Continual Learning for i-Blurry) Online Continual Learning on Class Incremental Blurry Task Configuration with An

NAVER AI 34 Oct 26, 2022
Tutorial to set up TensorFlow Object Detection API on the Raspberry Pi

A tutorial showing how to set up TensorFlow's Object Detection API on the Raspberry Pi

Evan 1.1k Dec 26, 2022
Python Actor concurrency library

Thespian Actor Library This library provides the framework of an Actor model for use by applications implementing Actors. Thespian Site with Documenta

Kevin Quick 177 Dec 11, 2022
The official repository for BaMBNet

BaMBNet-Pytorch Paper

Junjun Jiang 18 Dec 04, 2022
Safe Bayesian Optimization

SafeOpt - Safe Bayesian Optimization This code implements an adapted version of the safe, Bayesian optimization algorithm, SafeOpt [1], [2]. It also p

Felix Berkenkamp 111 Dec 11, 2022
PyTorch implementation of MLP-Mixer

PyTorch implementation of MLP-Mixer MLP-Mixer: an all-MLP architecture composed of alternate token-mixing and channel-mixing operations. The token-mix

Duo Li 33 Nov 27, 2022
Pytorch implementation of Masked Auto-Encoder

Masked Auto-Encoder (MAE) Pytorch implementation of Masked Auto-Encoder: Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick

Jiyuan 22 Dec 13, 2022
PiCIE: Unsupervised Semantic Segmentation using Invariance and Equivariance in clustering (CVPR2021)

PiCIE: Unsupervised Semantic Segmentation using Invariance and Equivariance in Clustering Jang Hyun Cho1, Utkarsh Mall2, Kavita Bala2, Bharath Harihar

Jang Hyun Cho 164 Dec 30, 2022
Official code repository for the publication "Latent Equilibrium: A unified learning theory for arbitrarily fast computation with arbitrarily slow neurons"

Latent Equilibrium: A unified learning theory for arbitrarily fast computation with arbitrarily slow neurons This repository contains the code to repr

Computational Neuroscience, University of Bern 3 Aug 04, 2022
3 Apr 20, 2022
Code for CVPR 2021 paper: Anchor-Free Person Search

Introduction This is the implementationn for Anchor-Free Person Search in CVPR2021 License This project is released under the Apache 2.0 license. Inst

158 Jan 04, 2023