SAVI2I: Continuous and Diverse Image-to-Image Translation via Signed Attribute Vectors

Related tags

Text Data & NLPSAVI2I
Overview

License CC BY-NC-SA 4.0 Python 3.6

SAVI2I: Continuous and Diverse Image-to-Image Translation via Signed Attribute Vectors

[Paper] [Project Website]

Pytorch implementation for SAVI2I. We propose a simple yet effective signed attribute vector (SAV) that facilitates continuous translation on diverse mapping paths across multiple domains.
More video results please see Our Webpage
Contact: Qi Mao ([email protected])

Paper

Continuous and Diverse Image-to-Image Translation via Signed Attribute Vectors
Qi Mao, Hsin-Ying Lee, Hung-Yu Tseng, Jia-Bin Huang, Siwei Ma, and Ming-Hsuan Yang
In arXiv 2020

Citation

If you find this work useful for your research, please cite our paper:

    @article{mao2020continuous,
      author       = "Mao, Qi and Lee, Hsin-Ying and Tseng, Hung-Yu and Huang, Jia-Bin and Ma, Siwei and Yang, Ming-Hsuan",
      title        = "Continuous and Diverse Image-to-Image Translation via Signed Attribute Vectors",
      journal    = "arXiv preprint 2011.01215",
      year         = "2020"
    }

Quick Start

Prerequisites

  • Linux or Windows
  • Python 3+
  • Suggest to use two P100 16GB GPUs or One V100 32GB GPU.

Install

  • Clone this repo:
git clone https://github.com/HelenMao/SAVI2I.git
cd SAVI2I
  • This code requires Pytorch 0.4.0+ and Python 3+. Please install dependencies by
conda create -n SAVI2I python=3.6
source activate SAVI2I
pip install -r requirements.txt 

Training Datasets

Download datasets for each task into the dataset folder

./datasets
  • Style translation: Yosemite (summer <-> winter) and Photo2Artwork (Photo, Monet, Van Gogh and Ukiyo-e)
  • You can follow the instructions of CycleGAN datasets to download Yosemite and Photo2artwork datasets.
  • Shape-variation translation: CelebA-HQ (Male <-> Female) and AFHQ (Cat, Dog and WildLife)
  • We split CelebA-HQ into male and female domains according to the annotated label and fine-tune the images manaully.
  • You can follow the instructions of StarGAN-v2 datasets to download CelebA-HQ and AFHQ datasets.

Training

Notes

For low-level style translation tasks, you suggest to set --type=1 to use corresponding network architectures.
For shape-variation translation tasks, you suggest to set --type=0 to use corresponding network architectures.

  • Yosemite
python train.py --dataroot ./datasets/Yosemite/ --phase train --type 1 --name Yosemite --n_ep 700 --n_ep_decay 500 --lambda_r1 10 --lambda_mmd 1 --num_domains 2
  • Photo2artwork
python train.py --dataroot ./datasets/Photo2artwork/ --phase train --type 1 --name Photo2artwork --n_ep 100 --n_ep_decay 0 --lambda_r1 10 --lambda_mmd 1 --num_domains 4
  • CelebAHQ
python train.py --dataroot ./datasets/CelebAHQ/ --phase train --type 0 --name CelebAHQ --n_ep 30 --n_ep_decay 0 --lambda_r1 1 --lambda_mmd 1 --num_domains 2
  • AFHQ
python train.py --dataroot ./datasets/AFHQ/ --phase train --type 0 --name AFHQ --n_ep 100 --n_ep_decay 0 --lambda_r1 1 --lambda_mmd 10 --num_domains 3

Pre-trained Models

Download and save them into

./models

or download the pre-trained models with the following script.

bash ./download_models.sh

Testing

Reference-guided

python test_reference_save.py --dataroot ./datasets/CelebAHQ --resume ./models/CelebAHQ/00029.pth --phase test --type 0 --num_domains 2 --index_s A --index_t B --num 5 --name CelebAHQ_ref  

Latent-guided

python test_latent_rdm_save.py --dataroot ./datasets/CelebAHQ --resume ./models/CelebAHQ/00029.pth --phase test --type 0 --num_domains 2 --index_s A --index_t B --num 5 --name CelebAHQ_rdm  

License

All rights reserved.
Licensed under the CC BY-NC-SA 4.0 (Attribution-NonCommercial-ShareAlike 4.0 International).
The codes are only for academical research use. For commercial use, please contact [email protected].

Acknowledgements

Codes and network architectures inspired from:

Owner
Qi Mao
PhD student in Institute of Digital Media, Peking University.
Qi Mao
State-of-the-art NLP through transformer models in a modular design and consistent APIs.

Trapper (Transformers wRAPPER) Trapper is an NLP library that aims to make it easier to train transformer based models on downstream tasks. It wraps h

Open Business Software Solutions 42 Sep 21, 2022
The implementation of Parameter Differentiation based Multilingual Neural Machine Translation

The implementation of Parameter Differentiation based Multilingual Neural Machine Translation .

Qian Wang 21 Dec 17, 2022
History Aware Multimodal Transformer for Vision-and-Language Navigation

History Aware Multimodal Transformer for Vision-and-Language Navigation This repository is the official implementation of History Aware Multimodal Tra

Shizhe Chen 46 Nov 23, 2022
NeuTex: Neural Texture Mapping for Volumetric Neural Rendering

NeuTex: Neural Texture Mapping for Volumetric Neural Rendering Paper: https://arxiv.org/abs/2103.00762 Running Run on the provided DTU scene cd run ba

Fanbo Xiang 68 Jan 06, 2023
Dé op-de-vlucht Pieton vertaler. Wereldwijd gebruikt door meer dan 1.000+ succesvolle bedrijven!

Dé op-de-vlucht Pieton vertaler. Wereldwijd gebruikt door meer dan 1.000+ succesvolle bedrijven!

Lau 1 Dec 17, 2021
End-to-end MLOps pipeline of a BERT model for emotion classification.

image source EmoBERT-MLOps The goal of this repository is to build an end-to-end MLOps pipeline based on the MLOps course from Made with ML, but this

Dimitre Oliveira 4 Nov 06, 2022
⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x using fastT5.

Reduce T5 model size by 3X and increase the inference speed up to 5X. Install Usage Details Functionalities Benchmarks Onnx model Quantized onnx model

Kiran R 399 Jan 05, 2023
American Sign Language (ASL) to Text Converter

Signterpreter American Sign Language (ASL) to Text Converter Recommendations Although there is grayscale and gaussian blur, we recommend that you use

0 Feb 20, 2022
Code for producing Japanese GPT-2 provided by rinna Co., Ltd.

japanese-gpt2 This repository provides the code for training Japanese GPT-2 models. This code has been used for producing japanese-gpt2-medium release

rinna Co.,Ltd. 491 Jan 07, 2023
Materials (slides, code, assignments) for the NYU class I teach on NLP and ML Systems (Master of Engineering).

FREE_7773 Repo containing material for the NYU class (Master of Engineering) I teach on NLP, ML Sys etc. For context on what the class is trying to ac

Jacopo Tagliabue 90 Dec 19, 2022
:P Some basic stuff I'm gonna use for my upcoming Agile Software Development and Devops

reverse-image-search-py bash script.sh img_name.jpg Requirements pip install requests pip install pyshorteners Dry run [ Sudhanva M 3 Dec 18, 2021

LightSeq: A High-Performance Inference Library for Sequence Processing and Generation

LightSeq is a high performance inference library for sequence processing and generation implemented in CUDA. It enables highly efficient computation of modern NLP models such as BERT, GPT2, Transform

Bytedance Inc. 2.5k Jan 03, 2023
Espresso: A Fast End-to-End Neural Speech Recognition Toolkit

Espresso Espresso is an open-source, modular, extensible end-to-end neural automatic speech recognition (ASR) toolkit based on the deep learning libra

Yiming Wang 919 Jan 03, 2023
Use the power of GPT3 to execute any function inside your programs just by giving some doctests

gptrun Don't feel like coding today? Use the power of GPT3 to execute any function inside your programs just by giving some doctests. How is this diff

Roberto Abdelkader Martínez Pérez 11 Nov 11, 2022
Python module (C extension and plain python) implementing Aho-Corasick algorithm

pyahocorasick pyahocorasick is a fast and memory efficient library for exact or approximate multi-pattern string search meaning that you can find mult

Wojciech Muła 763 Dec 27, 2022
Topic Inference with Zeroshot models

zeroshot_topics Table of Contents Installation Usage License Installation zeroshot_topics is distributed on PyPI as a universal wheel and is available

Rita Anjana 55 Nov 28, 2022
Super Tickets in Pre-Trained Language Models: From Model Compression to Improving Generalization (ACL 2021)

Structured Super Lottery Tickets in BERT This repo contains our codes for the paper "Super Tickets in Pre-Trained Language Models: From Model Compress

Chen Liang 16 Dec 11, 2022
Python bindings to the dutch NLP tool Frog (pos tagger, lemmatiser, NER tagger, morphological analysis, shallow parser, dependency parser)

Frog for Python This is a Python binding to the Natural Language Processing suite Frog. Frog is intended for Dutch and performs part-of-speech tagging

Maarten van Gompel 46 Dec 14, 2022
Built for cleaning purposes in military institutions

Ferramenta do AL Construído para fins de limpeza em instituições militares. Instalação Requer python = 3.2 pip install -r requirements.txt Usagem Exe

0 Aug 13, 2022
超轻量级bert的pytorch版本,大量中文注释,容易修改结构,持续更新

bert4pytorch 2021年8月27更新: 感谢大家的star,最近有小伙伴反映了一些小的bug,我也注意到了,奈何这个月工作上实在太忙,更新不及时,大约会在9月中旬集中更新一个只需要pip一下就完全可用的版本,然后会新添加一些关键注释。 再增加对抗训练的内容,更新一个完整的finetune

muqiu 317 Dec 18, 2022