An implementation for Neural Architecture Search with Random Labels (CVPR 2021 poster) on Pytorch.

Related tags

Deep LearningRLNAS
Overview

Neural Architecture Search with Random Labels(RLNAS)

Introduction

This project provides an implementation for Neural Architecture Search with Random Labels (CVPR 2021 poster) on Pytorch. Experiments are evaluated on multiple datasets (NAS-Bench-201 and ImageNet) and multiple search spaces (DARTS-like and MobileNet-like). RLNAS achieves comparable or even better results compared with state-of-the-art NAS methods such as PC-DARTS, Single Path One-Shot, even though the counterparts utilize full ground truth labels for searching. We hope our finding could inspire new understandings on the essential of NAS.

Requirements

  • Pytorch 1.4
  • Python3.5+

Search results

1.Results in NAS-Benchmark-201 search space

nas_201_results

2.Results in DARTS searh space

darts_search_sapce_results

Architeture visualization

1) Architecture searched on CIFAR-10

  • RLDARTS = Genotype(
    normal=[
    ('sep_conv_5x5', 0), ('sep_conv_3x3', 1),
    ('dil_conv_3x3', 0), ('sep_conv_5x5', 2),
    ('sep_conv_3x3', 0), ('dil_conv_5x5', 3),
    ('dil_conv_5x5', 1), ('dil_conv_3x3', 2)],
    normal_concat=[2, 3, 4, 5],
    reduce=[
    ('sep_conv_5x5', 0), ('dil_conv_3x3', 1),
    ('sep_conv_3x3', 0), ('sep_conv_5x5', 2),
    ('dil_conv_3x3', 1), ('sep_conv_3x3', 3),
    ('max_pool_3x3', 1), ('sep_conv_5x5', 2,)],
    reduce_concat=[2, 3, 4, 5])

  • Normal cell: architecture_searched_on_cifar10

  • Reduction cell: architecture_searched_on_cifar10

2) Architecture searched on ImageNet-1k without FLOPs constrain

  • RLDARTS = Genotype( normal=[
    ('sep_conv_3x3', 0), ('sep_conv_3x3', 1),
    ('sep_conv_3x3', 1), ('sep_conv_3x3', 2),
    ('sep_conv_3x3', 0), ('sep_conv_5x5', 1),
    ('sep_conv_3x3', 0), ('sep_conv_3x3', 1)],
    normal_concat=[2, 3, 4, 5],
    reduce=[
    ('sep_conv_3x3', 0), ('sep_conv_3x3', 1),
    ('sep_conv_5x5', 0), ('sep_conv_3x3', 2),
    ('sep_conv_5x5', 0), ('sep_conv_5x5', 2),
    ('sep_conv_3x3', 2), ('sep_conv_3x3', 4)],
    reduce_concat=[2, 3, 4, 5])

  • Normal cell: architecture_searched_on_imagenet_no_flops_constrain

  • Reduction cell: architecture_searched_on_cifar10

3) Architecture searched on ImageNet-1k with 600M FLOPs constrain

  • RLDARTS = Genotype(
    normal=[
    ('sep_conv_3x3', 0), ('sep_conv_3x3', 1),
    ('skip_connect', 1), ('sep_conv_3x3', 2),
    ('sep_conv_3x3', 1), ('sep_conv_3x3', 2),
    ('skip_connect', 0), ('sep_conv_3x3', 4)],
    normal_concat=[2, 3, 4, 5],
    reduce=[ ('sep_conv_3x3', 0), ('max_pool_3x3', 1),
    ('sep_conv_3x3', 0), ('skip_connect', 1),
    ('sep_conv_3x3', 0), ('dil_conv_3x3', 1),
    ('skip_connect', 0), ('sep_conv_3x3', 1)],
    reduce_concat=[2, 3, 4, 5])

  • Normal cell: architecture_searched_on_imagenet_no_flops_constrain

  • Reduction cell: architecture_searched_on_cifar10

3.Results in MobileNet search space

The MobileNet-like search space proposed in ProxylessNAS is adopted in this paper. The SuperNet contains 21 choice blocks and each block has 7 alternatives:6 MobileNet blocks (combination of kernel size {3,5,7} and expand ratio {3,6}) and ’skip-connect’.

mobilenet_search_sapce_results

Architeture visualization

mobilenet_search_sapce_results

Usage

  • RLNAS in NAS-Benchmark-201

1)enter the work directory

cd nas_bench_201

2)train supernet with random labels

bash ./scripts-search/algos/train_supernet.sh cifar10 0 1

3)evolution search with angle

bash ./scripts-search/algos/evolution_search_with_angle.sh cifar10 0 1

4)calculate correlation

bash ./scripts-search/algos/cal_correlation.sh cifar10 0 1
  • RLNAS in DARTS search space

1)enter the work directory

cd darts_search_space

search architecture on CIFAR-10

cd cifar10/rlnas/

or search architecture on ImageNet

cd imagenet/rlnas/

2)train supernet with random labels

cd train_supernet
bash run_train.sh

3)evolution search with angle

cd evolution_search
cp ../train_supernet/models/checkpoint_epoch_50.pth.tar ./model_and_data/
cp ../train_supernet/models/checkpoint_epoch_0.pth.tar ./model_and_data/
bash run_server.sh
bash run_test.sh

4)architeture evaluation

cd retrain_architetcure

add searched architecture to genotypes.py

bash run_retrain.sh
  • RLNAS in MobileNet search space

The conduct commands are almost the same steps like RLNAS in DARTS search space, excepth that you need run 'bash run_generate_flops_lookup_table.sh' before evolution search.

Note: setup a server for the distributed search

tmux new -s mq_server
sudo apt update
sudo apt install rabbitmq-server
sudo service rabbitmq-server start
sudo rabbitmqctl add_user test test
sudo rabbitmqctl set_permissions -p / test '.*' '.*' '.*'

Before search, please modify host and username in the config file evolution_search/config.py.

Citation

If you find that this project helps your research, please consider citing some of the following papers:

@article{zhang2021neural,
  title={Neural Architecture Search with Random Labels},
  author={Zhang, Xuanyang and Hou, Pengfei and Zhang, Xiangyu and Sun, Jian},
  booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition},
  year={2021}
}
@inproceedings{hu2020angle,
  title={Angle-based search space shrinking for neural architecture search},
  author={Hu, Yiming and Liang, Yuding and Guo, Zichao and Wan, Ruosi and Zhang, Xiangyu and Wei, Yichen and Gu, Qingyi and Sun, Jian},
  booktitle={European Conference on Computer Vision},
  pages={119--134},
  year={2020},
  organization={Springer}
}
@inproceedings{guo2020single,
  title={Single path one-shot neural architecture search with uniform sampling},
  author={Guo, Zichao and Zhang, Xiangyu and Mu, Haoyuan and Heng, Wen and Liu, Zechun and Wei, Yichen and Sun, Jian},
  booktitle={European Conference on Computer Vision},
  pages={544--560},
  year={2020},
  organization={Springer}
}
Learning Multiresolution Matrix Factorization and its Wavelet Networks on Graphs

Project Learning Multiresolution Matrix Factorization and its Wavelet Networks on Graphs, https://arxiv.org/pdf/2111.01940.pdf. Authors Truong Son Hy

5 Jun 28, 2022
Grow Function: Generate 3D Stacked Bifurcating Double Deep Cellular Automata based organisms which differentiate using a Genetic Algorithm...

Grow Function: A 3D Stacked Bifurcating Double Deep Cellular Automata which differentiates using a Genetic Algorithm... TLDR;High Def Trees that you can mint as NFTs on Solana

Nathaniel Gibson 4 Oct 08, 2022
Rainbow: Combining Improvements in Deep Reinforcement Learning

Rainbow Rainbow: Combining Improvements in Deep Reinforcement Learning [1]. Results and pretrained models can be found in the releases. DQN [2] Double

Kai Arulkumaran 1.4k Dec 29, 2022
A Factor Model for Persistence in Investment Manager Performance

Factor-Model-Manager-Performance A Factor Model for Persistence in Investment Manager Performance I apply methods and processes similar to those used

Omid Arhami 1 Dec 01, 2021
A neuroanatomy-based augmented reality experience powered by computer vision. Features 3D visuals of the Atlas Brain Map slices.

Brain Augmented Reality (AR) A neuroanatomy-based augmented reality experience powered by computer vision that features 3D visuals of the Atlas Brain

Yasmeen Brain 10 Oct 06, 2022
Video lie detector using xgboost - A video lie detector using OpenFace and xgboost

video_lie_detector_using_xgboost a video lie detector using OpenFace and xgboost

2 Jan 11, 2022
A sequence of Jupyter notebooks featuring the 12 Steps to Navier-Stokes

CFD Python Please cite as: Barba, Lorena A., and Forsyth, Gilbert F. (2018). CFD Python: the 12 steps to Navier-Stokes equations. Journal of Open Sour

Barba group 2.6k Dec 30, 2022
AQP is a modular pipeline built to enable the comparison and testing of different quality metric configurations.

Audio Quality Platform - AQP An Open Modular Python Platform for Objective Speech and Audio Quality Metrics AQP is a highly modular pipeline designed

Jack Geraghty 24 Oct 01, 2022
This repo contains the implementation of YOLOv2 in Keras with Tensorflow backend.

Easy training on custom dataset. Various backends (MobileNet and SqueezeNet) supported. A YOLO demo to detect raccoon run entirely in brower is accessible at https://git.io/vF7vI (not on Windows).

Huynh Ngoc Anh 1.7k Dec 24, 2022
Discord Multi Tool that focuses on design and easy usage

Multi-Tool-v1.0 Discord Multi Tool that focuses on design and easy usage Delete webhook Block all friends Spam webhook Modify webhook Webhook info Tok

Lodi#0001 24 May 23, 2022
Continual learning with sketched Jacobian approximations

Continual learning with sketched Jacobian approximations This repository contains the code for reproducing figures and results in the paper ``Provable

Machine Learning and Information Processing Laboratory 1 Jun 30, 2022
HarDNeXt: Official HarDNeXt repository

HarDNeXt-Pytorch HarDNeXt: A Stage Receptive Field and Connectivity Aware Convolution Neural Network HarDNeXt-MSEG for Medical Image Segmentation in 0

5 May 26, 2022
DuBE: Duple-balanced Ensemble Learning from Skewed Data

DuBE: Duple-balanced Ensemble Learning from Skewed Data "Towards Inter-class and Intra-class Imbalance in Class-imbalanced Learning" (IEEE ICDE 2022 S

6 Nov 12, 2022
Merlion: A Machine Learning Framework for Time Series Intelligence

Merlion: A Machine Learning Library for Time Series Table of Contents Introduction Installation Documentation Getting Started Anomaly Detection Foreca

Salesforce 2.8k Dec 30, 2022
A PyTorch implementation of EfficientNet and EfficientNetV2 (coming soon!)

EfficientNet PyTorch Quickstart Install with pip install efficientnet_pytorch and load a pretrained EfficientNet with: from efficientnet_pytorch impor

Luke Melas-Kyriazi 7.2k Jan 06, 2023
Implementation of "Selection via Proxy: Efficient Data Selection for Deep Learning" from ICLR 2020.

Selection via Proxy: Efficient Data Selection for Deep Learning This repository contains a refactored implementation of "Selection via Proxy: Efficien

Stanford Future Data Systems 70 Nov 16, 2022
A annotation of yolov5-5.0

代码版本:0714 commit #4000 $ git clone https://github.com/ultralytics/yolov5 $ cd yolov5 $ git checkout 720aaa65c8873c0d87df09e3c1c14f3581d4ea61 这个代码只是注释版

Laughing 229 Dec 17, 2022
Implementation of Convolutional enhanced image Transformer

CeiT : Convolutional enhanced image Transformer This is an unofficial PyTorch implementation of Incorporating Convolution Designs into Visual Transfor

Rishikesh (ऋषिकेश) 82 Dec 13, 2022
Code for paper " AdderNet: Do We Really Need Multiplications in Deep Learning?"

AdderNet: Do We Really Need Multiplications in Deep Learning? This code is a demo of CVPR 2020 paper AdderNet: Do We Really Need Multiplications in De

HUAWEI Noah's Ark Lab 915 Jan 01, 2023
Bayesian inference for Permuton-induced Chinese Restaurant Process (NeurIPS2021).

Permuton-induced Chinese Restaurant Process Note: Currently only the Matlab version is available, but a Python version will be available soon! This is

NTT Communication Science Laboratories 3 Dec 17, 2022