Distance correlation and related E-statistics in Python

Overview

dcor

build status Documentation Status Coverage Status Pypi version Available in Conda

dcor: distance correlation and related E-statistics in Python.

E-statistics are functions of distances between statistical observations in metric spaces.

Distance covariance and distance correlation are dependency measures between random vectors introduced in [SRB07] with a simple E-statistic estimator.

This package offers functions for calculating several E-statistics such as:

  • Estimator of the energy distance [SR13].
  • Biased and unbiased estimators of distance covariance and distance correlation [SRB07].
  • Estimators of the partial distance covariance and partial distance covariance [SR14].

It also provides tests based on these E-statistics:

  • Test of homogeneity based on the energy distance.
  • Test of independence based on distance covariance.

Installation

dcor is on PyPi and can be installed using pip:

pip install dcor

It is also available for conda using the conda-forge channel:

conda install -c conda-forge dcor

Previous versions of the package were in the vnmabus channel. This channel will not be updated with new releases, and users are recommended to use the conda-forge channel.

Requirements

dcor is available in Python 3.5 or above and in Python 2.7, in all operating systems.

Documentation

The documentation can be found in https://dcor.readthedocs.io/en/latest/?badge=latest

References

[SR13] Gábor J. Székely and Maria L. Rizzo. Energy statistics: a class of statistics based on distances. Journal of Statistical Planning and Inference, 143(8):1249 – 1272, 2013. URL: http://www.sciencedirect.com/science/article/pii/S0378375813000633, doi:10.1016/j.jspi.2013.03.018.
[SR14] Gábor J. Székely and Maria L. Rizzo. Partial distance correlation with methods for dissimilarities. The Annals of Statistics, 42(6):2382–2412, 12 2014. doi:10.1214/14-AOS1255.
[SRB07] (1, 2) Gábor J. Székely, Maria L. Rizzo, and Nail K. Bakirov. Measuring and testing dependence by correlation of distances. The Annals of Statistics, 35(6):2769–2794, 12 2007. doi:10.1214/009053607000000505.
Comments
  • Is there a fast way of doing pairwise distance correlation (dcor.distance_correlation)

    Is there a fast way of doing pairwise distance correlation (dcor.distance_correlation)

    Hi,

    I am trying to do a pairwise distance correlation for every column in a pandas dataframe of shape (1000, 10000) - i want to do a pairwise correlation of all columns (so 10k pairwise correlations, each column by every other column.)

    if i run the following code:

    dist_corr = lambda column1, column2: dcor.distance_correlation(column1, column2)
    d_corr = df.apply(lambda col1: df.apply(lambda col2: dist_corr(col1, col2)))
    

    this takes far too long, many many hours and in some cases doesn't finish. Is there an implementation that is more optimised? any advice would be much appreciated.

    thank you

    opened by amjass12 8
  • Numba energy permutation test

    Numba energy permutation test

    Note: this builds on #27, and changes from that branch will appear here until that is merged.

    This re-implements most of the energy distance functions and permutation tests using numba. This provides significant performance improvements. I have some benchmarks below, which compare numba to pure Python (note: this isn't comparing numba to original code that used numpy tricks, it's comparing my changes with and without the JIT). The results suggest that numba improves performance for any number of permutations above 250. I expect this will also be true of multiple different permutation tests in the same program.

    image

    And with slightly higher limits: image

    However, the costs are:

    • Some ugly numba workarounds, like re-implementing the permutation function using nested loops
    • We lose the ability to pass in arbitrary average functions, the average parameter is now a string which is either mean or median
    • We lose the use of the RandomState object, and have to rely on only np.random.seed()
    • Startup costs associated with JIT compiler. Thus, for less than 250 permutations, the JIT compilation slows down the task.
    opened by multimeric 8
  • Counting the distance from a point to itself

    Counting the distance from a point to itself

    Hi, I've hit a bit of a problem. I was trying to work out why I was getting different results from the ecp R package versus dcor. After some intense investigation, I think the cause seems to be at the point of taking the mean of each within-sample distance. Note, this is before we apply the coefficient or consider the between-sample distances. Precisely, I'm referring to the mean taken here: https://github.com/vnmabus/dcor/blob/161a6f5928ec0f30ce89fcfd5e90e6ed9315e383/dcor/_energy.py#L41-L42

    In all the Székely and Rizzo papers (e.g. Székely & Rizzo, 2004), this mean is defined as the arithmetic mean, and the same as you have used in dcor: image

    However in the Matteson and James papers I have been looking at (e.g. Matteson & James, 2014; James et al., 2016), they seem to define it as follows:

    image

    What they seem to be doing here is summing the lower triangle of the matrix, excluding the diagonal, and then divided by the combination n choose 2. So if we had a sample with 5 items, the full distance matrix would be 5 x 5 = 25 items, but the lower triangle would only have 10 items in it. They would sum these distances and divide by 5 choose 2, which is 10. So this is also taking the mean, but it's the mean excluding the diagonal, which is of course always 0 in a within-sample distance matrix. The ultimate outcome is that their "mean" is actually \frac{n}{n-1} \mu, which is larger than it should be, as it isn't counting the 0s on the diagonal.

    Note that this is also visible in the implementation of their work, in the ecp package. Here, they sum the matrix but then divide by n \times n - 1, which is equivalent to the above, but not equivalent to the true mean: https://github.com/zwenyu/ecp/blob/65a9bb56308d25ce3c6be4d6388137f428118248/src/energyChangePoint.cpp#L112

    My question is this: are they simply wrong? If no, is there any theory supporting this alternative formula? If there is, should this be something supported in dcor? Fortunately it kind of already is thanks to my customizable average feature. But it could be called out specifically. I appreciate your input here as you likely understand this domain better than I do.

    opened by multimeric 8
  • __version__ returns 0.0. Version number is on a separate file

    __version__ returns 0.0. Version number is on a separate file

    Hello. Thank you for this very useful package. I need to query the version installed and check that it is >=0.5.3.

    In dcor/init.py

    try: with open(_os.path.join(_os.path.dirname(file), '..', 'VERSION'), 'r') as version_file: version = version_file.read().strip() except IOError as e: if e.errno != _errno.ENOENT: raise

    __version__ = "0.0"
    

    You are reading the version from the VERSION file and at the end anyway forcing the version number to be 0.0. This is always returning 0.0 when i do

    import dcor print(dcor.version)

    opened by sagarsimha 6
  • error in import dcor

    error in import dcor

    Hello,

    I installed the Python dcor package, and I got the following error whenever I tried to import dcor.

    Traceback (most recent call last): File "", line 1, in File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/dcor/init.py", line 14, in from . import independence # noqa File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/dcor/independence.py", line 13, in from ._dcor import u_distance_correlation_sqr File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/dcor/_dcor.py", line 27, in from ._fast_dcov_avl import _distance_covariance_sqr_avl_generic File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/dcor/_fast_dcov_avl.py", line 89, in _generate_partial_sum_2d(compiled=True)) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/decorators.py", line 200, in wrapper disp.compile(sig) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/compiler_lock.py", line 32, in _acquire_compile_lock return func(*args, **kwargs) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/dispatcher.py", line 768, in compile cres = self._compiler.compile(args, return_type) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/dispatcher.py", line 81, in compile raise retval File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/dispatcher.py", line 91, in _compile_cached retval = self._compile_core(args, return_type) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/dispatcher.py", line 109, in _compile_core pipeline_class=self.pipeline_class) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/compiler.py", line 551, in compile_extra return pipeline.compile_extra(func) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/compiler.py", line 331, in compile_extra return self._compile_bytecode() File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/compiler.py", line 393, in _compile_bytecode return self._compile_core() File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/compiler.py", line 373, in _compile_core raise e File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/compiler.py", line 364, in _compile_core pm.run(self.state) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/compiler_machinery.py", line 347, in run raise patched_exception File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/compiler_machinery.py", line 338, in run self._runPass(idx, pass_inst, state) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/compiler_lock.py", line 32, in _acquire_compile_lock return func(*args, **kwargs) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/compiler_machinery.py", line 302, in _runPass mutated |= check(pss.run_pass, internal_state) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/compiler_machinery.py", line 275, in check mangled = func(compiler_state) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/typed_passes.py", line 95, in run_pass raise_errors=self._raise_errors) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/typed_passes.py", line 66, in type_inference_stage infer.build_constraint() File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/typeinfer.py", line 938, in build_constraint self.constrain_statement(inst) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/typeinfer.py", line 1274, in constrain_statement self.typeof_assign(inst) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/typeinfer.py", line 1345, in typeof_assign self.typeof_global(inst, inst.target, value) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/typeinfer.py", line 1444, in typeof_global typ = self.resolve_value_type(inst, gvar.value) File "/Users/sanghoonkim/anaconda3/lib/python3.7/site-packages/numba/typeinfer.py", line 1366, in resolve_value_type raise TypingError(msg, loc=inst.loc) numba.errors.TypingError: Failed in nopython mode pipeline (step: nopython frontend) Untyped global name '_dyad_update': cannot determine Numba type of <class 'function'>

    File "anaconda3/lib/python3.7/site-packages/dcor/_fast_dcov_avl.py", line 70: def _partial_sum_2d(x, y, c, ix, iy, sx_c, sy_c, c_sum, l_max,

        dyad_update = _dyad_update_compiled if compiled else _dyad_update
        ^
    
    opened by sanghoonkim0918 6
  • OSError: [Errno 36] File name too long when importing dcor

    OSError: [Errno 36] File name too long when importing dcor

    Importing dcor failed due to file name too long.

    Ubuntu 20.04 python 3.8.10 dcor 0.5.3 numba 0.53.1 (+ 0.54.1)

    >>> import dcor
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/home/quentin/.local/lib/python3.8/site-packages/dcor/__init__.py", line 14, in <module>
        from . import independence  # noqa
      File "/home/quentin/.local/lib/python3.8/site-packages/dcor/independence.py", line 11, in <module>
        from ._dcor import u_distance_correlation_sqr
      File "/home/quentin/.local/lib/python3.8/site-packages/dcor/_dcor.py", line 26, in <module>
        from ._fast_dcov_mergesort import _distance_covariance_sqr_mergesort_generic
      File "/home/quentin/.local/lib/python3.8/site-packages/dcor/_fast_dcov_mergesort.py", line 208, in <module>
        _distance_covariance_sqr_mergesort_generic_impl_compiled = numba.njit(
      File "/home/quentin/.local/lib/python3.8/site-packages/numba/core/decorators.py", line 221, in wrapper
        disp.compile(sig)
      File "/home/quentin/.local/lib/python3.8/site-packages/numba/core/dispatcher.py", line 891, in compile
        cres = self._cache.load_overload(sig, self.targetctx)
      File "/home/quentin/.local/lib/python3.8/site-packages/numba/core/caching.py", line 644, in load_overload
        return self._load_overload(sig, target_context)
      File "/home/quentin/.local/lib/python3.8/site-packages/numba/core/caching.py", line 651, in _load_overload
        data = self._cache_file.load(key)
      File "/home/quentin/.local/lib/python3.8/site-packages/numba/core/caching.py", line 495, in load
        overloads = self._load_index()
      File "/home/quentin/.local/lib/python3.8/site-packages/numba/core/caching.py", line 511, in _load_index
        with open(self._index_path, "rb") as f:
    OSError: [Errno 36] File name too long: '/home/quentin/.local/lib/python3.8/site-packages/dcor/__pycache__/_fast_dcov_mergesort._generate_distance_covariance_sqr_mergesort_generic_impl.locals._distance_covariance_sqr_mergesort_generic_impl-163.py38.nbi'
    
    opened by Quentin62 5
  • Add configerable average function

    Add configerable average function

    Closes #22, see discussion there.

    Uncertainties:

    • Have I covered all public APIs, ensuring they can all be configured?
    • The test statistic ends up being negative, and therefore with a p-value of 1 when used to compare a standard normal and t distribution in the test_different_distributions. Does this make sense, or is it revealing a flaw in the code somewhere?
    opened by multimeric 4
  • Clarification of distance correlation - dcor vs scipy

    Clarification of distance correlation - dcor vs scipy

    Hi!

    I have started using dcor as as I need to find pairwise correlations between two variables/vectors for every pairwise comparison in a dataframe. I am using the distance correlation as i need to find correlations not just for linear pairwise correlations but also non-linear correlations.

    Having read the documentation, I know this is the correct implementation for this purpose, however, as I understand it, Scipy also provides a distance correlation function. I am getting different results when using both dcor and scipy and was wondering if you could explain why? I am unsure if Scipy is actually using the same distance correlation, or if their implementation contains something obvious I have missed which leads to the different results:

    from scipy.spatial import distance
    distance.correlation(data['column1'], data['column2'])
    = 0.57
    
    import dcor
    dcor.distance_correlation(data['column1'], data['column2'])
    = 0.41
    
    

    There is a large discrepancy here and would appreciate clarification!

    thank you!

    opened by amjass12 4
  • Distance correlation of matrix and vector.

    Distance correlation of matrix and vector.

    dcor returns a scalar for the distance correlation of a matrix and a vector. I cannot yet understand why this is the case as isn't the distance correlation defined between two vectors and so I would expect a vector of the correlations as the output.

    Could you explain what's going on?

    opened by CompRhys 4
  • Numba support

    Numba support

    I'm trying to use distance correlations as a metric for computing UMAP embeddings. This requires Numba support.

    Is there a fundamental reason why dcor.correlation_distance can't support Numba, or is it just a matter of going over the code?

    opened by asemic-horizon 3
  • Process killed due to very large array

    Process killed due to very large array

    Hello, I am trying to get the distance correlation between two very large vectors (25k each), and the dcor function gets killed due to out of memory error. How can we fix that?

    dcor.distance_correlation(np.array(x, dtype=np.float32), np.array(y, dtype=np.float32), exponent=0.5)

    opened by IslamAAli 2
  • Implement energy distance in terms of distance covariance

    Implement energy distance in terms of distance covariance

    We can implement energy distance in terms of distance covariance, as shown in https://arxiv.org/pdf/1910.08883.pdf.

    We need to study:

    • How this affect the current parameters of energy distance.
    • How to allow users to optionally access the different implementations of distance covariance, as well as the old energy distance implementation (if needed).
    enhancement 
    opened by vnmabus 0
  • Study and implement energy-based clustering

    Study and implement energy-based clustering

    As mentioned in https://doi.org/10.1016/j.jspi.2013.03.018 (https://pages.stat.wisc.edu/~wahba/stat860public/pdf4/Energy/JSPI5102.pdf), the energy distance can be used to implement a linkage method for hierarchical clustering.

    We should study the best way to implement it, if possible in a manner compatible with existing hierarchical clustering methods, such as scipy methods and scikit-learn AgglomerativeClustering class.

    enhancement 
    opened by vnmabus 0
  • Add distance skewness and symmetry test

    Add distance skewness and symmetry test

    In https://doi.org/10.1016/j.jspi.2013.03.018 (https://pages.stat.wisc.edu/~wahba/stat860public/pdf4/Energy/JSPI5102.pdf) a measure of asymmetry, distance skewness, is described, as well as a test of symmetry using it. We should attempt to implement it in this package.

    • [ ] Implement distance skewness.
    • [ ] Implement symmetry test.
    enhancement help wanted good first issue 
    opened by vnmabus 0
  • Add goodness-of-fit tests

    Add goodness-of-fit tests

    Energy distance can be used to perform goodness-of-fit tests, as mentioned in https://doi.org/10.1016/j.jspi.2013.03.018 (https://pages.stat.wisc.edu/~wahba/stat860public/pdf4/Energy/JSPI5102.pdf).

    It would be useful to create a new submodule goodness that could include some of the following:

    • [ ] Two-parameter exponential distribution goodness-of-fit test.
    • [ ] Uniform distribution goodness-of-fit test.
    • [ ] Univariate normality goodness-of-fit test.
    • [ ] Multivariate normality goodness-of-fit test.
    • [ ] Pareto distribution goodness-of-fit test.
    • [ ] Poisson distribution goodness-of-fit test.
    • [ ] Uniform distribution goodness-of-fit test.
    • [ ] Stable distributions goodness-of-fit test.
    enhancement help wanted good first issue 
    opened by vnmabus 0
  • Improve performance of pairwise distances computation

    Improve performance of pairwise distances computation

    The computation of pairwise distances is the main bottleneck of the naive algorithm for distance covariance. Currently we use scipy's cdist for Numpy arrays, and a broadcasting computation in other case.

    Any performance improvement to this function is thus well received.

    enhancement help wanted 
    opened by vnmabus 0
Releases(0.6)
  • 0.6(Dec 26, 2022)

    What's Changed

    Typing

    • Fixes wrong types in u_distance_stats_sqr.
    • Add missing types in rowwise.

    Documentation

    • New documentation theme.
    • Added links in the theory.
    • Added examples to the documentation.
    • Warning added to partial distance correlation/covariance docstrings by @jltorrecilla in https://github.com/vnmabus/dcor/pull/47

    Performance

    • Improve the computation time of distances for Numpy arrays, which improves performance for energy distance and the naive case of distance covariance/correlation.
    • Improve AVL algorithm for distance covariance performance to bring it closer to mergesort.
    • Refactor distance covariance to be able to compute distance correlation without additional calls to the covariance function.

    New Contributors

    • @jltorrecilla made their first contribution in https://github.com/vnmabus/dcor/pull/47

    Full Changelog: https://github.com/vnmabus/dcor/compare/0.5.7...0.6

    Source code(tar.gz)
    Source code(zip)
  • 0.5.7(Sep 2, 2022)

    What's Changed

    • Fix error with zero denominator. by @vnmabus in https://github.com/vnmabus/dcor/pull/37
    • Add first typing support

    Full Changelog: https://github.com/vnmabus/dcor/compare/0.5.6...0.5.7

    Source code(tar.gz)
    Source code(zip)
  • 0.5.6(Jun 4, 2022)

  • 0.5.5(May 31, 2022)

    What's Changed

    • Parallelize the permutation test with joblib. by @lemiceterieux in https://github.com/vnmabus/dcor/pull/33
    • Added compatibility with the Python Array Standard for most functionality.

    Full Changelog: https://github.com/vnmabus/dcor/compare/0.5.3...0.5.5

    Source code(tar.gz)
    Source code(zip)
  • 0.5.4(May 31, 2022)

    What's Changed

    • Parallelize the permutation test with joblib. by @lemiceterieux in https://github.com/vnmabus/dcor/pull/33
    • Added compatibility with the Python Array Standard for most functionality.

    Full Changelog: https://github.com/vnmabus/dcor/compare/0.5.3...0.5.4

    Source code(tar.gz)
    Source code(zip)
  • 0.5(Aug 23, 2020)

  • 0.4(Apr 30, 2020)

Owner
Carlos Ramos Carreño
Software engineer and mathematician. PhD student in Machine Learning at Universidad Autónoma de Madrid.
Carlos Ramos Carreño
"Learning and Analyzing Generation Order for Undirected Sequence Models" in Findings of EMNLP, 2021

undirected-generation-dev This repo contains the source code of the models described in the following paper "Learning and Analyzing Generation Order f

Yichen Jiang 0 Mar 25, 2022
DALL-Eval: Probing the Reasoning Skills and Social Biases of Text-to-Image Generative Transformers

DALL-Eval: Probing the Reasoning Skills and Social Biases of Text-to-Image Generative Transformers Authors: Jaemin Cho, Abhay Zala, and Mohit Bansal (

Jaemin Cho 98 Dec 15, 2022
Realtime_Multi-Person_Pose_Estimation

Introduction Multi Person PoseEstimation By PyTorch Results Require Pytorch Installation git submodule init && git submodule update Demo Download conv

tensorboy 1.3k Jan 05, 2023
Datasets and source code for our paper Webly Supervised Fine-Grained Recognition: Benchmark Datasets and An Approach

Introduction Datasets and source code for our paper Webly Supervised Fine-Grained Recognition: Benchmark Datasets and An Approach Datasets: WebFG-496

21 Sep 30, 2022
The implementation of the CVPR2021 paper "Structure-Aware Face Clustering on a Large-Scale Graph with 10^7 Nodes"

STAR-FC This code is the implementation for the CVPR 2021 paper "Structure-Aware Face Clustering on a Large-Scale Graph with 10^7 Nodes" 🌟 🌟 . 🎓 Re

Shuai Shen 87 Dec 28, 2022
Supplementary materials to "Spin-optomechanical quantum interface enabled by an ultrasmall mechanical and optical mode volume cavity" by H. Raniwala, S. Krastanov, M. Eichenfield, and D. R. Englund, 2022

Supplementary materials to "Spin-optomechanical quantum interface enabled by an ultrasmall mechanical and optical mode volume cavity" by H. Raniwala,

Stefan Krastanov 1 Jan 17, 2022
Tensorflow Repo for "DeepGCNs: Can GCNs Go as Deep as CNNs?"

DeepGCNs: Can GCNs Go as Deep as CNNs? In this work, we present new ways to successfully train very deep GCNs. We borrow concepts from CNNs, mainly re

Guohao Li 612 Nov 15, 2022
1st Solution For ICDAR 2021 Competition on Mathematical Formula Detection

This project releases our 1st place solution on ICDAR 2021 Competition on Mathematical Formula Detection. We implement our solution based on MMDetection, which is an open source object detection tool

yuxzho 94 Dec 25, 2022
Official pytorch code for SSAT: A Symmetric Semantic-Aware Transformer Network for Makeup Transfer and Removal

SSAT: A Symmetric Semantic-Aware Transformer Network for Makeup Transfer and Removal This is the official pytorch code for SSAT: A Symmetric Semantic-

ForeverPupil 57 Dec 13, 2022
Multi-Task Deep Neural Networks for Natural Language Understanding

New Release We released Adversarial training for both LM pre-training/finetuning and f-divergence. Large-scale Adversarial training for LMs: ALUM code

Xiaodong 2.1k Dec 30, 2022
Scalable, event-driven, deep-learning-friendly backtesting library

...Minimizing the mean square error on future experience. - Richard S. Sutton BTGym Scalable event-driven RL-friendly backtesting library. Build on

Andrew 922 Dec 27, 2022
Fast EMD for Python: a wrapper for Pele and Werman's C++ implementation of the Earth Mover's Distance metric

PyEMD: Fast EMD for Python PyEMD is a Python wrapper for Ofir Pele and Michael Werman's implementation of the Earth Mover's Distance that allows it to

William Mayner 433 Dec 31, 2022
MiniSom is a minimalistic implementation of the Self Organizing Maps

MiniSom Self Organizing Maps MiniSom is a minimalistic and Numpy based implementation of the Self Organizing Maps (SOM). SOM is a type of Artificial N

Giuseppe Vettigli 1.2k Jan 03, 2023
"NAS-Bench-301 and the Case for Surrogate Benchmarks for Neural Architecture Search".

NAS-Bench-301 This repository containts code for the paper: "NAS-Bench-301 and the Case for Surrogate Benchmarks for Neural Architecture Search". The

AutoML-Freiburg-Hannover 57 Nov 30, 2022
[CVPR2021] De-rendering the World's Revolutionary Artefacts

De-rendering the World's Revolutionary Artefacts Project Page | Video | Paper In CVPR 2021 Shangzhe Wu1,4, Ameesh Makadia4, Jiajun Wu2, Noah Snavely4,

49 Nov 06, 2022
Experimental Python implementation of OpenVINO Inference Engine (very slow, limited functionality). All codes are written in Python. Easy to read and modify.

PyOpenVINO - An Experimental Python Implementation of OpenVINO Inference Engine (minimum-set) Description The PyOpenVINO is a spin-off product from my

Yasunori Shimura 7 Oct 31, 2022
Using this you can control your PC/Laptop volume by Hand Gestures (pinch-in, pinch-out) created with Python.

Hand Gesture Volume Controller Using this you can control your PC/Laptop volume by Hand Gestures (pinch-in, pinch-out). Code Firstly I have created a

Tejas Prajapati 16 Sep 11, 2021
pytorch implementation of GPV-Pose

GPV-Pose Pytorch implementation of GPV-Pose: Category-level Object Pose Estimation via Geometry-guided Point-wise Voting. (link) UPDATE A new version

40 Dec 01, 2022
Code for ICDM2020 full paper: "Sub-graph Contrast for Scalable Self-Supervised Graph Representation Learning"

Subg-Con Sub-graph Contrast for Scalable Self-Supervised Graph Representation Learning (Jiao et al., ICDM 2020): https://arxiv.org/abs/2009.10273 Over

34 Jul 06, 2022
Person Re-identification

Person Re-identification Final project of Computer Vision Table of content Person Re-identification Table of content Students: Proposed method Dataset

Nguyễn Hoàng Quân 4 Jun 17, 2021