The implementation of FOLD-R++ algorithm

Overview

FOLD-R-PP

The implementation of FOLD-R++ algorithm. The target of FOLD-R++ algorithm is to learn an answer set program for a classification task.

Installation

Prerequisites

FOLD-R++ is developed with only python3. Numpy is the only dependency:

python3 -m pip install numpy

Instruction

Data preparation

The FOLD-R++ algorithm takes tabular data as input, the first line for the tabular data should be the feature names of each column. The FOLD-R++ does not need encoding for training. It can deal with numeric, categorical, and even mixed type features (one column contains categorical and numeric values) directly. But, the numeric features should be specified before loading data, otherwise they would be dealt like categorical features (only literals with = and != would be generated).

There are many UCI datasets can be found in the data directory, and the code pieces of data preparation should be added to datasets.py.

For example, the UCI breast-w dataset can be loaded with the following code:

columns = ['clump_thickness', 'cell_size_uniformity', 'cell_shape_uniformity', 'marginal_adhesion',
'single_epi_cell_size', 'bare_nuclei', 'bland_chromatin', 'normal_nucleoli', 'mitoses']
nums = columns
data, num_idx, columns = load_data('data/breastw/breastw.csv', attrs=columns, label=['label'], numerics=nums, pos='benign')

columns lists all the features needed, nums lists all the numeric features, label implies the feature name of the label, pos indicates the positive value of the label.

Training

The FOLD-R++ algorithm generates an explainable model that is represented with an answer set program for classification tasks. Here's an training example for breast-w dataset:

X_train, Y_train = split_xy(data_train)
X_pos, X_neg = split_X_by_Y(X_train, Y_train)
rules1 = foldrpp(X_pos, X_neg, [])

We have got a rule set rules1 in a nested intermediate representation. Flatten and decode the nested rules to answer set program:

fr1 = flatten(rules1)
rule_set = decode_rules(fr1, attrs)
for r in rule_set:
    print(r)

The training process can be started with: python3 main.py

An answer set program that is compatible with s(CASP) is generated as below.

% breastw dataset (699, 10).
% the answer set program generated by foldr++:

label(X,'benign'):- bare_nuclei(X,'?').
label(X,'benign'):- bland_chromatin(X,N6), N6=<4.0,
		    clump_thickness(X,N0), N0=<6.0,  
                    bare_nuclei(X,N5), N5=<1.0, not ab7(X).   
label(X,'benign'):- cell_size_uniformity(X,N1), N1=<2.0,
		    not ab3(X), not ab5(X), not ab6(X).  
label(X,'benign'):- cell_size_uniformity(X,N1), N1=<4.0,
		    bare_nuclei(X,N5), N5=<3.0,
		    clump_thickness(X,N0), N0=<3.0, not ab8(X).  
ab2(X):- clump_thickness(X,N0), N0=<1.0.  
ab3(X):- bare_nuclei(X,N5), N5>5.0, not ab2(X).  
ab4(X):- cell_shape_uniformity(X,N2), N2=<1.0.  
ab5(X):- clump_thickness(X,N0), N0>7.0, not ab4(X).  
ab6(X):- bare_nuclei(X,N5), N5>4.0, single_epi_cell_size(X,N4), N4=<1.0.  
ab7(X):- marginal_adhesion(X,N3), N3>4.0.  
ab8(X):- marginal_adhesion(X,N3), N3>6.0.  

% foldr++ costs:  0:00:00.027710  post: 0:00:00.000127
% acc 0.95 p 0.96 r 0.9697 f1 0.9648 

Testing in Python

The testing data X_test, a set of testing data, can be predicted with the predict function in Python.

Y_test_hat = predict(rules1, X_test)

The classify function can also be used to classify a single data.

y_test_hat = classify(rules1, x_test)

Justification by using s(CASP)

Classification and justification can be conducted with s(CASP), but the data also need to be converted into predicate format. The decode_test_data function can be used for generating predicates for testing data.

data_pred = decode_test_data(data_test, attrs)
for p in data_pred:
    print(p)

Here is an example of generated testing data predicates along with the answer set program for acute dataset:

% acute dataset (120, 7) 
% the answer set program generated by foldr++:

ab2(X):- a5(X,'no'), a1(X,N0), N0>37.9.
label(X,'yes'):- not a4(X,'no'), not ab2(X).

% foldr++ costs:  0:00:00.001990  post: 0:00:00.000040
% acc 1.0 p 1.0 r 1.0 f1 1.0 

id(1).
a1(1,37.2).
a2(1,'no').
a3(1,'yes').
a4(1,'no').
a5(1,'no').
a6(1,'no').

id(2).
a1(2,38.1).
a2(2,'no').
a3(2,'yes').
a4(2,'yes').
a5(2,'no').
a6(2,'yes').

id(3).
a1(3,37.5).
a2(3,'no').
a3(3,'no').
a4(3,'yes').
a5(3,'yes').
a6(3,'yes').

s(CASP)

All the resources of s(CASP) can be found at https://gitlab.software.imdea.org/ciao-lang/sCASP.

Citation

@misc{wang2021foldr,
      title={FOLD-R++: A Toolset for Automated Inductive Learning of Default Theories from Mixed Data}, 
      author={Huaduo Wang and Gopal Gupta},
      year={2021},
      eprint={2110.07843},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}
A Traffic Sign Recognition Project which can help the driver recognise the signs via text as well as audio. Can be used at Night also.

Traffic-Sign-Recognition In this report, we propose a Convolutional Neural Network(CNN) for traffic sign classification that achieves outstanding perf

Mini Project 64 Nov 19, 2022
A lightweight python AUTOmatic-arRAY library.

A lightweight python AUTOmatic-arRAY library. Write numeric code that works for: numpy cupy dask autograd jax mars tensorflow pytorch ... and indeed a

Johnnie Gray 62 Dec 27, 2022
Deep Learning (with PyTorch)

Deep Learning (with PyTorch) This notebook repository now has a companion website, where all the course material can be found in video and textual for

Alfredo Canziani 6.2k Jan 07, 2023
Hl classification bc - A Network-Based High-Level Data Classification Algorithm Using Betweenness Centrality

A Network-Based High-Level Data Classification Algorithm Using Betweenness Centr

Esteban Vilca 3 Dec 01, 2022
Voxel Set Transformer: A Set-to-Set Approach to 3D Object Detection from Point Clouds (CVPR 2022)

Voxel Set Transformer: A Set-to-Set Approach to 3D Object Detection from Point Clouds (CVPR2022)[paper] Authors: Chenhang He, Ruihuang Li, Shuai Li, L

Billy HE 141 Dec 30, 2022
NeRF Meta-Learning with PyTorch

NeRF Meta Learning With PyTorch nerf-meta is a PyTorch re-implementation of NeRF experiments from the paper "Learned Initializations for Optimizing Co

Sanowar Raihan 78 Dec 18, 2022
MAg: a simple learning-based patient-level aggregation method for detecting microsatellite instability from whole-slide images

MAg Paper Abstract File structure Dataset prepare Data description How to use MAg? Why not try the MAg_lib! Trained models Experiment and results Some

Calvin Pang 3 Apr 08, 2022
Deep Networks with Recurrent Layer Aggregation

RLA-Net: Recurrent Layer Aggregation Recurrence along Depth: Deep Networks with Recurrent Layer Aggregation This is an implementation of RLA-Net (acce

Joy Fang 21 Aug 16, 2022
Joint Discriminative and Generative Learning for Person Re-identification. CVPR'19 (Oral)

Joint Discriminative and Generative Learning for Person Re-identification [Project] [Paper] [YouTube] [Bilibili] [Poster] [Supp] Joint Discriminative

NVIDIA Research Projects 1.2k Dec 30, 2022
CS583: Deep Learning

CS583: Deep Learning

Shusen Wang 2.6k Dec 30, 2022
Feedback is important: response-aware feedback mechanism for background based conversation

RFM The code for the paper: "Feedback is important: response-aware feedback mechanism for background based conversation." Requirements python 3.7 pyto

Jiatao Chen 2 Sep 29, 2022
The Fundamental Clustering Problems Suite (FCPS) summaries 54 state-of-the-art clustering algorithms, common cluster challenges and estimations of the number of clusters as well as the testing for cluster tendency.

FCPS Fundamental Clustering Problems Suite The package provides over sixty state-of-the-art clustering algorithms for unsupervised machine learning pu

9 Nov 27, 2022
This repository is dedicated to developing and maintaining code for experiments with wide neural networks.

Wide-Networks This repository contains the code of various experiments on wide neural networks. In particular, we implement classes for abc-parameteri

Karl Hajjar 0 Nov 02, 2021
Pytorch implementation of face attention network

Face Attention Network Pytorch implementation of face attention network as described in Face Attention Network: An Effective Face Detector for the Occ

Hooks 312 Dec 09, 2022
Woosung Choi 63 Nov 14, 2022
nn_builder lets you build neural networks with less boilerplate code

nn_builder lets you build neural networks with less boilerplate code. You specify the type of network you want and it builds it. Install pip install n

Petros Christodoulou 157 Nov 20, 2022
Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)

FRSKD Official implementation for Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation (CVPR-2021) Requirements Pytho

75 Dec 28, 2022
Signals-backend - A suite of card games written in Python

Card game A suite of card games written in the Python language. Features coming

1 Feb 15, 2022
Predicting path with preference based on user demonstration using Maximum Entropy Deep Inverse Reinforcement Learning in a continuous environment

Preference-Planning-Deep-IRL Introduction Check my portfolio post Dependencies Gym stable-baselines3 PyTorch Usage Take Demonstration python3 record.

Tianyu Li 9 Oct 26, 2022
Extracting knowledge graphs from language models as a diagnostic benchmark of model performance.

Interpreting Language Models Through Knowledge Graph Extraction Idea: How do we interpret what a language model learns at various stages of training?

EPFL Machine Learning and Optimization Laboratory 9 Oct 25, 2022