Relaxed Tree¶

The relaxed tree algorithm, or relaxed hierarchy algorithm, solves multi-class classification problems by exploiting the relaxed hierarchy structure of the data,

At each node, a binary classifier separates the data into three groups. Labels $$1$$ and $$−1$$ mark the positive and negative sample groups assigned by the classifier, while the confusing class, labeled with $$0$$, are ignored by the binary classifier (what “relaxed” refers to). The child of each node contains either group $$0$$ and $$1$$, or group $$0$$ and $$−1$$.

See [GK11] for a detailed introduction.

Example¶

Imagine we have files with training and test data. We create CDenseFeatures (here 64 bit floats aka RealFeatures) and CMulticlassLabels as

features_train = RealFeatures(f_feats_train)
features_test = RealFeatures(f_feats_test)
labels_train = MulticlassLabels(f_labels_train)
labels_test = MulticlassLabels(f_labels_test)

features_train = RealFeatures(f_feats_train);
features_test = RealFeatures(f_feats_test);
labels_train = MulticlassLabels(f_labels_train);
labels_test = MulticlassLabels(f_labels_test);

RealFeatures features_train = new RealFeatures(f_feats_train);
RealFeatures features_test = new RealFeatures(f_feats_test);
MulticlassLabels labels_train = new MulticlassLabels(f_labels_train);
MulticlassLabels labels_test = new MulticlassLabels(f_labels_test);

features_train = Shogun::RealFeatures.new f_feats_train
features_test = Shogun::RealFeatures.new f_feats_test
labels_train = Shogun::MulticlassLabels.new f_labels_train
labels_test = Shogun::MulticlassLabels.new f_labels_test

features_train <- RealFeatures(f_feats_train)
features_test <- RealFeatures(f_feats_test)
labels_train <- MulticlassLabels(f_labels_train)
labels_test <- MulticlassLabels(f_labels_test)

features_train = shogun.RealFeatures(f_feats_train)
features_test = shogun.RealFeatures(f_feats_test)
labels_train = shogun.MulticlassLabels(f_labels_train)
labels_test = shogun.MulticlassLabels(f_labels_test)

RealFeatures features_train = new RealFeatures(f_feats_train);
RealFeatures features_test = new RealFeatures(f_feats_test);
MulticlassLabels labels_train = new MulticlassLabels(f_labels_train);
MulticlassLabels labels_test = new MulticlassLabels(f_labels_test);

auto features_train = some<CDenseFeatures<float64_t>>(f_feats_train);
auto features_test = some<CDenseFeatures<float64_t>>(f_feats_test);
auto labels_train = some<CMulticlassLabels>(f_labels_train);
auto labels_test = some<CMulticlassLabels>(f_labels_test);


In order to run CRelaxedTree, we need to set the machine for confusion matrix and choose the kernel.

mll = MulticlassLibLinear()
kernel = GaussianKernel()

mll = MulticlassLibLinear();
kernel = GaussianKernel();

MulticlassLibLinear mll = new MulticlassLibLinear();
GaussianKernel kernel = new GaussianKernel();

mll = Shogun::MulticlassLibLinear.new
kernel = Shogun::GaussianKernel.new

mll <- MulticlassLibLinear()
kernel <- GaussianKernel()

mll = shogun.MulticlassLibLinear()
kernel = shogun.GaussianKernel()

MulticlassLibLinear mll = new MulticlassLibLinear();
GaussianKernel kernel = new GaussianKernel();

auto mll = some<CMulticlassLibLinear>();
auto kernel = some<CGaussianKernel>();


We create an instance of the CRelaxedTree classifier, set the labels, and set the machine for confusion matrix and kernel. We use confusion matrix to estimate the initial partition of the dataset and the kernel to train the model.

machine = RelaxedTree()
machine.set_labels(labels_train)
machine.set_machine_for_confusion_matrix(mll)
machine.set_kernel(kernel)

machine = RelaxedTree();
machine.set_labels(labels_train);
machine.set_machine_for_confusion_matrix(mll);
machine.set_kernel(kernel);

RelaxedTree machine = new RelaxedTree();
machine.set_labels(labels_train);
machine.set_machine_for_confusion_matrix(mll);
machine.set_kernel(kernel);

machine = Shogun::RelaxedTree.new
machine.set_labels labels_train
machine.set_machine_for_confusion_matrix mll
machine.set_kernel kernel

machine <- RelaxedTree()
machine$set_labels(labels_train) machine$set_machine_for_confusion_matrix(mll)
machine$set_kernel(kernel)  machine = shogun.RelaxedTree() machine:set_labels(labels_train) machine:set_machine_for_confusion_matrix(mll) machine:set_kernel(kernel)  RelaxedTree machine = new RelaxedTree(); machine.set_labels(labels_train); machine.set_machine_for_confusion_matrix(mll); machine.set_kernel(kernel);  auto machine = some<CRelaxedTree>(); machine->set_labels(labels_train); machine->set_machine_for_confusion_matrix(mll); machine->set_kernel(kernel);  Then we train and apply it to test data, which here gives CMulticlassLabels. machine.train(features_train) labels_predict = machine.apply_multiclass(features_test)  machine.train(features_train); labels_predict = machine.apply_multiclass(features_test);  machine.train(features_train); MulticlassLabels labels_predict = machine.apply_multiclass(features_test);  machine.train features_train labels_predict = machine.apply_multiclass features_test  machine$train(features_train)
labels_predict <- machine$apply_multiclass(features_test)  machine:train(features_train) labels_predict = machine:apply_multiclass(features_test)  machine.train(features_train); MulticlassLabels labels_predict = machine.apply_multiclass(features_test);  machine->train(features_train); auto labels_predict = machine->apply_multiclass(features_test);  We can evaluate test performance via e.g. CMulticlassAccuracy. eval = MulticlassAccuracy() accuracy = eval.evaluate(labels_predict, labels_test)  eval = MulticlassAccuracy(); accuracy = eval.evaluate(labels_predict, labels_test);  MulticlassAccuracy eval = new MulticlassAccuracy(); double accuracy = eval.evaluate(labels_predict, labels_test);  eval = Shogun::MulticlassAccuracy.new accuracy = eval.evaluate labels_predict, labels_test  eval <- MulticlassAccuracy() accuracy <- eval$evaluate(labels_predict, labels_test)

eval = shogun.MulticlassAccuracy()
accuracy = eval:evaluate(labels_predict, labels_test)

MulticlassAccuracy eval = new MulticlassAccuracy();
double accuracy = eval.evaluate(labels_predict, labels_test);

auto eval = some<CMulticlassAccuracy>();
auto accuracy = eval->evaluate(labels_predict, labels_test);


References¶

 [GK11] T. Gao and D. Koller. Discriminative learning of relaxed hierarchy for large-scale visual recognition. In IEEE International Conference on Computer Vision, 2072–2079. 2011.