11. S. Lazebnik and M. Raginsky: Learning Nearest Neighbor Quantizers from Labeled Data by Information Loss Minimization

This poster presents an information-theoretic clustering method that produces discriminative quantized representations of continuous data that is also supplied with class labels.
This method works by learning a set of prototypes in the feature space such that the index of the nearest prototype of a given feature vector approximates a sufficient statistic for its class label. We have applied the method to the application of producing visual vocabularies for bag-of-features image classification, and our experiments demonstrate that vocabularies learned by the proposed
method have higher classification performance than standard vocabularies learned using k-means.