Semantic Channel and Shannon’s Channel Mutually Match for Multi-label Classification
Résumé
A semantic channel consists of a set of membership functions or truth functions which indicate the denotations of a set of labels. In the multi-label learning, we obtain a semantic channel from a sampling distribution or Shannon’s channel. If samples are huge, we can directly convert a Shannon’s channel into a semantic channel by the third kind of Bayes’ theorem; otherwise, we can optimize the membership functions by a generalized Kullback–Leibler formula. In the multi-label classification, we partition an instance space with the maximum semantic information criterion, which is a special Regularized Least Squares (RLS) criterion and is equivalent to the maximum likelihood criterion. To simplify the learning, we may only obtain the truth functions of some atomic labels to construct the truth functions of compound labels. In a label’s learning, instances are divided into three kinds (positive, negative, and unclear) instead of two kinds as in the One-vs-Rest or Binary Relevance (BR) method. Every label’s learning is independent as in the BR method. However, it is allowed to train a label without negative examples and a number of binary classifications are not used. In the label selection, for an instance, the classifier selects a compound label with the most semantic information. This classifier has taken into the consideration the correlation between labels already. As a predictive model, the semantic channel does not change with the prior probability distribution (source) of instances. It still works when the source is changed. The classifier will vary with the source and hence can overcome the class-imbalance problem. It is shown that the old population’s increase will change the classifier for label “Old person” and has been impelling the evolution of the semantic meaning of “Old”. The CM iteration algorithm for unseen instance classification is introduced.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...