Exploring the Pertinence of Distance Functions for Nominal Multi-label Data - Artificial Intelligence Applications and Innovations
Conference Papers Year : 2022

Exploring the Pertinence of Distance Functions for Nominal Multi-label Data

Payel Sadhukhan
  • Function : Author
  • PersonId : 1406762

Abstract

Data with nominal features constitute a good fraction of multi-label datasets. Dealing with high-dimensional, nominal data is different from the handling of data with numeric features. The key reason being – the distance functions which work good on numeric datasets may not function optimally (without returning the true separations of the points) in a nominal feature space. We have further observed that, in a multi-label dataset, an imbalance exists in the distribution nominal features which further aggravates the learning. In this work, we focus to find the suitability of four different distance functions –euclidean, hamming, jaccard and kulsinski in a binary-nominal context. Additionally, we also propose and explore an ensemble of two classifiers where one classifier is modelled using jaccard distance and the other is modelled on kulsinski distance. An empirical study involving five binary-nominal datasets, four evaluation metrics and three multi-label classifiers is used to evaluate the pertinence of each distance function and the ensemble. We find that the proposed ensemble gives the best outcome across all but one case.
Embargoed file
Embargoed file
0 0 10
Year Month Jours
Avant la publication
Wednesday, January 1, 2025
Embargoed file
Wednesday, January 1, 2025
Please log in to request access to the document

Dates and versions

hal-04668666 , version 1 (07-08-2024)

Licence

Identifiers

Cite

Payel Sadhukhan. Exploring the Pertinence of Distance Functions for Nominal Multi-label Data. 18th IFIP International Conference on Artificial Intelligence Applications and Innovations (AIAI), Jun 2022, Hersonissos, Greece. pp.206-216, ⟨10.1007/978-3-031-08337-2_18⟩. ⟨hal-04668666⟩
27 View
1 Download

Altmetric

Share

More