The documents distributed by this server have been provided by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, not withstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Towards Ontology-based Training-less Multi-label Text Classifi cation

Author:Wael Alkhatib, Saba Sabrin, Svenja Neitzel, and Christoph Rensing
Date:April 2018
Kind:In proceedings - use for conference & workshop papers
Book title:The proceeding of the 23rd International Conference on Applications of Natural Language to Information Systems
Keywords:semantics; statistics; feature selection; ontology; text clas- si cation; typed dependencies.
Number of characters:18717
Abstract:In the under-explored research area of multi-label text clas- si cation. Substantial amount of research in adapting and transforming traditional classi ers to directly handle multi-label datasets has taken place. The performance of traditional statistical and probabilistic classi- ers su ers from the high dimensionality of feature space, training over- head and label imbalance. In this work, we propose a novel ontology- based approach for training-less multi-label text classi cation. We trans- form the classi cation task into a graph matching problem by develop- ing a shallow domain ontology to be used as a training-less classi er. Thereby, we overcome the challenges of feature engineering and label imbalance of traditional methods. Our intensive experiments, using the EUR-Lex dataset, prove that our method provides a comparable perfor- mance to the state-of-the-art techniques in terms of Macro F1-Score.

If the paper is not available from this page, you might contact the author(s) directly via the "People" section on our KOM Homepage.

[Export this entry to BibTeX]