ABSTRACT
Many classification problems require classifiers to assign each single document into more than one category, which is called multi-labelled classification. The categories in such problems usually are neither conditionally independent from each other nor mutually exclusive, therefore it is not trivial to directly employ state-of-the-art classification algorithms without losing information of relation among categories. In this paper, we explore correlations among categories with maximum entropy method and derive a classification algorithm for multi-labelled documents. Our experiments show that this method significantly outperforms the combination of single label approach.
- Benson, S. J., McInnes, L. C., Moré, J., & Sarich, J. (2004). TAO user manual (revision 1.7) (Technical Report ANL/MCS-TM-242). Mathematics and Computer Science Division, Argonne National Laboratory. http://www.mcs.anl.gov/tao.]]Google Scholar
- Cai, L., & Hofmann, T. (2004). Hierarchical document categorization with support vector machines CIKM '04: Proceedings of the Thirteenth ACM conference on Information and knowledge management (pp. 78--87). Washington, D.C., USA: ACM Press.]] Google ScholarDigital Library
- Chen, S. F., & Rosenfeld, R. (1999). A Gaussian prior for smoothing maximum entropy models (Technical Report CMU-CS-99-108). School of Computer Science Carnegie Mellon University.]]Google ScholarCross Ref
- Clare, A., & King, R. D. (2001). Knowledge discovery in multi-label phenotype data. PKDD '01: Proceedings of the 5th European Conference on Principles of Data Mining and Knowledge Discovery (pp. 42--53). Springer-Verlag.]] Google ScholarDigital Library
- Comite, F. D., Gilleron, R., & Tommasi, M. (2001). Learning multi-label alternating decision trees and applications. Proceedings of CAP'01 (pp. 195--210).]]Google Scholar
- Crammer, K., & Singer, Y. (2002). A new family of online algorithms for category ranking. Proceedings of the 25th annual international ACM SIGIR conference on Research and development in information retrieval (pp. 151--158). Tampere, Finland: ACM Press.]] Google ScholarDigital Library
- Della Pietra, S., Della Pietra, V. J., & Lafferty, J. D. (1997). Inducing features of random fields. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19, 380--393.]] Google ScholarDigital Library
- Elisseeff, A., & Weston, J. (2002). A kernel method for multi-labelled classification. Advances in Neural Information Processing Systems 14 (pp. 681--687). Cambridge, MA: MIT Press.]]Google Scholar
- Gao, S., Wu, W., Lee, C.-H., & Chua, T.-S. (2004). A mfom learning approach to robust multiclass multi-label text categorization. ICML '04: Twenty-first international conference on Machine learning. Banff, Alberta, Canada: ACM Press.]] Google ScholarDigital Library
- Godbole, S., & Sarawagi, S. (2004). Discriminative methods for multi-labeled classification. PAKDD.]]Google Scholar
- Har-Peled, S., Roth, D., & Zimak, D. Constraint classification for multiclass classification and ranking. In S. T. S. Becker and K. Obermayer (Eds.), Advances in neural information processing systems 15. MIT Press.]]Google Scholar
- Jaynes, E. T. (1957). Information theory and statistical mechanics. Physical Review, 106, 620--630.]]Google ScholarCross Ref
- Malouf, R. (2002). A comparison of algorithms for maximum entropy parameter estimation. Proc. of the sixth CoNLL.]] Google ScholarDigital Library
- McCallum, A. (1999). Multi-label text classification with a mixture model trained by EM. AAAI'99 Workshop on Text Learning.]]Google Scholar
- Nigam, K., Lafferty, J., & McCallum, A. (1999). Using maximum entropy for text classification. IJCAI-99 Workshop on Machine Learning for Information Filtering (pp. 61--67).]]Google Scholar
- Schapire, R. E., & Singer, Y. (2000). Boostexter: A boosting-based system for text categorization. Machine Learning, 39, 135--168.]] Google ScholarDigital Library
- Ueda, N., & Saito, K. Parametric mixture models for multi-labeled text. Advances in Neural Information Processing Systems 15. MIT Press.]]Google Scholar
- Wilcoxon, F. (1945). Individual comparisons by ranking methods. Biometrics, 1, 80--93.]]Google ScholarCross Ref
- Yang, Y., & Liu, X. (1999). A re-examination of text categorization methods. Proceedings of the 22nd Annual International Conference on Research and Development in Information Retrieval (SIGIR'99) (pp. 42--49). Berkley: ACM Press.]] Google ScholarDigital Library
- Zhang, T., & Oles, F. J. (2001). Text categorization based on regularized linear classification methods. Inf. Retr., 4, 5--31.]] Google ScholarDigital Library
- Zhu, J., & Hastie, T. (2003). Classification of gene microarrays by penalized logistic regression. Biostatistics.]]Google Scholar
Index Terms
- Multi-labelled classification using maximum entropy method
Recommendations
Text Classification from Labeled and Unlabeled Documents using EM
Special issue on information retrievalThis paper shows that the accuracy of learned text classifiers can be improved by augmenting a small number of labeled training documents with a large pool of unlabeled documents. This is important because in many text classification problems obtaining ...
Weak Labeled Multi-Label Active Learning for Image Classification
MM '15: Proceedings of the 23rd ACM international conference on MultimediaIn order to achieve better classification performance with even fewer labeled images, active learning is suitable for these situations. Several active learning methods have been proposed for multi-label image classification, but all of them assume that ...
Entropy-Based Estimation in Classification Problems
The problem of binary classification is considered, an algorithm for its solution is proposed, based on the method of entropy-based estimation of the decision rule parameters. A detailed description of the entropy-based estimation method and the ...
Comments