{ "id": "2107.11159", "version": "v1", "published": "2021-07-23T12:10:46.000Z", "updated": "2021-07-23T12:10:46.000Z", "title": "Learning Discriminative Representations for Multi-Label Image Recognition", "authors": [ "Mohammed Hassanin", "Ibrahim Radwan", "Salman Khan", "Murat Tahtali" ], "categories": [ "cs.CV" ], "abstract": "Multi-label recognition is a fundamental, and yet is a challenging task in computer vision. Recently, deep learning models have achieved great progress towards learning discriminative features from input images. However, conventional approaches are unable to model the inter-class discrepancies among features in multi-label images, since they are designed to work for image-level feature discrimination. In this paper, we propose a unified deep network to learn discriminative features for the multi-label task. Given a multi-label image, the proposed method first disentangles features corresponding to different classes. Then, it discriminates between these classes via increasing the inter-class distance while decreasing the intra-class differences in the output space. By regularizing the whole network with the proposed loss, the performance of applying the wellknown ResNet-101 is improved significantly. Extensive experiments have been performed on COCO-2014, VOC2007 and VOC2012 datasets, which demonstrate that the proposed method outperforms state-of-the-art approaches by a significant margin of 3:5% on large-scale COCO dataset. Moreover, analysis of the discriminative feature learning approach shows that it can be plugged into various types of multi-label methods as a general module.", "revisions": [ { "version": "v1", "updated": "2021-07-23T12:10:46.000Z" } ], "analyses": { "keywords": [ "multi-label image recognition", "learning discriminative representations", "discriminative feature", "method outperforms state-of-the-art approaches", "method first disentangles features corresponding" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }