arXiv Analytics

Sign in

arXiv:2408.01767 [cs.LG]AbstractReferencesReviewsResources

Comparison of Embedded Spaces for Deep Learning Classification

Stefan Scholl

Published 2024-08-03Version 1

Embedded spaces are a key feature in deep learning. Good embedded spaces represent the data well to support classification and advanced techniques such as open-set recognition, few-short learning and explainability. This paper presents a compact overview of different techniques to design embedded spaces for classification. It compares different loss functions and constraints on the network parameters with respect to the achievable geometric structure of the embedded space. The techniques are demonstrated with two and three-dimensional embeddings for the MNIST, Fashion MNIST and CIFAR-10 datasets, allowing visual inspection of the embedded spaces.

Related articles: Most relevant | Search more
arXiv:2107.01606 [cs.LG] (Published 2021-07-04)
A Comparison of the Delta Method and the Bootstrap in Deep Learning Classification
arXiv:2307.02973 [cs.LG] (Published 2023-07-06)
Pruning vs Quantization: Which is Better?
arXiv:1606.00930 [cs.LG] (Published 2016-06-02)
Comparison of 14 different families of classification algorithms on 115 binary datasets