arXiv Analytics

Sign in

arXiv:2006.00038 [cs.LG]AbstractReferencesReviewsResources

Quasi-orthonormal Encoding for Machine Learning Applications

Haw-minn Lu

Published 2020-05-29Version 1

Most machine learning models, especially artificial neural networks, require numerical, not categorical data. We briefly describe the advantages and disadvantages of common encoding schemes. For example, one-hot encoding is commonly used for attributes with a few unrelated categories and word embeddings for attributes with many related categories (e.g., words). Neither is suitable for encoding attributes with many unrelated categories, such as diagnosis codes in healthcare applications. Application of one-hot encoding for diagnosis codes, for example, can result in extremely high dimensionality with low sample size problems or artificially induce machine learning artifacts, not to mention the explosion of computing resources needed. Quasi-orthonormal encoding (QOE) fills the gap. We briefly show how QOE compares to one-hot encoding. We provide example code of how to implement QOE using popular ML libraries such as Tensorflow and PyTorch and a demonstration of QOE to MNIST handwriting samples.

Comments: Accepted and submitted to 19th Python in Science Conference. (SciPy 2020)
Categories: cs.LG, stat.ML
Related articles: Most relevant | Search more
arXiv:1212.1100 [cs.LG] (Published 2012-12-05)
Making Early Predictions of the Accuracy of Machine Learning Applications
arXiv:2202.03051 [cs.LG] (Published 2022-02-07, updated 2022-05-17)
Using Partial Monotonicity in Submodular Maximization
arXiv:2501.03840 [cs.LG] (Published 2025-01-07)
Machine learning applications in archaeological practices: a review