arXiv Analytics

Sign in

arXiv:2409.14235 [cs.LG]AbstractReferencesReviewsResources

Structure Learning via Mutual Information

Jeremy Nixon

Published 2024-09-21Version 1

This paper presents a novel approach to machine learning algorithm design based on information theory, specifically mutual information (MI). We propose a framework for learning and representing functional relationships in data using MI-based features. Our method aims to capture the underlying structure of information in datasets, enabling more efficient and generalizable learning algorithms. We demonstrate the efficacy of our approach through experiments on synthetic and real-world datasets, showing improved performance in tasks such as function classification, regression, and cross-dataset transfer. This work contributes to the growing field of metalearning and automated machine learning, offering a new perspective on how to leverage information theory for algorithm design and dataset analysis and proposing new mutual information theoretic foundations to learning algorithms.

Related articles: Most relevant | Search more
arXiv:1206.6452 [cs.LG] (Published 2012-06-27)
Smoothness and Structure Learning by Proxy
arXiv:2105.10350 [cs.LG] (Published 2021-05-20)
Definite Non-Ancestral Relations and Structure Learning
arXiv:1608.07934 [cs.LG] (Published 2016-08-29)
Relevant based structure learning for feature selection