{ "id": "2310.00183", "version": "v1", "published": "2023-09-29T23:09:54.000Z", "updated": "2023-09-29T23:09:54.000Z", "title": "On the Equivalence of Graph Convolution and Mixup", "authors": [ "Xiaotian Han", "Hanqing Zeng", "Yu Chen", "Shaoliang Nie", "Jingzhou Liu", "Kanika Narang", "Zahra Shakeri", "Karthik Abinav Sankararaman", "Song Jiang", "Madian Khabsa", "Qifan Wang", "Xia Hu" ], "categories": [ "cs.LG", "cs.AI" ], "abstract": "This paper investigates the relationship between graph convolution and Mixup techniques. Graph convolution in a graph neural network involves aggregating features from neighboring samples to learn representative features for a specific node or sample. On the other hand, Mixup is a data augmentation technique that generates new examples by averaging features and one-hot labels from multiple samples. One commonality between these techniques is their utilization of information from multiple samples to derive feature representation. This study aims to explore whether a connection exists between these two approaches. Our investigation reveals that, under two mild conditions, graph convolution can be viewed as a specialized form of Mixup that is applied during both the training and testing phases. The two conditions are: 1) \\textit{Homophily Relabel} - assigning the target node's label to all its neighbors, and 2) \\textit{Test-Time Mixup} - Mixup the feature during the test time. We establish this equivalence mathematically by demonstrating that graph convolution networks (GCN) and simplified graph convolution (SGC) can be expressed as a form of Mixup. We also empirically verify the equivalence by training an MLP using the two conditions to achieve comparable performance.", "revisions": [ { "version": "v1", "updated": "2023-09-29T23:09:54.000Z" } ], "analyses": { "keywords": [ "equivalence", "multiple samples", "data augmentation technique", "graph neural network", "conditions" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }