arXiv Analytics

Sign in

arXiv:2201.12204 [cs.LG]AbstractReferencesReviewsResources

From data to functa: Your data point is a function and you should treat it like one

Emilien Dupont, Hyunjik Kim, S. M. Ali Eslami, Danilo Rezende, Dan Rosenbaum

Published 2022-01-28Version 1

It is common practice in deep learning to represent a measurement of the world on a discrete grid, e.g. a 2D grid of pixels. However, the underlying signal represented by these measurements is often continuous, e.g. the scene depicted in an image. A powerful continuous alternative is then to represent these measurements using an implicit neural representation, a neural function trained to output the appropriate measurement value for any input spatial location. In this paper, we take this idea to its next level: what would it take to perform deep learning on these functions instead, treating them as data? In this context we refer to the data as functa, and propose a framework for deep learning on functa. This view presents a number of challenges around efficient conversion from data to functa, compact representation of functa, and effectively solving downstream tasks on functa. We outline a recipe to overcome these challenges and apply it to a wide range of data modalities including images, 3D shapes, neural radiance fields (NeRF) and data on manifolds. We demonstrate that this approach has various compelling properties across data modalities, in particular on the canonical tasks of generative modeling, data imputation, novel view synthesis and classification.

Related articles: Most relevant | Search more
arXiv:1506.00619 [cs.LG] (Published 2015-06-01)
Blocks and Fuel: Frameworks for deep learning
arXiv:1603.06430 [cs.LG] (Published 2016-03-21)
Deep Learning in Bioinformatics
arXiv:1602.02220 [cs.LG] (Published 2016-02-06)
Improved Dropout for Shallow and Deep Learning