arXiv Analytics

Sign in

arXiv:1511.07497 [cs.CV]AbstractReferencesReviewsResources

Constrained Structured Regression with Convolutional Neural Networks

Deepak Pathak, Philipp Krähenbühl, Stella X. Yu, Trevor Darrell

Published 2015-11-23Version 1

Convolutional Neural Networks (CNNs) have recently emerged as the dominant model in computer vision. If provided with enough training data, they predict almost any visual quantity. In a discrete setting, such as classification, CNNs are not only able to predict a label but often predict a confidence in the form of a probability distribution over the output space. In continuous regression tasks, such a probability estimate is often lacking. We present a regression framework which models the output distribution of neural networks. This output distribution allows us to infer the most likely labeling following a set of physical or modeling constraints. These constraints capture the intricate interplay between different input and output variables, and complement the output of a CNN. However, they may not hold everywhere. Our setup further allows to learn a confidence with which a constraint holds, in the form of a distribution of the constrain satisfaction. We evaluate our approach on the problem of intrinsic image decomposition, and show that constrained structured regression significantly increases the state-of-the-art.

Related articles: Most relevant | Search more
arXiv:1412.4564 [cs.CV] (Published 2014-12-15)
MatConvNet - Convolutional Neural Networks for MATLAB
arXiv:1412.6296 [cs.CV] (Published 2014-12-19)
Generative Modeling of Convolutional Neural Networks
arXiv:1605.06402 [cs.CV] (Published 2016-05-20)
Ristretto: Hardware-Oriented Approximation of Convolutional Neural Networks