Uniform TitleDiscriminative models and dimensionality reduction for regression
NameKim, Minyoung (author), Pavlovic, Vladimir (chair), Metaxas, Dimitris (internal member), Elgammal, Ahmed (internal member), Kumar, Sanjiv (outside member), Rutgers University, Graduate School - New Brunswick,
DescriptionMany prediction problems that arise in computer vision and robotics can be formulated within a regression framework. Unlike traditional regression problems, vision and robotics tasks are often characterized by a varying number of output variables with complex dependency structures. The problems are further aggravated by the high dimensionality of the input. In this thesis, I address two challenging tasks related to learning of regressors in such settings: (1) developing discriminative approaches that can handle structured output variables, and (2) reducing the dimensionality of the input while preserving the statistical correlation with the output.
A complex dependency structure in the output variables can be effectively captured by probabilistic graphical models. In contrast to traditional joint data modeling for probabilistic models, I propose conditional models and a discriminative learning approach that are directly related to the ultimate prediction objective. While discriminative learning of structured models such as Conditional Random Fields (CRFs) has attracted significant interest in the past, learning structured models in the regression setting has been rarely explored. In this work I first extend the CRF and the discriminatively trained HMM methods to the structured output regression problem. I propose two different approaches based on directed and undirected models. In the second approach the parameter learning is cast as a convex optimization problem, accompanied by a new approach that effective handles the density integrability constraint. Experiments in several problem domains, including human motion and robot-arm state estimation, indicate that the new models yield high prediction accuracy comparable to or better than state-of-the-art approaches.
In the second part of the thesis, I consider the task of finding a low-dimensional representation of the input covariates while preserving the statistical correlation in regressing the output. This task, known as the dimensionality reduction for regression (DRR), is particularly useful when visualizing high-dimensional data, efficiently designing regressors with a reduced input dimension, and eliminating noise in the input data by uncovering essential information for predicting the output. While the dimensionality reduction methods are common in many machine learning tasks, their use in the regression settings has not been widespread. A number of recent methods for DRR have been proposed in the statistics community but suffer from several limitations, including non-convexity and the need for slicing of potentially high-dimensional output space. I address these issues by proposing a novel approach based on covariance operators in reproducing kernel Hilbert spaces (RKHSes) that provide a closed-form DRR solution without the need for explicit slicing. The benefits of this approach are demonstrated in a comprehensive set of evaluations on several important regression problems in computer vision and pattern recognition.
NoteIncludes bibliographical references (p. 94-97).
CollectionGraduate School - New Brunswick Electronic Theses and Dissertations
Organization NameRutgers, The State University of New Jersey
RightsThe author owns the copyright to this work.