Correct Constraint

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 42 Experts worldwide ranked by ideXlab platform

Robert Nowak - One of the best experts on this subject based on the ideXlab platform.

  • Finite Sample Prediction and Recovery Bounds for Ordinal Embedding
    arXiv: Machine Learning, 2016
    Co-Authors: Lalit Jain, Kevin Jamieson, Robert Nowak
    Abstract:

    The goal of ordinal embedding is to represent items as points in a low-dimensional Euclidean space given a set of Constraints in the form of distance comparisons like "item $i$ is closer to item $j$ than item $k$". Ordinal Constraints like this often come from human judgments. To account for errors and variation in judgments, we consider the noisy situation in which the given Constraints are independently corrupted by reversing the Correct Constraint with some probability. This paper makes several new contributions to this problem. First, we derive prediction error bounds for ordinal embedding with noise by exploiting the fact that the rank of a distance matrix of points in $\mathbb{R}^d$ is at most $d+2$. These bounds characterize how well a learned embedding predicts new comparative judgments. Second, we investigate the special case of a known noise model and study the Maximum Likelihood estimator. Third, knowledge of the noise model enables us to relate prediction errors to embedding accuracy. This relationship is highly non-trivial since we show that the linear map corresponding to distance comparisons is non-invertible, but there exists a nonlinear map that is invertible. Fourth, two new algorithms for ordinal embedding are proposed and evaluated in experiments.

  • NIPS - Finite Sample Prediction and Recovery Bounds for Ordinal Embedding
    2016
    Co-Authors: Lalit Jain, Kevin Jamieson, Robert Nowak
    Abstract:

    The goal of ordinal embedding is to represent items as points in a low-dimensional Euclidean space given a set of Constraints like ``item $i$ is closer to item $j$ than item $k$''. Ordinal Constraints like this often come from human judgments. The classic approach to solving this problem is known as non-metric multidimensional scaling. To account for errors and variation in judgments, we consider the noisy situation in which the given Constraints are independently corrupted by reversing the Correct Constraint with some probability. The ordinal embedding problem has been studied for decades, but most past work pays little attention to the question of whether accurate embedding is possible, apart from empirical studies. This paper shows that under a generative data model it is possible to learn the Correct embedding from noisy distance comparisons. In establishing this fundamental result, the paper makes several new contributions. First, we derive prediction error bounds for embedding from noisy distance comparisons by exploiting the fact that the rank of a distance matrix of points in $\R^d$ is at most $d+2$. These bounds characterize how well a learned embedding predicts new comparative judgments. Second, we show that the underlying embedding can be recovered by solving a simple convex optimization. This result is highly non-trivial since we show that the linear map corresponding to distance comparisons is non-invertible, but there exists a nonlinear map that is invertible. Third, two new algorithms for ordinal embedding are proposed and evaluated in experiments.

S. Minton - One of the best experts on this subject based on the ideXlab platform.

  • Conditional Constraint networks for interleaved planning and information gathering
    IEEE Intelligent Systems, 2005
    Co-Authors: J.l. Ambite, C.a. Knoblock, M. Muslea, S. Minton
    Abstract:

    We have developed Heracles II, a framework for mixed-initiative planning and information gathering. Heracles II maps the hierarchical task structure of the planning domain into a conditional Constraint network. It also ensures Correct Constraint propagation in the presence of cycles, user interaction, and asynchronous sources. We have applied the Heracles II framework to several domains including travel planning and geospatial data integration.

Lalit Jain - One of the best experts on this subject based on the ideXlab platform.

  • Finite Sample Prediction and Recovery Bounds for Ordinal Embedding
    arXiv: Machine Learning, 2016
    Co-Authors: Lalit Jain, Kevin Jamieson, Robert Nowak
    Abstract:

    The goal of ordinal embedding is to represent items as points in a low-dimensional Euclidean space given a set of Constraints in the form of distance comparisons like "item $i$ is closer to item $j$ than item $k$". Ordinal Constraints like this often come from human judgments. To account for errors and variation in judgments, we consider the noisy situation in which the given Constraints are independently corrupted by reversing the Correct Constraint with some probability. This paper makes several new contributions to this problem. First, we derive prediction error bounds for ordinal embedding with noise by exploiting the fact that the rank of a distance matrix of points in $\mathbb{R}^d$ is at most $d+2$. These bounds characterize how well a learned embedding predicts new comparative judgments. Second, we investigate the special case of a known noise model and study the Maximum Likelihood estimator. Third, knowledge of the noise model enables us to relate prediction errors to embedding accuracy. This relationship is highly non-trivial since we show that the linear map corresponding to distance comparisons is non-invertible, but there exists a nonlinear map that is invertible. Fourth, two new algorithms for ordinal embedding are proposed and evaluated in experiments.

  • NIPS - Finite Sample Prediction and Recovery Bounds for Ordinal Embedding
    2016
    Co-Authors: Lalit Jain, Kevin Jamieson, Robert Nowak
    Abstract:

    The goal of ordinal embedding is to represent items as points in a low-dimensional Euclidean space given a set of Constraints like ``item $i$ is closer to item $j$ than item $k$''. Ordinal Constraints like this often come from human judgments. The classic approach to solving this problem is known as non-metric multidimensional scaling. To account for errors and variation in judgments, we consider the noisy situation in which the given Constraints are independently corrupted by reversing the Correct Constraint with some probability. The ordinal embedding problem has been studied for decades, but most past work pays little attention to the question of whether accurate embedding is possible, apart from empirical studies. This paper shows that under a generative data model it is possible to learn the Correct embedding from noisy distance comparisons. In establishing this fundamental result, the paper makes several new contributions. First, we derive prediction error bounds for embedding from noisy distance comparisons by exploiting the fact that the rank of a distance matrix of points in $\R^d$ is at most $d+2$. These bounds characterize how well a learned embedding predicts new comparative judgments. Second, we show that the underlying embedding can be recovered by solving a simple convex optimization. This result is highly non-trivial since we show that the linear map corresponding to distance comparisons is non-invertible, but there exists a nonlinear map that is invertible. Third, two new algorithms for ordinal embedding are proposed and evaluated in experiments.

J.l. Ambite - One of the best experts on this subject based on the ideXlab platform.

  • Conditional Constraint networks for interleaved planning and information gathering
    IEEE Intelligent Systems, 2005
    Co-Authors: J.l. Ambite, C.a. Knoblock, M. Muslea, S. Minton
    Abstract:

    We have developed Heracles II, a framework for mixed-initiative planning and information gathering. Heracles II maps the hierarchical task structure of the planning domain into a conditional Constraint network. It also ensures Correct Constraint propagation in the presence of cycles, user interaction, and asynchronous sources. We have applied the Heracles II framework to several domains including travel planning and geospatial data integration.

Kevin Jamieson - One of the best experts on this subject based on the ideXlab platform.

  • Finite Sample Prediction and Recovery Bounds for Ordinal Embedding
    arXiv: Machine Learning, 2016
    Co-Authors: Lalit Jain, Kevin Jamieson, Robert Nowak
    Abstract:

    The goal of ordinal embedding is to represent items as points in a low-dimensional Euclidean space given a set of Constraints in the form of distance comparisons like "item $i$ is closer to item $j$ than item $k$". Ordinal Constraints like this often come from human judgments. To account for errors and variation in judgments, we consider the noisy situation in which the given Constraints are independently corrupted by reversing the Correct Constraint with some probability. This paper makes several new contributions to this problem. First, we derive prediction error bounds for ordinal embedding with noise by exploiting the fact that the rank of a distance matrix of points in $\mathbb{R}^d$ is at most $d+2$. These bounds characterize how well a learned embedding predicts new comparative judgments. Second, we investigate the special case of a known noise model and study the Maximum Likelihood estimator. Third, knowledge of the noise model enables us to relate prediction errors to embedding accuracy. This relationship is highly non-trivial since we show that the linear map corresponding to distance comparisons is non-invertible, but there exists a nonlinear map that is invertible. Fourth, two new algorithms for ordinal embedding are proposed and evaluated in experiments.

  • NIPS - Finite Sample Prediction and Recovery Bounds for Ordinal Embedding
    2016
    Co-Authors: Lalit Jain, Kevin Jamieson, Robert Nowak
    Abstract:

    The goal of ordinal embedding is to represent items as points in a low-dimensional Euclidean space given a set of Constraints like ``item $i$ is closer to item $j$ than item $k$''. Ordinal Constraints like this often come from human judgments. The classic approach to solving this problem is known as non-metric multidimensional scaling. To account for errors and variation in judgments, we consider the noisy situation in which the given Constraints are independently corrupted by reversing the Correct Constraint with some probability. The ordinal embedding problem has been studied for decades, but most past work pays little attention to the question of whether accurate embedding is possible, apart from empirical studies. This paper shows that under a generative data model it is possible to learn the Correct embedding from noisy distance comparisons. In establishing this fundamental result, the paper makes several new contributions. First, we derive prediction error bounds for embedding from noisy distance comparisons by exploiting the fact that the rank of a distance matrix of points in $\R^d$ is at most $d+2$. These bounds characterize how well a learned embedding predicts new comparative judgments. Second, we show that the underlying embedding can be recovered by solving a simple convex optimization. This result is highly non-trivial since we show that the linear map corresponding to distance comparisons is non-invertible, but there exists a nonlinear map that is invertible. Third, two new algorithms for ordinal embedding are proposed and evaluated in experiments.