Minimax

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 315 Experts worldwide ranked by ideXlab platform

Hwajeong Kim - One of the best experts on this subject based on the ideXlab platform.

  • Unstable minimal surfaces of annulus type in manifolds
    Advances in Geometry, 2009
    Co-Authors: Hwajeong Kim
    Abstract:

    Unstable minimal surfaces are the unstable stationary points of the Dirichlet integral. In order to obtain unstable solutions, the method of the gradient flow together with the Minimax- principle is generally used, an application of which was presented in (19) for minimal surfaces in Euclidean space. We extend this theory to obtain unstable minimal surfaces in Riemannian manifolds. In particular, we consider minimal surfaces of annulus type.

  • Unstable minimal surfaces of annulus type in manifolds
    Advances in Geometry, 2009
    Co-Authors: Hwajeong Kim
    Abstract:

    Unstable minimal surfaces are the unstable stationary points of the Dirichlet integral. In order to obtain unstable solutions, the method of the gradient flow together with the Minimax-principle is generally used, an application of which was presented in [Struwe, J. Reine Angew. Math. 349: 1–23, 1984] for minimal surfaces in Euclidean space. We extend this theory to obtain unstable minimal surfaces in Riemannian manifolds. In particular, we consider minimal surfaces of annulus type.Peer Reviewe

  • Unstable minimal surfaces of annulus type in manifolds
    arXiv: Differential Geometry, 2006
    Co-Authors: Hwajeong Kim
    Abstract:

    Unstable minimal surfaces are the unstable stationary points of the Dirichlet-Integral. In order to obtain unstable solutions, the method of the gradient flow together with the Minimax-principle is generally used. The application of this method for minimal surfaces in the Euclidean spacce was presented in \cite{s3}. We extend this theory for obtaining unstable minimal surfaces in Riemannian manifolds. In particular, we handle minimal surfaces of annulus type, i.e. we prescribe two Jordan curves of class $C^3$ in a Riemannian manifold and prove the existence of unstable minimal surfaces of annulus type bounded by these curves.

Sergio Verdu - One of the best experts on this subject based on the ideXlab platform.

  • Minimax Rényi Redundancy
    IEEE Transactions on Information Theory, 2018
    Co-Authors: Semih Yagli, Yucel Altug, Sergio Verdu
    Abstract:

    The redundancy for universal lossless compression of discrete memoryless sources in Campbell’s setting is characterized as a Minimax Renyi divergence, which is shown to be equal to the maximal $\alpha $ -mutual information via a generalized redundancy-capacity theorem. Special attention is placed on the analysis of the asymptotics of Minimax Renyi divergence, which is determined up to a term vanishing in blocklength.

  • Minimax renyi redundancy
    International Symposium on Information Theory, 2017
    Co-Authors: Semih Yagli, Yucel Altug, Sergio Verdu
    Abstract:

    The redundancy for universal lossless compression in Campbell's setting is characterized as a Minimax Renyi divergence, which is shown to be equal to the maximal α-mutual information via a generalized redundancy-capacity theorem. Special attention is placed on the analysis of the asymptotics of Minimax Renyi divergence, which is determined up to a term vanishing in blocklength.

Peter Rudlof - One of the best experts on this subject based on the ideXlab platform.

  • On Minimax and related modules
    Canadian Journal of Mathematics, 1992
    Co-Authors: Peter Rudlof
    Abstract:

    AbstractA module M is called a Minimax module, if it has a finitely generated submodule U such that M/U is Artinian. This paper investigates Minimax modules and some generalized classes over commutative Noetherian rings. One of our main results is: M is Minimax iff every decomposition of a homomorphic image of M is finite.From this we deduce that:- All couniform modules are Minimax.- All modules of finite codimension are Minimax.- Essential covers of Minimax modules are Minimax. With the aid of these corollaries we completely determine the structure of couniform modules and modules of finite codimension.We then examine the following variants of the Minimax property:- replace U “ finitely generated” by U “ coatomic” (i.e. every proper submodule of U is contained in a maximal submodule);- replace M/U “ Artinian” by M/U “ semi-Artinian” (i.e. every proper submodule of M/U contains a minimal submodule).

Alexander Rakhlin - One of the best experts on this subject based on the ideXlab platform.

  • Empirical Entropy, Minimax Regret and Minimax Risk
    Bernoulli, 2017
    Co-Authors: Alexander Rakhlin, Karthik Sridharan, Alexandre B. Tsybakov
    Abstract:

    We consider the random design regression model with square loss. We propose a method that aggregates empirical minimizers (ERM) over appropriately chosen random subsets and reduces to ERM in the extreme case, and we establish sharp oracle inequalities for its risk. We show that, under the $\varepsilon^{-p}$ growth of the empirical $\varepsilon$-entropy, the excess risk of the proposed method attains the rate $n^{-2/(2+p)}$ for $p\in(0,2)$ and $n^{-1/p}$ for $p>2$ where $n$ is the sample size. Furthermore, for $p\in(0,2)$, the excess risk rate matches the behavior of the Minimax risk of function estimation in regression problems under the well-specified model. This yields a conclusion that the rates of statistical estimation in well-specified models (Minimax risk) and in misspecified models (Minimax regret) are equivalent in the regime $p\in(0,2)$. In other words, for $p\in(0,2)$ the problem of statistical learning enjoys the same Minimax rate as the problem of statistical estimation. On the contrary, for $p>2$ we show that the rates of the Minimax regret are, in general, slower than for the Minimax risk. Our oracle inequalities also imply the $v\log(n/v)/n$ rates for Vapnik-Chervonenkis type classes of dimension $v$ without the usual convexity assumption on the class; we show that these rates are optimal. Finally, for a slightly modified method, we derive a bound on the excess risk of $s$-sparse convex aggregation improving that of Lounici [Math. Methods Statist. 16 (2007) 246-259] and providing the optimal rate.

  • a stochastic view of optimal regret through Minimax duality
    Conference on Learning Theory, 2009
    Co-Authors: Jacob Abernethy, Alekh Agarwal, Peter L Bartlett, Alexander Rakhlin
    Abstract:

    We study the regret of optimal strategies for online convex optimization games. Using von Neumann's Minimax theorem, we show that the optimal regret in this adversarial setting is closely related to the behavior of the empirical minimization algorithm in a stochastic process setting: it is equal to the maximum, over joint distributions of the adversary's action sequence, of the difference between a sum of minimal expected losses and the minimal empirical loss. We show that the optimal regret has a natural geometric interpretation, since it can be viewed as the gap in Jensen's inequality for a concave functional--the minimizer over the player's actions of expected loss--defined on a set of probability distributions. We use this expression to obtain upper and lower bounds on the regret of an optimal strategy for a variety of online learning problems. Our method provides upper bounds without the need to construct a learning algorithm; the lower bounds provide explicit optimal strategies for the adversary. Peter L. Bartlett, Alexander Rakhlin

  • a stochastic view of optimal regret through Minimax duality
    arXiv: Learning, 2009
    Co-Authors: Jacob Abernethy, Alekh Agarwal, Peter L Bartlett, Alexander Rakhlin
    Abstract:

    We study the regret of optimal strategies for online convex optimization games. Using von Neumann's Minimax theorem, we show that the optimal regret in this adversarial setting is closely related to the behavior of the empirical minimization algorithm in a stochastic process setting: it is equal to the maximum, over joint distributions of the adversary's action sequence, of the difference between a sum of minimal expected losses and the minimal empirical loss. We show that the optimal regret has a natural geometric interpretation, since it can be viewed as the gap in Jensen's inequality for a concave functional--the minimizer over the player's actions of expected loss--defined on a set of probability distributions. We use this expression to obtain upper and lower bounds on the regret of an optimal strategy for a variety of online learning problems. Our method provides upper bounds without the need to construct a learning algorithm; the lower bounds provide explicit optimal strategies for the adversary.

Feng Liang - One of the best experts on this subject based on the ideXlab platform.

  • From Minimax Shrinkage Estimation to Minimax Shrinkage Prediction
    Statistical Science, 2012
    Co-Authors: Edward I. George, Feng Liang
    Abstract:

    In a remarkable series of papers beginning in 1956, Charles Stein set the stage for the future development of Minimax shrinkage estimators of a multivariate normal mean under quadratic loss. More recently, parallel developments have seen the emergence of Minimax shrinkage estimators of multivariate normal predictive densities under Kullback--Leibler risk. We here describe these parallels emphasizing the focus on Bayes procedures and the derivation of the superharmonic conditions for Minimaxity as well as further developments of new Minimax shrinkage predictive density estimators including multiple shrinkage estimators, empirical Bayes estimators, normal linear model regression estimators and nonparametric regression estimators.