Parametrization

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 73578 Experts worldwide ranked by ideXlab platform

Xin Zhang - One of the best experts on this subject based on the ideXlab platform.

  • probing the dynamics of dark energy with divergence free Parametrizations a global fit study
    Physics Letters B, 2011
    Co-Authors: Xin Zhang
    Abstract:

    Abstract The CPL Parametrization is very important for investigating the property of dark energy with observational data. However, the CPL Parametrization only respects the past evolution of dark energy but does not care about the future evolution of dark energy, since w ( z ) diverges in the distant future. In a recent paper [J.Z. Ma, X. Zhang, Phys. Lett. B 699 (2011) 233], a robust, novel Parametrization for dark energy, w ( z ) = w 0 + w 1 ( l n ( 2 + z ) 1 + z − l n 2 ) , has been proposed, successfully avoiding the future divergence problem in the CPL Parametrization. On the other hand, an oscillating Parametrization (motivated by an oscillating quintom model) can also avoid the future divergence problem. In this Letter, we use the two divergence-free Parametrizations to probe the dynamics of dark energy in the whole evolutionary history. In light of the data from 7-year WMAP temperature and polarization power spectra, matter power spectrum of SDSS DR7, and SN Ia Union2 sample, we perform a full Markov Chain Monte Carlo exploration for the two dynamical dark energy models. We find that the best-fit dark energy model is a quintom model with the EOS across −1 during the evolution. However, though the quintom model is more favored, we find that the cosmological constant still cannot be excluded.

  • probing the dynamics of dark energy with novel Parametrizations
    Physics Letters B, 2011
    Co-Authors: Xin Zhang
    Abstract:

    Abstract We point out that the CPL Parametrization has a problem that the equation of state w ( z ) diverges in the far future, so that this model can only properly describe the past evolution but cannot depict the future evolution. To overcome such a difficulty, in this Letter we propose two novel Parametrizations for dark energy, the logarithm form w ( z ) = w 0 + w 1 ( ln ( 2 + z ) 1 + z − ln 2 ) and the oscillating form w ( z ) = w 0 + w 1 ( sin ( 1 + z ) 1 + z − sin ( 1 ) ) , successfully avoiding the future divergency problem in the CPL Parametrization, and use them to probe the dynamics of dark energy in the whole evolutionary history. Our divergency-free Parametrizations are proven to be very successful in exploring the dynamical evolution of dark energy and have powerful prediction capability for the ultimate fate of the universe. Constraining the CPL model and the new models with the current observational data, we show that the new models are more favored. The features and the predictions for the future evolution in the new models are discussed in detail.

Ruth Lazkoz - One of the best experts on this subject based on the ideXlab platform.

  • supernova and baryon acoustic oscillation constraints on new polynomial dark energy Parametrizations current results and forecasts
    Monthly Notices of the Royal Astronomical Society, 2012
    Co-Authors: Irene Sendra, Ruth Lazkoz
    Abstract:

    In this work we introduce two new polynomial Parametrizations of dark energy and explore their correlation properties. The parameters to t are the equation of state values at z = 0 and z = 0:5, which have naturally low correlation and have already been shown to improve the popular Chevallier-Polarski-Linder (CPL) Parametrization. We test our models with low redshift astronomical probes: type Ia supernovae and baryon acoustic oscillations (BAO), in the form of both current and synthetic data. Specically, we present simulations of measurements of the radial and transversal BAO scales similar to those expected in a BAO high precision spectroscopic redshift survey similar to EUCLID. According to the Bayesian deviance information criterion (DIC), which penalizes large errors and correlations, we show that our models perform better than the CPL re-Parametrization proposed by Wang (in terms of z = 0 and z = 0:5). This is due to the combination of a lower correlation and smaller relative errors. The same holds for a frequentist perspective: our Figure-of-Merit is larger for our Parametrizations.

  • sn and bao constraints on new polynomial dark energy Parametrizations current results and forecasts
    arXiv: Cosmology and Nongalactic Astrophysics, 2011
    Co-Authors: Irene Sendra, Ruth Lazkoz
    Abstract:

    In this work we introduce two new polynomial Parametrizations of dark energy and explore their correlation properties. The parameters to fit are the equation of state values at z=0 and z=0.5, which have naturally low correlation and have already been shown to improve the popular Chevallier-Polarski-Linder (CPL) Parametrization. We test our models with low redshift astronomical probes: type Ia supernovae and baryon acoustic oscillations (BAO), in the form of both current and synthetic data. Specifically, we present simulations of measurements of the radial and transversal BAO scales similar to those expected in a BAO high precision spectroscopic redshift survey similar to EUCLID. According to the Bayesian deviance information criterion (DIC), which penalizes large errors and correlations, we show that our models perform better than the CPL re-Parametrization proposed by Wang (in terms of z=0 and z=0.5). This is due to the combination of a lower correlation and smaller relative errors. The same holds for a frequentist perspective: our Figure-of-Merit is larger for our Parametrizations.

  • exploring cosmological expansion Parametrizations with the gold snia data set
    Journal of Cosmology and Astroparticle Physics, 2005
    Co-Authors: Ruth Lazkoz, S Nesseris, Leandros Perivolaropoulos
    Abstract:

    We use the SnIa gold data set to compare LCDM (lambda cold dark matter) with ten representative Parametrizations of the recent Hubble expansion history H(z). For the comparison we use two statistical tests: the usual χmin2 and a statistical test that we call the p-test which depends on both the value of χmin2 and the number n of Parametrization parameters. The p-test measures the confidence level to which the parameter values corresponding to LCDM are excluded from the viewpoint of the Parametrization tested. For example, for a linear equation of state Parametrization w(z) = w0+w1z, the LCDM parameter values (w0 = −1, w1 = 0) are excluded at the 75% confidence level. We use a flat prior and Ω0m = 0.3. All Parametrizations tested are consistent with the gold data set at their best fit. According to both statistical tests, the worst fits among the ten Parametrizations correspond to the Chaplygin gas, the brane world and the Cardassian Parametrizations. The best fit is achieved by oscillating Parametrizations which can exclude the parameter values corresponding to LCDM at the 85% confidence level. Even though this level of significance does not provide a statistically significant exclusion of LCDM (it is less than 2σ) and does not by itself constitute conclusive evidence for oscillations in the cosmological expansion, when combined with similar independent recent evidence for oscillations coming from the cosmic microwave background and matter power spectra, it becomes an issue worthy of further investigation.

  • exploring cosmological expansion Parametrizations with the gold snia dataset
    arXiv: Astrophysics, 2005
    Co-Authors: Ruth Lazkoz, S Nesseris, Leandros Perivolaropoulos
    Abstract:

    We use the SnIa Gold dataset to compare LCDM with 10 representative Parametrizations of the recent Hubble expansion history $H(z)$. For the comparison we use two statistical tests; the usual $\chi_{min}^2$ which is insensitive to the Parametrization number of parameters, and a statistic we call the p-test which depends on both the value of $\chi_{min}^2$ and the number $n$ of the Parametrization parameters. The p-test measures the confidence level to which the parameter values corresponding to LCDM are excluded from the viewpoint of the Parametrization tested. For example, for a linear equation of state Parametrization $w(z)=w_0 + w_1 z$ the LCDM parameter values ($w_0=-1$, $w_1=0$) are excluded at 75% confidence level. We use a flat prior and $\Omega_{0m}=0.3$. All Parametrizations tested are consistent with the Gold dataset at their best fit. According to both statistical tests, the worst fits among the 10 Parametrizations, correspond to the Chaplygin gas, the brane world and the Cardassian Parametrizations. The best fit is achieved by oscillating Parametrizations which can exclude the parameter values corresponding to LCDM at 85% confidence level. Even though this level of significance does not provide a statistically significant exclusion of LCDM (it is less than $2\sigma$) and does not by itself constitute conclusive evidence for oscillations in the cosmological expansion, when combined with similar independent recent evidence for oscillations coming from the CMB and matter power spectra it becomes an issue worth of further investigation.

Joan Solà - One of the best experts on this subject based on the ideXlab platform.

  • Impact of landmark Parametrization on monocular EKF-SLAM with points and lines
    International Journal of Computer Vision, 2011
    Co-Authors: Joan Solà, Teresa Vidal-calleja, Javier Civera, Jose Maria Martinez-monti
    Abstract:

    This paper explores the impact that landmark Parametrization has in the performance of monocular, EKF-based, 6-DOF simultaneous localization and mapping (SLAM) in the context of undelayed landmark initialization. Undelayed initialization in monocular SLAM challenges EKF because of the combination of non-linearity with the large uncertainty associated with the unmeasured degrees of freedom. In the EKF context, the goal of a good landmark Parametrization is to improve the model's linearity as much as possible, improving the filter consistency, achieving robuster and more accurate localization and mapping. This work compares the performances of eight differ- ent landmark Parametrizations: three for points and five for straight lines. It highlights and justifies the keys for satisfactory operation: the use of parameters behaving proportionally to inverse-distance, and landmark anchoring. A unified EKF-SLAM framework is formulated as a benchmark for points and lines that is independent of the Parametrization used. The paper also defines a generalized linearity index suited for the EKF, and uses it to compute and compare the degrees of linearity of each Parametrization. Finally, all eight Parametrizations are benchmarked employing analytical tools (the linearity index) and statistical tools (based on Monte Carlo error and consistency analyses), with simulations and real imagery data, using the standard and the robocentric EKF-SLAM formulations.

  • consistency of the monocular ekf slam algorithm for three different landmark Parametrizations
    International Conference on Robotics and Automation, 2010
    Co-Authors: Joan Solà
    Abstract:

    We benchmark in this article three different landmark Parametrizations in monocular 6DOF EKF-SLAM. These Parametrizations are homogeneous points (HP), inverse-distance points (IDP, better known as inverse-depth), and the new anchored homogeneous points (AHP). The discourse used for describing them is chosen to highlight their differences and similarities, showing that they are just incremental variations of ones with respect to the others. We show for the first time a complete comparison of HP against IDP, two methods that are getting popular, and introduce also for the first time AHP, whose description falls precisely between the other two. The benchmarking is done by running all algorithms on the same data and by using the well-established NEES consistency analysis. Our conclusion is that the new AHP Parametrization is the most interesting one for monocular EKF-SLAM (followed by IDP and then HP) because it greatly postpones the apparition of EKF inconsistency.

Jose Maria Martinez-monti - One of the best experts on this subject based on the ideXlab platform.

  • Impact of landmark Parametrization on monocular EKF-SLAM with points and lines
    International Journal of Computer Vision, 2011
    Co-Authors: Joan Solà, Teresa Vidal-calleja, Javier Civera, Jose Maria Martinez-monti
    Abstract:

    This paper explores the impact that landmark Parametrization has in the performance of monocular, EKF-based, 6-DOF simultaneous localization and mapping (SLAM) in the context of undelayed landmark initialization. Undelayed initialization in monocular SLAM challenges EKF because of the combination of non-linearity with the large uncertainty associated with the unmeasured degrees of freedom. In the EKF context, the goal of a good landmark Parametrization is to improve the model's linearity as much as possible, improving the filter consistency, achieving robuster and more accurate localization and mapping. This work compares the performances of eight differ- ent landmark Parametrizations: three for points and five for straight lines. It highlights and justifies the keys for satisfactory operation: the use of parameters behaving proportionally to inverse-distance, and landmark anchoring. A unified EKF-SLAM framework is formulated as a benchmark for points and lines that is independent of the Parametrization used. The paper also defines a generalized linearity index suited for the EKF, and uses it to compute and compare the degrees of linearity of each Parametrization. Finally, all eight Parametrizations are benchmarked employing analytical tools (the linearity index) and statistical tools (based on Monte Carlo error and consistency analyses), with simulations and real imagery data, using the standard and the robocentric EKF-SLAM formulations.

Pierrealexandre Beaufort - One of the best experts on this subject based on the ideXlab platform.

  • automatic surface mesh generation for discrete models a complete and automatic pipeline based on reParametrization
    Journal of Computational Physics, 2020
    Co-Authors: Christophe Geuzaine, Pierrealexandre Beaufort, Jeanfrancois Remacle
    Abstract:

    Abstract Triangulations are an ubiquitous input for the finite element community. However, most raw triangulations obtained by imaging techniques are unsuitable as-is for finite element analysis. In this paper, we give a robust pipeline for handling those triangulations, based on the computation of a one-to-one Parametrization for automatically selected patches of input triangles, which makes each patch amenable to remeshing by standard finite element meshing algorithms. Using only geometrical arguments, we prove that a discrete Parametrization of a patch is one-to-one if (and only if) its image in the parameter space is such that all parametric triangles have a positive area. We then derive a non-standard linear discretization scheme based on mean value coordinates to compute such one-to-one Parametrizations, and show that the scheme does not discretize a Laplacian on a structured mesh. The proposed pipeline is implemented in the open source mesh generator Gmsh, where the creation of suitable patches is based on triangulation topology and Parametrization quality, combined with feature edge detection. Several examples illustrate the robustness of the resulting implementation.