V-Model

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 29045829 Experts worldwide ranked by ideXlab platform

Thomas Cartier-michaud - One of the best experts on this subject based on the ideXlab platform.

  • A gyro-kinetic model for trapped electron and ion modes
    The European Physical Journal D, 2014
    Co-Authors: Thomas Drouot, Etienne Gravier, T. Réveillé, Pierre Bertrand, Alain Ghizzo, Y Sarazin, Xavier Garbet, Thomas Cartier-michaud
    Abstract:

    In tokamak plasmas, it is recognized that ITG (ion temperature gradient instability) and trapped electron modes (TEM) are held responsible for turbulence giving rise to anomalous transport. The present work focuses on the building of a model including trapped kinetic ions and trapped kinetic electrons. For this purpose, the dimensionality is reduced by averaging the motion over the cyclotron motion and the “banana” orbits, according to the fact that the instabilities are characterized by frequencies of the order of the low trapped particle precession frequency. Moreover, a set of action-angle variables is used. The final model is 4D (two-dimensional phase space parametrized by the two first adiabatic invariants namely the particle energy and the trapping parameter). In this paper, the trapped ion and electron modes (TIM and TEM) are studied by using a linear analysis of the model. This work is currently performed in order to include trapped electrons in an existing semi lagrangian code for which TIM modes are already taken into account. This study can be considered as a first step in order to include kinetic trapped electrons in the 5D gyrokinetic code GYSELA [J. Abiteboul et al., ESAIM Proc. 32, 103 (2011)].

  • A gyro-kinetic model for trapped electron and ion modes
    The European Physical Journal D : Atomic molecular optical and plasma physics, 2014
    Co-Authors: Thomas Drouot, Etienne Gravier, T. Réveillé, Pierre Bertrand, Alain Ghizzo, Y Sarazin, Xavier Garbet, Thomas Cartier-michaud
    Abstract:

    In tokamak plasmas, it is recognized that ITG (ion temperature gradient instability) and trapped electron modes (TEM) are held responsible for turbulence giving rise to anomalous transport. The present work focuses on the building of a model including trapped kinetic ions and trapped kinetic electrons. For this purpose, the dimensionality is reduced by averaging the motion over the cyclotron motion and the ``banana'' orbits, according to the fact that the instabilities are characterized by frequencies of the order of the low trapped particle precession frequency. Moreover, a set of action-angle variables is used. The final model is 4D (two-dimensional phase space parametrized by the two first adiabatic invariants namely the particle energy and the trapping parameter). In this paper, the trapped ion and electron modes (TIM and TEM) are studied by using a linear analysis of the model. This work is currently performed in order to include trapped electrons in an existing semi lagrangian code for which TIM modes are already taken into account. This study can be considered as a first step in order to include kinetic trapped electrons in the 5D gyrokinetic code GYSELA.

Kerk L Phillips - One of the best experts on this subject based on the ideXlab platform.

  • integrating microsimulation models of tax policy into a dge macroeconomic model
    Public Finance Review, 2019
    Co-Authors: Jason Matthew Debacker, Richard Evans, Kerk L Phillips
    Abstract:

    This article proposes a method for integrating individual effective tax rates and marginal tax rates computed from a microsimulation (partial equilibrium) model of tax policy with a dynamic general equilibrium model of tax policy that can provide macroeconomic analysis or dynamic scores of tax reforms. Our approach captures the rich heterogeneity, realistic demographics, and tax-code detail of the microsimulation model and allows this detail to inform a general equilibrium model with a relatively high degree of heterogeneity. In addition, we propose a functional form in which tax rates depend jointly on the levels of both capital income and labor income.

Daniel W Apley - One of the best experts on this subject based on the ideXlab platform.

  • a better understanding of model updating strategies in validating engineering models
    Computer Methods in Applied Mechanics and Engineering, 2009
    Co-Authors: Ying Xiong, Wei Chen, Kwokleung Tsui, Daniel W Apley
    Abstract:

    Our objective in this work is to provide a better understanding of the various model updating strategies that utilize mathematical means to update a computer model based on both physical and computer observations. We examine different model updating formulations, e.g. calibration and bias-correction, as well as different solution methods. Traditional approaches to calibration treat certain computer model parameters as fixed over the physical experiment, but unknown, and the objective is to infer values for the so-called calibration parameters that provide a better match between the physical and computer data. In many practical applications, however, certain computer model parameters vary from trial to trial over the physical experiment, in which case there is no single calibrated value for a parameter. We pay particular attention to this situation and develop a maximum likelihood estimation (MLE) approach for estimating the distributional properties of the randomly varying parameters which, in a sense, calibrates them to provide the best agreement between physical and computer observations. Furthermore, we employ the newly developed u-pooling method (by Ferson et al.) as a validation metric to assess the accuracy of an updated model over a region of interest. Using the benchmark thermal challenge problem as an example, we study several possible model updating formulations using the proposed methodology. The effectiveness of the various formulations is examined. The benefits and limitations of using the MLE method versus a Bayesian approach are presented. Our study also provides insights into the potential benefits and limitations of using model updating for improving the predictive capability of a model.

Igor Mezic - One of the best experts on this subject based on the ideXlab platform.

  • a methodology for meta model based optimization in building energy models
    Energy and Buildings, 2012
    Co-Authors: Bryan Eisenhower, Zheng Oneill, Satish Narayanan, Vladimir A Fonoberov, Igor Mezic
    Abstract:

    a b s t r a c t As building energy models become more accurate and numerically efficient, model-based optimization of building design and operation is becoming more practical. The state-of-the-art typically couples an optimizer with a building energy model which tends to be time consuming and often leads to suboptimal results because of the mathematical properties of the energy model. To mitigate this issue, we present an approach that begins by sampling the parameter space of the building model around its baseline. An analytical meta-model is then fit to this data and optimization can be performed using different opti- mization cost functions or optimization algorithms with very little computational effort. Uncertainty and sensitivity analysis is also performed to identify the most influential parameters for the optimiza- tion. A case study is explored using an EnergyPlus model of an existing building which contains over 1000 parameters. When using a cost function that penalizes thermal comfort and energy, 45% annual energy reduction is achieved while simultaneously increasing thermal comfort by a factor of two. We compare the optimization using the meta-model approach with an approach using the EnergyPlus model inte- grated with the optimizer on a smaller problem using only seven optimization parameters illustrating good performance. © 2011 Elsevier B.V. All rights reserved.

Daniel Rueckert - One of the best experts on this subject based on the ideXlab platform.

  • a flexible graphical model for multi modal parcellation of the cortex
    NeuroImage, 2017
    Co-Authors: Sarah Parisot, Sofia Ira Ktena, Salim Arslan, Markus D. Schirmer, Ben Glocker, Daniel Rueckert
    Abstract:

    Abstract Advances in neuroimaging have provided a tremendous amount of in-vivo information on the brain's organisation. Its anatomy and cortical organisation can be investigated from the point of view of several imaging modalities, many of which have been studied for mapping functionally specialised cortical areas. There is strong evidence that a single modality is not sufficient to fully identify the brain's cortical organisation. Combining multiple modalities in the same parcellation task has the potential to provide more accurate and robust subdivisions of the cortex. Nonetheless, existing brain parcellation methods are typically developed and tested on single modalities using a specific type of information. In this paper, we propose Graph-based Multi-modal Parcellation (GraMPa), an iterative framework designed to handle the large variety of available input modalities to tackle the multi-modal parcellation task. At each iteration, we compute a set of parcellations from different modalities and fuse them based on their local reliabilities. The fused parcellation is used to initialise the next iteration, forcing the parcellations to converge towards a set of mutually informed modality specific parcellations, where correspondences are established. We explore two different multi-modal configurations for group-wise parcellation using resting-state fMRI, diffusion MRI tractography, myelin maps and task fMRI. Quantitative and qualitative results on the Human Connectome Project database show that integrating multi-modal information yields a stronger agreement with well established atlases and more robust connectivity networks that provide a better representation of the population.