The Experts below are selected from a list of 812367 Experts worldwide ranked by ideXlab platform
Jouni Helske - One of the best experts on this subject based on the ideXlab platform.
-
kfas exponential family State Space models in r
Journal of Statistical Software, 2017Co-Authors: Jouni HelskeAbstract:State Space modeling is an efficient and flexible method for statistical inference of a broad class of time series and other data. This paper describes the R package KFAS for State Space modeling with the observations from an exponential family, namely Gaussian, Poisson, binomial, negative binomial and gamma distributions. After introducing the basic theory behind Gaussian and non-Gaussian State Space models, an illustrative example of Poisson time series forecasting is provided. Finally, a comparison to alternative R packages suitable for non-Gaussian time series modeling is presented.
-
kfas exponential family State Space models in r
Journal of Statistical Software, 2017Co-Authors: Jouni HelskeAbstract:State Space modeling is an efficient and flexible method for statistical inference of a broad class of time series and other data. This paper describes the R package KFAS for State Space modeling w ...
-
KFAS: Exponential Family State Space Models in R
arXiv: Computation, 2016Co-Authors: Jouni HelskeAbstract:State Space modelling is an efficient and flexible method for statistical inference of a broad class of time series and other data. This paper describes an R package KFAS for State Space modelling with the observations from an exponential family, namely Gaussian, Poisson, binomial, negative binomial and gamma distributions. After introducing the basic theory behind Gaussian and non-Gaussian State Space models, an illustrative example of Poisson time series forecasting is provided. Finally, a comparison to alternative R packages suitable for non-Gaussian time series modelling is presented.
Atsushi Kawakami - One of the best experts on this subject based on the ideXlab platform.
-
a State Space realization form of multi input multi output two dimensional systems
Systems Man and Cybernetics, 1993Co-Authors: Atsushi KawakamiAbstract:In this paper, we propose a method for obtaining State-Space realization form of two-dimensional transfer function matrices (2DTFM). It can be directly obtained from the coefficients of the given 2DTFM. This State-Space realization form contains arbitrary constants, and by designating them properly, coefficients can be designated variously. And, we perform various consideration about this State-Space realization. Moreover, we present the conditions so that the State-Space realization exists. >
-
a State Space realization form of multi input multi output two dimensional systems
IFAC Proceedings Volumes, 1992Co-Authors: Atsushi KawakamiAbstract:Abstract In this paper, a method for obtaining State-Space realization form of two-dimensional transfer function matrices (2DTFM), is proposed. It can be directly obtained from the coefficients of the given 2DTFM. This State-Space realization form contains arbitrary constants, and by designating them properly, coefficients can be designated variously. And, various consideration about this State-Space realization, is performed. Moreover, the conditions so that the State-Space realization exists, is presented.
Anne C Smith - One of the best experts on this subject based on the ideXlab platform.
-
estimating a State Space model from point process observations
Neural Computation, 2003Co-Authors: Anne C SmithAbstract:A widely used signal processing paradigm is the State-Space model. The State-Space model is defined by two equations: an observation equation that describes how the hidden State or latent process is observed and a State equation that defines the evolution of the process through time. Inspired by neurophysiology experiments in which neural spiking activity is induced by an implicit (latent) stimulus, we develop an algorithm to estimate a State-Space model observed through point process measurements. We represent the latent process modulating the neural spiking activity as a gaussian autoregressive model driven by an external stimulus. Given the latent process, neural spiking activity is characterized as a general point process defined by its conditional intensity function. We develop an approximate expectation-maximization (EM) algorithm to estimate the unobservable State-Space process, its parameters, and the parameters of the point process. The EM algorithm combines a point process recursive nonlinear filter algorithm, the fixed interval smoothing algorithm, and the State-Space covariance algorithm to compute the complete data log likelihood efficiently. We use a Kolmogorov-Smirnov test based on the time-rescaling theorem to evaluate agreement between the model and point process data. We illustrate the model with two simulated data examples: an ensemble of Poisson neurons driven by a common stimulus and a single neuron whose conditional intensity function is approximated as a local Bernoulli process.
-
estimating a State Space model from point process observations
Neural Computation, 2003Co-Authors: Anne C SmithAbstract:A widely used signal processing paradigm is the State-Space model. The State-Space model is defined by two equations: an observation equation that describes how the hidden State or latent process is observed and a State equation that defines the evolution of the process through time. Inspired by neurophysiology experiments in which neural spiking activity is induced by an implicit (latent) stimulus, we develop an algorithm to estimate a State-Space model observed through point process measurements. We represent the latent process modulating the neural spiking activity as a gaussian autoregressive model driven by an external stimulus. Given the latent process, neural spiking activity is characterized as a general point process defined by its conditional intensity function. We develop an approximate expectation-maximization (EM) algorithm to estimate the unobservable State-Space process, its parameters, and the parameters of the point process. The EM algorithm combines a point process recursive nonlinear filter algorithm, the fixed interval smoothing algorithm, and the State-Space covariance algorithm to compute the complete data log likelihood efficiently. We use a Kolmogorov-Smirnov test based on the time-rescaling theorem to evaluate agreement between the model and point process data. We illustrate the model with two simulated data examples: an ensemble of Poisson neurons driven by a common stimulus and a single neuron whose conditional intensity function is approximated as a local Bernoulli process.
Carl Edward Rasmussen - One of the best experts on this subject based on the ideXlab platform.
-
variational gaussian process State Space models
Neural Information Processing Systems, 2014Co-Authors: Roger Frigola, Yutian Chen, Carl Edward RasmussenAbstract:State-Space models have been successfully used for more than fifty years in different areas of science and engineering. We present a procedure for efficient variational Bayesian learning of nonlinear State-Space models based on sparse Gaussian processes. The result of learning is a tractable posterior over nonlinear dynamical systems. In comparison to conventional parametric models, we offer the possibility to straightforwardly trade off model capacity and computational cost whilst avoiding overfitting. Our main algorithm uses a hybrid inference approach combining variational Bayes and sequential Monte Carlo. We also present stochastic variational inference and online learning approaches for fast learning with long time series.
M Chouikha - One of the best experts on this subject based on the ideXlab platform.
-
adaptive State Space filtering with fir convergence behaviour
International Conference on Acoustics Speech and Signal Processing, 1992Co-Authors: W Edmonson, W Alexander, M ChouikhaAbstract:An adaptive filter that addresses the problems of convergence to a stable filter, the optimization of multimodal performance surface, and the computational complexity of implementing an infinite impulse response (IIR) filter is described. The adaptive filter is based on describing the adaptive IIR filter in its State Space form. The adaptive State Space filter algorithm can best be described as a recursive form of the matrix minimum principle. The adaptive State Space is a two-part algorithm. The first part is a recursive algorithm for optimizing a predictor matrix which describes the transformation from past data to future data. The second part determines the system parameters from the optimized predictor matrix by the decomposition of the predictor matrix and the use of projection techniques. Assurance of convergence to a stable filter is also proven. >