Structured Representation

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 58617 Experts worldwide ranked by ideXlab platform

Raz Kupferman - One of the best experts on this subject based on the ideXlab platform.

  • mean field variational approximation for continuous time bayesian networks
    Journal of Machine Learning Research, 2010
    Co-Authors: Ido Cohn, Nir Friedman, Tal Elhay, Raz Kupferman
    Abstract:

    Continuous-time Bayesian networks is a natural Structured Representation language for multi-component stochastic processes that evolve continuously over time. Despite the compact Representation provided by this language, inference in such models is intractable even in relatively simple Structured networks. We introduce a mean field variational approximation in which we use a product of inhomogeneous Markov processes to approximate a joint distribution over trajectories. This variational approach leads to a globally consistent distribution, which can be efficiently queried. Additionally, it provides a lower bound on the probability of observations, thus making it attractive for learning tasks. Here we describe the theoretical foundations for the approximation, an efficient implementation that exploits the wide range of highly optimized ordinary differential equations (ODE) solvers, experimentally explore characterizations of processes for which this approximation is suitable, and show applications to a large-scale real-world inference problem.

  • mean field variational approximation for continuous time bayesian networks
    Uncertainty in Artificial Intelligence, 2009
    Co-Authors: Ido Cohn, Nir Friedman, Tal Elhay, Raz Kupferman
    Abstract:

    Continuous-time Bayesian networks is a natural Structured Representation language for multi-component stochastic processes that evolve continuously over time. Despite the compact Representation, inference in such models is intractable even in relatively simple Structured networks. Here we introduce a mean field variational approximation in which we use a product of inhomogeneous Markov processes to approximate a distribution over trajectories. This variational approach leads to a globally consistent distribution, which can be efficiently queried. Additionally, it provides a lower bound on the probability of observations, thus making it attractive for learning tasks. We provide the theoretical foundations for the approximation, an efficient implementation that exploits the wide range of highly optimized ordinary differential equations (ODE) solvers, experimentally explore characterizations of processes for which this approximation is suitable, and show applications to a large-scale real-world inference problem.

Kupfermanraz - One of the best experts on this subject based on the ideXlab platform.

Ido Cohn - One of the best experts on this subject based on the ideXlab platform.

  • mean field variational approximation for continuous time bayesian networks
    Journal of Machine Learning Research, 2010
    Co-Authors: Ido Cohn, Nir Friedman, Tal Elhay, Raz Kupferman
    Abstract:

    Continuous-time Bayesian networks is a natural Structured Representation language for multi-component stochastic processes that evolve continuously over time. Despite the compact Representation provided by this language, inference in such models is intractable even in relatively simple Structured networks. We introduce a mean field variational approximation in which we use a product of inhomogeneous Markov processes to approximate a joint distribution over trajectories. This variational approach leads to a globally consistent distribution, which can be efficiently queried. Additionally, it provides a lower bound on the probability of observations, thus making it attractive for learning tasks. Here we describe the theoretical foundations for the approximation, an efficient implementation that exploits the wide range of highly optimized ordinary differential equations (ODE) solvers, experimentally explore characterizations of processes for which this approximation is suitable, and show applications to a large-scale real-world inference problem.

  • mean field variational approximation for continuous time bayesian networks
    Uncertainty in Artificial Intelligence, 2009
    Co-Authors: Ido Cohn, Nir Friedman, Tal Elhay, Raz Kupferman
    Abstract:

    Continuous-time Bayesian networks is a natural Structured Representation language for multi-component stochastic processes that evolve continuously over time. Despite the compact Representation, inference in such models is intractable even in relatively simple Structured networks. Here we introduce a mean field variational approximation in which we use a product of inhomogeneous Markov processes to approximate a distribution over trajectories. This variational approach leads to a globally consistent distribution, which can be efficiently queried. Additionally, it provides a lower bound on the probability of observations, thus making it attractive for learning tasks. We provide the theoretical foundations for the approximation, an efficient implementation that exploits the wide range of highly optimized ordinary differential equations (ODE) solvers, experimentally explore characterizations of processes for which this approximation is suitable, and show applications to a large-scale real-world inference problem.

Nuno Lourenco - One of the best experts on this subject based on the ideXlab platform.

  • DENSER: deep evolutionary network Structured Representation
    Genetic Programming and Evolvable Machines, 2019
    Co-Authors: Filipe Assuncao, Nuno Lourenco, Penousal Machado, Bernardete Ribeiro
    Abstract:

    Deep evolutionary network Structured Representation (DENSER) is a novel evolutionary approach for the automatic generation of deep neural networks (DNNs) which combines the principles of genetic algorithms (GAs) with those of dynamic Structured grammatical evolution (DSGE). The GA-level encodes the macro structure of evolution, i.e., the layers, learning, and/or data augmentation methods (among others); the DSGE-level specifies the parameters of each GA evolutionary unit and the valid range of the parameters. The use of a grammar makes DENSER a general purpose framework for generating DNNs: one just needs to adapt the grammar to be able to deal with different network and layer types, problems, or even to change the range of the parameters. DENSER is tested on the automatic generation of convolutional neural networks (CNNs) for the CIFAR-10 dataset, with the best performing networks reaching accuracies of up to 95.22%. Furthermore, we take the fittest networks evolved on the CIFAR-10, and apply them to classify MNIST, Fashion-MNIST, SVHN, Rectangles, and CIFAR-100. The results show that the DNNs discovered by DENSER during evolution generalise, are robust, and scale. The most impressive result is the 78.75% classification accuracy on the CIFAR-100 dataset, which, to the best of our knowledge, sets a new state-of-the-art on methods that seek to automatically design CNNs.

  • denser deep evolutionary network Structured Representation
    arXiv: Neural and Evolutionary Computing, 2018
    Co-Authors: Filipe Assuncao, Nuno Lourenco, Penousal Machado, Bernardete Ribeiro
    Abstract:

    Deep Evolutionary Network Structured Representation (DENSER) is a novel approach to automatically design Artificial Neural Networks (ANNs) using Evolutionary Computation (EC). The algorithm not only searches for the best network topology (e.g., number of layers, type of layers), but also tunes hyper-parameters, such as, learning parameters or data augmentation parameters. The automatic design is achieved using a Representation with two distinct levels, where the outer level encodes the general structure of the network, i.e., the sequence of layers, and the inner level encodes the parameters associated with each layer. The allowed layers and hyper-parameter value ranges are defined by means of a human-readable Context-Free Grammar. DENSER was used to evolve ANNs for two widely used image classification benchmarks obtaining an average accuracy result of up to 94.27% on the CIFAR-10 dataset, and of 78.75% on the CIFAR-100. To the best of our knowledge, our CIFAR-100 results are the highest performing models generated by methods that aim at the automatic design of Convolutional Neural Networks (CNNs), and is amongst the best for manually designed and fine-tuned CNNs .

  • Artificial Evolution - SGE: A Structured Representation for Grammatical Evolution
    Lecture Notes in Computer Science, 2016
    Co-Authors: Nuno Lourenco, Francisco B Pereira, Ernesto Costa
    Abstract:

    This paper introduces Structured Grammatical Evolution, a new genotypic Representation for Grammatical Evolution, where each gene is explicitly linked to a non-terminal of the grammar being used. This one-to-one correspondence ensures that the modification of a gene does not affect the derivation options of other non-terminals, thereby increasing locality. The performance of the new Representation is accessed on a set of benchmark problems. The results obtained confirm the effectiveness of the proposed approach, as it is able to outperform standard grammatical evolution on all selected optimization problems.

  • sge a Structured Representation for grammatical evolution
    Revised Selected Papers of the 12th International Conference on Artificial Evolution - Volume 9554, 2015
    Co-Authors: Nuno Lourenco, Francisco B Pereira, Ernesto Costa
    Abstract:

    This paper introduces Structured Grammatical Evolution, a new genotypic Representation for Grammatical Evolution, where each gene is explicitly linked to a non-terminal of the grammar being used. This one-to-one correspondence ensures that the modification of a gene does not affect the derivation options of other non-terminals, thereby increasing locality. The performance of the new Representation is accessed on a set of benchmark problems. The results obtained confirm the effectiveness of the proposed approach, as it is able to outperform standard grammatical evolution on all selected optimization problems.

Nir Friedman - One of the best experts on this subject based on the ideXlab platform.

  • mean field variational approximation for continuous time bayesian networks
    Journal of Machine Learning Research, 2010
    Co-Authors: Ido Cohn, Nir Friedman, Tal Elhay, Raz Kupferman
    Abstract:

    Continuous-time Bayesian networks is a natural Structured Representation language for multi-component stochastic processes that evolve continuously over time. Despite the compact Representation provided by this language, inference in such models is intractable even in relatively simple Structured networks. We introduce a mean field variational approximation in which we use a product of inhomogeneous Markov processes to approximate a joint distribution over trajectories. This variational approach leads to a globally consistent distribution, which can be efficiently queried. Additionally, it provides a lower bound on the probability of observations, thus making it attractive for learning tasks. Here we describe the theoretical foundations for the approximation, an efficient implementation that exploits the wide range of highly optimized ordinary differential equations (ODE) solvers, experimentally explore characterizations of processes for which this approximation is suitable, and show applications to a large-scale real-world inference problem.

  • mean field variational approximation for continuous time bayesian networks
    Uncertainty in Artificial Intelligence, 2009
    Co-Authors: Ido Cohn, Nir Friedman, Tal Elhay, Raz Kupferman
    Abstract:

    Continuous-time Bayesian networks is a natural Structured Representation language for multi-component stochastic processes that evolve continuously over time. Despite the compact Representation, inference in such models is intractable even in relatively simple Structured networks. Here we introduce a mean field variational approximation in which we use a product of inhomogeneous Markov processes to approximate a distribution over trajectories. This variational approach leads to a globally consistent distribution, which can be efficiently queried. Additionally, it provides a lower bound on the probability of observations, thus making it attractive for learning tasks. We provide the theoretical foundations for the approximation, an efficient implementation that exploits the wide range of highly optimized ordinary differential equations (ODE) solvers, experimentally explore characterizations of processes for which this approximation is suitable, and show applications to a large-scale real-world inference problem.

  • Structured Representation of complex stochastic systems
    National Conference on Artificial Intelligence, 1998
    Co-Authors: Nir Friedman, Daphne Koller, Avi Pfeffer
    Abstract:

    This paper considers the problem of representing complex systems that evolve stochastically over time. Dynamic Bayesian networks provide a compact Representation for stochastic processes. Unfortunately, they are often unwieldy since they cannot explicitly model the complex organizational structure of many real life systems: the fact that processes are typically composed of several interacting subprocesses, each of which can, in tum, be further decomposed. We propose a hierarchically Structured Representation language which extends both dynamic Bayesian networks and the object-oriented Bayesian network framework of [9], and show that our language allows us to describe such systems in a natural and modular way. Our language supports a natural Representation for certain system characteristics that are hard to capture using more traditional frameworks. For example, it allows us to represent systems where some processes evolve at a different rate than others, or systems where the processes interact only intermittently. We provide a simple inference mechanism for our Representation via translation to Bayesian networks, and suggest ways in which the inference algorithm can exploit the additional structure encoded in our Representation.

  • AAAI/IAAI - Structured Representation of complex stochastic systems
    1998
    Co-Authors: Nir Friedman, Daphne Koller, Avi Pfeffer
    Abstract:

    This paper considers the problem of representing complex systems that evolve stochastically over time. Dynamic Bayesian networks provide a compact Representation for stochastic processes. Unfortunately, they are often unwieldy since they cannot explicitly model the complex organizational structure of many real life systems: the fact that processes are typically composed of several interacting subprocesses, each of which can, in tum, be further decomposed. We propose a hierarchically Structured Representation language which extends both dynamic Bayesian networks and the object-oriented Bayesian network framework of [9], and show that our language allows us to describe such systems in a natural and modular way. Our language supports a natural Representation for certain system characteristics that are hard to capture using more traditional frameworks. For example, it allows us to represent systems where some processes evolve at a different rate than others, or systems where the processes interact only intermittently. We provide a simple inference mechanism for our Representation via translation to Bayesian networks, and suggest ways in which the inference algorithm can exploit the additional structure encoded in our Representation.