Associative Memory

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 276 Experts worldwide ranked by ideXlab platform

Dan Ventura - One of the best experts on this subject based on the ideXlab platform.

  • Optically Simulating a Quantum Associative Memory
    Physical Review A, 2000
    Co-Authors: John C. Howell, John A. Yeazell, Dan Ventura
    Abstract:

    This paper discusses the realization of a quantum Associative Memory using linear integrated optics. An Associative Memory produces a full pattern of bits when presented with only a partial pattern. Quantum computers have the potential to store large numbers of patterns and hence have the ability to far surpass any classical neural network realization of an Associative Memory. In this work two 3-qubit Associative memories will be discussed using linear integrated optics. In addition, corrupted, invented and degenerate memories are discussed.

  • Quantum Associative Memory
    Information Sciences, 2000
    Co-Authors: Dan Ventura, Tony R. Martinez
    Abstract:

    Abstract This paper combines quantum computation with classical neural network theory to produce a quantum computational learning algorithm. Quantum computation (QC) uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts. The unique characteristics of quantum theory may also be used to create a quantum Associative Memory (QuAM) with a capacity exponential in the number of neurons. This paper combines two quantum computational algorithms to produce such a quantum Associative Memory. The result is an exponential increase in the capacity of the Memory when compared to traditional Associative memories such as the Hopfield network. The paper covers necessary high-level quantum mechanical and quantum computational ideas and introduces a QuAM. Theoretical analysis proves the utility of the Memory, and it is noted that a small version should be physically realizable in the near future.

  • Quantum Associative Memory
    arXiv: Quantum Physics, 1998
    Co-Authors: Dan Ventura, Tony R. Martinez
    Abstract:

    This paper combines quantum computation with classical neural network theory to produce a quantum computational learning algorithm. Quantum computation uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts. The unique characteristics of quantum theory may also be used to create a quantum Associative Memory with a capacity exponential in the number of neurons. This paper combines two quantum computational algorithms to produce such a quantum Associative Memory. The result is an exponential increase in the capacity of the Memory when compared to traditional Associative memories such as the Hopfield network. The paper covers necessary high-level quantum mechanical and quantum computational ideas and introduces a quantum Associative Memory. Theoretical analysis proves the utility of the Memory, and it is noted that a small version should be physically realizable in the near future.

  • Quantum Associative Memory with exponential capacity
    1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227), 1
    Co-Authors: Dan Ventura, Tony R. Martinez
    Abstract:

    Quantum computation uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts by taking advantage of quantum parallelism. The unique characteristics of quantum theory may also be used to create a quantum Associative Memory with a capacity exponential in the number of neurons. This paper covers necessary high-level quantum mechanical ideas and introduces a simple quantum Associative Memory. Furthermore, it provides discussion, empirical results and directions for future work.

Tony R. Martinez - One of the best experts on this subject based on the ideXlab platform.

  • Quantum Associative Memory
    Information Sciences, 2000
    Co-Authors: Dan Ventura, Tony R. Martinez
    Abstract:

    Abstract This paper combines quantum computation with classical neural network theory to produce a quantum computational learning algorithm. Quantum computation (QC) uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts. The unique characteristics of quantum theory may also be used to create a quantum Associative Memory (QuAM) with a capacity exponential in the number of neurons. This paper combines two quantum computational algorithms to produce such a quantum Associative Memory. The result is an exponential increase in the capacity of the Memory when compared to traditional Associative memories such as the Hopfield network. The paper covers necessary high-level quantum mechanical and quantum computational ideas and introduces a QuAM. Theoretical analysis proves the utility of the Memory, and it is noted that a small version should be physically realizable in the near future.

  • Quantum Associative Memory
    arXiv: Quantum Physics, 1998
    Co-Authors: Dan Ventura, Tony R. Martinez
    Abstract:

    This paper combines quantum computation with classical neural network theory to produce a quantum computational learning algorithm. Quantum computation uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts. The unique characteristics of quantum theory may also be used to create a quantum Associative Memory with a capacity exponential in the number of neurons. This paper combines two quantum computational algorithms to produce such a quantum Associative Memory. The result is an exponential increase in the capacity of the Memory when compared to traditional Associative memories such as the Hopfield network. The paper covers necessary high-level quantum mechanical and quantum computational ideas and introduces a quantum Associative Memory. Theoretical analysis proves the utility of the Memory, and it is noted that a small version should be physically realizable in the near future.

  • Quantum Associative Memory with exponential capacity
    1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227), 1
    Co-Authors: Dan Ventura, Tony R. Martinez
    Abstract:

    Quantum computation uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts by taking advantage of quantum parallelism. The unique characteristics of quantum theory may also be used to create a quantum Associative Memory with a capacity exponential in the number of neurons. This paper covers necessary high-level quantum mechanical ideas and introduces a simple quantum Associative Memory. Furthermore, it provides discussion, empirical results and directions for future work.

Waldo Fajardo - One of the best experts on this subject based on the ideXlab platform.

  • Continuous classifying Associative Memory
    International Journal of Intelligent Systems, 2002
    Co-Authors: Antonio B. Bailón, Miguel Delgado, Waldo Fajardo
    Abstract:

    In this article we present the so-called continuous classifying Associative Memory, able to store continuous patterns avoiding the problems of spurious states and data dependency. This is a Memory model based on our previously developed classifying Associative Memory, which enables continuous patterns to be stored and recovered. We will also show that the behavior of this continuous classifying Associative Memory may be adjusted to some predetermined goals by selecting some internal operating functions. © 2002 Wiley Periodicals, Inc.

  • Extension from a Linear Associative Memory to a Linguistic Linear Associative Memory
    International Journal of Intelligent Systems, 1998
    Co-Authors: Armando Blanco, Miguel Delgado, Waldo Fajardo
    Abstract:

    We present a linguistic extension from a crisp model by using a codification model that allows us to implement a fuzzy system on a discrete decision model. The paper begins with an introduction to the representation of fuzzy information, followed by a discussion of the codification method and the extension of a linear Associative Memory to a linguistic linear Associative Memory. Finally, we enumerate the advantages and disadvan- tages of the obtained linguistic linear Associative Memory. Q 1998 John Wiley & Sons, Inc.

Xiu Chun-bo - One of the best experts on this subject based on the ideXlab platform.

  • Associative Memory Based on the Cognitive Psychology
    Computer Simulation, 2010
    Co-Authors: Xiu Chun-bo
    Abstract:

    Combining with the theory of cognitive process,a novel method for Associative Memory is proposed to enlarge the Memory capability and enhance the Associative success rate.Dynamic connection weights are designed to store the information of the sample patterns.The weights can be chosen dynamically according to the current input pattern.The information of input pattern not only provides the initial values for the Associative Memory,but also plays heuristic searching role in the Associative process,which can enhance the Memory capability and the Associative success rate.Furthermore,the Associative Memory for the similar patterns and multi-values patterns can also be completed by the method.The simulation results prove the validity of the algorithm.

Haruaki Yamazaki - One of the best experts on this subject based on the ideXlab platform.

  • Complex-valued multidirectional Associative Memory
    Electrical Engineering in Japan, 2007
    Co-Authors: Masaki Kobayashi, Haruaki Yamazaki
    Abstract:

    Hopfield model is a representative Associative Memory. It was improved to Bidirectional Associative Memory (BAM) by Kosko and to Multidirectional Associative Memory (MAM) by Hagiwara. They have two layers or multilayers. Since they have symmetric connections between layers, they ensure convergence. MAM can deal with multiples of many patterns, such as (x1,x2,…), where xm is the pattern on layer m. Copyright © 2004 Wiley Periodicals, Inc. Noest, Hirose, and Nemoto proposed complex-value Hopfield model. Lee proposed complex-valued Bidirectional Associative Memory. Zemel proved the rotation invariance of complex-valued Hopfield model. It means that the rotated pattern in also stored. In this paper, the complex-valued Multidirectional Associative Memory is proposed. The rotation invariance is also proved. Moreover it is shown by computer simulation that the differences of angles of given patterns are automatically reduced. At first we define complex-valued Multidirectional Associative Memory. Then we define the energy function of network. With the energy function, we prove that the network ensures convergence. Next, we define the learning law and show the characteristic of recall process. The characteristic means that the differences of angles of given patterns are automatically reduced. Especially we prove the following theorem. In the case that only a multiple of patterns is stored, if patterns with different angles are given to each layer, the differences are automatically reduced. Finally, we investigate whether the differences of angles influence the noise robustness. It is found to reduce the noise robustness, because the input to each layer becomes small. We show this by computer simulations. © 2007 Wiley Periodicals, Inc. Electr Eng Jpn, 159(1): 39–45, 2007; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/eej.20387

  • Complex-valued Multidirectional Associative Memory
    IEEJ Transactions on Electronics Information and Systems, 2005
    Co-Authors: Masaki Kobayashi, Haruaki Yamazaki
    Abstract:

    Hopfield model is a representative Associative Memory. It was improved to Bidirectional Associative Memory(BAM) by Kosko and Multidirectional Associative Memory(MAM) by Hagiwara. They have two layers or multilayers. Since they have symmetric connections between layers, they ensure to converge. MAM can deal with multiples of many patterns, such as (x1, x2,…), where xm is the pattern on layer-m. Noest, Hirose and Nemoto proposed complex-valued Hopfield model. Lee proposed complex-valued Bidirectional Associative Memory. Zemel proved the rotation invariance of complex-valued Hopfield model. It means that the rotated pattern also stored.In this paper, the complex-valued Multidirectional Associative Memory is proposed. The rotation invariance is also proved. Moreover it is shown by computer simulation that the differences of angles of given patterns are automatically reduced.At first we define complex-valued Multidirectional Associative Memory. Then we define the energy function of network. By using energy function, we prove that the network ensures to converge.Next, we define the learning law and show the characteristic of recall process. The characteristic means that the differences of angles of given patterns are automatically reduced. Especially we prove the following theorem. In case that only a multiple of patterns is stored, if patterns with different angles are given to each layer, the differences are automatically reduced.Finally, we invest that the differences of angles influence the noise robustness. It reduce the noise robustness, because input to each layer become small. We show that by computer simulations.

  • Multidirectional Associative Memory with a hidden layer
    Systems and Computers in Japan, 2002
    Co-Authors: Masaki Kobayashi, Motonobu Hattori, Haruaki Yamazaki
    Abstract:

    MAM (Multidirectional Associative Memory) is an extended BAM (Bidirectional Associative Memory), and an Associative Memory model which can deal with multiple associations. If the training set has common terms, the conventional MAM often recalls the convolutional patterns. IMAM (Improved Multidirectional Associative Memory) can store them, but the structure is complex and the storage capacity is extremely small because it must use correlation matrix. In this paper, we propose a MAM with a hidden layer and its learning method. The structure is as simple as MAM and can store the training set which includes common terms. By computer simulation, we show the storage capacity is far larger than correlation learning and it is robust against noise. © 2002 Wiley Periodicals, Inc. Syst Comp Jpn, 33(6): 1–9, 2002; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/scj.10105