Processing Architecture

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 169176 Experts worldwide ranked by ideXlab platform

Xiaoniu Yang - One of the best experts on this subject based on the ideXlab platform.

  • Big Data Processing Architecture for Radio Signals Empowered by Deep Learning: Concept, Experiment, Applications and Challenges
    IEEE Access, 2018
    Co-Authors: Shilian Zheng, Shichuan Chen, Lifeng Yang, Jiawei Zhu, Zhenxing Luo, Xiaoniu Yang
    Abstract:

    In modern society, the demand for radio spectrum resources is increasing. As the information carriers of wireless transmission data, radio signals exhibit the characteristics of big data in terms of volume, variety, value, and velocity. How to uniformly handle these radio signals and obtain value from them is a problem that needs to be studied. In this paper, a big data Processing Architecture for radio signals is presented and a new approach of end-to-end signal Processing based on deep learning is discussed in detail. The radio signal intelligent search engine is used as an example to verify the Architecture, and the system components and experimental results are introduced. In addition, the applications of the Architecture in cognitive radio, spectrum monitoring, and cyberspace security are introduced. Finally, challenges are discussed, such as unified representation of radio signal features, distortionless compression of wideband sampled data, and deep neural networks for radio signals.

Wei Jun-yin - One of the best experts on this subject based on the ideXlab platform.

  • Ontology Processing Architecture for Chinese Healthcare Informative Data Set
    Computer Engineering, 2009
    Co-Authors: Wei Jun-yin
    Abstract:

    Aiming at healthcare information data set does not support semantic integration,automatic Processing and reasoning,it proposes a Knowledge Base(KBS) Processing Architecture based on ontology database in the mathematical fundamentals of Description Logics.It makes building digitalization of healthcare information bring into track of knowledge engineering,provides an astronomical,normative,dependable,maintainable KBS.It uses the data sets of the pre-marriage medical examination as a case reference,result shows that the Processing Architecture can implement the automatic disposal and reasoning of knowledge.

M. Weinhardt - One of the best experts on this subject based on the ideXlab platform.

  • PACT XPP—A Self-Reconfigurable Data Processing Architecture
    The Journal of Supercomputing, 2003
    Co-Authors: V. Baumgarte, G. Ehlers, F. May, A. Nückel, M. Vorbach, M. Weinhardt
    Abstract:

    The eXtreme Processing Platform (XPP^TM) is a new runtime-reconfigurable data Processing Architecture. It is based on a hierarchical array of coarsegrain, adaptive computing elements, and a packet-oriented communication network. The strength of the XPP^TM technology originates from the combination of array Processing with unique, powerful run-time reconfiguration mechanisms. Parts of the array can be configured rapidly in parallel while neighboring computing elements are Processing data. Reconfiguration is triggered externally or even by special event signals originating within the array, enabling self-reconfiguring designs. The XPP^TM Architecture is designed to support different types of parallelism: pipelining, instruction level, data flow, and task level parallelism. Therefore this technology is well suited for applications in multimedia, telecommunications, simulation, signal Processing (DSP), graphics, and similar stream-based application domains. The anticipated peak performance of the first commercial device running at 150MHz is estimated to be 57.6 GigaOps/sec, with a peak I/O bandwidth of several GByte/sec. Simulated applications achieve up to 43.5 GigaOps/sec (32-bit fixed point).

  • pact xpp a self reconfigurable data Processing Architecture
    The Journal of Supercomputing, 2003
    Co-Authors: V. Baumgarte, G. Ehlers, F. May, A. Nückel, M. Vorbach, M. Weinhardt
    Abstract:

    The eXtreme Processing Platform (XPPTM) is a new runtime-reconfigurable data Processing Architecture. It is based on a hierarchical array of coarsegrain, adaptive computing elements, and a packet-oriented communication network. The strength of the XPPTM technology originates from the combination of array Processing with unique, powerful run-time reconfiguration mechanisms. Parts of the array can be configured rapidly in parallel while neighboring computing elements are Processing data. Reconfiguration is triggered externally or even by special event signals originating within the array, enabling self-reconfiguring designs. The XPPTM Architecture is designed to support different types of parallelism: pipelining, instruction level, data flow, and task level parallelism. Therefore this technology is well suited for applications in multimedia, telecommunications, simulation, signal Processing (DSP), graphics, and similar stream-based application domains. The anticipated peak performance of the first commercial device running at 150 MHz is estimated to be 57.6 GigaOps/sec, with a peak I/O bandwidth of several GByte/sec. Simulated applications achieve up to 43.5 GigaOps/sec (32-bit fixed point).

Gregory Tauer - One of the best experts on this subject based on the ideXlab platform.

  • FUSION - Towards hard+soft data fusion: Processing Architecture and implementation for the joint fusion and analysis of hard and soft intelligence data
    2012
    Co-Authors: Geoff A Gross, Rakesh Nagi, Kedar Sambhoos, Daniel R Schlegel, Stuart C Shapiro, Gregory Tauer
    Abstract:

    Historically, data fusion has focused on Processing hard or physical sensor data while soft or human observed data has been neglected within fusion processes. This human observed data has much to offer towards obtaining comprehensive situational awareness, particularly in a domain such as intelligence analysis where subtle connections and interactions are difficult to observe with physical sensors. This paper describes the Processing Architecture designed and implemented for the fusion of hard and soft data in the multi-university research initiative on network-based hard and soft information fusion. The Processing elements designed to successfully fuse and reason over the hard and soft data include the natural language Processing elements to form propositional graphs from linguistic observations, conversion of the propositional graphs to attributed graphical form, alignment and tagging of the uncertainties extant in the human observations, conversion of hard data tracks to a graphical format, association of entities and relations in observational hard and soft data graphs and the matching of situations of interest to the cumulative data or evidential graph. To illustrate these Processing elements within the integrated Processing Architecture a small synthetic data set entitled the bomber buster scenario is utilized, presenting examples of each Processing element along the Processing flow. The value of fusing hard and soft information is illustrated by demonstrating that individually, neither hard nor soft information could provide the situation estimate.

  • towards hard soft data fusion Processing Architecture and implementation for the joint fusion and analysis of hard and soft intelligence data
    International Conference on Information Fusion, 2012
    Co-Authors: Geoff A Gross, Rakesh Nagi, Kedar Sambhoos, Daniel R Schlegel, Stuart C Shapiro, Gregory Tauer
    Abstract:

    Historically, data fusion has focused on Processing hard or physical sensor data while soft or human observed data has been neglected within fusion processes. This human observed data has much to offer towards obtaining comprehensive situational awareness, particularly in a domain such as intelligence analysis where subtle connections and interactions are difficult to observe with physical sensors. This paper describes the Processing Architecture designed and implemented for the fusion of hard and soft data in the multi-university research initiative on network-based hard and soft information fusion. The Processing elements designed to successfully fuse and reason over the hard and soft data include the natural language Processing elements to form propositional graphs from linguistic observations, conversion of the propositional graphs to attributed graphical form, alignment and tagging of the uncertainties extant in the human observations, conversion of hard data tracks to a graphical format, association of entities and relations in observational hard and soft data graphs and the matching of situations of interest to the cumulative data or evidential graph. To illustrate these Processing elements within the integrated Processing Architecture a small synthetic data set entitled the bomber buster scenario is utilized, presenting examples of each Processing element along the Processing flow. The value of fusing hard and soft information is illustrated by demonstrating that individually, neither hard nor soft information could provide the situation estimate.

Thierry Collette - One of the best experts on this subject based on the ideXlab platform.