Semantic Relation

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 49011 Experts worldwide ranked by ideXlab platform

Guo-dong Zhou - One of the best experts on this subject based on the ideXlab platform.

  • kernel based Semantic Relation detection and classification via enriched parse tree structure
    Journal of Computer Science and Technology, 2011
    Co-Authors: Guo-dong Zhou, Qiao Ming Zhu
    Abstract:

    This paper proposes a tree kernel method of Semantic Relation detection and classification (RDC) between named entities. It resolves two critical problems in previous tree kernel methods of RDC. First, a new tree kernel is presented to better capture the inherent structural information in a parse tree by enabling the standard convolution tree kernel with context-sensitiveness and approximate matching of sub-trees. Second, an enriched parse tree structure is proposed to well derive necessary structural information, e.g., proper latent annotations, from a parse tree. Evaluation on the ACE RDC corpora shows that both the new tree kernel and the enriched parse tree structure contribute significantly to RDC and our tree kernel method much outperforms the state-of-the-art ones.

  • Employing Constituent Dependency Information for Tree Kernel-Based Semantic Relation Extraction between Named Entities
    ACM Transactions on Asian Language Information Processing, 2011
    Co-Authors: Longhua Qian, Guo-dong Zhou, Qiao Ming Zhu
    Abstract:

    This article proposes a new approach to dynamically determine the tree span for tree kernel-based Semantic Relation extraction between named entities. The basic idea is to employ constituent dependency information in keeping the necessary nodes and their head children along the path connecting the two entities in the syntactic parse tree, while removing the noisy information from the tree, eventually leading to a dynamic syntactic parse tree. This article also explores various entity features and their possible combinations via a unified syntactic and Semantic tree framework, which integrates both structural syntactic parse information and entity-related Semantic information. Evaluation on the ACE RDC 2004 English and 2005 Chinese benchmark corpora shows that our dynamic syntactic parse tree much outperforms all previous tree spans, indicating its effectiveness in well representing the structural nature of Relation instances while removing redundant information. Moreover, the unified parse and Semantic tree significantly outperforms the single syntactic parse tree, largely due to the remarkable contributions from entity-related Semantic features such as its type, subtype, mention-level as well as their bi-gram combinations. Finally, the best performance so far in Semantic Relation extraction is achieved via a composite kernel, which combines this tree kernel with a linear, state-of-the-art, feature-based kernel.

  • tree kernel based Semantic Relation extraction with rich syntactic and Semantic information
    Information Sciences, 2010
    Co-Authors: Guo-dong Zhou, Longhua Qian, Jianxi Fan
    Abstract:

    This paper proposes a novel tree kernel-based method with rich syntactic and Semantic information for the extraction of Semantic Relations between named entities. With a parse tree and an entity pair, we first construct a rich Semantic Relation tree structure to integrate both syntactic and Semantic information. And then we propose a context-sensitive convolution tree kernel, which enumerates both context-free and context-sensitive sub-trees by considering the paths of their ancestor nodes as their contexts to capture structural information in the tree structure. An evaluation on the Automatic Content Extraction/Relation Detection and Characterization (ACE RDC) corpora shows that the proposed tree kernel-based method outperforms other state-of-the-art methods.

  • Exploiting constituent dependencies for tree kernel-based Semantic Relation extraction
    Proceedings of the 22nd …, 2008
    Co-Authors: Longhua Qian, Qiao Ming Zhu, Fang Kong, Guo-dong Zhou, Peide Qian
    Abstract:

    This paper proposes a new approach to dynamically determine the tree span for tree kernel-based Semantic Relation extraction. It exploits constituent dependencies to keep the nodes and their head children along the path connecting the two entities, while removing the noisy information from the syntactic parse tree, eventually leading to a dynamic syntactic parse tree. This paper also explores entity features and their combined features in a unified parse and Semantic tree, which integrates both structured syntactic parse information and entity-related Semantic information. Evaluation on the ACE RDC 2004 corpus shows that our dynamic syntactic parse tree outperforms all previous tree spans, and the composite kernel combining this tree kernel with a linear state-of-the-art feature-based kernel, achieves the so far best performance.

Longhua Qian - One of the best experts on this subject based on the ideXlab platform.

  • Employing Constituent Dependency Information for Tree Kernel-Based Semantic Relation Extraction between Named Entities
    ACM Transactions on Asian Language Information Processing, 2011
    Co-Authors: Longhua Qian, Guo-dong Zhou, Qiao Ming Zhu
    Abstract:

    This article proposes a new approach to dynamically determine the tree span for tree kernel-based Semantic Relation extraction between named entities. The basic idea is to employ constituent dependency information in keeping the necessary nodes and their head children along the path connecting the two entities in the syntactic parse tree, while removing the noisy information from the tree, eventually leading to a dynamic syntactic parse tree. This article also explores various entity features and their possible combinations via a unified syntactic and Semantic tree framework, which integrates both structural syntactic parse information and entity-related Semantic information. Evaluation on the ACE RDC 2004 English and 2005 Chinese benchmark corpora shows that our dynamic syntactic parse tree much outperforms all previous tree spans, indicating its effectiveness in well representing the structural nature of Relation instances while removing redundant information. Moreover, the unified parse and Semantic tree significantly outperforms the single syntactic parse tree, largely due to the remarkable contributions from entity-related Semantic features such as its type, subtype, mention-level as well as their bi-gram combinations. Finally, the best performance so far in Semantic Relation extraction is achieved via a composite kernel, which combines this tree kernel with a linear, state-of-the-art, feature-based kernel.

  • tree kernel based Semantic Relation extraction with rich syntactic and Semantic information
    Information Sciences, 2010
    Co-Authors: Guo-dong Zhou, Longhua Qian, Jianxi Fan
    Abstract:

    This paper proposes a novel tree kernel-based method with rich syntactic and Semantic information for the extraction of Semantic Relations between named entities. With a parse tree and an entity pair, we first construct a rich Semantic Relation tree structure to integrate both syntactic and Semantic information. And then we propose a context-sensitive convolution tree kernel, which enumerates both context-free and context-sensitive sub-trees by considering the paths of their ancestor nodes as their contexts to capture structural information in the tree structure. An evaluation on the Automatic Content Extraction/Relation Detection and Characterization (ACE RDC) corpora shows that the proposed tree kernel-based method outperforms other state-of-the-art methods.

  • Exploiting constituent dependencies for tree kernel-based Semantic Relation extraction
    Proceedings of the 22nd …, 2008
    Co-Authors: Longhua Qian, Qiao Ming Zhu, Fang Kong, Guo-dong Zhou, Peide Qian
    Abstract:

    This paper proposes a new approach to dynamically determine the tree span for tree kernel-based Semantic Relation extraction. It exploits constituent dependencies to keep the nodes and their head children along the path connecting the two entities, while removing the noisy information from the syntactic parse tree, eventually leading to a dynamic syntactic parse tree. This paper also explores entity features and their combined features in a unified parse and Semantic tree, which integrates both structured syntactic parse information and entity-related Semantic information. Evaluation on the ACE RDC 2004 corpus shows that our dynamic syntactic parse tree outperforms all previous tree spans, and the composite kernel combining this tree kernel with a linear state-of-the-art feature-based kernel, achieves the so far best performance.

Yunchuan Sun - One of the best experts on this subject based on the ideXlab platform.

  • Semantic Relation computing theory and its application
    Journal of Network and Computer Applications, 2016
    Co-Authors: Yunchuan Sun, Rongfang Bie, Cheng Lu, Junsheng Zhang
    Abstract:

    Semantic Relations among objects are primary Semantic factors, which play the most important role for human and smart systems /machines to understand and control the situation in the context of connected systems. However, few existing works focus on the study of Semantic Relations from the mathematical view, though it would be the basis for further research on Semantics. Existing research works focus on the representation of Semantic Relations and reasoning with Relations, but seldom concentrate on the Semantic Relation computing including accurate reasoning, integrity checking and redundancy checking. In this paper, we aim at exploring algebraic computing approach of Semantic Relations. A mathematical computing theory for establishing Semantic Relations - Semantic Relation space theory - is proposed for the first time, including conceptions of Semantic Relation basis, orthogonal basis, and some basic operations for Semantic Relations. The proposed theory can be used to represent the Semantic Relations among objects in an accurate way and to deduce implicit Relations in the connected smart systems, especially for automatic reasoning and autonomous computing. Furthermore, integrity and consistency issues among Semantic Relations are also discussed based on the theory. A case study in scientific research domain has shown the feasibility and effectiveness of the proposed theory.

  • theory for Semantic Relation computing and its application in Semantic link network
    Semantics Knowledge and Grid, 2012
    Co-Authors: Yunchuan Sun, Hongli Yan, Rongfang Bie
    Abstract:

    Semantic Relation among different objects is one of the most important kinds of Semantics which plays the primary role for people and intelligent systems in grasping the situation accurately in the context of connected systems. The Semantic Relations are viewed as the most important elements in most existing information models (such as ER model, RDF and SLN) during the mapping from the physical world into the cyber world, where efficient ways of representation for Semantic Relations are developed. However, all these models deal with the Semantic Relations in a simple and intuitive way. Few works go deep into the study of Semantic Relations from a mathematic view. This paper aims at exploring a mathematic theory of the Semantic Relations between objects including representing methods, normal forms of operations, Semantic orthogonal basis of the Semantic Relations. We also discuss the reasoning mechanism based on the proposed theory. The applications of the developed theory in Semantic link network are discussed finally.

  • The schema theory for Semantic link network
    Future Generation Computer Systems, 2010
    Co-Authors: Hai Zhuge, Yunchuan Sun
    Abstract:

    The Semantic Link Network (SLN) is a loosely coupled Semantic data model for managing Web resources. Its nodes can be any type of resource. Its edges can be any Semantic Relation. Potential Semantic links can be derived out according to reasoning rules on Semantic Relations. This paper proposes the schema theory for the SLN, including the concepts, rule-constraint normal forms, and relevant algorithms. The theory provides the basis for normalized management of Semantic link network. A case study demonstrates the proposed theory. © 2009.

Yoshimasa Tsuruoka - One of the best experts on this subject based on the ideXlab platform.

  • task oriented learning of word embeddings for Semantic Relation classification
    Conference on Computational Natural Language Learning, 2015
    Co-Authors: Kazuma Hashimoto, Makoto Miwa, Pontus Stenetorp, Yoshimasa Tsuruoka
    Abstract:

    We present a novel learning method for word embeddings designed for Relation classification. Our word embeddings are trained by predicting words between noun pairs using lexical Relation-specific features on a large unlabeled corpus. This allows us to explicitly incorporate Relationspecific information into the word embeddings. The learned word embeddings are then used to construct feature vectors for a Relation classification model. On a wellestablished Semantic Relation classification task, our method significantly outperforms a baseline based on a previously introduced word embedding method, and compares favorably to previous state-of-the-art models that use syntactic information or manually constructed external resources.

  • Simple Customization of Recursive Neural Networks for Semantic Relation Classification
    Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, 2013
    Co-Authors: Kazuma Hashimoto, Yoshimasa Tsuruoka, Makoto Miwa, Takashi Chikayama
    Abstract:

    In this paper, we present a recursive neural network (RNN) model that works on a syntactic tree. Our model differs from previous RNN models in that the model allows for an explicit weighting of important phrases for the target task. We also propose to average parameters in training. Our experimental results on Semantic Relation classification show that both phrase categories and task-specific weighting significantly improve the prediction accuracy of the model. We also show that averaging the model parameters is effective in stabilizing the learning and improves generalization capacity. The proposed model marks scores competitive with state-of-the-art RNN-based models.

Xiangrong Zhang - One of the best experts on this subject based on the ideXlab platform.

  • bio Semantic Relation extraction with attention based external knowledge reinforcement
    BMC Bioinformatics, 2020
    Co-Authors: Zhijing Li, Yuchen Lian, Xiangrong Zhang, Chen Li
    Abstract:

    BACKGROUND Semantic resources such as knowledge bases contains high-quality-structured knowledge and therefore require significant effort from domain experts. Using the resources to reinforce the information retrieval from the unstructured text may further exploit the potentials of such unstructured text resources and their curated knowledge. RESULTS The paper proposes a novel method that uses a deep neural network model adopting the prior knowledge to improve performance in the automated extraction of biological Semantic Relations from the scientific literature. The model is based on a recurrent neural network combining the attention mechanism with the Semantic resources, i.e., UniProt and BioModels. Our method is evaluated on the BioNLP and BioCreative corpus, a set of manually annotated biological text. The experiments demonstrate that the method outperforms the current state-of-the-art models, and the structured Semantic information could improve the result of bio-text-mining. CONCLUSION The experiment results show that our approach can effectively make use of the external prior knowledge information and improve the performance in the protein-protein interaction extraction task. The method should be able to be generalized for other types of data, although it is validated on biomedical texts.

  • bio Semantic Relation extraction with attention based external knowledge reinforcement
    BMC Bioinformatics, 2020
    Co-Authors: Yuchen Lian, Xiangrong Zhang
    Abstract:

    Semantic resources such as knowledge bases contains high-quality-structured knowledge and therefore require significant effort from domain experts. Using the resources to reinforce the information retrieval from the unstructured text may further exploit the potentials of such unstructured text resources and their curated knowledge. The paper proposes a novel method that uses a deep neural network model adopting the prior knowledge to improve performance in the automated extraction of biological Semantic Relations from the scientific literature. The model is based on a recurrent neural network combining the attention mechanism with the Semantic resources, i.e., UniProt and BioModels. Our method is evaluated on the BioNLP and BioCreative corpus, a set of manually annotated biological text. The experiments demonstrate that the method outperforms the current state-of-the-art models, and the structured Semantic information could improve the result of bio-text-mining. The experiment results show that our approach can effectively make use of the external prior knowledge information and improve the performance in the protein-protein interaction extraction task. The method should be able to be generalized for other types of data, although it is validated on biomedical texts.