Sequence Model

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 513015 Experts worldwide ranked by ideXlab platform

Ho Lam Lau - One of the best experts on this subject based on the ideXlab platform.

  • managing xml by the nested relational Sequence database system
    Web-Age Information Management, 2003
    Co-Authors: Ho Lam Lau
    Abstract:

    We have established the Nested Relational Sequence Model (NRSM), which is an extension of the Nested Relational Data Model in order to handle XML data. The NRSM incorporates the value order, the sibling order and the ancestor order. In this paper, we demonstrate NRS operations by examples and illustrate how XML queries can be formulated. We discuss the work related to implementing the NRSD System and present our preliminary results concerning the use of the system. We confirm the benefits of reducing the data size in NRSD.

  • querying xml data by the nested relational Sequence database system
    International Database Engineering and Applications Symposium, 2003
    Co-Authors: Ho Lam Lau
    Abstract:

    In this paper, we present the nested relational Sequence Model (NRSM), which is an extension of the nested relational data Model in order to handle XML data. We also introduce a set of algebraic operations pertaining to the nested relational Sequence Model in order to handle XML data, to manipulate XML documents via NRS relations. We demonstrate NRS operations by examples and illustrate how XML queries can be formulated within the NRSM. We also introduce the ongoing work of translating XQuery into NRS operations.

  • using index to manage the ordering structure of xml data in the nested relational Sequence database system
    The Web Conference, 2003
    Co-Authors: Ho Lam Lau
    Abstract:

    poster, we introduce the mechanism of handling the ordering structure of XML data by using the Nested Relational Sequence Database System (NRSD), which is built upon the Nested Relational Sequence Model (NRSM) we developed earlier on. We demonstrate that the storing and querying of XML data are desirable over an NRS relation, which we incorporate an index system into the NRSD. Our preliminary experimental results show that the NRSD provides benefits of minimizing the storage size while maintaining the following three types of order in XML data: the ancestor order, the sibling order and the value order.

  • querying xml data based on nested relational Sequence Model
    Poster Proceedings of the World Wide Web WWW'2002, 2002
    Co-Authors: Ho Lam Lau
    Abstract:

    We propose the Nested Relational Sequence Model (NRSM) [6], which is an extension of the well-established Nested Relational Data Model (NRDM) [2,3,4] in order to cater for the two important features of nesting structure and node ordering in XML documents [1,5]. The NRSM supports composite and multi-valued attributes, which are essential for representing hierarchically structured objects such as XML data. In addition, the NRSM extends the NRDM to support ordering of XML data by allowing nested tuple Sequences in a nested Sequence relation (or a NRS relation). An important feature in our Model is that XML data that has the same label along the same path can be collapsed into the same data node. This eliminates a substantial amount of redundancy in an XML document. A NRS relation R is defined by R = (N, O, S), where N is the NRS name, O is the NRS occurrence and S is the NRS schema. In Figure 1 we show an example of mapping of an XML data tree into a NRS relation. Within the NRSM we define a set of algebraic operations, which is employed to formulate a query over a NRS relation that contains XML data. These operations enable users to retrieve XML information and to integrate XML documents in a systematic manner as follows: by taking one or more NRS relations as an input, a NRS query returns a NRS relation as an output result which represents an XML data tree. We summarize these operations in Table 1 given below.

Vadim Sheinin - One of the best experts on this subject based on the ideXlab platform.

  • sql to text generation with graph to Sequence Model
    arXiv: Computation and Language, 2018
    Co-Authors: Zhiguo Wang, Yansong Feng, Vadim Sheinin
    Abstract:

    Previous work approaches the SQL-to-text generation task using vanilla Seq2Seq Models, which may not fully capture the inherent graph-structured information in SQL query. In this paper, we first introduce a strategy to represent the SQL query as a directed graph and then employ a graph-to-Sequence Model to encode the global structure information into node embeddings. This Model can effectively learn the correlation between the SQL query pattern and its interpretation. Experimental results on the WikiSQL dataset and Stackoverflow dataset show that our Model significantly outperforms the Seq2Seq and Tree2Seq baselines, achieving the state-of-the-art performance.

  • exploiting rich syntactic information for semantic parsing with graph to Sequence Model
    arXiv: Computation and Language, 2018
    Co-Authors: Zhiguo Wang, Liwei Chen, Vadim Sheinin
    Abstract:

    Existing neural semantic parsers mainly utilize a Sequence encoder, i.e., a sequential LSTM, to extract word order features while neglecting other valuable syntactic information such as dependency graph or constituent trees. In this paper, we first propose to use the \textit{syntactic graph} to represent three types of syntactic information, i.e., word order, dependency and constituency features. We further employ a graph-to-Sequence Model to encode the syntactic graph and decode a logical form. Experimental results on benchmark datasets show that our Model is comparable to the state-of-the-art on Jobs640, ATIS and Geo880. Experimental results on adversarial examples demonstrate the robustness of the Model is also improved by encoding more syntactic information.

  • exploiting rich syntactic information for semantic parsing with graph to Sequence Model
    Empirical Methods in Natural Language Processing, 2018
    Co-Authors: Zhiguo Wang, Liwei Chen, Vadim Sheinin
    Abstract:

    Existing neural semantic parsers mainly utilize a Sequence encoder, i.e., a sequential LSTM, to extract word order features while neglecting other valuable syntactic information such as dependency or constituent trees. In this paper, we first propose to use the syntactic graph to represent three types of syntactic information, i.e., word order, dependency and constituency features; then employ a graph-to-Sequence Model to encode the syntactic graph and decode a logical form. Experimental results on benchmark datasets show that our Model is comparable to the state-of-the-art on Jobs640, ATIS, and Geo880. Experimental results on adversarial examples demonstrate the robustness of the Model is also improved by encoding more syntactic information.

  • sql to text generation with graph to Sequence Model
    Empirical Methods in Natural Language Processing, 2018
    Co-Authors: Zhiguo Wang, Yansong Feng, Vadim Sheinin
    Abstract:

    Previous work approaches the SQL-to-text generation task using vanilla Seq2Seq Models, which may not fully capture the inherent graph-structured information in SQL query. In this paper, we propose a graph-to-Sequence Model to encode the global structure information into node embeddings. This Model can effectively learn the correlation between the SQL query pattern and its interpretation. Experimental results on the WikiSQL dataset and Stackoverflow dataset show that our Model outperforms the Seq2Seq and Tree2Seq baselines, achieving the state-of-the-art performance.

Olga G Troyanskaya - One of the best experts on this subject based on the ideXlab platform.

Zhiguo Wang - One of the best experts on this subject based on the ideXlab platform.

  • sql to text generation with graph to Sequence Model
    arXiv: Computation and Language, 2018
    Co-Authors: Zhiguo Wang, Yansong Feng, Vadim Sheinin
    Abstract:

    Previous work approaches the SQL-to-text generation task using vanilla Seq2Seq Models, which may not fully capture the inherent graph-structured information in SQL query. In this paper, we first introduce a strategy to represent the SQL query as a directed graph and then employ a graph-to-Sequence Model to encode the global structure information into node embeddings. This Model can effectively learn the correlation between the SQL query pattern and its interpretation. Experimental results on the WikiSQL dataset and Stackoverflow dataset show that our Model significantly outperforms the Seq2Seq and Tree2Seq baselines, achieving the state-of-the-art performance.

  • exploiting rich syntactic information for semantic parsing with graph to Sequence Model
    arXiv: Computation and Language, 2018
    Co-Authors: Zhiguo Wang, Liwei Chen, Vadim Sheinin
    Abstract:

    Existing neural semantic parsers mainly utilize a Sequence encoder, i.e., a sequential LSTM, to extract word order features while neglecting other valuable syntactic information such as dependency graph or constituent trees. In this paper, we first propose to use the \textit{syntactic graph} to represent three types of syntactic information, i.e., word order, dependency and constituency features. We further employ a graph-to-Sequence Model to encode the syntactic graph and decode a logical form. Experimental results on benchmark datasets show that our Model is comparable to the state-of-the-art on Jobs640, ATIS and Geo880. Experimental results on adversarial examples demonstrate the robustness of the Model is also improved by encoding more syntactic information.

  • exploiting rich syntactic information for semantic parsing with graph to Sequence Model
    Empirical Methods in Natural Language Processing, 2018
    Co-Authors: Zhiguo Wang, Liwei Chen, Vadim Sheinin
    Abstract:

    Existing neural semantic parsers mainly utilize a Sequence encoder, i.e., a sequential LSTM, to extract word order features while neglecting other valuable syntactic information such as dependency or constituent trees. In this paper, we first propose to use the syntactic graph to represent three types of syntactic information, i.e., word order, dependency and constituency features; then employ a graph-to-Sequence Model to encode the syntactic graph and decode a logical form. Experimental results on benchmark datasets show that our Model is comparable to the state-of-the-art on Jobs640, ATIS, and Geo880. Experimental results on adversarial examples demonstrate the robustness of the Model is also improved by encoding more syntactic information.

  • a graph to Sequence Model for amr to text generation
    Meeting of the Association for Computational Linguistics, 2018
    Co-Authors: Linfeng Song, Yue Zhang, Zhiguo Wang, Daniel Gildea
    Abstract:

    The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph. The current state-of-the-art method uses a Sequence-to-Sequence Model, leveraging LSTM for encoding a linearized AMR structure. Although being able to Model non-local semantic information, a Sequence LSTM can lose information from the AMR graph structure, and thus facing challenges with large-graphs, which result in long Sequences. We introduce a neural graph-to-Sequence Model, using a novel LSTM structure for directly encoding graph-level semantics. On a standard benchmark, our Model shows superior results to existing methods in the literature.

  • a graph to Sequence Model for amr to text generation
    arXiv: Computation and Language, 2018
    Co-Authors: Linfeng Song, Yue Zhang, Zhiguo Wang, Daniel Gildea
    Abstract:

    The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph. The current state-of-the-art method uses a Sequence-to-Sequence Model, leveraging LSTM for encoding a linearized AMR structure. Although being able to Model non-local semantic information, a Sequence LSTM can lose information from the AMR graph structure, and thus faces challenges with large graphs, which result in long Sequences. We introduce a neural graph-to-Sequence Model, using a novel LSTM structure for directly encoding graph-level semantics. On a standard benchmark, our Model shows superior results to existing methods in the literature.

Hermann Ney - One of the best experts on this subject based on the ideXlab platform.

  • towards two dimensional Sequence to Sequence Model in neural machine translation
    arXiv: Computation and Language, 2018
    Co-Authors: Parnia Bahar, Christopher Brix, Hermann Ney
    Abstract:

    This work investigates an alternative Model for neural machine translation (NMT) and proposes a novel architecture, where we employ a multi-dimensional long short-term memory (MDLSTM) for translation Modeling. In the state-of-the-art methods, source and target sentences are treated as one-dimensional Sequences over time, while we view translation as a two-dimensional (2D) mapping using an MDLSTM layer to define the correspondence between source and target words. We extend beyond the current Sequence to Sequence backbone NMT Models to a 2D structure in which the source and target sentences are aligned with each other in a 2D grid. Our proposed topology shows consistent improvements over attention-based Sequence to Sequence Model on two WMT 2017 tasks, German$\leftrightarrow$English.

  • towards two dimensional Sequence to Sequence Model in neural machine translation
    Empirical Methods in Natural Language Processing, 2018
    Co-Authors: Parnia Bahar, Christopher Brix, Hermann Ney
    Abstract:

    This work investigates an alternative Model for neural machine translation (NMT) and proposes a novel architecture, where we employ a multi-dimensional long short-term memory (MDLSTM) for translation Modelling. In the state-of-the-art methods, source and target sentences are treated as one-dimensional Sequences over time, while we view translation as a two-dimensional (2D) mapping using an MDLSTM layer to define the correspondence between source and target words. We extend beyond the current Sequence to Sequence backbone NMT Models to a 2D structure in which the source and target sentences are aligned with each other in a 2D grid. Our proposed topology shows consistent improvements over attention-based Sequence to Sequence Model on two WMT 2017 tasks, German English.