Inference Rule

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 23487 Experts worldwide ranked by ideXlab platform

Damián Islas Mondragón - One of the best experts on this subject based on the ideXlab platform.

Xingxing He - One of the best experts on this subject based on the ideXlab platform.

  • contradiction separation based dynamic multi clause synergized automated deduction
    Information Sciences, 2018
    Co-Authors: Yang Xu, Xiaomei Zhong, Shuwei Chen, Xingxing He
    Abstract:

    Abstract Resolution as a famous Rule of Inference has played a key role in automated reasoning for over five decades. A number of variants and refinements of resolution have been also studied, essentially, they are all based on binary resolution, that is, the cutting Rule of the complementary pair while every deduction involves only two clauses. In the present work, we consider an extension of binary resolution Rule, which is proposed as a novel contradiction separation based Inference Rule for automated deduction, targeted for dynamic and multiple (two or more) clauses handling in a synergized way, while binary resolution is its special case. This contradiction separation based dynamic multi-clause synergized automated deduction theory is then proved to be sound and complete. The development of this new extension is motivated not only by our view to show that such a new Rule of Inference can be generic, but also by our wish that this Inference Rule could provide a basis for more efficient automated deduction algorithms and systems.

  • ISKE - Some synergized clause selection strategies for contradiction separation based automated deduction
    2017 12th International Conference on Intelligent Systems and Knowledge Engineering (ISKE), 2017
    Co-Authors: Shuwei Chen, Yang Xu, Yan Jiang, Xingxing He
    Abstract:

    The synergized dynamic contradiction separation based automated deduction theory has provided a novel logic based automated deduction reasoning framework, which ex­tends the static binary resolution Inference Rule to a dynamic multiple contradiction separation based automated deduction mechanism. This novel contradiction separation based auto­mated deduction mechanism is characterized as a dynamic, multi-clauses involving, synergized, goal-oriented and robust automated reasoning framework. In order to further improve the efficiency and feasibility of this novel automated deduction mechanism, this paper proposes some strategies for clause or literal selection during the automated deduction process, which consider mainly the synergized effect of multi-clauses during the deduction process. Some examples are put forward to illustrate the feasibility of these proposed strategies.

Dale Miller - One of the best experts on this subject based on the ideXlab platform.

  • Influences between logic programming and proof theory
    2018
    Co-Authors: Dale Miller
    Abstract:

    The earliest and most popular use of logic in computer science views computation as something that happens independent of logic: e.g., registers change, tokens move in a Petri net, messages are buffered and retrieved, and a tape head advances along a tape. Logics (often modal or temporal logics) are used to make statements about such computations. Model checkers and Hoare proof systems employ this computation-as-model approach. Early in the 20th century, some logicians invented various computational systems, such as Turing machines, Church's λ-calculus, and Post correspondence systems, which were shown to all compute the same set of recursive functions. With the introduction of high-level programming languages, such LISP, Pascal, Ada, and C, it was clear that any number of ad hoc computation systems could be designed to compute these same functions. Eventually, the large number of different programming languages were classified via the four paradigms of imperative, object-oriented, functional, and logic programming. The latter two can be viewed as an attempt to make less ad hoc computational systems by relying on aspects of symbolic logic. Unlike most programming languages, symbolic logic is a formal language that has well-defined semantics and which has been studied using model theory [17], category theory [8, 9], recursion theory [5, 6], and proof theory [3, 4]. The computation-as-deduction approach to programming languages takes as its computational elements objects from logic, namely, terms, formulas, and proofs. This approach has the potential to allow the direct application of logic's rich metatheory to proving properties of specific programs and entire programming languages. The influence of proof theory on logic programming The first thing that proof theory has offered to the logic programming paradigm is a clean and straightforward means to differentiate itself from functional programming. From the proof theory perspective, functional programs correspond to proofs (usually in natural deduction), and computation corresponds to proof normalization: that is, programs correspond to non-normal proofs and computation is seen as a series of normalization steps (using either β-convergence or cut-elimination). This programs-as-proof correspondence is known as the Curry-Howard isomorphism [16]. In contrast, proof search is a good charactorization of computation in logic programming. Here, quantificational formulas are used to encode both programs and goals (think about the Rules and queries in database theory). Sequents are used to encode the state of a computation and (cut-free) proofs are used to encode computation traces: changes in sequents model the dynamics of computation. Although cut-elimination is not part of computation, it can be used to reason about computation. Also. the proof-theoretic notions of Inference Rule, schematic variable, proof checking, and proof search are directly implementable. The proof-normalization and the proof-search styles of computational specification remain distinct even in light of numerous recent developments in proof theory: for example, linear logic, game semantics, and higher-order quantification have all served to illustrate differences and not similarities between these two styles.

  • A Multi-Focused Proof System Isomorphic to Expansion Proofs
    Journal of Logic and Computation, 2014
    Co-Authors: Kaustuv Chaudhuri, Stefan Hetzl, Dale Miller
    Abstract:

    The sequent calculus is often criticized for requiring proofs to contain large amounts of low-level syntactic details that can obscure the essence of a given proof. Because each Inference Rule introduces only a single connective, sequent proofs can separate closely related steps---such as instantiating a block of quantifiers---by irrelevant noise. Moreover, the sequential nature of sequent proofs forces proof steps that are syntactically non-interfering and permutable to nevertheless be written in some arbitrary order. The sequent calculus thus lacks a notion of canonicity: proofs that should be considered essentially the same may not have a common syntactic form. To fix this problem, many researchers have proposed replacing the sequent calculus with proof structures that are more parallel or geometric. Proof-nets, matings, and atomic flows are examples of such revolutionary formalisms. We propose, instead, an evolutionary approach to recover canonicity within the sequent calculus, which we illustrate for classical first-order logic. The essential element of our approach is the use of a multi-focused sequent calculus as the means for abstracting away low-level details from classical cut-free sequent proofs. We show that, among the multi-focused proofs, the maximally multi-focused proofs that collect together all possible parallel foci are canonical. Moreover, if we start with a certain focused sequent proof system, such proofs are isomorphic to expansion proofs---a well known, minimalistic, and parallel generalization of Herbrand disjunctions---for classical first-order logic. This technique appears to be a systematic way to recover the "essence of proof" from within sequent calculus proofs.

  • a formal framework for specifying sequent calculus proof systems
    Theoretical Computer Science, 2013
    Co-Authors: Dale Miller, Elaine Pimentel
    Abstract:

    Intuitionistic logic and intuitionistic type systems are commonly used as frameworks for the specification of natural deduction proof systems. In this paper we show how to use classical linear logic as a logical framework to specify sequent calculus proof systems and to establish some simple consequences of the specified sequent calculus proof systems. In particular, derivability of an Inference Rule from a set of Inference Rules can be decided by bounded (linear) logic programming search on the specified Rules. We also present two simple and decidable conditions that guarantee that the cut Rule and non-atomic initial Rules can be eliminated.

  • A Systematic Approach to Canonicity in the Classical Sequent Calculus
    2012
    Co-Authors: Kaustuv Chaudhuri, Stefan Hetzl, Dale Miller
    Abstract:

    The sequent calculus is often criticized for requiring proofs to contain large amounts of low-level syntactic details that can obscure the essence of a given proof. Because each Inference Rule introduces only a single connective, sequent proofs can separate closely related steps---such as instantiating a block of quantifiers---by irrelevant noise. Moreover, the sequential nature of sequent proofs forces proof steps that are syntactically non-interfering and permutable to nevertheless be written in some arbitrary order. The sequent calculus thus lacks a notion of canonicity: proofs that should be considered essentially the same may not have a common syntactic form. To fix this problem, many researchers have proposed replacing the sequent calculus with proof structures that are more parallel or geometric. Proof-nets, matings, and atomic flows are examples of such revolutionary formalisms. We propose, instead, an evolutionary approach to recover canonicity within the sequent calculus, which we illustrate for classical first-order logic. The essential element of our approach is the use of a multi-focused sequent calculus as the means of abstracting away the details from classical cut-free sequent proofs. We show that, among the multi-focused proofs, the maximally multi-focused proofs that make the foci as parallel as possible are canonical. Moreover, such proofs are isomorphic to expansion proofs---a well known, minimalistic, and parallel generalization of Herbrand disjunctions---for classical first-order logic. This technique is a systematic way to recover the desired essence of any sequent proof without abandoning the sequent calculus.

  • CSL - A Systematic Approach to Canonicity in the Classical Sequent Calculus
    2012
    Co-Authors: Kaustuv Chaudhuri, Stefan Hetzl, Dale Miller
    Abstract:

    The sequent calculus is often criticized for requiring proofs to contain large amounts of low-level syntactic details that can obscure the essence of a given proof. Because each Inference Rule introduces only a single connective, sequent proofs can separate closely related steps---such as instantiating a block of quantifiers---by irrelevant noise. Moreover, the sequential nature of sequent proofs forces proof steps that are syntactically non-interfering and permutable to nevertheless be written in some arbitrary order. The sequent calculus thus lacks a notion of canonicity: proofs that should be considered essentially the same may not have a common syntactic form. To fix this problem, many researchers have proposed replacing the sequent calculus with proof structures that are more parallel or geometric. Proof-nets, matings, and atomic flows are examples of such revolutionary formalisms. We propose, instead, an evolutionary approach to recover canonicity within the sequent calculus, which we illustrate for classical first-order logic. The essential element of our approach is the use of a multi-focused sequent calculus as the means of abstracting away the details from classical cut-free sequent proofs. We show that, among the multi-focused proofs, the maximally multi-focused proofs that make the foci as parallel as possible are canonical. Moreover, such proofs are isomorphic to expansion proofs---a well known, minimalistic, and parallel generalization of Herbrand disjunctions---for classical first-order logic. This technique is a systematic way to recover the desired essence of any sequent proof without abandoning the sequent calculus.

Le Sun - One of the best experts on this subject based on the ideXlab platform.

  • context sensitive Inference Rule discovery a graph based method
    International Conference on Computational Linguistics, 2016
    Co-Authors: Xianpei Han, Le Sun
    Abstract:

    Inference Rule discovery aims to identify entailment relations between predicates, e.g., ‘X acquire Y –> X purchase Y’ and ‘X is author of Y –> X write Y’. Traditional methods dis-cover Inference Rules by computing distributional similarities between predicates, with each predicate is represented as one or more feature vectors of its instantiations. These methods, however, have two main drawbacks. Firstly, these methods are mostly context-insensitive, cannot accurately measure the similarity between two predicates in a specific context. Secondly, traditional methods usually model predicates independently, ignore the rich inter-dependencies between predicates. To address the above two issues, this pa-per proposes a graph-based method, which can discover Inference Rules by effectively modelling and exploiting both the context and the inter-dependencies between predicates. Specifically, we propose a graph-based representation—Predicate Graph, which can capture the semantic relevance between predicates using both the predicate-feature co-occurrence statistics and the inter-dependencies between predicates. Based on the predicate graph, we propose a context-sensitive random walk algorithm, which can learn con-text-specific predicate representations by distinguishing context-relevant information from context-irrelevant information. Experimental results show that our method significantly outperforms traditional Inference Rule discovery methods.

  • COLING - Context-Sensitive Inference Rule Discovery: A Graph-Based Method.
    2016
    Co-Authors: Xianpei Han, Le Sun
    Abstract:

    Inference Rule discovery aims to identify entailment relations between predicates, e.g., ‘X acquire Y –> X purchase Y’ and ‘X is author of Y –> X write Y’. Traditional methods dis-cover Inference Rules by computing distributional similarities between predicates, with each predicate is represented as one or more feature vectors of its instantiations. These methods, however, have two main drawbacks. Firstly, these methods are mostly context-insensitive, cannot accurately measure the similarity between two predicates in a specific context. Secondly, traditional methods usually model predicates independently, ignore the rich inter-dependencies between predicates. To address the above two issues, this pa-per proposes a graph-based method, which can discover Inference Rules by effectively modelling and exploiting both the context and the inter-dependencies between predicates. Specifically, we propose a graph-based representation—Predicate Graph, which can capture the semantic relevance between predicates using both the predicate-feature co-occurrence statistics and the inter-dependencies between predicates. Based on the predicate graph, we propose a context-sensitive random walk algorithm, which can learn con-text-specific predicate representations by distinguishing context-relevant information from context-irrelevant information. Experimental results show that our method significantly outperforms traditional Inference Rule discovery methods.

Shuwei Chen - One of the best experts on this subject based on the ideXlab platform.

  • contradiction separation based dynamic multi clause synergized automated deduction
    Information Sciences, 2018
    Co-Authors: Yang Xu, Xiaomei Zhong, Shuwei Chen, Xingxing He
    Abstract:

    Abstract Resolution as a famous Rule of Inference has played a key role in automated reasoning for over five decades. A number of variants and refinements of resolution have been also studied, essentially, they are all based on binary resolution, that is, the cutting Rule of the complementary pair while every deduction involves only two clauses. In the present work, we consider an extension of binary resolution Rule, which is proposed as a novel contradiction separation based Inference Rule for automated deduction, targeted for dynamic and multiple (two or more) clauses handling in a synergized way, while binary resolution is its special case. This contradiction separation based dynamic multi-clause synergized automated deduction theory is then proved to be sound and complete. The development of this new extension is motivated not only by our view to show that such a new Rule of Inference can be generic, but also by our wish that this Inference Rule could provide a basis for more efficient automated deduction algorithms and systems.

  • ISKE - Some synergized clause selection strategies for contradiction separation based automated deduction
    2017 12th International Conference on Intelligent Systems and Knowledge Engineering (ISKE), 2017
    Co-Authors: Shuwei Chen, Yang Xu, Yan Jiang, Xingxing He
    Abstract:

    The synergized dynamic contradiction separation based automated deduction theory has provided a novel logic based automated deduction reasoning framework, which ex­tends the static binary resolution Inference Rule to a dynamic multiple contradiction separation based automated deduction mechanism. This novel contradiction separation based auto­mated deduction mechanism is characterized as a dynamic, multi-clauses involving, synergized, goal-oriented and robust automated reasoning framework. In order to further improve the efficiency and feasibility of this novel automated deduction mechanism, this paper proposes some strategies for clause or literal selection during the automated deduction process, which consider mainly the synergized effect of multi-clauses during the deduction process. Some examples are put forward to illustrate the feasibility of these proposed strategies.