Deductive Logic

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 243 Experts worldwide ranked by ideXlab platform

Tiffany Barnes - One of the best experts on this subject based on the ideXlab platform.

  • Evolution of an Intelligent Deductive Logic Tutor Using Data-Driven Elements
    International Journal of Artificial Intelligence in Education, 2017
    Co-Authors: Behrooz Mostafavi, Tiffany Barnes
    Abstract:

    Deductive Logic is essential to a complete understanding of computer science concepts, and is thus fundamental to computer science education. Intelligent tutoring systems with individualized instruction have been shown to increase learning gains. We seek to improve the way Deductive Logic is taught in computer science by developing an intelligent, data-driven Logic tutor. We have augmented Deep Thought, an existing computer-based Logic tutor, by adding data-driven methods, specifically; intelligent problem selection based on the student’s current proficiency, automatically generated on-demand hints, and determination of student problem solving strategies based on clustering previous students. As a result, student tutor completion (the amount of the tutor the students completed) steadily improved as data-driven methods were added to Deep Thought, allowing students to be exposed to more Logic concepts. We also gained additional insights into the effects of different course work and teaching methods on tutor effectiveness.

  • towards data driven mastery learning
    Learning Analytics and Knowledge, 2015
    Co-Authors: Behrooz Mostafavi, Michael Eagle, Tiffany Barnes
    Abstract:

    We have developed a novel data-driven mastery learning system to improve learning in complex procedural problem solving domains. This new system was integrated into an existing Logic proof tool, and assigned as homework in a Deductive Logic course. Student performance and dropout were compared across three systems: The Deep Thought Logic tutor, Deep Thought with integrated hints, and Deep Thought with our data-driven mastery learning system. Results show that the data-driven mastery learning system increases mastery of target tutor-actions, improves tutor scores, and lowers the rate of tutor dropout over Deep Thought, with or without provided hints.

  • automatic generation of proof problems in Deductive Logic
    Educational Data Mining, 2011
    Co-Authors: Behrooz Mostafavi, Tiffany Barnes, Marvin J Croy
    Abstract:

    Automatic problem generation for learning tools can provide the required quantity and variation of problems necessary for an intelligent tutoring system. However, this requires an understanding of problem difficulty and corresponding features of student performance. Our goal is to automatically generate new proof problems in Deep Thought – an online propositional Logic learning tool – for individual students based on their performance and a set of instructor parameters.. In an initial exploratory study, we evaluate the generated problems compared to the original set of stored problems. This evaluation uses collected student data and instructor feedback.

  • towards an intelligent tutoring system for propositional proof construction
    Proceedings of the 2008 conference on Current Issues in Computing and Philosophy, 2008
    Co-Authors: Marvin J Croy, Tiffany Barnes, John C Stamper
    Abstract:

    This article reports on recent efforts to develop an intelligent tutoring system for proof construction in propositional Logic. The report centers on data derived from an undergraduate, general education course in Deductive Logic taught at the University of North Carolina at Charlotte. Within this curriculum, students use instructional java applets to practice state-transition problem solving, truth functional analysis, proof construction, and other aspects of propositional Logic. Two project goals are addressed here: 1) identifying at-risk students at an early stage in the semester, and 2) generating a visual representation of student proof efforts as a step toward understanding those efforts. Also discussed is the prospect for developing a Markov Decision Process approach to providing students with individualized help.

Behrooz Mostafavi - One of the best experts on this subject based on the ideXlab platform.

  • Evolution of an Intelligent Deductive Logic Tutor Using Data-Driven Elements
    International Journal of Artificial Intelligence in Education, 2017
    Co-Authors: Behrooz Mostafavi, Tiffany Barnes
    Abstract:

    Deductive Logic is essential to a complete understanding of computer science concepts, and is thus fundamental to computer science education. Intelligent tutoring systems with individualized instruction have been shown to increase learning gains. We seek to improve the way Deductive Logic is taught in computer science by developing an intelligent, data-driven Logic tutor. We have augmented Deep Thought, an existing computer-based Logic tutor, by adding data-driven methods, specifically; intelligent problem selection based on the student’s current proficiency, automatically generated on-demand hints, and determination of student problem solving strategies based on clustering previous students. As a result, student tutor completion (the amount of the tutor the students completed) steadily improved as data-driven methods were added to Deep Thought, allowing students to be exposed to more Logic concepts. We also gained additional insights into the effects of different course work and teaching methods on tutor effectiveness.

  • towards data driven mastery learning
    Learning Analytics and Knowledge, 2015
    Co-Authors: Behrooz Mostafavi, Michael Eagle, Tiffany Barnes
    Abstract:

    We have developed a novel data-driven mastery learning system to improve learning in complex procedural problem solving domains. This new system was integrated into an existing Logic proof tool, and assigned as homework in a Deductive Logic course. Student performance and dropout were compared across three systems: The Deep Thought Logic tutor, Deep Thought with integrated hints, and Deep Thought with our data-driven mastery learning system. Results show that the data-driven mastery learning system increases mastery of target tutor-actions, improves tutor scores, and lowers the rate of tutor dropout over Deep Thought, with or without provided hints.

  • automatic generation of proof problems in Deductive Logic
    Educational Data Mining, 2011
    Co-Authors: Behrooz Mostafavi, Tiffany Barnes, Marvin J Croy
    Abstract:

    Automatic problem generation for learning tools can provide the required quantity and variation of problems necessary for an intelligent tutoring system. However, this requires an understanding of problem difficulty and corresponding features of student performance. Our goal is to automatically generate new proof problems in Deep Thought – an online propositional Logic learning tool – for individual students based on their performance and a set of instructor parameters.. In an initial exploratory study, we evaluate the generated problems compared to the original set of stored problems. This evaluation uses collected student data and instructor feedback.

Zoubida Lounis - One of the best experts on this subject based on the ideXlab platform.

  • availability analysis of safety critical systems using advanced fault tree and stochastic petri net formalisms
    Journal of Loss Prevention in The Process Industries, 2016
    Co-Authors: Mohammed Talebberrouane, Faisal Khan, Zoubida Lounis
    Abstract:

    Abstract Failure scenarios analysis constitutes one of the cornerstones of risk assessment and availability analysis. After a detailed review of available methods, this paper identified two distinct formalisms to analyze failure scenarios and systems' availability: generalized stochastic Petri nets (GSPN) and Fault tree driven Markov processes (FTDMP). The FTDMP formalism is a combination of the Markov process and the fault tree. This aims to overcome fault tree limitations while maintaining the use of Deductive Logic. The GSPN is a Petri net with probabilistic analysis using Monte Carlo simulation. The effectiveness of both methods is studied through an emergency flare system including a knockout drum. It is observed that GSPN provides a robust and reliable mechanism for accident scenario analysis. It provides additional information such as events' frequencies at operating and failing modes and expected occurrence timing and durations resulting from different complex sequences. Even for multi-state variables which could be used to design a safety management system. Although FTDMP is a powerful formalism, it provides limited information.

  • Availability analysis of safety critical systems using advanced fault tree and stochastic Petri net formalisms
    Journal of Loss Prevention in The Process Industries, 2016
    Co-Authors: Mohammed Taleb-berrouane, Faisal Khan, Zoubida Lounis
    Abstract:

    Abstract Failure scenarios analysis constitutes one of the cornerstones of risk assessment and availability analysis. After a detailed review of available methods, this paper identified two distinct formalisms to analyze failure scenarios and systems' availability: generalized stochastic Petri nets (GSPN) and Fault tree driven Markov processes (FTDMP). The FTDMP formalism is a combination of the Markov process and the fault tree. This aims to overcome fault tree limitations while maintaining the use of Deductive Logic. The GSPN is a Petri net with probabilistic analysis using Monte Carlo simulation. The effectiveness of both methods is studied through an emergency flare system including a knockout drum. It is observed that GSPN provides a robust and reliable mechanism for accident scenario analysis. It provides additional information such as events' frequencies at operating and failing modes and expected occurrence timing and durations resulting from different complex sequences. Even for multi-state variables which could be used to design a safety management system. Although FTDMP is a powerful formalism, it provides limited information.

Mohammed Talebberrouane - One of the best experts on this subject based on the ideXlab platform.

  • availability analysis of safety critical systems using advanced fault tree and stochastic petri net formalisms
    Journal of Loss Prevention in The Process Industries, 2016
    Co-Authors: Mohammed Talebberrouane, Faisal Khan, Zoubida Lounis
    Abstract:

    Abstract Failure scenarios analysis constitutes one of the cornerstones of risk assessment and availability analysis. After a detailed review of available methods, this paper identified two distinct formalisms to analyze failure scenarios and systems' availability: generalized stochastic Petri nets (GSPN) and Fault tree driven Markov processes (FTDMP). The FTDMP formalism is a combination of the Markov process and the fault tree. This aims to overcome fault tree limitations while maintaining the use of Deductive Logic. The GSPN is a Petri net with probabilistic analysis using Monte Carlo simulation. The effectiveness of both methods is studied through an emergency flare system including a knockout drum. It is observed that GSPN provides a robust and reliable mechanism for accident scenario analysis. It provides additional information such as events' frequencies at operating and failing modes and expected occurrence timing and durations resulting from different complex sequences. Even for multi-state variables which could be used to design a safety management system. Although FTDMP is a powerful formalism, it provides limited information.

Peter Halfpenny - One of the best experts on this subject based on the ideXlab platform.

  • rationality and the sociology of scientific knowledge
    Sociological Theory, 1991
    Co-Authors: Peter Halfpenny
    Abstract:

    Within the positivist tradition, natural scientific knowledge is the epitome of rationality. In philosophy, this view was fostered by the Logical positivist Reichenbach's (1938) distinction between the context of discovery and the context of justification. To the former belong all the external determinants of the generation of scientific theoriespsychoLogical, social, political, and historical. To the latter belongs only rational calculation on the basis of disinterested observations. The distinction between the two contexts promotes the view that however scientific theories are discovered, they are justified or rejected solely by precise reasoning about the available evidence-ideally by following the canons of Deductive Logic. In sociology, this positivist conception of scientific knowledge made it immune to socioLogical analysis. Although the sociologist might seek to explain the origin of theories or to examine the social consequences of science, scientific knowledge itself was self-explanatory.1 At the most, its internal Logic could be scrutinized by analytic philosophers, but otherwise its probity could be entrusted to the democratic community of scientists, among whom truth and reason would prevail. In the positivist tradition, Reichenbach's analytic distinction is converted into a division of labor between disciplines. Natural scientists take care of scientific knowledge; philosophers act as no more than their handmaidens, helping to sort out Logical muddles; while sociologists concern themselves only with deviations from the ideal: scientific errors, false beliefs, and the irrational resistance to theories. SocioLogical explanations of these deviations are to be found in the improper social locations of the offending scientists or in the inappropriate structures of the organizations within which they work, either of which allow personal, social, or other factors to distort knowledge. Sociologists focus on scientists and scientific institutions, not on scientific knowledge. In sum, in the positivist tradition, the paradigm of rationality is Deductive Logic, the model of science is rational calculation on the basis of facts, and the sociology of scientific knowledge is an oxymoron. The positivist hegemony-in particular the belief that the positivist reconstruction is the correct description of natural scientific knowledge-came under increasing challenge in the 1960s.2 The notion that scientific knowledge is underdetermined.by evidence and that it has an ineradicable conventional character gained widespread acceptance. This view made scientific knowledge amenable to socioLogical investigation: the epistemic barrier that positivists had erected around it was breached.3 No longer was scientific knowledge different from any other corpus of knowledge, such