The Experts below are selected from a list of 792 Experts worldwide ranked by ideXlab platform

Wojciech Tylman - One of the best experts on this subject based on the ideXlab platform.

  • Misuse-based intrusion Detection using Bayesian networks
    International Journal of Critical Computer-based Systems, 2010
    Co-Authors: Wojciech Tylman
    Abstract:

    This paper presents an application of Bayesian networks to the process of intrusion Detection in computer networks. The presented system, called Bayesian system for intrusion Detection (Basset) extends functionality of Snort, an open-source network intrusion Detection system (NIDS), by incorporating Bayesian networks as additional processing stages. The flexible nature of this solution allows it to be used both for misuse-based and Anomaly-Based Detection process; this paper concentrates on the misuse-based Detection. The ultimate goal is to provide better Detection capabilities and less chance of false alerts by creating a platform capable of evaluating Snort alerts in a broader context – other alerts and network traffic in general. An ability to include on-demand information from third party programmes is also an important feature of the presented approach to intrusion Detection.

  • DepCoS-RELCOMEX - Anomaly-Based Intrusion Detection Using Bayesian Networks
    2008 Third International Conference on Dependability of Computer Systems DepCoS-RELCOMEX, 2008
    Co-Authors: Wojciech Tylman
    Abstract:

    This paper presents an application of Bayesian networks to the process of intrusion Detection in computer networks. The presented system, called Basset (Bayesian system for intrusion Detection) extends functionality of Snort, an open-source NIDS, by incorporating Bayesian networks as additional processing stages. The flexible nature of this solution allows it to be used both for misuse-based and Anomaly-Based Detection process; this paper concentrates on the Anomaly-Based Detection. The ultimate goal is to create a hybrid, misuse anomaly based solution that will allow interaction between these two techniques of intrusion Detection. Ability to alter its behaviour based on historical data is also an important feature of the described system.

  • DepCoS-RELCOMEX - Misuse-Based Intrusion Detection Using Bayesian Networks
    2008 Third International Conference on Dependability of Computer Systems DepCoS-RELCOMEX, 2008
    Co-Authors: Wojciech Tylman
    Abstract:

    This paper presents an application of Bayesian networks to the process of intrusion Detection in computer networks. The presented system, called Basset (Bayesian system for intrusion Detection) extends functionality of Snort, an open-source NIDS, by incorporating Bayesian networks as additional processing stages. The flexible nature of this solution allows it to be used both for misuse-based and Anomaly-Based Detection process; this paper concentrates on the misuse-based Detection. The ultimate goal is to provide better Detection capabilities and less chance of false alarms by creating a platform capable of evaluating Snort alerts in a broader context - other alerts and network traffic in general. An ability to include on-demand information from third party programs is also an important feature of the presented approach to intrusion Detection.

Giovanni Vigna - One of the best experts on this subject based on the ideXlab platform.

  • NDSS - Using Generalization and Characterization Techniques in the Anomaly-Based Detection of Web Attacks.
    2020
    Co-Authors: William Robertson, Giovanni Vigna, Christopher Krügel, Richard A. Kemmerer
    Abstract:

    The custom, ad hoc nature of web applications makes learning-based anomaly Detection systems a suitable approach to provide early warning about the exploitation of novel vulnerabilities. However, Anomaly-Based systems are known for producing a large number of false positives and for providing poor or non-existent information about the type of attack that is associated with

  • RAID - Swaddler: an approach for the Anomaly-Based Detection of state violations in web applications
    Lecture Notes in Computer Science, 2007
    Co-Authors: Marco Cova, Viktoria Felmetsger, Davide Balzarotti, Giovanni Vigna
    Abstract:

    In recent years, web applications have become tremendously popular, and nowadays they are routinely used in security-critical environments, such as medical, financial, and military systems. As the use of web applications for critical services has increased, the number and sophistication of attacks against these applications have grown as well. Most approaches to the Detection of web-based attacks analyze the interaction of a web application with its clients and back-end servers. Even though these approaches can effectively detect and block a number of attacks, there are attacks that cannot be detected only by looking at the external behavior of a web application. In this paper, we present Swaddler, a novel approach to the Anomaly-Based Detection of attacks against web applications. Swaddler analyzes the internal state of a web application and learns the relationships between the application's critical execution points and the application's internal state. By doing this, Swaddler is able to identify attacks that attempt to bring an application in an inconsistent, anomalous state, such as violations of the intended workflow of a web application. We developed a prototype of our approach for the PHP language and we evaluated it with respect to several real-world applications.

  • swaddler an approach for the anomaly based Detection of state violations in web applications
    Recent Advances in Intrusion Detection, 2007
    Co-Authors: Marco Cova, Viktoria Felmetsger, Davide Balzarotti, Giovanni Vigna
    Abstract:

    In recent years, web applications have become tremendously popular, and nowadays they are routinely used in security-critical environments, such as medical, financial, and military systems. As the use of web applications for critical services has increased, the number and sophistication of attacks against these applications have grown as well. Most approaches to the Detection of web-based attacks analyze the interaction of a web application with its clients and back-end servers. Even though these approaches can effectively detect and block a number of attacks, there are attacks that cannot be detected only by looking at the external behavior of a web application. In this paper, we present Swaddler, a novel approach to the Anomaly-Based Detection of attacks against web applications. Swaddler analyzes the internal state of a web application and learns the relationships between the application's critical execution points and the application's internal state. By doing this, Swaddler is able to identify attacks that attempt to bring an application in an inconsistent, anomalous state, such as violations of the intended workflow of a web application. We developed a prototype of our approach for the PHP language and we evaluated it with respect to several real-world applications.

  • Swaddler: An Approach for the Anomaly-Based Detection of State Violations in Web Applications
    Recent Advances in Intrusion Detection, 2007
    Co-Authors: Marco Cova, Viktoria Felmetsger, Davide Balzarotti, Giovanni Vigna
    Abstract:

    Abstract. In recent years, web applications have become tremendously popular, and nowadays they are routinely used in security-critical envi- ronments, such as medical, financial, and military systems. As the use of web applications

R.a. Maxion - One of the best experts on this subject based on the ideXlab platform.

  • Performance Evaluation of Anomaly-Based Detection Mechanisms
    2020
    Co-Authors: R.a. Maxion
    Abstract:

    Common practice in Anomaly-Based intrusion Detection is that one size fits all: a single anomaly detector should detect all anomalies. Compensation for any performance shortcomings is sometimes effected by resorting to correlation techniques, which could be seen as making use of detector diversity. Such diversity is intuitively based on the assumption that detector coverage is different – perhaps widely different – for different detectors, each covering some disparate portion of the anomaly space. Diversity, then, enhances Detection coverage by combining the coverages of individual detectors across multiple sub-regions of the anomaly space, resulting in an overall Detection coverage that is superior to the coverage of any one detector. No studies have been done, however, in which measured effects of diversity in anomaly detectors have been obtained. This paper explores the effects of using diverse anomalyDetection algorithms (algorithmic diversity) in intrusion Detection. Experimental results indicate that while performance/coverage improvements can in fact be effected by combining diverse Detection algorithms, the gains are surprisingly not the result of combining large, non-overlapping regions of the anomaly space. Rather, the gains are seen at the edges of the space, and are heavily dependent on the parameter values of the detectors, as well as on the characteristics of the anomalies. As a consequence of this study, defenders can be provided with detailed knowledge of diverse detectors, how to combine and parameterize them, and under what conditions, to effect diverse Detection performance that is superior to the performance of a single detector.

  • Benchmarking Anomaly-Based Detection systems
    Proceeding International Conference on Dependable Systems and Networks. DSN 2000, 2000
    Co-Authors: R.a. Maxion
    Abstract:

    Anomaly Detection is a key element of intrusion Detection and other Detection systems in which perturbations of normal behavior suggest the presence of intentionally or unintentionally induced attacks, faults, defects, etc. Because most anomaly detectors are based on probabilistic algorithms that exploit the intrinsic structure (or regularity) embedded in data logs, a fundamental question is whether or not such structure influences Detection performance. If detector performance is indeed a function of environmental regularity, it would be critical to match detectors to environmental characteristics. In intrusion-Detection settings, however, this is not done, possibly because such characteristics are not easily ascertained. This paper introduces a metric for characterizing structure in data environments, and tests the hypothesis that intrinsic structure influences probabilistic Detection. In a series of experiments, an anomaly Detection algorithm was applied to a benchmark suite of 165 carefully calibrated, anomaly-injected data sets of varying structure. The results showed performance differences of as much as an order of magnitude, indicating that current approaches to anomaly Detection may not be universally dependable.

  • DSN - Benchmarking Anomaly-Based Detection systems
    Proceeding International Conference on Dependable Systems and Networks. DSN 2000, 2000
    Co-Authors: R.a. Maxion
    Abstract:

    Anomaly Detection is a key element of intrusion Detection and other Detection systems in which perturbations of normal behavior suggest the presence of intentionally or unintentionally induced attacks, faults, defects, etc. Because most anomaly detectors are based on probabilistic algorithms that exploit the intrinsic structure (or regularity) embedded in data logs, a fundamental question is whether or not such structure influences Detection performance. If detector performance is indeed a function of environmental regularity, it would be critical to match detectors to environmental characteristics. In intrusion-Detection settings, however, this is not done, possibly because such characteristics are not easily ascertained. This paper introduces a metric for characterizing structure in data environments, and tests the hypothesis that intrinsic structure influences probabilistic Detection. In a series of experiments, an anomaly Detection algorithm was applied to a benchmark suite of 165 carefully calibrated, anomaly-injected data sets of varying structure. The results showed performance differences of as much as an order of magnitude, indicating that current approaches to anomaly Detection may not be universally dependable.

Chimin Zhou - One of the best experts on this subject based on the ideXlab platform.

  • by modeling multi-feAtures of web browsing behavior from Noisy dataset
    2020
    Co-Authors: Jin Wangi, Min Zhang, Keping Long, Xiaolong Yangi, Chimin Zhou
    Abstract:

    Abstract--- HTTP-flooding attack disables the victimized web server by sending a large number of HTTP Get requests. Recent research tends to detect the attacks with the anomaly­based approaches, which detect the HTTP-flooding by modeling the behavior of normal web users. However, most of the existing Anomaly-Based Detection approaches usually cannot filter the web crawling traces of the unknown search bots mixed in the normal web browsing logs. These web­crawling traces can bias the Detection model in the training phase, thus further influencing the performance of the Anomaly-Based Detection schemes. This paper proposes a novel Anomaly-Based HTTP-flooding Detection scheme (HTTP­sCAN), which can eliminate the influence of the web-crawling traces with the cluster algorithm. The simulation results show that HTTP-sCAN is immune to the interferences of unknown search sessions, and can detect all HTTP-flooding attacks. Keywords- IP network, DDoS, Relative Entropy, Cluster algorithm

  • APCC - HTTP-sCAN: Detecting HTTP-flooding attaCk by modeling multi-features of web browsing behavior from noisy dataset
    2013 19th Asia-Pacific Conference on Communications (APCC), 2013
    Co-Authors: Jin Wang, Min Zhang, Xiaolong Yang, Keping Long, Chimin Zhou
    Abstract:

    HTTP-flooding attack disables the victimized Web server by sending a large number of HTTP Get requests. Recent research tends to detect the attacks with the Anomaly-Based approaches, which detect the HTTP-flooding by modeling the behavior of normal Web users. However, most of the existing Anomaly-Based Detection approaches usually cannot filter the Web crawling traces of the unknown search bots mixed in the normal Web browsing logs. These Web-crawling traces can bias the Detection model in the training phase, thus further influencing the performance of the Anomaly-Based Detection schemes. This paper proposes a novel Anomaly-Based HTTP-flooding Detection scheme (HTTP-sCAN), which can eliminate the influence of the Web-crawling traces with the cluster algorithm. The simulation results show that HTTP-sCAN is immune to the interferences of unknown search sessions, and can detect all HTTP-flooding attacks.

Jin Wang - One of the best experts on this subject based on the ideXlab platform.

  • HTTP-sCAN: Detecting HTTP-flooding attack by modeling multi-features of web browsing behavior from noisy web-logs
    China Communications, 2015
    Co-Authors: Jin Wang, Min Zhang, Xiaolong Yang, Keping Long, Jie Xu
    Abstract:

    HTTP-flooding attack disables the victimized web server by sending a large number of HTTP Get requests. Recent research tends to detect HTTP-flooding with the Anomaly-Based approaches, which detect the HTTP-flooding by modeling the behavior of normal web surfers. However, most of the existing Anomaly-Based Detection approaches usually cannot filter the web-crawling traces from unknown searching bots mixed in normal web browsing logs. These web-crawling traces can bias the base-line profile of Anomaly-Based schemes in their training phase, and further degrade their Detection performance. This paper proposes a novel web-crawling traces-tolerated method to build baseline profile, and designs a new Anomaly-Based HTTP-flooding Detection scheme (abbr. HTTP-sCAN). The simulation results show that HTTP-sCAN is immune to the interferences of unknown web-crawling traces, and can detect all HTTP-flooding attacks.

  • APCC - HTTP-sCAN: Detecting HTTP-flooding attaCk by modeling multi-features of web browsing behavior from noisy dataset
    2013 19th Asia-Pacific Conference on Communications (APCC), 2013
    Co-Authors: Jin Wang, Min Zhang, Xiaolong Yang, Keping Long, Chimin Zhou
    Abstract:

    HTTP-flooding attack disables the victimized Web server by sending a large number of HTTP Get requests. Recent research tends to detect the attacks with the Anomaly-Based approaches, which detect the HTTP-flooding by modeling the behavior of normal Web users. However, most of the existing Anomaly-Based Detection approaches usually cannot filter the Web crawling traces of the unknown search bots mixed in the normal Web browsing logs. These Web-crawling traces can bias the Detection model in the training phase, thus further influencing the performance of the Anomaly-Based Detection schemes. This paper proposes a novel Anomaly-Based HTTP-flooding Detection scheme (HTTP-sCAN), which can eliminate the influence of the Web-crawling traces with the cluster algorithm. The simulation results show that HTTP-sCAN is immune to the interferences of unknown search sessions, and can detect all HTTP-flooding attacks.