Log Analysis

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 360 Experts worldwide ranked by ideXlab platform

Latifur Khan - One of the best experts on this subject based on the ideXlab platform.

  • Loglens a real time Log Analysis system
    International Conference on Distributed Computing Systems, 2018
    Co-Authors: Biplob Debnath, Mohiuddin Solaimani, Muhammad Ali Gulzar, Nipun Arora, Cristian Lumezanu, Bo Zong, Hui Zhang, Guofei Jiang, Latifur Khan
    Abstract:

    Administrators of most user-facing systems depend on periodic Log data to get an idea of the health and status of production applications. Logs report information, which is crucial to diagnose the root cause of complex problems. In this paper, we present a real-time Log Analysis system called LogLens that automates the process of anomaly detection from Logs with no (or minimal) target system knowledge and user specification. In LogLens, we employ unsupervised machine learning based techniques to discover patterns in application Logs, and then leverage these patterns along with the real-time Log parsing for designing advanced Log analytics applications. Compared to the existing systems which are primarily limited to Log indexing and search capabilities, LogLens presents an extensible system for supporting both stateless and stateful Log Analysis applications. Currently, LogLens is running at the core of a commercial Log Analysis solution handling millions of Logs generated from the large-scale industrial environments and reported up to 12096x man-hours reduction in troubleshooting operational problems compared to the manual approach.

Stefan J Boddie - One of the best experts on this subject based on the ideXlab platform.

  • a transaction Log Analysis of a digital library
    International Journal on Digital Libraries, 2000
    Co-Authors: Steve Jones, Sally Jo Cunningham, Rodger J Mcnab, Stefan J Boddie
    Abstract:

    As experimental digital library testbeds gain wider acceptance and develop significant user bases, it becomes important to investigate the ways in which users interact with the systems in practice. Transaction Logs are one source of usage information, and the information on user behavior can be culled from them both automatically (through calculation of summary statistics) and manually (by examining query strings for semantic clues on search motivations and searching strategy). We have conducted a transaction Log Analysis on user activity in the Computer Science Technical Reports Collection of the New Zealand Digital Library, and report insights gained and identify resulting search interface design issues. Specifically, we present the user demographics available with our library, discuss the use of operators and search options in queries, and examine patterns in query construction and refinement. We also describe common mistakes in searching, and examine the distribution of query terms appearing in the Logs.

Kousuke Nogami - One of the best experts on this subject based on the ideXlab platform.

  • Log Analysis in a http proxy server for accurately estimating web qoe
    Consumer Communications and Networking Conference, 2018
    Co-Authors: Anan Sawabe, Hiroshi Yoshida, Kousuke Nogami
    Abstract:

    The users' perceived quality of web page browsing, so-called “Web QoE”, is becoming an important consideration for mobile network operators. The means by which operators can increase their customer base is shifting from ensuring high network quality of service (QoS) in terms of throughput to improving the quality of experience (QoE) of their users of their networks. They hence need to estimate the Web QoE from a vast number of Logs stored on their network equipment, e.g., HTTP proxy servers. Generally, HTTP proxy servers record connection Logs not on a per web access basis but rather on a per HTTP connection basis. Moreover, a single web access typically consists of multiple HTTP connections. Because of that, mobile network operators need to estimate web sessions from a lot of HTTP connection Logs in a HTTP proxy server. To estimate web sessions, earlier studies took the following three approaches: (1) content type based, (2) time based, and (3) mixed. These approaches, however, inaccurately estimate (misestimate) web sessions in some cases. When a user accesses multiple web pages in a short time, these approaches may not distinguish which HTTP sessions compose a single web page and thus they may aggregate multiple web sessions as a single session. As a result, the estimation accuracy of web sessions decreases, and the estimation accuracy of Web QoE correspondingly decreases. In this paper, to more accurately estimate web sessions, we focus on the number of HTTP sessions in misestimated web sessions and propose a method for detecting erroneous estimations of web sessions that is based on statistical hypothesis testing. An experiment conducted on an operational LTE network showed that our method can decrease the mean absolute error of web session estimation by 0.09 point from that of the conventional method. Moreover, our method can get within 0.03 point of the estimation accuracy limit.

John P Rouillard - One of the best experts on this subject based on the ideXlab platform.

  • refereed papers real time Log file Analysis using the simple event correlator sec
    LISA '04 Proceedings of the 18th USENIX conference on System administration, 2004
    Co-Authors: John P Rouillard
    Abstract:

    Log Analysis is an important way to keep track of computers and networks. The use of automated Analysis always results in false reports, however these can be minimized by proper specification of recognition criteria. Current Analysis approaches fail to provide sufficient support for the recognizing the temporal component of Log Analysis. Temporal recognition of event sequences fall into distinct patterns that can be used to reduce false alerts and improve the efficiency of response to problems. This paper discusses these patterns while describing the rationale behind and implementation of a ruleset created at the CS department of the University of Massachusetts at Boston for SEC - the Simple Event Correlation program.

  • real time Log file Analysis using the simple event correlator sec
    USENIX Large Installation Systems Administration Conference, 2004
    Co-Authors: John P Rouillard
    Abstract:

    Originally published at Usenix LISA 2004 conference. November 2004 Atlanta, Georiga, USA. Log Analysis is an important way to keep track of computers and networks. The use of automated Analysis always results in false reports, however these can be minimized by proper specification of recognition criteria. Current Analysis approaches fail to provide sucient support for the recognizing the temporal component of Log Analysis. Temporal recognition of event sequences fall into distinct patterns that can be used to reduce false alerts and improve the eciency of response to problems. This paper discusses these patterns while describing the rationale behind and implementation of a ruleset created at the CS department of the University of Massachusetts at Boston for SEC - the Simple Event Correlation program.

Tengsheng Moh - One of the best experts on this subject based on the ideXlab platform.

  • detecting web attacks using multi stage Log Analysis
    International Conference on Advanced Computing, 2016
    Co-Authors: Melody Moh, Santhosh Pininti, Sindhusha Doddapaneni, Tengsheng Moh
    Abstract:

    Web-based applications have gained universal acceptance in every sector of lives, including social, commercial, government, and academic communities. Even with the recent emergence of cloud technoLogy, most of cloud applications are accessed and controlled through web interfaces. Web security has therefore continued to be fundamentally important and extremely challenging. One major security issue of web applications is SQL-injection attacks. Most existing solutions for detecting these attacks use Log Analysis, and employ either pattern matching or machine learning methods. Pattern matching methods can be effective, dynamic, they however cannot detect new kinds of attacks. Supervised machine learning methods can detect new attacks, yet they need to rely on an off-line training phase. This work proposes a multi-stage Log Analysis architecture, which combines both pattern matching and supervised machine learning methods. It uses Logs generated by the application during attacks to effectively detect attacks and to help preventing future attacks. The architecture is described in detail, a proof-of-concept prototype is implemented and hosted on Amazon AWS, using Kibana for pattern matching and Bayes Net for machine learning. It is evaluated on 10,000 Logs for detecting SQL injection attacks. Experiment results show that the two-stage system has combined the advantages of both systems, and has substantially improved the detection accuracy. The proposed work is significant in advancing web securities, while the multi-stage Log Analysis concept would be highly applicable to many intrusion detection applications.