Uninitialized Variable

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 57 Experts worldwide ranked by ideXlab platform

Yannick Moy - One of the best experts on this subject based on the ideXlab platform.

  • Gem #68: let's SPARK! - part 1
    ACM SIGAda Ada Letters, 2011
    Co-Authors: Yannick Moy
    Abstract:

    In this Gem and the next one, we present a simple walk-through of SPARK's capabilities and its integration with GPS. In this first Gem, we show how to set up a SPARK project and prove that your SPARK programs are free from Uninitialized Variable accesses and that they execute without run-time errors.

  • Gem #69: let's SPARK! - part 2
    ACM SIGAda Ada Letters, 2011
    Co-Authors: Yannick Moy
    Abstract:

    In this Gem and the previous one, we give you a simple walkthrough of SPARK's capabilities and its integration with GPS. In the previous Gem, we showed how to set up a SPARK project and prove that your SPARK programs are free from Uninitialized Variable accesses and that they execute without run-time errors. In this Gem, we show how to prove that your SPARK programs respect given contracts.

Ravindra Naik - One of the best experts on this subject based on the ideXlab platform.

  • WCRE - Precise Detection of Uninitialized Variables Using Dynamic Analysis - Extending to Aggregate and Vector Types
    2012 19th Working Conference on Reverse Engineering, 2012
    Co-Authors: Anushri Jana, Ravindra Naik
    Abstract:

    Uninitialized Variable is a Variable in a program that is declared but is not assigned a definite known value before it is used. Compilers of modern programming languages (like Java) check for their presence, but in languages like C and COBOL, they are a critical reason for incorrect results, wrong data entries, and run-time failures. Tools based on static and dynamic analysis techniques are available that detect use of Uninitialized Variables. However, the static analysis tools face the issue of false positives (precision) while the dynamic analysis tools are not able to cover all the classes of Variables (completeness). In this paper, we present a technique based on dynamic program analysis and results of the prototype tool to detect the first use of Uninitialized Variables. Using a combination of source and binary instrumentation, the technique is able to track Variables of basic types, individual array elements, and fields of structures. We verified the completeness and precision of the technique using two open source case-studies with very large input datasets. We present the comparison of the results with other open source tools. The prototype tool is a clear winner in terms of precision and the coverage of Variables, but more work is required to further optimize the size of instrumentation information, and the performance of its analyzer.

Yang Xue - One of the best experts on this subject based on the ideXlab platform.

  • Path Sensitive Defect Analysis for BPEL Under Dead Path Semantic
    Journal of Software, 2012
    Co-Authors: Yang Xue
    Abstract:

    Software defect is an important indicator for measuring the adequacy of software testing.In order to improve the reliability and robustness of Web service composition based BPEL,this paper proposes a defect detecting method which combines the dead path into a path sensitive analysis.Dead path is a special semantic of BPEL,which has no execution information,but can connect two executed path segments.Using the abstract value of Variable and its interval arithmetic,this method can identify an unreachable path and dead path,and also merge the conditions of property state at join points.A case study about Uninitialized Variable detecting which is related to dead path and executing path is given throughout the analysis,and verification is implemented to illustrate the effectiveness of this approach.

François Irigoin - One of the best experts on this subject based on the ideXlab platform.

  • Centre de Recherche en Informatique- Ecole des Mines de Paris
    2008
    Co-Authors: Thi Viet, Nga Nguyen, François Irigoin
    Abstract:

    Abstract — Software validation and verification are critical for almost systems whose failure is unacceptable. This is a very expensive process, typically costing 50 % of the total software development costs. We show in this paper how program analyses can be used automatically to perform efficient and effective verifications, e.g array bound checking and Uninitialized Variable checking. These verifications are based on array regions, the analyses that collect information about the way array elements used and defined by programs. With useful knowledge of the program properties provided by these static analyses, the amount of additional code necessary to instrument the application is reduced to a minimum. Run-time checks are generated only when we do not know how to prove that they are useless. Dynamic errors that may cause potentially failure are caught and their consequences can be reduced by exception handling. Information provided by our analyses is helpful for the debugging process of large scale programs. All verifications are implemented in PIPS, the research compiler developed at École des Mines de Paris. Qualitative and quantitative experiments have done with industrial applications up to hundreds of thousand lines of code

  • Advanced Program Analyses and Verifications
    2003
    Co-Authors: Thi Viet Nga Nguyen, François Irigoin
    Abstract:

    Software validation and verification are critical for almost systems whose failure is unacceptable. This is a very expensive process, typically costing 50% of the total software development costs. We show in this paper how program analyses can be used automatically to perform efficient and effective verifications, e.g array bound checking and Uninitialized Variable checking. These verifications are based on array regions, the analyses that collect information about the way array elements used and defined by programs. Wit

  • Advanced Program Analyses and Verifications
    2003
    Co-Authors: Thi Viet Nga Nguyen, François Irigoin
    Abstract:

    Software validation and verification are critical for almost systems whose failure is unacceptable. This is a very expensive process, typically costing 50% of the total software development costs. We show in this paper how program analyses can be used automatically to perform efficient and effective verifications, e.g array bound checking and Uninitialized Variable checking. These verifications are based on array regions, the analyses that collect information about the way array elements used and defined by programs. With usefu

Anushri Jana - One of the best experts on this subject based on the ideXlab platform.

  • WCRE - Precise Detection of Uninitialized Variables Using Dynamic Analysis - Extending to Aggregate and Vector Types
    2012 19th Working Conference on Reverse Engineering, 2012
    Co-Authors: Anushri Jana, Ravindra Naik
    Abstract:

    Uninitialized Variable is a Variable in a program that is declared but is not assigned a definite known value before it is used. Compilers of modern programming languages (like Java) check for their presence, but in languages like C and COBOL, they are a critical reason for incorrect results, wrong data entries, and run-time failures. Tools based on static and dynamic analysis techniques are available that detect use of Uninitialized Variables. However, the static analysis tools face the issue of false positives (precision) while the dynamic analysis tools are not able to cover all the classes of Variables (completeness). In this paper, we present a technique based on dynamic program analysis and results of the prototype tool to detect the first use of Uninitialized Variables. Using a combination of source and binary instrumentation, the technique is able to track Variables of basic types, individual array elements, and fields of structures. We verified the completeness and precision of the technique using two open source case-studies with very large input datasets. We present the comparison of the results with other open source tools. The prototype tool is a clear winner in terms of precision and the coverage of Variables, but more work is required to further optimize the size of instrumentation information, and the performance of its analyzer.