The Experts below are selected from a list of 291 Experts worldwide ranked by ideXlab platform
Martin Rinard - One of the best experts on this subject based on the ideXlab platform.
-
information Flow Analysis of android applications in droidsafe
Network and Distributed System Security Symposium, 2015Co-Authors: Michael I Gordon, Jeff H Perkins, Limei Gilham, Nguyen D Nguyen, Martin RinardAbstract:We present DroidSafe, a static information Flow Analysis tool that reports potential leaks of sensitive information in Android applications. DroidSafe combines a comprehensive, accurate, and precise model of the Android runtime with static Analysis design decisions that enable the DroidSafe analyses to scale to analyze this model. This combination is enabled by accurate Analysis stubs, a technique that enables the effective Analysis of code whose complete semantics lies outside the scope of Java, and by a combination of analyses that together can statically resolve communication targets identified by dynamically constructed values such as strings and class designators. Our experimental results demonstrate that 1) DroidSafe achieves unprecedented precision and accuracy for Android information Flow Analysis (as measured on a standard previously published set of benchmark applications) and 2) DroidSafe detects all malicious information Flow leaks inserted into 24 real-world Android applications by three independent, hostile Red-Team organizations. The previous state-of-the art Analysis, in contrast, detects less than 10% of these malicious Flows.
Hanne Riis Nielson - One of the best experts on this subject based on the ideXlab platform.
-
ESOP - Interprocedural Control Flow Analysis
Programming Languages and Systems, 1999Co-Authors: Flemming Nielson, Hanne Riis NielsonAbstract:Control Flow Analysis is a widely used approach for analysing functional and object oriented programs. Once the applications become more demanding also the Analysis needs to be more precise in its ability to deal with mutable state (or side-effects) and to perform polyvariant (or context-sensitive) Analysis. Several insights in Data Flow Analysis and Abstract Interpretation show how to do so for imperative programs but the techniques have not had much impact on Control Flow Analysis. We show how to incorporate a number of key insights from Data Flow Analysis (involving such advanced interprocedural techniques as call strings and assumption sets) into Control Flow Analysis (using Abstract Interpretation to induce the analyses from a collecting semantics).
-
Data Flow Analysis
Principles of Program Analysis, 1999Co-Authors: Flemming Nielson, Hanne Riis Nielson, Chris HankinAbstract:In this chapter we introduce techniques for Data Flow Analysis. Data Flow Analysis is the traditional form of program Analysis which is described in many textbooks on compiler writing. We will present analyses for the simple imperative language While that was introduced in Chapter 1. This includes a number of classical Data Flow Analyses: Available Expressions, Reaching Definitions, Very Busy Expressions and Live Variables. We introduce an operational semantics for While and demonstrate the correctness of the Live Variables Analysis. We then present the notion of Monotone Frameworks and show how the examples may be recast as such frameworks. We continue by presenting a worklist algorithm for solving Flow equations and we study its termination and correctness properties. The chapter concludes with a presentation of some advanced topics, including Interprocedural Data Flow Analysis and Shape Analysis.
-
Interprocedural Control Flow Analysis (Extended version)
DAIMI Report Series, 1998Co-Authors: Flemming Nielson, Hanne Riis NielsonAbstract:Control Flow Analysis is a widely used approach for analysing functional and object oriented programs and recently it has also successfully been used to analyse more challenging notions of computation involving concurrency. However, once the applications become more demanding also the Analysis needs to be more precise in its ability to deal with mutable state (or side-effects) and to perform polyvariant (or context-sensitive) Analysis. Several insights in Data Flow Analysis and Abstract Interpretation show how to do so for imperative programs but the techniques have not had much impact on Control Flow Analysis because of the less abstract way in which the techniques are normally expressed. In this paper we show how to incorporate a number of key insights from Data Flow Analysis involving such advanced interprocedural techniques as call strings and assumption sets using Abstract Interpretation to induce the analyses from a general collecting semantics.
Flemming Nielson - One of the best experts on this subject based on the ideXlab platform.
-
ESOP - Interprocedural Control Flow Analysis
Programming Languages and Systems, 1999Co-Authors: Flemming Nielson, Hanne Riis NielsonAbstract:Control Flow Analysis is a widely used approach for analysing functional and object oriented programs. Once the applications become more demanding also the Analysis needs to be more precise in its ability to deal with mutable state (or side-effects) and to perform polyvariant (or context-sensitive) Analysis. Several insights in Data Flow Analysis and Abstract Interpretation show how to do so for imperative programs but the techniques have not had much impact on Control Flow Analysis. We show how to incorporate a number of key insights from Data Flow Analysis (involving such advanced interprocedural techniques as call strings and assumption sets) into Control Flow Analysis (using Abstract Interpretation to induce the analyses from a collecting semantics).
-
Data Flow Analysis
Principles of Program Analysis, 1999Co-Authors: Flemming Nielson, Hanne Riis Nielson, Chris HankinAbstract:In this chapter we introduce techniques for Data Flow Analysis. Data Flow Analysis is the traditional form of program Analysis which is described in many textbooks on compiler writing. We will present analyses for the simple imperative language While that was introduced in Chapter 1. This includes a number of classical Data Flow Analyses: Available Expressions, Reaching Definitions, Very Busy Expressions and Live Variables. We introduce an operational semantics for While and demonstrate the correctness of the Live Variables Analysis. We then present the notion of Monotone Frameworks and show how the examples may be recast as such frameworks. We continue by presenting a worklist algorithm for solving Flow equations and we study its termination and correctness properties. The chapter concludes with a presentation of some advanced topics, including Interprocedural Data Flow Analysis and Shape Analysis.
-
Interprocedural Control Flow Analysis (Extended version)
DAIMI Report Series, 1998Co-Authors: Flemming Nielson, Hanne Riis NielsonAbstract:Control Flow Analysis is a widely used approach for analysing functional and object oriented programs and recently it has also successfully been used to analyse more challenging notions of computation involving concurrency. However, once the applications become more demanding also the Analysis needs to be more precise in its ability to deal with mutable state (or side-effects) and to perform polyvariant (or context-sensitive) Analysis. Several insights in Data Flow Analysis and Abstract Interpretation show how to do so for imperative programs but the techniques have not had much impact on Control Flow Analysis because of the less abstract way in which the techniques are normally expressed. In this paper we show how to incorporate a number of key insights from Data Flow Analysis involving such advanced interprocedural techniques as call strings and assumption sets using Abstract Interpretation to induce the analyses from a general collecting semantics.
Uday P. Khedker - One of the best experts on this subject based on the ideXlab platform.
-
Bidirectional data Flow Analysis for type inferencing
Computer Languages Systems & Structures, 2020Co-Authors: Uday P. Khedker, Dhananjay M. Dhamdhere, Alan MycroftAbstract:Tennenbaum's data Flow Analysis based formulation of type inferencing is termed bidirectional in the “Dragon Book”; however, it fails to qualify as a formal data Flow framework and is not amenable to complexity Analysis. Further, the types discovered are imprecise. Here, we define a formal data Flow framework (based on bidirectional data Flow Analysis) which discovers more precise type information and is amenable to complexity Analysis. We compare data Flow analyses with the more general constraint-based analyses and observe that data Flow analyses represent program analyses without unbounded auxiliary store. We show that if unlimited auxiliary store is allowed then no data Flow Analysis would need more than two passes; if auxiliary store is disallowed then type inferencing requires bidirectional data Flow Analysis.© Elsevie
-
Data Flow Analysis: Theory and Practice
2009Co-Authors: Uday P. Khedker, Amitabha Sanyal, Bageshri KarkareAbstract:This work provides an in-depth treatment of data Flow Analysis technique. Apart from including interprocedural data Flow Analysis, this book is the first to extend detailed coverage of Analysis beyond bit vectors. Supplemented by numerous examples, it equips readers with a combination of mutually supportive theory and practice, presenting mathematical foundations and including study of data Flow Analysis implementation through use of the GNU Compiler Collection (GCC). Readers can experiment with the analyses described in the book by accessing the authors web page, where they will find the source code of gdfa (generic data Flow analyzer).
-
Bidirectional data Flow Analysis for type inferencing
Computer Languages Systems & Structures, 2003Co-Authors: Uday P. Khedker, Dhananjay M. Dhamdhere, Alan MycroftAbstract:Tennenbaum's data Flow Analysis based formulation of type inferencing is termed bidirectional in the ''Dragon Book''; however, it fails to qualify as a formal data Flow framework and is not amenable to complexity Analysis. Further, the types discovered are imprecise. Here, we define a formal data Flow framework (based on bidirectional data Flow Analysis) which discovers more precise type information and is amenable to complexity Analysis. We compare data Flow analyses with the more general constraint-based analyses and observe that data Flow analyses represent program analyses without unbounded auxiliary store. We show that if unlimited auxiliary store is allowed then no data Flow Analysis would need more than two passes; if auxiliary store is disallowed then type inferencing requires bidirectional data Flow Analysis.
Alan Mycroft - One of the best experts on this subject based on the ideXlab platform.
-
Bidirectional data Flow Analysis for type inferencing
Computer Languages Systems & Structures, 2020Co-Authors: Uday P. Khedker, Dhananjay M. Dhamdhere, Alan MycroftAbstract:Tennenbaum's data Flow Analysis based formulation of type inferencing is termed bidirectional in the “Dragon Book”; however, it fails to qualify as a formal data Flow framework and is not amenable to complexity Analysis. Further, the types discovered are imprecise. Here, we define a formal data Flow framework (based on bidirectional data Flow Analysis) which discovers more precise type information and is amenable to complexity Analysis. We compare data Flow analyses with the more general constraint-based analyses and observe that data Flow analyses represent program analyses without unbounded auxiliary store. We show that if unlimited auxiliary store is allowed then no data Flow Analysis would need more than two passes; if auxiliary store is disallowed then type inferencing requires bidirectional data Flow Analysis.© Elsevie
-
Bidirectional data Flow Analysis for type inferencing
Computer Languages Systems & Structures, 2003Co-Authors: Uday P. Khedker, Dhananjay M. Dhamdhere, Alan MycroftAbstract:Tennenbaum's data Flow Analysis based formulation of type inferencing is termed bidirectional in the ''Dragon Book''; however, it fails to qualify as a formal data Flow framework and is not amenable to complexity Analysis. Further, the types discovered are imprecise. Here, we define a formal data Flow framework (based on bidirectional data Flow Analysis) which discovers more precise type information and is amenable to complexity Analysis. We compare data Flow analyses with the more general constraint-based analyses and observe that data Flow analyses represent program analyses without unbounded auxiliary store. We show that if unlimited auxiliary store is allowed then no data Flow Analysis would need more than two passes; if auxiliary store is disallowed then type inferencing requires bidirectional data Flow Analysis.