Integrity Verification

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 10878 Experts worldwide ranked by ideXlab platform

Zhuo Feng - One of the best experts on this subject based on the ideXlab platform.

  • DAC - A Spectral Graph Sparsification Approach to Scalable Vectorless Power Grid Integrity Verification
    Proceedings of the 54th Annual Design Automation Conference 2017, 2017
    Co-Authors: Zhiqiang Zhao, Zhuo Feng
    Abstract:

    Vectorless Integrity Verification is becoming increasingly critical to robust design of nanoscale power delivery networks (PDNs). To dramatically improve efficiency and capability of vectorless Integrity Verifications, this paper introduces a scalable multilevel Integrity Verification framework by leveraging a hierarchy of almost linear-sized spectral power grid sparsifiers that can well retain effective resistances between nodes, as well as a recent graph-theoretic algebraic multigrid (AMG) algorithmic framework. As a result, vectorless Integrity Verification solution obtained on coarse level problems can effectively help find the solution of the original problem. Extensive experimental results show that the proposed vectorless Verification framework can always efficiently and accurately obtain worst-case scenarios in even very large power grid designs.

  • DAC - Scalable vectorless power grid current Integrity Verification
    Proceedings of the 50th Annual Design Automation Conference on - DAC '13, 2013
    Co-Authors: Zhuo Feng
    Abstract:

    To deal with the growing phenomenon of electromigration (EM), power grid current Integrity Verification becomes indispensable to designing reliable power delivery networks (PDNs). Unlike previous works that focus on vectorless voltage Integrity Verification of power grids, in this work, for the first time we present a scalable vectorless power grid current Integrity Verification framework. By taking advantage of multilevel power grid Verifications, large-scale power grid current Integrity Verification tasks can be achieved in a very efficient way. Additionally, a novel EM-aware geometric power grid reduction method is proposed to well preserve the similar geometric and electrical properties of the original grid on the coarse-level power grids, which allows to quickly identify the potential "hot wires" that may carry greater-than-desired currents in a given power grid design. The proposed multilevel power grid Verification algorithm provides flexible tradeoffs between the current Integrity Verification cost and solution quality, while the desired upper/lower bounds for worst case currents flowing through a wire can also be computed efficiently. Extensive experimental results show that our current Integrity Verification approach can efficiently handle very large power grid designs with good solution quality.

Zhiqiang Zhao - One of the best experts on this subject based on the ideXlab platform.

  • DAC - A Spectral Graph Sparsification Approach to Scalable Vectorless Power Grid Integrity Verification
    Proceedings of the 54th Annual Design Automation Conference 2017, 2017
    Co-Authors: Zhiqiang Zhao, Zhuo Feng
    Abstract:

    Vectorless Integrity Verification is becoming increasingly critical to robust design of nanoscale power delivery networks (PDNs). To dramatically improve efficiency and capability of vectorless Integrity Verifications, this paper introduces a scalable multilevel Integrity Verification framework by leveraging a hierarchy of almost linear-sized spectral power grid sparsifiers that can well retain effective resistances between nodes, as well as a recent graph-theoretic algebraic multigrid (AMG) algorithmic framework. As a result, vectorless Integrity Verification solution obtained on coarse level problems can effectively help find the solution of the original problem. Extensive experimental results show that the proposed vectorless Verification framework can always efficiently and accurately obtain worst-case scenarios in even very large power grid designs.

Cuiping Li - One of the best experts on this subject based on the ideXlab platform.

  • DASFAA (1) - Secure Data Aggregation with Integrity Verification in Wireless Sensor Networks
    Database Systems for Advanced Applications, 2018
    Co-Authors: Hui Peng, Yuncheng Wu, Juru Zeng, Hong Chen, Ke Wang, Cuiping Li
    Abstract:

    In recent years, wireless sensor networks (WSNs) have become a useful tool for environmental monitoring and information collection due to their strong sensory ability. Whereas WSNs utilize wireless communication and is usually deployed in an outdoors environment, which make them vulnerable to be attacked and then lead to the privacy disclosure of the monitored environment. SUM, as one common query among the queries of WSNs, is important to acquire a high-level understanding of the monitored environment and establish the basis for other advanced queries. In this paper, we present a secure hash-based privacy preservation mechanism called HP2M, which not only preserves the privacy of the monitored environment during SUM aggregation query, but also could achieve exact SUM aggregation. Furthermore, an Integrity Verification mechanism is proposed to verify the Integrity of SUM aggregation result, which could alarm the system once data packets transmitted through the networks are modified. One main characteristic of HP2M and the proposed Integrity Verification mechanism is that they are lightweight with a small bandwidth consumption. Finally, some numerical experiments are performed to demonstrate the efficiency of our proposed approach.

Guoren Wang - One of the best experts on this subject based on the ideXlab platform.

  • Outsourced data Integrity Verification based on blockchain in untrusted environment
    World Wide Web, 2020
    Co-Authors: Hao Kun, Junchang Xin, Zhiqiong Wang, Guoren Wang
    Abstract:

    Outsourced data, as the significant component of cloud service, has been widely used due to its convenience, low overhead, and high flexibility. To guarantee the Integrity of outsourced data, data owner (DO) usually adopts a third party auditor (TPA) to execute the data Integrity Verification scheme. However, during the Verification process, DO cannot fully confirm the reliability of the TPA, and handing over the Verification of data Integrity to the untrusted TPA may lead to data security threats. In this paper, we focus on the problem of Integrity Verification of outsourced data in untrusted environment, that is, how to improve the security and efficiency of data Integrity Verification without utilizing untrusted TPA. To address the problem, we design a decentralized model based on blockchain consisting of some collaborative Verification peers (VPs), each of which maintains a replication of the entire blockchain to avoid maliciously tampering with. Based on the model, we present an advanced data Integrity Verification algorithm which allows DO to store and check the Verification information by writing and retrieving the blockchain. In addition, in order to improve the concurrent performance, we extend the algorithm by introducing the Verification group (VG) constituting by some VPs organized by Inner-Group and Inter-Group consensus protocols. We conduct a completed security analysis as well as extensive experiments of our proposed approach, and the evaluation results demonstrate that our proposed approaches achieve superior performance.

  • APWeb/WAIM (2) - Decentralized Data Integrity Verification Model in Untrusted Environment
    Web and Big Data, 2018
    Co-Authors: Hao Kun, Junchang Xin, Zhiqiong Wang, Zhuochen Jiang, Guoren Wang
    Abstract:

    Outsourced data, as an significant component of cloud service, has been widely used due to its convince, low overhead and high flexibility. To guarantee the Integrity of outsourced data and reduce the computational overhead, data owner (DO) usually adopts a third party auditor (TPA) to execute Verification scheme. However, handing over the Verification of data to TPA may lead to security vulnerabilities since the TPA is not fully trusted. In this paper, we propose a novel solution for data Integrity Verification in untrusted outsourced environment. Firstly, we design a decentralized model based on blockchain, consisting by some collaborative Verification peers (VPs). Based on our purposed model, we present an advanced data Integrity Verification algorithm, allowing DO stores and checks Verification results by writing and retrieving the blockchain. Moreover, each VP maintains a replication of the entire blockchain to avoid maliciously tampering with. We evaluate our proposed approach on real outsourced data service scenario. Experimental results demonstrate that our proposed approach is efficient and effective.

Sarala Ghimire - One of the best experts on this subject based on the ideXlab platform.

  • A data Integrity Verification method for surveillance video system
    Multimedia Tools and Applications, 2020
    Co-Authors: Sarala Ghimire
    Abstract:

    Due to the massive popularity and consciousness towards requirements in evidence, the usage of the surveillance system has tremendously increased. Although video data recorded by the surveillance system contains important information and provides crucial evidence, it is susceptible to malicious alterations. Thus, the authenticity and Integrity of the visual evidence need to be examined before the investigation proceeding. In this paper, we propose an Integrity Verification method for surveillance videos. The proposed method utilizes a randomized hashing method in combination with the elliptic curve cryptography (ECC) for video data Integrity Verification. In the proposed approach, the video content with a predefined size (segment) is randomized with the unique random value, and then a hash algorithm is applied. The hash algorithm here utilizes the random initialization vector, which is generated with a secret key. Besides, the combination of the randomized hash output and the key is encrypted with the ECC encryption algorithm that ensures the additional security of the data. The experimental results obtained from computer simulation and accident data recorder (ADR)-embedded system show that the proposed method achieves perfect forgery detection for various kinds of tampering such as copy-move, insert, and delete. A complexity analysis based on the execution time for different sized videos shows the minimal overhead of less than 4% for each segment and consumes less memory than the conventional method that utilizes individual frames for hashing.

  • Using Blockchain for Improved Video Integrity Verification
    IEEE Transactions on Multimedia, 2020
    Co-Authors: Sarala Ghimire, Jae Young Choi, Bumshik Lee
    Abstract:

    A video record plays a crucial role in providing evidence for crime scenes or road accidents. However, the main problem with the video record is that it is often vulnerable to various video tampering attacks. Although visual evidence is required to conduct an Integrity Verification before investigations, it is still difficult for human vision to detect a forgery. In this paper, we propose a novel video Integrity Verification method (IVM) that takes advantage of a blockchain framework. The proposed method employs an effective blockchain model in centralized video data, by combining a hash-based message authentication code and elliptic curve cryptography to verify the Integrity of a video. In our method, video content with a predetermined size (segments) is key-hashed in a real-time manner and stored in a chronologically chained fashion, thus establishing an irrefutable database. The Verification process applies the same procedure to the video segment and generates a hash value that can be compared with the hash in the blockchain. The proposed IVM is implemented on a PC environment, as well as on an accident data recorder-embedded system for Verification. The experimental results show that the proposed method has better detection capabilities and robustness toward various kinds of tampering, such as copy-move, insert, and delete, as compared to other state-of-the-art methods. An analysis based on execution time along with an increase in the number of blocks within the blockchain shows a minimal overhead in the proposed method.