Protocol Execution

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 25365 Experts worldwide ranked by ideXlab platform

Daniel Wichs - One of the best experts on this subject based on the ideXlab platform.

  • leakage resilient public key cryptography in the bounded retrieval model
    International Cryptology Conference, 2009
    Co-Authors: Joel Alwen, Yevgeniy Dodis, Daniel Wichs
    Abstract:

    We study the design of cryptographic primitives resilient to key-leakage attacks, where an attacker can repeatedly and adaptively learn information about the secret key, subject only to the constraint that the overall amount of such information is bounded by some parameter ?. We construct a variety of leakage-resilient public-key systems including the first known identification schemes (ID), signature schemes and authenticated key agreement Protocols (AKA). Our main result is an efficient three-round AKA in the Random-Oracle Model, which is resilient to key-leakage attacks that can occur prior-to and after a Protocol Execution. Our AKA Protocol can be used as an interactive encryption scheme with qualitatively stronger privacy guarantees than non-interactive encryption schemes (constructed in prior and concurrent works), which are inherently insecure if the adversary can perform leakage attacks after seing a ciphertext. Moreover, our schemes can be flexibly extended to the Bounded-Retrieval Model, allowing us to tolerate very large absolute amount of adversarial leakage ? (potentially many gigabytes of information), only by increasing the size of the secret key and without any other loss of efficiency in communication or computation. Concretely, given any leakage parameter ?, security parameter ?, and any desired fraction 0 < ? ≤ 1, our schemes have the following properties: Secret key size is ?(1 + ?) + O(?). Public key size is O(?), and independent of ?. Communication complexity is O(?/?), and independent of ?. Computation reads O(?/? 2) locations of the secret key, independent of ?. Lastly, we show that our schemes allow for repeated "invisible updates" of the secret key, allowing us to tolerate up to ? bits of leakage in between any two updates, and an unlimited amount of leakage overall. These updates require that the parties can securely store a short "master update key" (e.g. on a separate secure device protected against leakage), which is only used for updates and not during Protocol Execution. The updates are invisible in the sense that a party can update its secret key at any point in time, without modifying the public key or notifying the other users.

  • leakage resilient public key cryptography in the bounded retrieval model
    IACR Cryptology ePrint Archive, 2009
    Co-Authors: Joel Alwen, Yevgeniy Dodis, Daniel Wichs
    Abstract:

    We study the design of cryptographic primitives resilient to key-leakage attacks, where an attacker can repeatedly and adaptively learn information about the secret key, subject only to the constraint that the overall amount of such information is bounded by some parameter `. We construct a variety of leakage-resilient public-key systems including the first known identification schemes (ID), signature schemes and authenticated key agreement Protocols (AKA). Our main result is an efficient three-round leakage-resilient AKA in the Random-Oracle model. This Protocol ensures that session keys are private and authentic even if (1) the adversary leaks a large fraction of the long-term secret keys of both users prior to the Protocol Execution and (2) the adversary completely learns the long-term secret keys after the Protocol Execution. In particular, our AKA Protocol provides qualitatively stronger privacy guarantees than leakage-resilient public-encryption schemes (constructed in prior and concurrent works), since such schemes necessarily become insecure if the adversary can perform leakage attacks after seing a ciphertext. Moreover, our schemes can be flexibly extended to the Bounded-Retrieval Model, allowing us to tolerate very large absolute amount of adversarial leakage ` (potentially many gigabytes of information), only by increasing the size of the secret key and without any other loss of efficiency in communication or computation. Concretely, given any leakage parameter `, security parameter λ, and any desired fraction 0 < δ ≤ 1, our schemes have the following properties: ˆ Secret key size is `(1 + δ) +O(λ). In particular, the attacker can learn an approximately (1− δ) fraction of the secret key. ˆ Public key size is O(λ), and independent of `. ˆ Communication complexity is O(λ/δ), and independent of `. ˆ All computation reads at most O(λ/δ) locations of the secret key, independently of `. Lastly, we show that our schemes allow for repeated “invisible updates” of the secret key, allowing us to tolerate up to ` bits of leakage in between any two updates, and an unlimited amount of leakage overall. These updates require that the parties can securely store a short “master update key” (e.g. on a separate secure device protected against leakage), which is only used for updates and not during Protocol Execution. The updates are invisible in the sense that a party can update its secret key at any point in time, without modifying the public key or notifying the other users. ∗Computer Science Dept. NYU. Email: jalwen@cs.nyu.edu. †Computer Science Dept. NYU. Email: dodis@cs.nyu.edu. ‡Computer Science Dept. NYU. Email: wichs@cs.nyu.edu.

Phillip J Schulte - One of the best experts on this subject based on the ideXlab platform.

  • influence of clinical trial site enrollment on patient characteristics Protocol completion and end points insights from the ascend hf trial acute study of clinical effectiveness of nesiritide in decompensated heart failure
    Circulation-heart Failure, 2016
    Co-Authors: Stephen J Greene, Adrian F Hernandez, Jie Lena Sun, Marco Metra, Javed Butler, Andrew P Ambrosy, Justin A Ezekowitz, Randall C Starling, John R Teerlink, Phillip J Schulte
    Abstract:

    Background— Most international acute heart failure trials have failed to show benefit with respect to key end points. The impact of site enrollment and Protocol Execution on trial performance is unclear. Methods and Results— We assessed the impact of varying site enrollment volume among all 7141 acute heart failure patients from the ASCEND-HF trial (Acute Study of Clinical Effectiveness of Nesiritide in Decompensated Heart Failure). Overall, 398 sites enrolled ≥1 patient, and median enrollment was 12 patients (interquartile range, 5–23). Patients from high enrolling sites (>60 patients/site) tended to have lower ejection fraction, worse New York Heart Association functional class, and lower utilization of guideline-directed medical therapy but fewer comorbidities and lower B-type natriuretic peptide level. Every 10 patient increase (up to 100 patients) in site enrollment correlated with lower likelihood of Protocol noncompletion (odds ratio, 0.93; 95% confidence interval [CI], 0.89–0.98). After adjustment, increasing site enrollment predicted higher risk of persistent dyspnea at 6 hours (per 10 patient increase: odds ratio 1.02; 95% CI, 1.01–1.03) but not at 24 hours (odds ratio, 0.99; 95% CI, 0.98–1.00). Higher site enrollment was independently associated with lower risk of 30-day death or rehospitalization (per 10 patient increase: odds ratio, 0.98, 95% CI, 0.96–0.99) but not 180-day mortality (hazard ratio, 0.99; 95% CI, 0.98–1.01). The influence of increasing site enrollment on clinical end points varied across geographic regions with strongest associations in Latin America and Asia-Pacific (all interaction P Conclusions— In this large, acute heart failure trial, site enrollment correlated with Protocol completion and was independently associated with trial end points. Individual and regional site performance present challenges to be considered in design of future acute heart failure trials. Clinical Trial Registration— URL: http://www.clinicaltrials.gov. Unique identifier: NCT00475852.

Joel Alwen - One of the best experts on this subject based on the ideXlab platform.

  • leakage resilient public key cryptography in the bounded retrieval model
    International Cryptology Conference, 2009
    Co-Authors: Joel Alwen, Yevgeniy Dodis, Daniel Wichs
    Abstract:

    We study the design of cryptographic primitives resilient to key-leakage attacks, where an attacker can repeatedly and adaptively learn information about the secret key, subject only to the constraint that the overall amount of such information is bounded by some parameter ?. We construct a variety of leakage-resilient public-key systems including the first known identification schemes (ID), signature schemes and authenticated key agreement Protocols (AKA). Our main result is an efficient three-round AKA in the Random-Oracle Model, which is resilient to key-leakage attacks that can occur prior-to and after a Protocol Execution. Our AKA Protocol can be used as an interactive encryption scheme with qualitatively stronger privacy guarantees than non-interactive encryption schemes (constructed in prior and concurrent works), which are inherently insecure if the adversary can perform leakage attacks after seing a ciphertext. Moreover, our schemes can be flexibly extended to the Bounded-Retrieval Model, allowing us to tolerate very large absolute amount of adversarial leakage ? (potentially many gigabytes of information), only by increasing the size of the secret key and without any other loss of efficiency in communication or computation. Concretely, given any leakage parameter ?, security parameter ?, and any desired fraction 0 < ? ≤ 1, our schemes have the following properties: Secret key size is ?(1 + ?) + O(?). Public key size is O(?), and independent of ?. Communication complexity is O(?/?), and independent of ?. Computation reads O(?/? 2) locations of the secret key, independent of ?. Lastly, we show that our schemes allow for repeated "invisible updates" of the secret key, allowing us to tolerate up to ? bits of leakage in between any two updates, and an unlimited amount of leakage overall. These updates require that the parties can securely store a short "master update key" (e.g. on a separate secure device protected against leakage), which is only used for updates and not during Protocol Execution. The updates are invisible in the sense that a party can update its secret key at any point in time, without modifying the public key or notifying the other users.

  • leakage resilient public key cryptography in the bounded retrieval model
    IACR Cryptology ePrint Archive, 2009
    Co-Authors: Joel Alwen, Yevgeniy Dodis, Daniel Wichs
    Abstract:

    We study the design of cryptographic primitives resilient to key-leakage attacks, where an attacker can repeatedly and adaptively learn information about the secret key, subject only to the constraint that the overall amount of such information is bounded by some parameter `. We construct a variety of leakage-resilient public-key systems including the first known identification schemes (ID), signature schemes and authenticated key agreement Protocols (AKA). Our main result is an efficient three-round leakage-resilient AKA in the Random-Oracle model. This Protocol ensures that session keys are private and authentic even if (1) the adversary leaks a large fraction of the long-term secret keys of both users prior to the Protocol Execution and (2) the adversary completely learns the long-term secret keys after the Protocol Execution. In particular, our AKA Protocol provides qualitatively stronger privacy guarantees than leakage-resilient public-encryption schemes (constructed in prior and concurrent works), since such schemes necessarily become insecure if the adversary can perform leakage attacks after seing a ciphertext. Moreover, our schemes can be flexibly extended to the Bounded-Retrieval Model, allowing us to tolerate very large absolute amount of adversarial leakage ` (potentially many gigabytes of information), only by increasing the size of the secret key and without any other loss of efficiency in communication or computation. Concretely, given any leakage parameter `, security parameter λ, and any desired fraction 0 < δ ≤ 1, our schemes have the following properties: ˆ Secret key size is `(1 + δ) +O(λ). In particular, the attacker can learn an approximately (1− δ) fraction of the secret key. ˆ Public key size is O(λ), and independent of `. ˆ Communication complexity is O(λ/δ), and independent of `. ˆ All computation reads at most O(λ/δ) locations of the secret key, independently of `. Lastly, we show that our schemes allow for repeated “invisible updates” of the secret key, allowing us to tolerate up to ` bits of leakage in between any two updates, and an unlimited amount of leakage overall. These updates require that the parties can securely store a short “master update key” (e.g. on a separate secure device protected against leakage), which is only used for updates and not during Protocol Execution. The updates are invisible in the sense that a party can update its secret key at any point in time, without modifying the public key or notifying the other users. ∗Computer Science Dept. NYU. Email: jalwen@cs.nyu.edu. †Computer Science Dept. NYU. Email: dodis@cs.nyu.edu. ‡Computer Science Dept. NYU. Email: wichs@cs.nyu.edu.

K A Patel - One of the best experts on this subject based on the ideXlab platform.

  • coexistence of high bit rate quantum key distribution and data on optical fiber
    arXiv: Quantum Physics, 2012
    Co-Authors: K A Patel, J F Dynes, Iris Choi, A W Sharpe, A R Dixon, Z L Yuan, R V Penty, A J Shields
    Abstract:

    Quantum key distribution (QKD) uniquely allows distribution of cryptographic keys with security verified by quantum mechanical limits. Both Protocol Execution and subsequent applications require the assistance of classical data communication channels. While using separate fibers is one option, it is economically more viable if data and quantum signals are simultaneously transmitted through a single fiber. However, noise-photon contamination arising from the intense data signal has severely restricted both the QKD distances and secure key rates. Here, we exploit a novel temporal-filtering effect for noise-photon rejection. This allows high-bit-rate QKD over fibers up to 90 km in length and populated with error-free bidirectional Gb/s data communications. With high-bit rate and range sufficient for important information infrastructures, such as smart cities and 10 Gbit Ethernet, QKD is a significant step closer towards wide-scale deployment in fiber networks.

  • coexistence of high bit rate quantum key distribution and data on optical fiber
    Physical Review X, 2012
    Co-Authors: K A Patel, J F Dynes, Iris Choi, A W Sharpe, A R Dixon, Z L Yuan, R V Penty, A J Shields
    Abstract:

    Quantum key distribution (QKD) uniquely allows the distribution of cryptographic keys with security verified by quantum mechanical limits. Both Protocol Execution and subsequent applications require the assistance of classical data communication channels. While using separate fibers is one option, it is economically more viable if data and quantum signals are simultaneously transmitted through a single fiber. However, noise-photon contamination arising from the intense data signal has severely restricted both the QKD distances and secure key rates. Here, we exploit a novel temporal-filtering effect for noisephoton rejection. This allows high-bit-rate QKD over fibers up to 90 km in length and populated with error-free bidirectional Gb=s data communications. With a high-bit rate and range sufficient for important information infrastructures, such as smart cities and 10-Gbit Ethernet, QKD is a significant step closer toward wide-scale deployment in fiber networks.

Stephen J Greene - One of the best experts on this subject based on the ideXlab platform.

  • influence of clinical trial site enrollment on patient characteristics Protocol completion and end points insights from the ascend hf trial acute study of clinical effectiveness of nesiritide in decompensated heart failure
    Circulation-heart Failure, 2016
    Co-Authors: Stephen J Greene, Adrian F Hernandez, Jie Lena Sun, Marco Metra, Javed Butler, Andrew P Ambrosy, Justin A Ezekowitz, Randall C Starling, John R Teerlink, Phillip J Schulte
    Abstract:

    Background— Most international acute heart failure trials have failed to show benefit with respect to key end points. The impact of site enrollment and Protocol Execution on trial performance is unclear. Methods and Results— We assessed the impact of varying site enrollment volume among all 7141 acute heart failure patients from the ASCEND-HF trial (Acute Study of Clinical Effectiveness of Nesiritide in Decompensated Heart Failure). Overall, 398 sites enrolled ≥1 patient, and median enrollment was 12 patients (interquartile range, 5–23). Patients from high enrolling sites (>60 patients/site) tended to have lower ejection fraction, worse New York Heart Association functional class, and lower utilization of guideline-directed medical therapy but fewer comorbidities and lower B-type natriuretic peptide level. Every 10 patient increase (up to 100 patients) in site enrollment correlated with lower likelihood of Protocol noncompletion (odds ratio, 0.93; 95% confidence interval [CI], 0.89–0.98). After adjustment, increasing site enrollment predicted higher risk of persistent dyspnea at 6 hours (per 10 patient increase: odds ratio 1.02; 95% CI, 1.01–1.03) but not at 24 hours (odds ratio, 0.99; 95% CI, 0.98–1.00). Higher site enrollment was independently associated with lower risk of 30-day death or rehospitalization (per 10 patient increase: odds ratio, 0.98, 95% CI, 0.96–0.99) but not 180-day mortality (hazard ratio, 0.99; 95% CI, 0.98–1.01). The influence of increasing site enrollment on clinical end points varied across geographic regions with strongest associations in Latin America and Asia-Pacific (all interaction P Conclusions— In this large, acute heart failure trial, site enrollment correlated with Protocol completion and was independently associated with trial end points. Individual and regional site performance present challenges to be considered in design of future acute heart failure trials. Clinical Trial Registration— URL: http://www.clinicaltrials.gov. Unique identifier: NCT00475852.