Security Parameter

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 360 Experts worldwide ranked by ideXlab platform

Ron Steinfeld - One of the best experts on this subject based on the ideXlab platform.

  • gghlite more efficient multilinear maps from ideal lattices
    Theory and Application of Cryptographic Techniques, 2014
    Co-Authors: Adeline Langlois, Damien Stehle, Ron Steinfeld
    Abstract:

    The GGH Graded Encoding Scheme[9], based on ideal lattices, is the first plausible approximation to a cryptographic multilinear map. Unfortunately, using the Security analysis in[9], the scheme requires very large Parameters to provide Security for its underlying “encoding re-randomization” process. Our main contributions are to formalize, simplify and improve the efficiency and the Security analysis of the re-randomization process in the GGH construction. This results in a new construction that we call GGHLite. In particular, we first lower the size of a standard deviation Parameter of the re-randomization process of[9] from exponential to polynomial in the Security Parameter. This first improvement is obtained via a finer Security analysis of the “drowning” step of re-randomization, in which we apply the Renyi divergence instead of the conventional statistical distance as a measure of distance between distributions. Our second improvement is to reduce the number of randomizers needed from Ω(n logn) to 2, where n is the dimension of the underlying ideal lattices. These two contributions allow us to decrease the bit size of the public Parameters from O(λ 5 logλ) for the GGH scheme to O(λlog2 λ) in GGHLite, with respect to the Security Parameter λ (for a constant multilinearity Parameter κ).

  • faster fully homomorphic encryption
    International Conference on the Theory and Application of Cryptology and Information Security, 2010
    Co-Authors: Damien Stehle, Ron Steinfeld
    Abstract:

    We describe two improvements to Gentry’s fully homomorphic scheme based on ideal lattices and its analysis: we provide a more aggressive analysis of one of the hardness assumptions (the one related to the Sparse Subset Sum Problem) and we introduce a probabilistic decryption algorithm that can be implemented with an algebraic circuit of low multiplicative degree. Combined together, these improvements lead to a faster fully homomorphic scheme, with a O(λ 3.5) bit complexity per elementary binary add/mult gate, where λ is the Security Parameter. These improvements also apply to the fully homomorphic schemes of Smart and Vercauteren [PKC’2010] and van Dijk et al. [Eurocrypt’2010].

  • efficient public key encryption based on ideal lattices
    International Conference on the Theory and Application of Cryptology and Information Security, 2009
    Co-Authors: Damien Stehle, Ron Steinfeld, Keisuke Tanaka, Keita Xagawa
    Abstract:

    We describe public key encryption schemes with Security provably based on the worst case hardness of the approximate Shortest Vector Problem in some structured lattices, called ideal lattices. Under the assumption that the latter is exponentially hard to solve even with a quantum computer, we achieve CPA-Security against subexponential attacks, with (quasi-)optimal asymptotic performance: if n is the Security Parameter, both keys are of bit-length ${\widetilde{O}}(n)$ and the amortized costs of both encryption and decryption are ${\widetilde{O}}(1)$ per message bit. Our construction adapts the trapdoor one-way function of Gentry et al. (STOC'08), based on the Learning With Errors problem, to structured lattices. Our main technical tools are an adaptation of Ajtai's trapdoor key generation algorithm (ICALP'99) and a re-interpretation of Regev's quantum reduction between the Bounded Distance Decoding problem and sampling short lattice vectors.

Damien Stehle - One of the best experts on this subject based on the ideXlab platform.

  • gghlite more efficient multilinear maps from ideal lattices
    Theory and Application of Cryptographic Techniques, 2014
    Co-Authors: Adeline Langlois, Damien Stehle, Ron Steinfeld
    Abstract:

    The GGH Graded Encoding Scheme[9], based on ideal lattices, is the first plausible approximation to a cryptographic multilinear map. Unfortunately, using the Security analysis in[9], the scheme requires very large Parameters to provide Security for its underlying “encoding re-randomization” process. Our main contributions are to formalize, simplify and improve the efficiency and the Security analysis of the re-randomization process in the GGH construction. This results in a new construction that we call GGHLite. In particular, we first lower the size of a standard deviation Parameter of the re-randomization process of[9] from exponential to polynomial in the Security Parameter. This first improvement is obtained via a finer Security analysis of the “drowning” step of re-randomization, in which we apply the Renyi divergence instead of the conventional statistical distance as a measure of distance between distributions. Our second improvement is to reduce the number of randomizers needed from Ω(n logn) to 2, where n is the dimension of the underlying ideal lattices. These two contributions allow us to decrease the bit size of the public Parameters from O(λ 5 logλ) for the GGH scheme to O(λlog2 λ) in GGHLite, with respect to the Security Parameter λ (for a constant multilinearity Parameter κ).

  • faster fully homomorphic encryption
    International Conference on the Theory and Application of Cryptology and Information Security, 2010
    Co-Authors: Damien Stehle, Ron Steinfeld
    Abstract:

    We describe two improvements to Gentry’s fully homomorphic scheme based on ideal lattices and its analysis: we provide a more aggressive analysis of one of the hardness assumptions (the one related to the Sparse Subset Sum Problem) and we introduce a probabilistic decryption algorithm that can be implemented with an algebraic circuit of low multiplicative degree. Combined together, these improvements lead to a faster fully homomorphic scheme, with a O(λ 3.5) bit complexity per elementary binary add/mult gate, where λ is the Security Parameter. These improvements also apply to the fully homomorphic schemes of Smart and Vercauteren [PKC’2010] and van Dijk et al. [Eurocrypt’2010].

  • efficient public key encryption based on ideal lattices
    International Conference on the Theory and Application of Cryptology and Information Security, 2009
    Co-Authors: Damien Stehle, Ron Steinfeld, Keisuke Tanaka, Keita Xagawa
    Abstract:

    We describe public key encryption schemes with Security provably based on the worst case hardness of the approximate Shortest Vector Problem in some structured lattices, called ideal lattices. Under the assumption that the latter is exponentially hard to solve even with a quantum computer, we achieve CPA-Security against subexponential attacks, with (quasi-)optimal asymptotic performance: if n is the Security Parameter, both keys are of bit-length ${\widetilde{O}}(n)$ and the amortized costs of both encryption and decryption are ${\widetilde{O}}(1)$ per message bit. Our construction adapts the trapdoor one-way function of Gentry et al. (STOC'08), based on the Learning With Errors problem, to structured lattices. Our main technical tools are an adaptation of Ajtai's trapdoor key generation algorithm (ICALP'99) and a re-interpretation of Regev's quantum reduction between the Bounded Distance Decoding problem and sampling short lattice vectors.

Jesper Buus Nielsen - One of the best experts on this subject based on the ideXlab platform.

  • a tamper and leakage resilient von neumann architecture
    Public Key Cryptography, 2015
    Co-Authors: Sebastian Faust, Pratyay Mukherjee, Jesper Buus Nielsen, Daniele Venturi
    Abstract:

    We present a universal framework for tamper and leakage resilient computation on a random access machine (RAM). The RAM has one CPU that accesses a storage, which we call the disk. The disk is subject to leakage and tampering. So is the bus connecting the CPU to the disk. We assume that the CPU is leakage and tamper-free. For a fixed value of the Security Parameter, the CPU has constant size. Therefore the code of the program to be executed is stored on the disk, i.e., we consider a von Neumann architecture. The most prominent consequence of this is that the code of the program executed will be subject to tampering.

  • robust multiparty computation with linear communication complexity
    International Cryptology Conference, 2006
    Co-Authors: Martin Hirt, Jesper Buus Nielsen
    Abstract:

    We present a robust multiparty computation protocol. The protocol is for the cryptographic model with open channels and a poly-time adversary, and allows n parties to actively securely evaluate any poly-sized circuit with resilience t < n/2. The total communication complexity in bits over the point-to-point channels is ${\mathcal{O}}(S n \kappa + n {\mathcal{BC}})$, where S is the size of the circuit being securely evaluated, κ is the Security Parameter and ${\mathcal{BC}}$ is the communication complexity of one broadcast of a κ-bit value. This means the average number of bits sent and received by a single party is ${\mathcal{O}}(S \kappa + {\mathcal{BC}})$, which is almost independent of the number of participating parties. This is the first robust multiparty computation protocol with this property.

  • upper bounds on the communication complexity of optimally resilient cryptographic multiparty computation
    International Conference on the Theory and Application of Cryptology and Information Security, 2005
    Co-Authors: Martin Hirt, Jesper Buus Nielsen
    Abstract:

    We give improved upper bounds on the communication complexity of optimally-resilient secure multiparty computation in the cryptographic model. We consider evaluating an n-party randomized function and show that if f can be computed by a circuit of size c, then $\mathcal{O}(cn^2\kappa)$ is an upper bound for active Security with optimal resilience t < n/2 and Security Parameter κ. This improves on the communication complexity of previous protocols by a factor of at least n. This improvement comes from the fact that in the new protocol, only $\mathcal{O}(n)$ messages (of size $\mathcal{O}(\kappa)$ each) are broadcast during the whole protocol execution, in contrast to previous protocols which require at least $\mathcal{O}(n)$ broadcasts per gate. Furthermore, we improve the upper bound on the communication complexity of passive secure multiparty computation with resilience t

  • multiparty computation from threshold homomorphic encryption
    BRICS Report Series, 2000
    Co-Authors: Ronald Cramer, Ivan Damgård, Jesper Buus Nielsen
    Abstract:

    We introduce a new approach to multiparty computation (MPC) basing it on homomorphic threshold crypto-systems. We show that given keys for any sufficiently efficient system of this type, general MPC protocols for n players can be devised which are secure against an active adversary that corrupts any minority of the players. The total number of bits sent is O(nk|C|), where k is the Security Parameter and |C| is the size of a (Boolean) circuit computing the function to be securely evaluated. An earlier proposal by Franklin and Haber with the same complexity was only secure for passive adversaries, while all earlier protocols with active Security had complexity at least quadratic in n. We give two examples of threshold cryptosystems that can support our construction and lead to the claimed complexities.

  • multiparty computation from threshold homomorphic encryption
    IACR Cryptology ePrint Archive, 2000
    Co-Authors: Ronald Cramer, Ivan Damgård, Jesper Buus Nielsen
    Abstract:

    We introduce a new approach to multiparty computation (MPC) basing it on homomorphic threshold crypto-systems. We show that given keys for any sufficiently efficient system of this type, general MPC protocols for n parties can be devised which are secure against an active adversary that corrupts any minority of the parties. The total number of bits broadcast is O(nk|C|), where k is the Security Parameter and |C| is the size of a (Boolean) circuit computing the function to be securely evaluated. An earlier proposal by Franklin and Haber with the same complexity was only secure for passive adversaries, while all earlier protocols with active Security had complexity at least quadratic in n. We give two examples of threshold cryptosystems that can support our construction and lead to the claimed complexities.

Eshan Chattopadhyay - One of the best experts on this subject based on the ideXlab platform.

  • explicit non malleable extractors multi source extractors and almost optimal privacy amplification protocols
    arXiv: Cryptography and Security, 2016
    Co-Authors: Eshan Chattopadhyay
    Abstract:

    We make progress in the following three problems: 1. Constructing optimal seeded non-malleable extractors; 2. Constructing optimal privacy amplification protocols with an active adversary, for any Security Parameter; 3. Constructing extractors for independent weak random sources, when the min-entropy is extremely small (i.e., near logarithmic). For the first two problems, the best known non-malleable extractors by Chattopadhyay, Goyal and Li [CGL16], and by Cohen [Coh16a,Coh16b] all require seed length and min-entropy at least $\log^2 (1/\epsilon)$, where $\epsilon$ is the error of the extractor. As a result, the best known explicit privacy amplification protocols with an active adversary, which achieve 2 rounds of communication and optimal entropy loss in [Li15c,CGL16], can only handle Security Parameter up to $s=\Omega(\sqrt{k})$, where $k$ is the min-entropy of the shared secret weak random source. For larger $s$ the best known protocol with optimal entropy loss in [Li15c] requires $O(s/\sqrt{k})$ rounds of communication. In this paper we give an explicit non-malleable extractor that only requires seed length and min-entropy $\log^{1+o(1)} (n/\epsilon)$, which also yields a 2-round privacy amplification protocol with optimal entropy loss for Security Parameter up to $s=k^{1-\alpha}$ for any constant $\alpha>0$. For the third problem, previously the best known extractor which supports the smallest min-entropy due to Li [Li13a], requires min-entropy $\log^{2+\delta} n$ and uses $O(1/\delta)$ sources, for any constant $\delta>0$. A very recent result by Cohen and Schulman [CS16] improves this, and constructed explicit extractors that use $O(1/\delta)$ sources for min-entropy $\log^{1+\delta} n$, any constant $\delta>0$. In this paper we further improve their result, and give an explicit extractor that uses $O(1)$ (an absolute constant) sources for min-entropy $\log^{1+o(1)} n$.

  • explicit non malleable extractors multi source extractors and almost optimal privacy amplification protocols
    Foundations of Computer Science, 2016
    Co-Authors: Eshan Chattopadhyay
    Abstract:

    We make progress in the following three problems: 1. Constructing optimal seeded non-malleable extractors, 2. Constructing optimal privacy amplification protocols with an active adversary, for any possible Security Parameter, 3. Constructing extractors for independent weak random sources, when the min-entropy is extremely small (i.e., near logarithmic). For the first two problems, the best known non-malleable extractors by Chattopadhyay, Goyal and Li, and by Cohen all require seed length and min-entropy with quadratic loss in Parameters. As a result, the best known explicit privacy amplification protocols with an active adversary, which achieve two rounds of communication and optimal entropy loss was sub-optimal in the min-entropy of the source. In this paper we give an explicit non-malleable extractor that works for nearly optimal seed length and min-entropy, and yields a two-round privacy amplification protocol with optimal entropy loss for almost all ranges of the Security Parameter. For the third problem, we improve upon a very recent result by Cohen and Schulman and give an explicit extractor that uses an absolute constant number of sources, each with almost logarithmic min-entropy. The key ingredient in all our constructions is a generalized, and much more efficient version of the independence preserving merger introduced by Cohen, which we call non-malleable independence preserving merger. Our construction of the merger also simplifies that of Cohen and Schulman, and may be of independent interest.

Willy Susilo - One of the best experts on this subject based on the ideXlab platform.

  • Generalized public-key cryptography with tight Security
    Information Sciences, 2019
    Co-Authors: Fuchun Guo, Willy Susilo
    Abstract:

    Abstract Tightly secure public-key cryptographic schemes enjoy the advantage that the selection of the Security Parameter can be optimal to achieve a certain Security level. Security models in the multi-user setting with corruptions ( MU-C ) consider more realistic threats in practice. Many efforts have been devoted to constructing tightly MU-C secure schemes. To date, we have many concrete constructions. Nevertheless, the study on how to generally achieve tight Security in public-key cryptography remains lacking. In this paper, we take an insight into the key generations in public-key cryptography. We first generalize the key generation algorithms of traditional schemes and discuss the requirements of achieving tight Security. We notice that for some schemes (e.g. key-unique schemes), these requirements inherently cannot be satisfied and hence these schemes cannot achieve tight Security. This is in accordance with the impossibility results of tight reductions by Bader et al. (EUROCRYPT 2016). To further study possible constructions, we extend the key generations of public-key cryptographic schemes to obtain a different framework. To demonstrate its applications, we illustrate how to construct tightly secure key-unique schemes under the extended framework. This circumvents the impossibility results of tight Security for key-unique schemes.