Jelle Don, Serge Fehr, Yu-Hsuan Huang, Jyun-Jie Liao and Patrick Struck
Hide-and-Seek and the Non-Resignability of the BUFF Transform
To appear in Theory of Cryptography Conference - TCC 2024, Lecture Notes in Computer Science.
Abstract:
The BUFF transform, due to Cremers et al. (S&P’21), is a generic transformation for digital signature scheme, with the purpose of obtaining additional security guarantees beyond unforgeability: exclusive ownership, message-bound signatures, and non-resignability. Non-resignability (which essentially challenges an adversary to re-sign an unknown message for which it only obtains the signature) turned out to be a delicate matter, as recently Don et al. (CRYPTO’24) showed that the initial definition is essentially unachievable; in particular, it is not achieved by the BUFF transform. This led to the introduction of new, weakened versions of non-resignability, which are (potentially) achievable. In particular, it was shown that a salted variant of the BUFF transform does achieves some weakened version of non-resignability. However, the salting requires additional randomness and leads to slightly larger signatures. Whether the original BUFF transform also achieves some meaningful notion of non-resignability remained a natural open question.
In this work, we answer this question in the affirmative. We show that the BUFF transform satisfies the (almost) strongest notions of non-resignability one can hope for, facing the known impossibility results. Our results cover both the statistical and the computational case, and both the classical and the quantum setting. At the core of our analysis lies a new security game for random oracles that we call Hide-and-Seek. While seemingly innocent at first glance, it turns out to be surprisingly challenging to rigorously analyze.
Available files:
|
|
Jelle Don, Serge Fehr, Yu-Hsuan Huang, and Patrick Struck
On the (In)Security of the BUFF Transform
To appear in Advances in Cryptology - CRYPTO 2024, Lecture Notes in Computer Science.
Abstract:
The BUFF transform is a generic transformation for digital signature schemes, with the purpose of obtaining additional security properties beyond standard unforgeability, e.g., exclusive ownership and non-resignability. In the call for additional post-quantum signatures, these were explicitly mentioned by the NIST as ``additional desirable security properties'', and some of the submissions indeed refer to the BUFF transform with the purpose of achieving them, while some other submissions follow the design of the BUFF transform without mentioning it explicitly.
>
In this work, we show the following negative results regarding the non-resignability property in general, and the BUFF transform in particular. In the plain model, we observe by means of a simple attack that any signature scheme for which the message has a high entropy given the signature does not satisfy the non-resignability property (while non-resignability is trivially not satisfied if the message can be efficiently computed from its signature). Given that the BUFF transform has high entropy in the message given the signature, it follows that the BUFF transform does not achieve non-resignability whenever the random oracle is instantiated with a hash function, no matter what hash function.
>
When considering the random oracle model (ROM), the matter becomes slightly more delicate since prior works did not rigorously define the non-resignability property in the ROM. For the natural extension of the definition to the ROM, we observe that our impossibility result still holds, despite there having been positive claims about the non-resignability of the BUFF transform in the ROM. Indeed, prior claims of the non-resignability of the BUFF transform rely on faulty argumentation.
>
On the positive side, we prove that a salted version of the BUFF transform satisfies a slightly weaker variant of non-resignability in the ROM, covering both classical and quantum attacks, if the entropy requirement in the (weakened) definition of non-resignability is statistical; for the computational variant, we show yet another negative result.
Available files:
|
|
Thomas Attema, Serge Fehr, Nicolas Resch
Generalized Special-Sound Interactive Proofs and their Knowledge Soundness
In Theory of Cryptography Conference - TCC 2023, volume 14371 Lecture Notes in Computer Science, pages 424-454.
Abstract:
A classic result in the theory of interactive proofs shows that a special-sound Sigma-protocol is automatically a proof of knowledge. This result is very useful to have, since the latter property is typically tricky to prove from scratch, while the former is often easy to argue — if it is satisfied. While classic
Sigma-protocols often are special-sound, this is unfortunately not the case for many recently proposed, highly efficient interactive proofs, at least not in this strict sense. Motivated by this, the original result was recently generalized to k-special-sound
Sigma-protocols (for arbitrary, polynomially bounded k), and to multi-round versions thereof. This generalization is sufficient to analyze (e.g.) Bulletproofs-like protocols, but is still insufficient for many other examples.
In this work, we push the relaxation of the special soundness property to the extreme, by allowing an arbitrary access structure Gamma
to specify for which subsets of challenges it is possible to compute a witness, when given correct answers to these challenges (for a fixed first message). Concretely, for any access structure Gamma, we identify parameters t and kappa, and we show that any
Gamma-special-sound
Sigma-protocol is a proof of knowledge with knowledge error
kappa if t
is polynomially bounded. Similarly for multi-round protocols.
We apply our general result to a couple of simple but important example protocols, where we obtain a tight knowledge error as an immediate corollary. Beyond these simple examples, we analyze the FRI protocol. Here, showing the general special soundness notion is non-trivial, but can be done (for a certain range of parameters) by recycling some of the techniques used to argue ordinary soundness of the protocol (as an IOP). Again as a corollary, we then derive that the FRI protocol, as an interactive proof by using a Merkle-tree commitment, has a knowledge extractor with almost optimal knowledge error, with the caveat that the extractor requires (expected) quasi-polynomial time.
Finally, building up on the technique for the parallel repetition of k-special-sound
Sigma-protocols, we show the same strong parallel repetition result for
Gamma-special-sound
Sigma-protocol and its multi-round variant.
Manuel Barbosa, Gilles Barthe, Christian Doczkal, Jelle Don, Serge Fehr, Benjamin Gregoire, Yu-Hsuan Huang, Andreas Hülsing, Yi Lee, and Xiaodi Wu
Fixing and Mechanizing the Security Proof of Fiat-Shamir with Aborts and Dilithium
In Advances in Cryptology - CRYPTO 2023, volume 14085 of Lecture Notes in Computer Science, pages 358-389.
Abstract:
We extend and consolidate the security justification for the Dilithium signature scheme. In particular, we identify a subtle but crucial gap that appears in several ROM and QROM security proofs for signature schemes that are based on the Fiat-Shamir with aborts paradigm, including Dilithium. The gap lies in the CMA-to-NMA reduction and was uncovered when trying to formalize a variant of the QROM security proof by Kiltz, Lyubashevsky, and Schaffner (Eurocrypt 2018). The gap was confirmed by the authors, and there seems to be no simple patch for it. We provide new, fixed proofs for the affected CMA-to-NMA reduction, both for the ROM and the QROM, and we perform a concrete security analysis for the case of Dilithium to show that the claimed security level is still valid after addressing the gap. Furthermore, we offer a fully mechanized ROM proof for the CMA-security of Dilithium in the Easy-Crypt proof assistant. Our formalization includes several new tools and techniques of independent interest for future formal verification results.
Available files:
|
|
Serge Fehr and Yu-Hsuan Huang
On the Quantum Security of HAWK
In Conference on Post-Quantum Cryptography - PQCRYPTO 2023, volume 14154 of Lecture Notes in Computer Science, pages 405-416.
Abstract:
In this paper, we prove the quantum security of the signature scheme HAWK, proposed by Ducas, Postlethwaite, Pulles and van Woerden (ASIACRYPT 2022). More precisely, we reduce its strong unforgeability in the quantum random oracle model (QROM) to the hardness of the one-more SVP problem, which is the computational problem on which also the classical security analysis of HAWK relies. Our security proof deals with the quantum aspects in a rather black-box way, making it accessible also to non-quantum-experts.
Available files:
|
|
Gabriele Spini, Emiliano Mancini, Thomas Attema, Mark Abspoel, Jan de Gier, Serge Fehr, Thijs Veugen, Maran van Heesch, Daniël Worm, Andrea De Luca, Ronald Cramer, and Peter M.A. Sloot
New Approach to Privacy-Preserving Clinical Decision Support Systems for HIV Treatment
In Journal of Medical Systems volume 46, number 84 (2022).
Abstract:
Background -
HIV treatment prescription is a complex process. Clinical decision support systems (CDSS) are a category of health information technologies that can assist clinicians to choose optimal treatments based on clinical trials and expert knowledge. The usability of some CDSSs for HIV treatment would be significantly improved by using the knowledge obtained by treating other patients. This knowledge, however, is mainly contained in patient records, whose usage is restricted due to privacy and confidentiality constraints.
Methods -
A treatment effectiveness measure, containing valuable information for HIV treatment prescription, was defined and a method to extract this measure from patient records was developed. This method uses an advanced cryptographic technology, known as secure Multiparty Computation (henceforth referred to as MPC), to preserve the privacy of the patient records and the confidentiality of the clinicians’ decisions.
Findings -
Our solution enables to compute an effectiveness measure of an HIV treatment, the average time-to-treatment-failure, while preserving privacy. Experimental results show that our solution, although at proof-of-concept stage, has good efficiency and provides a result to a query within 24 min for a dataset of realistic size.
Interpretation -
This paper presents a novel and efficient approach HIV clinical decision support systems, that harnesses the potential and insights acquired from treatment data, while preserving the privacy of patient records and the confidentiality of clinician decisions.
Available files:
|
|
Thomas Attema, Serge Fehr and Michael Klooss
Fiat-Shamir Transformation of Multi-Round Interactive Proofs
In Theory of Cryptography Conference - TCC 2022, volume 13747 of Lecture Notes in Computer Science, pages 113-142.
Also: In Journal of Cryptology, 36, 36 (2023).
Abstract:
The celebrated Fiat-Shamir transformation turns any public-coin interactive proof into a non-interactive one, which inherits the main security properties (in the random oracle model) of the interactive version. While originally considered in the context of 3-move public-coin interactive proofs, i.e., so-called Σ-protocols, it is now applied to multi- round protocols as well. Unfortunately, the security loss for a (2μ+1)- move protocol is, in general, approximately Qμ, where Q is the number of oracle queries performed by the attacker. In general, this is the best one can hope for, as it is easy to see that this loss applies to the μ-fold sequential repetition of Σ-protocols, but it raises the question whether certain (natural) classes of interactive proofs feature a milder security loss.
In this work, we give positive and negative results on this question. On the positive side, we show that for (k1,...,kμ)-special-sound protocols (which cover a broad class of use cases), the knowledge error degrades linearly in Q, instead of Qμ. On the negative side, we show that for t-fold parallel repetitions of typical (k1,...,kμ)--special-sound protocols with t ≥ μ (and assuming for simplicity that t and Q are integer multiples of μ), there is an attack that results in a security loss of approximately 1/2*Qμ/μμ+t.
Available files:
|
|
Jelle Don, Serge Fehr, and Yu-Hsuan Huang
Adaptive versus Static Multi-oracle Algorithms, and Quantum Security of a Split-key PRF
In Theory of Cryptography Conference - TCC 2022, volume 13747 of Lecture Notes in Computer Science,pages 33-51.
Abstract:
In the first part of the paper, we show a generic compiler that transforms any oracle algorithm that can query multiple oracles adaptively, i.e., can decide on which oracle to query at what point dependent on previous oracle responses, into a static algorithm that fixes these choices at the beginning of the execution. Compared to naive ways of achieving this, our compiler controls the blow-up in query complexity for each oracle individually, and causes a very mild blow-up only.
In the second part of the paper, we use our compiler to show the security of the very efficient hash-based split-key PRF proposed by Giacon, Heuer and Poettering (PKC 2018), in the quantum random-oracle model. Using a split-key PRF as the key-derivation function gives rise to a secure KEM combiner. Thus, our result shows that the hash-based construction of Giacon et al. can be safely used in the context of quantum attacks, for instance to combine a well-established but only classically-secure KEM with a candidate KEM that is believed to be quantum-secure.
Our security proof for the split-key PRF crucially relies on our adaptive-to-static compiler, but we expect our compiler to be useful beyond this particular application. Indeed, we discuss a couple of other, known results from the literature that would have profitted from our compiler, in that these works had to go though serious complications in oder to deal with adaptivity.
Available files:
|
|
Thomas Attema and Serge Fehr
Parallel Repetition of (k1,...,kμ)-Special-Sound Multi-Round Interactive Proofs
In Advances in Cryptology - CRYPTO 2022, volume 13508 of of Lecture Notes in Computer Science, pages 415-443.
Abstract:
In many occasions, the knowledge error κ of an interactive proof is not small enough, and thus needs to be reduced. This can be done generically by repeating the interactive proof in parallel. While there have been many works studying the effect of parallel repetition on the soundness error of interactive proofs and arguments, the effect of parallel repetition on the knowledge error has largely remained unstudied. Only recently it was shown that the t-fold parallel repetition of any interactive protocol reduces the knowledge error from κ down to κt+ν for any non-negligible term ν. This generic result is suboptimal in that it does not give the knowledge error κt that one would expect for typical protocols, and, worse, the knowledge error remains non-negligible.
In this work we show that indeed the t-fold parallel repetition of any (k1,...,kμ)-special-sound multi-round public-coin interactive proof optimally reduces the knowledge error from κ down to κt. At the core of our results is an alternative, in some sense more fine-grained, measure of quality of a dishonest prover than its success probability, for which we show that it characterizes when knowledge extraction is possible. This new measure then turns out to be very convenient when it comes to analyzing the parallel repetition of such interactive proofs.
While parallel repetition reduces the knowledge error, it is easily seen to increase the completeness error. For this reason, we generalize our result to the case of s-out-of-t threshold parallel repetition, where the verifier accepts if s out of t of the parallel instances are accepting. An appropriately chosen threshold s allows both the knowledge error and completeness error to be reduced simultaneously.
Jelle Don, Serge Fehr, Christian Majenz, and Christian Schaffner
Efficient NIZKs and Signatures from Commit-and-Open Protocols in the QROM
In Advances in Cryptology - CRYPTO 2022, volume 13508 of Lecture Notes in Computer Science, pages 729-757.
Abstract:
Commit-and-open Sigma-protocols are a popular class of protocols for constructing non-interactive zero-knowledge arguments and digital-signature schemes via the Fiat-Shamir transformation. Instantiated with hash-based commitments, the resulting non-interactive schemes enjoy tight online-extractability in the random oracle model. Online extractability improves the tightness of security proofs for the resulting digital-signature schemes by avoiding lossy rewinding or forking-lemma based extraction.
In this work, we prove tight online extractability in the quantum random oracle model (QROM), showing that the construction supports post-quantum security. First, we consider the default case where committing is done by element-wise hashing. In a second part, we extend our result to Merkle-tree based commitments. Our results yield a significant improvement of the provable post-quantum security of the digital-signature scheme Picnic.
Our analysis makes use of a recent framework by Chung et al. for analysing quantum algorithms in the QROM using purely classical reasoning. Therefore, our results can to a large extent be understood and verified without prior knowledge of quantum information science.
Jelle Don, Serge Fehr, Christian Majenz, and Christian Schaffner
Online-Extractability in the Quantum Random-Oracle Model
In Advances in Cryptology - EUROCRYPT 2022, volume 13277 of Lecture Notes in Computer Science, pages 677-706.
Also: Accepted to QIP 2022 and QCRYPT 2022.
Abstract:
We show thefollowing generic result: When a quantum query algorithm in the quantum random-oracle model outputs a classical value t that is promised to be in some tight relation with H(x) for some x, then x can be efficiently extracted with almost certainty. The extraction is by means of a suitable simulation of the random oracle and works online, meaning that it is straight line, i.e., without rewinding, and on-the-fly, i.e., during the protocol execution and (almost) without disturbing it. The technical core of our result is a new commutator bound that bounds the operator norm of the commutator of the unitary operator that describes the evolution of the compressed oracle (which is used to simulate the random oracle above) and of the measurement that extracts x.
We show two applications of our generic online extractability result. We show tight online extractability of commit-and-open Σ-protocols in the quantum setting, and we offer the first complete post-quantum security proof of the textbook Fujisaki-Okamoto transformation, i.e, without adjustments to facilitate the proof, including concrete security bounds.
Thomas Attema, Ronald Cramer, and Serge Fehr
Compressing Proofs of k-Out-Of-n Partial Knowledge
In Advances in Cryptology - CRYPTO 2021, volume 12828 of Lecture Notes in Computer Science, pages 65-89.
Abstract:
In a proof of partial knowledge, introduced by Cramer, Damgård and Schoenmakers (CRYPTO 1994), a prover knowing witnesses for some k-subset of n given public statements can convince the verifier of this claim without revealing which k-subset. Their solution combines Σ-protocol theory and linear secret sharing, and achieves linear communication complexity for general k,n. Especially the "one-out-of-n" case k = 1 has seen myriad applications during the last decades, e.g., in electronic voting, ring signatures, and confidential transaction systems.
In this paper we focus on the discrete logarithm (DL) setting, where the prover claims knowledge of DLs of k-out-of-n given elements. Groth and Kohlweiss (EUROCRYPT 2015) have shown how to solve the special case k = 1 with logarithmic (in n) communication, instead of linear as prior work. However, their method takes explicit advantage of k = 1 and does not generalize to k > 1. Alternatively, an indirect approach for solving the considered problem is by translating the k-out-of-n relation into a circuit and then applying communication-efficient circuit ZK. Indeed, for the k = 1 case this approach has been highly optimized, e.g., in ZCash. Our main contribution is a new, simple honest-verifier zero-knowledge proof protocol for proving knowledge of k out of n DLs with logarithmic communication and for general k and n, without requiring any generic circuit ZK machinery. Our solution puts forward a novel extension of the compressed Σ-protocol theory (CRYPTO 2020), which we then utilize to compress a new Σ-protocol for proving knowledge of k-out-of-n DLs down to logarithmic size. The latter Σ-protocol is inspired by the CRYPTO 1994 approach, but a careful re-design of the original protocol is necessary for the compression technique to apply. Interestingly, even for k = 1 and general n our approach improves prior direct approaches as it reduces prover complexity without increasing the communication complexity. Besides the conceptual simplicity, we also identify regimes of practical relevance where our approach achieves asymptotic and concrete improvements, e.g., in proof size and prover complexity, over the generic approach based on circuit-ZK.
Finally, we show various extensions and generalizations of our core result. For instance, we extend our protocol to proofs of partial knowledge of Pedersen (vector) commitment openings, and/or to include a proof that the witness satisfies some additional constraint, and we show how to extend our results to non-threshold access structures.
Kai-Min Chung, Serge Fehr, Yu-Hsuan Huang, and Tai-Ning Liao
On the Compressed-Oracle Technique, and Post-Quantum Security of Proofs of Sequential Work
In Advances in Cryptology - EUROCRYPT 2021, volume 12697 of Lecture Notes in Computer Science, pages 598-629.
Also: Accepted to QCRYPT 2021.
Abstract:
We revisit the so-called compressed oracle technique, introduced by Zhandry for analyzing quantum algorithms in the quantum random oracle model (QROM). To start off with, we offer a concise exposition of the technique, which easily extends to the parallel-queryQROM, where in each query-round the considered algorithm may make several queries to the QROM in parallel. This variant of the QROM allows for a more fine-grained query-complexity analysis.
Our main technical contribution is a framework that simplifies the use of (the parallel-query generalization of) the compressed oracle technique for proving query complexity results. With our framework in place, whenever applicable, it is possible to prove quantum query complexity lower bounds by means of purely classical reasoning. More than that, for typical examples the crucial classical observations that give rise to the classical bounds are sufficient to conclude the corresponding quantum bounds. We demonstrate this on a few examples, recovering known results but also obtaining new results. Our main target is the hardness of finding a q-chain with fewer than q parallel queries, i.e., a sequence x0,x1,...,xq with xi = H(xi−1) for all 1 ≤ i ≤ q.
The above problem of finding a hash chain is of fundamental importance in the context of proofs of sequential work. Indeed, as a concrete crypto-graphic application of our techniques, we prove quantum security of theSimple Proofs of Sequential Work by Cohen and Pietrzak.
Serge Fehr and Chen Yuan
Robust Secret Sharing with Almost Optimal Share Size and Security Against Rushing Adversaries
In Theory of Cryptography Conference - TCC 2020, volume 12552 of Lecture Notes in Computer Science, pages 470-498.
Abstract:
We show a robust secret sharing scheme for a maximal threshold t < n/2 that features an optimal overhead in share size, offers security against a rushing adversary, and runs in polynomial time. Previous robust secret sharing schemes for t < n/2 either suffered from a suboptimal overhead, offered no (provable) security against a rushing adversary, or ran in superpolynomial time.
Jelle Don, Serge Fehr, and Christian Majenz
The Measure-and-Reprogram Technique 2.0: Multi-round Fiat-Shamir and More
In Advances in Cryptology - CRYPTO 2020, volume 12172 of Lecture Notes in Computer Science, pages 602-631.
Also: Accepted to QCRYPT 2020.
Abstract:
We revisit recent works by Don, Fehr, Majenz and Schaffner and by Liu and Zhandry on the security of the Fiat-Shamir (FS) transformation of Σ-protocols in the quantum random oracle model (QROM). Two natural questions that arise in this context are: (1) whether the results extend to the FS transformation of multi-round interactive proofs, and (2) whether Don et al.’s O(q2) loss in security is optimal.
Firstly, we answer question (1) in the affirmative. As a byproduct of solving a technical difficulty in proving this result, we slightly improve the result of Don et al., equipping it with a cleaner bound and an even simpler proof. We apply our result to digital signature schemes showing that it can be used to prove strong security for schemes like MQDSS in the QROM. As another application we prove QROM-security of a non-interactive OR proof by Liu, Wei and Wong.
As for question (2), we show via a Grover-search based attack that Don et al.’s quadratic security loss for the FS transformation of Σ-protocols is optimal up to a small constant factor. This extends to our new multi-round result, proving it tight up to a factor depending on the number of rounds only, i.e. is constant for constant-round interactive proofs.
Serge Fehr and Serge Vaudenay
Sublinear Bounds on the Distinguishing Advantage for Multiple Samples
In International Workshop on Security - IWSEC 2020, volume 12231 of Lecture Notes in Computer Science, pages 165-183.
Abstract:
The maximal achievable advantage of a (computationally unbounded) distinguisher to determine whether a source Z is distributed according to distribution P0 or P1, when given access to one sample of Z, is characterized by the statistical distance d(P0,P1). Here, we study the distinguishing advantage when given access to several i.i.d. samples of Z. For n samples, the advantage is then naturally given by d(P0n,P1n), which can be bounded as d(P0⊗n,P1⊗n) ≤ n d(P0,P1). This bound is tight for some choices of P0 and P1; thus, in general, a linear increase in the distinguishing advantage is unavoidable.
In this work, we show new and improved bounds on d(P0⊗n,P1⊗n) that circumvent the above pessimistic observation. Our bounds assume, necessarily, certain additional information on P0 and/or P1 beyond, or instead of, a bound on d(P0,P1); in return, the bounds grow as √n, rather than linearly in n. Thus, whenever applicable, our bounds show that the number of samples necessary to distinguish the two distributions is substantially larger than what the standard bound would suggest.
Such bounds have already been suggested in previous literature, but our new bounds are more general and (partly) stronger, and thus applicable to a larger class of instances.
In a second part, we extend our results to a modified setting, where the distinguisher only has indirect access to the source Z. By this we mean that instead of obtaining samples of Z, the distinguisher now obtains i.i.d. samples that are chosen according to a probability distribution that depends on the (one) value produced by the source Z.
Finally, we offer applications of our bounds to the area of cryptography. We show on a few examples from the cryptographic literature how our bounds give rise to improved results. For instance, importing our bounds into the analyses of Blondeau et al. for the security of block ciphers against multidimensional linear and truncated differential attacks, we obtain immediate improvements to their results.
Available files: |
|
Koen de Boer, Léo Ducas, and Serge Fehr
On the Quantum Complexity of the Continuous Hidden Subgroup Problem
In Advances in Cryptology - EUROCRYPT 2020, volume 12106 of Lecture Notes in Computer Science, pages 341-370.
Abstract:
The Hidden Subgroup Problem (HSP) aims at capturing all problems that are susceptible to be solvable in quantum polynomial time following the blueprints of Shor’s celebrated algorithm. Successful solutions to this problems over various commutative groups allow to efficiently perform number-theoretic tasks such as factoring or finding discrete logarithms.
The latest successful generalization (Eisenträger et al. STOC 2014) considers the problem of finding a full-rank lattice as the hidden subgroup of the continuous vector space Rm, even for large dimensions m. It unlocked new cryptanalytic algorithms (Biasse-Song SODA 2016, Cramer et al. EUROCRYPT 2016 and 2017), in particular to find mildly short vectors in ideal lattices.
The cryptanalytic relevance of such a problem raises the question of a more refined and quantitative complexity analysis. In the light of the increasing physical difficulty of maintaining a large entanglement of qubits, the degree of concern may be different whether the above algorithm requires only linearly many qubits or a much larger polynomial amount of qubits.
This is the question we start addressing with this work. We propose a detailed analysis of (a variation of) the aforementioned HSP algorithm, and conclude on its complexity as a function of all the relevant parameters. Our modular analysis is tailored to support the optimization of future specialization to cases of cryptanalytic interests. We suggest a few ideas in this direction.
Jelle Don, Serge Fehr, Christian Majenz, and Christian Schaffner
Security of the Fiat-Shamir Transformation
in the Quantum Random-Oracle Model
In Advances in Cryptology - CRYPTO 2019, volume 11693 of Lecture Notes in Computer Science, pages 356-383.
Also: Accepted to QCRYPT 2019 and QIP 2020.
Abstract:
The famous Fiat-Shamir transformation turns any public-coin three-round interactive proof,i.e., any so-called Σ-protocol, into a non-interactive proof in the random-oracle model. We study this transformation in the setting of a quantum adversary that in particular may query the random oracle in quantum superposition. Our main result is a generic reduction that transforms any quantum dishonest prover attacking the Fiat-Shamir transformation in the quantum random-oracle model into a similarly successful quantum dishonest prover attacking the underlying Σ-protocol (in the standard model). Applied to the standard soundness and proof-of-knowledge definitions, our reduction implies that both these security properties, in both the computational and the statistical variant, are preserved under the Fiat-Shamir transformation even when allowing quantum attacks. Our result improves and completes the partial results that have been known so far, but it also proves wrong certain claims made in the literature. In the context of post-quantum secure signature schemes, our results imply that for any Σ-protocol that is a proof-of-knowledge against quantum dishonest provers (and that satisfies some additional natural properties), the corresponding Fiat-Shamir signature scheme is secure in the quantum random-oracle model. For example, we can conclude that the non-optimized version of Fish, which is the bare Fiat-Shamir variant of the NIST candidate Picnic, is secure in the quantum random-oracle model.
Serge Fehr and Chen Yuan
Towards Optimal Robust Secret Sharing with Security against a Rushing Adversary
In Advances in Cryptology - EUROCRYPT 2019, volume 11478 of Lecture Notes in Computer Science, pages 472-499.
Abstract:
Robust secret sharing enables the reconstruction of a secret-shared message in the presence of up to t (out of n) incorrect shares. The most challenging case is when n=2t+1, which is the largest t for which the task is still possible, up to a small error probability 2-k and with some overhead in the share size.
Recently, Bishop, Pastro, Rajaraman and Wichs [3] proposed a scheme with an (almost) optimal overhead of O(k). This seems to answer the open question posed by Cevallos et al. [6] who proposed a scheme with overhead of O(n+k) and asked whether the linear dependency on n was necessary or not. However, a subtle issue with Bishop et al.'s solution is that it (implicitly) assumes a non-rushing adversary, and thus it satisfies a weaker notion of security compared to the scheme by Cevallos et al. [6], or to the classical scheme by Rabin and BenOr [13].
In this work, we almost close this gap. We propose a new robust secret sharing scheme that offers full security against a rushing adversary, and that has an overhead of O(kn𝜀), where 𝜀>0 is arbitrary but fixed. This n𝜀-factor is obviously worse than the polylog(n)-factor hidden in the O notation of the scheme of Bishop et al. [3], but it greatly improves on the linear dependency on n of the best known scheme that features security against a rushing adversary (when k is substantially smaller than n).
A small variation of our scheme has the same O(k) overhead as the scheme of Bishop et al. and achieves security against a rushing adversary, but suffers from a (slightly) superpolynomial reconstruction complexity.
Available files:
|
|
Serge Fehr
Classical Proofs for the Quantum Collapsing
Property of Classical Hash Functions
In Theory of Cryptography Conference - TCC 2018, volume 11240 of Lecture Notes in Computer Science, pages 315-338.
Abstract:
Hash functions are of fundamental importance in theoretical and in practical cryptography, and with the threat of quantum computers possibly emerging in the future, it is an urgent objective to understand the security of hash functions in the light of potential future quantum attacks. To this end, we reconsider the collapsing property of hash functions, as introduced by Unruh, which replaces the notion of collision resistance when considering quantum attacks. Our contribution is a formalism and a framework that offers significantly simpler proofs for the collapsing property of hash functions. With our framework, we can prove the collapsing property for hash domain extension constructions entirely by means of decomposing the iteration function into suitable elementary composition operations. In particular, given our framework, one can argue purely classically about the quantum-security of hash functions; this is in contrast to previous proofs which are in terms of sophisticated quantum-information-theoretic and quantum-algorithmic reasoning.
Available files:
|
|
Frédéric Dupuis, Serge Fehr, Philippe
Lamontagne, and Louis Salvail
Secure Certification of Mixed Quantum States with Application to Two-Party Randomness Generation
In Theory of Cryptography Conference - TCC 2018, volume 11240 of Lecture Notes in Computer Science, pages 282-314.
Abstract:
We investigate sampling procedures that certify that an arbitrary quantum state on n subsystems is close to an ideal mixed state 𝜑⊗𝑛 for a given reference state 𝜑, up to errors on a few positions. This task makes no sense classically: it would correspond to certifying that a given bitstring was generated according to some desired probability distribution. However, in the quantum case, this is possible if one has access to a prover who can supply a purification of the mixed state.
In this work, we introduce the concept of mixed-state certification, and we show that a natural sampling protocol offers secure certification in the presence of a possibly dishonest prover: if the verifier accepts then he can be almost certain that the state in question has been correctly prepared, up to a small number of errors.
We then apply this result to two-party quantum coin-tossing. Given that strong coin tossing is impossible, it is natural to ask “how close can we get”. This question has been well studied and is nowadays well understood from the perspective of the bias of individual coin tosses. We approach and answer this question from a different—and somewhat orthogonal—perspective, where we do not look at individual coin tosses but at the global entropy instead. We show how two distrusting parties can produce a common high-entropy source, where the entropy is an arbitrarily small fraction below the maximum.
Available files:
|
|
Serge Fehr, Pierre Karpman, and Bart Mennink
Short Non-Malleable Codes from Related-Key Secure Block Ciphers
In IACR Transactions on Symmetric Cryptology, volume 2018, issue 1. Presented at the International Conference on Fast Software Encryption - FSE 2018.
Abstract:
A non-malleable code is an unkeyed randomized encoding scheme that offers the strong guarantee that decoding a tampered codeword either results in
the original message, or in an unrelated message.
We consider the simplest possible construction in the computational split-state model, which encodes a message m simply as k||E_k(m) for a uniformly random key k, where E is a block cipher.
This construction is comparable to, but greatly simplifies over, the one of Kiayias et al. (ACM CCS 2016), who eschewed this simple scheme in fear of
related-key attacks on E.
In this work, we prove this construction to be a strong non-malleable code in the split-state tampering model as long as E is
(i) a pseudorandom permutation under leakage and (ii) related-key secure with respect to an arbitrary but fixed key relation.
Both properties are believed to hold for ``good'' block ciphers, such as AES-128, making this non-malleable code very efficient with short codewords of
length |m| + 2 kappa (where kappa is the security parameter, e.g., 128 bits), without significant security penalty.
Available files:
|
|
Serge Fehr and Louis Salvail
Quantum Authentication and Encryption with Key Recycling
In Advances in Cryptology - EUROCRYPT 2017, volume 10212 of Lecture Notes in Computer Science, pages 311-338. Springer-Verlag, 2017.
Also: Accepted to QCRYPT 2017.
Abstract:
We propose an information-theoretically secure encryption scheme for classical messages with quantum ciphertexts that offers detection of eavesdropping attacks, and re-usability of the key in case no eavesdropping took place: the entire key can be securely re-used for encrypting new messages as long as no attack is detected.
This is known to be impossible for fully classical schemes, where there is no way to detect plain eavesdropping attacks.
This particular application of quantum techniques to cryptography was originally proposed by Bennett, Brassard and Breidbart in 1982, even before proposing quantum-key-distribution, and a simple candidate scheme was suggested but no rigorous security analysis was given.
The idea was picked up again in 200
5, when Damgård, Pedersen and Salvail suggested a new scheme for the same task, but now with a rigorous security analysis. However, their scheme is much more demanding in terms of quantum capabilities: it requires the users to have a quantum computer.
In contrast, and like the original scheme by Bennett et al., our new scheme requires from the honest users merely to prepare and measure single BB84 qubits.
As such, we not only show the first provably-secure scheme that is within reach of current technology, but we also confirm Bennett et al.'s original intuition that a scheme in the spirit of their original construction is indeed secure.
Gabriele Spini and Serge Fehr
Cheater Detection in SPDZ Multiparty Computation
In International Conference on Information Theoretic Security - ICITS 2016, volume 10015 of Lecture Notes in Computer Science, pages 151– 176. Springer-Verlag, 2016.
Abstract: In this work we revisit the SPDZ multiparty computation protocol by Damgård et al. for securely computing a function in the presence of an unbounded number of dishonest parties. The SPDZ protocol is distinguished by its fast performance. A downside of the SPDZ protocol is that one single dishonest party can enforce the computation to fail, meaning that the honest parties have to abort the computation without learning the outcome, whereas the cheating party may actually learn it. Furthermore, the dishonest party can launch such an attack without being identified to be the cheater. This is a serious obstacle for practical deployment: there are various reasons for why a party may want the computation to fail, and without cheater detection there is little incentive for such a party not to cheat. As such, in many cases, the protocol will actually fail to do its job.
In this work, we enhance the SPDZ protocol to allow for cheater detection: a dishonest party that enforces the protocol to fail will be identified as being the cheater. As a consequence, in typical real-life scenarios, parties will actually have little incentive to cheat, and if cheating still takes place, the cheater can be identified and discarded and the computation can possibly be re-done, until it succeeds.
The challenge lies in adding this cheater detection feature to the original protocol without increasing its complexity significantly. In case no cheating takes place, our new protocol is as efficient as the original SPDZ protocol which has no cheater detection. In case cheating does take place, there may be some additional overhead, which is still reasonable in size though, and since the cheater knows he will be caught, this is actually unlikely to occur in typical real-life scenarios.
Available files:
|
|
Frédéric Dupuis, Serge Fehr, Philippe Lamontagne, and Louis Salvail
Adaptive Versus Non-Adaptive Strategies in the Quantum Setting with Applications
In Advances in Cryptology - CRYPTO 2016, volume 9816 of Lecture Notes in Computer Science, pages 33-59. Springer-Verlag, 2016.
Abstract:
We prove a general relation between adaptive and non-adaptive strategies in the quantum setting, i.e., between strategies where the adversary can or cannot adaptively base its action on some auxiliary quantum side information. Our relation holds in a very general setting, and is applicable as long as we can control the bit-size of the side information, or, more generally, its ``information content". Since adaptivity is notoriously difficult to handle in the analysis of (quantum) cryptographic protocols, this gives us a very powerful tool: as long as we have enough control over the side information, it is sufficient to restrict ourselves to
non-adaptive attacks.
We demonstrate the usefulness of this methodology with two examples.
The first is a quantum bit commitment scheme based on 1-bit cut-and-choose. Since bit commitment implies oblivious transfer (in the quantum setting), and oblivious transfer is universal for two-party computation, this implies the universality of 1-bit cut-and-choose, and thus solves the main open problem of [9]. The second example is a quantum bit commitment scheme proposed in 1993 by Brassard et al. It was originally suggested as an unconditionally secure scheme, back when this was thought
to be possible. We partly restore the scheme by proving it secure in (a variant of) the bounded quantum storage model.
In both examples, the fact that the adversary holds quantum side information obstructs a direct analysis of the scheme, and we circumvent it by analyzing a non-adaptive version, which can be done by means of known techniques, and applying our main result.
Serge Fehr and Max Fillinger
On the Composition of Two-Prover Commitments, and Applications to Multi-Round Relativistic Commitments
In Advances in Cryptology - EUROCRYPT 2016, volume 9666 of Lecture Notes in Computer Science, pages 477-496. Springer-Verlag, 2016.
Also: Accepted to QCRYPT 2015.
Abstract:
We consider the related notions of two-prover and of relativistic commitment schemes. In recent work, Lunghi et al. proposed a new relativistic commitment scheme with a multi-round sustain phase that enables to keep the binding property alive as long as the sustain phase is running.
They prove security of their scheme against classical attacks; however, the proven bound on the error parameter is very weak: it blows up doubly exponentially in the number of rounds.
In this work, we give a new analysis of the multi-round scheme of Lunghi et al., and we show a linear growth of the error parameter instead (also considering classical attacks only). Our analysis is based on a new composition theorem for two-prover commitment schemes. The proof of our composition theorem is based on a better understanding of the binding property of two-prover commitments that we provide in the form of new definitions and relations among them.
As an additional consequence of these new insights, our analysis is actually with respect to a strictly stronger notion of security as considered by Lunghi et al.
Serge Fehr and Max Fillinger
Multi-Prover Commitments Against Non-Signaling Attacks
In Advances in Cryptology - CRYPTO 2015, volume 9216 of Lecture Notes in Computer Science, pages 403-421. Springer-Verlag, 2015.
Also: Accepted to QCRYPT 2015.
Abstract:
We reconsider the concept of two-prover (and more generally: multi-prover) commitments,
as introduced in the late eighties in the seminal work by Ben-Or et al. As was recently
shown by Crépeau et al., the security of known two-prover commitment schemes not only relies on the explicit assumption that the two provers cannot communicate, but also depends on what their information processing capabilities are. For instance, there exist schemes that are secure against classical provers but insecure if the provers have quantum information processing capabilities, and there are schemes that resist such quantum attacks but become insecure when considering general so-called non-signaling provers, which are restricted solely by the requirement that no communication takes place.
This poses the natural question whether there exists a two-prover commitment scheme that is secure under the sole assumption that no communication takes place, and that does not rely on any further restriction of the information processing capabilities of the dishonest provers; no such scheme is known.
In this work, we give strong evidence for a negative answer: we show that any single-round twoprover commitment scheme can be broken by a non-signaling attack. Our negative result is as bad as it can get: for any candidate scheme that is (almost) perfectly hiding, there exists a strategy that allows the dishonest provers to open a commitment to an arbitrary bit (almost) as successfully as the honest provers can open an honestly prepared commitment, i.e., with probability (almost) 1 in case of a perfectly sound scheme. In the case of multi-round schemes, our impossibility result is restricted to perfectly hiding schemes.
On the positive side, we show that the impossibility result can be circumvented by considering three provers instead: there exists a three-prover commitment scheme that is secure against arbitrary non-signaling attacks.
Ronald Cramer, Ivan Damgård, Nico Döttling, Serge Fehr, and Gabriele Spini
Linear Secret Sharing Schemes from Error Correcting Codes and Universal Hash Functions
In Advances in Cryptology - EUROCRYPT 2015, volume 9057 of Lecture Notes in Computer Science, pages 313-363. Springer-Verlag, 2015.
Abstract:
We present a novel method for constructing linear secret sharing schemes (LSSS) from linear error correcting codes and linear universal hash functions in a blackbox way. The main advantage of this new construction is that the privacy property of the resulting secret sharing scheme essentially becomes independent of the code we use, only depending on its rate. This allows us to fully harness the algorithmic properties of recent code constructions such as efficient encoding and decoding or efficient list-decoding. Choosing the error correcting codes and universal hash functions involved carefully, we obtain solutions to the following open problems:
(1) A linear near-threshold secret sharing scheme with both linear time sharing and reconstruction algorithms and large secrets (i.e. secrets of size Ω(n)). Thus, the computational overhead per shared bit in this scheme is constant.
(2) An efficiently reconstructible robust secret sharing scheme for n/3 ≤ t ≤ (1-ε) n/2 corrupted players (for any constant ε > 0) with shares of optimal size O(1+λ/n) and secrets of size Ω(n+λ), where λ is the security parameter.
Available files:
|
|
Ivan Damgård, Serge Fehr, Louis Salvail, and Christian Schaffner
Secure Identification and QKD in the Bounded-Quantum-Storage Model
In Theoretical Computer Science, volume 560, part 1, pages 12-26, Theoretical Aspects of Quantum Cryptography - 30 Years of BB84. Elsevier, 2014.
A preliminary version of this paper appeared in CRYPTO 2007.
Abstract:
We consider the problem of secure identification: user U proves to server S that he knows an agreed (possibly low-entropy) password w, while giving away as little information on w as possible - the adversary can exclude at most one possible password for each execution. We propose a solution in the bounded-quantum-storage model, where U and S may exchange qubits, and a dishonest party is assumed to have limited quantum memory. No other restriction is posed upon the adversary. An improved version of the proposed identification scheme is also secure against a man-in-the-middle attack, but requires U and S to additionally share a high-entropy key k. However, security is still guaranteed if one party loses k to the attacker but notices the loss. In both versions, the honest participants need no quantum memory, and noise and imperfect quantum sources can be tolerated. The schemes compose sequentially, and w and k can securely be re-used. A small modification to the identification scheme results in a quantum-key-
distribution (QKD) scheme, secure in the bounded-quantum-storage model, with the same re-usability properties of the keys, and without assuming authenticated channels. This is in sharp contrast to known QKD schemes (with unbounded adversary) without authenticated channels, where authentication keys must be updated, and unsuccessful executions can cause the parties to run out of keys.
Available files:
|
|
Serge Fehr, and Stefan Berens,
On the Conditional Rényi Entropy
In IEEE Transactions on Information Theory, volume 60, number 11, pages 6801-6810. IEEE, 2014.
Abstract:
The Rényi entropy of general order unifies the well-known Shannon entropy with several other entropy notions, like the min-entropy or the collision entropy. In contrast to the Shannon entropy, there seems to be no commonly accepted definition for the conditional Rényi entropy: several versions have been proposed and used in the literature.
In this work, we reconsider the definition for the conditional Rényi entropy of general order as proposed by Arimoto in 1975.
We show that this particular notion satisfies several natural properties. In particular, we show that it satisfies monotonicity under conditioning, meaning that conditioning can only reduce the entropy, and (a weak form of) chain rule, which implies that the decrease in entropy due to conditioning is bounded by the number of bits one conditions on. None of the other suggestions for the conditional Rényi entropy satisfies both these properties.
Finally, we show a natural interpretation of the conditional Rényi entropy in terms of (unconditional) Rényi divergence, and we show consistency with a recently proposed notion of conditional Rényi entropy in the quantum setting.
Available files:
|
|
Harry Buhrman, Serge Fehr, and Christian Schaffner
On the Parallel Repetition of Multi-Player Games: The No-Signaling Case
In Theory of Quantum Computation, Communication, and Cryptography - TQC 2014, LIPICS, pages 24-35. Schloss Dagstuhl, 2014.
Abstract:
We consider the natural extension of two-player nonlocal games to an arbitrary number of players. An important question for such nonlocal games is their behavior under parallel repetition. For two-player nonlocal games, it is known that both the classical and the non-signaling value of any game converges to zero exponentially fast under parallel repetition, given that the game is non-trivial to start with (i.e., has classical/non-signaling value <1). Very recent results show similar behavior for the quantum value of a two-player game under parallel repetition. For nonlocal games with three or more players, very little is known up to present on their behavior under parallel repetition; this is true for the classical, the quantum and the non-signaling value.
In this work, we show a parallel repetition theorem for the non-signaling value of a large class of multi-player games, for an arbitrary number of players. Our result applies to all multi-player games for which all possible combinations of questions have positive probability; this class in particular includes all free games, in which the questions to the players are chosen independently. Specifically, we prove that if the original game has a non-signaling value smaller than 1, then the non-signaling value of the n-fold parallel repetition is exponentially small in n.
Our parallel repetition theorem for multi-player games is weaker than the known parallel repetition results for two-player games in that the rate at which the non-signaling value of the game decreases not only depends on the non-signaling value of the original game (and the number of possible responses), but on the complete description of the game. Nevertheless, we feel that our result is a first step towards a better understanding of the parallel repetition of nonlocal games with more than two players.
Available files:
|
|
Martin Müller-Lennert, Frédéric Dupuis, Oleg Szehr, Serge Fehr, Marco Tomamichel
On quantum Rényi Entropies: A New Generalization and Some Properties
In Journal of Mathematical Physics, volume 54, 122203. AIP Publishing, 2013.
Abstract:
The Rényi entropies constitute a family of information measures that generalizes the well-known Shannon entropy, inheriting many of its properties. They appear in the form of unconditional and conditional entropies, relative entropies, or mutual information, and have found many applications in information theory and beyond. Various generalizations of Rényi entropies to the quantum setting have been proposed, most prominently Petz's quasi-entropies and Renner's conditional min-, max-, and collision entropy. However, these quantum extensions are incompatible and thus unsatisfactory. We propose a new quantum generalization of the family of Rényi entropies that contains the von Neumann entropy, min-entropy, collision entropy, and the max-entropy as special cases, thus encompassing most quantum entropies in use today. We show several natural properties for this definition, including data-processing inequalities, a duality relation, and an entropic uncertainty relation.
Available files:
|
|
Ronald Cramer, Serge Fehr, Carles Padró
Algebraic Manipulation Detection Codes
In SCIENCE CHINA Mathematics, volume 56, issue 7, pages 1349-1358. Science China Press, 2013.
Abstract:
Algebraic manipulation detection codes are a cryptographic primitive that was introduced by Cramer et al. (Eurocrypt 2008). It encompasses several methods that were previously used in cheater detection in secret sharing. Since its introduction, a number of additional applications have been found. This paper contains a detailed exposition of the known results about algebraic manipulation detection codes as well as some new results.
Available files:
|
|
Marco Tomamichel, Serge Fehr, Jedrzej Kaniewski, Stephanie Wehner
One-Sided Device Independent QKD and Position-Based Cryptography from Monogamy Games
In Advances in Cryptology - EUROCRYPT 2013, volume 7881 of Lecture Notes in Computer Science, pages 609-625. Springer-Verlag, 2013.
Also (under a slightly different title): In New J. Phys. 15 (2013) 103002.
Abstract:
A serious concern with quantum key distribution (QKD) schemes is that, when under attack, the quantum devices in a real-life implementation may behave differently than modeled in the security proof. This can lead to real-life attacks against provably secure QKD schemes.
In this work, we show that the standard BB84 QKD scheme is one-sided device-independent. This means that security holds even if Bob's quantum device is arbitrarily malicious, as long as Alice's device behaves as it should.
Thus, we can completely remove the trust into Bob's quantum device for free, without the need for changing the scheme, and without the need for hard-to-implement loophole-free violations of Bell inequality, as is required for fully (meaning two-sided) device-independent QKD.
For our analysis, we introduce a new quantum game, called amonogamy-of-entanglement game, and we show a strong parallel repetition theorem for this game.
This new notion is likely to be of independent interest and to find additional applications. Indeed, besides the application to QKD, we also show a direct application to position-based quantum cryptography: we give the first security proof for a one-round position-verification scheme that requires only single-qubit operations.
Serge Fehr, Ran Gelles, Christian Schaffner
Security and Composability of Randomness Expansion from Bell Inequalities
In Physical Review A, volume 87, issue 1, pages 012335. APS, 2013.
Abstract:
The nonlocal behavior of quantum mechanics can be used to generate guaranteed fresh randomness from an untrusted device that consists of two nonsignalling components; since the generation process requires some initial fresh randomness to act as a catalyst, one also speaks of randomness expansion.
Colbeck and Kent proposed the first method for generating randomness from untrusted devices, however, without providing a rigorous analysis. This was addressed subsequently by Pironio et al. [Nature 464 (2010)], who aimed at deriving a lower bound on the min-entropy of the data extracted from an untrusted device, based only on the observed non-local behavior of the device.
Although that article succeeded in developing important tools towards the acquired goal, it failed in putting the tools together in a rigorous and correct way, and the given formal claim on the guaranteed amount of min-entropy needs to be revisited.
In this paper we show how to combine the tools provided by Pironio et al., as to obtain a meaningful and correct lower bound on the min-entropy of the data produced by an untrusted device, based on the observed non-local behavior of the device. Our main result confirms the essence of the improperly formulated claims of Pironio et al., and puts them on solid ground.
We also address the question of composability and show that different untrusted devices can be composed in an alternating manner under the assumption that they are not entangled. This enables for superpolynomial randomness expansion based on two untrusted yet unentangled devices.
Available files:
|
|
Serge Fehr, Jonathan Katz, Fang Song, Hong-Sheng Zhou, Vassilis Zikas
Feasibility and Completeness of Cryptographic Tasks in the Quantum World
In Theory of Cryptography Conference - TCC 2013, volume 7785 of Lecture Notes in Computer Science, pages 281-296. Springer Verlag, 2013.
Abstract:
It is known that cryptographic feasibility results can change by moving from the classical to the quantum world. With this in mind, we study the feasibility of realizing functionalities in the framework of universal composability, with respect to both computational and information-theoretic security. With respect to computational security, we show that existing feasibility results carry over unchanged from the classical to the quantum world; a functionality is "trivial" (i.e., can be realized without setup) in the quantum world if and only if it is trivial in the classical world. The same holds with regard to functionalities that are complete (i.e., can be used to realize arbitrary other functionalities).
In the information-theoretic setting, the quantum and classical worlds differ. In the quantum world, functionalities in the class we consider are either complete, trivial, or belong to a family of simultaneous-exchange functionalities (e.g., XOR). However, other results in the information-theoretic setting remain roughly unchanged.
Harry Buhrman,
Serge Fehr,
Christian Schaffner, and
Florian Speelman
The Garden-Hose Model
In Innovations in Theoretical Computer Science - ITCS 2013, pages 145-158. ACM, 2013.
Also: Accepted to QIP 2012.
Abstract:
We define a new model of communication complexity, called the garden-hose model. Informally, the garden-hose complexity of a function f: {0,1}n × {0,1}n -> {0,1} is given by the minimal number of water pipes that need to be shared between two parties, Alice and Bob, in order for them to compute the function f as follows: Alice connects her ends of the pipes in a way that is determined solely by her input x ∈ {0,1}n and, similarly, Bob connects his ends of the pipes in a way that is determined solely by his input y ∈ {0,1}n. Alice turns on the water tap that she also connected to one of the pipes. Then,
the water comes out on Alice's or Bob's side depending on the function value f(x,y).
We prove almost-linear lower bounds on the garden-hose complexity for concrete functions like inner product, majority, and equality, and we show the existence of functions with exponential garden-hose complexity. Furthermore, we show a connection to classical complexity theory by proving that all functions computable in log-space have polynomial garden-hose complexity.
We consider a randomized variant of the garden-hose complexity,
where Alice and Bob hold pre-shared randomness, and a quantum
variant, where Alice and Bob hold pre-shared quantum entanglement, and
we show that the randomized garden-hose complexity is within a
polynomial factor of the deterministic garden-hose
complexity. Examples of (partial) functions are given where the
quantum garden-hose complexity is logarithmic in n while the classical
garden-hose complexity can be lower bounded by nc for constant c>0.
Finally, we show an interesting connection between the garden-hose model and the (in)security of a certain class of {\em quantum position-verification} schemes.
Eli Ben-Sasson, Serge Fehr, and
Rafail Ostrovsky
Near-Linear Unconditionally-Secure Multiparty Computation with a Dishonest Minority
In Advances in Cryptology - CRYPTO 2012, volume 7417 of Lecture Notes in Computer Science, pages 663-680 . Springer-Verlag, 2012.
Abstract:
In the setting of unconditionally-secure MPC, where dishonest players are unbounded and no cryptographic assumptions are used, it was known since the 1980's that an honest majority of players is both necessary and sufficient to achieve privacy and correctness, assuming secure point-to-point and broadcast channels. The main open question that was left is to establish the exact communication complexity.
We settle the above question by showing an unconditionally-secure MPC protocol, secure against a dishonest minority of malicious players, that matches the communication complexity of the best known MPC protocol in the honest-but-curious setting. More specifically, we present a new n-player MPC protocol that is secure against a computationally-unbounded malicious adversary that can adaptively corrupt up to t < n/2 of the players. For polynomially-large binary circuits that are not too unshaped,
our protocol has an amortized communication complexity of O(n log n + k/n^c) bits per multiplication (i.e. AND) gate, where k denotes the security parameter and c is an arbitrary non-negative constant. This improves on the previously most efficient protocol with the same security guarantee, which offers an amortized communication complexity of O(n^2 k) bits per multiplication gate. For any k polynomial in n, the amortized communication complexity of our protocol matches the O(n log n) bit communication complexity of the best known MPC protocol with passive security.
We introduce several novel techniques that are of independent interest and we believe will have wider applicability. One is a novel idea of computing authentication tags by means of a mini MPC, which allows us to avoid expensive double-sharings; the other is a batch-wise multiplication verification that allows us to speedup Beaver's ``multiplication triples''.
Niek Bouman, Serge Fehr,
Carlos Gonzalez-Guillen,
and Christian Schaffner
An All-But-One Entropic Uncertainty Relation, and Application to Password-based Identification
In Theory of Quantum Computation, Communication, and Cryptography - TQC 2012. Lecture Notes in Computer Science. Springer-Verlag, 2012.
Also: Accepted to QCRYPT 2011.
Abstract:
Entropic uncertainty relations are quantitative characterizations of Heisenberg's uncertainty principle, which make use of an entropy measure to quantify uncertainty.
In quantum cryptography, they are often used as convenient tools in security proofs.
We propose a new entropic uncertainty relation. It is the first such uncertainty relation that lower bounds the uncertainty of all but one measurement outcome with respect to an arbitrary (and in particular an arbitrarily large) set of possible measurements, and, at the same time, uses the min-entropy as entropy measure, rather than the Shannon entropy. This makes it especially suited for quantum cryptography.
As application, we propose a new quantum identification scheme
in the bounded quantum storage model. It makes use of our new
uncertainty relation at the core of its security proof. In contrast to
the original quantum identification scheme proposed by Damgård
etal., our new scheme also offers some security in case the
bounded-quantum-storage assumption fails hold. Specifically, our
scheme remains secure against an adversary that has unbounded storage
capabilities but is restricted to single-qubit operations. The scheme
by Damgård etal., on the other hand, completely breaks down under
such an attack.
Alfonso Cevallos, Serge Fehr,
Rafail Ostrovsky, and
Yuval Rabani
Unconditionally-Secure Robust Secret Sharing with Compact Shares
In Advances in Cryptology - EUROCRYPT 2012, volume 7237 of Lecture Notes in Computer Science, pages 195-208. Springer-Verlag, 2012.
Abstract:
We consider the problem of reconstructing a shared secret in the presence of faulty shares, with unconditional security. We require that any t shares give no information on the shared secret, and reconstruction is possible even if up to t out of the n shares are incorrect. The interesting setting is n/3 <= t < n/2, where reconstruction of a shared secret in the
presence of faulty shares is possible, but only with an increase in the share size, and only if one admits a small failure probability. The goal of this work is to minimize this overhead in the share size. Known schemes either have a Omega(k n)-overhead in share size, where k is the security parameter, or they have a close-to-optimal overhead of order O(k + n) but have an exponential running time (in n).
In this paper, we propose a new scheme that has a close-to-optimal overhead in the share size of order O(k + n log(n)), and a polynomial running time. Interestingly, the shares in our new scheme are prepared in the very same way as in the well-known scheme by Rabin and Ben-Or, which relies on message authentication, but we use a message authentication code with short tags and keys and with correspondingly weak security. The short tags and keys give us the required saving in the share size. Surprisingly, we can compensate for the weakened security of the authentication and achieve an exponentially small (in k) failure probability by means of a more sophisticated reconstruction procedure.
Harry Buhrman,
Nishanth Chandran,
Serge Fehr,
Ran Gelles,
Vipul Goyal,
Rafail Ostrovsky, and
Christian Schaffner
Position-Based Quantum Cryptography: Impossibility and Constructions
In Advances in Cryptology - CRYPTO 2011, volume 6841 of Lecture Notes in Computer Science, pages 429-446. Springer-Verlag, 2011.
Also: in SIAM Journal on Computing (SICOMP), volume 43, issue 1.
Also: Accepted to QIP 2011 as plenary talk.
Abstract:
In this work, we study position-based cryptography in the quantum setting. The aim is to use the geographical position of a party as its only credential. On the negative side, we show that if adversaries are allowed to share an arbitrarily large entangled quantum state, no secure position-verification is possible at all. We show a distributed protocol for computing any unitary operation on a state shared between the different users, using local operations and one round of classical communication. Using this surprising result, we break any position-verification scheme of a very general form. On the positive side, we show that if adversaries do not share any entangled quantum state but can compute arbitrary quantum operations, secure position-verification is achievable. Jointly, these results suggest the interesting question whether secure position-verification is possible in case of a bounded amount of entanglement. Our positive result can be interpreted as resolving this question in the simplest case, where
the bound is set to zero.
In models where secure positioning is achievable, it has a number of interesting applications. For example, it enables secure communication over an insecure channel without having any pre-shared key, with the guarantee that only a party at a specific location can learn the content of the conversation. More generally, we show that in settings where secure position-verification is achievable, other position-based cryptographic schemes are possible as well, such as secure position-based authentication and position-based key agreement.
Serge Fehr (Ed.)
5th International Conference on Information Theoretic Security - ICITS 2011
Proceedings, volume 6673 of Lecture Notes in Computer Science. Springer-Verlag, 2011.
Available files:
|
|
Niek Bouman, and Serge Fehr
Secure Authentication from a Weak Key, Without Leaking Information
In Advances in Cryptology - EUROCRYPT 2011, volume 6632 of Lecture Notes in Computer Science, pages 246-265. Springer-Verlag, 2011.
Abstract:
We study the problem of authentication based on a weak key in the information-theoretic setting. A key is weak if its min-entropy is an arbitrary small fraction of its bit length. This problem has recently received considerable attention, with different solutions optimizing different parameters. We study the problem in an extended setting, where the weak key is as a one-time session key that is derived from a public source of randomness with the help of a (potentially also weak) long-term key. Our goal now is to authenticate a message by means of the weak session key in such a way that (nearly) no information on the long-term key is leaked. Ensuring privacy of the long-term key is vital for the long-term key to be re-usable. Previous work has not considered such a privacy issue, and previous solutions do not seem to satisfy this requirement. We show the existence of a practical four-round protocol that provides message authentication from a weak session key and that avoids non-negligible leakage on the long-
term key. The security of our scheme also holds in the quantum setting where the adversary may have limited quantum side information on the weak session key. As an application of our scheme, we show the existence of an identification scheme in the bounded quantum storage model that is secure against a man-in-the-middle attack and that is truly password-based: it does not need any high entropy key, in contrast to the scheme proposed by Damgaard et al.
Available files:
|
|
Niek Bouman, and Serge Fehr
Sampling in a Quantum Population, and Applications
In Advances in Cryptology - CRYPTO 2010, volume 6223 of Lecture Notes in Computer Science, pages 724-741. Springer-Verlag, 2010.
Abstract:
We propose a framework for analyzing classical sampling strategies for estimating the Hamming weight of a large string from a few sample positions, when applied to a multi-qubit quantum system instead. The framework shows how to interpret the result of such a strategy and how to define its accuracy when applied to a quantum system. Furthermore, we show how the accuracy of any strategy relates to its accuracy in its classical usage, which is well understood for the important examples.
We show the usefulness of our framework by using it to obtain new and simple security proofs for the following quantum-cryptographic schemes: BB84 quantum-key-distribution, and quantum oblivious-transfer from bit-commitment.
Serge Fehr, Dennis Hofheinz, Eike Kiltz, and Hoeteck Wee
Encryption Schemes Secure Against Chosen-Ciphertext Selective Opening Attacks
In Advances in Cryptology - EUROCRYPT 2010, volume 6110 of Lecture Notes in Computer Science, pages 381-402. Springer-Verlag, 2010.
Abstract:
Imagine many small devices send data to a single receiver, encrypted
using the receiver's public key. Assume an adversary that has the
power to adaptively corrupt a subset of these devices. Given the
information obtained from these corruptions, do the ciphertexts from
uncorrupted devices remain secure?
Recent results suggest that conventional security notions for
encryption schemes (like IND-CCA security) do not
suffice in
this setting. To fill this gap, the notion of security against
selective-opening attacks (SOA security) has been introduced. It
has been shown that lossy encryption implies SOA security against a
passive, i.e., only eavesdropping and corrupting, adversary
(SO-CPA). However, the known results on SOA security against an
active adversary (SO-CCA) are rather limited. Namely, while
there exist feasibility results, the (time and space) complexity of
currently known SO-CCA secure schemes depends on the number of
devices in the setting above.
In this contribution, we devise a new solution to the selective
opening problem that does not build on lossy encryption. Instead, we
combine techniques from non-committing encryption and hash proof
systems with a new technique (dubbed ``cross-authentication codes'')
to glue several ciphertext parts together. The result is a rather
practical
SO-CCA secure public-key encryption scheme that does not suffer from
the efficiency drawbacks of known schemes. Since we build upon hash
proof systems, our scheme can be instantiated using standard
number-theoretic assumptions such as decisional Diffie-Hellman
(DDH), decisional composite residuosity (DCR), and quadratic
residuosity (QR). Besides, we construct a conceptually very simple
and comparatively efficient SO-CPA secure scheme from (slightly
enhanced) trapdoor one-way permutations.
We stress that our schemes are completely independent of the number
of challenge ciphertexts, and we do not make assumptions about the
underlying message distribution (beyond being efficiently
samplable). In particular, we do not assume efficient conditional
re-samplability of the message distribution. Hence, our schemes are
secure in arbitrary settings, even if it is not known in
advance how many ciphertexts might be considered for corruptions.
Serge Fehr
Quantum Cryptography
In Foundations of Physics, volume 40, number 5, pages 494-531. Springer-Verlag, 2010.
Abstract:
Quantum cryptography makes use of the quantum-mechanical behavior of nature for the design and analysis of cryptographic schemes. Optimally (but not always), quantum cryptography allows for the design of cryptographic schemes whose security is guaranteed solely by the laws of nature. This is in sharp contrast to standard cryptographic schemes, which can be broken in principle, i.e., when given sufficient computing power. From a theory point of view, quantum cryptography offers a beautiful interplay between the mathematics of adversarial behavior and quantum information theory.
In this review article, we discuss the traditional application of quantum cryptography, quantum key distribution (QKD), from a modern perspective, and we discuss some recent developments in the context of quantum two-party cooperation (2PC). QKD allows two distant parties to communicate in a provably-secure way in the presence of
an outside eavesdropper, whereas 2PC is concerned with protecting information against possibly malicious insiders. We show the basic idea of constructing quantum cryptographic schemes, but we also show some connections to quantum information theory as needed for the rigorous security analyses, and we discuss some of the relevant quantum-information-theoretic results.
Ivan Damgård, Serge Fehr, Carolin Lunemann, Louis Salvail, and Christian Schaffner
Improving the Security of Quantum Protocols via Commit-and-Open
In Advances in Cryptology - CRYPTO '09, volume 5677 of Lecture Notes in Computer Science, pages 408-427. Springer-Verlag, 2009.
Also: Accepted to QIP 2010.
Abstract:
We consider two-party quantum protocols starting with a transmission of some random BB84 qubits followed by classical messages. We show a general "compiler" improving the security of such protocols: if the original protocol is secure against an "almost honest" adversary, then the compiled protocol is secure against an arbitrary computationally bounded (quantum) adversary. The compilation preserves the number of qubits sent and the number of rounds up to a constant factor. The compiler also preserves security in the bounded-quantum-storage model (BQSM), so if the original protocol was BQSM-secure, the compiled protocol can only be broken by an adversary who has large quantum memory and large computing power. This is in contrast to known BQSM-secure protocols, where security breaks down completely if the adversary has larger quantum memory than expected. We show how our technique can be applied to quantum identification and oblivious transfer protocols.
Serge Fehr, and Christian Schaffner
Composing Quantum Protocols in a Classical Environment
In Theory of Cryptography Conference - TCC '09, volume 5444 of Lecture Notes in Computer Science, pages 350-367. Springer-Verlag, 2009.
Abstract:
We propose a general security definition for cryptographic quantum
protocols that implement classical non-reactive two-party tasks. The
definition is expressed in terms of simple quantum-information-
theoretic conditions which must be satisfied by the protocol to be
secure. The conditions are uniquely determined by the ideal
functionality F defining the cryptographic task to be implemented.
We then show the following composition result. If quantum protocols
pi_1,...,pi_l securely implement ideal functionalities F_1,...,F_l
according to our security definition, then any purely classical
two-party protocol, which makes sequential calls to F_1,...,F_l,
is equally secure as the protocol obtained by replacing the calls
to F_1,...,F_l with the respective quantum protocols pi_1,...,pi_l.
Hence, our approach yields the minimal security requirements which
are strong enough for the typical use of quantum protocols as
subroutines within larger classical schemes. Finally, we show that
recently proposed quantum protocols for secure identification and
oblivious transfer in the bounded-quantum-storage model satisfy our
security definition, and thus compose in the above sense.
Alexandra Boldyreva, Serge Fehr, and Adam O'Neill
On Notions of Security for Deterministic Encryption, and Efficient Constructions Without Random Oracles
In Advances in Cryptology - CRYPTO '08, volume 5157 Lecture Notes in Computer Science, pages 335-359. Springer-Verlag, 2008.
Abstract:
The study of deterministic public-key encryption was initiated by Bellare et al. (CRYPTO '07), who provided the "strongest possible" notion of security for this primitive (called PRIV) and constructions in the random oracle (RO) model. We focus on constructing efficient deterministic encryption schemes without random oracles. To do so, we propose a slightly weaker notion of security, saying that no partial information about encrypted messages should be leaked as long as each message is a-priori hard-to-guess given the others (while PRIV did not have the latter restriction). Nevertheless, we argue that this version seems adequate for many practical applications. We show equivalence of this definition to single-message and indistinguishability-based ones, which are easier to work with. Then we give general constructions of both chosen-plaintext (CPA) and chosen-ciphertext-attack (CCA) secure deterministic encryption schemes, as well as efficient instantiations of them under standard number-theoretic assumptions. Our constructions build on the recently-introduced framework of Peikert and Waters (STOC '08) for constructing CCA-secure probabilistic encryption schemes, extending it to the deterministic-encryption setting as well.
Ronald Cramer, Yevgeniy Dodis, Serge Fehr, Carles Padró, and Daniel Wichs
Detection of Algebraic Manipulation with Applications to Robust Secret Sharing and Fuzzy Extractors
In Advances in Cryptology - EUROCRYPT '08, volume 4965 of Lecture Notes in Computer Science, pages 471-488. Springer-Verlag, 2008.
Abstract:
Consider an abstract storage device Σ(G) that can hold a single element
x from a fixed, publicly known finite group G. Storage is private in the
sense that an adversary does not have read access to Σ(G) at all. However, Σ(G)
is non-robust in the sense that the adversary can modify its contents by adding
some offset Δ ∈ G. Due to the privacy of the storage device, the value Δ can only
depend on an adversary's a priori knowledge of x. We introduce a new primitive
called an algebraic manipulation detection (AMD) code, which encodes a source
s into a value x stored on Σ(G) so that any tampering by an adversary will be detected.
We give a nearly optimal construction of AMD codes, which can flexibly
accommodate arbitrary choices for the length of the source s and security level.
We use this construction in two applications:
- We show how to efficiently convert any linear secret sharing scheme into a
robust secret sharing scheme, which ensures that no unqualified subset of
players can modify their shares and cause the reconstruction of some value
s' ≠ s.
- We show how to build nearly optimal robust fuzzy extractors for several natural
metrics. Robust fuzzy extractors enable one to reliably extract and later
recover random keys from noisy and non-uniform secrets, such as biometrics,
by relying only on non-robust public storage. In the past, such constructions
were known only in the random oracle model, or required the entropy
rate of the secret to be greater than half. Our construction relies on a randomly
chosen common reference string (CRS) available to all parties.
Ivan Damgård, Serge Fehr, Louis Salvail, and Christian Schaffner
Cryptography in the Bounded Quantum-Storage Model
In SIAM Journal on Computing, 37(6): 1865-1890, 2008.
Abstract:
We initiate the study of two-party cryptographic primitives with
unconditional security, assuming that the adversary's {\em quantum}
memory is of bounded size. We show that oblivious transfer and bit
commitment can be implemented in this model using protocols where
honest parties need no quantum memory, whereas an adversarial player
needs quantum memory of size at least $n/2$ in order to break the
protocol, where $n$ is the number of qubits transmitted. This is in
sharp contrast to the classical bounded-memory model, where we can
only tolerate adversaries with memory of size quadratic in honest
players' memory size. Our protocols are efficient, non-interactive
and can be implemented using today's technology. On the technical
side, a new entropic uncertainty relation involving min-entropy is
established.
Serge Fehr, and Christian Schaffner
Randomness Extraction via Delta-Biased Masking in the Presence of a Quantum Attacker
In Theory of Cryptography Conference - TCC '08, volume 4948 of Lecture Notes in Computer Science, pages 465-481. Springer-Verlag, 2008.
Abstract:
Randomness extraction is of fundamental importance for information-theoretic
cryptography. It allows to transform a raw key about which an attacker has some
limited knowledge into a fully secure random key, on which the attacker has
essentially no information.
We show a new randomness-extraction technique which works also in case where the
attacker has quantum information on the raw key. Randomness extraction is done by
xor'ing a so-called $\delta$-biased mask to the raw key. Up to date, only very
few techniques are known to work against a quantum attacker, much in contrast to
the classical (non-quantum) setting, which is much better understood and for which
a vast amount of different techniques for randomness extraction are known.
We show two applications of the new result. We first show how to encrypt a long
message with a short key, information-theoretically secure against a quantum
attacker, provided that the attacker has enough quantum uncertainty on the message.
This generalizes the concept of entropically-secure encryption to the case of a
quantum attacker.
And, as a second application, we show how the new randomness-extraction technique
allows to do error-correction without leaking partial information to a quantum
attacker. Such a technique is useful in settings where the raw key may contain
errors, since standard error-correction techniques may provide the attacker with
information on, say, a secret key that was used to obtain the raw key.
Ivan Damgård, Serge Fehr, Louis Salvail, and Christian Schaffner
Secure Identification and QKD in the Bounded-Quantum-Storage Model
In Advances in Cryptology - CRYPTO '07, volume 4622 of Lecture Notes in Computer Science, pages 342-359. Springer-Verlag, 2007.
Also: Accepted to QIP 2008.
Abstract:
We consider the problem of secure identification: user U proves to
server S that he knows an agreed (possibly low-entropy) password w,
while giving away as little information on w as possible, namely
the adversary can exclude at most one possible password for each
execution of the scheme. We propose a solution in the bounded-
quantum-storage model, where U and S may exchange qubits, and a
dishonest party is assumed to have limited quantum memory. No
other restriction is posed upon the adversary.
An improved version of the proposed identification scheme is also
secure against a man-in-the-middle attack, but requires U and S to
additionally share a high-entropy key k. However, security is still
guaranteed if one party loses k to the attacker but notices the
loss. In both versions of the scheme, the honest participants need
no quantum memory, and noise and imperfect quantum sources can be
tolerated. The schemes compose sequentially, and w and k can
securely be re-used.
A small modification to the identification scheme results in a
quantum-key-distribution (QKD) scheme, secure in the bounded-
quantum-storage model, with the same re-usability properties of the
keys, and without assuming authenticated channels. This is in sharp
contrast to known QKD schemes (with unbounded adversary) without
authenticated channels, where authentication keys must be updated,
and unsuccessful executions can cause the parties to run out of keys.
Ivan Damgård, Serge Fehr, Renato Renner, Louis Salvail, and Christian Schaffner
A Tight High-Order Entropic Quantum Uncertainty Relation With Applications
In Advances in Cryptology - CRYPTO '07, volume 4622 of Lecture Notes in Computer Science, pages 360-378. Springer-Verlag, 2007.
Also: Accepted to QIP 2008.
Masayuki Abe, and Serge Fehr
Perfect NIZK with Adaptive Soundness
In Theory of Cryptography Conference - TCC '07, volume 4392 of Lecture Notes in Computer Science, pages 118-136. Springer-Verlag, 2007.
Abstract:
This paper presents a very simple and efficient adaptively-sound
perfect NIZK argument system for any NP-language. In contrast to
recently proposed schemes by Groth, Ostrovsky and Sahai, our
scheme does not pose any restriction on the statements to be
proven. Besides, it enjoys a number of desirable properties: it
allows to re-use the common reference string (CRS), it can handle
arithmetic circuits, and the CRS can be set-up very efficiently
without the need for an honest party. We then show an application
of our techniques in constructing efficient NIZK schemes for
proving arithmetic relations among committed secrets, whereas
previous methods required expensive generic NP-reductions.
The security of the proposed schemes is based on a strong
non-standard assumption, an extended version of the so-called
Knowledge-of-Exponent Assumption (KEA) over bilinear groups. We
give some justification for using such an assumption by showing
that the commonly-used approach for proving NIZK arguments sound
does not allow for adaptively-sound statistical NIZK arguments
(unless NP is in P/poly). Furthermore, we show that the assumption
used in our construction holds with respect to generic adversaries
that do not exploit the specific representation of the group
elements. We also discuss how to avoid the non-standard assumption
in a pre-processing model.
Ivan Damgård, Serge Fehr, Louis Salvail, and Christian Schaffner
Oblivious Transfer and Linear Functions
In Advances in Cryptology - CRYPTO '06, volume 4117 of Lecture Notes in Computer Science, pages 427-444. Springer-Verlag, 2006.
Abstract:
We study unconditionally secure 1-out-of-2 Oblivious Transfer
(1-2 OT). We first point out that a standard security
requirement for 1-2 OT of bits, namely that the receiver only
learns one of the bits sent, holds if and only if the receiver
has no information on the XOR of the two bits. We then
generalize this to 1-2 OT of strings and show that the security
can be characterized in terms of binary linear functions.
More precisely, we show that the receiver learns only one of
the two strings sent if and only if he has no information on
the result of applying any binary linear function (which
non-trivially depends on both inputs) to the two strings.
We then argue that this result not only gives new insight into
the nature of 1-2 OT, but it in particular provides a very
powerful tool for analyzing 1-2 OT protocols. We demonstrate
this by showing that with our characterization at hand, the
reducibility of 1-2 OT (of strings) to a wide range of weaker
primitives follows by a very simple argument. This is in sharp
contrast to previous literature, where reductions of 1-2 OT to
weaker flavors have rather complicated and sometimes even
incorrect proofs.
Ivan Damgård, Serge Fehr, Louis Salvail, and Christian Schaffner
Cryptography in the Bounded Quantum-Storage Model
In 46th Symposium on Foundations of Computer Science (FOCS), pages 449-458, 2005.
Also: Invited talk at QIP 2006.
Abstract:
We initiate the study of two-party cryptographic primitives with
unconditional security, assuming that the adversary's {\em quantum}
memory is of bounded size. We show that oblivious transfer and bit
commitment can be implemented in this model using protocols where
honest parties need no quantum memory, whereas an adversarial player
needs quantum memory of size at least $n/2$ in order to break the
protocol, where $n$ is the number of qubits transmitted. This is in
sharp contrast to the classical bounded-memory model, where we can
only tolerate adversaries with memory of size quadratic in honest
players' memory size. Our protocols are efficient, non-interactive
and can be implemented using today's technology. On the technical
side, a new entropic uncertainty relation involving min-entropy is
established.
Ronald Cramer, Serge Fehr, and Martijn Stam
Black-Box Secret Sharing from Primitive Sets in Algebraic Number Fields
In Advances in Cryptology - CRYPTO '05, volume 3621 of Lecture Notes in Computer Science, pages 344-360. Springer-Verlag, 2005.
Abstract:
A {\em black-box} secret sharing scheme (BBSSS) for a given access
structure works in exactly the same way over any finite Abelian
group, as it only requires black-box access to group operations and
to random group elements. In particular, there is no dependence on
e.g.\ the structure of the group or its order. The expansion factor
of a BBSSS is the length of a vector of shares (the number of group
elements in it) divided by the number of players $n$.
At CRYPTO 2002 Cramer and Fehr proposed a threshold BBSSS with an
asymptotically minimal expansion factor $\Theta(\log n)$.
In this paper we propose a BBSSS that is based on a new paradigm,
namely, {\em primitive sets in algebraic number fields}. This leads
to a new BBSSS with an expansion factor that is absolutely minimal
up to an additive term of at most~2, which is an improvement by a
constant additive factor.
We provide good evidence that our scheme is considerably more
efficient in terms of the computational resources it requires.
Indeed, the number of group operations to be performed is
$\tilde{O}(n^2)$ instead of $\tilde{O}(n^3)$ for sharing and
$\tilde{O}(n^{1.6})$ instead of $\tilde{O}(n^{2.6})$ for
reconstruction.
Finally, we show that our scheme, as well as that of Cramer and
Fehr, has asymptotically optimal randomness efficiency.
Masayuki Abe, and Serge Fehr
Adaptively Secure Feldman VSS and Applications to Universally-Composable Threshold Cryptography
In Advances in Cryptology - CRYPTO '04, volume 3152 of Lecture Notes in Computer Science, pages 317-334. Springer-Verlag, 2004.
Abstract:
We propose the first distributed discrete-log key generation (DLKG) protocol from scratch which
is adaptively-secure in the non-erasure model, and at the same time completely avoids the use
of interactive zero-knowledge proofs. As a consequence, the protocol can be proven secure in a
universally-composable (UC) like framework which prohibits rewinding. We prove the security in
what we call the single-inconsistent-player (SIP) UC model, which guarantees arbitrary
composition as long as all protocols are executed by the same players. As applications, we
propose a fully UC threshold Schnorr signature scheme, a fully UC threshold DSS signature
scheme, and a SIP UC threshold Cramer-Shoup cryptosystem.
Our results are based on a new adaptively-secure Feldman VSS scheme. Although adaptive security
was already addressed by Feldman in the original paper, the scheme requires secure communication,
secure erasure, and either a linear number of rounds or digital signatures to resolve disputes.
Our scheme overcomes all of these shortcomings, but on the other hand requires some restriction
on the corruption behavior of the adversary, which however disappears in some applications
including our new DLKG protocol.
We also propose several new adaptively-secure protocols, which may find other applications, like
a distributed trapdoor-key generation protocol for Pedersen's commitment scheme, an
adaptively-secure Pedersen VSS scheme (as a {\em committed} VSS), or distributed-verifier proofs
for proving relations among commitments or even any NP relations in general.
Ivan Damgård, Serge Fehr, and Louis Salvail
Zero-Knowledge Proofs and String Commitments Withstanding Quantum Attacks
In Advances in Cryptology - CRYPTO '04, volume 3152 of Lecture Notes in Computer Science, pages 254-272. Springer-Verlag, 2004
Abstract:
The concept of zero-knowledge (ZK) has become of fundamental importance
in cryptography. However, in a setting where entities are modeled by
quantum computers, classical arguments for proving ZK fail to hold since,
in the quantum setting, the concept of rewinding is not generally
applicable. Moreover, known classical techniques that avoid rewinding
have various shortcomings in the quantum setting.
We propose new techniques for building {\em quantum} zero-knowledge (QZK)
protocols, which remain secure even under (active) quantum attacks. We
obtain computational QZK proofs and perfect QZK arguments for any NP
language in the common reference string model. This is based on a general
method converting an important class of classical honest-verifier ZK (HVZK)
proofs into QZK proofs. This leads to quite practical protocols if the
underlying HVZK proof is efficient. These are the first proof protocols
enjoying these properties, in particular the first to achieve perfect QZK.
As part of our construction, we propose a general framework for building
unconditionally hiding (trapdoor) string commitment schemes, secure against
quantum attacks, as well as concrete instantiations based on specific
(believed to be) hard problems. This is of independent interest, as these
are the first unconditionally hiding string commitment schemes withstanding
quantum attacks.
Finally, we give a partial answer to the question whether QZK is possible
in the plain model. We propose a new notion of QZK, {\em non-oblivious
verifier} QZK, which is strictly stronger than honest-verifier QZK but
weaker than full QZK, and we show that this notion can be achieved by means
of efficient (quantum) protocols.
Ivan Damgård, Serge Fehr, Kirill Morozov, and Louis Salvail
Unfair Noisy Channels and Oblivious Transfer
In Theory of Cryptography Conference - TCC '04, volume 2951 of Lecture Notes in Computer Science, pages 355-373. Springer-Verlag, 2004.
Abstract:
In a paper from EuroCrypt'99, Damg{\aa}rd, Kilian and Salvail show
various positive and negative results on constructing Bit Commitment
(BC) and Oblivious Transfer (OT) from Unfair Noisy Channels (UNC),
i.e., binary symmetric channels where the error rate is only known
to be in a certain interval $[\gamma ..\delta]$ and can be chosen
adversarily. They also introduce a related primitive called
$PassiveUNC$.
We prove in this paper that any OT protocol that can be constructed
based on a $PassiveUNC$ and is secure against a passive adversary
can be transformed using a generic ``compiler'' into an OT protocol
based on a $UNC$ which is secure against an active adversary. Apart
from making positive results easier to prove in general, this also
allows correcting a problem in the EuroCrypt'99 paper: There, a
positive result was claimed on constructing from UNC an OT that is
secure against active cheating. We point out that the proof sketch
given for this was incomplete, and we show that a correct proof of a
much stronger result follows from our general compilation result and
a new technique for transforming between weaker versions of OT with
different parameters.
Serge Fehr
Secure Multi-Player Protocols: Fundamentals, Generality, and Efficiency
PhD Dissertation, University of Aarhus, Denmark, 2003.
Available files:
|
|
Ronald Cramer, Serge Fehr, Yuval Ishai, and Eyal Kushilevitz
Efficient Multi-Party Computation over Rings
In Advances in Cryptology - EUROCRYPT '03, volume 2656 of Lecture Notes in Computer Science, pages 596-613. Springer-Verlag, 2003.
Abstract:
Secure multi-party computation (MPC) is an active research area, and a
wide range of literature can be found nowadays suggesting improvements
and generalizations of existing protocols in various directions.
However, all current techniques for secure MPC apply to functions that
are represented by (boolean or arithmetic) circuits over finite
{\em fields}. We are motivated by two limitations of these techniques:
- {\sc Generality.} Existing protocols do not apply to computation
over more general algebraic structures (except via a brute-force
simulation of computation in these structures).
- {\sc Efficiency.} The best known {\em constant-round} protocols
do not efficiently scale even to the case of large finite fields.
Our contribution goes in these two directions. First, we propose a basis
for unconditionally secure MPC over an arbitrary finite {\em ring}, an
algebraic object with a much less nice structure than a field, and obtain
efficient MPC protocols requiring only a {\em black-box access} to the
ring operations and to random ring elements. Second, we extend these
results to the constant-round setting, and suggest efficiency improvements
that are relevant also for the important special case of fields. We
demonstrate the usefulness of the above results by presenting a novel
application of MPC over (non-field) rings to the round-efficient secure
computation of the maximum function.
Masayuki Abe, Ronald Cramer, and Serge Fehr
Non-Interactive Distributed-Verifier Proofs and Proving Relations
among Commitments
In Advances in Cryptology - ASIACRYPT '02, volume 2501 of Lecture Notes in Computer Science, pages 206-223. Springer-Verlag, 2002.
Abstract:
A {\em commitment multiplication proof}, CMP for short, allows a player who is
committed to secrets $s$, $s'$ and $s'' = s \cdot s'$, to prove, without revealing
$s,s'$ or $s''$, that indeed $s'' = s s'$. CMP is an important building block for
secure general multi-party computation as well as threshold cryptography.
In the standard cryptographic model, a CMP is typically done interactively using
zero-knowledge protocols. In the random oracle model it can be done non-
interactively by removing interaction using the Fiat-Shamir heuristic. An
alternative non-interactive solution in the distributed setting, where at most a
certain fraction of the verifiers are malicious, was presented in [Abe99] for
Pedersen's discrete log based commitment scheme. This CMP essentially consists of a
few invocations of Pedersen's verifiable secret sharing scheme (VSS) and is secure
in the standard model.
In the first part of this paper, we improve that CMP by arguing that a building
block used in its construction in fact already constitutes a CMP. This not only
leads to a simplified exposition, but also saves on the required number of
invocations of Pedersen's VSS. Next we show how to construct non-interactive
proofs of partial knowledge [CDS94] in this distributed setting. This allows for
instance to prove non-interactively the knowledge of $\ell$ out of $m$ given secrets,
without revealing which ones. We also show how to construct efficient non-interactive
zero-knowledge proofs for circuit satisfiability in the distributed setting.
In the second part, we investigate generalizations to other homomorphic commitment
schemes, and show that on the negative side, Pedersen's VSS cannot be generalized
to arbitrary (black-box) homomorphic commitment schemes, while on the positive side,
commitment schemes based on $q$-one-way-group-homomorphism [CD98], which cover wide
range of currently used schemes, suffice.
Serge Fehr, and Ueli Maurer
Linear VSS and Distributed Commitments Based on Secret Sharing and
Pairwise Checks
In Advances in Cryptology - CRYPTO '02, volume 2442 of Lecture Notes in Computer Science, pages 565-580. Springer-Verlag, 2002.
Abstract:
We present a general treatment of all non-cryptographic (i.e., information-
theoretically secure) linear verifiable-secret-sharing (VSS) and distributed-
commitment (DC) schemes, based on an underlying secret sharing scheme, pairwise
checks between players, complaints, and accusations of the dealer. VSS and DC
are main building blocks for unconditional secure multi-party computation
protocols. This general approach covers all known linear VSS and DC schemes.
The main theorem states that the security of a scheme is equivalent to a pure
linear-algebra condition on the linear mappings (e.g.~described as matrices and
vectors) describing the scheme. The security of all known schemes follows as
corollaries whose proofs are pure linear-algebra arguments, in contrast to some
hybrid arguments used in the literature. Our approach is demonstrated for the
CDM DC scheme, which we generalize to be secure against mixed adversary settings
(some curious and some dishonest players), and for the classical BGW VSS scheme,
for which we show that some of the checks between players are superfluous, i.e.,
the scheme is not optimal. More generally, our approach, establishing the minimal
conditions for security (and hence the common denominator of the known schemes),
can lead to the design of more efficient VSS and DC schemes for general adversary
structures.
Ronald Cramer, and Serge Fehr
Optimal Black-Box Secret Sharing over Arbitrary Abelian Groups
In Advances in Cryptology - CRYPTO '02, volume 2442 of Lecture Notes in Computer Science, pages 272-287. Springer-Verlag, 2002.
Abstract:
A {\em black-box} secret sharing scheme for the threshold access structure
$T_{t,n}$ is one which works over any finite Abelian group $G$. Briefly, such
a scheme differs from an ordinary linear secret sharing scheme (over, say, a
given finite field) in that distribution matrix and reconstruction vectors
are defined over $\Z$ and are designed {\em independently} of the group $G$
from which the secret and the shares are sampled. This means that perfect
completeness and perfect privacy are guaranteed {\em regardless} of which
group $G$ is chosen. We define the black-box secret sharing problem as the
problem of devising, foran arbitrary given $T_{t,n}$, a scheme with minimal
expansion factor, i.e., where the length of the full vector of shares divided
by the number of players $n$ is minimal.
Such schemes are relevant for instance in the context of distributed
cryptosystems based on groups with secret or hard to compute group order. A
recent example is secure general multi-party computation over black-box rings.
In 1994 Desmedt and Frankel have proposed an elegant approach to the black-box
secret sharing problem based in part on polynomial interpolation over
cyclotomic number fields. For arbitrary given $T_{t,n}$ with $0 < t < n-1$, the
expansion factor of their scheme is $O(n)$. This is the best previous general
approach to the problem.
Using certain low degree integral extensions of $\Z$ over which there exist
pairs of sufficiently large Vandermonde matrices with co-prime determinants,
we construct, for arbitrary given $T_{t,n}$ with $0 < t < n-1$, a black-box secret
sharing scheme with expansion factor $O(\log n)$, which we show is minimal.
Ronald Cramer, Ivan Damgård, and Serge Fehr
On the Cost of Reconstructing a Secret, or VSS with Optimal Reconstruction Phase
In Advances in Cryptology - CRYPTO '01, volume 2139 of Lecture Notes in Computer Science, pages 503-523. Springer-Verlag, 2001.
Abstract:
Consider a scenario where an $l$-bit secret has been distributed among $n$
players by an honest dealer using some secret sharing scheme. Then, if all
players behave honestly, the secret can be reconstructed in one round with zero
error probability, and by broadcasting $nl$ bits.
We ask the following question: how close to this ideal can we get if up to $t$
players (but not the dealer) are corrupted by an adaptive, active adversary with
unbounded computing power? - and where in addition we of course require that the
adversary does not learn the secret ahead of reconstruction time. It is easy to
see that $t= \lfloor (n-1)/2 \rfloor$ is the maximal value of $t$ that can be
tolerated, and furthermore, we show that the best we can hope for is a one-round
reconstruction protocol where every honest player outputs the correct secret or
``failure''. For any such protocol with failure probability at most
$2^{-\Omega(k)}$, we show a lower bound of $\Omega(nl + k n^2)$ bits on the
information communicated. We further show that this is tight up to a constant
factor.
The lower bound trivially applies as well to VSS schemes, where also the dealer
may be corrupt. Using generic methods, the scheme establishing the upper bound
can be turned into a VSS with efficient reconstruction. \linebreak However, the
distribution phase becomes very inefficient. Closing this gap, we present a new
VSS protocol where the distribution complexity matches that of the previously best
known VSS, but where the reconstruction phase meets our lower bound up to a
constant factor. The reconstruction is a factor of $n$ better than previous VSS
protocols. We show an application of this to multi-party computation with
pre-processing, improving the complexity of earlier similar protocols by a factor
of $n$.
Serge Fehr
Efficient Construction of the Dual Span Program
Manuscript, May 1999.
Abstract:
We consider monotone span programs as a tool for representing,
we will say {\em computing}, general access structures. It is
known that if an access structure is computed by a monotone span
program, then the dual access structure is computed by a monotone
span program of the same size. We will strengthen this result by
proving that such an span program not only exists, but can be
efficiently computed.
| |