10:00. Henry Bambury (ENS Paris, Inria and DGA). Provably Reducing Near-Hypercubic Lattices.
Abstract: A recurring question in estimating parameters for lattice-based cryptography is the following: what is the largest dimension in which an attacker can be allowed to solve SVP in order for the scheme to remain secure? This question comes in two flavours: provable or heuristic. While we expect heuristic algorithms to perform better in practice, Ducas (DCC 2023) proposed a provable algorithm for reducing the hypercubic (Z^n) lattice that only requires SVP oracles in dimension n/2, bridging the gap between what is provable and what is heuristic. In this talk, we show that Ducas’ algorithm can be relaxed to sqrt(2)-approximate-SVP oracles, and provide an algorithm that reduces most n-dimensional NTRU lattices using SVP oracles in dimension n/2. This answers a conjecture of Gama, Howgrave- Graham and Nguyen from Eurocrypt 2006. If time allows, we compare the (provable) results with (heuristic) asymptotics for the primal attack.
Joint work with Phong Q. Nguyen
11:00 Julien Devevey (ANSSI).G+G: A Fiat-Shamir Lattice Signature based on Convolved Gaussians.
Abstract: Adapting Schnorr’s signature to the lattice setting has been a long standing problem. It was solved by Lyubashevsky in 2009 by introducing rejection sampling and the Fiat-Shamir with Aborts framework, which adds a layer of complexity to the scheme, as underlined by the fact that no correct proof of the framework was given before [D. et al., Crypto2023, Barbosa et al., Crypto 2023]. Getting rid of rejection sampling is possible, but comes at a significant increase in signature size due to the use of flooding techniques (see, e.g., [Damgard et al., Crypto 2012]). In this talk we propose a new adaptation, which relies on Gaussian convolution rather than flooding or rejection sampling as previous approaches. It does not involve any abort, can be proved secure in the ROM and QROM using existing analyses of the Fiat-Shamir transform, and enjoys smaller signature sizes (both asymptotically and for concrete security levels).
Joint work with Damien Stehlé and Alain Passelègue
13:45. Geoffroy Couteau (CNRS, IRIF, Université Paris Cité). Fast Public-Key Silent OT and More from Constrained Naor-Reingold.
Abstract: Pseudorandom Correlation Functions (PCFs) allow two parties, given correlated evaluation keys, to locally generate arbitrarily many pseudorandom correlated strings, e.g. Oblivious Transfer (OT) correlations, which can then be used by the two parties to jointly run secure computation protocols.
In this work, we provide a novel and simple approach for constructing PCFs for OT correlation, by relying on constrained pseudorandom functions for a class of constraints containing a weak pseudorandom function (wPRF). We then show that tweaking the Naor-Reingold pseudorandom function and relying on low-complexity pseudorandom functions allow us to instantiate our paradigm. We further extend our ideas to obtain efficient public-key PCFs, which allow the distribution of correlated keys between parties to be non-interactive: each party can generate a pair of public/secret keys, and any pair of parties can locally derive their correlated evaluation key by combining their secret key with the other party’s public key.
In addition to these theoretical contributions, we detail various optimizations and provide concrete instantiations of our paradigm relying on the Boneh-Ishai-Passelègue-Sahai-Wu wPRF and the Goldreich-Applebaum-Raykov wPRF. Putting everything together, we obtain public-key PCFs with a throughput of 15k-40k OT/s, which is of a similar order of magnitude to the state-of-the-art interactive PCFs and about 4 orders of magnitude faster than state-of-the-art public-key PCFs.
As a side result, we also show that public-key PCFs can serve as a building block to construct reusable designated-verifier non-interactive zero-knowledge proofs (DV-NIZK) for NP. Combined with our instantiations, this yields simple and efficient reusable DV-NIZKs for NP in pairing-free groups.
Joint work with Dung Bui, GC, Pierre Meyer, Alain Passelègue, and Mahshid Riahinia
14:45. Maxime Bombar (université de Bordeaux). From Structured Codes to Secure Computation, and Back Again.
Abstract: It is well-known that random linear codes are hard to decode, even with the help of quantum computers. In particular, they offer a compelling alternative to lattice-based post-quantum cryptography which has just been standardised. However, this traditional cryptography only protects data in transit, while a growing need today is to perform computations on that data. If lattices also enable (fully) homomorphic encryption, this still requires heavy computations which may not be suitable for all situations. In particular, when data is distributed among multiple users, secure MultiParty Computation (MPC) can offer a more practical solution. However, the main bottleneck in MPC protocols is often the cost of communications between all parties. Fortunately, it has long been known that extremely efficient online MPC protocols do exist, provided that all parties have access to a trusted source of correlated randomness, independent of the protocol’s inputs. Therefore, building efficient MPC protocols is reduced to distributing a large amount of correlated randomness to all the parties. To this end, new primitives known as Pseudorandom Correlation Generators (PCG) have been recently developed based on the hardness of decoding random linear codes having a specific algebraic structure. These are now considered to be one of the most promising approaches for performing MPC in this so-called “correlated randomness model”.