Abstract
The Learning Parity with Noise (LPN) problem represents the average-case analogue of the NP-Complete problem "decoding linear codes", and it has been extensively studied in learning theory, coding theory and cryptography with applications to quantum-resistant cryptographic schemes. However, LPN also suffers from large public key size which is the common drawback that hinders code-based cryptography from being practical. In this paper, we study a sparse variant of LPN whose public matrix consists of sparse vectors instead of following uniform distribution. We show a win-win argument that at least one of the following assumption is true: (i) either the hardness of sparse LPN is implied by that of the standard LPN under the same noise rate; (ii) or there exists new black-box constructions of public-key encryption schemes and oblivious transfer protocols from standard LPN. Since the second assumption relies on the infeasible noise regimes for LPN-based public-key cryptography, we believe that the first assumption is more likely to hold, i.e. sparse LPN is as hard as standard LPN. Finally, we give a (heuristic) method to further compress the sparse public matrix by evaluating pseudorandom functions with keys made public, whose security again resorts to the aforementioned win-win technique.
| Original language | English |
|---|---|
| Pages (from-to) | 1939-1947 |
| Number of pages | 9 |
| Journal | Computer Journal |
| Volume | 65 |
| Issue number | 8 |
| DOIs | |
| State | Published - 1 Aug 2022 |
| Externally published | Yes |
Keywords
- Learning Parity with Noise
- Sparsely Learning Parity with Noise
- oblivious transfer
- privacy amplification
- pseudorandom functions
- public-key encryption
- win-win argument