wizzard: (Default)
[personal profile] wizzard
As you probably know, a digital signature is a virtual "seal" that proves that some message has originated from a particular person in possession of a private key.

Let's try to look at this "seal" from the information-theoretic perspective.

We want to reach 128-bit security, i.e. a random guess should have a 1 in 2^128 chance getting a signature right.

Then, the public key is an encoding of a trapdoor function which can map X 256-bit messages into some thumbprint, where there are 2^256 ways (or more) to construct this trapdoor.


Let's assume we have Sam as the signer, trusted arbiter Alice instead of a trapdoor function, and a relying party Rob. Using an arbiter means minimal information (≤1 bit) is disclosed at each signature verification attempt (see [1])

Then, we have an easy and consistent signature scheme:
- Let the private key be a random bitstream PK, of length p.
- To sign the message, Sam computes N different hash functions H0 to Hn from the message digest, and uses them as indexes into PK. He then publishes these indexes together with corresponding bits of the private key.
- To verify the message, Rob passes the bits+indexes to Alice, who returns true/false when bits match their copy of the private key.

Sounds familiar? Yes, it's the Bloom Filter, and collision in the filter means two signatures share some subset of disclosed bits which means enough information is disclosed for the Rob to forge the signature using the same bits and hash functions.

And the formulas for bloom filter calculation are widely known. Let's calculate how large the private key has to be for signing 1M signatures and 128-bit security (3*10^-40 collision probability)? [3]

It seems we need a minimum of 22 MB of data to guarantee security, PLUS it has to be scrambled in some way if we want Rob to be able to verify the signature itself - because real-life customizable trapdoor functions are not as ideal as Alice (see GMSS or XMSS signatures, for example)

The message thumbprint size would then be about 131*(28+1)=3799 bits (475 bytes)

And it turns out, there is a paper [2] which arrives to exactly same numbers using a lot more complicated calculations :) (see Table 1)


Meanwhile, everybody knows SSL certificates arent 22 MB but only about 5-20 KB in length. Something just doesnt seem right.

So... anybody else still surprised why DSA, ECDSA, Winternitz, *MSS and other signature schemes allow Rob to compute the private key so easily if Sam does not rigorously follow restrictions of nonce, padding and so on?

This is also the reason I consider most signature schemes being insecure (as in, fundamentally susceptible to cryptanalysis, especially under the chosen-message attack)


[1] Johansson, T.: On the Construction of Perfect Authentication Codes that Permit Arbitration. In: Stinson, D.R. (ed.) CRYPTO 1993. LNCS, vol. 773, pp. 343–354. Springer, Heidelberg (1994)

[2] Goichiro Hanaoka, Junji Shikata, Yuliang Zheng, and Hideki Imai: Unconditionally Secure Digital ignature Schemes Admitting Transferability. T. Okamoto (Ed.): ASIACRYPT 2000, LNCS 1976, pp. 130–142, 2000.

[3] http://hur.st/bloomfilter?n=1000000&p=3.0E-39

Profile

wizzard: (Default)
wizzard

January 2019

S M T W T F S
  12 345
6789101112
1314 1516171819
202122 23242526
2728293031  

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 9th, 2026 09:48 pm
Powered by Dreamwidth Studios