# Disproving a Coincidence with Code Golf

## Table of Contents

In this post, we use code golf^{1} to prove that claims made by the
designers of some Russian standard cryptographic algorithms are
factually wrong. The content of this page is based on a joint work
with Xavier Bonnetain and Shizhu Tian [5].

## A Seemingly Impossible Code Golf Challenge

How small of a function can you write that takes as input an integer
`x`

in and outputs `pi[x]`

where `pi`

is the table
given below? Here, "small" means that the corresponding source code
length *measured in bits* is short.

**Send me your best implementation in the language of your choice and I will publish it on this page!**

**It does not have to shorter than our best implementations.**

unsigned char pi[] = { 0xFC,0xEE,0xDD,0x11,0xCF,0x6E,0x31,0x16,0xFB,0xC4,0xFA,0xDA,0x23,0xC5,0x04,0x4D, 0xE9,0x77,0xF0,0xDB,0x93,0x2E,0x99,0xBA,0x17,0x36,0xF1,0xBB,0x14,0xCD,0x5F,0xC1, 0xF9,0x18,0x65,0x5A,0xE2,0x5C,0xEF,0x21,0x81,0x1C,0x3C,0x42,0x8B,0x01,0x8E,0x4F, 0x05,0x84,0x02,0xAE,0xE3,0x6A,0x8F,0xA0,0x06,0x0B,0xED,0x98,0x7F,0xD4,0xD3,0x1F, 0xEB,0x34,0x2C,0x51,0xEA,0xC8,0x48,0xAB,0xF2,0x2A,0x68,0xA2,0xFD,0x3A,0xCE,0xCC, 0xB5,0x70,0x0E,0x56,0x08,0x0C,0x76,0x12,0xBF,0x72,0x13,0x47,0x9C,0xB7,0x5D,0x87, 0x15,0xA1,0x96,0x29,0x10,0x7B,0x9A,0xC7,0xF3,0x91,0x78,0x6F,0x9D,0x9E,0xB2,0xB1, 0x32,0x75,0x19,0x3D,0xFF,0x35,0x8A,0x7E,0x6D,0x54,0xC6,0x80,0xC3,0xBD,0x0D,0x57, 0xDF,0xF5,0x24,0xA9,0x3E,0xA8,0x43,0xC9,0xD7,0x79,0xD6,0xF6,0x7C,0x22,0xB9,0x03, 0xE0,0x0F,0xEC,0xDE,0x7A,0x94,0xB0,0xBC,0xDC,0xE8,0x28,0x50,0x4E,0x33,0x0A,0x4A, 0xA7,0x97,0x60,0x73,0x1E,0x00,0x62,0x44,0x1A,0xB8,0x38,0x82,0x64,0x9F,0x26,0x41, 0xAD,0x45,0x46,0x92,0x27,0x5E,0x55,0x2F,0x8C,0xA3,0xA5,0x7D,0x69,0xD5,0x95,0x3B, 0x07,0x58,0xB3,0x40,0x86,0xAC,0x1D,0xF7,0x30,0x37,0x6B,0xE4,0x88,0xD9,0xE7,0x89, 0xE1,0x1B,0x83,0x49,0x4C,0x3F,0xF8,0xFE,0x8D,0x53,0xAA,0x90,0xCA,0xD8,0x85,0x61, 0x20,0x71,0x67,0xA4,0x2D,0x2B,0x09,0x5B,0xCB,0x9B,0x25,0xD0,0xBE,0xE5,0x6C,0x52, 0x59,0xA6,0x74,0xD2,0xE6,0xF4,0xB4,0xC0,0xD1,0x66,0xAF,0xC2,0x39,0x4B,0x63,0xB6 };

Why this table? Why do we want a small implementation? Is it even possible to find one? Read on!

## Why this Challenge Matters

### What is π?

π (as in the Greek letter) is a component used by the latest standard
symmetric cryptographic algorithms in Russia, namely the block cipher
Kuznyechik and the cryptographic hash function Streebog. It is what we
cryptographers call an **S-box**, i.e. a small non-linear function with
an input small enough that it is possible to specify said function
simply by providing all of its outputs. That is what the C array above
is: `pi[x]`

is equal to π(x).

Both Streebog and Kuznyechik are standards in Russia and at the IETF
(RFC 6986 and 7801 respectively). Streebog is also an ISO/IEC standard
and Kuznyechik might become one. In a previous paper on this S-box
[3], I found that it could be described by a very simple
algorithms which I called *TKlog*. This work and its context are
summarized on this page.

### Is its TKlog Structure a Coincidence?

Since the publication of that structure, the designers of these algorithms have claimed that the presence of said structure was a mere coincidence during an ISO/IEC meeting [4]. They claim to have in fact generated it by picking random permutations and then keeping one with good enough cryptographic properties.

However, the set of TKlogs is beyond tiny (it contains
elements while there are permutations of ), meaning that their claim is *a priori* very hard to
believe. To put these unfathomable number into perspective: there are
about atoms in the universe and a random permutation is a
TKlog with a probability of which
is just under . Suppose that you are participating in a
special lottery where you have to pick 1 atom in the whole universe,
and that you win if you guessed the right one. The probability of
obtaining a TKlog when choosing an 8-bit permutation uniformly at
random is a bit below the probability of winning this atomic lottery
*6 times in a row*. What I am trying to say is that the probability of
obtaining a TKlog by chance is *small*. Like, really, *really really*
small.

Thus, to try to reconcile my work with their claims of randomness,
the designers of π asserted that, for any permutation, it is possible
to find a simple structure implementing it. Thus, while the
probability of having a TKlog specifically is tiny, the probability of
having *some* structure is very high. Had they picked the next
randomly generated S-box, I would have found another equally unlikely
structure.

It sounds plausible! Indeed, suppose that you have a coin and want to
check if it is biased. To do this, you look at the number of tails you
get when tossing it 10000 times and find 4970. The probability that
you get *exactly* this number is quite small:

But it is not that interesting. What would be much more meaningful in
this case is the probability that you get 4970 tails *or fewer*,
namely

which is actually quite high.

The situation would then be the same for the presence of a structure
in an S-box. Sure, each specific structure has a tiny probability
(like that of having exactly 4970 tails) but it is the case for the
structure of *all* the S-boxes. It is similar to the case of the coin,
having exactly 4971 or 4969 tails is also unlikely. But it doesn't
mean anything: it is in the end not that weird to have at most 4970
tails, and we cannot deduce that the coin is biased from the fact that
it yielded 4970 tails. For π, we similarly could not deduce that its
structure is not a coincidence from the fact that said structure has a
low probability.

Except this is completely wrong.

### No, it's Not

As plausible as it may sound, the argument about π detailed above is
completely wrong because a random permutation does not have a
structure and *certainly not one as simple as a TKlog*.

To see why, we need to find a way to sort the different S-box
structures. If two structures can always be ranked (i.e. if we can
find a partial ordering of the S-box structures), then we can look at
probabilities that are far more meaningful: instead of the
probability that π has a *specific* structure (here, the TKlog), we
can instead look at the probability that π is *at least as structured
as a TKlog*. As in the example above with the coin, this would give us
a probability of having a structure like the one we found *or better*,
which would be far more informative.

And that is why we will code golf!

### To Prove it, We Need a Small Implementation of π

The ordering we propose to use is very simple and takes inspiration
from the Kolmogorov complexity. Intuitively, if a function is "simple"
or "too structured" then it should be possible to implement it using a
short program. Here, short means that the program takes literally
little space, i.e. that it can be encoded with few bits. The length
of the smallest program is then an integer, and those are very easy to
order: 1 is smaller than 5 which is smaller than 47, etc. Identically,
a function that can be implemented on 400 bits is simpler than one
that needs at least 2000. Finding small implementations is precisely
the goal of code golf, which is why we need to grab our metaphorical
golf clubs. We must however be careful to choose a meaningful
language: if we just define a language such that `p()`

is a predefined
function that implements π then we haven't said much.

One problem remains: we can now *compare* the simplicity of
two functions by comparing the length of their smallest
implementation. But in order to say something about the simplicity of
a given function compared to all the others, we need some *absolute*
quantification of said simplicity. This implies bounding the length of
the program needed to implement a random permutation.

It is surprisingly simple to obtain a (very coarse) bound suited to our purpose. In fact, Claude Shannon used the same kind of argument to prove that almost all Boolean functions mapping n bits to 1 are as difficult to implement as the hardest ones [6]. Suppose that all strings of bits are valid programs in the language of your choice. Then at most functions have an implementation at most bits long. There are at most such program so it is impossible that they correspond to distinct functions or more.

Thus, in our case, if a permutation can be implemented by a program of length then there are at most permutations with an implementation as small as this, and the probability that a random permutation is that simple is at most equal to . This bound is everything but tight! It assumes that all strings are valid programs corresponding to distinct permutations. It is not the case, so we have an upper bound—but that is all we need.

## Implementations too Small to be Normal

We set out to find a small implementation of π, i.e. one which needs
much fewer than 1684 bits. We chose the C language: we hope you are
convinced that we did not temper with the C standard^{2} to allow
the small implementation described below. To push our approach
further, we also looked at ARM machine code as it lends itself well to
very short implementations.

Using the TKlog structure from [3] Xavier Bonnetain devised the following implementation. It is explained on this page.

p(x){unsigned char*t="@`rFTDVbpPBvdtfR@\xacp?\xe2>4\xa6\xe9{z\xe3q5\xa7\xe8",a=2,l=0,b=17;while(++l&&a^x)a=2*a^a/128*29;return l%b?t[l%b]^t[b+l/b]^b:t[l/b]^188;}

The program above contains 161 characters which are all ASCII, meaning that they can be encoded on 7 bits each. In total, our implementation is 161×7=1127 bits. We present other implementations in [5].

We then have that a random permutation is at least as simple as π with probability at most , i.e. roughly the probability of winning the atomic lottery twice in a row. It is more likely than winning it 6 times but it is still extremely unlikely.

This is only a bound on how unlikely this level of structure is when
looking at the C language. If we instead look at ARM machine code (see
*below*), we obtain a program length of 640 bits and thus a probability
bound of . It is the probability of
winning the atomic lottery 4 times in a row.

Overall, the fact a random permutation is as structured as π is incredibly unlikely. And I do literally mean "incredibly", i.e. that it should not be believed that such an unlikely event happened by chance.

## Conclusion

There are only two possible conclusions our results:

- the designers of π are
*incredibly*lucky, or - π was not designed in the way they claim.

**The probability of the first option is negligible. How then did they generate it?**

Finally, if you have a short implementation of π in some language, send it to me and I will put it with the others here!

## Acknowledgement

I thank Xavier Bonnetain for his help when drafting this document and for his proofreading.

## Bibliography

- Alex Biryukov, Léo Perrin, and Aleksei Udovenko.
**Reverse-engineering the S-box of streebog, kuznyechik and STRIBOBr1**. In Marc Fischlin and Jean-Sébastien Coron, editors,*Advances in Cryptology – EUROCRYPT 2016, Part I*, volume 9665 of*Lecture Notes in Computer Science*, pages 372–402. Springer, Heidelberg, May 2016. link to eprint.iacr.org. - Léo Perrin and Aleksei Udovenko.
**Exponential S-boxes: a link between the S-boxes of BelT and Kuznyechik/Streebog**.*IACR Transactions on Symmetric Cryptology*, 2016(2):99–124, 2017. link to tosc.iacr.org. - Léo Perrin.
**Partitions in the S-Box of Streebog and Kuznyechik**.*IACR Transactions on Symmetric Cryptology*, 2019(1), 302–329, 2019. link to tosc.iacr.org. - Joseph Cox.
**Experts Doubt Russian Claims That Cryptographic Flaw Was a Coincidence**. Vice, Motherboard (8th of May 2019). link. - Xavier Bonnetain, Léo Perrin and Shizhu Tian.
**Anomalies and Vector Space Search: Tools for S-Box Reverse-Engineering**. link to eprint.iacr.org. - Claude Shannon.
**The synthesis of two-terminal switching circuits.***The Bell System Technical Journal*, 28(1):59–98, Jan 1949.

## Footnotes:

^{1}

The analysis I present in this note proves that π cannot be the output of a random generation using only the simplicity of its structure. As discussed in the description of said structure, it also has very peculiar cryptographic properties that interact in a special way with other components of the algorithms using this S-box. This alone should be sufficient to rule out a random generation. Still, it turns out that we don't even need these arguments, studying π "in a vacuum" can tell us everything we need.

^{2}

A compelling argument being that ANSI C89 is literally older than each of the authors of [5].