Ideal-to-Realized Security Assurance In Cryptographic Keys (Part 2)

If you missed the first article in this series please read Ideal-to-Realized Security Assurance In Cryptographic Keys (Part 1) – Why “128 bits of key length” does not necessarily mean “128 bits of security.”

Part 2: Collision Attacks

I will simply define the two methods of attack which combine to form collision attacks – birthday attack and meet-in-the-middle attack – as we generically deem them.

Birthday Attack

As the name implies, this attack is based upon the birthday paradox, which goes a little something like this: If you have, let’s say, 23 people in a room, the probability of duplicate birthdays is above 50%. In fact, 23 people are required for this paradox to hold a probability > 50%. This attack relies on the idea of producing duplicates, or collisions, at a rate that exceeds expectations. To satisfy our purpose, we can define it simply, as:

For N different values, one can expect the initial collision to occur, in the vicinity of the square root of N randomly chosen values. You are simply waiting for the second occurrence of a single value within the same value set.

To those who actually analyze the basis for this attack, it isn’t a paradox in the sense that it contradicts itself, be it logically or such. It is more so a paradox in that it contains a veracity of mathematics in which there is a contradiction of how one’s natural cognitive reaction may be, to the problem. In other words, one may be apprehensive of the problem’s validity, in terms of how sound the conclusion is, therefore making this a “contrary to popular belief” case, rather than a “contradicting itself” case.

Meet-in-the-Middle Attack

Unlike its relative above, this attack relies on the overlapping of two value sets. Not only does this attack see more applicable benefit, it is much more flexible. Consider the following scenario:

Where N is all possible values, while P and Q are pairs of element sets, one can assume the expectation of a collision when PQ/N is close to 1, if the chance of matching is 1/N. P similar to Q similar to the square root of N, of course, is the birthday bound, which is most efficient.

This is where the flexibility of meet-in-the-middle attacks comes into play. Sometimes, the ease of finding elements is greater for one set than the other, which is valid, as long as PQ is similar to N.
The differences between these two attacks are very simple to outline. Birthday attacks are concerned with the duplicate occurrence of single values within one set, whereas meet-in-the-middle attacks exhibit the praxis of the overlapping of two sets.
So, why are we concerned with this?

Smaller, 128-bit keys, or similar cryptographic values, will provide us with n/2-bit security, therefore, in the case of 128-bit, we render 2^64 complexity, which is far below our desired level of security. This is the primary basis for our concern with collision attacks and their effects on insufficient key lengths. Otherwise, using 128-bit keys might not be so risky. The complexity that this imposes makes it the most cumbersome issue to deal with, alongside the intricacies of what it takes to get entropy right, the first time around. So, here we have it. The second antagonist – collision attack. It’s riding shotgun with entropy, so you’d better approach them with equal caution.
Even if we can ensure n bits of entropy to obtain n bits of security, we must still append extra key material, at the mode of operation level, to thwart the effectiveness of collision attacks. This, in itself, is a security risk, due to added complexity, which is exactly what we do not want to impose upon our decision-making process.  It is much more trivial to achieve sufficient security via simplicity, in our case. Again, stay with larger keys. 256-bit, if you can help it. I’ll explain that further below.

I’m sure there will be those who can’t find satisfaction in taking the simpler route, so for you lot, consider the following routine for using n-bit keys to induce n-bit security:

First, we must satisfy the criterion of a cryptographically sound keying source, be it pseudo-random or a password derivative (where key space complexity is considered within password-choosing etiquette). With this, we establish n for n key-to-entropy measurements.  By a specialized appending of extra keying material, we effectively mitigate the impact of certain collision attacks, however, we do so at the superfluous cost of complexity. Who cares whether or not we’ve successfully inducted a seemingly mediocre construction of security. We’ve committed the cardinal sin of cryptography – complexity. Dead end.

Why 256-bit keys are better

Obviously, it’s rather difficult to successfully deploy an 128-bit key. It’s not impossible to arrive at 128-bit security via an 128-bit key, but the point is, a 256-bit key has the flexibility to arrive at this level of security with much more simplicity. This is rather appealing when we consider our design goal and the essential gist of how we wish to obtain cryptographic security to begin with – in the most simplistic manner possible. This is why I am led to suggest the use of 256-bit keys. Simplicity is almost always the most conservative choice. Besides, why not hop on this bargain-basement-induced bit-bandwagon and reap security in the process?

The reason for choosing 256-bit keys is simple. Cryptographically effective, conservatively security-conscious, flexible, collision-attack-resilient, and capable of providing 128 bits of security, much more easily, at the alleviation of problematic entropic issues. They more appropriately cater to our design goal. Remember how we defined our security level as being desirably 128-bit? Well, 256-bit keys, when facing collision attack methodology, render 2^128 complexity. That’s right.  How much more convenient can security get, for this paradigm? This establishes a short, but unequivocally provable, golden rule, as follows:

2n bits of key for n bits of security.

Kudos to those who swear by this rule, as this is the ultimate portrayal of penance from the wicked ways of complexity. What more of an utopian semantic could one ask for, than affordably obtaining more security via more simplicity? Alice and Bob would be proud of you right now.

By designing around this rule, with 128-bit security in mind, you assure yourself the security margin that is deemed sufficient in today’s cryptographic stance, as well as introduce the defining pillar of security – simplicity. In juxtaposition with any other approach, this philosophy is empyreal.

The purpose of this is to point out that security isn’t all about the length of your key, but the quality of it. For security’s sake, as well as simplicity’s, 256-bit keys are advised. Sure, in this case, we’ve used a larger key, but not just because of its size. It ensures facile quality, that a smaller key would have to work much harder to obtain. It’s a win-win situation for us, really.
This is a prime example of the old saying, “It’s not size that matters, but how you use what you’ve got.”

Alright, so we’ve tackled quite a few areas in a short amount of time, ranging from entropy to collision attacks. As we’re nearing cessation, you may be wondering if there was ever a theme; there is. In fact, it’s a strategy – conservatism. Levels of security are among the most taken-for-granted aspects of cryptographic security. This article only addresses one facet of that, which is key length; it’s just as important to address other facets, such as block length. First things first, though, so it’s reasonable to begin this quest for conservatism with the length of the cryptographic key. Be strict with where your keys come from, but don’t be stingy with bits; both a secure source of derivation and sufficient length are important for security, respectively.

I’ll leave you with two words, to hum, or chant, or at least embroider on your cryptographic regimen, for future reference – entropic and enough. Two bells should ring, at the thought of these words. What we need is uncertainty; we want random, unpredictable sources. This corresponds to the sense of being maximally “entropic,” as we wish our keys to be. Moreover, we want to set a length that caters to a design strength goal, in the sense of approaching attacks more conservatively, such that our level of security is more accurately rendered; to get what we want, we double that, which is what we use. When we say that we wish to set a workload of 2^n steps, it would be nice to have a little breathing room. This corresponds to the sense of having “enough,” as we wish to have, bit-wise.

The current state of the art has progressed, with its share of regression, as well. While the community may not embrace it as hoped, academia has, over decades of vigorous analysis, given us strategic ways of making cryptography an art that satisfies conventional needs on various levels; it has given us tactics that work, even if, in a sense, we’re confined to infancy in design, as we face the unknown advance in cryptanalysis. The bottom line is that in the realized, practical world, our goal is to mimic the ideal, theoretical world, in hopes that our assumption of security is achievable, with reasonable precision. 

What we’ve learned is that mimicking the ideal world is collateral with mimicking the ideal user; failure in the realized world is, a majority of the time, a product of user failure. Thus, it’s fundamental that we approach ideal marks with ideal mentality; if we don’t, the system fails. It’s easy to design bad cryptography; it’s even easier to use good cryptography badly. The seriousness of this matter is that sometimes this isn’t blatantly obvious; sometimes, it may even be indistinguishable until it’s too late. This echoes the importance of the ideal user to establish policies that do not tolerate, in any form, laxation in design and deployment.

To have any conceivable hope of obtaining a level of security even close to our presumed design goal, it all boils down to the excellence of our strategy and policy, from conceptualization to implementation; this article outlines a small figment of what should be amongst that which is approached this way. Why is this so worthy of a fiery sermon? Well, you see, the quality of how we use cryptography is, ultimately, what practical cryptography is all about. Good security relies on the good usage of good cryptography. Setting aside bits and figures, this is the “key” to wholesome cryptography. It is the gospel.

If you missed the first article in this series please read Ideal-to-Realized Security Assurance In Cryptographic Keys (Part 1) – Why “128 bits of key length” does not necessarily mean “128 bits of security.”

About The Author

Leave a Comment

Your email address will not be published. Required fields are marked *

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Scroll to Top