*If you would like to read the next article in this series please check out Ideal-to-Realized Security Assurance In Cryptographic Keys (Part 2) – *Collision Attacks*.*

**Part 1: Why “128 bits of key length” does not necessarily mean “128 bits of security”**

As the majority of users will admit, they’re still finger painting, when it comes to the meticulously intricate art of cryptography. Because of this, key length is a cryptographic facet that takes on that level of “this-is-what-everyone-else-is-using” kind of security. Well, it’s time to clean the paint from under your nails and prepare to dip your brush into a bucket of know-how. After reading this, you’ll be able to understand the idealistic theory of “getting your cryptographic key’s worth,” and achieving the desired security level. In this discussion, we will be referring to 128-bit keys, 256-bit keys, and the prudent philosophy of rendering 128-bit security levels with the latter.

But as you read, please remember that we are talking strictly keys, here. The importance of this reminder is, without further ado, relative to the fact that other values and aspects, such as block lengths, confidentiality modes of operation, and the mitigation of *information leakage*, should also match this philosophy, in order for us to achieve our proposed success. But in reality, as most any block cipher will exhibit, 256-bit blocks aren’t commonplace; hence why 128-bit security levels are very rarely achieved, in their entirety. We’ll save that discussion for another rainy day, though. So, for clarity’s sake, regard this as idealistic, informal, and assuming of the integral construction of other factors that coincide with cryptographic keys to compose the definition of a security level.

So, on to the focal point of this write-up, shall we?

## How secure is 128-bit?

To simply answer this, I will say – sufficiently secure. For nearly all applicable scenarios, 128 bits of *security *is enough. Note my emphasis on the word “security.” This is where we pierce the epidermal layer of cryptography. To do so, we must establish an important rule of thumb, by showing that an increase in length doesn’t perfunctorily imply an increase in security.

Rule of thumb

: The smaller the key, the inherently smaller the security margin. In some cases, a small-enough key will fail to offer any security, regardless of any cryptographic factors. The longer the key, the more *variable *the security margin, where “variable” is a dependant of the amount of entropy in the key. Therefore, whereas security, or the lack thereof, is relative to the length of small keys, security, in terms of longer keys, is relative to entropy, keying source, and a myriad of various other cryptographic factors.

56-bit keys, for example, won’t amount to a hill of cryptographic beans, regardless of the amount of entropy. 128-bit, however, if utilized correctly, will prove as an ample supply of bits for the security margin. This article will explain the meaning of “*if utilized correctly*” and how we aim for 128-bit security, without necessarily using 128-bit keys, for reasons of simplicity.

Too small – you’re out of luck. Large enough – you have some breathing room to work with. For the minuscule cost of bits, it’s an advantageous investment for one to establish larger key lengths, for their system. If we define “too small” as cryptographically insufficient, and “large enough” as cryptographically sufficient, we can state: “If our cryptographic key is too small, then it is definitely insecure, and if our cryptographic key is large enough, then it is indefinitely secure.” The former is a constant state of susceptibility to compromise, while the latter is a variable state of potential resilience to compromise. If your hand is decent and you play your cards right, the probability of obtaining a comfortable margin of security is favorable.

## Entropy and the woes thereof

Entro-what?

Oh, great, now what? Just when we thought we were getting somewhere, a new term pops up – *entropy*. No worries. Entropy is just a snazzy synonym for the measurement of things “random.” It simply denotes the amount of chaotic key material present, or uncertainty, which ultimately provides us with the desired probability necessary, in order to achieve *n *bits of security with an *n*-bit key. However, this is an arduous task, of which revels in the fact that a great majority’s attempt at this approach ends up in an insecurity-induced fiasco.

Password etiquette and key space complexity

This holds true in many cases, as a bulk of keys are the derivatives of poorly chosen passwords or passphrases, due to lack of proper etiquette. You may ask, “What in the heck does this have to do with entropy?” It has everything to do with increased predictability, which is what we should be afraid of. The more common the password or passphrase, the more predictable; the more predictable, the higher the probability; the more probable the element, the more predictable, respectively. A level of imbalance, in both predictability and probability, leads to a system prone to dictionary attacks, which greatly reduce the work efforts of a traditional key exhaust by arbitrarily fixing the exhaustion, to operate on a selective pool of commonly chosen values. This collaterally affects the amount of entropy; so, that’s what the heck it has to do with it. Key source plays an extremely vital role in the presence of entropy. It is poor practice such as this that often negates the effective purpose of a strong algorithm, or hefty key length.

If you can help it, a majority of the time, you’ll find that it’s much more cryptographically appealing to use a cryptographically sound method of generating keys, such as a pseudo-random number generator, that will provide a source that is sufficiently unpredictable and statistically random. It plays to the favor of information theory, and mitigates the associated fallacies of neglecting such theory, while also frustrating the otherwise practical effectiveness of specialized exhaustive search that exploits lax attitudes towards choosing good, strong values.

Sporadically, you’ll see the proper instantiation of key space complexity, but oftentimes, complexity reduction ruins any chances of a satisfyingly entropic key length, that would otherwise render the assumed *n*-bit security. This is a critical issue, as we must have this uncertainty, in order to obtain *n *bits of security, for a key length of *n *bits. The lesser knownit is – the better.

To make *n*-bit keys literal, in meaning, is to prove the key source capable of providing entropy sufficient enough to guarantee *n *bits of security. This type of concreteness, in the conventional scene, is increasingly toilsome.

Easy enough, right?

Seemingly, entropy is a lucid criterion – wrong. As you can see, between poorly chosen passwords and their effects on complexity reduction, an array of things can go wrong, thus preventing you from ever fulfilling this requirement. This operose task becomes your burden.

Let’s just assume, for a moment, that we can achieve this. We’re still not out of the woods, just yet. Now, we are faced with another serious problem – collision attack methodology.

In this first segment, we’ve been introduced to “128-bit,” and how this corresponds to cryptographic values, in general, such as keys, and how this exists as a design goal, security-wise. We’ve also explored, concisely, entropy and key space complexity, and how this is often handled more properly through the use of a cryptographically secure pseudo-random number generator, as opposed to usually-redundant passwords and passphrases.

In the second, final segment, we’ll take a brief glimpse into the two attacks that compose the collision attack methodology; these two attacks are affectionately known as the birthday attack and the meet-in-the-middle attack. Furthermore, we’ll justify the argument for using 256-bit keys, when claiming a design goal of 128-bit security, by defining a conservative golden rule to simplify the process. We’ll extend this argument to counter the complexities of attempting to use 128-bit cryptographic values to obtain 128-bit levels of security. We’ll conclude by compressing the gist of this article into one tactful strategy, which is the cornerstone of sensible cryptographic; that is, being conservative.

*If you would like to read the next article in this series please check out **Ideal-to-Realized Security Assurance In Cryptographic Keys (Part 2) – *Collision Attacks*.*