Block Party: A Cryptographic Standard Shindig

Tetris, Legos, David Robinson, and Belgium

If you were to ask me about my opinion on the best contributions to cryptography, over the past few decades, my answer wouldn’t really be specific to cryptography at all. It’s not a stream cipher, a block cipher, a hash function, or a protocol. In fact, it’s really nothing we “use”, as consumers of security, per se. However, it is something that gages what we use; it’s a cryptographic competition. Such a competition is basically a selection process that involves cryptographers with competing designs. These designs are pummeled by rounds and rounds of cryptanalysis. Aside from scrutinizing the security aspects of the designs, various design metrics and criteria are introduced, to examine their performance and efficiency attributes, in a myriad of software and hardware environments. A potential standard must obviously be secure, but it’s vital that it makes good time, and keeps the cost down, in the process. Otherwise it’s not going to cut it. Even robust security isn’t enough to make up for an inefficient, poor performing cryptographic design.

Some of y’all are wondering where you’ve seen this type of situation before. Here’s a hint: What has more blocks than a Tetris and Lego love affair? No, not David Robinson’s career statistics. Nice try, though. If you answered with, “What is the Advanced Encryption Standard selection process?” you’d be correct here, and on Jeopardy too. Our current block cipher standard, a Belgian-designed 128-bit SPN, or Substitution Permutation Network, dubbed “Rijndael,” was the champion of that competition. This five-year process demonstrated the fruits of an open competition. The outcome is overflowing with goodness, and there are solid reasons for having faith in the standard. After all, it survived the onslaught of the most relentless group of cryptanalysts ever assembled. Furthermore, it’s still surviving post-analysis. If that’s not enough, consider this: It’s receiving more cryptanalytical attention than any other block cipher, including the two front-runner finalists, Twofish and Serpent.

The competition, from start to finish, is much like a sandbox for design methodologies. You get the best cryptographers together, have them toss in a variety of Feistels, SPNs, et cetera, based on a slew of different design strategies, relating to both security and implementation performance, and you’ve got the ideal conditions for weeding out the good, the bad, and the ugly. Not only can you weed out what’s not secure and what doesn’t perform well, but you can start to build metrics for what constitutes a secure, well-performing block cipher in software and hardware environments where processing and memory constraints can becoming incredibly demanding. This is necessary for practical cryptography. In theory, where conditions are arbitrary and idealistic, one can concentrate on security alone. In practice, everything is more demanding; it’s not a secure design, alone, that wins – it’s a useful design.

I’d like a side order of hash browns, please

Now that we’ve learned all of these good lessons on how to select a cryptographic standard, what can we do with them? It turns out that we’re in dire need of another standard, and this time, it’s probably going to be even more of an intricate process. As most of us know, hash functions have taken a rather hard hit over the past few years. Within a relatively short period of time, an aggregation of cryptanalysis, on various hash functions, surfaced. Along with this came a slew of opinionated ways of looking at the implications. After just a cursory glance at what most of these views entailed, the obliviousness of the matter becomes apparent – the general community has an extremely limited understanding of what’s going on, setting aside reiterations of the obvious, still, of which, are comprehended to a small degree. Without further ado, or rehashing (pun slightly intended) what the media has already done a fantastic job at over-hyping, I’ll elaborate on a “structural” perspective on the matter, that isn’t as much of a commonality as it should be. In other words, unless one practices cryptography in the realm of academia, it is likely that they haven’t a clue. Hopefully, this will provide the nudge that non-academics may need, in order to be “clued in”, so to speak.

The main observation that I wish to address, is that of the design strategy of which the majority of the family (i.e., MD4 family) of conventional hash functions are composed; this family consists of MD4, MD5, HAVAL, RIPEMD, SHA-1, and SHA-2, and the respective output length extensions of those, where applicable. Basically, in the MD4 family of hash functions, we’re dealing with a common structure, which is the conversion of a block cipher into a one-way hash function, utilizing a Davies-Meyer feedforward construction; these functions, for the most part, can be summarized as exhibiting the structure of a UFN, or Unbalanced Feistel Network, that are both source-heavy and heterogenous. Over the past decade or so, up until recent months, we’ve seen quite a bit of cryptanalysis regarding these functions, including that of Biham, R. Chen, Chabaud, Joux, Dobbertin, Wang, Yu, Yin, Feng, H. Chen, Lai, Van Rompay, Biryukov, Preneel, Vandewalle, Hawkes, Paddon, Rose, et cetera.

Since the cryptanalytical reduction of SHA-1’s generic level of complexity, from 2^80 to 2^63, the media has been an uninvited catalyst to nonsensical thinking. Many suggestions revolve around the panic of rushing to make infrastructure alterations to support SHA-2, and phase out the use of SHA-1. Before I go any further, I want to briefly stress the importance of not rushing cryptography at the implementation level; there is nothing quite as detrimental to security, than altering cryptography on a whim, at the implementation level. It’s at this level where the effectiveness of cryptography is largely dictated.

First, this [migration to a larger hash function output] should have been instantiated long beforehand. Advocating primitives with, at least, 128-bit levels of security should already be standard practice, even if just in the de facto sense. Second, while SHA-256, for example, should be sufficient enough to satisfy this desire for 128-bit security, it is not a terminal solution. At best, it is an interim, and here’s why: Recent cryptanalysis has rendered enhanced methodologies and extensions that apply, in some form or other, to each construction within the MD4 family. It’s not time to use functions within that family that support larger output; it’s time to explore the route of designing hash functions that not only support larger output lengths, but are composed of entirely different design strategies. It’s time to analyze other strategies for constructing cryptographically-secure hash functions.

By “other strategies,” we’re simply looking at anything but the UFN nature of the majority of existing conventional hash functions. There is one peculiar strategy that seems, in my opinion, to be the ideal point of interest in a case such as this. This particular strategy is the wide trail strategy. Sound familiar? The underlying primitive of the AES, Rijndael, is a child of this strategy, as are a variety of other similar ciphers within the family of those before it. Vincent Rijmen, the “Rij” in “Rijndael,” along with Paulo Barreto, gave us Whirlpool; it’s an interesting design that employs strengthening of the Merkle-Damgard variety, and a Miyaguchi-Preneel hashing construction, with a block cipher that is similar, in some aspects, to Rijndael. This 512-bit hash function relies on the wide trail strategy, as well, and should be a model for the direction, or one of directions, that we should investigate. Another effect, somewhat direct and indirect, both, is the fact that such a direction would prompt for further, more extensive and rigorous, cryptanalysis of the wide trail strategy. Not only would this benefit the confidence in this strategy as a methodology for constructing cryptographically-secure hash functions, it would also provide a better look into the same general principles on which Rijndael is based, which is certainly a beneficial tactic.

The Real Issue and The Right Atmosphere

In conclusion, I feel that the issue at hand – the real issue – is much deeper than simply extending the length of hash function output to provide a larger margin of generic security. I feel this is a matter of structural significance, where we need to allocate our analytical effort to other strategies, aside from UFNs, and the composition of the MD4 family, lest we find ourselves in a position where every construction within the only family we have is susceptible to attacks that have the potential to become more efficient. Note, I’m certainly not declaring that UFNs are inherently insecure, by any means. I’m simply suggesting a change in design policy, so to speak. Look at it as a way of approaching hash function design in the way we’ve approached block cipher design.

Despite the fact that hash functions are the jack-of-all-trade, wearer-of-many-hats functions in cryptography, we seem to know a bit more about block cipher design, given that most all of the conventional hash functions we have are derivatives of the same family, while block ciphers boast varieties of contrasting designs. Hosting a competition for a federal standard is the most fruitful method for milking the cryptographic community of its design prowess. The constraints of a standard provide the right atmosphere for intense analysis, which ensures that criteria are balanced between conservatism and compromise. By doing so, we maximizes the instances in which the function is suitable for implementation, while providing the cryptographic security we would expect from such a function.

Saying our goodbyes to SHA-1 are long overdue. SHA-256 is responsible for keeping its belt tight and pants up, until 2011, when we’re expected to have a new hash function standard. Beginning in 2008, we’ll start this trek on a lengthy path to understanding, arguably, the most versatile of cryptographic functions. We owe them that much. Just as with the AES selection process, we’re going to see a large collaboration between the best-of-the-best, from continents galore, pooling together schools of thought and methodologies of design. Thanks to the NIST, and all of the cryptographers involved, it’s time to share some optimism. The understanding of hash function design that we’ll gain, from the process alone, will be just as big of a deal as the standard that will be chosen. Without a doubt, this will be one of the biggest steps that cryptography has taken. Let the cryptographic kumite begin.

About The Author

Leave a Comment

Your email address will not be published. Required fields are marked *

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Scroll to Top