Simplicity can take on many forms, within the context of cryptography, but we'll look at three "acts of simplicity" that are concerned with the effectiveness of cryptography, in practice. In conclusion - and I know I harp on this quite often - I'll throw in my spiel on using a MAC; it will be short, sweet, and with enough substance to get the point across. My rationale is simply (pun intended) this: After thinking through these three acts of simplicity, I started to see the correlation between all of them, and how it all pointed towards what I sometimes call cryptographic conservatism, which is, simply (there I go again), the philosophy of being resourceful and positively paranoid. Someone of this frame-of-mind would design with security in mind, first and foremost, by eliminating unnecessary complexities and using a primitive for what it can be securely used for (i.e., using a block cipher in both your encryption and authentication schemes), without introducing more primitives, where applicable. Also, this type of cryptographer wouldn't be quick to write-off seemingly "unlikely" attack models; they would allow cryptography to address such attack models, and learn that the insider (implied "ally") is sometimes more potentially dangerous than the outsider (implied "adversary"). Good paranoia - the kind that isn't put-on-your-tin-foil-hat incompetence-driven - is a crucial step towards building a thorough cryptographic solution. Be reasonable, but not scantily clad. Minimize the assumptions you have to make, where you can. This basically summarizes what you'll read throughout the rest of this article.
Keeping your ducks in a row, and their feathers to themselves
The first act of simplicity is a general principle that exists even more basically than within the realm of cryptography itself; however, when pertaining to cryptography, it becomes a necessity. It does, at least, if you understand the importance of simplifying the analysis of a system to the greatest extent possible. With that said, it shouldn't be too much of a surprise for me to mention modularization of components.
This is a fairly basic concept that coincides with keeping your ducks in a row; it's easier to account for each of them, if you do, so you don't have any ducks leading other ducks astray. Although there may be cases of dependence between modules, in some scenarios, stress - and I mean stress - independence between modules. (This is reasonable; it's not always the case where we can entirely eliminate dependencies altogether, but we can minimize them.) Ideally, they should be secure on their own, such that insecurity will be local within that module. This separation of "ducks" into their own respective modules makes analysis much simpler. Simplifying module-to-system interfacing is a cornerstone to why modularity is good; that is, if it's done properly, just as with everything in cryptography. Being modularized pays off tremendously.
Recycling isn't just for aluminum cans anymore
The second act of simplicity is recycling of primitives. To begin - and I'm being objective here - the majority of cryptographic software you'll find floating around is pollution. It's either missing something (i.e., a MAC, for instance), or horribly implementing what it's including. On occasion, I'll find a particular implementation that offers dozens of primitives, from block ciphers to stream ciphers to hash functions and so on and so forth; this introduces complexities that can affect security at both the cryptographic and implementation levels. Remember, the more options you add, the more complexity you introduce. Most users probably aren't cryptographers, so don't burden them with a plethora of possible configurations that may or may not end up being secure. Recycle primitives where you can; in many cases, we can use the same primitive in multiple schemes, such as AES, for example. Consider an encryption scheme that uses AES in CTR mode, and an authentication scheme that uses CMAC-AES. Set a configuration that is secure by default. Either allow them to be secure, or not. Limit them to as few choices as possible, such that the implications of their choices are obvious; that is, "this is secure; that is not." It's difficult enough for a cryptographer to design secure cryptographic primitives, and respectively difficult for a programmer to implement them; if the user is given opportunities to insecurely configure such an implementation, we're reducing the potential effectiveness of cryptography, in practice. That's the primary goal of practical cryptography - designing systems that are useful and secure.
Oh c'mon. What are the odds of that ever happening?
The third act of simplicity is threat modeling. Now, I can't generalize this, as it is scenario-specific, but I can suggest some conservative ways to think about it that can be generalized in many cases. Let cryptography do its job; that is, minimize trust, and don't assume too much when you don't have to. Oftentimes, you may think of security within the context of insiders and outsiders, where the former group consists of your friends (trust buddies), while the latter group consists of your foes. However, let's look at this model for a moment. History has shown that it's not outlandish for supposed insiders to betray trust within their circle; in fact, as they are implicitly trusted, they have easy access to that which is kept from outsiders. How to address this? Don't necessarily be quick to trust even those on the inside. This may be a little paranoid, but considering that insiders are potentially more harmful than outsiders, it minimizes the trust we hand out; therefore, it is good paranoia. Be a conservative "threat analyst." Don't dismiss seemingly remote practicalities. I'd rather place my bets on cryptographically addressing it, as opposed to assuming that an adversary clever enough to exploit that, given the aptitude and resources, doesn't exist. Threats will differ by the scenario, but this principle can apply to most any of them.
How these acts of simplicity relate
What do these "acts" have in common? Well, they share the role of being things that simplify analysis by minimizing things like dependencies, complexities, trust, et cetera. Whenever failure occurs in a cryptographic system, it's generally not a fault of the mathematics behind it; it is, obviously, an implementation error that results in some exploitable insecurity. Modularity helps retain independent separation, and recycling primitives reduces the complexity that is a product of introducing unnecessary primitives. There are even situations where they (the system "architects") got the implementation right, but were far too lax with their security policies, and didn't address threats that should have been addressed. As such, even if there aren't any implementation vulnerabilities to exploit, the implementation may not be sufficiently providing the security that is required for a given scenario and the threat model it calls for. Conservative threat modeling helps ensure that this doesn't happen.
A short, obligatory advocation of using a MAC
Oh, and I can't forget my spiel on MACs. This ties in with threat modeling that some might call conservative, but in reality, it's a necessity in the majority of cases; that is, it's not extra security, but a vital component in the first place. I would be willing to say that in most scenarios where confidentiality preservation is needed, integrity preservation is needed too. In fact, if you don't preserve integrity, this can lead to a loss of confidentiality, as well. As I've often encountered when analyzing a system, encryption is the only service provided; this is what you might call "confidentiality preservation." However, the system does not protect against an adversary's ability to manipulate information, which is often more detrimental than merely being able to divulge it. This is a realistic threat, and in most cases, should be explicitly addressed. This is a downfall to many threat models; it's a gaping hole that is all too often left out. Preserve integrity through message authentication. Use a MAC.
Summarizing things up a bit
To wrap things up, we've discussed three simplistic measures that are steps to a conservatively secure system; also, we concluded with an often omitted component which is, in a majority of scenarios, vital - not optional conservatism. Call me simple-minded, being from the spectacular state of North Carolina, but simplicity has shown me that systems which abide by it are potentially useful, from a security point of view; the likelihood of those that don't abide by it being secure is probably relative to the frequency of snowball fights in the Kalahari (i.e., not worth counting on). Bottom line - be overbearing about your security and don't make unnecessary sacrificial trade-offs at its expense; 'tis a good resolution.