Bruce Schneier really got me thinking recently when he posted an article on his website titled Security Design: Stop Trying to Fix the User. Bruce is a well-known thinker and writer in the information security field, and while he isn't always on the money with his observations, he does have a way of making us see outside the box that IT pros often find themselves boxed into. Bruce's basic thesis in his article is that the "problem isn't the users: it's that we've designed our computer systems' security so badly that we demand the user do all of these counterintuitive things." He then asks us why we can't design computing systems that allow users to "choose easy-to-remember passwords" or "click on links in emails with wild abandon" or even "plug a USB stick into a computer without facing a myriad of viruses."
Why indeed. Instead of striving to build IT systems that make it easy for users while still being secure, we argue instead that there is an intrinsic trade-off between usability and security, and that this trade-off is intrinsic, axiomatic, and inescapable. We then infer from this axiom that we must divide our effort between securing the technology and educating the user in order to achieve maximum security effectiveness in a man-machine environment. In other words, organizations must allocate funds to build programs that enhance security awareness for their employees. We should educate users on how to create complex but easy-to-remember passphrases. We must warn them not to click on links in emails from senders they don't recognize. We need to caution them against picking up USB memory sticks they may find littering the company parking lot. And if our users fail to adhere to such security policies we lay out for them, we should punish them accordingly.
Wrong way to try and fix users
If you take the common position that a secure IT environment equals secure technology securely implemented plus end-user security awareness, then you need to find ways to successfully change user behavior toward actions that enhance security instead of degrading it. Unfortunately, most organizations don’t do this in the real world because they think IT security is an IT problem, not a problem in human psychology.
It does little good to post information security posters throughout your workplace; employees just find these condescending and ignore them. It's like the User Account Control (UAC) prompt that appears in Microsoft Windows when you want to install a piece of untrusted software on your computer; you simply click OK because it doesn't matter if Windows doesn't trust the software -- you trust it, otherwise you wouldn't have downloaded it. If you still think such posters are going to change the behavior of your employees, you should read the Magic Quadrant for Security Awareness Computer-Based Training Vendors report that Gartner published a couple of years ago to find out just how ineffective most corporate security awareness initiatives are in general.
Bring in behavioral science
Along these lines Deanna Caputo, a behavioral scientist in MITRE's Social, Behavioral, and Linguistic Sciences Department, wrote an article a few years ago about how lessons learned and strategies developed by behavioral scientists can help organizations lessen the danger of employees getting caught by phishing attacks that try to get users to cough up sensitive info like passwords or credit card numbers. It might be worthwhile for larger organizations to hire someone like Deanna on a consulting basis to help them develop information security awareness campaigns that may actually work instead of just putting up a poster over the coffee machine or telling users to go read this article from the Microsoft Safety & Security Center, which explains how to recognize phishing email messages, links, or phone calls. Web articles like that are similar to posters and are likely to garner just as little attention from your employees.
The team approach
The problem with involving behavioral science experts in developing information security awareness campaigns is that it costs money to hire people who have not just qualifications but solid experience in cybersecurity risk management and mitigation. In fact I can almost hear the bean counters groaning in upper management at your company at the mere mention of paying to bring in some Ph.D. to help them solve a problem. What else can you do to fix the broken users in your organization?
One approach I've seen that actually seems to work is to get the users themselves involved in developing your awareness campaign. This approach involves gathering users together into teams based on location or department and tasking them with the job of trying to assess the risks the organization (and their own jobs) face as a result of inadequate information security awareness on their part and then come up with some simple ways of reducing such risks by correcting their own behavior.
The success of such a team approach is based largely on the fact that the users themselves take responsibility for their own actions in the workplace. Of course you'll need to motivate your employees in some fashion to get them to take such an exercise seriously, but that's where you can always get the bean counters involved. After all, which costs more: Hiring a Ph.D. for several months on a contract basis to develop a top-down program that will try to influence user behavior without their knowledge, or catering in a few nice dinners for after-hours teams of employees to help get them motivated to change their own behavior? And please, make their rewards real, like some nice food for their meetings, and forget about awarding them badges or verbal recognition or a nice block of lucite. Management should stay away from these team meetings and let the employees involved organize and run it. The groups should be allowed to bring IT staff in for questioning about various issues to supplement the learning employees are achieving on their own.
Putting into practice
If your users are the broken part of your information security environment, let them fix themselves instead of trying to fix them yourself. Human beings can do remarkable things when you give them the opportunity to deal with stuff themselves instead of having it forced on them. One of the best resources I've found for business leaders who want to effectively implement a team approach with their employees is Shawn Casemore's book Operational Empowerment: Collaborate, Innovate, and Engage to Beat the Competition. It's full of practical, actionable advice on how you can engage your employees by challenging them to discover and introduce improvements that can make their jobs easier and more secure. The payoff for the organization is that when users fix themselves, many of the problems the organization faces almost miraculously disappear -- including the kind of security issues Bruce Schneier talks about in his article.
Photo credit: FreeRange Stock