Sunday, December 6, 2009

So Long, And No Thanks for the Externalities

The Rational Rejection of Security Advice by Users

This is a fascinating analysis of the true cost of computer security. Since we have well-established that I am a reactionary, anti-political correctness extremist, you have some idea where this is going.
It is often suggested that users are hopelessly lazy and unmotivated on security questions. They chose weak passwords, ignore security warnings, and are oblivious to certificates errors. We argue that users' rejection of the security advice they receive is entirely rational from an economic perspective. The advice offers to shield them from the direct costs of attacks, but burdens them with far greater indirect costs in the form of effort.

Looking at various examples of security advice we find that the advice is complex and growing, but the benefit is largely speculative or moot. For example, much of the advice concerning passwords is outdated and does little to address actual treats, and fully 100% of certificate error warnings appear to be false positives. Further, if users spent even a minute a day reading URLs to avoid phishing, the cost (in terms of user time) would be two orders of magnitude greater than all phishing losses.

Thus we find that most security advice simply offers a poor cost-benefit tradeoff to users and is rejected. Security advice is a daily burden, applied to the whole population, while an upper bound on the benefit is the harm suffered by the fraction that become victims annually. When that fraction is small, designing security advice that is beneficial is very hard. For example, it makes little sense to burden all users with a daily task to spare 0.01% of them a modest annual pain.
(Continue reading (PDF -- twelve pages)...)

This is the kind of economic analysis that congress and regulators rarely do with government programs. Instead, they look at the feel-good factor, and ignore the cost entirely. One line in particular caught my eye: "... economists have long studied how misaligned incentives often produce
undesired outcomes ...". This practically defines government programs. I know many liberals would read this document, and their heads would explode. "How can you say this? Do you know the risk you are exposing users to?" Well, yes, as a matter of fact, I do. But just try having a rational discussion about bicycle helmets or seat belts with a liberal. They feel that no risk of any kind, in any amount, is acceptable. Hence, the disclaimer at the end:
Note: this paper is not to be read as an encouragement to end-users to ignore security policies or advice. The opinions expressed are those of the author.
They cannot seem to get their heads around calculated risk assessment. Since they can't, they feel that no one else can either, and regulators must force everyone to "be safe" -- whatever that means in real terms (while simultaneously ignoring the unintended and often very undesirable consequences of their nanny statism).

Or, maybe it's because most legislators are lawyers, and everything is a "slip-and-fall" situation. The possibility of getting sued and having someone be awarded an astronomically out-of-proportion-to-the-injury settlement makes no risk acceptable these days.

No comments :

Post a Comment

This is a moderated forum. Please try to avoid ad-hominem attacks and gratuitous profanity. Justifiable profanity may be tolerated.

I am sorry, but due to the un-manageable volume of spam comments, I have enabled the scrambled word verification. I apologize for the inconvenience.