Let us have a little less of “hands across the sea,” and a little more of that elemental distrust that is the security of nations. War loves to come like a thief in the night; professions of eternal amity provide the night.
-Ambrose Bierce, “The Devil’s Dictionary”
As I’ve said before, the best practical measure of security that I can think of is “cost-to-break.” It’s a good reflection of the relative difficulty that someone has to go through to overcome a particular measure or control. It also helps to deal in “currency” as a consistent unit (for a given economy) for a lot of modeling purposes, and of course you can even factor in things like “windows” of opportunity and risk with a financial-model for defining security.
If that makes sense, the next concept is critical: cost-to-break isn’t static…it changes. It is subject to many influences from changes in laws and law enforcement on the penalty side to changes in difficulty. In fact, it’s fair to say that cost-to-break will normally trend down more than up for a given control or security measure and that the rate of decline is based on a number of factors, including (but not limited to) the amount of attention and innovation the bad guys bring to bear, changes in overall technology capabilities and time.
In the world of cryptography, we are used to extremely high tolerance for mechanisms and controls. In that world, the advent of something like efficient factoring of the products of large prime numbers could have an immediate and dramatic effect. But when we leave the black-and-white world of classical cryptography and hard math and get into the fudgier, economic world of “shades of grey” items like authentication, the game changes quite a bit.
Unlike in the factoring example, the creation of a single tool won’t have a discontinuous effect on the strength of any individual authenticator; but the cumulative effect over time on the strength of any authenticator will still lead to a lowering of the cost-to-break and therefore a lowering of the pragmatic security value of any authenticator. This is, in effect, a classical commoditization curve, albeit a non-intuitive one. On the “product-and-goods” side, when a product is ubiquitously available and homogenous in quality it differentiates on the basis of price only, and prices plummet according to a commodity curve decline. In this darker side, there is a similar commoditization curve; and in this case, the cost-to-break plummets over time too.
So I naturally loved Dan Goodin’s “Why passwords have never been weaker—and crackers have never been stronger.” There’s some good stuff in there around the heart of the problem, such as having 25 separate accounts but, on average using only 6.5 passwords to protect them (he cites this Microsoft Research article) and diving into the leveraging of Moore’s law and the ready availability of password cracking mechanisms. However, I ultimately saw it as an expression of the inevitable commoditization of the electronic authentication form factor in the world: the password. I’ll make a few points here in bullet form for clarity…
- Passwords have been around forever
- Discipline is poor around choosing the right ones, changing them often and protecting them – and this won’t really change (if anything I was surprised that the average of 6.5 passwords per person (I’ll call this PPP because I like how that sounds) is as high as it is! Can PPP really climb to 20 or 40 practically speaking without introducing weaker controls for managing them?)
- They are used too widely
- Discipline on the backend is lax and rare
- Attacking techniques are advanced and have evolved to get better and more effective over time
This will only get worse. So what should we do about? Here are the first two things – feel free to comment on them or add more (via comments):
- We can “beef up” passwords quite a bit with better “best practice” back end controls (and more front-end options for helping raise the PPP) that are more consistent across the industry (remember what happened to Mat Honan when two large companies didn’t build according to the same principles, which I blogged on here), but their decline is inevitable and shouldn’t be lamented. We should plan for it and we should also accept that passwords are going to be part of our collective lives for quite some time to come.
- We should move to stronger forms of authentication that are subject to slower commoditization (from a security perspective) and should likewise beef those up…but ultimately all form factors are destined to the same decline in cost-to-break in direct proportion to the attention they receive from the bad guy community: OTPs, certificates, grid cards, magic squares, knowledge bases and so on for all of them. This leads to the third and final thing we should do…
- In the end, we have to build systems that use as many intuitive feeds and form factors as possible concurrently, with context and intelligence governing them and adapting to them.
If we can actually follow (3) to its logical conclusion, we will see an improvement in trust on the Internet and have a meaningful impact on all of our lives.
PS: in the spirit of fun from one of my posts last week, here is a haiku for it:
Passwords float on ether
Access rights float to hacker’s shores
We need not words alone