Bruce Schneier at LSE on the economics of security
Posted by william in psychology, Faster/smaller/better... at March
22nd, 2007
In his second talk in two days at the LSE (this one co-sponsored by
the BCS) Bruce Schneier spoke about the economics of information
security. Economics is a useful tool for shining a light on
information security questions which make no sense from the
technology point of view, he said, describing 10 trends in info
security:
1. Economic value of information: an old notion with new
implications. It's now normal to have companies whose physical assets
are worth less than information assets which can be used for
marketing, process streamlining, personalisation, law enforcement and
forensics. "If information didnt have value computer security
wouldn't exist."
2. Networks as critical infrastructure. If it's important these days
it comes over the net. When did you last get something important in
the mail?
3. Third parties controlling information. "Your information isn't
controlled by you." Our existing legal protections are written in
terms of your person, home, car - things under your control. But your
emails reside at Google or your ISP, the merchant controls your
shopping data and the hospital (or in the UK a central authority)
your medical records. Paris Hilton didn't leak messages from hr own
phone - they were hacked from T-Mobile. It's all stored elsewhere
under someone else's control.
4. Criminals are thriving on the net. It used to be hobbyists
defacing web sites; now the dominant hackers are criminals trying to
take your money with spam, fraud due to impersonation and denial-of-
service extortion. There's a business model for spam and a market for
bot networks. It's global and by some accounts comparable in value to
the market for illegal drugs. "They're not going away and we're not
going to solve this."
5. Complexity is the worst enemy of security. If computers get better
faster and cheaper, why is security getting worse? The answer is
complexity. If we wanted a secure operating system we'd start by
going back to DOS, but we love complexity. Security is getting better
like everything else, but complexity is getting worse faster.
6. Slower patching; faster exploits. There's a weird business model
for software: we build it, throw it out there, and fix it later. You
can either have patches fast, or well tested. But not both. Hence
Microsoft's move to a monthly "patch Tuesday".
7. Sophistication of automatic worms. They used to be simple
creatures. Now they're polymorphic, metamorphic, they use Google for
vulnerability assessment. New worms dont advertise their presence
with a cheeky message. They report back to their owner to ask for
instructions- to sniff passwords, collect a keystream, infect other
platforms. It's no longer about novelty to score style points -it's
about repeatedly doing what's effective.
8. Untrustworthiness of the endpoints. The traditional security
model, on which something like PGP encryption or SSL is based,
requires trusted end points. This model fails. The attackers use
Trojans or spyware. The bad guy captures your keystream, or decrypted
data, or does back ground transactions once you're authenticated.
We're trying, eg with Micsosoft Vista, but these endpoints are hard
to fix.
9. The end user seen as attacker. The aim of DRM is to protect
someone else from you. It reduces your functionaity, pisses you off
and you cant delete it. It sounds like a hacking tool; it looks like
malicious code (eg the Sony root kit). The security expert cant
protect you and protect from you at the same time. "If I protect you
it makes it harder from Sony. If I make it easier for Sony it's
easier for the bad guy." So we're set to get more and more invasive
tools that assume we are the bad guys.
10. Regulatory pressure. It's hard to get people to buy security:
it's a "fear" sell. As a greed sell it never works. But what does
work to sell security is regulation. Fear of failing an audit is way
bigger than fear of data theft. "It annoys me no end, but there you
have it."
Things are getting worse not better. They're getting more
complicated. The non-technical aspects are more important than the
technical. And increasingly the driver is economics and not computer
science.
The basic economics of security are that if you lose £1000 by being
mugged and it happens once evrey ten years, it's worth spending £100
preventing it. But this model breaks down as the likelihood becomes
zero and the effects catastrophic, and you're trying to multiply zero
by infinity, and there's no numeric grasp of the risk.
Economics gives us the idea of an externality - a cost borne by
someone other than the person responsible. We pollute a river to make
chemicals, those downstream suffer. To correct this either the
authorities fine the polluter, or those who suffer sue. We need a
similar fix to the problem of buggy software. A lot of security
paradoxes can be explained by externalities. eg phone security, data
thefts. Buggy software, insecure home computers. We need to align
ability to mitigate risk with financial responsibilty.
The recommended next steps are:
1. Understand the security problem and stakeholders
2. Undertand the security and nonsecurity tradeoffs
3. Align the economic incentives (otherwise the problem will never
get solved).
4. Implement countermeasures to reduce risk.
5. Iterate as technology changes things, making it faster easier
cheaper forthe bad guys as well as the good guys.
Thursday, March 22, 2007
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment