I don't normally copy directly from others, but the following post by Bruce Schneier provides an introduction to one of the most important topics in Information Security. I have tried long and hard to write about it, but the topic is messy, controversial and hard evidence is thin. Here goes...
Bruce Schneier writes in Forbes about the booming vulnerabilities market:
All of these agencies have long had to wrestle with the choice of whether to use newly discovered vulnerabilities to protect or to attack. Inside the NSA, this was traditionally known as the "equities issue," and the debate was between the COMSEC (communications security) side of the NSA and the SIGINT (signals intelligence) side. If they found a flaw in a popular cryptographic algorithm, they could either use that knowledge to fix the algorithm and make everyone's communications more secure, or they could exploit the flaw to eavesdrop on others -- while at the same time allowing even the people they wanted to protect to remain vulnerable. This debate raged through the decades inside the NSA. From what I've heard, by 2000, the COMSEC side had largely won, but things flipped completely around after 9/11.
It's probably worth reading the rest of the article too - I only took the one para talking about the Equities Debate.
What's it about? Well, in a nutshell, the intelligence community debated long and hard about whether to allow the world's infosec infrastructure to be vulnerable, so as to assist spying. Or not.... Is there such a stark choice? The answer to this is a bit difficult to prove, but I'm going to put my money on YES: for the NSA it is either/or. The reason for this is that when the NSA goes for vulnerabilities, there are all sorts of flow-on effects:
Limiting research, either through government classification or legal threats from venders, has a chilling effect. Why would professors or graduate students choose cryptography or computer security if they were going to be prevented from publishing their results? Once these sorts of research slow down, the increasing ignorance hurts us all.
I remember this from my early formative years in University - security work was considered a bad direction to head. As you got into it you found all of these restrictions and regulations. It just had a bad taste, the brighter people were bright enough to go elsewhere.
In general, it seems that the times the NSA has erred on the side of "YES! let them be weak," that we are now counting the cost. If you look back through the last couple of decades, the mantra is very clear: security is an afterthought. That's in large part because almost nobody coming out of the training camps is steeped in it. We got away with this for a decade or two when the Internet was in its benign phase - 1990s, spam, etc.
But that's all changed now. Chickens now come home to roost. For one example, if you look at the timeline of CA attacks over the last decade, there is a noticeable spike in 2011. For another, look at Stuxnet and Flame as a cyberweapons of inspiration.
Which brings costs to everyone.
I personally think the Equity Issue within the NSA is perhaps the single most important information security influence, ever. Their mission is dual-fold, to protect and the listen. By choosing vulnerability over protection, we have all suffered. We are now in the cost-amortisation phase; for the next decade we will suffer a non-benign Internet risks environment.
Next time you read of the US government banging the cyberwar drum in order to rustle up budget for cyberwarriors, ask them if they've re-thought the equity issue, and why we would provide funds for something they created in the first place?Posted by iang at June 16, 2012 06:50 AM | TrackBack