October 05, 2013

The NSA's breach of RSA Inc's crypto: what to do? Where do we stand? My Answer: avoid American crypto

We now know -- on balance of probabilities -- that the NSA conducted a 3 phased attack on the crypto world. First step was to insert a dodgy random number generator (RNG) into a NIST standard, called Dual_EC. Second step was to convince major suppliers to implement and set that RNG as the default. Third step is: Profit! which is to say, defeat your crypto.

This step is effected by decrypting your traffic, knowing how the random numbers were fed into your protocol, and being able to predict them with some degree of crunchability. We have no information on that third step, but the information that has come out in the post-Snowden world is damning. We can conclude that this was a phased and deliberate approach.

What then to do? As Jon Callas of Silent Circle puts it:

The problem one faces with the BULLRUN documents gives a decision tree. The first question is whether you think they're credible. If you don't think BULLRUN is credible, then there's an easy conclusion -- stay the course. If you think it is credible, then the next decision is whether you think that the NIST standards are flawed, either intentionally or unintentionally; in short, was BULLRUN *successful*. If you think they're flawed, it's easy; you move away from them.

The hard decision is the one that comes next -- I can state it dramatically as "Do you stand with the NSA or not?" which is an obnoxious way to put it, as there are few of us who would say, "Yes, I stand with the NSA." You can phrase less dramatically it as standing with NIST, or even less dramatically as standing with "the standard." You can even state it as whether you believe BULLRUN was successful, or lots of other ways.

Where do we stand? We need to answer a bunch of questions in order to get to a conclusion.

The first question surrounds the nature of defaults. RSA Inc's alleged crime against its customers was to set the dodgy RNG as a default. Some will argue that this leaves the user the choice and responsibility of adjusting the defaults, whereas others will argue that the customer buys from RSA so that it gets a secure-by-default product.

Who is right? Peter points to compelling evidence that defaults are sticky: "Software Defaults as De Facto Regulation: The Case of Wireless APs ," Rajiv Shah and Christian Sandvig, TPRC'07, September 2005,

Our results show that default settings play a powerful role in how people use technology. People are hesitant to change the manufacturer’s default settings and defer to them. While this argument is well known to scholars in this area, this study found empirical evidence to quantify this effect using multiple measures from two very different sources of data (one of them very large). In our empirical study, we found that most people do not change default settings.

Time and time again we've found that out in userland, systems are insecure because the configuration issues are beyond the users. Users cannot deal with crypto and security decisions, and they are not asking us to offer them these choices. Users ask us to supply a secure product, and, forcing "freedom of choice" on users is just a nonsense if empirically we know they cannot adequately handle any choice.

The evidence and observation now suggests that setting an insecure default will leave a majority insecure, and is therefore a monumental fail:

Specifically, we found that when a manufacturer sets a default setting to ‘ON’, 96–99 percent of users follow the manufacturer’s suggestion. When a manufacturer sets a default setting to ‘OFF’, and users are exhorted to change the setting by the media, instruction manuals, and online help, only 28–57 percent of users will do so. About half of the users of the most popular product changed no defaults at all, and there was a small positive association between changing one default setting and changing another, even though the qualitative nature of the default settings we considered was quite different. There is also a suggestion that those living in areas with lower incomes, lower levels of education, and higher minority populations are less likely to change defaults, ...

The choice of default is integral to the provision of security, and an insecure default is an insecure product. Security defaults must then be secure. Indeed, I go even further than Shah & Sandvig, and ban choice altogether:

There is only one mode and it is secure.

If you want to deliver security to users, take the choice away, and this mess shouldn't happen.

Back to RSA: we can conclude that they lived dangerously and promoted the choice of an NSA-influenced RNG as a default. Their decision was indeed a dramatically bad one, and a damaging one, and I'd place this mistake akin to the famous Debian bug.

And our next question: Was their choice justified at the time? And, when or if did it become unjustified? Who knew what, when?

We can make an argument that when RSA did the contract work for the US government around 2004-2005, Dual_EC was a good idea. This argument survives because:

  • the RNG was based on Elliptic Curve cryptography, which was strongly recommended by the NSA in for example Suite B,
  • the NSA had designed the RNG, said it was good, and had impeccable credentials for that recommendation, and finally
  • NIST reviewed and accepted Dual_EC as a standard.

But this argument quickly became controversial, as Dual_EC was criticised almost immediately on becoming a standard. Criticisms continued to mount, and Dual_EC didn't survive unscathed for long: In 2007, cryptographers at Microsoft announced that the design appeared architectured as if for a backdoor. At that point, the scales of cryptojustice tipped the other way: Dual_EC had been fatally undermined, and it behoved all suppliers of security to re-think its place in the world.

What did RSA do at this point? Nothing, or at least nothing that affected their users, which is the same thing. In the face of severe criticism of Dual_EC, RSA left it as the default.

Should they have acted? As their job is to provide secure crypto software, they have a duty of care on exactly that point: they knew or should have known that Dual_EC was no longer secure, and to leave in Dual_EC was an epic fail.

As of day of writing, RSA's website reads:
*Crypto Kernel

RSA BSAFE Crypto Kernel offers versions of popular cryptographic algorithms optimized for both small code size and high performance. Unlike alternatives such as open source, our technology is backed by highly regarded cryptographic experts.

My emphasis.

In short, RSA Inc. was negligent. RSA did not manage that tool for the benefit of users until it became so blindingly embarrassing that NIST itself struck the tool off the standard. It continued to act in the interests of one customer, the US government, and against the interests of its other customers, up until it was too late.

RSA therefore deserves to be excoriated as a security provider, and dropped for its failure. It deserves to lose all business outside its one favoured customer. Not because it made a mistake, as did for example Debian, but because RSA did not take reasonable care, that care due to the customer in the security business, by not rectifying the mistake when the writing was on the wall.

The same logic would apply to any other supplier that set and left Dual_EC as a default and/or was influenced by the NSA to favour them and not their smaller, more vulnerable customers. Which leads us to the next question:

How can we damn RSA and not the others? Are we just on a witchhunt? Who are the others? Are we just being unfair?

This is where we have to rely more on reasoned logic than on facts. We know that this attack happened. If it happened, was it only limited to RSA, or was it a broad-based attack on many suppliers? Reason suggests that it must have happened to others because

  • the attack cost a lot of time and money to push through,
  • it was set up well in advance of any attack event, and therefore
  • would not be precise enough to target anyone in particular.

It has to be a campaign, it has to have targetted as many suppliers as possible.

Then, who? Perhaps RSA Inc are only guilty of being the honest ones, or the ones caught out? Who else was likely at risk, possibly influenced?

This influence would likely have happened at the intersection of those suppliers with the most interesting customers, and those that the NSA had most influence on.

Who has the most interesting customers? Well, let's say that the NSA was sticking to its mandate of spying on foreigners, and ignore any Americans. This would mean that "interesting suppliers" are those that sell worldwide, to interesting foreigners.

But, influence is only strong on USA suppliers who have to export (and seek a USA export licence), or those who engage in large US government contracts. Either way, we can suggest that all USA suppliers who export are most at risk. In a nutshell, speaking as foreigners:

American-influenced cryptography should be avoided.

Simple enough to say, bit this is a bit more dramatic than it sounds at first blush. Under this conclusion, not only RSA's BSafe product and any similar FIPS approved products, but also Java's JCE/JCA (Java Cryptography Engine/Architecture), and Microsoft's CAPI (Cryptographic API) are tainted. Which leaves most of finance, most of mobile and most of the desktop in a state of uncertainty. And, inevitably, questions will circulate around Apple, IBM and google and others that ship and use crypto.

Even OpenSSL has FIPS-approved distributions. Linux ships with SELinux security modifications with influence from the NSA. Lavabit was running a website in the USA. This is going to be one very busy microscope.

Does this pass the laugh test? Unfortunately, yes. Too much information has come from Snowden (and too many hints existed before) that in sum suggests this was a persistent and deliberate campaign. Recall, the 'crown jewels' disclosure revealed that the NSA were happy to destroy the security credentials of major American Internet companies in order to get an advantage! By which I mean, Google, Facebook, Microsoft, etc, companies that have still not responded to the allegations in a meaningful way, so their credibility is damaged.

It's serious stuff. Worse, we don't have easy solutions. We don't have enough independent sources of crypto. We don't have enough cryptoplumbers to go around, as the stuff is complicated and esoteric, and users have never paid heed to it. We don't have enough evidence to know which other countries are also impacted, whether there are other, non-USA products that they got at.

What if you are an American supplier of cryptographic trust? You're in a bind:

And absolutely, this is an emotional response. It's protest. Intellectually, I believe that AES and SHA2 are not compromised. Emotionally, I am angry and I want to distance myself from even the suggestion that I am standing with the NSA. As Coderman and Iang put it, I want to *signal* my fury. I am so pissed off about this stuff that I don't *care* about baby and bathwater, wheat and chaff, or whatever else. I also want to signal reassurance to the people who use my system that yes, I actually give a damn about this issue.

Avoiding American-influenced crypto is just today's logic, based on what we know, today. The crypto industry is now in a crisis of trust. This is going to get worse.

Posted by iang at October 5, 2013 03:20 PM | TrackBack

[quote]Avoiding American-influenced crypto is just today's logic, based on what we know, today. The crypto industry is now in a crisis of trust. This is going to get worse.[/quote]

Quite so, lang, this is only the beginning of a fundamental change in the way that things are to be better done with virtual betas in the future. And those few words have been chosen very careful to try and accurately reflect and convey what is to be expected, for it is already launching programs with stealthy agents into all critical components and strategic elements of global remote command and virtual machine and SCADA control systems.

And things will be considerably better after they have finished being significantly worse for corrupt and corrupted systems admin of present day strife policies.

And fortunately, there is nothing nobody can do to stop the intelligent march of information progress and any and all who would be foolish and stupid enough to try, just merely identify themselves as persons of interest to be appropriately and summarily dealt with.

New Great Games have New Great Games Players and they be not guided nor mindful of stupid rules and regulations designed by generations with secrets to keep in order to profit from unfair and/or inequitable advantage with privileges.

Posted by: amanfromMars at October 6, 2013 12:54 PM

A minor point here, is that this "Crisis of Trust" is maybe it's not just the NSA and/or American Cryptography. The NSA is merely the first major national Intelligence agency to get publicly caught at it.

I think it likely that all world's Major Intelligence agencies (China, Russia, Israel, etc.) are likely doing similar things and that all Crypto everywhere is currently suspect.

If you want to avoid US Crypto, where are you going to go? It's likely that everywhere else has similar or worse risks. Would you really trust a "international" Encryption algorithm that might has been influences by China, the KGB or lord knows who.

I think the only solution here is the hard way - fix the existing Cryptography so its done in an open and "proven" manner.

In any case, frankly, I don't think the possible weaknesses in NSA influenced cryptography are the "tipping point", there are so many non-cryptographic ways to get access to any target's data that even if we could wave a "magic wand" and make everyone's cryptography secure, we'd still be vulnerable to many kinds of attack/disclosure from various levels of attackers.

We are all vulnerable to un-patched systems, Zero day exploits, viruses, poorly designed infrastructure, hacked operating systems and hardware, Agents sent in as moles, infiltrated contractors, janitors and outsourcing, disgruntled ex-employees, bribery and list of way to get your data seems to be endless and almost nobody can afford to defend against the most well resourced and determined opponents.

If these vulnerabilities are true, for most people (99.999%+), how much worse is it that maybe one likely un-interested Intelligence agency may be able to read your emails and web browsing activity? Do you seriously think you could keep the NSA, China, the CIA or the FBI from figuring out your secrets if they wanted to?

If your answer is no, then nothing has changed for you.

On the other hand, should we be upset?
Yeah, damn right we should.

Should we ensure that US government agencies have to follow US law?
Yeah! It's about damn time.

Should this be an issue in determining who we re-elect next election?
Absolutely and lets make a big stink and educate everyone so they can make a similarly informed decision on who supported the NSA lying so much about this.

Posted by: David Donahue at October 8, 2013 04:28 PM

David: Yes, all. It's very messy. But even though the current situation raises more questions than it answers, it is still my view that (a) we are in a crisis of trust in the security industry, and (b) the American product is now firmly enmeshed in that untrust space.

What to do? I don't recommend people switch from CAPI to the Chinese offering, or from JCE to GOST. These alternatives aren't reasonable and they don't exist.

Instead, I think it is time for the individual security practitioner to start taking more responsibility. Part of this responsibility for security is to realise that outsourcing components to big American name-brands no longer works. Is no longer taking responsibility for security.

When it comes to crypto protocols, the very space of the NSA breach, I have a simple suggestion: It's your job. Do it.

Posted by: Iang (It's your job. Do it!) at October 9, 2013 03:48 AM

... As an example of the far reaching implications, Ylonen says that since the Snowden documents about the NSA were leaked, Finland stopped electronically communicating top secret material between embassies, preferring to courier this kind of information instead. ...

Posted by: the ripples spread... at October 10, 2013 11:26 AM

PrivateSky was shut down at the beginning of the year after introducing a web-based version in beta and for Outlook and had "tens of thousands of heavily active users".

Brian Spector, CEO of CertiVox, told IT Security Guru: "Towards the end of 2012, we heard from the National Technical Assistance Centre (NTAC), a division of GCHQ and a liaison with the Home Office, [that] they wanted the keys to decrypt the customer data. We did it before Lavabit and Silent Circle and it was before Snowden happened.

"It is the same in the USA with FISMA, and it is essentially a national security warrant. So in late 2012 we had the choice to make - either architect the world's most secure encryption system on the planet, so secure that CertiVox cannot see your data, or spend £500,000 building a backdoor into the system to mainline data to GCHQ so they can mainline it over to the NSA.

"It would be anti-ethical to the values and message we are selling our customers in the first place."

Posted by: Avoid British cryptography? at December 14, 2013 08:36 AM
Post a comment

Remember personal info?

Hit preview to see your comment as it would be displayed.