October 05, 2006

How the Classical Scholars dropped security from the canon of Computer Science

Once upon a time we all went to CompSci school and had lots of fun.

Then it all stopped. It stopped at different times at different places, of course, but it was always for the same reason. "You can't have fun," wiggled the finger of some professor talking a lot of Greek.

Well, it wasn't quite like that, but there is a germ of truth in the story. Have a look over at this post (spotted on EC) where the poster declares -- after a long-winded description of the benefits of a classical education -- that the computer science schools should add security to their canons, or core curricula. (Did I get the plurals right? If you know the answer you will have trouble following the rest of this post...)

Teaching someone to validate input is easy. Teaching someone why they need to be rabid about doing it every single time - so that they internalize the importance of security - is hard. It's the ethics and values part of secure coding I really hate having to retrofit, not the technical bit. As it says in Proverbs 22:6, "Train a child in the way he should go, and when he is old he will not turn from it."

This poster has it wrong, and I sense years in the classroom, under the ruler, doing verbs, adverbs and whatnots. No fun at all.

Of course, the reason security is hard is because they -- the un-fun classical scholars -- don't provide any particular view as to why it is necessary, and modern programmers might not be able to eloquently fill in the gap, but they do make very economic decisions. Economics trumps ethics in all known competitions, so talking about "ethics and values of secure coding" is just more Greek.

So what happened to the fun of it all? I speak of those age-old core skills now gone, the skills now bemoaned in the board rooms of just-hacked corporations the world around, as they frown over their SB1386s. To find out happened to them, we have to go back a long time, to a time where titles mattered.

Time was, a junior programmer didn't have a title, and his first skills were listening, coding and frothing.

He listened, he coded and frothed, all under close supervision, and perchance entered the world's notice as a journeyman, being an unsupervised listener, coder and frother.

In those days, the journeyman was called a hacker, because he was capable of hacking some bits and pieces together, of literally making a hack of something. It was a bit of a joke, really, but our hacker could get the job done, and it was better to be known as one than not being known at all. (One day he would aspire to become a guru or a wizard, but that's another story.)

There was another lesser meaning to hacker, which derived from one of the journeyman's core classes in the canon -- breaching security. He was required to attack others' work. Not so as to break it but so as to understand it. To learn, and to appreciate why certain odd things were done in very certain but very odd ways.

Breaching security was not only fun, it was a normal and necessary part of computer science. If you haven't done it, you aren't well rounded, and this is partly why I muck around in such non-PC areas such as poking holes in SSL. If I can poke holes in SSL, so the theory of secure protocols goes, then I'm ready -- perhaps -- to design my own.

Indeed breaching security or its analogue is normal and essential in many disciplines; cryptology for example teaches that you should spend a decade or so attacking others' designs -- cryptanalysis -- before you ever should dare to make your own -- cryptography. Can you imagine doing a civil engineering course without bending some beams?

(Back to the story.) And in due course, when a security system was breached, it became known as being hacked. Not because the verb was so intended, but by process of elimination some hacker had done it, pursuant to his studies. (Gurus did not need to practice, only discuss over cappuccinos. Juniors listened and didn't do anything unless told, mostly cleaning out the Atomic for another brew.)

You can see where we are going now, so I'll hit fast-forward. More time passed ... more learning a.k.a. hacking ... The press grasped the sexy term and reversed the senses of the meaning.

Some company had its security breached. Hacking became annoying. The fun diminishes... Then the viruses, then the trojans, the DDOS, the phishing, the ....

And it all got lumped together under one bad name and made bad. Rustication for some, "Resigned" for others, Jail for some unfortunates.

That's how computer science lost the security skills from the canon. It was dropped from the University curricula by people who didn't understand that it was there for a reason. Bureaucrats, lawmakers, police, and especially corporates who didn't want to do security and preferred to blame others, etc etc, the list of naysayers is very long.

Having stripped it from computer science, we are now in a world where security is not taught, and need to ask: what they suggest in its place:

There are some bright security spots in the academic environs. For example, one professor I talked to at Stanford in the CS department - in a non-security class, no less - had his students "red team" and blue team" their homework, to stress that any and all homework had to be unhackable. Go, Stanford! Part of your grade was your homework, but your grade was reduced if a classmate could hack it. As it should be.

Right, pretend hacking exercises. Also, security conferences. Nice in spirit, but the implementation is a rather poor copy of the original. Indeed, if you think about the dynamics of hacking games and security conferences ("Nobody ever got hacked going to Blue Hat??") we aren't likely to sigh with relief.

And then:

One of my colleagues in industry has an even more draconian (but necessary) suggestion for enforcing change upon universities. ... He decided that one way to get people's attention was to ask Congress to tie research funds to universities to changing the computer science curriculum. I dare say if universities' research grants were held up, they might find the extra time or muster the will to change their curricula!

Heaven help us! Are we to believe that the solution to the security training quandary is to ask the government to tell us how to do security training? Tell me this is a joke, and the Hello Kitty People haven't taken over our future:

The Hello Kitty people are those teenagers who put their personal lives on MySpace and then complain that their privacy is being violated. They are the TV viewers who think that the Hurricane Katrina rescue or the Iraq war were screwed up only because we don't, they belatedly discover, have actual Hello Kitties in political power. When inevitably some of world's Kitties, unknown beyond their cute image, turn out to be less than fully trustworthy, the chorus of yowling Kitty People becomes unbearable cacophony.

(We've written before about how perhaps the greatest direct enemy of Internet security is the government, so we won't repeat today.)

Here is a test. A corporation such as Oracle could do this, instead of blaming the government or the hackers or other corporations for its security woes. Or Microsoft could do it, or anyone, really.

Simply instruct all your people to breach security. Fill in the missing element in the canon. Breach something, today, and learn.

Obviously, the Greeks will complain about the errant irresponsibility of such support for crime ... but just as obviously, if they were to do that, their security knowledge would go up by leaps and bounds.

Sadly, if this were taken seriously, modern corporations such as Oracle would collapse in a heap. It's far cheaper to drop the training, blame "the hackers" and ask the government for a subsidy.

Even more sadly, we just don't have a better version of training than "weapons free." But let's at least realise that this is the issue: you classicals, you bureaucrats, you Sonys and Oracles and Suns, you scared insecure corporations have brought us to where we are now, so don't blame the Universities, blame yourselves.

And, in the name of all that is sacred, *don't ask the government for help!*

Posted by iang at October 5, 2006 06:58 PM | TrackBack
Comments

around the time we were doing some work on the original payment gateway (for what has since became to be called e-commerce)
http://www.garlic.com/~lynn/aadsm5.htm#asrn2
http://www.garlic.com/~lynn/aadsm5.htm#asrn3

I frequently commented that to take a standard, well-tested application and turn it into an industrial strength dataprocessing "service", typically required 4-10 times the original effort. as mentioned in the above references ... two of the people that we had worked with when we were doing our high-availability (ha/cmp) product ... were now responsible for the commerce server. old posting about an old ha/cmp meeting
http://www.garlic.com/~lynn/95.html#13

misc. past posts mentioning ha/cmp
http://www.garlic.com/~lynn/subtopic.html#hacmp

we have also made the statement that that payment gateway could be considered the original SOA ... part of that is from coming up with 3-tier architecture in the 80s
http://www.garlic.com/~lynn/subnetwork.html#3tier

when we started ha/cmp we did a detailed failure mode analysis on tcp/ip protocol and code. some of the items we turned up ... showed up in attacks nearly 10 years later .... i.e. from the standpoint of doing industrial strength dataprocessing ... a failure mode analysis is a superset of some types of security treat analysis .... or you upgrade the term threat analysis to any type of problem ... not solely limited to stuff related to human attackers.

of course part of the history was having done a lot of work in the 60s & 70s on operating systems that were used for commercial time-sharing services
http://www.garlic.com/~lynn/subtopic.html#timeshare

where the implicit assumption was that the full community of users could have hostile intent with regard to each other ... compared to today where very few people take seriously that the default assumption is the standard operating environment is extremely hostile.

there used to be the old joke that computer science institutional memory is no more than five years. this somewhat goes along with the current idea that free software is something new ... as opposed to be the default in the 50s, 60s, and 70s. some amount of the change was gov. litigation and the resulting 23jun69 unbundling announcement that started trend charging for software ... some number of past posts about transition from free software to charge-for/licensed software
http://www.garlic.com/~lynn/subtopic.html#unbundle


some past posts mentioning identifying failure modes ... some of which didn't actually materialize (in attacks) to nearly ten years later:
http://www.garlic.com/~lynn/2004g.html#11 Infiniband - practicalities for small clusters
http://www.garlic.com/~lynn/2004g.html#42 command line switches [Re: [REALLY OT!] Overuse of symbolic constants]
http://www.garlic.com/~lynn/2005c.html#51 [Lit.] Buffer overruns
http://www.garlic.com/~lynn/2006e.html#11 Caller ID "spoofing"
http://www.garlic.com/~lynn/2006i.html#6 The Pankian Metaphor

some number of past posts mentioning it taking 4-10 times the effort to turn a well-tested application into an industrial strength service
http://www.garlic.com/~lynn/2001f.html#75 Test and Set (TS) vs Compare and Swap (CS)
http://www.garlic.com/~lynn/2001n.html#91 Buffer overflow
http://www.garlic.com/~lynn/2001n.html#93 Buffer overflow
http://www.garlic.com/~lynn/2002n.html#11 Wanted: the SOUNDS of classic computing
http://www.garlic.com/~lynn/2003g.html#62 IBM says AMD dead in 5yrs ... -- Microsoft Monopoly vs. IBM
http://www.garlic.com/~lynn/2003j.html#15 A Dark Day
http://www.garlic.com/~lynn/2003p.html#37 The BASIC Variations
http://www.garlic.com/~lynn/2004b.html#8 Mars Rover Not Responding
http://www.garlic.com/~lynn/2004b.html#48 Automating secure transactions
http://www.garlic.com/~lynn/2004k.html#20 Vintage computers are better than modern crap !
http://www.garlic.com/~lynn/2004l.html#49 "Perfect" or "Provable" security both crypto and non-crypto?
http://www.garlic.com/~lynn/2004p.html#23 Systems software versus applications software definitions
http://www.garlic.com/~lynn/2004p.html#63 Systems software versus applications software definitions
http://www.garlic.com/~lynn/2004p.html#64 Systems software versus applications software definitions
http://www.garlic.com/~lynn/2005b.html#40 [Lit.] Buffer overruns
http://www.garlic.com/~lynn/2005i.html#42 Development as Configuration
http://www.garlic.com/~lynn/2005n.html#26 Data communications over telegraph circuits
http://www.garlic.com/~lynn/2006n.html#20 The System/360 Model 20 Wasn't As Bad As All That

Posted by: Lynn Wheeler at October 5, 2006 02:17 PM

there is something of an analogy with driving ... long ago you could have an environment that didn't have bumpers, safety glass, collapsible steering column, padded dashboard, safetybelts, door locks, ant-theft devices, alarms, air bags, guard rails, impact barriers, crush zones, stop signs, and a ton of other countermeasures for all the possible bad things that might happen.

part of the current programming state of the art ... is that it appears that nearly all of the countermeasures have to be re-invented from scratch for every new effort (if there is any attempt to address the issues at all).

note that in some of the scenarios ... a particular piece of software is more like a single component ... like the engine ... so a lot of the necessary countermeasures aren't part of the engine operation ... it would be part of other components.

a basic issue is designing and implementing an overall infrastructure that has all the necessary countermeasures designed in as part of the fundamental operation .... aftermarket piecemeal is significantly less effective than having it alld designed in from the start.

re:
http://www.garlic.com/~lynn/aadsm25.htm#37 How the Classical Scholars dropped security from the canon of Computer Science

another scenario ... is the x9.59 paradigm change.

if you take the security acronym PAIN

P -- privacy (or sometimes cain and confidentiality)
A -- authentication
I -- integrity
N -- non-repudiation

much of the current retail payment environment is dependent on privacy (information hiding and encryption). the x9a10 financial standard working group was given the requirement to preserve the integrity of the financial infrastructure for all retail payments (as part of the x9.59 standard work)
http://www.garlic.com/~lynn/x959.html#x959
http://www.garlic.com/~lynn/subpubkey.html#x959

x9.59 changed the paradigm and requires integrity and authentication ... but no longer requires data hiding encryption (privacy) as part of the transaction in order to preserve the integrity of the financial infrastructure for all retail payments.

this is somewhat the past comments about moving away from naked payment/transaction paradigm:
http://www.garlic.com/~lynn/aadsm24.htm#5 New ISO standard aims to ensure the security of financial transactions on the Internet
http://www.garlic.com/~lynn/aadsm24.htm#7 Naked Payments IV - let's all go naked
http://www.garlic.com/~lynn/aadsm24.htm#8 Microsoft - will they bungle the security game?
http://www.garlic.com/~lynn/aadsm24.htm#9 Naked Payments IV - let's all go naked
http://www.garlic.com/~lynn/aadsm24.htm#10 Naked Payments IV - let's all go naked
http://www.garlic.com/~lynn/aadsm24.htm#12 Naked Payments IV - let's all go naked
http://www.garlic.com/~lynn/aadsm24.htm#14 Naked Payments IV - let's all go naked
http://www.garlic.com/~lynn/aadsm24.htm#22 Naked Payments IV - let's all go naked
http://www.garlic.com/~lynn/aadsm24.htm#26 Naked Payments IV - let's all go naked
http://www.garlic.com/~lynn/aadsm24.htm#27 DDA cards may address the UK Chip&Pin woes
http://www.garlic.com/~lynn/aadsm24.htm#30 DDA cards may address the UK Chip&Pin woes
http://www.garlic.com/~lynn/aadsm24.htm#31 DDA cards may address the UK Chip&Pin woes
http://www.garlic.com/~lynn/aadsm24.htm#38 Interesting bit of a quote
http://www.garlic.com/~lynn/aadsm24.htm#41 Naked Payments IV - let's all go naked
http://www.garlic.com/~lynn/aadsm24.htm#42 Naked Payments II - uncovering alternates, merchants v. issuers, Brits bungle the risk, and just what are MBAs good for?
http://www.garlic.com/~lynn/aadsm24.htm#46 More Brittle Security -- Agriculture
http://www.garlic.com/~lynn/aadsm25.htm#20 Identity v. anonymity -- that is not the question
http://www.garlic.com/~lynn/aadsm25.htm#25 RSA SecurID SID800 Token vulnerable by design
http://www.garlic.com/~lynn/aadsm25.htm#28 WESII - Programme - Economics of Securing the Information Infrastructure
http://www.garlic.com/~lynn/2006m.html#15 OpenSSL Hacks
http://www.garlic.com/~lynn/2006m.html#24 OT - J B Hunt
http://www.garlic.com/~lynn/2006o.html#35 the personal data theft pandemic continues
http://www.garlic.com/~lynn/2006o.html#37 the personal data theft pandemic continues
http://www.garlic.com/~lynn/2006o.html#40 the personal data theft pandemic continues

Posted by: Lynn Wheeler at October 5, 2006 03:14 PM

i guess i have to admit to being on a roll.

one of the big issues is inadequate design and/or assumptions ... in part, failing to assume that the operating environment is an extremely hostile environment with an enormous number of bad things that can happen.

i would even assert that the enormous amounts of money spent attempting to patch an inadequate implementation can be orders of magnitude larger than the cost of doing it right in the first place.

some of this is security proportional to risk ... where it is also fundamdental that what may be at risk is correctly identified.

this is my old standby security proportional to risk posting
http://www.garlic.com/~lynn/2001h.html#61

these are postings in a pki/ca security proportional risk thread from today
http://www.garlic.com/~lynn/2006s.html#4
http://www.garlic.com/~lynn/2006s.html#5

and as i've observed before that possibly part of the "yes card" exploits has been assumptions that there would be attacks on the chipcard ... where, in fact, the "yes card" attacks are on the terminal and infrastructure (and the chipcard attack countermeasures pretty much failed to address terminal/infrastructure attacks at all)
http://www.garlic.com/~lynn/subintegrity.html#yescard

Posted by: Lynn Wheeler at October 5, 2006 07:43 PM

But the real solution is to make the right folks responsible for the task. We had a big problem with stolen credit cards until the credit card providers were financially responsible for (almost) all the losses incurred. Suddenly, security measures were taken which reduced the problem to whatever the losers were willing to bear. That works. But when they can make the problem an "externality" (as happens with identity theft these days) the problem will not be contained. The people who can contain it lack the motivation. End of story.

Bruce Schneier writes clearly about these topics at Schneier.com so you can read it for free.

Posted by: Richard Karpinski at October 6, 2006 11:21 AM

Funny... yes, incentives is part of the story, but not the whole part. It's the Internet, remember, and the attackers don't like your funny incentives. Also, your defenders need to get their training somewhere. And if you don't do that, then the company goes out and hires some company like Counterpane or Symantec or ..., and we're back with muddled incentives again.

Here's something you should read: someone reaching out and communicating with the blackhats. Coz he needs their help, and they might even do it ... not because he got his ethics out of a cornflakes packet.

http://www.wired.com/wired/archive/15.01/cybercop_pr.html

Posted by: Reading is free, learning costs... at January 3, 2007 03:06 PM

** *** ***** ******* *********** *************

Teaching Viruses

Over two years ago, George Ledin wrote an essay in "Communications of the ACM," where he advocated teaching worms and viruses to computer science majors: "Computer science students should learn to recognize, analyze, disable, and remove malware. To do so, they must study currently circulating viruses and worms, and program their own. Programming is to computer science what field training is to police work and clinical experience is to surgery. Reading a book is not enough. Why does industry hire convicted hackers as security consultants? Because we have failed to educate our majors."

This spring semester, he taught the course at Sonoma State University. It got a lot of press coverage. No one wrote a virus for a class project. No new malware got into the wild. No new breed of supervillain graduated.

Teaching this stuff is just plain smart.

Essay:
http://www.csl.sri.com/neumann/insiderisks05.html#175

http://www.sonoma.edu/pubs/newsrelease/archives/001090.html
http://www1.pressdemocrat.com/apps/pbcs.dll/article?AID=/20070522/NEWS/705220312/1033/NEWS01 or http://tinyurl.com/ytrbzs
http://blogs.pcworld.com/staffblog/archives/004452.html
http://www1.pressdemocrat.com/apps/pbcs.dll/article?AID=/20070526/NEWS/705260309/1043/OPINION01 or http://tinyurl.com/2e2anv
http://www.hardocp.com/news.html?news=MjU5NzgsLCxoZW50aHVzaWFzdCwsLDE
http://technews.acm.org/archives.cfm?fo=2007-05-may/may-25-2007.html#313412 or http://tinyurl.com/yuur5l
http://www.calstate.edu/pa/clips2007/may/22may/virus.shtml

Posted by: from Schneier's crypto-gram at June 25, 2007 12:51 AM
Post a comment









Remember personal info?






Hit preview to see your comment as it would be displayed.