Sense About Science lecture

From The Cory Doctorow Wiki
A quote from the Sense about science lecture on a portrait of Doctorow by Jonathan Worth, CC BY-SA.


There is no way to fight oppression without free and open devices and networks. So it's up to us to demand the freedom layer on our devices and to enable that struggle. To be cyber-optimists, to secure the network and to use it to coordinate our struggle for our freedom. To jailbreak every device, to crack every censor-wall, to seize the means of information, and with it liberate the planet.



May 19, 2013

Website, Guardian


MP3 (34 MB)


«I gave the annual Sense About Science lecture last week in London, and The Guardian recorded and podcasted it. It’s based on the Waffle Iron Connected to a Fax Machine talk I gave at Re:publica in Berlin the week before.»


Please help with

Feel free to help out with any of this on the transcript bellow:

  • Add more paragraphs to make it easier to read
  • Add more sub-headings to make it easier to read
  • Add timestamps at the start of each paragraph to make it easier to use
  • Check for typos and errors
  • Fix tagged issues
  • Add explanatory links to things that not everyone might understand
  • Add more relevant tags

Thank you!



Thank you very much. I'm very honored to be here tonight. Especially given how amazing this building is. It’s just, just beautiful. I am technically now British, although I haven't been issued my accent yet. But I love coming to these buildings, it makes me feel so old-worldly.

Detergent that digests clothes

So the speech tonight starts with a story that is really, I have to admit, playing to the crowd. It's a story about the engineers versus the marketing people. But it’s a true story, or at least a true enough story. It was told to me by a friend who used to work for one of the major packaged goods companies, companies like Proctor & Gamble and Unilever, that specialize in marketing driven product design. They go out and they figure out what people are after and then they come back to the technical people and get them to build it for them.

Well one day the marketing people had gone of for the weekend some place where they had had massages and Ayurvedic treatments and what not and they come back with an AMAZING idea for the people in the bowels, the engineering department, they come to them and they say "Listen, we've worked it out. What people want is detergent that makes their clothes newer when you wash them." So the engineers they kind of huddled for a bit, and they try for a while to explain the second law of thermo dynamics, and they gave it up as a bad job. They where really struggling with it, because the marketing department is their bosses, and they can't just say "no", especially giving the market research looks like theres a huge opportunity for this, and they realize that the marketing department does not speak with technical precision. That when they say "new" they don't mean "new", they mean "the appearance of new".

By happy coincidence they had been working with hot water activated enzymes that attack fiber-ends. Now a broken fiber has twice as many ends as an intact fiber, and the things that makes looks pilled and old and dulled are those broken fiber, so if you put this enzyme in the wash with some hot water well then it will attack the fiber ends and it will get twice as many broken fibers as intact fiber and your clothes will LOOK newer. Now your clothes wont actually be newer. Your clothes will be MUCH older.

They will be partially digested by the wash-water, and in every sense apart from the sense that the marketing department cared about this was not making your clothes newer at all. But it was exactly what marketing was looking for, and off they went. It's now stuff that you can buy, for your wash. It kind of feels like a harmless delusion here. And I think that a lot of the times when we go along with things that sound like they're technically right, but on closer examination are the opposite of technically right, it’s often a kind of a harmless delusion off in the corner; Who cares if its really newer or if it’s just newer-ish? Everybody's happy, whats the problem? But not every one of our technical delusions are harmless, and thats what I am going to talk about tonight.

Not harmless

I believe that we increasingly live in a world that is made out of computers. These days we are apt to no longer have houses, we have computers we put our bodies into. We take the computers out of our houses, our houses cease to be inhabitable. Our houses are in some important sense just a case for a computer that we happen to live in. A 747 is a flying Sun Solaris box in an exotic aluminum case connected to some very, very badly secured SCADA controllers. Cars are computers you put your body into that races down the motorway at a 120 km/h surrounded by other people likewise trapped in computers racing down the motorway at 120 km/h. And its not just that we put our body into computers these days, we increasingly put computers into our bodies.

Some of you are my generation and grew up with walk-mans, some of you are younger and grew up with iPods. Most of us will have logged enough punishing earbud-hours that come the day, if we live long enough, we will all have hearing aids. When we get our hearing aids they’re vanishingly likely to be beige retro, hipster, transistor driven analog circuitry. They will be computers that we put in our bodies. And because everything we do today increasingly involves computers, every problem that we have and every problem that our regulators and lawmakers seek to solve involves a computer and that means that increasingly policymakers are going to look at regulating computers as an effective means of solving social problems. Now as it turns out, we have a 15 years running experiment on achieving policy goals through the regulation of computers, and that of course is the copyright-wars and the particularly the anti-copying technology we’ve tried to deploy in our devices, the digital rights management technology that tries to stop you making copies and doing other things you are not supposed to do.

This technology has been around a long time and predates the term digital rights management. We have had DRM longer than we have had the term DRM. Back in the 1980's there where all sorts of DRM schemes that was field testes in the name of preventing copies of programs delivered on floppy disks – the Chuckie Egg era of DRM – or embedded in our satellite receivers to stop us from receiving satellite program without paying a subscription fee. And if you cast your mind back to those days the one thing you will recall is that none of them actually worked really well. You always had a mate who could make you a copy of Chuckie Egg. It was just rubbish, because conceptually this is what it was trying to do: You have a supplier. A company that makes software, or operates the head end of a satellite up-link. And they make an encrypted text available to you, a cipher text that’s been scrambled, a message that’s been scrambled to you. And they also give you a program for unscrambling that. Either thats in the satellite receiver, or its on the disk, or it’s part of your computer. And the idea is that that program has the keys and it takes the keys and it adds them to the scrambled message, and what comes out is the unscrambled message, the program to run, the de-scrambled satellite feed. Every time you invoke the de-scrambling program it checks to see whether it is presently allowed to do so. It consults its own internal logic to see whether its meant to be doing it, or it tries some external tests.

You may remember there was a time where if you where unlucky enough to work with AutoCAD you had to lug 1200 pounds of manual along, because the thing it would do every time you fired it up was "Please turn to page 972 and go to paragraph 4 and tell me what word 5 is", so it makes some external check to see if it was allowed to de-scramble itself. And then it would de-scramble it if it was allowed to. But when it was done de-scrambling, it would throw the de-scrambled file away along with all of it working files. And it went to some pains to obscure whatever temporary directory or memory locations it was storing that stuff into. So that you would have to run the de-scrambling the next time you wanted to access this material. And this beggars a kind of cargo-cultish resemblance to a cryptographic security system, but it doesn't really stand up to any rigorous inspection. In real crypto you have this idea that you have a sender and a receiver who are allies. They are on the same side, and in the middle you have an adversary who is trying to intercept and de-scramble the message that the sender and the receiver are exchanging. so you have Alice and Bob who are exchanging a message, and Carol who is trying to receive it. Alice and Bob they have a shared secret, they have a key that they share, or some set of keys that they share, that allows them to de-scramble the message.

Carol doesn't. And so Carol even though she can intercept the message because its going over the radio-waves or because she has broken into and office and gotten it or because of some other factor, even though she can grab the message, and even though she knows how it was scrambled – because we don't use proprietary scrambling methods, we use ones that are worked trough by rigorous peer review, because as Bruce Schneier says “Anyone can invent a scrambling system that is so secure that he can't think of a way of breaking it”, the only experimental methodology for knowing weather or not your security system is any good is to tell as many people as you can about it, and see if any of them can think of a way of breaking it. So Carol has the scrambling method, she has the scrambled text and the only thing she does not have is the keys. Thats how real crypto-systems work. But in the vodoo-crypto of Digital Rights Management the decoding app has the keys and the decoding app is in the hands of the adversary So we don't have Alice and Bob and Carol in this scenario. There is just Alice and Bob here.

You have Bob sending Alice both the scrambled message and the keys and the method for de-scrambling it. But thinking that somehow Alice is going to be too stupid to figure out how to hang on to those keys when Bob tells her to get rid of them. Bob is engaged in what we call in security "wishful thinking" here. Because in reality it's not just one Alice. In this situation Alice is everybody in the world who has an interest in getting that cipher-text. Or any of the other cipher-texts that is protected with the same keys. All of the adversaries in the world, and as soon as one of them work out where the keys are, or how they are being stored, well that's it. It’s game over to all of them. This is a break once, break everywhere system. And the most powerful adversary in the world who has an interest in breaking this cryptosystem, is the adversary you have to contend with. Not the average person of the street, but the one with the electron tunneling microscope, or the one who's got access to a lab full of computer-technicians or just a really good debugger. And that is the person who is going to add the "Save as"-command to the program that allows you to save it back to the hard drive after is has been de-scrambled.

Now, if the person who extracts those keys isn't up to it, she can share those keys with someone who can, or she can just decrypt a bunch of files and share them, or she can write up instructions for extracting the keys and send them around. It all comes to the same thing. And to this day people who advocate for this strange idea of security, they argue that if nothing else, it presents a speed bump on the way to decrypting it. But it presents no speed bump, quite the contrary. Once you've rendered a file in the clear and saved it out, it can be infinitely copied – it is fecund and it reproduces. By contrast, the files that you get legitimately have been neutered at the factory and they can't reproduce themselves and so as with any ecosystem; if you have a neuter competing with something that is infinitely fecund, the infinitely fecund thing quickly fills up the field, and it's only the people who want legitimately to get it through legitimate channels that are presented with any kind of speed bump on the way to getting it. And it's always easier to get the DRM-unrestricted version than the DRM restricted version. So that's what this talk is about, it’s about the harm from this paradigm of computer regulation, this harm from DRM. And I want to signpost the things that I don't want to talk about tonight, the stuff that always comes up when we talk about DRM; pointless arguments about whether DRM leads to sales or whether it leads to a net drop in sales, or whether DRM is fair and can embody all the things of fair dealing and fair use, whether DRM is a good strategy or a bad strategy for artists, whether DRM ultimately works at restricting copying. I think that’s the wrong sorts of questions to ask unless you happen to be in my weird line of business. If you're an artist like me – I write science fiction novels – and if you're like me, try to sell as many books as possible in order to put food on the table for your family it is important to know the answers to those questions. But from the perspective of a policy maker or a computer scientist or someone who cares about public policy these questions are entirely beside the point for the purpose of tonight's talk. But for the record, let me do say that I think that DRM does lead to a net decrease in sales, that it can't accommodate fair dealing, that it's not a good business strategy and that it isn't very effective at stopping copying. But like I said, none of that matters to anyone except for the tiny minority of people who happen to be in my industry, the entertainment industry. For a wider society the effects of DRM are independent of the answers to those questions. That is; even if you could show me ironclad proof that DRM is good for sales, that it's totally fair, that it's great for the entertainment industry and wholly effective at limiting copying I still think that there are reasons to oppose it on grounds that are much more important than any of that stuff. And to understand why we have to unpack some of the nerd complacency behind discussions about this stuff and realize that it's impossible to talk about technological questions without examining and weighing the legal code at the same time as we consider the software code.

Now the main body of legal code around how we interact with copy restriction technologies emanates from a UN specialized body called WIPO, the World Intellectual Property Organization. This is the UN specialized agency that has the same relationship to dumb global copyright law that Mordor has to evil. In 1996 WIPO enacted its Internet treaties: the WIPO copyright treaty – the WCT – and the WIPO phonograms and performer's treaty – the WPPT. These are the two most important treaties that you've never heard of in all likelihood. And what these did was they set out to create special protection for DRM, to make DRM effective in law even as it was ineffective in technology. So what they said was all member states have to pass laws that say that it's illegal to break DRM, they have to make it illegal to extract the keys from DRM, they have to make it illegal to tell people how to extract keys from DRM, they have to make it illegal to host the keys or instructions for extracting the keys and they have to make it illegal to automate the extraction of keys to make a tool that extracts keys. And they also have to regulate anyone who's an intermediary where this stuff might end up being hosted. So YouTube, your favorite blog program, Twitter, Facebook, any place where people talk to one another has to be regulated so that it's illegal to host decrypted files on it keys or instructions for removing keys. And moreover: all of those hosts must expeditiously remove anything that's claimed to be any of those classes of documents, so that if you get a notice that says this is an infringing file, this is a key or this is instructions for removing a key or a program that automates removing keys and you operate a place where people can talk to each other you must immediately remove that or you face being named in any eventual suit that arises from any infraction there, any copyright infringement that takes place. And many of the member states that were parties to these treaties and acted exemplary damages rules that made it very attractive to companies to automatically comply with any request even if it didn't pass the the least rigorous giggle test. In the United States it is 250 000 dollars per infringement in statutory damages, so very quickly that piles up to you know eleven times the global GDP. So companies that receive any kind of notice tend to expeditiously remove this material instantaneously. And this country, the laws enacted through the European Union Copyright Directive – the UCD – it was most recently kind of bolstered and reaffirmed through the Digital Economy Act.

Now where does all this stuff get us? Well it means that it's illegal to reverse engineer or interoperate with any technology that is any part of a copy restriction system, and that's a pretty sweet set up for a company. Lets play out how this works in the real world. Think about DVD's in 1996 DVDs hit the market. Now DVD's are not hard to read and to make technologies that can interoperate with them. But anyone who wants to implement a DVD has to license the keys from the consortium that makes DVD's and that controls the key licensing for them. If you get the keys in any other way you fall afoul of the ‘no extracting, no distributing. no automating the extraction of keys’ rule. And as a condition of licensing keys, if you want to license the keys and implement a DVD player that does stuff with DVD's that people own – that they bought, not DVD's that they pirated, DVD's they bought – you have to agree to a whole bunch of things that have nothing to do with stopping piracy. You have to agree with things that can be most charitably called profit maximization schemes. For example you have to implement a flag for video that says ‘Don't allow this video to be skipped’, and so is supposed to be used for those F.B.I. warnings, but it's also used for ads as you'll have discovered if you have a DVD player. You have to implement region checking. Now it's true that these days if you go in the high street all the DVD players are region free. That's only because when they formed the DVD consortium they forgot to hit up their major members for a litigation budget to sue non-compliant companies. So when the company started coming out with these they just there were too many of them for them to sue, they would pop up in Pacific Rim and then disappear and another company would pop up next door. With Blu Ray they raised a huge war chest and you will not see region free Blu-ray players in the high street any time. And you have to implement something called robustness and this is where it starts to get really interesting for people who have nothing to do with the entertainment industry. Because the robustness requirement is that the device has to be robust against user modification. It has to be designed to prevent users from modifying it. Now what does it mean to be resistant to user modification in DVD players? The actual keys for playing DVD's were extracted about ten years ago by a Norwegian teenager in an afternoon, and they've been floating around the Internet ever since. So what does it mean to make something that is not user modifiable when the keys can be downloaded from the internet trivially?

[18:31] I think that there's an argument to be had about that. I don't want to have that argument. The one thing that I will say is that there's a whole class of technologies that absolutely fail the ‘resistant to user modification’ test, and those are the Free and Open Source software technologies, because every one of them has a license that requires something ‘like in order to use this technology you must agree to distribute whatever you make with it in code that is in the preferred form for modification by its users’. That's the kind of standard term for free and open source software licensing. So we can say as a first order of fact that when you implement a robustness requirement you ban anything that's Free and open source software. And so that's Linux and all the things that are around it. Because it wouldn't do to implement an anti-user or use-restriction system in code that's designed to be modified by users. If you have code that somewhere in it says ‘allow this file to be copied zero times’ and the user can simply go in and change that to ‘twenty million times’ it ceases to be an effective use control system. It has to be designed to keep the users out. Now why does it matter that not anyone can interoperate with DVD's? Well there's lots of good reasons to like interoperability. If you follow the history of competition law you'll see that you know the spares market has been an area of regulation and litigation for years and years and that generally economists come down on the side that it's really good to be able to buy third party add-ons for whatever it is you what you already own, it's nice to not have to buy your dishes from the same people who sold you your dishwasher. But there are there are some specific reasons that I'll get into tonight. One is that a company may not see the value in adding features to the product that you use that you really need. Either because it incorrectly discounts the value of the feature, it just decides that there's no profit to be made there. The world is full of companies that have said things like, the chairman of IBM said that the world probably has room in it for as many as six computers, so they may have just just missed the fact that there's a market for their product there. Or they may be correct in assuming that you don't constitute the market that is big enough for them to pursue but you have other reasons to want it, for example if you have a sensory or physical disability you may not constitute enough of a market for anyone to commercially pursue it and you may have to rely on cooperative efforts or efforts led by kind of public spirited societies like the [DU 21:08 RNAB?] to add features. Or sometimes it's because there's a thing that you could do for yourself if someone was allowed to make a feature for this device, but they make more money by keeping you from doing it.

[21:24] And to really dig into how that works let's get back to DVD's and compare them to CD's. Now DVD's been around since 1996. CD's have been around a lot longer, another decade or so. But these days they are made on the same factory floors, they are read in the same drives, you read and write them with the same lasers. I'd like to propose a little danken experiment: Let's go back to 1996 – to the the heyday of the Virgin Megastore and the HMV – You walk into one of these stores and you drop a thousand quid on DVD's and a thousand pounds on CD's and you take them back and you stick them in a vault. Now normally what happens when you buy technology and stick it in a vault is that it depreciates to zero. I speak from experience, I spent the dotcom years buying laptops. But a funny thing happens if you stick CDs in a vault for ten years in 1996. When you open that vault up again in 2006 they have appreciated they have a dividend to pay you. Because when you bought the DVD's and CD's all you could do with them is play them. But now you can rip them, you can mash them up, you can put them on your iPod, you can make them into ring tones, alarm tones, you can share them, you can put them on a server, you can put them on the cloud, you can stream them, you can do all of these things. That's what happens when the companies that didn't originate the product or allowed to extend the useful life of the product. Now DVD's are identical technology to CD's, but they're illegal to interoperate with because they have some scrambling on them and it's illegal to descramble a file without permission. DVD's in that ten years have not appreciated at all. DVD's have been around since 1996 – we're getting up to the twentieth anniversary of the DVD – not one feature has been added to DVD's since 1996. Today you can do the same thing you could do with a DVD in 1996. You can watch it. Period. All of the other features lurking in potentia for your DVD's, those are features that have been picked from your pocket. All that value has been removed from you by force thanks to a law that most of us have never heard about and don't know enough about to care about.

[23:29] But interoperability is really only the first casualty of DRM, and I've saved the greatest consequence of it for last, and that's transparency. So how does DRM work? Well effectively to make DRM work you have to enable an anti-feature in computers, a feature that causes computers to disobey their owners. Nobody wants DRM. People might buy devices with DRM in them because you don't care about it, but nobody got up this morning and said ‘I wonder if there's a phone out there that really restricts my music’ and gone to the shops looking for one. You need to be able to design computers that disobeyed their owners in some important way. You ask your computer to do something, you say ‘copy that file’. Normally what your computer does it says ‘Yes master’, that's what we expect of our servants, but rather than doing that what your computer says in this instance is ‘I can't let you do that Dave’. Now in order to accomplish the trick, in order to stop you from making a copy the computer has to be running a program that watches to see if you're making that copy. A program you could call like HAL9000.exe sitting on your on your desktop. And the most efficient way to implement HAL9000.exe, the way to stop you from dragging it into the trash because nobody wants it there, is to redesign computers so that they are blind to certain programs and certain processes. If you ask them what's going on in them they mis report what's going on in them. And to understand how that works go back to another DRM story.

[25:02] In 2005 Sony B.M.G. shipped 50 audio CD's, about six million CD's, all over the world and they were loaded with some software that automatically installed itself on your computer when you put it in your CD drive. That software stealthily changed your computer's operating system. So if there was a file on your computer that was prepared with §SYS§ that it wouldn't report it. So if there was a file called §SYS§hello.txt and you said ‘list those files’ or ‘open the window’ that file wouldn't show up. All the other files would, that file wouldn't. And if you ran the process monitor, if you said ‘what programs are running on this computer?’ and the process was prepended with §SYS§ your computer wouldn't report on that either.

[25:52] You may remember, those of you who are old computer gamers, that Infocom did a Zork-style Hitchhiker's Guide To The Galaxy game that Douglas Adams helped write. You couldn't solve the game without going into one room and saying look and the game would say ‘there is nothing here’ and you type look and the game would say ‘there's nothing here’ and you type look and the game would say ‘OK there's something here’ and then you type look again it would reveal it. [26:14] You couldn't win the game without knowing that. Now that was very funny for Douglas Adams to have implemented but it is not how we want our computers to work in the rest of our lives.

[26:25] So what happened when Sony did this? Well, they put a moat in your computers eye. They made it blind to any process or any program that was prepended with this magic string of characters. And immediately virus writers started prepending all of their viruses with §SYS§. Of course they did, why wouldn't they? There is a §SYS§ shaped hole in your computer's immune system. We can't expect anything but that infections will rush in opportunistically to fill that hole. Dan Kaminsky, who is a very good security researcher, did a very clever piece of work to figure out how widespread the Sony rootkit was – that's what this piece of software is called a root kit, something that changes your computer's operating system to blind it. And he reckons that 300 000 American military and government networks were infected with the Sony rootkit by people who did nothing more than bring home a Celin Dion CD. As a Canadian I apologize for the CD and for Celine Dion.

[27:27] So here alas is the true cost of DRM: When you add DRM to a system you create a legal requirement for opacity in that system for weak security. You make it illegal for researchers to tell you what's going on inside a computer, because to tell you what's going on inside a computer is to compromise the DRM. If you explain how the DRM works, you necessarily explain how to get around it. And that matters for reasons that are much more significant than the future of the entertainment industry. Now I have to say, I had a really special weekend last weekend at the Bank holiday. I invited information around for drinks, and we sat down, we drank some Chardonnay, we laughed, we cried, we hugged, and when it was all over information confessed that it does not want to be free. In fact information confessed that there's only one thing it wants in this whole world and that's for us to stop anthropomorphizing information. Information does not want to be free, but people want to be free.

[28:26] We have computers on our desks, and we have computers in our pocket. We have computers we insert into our bodies, and we have computers into which we insert our bodies. And they have the power to liberate us or to enslave us. When computers don't tell us what they're doing they expose us to horrible, horrible risks. Last year, in the United States, the Federal Trade Commission, which is a kind of consumer watchdog agency, settled with seven companies in the hire purchase business and a software company that made security software for hire purchase laptops. Now who hire purchases a laptop? Vulnerable people, right? Why else would you pay seven times over the odds to own a laptop, except that you know that in order to participate in the 21st century you need to have a computer and you can afford to buy it out right. Now these companies they understood that some of these laptops will be stolen and some of them will be lost, and that sometimes people you do hire purchase with turn out not to be great credit risks and they just stop paying and disappear. And so they wanted to be able to track them down.

[29:26] So they asked this company, Designer Ware, to make a laptop security system for them that would allow them to operate the computer over a network without any outward indication of the computer being operated. And when these seven companies and the higher purchase business and Designer Ware entered into their settlement with the Federal Trade Commission they stipulated to the Federal Trade Commission that they had used these cameras deliberately to secretly video tape their customers having sex. To secretly video record their children in the nude. To audio record their confidential conversations. To intercept their passwords and their e-mails, including confidential information about their finances, their medical history and their confidential correspondences with their lawyers. And they also went rooting around on their hard drives just looking for entertaining files to pass around the office.

[30:13] And the Federal Trade Commission, when they settled with these eight companies they said "you must no longer do this (as you'd hope), "unless you put it somewhere in the fine print", in the user agreement. Somewhere in that thing we've all clicked through a hundred times a day. Consumer Union, which is the American version of which[?], they reckoned it would take 27 hours a day to review all the user agreements you interact with in a day. They all come down to the same thing, you know: By being dumb enough to be a customer of this firm you agree that they're allowed to come over to your house and punch you in the mouth and wear your underwear and, you know, clean out all the food in your fridge and make long distance calls. So none of us read them because we know, if you've ever read one of them you know before you click the agree button there's one thing you're sure of is that you don't agree. And we kinda hope that we'll get something like a uniform commercial code or some kind of restriction on this. But actually we regulate the internet and the devices we connect to it as though this kind of adhesion contract is just par for the course.

[31:14] And it gets worse, because it's not just creepy spying on poor people through their laptops, it's out-and-out what I have to call war crimes. Companies in this country, like the manufacturer of Finfisher, companies in France like the manufacturers of VuPen that sell Deep Packet Inspection software that allow for the extraction of all internet traffic to dictators in the Middle East. And we've had reports from Bahrain, we’ve had reports from Syria, we had evidence from the looted intelligence ministry in Egypt after the Egyptian revolution, that this software is used to track down dissidents to figure out who to arrest, who to disappear, and ultimately sometimes who to kill.

[31:58] Now you may think that these dangers are abstract and far off in the future, and after all I am a science fiction writer so you'd be forgiven for thinking that. But last November a security researcher named Barnaby Jack gave a presentation in Australia about the work he'd been doing on implanted defibrillator. And assume some of your medical people and so you know just how cool this is. If your heart isn't keeping the beat your doctor can anesthetize you, she can open your chest cavity, she can spread your ribs, she can reach inside and she can attach a device directly to your heart. A computer and a battery, and the computer listens to your heart and the battery shocks it back into rhythm when you start to falter, and this saves people's lives. And doctors want to read the telemetry off of them, but it's a little messy attaching a cable to something that's lodged in your chest cavity. So it has a wireless interface, because everything has a wireless interface. London is a giant microwave oven. And this is where Barnaby Jack comes in because from thirty feet (ten meters) away he can detect your implanted defibrillator’s wireless interface, can compromise it, reprogram its firmware, cause it to seek out other wireless defibrilators and reprogram them, and then cause them to deliver lethal shocks either randomly or at a set time in the future. Being able to tell people about what their devices are doing and giving them the freedom to change how those devices work is not going to be a matter of life and death in the future, it's a matter of life and death now.

[33:22] Now at the start of this talk I said that lawmakers are going to go on making this mistake for a long time, because everything we do in the future will have a computer in it. so every problem will involve a computer. And every problem will arrive at the same solution: ‘Just make me a computer that does everything but doesn't run the program that causes the problem’. And this runs up against a theoretical limit. We have this idea in computer science of a Turing complete machine, named for Alan Turing. That is a computer that has general purpose and can execute any program that can be expressed in symbolic logic. It was the great breakthrough of computer science after the Second World War, this idea that you wouldn't build one computer or another but rather that you build a kind of pluri-potent machine that, depending on the code you loaded into it, could be any other machine, could do any logical operation. And that's amazing, that's where we are today. This is this amazing place we've arrived at, where there's general purpose computer has proven so malleable that – and has enjoyed such amazing economies of scale because it's so useful in so many ways – that it's everywhere. It's in our light switches. I saw a presentation by Vint Cerf, one of the fathers of the Internet, in Paris last week where he quoted the price for embedding a full programmable computer and IPv6 networking stack in a light bulb at sixty cents. That's where we've arrived at with our general purpose computers.

[34:47] As wonderful as general purpose computers are, we have no theoretical basis for a Turing complete computer that is Turing complete minus one computer. A computer that can run all the code we can compile, all the code we can conceive of, except for a program that scares voters or causes serious problems. The closest we can come to that is a computer with spyware on it out of the box. For example: The BBC was worried that its American TV partners Fox, HBO and Warner wouldn't allow it to air the broadcasts that they got early enough unless they had DRM. They thought that the terrestrial broadcast would be limited and those terrestrial broadcasts would go to Sky, which has DRM and it signals. And so they successfully petitioned Ofcom [the communications regulator in the UK] – on the basis of a report that Ofcom suppressed and I subsequently published in The Guardian after it was leaked to me – they successfully partitioned Ofcom to add DRM to our TV. receivers so that it's now a requirement to run code that you're not allowed to audit, that you're not allowed to see in your house connected to your telly which is connected to a camera and connected to a microphone and connected to all sorts of other devices in order to watch American telly. Ironically they said that their big risk was HBO and HBO Just signed a ten year deal with Sky. So I fail to see how that presented any particular risk, no matter what the Ofcom and the BBC did, this is where they where going to end up. The reason that they that they did it is because they reckon that it wasn't a big deal, that they were just subtracting the illegally copied TV shows’ file or program from the general purpose computer in there. And there is no way to subtract that all you can do is add spyware to it.

[36:28] And we've heard people say well it doesn't matter, the general purpose computing era is over because of course everybody has tablets these days, and they're like appliances. But there's no conceptual reason that you can make a tablet that is an appliance, and is designed to do very few things and do them well, and that you have to as a condition of doing that also add spyware to it to stop a user who wants to make it do more. The only reason the most common brand of tablet comes with spyware that stops you running certain programs on it is that the manufacturer has tied it to it's own store and they want to be sure that they can collect a 30 % royalty every time you run code on that device, because they know it's come from their store. And the second most popular kind of tablet, the Android tablets, has no such requirement. There isn't anyone, as I understand it, who goes out and buys an iPad because they were hoping to get a tablet that would only run code from one store, even if they wanted to buy it from another store. They just bought it because they don't care. That doesn't mean that they want spyware, it means that they're ignorant to the risks of spyware. Which is where talks like this come in.

[37:36] We are the masters of the technological universe we people who care about science, and those of us involved in the computer industry. And those of us who are really switched on to computers we tend to be contemptuous of DRM, we say we can make it go away with the wave of a debugger. But that doesn't make it harmless. The computer's industry's own complacency about the DRM is the most dangerous thing about it. Because the copyright wars were just the warm up, the first skirmish. The coming century has a thousand fight over the model of DRM, over computers that say ‘yes, master’ and computers that say ‘I can't let you do that, Dave’.

Cyber-utopianism vs cyber-realism

[38:14] I've been called a cyber-utopian and you may be tempted to dismiss all of this as cyber-utopian folly. After all the entertainment industry demanded copy-proof bits, a computer that could somehow make information go from one place to another without copying it – an idea so stupid that it would it make Claude Shannon and Alan Turing soil themselves with laughter – and we gave them clothes that dissolved in the wash and that seemed to make them happy. So what's the harm? But here's the thing about cyber utopianism: From the very beginning, those of us who cared about the stuff, who advocated for networks and devices as forces for freedom, have warned that they also had the potential for oppression.

[38:54] From the beginning the cyber-utopian movement, such as it is, has been mobilized by intense optimism about the power of crypto to enable cheap organization and secrecy from oppressive regimes, and by stark, frank terror over what would happen if cryptography did not become the norm in our everyday conversations, if the freedom layer was not included in our devices and our networks. These days it's fashionable to talk about cyber-realism as an alternative to cyber-utopianism, that says the internet isn't any kind of special thing, maybe it's not even a thing at all, and if it is a thing it's not really important to the struggle for justice. Except in as much as it's a barrier to the struggle for justice, a siphon for feeding activists signal intelligence to the spooks working for dictators, or warm distracting bath that takes real activists energy and diffuses it through meaningless clicktivism.

[39:44] At its core this realism seems to be saying that the means of information are relevant to the reality of the world that is: it turns its back on the whole history of human strategic history, that says that coordination is the key to victory and that communications infrastructure are the key to coordination. A history that includes everything from the Caesar tattooing secret messages on a messengers skull and then waiting for his hair to grow back in before he sends him off across the lines, right up through Trotsky sending troops to seize the post and telegraphs office on the eve of the Russian Revolution, and up to this very moment when organizations as diverse as Mexican drug cartels are kidnapping Motorola engineers and forcing them to build private cellular networks and FBI agents are inadvertently revealing that they have the full text search of all the phone calls made in America. [40:33] The internet is not nothing. Nor is it irrelevant except as a means of buying and selling things. Nor is it the worlds greatest pornography distribution system or the second coming of the telephone. If it is any of those things it is purely incidental. Because what the internet really is is the nervous system of the 21st century. Everything we do today involves the internet and everything we do tomorrow will require it. It will either be a nexus of control or a nexus of liberation.

[41:03] Now when I was an activist in the 80s I spent 98% of my time stuffing envelopes, and 2% of my time figuring out what to put in them. I am glad that the internet has given us the stamps and the envelopes and the address books for free. Clicktivism is the greatest boon to activist organizing in history. It's the gateway drug to deeper forms of engagement. Savvy activist organizations like this one are offering smooth gradients of engagement from just clicking a petition all the way up to taking to the streets or devoting your life to it. It's a marked contrast to the activist world we had before where engagement was either total or non-existent, which meant that usually activism ended when you became employed or when your unemployment became so dire you couldn't afford to think of anything else, or, importantly, when you had children. Which meant that in particular disenfranchised people, parents, and especially women who where mothers, where denied participation in the struggle for their own liberation.

[41:59] It is a privileged thing indeed to sneer at clicktivism, at the idea that you can only be an activist some of the time. The statement of someone who is not worried about the next meal, someone who isn't at home with their kids, someone who doesn't have to worry about losing a job in order to attend a protest. Inclusive movements can not afford to be made up of those with nothing to lose and those with a leisure to lose a little.

[42:20] It’s up to us to build a future. If we build spyware and rootkits into our computers, if we put a mote in their eyes, we will make a future that makes computers into levers for turning petty tyrants into global monsters. Or we can resist. We can refuse to flap our hands at the silly marketing department and the deluded politicos and the Hollyweird fatcats who threaten to abandon the web and take their precious content back to AOL if they don't get their way in the design of the internet.

[42:50] I am an artist and my livelihood depends on the sale of my entertainment products and my ability to extract meaningful sums of money from the world in exchange for amusing stories that help you pass the hours between the cradle and the grave. And I think DRM is rubbish and no help to me for reasons I've gone into at considerable length in this talk. But even if I was convinced that DRM was the only way to earn a living telling my funny, made up fairy tales, I would not go for it. I would sooner get a real job. Because as much as I want to take my family to Disneyland Paris for fun weekends or buy my daughter nice clothes or even pay our mortgage, I want a free and fair world for my daughter even more, and I think you should too. There is no way to fight oppression without free and open devices and networks. So it's up to us to demand the freedom layer on our devices and to enable that struggle. To be cyber-optimists, to secure the network and to use it to coordinate our struggle for our freedom. To jailbreak every device, to crack every censor-wall, to seize the means of information, and with it liberate the planet.

Thank you.