My talk on the Internet of Things, wealth disparity, surveillance, evidence-based policy and the future of the world

From The Cory Doctorow Wiki

Cory Doctorow talks about how massive surveillance is a tool the elites use to maintain a huge gap in wealth between rich and poor, and what copyright, DRM and the internet has to do with the fights over evidence-based policy, climate change, corruption and poverty.

Quote

«The fight here is not about cryptography. It's not about computers. It's not even about the Internet of Things. The real problems that we have today are much greater than that. We have things like climate change and sectarian conflict and vast economic disparity, corruption and poverty. But all those fights, as important as they are, more important than any fight we have about the Internet, all those fights will be fought and won or lost on the Internet. So the policy questions raised by the Internet of Things are not the most important questions […] . But they are the most foundational questions. All the other policy questions are contingent on how we answer the policy questions arising from the Internet of Things. On whether we believe that the default posture of our devices should be 'Yes masters' or 'I can't let you do that, Dave.'»

Content

Download

MP3 (108 MB)

Metadata

Published

January 16, 2015

Website

craphound.com, Mindenki Joga Radio Show (Facebook)

About

«Here’s the audio from last night’s talk on the Internet of Things at Central European University in Budapest! It was recorded by the Mindenki Joga Radio Show.»

License

Please help with

Feel free to help out with any of this on the transcript bellow:

  • Add sub-headings to make it easier to read
  • Check for typos and errors
  • Fix tagged issues
  • Add explanatory links to things that not everyone might understand
  • Add more tags

Thank you!

Transcript

[Introductory niceties]

[00:51] We live today in a world made of computers. We put our bodies into computers.

[00:58] A modern house, and a modern office building especially, is a computer that our body cohabitates. When you remove the computers from modern buildings, buildings that are a hybrid of insulation and automation, they cease to be habitable almost immediately, and in Florida, when they turned the computers off in the sub-prime houses that where foreclosed on after the crisis in 2008, what they discovered is that by turning the computers off for an appreciable length of time the amount of mold and decay that sets in made those houses permanently inhabitable. They had to scrape into the ground and start over again. In an important sense, the most significant thing about those houses is that they are giant case mods that we happen to live in.

[01:43] Cars are computers. Not self-driving cars, but contemporary, modern cars. Every year on conferences like Defcon or CCC we see people stand up and demonstrate attacks on car informatics using things like their Bluetooth unlocking interface to take control of the brakes or the steering. The most salient fact about your car is its informatics, not its engine.

[02:07] A plane is a computer. Boeing 747 is a very fancy flying Sun Solaris workstation in a really, really expensive aluminum case connected to extremely badly secured and tragic SCADA controllers.

[02:24] And not only do we keep our bodies inside of computers, increasingly we put computers inside of our bodies. Many of you will know someone who has a pacemaker or an implanted defibrillator, but even more than that; if you're like me and you grew up with a Walkman, or if you're a little younger and you grew up with mp3-players, you will log enough punishing earbud-hours that, should you live long enough and not be killed by a self-driving car, you will some day get a hearing aid, and that hearing aid is vanishingly unlikely to be an analogue, retro, transistorized, beige, plastic hearing aid. It will be a computer that you put in your body. And depending on how that computer is configured it will know what you hear, it may tell other people what you hear, it may stop you from hearing things that are there, and it may make you hear things that aren't there, with or without your consent.

[03:14] I mentioned implanted defibrillators before. Implanted defibrillators are wonderful technology. If you know someone who would die because their heart is no longer able to sustain its rhythm – today they can go to their doctor and she will cut them open, spread their ribs and attach a computer with a battery to their heart. And it will listen to their heartbeat and if their heart stops beating it will shock them back to life like defibrillator paddles. This is amazing stuff. And of course doctors want to read the telemetry off of them, what they're doing, they want to update the firmware, add new features. And computers that are in your chest cavity are inconvenient to attach USB cables to, so they have wireless interfaces. And a few years ago a researcher, now deceased, named Barnaby Jack, gave a presentation on how from 30 feet away he could compromise the wireless interfaces in implanted defibrillators and cause them to seek out other implanted defibrillators and take them over and then at preset dates administer lethal shocks to all their wearers. When Dick Cheney had his defibrillator implanted he had the wireless interface turned off.

[04:21] I'm a very frequent flier – I'm changing the climate, ask me how – and I know that the first rule of the road warrior is ABC: Always Be Charging, because your laptop is your lifeline to the world. And so whenever I go into a new room I automatically scan the baseboards for plugs. And I was in an airport lounge and feeling very smug about having seized the only plug in the room to charge my laptop, and a man walked up to me and very cheekily asked me if he could use my plug. And I looked over my glasses at him and said, "I'm charging my laptop before the flight," and he rolled up his trouser leg and he showed me the robotic prosthesis that was attached to his leg from the knee down and he said "I need to charge my leg before the flight", and I said, "it's all yours."

[05:07] So we live in a world made up of computers, where our bodies are inside of computers and where computers are inside of our bodies. And computers pose new regulatory challenges that are, in some ways, without precedent in the history of technology regulation. And because computers have become so intrinsic to our condition, our answers to these regulatory questions are some of the most significant questions we have in policy circles today. Now, historically, when a technology was involved with a social problem, we solved that problem with a technology mandate. So, for example, radios do a lot of useful social things – they bind us together as a community through radio broadcast, they allow for efficient emergency services dispatch, they are critical to air traffic control – and radio is a very fragile technology. Depending on how a radio emitter and a radio receiver are built, they will or won't work, and they can interfere with each other in ways that are quite dreadful and which can render whole classes of devices over very large areas completely unusable. Indeed, the first radio transmitters were things called spark gap generators, which, if you where to operate one today would effectively render all the radios around you unusable for so long as you insisted on using your big, dumb spark gap generator.

[06:27] Now, the way that we solved that problem was with a mandate. We have, in virtually every country in the world, a radio regulator, and if you are going to make a device for sale in a country or for import to that country, you are required to give a prototype of that device to a regulator who will inspect it and decide whether or not that radio is designed well to only emit in the bands which it is supposed to emit in and won't interfere with other radio equipment. And although, with a lot of effort, someone who is skilled as an electronics engineer could re-engineer that radio, could turn a baby monitor into an air traffic control device, that person could also do it just from scratch with parts. And the likelihood that someone would accidentally turn their baby monitor into an air traffic control device is extremely low. But today our regulatory model for radios has fallen apart, because modern radios are software defined radios. Instead of having a crystal – a quartz crystal who's resonant frequency, whose vibrational frequency determines how that radio emits – today we have an oscillator and a signal processing algorithm and an analog to digital converter/digital to analog converter that determines the characteristics of the radio depending on what software is used in conjunction with them. Which means that by loading new code into your baby monitor your really can make it into an air traffic control device. So this is an enormous regulatory challenge which you don't really have any answers for. It doesn't really work with computers this idea saying before the computer leaves the factory you have to decide what kind of instructions it will interpret and which ones it wont, and that way we'll solve our social problem.

[08:21] And to explain why, I have to delve into some computer science fundamentals, and particularly into the idea of Turing completeness. Now before the Second World War if you wanted to electronically compute something important you built a calculator to compute just that thing. So if you wanted to tabulate elections you built an election tabulating machine. If you wanted to calculate ballistics tables you built a ballistic table calculating machine. And so the war changed all that. During WWII Bletchley Park in the United Kingdom – a great collection of mathematicians and early computer scientists led by Alan Turing and particularly assisted by Polish mathematicians in exile – worked together to create what we think of as the modern computer. And that work was critically augmented by what they called "The Hungarians"; John von Neumann and his colleagues who went to the Princeton institute and worked on the early computer architecture in Princeton. And together they designed this computer, this architecture, this revolutionary architecture for computing that we call the Von Neumann machine that is Turing complete. And a Turing complete Von Neumann machine is a computer that can execute all the instructions that we can express in symbolic logic. Any program that is valid can run on any Turing-machine, and every computer that you've used in your life, almost certainly, was a Turing complete machine. What that means is that the programs that you're running on computers today, that are tens of millions of times more powerful than the computers that Turing built and Von Neumann built, would run on those computers. Now they would run very slowly on those computers, they might take more time than we have left before the universe collapses to run, but they would run.

[10:10] Now Turing completeness is amazing, because it means that instead of designing a new machine every time we need to do a new thing, a new calculation, we just load new code onto it. And Turing completeness, once we discovered it, we hardly seem to be able to get away from it. It seems to be almost latent in the structure of the universe. You may know a collectible card game called Magic: The Gathering. Magic: The Gathering, given a large enough deck and the right rule set is a Turing complete computer, and you can compute anything you can compute on a laptop with a very large Magic: The Gathering deck. And when I say very large I do mean a deck that might stretch to the sun and beyond, but with a large enough deck and enough time you can compute anything on any Turing complete machine.

[10:56] Indeed we try now to build machines that aren't Turing complete, and we usually fail. So why would we want to build a Turing incomplete machine? Well, say you just designed a new social media platform, and on your social media platform everyone gets a page, and on those pages they all get a glittery unicorn on the top of the page. They get a GIF at the top of the page that glitters and animates. An you want to let them have the awesome self expression power choosing how the unicorn looks. So you write a toy scripting language that lets the unicorn do 3 or 4 really simple things, and you think that's safe. And then, inevitably, at a conference like CCC or Blackhat or Defcon some programmer stands up and says I took the 2 or 3 instructions that you included and figured out how to build a full Turing complete instruction set out of your toy unicorn animation language, and I wrote a virus of them. So this means that computers can't be appliances the way we think of computers as appliances. When you say, oh I've sold you a router and all it is is a router and it will never be a rendering station or a car informatics system, what we mean is it will be dumb and weird to use it as a rendering system or a car informatics. What we don't mean is that it's technically challenging or impossible to use it as that.

[12:24] But firms are under extreme pressure today to tether their devices to ecosystems that they control. They look at Apples 10 billion dollar 2013 App Store revenue, and they say wouldn't it be great if we could design computers that would only run the code that came from our store, and not the code that came from other peoples stores?

[12:46] The investors that they court, they speak glowingly of business that have what they call "moats" and "walls", where there is some cost that the customer would have to pay to change their loyalties, so that if you stop being a user of one ecosystem and start being a customer of another you find yourself having to throw away all of your gear and buy all new stuff, because it's been designed not to interoperate. [some fumbling with words] That's a "wall". And then they a business with a "moat" and that's a business where it would cost another company a lot of money to come in and compete with you. And so they want to design devices and computers that only talk to the ones that are blessed by them.

[13:28] Now historically there has been a kind of economical equilibrium between "moats" and the rents that they allow the firms that create them to extract. If your customers have to spend to replace their printers to get one that uses cheaper ink, then you can charge them over the life the of the printer, and on average they will stick with your printer and not throw it away and buy one that gets cheaper ink. Hand waving aside [DU] questions about net present value and so on. But generally if your printer costs a lot more than in extra value you would expect your customers to maybe defect to a printer that was cheaper and throw away their brand new printer to get another one rather than buy all of your extremely expensive ink.

[14:18] So that's normally the limit on how much rent you can extract from your moat. It's what your competitor can come in and offer your customers that would make it a good deal not to buy your very expensive consumable or participate in your very expensive ecosystem. Generally, if you try to get your customers off your rivals products, they will just a lot[?] themselves, thanks to the miracle of the Turing complete of the von Neumann[?] architecture. They will just – if you have a program that checks to make sure that all the software running a phone is blessed by Apple, someone will make a program that makes sure that that first program doesn't run, right? And then they can buy their software from anyone, and that's the end of it.

Basics of cryptography

[15:05] Now to understand how that works and why that works, and particularly why that doesn't work we have to talk about cryptography for a moment, as well as computer science foundations. And again we go back to Allan Turing here and the work he did on cryptography. When you talk to cryptographers about how crypto works they tend to use examples involving three people: Alice, Bob and Carol – Although today we might say Alice, Bob and Clapper[?]. And Alice and Bob are two friends who trust each other and want to send each other messages, and Carol is their enemy, and Carol wants to read their messages. And cryptographers try to figure out how Alice and Bob can talk to each other without Carol getting in the middle of things. And cryptographers start from certain assumptions. They assume that Carol knows that Alice and Bob is sending each other a message. Because Alice and Bob are probably sending that message over a medium that are a bit noisy, like a radio, where everyone inside the radios transmission zone can see that the message is going by, or maybe by satellite, where it blankets a whole continent, or maybe they're using a public switched internet or a phone. In all of those cases we assume that the adversary, Carol, knows that the message exists, and we actually assume that she can get the message, that she can receive it in transit. Because if you know it exists, if it's coming from a satellite, if it's being blanketed over a whole continent, then it's not hard for Carol to get a copy of that message. So Carol has the scrambled message. Alice and Bob also assume that Carol knows how they scrambled it.

[16:43] The reason that Carol will know how Alice and Bob scrambled it is because we don't know how to make security systems who's security is provable by any methodology other than by telling other people how they work. Before we had science we had a thing that looked a lot like science, called alchemy. And alchemists where all engaged in similar labor – they wanted to transform base metals into precious ones – and they had a kind of weird game theory kind of outcome. Because if you figured out how to turn lead in to gold and then all the other alchemists got the secret from you, then gold would become worthless, and all of your life's work would become waisted. So alchemists didn't tell each other what they thought that they'd learned. And human beings have a bottomless capacity for self deception, which is very hard to check when you don't tell anyone else what you think you know. And as a consequence[?] alchemists all discovered for themselves in the hardest way possible that drinking mercury was a very bad idea. So alchemy stalled them. We didn't get any kind of advance on the alchemical project, until alchemists started publishing. And we call the period before they started publishing "the dark ages" and we call the period when they started publishing "the enlightenment". And like every other discipline[?] of the enlightenment the only way to know security works is peer review. There is no security in obscurity for the same reason there is no physics in obscurity. If you don't tell other people what you think you know, you're probably kidding yourself. So Alice and Bob, having learned the lessons of Allan Turings adversaries, who had secret ciphers that Alan Turing broke along the [DU] and the Hungarians and continued to send messages in these broken ciphers and got all their u-boats sunk. Having learned that lesson, they told everyone they could find how their cipher worked, so that all the dumb mistakes they make can be corrected. Because anyone can design a security system that works against people stupider than them, and they want to make one that works on people smarter than them. So Carol knows what Alice and Bob have done [which is?] to scramble the message. Carol has the message. So how is Alice and Bob to retain their secrecy? Well, Alice and Bob have a key, and when they run the message trough the algorithm with the key it gets scrambled in a way that can't be descrambled unless you also have the key. The maths of the crypto are strong and good and have withstood peer scrutiny, and we believe they work. We believe in a really foundational way that they work, and that means that you can give Carol the message, and so long as you never give her the key, it doesn't matter.

Stopping someone from using their own device as they want – Crypto applied to software

[19:31] So how does it work when you try to stop someone from running software that they want on a device that they own in this crypto model? I want you to be able to play DVDs, but I don't want you to be able to rip DVDs. Right, I just want you to play them in the optical drive in your computer. Or I want you to be able to download Netflix and watch it once but not save it to watch later. How do we make that work? Well, we give you a player – we give you the Netflix player or the DVD player – and that player has an algorithm, it's a good algorithm, you know it's one that is published widely and is understood, and that player has the scrambled movie, that's what Netflix sent you or that you brought home from the store on a DVD, and that player has the key. And so long as you never find out what the key is, you have designed a piece of software that decodes the DVD, right?

[20:26] Well, you may have spotted the problem with this. Because you are Carol, but you're also Alice. Bob has sent you the key because you're Alice, you're supposed to read the message, but Bob hopes that you don't have the key, because you're Carol and you're not supposed to be able to read the message unless you're doing it in a way that Bob says so. And anybody in the world can be Alice. All you need to do to be an Alice is to get a Netflix subscription, buy a DVD-player. Bored lab-students with the weekend off and electron tunneling microscopes are Alice. And we have hidden the key in a piece of equipment that we let Alice take home with her and we hope that she'll never figure out where we hid it. This doesn't work for the same reason that we don't keep even really good bank safes in bank robbers living rooms. Because if you have it on your equipment, on your premises, it won't work.

[21:32] So why do we still have it? How does it work, how is it that we have Netflix and that we have iTunes, that we have iPhones and so on? Well we swallow spiders to catch that fly. There is a network of global treaties that began in 1996 with two UN WIPO treaties – the WCT and the WPPT – and then turned into a global laws like the 2001 EUCD, the American DMCA – Digital Millenium Copyrigth Act – of 1998 and Canadas recent Bill C11. All those laws endeavor to protect this weird Alice, Bob and Carol business model – these walled gardens. In the name of preventing piracy they make it a felony to produce a tool that circumvents effective means of access control. That is, if you do anything that assists people in doing things that manufacturer has prohibited, you commit a felony. Now it doesn't actually stop people from creating software to jail break their devices, they still do that. But the fact that they can't do that in a way that is open and in a way that allows them to raise capital and market products and put adverts on the side of buses means that the industry that uses these digital locks can still maintain a source of secondary income for themselves by depriving users of features that the law would allow but that they're not allowed to add because they have to break the digital lock to get there.

[23:05] So a good example of this is CD's and DVD's. If you buy a CD and you bring it home and you own a fruit flavor laptop Apple will give you a free program to slow it is really minutes but called iTunes, and it will do something totally legal: It will rip your CD and it'll stick it right on your iPhone. Now, if you own a DVD, which is read in the same drive and is made in the same factory and can be organized with your copy of iTunes, It would be in felony to give you a copy of a program that would let you rip it to your computer and stick it on your iPhone. Instead you're supposed to buy that DVD, again as a download from the iTunes store.

[23:52] And so, for companies that would like to sell you something you already own, a prohibition on breaking digital locks is a great way to make more money. It's a way to take the surplus value latent in your property and take it in for themselves. So you can imagine that if you have a thousand dollars worth of CD's and a thousand dollars worth of DVD's in 1996, that today those DVD's can only do what they could do when you bought them, you can just watch them, which is amazing. When you think of a technology where no new features has been added since 1996, that's a shocking thing, right? But CD's, because you can rip them, there are alarm tones and ring tones, you can mash them up and stream them and give them to your friends and you can put them as background music in your YouTube videos and do a million things. All that latent value that used to be in your in your CD has now been unleashed because there's no prohibition on giving you a tool that allows you to do legal things with your CD's. The model that digital locks enable you can think of as the urinary tract infection business model. So the CD, all the value flows in a kind of healthy gush. With the DVD the value comes at a painful dribble. Every time you want to do something new with the DVD you have to go buy that right again.

[25:22] So on this is a very, very attractive proposition to industry and to investors. They really like this. I had dinner recently with a friend of mine who is an analyst for a VC firm that invests in Western companies doing hardware startups in China, and they said we're only investing in companies that have an ecosystem. Where there's a software ecosystem that ties the hardware into a suite of services. And that suite of services will never be competed with by industry, through investment capital, because it's unlawful to do that.

[25:57] But it's not enough to make it a crime to manufacture products that allow users to jailbreak their devices. Because individual users might figure out how to get around this all on their own, or you might get nonprofit software entities that have different risk profiles and different victory conditions, like VLC – which is based here in Hungary – that circumvent and don't charge anything for it. And because, of course, as I said, anyone can become Alice by buying your products and scrutinizing it closely. So these laws don't just prohibit firms from releasing interoperable products, they criminalize disclosing information that might help you break a digital lock. They make it a crime to tell you about flaws in the digital lock that can be used to lock it, because that's what individuals need in order to make their own players. It's what VLC relies on in order to make their own players.

[26:55] Now that turns the defects in devices that are covered by these anti-circumvention statutes into reservoirs of long lived vulnerabilities, and those long lived vulnerabilities threaten our own lives. Because bugs in software aren't just used to jailbreak phones. Your phone is a supercomputer in your pocket that knows everything about you. It knows who your friends are and what you talk to them about and everywhere you go and how log in to your bank account. It has a camera and you take it with you to the toilet and into the bedroom and it has a microphone that's on, potentially, while you discuss sensitive things with people around you in the room. It knows what your doctor e-mailed you, it knows what you said to your lawyer. And so a vulnerability in your iphone doesn't just let you break out of Apples software ecosystem, it also opens you at enormous risks.

[27:54] An Internet of Things world is a world where you are potentially under continuous surveillance. If you think about what a voice activated system means it means a system where you are never out of range of a microphone. What a gesture activated system means is a system where you are never out of range of a camera. And malware authors, the people who attack us trough our computers, rely on the same vulnerabilities that are a crime to report under anti-piracy laws. They rely on them to figure out how to subvert devices' security models. As do spies. And any limit on vulnerability disclosure increases the length of time that that vulnerability is live in the field, before the manufacturer can issue a patch for it. And so, if you think about what this means for a future of devices that you don't know about, that you're not allowed to know the vulnerabilities in, and are ever more intimately woven into your life, and where people who discover critical flaws face jail time, as the Russian programmer Dmitry Sklyarov faced when he revealed flaws in Adobe's e-book reader and the FBI put him in jail in America.

[29:10] It means that our devices become not honest servants, but potential traitors in every way. Today we're already seeing the first steps of that. So think of the Euromaidan protests in Kiev. The old regime was extremely hostile to the uprising in [DU], and they did what they could to compromise and attack the people who came, in including investigating and threatening their families, trying to assemble dossiers on who was sympathetic to the movement and so on. In America now there's widespread use of these devices called stingrays. A stingray is a fake cell phone tower that briefly pulses all the phones around it and gets them to answer back with something called the IMEI number, which is their unique ID number. And your carrier, which is liable to a subpoena or just a straight up attack from the state, your carrier can associate your IMEI with your identity. So imagine if in Maidan, if rather than having to send secret police and provocateurs around to find out who is throwing molotovs or beating the drums or doing all the other things on the other side of the line, they could've just pressed a button and gotten the name and address of everybody protesting in the square.

[30:34] And now think forward a couple of years. We now have these new smart meters going in – Google just bought Nest, which is a really big smart meter company.

Smart meters are a super cool technology. If you, like me, are worried about coal power and climate, one of the promises of the smart meter is it's going to allow us to minimize our reliance on cool. Because right now we turn on coal generators when the power grid hits peak load, so that's when we turn on the coal. But there is an alternative to turning on coal when the power grid hits peak load, which is turning down everyone's thermostat one degree and we don't have to turn on the coal plants. That's great! But you can imagine that people who are deploying these systems would think a priori "if we're going to turn down people's thermostats by one degree, we don't want them walking over and turning it back up again. We need to design those thermostats so that they're not under user control, but under remote control. So that the user is treated as an adversary and not the owner of the device." In fact, in most cases, people don't own their thermostats, they're owned by the power company, especially in these smart models.

[31:42] Now imagine that the next Maidan is in Minsk and everyone's got smart meters and Lukashenko turns on his stingray. He gets everyones address and then presses a button and turns off everyones heat, who showed up for a demonstration in Minsk. The power of a state to exercise coercive force over the citizenry in a world in which we are treated as adversaries to devices that we live inside of and that are inside our bodies is horrific.

[32:18] In America, the collapse of the subprime house industry sent the finance industry searching for new things to securitize, new loans to securitize. And the latest securitization mania is for cars. They have started offering sub prime loans to people who aren't good credit risks to buy cars. And they secureitize those loans, they generate bonds based on them, and the way that they maintain the value of those bonds, which is contingent on the car payments being made, is by fitting the cars with a networked, remote controlled, location sensitive ignition override. And if you don't make your payments your car won't start. And if you have a condition on your lease that says you can't drive out of a certain region and you drive out of that region, your car on start. We've already seen the negative consequences of this. There is in the New York Times an account of a woman who I had a lease that specified she wouldn't take her car out of her county line, and she took her children on a trip to the woods just over the county lines, outside of mobile phone reception. And she turned the car off and they walked in the woods, and they walked back to the car, and the car wouldn't start and they had no mobile phone reception. Because they're on the wrong side of the county line they needed to speak to the finance company to restart their car. When you imagine how that will work when it's not accidental but deliberate, the risks seem very large. Not only that but if Congress was willing to give the entertainment industry a statute that says it's a felony to show someone how to listen to music on an unapproved devise, what will Congress make of the YouTube videos that are already there explaining how to remove the override from your car? They're gonna go to Congress and say "why can't we have a law that says it's illegal to show someone how to steal a car if it's illegal shows them how to listen to music the wrong way?" And so we can expect that this doctrine will have enormous pressure to expand.

[34:30] And where it ends is anyone's guess. But I saw presentation a few years ago from Hugh Herr, who is the head MIT's prosthetics laboratory. And Herr did this amazing presentation – he uses slides, which I am in awe of, 'cus I can't use slides – and he shows all these slides of these amazing things that his lab has done. And then the last slide is a picture of him, and he's clinging to a rock, in Gore-Tex, and he's rock climbing and he has no legs, he has prosthetics. And he's been walking up and down the stage this whole time [DU ?] and he goes "oh didn't I mention?" and he rolls his pants legs up and he's lost with his legs to frostbite, he's robot from the knee down. And then sort of jumping up and down the stage like a mountain goat. It is the coolest down a line[?] I've ever seen. And the first question anyone asked was "how much are those legs?" And he named a prize that, like you could buy a brownstone in the Lower east side for or a house in Mayfair, right? And then the next question was "who can afford those legs?" And he said, "why any one," right? If it's a forty year mortgage on a house or a sixty year mortgage on legs, you're going to take the legs. Well we already have seen it in sub prime mortgage what repossession looks like. What does reposition look like when it's your legs?

[35:52] Now worryingly the world's security services have hit on a strategy for cybersecurity capability that is all offense and no defense. The Snowden revelations included the news that GCHQ and the NSA collaborated on programs that are, depending on which side of the ocean you're on, is called Bull Run or Edgehill – which cost two hundred fifty million dollars a year – that are devoted to deliberately introducing vulnerabilities into our devices, so that they can be used to attack their adversaries. Including subverting standards like the National Institute For Standards and Technologies (NIST) cryptography random number generator – their elliptical curve random number generator. Which is a shocking turn of affairs. It's like discovering that the security services have been secretly mixing sand into all the concrete so they can make buildings fall down if they need to.

[36:44] Your building, remember, is just as uninhabitable without its IT as it is without its structural supports. So in addition to this, many of the world's governments have started buying vulnerabilities from security researchers. Historically security researchers who discovered vulnerabilities disclosed them to the manufacturers. In fact they forced the manufacturers to respond to them. It used to be you would go to Microsoft and say "I found a critical bug in Windows" and they'd say "Ok, whatever" and you go back like six months later and say "Haven't you guys fixed that!?" and they'd go "We're busy." And now what hackers do is they go to Microsoft and they say "We've found a bug in your software, we've got a paper accepted by it accepted by a conference next month, you better fix it before then!" Except there's another thing you can do if you find a vulnerability now, which is you can sell it to the spies and they will weaponize it and they will use it to attack us. And if the spies have a vulnerability that they're using to attack their enemies that vulnerability is also there to be discovered by criminals and voyeurs and other bad guys who can come and use them to attack us.

[37:52] This week David Cameron, the Tory [a political party in the UK] prime minister of the United Kingdom, where I live as you can tell by my accent [This is a joke, Cory is from Canada and sounds like it], he said we should have no means of communication that we cannot read. And the FPI and the New York attorney general has gone on record saying they would like mandates for back doors on phones. that do[?] full disc encryption and have encrypted protocols to talk to one another.

[38:21] But as we saw with Alice and Bob and Carol, if your computer has a program that is insecure, that has a back door, the only way for that back door to subsist once you know it's there is for it to be impossible for you to installed a better program on your computer. When the FBI says "we want a backdoor in the operating system of your phone," what they mean is "we don't want you to be able to change the operating system of your phone." That is absolutely necessary as a precondition for them to be able to back door your phone. And as we saw with Alice, Bob, Carol and the Digital Millennium Copyright Act, the only way to preserve that is to make it a felony for you to know about vulnerabilities that would allow you to install better software on your phone.

[39:13] So even if you trust the secret police in your country, what it means is that the vulnerabilities in your phone are going to take much longer for you to know about. And you may be compromised, like say Cassidy Wolf – former Miss Teen USA – who last year had someone install drive-by malware on her phone, which he used to hijack her camera and keyboard, took incidental nude photos of her as she walked around her room and then demanded by e-mail live sex shows from her in front of the camera that he now had taken control of, or he would use the social media passwords he'd harvested to put those nude photos all over the Internet. She called the FBI, who arrested him and found that he had over 140 victims around the world, including minor children in the EU. That's what it means to not know if there's a backdoor of vulnerability lurking in your device.

[40:04] Now the FBI is not the Stasi. The modern world does exist in a state of constant surveillance, and the Snowden revelations – which encompassed full feed captures of major Internet front lines, mass harvesting of data from enormous online services like Google and Facebook, and the deliberate introduction of flaws into operating systems and devices – shocked many of us. In Europe, especially in the former Soviet states, we like to draw comparisons to the surveillance habits of the Stalin security agencies like the Stasi. But a cursory look at the figures show that this is a remarkably inapt comparison. At its peak no 1989 the Stasi operated in a largely pre-Internet pre-computer era and really had a hard job of it, spying on people. In 1989 there were about sixty million people in the G.D.R. and there were 294 000 operatives of one kind or another under the pay of the Stasi, including 173 000 unofficial informants, snitches. And today there are, it's hard to know how big the NSA is, but there's four million Americans with any sort of security clearance, 1.4 million of them with top security clearance and about seven billion people on earth. And the U.S. intelligence services – along with the Five Eyes allies in the U.K., New Zealand, Australia and Canada, and the wider circle of trusted states in the E.U., with much smaller security services than the U.S., just a rounding error against the U.S. numbers – manages to spy on practically all seven billion human beings on earth. If we assume that every American, with any kind of clearance, is actually a front for the NSA then there is a ratio of one spook for every 400 people. If we assume that only those with a top secret clearance, which again would be a very large superset of NSA operatives, is fronting the NSA it's a ratio of one spook to every 5000 people under surveillance. By contrast the Stasi needed one spook for every sixty people. They did it retail. It was artisanal surveillance.

[42:05] Since 1990 the productivity gains of I.C.T. has given spies a two order magnitude lift in their surveillance capability. The Stasi used an army to surveil a nation, the NSA uses a battalion to surveil a planet. We tend to see NSA surveillance as a kind of extreme aberration. Because of the sheer number of people that are put under surveillance. But maybe there's another explanation for how the NSA got here. Maybe they only modestly increase the resources and staff in their surveillance project, but because of ICT's incredible gifts to surveillance roughly the same amount of effort gets them a geometric expansion in capacity. Indeed, since the end of the Cold War, the most, fastest growing U.S. surveillance apparatus has only grown four fold, and most of them grown much more modestly than that. But although they increase their surveillance capacity by more than a hundred fold they've made a relatively small investment since the end of the Cold War.

Why spy? – Inequality, social stability, guard labor and wealth distribution

[43:04] Which brings us to a kind of existential question, which is 'why bother spying at all?' Generally a state's spying is part of a wider program to maintain social stability. They worry that either internal or external forces will compromise the status quo, which will destabilize the state and endanger those for whom the state's views seem legitimate, the people who think the state serves them well. For example at the time of the Stasi the leaders of the Soviet states believed correctly that the US and NATO wanted to overturn not just their governments, but their very system of governance. They believed, also correctly, that there were people within their borders who shared this goal and the same of course true of autocratic states today. You know, as much as the Kim family are not very nice people they aren't paranoid when they imagine that there's lots of people in North Korea who want to get rid of them. And a few years ago Syria was a really good example of this, prior to the civil war. Although people in the country lived in enormous poverty the Assad family enjoyed spectacular and widely publicized material wealth. They rang up huge iTunes bills, they bought haute couture clothes off of Paris runways, they shopped Chelsea for handmade furniture, they spent thousands circumventing – this is true – thousands circumventing economic sanctions on their regime to import illegal copies of the last Harry Potter movie.

[44:28] And this is also true in the United States. There are plenty of people who object not just to the current US government, but to America's system of governance. There are domestic terrorists in America, there are populist movement in America, there are anti authoritarian movements like Occupy in America, and of course there are jihadists in America and they all point to a state whose best served members have reason to fear the prosperity that they receive thanks to the status quo, that that prosperity is in danger. It's not wrong to believe that if you enjoy a lot of wealth that other people object to and they'd like to get rid of that system that your quality of life is threatened by those people. That doesn't say anything about the legitimacy of their aims, it just says that that's true.

[45:18] Now with social stability comfort[?] Generally speaking people who feel well served by a state work for its continuance, or at least don't work against its continuance. And historically states has used a combination of social programs and what the economist Samuel Bulls[?] calls 'guard labor' – the [??] apparatus, surveillance, national armies, police, jails and so forth – to attain social stability. States that have a lot of social programs don't need to spend as much on guard labor, you can think of the Nordic states, and states that have a lot of guard labor generally don't do as much redistribution, like Bahrain. And most states use a mix of guard labor and social programs, like Saudi, where you have a certain class who get a lot of money from the oil fields and you have an underclass who are not citizens, but who are effectively long term residents, even multi-generational guest workers, who don't have those rights. But the guard labor and the legitimacy created by that larger pool of wealthy people, much like the middle class in Western states, even where there's a lot of poverty, that creates a legitimacy in a certain [??] stability.

[46:20] But now we live in an era of an expanding wealth gap. There's profound economic inequality, both in and out of the western states. The 400 richest Americans hold more wealth than the rest of the country combined, over three hundred sixty million people. And the majority of the richest Americans, six of the Forbes top ten, inherited their wealth. Which creates a picture of a kind of static dynastic form of government where the elites are created by birth and not by virtue. The OECD says wealth disparities are at its worst in over 50 years and the three richest people on earth have a net worth that's higher than the 48 poorest countries on earth combined.

[47:02] And these extreme wealth gaps are the source of real social instability. For example real wealth gaps and power gaps endanger evidence based policy. In the old Soviet system the best expression of that was Lysenkoism, where for political reasons there was a scientist who had the ear of Stalin, Lysenko, who rejected Darwinian selection and who insisted that people could be perfected genetically through behavioral changes so that you could change people while they were alive and then those changes would be reflected in their germ [??] and carried on to their children. This is obviously ideologically very attractive to the Stalinist project, and so they used it in wheat growing. You may remember the famines of the Stalinist era, tens of millions of people dead, that was Lysenko. If evidence based policy is inadmissible in your halls of power then your state is destabilized because evidence-denying policy produces bad outcomes. That's why we do evidence-based policy. So modern states whose ruling elites depend on the denial climate change can't have policies that are grounded in the reality of climate change any more than states whos ruling elites are based on extremist religious doctrines that demand the repression of women can allow evidence-driven policies that point out the impact on industry, on public life, and on national prosperity of excluding 52% of the population from public life. And social instability is a threat to extreme wealth. Your wealth is only as good as the guns surrounding it and the social consensus that you deserve it.

[48:43] Historically there's been a hard limit on wealth inequality; the more unequal a society is, the more people within its borders and around the world question its legitimacy, and so the more it had to spend on guard labor. Eventually it becomes cheaper to just redistribute some wealth than to keep paying for higher and higher walls. But we have a new status quo, an ICT, supercharged surveillance status quo. The elite in the former Soviet bloc you know enjoyed tremendous wealth and privilege even during the darkest days of Stalins famines. But remember, they had to pay for one spy for every sixty people and arguably they were under investing, because after all the Stasi didn't know until the wall came down that it was about to come down. They weren't spying enough to figure out what was about to happen. If the Stasi could have given itself a two orders of magnitude efficiency boost, spending the same amount of money and get a hundred times more spying, if every snitch could watch 5000 instead of 60 people, imagine how much more wealth the elites could have hoarded in their doshes[?].

[49:47] We don't need to imagine it. You can look at countries like Ethiopia, one of the three least developed nations in the world, which has become a turnkey surveillance state although it has virtually no ICT capability domestically. It was able to go on the global market and buy NSA grade surveillance capability for its ruling elites, which has allowed those ruling elites to enjoy enormous privilege concentrated almost exclusively in the hands of government. They bought weaponized security vulnerabilities to attack their dissidents, including one permanent resident in the United States who is of Ethiopian origin, an Ethiopian dissident who was attacked in Washington D.C. by an Ethiopian cyber weapon they bought on the open market, which they used to spy on his Skype sessions to find out who he was communicating with back in Ethiopia. He's presently suing the Ethiopian government at an American court.

[50:44] The level of wealth disparity and the corruption it engenders is intimately bound up with the story of ICT. The ability it gives states to a assert wholesale control at fire sale prices is a key factor in the ability of the wealth gap to spread as wide as it has. We're in a global arms race between ICT's power to spy and the appetites of elites to amass greater fortunes, poised against ICT's power to shield from surveillance and allow an effective dialogue about fairness and political change to grow without fear of reprisals. And this is profoundly worrying. An Internet of Things surveillance state will put the spies under your skin and literally inside your bed, in your toilet and in your wallet. But the conclusion isn't foreordained. Turings legacy is with us, cryptography works.

[51:31] In the same way that we find it very hard not to make Turing complete computers we find it incredibly, surprisingly easy to make ciphers that we can't figure out how to break, that no one can figure out how to break. In fact, it is the belief of modern cryptographers that using the computer in your pocket you can scramble a message so thoroughly that if all the hydrogen atoms in the universe where turned to computers and did nothing from now until the universe ran out of energy but try to break that message that it would never be able to render it unless you told them the key. So we have an unprecedented thing, a thing new on this earth, not just the ability to communicate with one another – which is itself a profound change in the politic – but to communicate with each other and ensure that only the people who choose to share communications with get them. This is an amazing and astounding thing.

[52:20] In a corrupted world the only sustainable policies must generate economic surplus for their beneficiaries, who use the economic circles to lobby for the continuance of that policy. One of the reasons that Western governments spy is that spying is conducted through public-private partnerships. Companies like [??] reap extraordinary windfall profits from surveilling on behalf of the NSA, which they use to lobby for the expansion of American surveillance.

[52:50] Every Walled garden exists to extract monopoly rents for its proprietors, to make printer ink more expensive than champagne, to make 3D printer nylon more expensive than filet mignon. And as Amazon founder Jeff Bezos once said in one of his more hippie-dippie moments; 'your margin is my opportunity'. The ten million dollars that Apple takes in from the App Store is ten million dollars that could be taken in by its rivals. If they can operate their own competing app stores. And they could use that wealth to lobby to continue the policy of allowing people to know how their computers work. Which is critical because when it's only legal to install already approved and transparent code on your devices the states can introduce new modes of surveillance effectively for free. They can just quietly visit Apple and tell them that everything installed in the apps store must from now on have surveillance backdoors, as they apparently did with Microsoft and Skype. The criminals will discover those backdoors, the autocratic regimes of the world will be able to buy access to those backdoors on the open market and because it's a fellony to jailbreak your iPhone and install third party code those backdoors will endure. A walled garden is a panoticon and always must be a panopticon.

[54:05] Corporations, and the states that they colonize each have a perverse incentive to make it impossible for users to prevent themselves from being spied upon. Walled gardens are the perfect model for a feudal-style wealth disparity. Walled gardens allow manufacturers to arbitrarily change, disable or modify what you can do with your things. You become a tenant of your ICT, not its owner. In a walled gardens world property is something only the very powerful get and everyone else gets a license, 22 000 words that you click 'I agree' to and have never read, because you know in your heart that what it says is 'by being dumb enough to do this you agree that we're allowed to come over to your house and punch your grandmother and wear your underwear and make long distance calls and eat all the food in your fridge.'

[54:54] There is no better example of the expansion of this doctrine today than the streaming world. Where streaming – although it doesn't require that you not be able to save things, it's not as though you would sign up for Netflix download the only three movies you'd ever want to watch and then resign – nevertheless they are committed to the idea that if you're streaming things then you're not downloading. As though there was some kind of difference between streaming and downloading, as though there was a way for a file to be shown on your screen without it being downloaded to your computer. What they mean when they say 'we are streaming this and not downloading' is 'we think that your screaming client doesn't have a save button.' And streaming has become so critical – this consensus hallucination has becomes so critical – to the business models of the feudal Internet that Netflix, the BBC and other large streaming entities convinced the World Wide Web Consortium, which has historically been the staunchest ally we've had for open and free standards in the Internet, they inveigled [persuaded] them to introduce DRM as a standard feature of browsers to protect streaming media. And the very requirements for these standards are themselves a secret. You have to sign a nondisclosure to find out what the requirements are for this standard – and violating the standard will be a felony, as well reporting vulnerabilities in devices that implement the standard. And that means that everything that's controlled by a browser will be illegal to know the vulnerabilities in.

[56:23] Now the fight here is not about cryptography. It's not about computers. It's not even about the Internet of Things. The real problems that we have today are much greater than that. We have things like climate change and sectarian conflict and vast economic disparity, corruption and poverty. But all those fights, as important as they are, more important than any fight we have about the Internet, all those fights will be fought and won or lost on the Internet. So the policy questions raised by the Internet of Things are not the most important questions – I know you wanted me to say that – but they're not the most important questions. But they are the most foundational questions. All the other policy questions are contingent on how we answer the policy questions arising from the Internet of Things. On whether we believe that the default posture of our devices should be 'Yes masters' or 'I can't let you do that, Dave.'

[57:17] People ask me if I'm an optimist or a pessimist. And you know, I don't know if that's a productive question. Because if I where optimistic about the future of our devices and I thought that we had it within us to have the political will to make devices that are responsive to their users and treated their users as the legitimate creators of policy over them and not as their adversaries, I would get up every morning and do everything I could to further that. And if I was pessimistic and I thought that the chances were slim, and we're going to end up in a kind of horrific Orwellian nightmare by way of Kafka, by way of Huxley, I would get up every morning and do exactly the same thing I would do if I was optimistic.

[58:04] So optimism and pessimism aren't a useful question. What's a more useful question is hope. And hope is why, if your ship sinks in the middle the sea, you thread water. Not because you have a very good chance of being picked up, but because everyone who was rescued threaded water until someone came and picked them up. And if your ship sinks in the middle of the sea and you are with people you love, people who couldn't kick as hard as you, you wouldn't let them sink. You would hold on to them and kick twice as hard. And those of us who think about these issues, we are surrounded by people – our children, our parents, people around us who are not as clued in to these technological issues – who are going to be as subject to the outcomes of these fights as we are. And I have enough hope to do everything I can to try to influence those outcomes. And the fact that my neighbors haven't yet picked up on how important this is, isn't cause for despair, it's cause to kick twice as hard.

[58:57] So with that I'll thank you and entertain your questions. Thank you.

Questions and Answers

[59:08] So I'll remind you that a long, rambling statement followed by 'what do you think about that?' is technically a question, but not a good one. And I like to alternate between men and women at my questions [??] because they tend to be a bit of a sausage fest otherwise.

[59:28] [Question from audience:] What can we do about it?

[59:33] What we can do about it in addition to making good choices about what kind of technology we use – and that's an important one. I know it's hard to choose not to use heavy surveillance technologies like Facebook because your friends are all there. In addition to that you can support organizations that are working to improve this stuff. The Electronic Frontier Foundation, for whom I've just gone back to work, is an international NGO that works on this. The Committee to Protect Journalists, Amnesty International, the Free Software Foundation and the Free Software Foundation Europe, Creative Commons and many other organizations do excellent work on this, and they offer many different ways to participate.

[1:00:10] You can also avail yourself of the best of anti-surveillance technology which, as I said before, really works. The Reset the net project, which is at resetthenet.org, has a pack of tools – pack.resetthenet.org – which will show you how to turn your Mac, Windows, Linux, iOS or Android device into a much more secure device. It won't be secure against the security services deciding that you personally should be surveiled but it will mean that surveilling you will have an incremental cost, and that's critical. Because one of the reasons the security services are able to surveil as wildly as they are is that there's no marginal cost to adding a new person to the surveillance net. Once PRISM is in Google's data center, the next Gmail user costs nothing. And so only by creating an incremental cost of surveillance will we force the security services to surveil in a way that reflects actual suspicion as opposed to the idea that we will just, you know, spy on all and let god sort them out.

[1:01:11] Are there women or people who identify as women who would like to ask a question?

[1:01:35] [Question] One of the questions I have picks up on what you were saying; The fact that so much of the policy is driven by the anxieties and how we get to this point. The level of surveillance the government is looking for, specifically in response to the absolutely horrific events that took place in Paris. One of the more troubling responses among the many troubling responses have been the European ministers of interior and justice have issued this statement that basically is a call for more surveillance, and it's a call to put the pressure and the onus on the Googles and the Facebooks and the Twitters. And it's a very opportunistic call. We know David Cameron and others would love to have this kind of power. So I guess my question is to get your response specifically to that and how do we deal with the role that anxiety plays in the justification for this kind of surveillance?

[1:02:42] [Answer] So I'm not privy to their thoughts, but my speculation is that they understand that because they actually had already correctly identified the assailants in that terrorist attack and had put them under surveillance and had lost track of them, that the problem isn't they aren't casting the net wide enough. That they in fact somewhere in their heart of hearts know this. I'm also willing to believe that they also understand that as horrific as the crime was that the actual death toll from terrorism is a rounding number and that terrorism is not an existential threat, it's a horrific form of crime. And terrorists are not hundred foot tall robot ninjas with laser eyes, they're just dumb people who kill people and then get killed, and that they're not an existential threat to Western society. And so I think that's true. I mean I think that they live in the real world and maybe they have doublethink, and some politicians clearly do. I mean I'm willing to believe there are Texas oil men who are genuine creationists who also at the same time tell their geologists to drill for oil where it would be if dinosaurs were real, and they're able to hold both of those realities in their head. And it may be that there are people in the halls of power who are able to unwind and say I know what a mortality tables looks like and terrorists don't appear on it and I know how surveillance works and this doesn't work and the same time are feeling a visceral, gut level horror that gets them to it[?]

[1:04:16] But I'm more inclined to believe that they are opportunistic, and I know that's a cynical thing to say but I think that they're opportunistic – and that just like the 700 pages long PATRIOT Act was not written in the 48 hours after September eleventh, but was instead written not because someone planed 9/11 but because there is a kind of person in power who would like more surveillance and who when they contemplate a disaster on the horizon rather than making contingency plans for how to help their neighbors they write legislation for what to do when their neighbors are scared. I think that there are people in the halls of power who do this, and I don't think that events are related particularly to the response. I think that to the extent that the events are related to the response – there's a volume out there, there's a rheostat out there that is about how much surveillance they think they can call for after the event. I think the endgame is a lot more surveillance, for the reasons that I talked about in the talk; because there's a business model for surveillance, because there's great social instability that's occasioned by wealth disparity and policy is created by ruling elites and the more wealth disparity there is, the more that policy is dictated by ruling elites and they value the status quo and they also destabilize it by being so wealthy. And so I think that's the underlying cause. The way that we address that, in addition to having an evidence based, good government, the way that we get there is by impart change in the technical and political reality on the ground. An understanding of what Turing completeness means means that the nonsensical nature of the call to force devices to have back doors becomes more widely apparent. The more we understand how signal processing and language processing algorithms work, the more we understand it is nonsense to say 'design me a Google that can find terrorism' or 'design me a Facebook filter for terrorism'. Once that perception is more widespread the political facts of the realpolitik of saying it changes. Nobody who raves about obesity suggests removing gravity to solve it, but people who rave about terrorism suggest removing Turing completeness to solve it. And gravity and Turing completeness are on the same level of confidence in terms of the skilled practitioners of their arts who study them and who believe in them. But the nonsensical nature of 'remove Turing completeness to solve terrorism' is not as apparent.

[1:06:55] The realpolitik of a response to a terror attack would change in the face of that. In the same way that people who are accustomed to using crypto to secure themselves from people they don't trust – even if they trust their governments – like voyeurs, like foreign spies, like corporate rivals, like offshore corporate espionage. All those things that they worry about – to the extent that they understand that there's two kinds of crypto; crypto that works and crypto that doesn't, and crypto that works doesn't have a back door – those people will have a different kind of response to a politician who won't look brave and principled and reasonable and taking a middle ground – as David Cameron manages to do, or I just heard Boris Johnson, who's the muppet who runs London, on the radio say that they don't want to abolish crypto, they just want a golden key to crypto. And there's only one computer science department that's developing that golden key and it's at Hogwarts. But John Humphrey[?], who's normally a bit of a terrier on BBC Radio 4, said 'ah, a golden key, alright then'. John Humphrey would have said 'You're going to do what about gravity to solve obesity?' When we change that, we will change the political reality of surveillance as well.

[1:08:48] [Question] [Hard to hear exactly, here's a summary: He asks for some thoughts on a private torrent network in Hungary which is not being targeted by politicians. He thinks the reason might be that the politicians realizes that they are 300 000 voters who would act as one if they where attacked.]

[1:10:00] [Answer] There is something to this. You just described the formation of the Pirate Party in some important sense – a political consciousness among people who view what they do as non-political has been at the core of every social change movement since its earliest days. I am not sanguine about a political movement grounded in something that is illegal. Not because I don't think that's a way to get a political movement off the ground, but I think that its members are at a first order risk that is not in the in the political calculus of other movements. When every member of your movement is a presumptive criminal it means that doing things that are a lawful, like organizing and being a leader, those people can be targeted, not for their lawful activities but for their presumptive unlawful activities. This creates enormous power imbalances that makes those sorts of political movements very difficult. I'm not saying that they're not important or not a good thing to try,but they're fraught in ways other movements aren't. An example of this is file sharing. In the US the penalties for file sharing are very strong. In the mid 2000's there was a young computer science student at [??] College in Pennsylvania who was maintaining a piece of Free and open source software called Flatland – a search engine for searching local networks and indexing files. So it does what Google does for HTTP file sharing for Windows file sharing. You run it on your campus networking and it finds all your professors Powerpoints and all the mp3-files of all the lectures, but also the music and videos that the students in the dorms are sharing. Just like Google has all the torrents and all the EU documents. And it's not illegal to do what he was doing. But the recording industry didn't like it and they wanted to make an example of him. So the RIAA went after him for file sharing. And they knew he was file sharing because he was an American college student. And to him they said, unlike the other thirty thousand students they targeted, they didn't say 'your penalty is you give us all your money'. They said 'your penalty is you have to change majors. You can't get a computer science degree, because we want to send a message to people pursuing computer science degrees that this kind of research is not good for your health.'

[1:12:24] We took him on as a client at the EFF and we embarrassed them into dropping that claim, but it shows you what happens when you're presumptively guilty already before you've done anything political. It puts you in this tremendous risk. In fact a parallel might be the marijuana legalization movement. I knew people in the American legalization movement who stopped smoking pot while they were campaigning for legalization because they understood that they needed clean hands to defend pot smoking, including people who smoked pot for medical reasons, who used them to control epilepsy or other serious conditions, they allowed those conditions to worsen because they knew politically they couldn't do their work. So if you wanted to create a movement around file sharing political consciousness I think your first action should be to stop file sharing, at least illegal file sharing. Not because whatever views I may have on file sharing, but because tactically it's a sound move.

[1:13:20] Maybe one more question from someone who identifies as a woman.

[1:13:29] [Question on the Internet of Things and how there's a "bait and switch" when it comes to the business model of profiting off of the data collected from the people using the IoT products]

[1:14:25] I totally agree but to get back to why I actually feel like we might win this one: If you assume that Internet of Things companies are all trying to create ecosystems that effect transfer pricing and monopoly rent extraction on IoT ecosystems, to use eco-jargon. Basically lets them rip off the people who buy them. Then there is a business model in jail breaking them. Because all of the things that you are being charged to do in those IoT ecosystems are things that you could get for yourself for free if you could install any code you wanted. And the things that they want to charge you to do aren't things that are unlawful, they're just things that are more profitable if you have to pay to do them. It's a kind of helplessness business model. And I believe very strongly that the nonprofit sector can do good, but I think that when the nonprofit sector is aligned to a for profit sector that sees an opportunity for windfall profits from doing what the nonprofit sector wants of it, then you can get a lot further. Now there's a huge risk, and that risk you can see in green washing. Basically you can take on the formal characteristics of the cause oriented nonprofit project and then serve none of its goals, but hijack it and use that. You may get the same profits and you end up being the new boss, the same as the old boss. That's something we need to be vigilant about. You don't want someone to get jailbreak your iPhone to lock you into their ecosystem. You want someone jailbreaking your iPhone so you can use whatever software you want and trust whatever expert you want to certify that the software that you use is fit to use. Maybe it's Apple. But if Apple was so sure that its users really trusted it and didn't trust anyone else they wouldn't need it to be a felony to trust someone other than Apple to install software on your devices.

[1:16:29] So there is nothing intrinsic to the technological model of the Internet of Things that says that the things must treat you as their adversary, that things need necessarily be treacherous. That's a business question. And the reason that Silicon Valley venture capital is on the side of the surveillance business model is not out of an ideological commitment to surveillance, it's out of a total, howling void where their ethical center should be. And they are just as happy to play the other side of that. For every investor who's willing to invest in a moat and a wall there is another one willing to invest in a bridge and a battering ram. And they're happy to do that. And I know there are parts of my talk that read like Marxist economics, but this is Hyek. This is what Hyek thought would happen. This is pure market economics: If you are pulling in windfall profits above the marginal cost competition will drive the goods down to their marginal cost, to the benefit of wider society. That's not untrue in some cases and this is one of those cases where it can be true and even more so it's one of those cases where corruption has prevented it from being true, which by corruption I just mean policy that's pursued not because of evidence but because of influence from powerful, moneyed interests. There is surplus capital available for corruption in the service of jailbreaking, just as much as there is in the service of prohibiting jailbreaking.

[1:18:03] So I don't think it's a foregone conclusion, but I see a plausible way from here to there. And I think that the foundational question of 'how do we solve all the other problems of corruption?' starts with 'not having devices that are already corrupted, that come pre surveillance-ready,' because we're never going to fight corruption if our devices are already surveillance-ready. We have to have the infrastructure first.

[1:18:27] All right well thank you all very, very much. Thank you.