DEF CON 23 – Fighting Back in the War on General Purpose Computers

From The Cory Doctorow Wiki

Content[edit | edit source]

Quote[edit | edit source]

 

Metadata[edit | edit source]

Published[edit | edit source]

December 25, 2015

Website[edit | edit source]

YouTube

Download[edit | edit source]

m4b from defcon.org

Download all of Defcon 23 from Archive.org

About[edit | edit source]

«EFF's Apollo 1201 project is a 10-year mission to abolish all DRM, everywhere in the world, within a decade. We're working with security researchers to challenge the viability of the dread DMCA, a law that threatens you with jail time and fines when you do your job: discover and disclosing defects in systems that we rely on for life and limb.»

License[edit | edit source]

Creative Commons Attribution 3.0

Summary[edit | edit source]

Please help with[edit | edit source]

Feel free to help out with any of this:

  • Add a short summary above
  • Add paragraphs to make it easier to read
  • Add sub-headings to make it easier to read
  • Check for typos and errors
  • Fix tagged issues
  • Add explanatory links to things that not everyone might understand
  • Add more tags

Thank you!

Transcript[edit | edit source]

[00:33] Two years ago I came to Def Con and I gave a talk where I predicted that things were going to get really ugly in the Internet of Things. After all we live in a world made out of computers and not just in that kind of metaphorical sense where like if you download those those glossy videos promoting the internet of things where like it all looks like it's been set dressed by someone involved in the Tron production and everything is white and curvilinear and jumpsuity and people walk into these houses and they wave their hands and the lights come on. I had someone the other day say "Yeah I've had those awesome computer controlled lights for months now and I went to a hotel and I was like 'You mean I have to touch a light switch, like an animal?'" and then they walk into the kitchen and they wave their hand again and the lights come on and they say like "Tea, black, hot, Earl Grey," right? And it feels futuristic, like tomorrow we will live in a world made of computers. One of the implications that may occur to you as you watch this person walk through their gesture and voice controlled house is that a house that you can gesture and talk to you no matter where you are, is a house where there's a microphone listening to you and a camera watching you wherever you are – and this is where things start to get ugly with the Internet of Things.

[01:53] Now we don't have to wait for the Tron future to live in a world made out of computers. We already inhabit a world where the most salient facts about many of the things that we put our bodies into and we put inside of our bodies, the most important thing about them is their internal logic. It's the computers that they run on. They are, in effect, special purpose computers that do things like let you live inside of them, or drive around in them, or keep your heart rolling. So when you hear about for example subprime car lending – which is the newest way to monetize poor people – you give them a car loan, and then you turn the return on that car loan into a bond, and you float that on the street, and the way that you securitize that car loan is you fit the ignition system on that car with a network location aware ignition override that make sure that you're adhering to the lease terms of the loan terms or it disables your ignition system. So if you drive outside the tri county area, if that's the term of your loan, they can remote shut down your car. And you hear about that and you realize that the most important fact about that car is the computer inside of it. It doesn't have to be one of those Jeeps that just got recalled because over the Internet you can disable the steering and the brakes. These things are designed to be disabled over the Internet and people have broken into those systems and used them to shut down cars the middle of the highway or immobilize every car ever sold out of a dealership. And it's not just cars that are computers that we put our bodies into. A seven forty seven as a flying Solaris work station in a fancy aluminum case connected to some tragically badly secured SCADA controllers.

[03:41] And the thing about the Internet of Things, the thing about a world made out of computers is that it's inheriting the worst fact about the early years of computers, which is the inkjet printer business model – where the things are sold in a way that they are intended to act as a platform where the manufacturer controls that platform and gets to monetize it in at high margins by controlling who can sell addons for it and who can sell consumables for it and who can add features to it. This is very useful if you're the manufacturer. For example you can charge a lot of money for the extra stuff that plugs into it, like whatever you use up in it, like the chargers or the gasoline or the windshield wiper fluid. It's also great if you're the manufacturer because you can make covenants to third parties who might subsidize the purchase. So if you make smart thermostats you might be able to get a power company to subsidize buying millions of them for everybody who lives in a district by being able to warrant that you'll never sell software for it or allow software for it that allows the user of that thermostat to adjust it when the power company turns it down or up. So if the power company says "oh we're running out of head room in the in the power infrastructure, we're going to turn down everyone's air conditioning by two degrees," they want to be able to make sure that you don't just walk over to it and turn the air conditioning back to where you had it before. And that's a great way to subsidize the hardware. So everybody wants this inkjet printer business model where you got to control the software, the consumables, and every other piece of that ecosystem. And we have given a gift to people who want to design these inkjet printer business model – a legislative gift in this country in the form of a section of the Digital Millennium Copyright Act of 1998, called Section 1201, the anti circumvention component of the Digital Millennium Copyright Act.

[05:40] This is a law that says that if you break a lock that is used to secure access to a copyrighted work, that even if accessing that copyrighted work is lawful, breaking the lock is not. So all you need to do to make sure that nobody can plug stuff in that you don't want plugged in or run software that you don't want run or add consumables that you don't want added to your platform is put the thinnest credible lock that you can imagine around the system. Then the government will spend an unlimited number of tax dollars prosecuting people who remove that lock to add otherwise lawful functionality. You'd be crazy not to take up Uncle Sam on this offer to defend your dumb business model with every tax dollar at their disposal.

[06:23] You don't even have to get action against you under the DMCA for the DMCA to work to stop people from getting involved in breaking these these inkjet printer business model tools, because it has such incredible horrific penalties that all you need to do is ask yourself 'am I willing to go to jail for five years and spend $500,000 in fines on my first offense to see whether or not I can unlock functionality in this in this device?' the answer is usually no. You don't even need a lot of prosecutions to get everybody who's capable of doing that unlocking step to kind of say 'actually there's probably something better somewhere else out there with fewer penalties, my risk calculus says I'd rather not do this.' Now the interesting thing about DMCA 1201 is that there's almost no litigation history. We don't really know whether or not the courts would find that $500,000 fines and five years in jail for listening to music the wrong way or watching TV the wrong way or plugging in some you know additional functionality or an unauthorized charger to your device is right and proportional, because the other side gets to decide when to prosecute people for violating 1201. And they generally speaking only go after people who they think they can win against, who have really bad facts and don't look like the kind of people you'd want to stand up in front of a judge. So one of the only cases where 1201 has ever been litigated was when 2600 magazine which, don't get me wrong 2600 is an amazing magazine, I've written for it, Manny does God's work, but 2600 magazine published DeCSS for decrypting DVDs, and the film industry said 'we're going to fight this one all the way to the end. Because 2600 magazine is in New York, where the Second Circuit and judges don't really understand technology – not like out in California where they're really clued in on this stuff – and 2600 calls itself 'the hacker quarterly' and we can be reliably assured that the magazine and all of its all of its supporters are going to show up in court wearing black t-shirts that say things that judges don't understand and find vaguely disturbing. These are the people we want to fight in court.' And we we got our butts handed to us in 2600. The court said there's no free speech interest in magazines being allowed to publish math. And they said that this is about whether or not people should be allowed to defend their investments or whether or not anyone who can figure out how to steal their stuff should be able to just because they know how to remove a lock. It's a terrible judgment.

[09:10] Now a couple of years later we had a great chance to stand up the right kind of defendant in front of the judge, because Ed Felton – who was then at Princeton and is now deputy CTO of the White House – led a team or worked with a team that broke SDMI, which was this really crazy dumb idea to watermark digital music so that when you converted it to analog and then back to digital again that somehow that watermark would survive, even against an adversary who wanted to remove it, and that watermark would be detectable by digital-to-analog converters and they would just say 'I'm sorry I refuse to convert this analog music back to digital because it has been marked as non convertible and we won't do it.' The SDMI consortium spent hundreds of millions of dollars, they spent years on it, smartest people going working on it, and they offered a big bounty to anyone who could break it on condition that you signed a non-disclosure. But of course Ed and his team didn't want a bounty, they wanted tenure and publication. So they wrote a paper on it, they submitted it to USENIX Security Symposium and the record industry went bananas and they threatened Ed and USENIX and they said 'if Ed gives this paper about stats at a technical conference, we're going to sue him and the technical conference.' And we were like 'oh yeah, this is the one we want.' Because we really want a judge to decide whether or not record executives should be in charge of what kind of math Princeton professors can talk about at learned conferences. That's the question I want to stand up in front of a judge all day long. And if we get the right answer, well then 2600 can publish all the DeCSS that they want. So as soon as we stepped up [1] Electronic Frontier Foundation to represent Ed, the record industry dropped the threat. And they not only drop the threat they offered us a covenant sa ying they would never pursue Ed for SDMI ever just to stop us from going to the judge and asking for what's called a declaratory judgment, which is when you say 'I've got this threat in writing' – they put it in writing – 'we've got this threat in writing from the record industry against my client, my client has the right to know even though they've withdrawn the threat he has the right to know how that threat would turn out.' And because they given us a covenant saying we're never going to go after Ed, the judge said 'you don't have any standing to ask that question, because you know answer. The answer is they're never going after Ed.'

[11:30] So there's almost no litigation history in 1201, which works great if your hope is that 1201 will intimidate people who don't know how a lawsuit might turn out because there's no case law on it and you just want them to stay the hell away from anything that interferes with your dumb inkjet cartridge business model. Now we do have a little bit more case law – and this is actually kind of interesting and it's kind of an interesting example of how the Internet of Things has changed the calculus here. In 2004 there were two companies that sued over the DMCA. One was called Lexmark, which is a division of IBM, which sued a competitor that was refilling ink jet cartridges and changing the software in them. It had a bit that was set that said 'I have been discharged' and that bit was not supposed to be settable back to 'I am now full again.' They were resetting that bit to say 'I am full and I have never been discharged,' and they were refilling the cartridges and selling them on. And there was another one called Skylink that made garage door openers and there was a third party company that was making replacement handsets to open the garage door. In both cases, one was consumables the other one was add-ons, they went to the court to the Federal Circuit – who are not known for their technological [??] – this is the circuit where all the dumb patent cases are heard and where we got the worst judgments out of. But they went to the Federal Circuit and they said 'we think that 1201 protects us from people refilling ink jet cartridges, these are copyrighted inkjet cartridges, copyright law protects us from refilling. These are copyrighted garage door openers.' The judge took a good look at both of these and they said 'you know what, I've been looking for your copyrighted work in this and I can only find one, and it's the DRM.' The only code running in this thing is the DRM. It's supposed to be against the law to remove DRM that controls access to a copyrighted work, but if the copyrighted work that the DRM is protecting you against is the DRM, it just feels a bit circular. So the judge bounced both of those.

[13:37] Now things have changed since 2004. Here we are in 2015 and every single computerized system has real, substantial copyrighted works inside of it. Because these days you don't build like PLC or FPGA or special-purpose based controllers for most of our little things. Controllers are so cheap now that for 60 cents you can buy a TCP/IP stack in an embedded controler that you can stick in your light bulb, so why would you build something that was small and lightweight and didn't have a substantial copyrighted work in it when you can get some lightweight version of Linux already on a chip for pennies? So everything, from light bulbs on up, now has a copyrighted work inside it. So as soon as you add a lock to it it's against the law to remove that lock, because now it's protecting access to a real, bona fide copyrighted work beyond the DRM itself. That means that we now have this restriction on jailbreaking HVAC systems, insulin pumps, 747s, implanted pacemakers and implanted defibrillators, and it just keeps going. We just keep getting more and more things that are protected by digital locks and where it's against the law to remove those locks because someone has used them to protect a business model. So those tall, willowy office towers that you see going up in the finance districts of the great cities of the world where you look at them you're like 'how the hell does something that tall and skinny stay up?' Well they use computer-controlled dynamic load adjustment to dynamically readjust themselves against seismic and wind stresses from moment to moment. That building is a case mod that bankers hang out in and it's protected under the DMCA.

[15:37] You just saw 1.4 million Jeeps are recalled because they could be accessed over the internet and have their steering and brakes disabled. A client of EFF, Chris Roberts, has claimed that there are ways to get into the control systems of United's planes through its in-flight Wi-Fi system.

[16:00] The thing about this is that it's not only crazy for for business purposes, and not only is a way for companies to rip you off by charging you a lot of money for consumables or locking you out of features that you might otherwise want to have – although there's a lot of that, make no mistake – it's also deadly for security. Because we have exactly one methodology for determining whether or not a security system works, and that's disclosure and adversarial peer review; It's when your friends tell you about the dumb mistakes you made and your enemies make fun of you for having made them. If it sounds familiar it's because it's the methodology that we used to go from the dark ages to the Enlightenment. Before the Enlightenment we had one kind of science, it was called 'alchemy.' Alchemists never subjected their findings to adversarial peer-review, they fell prey to the most frail of human frailties, which is our ability to deceive ourselves about what we think we know. So they would conduct experiments and they would go 'eeh, I think it came out the way I thought I predicted it would, I'm not going to tell anyone else about it because I've discovered something awesome.' And this is why every alchemists discovered for himself the hardest way possible that drinking mercury was a bad idea. It wasn't until Alchemist started publishing and submitting themselves to adversarial peer review that we found out about the dumb mistakes. We call that moment the Enlightenment, we call what came out of its science, and it's what we use to determine whether anything works. It's why our bridges stand up and it's why our security stands up when it does. It's because we allow third parties who don't like us to look at the stuff that we've done and figure out the dumb mistakes that we've made in order to humiliate us in the public sphere. That's the only methodology we have for knowing whether security works.

[17:46] But under 1201 it's against the law to reveal information that could be used to remove a lock, e.g. 'where was the key hidden?' or 'there is a buffer overrun' or 'there is some other component about this that makes it insecure and would allow an adversary to gain access to it and override the access controls built into the system that the manufacturer was hoping would be intact through its entire duty cycle.' What that means is not that vulnerabilities don't exist, they do, and not that vulnerabilities don't get discovered by hostile parties, they do. One of the things we saw when the NSA's tailored access operations group manual leaked – which is their manual of pre-made hacks that field agents can request from their IT people to use in the field – is that the NSA routinely discovers and weaponizes 'zero days' that they use against other people and that in particular those 'zero days' are longer lived when they are in systems where it's against the law for independent security researchers to disclose those 'zero days.' And so they last longer. So making it illegal to disclose vulnerabilities doesn't stop those vulnerabilities from existing. It doesn't stop those vulnerabilities from being discovered and it doesn't stop them from being exploited. It just makes it harder for normal people, who don't anticipate vulnerabilities in the systems that they rely on for life and limb from being discovered. Because, after all, your phone is more than a super computer in your pocket that you use to throw pigs at birds. Your phone is a super computer in your pocket that knows who all your friends are and knows what you talk to them about and where you are and where you go. And it knows how your banker and you talk to each other and authenticate your conversations and how your lawyer and you talk to each other and authenticate your conversations. Maybe your 'auth token' to your houses front door lock and your car. And it has a camera and a microphone, and the only way you can know whether those are on or off is if the phone security modul is intact, otherwise if the camera and microphone could be covertly operated. And making it against the law to tell you about vulnerabilities in that phone to make sure that you don't run software that didn't come from the manufacturers App Store is grotesque. Not just because it rips off independent software vendors and limits innovation, it's grotesque because it puts you at risk in every conceivable way of vulnerabilities being discovered and festering and being used to exploit you.

[20:11] So every three years the copyright office holds these hearings on 1201 where they ask 'are there any ways that 1201 is getting in people's ways, and should we grant an exemption until the next hearing?' And they just concluded one. You probably heard a couple of examples from it. For example John Deere tractors. A farmer put forward a petition to allow him to Jailbreak his John Deere tractor. The story went that he was getting out there to till the fields with his John Deere tractor and it wouldn't run and he called tech support and they said 'yeah, looks like the inflation sensor on one of your tires has gone south, we can dispatch a part in a couple of days.' And he said 'well, the tire is fine, can I just go into the firmware and disable that sensor?' And they said 'no, you can't.' He said 'well, I really want to be able to tweak that.' Now John Deere doesn't really care whether or not you tweak it, but what they do care about is the fact that they do centimeter accurate soil density surveys through the torque sensors in the wheels while you till your fields. And they sell that information to seed companies like Monsanto, but not to the farmer. And the only way for the farmer to know about her field's soil density is to pay the seed company and covenant only to use that seed companies seed. So they went to the copyright office and said 'we want a jailbreak our Monsanto tractors' and John Deere came back and said 'no, that's not your tractor it's our tractor. You've only been licensed the tractor, it's a copyrighted work.' GM [General Motors] uses this to lock mechanics out of their cars, because they want to make sure that only mechanics who sign a contract that says 'I buy original GM parts that we charge major markups on and not third-party parts,' that only those people can get diagnostics off the motor. And so GM filed a petition, whose basic summary was 'you remember that ad where we said that's not your father's Oldsmobile? We weren't speaking metap horically.'

[22:11] If you want to read stuff that'll make the hair on the back of your neck stand up, look up '2015 Copyright Office 1201 triennial proceedings', and read what the security researchers wrote about how this stuff is getting in their way. This guy named Jay Radcliffe is a security researcher at Rapid7, he's also a type 1 diabetic. If your type 1 diabetic you get an insulin pump instead of relying on human beings, who are after all the world's shittiest lab techs, to figure out when you need more insulin. You get an insulin pump instead of relying on human beings, it means that your insulin dose is titrated very carefully and very tightly through the day, in really good lock step with your blood sugar, and you live many more years. But Jay Radcliffe has audited the insulin pumps and he won't get one. He is prepared to sacrifice years off his life to not get one, because he knows what the wireless interface in this thing can get access to, and after all insulin can be fatally overdosed on pretty trivially. So he won't get a type one, and he reports on this. He also said he's audited a bunch of other medical devices and he estimates that 40% of the code in implanted medical devices has never been audited. We also heard from security researchers who work on voting machines. They say that voting machines are horrificly insecure, but because they have a lock that protects access to a copyrighted work they can't tell the people who are procuring them for their local elections about the vulnerabilities in them. The most hair-raising thing in it was there's a security researcher, we don't know which one, so there's a filing from about half a dozen super eminent, Ivy League, Big Ten security researchers, and in their filing they said 'one of us, not naming any names, has been advised by counsel not to tell you what area she or he works in, because merely disclosing that Council believes will bring a 1201 action against him or her.' So this is the first rule of 1201 Club is 'you don't talk about 1201 club.'

[24:09] Now Internet of Things-investors are really big on this 1201 stuff. They all want these devices to ship with ecosystems, lock in for add-ons, lock in for consumables. And you know in a real market, where that lock-in can be broken, this stuff doesn't work. Keurig put a lock in on its 2.0 coffee pod machines, so you can only buy coffee pods from Keurig and not from third parties. Because it's a competitive market they lost 25% of their market share in the first year. And they did an investor call where the CEO had to eat his hat and say 'we made a really dumb mistake.' Because nobody woke up this morning and said 'gosh, I wish there were fewer vendors who could supply software for my cochlear implant or fewer ways that I could read my ebooks or fewer people who would sell me coffee for my coffee pod machine.' In the absence of one of these things there's no way that you can make this stuff work. But in 2015 we now have a system whereby this competition is getting harder and harder to come by. And the future looks pretty grim in terms of 1201 and how it interacts with the Internet of Things. I'm working on a catalogue of design fiction, you know ideas about what kind of devices we don't have and should have or might see in the future as a result of 1201. It's called 'The catalog of missing devices.' One of the ideas that our UX futurist types came up with is a product that unlocks your fridge so it can chill third-party butter.

[25:42] I told you about those subprime cars. There's another really interesting example of where the Internet of Things is headed. There's a guy named Hugh hair who runs the prosthetics lab at the MIT Media Lab. He's got awesome slides of all of the ways that computers have been woven tightly into people's bodies to improve their lives in immeasurable ways. Hands and feet, even neural prostheses that use very powerful magnets to suppress activity in parts of the brain that cause otherwise untreatable depression. So I saw him do this talk – it was amazing – and he shows his slides, and the last slide is him. He's clinging to the side of a mountain like a gecko and he's all in gore-tex is super-ripped and you can see that from the knees down he's just got stumps. And his stumps are in these great, robotic mountain climbing prostheses. But he's been standing there like this the whole time. And he says 'oh yeah, didn't I mention?,' rolls up his pants and he's robot from the leg down. And he starts running up and down the stage like a mountain goat, leaping into the air. It's incredible best demo I've ever seen. So the first question anyone asked was 'how much did your legs cost?' and he names a price you could buy like a terraced house in Mayfair or a brownstone on the Lower East Side. The second question anyone asks is 'who can afford those legs?' and he says 'well, of course anyone can afford those legs. If it's a choice between a 60 year mortgage to get your house and a 60 year mortgage to have legs, everyone's going to pick the legs all day long,' which i think is probably true. But when you combine subprime immobilizers and legs, you get somewhere very ugly, right? When you miss the payment the legs walk you back to the repo depot to take them back. And anyone who pwns those legs, because it's against the law to tell you about vulnerabilities in them, can make your legs take you anywhere they want.

[27:42] EFF loves litigating stupid tech laws this is one of our secret super powers that I think a lot of people don't appreciate. America has this amazing back door to its legislative system. In other countries without strong constitutional traditions, the way that it works is lawmakers make an incredibly stupid law, and then you have to wait until there's new lawmakers, and enough of them, to make that law go away. You have to bring pressure against them, you have to get a majority of those lawmakers to change that law. But because America has an independent judiciary and a strong constitutional tradition, if you can just convince enough federal judges – sometimes only one – that a law violates the Constitution, they can make the law go away. Now this has some downsides because lawmakers can make dumb laws with impunity knowing that it's good, red meat for their base and then the judges are going to make them go away before they cause too much harm. But on balance it means that we get a legislative second resort in the courts. And that means that when you have ideological blindness in Congress to bad technology ideas, if we can get the right defendant in front of a judge, we can make the fact that Congress is full of people who don't understand technology irrelevant to the legislative landscape.

[28:57] Our best example of this was Bernstein. Some of you probably know about Bernstein. In the early 90s governments had this weird idea that it's hard to imagine in 2015 that anyone would have, that civilians shouldn't have access to crypto. It's kind of a crazy idea, it's amazing that anyone could have ever had that really dumb idea back in the early 90s. For those of you don't know this idea has come back and in 2015 FBI wants to ban civilian access to crypto. Anyway, the NSA had this idea that civilians shouldn't be able to use crypto, so they classed it as ammunition and they made it illegal to "traffic" in strong crypto. And people made lots of arguments in front of Congress about why this was dumb. John Gilmore, one of EFFs founders, employee number six at Sun, and one of the principal authors of GCC and Solaris, one of the designers of SPARC chip. He made a computer that was optimized for brute forcing DES 50, which was the cipher that the NSA argued was sufficient for civilian use, for a quarter million dollars he made a computer that could exhaust DES 50 in two hours. And he said that this means the entire American banking system can be beat for 250 grand with this thing the size of a bar fridge, by a guy who looks like a hippie. And Congress said 'yeah that's very nice, why don't you leave now?' We made arguments about international competitiveness, economists joined on, nobody cared.

[30:17] But then we found Daniel Jay Bernstein, who you may know today as DJB, an eminent cryptographer and a professor and researcher at UC Berkeley, who was publishing strong ciphers on Usenet. Remember Usenet? And we argued that his source code was a form of expressive speech covered by the First Amendment. That programmers had the First Amendment right to publish source. And the Ninth Circuit and the Ninth Circuit Appellate Division said 'you're absolutely right, Congress is wrong. Whether or not this should be classed as ammunition classing it as ammunition violates the First Amendment', and we can have strong crypto. That's how we got crypto. That's that's how we got here, today. So we love litigating bad laws with good clients.

[31:11] I left EFF about 10 years ago to go be a novelist and make up stories to help you pass the long, boring hours between the cradle and the grave. And and after about 10 years of that I looked around at this stuff and said 'this is dire.' As much as I'm interested in making sure that I can live the cushy, glamorous life of a science fiction writer, I also don't want my daughter growing up in a world that makes dystopian science fiction look like a My Little Ponies episode.

[31:41] So I came back to EFF to work on a project that we call Apollo 1201. It's a mission to kill all the DRM in the world within a decade. And we're going to do it with your help. Because we know that people in this room, people who come to this conference, people who work in this field, violate 1201 all day long. It's impossible to do security research without doing it. But it's the love that dare not speak its name. We have a pact, effectively, with the people who want to defend 1201 and keep it intact. And it's that security researchers just don't make a big deal out of the 1201 violating parts in their research, so they research mobile malware and then if the paper starts with 'I got some apps,' they don't say 'I took a jailbroken phone and decrypted a bunch of apps and memory,' they just say 'I got apps and then I discovered this interesting thing.' So so long as nobody talks about it too loudly the other side doesn't come back, and everybody kind of trundles along. But meanwhile devices that we rely on every day become reservoirs of long live digital pathogens that threaten you, me and everybody we love.

[32:54] So we want to talk to you about this stuff. As they said in Ocean's eleven; 'we're putting together a team.' We want to know about the work that you're doing. We want to know in particular when you're scared about 1201. And we want to help you figure out how to structure that research so that it's as litigation hardened as possible. So that if you decide that a critical piece of your research is describing the 1201 elements, or someone on the other side decides that they want to make an example of you and put your head on a pike, that the way that your research is structured is optimized for making sure that the judgment that comes out of it is a shining beacon on a hill for everyone else who's thinking about 1201 and not a terrifying icon of how bad it is when you go up against the machine.

[33:39] We want to make this structured in such a way that 1201 eventually goes away altogether. Because with 1201 gone, DRM goes too. Nobody wants DRM, and once it's not illegal to eliminate DRM, people will eliminate DRM. Because DRM is only used to protect high margins, and as Jeff Bezos said in a moment of alarming candor to the book publishers 'your margin is my opportunity.' So every one of those stores that's taking 30% out of the hide of app software vendors is ripe to be disrupted by someone who makes another store that takes 20% out of their hides or ten percent of their hides or monetizes it differently by having some kind of platform strategy. Every device that has a high price consumable, every John Deere tractor that's selling information back to farmers that they're generating by driving their own tractors around the field. Every one of those is a market opportunity, and every one of those will be taken advantage of and the DRM will disappear as soon as the status of DRM is legally ambiguous or positive for people who break it.

[34:44] And it's not hard to break DRM. Hiding keys in devices that are owned by your adversary is a bad idea for the same reason that keeping money in a safe that you leave in the bank robbers house is a bad idea. Eventually those keys will always be subject to interrogation. Because remember, in DRM crypto you don't have Alice and Bob and Carol, you just have Alice and Bob. Bob gives Alice an enciphered message and Bob gives Alice a device that has the key and then Bob crosses his fingers and hope that Alice never figures out where the key is and puts it in another device that does things Bob doesn't like. That is the wishful thinking business model, it's the abstinence based education business model of crypto, and it doesn't work.

[35:33] So here's my pitch to you. If you're a hacker and you're doing security research come and talk to us I'm really easy to find on the first 'Cory' in Google. Just type 'Cory' into Google, the first result that comes up is my home page. It has one email address on it, it's the same email address my mom and my wife uses to get in touch with me. Email me and tell me about your 1201 stuff and we want to talk to you and figure out how we can help you. If we can help you, talk to you about the contours of the law and give you good advice. It's the thing we've been doing for 25 years now and we're awesome at it.

[36:01] But if you're a designer, if you're a UX person – I think those people come to events like this too – get in touch with me about The catalog of missing devices. Because we're putting together a design intervention as part of the 1201 project to help people start to realize what's missing because of 1201. This is an under appreciated fact about 1201, that there's all these devices missing from the field, and it's hard to notice what's not there. With the patent fight there's a device everybody loves, some patent troll has a dumb patent, they make that device disappear, everybody gets angry. But with a with DMCA 1201 the device just never shows up in the field and people don't even notice it's there. They kind of assume that maybe physics is the reason that you stick a CD in your computer and the computer wakes up and run some manufacturer supplied software that rip mixes and burns your CD to put on your mobile device, but you stick a DVD in and all it lets you do is the same thing you could do with it in 1996, which is watch it. And people are like 'oh I guess there just must be something like technically impossible about doing more stuff with DVDs.' So we're making this catalog of all the stuff that was stolen from your future.

[37:11] And if you're into this stuff, if you're a UX or UI designer, if you're a product designer, get in touch with me and talk to me about contributing to The catalog of missing devices. we can also offer you tax receipts for the consulting that you do on this stuff, for the value of your consulting. Because we're a 501c3 we've got lots of ways to make this good for firms that want to work on it with us. And we'll also make sure that you get credit and you'll be helping out in an important way.

[37:35] And then my last plea to all of you is if you're a W3C member, if you're a member of the World Wide Web Consortium, or if you work for a company that's a member of the World Wide Web Consortium also get in touch with me. Because the World Wide Web Consortium last year took the dangerous and awful step of adding DRM to the realm of technologies that they're willing to standardize for the web. Which means that our web-based front ends, which are supposed to replace plug-in based or app based front ends for everything from our pacemakers to our thermostats, will all have components that are unlawful to report vulnerabilities in and will thus be subject to having those vulnerabilities fester in them. And we have a project to reform the way that the W3C deals with this, and so if you are involved with the W3C I really want to hear from you. So the DMCA has been festering since 1998. It's an unsightly boil on the American legal system and it's spread all over the world, thanks to the US Trade Representative. And as it's stopped being enforced in America every one of the countries that has adopted their own version of 1201 at the behest of this government will be poised to remove their own 1201 laws. Because after all if you're in a suicide pact and the other side backs out and says 'no, we won't make these products that our people like[?],' then it's natural for you to want to back out too. With your help we are going to squeeze this zit, forever and everywhere in the world. Thank you.

Questions and Answers[edit | edit source]

[39:10] I have about 10 minutes for Q&A. I remind you a long, rambling statement followed by ‘what do you think of that?’ is a question, but not a good one. I will alternate between people who identify as women or non-binary and men because otherwise the Q&A is always a sausage fest.

[39:58] So the question is 'are there no cases where DRM works, like for example subsidy consoles where they sell the console below cost and then they use money from the games to realize a recoupment on those costs, or other models where I take a picture and I send it to you but I want to make sure you don't share it on. How do I do that without DRM? I think that those are two separate cases, I'm going to take them separately.

[40:24] In the second case, the Snapchat case: What Snapchat is good for, or Wickr, or all those other 'disappearing ink tools' are good for, is a system in which you trust the other party but you don't trust their Operations Security (OPSEC). So you say to them 'we're going to share a document and we're both going to delete it after we've looked at it, but sometimes you forget because human beings are crappy computers.' And that works really well. Auto enforcing OPSEC, great! It's totally terrible at stopping people you don't trust from sharing information that you shared with them and ask them not to share on. Because if you send me a photo to my device, at bare minimum i can just take a picture of it. But also, it's my device and you've hidden some keys in it and then you expect that I'm not going to find it. Again we're back to wishful thinking. Would the world be a better place if we could figure out how to turn off gravity? Well as someone with chronic back pain, yes. But are we going to defend things that don't have any nexus with turning off gravity but let people pretend that we do, and have all these horrible side effects? No.

[41:32] As to 'can we protect subsidy hardware by allowing the state to spend an unlimited number of tax dollars to prosecute people who do otherwise lawful things to that subsidized hardware?' I guess the answer is whether or not you think the state should be deciding which business models work. I think that we have lots of hardware that works without subsidy. Nobody came down off a mountain with two stone tablets that said 'the only way a console shall ever be monetized is through discount hardware.' In the games world it's especially interesting because the games world has a long history of not being protected by law. So back in the in the days of floppy and CD based software piracy the games industry were one of the first major constituencies to go to the law bodies and say 'we're being pirated into the ground.' And of course lawmakers hated the games industry, so their response was 'that's awesome.' So the games industry invented network-based play like Warcraft, which quickly became larger than all of the other games ever invented. So they'll adapt or they'll die right? I mean we have video games before we had subsidy hardware, we'll have video games after we have subsidy hardware. And if the question is 'do we get to know about vulnerabilities in consoles in our living rooms with cameras and microphones, or do we get to have subsidy hardware for Mario reboots?' I'll take the knowing about the vulnerabilities in the cameras.

[42:55] Do we have any people who identify as women or non-binary who'd like to ask the next question? Anyone?

[43:11] All right do we have any dudes who'd like to ask a question? Thank you.

[43:16] [Question:] There's been discussion about applying DRM to allow people to protect their private information when they share with a website or some sort of a service. Can you comment on what you think about that.

[43:31] Yeah, so what about IRM, which is what this was called when Microsoft first launched it, Information Rights Management. The idea is I encrypt the data, I send it to you, you have a client that has the keys to decrypt it, the client looks for business rules that come along with the encrypted document like 'no print, no forward, no view after six hours,' and then it obeys those rules, it enforces those rules against you. So again, this is a great tool inside a firm if you want to make sure that OPSEC just happens automatically. If the thing is that you and I have agreed that after we both looked at this document and dispensed with it we're not going to retain it. We have like a seven-year document retention window. We want to dispose of everything after seven years so that if there's ever a lawsuit against us the discovery process won't involve old documents we forgot to get rid of, that we have to pay outside counsel 400 dollars an hour to review. Then that's a great tool.

[44:25] But if the idea is that like I'm a giant company based in either Mountain View [Google] or Redmond [Microsoft] or Cupertino [Apple] and you are going to send me a bunch of personally identifying, compromising information, but you're going to lock it up with DRM, that I will run the client that you trust, and that software will make sure that I never do anything untoward with it. That seems really technologically implausible. We're back to 'it would be awesome if gravity didn't work sometimes.' If you don't trust your adversary, and you send them both the keys to decrypt the document and the document and then you ask them nicely not to do bad things with that information, it seems really unlikely that that'll work.

[45:13] We have attacks on DRM encrypted ebooks that are totally non-encryption breaking, like screen scripting where you just run a screen scripting tool kit that just advances the page once a second, screen shots the predictable rectangle it shows up on, and runs it through OCR [transcription]. So you send me a million line spreadsheet and it's in a machine-readable typeface, it's just not that hard for me to convert that back to machine readable data with no restrictions.

[45:46] So the fact that it would be awesome if that did work doesn't mean that it does work. It needs to be technologically plausible as well as a good idea. And the problem is that the way that we defend that today is by saying 'you go to jail if your report a vulnerability that would allow you to do that kind of thing.' As we've seen it doesn't stop bad guys, whatever your definition of bad guy is. Whether that's state-level actors that sit in your own government's capital or state level actors that sit in some other governments capital or script kids or identity thieves or voyeurs. It doesn't stop them from violating the law. It's a bit like the rule against extracting keys to rip DVDs, right? If you're already prepared to share a DVD and break the law, the fact that extracting keys to rip the DVD also breaks the law is not much of a deterrent. It's like 'I was going to break this law, but then I found out I'd be breaking another law so I stopped breaking the law,' right? It doesn't seem likely that that's going to happen.

[46:40] I'm all for having things that make companies better about their document handling and retention, especially in respect to personal information. And I love ideas like going to insurance underwriters and re-insurers and saying 'you're allowing the companies that you write policies for to treat personal information as though the only cost associated with it is hard drives, and really what you should be doing is factoring in the full cost of the eventual breach of that information because **we have exactly one gold standard for not having information breaches and that's not collecting and retaining information.**' And so companies that are able to subsidize themselves by getting insurers to write them cheap policies because for some reason insurers are acting like there's no risk to retaining personally identifiable information, that's crazily dumb.

[47:26] I think that not passing laws that require deep document retention, like we have in the EU and are being proposed in the US, that's also a really good way to get away from this stuff. Right now a lot of firms are bound through compliance to gather and retain tons of personally identifiable information, and that's just asking for trouble. David Cameron, the Prime Minister of the United Kingdom, a country I've just emigrated from (those two facts are not coincidences), has just announced that everyone who operates a porn site that is accessible within the United Kingdom is going to have to gather and retain proof of age, which in practice means credit card numbers, for everyone who visits the website, to make sure that they're over 18. Or they'll be blocked at the National firewall of the UK, which was instituted in the last Parliament. And given that porn sites are not better than the Office of Personnel Management at protecting themselves from breaches, what he's just said is we are going to gather and then release a financial net worth indexed record of the pornography tastes of everyone in the United Kingdom. This is a crazily bad idea that we don't need DRM to fix in terms of privacy, we just need to stop mandating that companies retain privately identifiable information, because we should assume that the breach will always occur.

[49:03] [Answer to question:] Is there a list of companies that are friendly towards 1201 versus not friendly? There is a bit of one. If you look at the 1201 docket at the Copyright Office, the 1201 triennial exemption docket, you will see the companies that insisted that 1201 is a requirement to stay intact. And Apple is one of them. Although Steve Jobs wrote that letter saying DRM sucks, and although one of the jailbreaking exemptions that has been granted at one of the last couple triennials is an exemption for phones, though not tablets, because the Copyright office says it can't tell the difference between a tablet and a laptop – which suggests that they probably shouldn't be regulating either of them. Apple, every year when this exemption comes up, says 'the sky will fall!' if you're allowed to decide what software runs on your iPhone. And you can also see the companies that insisted that DRM should be a [?] product of the W3C, if you look at them. They are the big ones.

[50:05] On the other side there's companies that play both sides of the fence. When Amazon was hoping to open a music store to rival Apple's DRM-locked Music Store, the launched the MP3 Store, and their slogan was 'DRM – Don't Restrict Me.' Then they bought Audible, which sells digital audio books, and they said, 'by the way, if you make audio books and sell them trough our platform – which is responsible for 90% of audiobook sales in the world and is the sole supplier of spoken word to the iTunes Store – you must have DRM.' So many of those companies play both sides of the fence.

Thank you! Support EFF!